Curl is a tool to transfer data from or to a server, using one of the supported protocols

DICT, FILE, FTP, FTPS, GOPHER, HTTP, HTTPS, IMAP, IMAPS, LDAP, LDAPS, POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMB, SMBS, SMTP, SMTPS, TELNET and TFTP. 

Curl command has the following functionality:

  • Multiple URLs
  • Usernames and Passwords support
  • IPv6 support
  • Retry failed download
  • URL globbing/sequences
  • Win32 support
  • Large file support
  • GnuTLS support
  • DarwinSSL support
  • Schannel support
  • Cyassl support
  • PolarSSL support
  • AxTLS support
  • SSL Session ID
  • SSL Private Certificate
  • netrc support
  • Metalink support
  • IDN support
  • Bandwidth limiting
  • Happy eyeballs
  • SOCKS
  • TFTP
  • SCP upload/download
  • SFTP upload/download
  • HTTP Proxy
  • HTTP Resume
  • HTTP Ranges
  • Follow HTTP Redirects
  • HTTP Post
  • HTTP Post Chunked
  • HTTP Put
  • Cookie support
  • HTTP 1.1
  • HTTP 2 (plain text upgrade)
  • HTTP 2 (TLS ALPN)
  • HTTP 2 (TLS NPN)
  • HTTP persistent connections
  • HTTPS
  • HTTP Digest Auth
  • HTTP NTLM Auth
  • HTTP Negotiate Auth
  • HTTP Multipost Part
  • HTTP Deflate gzip
  • FTP resume
  • FTP ranges
  • FTP active mode
  • FTP SSL
  • FTP upload
  • FTP Kerberos
  • FTP Connection re-use
  • GOPHER

One-Liners

Save the output of the URL to a file		curl -o website.html https://domain.com
               					curl -o archive.zip https://domain.com/file.zip
						curl https://domain.com > website.html
Save with name same as remote file		curl -O https://domain.com/file.zip
       	        				curl -O https://domain.com/file.zip -O https://domain.com/file2.zip
Download files securely via SSH 		curl -u user sftp://server.domain.com/path/to/file
                                       		curl -u username:password https://domain.com
Get HTTP header information			curl -I http://domain.com
Access an FTP server				curl ftp://ftp.domain.com --user username:password
Download a file via FTP				curl ftp://ftp.domain.com/file.zip --user username:password
                                   		curl -u ftpuser:password -O ftp://ftp_pub/public_html/index.html
Upload a file to the FTP server			curl -T file.zip ftp://ftp.domain.com/ --user username:password
                                      		curl -u ftpuser:password -T linuxtechi.txt ftp://ftp_pub/public_html/
To upload multiple files to FTP server 		curl -u ftpuser:password -T "(linuxtechi1.txt linuxtechi2.txt)"  ftp://ftp_pub/public_html/ 
Deleting files from ftp server         		curl ftp://ftp_pub/public_html -X 'DELE linuxtechi.zip' --user ftpuser:password
Handle URL redirects 				curl -L http://domain.com
Debug level details 		 		curl -v http://domain.com
Using proxy to download a file  		curl -x proxy.server.com:3128 https://domain.com
Limit data transfer rate   			curl --limit-rate 1024B -O https://domain.com
Download file modified after a given date  	curl -z 1-Jan-17 https://domain.com
Download file modified before a given date   	curl -z -1-Jan-17 https://domain.com
Resume a download     				curl -C https://domain.com
Verifying SSL certificate   			curl --cacert new-ca.crt https://domain.com
Ignoring the ssl certificate warning   		curl -k https://domain.com

URL syntax

  • You can specify multiple URLs or parts of URLs by writing part sets within braces as in:
http://site.{one,two,three}.com
  • You can get sequences of alphanumeric series by using [] as in:
 ftp://ftp.numericals.com/file[1-100].txt
 ftp://ftp.numericals.com/file[001-100].txt
 ftp://ftp.letters.com/file[a-z].txt
  • Nested sequences are not supported, but you can use several ones next to each other:
http://any.org/archive[1996-1999]/vol[1-4]/part{a,b,c}.html
  • You can specify a step counter for the ranges to get every Nth number or letter:
 http://www.numericals.com/file[1-100:10].txt
 http://www.letters.com/file[a-z:2].txt
  • If you specify URL without protocol:// prefix, curl will attempt to guess what protocol you might want.
  • It will then default to HTTP but try other protocols based on often-used host name prefixes.
  • For example, for host names starting with "ftp." curl will assume you want to speak FTP.
  • Curl will attempt to re-use connections for multiple file transfers, so that getting many files from the same server will not do multiple connects / handshakes.
  • This improves speed.
  • Of course this is only done on files specified on a single command line and cannot be used between separate curl invokes.

Switches

Switch Descripton/Usage
-# Progress Meter
-a Append in FTP/SFTP
-A <agent string> User-agent eg: "Mozilla/4.0"
--anyauth Tells curl to figure out auth method
-b <name=data> Cookie ("NAME1=VALUE1; NAME2=VALUE2"). If no (=) then treated as filename

This is only used as input. No cookies will be stored in the file. To store cookies, use -c or -D

-c <file name> Cookie-jar(file curl should use to save all cookies)
-D Write the protocol headers to the specified file.
-B FTP/LDAP - Enable ASCII transfer.
--ciphers <list of ciphers> List of ciphers to be used.
--compressed Request a compressed response using one of the algorithms curl supports
--connect-timeout <seconds> Maximum time in seconds that the connection to the server may take.
-C <offset> Continue/Resume a previous file transfer at the given offset.

Use "-C -" to tell curl to automatically find out where/how to resume the transfer.

--crlfile <file> (HTTPS/FTPS) Provide a file using PEM format with a Certificate Revocation List that may specify peer certificates that are to be considered revoked.
-d <file> HTTP - Sends the specified data in a POST request to the HTTP server, emulate as if a user has filled in an HTML form and pressed the submit button.
--digest Enables HTTP Digest authentication.
-e <URL> Sends the "Referer Page" information to the HTTP server.
--engine <name> Select the OpenSSL crypto engine to use for cipher operations. Use "--engine list" to view list
--cert <certificate[:password]> Use the specified client certificate file. Certificate must be in PEM format.

If the optional password isn't specified, it will be prompted.
This assumes that Cert file is private key and private certificate concatenated. use "--cert" and "--key" to specify them independently.

--cert-type <type> Tells curl what certificate type the provided certificate is in. PEM, DER and ENG are recognized types. If not specified, PEM is assumed.
--cacert <CA certificate> Tells curl to use the specified certificate file to verify the peer.
-G This option will make all data specified with -d to be used in a HTTP GET request instead of the POST.
-H <header> Extra header to use when getting a web page. You may specify any number of extra headers.
-i Include the HTTP-header in the output.
--interface <name> Perform an operation using a specified interface, IP address or host name. eg: curl --interface eth0:1 http://www.netscape.com/
-I Fetch the HTTP-header only.
-j Discard all "session cookies". Same effect as if a new session is started.
--config <config file> Specify which config file to read curl arguments from.
--limit-rate <speed> Specify the maximum transfer rate you want curl to use.
--max-filesize <bytes> Specify the maximum size (in bytes) of a file to download. If larger file, curl will return with exit code 63.
--negotiate primarily meant as a support for Kerberos5 authentication but may be also used along with another authentication methods
--ntlm Enables NTLM authentication.
-o <file> Write output to file instead of stdout.
-O Write output to a local file named like the remote file we get. The file will be saved in the current working directory.
--proto <protocols> + Permit this protocol in addition to protocols already permitted.

- Deny this protocol, removing it from the list of protocols already permitted.
= Permit only this protocol
Ex: --proto -ftps uses the default protocols, but disables ftps
--proto -all,https,+http only enables http and https
--proto =http,https also only enables http and https

--pass <phrase> (SSL/SSH) Pass phrase for the private key.
--pubkey <key> (SSH) Public key file name.
--retry <num> If a transient error is returned, it will retry this number of times before giving up. Setting to 0 makes, do not retry (which is the default).
--retry-delay <seconds> Make curl sleep this amount of time between each retry
-s Silent mode. Don't show progress meter or error messages.
-S When used with -s it makes curl show error message if it fails.
--ssl Try to use SSL/TLS for the connection. Reverts to a non-secure connection if the server doesn't support SSL/TLS.
--ssl-reqd Require SSL/TLS for the connection. Terminates the connection if the server doesn't support SSL/TLS.
-T <file> This transfers the specified local file to the remote URL.

curl -T "{file1,file2}" http://www.uploadtothissite.com
curl -T "img[1-1000].png" ftp://ftp.picturemania.com/upload/

--trace <file> Enables a full trace dump of all incoming and outgoing data
--trace-ascii <file> leaves out the hex part and only shows the ASCII part of the dump.
--trace-time Prepends a time stamp to each trace or verbose line that curl displays.
-u <user:password> Specify user and password to use for server authentication.
-U <user:password> Specify user and password to use for proxy authentication.
--url <URL> Specify a URL to fetch.
-v Makes the fetching more verbose/talkative.
-w <format> Defines what to display on stdout after a completed and successful operation.
-x <[protocol://][user:password@]proxyhost[:port]> Use the specified HTTP proxy. If the port number is not specified, it is assumed at port 1080.
-X <command> Specifies a custom request method to use when communicating with the HTTP server. The specified request will be used instead of the method otherwise used (which defaults to GET)
-y <time> If a download is slower than speed-limit bytes per second during a speed-time period, the download gets aborted.
-Y <speed> If a download is slower than this given speed, in bytes per second, for speed-time seconds it gets aborted.
-z <date expression>|<file> Request a file that has been modified later than the given time and date, or one that has been modified before that time.
--max-redirs <num> Set maximum number of redirection-followings allowed.
         This section is under construction.
https://www.computerhope.com/unix/curl.htm 



References





{{#widget:DISQUS |id=networkm |uniqid=Curl |url=https://aman.awiki.org/wiki/Curl }}