Curl: Difference between revisions

8,617 bytes added ,  2 years ago
 
(16 intermediate revisions by the same user not shown)
Line 8:
Curl command has the following functionality:
<div style="column-count:3;-moz-column-count:3;-webkit-column-count:3">
* Multiple URLs
* Usernames and Passwords support
* IPv6 support
* Retry failed download
* URL globbing/sequences
* Win32 support
* Large file support
* GnuTLS support
* DarwinSSL support
* Schannel support
* Cyassl support
* PolarSSL support
* AxTLS support
* SSL Session ID
* SSL Private Certificate
* netrc support
* Metalink support
* IDN support
* Bandwidth limiting
* Happy eyeballs
* SOCKS
* TFTP
* SCP upload/download
* SFTP upload/download
* HTTP Proxy
* HTTP Resume
* HTTP Ranges
* Follow HTTP Redirects
* HTTP Post
* HTTP Post Chunked
* HTTP Put
* Cookie support
* HTTP 1.1
* HTTP 2 (plain text upgrade)
* HTTP 2 (TLS ALPN)
* HTTP 2 (TLS NPN)
* HTTP persistent connections
* HTTPS
* HTTP Digest Auth
* HTTP NTLM Auth
* HTTP Negotiate Auth
* HTTP Multipost Part
* HTTP Deflate gzip
* FTP resume
* FTP ranges
* FTP active mode
* FTP SSL
* FTP upload
* FTP Kerberos
* FTP Connection re-use
* GOPHER
</div>
 
= One-Liners =
 
Save the output of the URL to a file curl -o website.html https://domain.com
* Save the output of the URL to a file
curl -o archive.zip https://domain.com/file.zip
curl -o website.html <nowiki>https://domain.com </nowiki> website.html
Save with name same as remote file curl -Oo archive.zip <nowiki>https://domain.com</nowiki>/file.zip
curl -O <nowiki>https://domain.com</file.zipnowiki> -O> https://domain.com/file2website.ziphtml
 
Download files securely via SSH curl -u user sftp://server.domain.com/path/to/file
* Save with name same as remote file
curl -u username:password https://domain.com
Get HTTP header information curl -IO httphttps://domain.com/file.zip
Access an FTP server curl ftp-O https://ftp.domain.com/file.zip --userO usernamehttps:password//domain.com/file2.zip
 
Download a file via FTP curl ftp://ftp.domain.com/file.zip --user username:password
* Download files securely via SSH
curl -u ftpuser:password -O ftp://ftp_pub/public_html/index.html
Upload a file to the FTP server curl -Tu file.zipuser ftpsftp://ftpserver.domain.com/ --user username:passwordpath/to/file
curl -u username:password https://domain.com
curl -u ftpuser:password -T linuxtechi.txt ftp://ftp_pub/public_html/
 
To upload multiple files to FTP server curl -u ftpuser:password -T "(linuxtechi1.txt linuxtechi2.txt)" ftp://ftp_pub/public_html/
* Get HTTP header information
Deleting files from ftp server curl ftp://ftp_pub/public_html -X 'DELE linuxtechi.zip' --user ftpuser:password
Handle URL redirects curl -LI http://domain.com
 
Debug level details curl -v http://domain.com
* Access an FTP server
Using proxy to download a file curl -x proxy.server.com:3128 https://domain.com
curl ftp://ftp.domain.com --user username:password
Limit data transfer rate curl --limit-rate 1024B -O https://domain.com
 
Download file modified after a given date curl -z 1-Jan-17 https://domain.com
* Download a file via FTP
Download file modified before a given date curl -z -1-Jan-17 https://domain.com
Resume a download curl -C httpsftp://ftp.domain.com/file.zip --user username:password
curl -u ftpuser:password -O ftp://ftp_pub/public_html/index.html
Verifying SSL certificate curl --cacert new-ca.crt https://domain.com
 
Ignoring the ssl certificate warning curl -k https://domain.com
* Upload a file to the FTP server
curl -T file.zip ftp://ftp.domain.com/ --user username:password
curl -u ftpuser:password -T linuxtechi.txt ftp://ftp_pub/public_html/
 
* To upload multiple files to FTP server
curl -u ftpuser:password -T "(linuxtechi1.txt linuxtechi2.txt)" ftp://ftp_pub/public_html/
 
* Deleting files from ftp server
curl ftp://ftp_pub/public_html -X 'DELE linuxtechi.zip' --user ftpuser:password
 
* Handle URL redirects
curl -L http://domain.com
 
* Debug level details
curl -v http://domain.com
 
* Using proxy to download a file
curl -x proxy.server.com:3128 https://domain.com
 
* Limit data transfer rate
curl --limit-rate 1024B -O https://domain.com
 
* Download file modified after a given date
curl -z 1-Jan-17 https://domain.com
 
* Download file modified before a given date
curl -z -1-Jan-17 https://domain.com
 
* Resume a download
curl -C https://domain.com
 
* Verifying SSL certificate
curl --cacert new-ca.crt https://domain.com
 
* Ignoring the ssl certificate warning
curl -k https://domain.com
 
* Getting information about supported methods
curl -i -X OPTIONS http://10.107.88.68:8082
 
= Scripts =
 
* Testing Response Times:
while true; do curl -s -w 'Testing Response Time for :%{url_effective}\n\nLookup Time:\t\t%{time_namelookup}\nConnectTime:\t\t%{time_connect}\nPre-transfer Time:\t%{time_pretransfer}\nStart-transfer Time:\t%{time_starttransfer}\n\nTotal Time:\t\t%{time_total}\n' -o /dev/null https://google.com ; sleep 10; done
 
=URL syntax=
Line 100 ⟶ 144:
 
*You can specify a step counter for the ranges to get every Nth number or letter:
<nowiki> http://www.numericals.com/file[1-100:10].txt
http://www.letters.com/file[a-z:2].txt</nowiki>
 
Line 112 ⟶ 156:
 
=Switches=
{| class="wikitable"
Progress Meter -#
|-
Append in FTP/SFTP -a
!Switch !! Descripton/Usage
User-agent -A "Mozilla/4.0"
|-
Tells curl to figure out auth method --anyauth
| -# || Progress Meter
Cookie -b
|-
Cookie-jar(file curl should use to save all cookies) -c
| -a || Append in FTP/SFTP
|-
| -A <agent string> || User-agent eg: "Mozilla/4.0"
|-
| --anyauth || Tells curl to figure out auth method
|-
| -b <name=data> || Cookie ("NAME1=VALUE1; NAME2=VALUE2"). If no (=) then treated as filename<br />
This is only used as input. No cookies will be stored in the file. To store cookies, use -c or -D<br />
|-
| -c <file name> || Cookie-jar(file curl should use to save all cookies)
|-
| -D || Write the protocol headers to the specified file.
|-
| -B || FTP/LDAP - Enable ASCII transfer.
|-
| --ciphers <list of ciphers> || [https://www.openssl.org/docs/man1.1.0/apps/ciphers.html List of ciphers] to be used.
|-
| --compressed || Request a compressed response using one of the algorithms curl supports
|-
| --connect-timeout <seconds> || Maximum time in seconds that the connection to the server may take.
|-
| -C <offset> || Continue/Resume a previous file transfer at the given offset.<br />
Use "-C -" to tell curl to automatically find out where/how to resume the transfer.
|-
| --crlfile <file> || (HTTPS/FTPS) Provide a file using PEM format with a Certificate Revocation List that may specify peer certificates that are to be considered revoked.
|-
| -d <file> || HTTP - Sends the specified data in a POST request to the HTTP server, emulate as if a user has filled in an HTML form and pressed the submit button.
|-
| --digest || Enables HTTP Digest authentication.
|-
| -e <URL> || Sends the "Referer Page" information to the HTTP server.
|-
| --engine <name> || Select the OpenSSL crypto engine to use for cipher operations. Use "--engine list" to view list
|-
| --cert <certificate[:password]> || Use the specified client certificate file. Certificate must be in PEM format.<br />
If the optional password isn't specified, it will be prompted.<br />
This assumes that Cert file is private key and private certificate concatenated. use "--cert" and "--key" to specify them independently.
|-
| --cert-type <type> || Tells curl what certificate type the provided certificate is in. PEM, DER and ENG are recognized types. If not specified, PEM is assumed.
|-
| --cacert <CA certificate> || Tells curl to use the specified certificate file to verify the peer.
|-
| -G || This option will make all data specified with -d to be used in a HTTP GET request instead of the POST.
|-
| -H <header> || Extra header to use when getting a web page. You may specify any number of extra headers.
|-
| -i || Include the HTTP-header in the output.
|-
| --interface <name> || Perform an operation using a specified interface, IP address or host name. eg: curl --interface eth0:1 http://www.netscape.com/
|-
| -I || Fetch the HTTP-header only.
|-
| -j || Discard all "session cookies". Same effect as if a new session is started.
|-
| --config <config file> || Specify which config file to read curl arguments from.
|-
| --limit-rate <speed> || Specify the maximum transfer rate you want curl to use.
|-
| --max-filesize <bytes> || Specify the maximum size (in bytes) of a file to download. If larger file, curl will return with exit code 63.
|-
| --negotiate || primarily meant as a support for Kerberos5 authentication but may be also used along with another authentication methods
|-
| --ntlm || Enables NTLM authentication.
|-
| -o <file> || Write output to file instead of stdout.
|-
| -O || Write output to a local file named like the remote file we get. The file will be saved in the current working directory.
|-
| --proto <protocols> || + Permit this protocol in addition to protocols already permitted.<br />
- Deny this protocol, removing it from the list of protocols already permitted.<br />
= Permit only this protocol <br />
Ex: '''--proto -ftps''' uses the default protocols, but disables ftps<br />
'''--proto -all,https,+http''' only enables http and https<br />
'''--proto =http,https''' also only enables http and https
|-
| --pass <phrase> || (SSL/SSH) Pass phrase for the private key.
|-
| --pubkey <key> || (SSH) Public key file name.
|-
| --retry <num> || If a transient error is returned, it will retry this number of times before giving up. Setting to 0 makes, do not retry (which is the default).
|-
| --retry-delay <seconds> || Make curl sleep this amount of time between each retry
|-
| -s || Silent mode. Don't show progress meter or error messages.
|-
| -S || When used with -s it makes curl show error message if it fails.
|-
| --ssl || Try to use SSL/TLS for the connection. Reverts to a non-secure connection if the server doesn't support SSL/TLS.
|-
| --ssl-reqd || Require SSL/TLS for the connection. Terminates the connection if the server doesn't support SSL/TLS.
|-
| -T <file> || This transfers the specified local file to the remote URL.<br />
curl -T "{file1,file2}" http://www.uploadtothissite.com<br />
curl -T "img[1-1000].png" ftp://ftp.picturemania.com/upload/
|-
| --trace <file> || Enables a full trace dump of all incoming and outgoing data
|-
| --trace-ascii <file> || leaves out the hex part and only shows the ASCII part of the dump.
|-
| --trace-time || Prepends a time stamp to each trace or verbose line that curl displays.
|-
| -u <user:password> || Specify user and password to use for server authentication.
|-
| -U <user:password> || Specify user and password to use for proxy authentication.
|-
| --url <URL> || Specify a URL to fetch.
|-
| -v || Makes the fetching more verbose/talkative.
|-
| -w <format> || Defines what to display on stdout after a completed and successful operation.
|-
| -x <[protocol://][user:password@]proxyhost[:port]> || Use the specified HTTP proxy. If the port number is not specified, it is assumed at port 1080.
|-
| -X <command> || Specifies a custom request method to use when communicating with the HTTP server. The specified request will be used instead of the method otherwise used (which defaults to GET)
|-
| -y <nowiki><time></nowiki> || If a download is slower than speed-limit bytes per second during a speed-time period, the download gets aborted.
|-
| -Y <speed> || If a download is slower than this given speed, in bytes per second, for speed-time seconds it gets aborted.
|-
| <nowiki>-z <date expression>|<file></nowiki> || Request a file that has been modified later than the given time and date, or one that has been modified before that time.
|-
| --max-redirs <num> || Set maximum number of redirection-followings allowed.
|-
| -0 || Forces curl to issue its requests using HTTP 1.0 instead of using its internally preferred: HTTP 1.1.
|-
| -1 || Forces curl to use TSL version 1 when negotiating with a remote TLS server.
|-
| -2 || Forces curl to use SSL version 2 when negotiating with a remote SSL server.
|-
| -3 || Forces curl to use SSL version 3 when negotiating with a remote SSL server.
|-
| -4 || Tells libcurl to resolve names to IPv4 addresses only.
|-
| -6 || Tells libcurl to resolve names to IPv6 addresses only.
|}
 
= Header Modifications =
 
Basic syntax for spoofing user agent:
curl -A "UserAgentString" https://aman.info.tm
 
Basic syntax for User Agent Spoofing along with other headers:
curl -A [user-agent] -H [headers] "https://aman.info.tm"
 
Two methods to spoof USer-Agent:
curl -L -A "Mozilla/5.0" https://aman.info.tm
curl -L -H "user-agent: Mozilla/5.0" https://aman.info.tm
 
 
One of the most common situations of different source HTML and CSS are for websites with stripped down mobile versions, you could retrieve iPhone-specific source code with:
curl -A "Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3_3 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8J2 Safari/6533.18.5" https://aman.info.tm
 
Some sites do this with other browsers too. This would be Chrome 12 in Mac OS X 10.6.8:
curl -A "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_8) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.112 Safari/534.30" https://aman.info.tm
 
Other Examples
{{UC}}
curl -L -H "Host: aman.info.tm" -H "Cache-Control: max-age=0" -H "Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8" -H "User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.89 Safari/537.36" -H "HTTPS: 1" -H "DNT: 1" -H "Referer: https://www.google.com/" -H "Accept-Language: en-US,en;q=0.8,en-GB;q=0.6,es;q=0.4" -H "If-Modified-Since: Thu, 23 Jul 2015 20:31:28 GMT" --compressed https://aman.info.tm
 
curl -L -H "User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.89 Safari/537.36" https://aman.info.tm
 
<br />