Curl

Curl is a tool to transfer data from or to a server, using one of the supported protocols DICT, FILE, FTP, FTPS, GOPHER, HTTP, HTTPS, IMAP, IMAPS, LDAP, LDAPS, POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMB, SMBS, SMTP, SMTPS, TELNET and TFTP.

Curl command has the following functionality:
 * Multiple URLs
 * Usernames and Passwords support
 * IPv6 support
 * Retry failed download
 * URL globbing/sequences
 * Win32 support
 * Large file support
 * GnuTLS support
 * DarwinSSL support
 * Schannel support
 * Cyassl support
 * PolarSSL support
 * AxTLS support
 * SSL Session ID
 * SSL Private Certificate
 * netrc support
 * Metalink support
 * IDN support
 * Bandwidth limiting
 * Happy eyeballs
 * SOCKS
 * TFTP
 * SCP upload/download
 * SFTP upload/download
 * HTTP Proxy
 * HTTP Resume
 * HTTP Ranges
 * Follow HTTP Redirects
 * HTTP Post
 * HTTP Post Chunked
 * HTTP Put
 * Cookie support
 * HTTP 1.1
 * HTTP 2 (plain text upgrade)
 * HTTP 2 (TLS ALPN)
 * HTTP 2 (TLS NPN)
 * HTTP persistent connections
 * HTTPS
 * HTTP Digest Auth
 * HTTP NTLM Auth
 * HTTP Negotiate Auth
 * HTTP Multipost Part
 * HTTP Deflate gzip
 * FTP resume
 * FTP ranges
 * FTP active mode
 * FTP SSL
 * FTP upload
 * FTP Kerberos
 * FTP Connection re-use
 * GOPHER

= One-Liners =

curl -o website.html https://domain.com curl -o archive.zip https://domain.com /file.zip curl https://domain.com > website.html
 * Save the output of the URL to a file

curl -O https://domain.com/file.zip curl -O https://domain.com/file.zip -O https://domain.com/file2.zip
 * Save with name same as remote file

curl -u user sftp://server.domain.com/path/to/file curl -u username:password https://domain.com
 * Download files securely via SSH

curl -I http://domain.com
 * Get HTTP header information

curl ftp://ftp.domain.com --user username:password
 * Access an FTP server

curl ftp://ftp.domain.com/file.zip --user username:password curl -u ftpuser:password -O ftp://ftp_pub/public_html/index.html
 * Download a file via FTP

curl -T file.zip ftp://ftp.domain.com/ --user username:password curl -u ftpuser:password -T linuxtechi.txt ftp://ftp_pub/public_html/
 * Upload a file to the FTP server

curl -u ftpuser:password -T "(linuxtechi1.txt linuxtechi2.txt)" ftp://ftp_pub/public_html/
 * To upload multiple files to FTP server

curl ftp://ftp_pub/public_html -X 'DELE linuxtechi.zip' --user ftpuser:password
 * Deleting files from ftp server

curl -L http://domain.com
 * Handle URL redirects

curl -v http://domain.com
 * Debug level details

curl -x proxy.server.com:3128 https://domain.com
 * Using proxy to download a file

curl --limit-rate 1024B -O https://domain.com
 * Limit data transfer rate

curl -z 1-Jan-17 https://domain.com
 * Download file modified after a given date

curl -z -1-Jan-17 https://domain.com
 * Download file modified before a given date

curl -C https://domain.com
 * Resume a download

curl --cacert new-ca.crt https://domain.com
 * Verifying SSL certificate

curl -k https://domain.com
 * Ignoring the ssl certificate warning

curl -i -X OPTIONS http://10.107.88.68:8082
 * Getting information about supported methods

= Scripts =

while true; do curl -s -w 'Testing Response Time for :%{url_effective}\n\nLookup Time:\t\t%{time_namelookup}\nConnectTime:\t\t%{time_connect}\nPre-transfer Time:\t%{time_pretransfer}\nStart-transfer Time:\t%{time_starttransfer}\n\nTotal Time:\t\t%{time_total}\n' -o /dev/null https://google.com ; sleep 10; done
 * Testing Response Times:

=URL syntax= http://site.{one,two,three}.com
 * You can specify multiple URLs or parts of URLs by writing part sets within braces as in:

ftp://ftp.numericals.com/file[1-100].txt ftp://ftp.numericals.com/file[001-100].txt ftp://ftp.letters.com/file[a-z].txt
 * You can get sequences of alphanumeric series by using [] as in:

http://any.org/archive[1996-1999]/vol[1-4]/part{a,b,c}.html
 * Nested sequences are not supported, but you can use several ones next to each other:

http://www.numericals.com/file[1-100:10].txt http://www.letters.com/file[a-z:2].txt
 * You can specify a step counter for the ranges to get every Nth number or letter:


 * If you specify URL without protocol:// prefix, curl will attempt to guess what protocol you might want.
 * It will then default to HTTP but try other protocols based on often-used host name prefixes.
 * For example, for host names starting with "ftp." curl will assume you want to speak FTP.


 * Curl will attempt to re-use connections for multiple file transfers, so that getting many files from the same server will not do multiple connects / handshakes.
 * This improves speed.
 * Of course this is only done on files specified on a single command line and cannot be used between separate curl invokes.

=Switches=

= Header Modifications =

Basic syntax for spoofing user agent: curl -A "UserAgentString" https://aman.info.tm

Basic syntax for User Agent Spoofing along with other headers: curl -A [user-agent] -H [headers] "https://aman.info.tm"

Two methods to spoof USer-Agent: curl -L -A "Mozilla/5.0" https://aman.info.tm curl -L -H "user-agent: Mozilla/5.0" https://aman.info.tm

One of the most common situations of different source HTML and CSS are for websites with stripped down mobile versions, you could retrieve iPhone-specific source code with: curl -A "Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3_3 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8J2 Safari/6533.18.5" https://aman.info.tm

Some sites do this with other browsers too. This would be Chrome 12 in Mac OS X 10.6.8: curl -A "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_8) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.112 Safari/534.30" https://aman.info.tm

Other Examples curl -L -H "Host: aman.info.tm" -H "Cache-Control: max-age=0" -H "Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8" -H "User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.89 Safari/537.36" -H "HTTPS: 1" -H "DNT: 1" -H "Referer: https://www.google.com/" -H "Accept-Language: en-US,en;q=0.8,en-GB;q=0.6,es;q=0.4" -H "If-Modified-Since: Thu, 23 Jul 2015 20:31:28 GMT" --compressed https://aman.info.tm

curl -L -H "User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.89 Safari/537.36" https://aman.info.tm


 * References