My Commands
Networking
Quick Traceroute
traceroute -n -w 1 -m 10 4.2.2.2
Save Packet Captures
tcpdump -s 0 -i eth0 host 10.1.1.1 -v -w /tmp/packet_capture.cap
Docker Packet Captures
docker exec -it 428947239426349 tcpdump -N -A 'port 80' -w capture.pcap
Netstat list Applications along with open port numbers
netstat -anp 8080 netstat -an | grep 8080
Netstat list Programs and open port
netstat -lntp
Active Internet connections (only servers) Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name tcp 0 0 0.0.0.0:111 0.0.0.0:* LISTEN 800/rpcbind tcp 0 0 0.0.0.0:8080 0.0.0.0:* LISTEN 1522/nginx
Show active connections/Applications only
sudo netstat -tulpn
netstat -lnt
Ping a Range:
for i in {131..140}; do ping -c 1 -t 1 10.52.1.$i; done
Check Public IP by CLI:
curl ifconfig.io
List Files
Sort List by Time
ls -lhtra
Sort List by Size
ls -lhSra
Do not sort; list entries in directory order
ls -U
Bash
Execute cmd in other dir & return back to original dir
(cd /etc && ls -a)
Copy/Move all files from sub directories into current dir
cp ./*/* .
mv ./*/* .
Loop Commands
for i in `find . -type f`; do echo $i; cat $i; done | grep terminate while true; do this; do that; sleep 2; done
Redirect Standard error to null:
find / -name 'geeqierc.xml' 2>/dev/null
Flush Logs without delete
for i in *; do >$i; done
Quickly backup a file:
cp some_file.py{,.orig}
Grep
Filter comments from a config file
grep -vE '^#|^;|^$' server.conf
Filter multiple strings
pstree | grep -e docker -e ssh
Archives
Extract "*.gz" file
gunzip FILE_NAME.gz
Extract "*.tar.gz" file
tar zxf FILE_NAME.tar.gz
Extract "*.tar.bz2" file
tar jxf FILE_NAME.tar.bz2
Extract multiple archives into sub directories:
for i in `find $(pwd) -type f -name '*.gz'`; do echo $i; j=$(echo $i | cut -d '.' -f1); echo $j; mkdir $j; tar xvzf $i -C $j; done
Extract files from similarly named directories:
for i in `find . -name 'tech_node*'`; do cd $i; sudo tar xvzf node.tar.gz; cd ..; done
Creating a Tar file from a directory:
tar -zcvf /tmp/log.tar.gz /opt/avi/log/*
Testing Archives without extracting:
tar tvf logs.tar.gz
Generate 100 HTTP requests
sudo apt-get install parallel seq 100 | parallel --no-notice -j0 --joblog log curl -s http://10.107.88.91/welcome.png/{} ">" {}.txt cut -f 4 log seq 100 | parallel --no-notice -j0 --joblog log curl http://10.107.88.91/welcome.png/{} ">" {}.txt
Finding Old Logs
- Find and Delete more than 30 days old files:
find /tmp/report_ftw -type f -mtime +30 find /tmp/report_ftw -type f -mtime +30 -name "messages.*" -exec rm -f {} \;
- List and Delete gz files older than 30 days:
find /var/log -type f -mtime +30 -name "*.gz" -exec ls {} \; find /var/log -type f -mtime +30 -name "*.gz" -exec rm -f {} \;
Searching Multiple text files
for i in log*; do echo $i ; cat $i | egrep -vi "error|not|warning|false" ; done
Replace a keyword in all files at once
find ./ -name \*.tf -exec sed -i "s/cybernetnews/cybernet/g" {} \; find ./ -type f -readable -writable -exec sed -i "s/cybernetnews/cybernet/g" {} \;
Cisco
R1(config-router)#do sh run | section ospf R1(config-router)#do sh run | s ospf R1(config-router)#do sh run | include ospf R1(config-router)#do sh run | i ospf
Top Command
top E cycle through Memory units - KB,MB,GB 1 CPU details for each core m Memory Graph c complete path k kill <pid> M Sort by memory usage P Sort by CPU usage R Results in ascending order
top -o %CPU top -o %MEM top -b -n 1 > top.txt
CPU Limit
sudo apt-get install cpulimit ps | grep matho-prime # Find PID of process sudo cpulimit -b -l 50 -p 16299 # 16299 is PID & 50 is the CPU %
Text Editor
Nano Search
Cntrl + W
vi
Cntl + b => One page before Cntl + f => One page after dd => Cut Line <n>dd => Cut n Lines yy => Copy Line <n>yy => Copy n Lines p => Paste
File Sharing
Check Samba Shares
sudo apt install smbclient smbclient //10.140.196.7/share -U aman
HTTP Proxy through SSL Tunnel
ssh -L 127.0.0.1:19443:10.52.201.10:443 aman@10.52.1.138
Access = https://127.0.0.1:19443 Jump Server = aman@10.52.1.138 Remote Server = 10.52.201.10:443
Curl
- Test Site Reliability:
for i in {1..999}; do echo -n $i ' '; curl http://google.com -s -w %{http_code} -o /dev/null -m 1; echo ""; sleep 1; done
for i in {01..999}; do echo -n $i HTTP Code:' '; /usr/bin/time -qf " Real:%e User:%U Sys:%S" curl http://google.com -s -w %{http_code} -o /dev/null -m 1; sleep 1; done
while true; do curl http://google.com -s -w %{http_code} -o /dev/null -m 1; echo ""; sleep 1; done
- Testing Response Times:
while true; do curl -s -w 'Testing Response Time for :%{url_effective}\n\nLookup Time:\t\t%{time_namelookup}\nConnectTime:\t\t%{time_connect}\nAppconnect:\t\t%{time_appconnect}\nPre-transfer Time:\t%{time_pretransfer}\nStart-transfer Time:\t%{time_starttransfer}\n\nTotal Time:\t\t%{time_total}\n' -o /dev/null https://google.com ; sleep 10; done
- Testing Time taken & Response Code:
for i in {1..999}; do echo -n $i ; curl -skL -w ' http code: %{http_code}\tTotal Time: %{time_total}\n' -o /dev/null https://google.com ; sleep 1; done
- Switches:
curl -I ==> Response Headers only (HEAD) curl -v ==> Request & Response Headers curl -k ==> No Certificate validation. curl -H "user-agent: Mozilla/5.0" ==> custom header curl -L ==> Handle URL redirects curl -X <method> ==> Custom request method; otherwise defaults to GET; DELETE, POST, PUT, GET; use with -d data curl -d or curl -F ==> POST curl -T ==> PUT
- See just Request & Response Headers:
curl -vs google.com 2>&1 > /dev/null| sed '/^* /d; /bytes data]$/d; s/> //; s/< //'
- Do not use Cache(Server or proxies in middle can ignore this):
curl -H 'Cache-Control: no-cache' http://www.example.com
- Output specific lines from multiline output:
curl -skL https://aman.info.tm | awk '/Articles/ &&NR>=178 && NR<= 180' curl -skL https://aman.info.tm | awk 'NR>=178 && NR<= 180' | grep Articles
Rsync
rsync -avz --progress --partial /home/user/Downloads/ pi@192.168.1.35:/media/temp rsync -avzP /home/user/Downloads/ pi@192.168.1.35:/media/temp
Dig
Show just Errors:
while true; do var=$(dig @10.1.1.83 example.com); if [[ $var != *"NOERROR"* ]]; then echo $var; fi; done