My Commands

__NOINDEX__

= Networking =

Quick Traceroute traceroute -n -w 1 -m 10 4.2.2.2

Save Packet Captures tcpdump -s 0 -i eth0 host 10.1.1.1 -v -w /tmp/packet_capture.cap

Netstat list Applications along with open port numbers netstat -anp 8080 netstat -an | grep 8080

Netstat list Programs and open port netstat -lntp

Active Internet connections (only servers) Proto Recv-Q Send-Q Local Address          Foreign Address         State       PID/Program name tcp       0      0 0.0.0.0:111             0.0.0.0:*               LISTEN      800/rpcbind tcp       0      0 0.0.0.0:8080            0.0.0.0:*               LISTEN      1522/nginx

Show active connections/Applications only sudo netstat -tulpn

netstat -lnt

Ping a Range: for i in {131..140}; do ping -c 1 -t 1 10.52.1.$i; done

= List Files =

Sort List by Time ls -lhtra

Sort List by Size ls -lhSra

Do not sort; list entries in directory order ls -U

= Bash =

Execute cmd in other dir & return back to original dir (cd /etc && ls -a)

Copy/Move all files from sub directories into current dir cp ./*/*. mv ./*/*.

Loop Commands for i in `find. -type f`; do echo $i; cat $i; done | grep terminate while true; do this; do that; sleep 2; done

Redirect Standard error to null: find / -name 'geeqierc.xml' 2>/dev/null

Flush Logs without delete for i in *; do >$i; done

Quickly backup a file: cp some_file.py{,.orig}

= Grep =

Filter comments from a config file grep -vE '^#|^;|^$' server.conf

Filter multiple strings pstree | grep -e docker -e ssh

= Archives =

Extract "*.gz" file gunzip FILE_NAME.gz

Extract "*.tar.gz" file tar zxf FILE_NAME.tar.gz

Extract "*.tar.bz2" file tar jxf FILE_NAME.tar.bz2

Extract multiple archives into sub directories: for i in `find $(pwd) -type f -name '*.gz'`; do echo $i; j=$(echo $i | cut -d '.' -f1); echo $j; mkdir $j; tar xvzf $i -C $j; done

Extract files from similarly named directories: for i in `find. -name 'tech_node*'`; do cd $i; sudo tar xvzf node.tar.gz; cd ..; done

Creating a Tar file from a directory: tar -zcvf /tmp/log.tar.gz /opt/avi/log/*

Testing Archives without extracting: tar tvf logs.tar.gz

= Generate 100 HTTP requests = sudo apt-get install parallel seq 100 | parallel --no-notice -j0 --joblog log curl -s http://10.107.88.91/welcome.png/{} ">" {}.txt cut -f 4 log seq 100 | parallel --no-notice -j0 --joblog log curl http://10.107.88.91/welcome.png/{} ">" {}.txt

= Finding Old Logs = find /tmp/report_ftw -type f -mtime +30 find /tmp/report_ftw -type f -mtime +30 -name "messages.*" -exec rm -f {} \;
 * Find and Delete more than 30 days old files:

find /var/log -type f -mtime +30 -name "*.gz" -exec ls {} \; find /var/log -type f -mtime +30 -name "*.gz" -exec rm -f {} \;
 * List and Delete gz files older than 30 days:

= Searching Multiple text files = for i in log*; do echo $i ; cat $i | egrep -vi "error|not|warning|false" ; done

= Replace a keyword in all files at once = find ./ -name \*.tf -exec sed -i "s/cybernetnews/cybernet/g" {} \; find ./ -type f -readable -writable -exec sed -i "s/cybernetnews/cybernet/g" {} \;

= Cisco = R1(config-router)#do sh run | section ospf R1(config-router)#do sh run | s ospf R1(config-router)#do sh run | include ospf R1(config-router)#do sh run | i ospf

= Top Command = top    E cycle through Memory units - KB,MB,GB 1 CPU details for each core m Memory Graph c complete path k kill M Sort by memory usage P Sort by CPU usage R Results in ascending order

top -o %CPU top -o %MEM top -b -n 1 > top.txt

= Text Editor =

Nano Search
Cntrl + W

vi
Cntl + b => One page before Cntl + f => One page after

= File Sharing =

Check Samba Shares sudo apt install smbclient smbclient //10.140.196.7/share -U aman

= HTTP Proxy through SSL Tunnel =

ssh -L 127.0.0.1:19443:10.52.201.10:443 aman@10.52.1.138

Access       = https://127.0.0.1:19443 Jump Server  = aman@10.52.1.138 Remote Server = 10.52.201.10:443

= Curl =

Test Site Reliability: for i in {1..999}; do echo -n $i ' '; curl http://google.com -s -w %{http_code} -o /dev/null -m 1; echo ""; sleep 1; done for i in {01..999}; do echo -n $i HTTP Code:' '; /usr/bin/time -qf " Real:%e User:%U Sys:%S" curl http://google.com -s -w %{http_code} -o /dev/null -m 1; sleep 1; done

Switches: curl -I                           ==> Response Headers only (HEAD) curl -v                           ==> Request & Response Headers curl -k                           ==> No Certificate validation. curl -H "user-agent: Mozilla/5.0" ==> custom header curl -L                           ==> Handle URL redirects curl -X                  ==> Custom request method; otherwise defaults to GET; DELETE, POST, PUT, GET; use with -d data curl -d or curl -F                ==> POST curl -T                           ==> PUT

See just Request & Response Headers: curl -vs google.com 2>&1 > /dev/null| sed '/^* /d; /bytes data]$/d; s/> //; s/< //'

Do not use Cache(Server or proxies in middle can ignore this): curl -H 'Cache-Control: no-cache' http://www.example.com

Output specific lines from multiline output: curl -skL https://aman.info.tm | awk '/Articles/ &&NR>=178 && NR<= 180' curl -skL https://aman.info.tm | awk 'NR>=178 && NR<= 180' | grep Articles