Ox01 Web log
Web access logs record raw information such as processing requests received by the Web server and runtime errors. Security analysis of WEB logs can not only help us locate attackers, but also help us restore the attack path, find and repair the security vulnerabilities on websites.
Let’s look at an Apache access log:
127.0.0.1 – – [11/Jun/2018:12:47:22 +0800] “GET /login. HTML HTTP/1.1″ WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/66.0.3359.139 Safari/537.36”
Through this Web access log, we can clearly know the user in what IP, what time, what operating system, what browser under the circumstances of which to visit your website which page, whether the access is successful.
This paper introduces the ideas and common skills of Web log security analysis.
0x02 Log Analysis Techniques
You can perform security analysis on WEB logs in two ways to restore the entire attack process step by step.
The first one is to determine the time range of the intrusion, take this as a clue, search for suspicious logs within this time range, conduct further investigation, and finally determine the attacker and restore the attack process.
The second kind: the attacker after the invasion of the website, usually will leave a back door to maintain access, in order to facilitate access again, we can find the file, and take this as a clue to launch analysis.
Common analysis tools:
Windows, the recommended use of EmEditor log analysis, support large text, search efficiency is not bad.
In Linux, use Shell commands to query and analyze.
Shell+Linux command to achieve log analysis, generally combined with grep, awk and other commands to achieve several common log analysis statistics skills.
Apache Log Analysis techniques:
1, the list of the most visited the IP command: cut - d - 1 f log_file | uniq -c | sort - rn | head - 20 2, view the day how many IP access: Awk '{print $1}' log_file | sort | uniq | 3 wc -l, view a page is access to the number of times: grep "/ index. PHP" log_file | wc - l 4, check how much each IP access page: Awk '{++S[$1]} END {for (a in S) print a,S[a]}' log_file Awk '[$1]} {+ + S END {for (S) in a print S [a], a}' log_file | sort - n 6, view a IP access which pages: Grep ^ 111.111.111.111 log_file | awk '{print $1, $7}' 7, remove the statistics on the day of the search engine page: Awk '{print $12, $1}' log_file | grep ^ \ "Mozilla | awk '{print $2}' | sort | uniq | wc -l 8, view on June 21, 2018, 14 when this one hour how much IP access: awk '{print $4,$1}' log_file | grep 21/Jun/2018:14 | awk '{print $2}'| sort | uniq | wc -lCopy the code
0x03 Log Analysis Case
Example of Web log analysis: Multiple picture Trojan horses were uploaded to a directory of a site on the Intranet server through nginx proxy. Although they cannot be resolved under II7, they still want to find out who uploaded them through what path.
Here, we encounter a problem: because proxy forwarding is set up, only the PROXY server IP is recorded, not the visitor IP? At this point, how to identify different visitors and attack sources?
This is a problem of misconfiguration of administrator logs, but the good news is that we can use browser fingerprints to locate different access sources and restore attack paths.
1. Locate the attack source
First visit the image Trojan records, only found one, because all access logs only record the proxy IP, and cannot restore the attack path through THE IP, at this time, you can use the browser fingerprint to locate.
Browser fingerprint:
Mozilla + / 4.0 (compatible; + MSIE + 7.0; + + Windows NT + 6.1; +WOW64; Trident + / 7.0; +SLCC2; + +. NET CLR + 2.0.50727; + +. NET CLR + 3.5.30729; + +. NET CLR + 3.0.30729; +. NET4.0 C; +. NET4.0 E)
2. Search for related log records
By filtering the log records associated with the browser’s fingerprint, you can clearly see the attacker’s attack path.
3. Interpret the access logs found, and the attacker’s approximate access path is as follows:
Aspx and msgsebd.aspx C; xzuser. aspx D; multiple POST (suspected of uploading module defects through this page) e. The attacker accessed the image Trojan horseCopy the code
Open the website and visit xzuser.aspx to confirm that the attacker uploaded the picture Trojan through the file on this page. At the same time, the website found the existence of unauthorized access vulnerability, the attacker accessed a specific URL, without logging in to enter the background interface. Use log analysis to locate vulnerabilities and fix them.
0x04 Log statistical analysis Techniques
Statistical crawler:
grep -E 'Googlebot|Baiduspider' /www/logs/access.2019-02-23.log | awk '{ print $1 }' | sort | uniq
Copy the code
Statistics browser:
cat /www/logs/access.2019-02-23.log | grep -v -E 'MSIE|Firefox|Chrome|Opera|Safari|Gecko|Maxthon' | sort | uniq -c | sort -r -n | head -n 100
Copy the code
IP statistics:
grep '23/May/2019' /www/logs/access.2019-02-23.log | awk '{print $1}' | awk -F'.' '{print $1"."$2"."$3"."$4}' | sort | Uniq -c | sort - r - n | head -n 10 219.136.134.13 182.34.15.248 211.140.143.100 119.145.149.106 1431 1431 1497 2206 1427 61.183.15.179 1427 218.6.8.189 1422 124.232.150.171 1421 106.187.47.224 1420 61.160.220.252 1418 114.80.201.18Copy the code
Network segment:
cat /www/logs/access.2019-02-23.log | awk '{print $1}' | awk -F'.' '{print $1"."$2"."$3".0"}' | sort | uniq -c | sort -r -n | head -n 200Copy the code
Statistical domain name:
cat /www/logs/access.2019-02-23.log |awk '{print $2}'|sort|uniq -c|sort -rn|more
Copy the code
HTTP Status:
cat /www/logs/access.2019-02-23.log |awk '{print $9}'|sort|uniq -c|sort -rn|more
5056585 304
1125579 200
7602 400
5 301
Copy the code
URL statistics:
cat /www/logs/access.2019-02-23.log |awk '{print $7}'|sort|uniq -c|sort -rn|more
Copy the code
File traffic statistics:
cat /www/logs/access.2019-02-23.log |awk '{sum[$7]+=$10}END{for(i in sum){print sum[i],i}}'|sort -rn|more
grep ' 200 ' /www/logs/access.2019-02-23.log |awk '{sum[$7]+=$10}END{for(i in sum){print sum[i],i}}'|sort -rn|more
Copy the code
URL Access statistics:
cat /www/logs/access.2019-02-23.log | awk '{print $7}' | egrep '\? |&' | sort | uniq -c | sort -rn | moreCopy the code
Script running speed:
Check for the slowest script
grep -v 0$ /www/logs/access.2019-02-23.log | awk -F '\" ' '{print $4" " $1}' web.log | awk '{print $1" "$8}' | sort -n -k 1 -r | uniq > /tmp/slow_url.txt
Copy the code
IP, URL extraction:
# tail -f /www/logs/access.2019-02-23.log | grep '/test.html' | awk '{print $1" "$7}'
Copy the code
MSSQL Log Analysis will be shared with you tomorrow. Remember to pay attention and give a thumbs up