Log Analysis on Web Servers

Log Analysis on Web Servers for Incident Response

Log analysis is critical for identifying malicious activities such as SQL Injection, Directory Traversal, and other web-based attacks. By focusing on access and error logs, incident responders can detect and mitigate these threats effectively.


Key Log Files and Their Locations

Web Server

Log Type

Log Location

Apache

Access Logs

/var/log/apache2/access.log

Error Logs

/var/log/apache2/error.log

Nginx

Access Logs

/var/log/nginx/access.log

Error Logs

/var/log/nginx/error.log

IIS

Access Logs

C:\inetpub\logs\LogFiles\W3SVC1\

System Logs

General Activity

/var/log/syslog, /var/log/messages


Case 1: SQL Injection Analysis (Apache)

SQL Injection exploits occur when an attacker inserts malicious SQL queries into input fields, aiming to manipulate the database.

Steps to Analyze SQL Injection Logs

  1. View Access Logs:

    cat /var/log/apache2/access.log
  2. Search for Common SQL Injection Payloads: Filter requests containing SQL keywords:

    cat access.log | grep -E "%27|--|union|select|from|or|@|version|char|varchar|exec"
  3. Filter by HTTP 200 Response: Focus on successful injection attempts:

    cat access.log | grep "200"
  4. URL Decode Suspicious Entries: Decode encoded URLs to reveal clear SQL queries:

    echo "/cat.php?id=1%20UNION%20SELECT%20..." | python3 -c "import urllib.parse; print(urllib.parse.unquote(input()))"
  5. Check Admin Panel Access: After exploitation, verify if the attacker accessed sensitive areas:

    cat access.log | grep 192.168.2.232 | grep admin/index.php

Example Decoded URL:

/cat.php?id=1 UNION SELECT 1,concat(login,':',password),3,4 FROM users;

This indicates user credentials extraction.


Case 2: Directory Traversal Attack (Nginx)

Directory Traversal involves navigating outside the web root directory by exploiting file paths like ../.

Steps to Detect Directory Traversal

  1. Filter for Directory Traversal Attempts:

    cat /var/log/nginx/access.log | grep "../"
  2. Check for Sensitive File Access: Look for attempts to access files like /etc/passwd:

    cat access.log | grep "/etc/passwd"
  3. Inspect Successful Attempts: Focus on HTTP 200 responses for sensitive files:

    cat access.log | grep "200"

Observation:

If you notice /etc/passwd being downloaded:

wget http://example.com/../../../../../etc/passwd

This indicates a successful traversal exploit.


Case 3: Log Analysis on IIS Web Servers

IIS logs provide rich details about HTTP requests, including timestamps, client IPs, and request details.

Sample Log Entry:

2024-11-12 12:34:56 192.168.1.1 GET /index.html 200

Field

Description

Timestamp

Event time

Client IP

IP address of the request origin

HTTP Method

Type of request (GET, POST)

URL Requested

Resource path

HTTP Status

Server response (200, 404, 500)

Steps to Detect Suspicious Activities:

  1. Search for SQL Injection:

    findstr /C:"union select" C:\inetpub\logs\LogFiles\W3SVC1\*
  2. Detect Directory Traversal:

    findstr /C:"../" C:\inetpub\logs\LogFiles\W3SVC1\*

Additional Tools for Enhanced Analysis

  1. Wireshark:

    • Inspect network traffic and POST request payloads.

  2. mod_security or mod_forensic:

    • Enhance Apache logging by capturing detailed request and response data.

  3. Splunk / ELK Stack:

    • Centralize logs for real-time correlation and visualization.


Key Log Fields for Analysis

Access Logs:

Example:

192.168.1.1 - - [12/Nov/2024:12:34:56 +0000] "GET /index.html HTTP/1.1" 200 1024
  • IP Address: 192.168.1.1

  • Timestamp: [12/Nov/2024:12:34:56 +0000]

  • Request Method: GET

  • URL: /index.html

  • Status Code: 200 (Success)

  • Response Size: 1024 bytes

Error Logs:

Example:

[Wed Nov 12 12:34:56 2024] [error] [client 192.168.1.1] File does not exist: /var/www/html/favicon.ico
  • Error Level: [error]

  • Client IP: 192.168.1.1

  • Message: File not found error.


Best Practices for Log Analysis

  1. Centralize Logs: Use tools like the ELK Stack (Elasticsearch, Logstash, Kibana) or Graylog to correlate and visualize log data across multiple systems.

  2. Automate Routine Checks: Create reusable shell scripts for recurring analysis tasks:

    # Example: Monitor for failed logins
    grep "Failed password" /var/log/auth.log | awk '{print $1, $2, $3, $11}'
  3. Set Alerts for Critical Events:

    • Repeated failed login attempts.

    • High-frequency 404/500 errors.

    • Unauthorized access patterns.

  4. Log Retention Policies: Ensure logs are stored securely and retained for an adequate period to support long-term investigations.


Key Points

Web log analysis is a cornerstone of effective incident response. By focusing on patterns related to SQL Injection, Directory Traversal, and other web exploits, analysts can quickly detect malicious activities and take appropriate action. Mastering these techniques and leveraging automation ensures a proactive defense against evolving threats.

Last updated