THM AOC2024 DAY 17: Investigating with Splunk

Every December, TryHackMe's Advent of Cyber delivers 24 free daily cybersecurity challenges, offering hands-on scenarios that simulate real-world attacks and defenses. Designed for beginners and professionals alike, it's an exciting, gamified way to explore topics like threat hunting, penetration testing, cryptography, and more. This event is perfect for building skills, gaining practical experience, and spreading some cybersecurity cheer during the festive season!


Learning Objectives

  1. Learn how to extract custom fields in Splunk.
  2. Learn to create a parser for custom logs.
  3. Filter and narrow down the search results using Search Processing Language (SPL).

Tools Overview

  • Splunk SIEM: A powerful tool for log aggregation and real-time analytics. It helps identify and investigate security incidents by providing centralized visibility into system and application logs.
  • Search Processing Language (SPL): Splunk's query language designed for filtering, parsing, and analyzing log data. It allows advanced searches and data visualization.
  • Custom Log Parsing: The process of extracting relevant fields from raw logs to create structured data for better analysis and insights.

Task Walkthrough

Overview

In today's task, we used Splunk to investigate an incident involving unauthorized access and deletion of CCTV recordings. The investigation focused on parsing custom logs, identifying malicious activities, and correlating data from multiple sources to pinpoint the attacker.

Steps

Task 1: Accessing Splunk

  1. Connect to the Splunk SIEM: Navigate to the URL provided in the connection card.
  2. Search Logs: Use the query index=* to display all logs. Ensure the time range is set to "All time."
  3. Identify Data Sources: Verify the datasets available by checking the sourcetype field. This task included:
    • web_logs: Logs of web server connections.
    • cctv_logs: Logs related to CCTV access.

Task 2: Extracting Custom Fields

  1. Select a Sample Log: Choose a log from cctv_logs as a baseline.
  2. Use Regular Expressions: Highlight fields (e.g., Timestamp, Event, User_id, UserName, and Session_id) to extract relevant data.
  3. Validate the Regex: Ensure the pattern works across all logs.
  4. Save and Apply: Save the extraction rules for consistent parsing.

Task 3: Fixing Parsing Issues

  1. Identify Parsing Errors: Some logs failed to parse due to format variations.
  2. Update Regex: Use the expression ^(?P<timestamp>\d+\-\d+\-\d+\s+\d+:\d+:\d+)\s+(?P<Event>(Login\s\w+|\w+))\s+(?P<user_id>\d+)?\s?(?P<UserName>\w+)\s+.*?(?P<Session_id>\w+)$ to handle inconsistencies.
  3. Reapply Parsing: Validate that all logs are now correctly parsed.

Task 4: Investigating Logs

  1. Count Events by User: Use index=cctv_feed | stats count(Event) by UserName to analyze activity.
  2. Identify Rare Events: Use index=cctv_feed | rare Event to find anomalies like deletion events.
  3. Correlate Session IDs: Trace activities associated with specific Session IDs to uncover the attack timeline.
  4. Filter Suspicious Logs: Narrow down results using queries such as:index=cctv_feed *failed* | table _time UserName Event Session_id index=cctv_feed *Delete*

Task 5: Correlating Web Logs

  1. Link Session IDs: Cross-reference Session IDs from cctv_logs with web_logs.
  2. Identify Attacker's IP: Use the query index=web_logs clientip="<IP_ADDRESS>" to trace activities.
  3. Confirm Attacker's Actions: Examine all associated events to establish a timeline.

Questions and Solutions

  1. How many logs were captured associated with the successful login?
    • Answer: 642
  2. What is the Session_id associated with the attacker who deleted the recording?
    • Answer: rij5uu4gt204q0d3eb7jj86okt
  3. What is the name of the attacker found in the logs, who deleted the CCTV footage?
    • Answer: mmalware

Recap of Learning Objectives

1. Learn how to extract custom fields in Splunk

Custom field extraction transforms unstructured logs into structured data, enabling targeted queries and insightful analysis. This task demonstrated using regex to parse fields such as timestamps and user IDs, ensuring that critical data was accessible for investigation.

2. Learn to create a parser for custom logs

By crafting and refining regex patterns, we ensured accurate parsing across all logs. This step emphasized the importance of tailoring parsers to handle varied log formats for consistent data interpretation.

3. Filter and narrow down the search results using SPL

SPL's versatility allowed efficient filtering and correlation of data from different sources. Techniques like rare and stats provided insights into rare events and activity counts, aiding in identifying malicious behaviors.

Leave a Reply