Static analysis report on an instructor-assigned training target. Git repository with at least 3 commits. Report: 5-8 pages. Due: Week 14.
Overview
The capstone is a structured firmware analysis report on an instructor-assigned training target: a small consumer IoT device or legacy router whose firmware is publicly and legally available. You perform static analysis on one binary extracted from the firmware image, document your findings, and compare your findings to any prior public research on the device.
You are NOT exploiting the device, NOT running code on physical hardware, and NOT testing against any live infrastructure. You are analyzing bytes on your workstation.
Deliverable: A Git repository hosted on GitHub or GitLab with at least 3 commits. The repository contains your report and any supporting artifacts (screenshots, annotated disassembly listings, the extracted binary if the license permits redistribution).
Time: Weeks 13-14, approximately 8-10 hours total.
Approved target list
The instructor will assign a target from this list, or approve an alternative you propose:
| Target | Firmware source | Architecture | Why it is appropriate |
|---|---|---|---|
| TP-Link TL-WR841N (any version) | tp-link.com/support | MIPS32 | Common training target; prior research widely available; squashfs filesystem |
| Netgear R6700v3 | netgear.com/support | ARM | Multiple prior CVEs; httpd binary is well-studied at RE-101 level |
| D-Link DIR-615 (any version) | dlink.com/support | MIPS32 | Classic router; well-documented firmware format |
| ASUS RT-N56U | asus.com/support | MIPS32 | Broad prior research; multiple filesystem variants |
| Custom course-provided training image | (provided by instructor) | x86-64 | Designed for RE-011; guaranteed appropriate complexity |
Self-proposed alternatives must have: legally-available firmware, at least one public CVE or prior security finding, and a suitable (non-kernel-level) binary for analysis.
The custom course-provided training image is the recommended target for self-paced students who want a controlled scope.
Repository structure
re011-capstone-[yourname]/
├── README.md # brief description (2-3 sentences)
├── report.md # the main analysis report (see below)
├── figures/
│ └── *.png # Ghidra screenshots, annotated disassembly
└── artifacts/
└── (optional) # extracted binary, strings output, etc.
# do NOT include full firmware images (may be large)
Git requirement: At minimum 3 commits showing work over time. Acceptable commit structure:
- Initial commit: scoping document (from Week 13) + README
- In-progress commit: partial report + first figures
- Final commit: complete report
A single commit with everything is a flag for rushed work; it does not satisfy the 3-commit requirement.
Report structure and requirements
Section 1: Target overview (1 page)
- Device name, model, and firmware version analyzed
- Firmware download source (URL or description)
- SHA-256 hash of the firmware image (run
sha256sum firmware.bin) - binwalk scan output (quoted or summarized: what components were found)
- Filesystem type and what was extracted
- Brief
findlisting of the extracted filesystem root (not exhaustive -- show the key directories)
Section 2: Target binary (1 page)
- Name and path of the binary you chose to analyze (e.g.,
/usr/sbin/httpd,/bin/busybox,/usr/bin/app_daemon) - Why you chose it (what made it interesting)
- Architecture identified (from
fileandreadelf -h) - Stripped or unstripped (from
readelf -sandnm) - Import table summary: which library functions does it call? (Quote the most interesting 10-15 from
nm -D) - Ghidra import: what processor configuration did you select? Did the auto-analyser identify functions correctly?
Section 3: Findings (2-3 pages)
This is the graded core. Document every salient finding with specific evidence.
Each finding entry:
Finding: [Brief title]
Type: [hardcoded credential / dangerous function / outdated library / debug interface / exposed endpoint / other]
Location: [binary name, function name or address, Ghidra screenshot filename]
Evidence: [The string, address, function call, or version string that you observed]
Security implication: [Why this matters; what an attacker could do with this information]
Minimum: two findings. Strong reports have three to five. A finding without specific evidence (function address, string content, library version) earns no credit.
Evidence types that earn credit:
- Hardcoded string literal (show the string and how you found it)
- Direct call to
strcpy,gets,sprintfwith external input as argument (show the call site address and what the argument sources are) - Library version string (show the string, the version number, and a CVE or advisory for that version)
- Debug service binary present in the filesystem (show the path and what it does)
- Default credential in a configuration file extracted from the filesystem
Section 4: Comparison to prior work (1 page)
Identify at least one prior public finding on this device (a CVE, a security advisory, a conference talk, a blog post). Describe:
- What was found in the prior work (brief summary, not a long quote)
- Whether your analysis independently corroborates it, contradicts it, or addresses a different part of the attack surface
- What you found that the prior work did not address (or acknowledge that the prior work was more thorough and explain why)
This section tests whether you can situate your own analysis in the broader research context -- a skill required in all professional RE work.
Section 5: What would be next (1 page)
Write what you would do next if this were a real engagement. Include:
- What dynamic analysis would add (which function would you debug, what hypothesis would you test)
- What hardware access would enable (if you had UART access to the physical device, what would you do first)
- What the most plausible exploitation path would be, if any finding points toward one (at the concept level -- RE-011 does not require exploit development)
- What RE-101 would cover that RE-011 does not
Section 6: Tooling narrative (0.5 page)
Which tools did you use, in which order, and what did each contribute? Write this as a brief practitioner reflection, not a list. Example format: "I started with binwalk to understand the firmware structure, then extracted the squashfs filesystem. After surveying the filesystem with find and strings, I chose httpd as the target binary because its import table included functions commonly associated with web form processing. I imported it into Ghidra using the MIPS:BE:32 processor..."
Grading rubric
Tier 1: Pass/fail gate
Before scoring:
- The firmware source is legal and the source URL or description is provided.
- The binary analyzed is from the assigned target (not a different binary).
- The technical mechanism described in Section 3 is accurate.
- No paragraph-level copying from public writeups.
Reports that fail Tier 1 are returned for revision. Students who pass Tier 1 proceed to Tier 2.
Tier 2: Scored rubric
| Dimension | Weight | Strong | Weak |
|---|---|---|---|
| Technical depth (Section 3) | 40% | Findings cite specific addresses, strings, or function calls; security implication is reasoned, not generic | Findings say "the binary handles web requests" without specific evidence; implications are generic ("could be vulnerable") |
| Analytical clarity (all sections) | 30% | Non-specialist reader could follow the analysis; tools explained on first mention; findings described in concrete terms | Assumes reader knows MIPS calling conventions; findings are not connected to evidence; jargon unexplained |
| Research situatedness (Section 4) | 30% | Prior work accurately cited and compared; independent assessment of how your analysis relates | Prior work summarized without comparison to your own findings; Section 4 is a bibliography not a comparison |
Submission
Submit your Git repository URL to the instructor by the end of Week 14. Confirm:
- The repository is publicly accessible (or the instructor has been added as a collaborator).
- The repository has at least 3 commits.
report.mdis present and complete.- All figures referenced in the report are present in
figures/.
Lab 9 of 9. Final deliverable for RE-011. The firmware analysis skill developed here is directly prerequisite to RE-101's SB6141 lab target work.