Classroom Public page

SEC-101 Lab 8: CVE-Record-Walk Worksheet

903 words

Week: 11 Graded: Yes Time estimate: 60-75 minutes Tools: Browser only; nvd.nist.gov; cve.mitre.org; first.org/cvss/calculator/3.1; cisa.gov/known-exploited-vulnerabilities-catalog


Learning objective

Given a CVE record, identify the vulnerable component, affected versions, CVSS v3.1 base-score breakdown, and proof-of-concept availability. Cross-reference NVD, MITRE, and one independent advisory. Score the CVSS v3.1 manually and compare to NVD's score. (Bloom's L4: Analyze -- read a CVE record correctly and decompose the CVSS score into its constituent decisions.)


Setup

No software installation. You need a browser and access to the following sites:

  • nvd.nist.gov (National Vulnerability Database)
  • cve.mitre.org (MITRE CVE records)
  • first.org/cvss/calculator/3.1 (CVSS v3.1 calculator)
  • cisa.gov/known-exploited-vulnerabilities-catalog (CISA KEV)

Step-by-step instructions

Step 1: Choose your CVE (5 min)

Go to https://nvd.nist.gov/vuln/search and search for a CVE published within the past 24 months. Pick one that:

  • Has a CVSS v3.1 base score of 7.0 or higher (to make the scoring exercise meaningful).
  • Affects a product or technology you recognize (a programming language runtime, a web server, a browser, a cloud service, an operating system).
  • Has at least two independent write-ups beyond the NVD/MITRE records.

Record the CVE ID (e.g., CVE-2024-NNNN) and the product name.

If you are unsure which CVE to pick, look at the CISA KEV catalogue (https://www.cisa.gov/known-exploited-vulnerabilities-catalog) and filter by date. KEV entries are by definition being actively exploited and have extensive public documentation.

Step 2: The MITRE CVE record (10 min)

Go to https://cve.mitre.org and search for your CVE. Record:

  • The official description (the one-paragraph summary).
  • The CNA (CVE Numbering Authority) that assigned the CVE. Is it the vendor, a research organization, or MITRE itself?
  • The list of references. How many references are listed? Are they vendor advisories, security researcher blogs, or news articles?

Step 3: The NVD entry (15 min)

Go to https://nvd.nist.gov/vuln/detail/[YOUR-CVE-ID]. Record:

  • The CVSS v3.1 base score and the vector string (a compact notation like CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H).
  • The CWE classification (what type of weakness does NVD attribute this to?).
  • The CPE entries: which specific product versions are affected?
  • The reference list: note any references that are technical write-ups rather than vendor advisories.

Step 4: The vendor advisory (10 min)

Find the vendor's security advisory for your CVE. It may be linked from the NVD references, or you may need to search the vendor's security bulletin page directly.

Record:

  • The advisory date.
  • The affected versions listed by the vendor.
  • The patch or workaround the vendor recommends.
  • Any credit the vendor gives to the reporting researcher.

Step 5: One independent technical write-up (10 min)

Find one independent write-up: a security researcher's blog post, a Project Zero analysis, a conference paper, a CERT advisory, or a technical article from a security vendor (not the product vendor). The write-up must provide technical detail beyond the CVE description.

Record:

  • The author and publication.
  • What technical detail the write-up provides that the CVE record does not (the mechanism, the proof-of-concept, the exploitation conditions, the remediation approach).

Step 6: Score the CVSS v3.1 yourself (15 min)

Go to https://www.first.org/cvss/calculator/3.1. Work through the eight base metrics for your CVE. For each metric, write your chosen value and a one-sentence justification.

Metric Your value Justification
Attack Vector (AV)
Attack Complexity (AC)
Privileges Required (PR)
User Interaction (UI)
Scope (S)
Confidentiality (C)
Integrity (I)
Availability (A)

Record the resulting base score. Compare to NVD's assigned score. If they differ, write a brief note explaining where your assessment differs from NVD's and why.


Deliverable

A completed worksheet with all six steps. The format is structured (the table in Step 6, the record fields in Steps 2-5) plus your own words where questions are open-ended.

Required elements:

  • CVE ID and product name.
  • MITRE record: description, CNA, reference count.
  • NVD entry: CVSS vector string, CWE, affected versions.
  • Vendor advisory: date, versions, patch recommendation.
  • Independent write-up: author, publication, what technical detail it adds.
  • CVSS scoring table with justifications.
  • Score comparison: your score vs. NVD's score, with explanation if they differ.

Total length: 400-600 words outside of tables and record fields.


Grading rubric

Criterion Points Notes
CVE selection: severity >= 7.0, recognized product, within 24 months 10 Out-of-range CVEs accepted if justified; points not deducted
MITRE record: all three fields recorded 15 CNA identification is specifically checked
NVD entry: vector string, CWE, affected versions recorded 20 Vector string must be complete, not just the score
Vendor advisory: all three fields; source identified 15 "I couldn't find an advisory" must explain where you looked
Independent write-up: technical addition documented 15 "It summarizes the CVE" does not score; must identify specific technical addition
CVSS scoring table: all eight metrics with one-sentence justification 20 Missing justifications: 2 points deducted per metric
Score comparison: difference noted and explained (if any) 5 "Same as NVD" scores if true; unexplained difference does not
Total 100

picoCTF connection

Lab 8 builds the research discipline for the capstone. The picoCTF Binary Exploitation challenges you've been working through have CVEs behind many of their mechanisms: buffer overflow (CWE-121/122), format string vulnerability (CWE-134), integer overflow (CWE-190). After completing this lab, pick a picoCTF binary challenge you found confusing and search NVD for real-world CVEs of the same CWE. Reading how the real vulnerability was disclosed (the CVE record, the vendor response, the researcher's write-up) deepens the technical model behind the CTF challenge.


Lab 8 of 9. Next: Lab 9 (Capstone -- historical CVE explainer report, Weeks 13-14).