Before I founded AeoliTech, I spent years at Los Alamos National Laboratory helping systems achieve Authority to Operate under NIST 800-53 High. That is the most rigorous federal security baseline there is. When you carry a system through ATO at a national laboratory, you learn very quickly what separates a defensible evidence package from a pile of screenshots that an assessor will tear apart in the first hour. Those lessons translate directly to CMMC Level 2, and they are the reason our clients pass their C3PAO assessments on the first attempt.
What I Learned at LANL
At Los Alamos, the assessment process was unforgiving. The assessors were not consultants trying to help you pass. They were security professionals whose job was to find every weakness in your implementation. If your evidence was ambiguous, they interpreted it against you. If your documentation had gaps, they assumed the worst. If your staff could not explain how a control worked during an interview, the control was marked as not implemented regardless of what your SSP said.
NIST 800-53 High has over 400 controls. CMMC Level 2 has 110. The scope is smaller, but the assessment rigor under CMMC is converging toward what I experienced at LANL. C3PAO assessors are trained to use the same Examine, Interview, and Test methodology. They are looking for the same things: evidence that controls are not just documented but implemented, operating, and effective.
The Difference Between Checkbox and Defensible
A checkbox evidence package says: "We have a policy." A defensible evidence package says: "We have a policy, here is when it was approved, here is who approved it, here is evidence that staff were trained on it, here is a log showing the policy is enforced technically, and here is a recent audit showing the enforcement is working."
Checkbox vs. Defensible Evidence
Checkbox Approach
- ❌ Generic policy template with company name swapped in
- ❌ Single screenshot of a configuration page
- ❌ "We use MFA" with no proof of enforcement
- ❌ Audit logs exist but no evidence of review
- ❌ Training slides with no attendance records
- ❌ Network diagram from two years ago
Defensible Approach
- ✅ Policy with approval signatures, revision history, annual review dates
- ✅ Configuration export with timestamp, system name, and baseline comparison
- ✅ Conditional access policy export showing MFA enforcement + sign-in logs showing MFA challenges
- ✅ Audit log samples + documented review process + findings from recent review
- ✅ Training completion records with dates, names, quiz scores
- ✅ Current network diagram matching actual scan results
Evidence Organization: The LANL Method
At LANL, we organized evidence by control family, then by individual control, then by assessment objective. Every piece of evidence was tagged with metadata: the date it was collected, the system it came from, the person who collected it, and which specific assessment objective it satisfied. This sounds like overkill until you are sitting across from an assessor who asks for evidence of AC-2(3) and you can produce it in under 30 seconds.
evidence-package/
AC - Access Control/
AC.L2-3.1.1 - Authorized Access Control/
EXAMINE/
AC.L2-3.1.1_policy_v3.2_approved_2026-03-15.pdf
AC.L2-3.1.1_azure-ad-conditional-access-export_2026-04-01.json
AC.L2-3.1.1_access-control-list_fileserver01_2026-04-01.csv
INTERVIEW/
AC.L2-3.1.1_interview-guide_sysadmin-role.md
TEST/
AC.L2-3.1.1_unauthorized-access-attempt-log_2026-04-01.png
AC.L2-3.1.1_least-privilege-verification_2026-04-01.pdf
AC.L2-3.1.2 - Transaction & Function Control/
...
AU - Audit & Accountability/
AU.L2-3.3.1 - System Auditing/
EXAMINE/
AU.L2-3.3.1_siem-configuration-export_2026-04-01.json
AU.L2-3.3.1_audit-policy-gpo-export_2026-04-01.html
AU.L2-3.3.1_log-retention-policy_v2.1.pdf
...This structure does three things. First, it demonstrates organizational maturity to the assessor before they even look at the content. Second, it makes the assessment faster because the assessor can find what they need without asking you to dig through a shared drive. Third, it forces you to identify gaps before the assessment because empty folders are impossible to ignore.
What Assessors Look for in the First 10 Minutes
Every assessor I have worked with, at LANL and in the CMMC ecosystem, forms an initial impression within the first 10 minutes of reviewing your evidence package. That impression colors the entire assessment. Here is what they look for:
1. Currency of Evidence
Are the screenshots and exports recent? Evidence older than 90 days raises questions. Evidence older than 6 months is often rejected. The assessor wants to see that your controls are operating now, not that they were operating last year.
2. Consistency Across Artifacts
Does your SSP match your network diagram? Does your network diagram match your scan results? Does your asset inventory match the systems in your audit logs? Inconsistencies are the fastest way to lose assessor confidence.
3. Specificity Over Generality
Generic statements like "encryption is used for data at rest" are worthless. Specific statements like "BitLocker with AES-256 is enforced via GPO on all domain-joined endpoints, verified by monthly compliance scan" demonstrate real implementation.
4. Evidence of Ongoing Operation
A one-time configuration screenshot proves you set something up once. Periodic audit reports, regular review meeting minutes, and trend data from monitoring tools prove the control is operating continuously.
Common Evidence Failures I Have Seen
After years of building evidence packages and reviewing others, these are the failure patterns that come up again and again:
Evidence Anti-Patterns
The Screenshot Graveyard
Hundreds of unlabeled screenshots dumped into a folder. No context, no timestamps visible, no mapping to specific controls. The assessor has to guess what each screenshot is supposed to prove.
The Template SSP
An SSP clearly generated from a template with placeholder text still visible, or worse, another company's name in the headers. This happens more often than you would think, and it is an immediate credibility killer.
The Aspirational Policy
Policies that describe what the organization plans to do rather than what it actually does. "The organization shall implement" instead of "The organization implements." Assessors read policies carefully and notice tense.
The Missing Link
Evidence that proves a control exists but not that it is effective. You show that audit logging is configured, but you cannot show that anyone reviews the logs. You show that vulnerability scanning runs, but you cannot show that findings are remediated.
The Stale Package
Evidence collected six months before the assessment that no longer reflects the current environment. Systems have been added, configurations have changed, staff have turned over, but the evidence package was never refreshed.
How LANL Rigor Applies to CMMC L2
The LANL ATO process taught me that evidence quality is a function of process maturity, not effort. Organizations that have mature processes produce good evidence naturally because the evidence is a byproduct of how they operate. Organizations that lack mature processes have to manufacture evidence, and manufactured evidence always looks manufactured.
For CMMC Level 2, this means the best preparation is not assembling an evidence package. It is building the operational processes that generate evidence continuously. When your access reviews happen on schedule and are documented, you have evidence for AC controls. When your vulnerability scans run weekly and findings are tracked to closure, you have evidence for RA and SI controls. When your incident response team conducts tabletop exercises quarterly, you have evidence for IR controls.
The Evidence Maturity Ladder
Reactive (Most Contractors)
Evidence is collected in a rush before the assessment. Quality is poor, gaps are common, and the package does not reflect actual operations.
Periodic (Better)
Evidence is collected on a schedule (monthly or quarterly). Quality improves but gaps still appear between collection cycles.
Continuous (LANL Standard)
Evidence is generated automatically as a byproduct of operational processes. The evidence package is always current because it is always being updated.
Building Your Evidence Package: A Practical Guide
Here is the process I recommend, based on what worked at LANL and what I have refined through dozens of CMMC engagements:
Step 1: Map Every Practice to Evidence Sources
For each of the 110 CMMC Level 2 practices, identify where the evidence lives in your environment. Which system generates the log? Which tool exports the configuration? Which SharePoint site holds the policy? Document this mapping before you collect anything.
Step 2: Automate Collection Where Possible
Every piece of evidence you can collect automatically is a piece you do not have to remember to collect manually. Script your configuration exports. Schedule your scan reports. Set up automated evidence pulls from your SIEM, your identity provider, and your endpoint management platform.
Step 3: Tag and Organize Immediately
Every artifact gets a filename that includes the practice ID, a description, and a date. Every artifact goes into the correct folder in your evidence structure. Do this at collection time, not the week before the assessment.
Step 4: Conduct Internal Reviews
Have someone who did not collect the evidence review it. Can they understand what it proves without explanation? If not, add context. An assessor will not call you to ask what a screenshot means. They will mark the practice as insufficiently evidenced.
Step 5: Refresh Before Assessment
In the 30 days before your C3PAO assessment, refresh all evidence. Replace anything older than 90 days. Verify that your SSP matches your current environment. Run a final gap check against your evidence structure to identify any empty folders.
The Bottom Line
The evidence package is where most CMMC assessments are won or lost. Not because the controls are hard to implement, but because proving implementation to a third party requires a level of documentation discipline that most organizations have never needed before. The lessons from LANL are simple: be specific, be current, be organized, and let your evidence tell a story of operational maturity rather than last-minute compliance.
Build an Evidence Package That Passes on the First Attempt
AeoliTech's PolicyCortex automates evidence collection and organization across all 110 CMMC Level 2 practices. Talk to us about assessment readiness.
Schedule a CMMC Readiness CallLeonard Esere
Founder & CEO, AeoliTech
Leonard carried systems through full ATO at Los Alamos National Laboratory under NIST 800-53 High before founding AeoliTech. That experience informs every evidence package and assessment preparation engagement the company delivers.