Home Learning Paths ECU Lab Assessments Interview Preparation Arena Pricing Log In Sign Up

What Counts as Evidence?

In ASPICE, evidence is any artifact or observable that allows an assessor to judge whether a Base Practice (BP) indicator is achieved. The PAM defines two types of indicators:

  • Work Product (WP) indicators - tangible artifacts (documents, tool outputs, reports) whose existence and content demonstrate that a BP was performed
  • Process Performance (PP) indicators - observable behaviors or direct testimony from interviewees showing the activity is actually performed as described

Both types matter. A perfectly written Software Requirements Specification (WP indicator) paired with interview testimony revealing that requirements are never reviewed before being approved (PP indicator gap) still produces a finding. Evidence is the combination of what exists on paper and what actually happens in practice.

📋 Learning Objectives

  • Distinguish WP indicators from PP indicators and know when each is primary
  • Identify the highest-credibility evidence type for each ASPICE process
  • Build an evidence package that minimizes assessor time and maximizes clarity
  • Handle the most common evidence gap: no documentation of activities performed months before the assessment
  • Know which tool exports, report formats, and metadata elements assessors specifically look for

Evidence Quality Hierarchy

Not all evidence carries equal weight. Assessors implicitly apply a quality hierarchy when evaluating evidence. Understanding this hierarchy allows you to focus preparation effort where it has the highest return.

TierEvidence TypeExamplesAssessor Credibility
1 (Highest)Tool-generated artifacts with audit trailsDOORS export with requirement IDs, links, change history; Jira/GitLab issue history; CI/CD build logs; coverage tool reports with timestampsVery high - tool timestamps and change histories cannot be easily fabricated; they show the actual sequence of work
2Formally reviewed and signed-off documentsSRS with review signature sheet; architecture document with reviewer names and review date; test plan with approval recordHigh - signatures and dates demonstrate that a process step actually occurred
3Meeting records and email threads with traceable decisionsReview meeting minutes with attendees and action items; change request approval emails; phase gate sign-off recordsMedium-High - contemporaneous records demonstrate process execution even when formal documents are thin
4Interview testimony corroborated by multiple intervieweesThree engineers independently describing the same review process consistentlyMedium - consistent testimony is credible; single-person testimony for all BPs is weak
5 (Lowest)Process documents and procedure descriptionsProcess definition documents, work instruction PDFs, organizational process guidelinesLow alone - describes what should happen, not what did happen. Must be combined with Tier 1–4 evidence of actual execution

⚠️ The Process Document Trap

A common preparation mistake: organizations submit beautifully formatted process definition documents (Tier 5) as their primary evidence. Assessors will respond: "This tells me what you plan to do. Now show me that you actually did it on the assessed project." If Tier 1–4 evidence of actual execution cannot be produced, the process document alone cannot support a CL1 rating. It can only contribute as context.

Per-Process Evidence Guide

The following table details the highest-impact evidence items per process. "Highest-impact" means: the item that, if missing or poor quality, most reliably results in a finding. These are the documents assessors ask for first.

ProcessMust-Have EvidenceCommon Gap
SWE.1SRS with unique IDs, review sign-off sheet, traceability matrix to SYS.2, requirement priority/release columnTraceability matrix missing or covers only a subset of requirements
SWE.2SW Architecture Document with static block diagram + at least one dynamic (sequence/timing) diagram, interface specification, requirement-to-component allocation tableDynamic view absent; interface spec is a list of function signatures without timing or error semantics
SWE.3Detailed design per unit (pseudocode, flowchart, or detailed function description), coding guidelines document, static analysis report with suppression justifications, source code in CMDetailed design only exists at architectural level; individual unit design is undocumented
SWE.4Unit test strategy (coverage targets, methods, environment), unit test cases with trace to SWE.3, test execution results, coverage reportTest strategy not documented; coverage report exists but coverage targets not defined or gaps not addressed
SWE.5SW integration test plan with integration order rationale, integration test cases traced to SWE.2 interfaces, test results with defect links for failures, regression test recordsIntegration order not documented; test cases trace to SWE.1 requirements rather than SWE.2 architecture interfaces
SWE.6SW qualification test specification traced to SWE.1, test report with pass/fail/blocked status, requirements coverage summary, regression test execution after changesRequirements coverage not measured; "test report" is an executed script log without traceability metadata
SUP.1QA plan with auditor role and independence statement, QA audit records for the assessed project (document-specific), nonconformance tracking recordsQA auditor is the project lead or a team member; QA records reference generic process checks, not project-specific artifacts
SUP.8CM plan, version history from VCS (Git log or equivalent) for all WPs, baseline records (tag names, dates, content lists), build manifestNon-code WPs (SRS, arch doc) not in CM; baselines defined in the CM plan but never actually created
SUP.10Change request procedure, CR records with status history, impact analysis per CR, approval signatures, implementation verification recordsCR process exists on paper but all CRs are in "approved" status with no evidence of impact analysis or implementation verification
MAN.3Project plan (scope, schedule, resources), status meeting minutes referencing plan vs actual, risk log, corrective action records for deviations, project closure reportPlan exists but status meetings do not reference it; no corrective action records when milestones are missed

Handling Retrospective Evidence

Retrospective evidence arises when work was performed but not documented at the time. This is the most common evidence challenge for teams new to ASPICE: processes were followed but the paper trail was not created. What can be done?

What Is Acceptable Retrospective Evidence

  • Email archives: Review emails, approval threads, and discussion chains with timestamps are contemporaneous records even if not formal documents. An email thread showing "Requirements v1.3 reviewed by [names], approved on [date], comments resolved" is valid review evidence.
  • VCS history: Git commit messages, branch names, PR/MR review threads, and commit timestamps are tool-generated (Tier 1) retrospective evidence. A PR description that says "implements requirement R-0042 per detailed design section 5.3" provides implicit traceability evidence.
  • Meeting minutes retroactively formalized: Notes from meetings that were held but not formally recorded can be formalized retrospectively - provided they are clearly labeled as post-hoc reconstruction and the attendees can confirm the accuracy. Assessors accept this with lower credibility than contemporaneous records.
  • Reconstructed traceability matrices: A traceability matrix created before the assessment by tracing existing requirements and tests is acceptable - the completeness of the traces is assessable, and inconsistencies (requirements with no test) will be visible. Do not claim the matrix was produced during development if it was not; be transparent about when it was created.

What Is NOT Acceptable

  • Review records with today's date for a review that allegedly happened 6 months ago
  • Backdated document version timestamps (file system metadata will betray this)
  • Claiming tool-generated reports cover a period when the tool was not in use for those artifacts
  • Meeting minutes created for events that never occurred

🔍 The Honest Gap Statement

When contemporaneous evidence cannot be produced for a completed activity, the most credible approach with an assessor is transparency: "This activity was performed but not documented at the time. We now have [existing evidence type]. In future projects, we have implemented [process change] to ensure [work product] is created during the activity." Assessors respect organizations that acknowledge gaps and demonstrate systemic improvement. They do not respect fabricated evidence - and they are experienced enough to find it.

Organizing the Document Package

The document package submitted before the assessment is the first impression assessors receive of the organization's process maturity. A well-organized package communicates process discipline before a single interview question is asked. A chaotic package - mislabeled files, inconsistent versions, missing items - signals exactly the opposite.

Recommended Package Structure

Organize by process, not by document type. Assessors work process by process; they should not need to search across folders to assemble evidence for a single process.

Assessment_Evidence_Package/
├── 00_Overview/
│   ├── Assessment_Contract.pdf
│   ├── Project_Overview.pdf          (1-2 pages: project scope, phase, team)
│   └── Document_Index.xlsx           (maps every document to its process(es))
├── SWE.1_SW_Requirements/
│   ├── SRS_v3.2_baseline.pdf
│   ├── SRS_Review_Records.pdf
│   └── Traceability_SYS2_to_SWE1.xlsx
├── SWE.2_SW_Architecture/
│   ├── SW_Architecture_Design_v2.1.pdf
│   └── Component_Interface_Spec.pdf
├── SWE.3_Detailed_Design/
│   ├── Detailed_Design_Document.pdf
│   ├── Coding_Guidelines.pdf
│   └── Static_Analysis_Report_v2.1.pdf
├── SWE.4_Unit_Verification/
│   ├── Unit_Test_Strategy.pdf
│   ├── Unit_Test_Report_v1.3.pdf
│   └── Coverage_Report_lcov.html
├── SWE.5_SW_Integration/
│   ├── SW_Integration_Test_Plan.pdf
│   └── SW_Integration_Test_Results.pdf
├── SWE.6_SW_Qualification/
│   ├── SW_Qualification_Test_Spec.pdf
│   ├── SW_Qualification_Test_Report.pdf
│   └── Req_Coverage_Matrix.xlsx
├── SUP.1_Quality_Assurance/
│   ├── QA_Plan.pdf
│   └── QA_Audit_Records_ProjectAlpha.pdf
├── SUP.8_Config_Management/
│   ├── CM_Plan.pdf
│   ├── Baseline_Records.pdf
│   └── Git_Log_Export.csv
├── SUP.10_Change_Management/
│   ├── CR_Process_Description.pdf
│   └── CR_Sample_Records.pdf
└── MAN.3_Project_Management/
    ├── Project_Plan_v4.mpp (or PDF export)
    ├── Status_Reports_Last3.pdf
    └── Risk_Register.xlsx

The Document Index

The document index (00_Overview/Document_Index.xlsx) is the highest-value single artifact you can create for the assessment. It is a table that maps every document to: the ASPICE process(es) it evidences, the version submitted, the baseline date, and the CM location. Assessors use this to navigate the package and to quickly identify what is missing. Creating this index also forces you to verify that evidence exists for every process in scope before you submit - a valuable self-check.

Live Interview Evidence

Documents submitted before the assessment are the foundation, but on-site interviews are where assessors verify, probe, and challenge. The interview is not a presentation - it is a verification activity. Engineers should be prepared to navigate directly to any document in the submitted package, demonstrate tool usage live, and explain their own work in their own words.

Live Demonstration Best Practices

  • Open your requirements tool, not a PDF export - showing DOORS or Polarion live demonstrates that the tool is actively used, not just used to generate a snapshot for the assessment. Live tools also allow assessors to click links and verify traces themselves.
  • Show version history - in Git, in DOORS change history, in the requirements tool - to demonstrate that work products were maintained over time, not created as a single artifact for the assessment.
  • Show a complete vertical slice - select one SWE.1 requirement and walk the assessor through the complete chain: the SYS.2 requirement it derives from, the SWE.2 component that implements it, the SWE.3 unit, the SWE.4 unit test, and the SWE.6 qualification test. Being able to do this live for a randomly selected requirement is the highest-quality evidence of functional traceability.
  • Show defect records - closed defects from review records and test execution demonstrate that processes actually caught and resolved issues. A process with no defects found in reviews is a process where reviews are not being done rigorously - assessors know this.

Preparing Interviewees

Coach interviewees on three things: (1) Know your role in the process - what you personally do, not what the process document says should happen; (2) Know where the key artifacts are - be able to navigate to them in under 30 seconds; (3) Answer questions directly - "I don't know" is acceptable, "let me show you the document that answers that" is better. Coached, scripted answers where the interviewee clearly does not understand what they are reading are immediately recognizable and damaging.

Summary & Key Takeaways

✅ Key Takeaways

  • Evidence = WP indicators (documents) + PP indicators (observable behavior). Both matter; documents alone are insufficient.
  • Tier 1 evidence (tool-generated, timestamped) has the highest credibility. Tier 5 evidence (process description documents) is context only - not proof of execution.
  • The highest-impact evidence items: for SWE.1, the traceability matrix to SYS.2; for SUP.1, QA audit records with specific project artifact references; for MAN.3, status reports that reference the project plan explicitly.
  • Retrospective evidence is acceptable if it is contemporaneous (emails, VCS logs) or transparently labeled as post-hoc reconstruction. Backdated or fabricated documents are immediately recognizable and destroy credibility.
  • Organize the document package by process, not by document type. Include a Document Index mapping every artifact to its process(es).
  • The most powerful live evidence: walk an assessor through one complete traceability chain (SYS.2 → SWE.1 → SWE.2 → SWE.3 → SWE.4 → SWE.6 → SYS.5) in your live tools.

What's Next

Continue to the Hands-On: Bidirectional Traceability Setup exercise to build a complete traceability structure from scratch - from OEM stakeholder requirements through software unit tests - using a realistic ECU project scenario.

← PreviousAssessment Planning & Scope Next →Common Assessment Findings & Solutions