Home Learning Paths ECU Lab Assessments Interview Preparation Arena Pricing Log In Sign Up

Overview & Work Product Characteristic (WPC) Primer

ASPICE defines not just that work products must exist, but what they must contain. These content requirements are called Work Product Characteristics (WPCs). They are listed in the PAM for each work product referenced by a Base Practice. Producing a document titled "Software Requirements Specification" is not sufficient - the document must contain specific elements or the assessor will rate the corresponding BPs as Partially or Not achieved.

📋 What You Will Do In This Chapter

  • Study the WPCs for SRS (SWE.1), SAD (SWE.2), and Review Records (all processes)
  • Work through annotated templates showing which section maps to which WPC/BP
  • Identify gaps in a deliberately imperfect template and classify them
  • Build a reusable template checklist mapped to PAM v3.1 requirements

How WPCs Work in an Assessment

During data collection, an assessor reviewing your SRS will mentally map every section against the WPC list for that work product. If a required WPC element is absent or insufficient, it weakens the corresponding BP rating. Common WPC checks include: are requirements uniquely identified (WPC: unique identifier per item), are verification methods defined (WPC: verification method or acceptance criteria per requirement), is there a section documenting review status (WPC: review/approval record), and are change history entries present (WPC: change history / revision history).

Work ProductReferenced by BPsKey WPCs (Summary)
Software Requirements Specification (SRS)SWE.1.BP1, BP2, BP5, BP6, BP8Unique requirement IDs; functional & non-functional requirements; traceability to source; verification method per requirement; approval status; change history
Software Architectural Design (SAD)SWE.2.BP1, BP2, BP3, BP4, BP5SW component decomposition; component interfaces (port names, data types, direction); requirements allocation to components; static and dynamic architecture views; design rationale
Review RecordGP 2.2.4 (all processes); SWE.1.BP3, SWE.2.BP6Work product reviewed (ID, version); review date; reviewers; issues found with ID; issue severity; disposition (accepted/rejected/deferred); updated version after review
Traceability MatrixSWE.1.BP6, SWE.2.BP5, SWE.6.BP4Source requirement ID; target requirement ID (or test case ID); link status (covered/not covered); coverage metrics; baseline reference
Project Management PlanMAN.3.BP1–BP5; GP 2.1.1–2.1.6Project scope; milestones and schedule; resource plan with named roles; responsibility matrix (RACI); risk register reference; process objectives per in-scope ASPICE process

SRS Template (SWE.1)

Below is an annotated SRS template structure. Each section is labeled with the WPC it satisfies and the BP it supports. Use this as the baseline structure for your project SRS. The annotations in brackets are guidance notes - remove them from your actual document.

📄 Software Requirements Specification - Template Structure

Document ID: [PROJECT-SRS-001]  |  Version: [1.0]  |  Status: [Draft / Under Review / Approved]

Project: [Project name]  |  ECU: [ECU name]  |  Date: [YYYY-MM-DD]

Prepared by: [Name, Role]  |  Reviewed by: [Name, Role]  |  Approved by: [Name, Role]


1. Introduction [WPC: document scope and purpose]

  1.1 Purpose - What this document defines and its role in the project

  1.2 Scope - Which system or subsystem is covered; what is excluded

  1.3 Definitions, Acronyms, Abbreviations - [define all domain-specific terms used in requirements]

  1.4 References - [list the STS version, interface documents, standards referenced] [WPC: traceability to source documents]


2. System Context [WPC: operating environment description - BP4]

  2.1 External Interfaces - Other ECUs, sensors, actuators that interact with this software

  2.2 Operational Modes - Normal, degraded, safe state, power modes

  2.3 Constraints - Hardware constraints (MCU, memory), AUTOSAR BSW version, tool chain


3. Requirements [WPC: unique IDs, functional and non-functional separation - BP1, BP2]

  3.1 Functional Requirements

  [REQ-ID] | [Requirement text - one testable statement per requirement] | [Source STS ID] | [Priority: High/Med/Low] | [Verification Method: Test/Analysis/Inspection/Demonstration] | [ASIL: QM/A/B/C/D]

Example:

SRS-042 | The WheelSpeedProcessor shall filter the raw wheel speed sensor signal using a moving average filter with a configurable window size of 4–16 samples. | STS-PremCar-AWD-107 | High | Test | QM

  3.2 Non-Functional Requirements [performance, timing, memory, MISRA compliance, etc.]

SRS-NF-001 | The SlipController main function shall execute within a maximum of 500 µs on the target AURIX TC387 at 300 MHz. | STS-PremCar-AWD-198 | High | Test | QM

  3.3 Interface Requirements [CAN signals, diagnostic services, AUTOSAR port specs]

  3.4 Constraints from Standards [ISO 26262, MISRA C, AUTOSAR compliance statements]


4. Traceability [WPC: bidirectional traceability - BP6]

  4.1 Requirements Source Traceability - Table mapping each SRS requirement to its source STS item (or marking it as an internal/derived requirement with justification)

  4.2 Requirements Coverage - Summary of total STS items and their SRS decomposition coverage (must show no orphaned STS items)


5. Review and Approval [WPC: approval status - BP8; review record reference - GP 2.2.4]

  5.1 Review History - Reference to review record document ID and version reviewed

  5.2 Approval Signatures - Name, role, date, signature (or electronic approval record)


6. Change History [WPC: change history - GP 2.2.2/2.2.3]

  [Version] | [Date] | [Author] | [Change Description] | [Changed Sections]

Critical SRS Mistakes That Fail ASPICE Assessments

MistakeWPC/BP ViolatedHow to Fix
Requirements written as vague design statements: "The system shall be implemented efficiently"BP1 - requirement not testable; BP5 - no verification method possibleEvery requirement must be unambiguous, singular, and testable. Replace with: "The algorithm shall execute within X ms under Y conditions."
Requirements with no unique ID - numbered only by section (e.g., "4.2.1")BP2 - no unique stable identifier; BP6 - traceability impossible if document structure changesUse a project-consistent ID scheme: SRS-nnn or [PROJ]-SW-REQ-nnn. IDs must be stable across document versions - renumbering invalidates traceability.
Source column empty for "obvious" requirementsBP6 - bidirectional traceability incompleteEvery requirement must have a source entry. Internal/derived requirements must be marked explicitly with a justification for why they have no STS parent.
Verification method column blank or "TBD"BP5 - verification criteria not definedDefine the verification method at requirement write time, not at test planning time. Valid methods: Test, Analysis, Inspection, Demonstration.
SRS approved by the requirements engineer who wrote itGP 2.2.4 - independent review requiredThe reviewer must be different from the author. For safety-relevant requirements, a safety engineer or systems engineer should be a required reviewer.

SAD Template (SWE.2)

The Software Architectural Design document must satisfy SWE.2's WPCs. Unlike the SRS which is requirements-focused, the SAD must convey design decisions - decomposition rationale, interface specifications, and allocation tables.

📄 Software Architectural Design - Template Structure

Document ID: [PROJECT-SAD-001]  |  Version: [1.0]  |  Status: [Draft / Approved]


1. Architecture Overview [WPC: architecture description - BP1]

  1.1 Architecture Scope - What SW elements are covered; boundaries

  1.2 Architecture Principles - Key design decisions: AUTOSAR SWC decomposition strategy, layering model, separation of concerns rationale

  1.3 Static Architecture View - Component diagram showing all SWCs, BSW modules used, and the RTE. ARXML-based tools should export this.

  1.4 Dynamic Architecture View - Sequence or communication diagram showing key runtime interactions (e.g., periodic task activation, data exchange flows)


2. Software Component Descriptions [WPC: SW component description with interfaces - BP1, BP3]

For each SWC, provide:

[Component Name] | [Responsibility] | [Port Name] | [Port Direction: Provided/Required] | [Interface Type: Sender-Receiver / Client-Server] | [Data Type] | [AUTOSAR Port Interface Name]

Example:

SlipController | Calculates target slip ratio and generates valve commands | ValveCommands | Provided | Sender-Receiver | SlipCtrl_ValveCmd_Type | If_SlipCtrl_ValveCmd


3. Requirements Allocation [WPC: SW requirements allocated to SW elements - BP2]

  3.1 Allocation Table - Map every SRS requirement to the SW component(s) that implement it

SRS-042 | WheelSpeedProcessor | WS_Filter | Allocated

  Note: every SRS requirement must appear in this table. Unallocated requirements are a Finding.


4. Interface Specifications [WPC: interface specifications - BP3]

For each inter-component interface:

[Interface Name] | [Provider Component] | [Consumer Component] | [Data Type] | [Communication Pattern] | [Update Rate / Trigger] | [Min/Max/Resolution/Unit] | [Initial Value]


5. Design Decisions and Rationale [WPC: design decisions - implicit in BP1; required for CL2 review evidence]

  Document the "why" for non-obvious architectural choices. Examples: why a Sender-Receiver port was chosen over Client-Server, why a component was split into two, why a particular BSW module is used or bypassed.


6. Dynamic Behavior [WPC: dynamic behavior - BP4]

  6.1 Task Mapping - Which runnables execute in which OS tasks at what rates

  6.2 Startup and Shutdown Sequences - Init order dependencies

  6.3 Error Handling Behavior - How the architecture handles DEM events, mode requests, degradation


7. Traceability to SRS [WPC: bidirectional traceability - BP5]

  Reference to the central Traceability Matrix (TRM-001) or inline table showing SRS→SAD component allocation coverage


8. Review and Change History - Same structure as SRS section 5–6

⚠️ AUTOSAR-Specific SAD Note

In AUTOSAR Classic projects, the SAD is often partially generated from the ARXML - the component and port definitions come from the AUTOSAR Composition Description. This is acceptable evidence, but the ARXML export alone is not sufficient. The SAD must also include: design rationale, allocation table, dynamic behavior sections, and review records. The generated ARXML satisfies the structural description WPCs; the manually authored sections satisfy the design decision and allocation WPCs.

Review Record Template

The review record is the single most frequently checked evidence artifact for GP 2.2.4 - and the most commonly missing or insufficient one. Every work product that goes through a review must have a review record. The record must answer: what was reviewed, by whom, when, what issues were found, and how each issue was resolved.

📄 Review Record - Template Structure

Review ID: RVW-[WP-ID]-[nnn]  |  Review Type: [Peer Review / Inspection / Walkthrough]

Work Product Reviewed: [Document ID, Title, Version]  |  Review Date: [YYYY-MM-DD]

Review Moderator: [Name, Role]  |  Author (not reviewer): [Name, Role]

Reviewers: [Name 1, Role] | [Name 2, Role] | [Name 3, Role]

Review Checklist Used: [Checklist ID or "none"]  |  Preparation Time (hrs): [total across reviewers]

Review Meeting Duration: [hh:mm]


Review Issue Log:

Issue ID | Section/Page | Issue Description | Severity [Major/Minor/Observation] | Owner | Due Date | Disposition [Accepted/Rejected/Deferred] | Closed in Version

Example rows:

RVW-001-001 | Sec 3.1, SRS-043 | Verification method column empty | Minor | RE Lead | 2024-08-29 | Accepted | v2.2

RVW-001-002 | Sec 4.1 | 8 requirements (SRS-156, 201, 245...) have no STS source link | Major | RE Lead | 2024-08-30 | Accepted | v2.2

RVW-001-003 | Sec 5 | Timing requirement SRS-NF-003 lacks quantitative acceptance criteria | Observation | RE Lead | 2024-09-05 | Accepted | v2.3


Review Summary:

Total issues found: [n] | Major: [n] | Minor: [n] | Observations: [n]

Review outcome: [Approved with corrections / Re-review required / Rejected]

Next steps: [e.g., "Author to close all Major issues by [date]; minor issues closed in next regular update"]

Review Close Date: [YYYY-MM-DD]  |  All issues closed: [Yes / No - if No, explain]

❌ What Assessors Reject as Insufficient Review Evidence

  • "We discussed it in the project meeting and everyone agreed" - no document = no evidence
  • An email thread saying "looks good to me" - does not show what version was reviewed, what was checked, or what issues were found
  • A review record with 0 issues - assessors interpret this as either the review was not serious or the document is trivially simple. A real review of a 200-requirement SRS should find at least several issues.
  • A review record where the author is also listed as a reviewer - the author cannot review their own work for independence. The author may attend to answer questions, but must not be counted as a reviewer.
  • Issue list without disposition - knowing issues were found is not useful without knowing how each was resolved

Traceability Matrix Template

The traceability matrix (TRM) connects requirements across the V-model. A complete TRM covers the full chain: STS (SYS.2) → SRS (SWE.1) → Architecture (SWE.2) → Test Cases (SWE.6). In practice, the horizontal and vertical traceability is sometimes split into separate artifacts, but a single integrated matrix is easier to audit.

📄 Traceability Matrix - Column Structure (Excel / Requirements Tool)

Tab 1: Horizontal Traceability (STS → SRS)

STS ID | STS Title/Description | SRS ID(s) | Coverage Status | Notes

STS-PremCar-AWD-107 | Wheel speed signal filtering | SRS-042, SRS-043, SRS-044 | Covered | -

STS-PremCar-AWD-198 | ABS response timing | SRS-NF-001 | Covered | Performance req

STS-PremCar-AWD-255 | Diagnostic fault storage | (none) | NOT COVERED | Pending - open action AR-041


Tab 2: Vertical Traceability (SRS → Architecture → Tests)

SRS ID | SRS Title | SAD Component | SWE.4 Test Case ID | SWE.6 Test Case ID | SWE.6 Test Result | Coverage Status

SRS-042 | WS filtering | WheelSpeedProcessor/WS_Filter | UT-WS-042-001 | QT-042-001 | PASS | Covered

SRS-043 | WS filter window size | WheelSpeedProcessor/WS_Filter | UT-WS-043-001 | QT-043-001 | PASS | Covered

SRS-156 | [Internal req - no STS source] | NOT ALLOCATED | (none) | (none) | - | NOT COVERED


Tab 3: Coverage Summary

Total STS items: 147 | Covered by SRS: 144 | Not covered: 3 | Coverage: 97.9%

Total SRS requirements: 312 | With architecture allocation: 311 | Not allocated: 1 | Coverage: 99.7%

Total SRS requirements: 312 | With SWE.6 test case: 308 | Not tested: 4 | Test coverage: 98.7%

Tool Options for Traceability Management

<
ToolScaleASPICE Evidence StrengthNotes
Excel / Google Sheets<300 requirementsModerate - manual maintenance risk; can driftAcceptable for small projects; must be version-controlled; coverage formula must be correct
IBM DOORS / DOORS NextLarge (1000+ req)Strong - purpose-built, full audit trailIndustry standard for complex ECUs; steep learning curve; expensive licensing
Polarion ALM (Siemens)Medium-LargeStrong - full lifecycle traceability, ASPICE dashboardsWidely used at Tier-1; integrates with Jenkins/Jira
Jama ConnectMediumStrong - good review workflow, trace dashboardsGood for Agile teams; less common in traditional automotive
Jira + Zephyr/XRayMediumModerate - Jira not natively a requirements tool; traceability needs careful plugin configUsed in Agile automotive teams; assessors accept it but scrutinize the link structure more carefully

Template Review Exercise

Below is a deliberately flawed SRS requirement table excerpt. Identify every WPC or BP gap, classify each as Finding or Weakness, and state how you would fix it.

📄 SRS Excerpt - ABS SW Requirements v1.0 (section 3.1, first 6 requirements)

#Requirement TextSourcePriorityVerificationASIL
3.1.1The system shall process wheel speed data.STS-107HighTestQM
3.1.2The filtering algorithm shall use an efficient implementation.HighTestQM
3.1.3The WheelSpeedProcessor shall apply a moving average filter with a window of 8 samples to the raw sensor data, and reject samples deviating more than 15% from the running average.STS-107HighTestQM
3.1.4The ABS ECU shall detect wheel lockup and shall activate the hydraulic pressure reduction and shall log a DTC and shall send a CAN signal to the dashboard.STS-112HighTestASIL-B
3.1.5Response time shall be acceptable.STS-198HighTBDASIL-B
3.1.6The software shall comply with MISRA C:2012.MediumInspectionQM

❓ Find All Gaps - Then Check Model Answer

Before reading below: identify every gap in the table above, which BP/WPC it violates, Finding vs. Weakness, and how to fix it.

✅ Model Answer

#3.1.1 - "Process wheel speed data" is not a testable requirement. It describes a function at too high a level. Finding → BP1 (requirement not specified adequately). Fix: Replace with a specific, measurable statement - e.g., same as 3.1.3 but more general if 3.1.3 is the specific implementation.

#3.1.2 - "Efficient implementation" is not measurable; no source STS ID. Two violations: Finding → BP1 (not testable) and Finding → BP6 (no upstream traceability). Also uses row number as ID - not a stable unique ID. Fix: Give it a project-standard ID (SRS-nnn), define quantitative efficiency criteria (e.g., max CPU usage %), add source STS ID or mark as internal requirement.

#3.1.3 - Well-written: specific, measurable, testable, has source and verification. No gaps. Strength. (Note: it lacks a stable ID - numbered 3.1.3 - same ID scheme problem as 3.1.2.)

#3.1.4 - Contains 4 distinct requirements in one statement (compound "shall...and shall..."). Finding → BP2 (requirement not structured - multiple requirements in one item makes traceability and test coverage ambiguous). Fix: Split into 4 separate requirements: SRS-nnn (lockup detection), SRS-nnn (pressure reduction activation), SRS-nnn (DTC logging), SRS-nnn (CAN signal). Each gets its own ID and test case.

#3.1.5 - "Acceptable" is not measurable; verification method is TBD. Finding → BP1, BP5. Fix: Replace "acceptable" with a specific timing value from the STS - e.g., "≤ 20 ms from wheel lockup detection to first pressure reduction pulse." Change Verification to "Test."

#3.1.6 - MISRA compliance is a valid non-functional requirement; Inspection is the correct verification method. No source STS - this is likely an internal/project standard constraint, which is acceptable if labeled as "Internal" in the Source column. Weakness → BP6 (source not documented, though absence is justifiable). Fix: Add "Source: Internal/Project Standard" in the Source column.

Global issues: The entire table uses section numbers (3.1.1–3.1.6) as IDs rather than stable project-wide IDs. If section 3 is restructured, all IDs change - breaking every traceability link. Finding → BP2 (lack of stable unique identifiers). Fix: Assign SRS-nnn IDs to all requirements, independent of document structure.

ASPICE Work Product Readiness Checklist

Use this checklist before submitting work products for assessor pre-review. Each item maps to a specific WPC, BP, or GP. Every "No" is a gap that will appear as a finding or weakness in the assessment.

Check ItemWPC/BP/GPStatus
SRS
Every requirement has a stable unique ID (not a section number)SWE.1.BP2[ ] Yes [ ] No
Every requirement is testable (quantitative, unambiguous, singular)SWE.1.BP1[ ] Yes [ ] No
Every requirement has a verification method (Test/Analysis/Inspection/Demonstration)SWE.1.BP5[ ] Yes [ ] No
Every requirement has a source STS ID or is marked as Internal with justificationSWE.1.BP6[ ] Yes [ ] No
Every STS item has at least one SRS requirement derived from it (no orphaned STS items)SWE.1.BP6[ ] Yes [ ] No
SRS is version-controlled with change historyGP 2.2.2/2.2.3[ ] Yes [ ] No
SRS has a formal review record (version reviewed, reviewers, issues, dispositions)GP 2.2.4[ ] Yes [ ] No
SRS has approval signature(s) from authorized person(s)SWE.1.BP8[ ] Yes [ ] No
SAD
All SWCs are described with their ports, data types, and interface directionSWE.2.BP1, BP3[ ] Yes [ ] No
Requirements allocation table covers 100% of SRS requirementsSWE.2.BP2[ ] Yes [ ] No
Dynamic behavior (task mapping, runnable timing, startup sequence) documentedSWE.2.BP4[ ] Yes [ ] No
Design rationale for key decisions documentedSWE.2.BP1 (implicit)[ ] Yes [ ] No
SAD has review record and approvalGP 2.2.4, BP6[ ] Yes [ ] No
Review Records (all processes)
Review record exists for every key work product (SRS, SAD, SDD, Test Plan, Test Report)GP 2.2.4[ ] Yes [ ] No
Review records show reviewer names, not just "team"GP 2.2.4[ ] Yes [ ] No
Review records list issues found (0 issues on a real document is a red flag)GP 2.2.4[ ] Yes [ ] No
All review issues have disposition (Accepted/Rejected/Deferred) and closed versionGP 2.2.4[ ] Yes [ ] No
Traceability
Traceability matrix covers all STS → SRS links (or explains gaps)SWE.1.BP6[ ] Yes [ ] No
Traceability matrix covers all SRS → SWE.6 test case linksSWE.6.BP4[ ] Yes [ ] No
Coverage report shows quantitative metrics (% covered, # uncovered items)SWE.6.BP4, SWE.1.BP6[ ] Yes [ ] No
Traceability matrix is version-controlled and currentGP 2.2.2/2.2.3[ ] Yes [ ] No

What's Next

Continue to Improvement Action Planning - the final chapter in the ASPICE module - where you will learn how to translate assessment findings into a structured, OEM-reportable Improvement Action Plan, prioritize gaps by CL impact, and build a realistic timeline for achieving target capability levels.

← PreviousSUP.10 - Change Request Management Next →Assessment Planning & Scope