| SWE.1.BP1 |
Specify software requirements |
Define and document each software requirement. Each requirement must be uniquely identified, correct, verifiable, and traceable. |
Software Requirements Specification (SRS) with numbered requirements and attributes (priority, type, source) |
Requirements stated as design decisions ("the system shall use a 16-bit counter") rather than true requirements ("the system shall count events up to 65535") |
| SWE.1.BP2 |
Structure software requirements |
Organize requirements with unique IDs and a logical structure (chapters, groups) that supports navigation, review, and change management. |
SRS with hierarchical numbering, requirement IDs (e.g., SRS-FEAT-0042), and a consistent naming convention |
Flat list of requirements with sequential numbers only, no grouping by function or feature area - makes impact analysis impractical |
| SWE.1.BP3 |
Analyze software requirements |
Evaluate requirements for completeness, consistency, technical feasibility, and correctness. Document analysis results and resolve identified issues. |
Requirements review record with analysis checklist results, or documented requirements review meeting minutes with issues list |
Review is done mentally but not recorded; or only a "sign-off" page exists without any issue log showing what was actually analyzed |
| SWE.1.BP4 |
Analyze impact on operating environment |
Identify and document how software requirements affect other system elements - hardware, other ECUs, vehicle networks, external systems. |
Interface analysis section in SRS, or a separate Interface Control Document (ICD) showing affected interfaces |
SRS treats the software in isolation with no analysis of how requirements drive hardware sizing, bus load, or interaction with adjacent ECUs |
| SWE.1.BP5 |
Develop verification criteria |
For each requirement, identify how it will be verified: test, analysis, inspection, or demonstration. Define acceptance criteria specific enough to make the verification objective. |
Verification method column in SRS, or a separate verification cross-reference matrix (VCRM). Acceptance criteria must be measurable, not just "the function works correctly." |
Verification method defined generically ("test") without acceptance criteria; or no verification method defined at all for non-functional requirements like response time or memory usage |
| SWE.1.BP6 |
Ensure consistency and establish bidirectional traceability |
Every SWE.1 requirement must trace upward to a SYS.2 system requirement that justifies its existence. Every SYS.2 system requirement must trace downward to at least one SWE.1 requirement that realizes it. Both directions must be documented. |
Traceability matrix (spreadsheet, DOORS links, Polarion associations) showing SRS ID ↔ System Req ID mappings. Coverage report showing all SYS.2 requirements are covered. |
SRS requirements have source annotations ("from STS §3.2.1") but no formal machine-readable trace link; assessor cannot verify coverage without manual inspection. Or: traceability exists but 15–20% of requirements are unmapped - rated P, not F. |
| SWE.1.BP7 |
Identify the content of the software product release notes |
Specify which requirements will be covered in which software release. Provide input to release planning so that stakeholders know what will be delivered and when. |
Release plan or sprint plan showing requirement allocation to releases; requirements marked with "target release" attribute |
Release planning exists as a schedule but requirements are not explicitly tagged to releases; assessor cannot determine which requirements are in-scope for the current release being assessed |
| SWE.1.BP8 |
Ensure agreement and communicate requirements |
Requirements must be formally agreed with all affected parties (OEM, system engineers, architects, test team) and distributed. "Communicated" means acknowledged receipt, not just emailed. |
Review sign-off page signed by relevant stakeholders; distribution records; meeting minutes from requirements review with external parties attending |
"We sent the SRS to the customer" without documented acknowledgment or formal approval. Internal distribution also missing - test engineers never formally received the SRS they are supposed to use. |