Home Learning Paths ECU Lab Assessments Interview Preparation Arena Pricing Log In Sign Up

SWE.1 Purpose, Scope & Outcomes

Official Purpose Statement (ASPICE v3.1/v4.0): "The purpose of the Software Requirements Analysis Process is to transform the software-related parts of the system requirements into a set of software requirements."

SWE.1 sits at the top of the left leg of the software V-model. It is the first process assessors evaluate because failures here cascade through the entire SWE chain. If requirements are missing, ambiguous, or untraceable, every downstream process (architecture, design, testing) is building on a broken foundation. SWE.1 is the #1 source of CL1 partial ratings in automotive supplier assessments.

📋 Learning Objectives

  • Recite SWE.1's eight Base Practices and the specific work products each requires
  • Write requirements that satisfy ASPICE's testability, consistency, and traceability criteria
  • Set up a traceability matrix linking SWE.1 output (SRS) to SYS.2 system requirements
  • Conduct and document a requirements review that satisfies GP 2.2.4
  • Identify and fix the five most common SWE.1 findings in real assessments

SWE.1 Process Outcomes (PRM Level)

The six process outcomes define what "SWE.1 done right" looks like from the PRM perspective. These are what the assessor will ultimately verify, even though they collect evidence at the BP level:

Outcome IDStatementAssessed Via
O1Software requirements are defined and documentedSRS exists with structured, identified requirements
O2Software requirements are consistent with system requirementsTraceability matrix SRS ↔ System Tech. Spec.; no orphaned requirements
O3Software requirements are evaluated for testabilityVerification method assigned; test-unfriendly requirements flagged and resolved
O4Software requirements are prioritizedPriority or must-have/should-have attribute on each requirement
O5Impact of proposed changes to SW requirements is analyzedChange request records with impact analysis before re-baselining SRS
O6SW requirements are agreed and communicated to affected partiesReview sign-off records; distribution/notification log

The 8 Base Practices - Complete Reference

These are the definitive BP definitions from the ASPICE PAM v3.1. During an assessment, each BP is rated individually. A BP rated "P" (Partially achieved) contributes to PA 1.1 being less than "F" - blocking CL1 achievement.

BPNameWhat Assessors CheckRequired Work ProductsCommon Failure Mode
BP1 Specify Software Requirements Functional and non-functional requirements are defined. Each requirement covers: function, input, output, timing, accuracy, interfaces. Non-functional covers performance, resource constraints, safety constraints, regulatory compliance. Software Requirements Specification (SRS) Only functional requirements documented; performance, memory, timing requirements absent or vague ("shall be fast")
BP2 Structure Software Requirements Every requirement has a unique ID (e.g., SRS-042). Requirements are grouped logically (by feature, interface, or system mode). Attributes defined: status, source, priority, ASIL/QM, verification method. SRS with ID scheme and attribute structure Sequential numbering without hierarchy; no attributes; no classification of safety requirements
BP3 Analyze Software Requirements Requirements analyzed for completeness (all SYS.2 items covered), consistency (no contradictions), feasibility (implementable on target HW), correctness (no technical errors). Analysis results are documented. Review records, analysis results, requirement status report Analysis done verbally in meetings; no written analysis records; feasibility assessment missing
BP4 Analyze Impact on Operating Environment Software requirements are analyzed for their effects on other system elements - other ECUs, the communication bus (CAN load), HW resources (flash, RAM, CPU cycles), mechanical systems. External interfaces to other ECUs documented. Interface control document, updated SRS with interface requirements, communication matrix ECU-internal requirements documented but external interface and bus load impacts not analyzed
BP5 Develop Verification Criteria For each software requirement, the method of verification is defined: Test (most common), Analysis, Inspection, or Demonstration. Acceptance criteria are quantified (not "shall be verified by test" - but "test case TC-042 shall confirm voltage output = 5.0V ± 0.1V at load 100–500mA"). Verification method matrix, or verification column in SRS attribute table All requirements assigned "T" (Test) without specifying what the test will verify; no acceptance criteria; untestable requirements not flagged
BP6 Ensure Consistency and Bidirectional Traceability Every SW requirement traces to at least one SYS.2 system requirement. Every SYS.2 SW-relevant requirement has at least one SWE.1 requirement covering it. No orphans in either direction. Traceability is maintained across requirement changes. Traceability matrix (SRS ↔ STS/SRS-sys), or DOORS/Polarion/Jama link table Traceability matrix exists but is stale - updated at project kickoff and never maintained as requirements evolved. 20–30% of requirements have broken links.
BP7 Identify Content of Software Release Notes Which requirements will be implemented in which release of the software. Release planning maps requirements to software release versions. Delta between releases is documented. Release plan, software release notes template, feature-to-release mapping Release planning done informally or only in Jira tickets without explicit requirement-to-release traceability
BP8 Ensure Agreement and Communicate Requirements Requirements are formally reviewed by all affected parties (architecture team, test team, HW team, safety team if ASIL-rated). Review outcome (approve/reject/conditional) is documented. Requirements distributed to all development stakeholders. Review sign-off record, distribution log, meeting minutes with attendees and version reviewed Requirements distributed over email with no acknowledgment record; review meeting held but minutes only say "SRS reviewed and approved" without attendee list or version number

Writing ASPICE-Compliant Requirements

The most common source of SWE.1 findings is not a missing document - it is a document full of requirements that do not meet ASPICE's quality criteria. These are the rules that separate requirements that pass assessment from those that generate findings.

The SMART-V Requirement Template

A compliant software requirement must be: Singular (one requirement per ID), Measurable (quantified acceptance criteria), Attainable (feasible on the target HW), Relevant (traces to a system-level need), Traceable (has upstream and downstream links), Verifiable (a concrete test can confirm it).

Good vs. Bad Requirements - Side-by-Side

ID❌ Non-Compliant RequirementProblem✅ Compliant Rewrite
SRS-010 "The system shall respond quickly to user inputs." Not measurable - "quickly" has no quantified criterion. Not verifiable. "The BCM shall respond to a door unlock CAN message (0x18F) within 50ms (95th percentile over 1000 test cycles) at ambient temperatures -40°C to +85°C."
SRS-011 "The software shall be safe." Not measurable, not singular, not a software requirement - safety is a system attribute derived via ISO 26262 hazard analysis. Decompose into: "SRS-011a: The software shall detect sensor signal loss within 20ms and transition to fallback mode (safe state as defined in SS-042). [ASIL-B] SRS-011b: The fallback activation shall be logged as DTC P0571 via UDS DEM."
SRS-012 "The ECU shall support CAN communication." Not specific - which CAN ID? Which message? What data length? What cycle time? No verification criteria. "The EMS shall transmit CAN message ID 0x3E9 (Engine Status) on CAN-HS bus at 10ms cycle ±1ms jitter. Payload shall contain: byte 0 = engine speed (0.5 rpm/LSB), byte 1–2 = coolant temp (0.1°C/LSB, offset -40°C). [Trace: STS-089]"
SRS-013 "The system shall store fault codes." Combines multiple concerns (store, what codes, when, how long, how many). Not singular. Split into: "SRS-013a: The DEM shall support storage of up to 20 DTCs in NVM. SRS-013b: A DTC shall be stored when the corresponding DEM event is confirmed (2 consecutive driving cycles). SRS-013c: DTCs shall persist through ECU power cycle. SRS-013d: DTCs shall be clearable via UDS Service 0x14."

Non-Functional Requirement Categories - Don't Skip These

ASPICE BP1 explicitly requires non-functional requirements. These are the categories assessors check for completeness:

  • Timing & Performance: Worst-case task execution times (WCET), response time targets, interrupt latency budgets, CAN bus load limits (≤40% recommended for HS-CAN)
  • Memory: Flash memory budget (with margin), RAM allocation per task/module, stack depth limits
  • Reliability/Availability: Allowed MTBF, diagnostic coverage targets (for ISO 26262 ASIL requirements)
  • Interface Compatibility: Baud rates, voltage levels, communication matrix compliance, bootloader interface spec
  • Standards Compliance: Which AUTOSAR version, which UDS services, which OBD-II PIDs - referenced by document and version
  • Calibration: Which parameters are calibratable, their ranges, their default values

⚠️ Assessment Reality

Assessors will open your SRS and sample 10–15 requirements at random. For each one, they will check: Does it have an ID? Is it measurable? Can it be tested? Does it trace to a system requirement? Does it have a verification method assigned? If 3 out of 10 sampled requirements fail any of these - expect a Major Finding on SWE.1 BP1 or BP5, and PA 1.1 will not achieve "F".

Bidirectional Traceability - Setup & Maintenance

BP6 (bidirectional traceability) is the single most failed BP in SWE.1 assessments. Understanding exactly what it requires - and how to set it up correctly - is essential.

What "Bidirectional" Actually Means

Traceability has two directions, and both must be complete:

  • Downward (Derivation): Every SWE.1 software requirement must point to the SYS.2 system requirement(s) it was derived from. This proves your SRS is grounded in customer/system needs.
  • Upward (Coverage): Every SW-relevant SYS.2 requirement must have at least one SWE.1 requirement covering it. This proves nothing was dropped in translation.

An orphaned SWE.1 requirement (one with no upstream SYS.2 link) is called a derived requirement. Derived requirements are allowed - they represent implementation decisions not mandated by the customer - but they must be explicitly labeled as derived and justified. An unlabeled orphan looks like a traceability gap to an assessor.

Traceability Matrix Structure (Minimum Viable)

If you are not using a requirements management tool, a well-structured Excel matrix satisfies ASPICE for projects up to ~300 requirements:

SRS-IDRequirement Text (Short)TypeASILPriorityVerification Method↑ STS-ID (Upstream)↓ SWE.2-Comp (Downstream)↓ SWE.6-TC (Downstream)Status
SRS-001BCM unlock response ≤50msFunctional/TimingQMMustTestSTS-042BCM_DoorCtrlTC-BCM-001Approved
SRS-002CAN message 0x3E9 at 10ms cycleInterfaceQMMustTestSTS-089COM_BusStackTC-COM-014Approved
SRS-015Fallback mode activation on sensor lossSafetyASIL-BMustTest + AnalysisSTS-103, STS-104SafetyMgrTC-SAFE-007Approved
SRS-028NVM read retry max 3 attemptsRobustnessQMShouldTest[Derived - design decision]NVM_HandlerTC-NVM-003Approved

Key rules: every row must have a value in the upstream column (STS-ID or [Derived - justification]). Every row must have a downstream link to a test case before the SWE.6 qualification test phase. Empty cells = traceability gaps = findings.

Toolchain Options

<
ToolScaleASPICE CoverageNotes
Excel / Google Sheets<300 reqManual - works but maintenance burden is highOK for small projects; traceability breaks under change management pressure
IBM DOORS ClassicAnyNative link architecture; supports both directions; formal baselinesIndustry standard in Tier-1; steep learning curve; expensive
IBM DOORS Next (DNG)AnyFull bidirectional links; module/artifact traceability; DXL scriptingWeb-based; better UX than DOORS Classic; integrates with RTC/EWM
Polarion ALMAnyFull traceability + test management + CM integrationStrong in European Tier-1s; good ASPICE-specific workflows available
Jama ConnectMedium–LargeBidirectional links; suspect links on change; coverage reportsPopular in North American automotive; integrates with Jira
Jira + XraySmall–MediumPartial - Jira natively lacks formal upstream traceability; Xray adds test linkageViable for CL1–CL2 if structured carefully; assessors sometimes skeptical

🔍 The Stale Matrix Problem

The most common SWE.1 finding in projects with requirements tools: the traceability matrix was set up at project kickoff, never maintained through change requests, and by assessment time 25–40% of links are stale or broken. The fix is process, not tooling: every CR (SUP.10) must include a traceability update step as a mandatory gate before the CR is closed.

Requirements Review - Meeting CL2 GP 2.2.4

The requirements review is where CL2 GP 2.2.4 (Work Product review) is demonstrated for the SRS. A verbal "we checked it" does not satisfy this Generic Practice. Here is the exact evidence structure assessors expect.

Required Review Record Components

ComponentWhy RequiredExample
Review ID & DateTraceability to a specific review eventREV-SRS-004, 2024-03-15
Document/Version ReviewedProves the reviewed version is known - not a "we reviewed the general idea"SRS_BCM_v2.3.docx, SHA: a4f9b2c
Attendees (name + role)Demonstrates cross-functional coverage (arch, test, safety, customer if applicable)K. Müller (SWE.1 owner), T. Singh (SWE.2), P. Novak (SWE.6 test), J. Park (Safety)
Review Checklist UsedProves systematic rather than ad-hoc review; checklist items per ASPICE criteriaAttach completed checklist: IDs?, Measurable?, Traceable?, Testable?, Consistent?
Issues Found (numbered list)Evidence the review was actually done - zero issues = red flag for assessorsISSUE-001: SRS-042 lacks timing criterion. ISSUE-002: SRS-088 has no upstream trace. ...
Issue DispositionEvery issue must be resolved before WP is re-baselined. Proves the review loop closed.ISSUE-001 → Fixed in SRS v2.4 by adding "≤50ms at 85°C". Status: Closed.
Review DecisionFormal outcome: Approved / Approved with conditions / Rejected"Approved conditional on closure of ISSUE-001 and ISSUE-002. Re-review not required."

Review Checklist - ASPICE SWE.1 Criteria

The following checklist, completed per review, directly maps to the ASPICE SWE.1 BPs. Use it as a template:

  • ☐ Every requirement has a unique ID (BP2)
  • ☐ Every requirement is singular - one requirement per ID (BP2)
  • ☐ Functional requirements are unambiguous - no "shall be fast", "shall be safe" (BP1)
  • ☐ Non-functional requirements cover timing, memory, interfaces, and regulatory constraints (BP1)
  • ☐ Every requirement has a verification method assigned (BP5)
  • ☐ Every requirement has acceptance criteria quantified (BP5)
  • ☐ Every requirement traces to a SYS.2 requirement or is explicitly labeled as Derived (BP6)
  • ☐ No SYS.2 SW-relevant requirement is uncovered (BP6 - coverage direction)
  • ☐ ASIL/safety classification applied to all safety-relevant requirements (BP2)
  • ☐ Interface requirements reference the communication matrix version (BP4)
  • ☐ Requirements analyzed for feasibility against target HW spec (BP3)
  • ☐ Release mapping defined for all requirements (BP7)

CL2 for SWE.1 - Generic Practices Applied

CL2 requires PA 2.1 (Performance Management) and PA 2.2 (Work Product Management) on top of a fully achieved CL1. Here is what each Generic Practice looks like concretely for SWE.1:

Generic PracticeConcrete SWE.1 EvidenceTypical Gap
GP 2.1.1 - Process objectives identifiedProject quality plan states: "SWE.1 shall deliver a reviewed, baselined SRS covering 100% of SW-relevant STS items by milestone M2 (2024-04-30)."Objectives are generic ("write requirements") with no metric, milestone, or coverage target
GP 2.1.2 - Performance plannedProject schedule shows SRS activities: requirements elicitation (W1–W4), internal review (W5), rework (W6), customer review (W7), baseline (W8). Resource assignment per activity.Plan exists for delivery milestone only - no activity-level schedule for requirements engineering work
GP 2.1.3 - Performance monitoredWeekly status report shows: requirements written/total (e.g., 147/200 = 73.5%), open issues from last review (12 open, 8 closed this week), delta to plan (3 days behind on elicitation).Status tracked in project manager's head; no documented actuals vs. planned record
GP 2.1.4 - Roles assignedRACI: SRS author = K. Müller, SRS reviewer = T. Singh + J. Park (safety), SRS approver = project lead, SRS CM custodian = configuration manager.Everyone knows their role informally but no document records it; on assessor questioning "who is responsible for SRS traceability?" answers differ between team members
GP 2.2.1 - WP content requirements definedSRS template exists with mandatory sections: scope, definitions, functional requirements (ID + text + attribute table), non-functional requirements, interface requirements, traceability appendix.No template; each engineer writes SRS in their own style with different structure and missing sections
GP 2.2.2 - WP documentation requirements definedCM plan states: SRS is a controlled document, stored in /ProjectX/SRS/ in Polarion, versioned with semantic versioning (major.minor.patch), baselined at M1, M2, and final release.SRS stored on a shared drive with no version control; overwritten in place with no history
GP 2.2.3 - WP identified & controlledPolarion shows SRS version history: v1.0 → v1.1 → v2.0 with date, author, change reason. Baseline M2 tagged. Current working version clearly distinguished from baseline.Multiple "final" versions: SRS_v3_FINAL_reviewed_corrected2.docx - no CM discipline
GP 2.2.4 - WP reviewedReview record REV-SRS-004 showing version reviewed, attendees, 23 issues found, all dispositioned, SRS re-baselined as v2.1 after closure. (See review template above.)Review happened but only note is "SRS reviewed in project meeting 15.03" - no version, no issues list, no attendees recorded

Top 5 SWE.1 Assessment Findings - With Fixes

#FindingBP/GP FailedRoot CauseFix
1Requirements are not measurable: "shall be robust," "shall be fast," "shall support communication"BP1, BP5Requirements written by system engineers with domain knowledge but no training in ASPICE-compliant requirement writingMandatory requirements writing training; introduce a peer review checklist that blocks non-measurable requirements from advancing; require acceptance criteria for every requirement before SRS review
2Bidirectional traceability incomplete: 30%+ of SRS requirements have no upstream STS link or are unlabeled as derivedBP6Traceability matrix set up at project start, never maintained during engineering changes; no CM step requiring traceability update in the change processAdd traceability update as a mandatory gate in the CR closure checklist (SUP.10); weekly traceability coverage metric in status reporting; use a tool with "suspect link" functionality that highlights broken links on change
3SRS review not documented: no review record, no issue list, no version trackedGP 2.2.4Review held informally in a design meeting; minutes not structured to capture review-specific dataIntroduce a standard review record template (as above); make review record creation a milestone gate; train project leads that informal review minutes do not satisfy GP 2.2.4
4Non-functional requirements missing: timing, memory, and interface constraints not in SRSBP1, BP4Engineers focus on functional behavior; non-functional requirements are "in the head" of experienced engineers and never written downAdd dedicated non-functional requirement sections to the SRS template with mandatory subsections (timing budget, memory budget, interface specification per bus); validate completeness against the HW spec sheet during BP3 analysis
5SRS version control absent: document overwritten, no history, no baseline at milestonesGP 2.2.2, GP 2.2.3Document management done through file shares; only code in Git; no discipline around document CMEither extend the CM tool (e.g., a requirements management tool with versioning) or enforce Git for SRS documents; define explicit baselines at project milestones in the CM plan and enforce them through milestone review gates

✅ SWE.1 CL2 Readiness Checklist

  • ✅ SRS template with mandatory sections exists and is used
  • ✅ Every requirement has a unique ID, is measurable, and has acceptance criteria
  • ✅ Non-functional requirements cover timing, memory, interfaces, and compliance
  • ✅ 100% of SW-relevant STS items covered; 100% of SRS items have upstream link or Derived label
  • ✅ Downstream trace to SWE.2 components and SWE.6 test cases established
  • ✅ Formal review record with version, attendees, issues, dispositions, and sign-off
  • ✅ SRS under CM: versioned, baselined at M2, history auditable
  • ✅ Project plan has SWE.1 milestones, resource assignments, and coverage targets
  • ✅ Weekly status tracking documented with actuals vs. planned requirements count

What's Next

Continue to SWE.2 - Software Architectural Design, where requirements allocated in SWE.1 are decomposed into software components with defined interfaces, static and dynamic views, and the traceability chain is extended downward from SRS to architectural components.

← PreviousHands-On: Process Capability Scoring Next →SWE.2 - Software Architectural Design