Home Learning Paths ECU Lab Assessments Interview Preparation Arena Pricing Log In Sign Up

SWE.3 Purpose, Scope & Outcomes

Official Purpose Statement (ASPICE v3.1): "The purpose of the Software Detailed Design and Unit Construction Process is to provide an evaluated detailed design for the software units and to produce software units."

SWE.3 is the third step on the left leg of the V-model and is unique in that it combines a design activity (detailed design) with an implementation activity (unit construction - i.e., writing code). This dual nature makes it the most code-intensive ASPICE process and the one where design documentation most frequently lags behind implementation in real projects.

📋 Learning Objectives

  • Explain the difference between SWE.2 architectural design and SWE.3 detailed design in terms of required specificity
  • Describe the five SWE.3 Base Practices and the artifacts each produces
  • Apply static analysis to source code and document the results in ASPICE-compliant form
  • Set up coding guidelines and apply them consistently across a software unit
  • Trace source code units back to the SWE.2 architectural components they implement

SWE.3 Process Outcomes

OutcomeStatementAssessed Via
O1A detailed design for the SW units is developedDetailed design document (pseudocode, flowcharts, data structure specs) sufficient to guide implementation without additional design decisions
O2SW units are producedSource code exists; compiles; code review records show the implementation was evaluated
O3Consistency and bidirectional traceability between SW architectural design and SW units is establishedTraceability: SAD component → SW unit source files; and reverse
O4The detailed design is agreed and evaluatedDesign review/code review records with version, attendees, findings, dispositions

The 6 Base Practices - Complete Reference

BPNameWhat Assessors CheckWork ProductsCommon Failure
BP1 Develop SW detailed design The detailed design goes below the architectural level: internal data structures, algorithm logic, error handling per code path, local state variables. It is specific enough that two different developers could independently produce interchangeable implementations. Assessors will check: does the detailed design explain how the component works, not just what it does? Software Detailed Design Document (SDDD), pseudocode, structured flowcharts, Doxygen-annotated header files (if used as design specification) Detailed design = copy of architectural description with no additional detail; "the implementation defines the design" - code written first, design never produced
BP2 Evaluate the detailed design The detailed design is reviewed before coding (or at latest concurrently). The review must cover: correctness of logic, consistency with SWE.2 architecture, completeness of error handling, compliance with coding guidelines, and feasibility within resource budgets. Review findings are documented and resolved. Design review record (version reviewed, attendees, findings, status) Code review conducted post-implementation is treated as the "design review" - by then the design is the code, and the review loses its purpose
BP3 Produce SW units Source code is developed according to the detailed design and coding guidelines. Code compiles without errors and without suppressed warnings. File headers contain: component ID, version, author, date, change history. The code implements the detailed design - no unexplained deviations. Source code files (C/C++/Ada/etc.) in version control with complete history Source code exists but has no file headers, no change history, suppress-all-warnings compiler flags enabled, or significant deviations from the detailed design with no documented reason
BP4 Apply coding guidelines A defined coding guideline (MISRA-C, AUTOSAR C++ Coding Guidelines, or project-specific) is applied. Compliance is verified by static analysis tools (not just manual review). Exceptions to guidelines are documented, justified, and approved - not silently suppressed. Coding guideline document; static analysis configuration and reports; documented rule deviations (deviations register) MISRA-C adopted as guideline but static analysis shows hundreds of violations; violations suppressed without review; no deviations register; "we use MISRA as a reference, not a mandate"
BP5 Perform static analysis Static analysis tools (LDRA, QA-C, PC-lint, Polyspace, Coverity, etc.) are run on all source code. Results are reviewed: violations classified as genuine findings or accepted deviations, genuine findings fixed, accepted deviations justified. Static analysis report produced. The goal is not zero violations - it is that all violations are accounted for. Static analysis tool report; violation review records; deviations register Static analysis run as a formality at project end; hundreds of violations with no systematic review; results not linked to the coding guideline; violations in safety-critical code not prioritized
BP6 Establish bidirectional traceability Every SW unit (source file or module) traces to the SWE.2 architectural component it implements. Every SWE.2 component has at least one SW unit implementing it. Plus: the detailed design document traces to the architectural component, and the source code traces to the detailed design. Extended traceability matrix: SAD Component → SDDD section → Source file(s) Traceability matrix stops at SWE.2 component level; no link from component to source file; "everyone knows BCM_DoorCtrl.c implements the DoorCtrl component" is not documented evidence

What Detailed Design Actually Means at the Unit Level

The most common SWE.3 misunderstanding: "detailed design = UML class diagram." In embedded automotive C development, detailed design is typically expressed through three complementary artifacts:

1. Function-Level Pseudocode / Structured Description

For each non-trivial function, a design description is produced before implementation. The description must include:

  • Function purpose (one sentence)
  • Algorithm steps (numbered, structured, pseudocode level - language-independent)
  • Decision points and their conditions (if/else branches with guard conditions explicitly stated)
  • Error handling paths (what happens when each error condition occurs)
  • Local data structures used (type, purpose, valid range)

Example for a sensor fault detection function:

ElementContent
FunctionSafetyMgr_CheckSensorHealth()
PurposeMonitor all safety-relevant sensors for out-of-range signals and trigger safe state if a fault persists beyond the debounce threshold
Algorithm1. For each sensor ID in SafeSensor_Table (SENSOR_TEMP, SENSOR_PRESS, SENSOR_SPEED):
2.   Read raw ADC value via Sensor_GetRaw(sensorId)
3.   If value < MIN_VALID[sensorId] OR value > MAX_VALID[sensorId]:
4.     Increment FaultCounter[sensorId]
5.     If FaultCounter[sensorId] >= FAULT_DEBOUNCE_COUNT (3 cycles):
6.       Report DEM event SENSOR_FAULT_EVENT[sensorId]
7.       Trigger SafeState_Request(REASON_SENSOR_FAULT)
8.   Else: Reset FaultCounter[sensorId] to 0
Error handlingIf Sensor_GetRaw() returns SENSOR_COMM_ERROR: treat as fault (worst case); increment FaultCounter; do NOT reset on next cycle without confirmed valid reading
ConstraintsWCET ≤ 100µs; called from 10ms cyclic task; non-reentrant
TraceSRS-015 (sensor fault detection), SRS-016 (safe state activation), SAD: SafetyMonitor component

2. Data Structure Specifications

Complex data structures (lookup tables, ring buffers, state databases, calibration parameter structs) must be defined at design time with: element names, data types, valid ranges, access patterns (read-only, read-write, initialized at startup), and memory section allocation.

3. Coding Standards Compliance (MISRA-C)

For most automotive C projects, the MISRA-C:2012 standard is the required guideline. Key rules that consistently appear in static analysis violations and assessor findings:

MISRA-C:2012 RuleCategoryViolation PatternASPICE Risk
Rule 15.5AdvisoryFunction has multiple return paths; only one return statement required at function endLow - advisory, but safety teams often mandate it
Rule 14.4RequiredBoolean expression used in a non-boolean context (e.g., if(ptr) instead of if(ptr != NULL))Medium - consistently flagged in reviews
Rule 11.3RequiredCast between pointer to object typesHigh - safety-critical; pointer type punning can cause undefined behavior
Rule 10.1RequiredOperand of inappropriate essential type (mixing signed/unsigned without explicit cast)High - leads to integer overflow in safety-critical calculations
Rule 17.7RequiredReturn value of non-void function not usedHigh - error return codes silently ignored; missed fault detection
Rule 8.7AdvisoryFunctions / objects that are only referenced in one translation unit should have internal linkage (static)Medium - symbol pollution; increases link time

⚠️ Static Analysis Gotcha

Running MISRA-C analysis and suppressing all violations with /* MISRA deviation: approved */ comments (without a corresponding entry in the deviations register) is worse than not running static analysis at all. Assessors will check whether suppressions are reviewed and logged. A clean deviations register with 15 justified exceptions is far stronger evidence than a codebase with 500 unsuppressed violations or 500 silently suppressed ones.

Top SWE.3 Findings & How to Prevent Them

#FindingBP FailedPrevention
1Detailed design does not exist - implementation preceded design; design document written retrospectively by reverse-engineering the codeBP1, BP2Gate: no coding allowed until detailed design is reviewed and approved. Use milestone reviews enforced in the project plan (GP 2.1.2, 2.1.3). Design review as a CM baseline event.
2Static analysis run but results not reviewed: tool configured to output 800 violations, none triaged or resolvedBP4, BP5Define a maximum allowable open violation count (e.g., zero required-rule violations, <50 advisory with all reviewed). Make static analysis closure a release gate. Assign an owner to the violations review.
3Coding guideline exists as a document nobody reads: violations everywhere, no deviations registerBP4Integrate static analysis into CI/CD pipeline. Block merge/commit if required-rule violations are introduced. Treat deviations as formal change requests with approval.
4Traceability from source code to architectural component absent or unmaintainedBP6Add file-level Doxygen annotation: @component BCM_DoorCtrl, @SAD-ref SAD-3.2. Validate traceability coverage in CI.
5Source code not in version control, or in version control but with no meaningful commit messagesGP 2.2.3Enforce Git commit conventions: [CR-042][BCM_DoorCtrl] Fix sensor debounce counter reset logic. Every commit references a CR or work item.

✅ SWE.3 CL2 Readiness Checklist

  • ✅ Detailed design document exists with algorithm descriptions, data structures, error handling per function
  • ✅ Detailed design review completed before coding - review record with version, attendees, issues, dispositions
  • ✅ Source code compiles without errors; no blanket warning suppression
  • ✅ Coding guideline (MISRA-C:2012 or equivalent) applied; scope and mandatory rules defined
  • ✅ Static analysis configured and run; all violations triaged; required-rule violations = zero; deviations register complete
  • ✅ Source files have headers: component ID, version, trace to SAD component and SRS requirements
  • ✅ Source code under CM (Git or equivalent) with meaningful commit history traceable to CRs
  • ✅ Traceability: SAD component → SDDD section → source file(s) documented

What's Next

Continue to SWE.4 - Software Unit Verification, where the source code produced in SWE.3 is verified against the detailed design through unit testing, static analysis review closure, and coverage measurement.

← PreviousSWE.2 - Software Architectural Design Next →SWE.4 - Software Unit Verification