Home Learning Paths ECU Lab Assessments Interview Preparation Arena Pricing Log In Sign Up

Lab Overview & Objectives

This hands-on lab takes you through the full process capability scoring workflow for a realistic automotive ECU project scenario. Rather than abstract theory, you will work with actual evidence samples and apply the N/P/L/F rating scale to produce CL ratings - the same workflow a Competent Assessor follows in a real assessment.

📋 Lab Objectives

  • Apply the N/P/L/F rating scale to real evidence samples for SWE.1 and MAN.3
  • Distinguish Findings from Weaknesses based on evidence quality and coverage
  • Apply the CL achievement rules to produce a final CL rating for each process
  • Build a complete process capability profile for a 4-process assessment scope
  • Identify the highest-priority improvement actions from the resulting profile

⚠️ How to Use This Lab

Work through each exercise independently before reading the reference solution. The value of the lab is in practicing the rating judgment - not just reading the answer. ASPICE rating is a skill developed through repeated application, not memorization of rules.

The Project Scenario

You are assessing Project Phoenix - the development of a new Body Control Module (BCM) ECU at a Tier-1 automotive supplier, Apex Automotive Systems GmbH. The project has been running for 14 months and is in late SWE.4 (unit verification) phase, heading toward SWE.5 integration testing. The OEM customer has requested an ASPICE assessment as part of series production qualification.

Project Facts

ParameterValue
ECUBody Control Module (BCM) - controls lighting, window actuators, central locking, rain sensor
Team size8 software engineers, 1 project manager, 1 QA engineer (part-time, 50%)
Development approachV-model with 2-week status reviews; AUTOSAR Classic BSW from vendor; application SWC developed in-house
Requirements toolMicrosoft Excel (SRS spreadsheet, ~320 rows)
Version controlGit (code and documents in same repository)
Issue trackingJIRA
Assessment scopeSWE.1, SWE.2, MAN.3, SUP.8 - all at CL2 target
ASPICE versionv3.1

Evidence Available

The following work products have been provided for assessment:

  • SRS_BCM_v2.3.xlsx - 320 requirements, columns: ID, Requirement Text, Priority, Source Chapter (free text), Verification Method
  • Architecture_BCM_v1.5.docx - Component diagram (PowerPoint embedded), component descriptions (1–2 paragraphs each), no formal interface specifications
  • SRS_Review_Meeting_Notes_2024-09-12.docx - Meeting notes from SRS review: attendees listed (4 engineers), discussion summary, 3 issues listed as "to be resolved"
  • Project_Plan_BCM_v3.xlsx - Gantt chart with phases: Requirements, Architecture, Detailed Design, Coding, Unit Test, Integration Test. No resource allocation column. Last updated 6 months ago.
  • Status_Report_Nov2024.pptx - Monthly status deck: RAG status per phase, no planned vs. actual comparison table
  • Git log - All code and documents in Git; no tags or release branches; commit history shows frequent direct pushes to main

Exercise 1: Rate SWE.1 PA 1.1

Review the SWE.1 evidence above and answer the following questions. For each BP, decide whether it is N, P, L, or F based on the evidence. Then determine the PA 1.1 rating for SWE.1.

BPEvidence AvailableYour RatingReasoning
SWE.1.BP1 - Specify requirementsSRS_BCM_v2.3.xlsx with 320 requirements, text filled in for allYour answer here
SWE.1.BP2 - Structure requirementsIDs present (row numbers 1–320); no grouping or hierarchy visible in column structureYour answer here
SWE.1.BP3 - Analyze requirementsReview meeting notes exist; 3 issues listed but no disposition column or status updatesYour answer here
SWE.1.BP4 - Analyze operating environment impactNo interface analysis section in SRS; no ICD existsYour answer here
SWE.1.BP5 - Develop verification criteriaVerification Method column exists; values are all "Test" for functional requirements; no acceptance criteria column; non-functional requirements have no verification methodYour answer here
SWE.1.BP6 - Bidirectional traceabilitySource Chapter column filled in with text like "from STS §4.2"; no machine link; 47 requirements have Source = "internal" with no STS referenceYour answer here
SWE.1.BP7 - Content of release notesNo release-specific tagging in SRS; no release plan linked to requirementsYour answer here
SWE.1.BP8 - Ensure agreement and communicateReview meeting notes signed by 4 engineers; no evidence of OEM/system engineer acknowledgment; no formal distribution recordYour answer here

Based on your BP ratings: What is your PA 1.1 rating for SWE.1? What CL does this achieve?

[Work through this before reading the solution below.]

Exercise 2: Rate PA 2.1 & PA 2.2 for MAN.3

Using the project management evidence (Project_Plan_BCM_v3.xlsx and Status_Report_Nov2024.pptx), rate each Generic Practice for MAN.3 at CL2.

GPEvidence AvailableYour Rating
GP 2.1.1 - Identify performance objectives for MAN.3Project plan has a "Project Goals" tab with 3 high-level objectives: "On-time delivery," "Zero critical defects at SOP," "ASPICE CL2." No process-specific objectives for MAN.3 itself.
GP 2.1.2 - Plan MAN.3 performanceGantt chart shows project phases with start/end dates. Project management activities (status meetings, steering committee) not explicitly scheduled as distinct tasks.
GP 2.1.3 - Monitor and adjust MAN.3 performanceMonthly status deck shows RAG status per phase. No planned vs. actual table. No documented corrective actions. One slide says "Architecture phase delayed by 2 weeks" with no follow-up action.
GP 2.1.4 - Define responsibilitiesProject plan has a team list tab with names and roles: PM, Lead Engineer, QA, 6 Developers. No RACI for specific management activities.
GP 2.1.5 - Identify and allocate resourcesProject plan has a resource column showing person-days per phase. QA shown as 50% allocation across all phases.
GP 2.1.6 - Manage interfacesNo documented communication plan. Status meetings are held weekly (confirmed by calendar invites). OEM communication handled ad-hoc by PM.
GP 2.2.1 - Define WP requirements for MAN.3 WPsNo template defined for status reports or project plans. Status report format varies between months.
GP 2.2.2 - Define documentation/control requirementsGit used for all files. No CM plan specifying how project management documents are versioned or baselined.
GP 2.2.3 - Identify and control WPsProject plan is in Git (excel file). Status reports in a shared folder, inconsistent file naming (some have dates, some have v1/v2).
GP 2.2.4 - Review and adjust WPsProject plan sent to OEM PM by email in Month 1; reply email says "Looks reasonable." No formal review record. Status reports not reviewed - authored and presented directly.

Rate PA 2.1 and PA 2.2 for MAN.3. Does MAN.3 achieve CL2? Why or why not?

Exercise 3: Classify Findings

For each observation below, classify it as: Strength, Weakness, or Finding. Justify your classification by identifying which BP or GP is affected and what impact the gap has on CL achievement.

#ObservationYour ClassificationAffected BP/GP
1"47 requirements in the SRS have Source = 'internal' with no traceability to the STS (System Technical Specification). The remaining 273 requirements have source chapter references but no machine-traceable links to specific STS requirement IDs."
2"The architecture document contains a component diagram and component descriptions but no interface specifications. The lead architect explained the interface design but could not produce a documented interface specification during the interview."
3"The SRS review meeting notes document 3 open issues from the September 2024 review. Follow-up JIRA tickets were created for all 3 issues. Two are now closed (with SRS update referenced). One remains open, aged 8 weeks, assigned to an engineer currently on leave."
4"All code changes to main branch require a pull request reviewed and approved by at least one other engineer before merge. 100% of 247 merged PRs since project start have review comments and approval records in Git."
5"The project plan was last updated 6 months ago. Actual progress has deviated from the plan (architecture delayed 2 weeks, unit testing started 3 weeks late) but these deviations are not reflected in the current plan version."
6"No configuration baselines have been defined or established. All files are in Git with commit history, but no tagged release points, no baseline manifests, and no defined milestone where a baseline should be created."

Exercise 4: Produce the CL Profile

Based on all your ratings from Exercises 1–3, complete the following capability profile. Assume the following additional PA ratings for processes not yet fully analyzed:

  • SWE.2: PA 1.1 = L (architecture exists but interfaces not specified - finding from Observation 2); PA 2.1 = L; PA 2.2 = P (no review records for architecture)
  • SUP.8: PA 1.1 = P (CM tool used but no baselines - finding from Observation 6); PA 2.1 = L; PA 2.2 = L
ProcessPA 1.1PA 2.1PA 2.2CL AchievedTarget CLMet?
SWE.1(from Ex.1)LPCL2
SWE.2LLPCL2
MAN.3F(from Ex.2)(from Ex.2)CL2
SUP.8PLLCL2

Question: Which two processes should be the top priority for improvement, and what are the specific actions required to reach CL2?

Reference Solutions & Worked Explanations

Exercise 1 Reference Solution: SWE.1 PA 1.1 Rating

BPRatingRationale
SWE.1.BP1F320 requirements defined with text. While quality varies, the practice of specifying requirements is Fully performed.
SWE.1.BP2PIDs exist (row numbers) but no functional grouping or hierarchy. Row numbers are not stable identifiers - renaming a row breaks references. Structure is insufficient for change management.
SWE.1.BP3PReview was performed (meeting notes exist) but the 3 issues have no documented disposition status. Analysis completeness is questionable with no checklist.
SWE.1.BP4NNo interface analysis present. BCM interfaces with multiple vehicle networks (LIN for actuators, CAN for body domain) - this analysis is expected and absent.
SWE.1.BP5PVerification method column present but generic ("Test"). No acceptance criteria. Non-functional requirements (response time, memory) have no verification method at all.
SWE.1.BP6PSource chapter references exist but are free text, not machine-traceable. 47 requirements (15%) have no upstream trace at all. Bidirectional coverage is demonstrably incomplete.
SWE.1.BP7NNo release planning linked to requirements. No tagging or release allocation visible in SRS.
SWE.1.BP8PInternal review sign-off present. No OEM acknowledgment or formal distribution to architects/test team documented.

PA 1.1 Rating for SWE.1: P (Partially Achieved)

Multiple BPs are rated N or P. BP4 (interface impact analysis) is rated N - no evidence at all. BP6 (traceability) is P with 15% of requirements completely unlinked. BP7 (release planning) is N. The aggregate of N and P ratings across 5 of 8 BPs means PA 1.1 cannot reach L. PA 1.1 = P.

CL Achieved: CL0 (Incomplete). PA 1.1 must be L or F to achieve CL1. PA 1.1 = P means CL1 is not achieved, so CL0 is the result - despite significant engineering work existing.

🔍 The Surprise Finding

This is a common shock for teams who believe they "obviously have CL1." A CL0 rating does not mean the software is bad or that engineers are incompetent. It means specific process requirements are not met at a documentation and traceability level. BP4, BP5, BP6, and BP7 are fixable in 2–4 weeks of focused work - none require re-engineering the software itself.

Exercise 2 Reference Solution: MAN.3 PA 2.1 & PA 2.2

GPRatingRationale
GP 2.1.1LHigh-level project objectives exist. MAN.3-specific performance objectives are not defined, but the overall project objectives partially cover this. Weakness: process-specific objectives missing.
GP 2.1.2LGantt chart provides planning for phases. Project management activities not explicitly scheduled. Largely meets planning intent but with gaps in management activity granularity.
GP 2.1.3PRAG status reported monthly but no planned vs. actual comparison documented. The 2-week architecture delay was noted but no corrective action recorded. This is a Finding - monitoring is performed but not documented with the required rigor.
GP 2.1.4LTeam list with roles defined. No RACI for management activities. Largely sufficient - roles are clear even without a formal RACI.
GP 2.1.5FResource allocation documented per phase with person-day estimates. QA 50% clearly shown. Resource planning is well implemented.
GP 2.1.6PNo communication plan. OEM communication is ad-hoc. Interface management is not defined. This is a significant gap for a supplier project with regular OEM interaction.
GP 2.2.1NNo template or content requirement defined for any project management work product. Status report format varies - no defined requirements.
GP 2.2.2PGit used but no CM plan specifying how PM documents are managed. Ad-hoc rather than defined.
GP 2.2.3PProject plan in Git (good). Status reports in shared folder with inconsistent naming (bad). Not consistently controlled.
GP 2.2.4NNo formal review record for project plan. Email reply is not a review record. Status reports not reviewed at all. This is a Finding.

PA 2.1 = P (GP 2.1.3 = P and GP 2.1.6 = P drag the aggregate to P level)

PA 2.2 = N (GP 2.2.1 = N and GP 2.2.4 = N - two out of four GPs rated N)

MAN.3 CL: CL1 (PA 1.1 = F assumed; but PA 2.1 = P and PA 2.2 = N → CL2 not achieved)

Exercise 3 Reference Solution: Finding Classification

#ClassificationRationale
1FindingSWE.1.BP6 - 15% of requirements with no upstream trace is a systemic gap. Coverage is demonstrably below 85%. This drives PA 1.1 to P for SWE.1 and prevents CL1 achievement.
2FindingSWE.2.BP3 - Interface specifications are absent despite an interface-heavy component (BCM controls multiple actuator buses). The absence prevents SWE.5 integration testing from having a formal baseline to test against. PA 1.1 for SWE.2 is impacted.
3WeaknessGP 2.2.4 - Review was performed and JIRA tickets were created (positive). One open issue aged 8 weeks is a weakness - the review process works but follow-through is incomplete. Not a Finding because 2/3 issues are resolved and the process exists.
4StrengthGP 2.2.4 for code review - 100% PR review with documented approval for all 247 merges is excellent evidence of systematic work product review. Significantly exceeds minimum requirements. This should be noted as a strength in the assessment report.
5FindingGP 2.1.3 - The project plan has not been updated to reflect actual deviations, and no corrective action was documented for the delays. Monitoring is effectively absent for planning purposes. This is a Finding, not just a Weakness, because it means CL2 management discipline is not demonstrated.
6FindingSUP.8.BP3 - No configuration baselines established. Git history exists (which is good), but without defined baselines, you cannot reproduce a specific configuration state at any historical milestone. This is a fundamental SUP.8 requirement and its absence drives PA 1.1 for SUP.8 to P.

Exercise 4 Reference Solution: CL Profile

ProcessPA 1.1PA 2.1PA 2.2CL AchievedTarget CLMet?Priority Action
SWE.1PLP0CL2Fix BP4 (interface analysis), BP6 (traceability links + 100% coverage), BP7 (release tagging), BP5 (acceptance criteria)
SWE.2LLP0CL2Create formal interface specification document (BP3); then PA 1.1 → F; also fix GP 2.2.4 for architecture review records
MAN.3FPN1CL2Document planned vs. actual in status reports (GP 2.1.3); define WP templates and create formal review records (GP 2.2.1, GP 2.2.4)
SUP.8PLL0CL2Define and execute CM baselining at key milestones; create CM plan with baseline strategy (BP3)

Top 2 Priority Processes:

  1. SWE.1 - PA 1.1 = P means CL0. Four BPs need work. Fixing SWE.1 also fixes upstream traceability that impacts SWE.2, SWE.5, and SWE.6. Highest cascading value.
  2. SUP.8 - PA 1.1 = P due to missing baselines. This is a single structural fix (define baseline events and execute them) that can move SUP.8 from CL0 to CL2 relatively quickly. Baselines also benefit SWE traceability and change management.

Summary & Module 1 Completion Checklist

✅ Module 1 Completion Checklist

  • ☐ Can explain ASPICE's purpose, lineage (TR 15504 → ISO 33000 → v4.0), and governance (intacs, HIS, PISA)
  • ☐ Can name all 7 process groups, their prefixes, and scope
  • ☐ Can recite the purpose of SWE.1–SWE.6 and the V-model pairing structure
  • ☐ Can explain the CL scale (0–5), Process Attributes, and N/P/L/F ratings with the strict cumulative rule
  • ☐ Can walk through the full BP list for SWE.1, SWE.2, and SWE.3
  • ☐ Can list the 10 Generic Practices at CL2 (6 from PA 2.1, 4 from PA 2.2) and give concrete evidence examples for each
  • ☐ Know the three assessment types and the 6-phase assessment lifecycle
  • ☐ Can classify a set of observations as Strength / Weakness / Finding
  • ☐ Applied the rating algorithm to produce a CL profile from raw evidence in the hands-on lab

Module 2 Preview

Module 2 dives deep into each SWE process individually - SWE.1 through SWE.6 - with worked examples of compliant vs. non-compliant work products, detailed BP analysis for each process, and hands-on exercises for building traceability matrices. Start with SWE.1: Software Requirements Analysis.

Module 2 Starts Here

Continue to Module 2 - SWE.1: Software Requirements Analysis for the deep-dive process chapter on ASPICE's most-assessed and most-failed process.

← PreviousBase Practices & Generic Practices Next →SWE.1 - Software Requirements Analysis