Home Learning Paths ECU Lab Assessments Interview Preparation Arena Pricing Log In Sign Up

What Is an Improvement Action Plan?

An Improvement Action Plan (IAP) is the formal output of an ASPICE assessment that didn't meet its target Capability Levels - which is most first assessments. It is the structured, time-bound plan that maps each assessment finding to a specific corrective action, an owner, a deadline, and a verification method. Without a credible IAP, OEM customers have no basis to trust that identified gaps will be closed.

The IAP is not optional after a finding is raised. Most OEM Supplier Quality Agreements (SQAs) specify a mandatory IAP submission deadline - typically 30 to 90 days after the assessment report is issued. Failure to submit an IAP on time can trigger escalation to the supplier scorecard or procurement review.

📋 Learning Objectives

  • Read and interpret an ASPICE assessment report - understand what each finding means for CL ratings
  • Prioritize findings by their CL-blocking impact vs. effort to remediate
  • Build a complete IAP with root cause analysis, corrective actions, owners, and milestones
  • Apply root cause techniques (5-Why, Ishikawa) to systemic ASPICE gaps
  • Define a realistic re-assessment timeline and prepare OEM-reportable progress evidence

IAP vs. Corrective Action Plan (CAP)

These terms are sometimes used interchangeably, but in the ASPICE context they have distinct scopes. A CAP addresses a specific isolated defect - a single requirement without a trace link, a missing review record for one document. A IAP addresses the systemic process-level gaps that produced those defects. The goal of ASPICE improvement is not to fix individual document issues but to change the process so that those issues do not recur. An IAP that lists "add missing traceability links" as its only action for SWE.1 will satisfy neither assessors nor OEMs - it needs to address the root cause of why traceability was missing in the first place.

Reading Your Assessment Report

An ASPICE assessment report from a Competent Assessor typically contains: an executive summary (overall CL profile), a process profile table (CL per process), a detailed finding list, and appendices with evidence references. Understanding how to read this report is the prerequisite to building a valid IAP.

The Process Profile Table

The process profile is the most important page in the report. It shows, for each assessed process, the PA ratings and the resulting CL. A typical first-assessment profile for a Tier-1 ECU supplier looks like this:

ProcessPA 1.1PA 2.1PA 2.2CL AchievedCL TargetGap?
SWE.1L--02❌ PA 1.1 not F; CL1 not achieved
SWE.2FLP12❌ PA 2.2 = P; CL2 not achieved
SWE.3L--02❌ PA 1.1 not F
SWE.4FFF22✅ Met
SWE.5FLL22✅ Met
SWE.6FLL22✅ Met
SYS.2FPL12❌ PA 2.1 = P
SYS.3FLL22✅ Met
SYS.4FLL22✅ Met
SYS.5FLL22✅ Met
SUP.1FLL22✅ Met
SUP.8L--02❌ PA 1.1 not F
SUP.10FLL22✅ Met
MAN.3FPL12❌ PA 2.1 = P

In this example profile: 6 processes met CL2 target; 4 processes (SWE.1, SWE.3, SUP.8, and the PA-level gap in SWE.2, SYS.2, MAN.3) have gaps. The IAP must address all gaps.

Understanding the Finding List

The finding list provides the detail behind each PA rating. Each finding typically includes: a finding ID, the process and PA it affects, the finding description, the evidence reviewed, the assessor's rationale for the rating, and a recommended corrective action (not mandatory - the supplier defines the actual corrective action in the IAP). When reading findings, distinguish between:

  • PA 1.1 Findings - these affect the CL1 base. Until these are addressed, CL2 is unreachable for that process regardless of how strong your management practices are.
  • PA 2.1 / PA 2.2 Findings - these affect the CL2 layer. The BP-level work is done, but the management discipline around it is insufficient.
  • Weaknesses - these are noted improvement areas but do not prevent the current CL from being achieved. They are still important: in a follow-up assessment, a weakness that hasn't been addressed can be upgraded to a Finding if the assessor determines the situation hasn't improved.

Prioritizing Findings by CL Impact

Not all findings are equally important to fix first. The prioritization framework below uses two dimensions: CL-blocking impact (does this finding prevent CL target from being achieved?) and remediation effort (how hard is it to fix?). High-impact, low-effort items must be addressed immediately. High-impact, high-effort items need early project planning to have any chance of being closed in time.

Finding TypeCL-Blocking ImpactTypical Remediation EffortPriority
PA 1.1 = P or N (incomplete BPs) - traceability gaps, missing work products, no review evidence🔴 Highest - CL1 not achieved; CL2 impossibleMedium - requires rework of work productsP1 - address first
PA 2.2 findings - WPs not in CM, no review records🔴 High - blocks CL2 achievement directlyLow-Medium - procedural changes + retrospective evidenceP1 - address first
PA 2.1 findings - no documented monitoring, no process objectives🟠 High - blocks CL2 for the affected processesLow - requires adding documentation discipline to existing practicesP1 - address in parallel
PA 1.1 Weaknesses - minor coverage gaps, minor WPC gaps🟡 Medium - doesn't block CL1 but may become a Finding if not addressedLow - targeted document updatesP2 - address before follow-up assessment
PA 2.1/2.2 Weaknesses - minor management gaps🟢 Low - CL2 currently achieved despite weaknessLowP3 - good practice improvement

The CL-Impact Triage: Which Processes to Fix First

Using the example profile from above, the triage order is:

  1. SWE.1 (PA 1.1 = L) - traceability gaps are the likely driver. This blocks all 6 qualification test cases that link back to the affected requirements. Must be P1.
  2. SWE.3 (PA 1.1 = L) - detailed design documentation incomplete. Since SWE.4 (unit tests) depends on SWE.3 detailed design, this gap may invalidate some unit test evidence too. Must be P1.
  3. SUP.8 (PA 1.1 = L) - configuration management is the foundation for all GP 2.2 evidence. If CM is weak, all work product management evidence is suspect. Must be P1.
  4. MAN.3 (PA 2.1 = P) - project monitoring not documented. This is a P1 because it affects not just MAN.3 but the GP 2.1.3 evidence for every process. Fix the monitoring discipline once and it helps all processes.
  5. SYS.2 (PA 2.1 = P) - same pattern as MAN.3 but scoped to SYS.2 process activities.
  6. SWE.2 (PA 2.2 = P) - SAD review records incomplete. Targeted fix: complete review records for the SAD.

IAP Structure & Template

An IAP submitted to an OEM should be structured, traceable to the assessment report, and contain enough detail to verify progress at a follow-up review. Below is the standard IAP table structure with a worked example for the SWE.1 traceability finding.

📄 Improvement Action Plan - Table Format

IAP Header: Supplier: [AutoSoft GmbH] | Project: [ABS-ECU SW v2.3] | Assessment Date: [2024-09-15] | Report Version: [v1.0] | IAP Version: [1.0] | IAP Submission Date: [2024-10-15]


Column headers:

IAP-ID | Finding Reference | Process | PA Affected | Finding Description | Root Cause | Corrective Action | Action Type | Owner | Start Date | Due Date | Completion Criteria | Status | Evidence of Completion


Example row (SWE.1 traceability finding):

IAP-001 | F-SWE1-003 | SWE.1 | PA 1.1 (BP6) | 9 of 312 SRS requirements (SRS-156, SRS-201, SRS-245, SRS-246, SRS-289, SRS-301, SRS-302, SRS-303, SRS-312) have no upstream traceability link to STS. Root cause analysis pending. | Process gap: requirements intake procedure does not include a mandatory STS coverage check step. Tooling gap: TRM-001 not linked to requirement creation workflow in SharePoint, so new requirements can be added without triggering a trace-link creation step. | (1) Update requirements intake process to include mandatory STS-ID entry for all new requirements - process document update. (2) Add SharePoint list validation to make STS-ID field mandatory. (3) Retrospectively add missing STS links for all 9 requirements and re-baseline SRS and TRM-001. (4) Run updated intake process on next STS change batch and verify 100% link coverage. | Process improvement + retroactive remediation | RE Lead (Maria Fischer) | 2024-10-01 | 2024-11-01 | (a) Updated process document approved and distributed. (b) SharePoint validation active. (c) SRS v2.2 baselined with all 9 links resolved. (d) TRM-001 v1.6 showing 100% upstream coverage. | Open | [to be completed]


Action Types: Use consistent classification across all IAP rows:

R - Remediation (fix the specific gap in existing evidence)

P - Process improvement (change the process to prevent recurrence)

T - Training (address a competency gap)

T+P - Training plus process improvement (competency gap in context of process)

Complete IAP for the Example Profile

IAP-IDProcess/PACore ActionTypeDue
IAP-001SWE.1 / PA 1.1 (BP6)Update requirements intake process with mandatory STS ID; fix 9 missing links; re-baseline SRS and TRMP+R4 weeks
IAP-002SWE.3 / PA 1.1 (BP1)Complete detailed design documentation for 3 components that have only stub-level design; hold peer reviews; baseline SDDR6 weeks
IAP-003SUP.8 / PA 1.1 (BP2)Define formal baseline milestones in CM plan; establish baselines for all current work products; configure mandatory version tagging in SharePointP+R3 weeks
IAP-004MAN.3 / PA 2.1 (GP 2.1.3)Formalize project status meeting template to include planned vs. actual comparison table; retroactively document actuals in minutes for last 4 meetingsP+R2 weeks
IAP-005SYS.2 / PA 2.1 (GP 2.1.3)Add SYS.2 process activities to project status monitoring scope; assign SYS.2 process objectives in QA planP2 weeks
IAP-006SWE.2 / PA 2.2 (GP 2.2.4)Complete SAD v1.8 review, produce review record with full issue log and dispositions; update SAD review record v1.7 to reflect resolved issue #14R3 weeks

Root Cause Analysis for ASPICE Gaps

OEM quality engineers reviewing an IAP will evaluate the quality of the root cause analysis. A root cause that simply restates the finding ("the traceability was missing because we didn't add it") is not a root cause - it is a symptom description. A credible RCA identifies why the existing process allowed the gap to occur, which then enables an effective corrective action that prevents recurrence.

5-Why Analysis: SWE.1 Traceability Gap

Problem: 9 SRS requirements have no upstream STS link.

Why 1: Why do 9 requirements have no STS link?
→ They were added to the SRS after the initial version was written, and the STS ID was not entered.

Why 2: Why was the STS ID not entered when requirements were added?
→ The requirements intake process does not require an STS ID - it is a recommended practice, not a mandatory field.

Why 3: Why is STS ID entry not mandatory?
→ When the requirements process was set up, the traceability requirement was described verbally in team onboarding but not enforced by the process document or tool configuration.

Why 4: Why was it not enforced?
→ The process document was written by a QA engineer who assumed the requirement was obvious and self-enforcing; no technical or procedural enforcement was designed in.

Why 5: Why was there no enforcement mechanism?
→ Root cause: The process design step did not include a verification of enforcement - there was no check that the process as written would actually be followed without additional controls.

Root cause: Process design does not include an enforcement validation step; requirements intake process has no mandatory STS-ID field and no tool-level validation to enforce it.

Corrective action: (1) Update process design practice to require enforcement validation. (2) Make STS-ID a mandatory field with tool-level validation. (3) Retrain team on updated process. (4) Add STS coverage check to definition-of-done for new requirements.

Common Root Cause Patterns in ASPICE Gaps

Gap PatternCommon Root CauseCorrective Action Type
Review records missing across multiple processesReview is done informally (verbal); no review record template or habit; team doesn't know what a compliant review record looks likeTraining (what reviews require) + Process (mandatory review record template + step in workflow) + Tooling (review record in SharePoint with mandatory fields)
Work products not in CM / unversionedCM plan exists but covers only code; documents stored ad-hoc in personal drives or informal SharePoint folders without version disciplineProcess (extend CM plan to cover all ASPICE WPs) + Tooling (configure SharePoint with version control for document libraries) + Procedure (define what CM means for documents)
Project monitoring not documented (GP 2.1.3)Monitoring happens in the project manager's head or in informal chats; formal status meetings exist but minutes don't capture planned vs. actualProcess (standard status meeting template with planned/actual table) + Habit (PM commits to completing template before publishing minutes)
Detailed design incomplete (SWE.3)Pressure to code early; design documentation deferred and never caught up; no exit criteria defined for SWE.3 before SWE.4 beginsProcess (define SWE.3 exit criteria - design review passed + documentation complete - as prerequisite for SWE.4 start) + Project Planning (allocate SWE.3 time in schedule)
QA not independent enough (SUP.1)QA engineers are in the same team as developers; organizational structure doesn't create functional independenceOrganizational (move QA to a separate cost center or reporting line) - this is the hardest fix and may require management decision beyond project scope

Timeline & Milestone Planning

A realistic IAP timeline needs to account for the actual effort required to fix each gap, the project team's availability (they still have development work to do), and the OEM's follow-up review schedule. Most OEMs schedule a follow-up assessment 3–6 months after the initial assessment. This gives 12–24 weeks to close all P1 findings.

IAP Timeline Template

WeekActivityIAP ItemsOutput
1–2IAP kickoff: assign owners, confirm due dates, complete RCA for all P1 findingsAllSigned IAP v1.0 submitted to OEM
2–3Quick wins: documented monitoring template + retroactive status meeting minutes updateIAP-004, IAP-005Updated PSM template; 4 backdated-but-compliant meeting minutes
3–4CM plan update: define baselines for all WPs, establish formal version control disciplineIAP-003Updated CM plan v2.0; baseline manifest for all current WPs
3–5SWE.2 SAD review completion: complete v1.8 review, update v1.7 recordIAP-006RVW-SAD-001 v2 (updated); RVW-SAD-002 (v1.8 review)
4–8SWE.1 traceability closure: update intake process, fix 9 missing links, re-baseline SRS + TRMIAP-001Updated intake process; SRS v2.2; TRM v1.6 at 100% coverage
5–10SWE.3 detailed design completion: write and review SDD for 3 incomplete componentsIAP-002SDD-001 v1.4 (complete); review records for each SDD section
8–10Internal verification: run internal ASPICE gap check against all P1 findings. Confirm all IAP items closed.AllInternal gap check report; updated IAP with completion evidence
10–12IAP progress report to OEM: submit evidence package and updated IAP statusAllIAP v2.0 with completion evidence; ready for follow-up assessment

Realistic Effort Estimates for Common Fixes

Fix ItemTypical EffortCaveat
Add missing traceability links (per 10 requirements)4–8 hours RE effortLonger if source STS is ambiguous or STS has been updated since SRS was written
Complete a review record for an existing approved document4–6 hours (review meeting + record)Requires assembling reviewers who reviewed the document - gets harder if months have passed
Complete detailed design for one SW component (medium complexity)2–5 days design + 1 day reviewHighly variable; complex state machines or safety-relevant components take longer
Update CM plan and establish baselines2–3 days (CM engineer)Establishing retrospective baselines can take longer if version history is scattered
Create and roll out a status meeting template + backfill 4 meetings1–2 days PM effortBackfilling requires memory or team input; future meetings are trivial with the template
Update requirements intake process + tool validation3–5 days (process engineer + IT)SharePoint list validation setup is fast; process document approval chain may take 1–2 weeks

OEM Reporting & Follow-Up Assessment

IAP Submission and Progress Reporting

After submitting the IAP to the OEM, the supplier is typically required to provide progress reports at defined intervals (e.g., monthly). The progress report is a simple update of the IAP table: for each action, update the Status field and add evidence reference. OEM quality engineers do not want to re-read the whole IAP each month - they want to see which items changed status and what evidence proves it.

Use clear status codes: Open (not started), In Progress (started, not complete), Completed Pending Verification (supplier considers it done, awaiting assessor confirmation), Closed (confirmed complete in follow-up review).

What OEMs Check in a Follow-Up Assessment

A follow-up assessment is not a full re-assessment. It is a targeted review of the specific processes and PAs where gaps were found. The assessor will:

  1. Review the IAP and confirm that every P1 finding has a corresponding completed action with evidence
  2. For each closed action, review the evidence - the same way as in the original assessment (not just reading the IAP claim)
  3. Re-rate the affected BPs and PAs based on the new evidence
  4. Check that process improvements have actually been applied (not just documented) - interviewees will be asked if they are aware of and following the new process
  5. Upgrade closed findings to resolved; note any that are incomplete

❌ Most Common IAP Failure in Follow-Up Assessments

The most frequent reason a follow-up assessment still fails is: the immediate defect was fixed (the 9 missing trace links were added) but the process was not changed (the intake procedure still has no mandatory STS-ID field). The assessor will check new requirements added since the original assessment - and if they also have missing trace links, the corrective action is considered insufficient and the finding remains open. Always close both the remediation (fix the existing gap) and the process improvement (prevent recurrence) before claiming an action as complete.

Escalation Paths When IAP Cannot Be Closed on Time

If it becomes clear that a P1 finding cannot be closed within the OEM's deadline (due to project schedule, resource constraints, or technical complexity), proactive communication is essential. The worst outcome is for the OEM to discover during the follow-up assessment that an action is not complete - without any prior warning. The right approach: inform the OEM quality contact as early as possible, provide a revised timeline with justification, propose a partial closure plan (remediation complete but process improvement pending), and request a timeline extension in writing through the SQA escalation process. Most OEMs will accept a credible extension request if it is raised proactively with evidence of progress - they will not accept a surprise at the follow-up meeting.

✅ Summary: The Characteristics of a Good IAP

  • Every finding from the assessment report has a corresponding IAP entry - nothing is silently ignored
  • Root cause analysis goes to the process level, not just the symptom level
  • Corrective actions include both remediation (fix the gap) and process improvement (prevent recurrence)
  • Completion criteria are specific and verifiable - not "improve traceability" but "TRM v1.6 showing 100% upstream coverage of all 312 SRS requirements"
  • Owners are named individuals, not "the team" or "the department"
  • Due dates are realistic given project workload - a plan with all items due in 2 weeks is not credible
  • Progress is reported proactively; delays are communicated before they become surprises

What's Next

This completes the ASPICE Assessment Preparation module. Continue to Hands-On: Mock Assessment Exercise to practice all the concepts - reading process evidence, rating BPs and PAs, classifying findings, and drafting IAP entries - in a realistic simulated assessment scenario.

← PreviousCommon Assessment Findings & Solutions Next →Hands-On: Mock Assessment Exercise