Home Learning Paths ECU Lab Assessments Interview Preparation Arena Pricing Log In Sign Up

What Is Assessment Planning?

Assessment planning is the structured phase before any ASPICE assessment begins - the work that determines what will be assessed, how deeply, by whom, over what time period, and against which version of the PAM. A poorly planned assessment produces unreliable ratings, scope disputes mid-audit, and wasted stakeholder time. A well-planned assessment produces defensible, repeatable results that actually drive process improvement.

ASPICE assessments fall into three categories, each with different planning requirements:

  • Supplier assessment (external) - Initiated by an OEM or Tier-1 to evaluate a supplier's processes. The OEM's supplier quality team or a certified third-party assessor leads it. The supplier has limited influence over scope.
  • Internal assessment (self-assessment) - Initiated by the supplier organization to identify gaps before an OEM audit or to drive improvement. Lower formality; no intacs certification required for the lead, though certified leads produce more comparable results.
  • Capability determination (joint) - A collaborative assessment agreed between customer and supplier, often used for new program onboarding. Both parties negotiate scope and share findings.

📋 Learning Objectives

  • Define all elements required in an ASPICE Assessment Contract per ISO 33002
  • Select appropriate process instances given a project portfolio and OEM scope requirements
  • Determine interview duration and assessor team size based on scope and target CL
  • Identify the specific preparation artifacts the assessed organization must provide before Day 1
  • Recognize the five most common planning failures that cause assessment findings to be disputed

The Planning Timeline

For a full HIS-scope assessment of a medium-size ECU project targeting CL2, the typical planning timeline runs 4–6 weeks before the on-site interviews. This is not bureaucratic overhead - it is the time needed to gather, organize, and pre-review the evidence package so that interview days are spent on verification, not hunting for documents.

Weeks BeforeActivityOwner
T-6 weeksAssessment Contract drafted and agreed. Scope, version, CL targets, confidentiality terms defined.Customer + Supplier Lead
T-4 weeksProcess instance list confirmed. Project selected. Key interviewees identified and calendar-blocked.Assessed Organization
T-3 weeksDocument package submitted: plans, specifications, test records, CM logs, review records.Assessed Organization
T-2 weeksAssessors pre-review document package. Gap list prepared. Interview questions drafted per process.Assessment Team
T-1 weekClarification meeting: assessors raise pre-review questions; organization provides missing evidence or context.Both
T-0On-site interviews. Evidence presented and verified. Preliminary ratings discussed.Both

Defining the Assessment Scope

Scope definition is where most assessment disputes originate. "Scope" has three distinct dimensions in ASPICE, all of which must be explicitly agreed before the assessment begins:

Dimension 1: Process Scope

Which process IDs from the PRM are being assessed. The starting point is the HIS default scope (SWE.1–6, SYS.2–5, SUP.1, SUP.8, SUP.10, MAN.3), but OEMs frequently modify this. Common expansions:

  • Adding SUP.9 (Problem Resolution Management) - especially for projects with a significant maintenance/sustaining engineering component
  • Adding MAN.5 (Risk Management) and MAN.6 (Measurement) - required by some OEMs for CL3 re-assessments
  • Adding ACQ.4 (Supplier Monitoring) - when the assessed organization itself has software sub-suppliers
  • Adding HWE.1–HWE.4 (v4.0) - for organizations with both HW and SW development responsibility

Common compressions: if a supplier has a clearly bounded software-only role with no system engineering responsibility, SYS.2–SYS.5 may be partially or fully out of scope.

Dimension 2: Capability Level Target

The CL target per process must be specified in advance - you cannot assess for CL2 and then decide mid-assessment to also rate CL3. The assessor's interview depth, evidence requests, and ratings are calibrated to the target CL. A typical supplier audit specifies: SWE.1–SWE.6 at CL2; SYS processes at CL2; SUP and MAN processes at CL2.

Assessors follow the ordinal scale rule: to achieve CLn, all Process Attributes at levels 1 through n must be rated Largely (L) or Fully (F) achieved. This means to demonstrate CL2, the organization must first demonstrate CL1 (PA 1.1 = L or F), then demonstrate PA 2.1 and PA 2.2 both at L or F. Failing PA 1.1 means CL0 regardless of how well PA 2.1 evidence looks.

Dimension 3: Organizational Scope

Which organizational units, sites, and teams are in scope. For a multinational Tier-1 with development centers in Germany, India, and China, the assessment may scope only the German site, or all sites, or a specific functional team. This matters because:

  • Different sites may have different process implementations
  • Evidence must be collected from the in-scope organizational unit - evidence from an out-of-scope site cannot substitute
  • CL3 (Established Process) requires a standard defined process at the organizational level, which means the organizational boundary must be wide enough to include the group that defines and maintains the standard process

⚠️ The Scope Creep Risk

It is common for assessors to discover during interviews that a process is performed differently than documented, or that a different team is actually responsible. If this expands the intended scope, it can invalidate planned interview time and create rating ambiguity. Pre-assessment document review is the mitigation - surface surprises before Day 1, not during it.

The Assessment Contract

The Assessment Contract is a formally defined artifact per ISO/IEC 33002 (Requirements for Performing Process Assessment). It is not optional - without a signed Assessment Contract, an assessment does not meet the requirements for a conformant ASPICE assessment and its results cannot be officially reported as CL ratings. For internal assessments, a less formal equivalent is acceptable, but the same information should still be documented.

Mandatory Elements of an Assessment Contract

ElementWhat It Must SpecifyWhy It Matters
PurposeWhy this assessment is being conducted (e.g., "determine process capability in preparation for series production release")Determines assessment depth and how findings will be used
Scope - ProcessExact list of process IDs and the PAM version (e.g., "ASPICE PAM v3.1")Prevents scope disputes; defines what is and is not being rated
Scope - OrganizationalNamed organizational units, sites, teams, and project(s) in scopeEvidence from out-of-scope units is inadmissible
Target CLCL target per process or globallyCalibrates interview depth and evidence requirements
Assessment teamLead assessor name + intacs certification level; team member names; rolesLead must be intacs-certified; team competency requirements are normative in ISO 33002
Dates and logisticsOn-site interview dates, document submission deadline, preliminary results dateCoordinates all parties; late evidence submission is a common dispute trigger
ConfidentialityHow assessment findings, ratings, and evidence are handled, stored, and sharedAssessed organizations have IP rights in their process documentation; findings are often commercially sensitive
Output specificationWhat deliverables the assessed organization will receive: ratings per process, findings list, Assessment RecordThe Assessment Record is the formal output - it must be provided to the assessed organization per ISO 33002

Lead Assessor Certification Requirements

For an assessment to be reported as a conformant ASPICE assessment, the Lead Assessor must hold an active intacs certification. intacs defines three levels:

  • Provisional Assessor - has completed the intacs exam and is building supervised assessment experience. Can participate as a team member but cannot lead a conformant assessment independently.
  • Competent Assessor - has led conformant assessments under supervision and been formally endorsed. Can lead conformant assessments independently.
  • Principal Assessor - can mentor others and validate assessments. Can sign off on assessments where the lead is a Competent Assessor.

Always verify the Lead Assessor's certification status directly on the intacs website (intacs.eu) before the assessment begins. Certificates have expiry dates and require active renewal via continued assessment practice.

Process Instance Selection

A process instance is a specific occurrence of a process being executed - for example, the SWE.1 process as it was performed on Project Alpha for ECU firmware v2.3. A large development organization may have many simultaneous projects, each being a separate process instance of SWE.1.

Why Instance Selection Is Critical

ASPICE does not assess abstract process descriptions - it assesses evidence from real project work. The instances selected determine what evidence the assessor will review. Assessed organizations sometimes want to show their best project (lowest risk, most mature). OEM assessors often want to see the most challenging active project (highest risk, newest team).

ISO 33002 requires that the assessed organization provide input to instance selection, but the lead assessor has final authority in a supplier assessment. For self-assessments, the organization selects instances themselves - but selecting only the "easy" instances produces misleading results and will be visible to any OEM who later conducts a proper assessment.

Instance Selection Criteria

CriterionConsideration
Project lifecycle phaseIdeally in an active development phase with SWE.1–SWE.4 ongoing. A project in SWE.6 (qualification testing) allows review of end-to-end artifacts but may miss the planning and design processes in action.
Project risk/complexityOEM assessors prefer medium-to-high complexity projects. A simple I/O ECU with 50 requirements does not demonstrate the same capability as a powertrain ECU with 2,000 requirements.
Process maturityFor a CL3 assessment (Established Process), the instance must be executed using the organizational standard process. A project that was exempted from the standard process cannot serve as a CL3 instance.
Data availabilityAll work products from SWE.1 requirements specification through test reports must be accessible during the assessment. Archived projects with deleted evidence are problematic.
Multiple instancesFor CL3 assessment, typically 2–3 instances are reviewed to demonstrate that the standard process is deployed consistently, not just once.

Mapping Instances to Processes

Not all processes have the same instance boundary. SWE.1–SWE.6 typically share a project instance - the same project serves as the instance for all six. But SUP.8 (Configuration Management) and MAN.3 (Project Management) may have organization-level instances that cross multiple projects. Clarify these boundaries explicitly in the Assessment Contract to prevent assessors arriving with expectations the organization cannot meet.

Assessment Team & Roles

Assessor Team Sizing

A full HIS-scope assessment (14 processes, CL2 target) typically requires 2–4 assessors for 3–5 on-site days. The sizing depends on scope depth, number of process instances, and whether pre-assessment document review was thorough. A rough guide:

ScopeCL TargetAssessorsOn-site Days
HIS default (14 processes)CL1–CL22–33–4
HIS default (14 processes)CL2–CL33–44–5
Extended scope (18+ processes)CL2–CL34–55–7
Self-assessment, subset (6–8 processes)CL1–CL21–22–3

Roles in the Assessed Organization

  • Assessment Coordinator - single point of contact on the supplier side. Manages logistics, coordinates interviewees, owns the document submission package. This role often underestimates the workload - plan for ~50% of one FTE in the 3 weeks before on-site.
  • Process Owners / Interviewees - the engineers and managers who actually perform the processes. SWE interviewees should be working-level engineers (not managers); MAN.3 interviewees should include the project manager and a team member for a balanced view.
  • Process Improvement Lead (SPICE Coordinator) - in organizations with active ASPICE programs, this role prepares the evidence package, coaches interviewees, and tracks gap closure. In less mature organizations, this role may not exist - a significant risk factor.

💡 Who Should Be in the Interview Room

Assessors prefer interviewing the people who do the work - not their managers. A requirements engineer explaining how SWE.1 is actually performed is far more credible than a project manager explaining what the process document says. Include managers for MAN.3 and strategic process questions, but for SWE processes, have the working engineers present. Ideally: 2–3 people per process interview, all of whom personally perform that process on the assessed project.

Interview & Evidence Logistics

The Pre-Assessment Document Package

The document package submitted 2–3 weeks before the on-site interviews should be structured to allow assessors to pre-verify work product existence and structure before spending interview time on it. A well-organized package significantly reduces on-site friction. Minimum contents:

Process AreaDocuments to Include
SWE.1Software Requirements Specification (full), traceability matrix to SYS requirements, SWE.1 review records
SWE.2Software Architectural Design document, interface specifications, traceability to SWE.1 requirements
SWE.3Detailed design documents (or selected representative units), coding guidelines, static analysis report
SWE.4Unit test plan, test cases (sample), test execution results, coverage report
SWE.5Integration test plan, integration order, test results, regression test records
SWE.6Qualification test specification, test report, requirements coverage matrix
SUP.1QA plan, QA audit records for the assessed project, nonconformance log
SUP.8CM plan, baseline records, change log from version control system
SUP.10Change request procedure, sample CRs with approval records and implementation tracing
MAN.3Project plan, status reports, milestone tracking records, corrective action log

Interview Day Structure

A typical assessment day follows this pattern: 30-minute opening (assessor presents process scope and approach), 60–90 minutes per process (interview + evidence review), 30-minute daily wrap-up (preliminary findings discussed). Processes are clustered logically: SWE.1–SWE.2 in one block, SWE.3–SWE.4 in another, with SUP and MAN processes typically on the final day.

Assessors will ask for specific documents to be shown on-screen or printed. Do not pre-filter which version of a document to show - assessors will ask for the actual working version, not a cleaned-up version created for the assessment. Showing a version that is inconsistent with the rest of the evidence package is a significant credibility issue.

🔍 The "Assessment Theater" Anti-Pattern

Assessment theater is when an organization creates work products specifically for the assessment that do not reflect how work is actually done. Examples: traceability matrices created the week before by someone who was not involved in requirements; review records that are complete and perfect with no comments or defects; project plans backdated to look like they were created at project kickoff. Experienced assessors recognize assessment theater immediately - internal inconsistencies between documents, timestamps that contradict claimed work sequences, and interviewees who cannot explain artifacts attributed to them. The consequence is not just a finding - it damages the credibility of the entire assessment.

Common Planning Pitfalls

PitfallWhat HappensHow to Avoid
Scope not in writingAssessor expects SYS.1–SYS.5; supplier prepared only SYS.2–SYS.4. Day 1 dispute consumes planning time.Always have a signed Assessment Contract with exact process IDs listed.
Wrong PAM versionSupplier prepared for v3.1; OEM runs v4.0. BPs differ; evidence package doesn't cover v4.0-specific requirements.Confirm exact PAM version in the contract. Request it in writing if the OEM's SQA is ambiguous.
Interviewees not availableKey requirements engineer on leave; manager covers but cannot answer implementation questions. Evidence credibility drops.Block interviewee calendars as soon as dates are confirmed. Name backups for each role.
Document package too latePackage arrives 3 days before on-site. Assessors cannot pre-review; on-site time is consumed reading documents rather than verifying them.Set a hard submission deadline in the contract (minimum 2 weeks before on-site).
Instance mismatchSupplier presents documents from Project A, but most interviewees worked on Project B. Evidence and testimony don't align.Agree on specific project instance(s) in writing. Confirm that all interviewees have worked on those instances.
No process improvement leadNo one owns the preparation. Evidence package is assembled the week before by whoever is available. Quality is inconsistent.Assign an ASPICE coordinator role at least 6 weeks before the assessment. This is a real, substantial time commitment.

Summary & Key Takeaways

✅ Key Takeaways

  • Scope has three dimensions: process IDs, CL target, and organizational unit. All three must be explicitly agreed in the Assessment Contract.
  • The Assessment Contract is a formal requirement per ISO 33002 for a conformant assessment. It must include purpose, scope, team, dates, confidentiality, and output specification.
  • The Lead Assessor must hold a current, active intacs certification (Competent or Principal level). Verify on intacs.eu.
  • Process instance selection determines what evidence is reviewed - be deliberate. For CL3, use 2–3 instances to demonstrate organizational deployment.
  • The pre-assessment document package should be submitted at least 2 weeks before on-site. Late submission turns interview days into reading sessions.
  • "Assessment theater" - creating documents specifically for the audit - is immediately recognizable to experienced assessors and destroys credibility.
  • Plan for a dedicated ASPICE coordinator spending ~50% of their time in the 3–6 weeks before an OEM assessment. It is a substantial effort.

What's Next

The next chapter covers Capability Levels 0–5 in depth - exactly what each Process Attribute requires, how the N/P/L/F rating scale works in practice, and what distinguishes a CL2 Managed Process from a CL3 Established Process in terms of concrete evidence and organizational structure.

What's Next

Continue to Capability Levels 0–5 to understand the Process Attribute measurement framework, Generic Practice requirements at each capability level, and how assessors determine final ratings from the N/P/L/F evidence grid.

← PreviousHands-On: Work Product Templates Next →Evidence Collection Strategies