SlideShare uma empresa Scribd logo
1 de 104
Baixar para ler offline
Seite 1
© KUGLER MAAG CIE GmbH
Automotive SPICE 3.0
What is new and what
has changed?
Fabio Bella
Klaus Hoermann
Bhaskar Vanamali
Steffen Herrmann
Markus Müller
Dezember 2015
Version 2015-12-05
About the Trainer: Markus Mueller
Qualification & Experience
• intacs™-certified Principal Assessor and trainer, intacs™ Advisory Board member,
who
• conducted more than 50 assessments, many of them for OEMs
• trained more than 300 ISO/IEC 15504 provisional assessors from leading car
manufactures (OEMs) and suppliers
• advised OEM representatives on the development of Automotive SPICE®
• Project leader of several change and improvement projects based on ISO/IEC
15504 and CMM/CMMI®
• Providing consultancy, coaching, and active support in several ECU development
projects in automotive
• E.g. project leader for the implementation of a project control office (PCO) in the
electronics development of a major car manufacturer, which today controls more
than 100 ECU development projects
• Married with 2 children
• Director Operations at Kugler Maag Cie
• Over 15 years of experience in industry and research projects
• Assisting medium-size companies as well as international
corporations, primarily in the automotive industry
• PMI Project Management Professional
• Very experienced trainer, moderator, and management coach
• Speaker at conferences and co-author of books
Seite 3
Introducing myself
Dr. Klaus Hoermann
• Principal and Partner at Kugler Maag Cie
• Leader of the intacsTM working group “Exams”
• intacsTM SPICE Principal Assessor, intacsTM SPICE Instructor
• Volkswagen-certified Software Quality Improvement Leader (SQIL)
• CMMI® SCAMPI Lead Appraiser (CMMI Institute-Certified)
• CMMI® Instructor (CMMI Institute-Certified)
• Scrum Master (Scrum.org certified)
About the trainer: Bhaskar Vanamali
Qualification & Experience
• intacs™-certified Principal Assessor and trainer, VDA AK13 member, who
• conducted more than 90 assessments, many of them for OEMs
• trained more than 200 ISO/IEC 15504 provisional assessors from leading car
manufactures (OEMs) and suppliers
• advised OEM representatives on the development of Automotive SPICE®
• Project leader of several change and improvement projects based on SPICE and
CMM/CMMI®
• Providing consultancy, coaching, and active support in several ECU development
projects in automotive
• Member of ISO-working group for system and SW-engineering processes
• Married with 4 children
• Process Director at Kugler Maag Cie
• Over 15 years of experience in industry and process
improvement
• Assisting medium-size companies as well as international
corporations, primarily in the automotive industry
• Very experienced trainer, moderator, and management coach
• Speaker at conferences and co-author of books
Seite 5
• Executive Summary
• Introduction
• Overview of the Changes
• Changes in Detail
• And Finally
• Need more Advice?
• Contact Information
Contents
Seite 6
• Version 3.0 comprises many small changes and improvements, some structural
changes, and few changes which will increase project efforts.
• Structural changes:
• The engineering processes were divided into the two groups System (SYS) and Software (SWE).
• The tip of the V was changed: Unit construction and unit verification have been separated into
two processes.
• A „Plug-In Concept“ allows integration of mechanical and hardware processes (not provided by
Automotive SPICE)
• Content-wise, the HIS-Scope remains mainly the same (however the name of some
processes have changed).
• Few changes will cause additional efforts from projects, e.g., evaluation of alternative solutions
is required for system and software architectures according to defined criteria. The evaluation
result including a rationale for the architecture/design selection has to be recorded.
• The Measurement Framework was adapted to the changes in ISO 33020
• Little changes for capability levels 1-3, major changes for capability levels 4-5
• Automotive SPICE 3.0 is not yet mandatory. This will be decided by the VDA Quality
Management Board with the release of the new „Blue/Gold Volume“ (Sep. 2016).
• In the meantime the VDA AK 13 will develop interpretation guidelines for Automotive
SPICE 3.0 and also a guideline for performing assessments (planned for April 2016).
Executive Summary
Seite 7
Introduction
Seite 8
• Automotive SPICE Vers. 3.0 will replace the Automotive SPICE Process
Assessment Model (PAM) 2.5 and the Process Reference Model (PRM) 4.5.
• ASPICE Vers. 3.0 comprises PRM and PAM in one single document.
• Automotive SPICE is no longer using ISO 12207 as guidance
• Automotive SPICE® is a registered trademark of the Verband der
Automobilindustrie e.V. (VDA)
• Automotive SPICE 3.0 has been created by the Working Group 13 of the
Quality Management Center (QMC) within the German Association of the
Automotive Industry (Verband der Automobilindustrie e.V., VDA) with the
representation of members of the Automotive Special Interest Group (SIG) –
review only, and with the agreement of The SPICE User Group. This agreement
is based on a validation of the Automotive SPICE 3.0 version regarding any ISO
copyright infringement and the statements given from VDA QMC to the SPICE
User Group regarding the current and future development of Automotive
SPICE.
Basic Facts and Acknowledgements
Seite 9
• Employees of Volkswagen, Continental, Schäffler, ZF, Brose, Ford, BMW,
Daimler, Knorr Bremse
• Secretary: Bhaskar Vanamali (KMC)
• Contact VDA QMC: Dr. Jan Morenzin
Members VDA AK13
Seite 10
• Automotive SPICE 3.0 has been published in July 2015 and may be used for
assessments in agreement with the sponsor.
• Automotive SPICE 2.3 is still the version which is considered mandatory by the
VDA. Automotive SPICE versions 2.3 or 2.5 may still be used.
• Mandatory rules for the Automotive SPICE 3.0 transition are decided by the
VDA Quality Management Board with the release of the new “Blue/Gold
Volume” (Sep. 2016).
• In the meantime the VDA AK 13 will develop interpretation guidelines for
Automotive SPICE 3.0 and also a guideline for performing assessments
(Blue/Gold Volume by VDA, planned date is September 2016).
Automotive SPICE 3.0 Deployment and Timeline
Seite 11
• Timeline of publications by AK13 and transition time
• CAVE! The publication time of the Blue/Gold Volume of VDA is a target date
and the transition time of one year (Period II) is still under discussion.
Blue/Gold Volume and ASPICE 3.0 - Deployment and Timeline (1/2)
Automotive
SYS
July 2015
Release
Automotive
SPICE 3.0
End of 2016
Release
Blue/Gold
Volume
End of 2017
Period I
Period II =
Transition Period Period III
Seite 12
Blue/Gold Volume and ASPICE 3.0 - Deployment and Timeline (2/2)
Topic Period I Period II Period III
Entry
Exit
ASPICE 3.0 release
BG Volume release
BG Volume release
End of transition (~1y)
End of transition
Open end
iNTACS Trainings Update of trainings to
ASPICE 3.0
Update of trainings to the
BG Volume
None
Certified assessors No implication for any
grade
Upgrade training needed
for Competent and
Principal
No implication for
Provisional assessors
Still Upgrade-Training
needed for Competent
and Principal
No implication yet for
Provisional assessors
Additional trainings Update Trainings possible,
but no official trainings
iNTACS- and VDA WG13-
approved upgrade
trainings
Still iNTACS- and VDA
WG13-approved upgrade
trainings
Provisional assessors No specific requirements Upgrade training in
discussion but not clear
how to enforce that
Upgrade training in
discussion but not clear
how to enforce that
Assessments Assessment for any PAM
2.3 to 3.0 are possible
Assessments for PAM 3.0
only accepted if Lead
assessor underwent
upgrade training
In discussion: For German
OEMs assessments have
to be performed based on
PAM 3.0
Seite 13
• There will be no official upgrade trainings.
• The iNTACS training materials for Provisional and Competent Assessors will be
updated (Update for Provisional Assessor training planned for early 2016)
• iNTACS instructors are not required to be upgraded
• To be able to perform Automotive SPICE 3.0 assessments: no additional
requirements
• Automotive SPICE 3.0 Assessments are acknowledged as EE-1.
• Certification/Recertification: No changes
Rules for the Period between July 2015 until the Guidelines are
published – Period I
Seite 14
• iNTACS training materials will be updated. There will be an official upgrade
training. iNTACS will provide all changes, AK13 will perform reviews. iNTACS
training providers will perform the trainings.
• iNTACS instructors need to be upgraded. Bhaskar Vanamali and Pierre Metz
will perform the upgrade trainings.
• iNTACS training providers will provide upgrade trainings for all assessor grades.
• To be able to perform Automotive SPICE 3.0 assessments: The lead assessor
must have participated in official upgrade training.
• Automotive SPICE 3.0 Assessments are acknowledged as EE-1 if the assessor
has passed an upgrade training (or has attended a new provisional course).
Assessments from previous models (2.3 upwards) are still acknowledged as EE-
1.
• Certification/Recertification:
• Provisional: There are no changes to recertification. New Provisional Course.
• Competent: Need upgrade unless they took a new Provisional Course.
• Principals: Need upgrade
Rules for the Period between Guidelines are published until the
end of the Transition Period (date tbd) – Period II
Seite 15
• Automotive SPICE 3.0 Assessments are acknowledged as EE-1 if the assessor
has passed an upgrade training (or has attended a new provisional course).
• Whether assessments from previous models (2.3 upwards) are still
acknowledged as EE-1 is not yet decided.
• Certification/Recertification:
• Provisional: There are no planned changes to recertification. New Provisional Course.
However, discussion on mandatory upgrade trainings for Provisional Assessors which
were trained based on old trainings.
• Competent: Need upgrade unless they took a new Provisional Course.
• Principals: Need upgrade
Rules for the time after the end of the Transition Period (date tbd)
– Period III
Seite 16
Overview of the Changes
Seite 17
• Adaptation to ISO/IEC 15504-5 2012
• Automotive SPICE 2.5 was based on ISO/IEC 15504-5 2006.
• Adaption to the ISO/IEC 33000 series
• including the updated measurement framework
• Some basic structures, concepts, and terminology needed updates.
• Assessment Indicators needed updates
(particularly Base Practices for the HIS scope)
Motivation for Updating Automotive SPICE
Seite 18
Chapter Change
Chapter 1 Introduction
Editorial adaption to 33000 series, notes regarding combined PRM/PAM
in this document
Chapter 2 Statement of compliance Adaption to 33000 series
Chapter 3 Introduction Optimized for better understanding and adapted to 33000 series.
Chapter 4 Process reference model and
performance indicators (Level 1)
The acronym ENG changed to SYS and SWE, the structure of the
processes has changed, Base Practices of the HIS Scope have been
reworked.
Chapter 5 Process capability levels and
process attributes
Adapted to the measurement framework of ISO/IEC 33020
Annex A Conformity of the process
assessment and reference model
Conformity statement adapted to ISO/IEC 33004:2015.
Annex B Work product characteristics
Modifications on work product characteristics according to the changes
in chapter 4.
Annex C Terminology Update to recent standards, introduction of new terminology
Annex D Key Concepts
Added the new major concepts relative to Annex D of AS 2.5
Traceability diagram (Annex E PAM 2.5) is now in Annex D
Annex E Reference Standards Updated references to other standards
Overview of Main Changes (Document view)
Seite 19
The new Annex D was extended to include a clarification of new Automotive
SPICE key concepts and is a good starting point to understand the differences
between V 3.0 and V 2.5 in particular with respect to Capability Level 1
Annex D Key Koncepts
• D.1 The “Plug-in” Concept
• D.2 The Tip of the “V”
• D.3 Terms “Element”, “Component”, “Unit”, and “Item”
• D.4 Traceability and Consistency
• D.5 “Agree” and “Summarize and Communicate”
• D.6 “Evaluate”, “Verification Criteria” and “Ensuring compliance”
• D.7 The Relation Between “Strategy” and “Plan”
Overview of Main Changes
Seite 20
Structural Changes in Version 3.0: “Plug-In Concept”
SWE.1
SWE.2
SWE.3
SWE.5
SWE.4
SWE.6
SYS.1
SYS.2
SYS.3 SYS.4
SYS.5
SystemLevel
HWE.1
HWE.2 HWE.3
HWE.4 MEE.1
MEE.2 MEE.3
MEE.4
DomainLevel
MAN.3
ACQ.4
SUP.1
SUP.8
SUP.9
SUP.10
SYS System Engineering
SWE Software Engineering
HWE Hardware Engineering
MEE Mechanical Engineering
part of the Automotive SPICE® 3.0 PAM
not developed by VDA, not included in Automotive SPICE® 3.0 PAM
Seite 21
Structural Changes in Version 3.0: The Tip of the V
SYS.2
System Requirements Analysis
SYS.3
System Architectural Design
SWE.1
SW Requirements Analysis
SWE.2
SW Architectural Design
SWE.3
SW Detailed Design
and Unit Construction
SWE.5
SW Integration
and Integration Test
SWE.6
SW Qualification Test
SYS.4
System Integration and
Integration Test
SYS.5
System Qualification Test
SWE.4
SW Unit Verification
Seite 22
Terms “Element”, “Component”, “Unit”, and “Item”
• A system architecture specifies the elements of the system.
• A software architecture specifies the elements of the software.
• Software elements are hierarchically decomposed into smaller elements down to the software
components which are at the lowest level of the software architecture.
• Software components are described in the detailed design.
• A software component consists of one or more software units.
• Items on the right side of the V-model are the implemented counterparts of elements and
components on the left side. This can be a 1:1 or m:n relationship, e.g. an item may represent
more than one implemented element.
System Engineering Process Group (SYS)
SYS.1
Requirements Elicitation
SYS.2
System Requirements
Analysis
SYS.3
System Architectural Design
Software Engineering Process Group (SWE)
SYS.5
System Qualification Test
SYS.4
System Integration and
Integration Test
SWE.1
Software Requirements
Analysis
SWE.2
Software Architectural
Design
SWE.3
Software Detailed Design
and Unit Construction
SWE.6
Software Qualification Test
SWE.5
Software Integration and
Integration Test
SWE.4
Software Unit Verification
elements
components
units units
items
Seite 23
• Traceability and consistency have been formerly addressed by one single base
practice on the right side of the V and have now been split into two base
practices.
• Traceability refers to the existence of references or links between work
products. Traceability supports coverage analysis, impact analysis,
requirements implementation status tracking etc.
• Consistency means that
• All traceability references/links are available (i.e., nothing is missing)
• All traceability references/links are correct (i.e., not linking to the wrong work
product)
• Consistency has to be proven by technical review of the traceability
• New traceability requirements have been added:
• Between test cases and test results
• Between change requests and work products affected by these change requests
(SUP.10)
Traceability and Consistency
Seite 24
Traceability and Consistency
Stakeholder
requirements
System Integration
test specification
Software qualification
test specification
Software integration
Test specification
System qualification
test specification
SYS.4 BP7
SYS.4 BP8
SYS.5 BP5
SYS.5 BP6
SYS.3 BP6
SYS.3 BP7
SYS.2 BP7
SYS.2 BP8
SUP.10 BP8
SYS.5 BP5
SWE.4 BP5
SWE.4 BP5
SWE.4 BP6
SWE.1 BP7
SWE.1 BP8
SWE.3 BP5
SWE.3 BP6
SWE.2 BP7
SWE.2 BP8
SWE.6 BP5
SWE.6 BP6
SWE.5 BP7
SWE.5 BP8
SWE.3 BP5
SWE.3 BP6
SWE.3 BP5
SWE.3 BP6
SWE.1 BP7
SWE.1 BP8
SWE.4 BP5
SYS.4 BP7
SWE.6 BP5
SWE.5 BP7
To affected work products
Test cases
Test cases
Test cases
Test cases
System
requirements
Unit
test specification
Software
detailed design
Software
architecture
Software
requirements
Change request
Unit
test results
Software Integration
test result
Software qualification
test results
System integration
test results
System qualification
test results
System
architecture
Software
units
Static varification
results
Consistency
Bidirectional traceability
Seite 25
• The information flow on the left side of the “V” is ensured through a base practice
“Communicate agreed ‘work product x’”. The term “agreed” means that there is a joint
understanding of all stakeholders regarding the content of the work product.
• The information flow on the right side of the “V” is ensured through a base practice
“Summarize and communicate results”. The term “Summarize” refers to abstracted
information resulting from test executions made available to all relevant parties.
“Agree” and “Summarize and Communicate”
SYS.2
System Requirements Analysis
SYS.3
System Architectural Design
SWE.1
SW Requirements Analysis
SWE.2
SW Architectural Design
SWE.3
SW Detailed Design
and Unit Construction
SWE.5
SW Integration
and Integration Test
SWE.6
SW Qualification Test
SYS.4
System Integration and
Integration Test
SYS.5
System Qualification Test
SWE.4
SW Unit Verification
BP: “communicate agreed…” BP: “summarize and communicate…”
Seite 26
“Evaluate”, “Verification Criteria” and “Ensuring compliance”
SYS.2
System Requirements Analysis
SYS.3
System Architectural Design
SWE.1
SW Requirements Analysis
SWE.2
SW Architectural Design
SWE.3
SW Detailed Design
and Unit Construction
SWE.5
SW Integration
and Integration Test
SWE.6
SW Qualification Test
SYS.4
System Integration and
Integration Test
SYS.5
System Qualification Test
SWE.4
SW Unit Verification
SUP.2
Verification
SYS.2.BP5: Verification criteria
SWE.1.BP5: Verification criteria
SYS.5.BP2: Ensure compliance
SYS.3.BP3: Ensure compliance
SWE.6.BP2: Ensure compliance
SWE.5.BP3:
Ensure compliance
SWE.3.BP4: Evaluate SWE.4.BP2: Criteria for unit verification
Seite 27
• Verification criteria are used as input for the development of the test cases or
other verification measures that ensures compliance with the requirements.
Verification criteria are only used in the context of System Requirements
Analysis (SYS.2) and Software Requirements Analysis (SWE.1) processes.
Verification aspects which cannot be covered by testing are covered by the
verification process (SUP.2).
• Criteria for unit verification ensure compliance of the source code with the
software detailed design and the non-functional requirements. Possible
criteria for unit verification include unit test cases, unit test data, coverage
goals and coding standards and coding guidelines, e.g. MISRA. For unit testing,
such criteria shall be defined in a unit test specification. This unit test
specification may be implemented e.g. as a script in an automated test bench.
“Evaluate”, “Verification Criteria” and “Ensuring compliance”
Seite 28
• Evaluation of alternative solutions is required for system and software
architectures. The evaluation has to be done according to defined criteria.
Such evaluation criteria may include quality characteristics like modularity,
reliability, security, and usability, or results of make-or-buy or reuse analysis.
The evaluation result including a rationale for the architecture/design
selection has to be recorded.
• Compliance with an architectural design (SWE.5.BP3) means that the
specified integration tests are capable of proving that interfaces and relevant
interactions like e.g. dynamic behavior between
- the software units,
- the software items and
- the system items
fulfill the specification given by the architectural design.
“Evaluate”, “Verification Criteria” and “Ensuring compliance”
Seite 29
• SYS.3.BP5: Evaluate alternative system architectures. Define evaluation
criteria for architecture design. Evaluate alternative system architectures
according to the defined criteria. Record the rationale for the chosen system
architecture. [OUTCOME 1]
• NOTE 3: Evaluation criteria may include quality characteristics (modularity,
maintainability, expandability, scalability, reliability, security and usability) and
results of make-buy-reuse analysis.
New practice in System Architectural Design : Evaluate alternative
system architectures
Seite 30
Both terms “Strategy” and “Plan” are commonly used across following processes of the Automotive
SPICE 3.0 PAM:
• SYS.4 System Integration and Integration Test
• SYS.5 System Qualification Test
• SWE.4 Software Unit Verification
• SWE.5 Software Integration and Integration Test
• SWE.6 Software Qualification Test
• SUP.1 Quality Assurance
• SUP.8 Configuration Management
• SUP.9 Problem Resolution Management
• SUP.10 Change Request Management
The Relation Between “Strategy” and “Plan”
BP1:
Develop Strategy
Plan
Process specific plan
WP 08-nn
Generic plan
WP 08-00
CL >=2
CL =1
Which is documented in the
related WP
BP 1 defines the strategy…
Seite 31
• Capability Level 1:
Each of these processes requires the development of a process-specific
strategy. The strategy always corresponds to a process-specific “Plan”. For each
process-specific “Plan” there are process-specific work product characteristics
defined (e.g. “08-52 Test Plan”, “08-04 Configuration Management Plan”).
Scheduling like e.g. old SUP.10 BP10 has been moved to Level 2
• Capability Level 2 or higher:
Each process-specific “Plan” (WP 08-nn) inherits the work product
characteristics represented by the Generic Plan (WP 08-00). This means that
for a process-specific “Plan” both the process-specific characteristics (WP 08-
nn) and the generic characteristics (WP 08-00) apply.
• BPs for proceeding have been deleted.
The Relation Between “Strategy” and “Plan”
Seite 32
• Color code for different elements
• Red for PRM elements (Process ID, name, purpose and outcomes)
• Green for base practices and generic practices
• Blue for output work products and generic resources
• Italics for content from ISO 330xx (e.g. measurement framework)
No Separation of PAM and PRM in v3.0
Seite 33
Rating scale as defined by ISO/IEC 33020
Rating Rationale
N Not achieved There is little or no evidence of achievement of the defined
process attribute in the assessed process.
P Partially achieved There is some evidence of an approach to, and some
achievement of, the defined process attribute in the assessed process. Some
aspects of achievement of the process attribute may be unpredictable.
L Largely achieved There is evidence of a systematic approach to, and
significant achievement of, the defined process attribute in the assessed
process. Some weaknesses related to this process attribute may exist in the
assessed process.
F Fully achieved There is evidence of a complete and systematic approach
to, and full achievement of, the defined process attribute in the assessed
process. No significant weaknesses related to this process attribute exist in the
assessed process.
Seite 34
Optional Rating Scale
Rating Rationale
P- Partially achieved: There is some evidence of an approach to, and some
achievement of, the defined process attribute in the assessed process. Many
aspects of achievement of the process attribute may be unpredictable.
P+ Partially achieved: There is some evidence of an approach to, and some
achievement of, the defined process attribute in the assessed process. Some
aspects of achievement of the process attribute may be unpredictable.
L- Largely achieved: There is evidence of a systematic approach to, and
significant achievement of, the defined process attribute in the assessed
process. Many weaknesses related to this process attribute may exist in the
assessed process.
L+ Largely achieved: There is evidence of a systematic approach to, and
significant achievement of, the defined process attribute in the assessed
process. Some weaknesses related to this process attribute may exist in the
assessed process.
Seite 35
Extended Rating Scheme - percentages
Rating Percentages
P- > 15% to 32.5%
P+ > 32.5 % to 50 %
L- > 50 % to 67.5 %
L+ > 67.5 % to 85 %
AK13 has not decided whether it is going to be part of the
guidelines or out of scope
Seite 36
Rating and Aggregation methods
• ISO/IEC 33020 identifies 3 rating methods R1, R2 and R3 and different
aggregation methods
• aggregation within one process (one-dimensional, vertical aggregation)
• across multiple process instances (one-dimensional, horizontal aggregation)
• both (two-dimensional, matrix aggregation)
• They are usually used for Assessments on Organizational Maturity
• AK13 has not decided whether it is going to be part of the guidelines or out of
scope
• In Automotive SPICE V 3.0 the rating methods are only briefly explained and
the approaches referenced
Seite 37
Rating method R1 (which has the strongest requirements):
• The approach to process attribute rating shall satisfy the following conditions:
• a) Each process outcome of each process within the scope of the assessment
shall be characterized for each process instance, based on validated data;
• b) Each process attribute outcome of each process attribute for each process
within the scope of the assessment shall be characterised for each process
instance, based on validated data;
• c) Process outcome characterisations for all assessed process instances shall be
aggregated to provide a process performance attribute achievement rating;
• d) Process attribute outcome characterisations for all assessed process
instances shall be aggregated to provide a process attribute achievement
rating.
Example of a Rating Method
Seite 38
Inst. 1 Inst. 2 Inst. 3 Inst. 4 Inst. 5 Inst. 6
GP 3.2.1 P P N P P L
GP 3.2.2 P L L L F L
GP 3.2.3 L L L L L L
Gp 3.2.4 F F F F F L
GP 3.2.5 L L P P L L
GP 3.2.6 P L P P N L
BewertungderIndikatoren
<--->
Prozessinstanzen
< --- >
Hori-
zontal
Horizontale Aggregation
page 38
Hori-
zontal
Horizontale
Aggregation
P
L
L
F
L
P
Seite 39
Vertikale Aggregation
page 39
Vertikale Aggregation Vertikale Aggregation
Inst. 1 Inst. 2 Inst. 3 Inst. 4 Inst. 5 Inst. 6
GP 3.2.1 P P N P P L
GP 3.2.2 P L L L F L
GP 3.2.3 L L L L L L
GP 3.2.4 F F F F F L
GP 3.2.5 L L P P L L
GP 3.2.6 P L P P N L
BewertungderIndikkatoren
<--->
Bewertung pro Prozessinstanz
< --- >
Vertikale
Aggregation
P L P L P L
Seite 40
Changes in Detail
Seite 41
General Changes in V3.0
• See Key concepts (previous slides)
• In the testing processes (SYS.4, SYS.5, SWE.5, SWE.6) test cases have to be selected
based on the test strategy of the relevant test step.
• Removal of Level 2 activities from BPs
• Planning and monitoring activities were removed from BPs (e.g. SUP-processes)
• Reviews beyond consistency checks were removed from BPs (e.g. SWE.2-4)
Seite 42
Outcomes V2.5
As a result of successful implementation of this process
1 the scope of the work for the project is defined;
2 the feasibility of achieving the goals of the project with
available resources and constraints is evaluated;
3 the tasks and resources necessary to complete the
work are sized and estimated;
4 interfaces between elements in the project, and with
other project and organizational units, are identified
and monitored;
5 plans for the execution of the project are developed,
implemented and maintained;
6 progress of the project is monitored and reported; and
7 actions to correct deviations from the plan and to
prevent recurrence of problems identified in the project
are taken when project goals are not achieved.
Outcomes V3.0
As a result of successful implementation of this process
1 the scope of the work for the project is defined;
2 the feasibility of achieving the goals of the project with
available resources and constraints is evaluated;
3 the activities and resources necessary to complete the
work are sized and estimated;
4 interfaces within the project, and with other projects
and organizational units, are identified and monitored;
5 plans for the execution of the project are developed,
implemented and maintained;
6 progress of the project is monitored and reported; and
7 corrective action is taken when project goals are not
achieved, and recurrence of problems identified in the
project is prevented.
• Mainly rewording but no new content
MAN.3 Project Management – Outcomes
43Seite 43
Base Practices V2.5
1 Define the scope of work.
2 Define project life cycle
3 Determine and maintain estimates for project
Attributes
4 Define project activities
5 Define skill needs
6 Define and maintain project schedule
7 Identify and monitor project interfaces
8 Establish project plan
9 Implement the project plan
10 Monitor project attributes
11 Review and report progress of the project
12 Act to correct deviations
Base Practices V3.0
1 Define the scope of work.
2 Define project life cycle
3 Evaluate feasibility of the project
4 Define, monitor and adjust project activities
5 Determine, monitor und adjust project estimates and
resources
6 Ensure required skills, knowledge, and experience
7 Identify, monitor and adjust project interfaces and
agreed commitments
8 Define, monitor and adjust project schedule
9 Ensure consistency
10 Review and report progress of the project
Link to 3, 4, 5, 6, 7, 10
MAN.3 Project Management – Base Practices
44Seite 44
Changes in V3.0 – MAN.3 Project Management
• Establish and implement Project plan has been deleted as it caused confusion in the
past
• Instead all aspects of planning have to be identified, monitored and adjusted
(estimates, activities, schedules, plans, interfaces and commitments)
• Across all artifacts consistency has to be established (specific BP) – no traceability,
just consistency
• Scope of work used to contain check of feasibility. In 3.0 a specific BP for feasibility
has been introduced
• The project plan 08-12 is still an output work product of MAN.3. Check Annex B but
the references to risk management were removed. However, risks are mentioned in
BP5 and MAN.5 is referenced in the corresponding Note 6
45Seite 45
Outcomes V2.5
As a result of successful implementation of this process
1 a strategy for conducting quality assurance is
developed, implemented and maintained;
2 quality assurance is performed independent of the
activity or project being performed;
3 evidence of quality assurance is produced and
maintained;
4 adherence of products, processes and activities to
agreed requirements are verified, documented, and
communicated to the relevant parties;
5 problems and/or non-conformance with agreement
requirements are identified, recorded, communicated
to the relevant parties, tracked and resolved; and
6 quality assurance has the independence and authority
to escalate problems to appropriate levels of
management.
Outcomes V3.0
As a result of successful implementation of this process
1 a strategy for performing quality assurance is
developed, implemented, and maintained;
2 quality assurance is performed independently and
objectively without conflicts of interest;
3 non-conformances of work products, processes, and
process activities with relevant requirements are
identified, recorded, communicated to the relevant
parties, tracked, resolved, and further prevented;
4 conformance of work products, processes and
activities with relevant requirements is verified,
documented, and communicated to the relevant
parties;
5 authority to escalate non-conformances to appropriate
levels of management is established; and
6 management ensures that escalated non-
conformances are resolved.
• Focus is more on checking conformance and ensuring resolution of non-conformances
SUP.1 Quality Assurance – Outcomes
46Seite 46
Base Practices V2.5
1 Develop project quality assurance strategy
2 Develop and maintain an organisation structure
which ensures that quality assurance is carried out
and report Independentely
3 Develop and implement a plan for project quality
assurance based on a quality assurance strategy
4 Maintain evidence of quality assurance
5 Assure quality of work products
6 Assure quality of process activities
7 Track and record quality assurance activities
8 Report quality assurance activities and results
9 Ensure resolution on non-conformances
10 Implement an escalation mechanism
Base Practices V3.0
1 Develop project quality assurance strategy
2 Assure quality of work products
3 Assure quality of process activities
4 Summarize and communicate quality assurance
activities and results
5 Ensure resolution of non-conformances
6 Implement an escalation mechanism
SUP.1 Quality Assurance – Base Practices
47Seite 47
Changes in V3.0 – SUP.1 Quality Assurance
• Overall simplifying the process with similar content
• On top of independence in QA objectivity is required.
• It is clarified that escalation has to lead to management attention and actions.
48Seite 48
Outcomes V2.5
As a result of successful implementation of this process
1 a configuration management strategy is developed;
2 all items generated by a process or project are
identified, defined and baselined according to the
Configuration management strategy;
3 modifications and releases of the items are controlled;
4 modifications and releases are made available to
affected parties;
5 the status of the items and modification requests are
recorded and reported;
6 the completeness and consistency of the items is
ensured; and
7 storage, handling and delivery of the items are
controlled.
Outcomes V3.0
As a result of successful implementation of this process
1 a configuration management strategy is developed;
2 all configuration items generated by a process or
project are identified, defined and baselined according
to the configuration management strategy;
3 modifications and releases of the configuration items
are controlled;
4 modifications and releases are made available to
affected parties;
5 the status of the configuration items and modifications
is recorded and reported;
6 the completeness and consistency of the baselines is
ensured; and
7 storage of the configuration items is controlled.
• Some rewording but in essence the outcomes have not changed
SUP.8 Configuration Management – Outcomes
49Seite 49
Base Practices V2.5
1 Develop a configuration management strategy
2 Identify configuration items
3 Establish a configuration management system
4 Establish branch management strategy
5 Establish baselines
6 Maintain configuration item description
7 Control modifications and releases
8 Maintain configuration item history
9 Report configuration status
10 Verify the information about configured items
11 Manage the backup, storage, archiving, handling
and delivery of configuration items
Base Practices V3.0
1 Develop a configuration management strategy
2 Identify configuration items
3 Establish a configuration management system
4 Establish branch management strategy
5 Control modifications and releases
6 Establish baselines
7 Report configuration status
8 Verify the information about configured items
9 Manage the storage of configuration items and
baselines
SUP.8 Configuration Management – Base Practices
50Seite 50
Changes in V3.0 – SUP.8 Configuration Management
• Overall simplifying the process with similar content
• No new content
51Seite 51
Outcomes V2.5
As a result of successful implementation of this process
1 a problem management strategy is developed;
2 problems are recorded, identified and classified;
3 problems are analysed and assessed to identify
acceptable solution(s);
4 problem resolution is implemented;
5 problems are tracked to closure; and
6 the status of all problem reports is known
Outcomes V3.0
As a result of successful implementation of this process
1 a problem resolution management strategy is
developed;
2 problems are recorded, uniquely identified and
classified;
3 problems are analyzed and assessed to identify an
appropriate solution;
4 problem resolution is initiated;
5 problems are tracked to closure; and
6 the status of problems and their trend are known
• Some rewording but in essence the outcomes have not changed
SUP.9 Problem Resolution Management – Outcomes
52Seite 52
Base Practices V2.5
1 Develop a problem resolution management strategy
2 Establish a consistent problem resolution
management proceeding
3 Identify and record the problem
4 Investigate and diagnose the cause and the impact
of the problem
5 Execute urgent resolution action, where necessary
6 Raise alert notifications, where necessary
7 Initiate change request
8 Track problems to closure
9 Analyze problem trends
Base Practices V3.0
1 Develop a problem resolution management strategy
2 Identify and record the problem
3 Record the status of problems
4 Diagnose the cause and determine the impact of the
problem
5 Authorize urgent resolution action
6 Raise alert notifications
7 Initiate problem resolution
8 Track problems to closure
9 Analyze problem trends
SUP.9 Problem Resolution Management – Base Practices
53Seite 53
Changes in V3.0 – SUP.9 Problem Resolution Management
• Only minor changes regarding terminology
• No need to start a change request anymore (However, problems have to be tracked
to closure and initiating a change request might be an option )
• So also no planning aspects on Level 1
• The proceeding is part of the strategy/plan.
54Seite 54
Outcomes V2.5
As a result of successful implementation of this process
1 a change management strategy is developed;
2 requests for changes are recorded and identified;
3 dependencies and relationships to other change
requests are identified;
4 criteria for confirming implementation of the change
request are defined;
5 requests for change are analysed, prioritized, and
resource requirements estimated;
6 changes are approved on the basis of priority and
availability of resources;
7 approved changes are implemented and tracked to
closure; and
8 the status of all change requests is known
Outcomes V3.0
As a result of successful implementation of this process
1 a change request management strategy is developed;
2 requests for changes are recorded and identified;
3 dependencies and relationships to other change
requests are identified;
4 criteria for confirming implementation of change
requests are defined;
5 requests for change are analyzed, and resource
requirements are estimated;
6 changes are approved and prioritized on the basis of
analysis results and availability of resources;
7 approved changes are implemented and tracked to
closure;
8 the status of all change requests is known; and
9 bi-directional traceability is established between
change requests and affected work products
• Major change that traceability to affected work products is included.
• Other than that no major changes
SUP.10 Change Request Management – Outcomes
55Seite 55
Base Practices V2.5
1 Develop a change request management strategy
2 Establish a consistent change request management
Proceeding
3 Identify and record the change request
4 Record the status of change requests
5 Establish the dependencies and relationships to
other change requests
6 Assess the impact of the change
7 Analyze and prioritize change requests
8 Approve change requests before implementation
9 Identify and plan the verification and validation
activities to be performed for implemented changes
10 Schedule and allocate the change request
11 Review the implemented change
12 Change requests are tracked until closure
Base Practices V3.0
1 Develop a change request management strategy
2 Identify and record the change requests
3 Record the status of change requests
4 Analyze and assess change requests
5 Approve change requests before implementation
6 Review the implementation of change requests
7 Track change requests to closure
8 Establish bidirectional traceability
Has been moved to Level 2
New BP/aspect of SUP.10
SUP.10 Change Request Management – Base Practices
56Seite 56
Changes in V3.0 – SUP.10 Change Request Management
• There are two major changes:
• The planning aspects (scheduling and planning of verification and validation) have been
moved to Level 2
• The traceability between Change requests and affected Work products has been
introduced
• Other than that there are no changes except for wording.
• The proceeding is part of the strategy/plan.
57Seite 57
Outcomes V2.5
As a result of successful implementation of this process
1 joint activities between the customer and the supplier
are performed as needed;
2 all information, agreed upon for exchange, is
transferred between the supplier and the customer;
3 information on progress is exchanged regularly with
the supplier;
4 performance of the supplier is monitored against the
agreed requirements; and
5 changes to the agreement, if needed, are negotiated
between the customer and the supplier and
documented with the agreement
Outcomes V3.0
As a result of successful implementation of this process
1 joint activities, as agreed between the customer and
the supplier, are performed as needed;
2 all information, agreed upon for exchange, is
communicated regularly between the supplier and
customer;
3 performance of the supplier is monitored against the
agreements; and
4 changes to the agreement, if needed, are negotiated
between the customer and the supplier and
documented in the agreement
• Some rewording but in essence the outcomes have not changed
ACQ.4 Supplier Monitoring – Outcomes
58Seite 58
Base Practices V2.5
1 Agree on joint processes and joint interfaces
2 Exchange all relevant information
3 Review technical development with the supplier
4 Review progress of the supplier
5 Track open items
6 Act to correct deviations
7 Agree on changes
Base Practices V3.0
1 Agree on and maintain joint processes
2 Exchange all agreed information
3 Review technical development with the supplier
4 Review progress of the supplier
5 Act to correct deviations
Linked to 1, 3, 4 and 5
ACQ.4 Supplier Monitoring – Base Practices
59Seite 59
Base Practices V2.5
1 Agree on joint processes and joint interfaces
2 Exchange all relevant information
3 Review technical development with the supplier
4 Review progress of the supplier
5 Track open items
6 Act to correct deviations
7 Agree on changes
Base Practices V3.0
1 Agree on and maintain joint processes
2 Exchange all agreed information
3 Review technical development with the supplier
4 Review progress of the supplier
5 Act to correct deviations
Linked to 1, 3, 4 and 5
ACQ.4 Supplier Monitoring – Base Practices
60Seite 60
Changes in V3.0 – ACQ.4 Supplier Monitoring
• No major changes, but the approach was simplified
Seite 61
Discuss in small working groups and document the results on a flip chart :
• What are the major changes ? (summarize)
• Where do see barriers in changing your process to ASPICE 3.0 complaince ?
• How do you estimate the invest (higher / same / less) ? Why ?
• What will be the impact of Assessments (imagine : now based on 3.0 instead
of 2.X) ?
Group Work / Discussion
62Seite 62
Outcomes V2.5 – ENG.2
As a result of successful implementation of this process
1 a defined set of system requirements is established;
2 system requirements are categorized and analyzed for
correctness and testability;
3 the impact of the system requirements on the
operating environment is evaluated;
4 prioritization for implementing the system requirements
is defined;
5 the system requirements are approved and updated as
needed;
6 consistency and bilateral traceability are established
between customer requirements and system
requirements;
7 changes to the customer’s requirements baseline are
evaluated for cost, schedule and technical impact; and
8 the system requirements are communicated to all
affected parties and baselined
Outcomes V3.0
As a result of successful implementation of this process
1 a defined set of system requirements is established;
2 system requirements are categorized and analyzed for
correctness and verifiability;
3 the impact of system requirements on the operating
environment is analyzed;
4 prioritization for implementing the system requirements
is defined;
5 the system requirements are updated as needed;
6 consistency and bidirectional traceability are
established between stakeholder requirements and
system requirements;
7 the stakeholder requirements are evaluated for cost,
schedule and technical impact; and
8 the system requirements are agreed and
communicated to all affected parties
• Some rewording but in essence the outcomes have not changed
SYS.2 System Requirements Analysis – Outcomes
63Seite 63
Base Practices V2.5 – ENG.2
1 Identify System Requirements
2 Analyze system requirements
3 Determine the impact on the operating environment
4 Prioritize and categorize system requirements
5 Evaluate and update system requirements
6 Ensure consistency and bilateral traceability of
customer requirements to system requirements
7 Communicate system requirements
Base Practices V3.0
1 Specify system requirements
2 Structure system requirements
3 Analyze system requirements
4 Analyze the impact on the operating environment
5 Develop verification criteria
6 Establish bidirectional traceability
7 Ensure consistency
8 Communicate agreed system requirements
Linked to 1-8
SYS.2 System Requirements Analysis – Base Practices
64Seite 64
Changes in V3.0 – SYS.2 System Requirements Analysis
• Traceability and consistency are separated (see key concepts)
• Verification criteria explicitly required in BP5
• Other than that mainly rewording
65Seite 65
Outcomes V2.5 – ENG.3
As a result of successful implementation of this process
1 a system architectural design is defined that identifies
the elements of the system and meets the defined
systems requirements;
2 the system requirements are allocated to the elements
of the system;
3 internal and external interfaces of each system
element are defined;
4 verification between the system requirements and the
system architectural design is performed;
5 consistency and bilateral traceability are established
between system requirements and system
architectural design; and
6 the system requirements, the system architectural
design, and their relationships are baselined and
communicated to all affected parties.
Outcomes V3.0
As a result of successful implementation of this process
1 a system architectural design is defined that identifies
the elements of the system;
2 the system requirements are allocated to the elements
of the system;
3 the interfaces of each system element are defined;
4 the dynamic behavior objectives of the system
elements are defined;
5 consistency and bidirectional traceability are
established between system requirements and system
architectural design; and
6 the system architectural design is agreed and
communicated to all affected parties
• New aspect with dynamic behavior but other than that the outcomes have not changed in essence
New aspect of system architecture
SYS.3 System Architectural Design – Outcomes
66Seite 66
Base Practices V2.5 – ENG.3
1 Define system architectural design
2 Allocate System Requirements
3 Define Interfaces
4 Develop verification criteria
5 Verify System Architectural Design
6 Ensure consistency and bilateral traceability of
system requirements to system architectural design
7 Communicate system architectural design
Base Practices V3.0
1 Develop system architectural design
2 Allocate system requirements
3 Define interfaces of system elements
4 Describe dynamic behavior
5 Evaluate alternative system architectures
6 Establish bidirectional traceability
7 Ensure consistency
8 Communicate agreed system architectural design
New aspects of the system architecture process
SYS.3 System Architectural Design – Base Practices
67Seite 67
Changes in V3.0 – SYS.3 System Architectural Design
• New:
• Dynamic behavior has to be explicitly addressed (before only implicitly)
• Design alternatives have to be documented
• Traceability and consistency are separated (see key concepts)
• No verification criteria required anymore
• Review only for consistency check, otherwise on Level 2
68Seite 68
Outcomes V2.5 – ENG.9
As a result of successful implementation of this process
1 a system integration and system integration test strategy is
developed for system elements consistent with the system
architectural design according to the priorities and
categorization of the system requirements;
2 a test specification for system integration test is developed
to verify compliance with the system architectural design,
including the interfaces between system elements;
3 an integrated system is integrated as defined by the
integration strategy;
4 the integrated system elements are verified using the test
cases;
5 results of system integration testing are recorded;
6 consistency and bilateral traceability are established
between system architectural design and system integration
test specification including test cases; and
7 a regression strategy is developed and applied for re-
testing the system elements when changes are made
Outcomes V3.0
As a result of successful implementation of this process
1 a system integration strategy consistent with the project
plan, the release plan and the system architectural design
is developed to integrate the system items;
2 a system integration test strategy including the regression
test strategy is developed to test the system item
interactions;
3 a specification for system integration test according to the
system integration test strategy is developed that is suitable
to provide evidence for compliance of the integrated system
items with the system architectural design, including the
interfaces between system items;
4 system items are integrated up to a complete integrated
system according to the integration strategy;
5 test cases included in the system integration test
specification are selected according to the system
integration test strategy and the release plan;
6 system item interactions are tested using the selected test
cases and the results of system integration testing are
recorded;
7 consistency and bidirectional traceability between the
elements of the system architectural design and test cases
included in the system integration test specification and
bidirectional traceability between test cases and test results
is established; and
8 results of the system integration test are summarized and
communicated to all affected parties
The appropriate test cases need to be selected
to support the test strategy (incl. the regression
test strategy)
SYS.4 System Integration and Integration Test – Outcomes
69Seite 69
Base Practices V2.5 – ENG.9
1 Develop system integration strategy
2 Develop system integration test strategy
3 Develop a test specification for system integration
4 Integrate system elements
5 Verify the integrated system
6 Record the results of system integration testing
7 Ensure consistency and bilateral traceability of
system architectural design to the system integration
test specification
8 Develop regression testing strategy and perform
regression testing
Base Practices V3.0
1 Develop system integration strategy
2 Develop system integration test strategy including
regression test strategy
3 Develop specification for system integration test
4 Integrate system items
5 Select test cases
6 Perform system integration test
7 Establish bidirectional traceability
8 Ensure consistency
9 Summarize and communicate results
New aspects of the system integration and integration
test process
SYS.4 System Integration and Integration Test – Base Practices
70Seite 70
Changes in V3.0 – SYS.4 System Integration and Integration Test
• Test cases have to be selected according to the test strategy including the regression
test strategy
• Traceability and consistency are separated (see key concepts)
71Seite 71
Outcomes V2.5 – ENG.10
As a result of successful implementation of this process
1 a strategy is developed to test the system according to
the priorities of and categorization the system
requirements;
2 a test specification for system test of the integrated
system is developed that demonstrates compliance
with the system requirements;
3 the integrated system is verified using the test cases;
4 results of system testing are recorded;
5 consistency and bilateral traceability are established
between system requirements and the system test
specification including test cases; and
6 a regression test strategy is developed and applied for
re-testing the integrated system when a change in
system elements is made
Outcomes V3.0
As a result of successful implementation of this process
1 a system qualification test strategy including
regression test strategy consistent with the project plan
and release plan is developed to test the integrated
system;
2 a specification for system qualification test of the
integrated system according to the system qualification
test strategy is developed that is suitable to provide
evidence for compliance with the system requirements;
3 test cases included in the system qualification test
specification are selected according to the system
qualification test strategy and the release plan;
4 the integrated system is tested using the selected test
cases and the results of system qualification test are
recorded;
5 consistency and bidirectional traceability are
established between system requirements and test
cases included in the system qualification test
specification and between test cases and test results;
and
6 results of the system qualification test are summarized
and communicated to all affected parties
The appropriate test cases need to be selected
to support the test strategy (incl. the regression
test strategy)
SYS.5 System Qualification Test – Outcomes
72Seite 72
Base Practices V2.5 – ENG.10
1 Develop system test strategy
2 Develop test specification for system test
3 Verify integrated system
4 Record the results of system testing
5 Ensure consistency and bilateral traceability of
system requirements to the systems test
specification
6 Develop system regression test strategy and
perform testing
Base Practices V3.0
1 Develop system qualification test strategy including
regression test strategy
2 Develop specification for system qualification test
3 Select test cases
4 Test integrated system
5 Establish bidirectional traceability
6 Ensure consistency
7 Summarize and communicate results
New aspects of the system qualification test process
SYS.5 System Qualification Test – Base Practices
73Seite 73
Changes in V3.0 – SYS.5 System Qualification Test
• New name for system test  system qualification test (from ISO 15504-5:2012)
• Test cases have to be selected according to the test strategy including the regression
test strategy
• Traceability and consistency are separated (see key concepts)
74Seite 74
Outcomes V2.5 – ENG.4
As a result of successful implementation of this process
1 the software requirements to be allocated to the
software elements of the system and their interfaces
are defined;
2 software requirements are categorized and analyzed
for correctness and testability;
3 the impact of software requirements on the operating
environment is evaluated;
4 prioritization for implementing the software
requirements is defined;
5 the software requirements are approved and updated
as needed;
6 consistency and bilateral traceability are established
between system requirements and software
requirements; and consistency and bilateral traceability
are established between system architectural
design and software requirements;
7 changes to the software requirements are evaluated
for cost, schedule and technical impact; and
8 the software requirements are baselined and
communicated to all affected parties
Outcomes V3.0
As a result of successful implementation of this process
1 the software requirements to be allocated to the
software elements of the system and their interfaces
are defined;
2 software requirements are categorized and analyzed
for correctness and verifiability;
3 the impact of software requirements on the operating
environment is analyzed;
4 prioritization for implementing the software
requirements is defined;
5 the software requirements are updated as needed;
6 consistency and bidirectional traceability are
established between system requirements and
software requirements; and consistency and
bidirectional traceability are established between
system architectural design and software
requirements;
7 the software requirements are evaluated for cost,
schedule and technical impact; and
8 the software requirements are agreed and
communicated to all affected parties
• Some rewording but in essence the outcomes have not changed
SWE.1 Software Requirements Analysis – Outcomes
75Seite 75
Base Practices V2.5 – ENG.4
1 Identify software requirements
2 Analyze software requirements
3 Determine the impact on the operating environment
4 Prioritize and categorize software requirements
5 Evaluate and update software requirements
6 Ensure consistency and bilateral traceability of
system requirements to software requirements
7 Ensure consistency and bilateral traceability of
system architectural design to software requirements
8 Communicate software requirements
Base Practices V3.0
1 Specify software requirements
2 Structure software requirements
3 Analyze software requirements
4 Analyze the impact on the operating environment
5 Develop verification criteria
6 Establish bidirectional traceability
7 Ensure consistency
8 Communicate agreed software requirements
Linked to 1-8
SWE.1 Software Requirements Analysis – Base Practices
76Seite 76
Changes in V3.0 – SWE.1 Software Requirements Analysis
• Traceability and consistency are separated (see key concepts)
• Verification criteria explicitly required in BP5
• Other than that mainly rewording
77Seite 77
Outcomes V2.5 – ENG.5
As a result of successful implementation of this process
1 a software architectural design is defined that identifies
the components of the software and meets the defined
software requirements; (ENG.5 – 1)
2 the software requirements are allocated to the
elements of the software; (ENG.5 – 2)
3 internal and external interfaces of each software
component are defined; (ENG.5 – 3)
4 the dynamic behaviour and resource consumption
objectives of the software components are defined;
(ENG.5 – 4)
5 consistency and bilateral traceability are established
between software requirements and software
architectural design; (ENG.5 – 6)
Outcomes V3.0
As a result of successful implementation of this process
1 a software architectural design is defined that identifies
the elements of the software;
2 the software requirements are allocated to the
elements of the software;
3 the interfaces of each software element are defined;
4 the dynamic behavior and resource consumption
objectives of the software elements are defined;
5 consistency and bidirectional traceability are
established between software requirements and
software architectural design; and
6 the software architectural design is agreed and
communicated to all affected parties
New aspect of software architecture
• Outcomes between SWE.2 and SWE.3 (formerly ENG.5 and ENG.6) have been rearranged
SWE.2 Software Architectural Design – Outcomes
78Seite 78
Base Practices V2.5 – ENG.5
1 Develop software architectural design (ENG.5 –
BP1)
2 Allocate software requirements (ENG.5 – BP2)
3 Define interfaces (ENG.5 – BP3)
4 Describe dynamic behaviour (ENG.5 – BP4)
5 Define resource consumption objectives (ENG.5 –
BP5)
6 Develop Verification Criteria (ENG.5 – BP7)
7 Verify Software Design (ENG.5 – BP8)
8 Ensure consistency and bilateral traceability of
software requirements to software architectural
design (ENG.5 – BP9)
Base Practices V3.0
1 Develop software architectural design
2 Allocate software requirements
3 Define interfaces of software elements
4 Describe dynamic behavior
5 Define resource consumption objectives
6 Evaluate alternative software architectures
7 Establish bidirectional traceability
8 Ensure consistency
9 Communicate agreed software architectural design
New aspects of the software architecture process
SWE.2 Software Architectural Design – Base Practices
79Seite 79
Changes in V3.0 – SWE.2 Software Architectural Design
• New:
• Design alternatives have to be documented
• The architecture has to be agreed upon and communicated
• The processes of design and unit construction (ENG.5/6) have been split in three different
processes (SWE.2 – 4)
• Traceability and consistency are separated (see key concepts)
• Verification criteria not required explicitly
• Review only for consistency check, otherwise on Level 2
80Seite 80
Outcomes V2.5 – ENG.5/6
As a result of successful implementation of this process
1 a detailed design is developed that describes software
units that can be implemented and tested; (ENG.5 – 5)
2 internal and external interfaces of each software
component are defined; (ENG.5 – 3)
3 the dynamic behaviour and resource consumption
objectives of the software components are defined;
(ENG.5 – 4)
4 consistency and bilateral traceability are established
between software architectural design and software
detailed design. (ENG.5 – 7)
5 software units defined by the software design are
produced (ENG.6 – 3)
6 consistency and bilateral traceability are established
between software detailed design and software units;
(ENG.6 – 6)
Outcomes V3.0
As a result of successful implementation of this process
1 a detailed design is developed that describes software
units;
2 interfaces of each software unit are defined;
3 the dynamic behavior of the software units is defined;
4 consistency and bidirectional traceability are
established between software requirements and
software units; and consistency and bidirectional
traceability are established between software
architectural design and software detailed design; and
consistency and bidirectional traceability are
established between software detailed design and
software units;
5 the software detailed design and the relationship to the
software architectural design is agreed and
communicated to all affected parties; and
6 software units defined by the software detailed design
are produced
• Outcomes between SWE.2 and SWE.3 (formerly ENG.5 and ENG.6) have been rearranged
New aspect of software detailed design
SWE.3 Software Detailed Design and Unit Construction – Outcomes
81Seite 81
Base Practices V2.5 – ENG.5/6
1 Develop detailed design (ENG.5 – BP6)
2 Define interfaces (ENG.5 – BP3)
3 Describe dynamic behaviour (ENG.5 – BP4)
4 Analyze software units (ENG.6 – BP2)
5 Prioritize and categorize software units (ENG.6 –
BP3)
6 Develop Verification Criteria (ENG.5 – BP7)
7 Verify Software Design (ENG.5 – BP8)
8 Ensure consistency and bilateral traceability of
software architectural design to software detailed
design (ENG.5 – BP10)
9 Ensure consistency and bilateral traceability of
software detailed design to software units (ENG.6 –
BP8)
10 Ensure consistency and bilateral traceability of
software requirements to software units (ENG.6 –
BP9)
11 Develop software units (ENG.6 – BP4)
Base Practices V3.0
1 Develop software detailed design
2 Define interfaces of software units
3 Describe dynamic behavior
4 Evaluate software detailed design
5 Establish bidirectional traceability
6 Ensure consistency
7 Communicate agreed software detailed design
8 Develop software units
New aspects of SWE.3
Linked to BPs 5 and 6
SWE.3 Software Detailed Design and Unit Construction – Base
Practices
82Seite 82
Changes in V3.0 – SWE.3 Software Detailed Design and Unit
Construction
• New:
• The design has to be agreed upon and communicated
• The processes of design and unit construction (ENG.5/6) have been split in three different
processes (SWE.2 – 4)
• Analysis and prioritization of the detailed design/units (ENG.6 BP2/3) is covered in the
evaluation of the detailed design (SWE.3 BP4)
• Traceability and consistency are separated (see key concepts)
• Verification criteria not explicitly required anymore
• Review only for consistency check, otherwise on Level 2
83Seite 83
Outcomes V2.5 – ENG.6
As a result of successful implementation of this process
1 a unit verification strategy is developed for software
units consistent with the software design;
(ENG.6 – 1)
2 software units defined by the software design are
analyzed for correctness and testability;
(ENG.6 – 2)
3 software units are verified according to the unit
verification strategy;
(ENG.6 – 4)
4 results of unit verification are recorded; (ENG.6 – 5)
and
5 consistency and bilateral traceability are established
between software detailed design and software units;
(ENG.6 – 6)
Outcomes V3.0
As a result of successful implementation of this process
1 a software unit verification strategy including
regression strategy is developed to verify the software
units;
2 criteria for software unit verification are developed
according to the software unit verification strategy that
are suitable to provide evidence for compliance of the
software units with the software detailed design and
with the non-functional software requirements;
3 software units are verified according to the software
unit verification strategy and the defined criteria for
software unit verification and the results are recorded;
4 consistency and bidirectional traceability are
established between software units, criteria for
verification and verification results; and
5 results of the unit verification are summarized and
communicated to all affected parties.
• Outcomes between SWE.3 and SWE.4 (formerly ENG.6) have been rearranged
New aspect of software unit verification
SWE.4 Software Unit Verification – Outcomes
84Seite 84
Base Practices V2.5 – ENG.6
1 Define a unit verification strategy (ENG.6 – BP1)
2 Analyze software units (ENG.6 – BP2)
3 Prioritize and categorize software units (ENG.6 –
BP3)
4 Develop unit verification criteria (ENG.6 – BP5)
5 Verify software units (ENG.6 – BP6)
6 Record the results of unit verification (ENG.6 – BP7)
7 Ensure consistency and bilateral traceability of
software units to test specification for software units
(ENG.6 – BP10)
Base Practices V3.0
1 Develop software unit verification strategy including
regression strategy
2 Develop criteria for unit verification
3 Perform static verification of software units
4 Test software units
5 Establish bidirectional traceability
6 Ensure consistency
7 Summarize and communicate results
New aspects of the software unit verification process
Covered in SWE.3
SWE.4 Software Unit Verification – Base Practices
85Seite 85
Changes in V3.0 – SWE.4 Software Unit Verification
• New:
• The processes of design and unit construction (ENG.5/6) have been split in three different
processes (SWE.2 – 4)
• All verification activities on Unit Level are covered in this process
• Traceability and consistency are separated (see key concepts)
86Seite 86
Outcomes V2.5 – ENG.7
As a result of successful implementation of this process
1 a software integration and integration test strategy is
developed for software items consistent with the software
design according to the priorities and categorization of the
software requirements;
2 a test specification software integration is developed that
ensures compliance with the software architectural design,
software detailed design, allocated to the items;
3 software units and software items are integrated as defined by
the integration strategy;
4 integrated software items are verified using the test cases;
5 results of software integration testing are recorded;
6 consistency and bilateral traceability are established between
software architectural design and software detailed design to
software integration test specification including test cases; and
7 a regression strategy is developed and applied for re-
integrating and re-verifying software items when a change in
software items (including associated requirements, design and
code) occurs
Outcomes V3.0
As a result of successful implementation of this process
1 a software integration strategy consistent with the project plan,
release plan and the software architectural design is
developed to integrate the software items;
2 a software integration test strategy including the regression
test strategy is developed to test the software unit and
software item interactions;
3 a specification for software integration test according to the
software integration test strategy is developed that is suitable
to provide evidence for compliance of the integrated software
items with the software architectural design, including the
interfaces between the software units and between the
software items;
4 software units and software items are integrated up to a
complete integrated software according to the integration
strategy;
5 Test cases included in the software integration test
specification are selected according to the software integration
test strategy, and the release plan;
6 integrated software items are tested using the selected test
cases and the results of software integration test are recorded;
7 consistency and bidirectional traceability are established
between the elements of the software architectural design and
the test cases included in the software integration test
specification and between test cases and test results; and
8 results of the software integration test are summarized and
communicated to all affected parties
New aspect of software integration and integration
test
• Some rewording
SWE.5 Software Integration and Integration Test – Outcomes
87Seite 87
Base Practices V2.5 – ENG.7
1 Develop software integration strategy
2 Develop software integration test strategy
3 Develop test specification for software integration
Test
4 Integrate software units and software items
5 Verify the integrated software
6 Record the results of software integration testing
7 Ensure consistency and bilateral traceability of
software architectural design and software detailed
design to
software integration test specification
8 Develop regression testing strategy and perform
regression testing
Base Practices V3.0
1 Develop software integration strategy
2 Develop software integration test strategy including
regression test strategy
3 Develop specification for software integration test
4 Integrate software units and software items
5 Select test cases
6 Perform software integration test
7 Establish bidirectional traceability
8 Ensure consistency
9 Summarize and communicate results
New aspects of the software integration and
integration test process
SWE.5 Software Integration and Integration Test – Base Practices
88Seite 88
Changes in V3.0 – SWE.5 Software Integration and Integration Test
• New:
• Selection of test cases based on the test strategy
• Traceability and consistency are separated (see key concepts)
89Seite 89
Outcomes V2.5 – ENG.8
As a result of successful implementation of this process
1 a strategy is developed to test the integrated software
according to the priorities and categorization of the
software requirements;
2 a test specification for software test of the integrated
software is developed that demonstrates compliance
to the software requirements;
3 the integrated software is verified using the test cases;
4 results of software testing are recorded;
5 consistency and bilateral traceability are established
between software requirements and software test
specification including test cases; and
6 a regression test strategy is developed and applied for
re-testing the integrated software when a change in
software items occur
Outcomes V3.0
As a result of successful implementation of this process
1 a software qualification test strategy including
regression test strategy consistent with the project plan
and release plan is developed to test the integrated
software;
2 a specification for software qualification test of the
integrated software according to the software
qualification test strategy is developed that is suitable
to provide evidence for compliance with the software
requirements;
3 test cases included in the software qualification test
specification are selected according to the software
qualification test strategy and the release plan;
4 the integrated software is tested using the selected
test cases and the results of software qualification test
are recorded;
5 consistency and bidirectional traceability are
established between software requirements and
software qualification test specification including test
cases and between test cases and test results; and
6 results of the software qualification test are
summarized and communicated to all affected parties
SWE.6 Software Qualification Test – Outcomes
The appropriate test cases need to be selected
to support the test strategy (incl. the regression
test strategy)
90Seite 90
Base Practices V2.5 – ENG.8
1 Develop software test strategy
2 Develop test specification for software test
3 Verify integrated software
4 Record the results of software testing
5 Ensure consistency and bilateral traceability of
software requirements to software test specification
6 Develop regression test strategy and perform
regression testing
Base Practices V3.0
1 Develop software qualification test strategy including
regression test strategy
2 Develop specification for software qualification test
3 Select test cases
4 Test integrated software
5 Establish bidirectional traceability
6 Ensure consistency
7 Summarize and communicate results
New aspects of the software qualification test process
SWE.6 Software Qualification Test – Base Practices
91Seite 91
Changes in V3.0 – SWE.6 Software Qualification Test
• New name for software test  software qualification test (from ISO 15504-5:2012)
• New:
• Selection of test cases based on the test strategy
• Traceability and consistency are separated (see key concepts)
Seite 92
Vers. 3.0 / ISO 330xx
• GP 2.1.1 Identify the objectives for the
performance of the process.
• GP 2.1.2 Plan the performance of the
process to fulfill the identified objectives.
• GP 2.1.3 Monitor the performance of the
process against the plans.
• GP 2.1.4 Adjust the performance of the
process.
• GP 2.1.5 Define responsibilities and
authorities for performing the process.
• GP 2.1.6 Identify, prepare, and make
available resources to perform the process
according to plan.
• GP 2.1.7 Manage the interfaces between
involved parties.
Vers. 2.5 / ISO 15504
• GP 2.1.1 Identify the objectives for the
performance of the process.
• GP 2.1.2 Plan and monitor the
performance of the process to fulfill the
identified objectives.
• GP 2.1.3 Adjust the performance of the
process.
• GP 2.1.4 Define responsibilities and
authorities for performing the process.
• GP 2.1.5 Identify and make available
resources to perform the process
according to plan.
• GP 2.1.6 Manage the interfaces between
involved parties.
Changes to Generic Practices - CL2 (1/3)
Seite 93
• GP 2.1.2 Plan the performance of the process to fulfill the identified
objectives. [ACHIEVEMENT b]
• Plan(s) for the performance of the process are developed.
• The process performance cycle is defined.
• Key milestones for the performance of the process are established.
• Estimates for process performance attributes are determined and maintained.
• Process activities and tasks are defined.
• Schedule is defined and aligned with the approach to performing the process.
• Process work product reviews are planned.
• GP 2.1.3 Monitor the performance of the process against the plans.
[ACHIEVEMENT c]
• The process is performed according to the plan(s).
• Process performance is monitored to ensure planned results are achieved and to
identify possible deviations
Changes to Generic Practices - CL2 (2/3)
Seite 94
GP 2.1.6 Identify, prepare, and make available resources to perform the
process according to plan. [ACHIEVEMENT f, g]
• The human and infrastructure resources, necessary for performing the process
are identified made available, allocated and used.
• The individuals performing and managing the process are prepared by
training, mentoring, or coaching to execute their responsibilities.
• The information necessary to perform the process is identified and made
available.
Changes to Generic Practices - CL2 (3/3)
Seite 95
GP 3.1.1 Define and maintain the standard process that will support the
deployment of the defined process. [ACHIEVEMENT a]
• A standard process is developed and maintained that includes the
fundamental process elements.
• …
GP 3.1.3 Identify the roles and competencies, responsibilities, and authorities
for performing the standard process. [ACHIEVEMENT c]
• Process performance roles are identified
• Competencies for performing the process are identified.
• Authorities necessary for executing responsibilities are identified.
Changes to Generic Practices – CL3 (1/2)
Seite 96
GP 3.1.5 Determine suitable methods and measures to monitor the
effectiveness and suitability of the standard process. [ACHIEVEMENT e]
• Methods and measures for monitoring the effectiveness and suitability of the
process are determined.
• …
PA 3.2 Process deployment attribute
The process deployment attribute is a measure of the extent to which the
standard process is effectively deployed as a defined process to achieve its
process outcomes. As a result of full achievement of this attribute:
In addition the following Notes for GP 3.2.6 Collect and analyze data about
performance of the process to demonstrate its suitability and effectiveness.
[ACHIEVEMENT f]
NOTE 1: Data about process performance may be qualitative or quantitative.
Changes to Generic Practices – CL3 (2/2)
Seite 97
Vers. 3.0 / ISO 330xx
• GP 4.1.1 Identify business goals.
• GP 4.1.2 Establish process information
needs.
• GP 4.1.3 Derive process measurement
objectives from process information needs.
• GP 4.1.4 Identify measurable relationships
between process elements.
• GP 4.1.5 Establish quantitative objectives.
• GP 4.1.6 Identify process measures that
support the achievement of the quantitative
objectives.
• GP 4.1.7 Collect product and process
measurement results through performing
the defined process.
Vers. 2.5 / ISO 15504
• GP 4.1.1 Identify process information
needs, in relation with business goals.
• GP 4.1.2 Derive process measurement
objectives from process information
needs.
• GP 4.1.3 Establish quantitative objectives
for the performance of the defined
process, according to the alignment of the
process with the business goals.
• GP 4.1.4 Identify product and process
measures that support the achievement of
the quantitative objectives for process
performance.
• GP 4.1.5 Collect product and process
measurement results through performing
the defined process.
• GP 4.1.6 Use the results of the defined
measurement to monitor and verify the
achievement of the process performance
objectives.
Changes to Generic Practices - CL4 (1/2)
Seite 98
GP 4.1.4 Identify measurable relationships between process elements.
[ACHIEVEMENT a, d]
Identify the relationships between process elements, which contribute to the
derived measurement objectives.
Changes to Generic Practices - CL4 (2/2)
Seite 99
Vers. 3.0 / ISO 330xx
• GP 5.1.1 Define the process innovation
objectives for the process that support the
relevant business goals.
• GP 5.1.2 Analyze data of the process to
identify opportunities for innovation.
• GP 5.1.3 Analyze new technologies and
process concepts to identify opportunities
for innovation.
• GP 5.1.4 Define and maintain an
implementation strategy based on
innovation vision and objectives.
Vers. 2.5 / ISO 15504
• GP 5.1.1 Define the process improvement
objectives for the process that support the
relevant business goals.
• GP 5.1.2 Analyse measurement data of the
process to identify real and potential
variations in the process performance.
• GP 5.1.3 Identify improvement
opportunities of the process based on
innovation and best practices.
• GP 5.1.4 Derive improvement
opportunities of the process from new
technologies and process concepts. Impact
of new technologies on process
performance is identified and evaluated.
• GP 5.1.5 Define an implementation
strategy based on long-term improvement
vision and objectives.
Changes to Generic Practices – CL5
Seite 100
And finally…
Seite 101
• Meet the experts:
• We offer free half day public seminars. Our experts will explain the changes and
answer your individual questions.
• For English language introductions see
www.kuglermaag.com/aspice-intro
• For German language introductions see
www.kuglermaag.de/aspice-intro
• Schedule a one day inhouse seminar:
• Our experts will explain the changes, answer your questions, but also look at your
individual situation and plan how to proceed to upgrade your organization to
Automotive SPICE 3.0 compliance.
• For update trainings see
www.kuglermaag.com/aspice-update (English version)
• www.kuglermaag.de/aspice-update (German version)
• Any other questions and concerns?
• See “contact information”
Need more advice?
Seite 102
About the Authors
Fabio Bella
• Process Director at Kugler Maag Cie, Country Manager for Italy
• intacsTM advisory board member
• intacsTM SPICE Principal Assessor, intacsTM SPICE Instructor
• TüV Rheinland Functional Safety Engineer (Automotive)
• Volkswagen-certified Software Quality Improvement Leader (SQIL)
Dr. Klaus Hoermann
• Principal and Partner at Kugler Maag Cie
• Leader of the intacsTM working group “Exams”
• intacsTM SPICE Principal Assessor, intacsTM SPICE Instructor
• Volkswagen-certified Software Quality Improvement Leader (SQIL)
• CMMI® SCAMPI Lead Appraiser (CMMI Institute-Certified)
• CMMI® Instructor (CMMI Institute-Certified)
• Scrum Master (Scrum.org certified)
Bhaskar Vanamali
• Process Director at Kugler Maag Cie
• Member of the VDA AK13 (working on Automotive SPICE), member of the
SC7 WG10 (working on ISO 15504/ISO 33000)
• intacsTM SPICE Principal Assessor, intacsTM SPICE Instructor
• SixSigma Green Belt
• Volkswagen-certified Software Quality Improvement Leader (SQIL)
Seite 103
Contact information
KUGLER MAAG CIE GmbH
Leibnizstr. 11
70806 Kornwestheim, Germany
information@kuglermaag.com
www.kuglermaag.de
Phone +49 7154 1796 100
KUGLER MAAG CIE North America Inc.
Columbia Center, 201 W Big Beaver Rd,
Troy, MI 48084, USA
usa@kuglermaag.com
www.kuglermaag.com
Phone +1 248 687 1210
KUGLER MAAG CIE Central Eastern Europe
cee@kuglermaag.com
Phone +48 513 144 297
Seite 104
© KUGLER MAAG CIE GmbH
Thank you for your
attention.
Questions? Comments?

Mais conteúdo relacionado

Mais procurados

Agile testing - Testing From Day 1
Agile testing - Testing From Day 1Agile testing - Testing From Day 1
Agile testing - Testing From Day 1Kaizenko
 
Тестирование требований и документации
Тестирование требований и документацииТестирование требований и документации
Тестирование требований и документацииUladzimir Kryvenka
 
Requirement Management
Requirement ManagementRequirement Management
Requirement ManagementRavikanth-BA
 
Test Mühendisliğine Giriş Eğitimi - Bölüm 1
Test Mühendisliğine Giriş Eğitimi - Bölüm 1Test Mühendisliğine Giriş Eğitimi - Bölüm 1
Test Mühendisliğine Giriş Eğitimi - Bölüm 1Mesut Günes
 
Manual Testing
Manual TestingManual Testing
Manual TestingG.C Reddy
 
52892006 manual-testing-real-time
52892006 manual-testing-real-time52892006 manual-testing-real-time
52892006 manual-testing-real-timeSunil Pandey
 
Autosar basics by ARCCORE
Autosar basics by ARCCOREAutosar basics by ARCCORE
Autosar basics by ARCCOREARCCORE
 
ISTQB / ISEB Foundation Exam Practice - 4
ISTQB / ISEB Foundation Exam Practice - 4ISTQB / ISEB Foundation Exam Practice - 4
ISTQB / ISEB Foundation Exam Practice - 4Yogindernath Gupta
 
Polarion Tomorrows ALM Platform Today
Polarion Tomorrows ALM Platform TodayPolarion Tomorrows ALM Platform Today
Polarion Tomorrows ALM Platform Todaypolarion
 
Software testing basic concepts
Software testing basic conceptsSoftware testing basic concepts
Software testing basic conceptsHưng Hoàng
 
Public and private APIs: differences and challenges
Public and private APIs: differences and challengesPublic and private APIs: differences and challenges
Public and private APIs: differences and challengesRestlet
 
Manual testing good notes
Manual testing good notesManual testing good notes
Manual testing good notesdkns0906
 
An approach towards sotif with ansys medini analyze
An approach towards sotif with ansys medini analyzeAn approach towards sotif with ansys medini analyze
An approach towards sotif with ansys medini analyzeBernhard Kaiser
 
Simulation with Python and MATLAB® in Capella
Simulation with Python and MATLAB® in CapellaSimulation with Python and MATLAB® in Capella
Simulation with Python and MATLAB® in CapellaObeo
 
ISO/PAS 21448 (SOTIF) in the Development of ADAS and Autonomous Vehicles
ISO/PAS 21448 (SOTIF) in the Development of ADAS and Autonomous VehiclesISO/PAS 21448 (SOTIF) in the Development of ADAS and Autonomous Vehicles
ISO/PAS 21448 (SOTIF) in the Development of ADAS and Autonomous VehiclesIntland Software GmbH
 

Mais procurados (20)

Agile testing - Testing From Day 1
Agile testing - Testing From Day 1Agile testing - Testing From Day 1
Agile testing - Testing From Day 1
 
Тестирование требований и документации
Тестирование требований и документацииТестирование требований и документации
Тестирование требований и документации
 
ISTQB Advanced Test Manager Training 2012 - Testing Process
ISTQB Advanced Test Manager Training 2012 - Testing Process ISTQB Advanced Test Manager Training 2012 - Testing Process
ISTQB Advanced Test Manager Training 2012 - Testing Process
 
IEC 62304 Action List
IEC 62304 Action List IEC 62304 Action List
IEC 62304 Action List
 
Requirement Management
Requirement ManagementRequirement Management
Requirement Management
 
API Governance
API Governance API Governance
API Governance
 
Test Mühendisliğine Giriş Eğitimi - Bölüm 1
Test Mühendisliğine Giriş Eğitimi - Bölüm 1Test Mühendisliğine Giriş Eğitimi - Bölüm 1
Test Mühendisliğine Giriş Eğitimi - Bölüm 1
 
Manual Testing
Manual TestingManual Testing
Manual Testing
 
52892006 manual-testing-real-time
52892006 manual-testing-real-time52892006 manual-testing-real-time
52892006 manual-testing-real-time
 
Autosar basics by ARCCORE
Autosar basics by ARCCOREAutosar basics by ARCCORE
Autosar basics by ARCCORE
 
ISTQB / ISEB Foundation Exam Practice - 4
ISTQB / ISEB Foundation Exam Practice - 4ISTQB / ISEB Foundation Exam Practice - 4
ISTQB / ISEB Foundation Exam Practice - 4
 
Flash Bootloader Development for ECU programming
Flash Bootloader Development for ECU programmingFlash Bootloader Development for ECU programming
Flash Bootloader Development for ECU programming
 
Polarion Tomorrows ALM Platform Today
Polarion Tomorrows ALM Platform TodayPolarion Tomorrows ALM Platform Today
Polarion Tomorrows ALM Platform Today
 
Software testing basic concepts
Software testing basic conceptsSoftware testing basic concepts
Software testing basic concepts
 
Qms &amp; iatf presentation1
Qms &amp; iatf presentation1Qms &amp; iatf presentation1
Qms &amp; iatf presentation1
 
Public and private APIs: differences and challenges
Public and private APIs: differences and challengesPublic and private APIs: differences and challenges
Public and private APIs: differences and challenges
 
Manual testing good notes
Manual testing good notesManual testing good notes
Manual testing good notes
 
An approach towards sotif with ansys medini analyze
An approach towards sotif with ansys medini analyzeAn approach towards sotif with ansys medini analyze
An approach towards sotif with ansys medini analyze
 
Simulation with Python and MATLAB® in Capella
Simulation with Python and MATLAB® in CapellaSimulation with Python and MATLAB® in Capella
Simulation with Python and MATLAB® in Capella
 
ISO/PAS 21448 (SOTIF) in the Development of ADAS and Autonomous Vehicles
ISO/PAS 21448 (SOTIF) in the Development of ADAS and Autonomous VehiclesISO/PAS 21448 (SOTIF) in the Development of ADAS and Autonomous Vehicles
ISO/PAS 21448 (SOTIF) in the Development of ADAS and Autonomous Vehicles
 

Destaque

Top Metrics for SPICE-compliant projects
Top Metrics for SPICE-compliant projectsTop Metrics for SPICE-compliant projects
Top Metrics for SPICE-compliant projectsLuigi Buglione
 
What Is Iso/iec 15504
What Is Iso/iec 15504What Is Iso/iec 15504
What Is Iso/iec 15504pax_isp
 
ISO/IEc 15504/SPICE Status
ISO/IEc 15504/SPICE StatusISO/IEc 15504/SPICE Status
ISO/IEc 15504/SPICE StatusAlec Dorling
 
Agilität im Systems Engineering – geht das?
Agilität im Systems Engineering – geht das?Agilität im Systems Engineering – geht das?
Agilität im Systems Engineering – geht das?HOOD Group
 
A proposal for a new common process scope for AutomotiveSPICE: Six reasons fo...
A proposal for a new common process scope for AutomotiveSPICE: Six reasons fo...A proposal for a new common process scope for AutomotiveSPICE: Six reasons fo...
A proposal for a new common process scope for AutomotiveSPICE: Six reasons fo...Luigi Buglione
 
Tool Qualification v12.02
Tool Qualification v12.02Tool Qualification v12.02
Tool Qualification v12.02iSYSTEM AG
 
MASP (Metrics in Automotive Software Projects) - Purpose, Scope &amp; Results
MASP (Metrics in Automotive Software Projects) - Purpose, Scope &amp; ResultsMASP (Metrics in Automotive Software Projects) - Purpose, Scope &amp; Results
MASP (Metrics in Automotive Software Projects) - Purpose, Scope &amp; ResultsLuigi Buglione
 
Internal Audits and Assessments with help of Enterprise SPiCE
Internal Audits and Assessments with help of Enterprise SPiCEInternal Audits and Assessments with help of Enterprise SPiCE
Internal Audits and Assessments with help of Enterprise SPiCEErnest Wallmueller
 
กระบวนการออกแบบรายละเอียดซอฟต์แวร์
กระบวนการออกแบบรายละเอียดซอฟต์แวร์กระบวนการออกแบบรายละเอียดซอฟต์แวร์
กระบวนการออกแบบรายละเอียดซอฟต์แวร์Sitdhibong Laokok
 
Webinar Presentation- Typical Challenges Faced by Tier 1s in AUTOSAR Tooling
Webinar Presentation- Typical Challenges Faced by Tier 1s in AUTOSAR ToolingWebinar Presentation- Typical Challenges Faced by Tier 1s in AUTOSAR Tooling
Webinar Presentation- Typical Challenges Faced by Tier 1s in AUTOSAR ToolingKPIT
 
Findeis inovex robert_boschgmbh-vortrag-WiMa2014
Findeis inovex robert_boschgmbh-vortrag-WiMa2014Findeis inovex robert_boschgmbh-vortrag-WiMa2014
Findeis inovex robert_boschgmbh-vortrag-WiMa2014Christoph Tempich
 
A process maturity model for requirements engineering
A process maturity model for requirements engineeringA process maturity model for requirements engineering
A process maturity model for requirements engineeringIan Sommerville
 
Agile at Enterprise Scale: The Tricky Bits
Agile at Enterprise Scale: The Tricky BitsAgile at Enterprise Scale: The Tricky Bits
Agile at Enterprise Scale: The Tricky BitsBernie Maloney
 

Destaque (20)

Spice
SpiceSpice
Spice
 
Automotive SPICE Introduction
Automotive SPICE IntroductionAutomotive SPICE Introduction
Automotive SPICE Introduction
 
Spice
SpiceSpice
Spice
 
Top Metrics for SPICE-compliant projects
Top Metrics for SPICE-compliant projectsTop Metrics for SPICE-compliant projects
Top Metrics for SPICE-compliant projects
 
What Is Iso/iec 15504
What Is Iso/iec 15504What Is Iso/iec 15504
What Is Iso/iec 15504
 
ISO/IEc 15504/SPICE Status
ISO/IEc 15504/SPICE StatusISO/IEc 15504/SPICE Status
ISO/IEc 15504/SPICE Status
 
Agilität im Systems Engineering – geht das?
Agilität im Systems Engineering – geht das?Agilität im Systems Engineering – geht das?
Agilität im Systems Engineering – geht das?
 
A proposal for a new common process scope for AutomotiveSPICE: Six reasons fo...
A proposal for a new common process scope for AutomotiveSPICE: Six reasons fo...A proposal for a new common process scope for AutomotiveSPICE: Six reasons fo...
A proposal for a new common process scope for AutomotiveSPICE: Six reasons fo...
 
Tool Qualification v12.02
Tool Qualification v12.02Tool Qualification v12.02
Tool Qualification v12.02
 
MASP (Metrics in Automotive Software Projects) - Purpose, Scope &amp; Results
MASP (Metrics in Automotive Software Projects) - Purpose, Scope &amp; ResultsMASP (Metrics in Automotive Software Projects) - Purpose, Scope &amp; Results
MASP (Metrics in Automotive Software Projects) - Purpose, Scope &amp; Results
 
Internal Audits and Assessments with help of Enterprise SPiCE
Internal Audits and Assessments with help of Enterprise SPiCEInternal Audits and Assessments with help of Enterprise SPiCE
Internal Audits and Assessments with help of Enterprise SPiCE
 
Relatorio msc&co. 1 q15
Relatorio msc&co. 1 q15Relatorio msc&co. 1 q15
Relatorio msc&co. 1 q15
 
120702_ZOE 3_12 wzk
120702_ZOE 3_12 wzk120702_ZOE 3_12 wzk
120702_ZOE 3_12 wzk
 
กระบวนการออกแบบรายละเอียดซอฟต์แวร์
กระบวนการออกแบบรายละเอียดซอฟต์แวร์กระบวนการออกแบบรายละเอียดซอฟต์แวร์
กระบวนการออกแบบรายละเอียดซอฟต์แวร์
 
Webinar Presentation- Typical Challenges Faced by Tier 1s in AUTOSAR Tooling
Webinar Presentation- Typical Challenges Faced by Tier 1s in AUTOSAR ToolingWebinar Presentation- Typical Challenges Faced by Tier 1s in AUTOSAR Tooling
Webinar Presentation- Typical Challenges Faced by Tier 1s in AUTOSAR Tooling
 
Findeis inovex robert_boschgmbh-vortrag-WiMa2014
Findeis inovex robert_boschgmbh-vortrag-WiMa2014Findeis inovex robert_boschgmbh-vortrag-WiMa2014
Findeis inovex robert_boschgmbh-vortrag-WiMa2014
 
A process maturity model for requirements engineering
A process maturity model for requirements engineeringA process maturity model for requirements engineering
A process maturity model for requirements engineering
 
Agile at Enterprise Scale: The Tricky Bits
Agile at Enterprise Scale: The Tricky BitsAgile at Enterprise Scale: The Tricky Bits
Agile at Enterprise Scale: The Tricky Bits
 
HR in agilen Umgebungen
HR in agilen UmgebungenHR in agilen Umgebungen
HR in agilen Umgebungen
 
Agiles Management
Agiles ManagementAgiles Management
Agiles Management
 

Semelhante a Automotive SPICE® 3.0 - What is new and what has changed?

QuEST Forum TL 9000 R6.0 Requirements & ISO 9001:2015
QuEST Forum TL 9000 R6.0 Requirements & ISO 9001:2015QuEST Forum TL 9000 R6.0 Requirements & ISO 9001:2015
QuEST Forum TL 9000 R6.0 Requirements & ISO 9001:2015QuEST Forum
 
Implementing ts strategic-choices_5_2003
Implementing ts strategic-choices_5_2003Implementing ts strategic-choices_5_2003
Implementing ts strategic-choices_5_2003Omnex Inc.
 
TL 9000 Measurements and Requirements Interactive Workshop
TL 9000 Measurements and Requirements Interactive WorkshopTL 9000 Measurements and Requirements Interactive Workshop
TL 9000 Measurements and Requirements Interactive WorkshopQuEST Forum
 
The Status and Issues of Quality Management System Certification in Japan
The Status and Issues of Quality Management System Certification in JapanThe Status and Issues of Quality Management System Certification in Japan
The Status and Issues of Quality Management System Certification in JapanQuEST Forum
 
Curriculum_vitae
Curriculum_vitaeCurriculum_vitae
Curriculum_vitaeKiran Gp
 
VDA 6.3 Process Approach in Automotive Industries
VDA 6.3 Process Approach in Automotive IndustriesVDA 6.3 Process Approach in Automotive Industries
VDA 6.3 Process Approach in Automotive IndustriesKannanDN
 
Reducing timeincreasingvalue0503
Reducing timeincreasingvalue0503Reducing timeincreasingvalue0503
Reducing timeincreasingvalue0503Omnex Inc.
 
Iso9001 transition planning_guidance
Iso9001 transition planning_guidanceIso9001 transition planning_guidance
Iso9001 transition planning_guidanceAhmed said
 
Challenges in Automotive Quality Standards
Challenges in Automotive Quality StandardsChallenges in Automotive Quality Standards
Challenges in Automotive Quality StandardsTimothy Wooi
 
updated resume yathesh 18.09.2016
updated resume yathesh 18.09.2016updated resume yathesh 18.09.2016
updated resume yathesh 18.09.2016Yatheesh J R
 

Semelhante a Automotive SPICE® 3.0 - What is new and what has changed? (20)

QuEST Forum TL 9000 R6.0 Requirements & ISO 9001:2015
QuEST Forum TL 9000 R6.0 Requirements & ISO 9001:2015QuEST Forum TL 9000 R6.0 Requirements & ISO 9001:2015
QuEST Forum TL 9000 R6.0 Requirements & ISO 9001:2015
 
Iso ts iat material
Iso ts iat materialIso ts iat material
Iso ts iat material
 
Implementing ts strategic-choices_5_2003
Implementing ts strategic-choices_5_2003Implementing ts strategic-choices_5_2003
Implementing ts strategic-choices_5_2003
 
TL 9000 Measurements and Requirements Interactive Workshop
TL 9000 Measurements and Requirements Interactive WorkshopTL 9000 Measurements and Requirements Interactive Workshop
TL 9000 Measurements and Requirements Interactive Workshop
 
EQHSMS Auditor Training Presentation
EQHSMS Auditor Training PresentationEQHSMS Auditor Training Presentation
EQHSMS Auditor Training Presentation
 
Iso final
Iso finalIso final
Iso final
 
The Status and Issues of Quality Management System Certification in Japan
The Status and Issues of Quality Management System Certification in JapanThe Status and Issues of Quality Management System Certification in Japan
The Status and Issues of Quality Management System Certification in Japan
 
Training Presentation on QMS 9001:2015
Training Presentation on QMS 9001:2015Training Presentation on QMS 9001:2015
Training Presentation on QMS 9001:2015
 
Curriculum_vitae
Curriculum_vitaeCurriculum_vitae
Curriculum_vitae
 
CMMI for Services v2.0 Changes, Practice Areas, Appraisals
CMMI for Services v2.0 Changes, Practice Areas, AppraisalsCMMI for Services v2.0 Changes, Practice Areas, Appraisals
CMMI for Services v2.0 Changes, Practice Areas, Appraisals
 
VDA 6.3 Process Approach in Automotive Industries
VDA 6.3 Process Approach in Automotive IndustriesVDA 6.3 Process Approach in Automotive Industries
VDA 6.3 Process Approach in Automotive Industries
 
Reducing timeincreasingvalue0503
Reducing timeincreasingvalue0503Reducing timeincreasingvalue0503
Reducing timeincreasingvalue0503
 
Iso9001 transition planning_guidance
Iso9001 transition planning_guidanceIso9001 transition planning_guidance
Iso9001 transition planning_guidance
 
Challenges in Automotive Quality Standards
Challenges in Automotive Quality StandardsChallenges in Automotive Quality Standards
Challenges in Automotive Quality Standards
 
Asq presentationkjkj
Asq presentationkjkjAsq presentationkjkj
Asq presentationkjkj
 
Asq presentation
Asq presentationAsq presentation
Asq presentation
 
Isots 169492002
Isots 169492002Isots 169492002
Isots 169492002
 
Rollout
RolloutRollout
Rollout
 
Shashank Singh Sengar
Shashank Singh SengarShashank Singh Sengar
Shashank Singh Sengar
 
updated resume yathesh 18.09.2016
updated resume yathesh 18.09.2016updated resume yathesh 18.09.2016
updated resume yathesh 18.09.2016
 

Último

design a four cylinder internal combustion engine
design a four cylinder internal combustion enginedesign a four cylinder internal combustion engine
design a four cylinder internal combustion enginepiyushsingh943161
 
Delhi Call Girls Saket 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Saket 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls Saket 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Saket 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Callshivangimorya083
 
Vip Mumbai Call Girls Mumbai Call On 9920725232 With Body to body massage wit...
Vip Mumbai Call Girls Mumbai Call On 9920725232 With Body to body massage wit...Vip Mumbai Call Girls Mumbai Call On 9920725232 With Body to body massage wit...
Vip Mumbai Call Girls Mumbai Call On 9920725232 With Body to body massage wit...amitlee9823
 
Sales & Marketing Alignment_ How to Synergize for Success.pptx.pdf
Sales & Marketing Alignment_ How to Synergize for Success.pptx.pdfSales & Marketing Alignment_ How to Synergize for Success.pptx.pdf
Sales & Marketing Alignment_ How to Synergize for Success.pptx.pdfAggregage
 
John Deere 335 375 385 435 Service Repair Manual
John Deere 335 375 385 435 Service Repair ManualJohn Deere 335 375 385 435 Service Repair Manual
John Deere 335 375 385 435 Service Repair ManualExcavator
 
Business Bay Escorts $#$ O56521286O $#$ Escort Service In Business Bay Dubai
Business Bay Escorts $#$ O56521286O $#$ Escort Service In Business Bay DubaiBusiness Bay Escorts $#$ O56521286O $#$ Escort Service In Business Bay Dubai
Business Bay Escorts $#$ O56521286O $#$ Escort Service In Business Bay DubaiAroojKhan71
 
What Causes BMW Chassis Stabilization Malfunction Warning To Appear
What Causes BMW Chassis Stabilization Malfunction Warning To AppearWhat Causes BMW Chassis Stabilization Malfunction Warning To Appear
What Causes BMW Chassis Stabilization Malfunction Warning To AppearJCL Automotive
 
Delhi Call Girls Vikaspuri 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Vikaspuri 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls Vikaspuri 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Vikaspuri 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Callshivangimorya083
 
Why Won't Your Subaru Key Come Out Of The Ignition Find Out Here!
Why Won't Your Subaru Key Come Out Of The Ignition Find Out Here!Why Won't Your Subaru Key Come Out Of The Ignition Find Out Here!
Why Won't Your Subaru Key Come Out Of The Ignition Find Out Here!AutoScandia
 
一比一原版(UVic学位证书)维多利亚大学毕业证学历认证买留学回国
一比一原版(UVic学位证书)维多利亚大学毕业证学历认证买留学回国一比一原版(UVic学位证书)维多利亚大学毕业证学历认证买留学回国
一比一原版(UVic学位证书)维多利亚大学毕业证学历认证买留学回国ezgenuh
 
Lucknow 💋 (Genuine) Escort Service Lucknow | Service-oriented sexy call girls...
Lucknow 💋 (Genuine) Escort Service Lucknow | Service-oriented sexy call girls...Lucknow 💋 (Genuine) Escort Service Lucknow | Service-oriented sexy call girls...
Lucknow 💋 (Genuine) Escort Service Lucknow | Service-oriented sexy call girls...anilsa9823
 
Sanjay Nagar Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalor...
Sanjay Nagar Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalor...Sanjay Nagar Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalor...
Sanjay Nagar Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalor...amitlee9823
 
如何办理女王大学毕业证(QU毕业证书)成绩单原版一比一
如何办理女王大学毕业证(QU毕业证书)成绩单原版一比一如何办理女王大学毕业证(QU毕业证书)成绩单原版一比一
如何办理女王大学毕业证(QU毕业证书)成绩单原版一比一opyff
 
如何办理麦考瑞大学毕业证(MQU毕业证书)成绩单原版一比一
如何办理麦考瑞大学毕业证(MQU毕业证书)成绩单原版一比一如何办理麦考瑞大学毕业证(MQU毕业证书)成绩单原版一比一
如何办理麦考瑞大学毕业证(MQU毕业证书)成绩单原版一比一ozave
 
How To Fix Mercedes Benz Anti-Theft Protection Activation Issue
How To Fix Mercedes Benz Anti-Theft Protection Activation IssueHow To Fix Mercedes Benz Anti-Theft Protection Activation Issue
How To Fix Mercedes Benz Anti-Theft Protection Activation IssueTerry Sayther Automotive
 
Greenery-Palette Pitch Deck by Slidesgo.pptx
Greenery-Palette Pitch Deck by Slidesgo.pptxGreenery-Palette Pitch Deck by Slidesgo.pptx
Greenery-Palette Pitch Deck by Slidesgo.pptxzohiiimughal286
 
How To Troubleshoot Mercedes Blind Spot Assist Inoperative Error
How To Troubleshoot Mercedes Blind Spot Assist Inoperative ErrorHow To Troubleshoot Mercedes Blind Spot Assist Inoperative Error
How To Troubleshoot Mercedes Blind Spot Assist Inoperative ErrorAndres Auto Service
 
Chapter-1.3-Four-Basic-Computer-periods.pptx
Chapter-1.3-Four-Basic-Computer-periods.pptxChapter-1.3-Four-Basic-Computer-periods.pptx
Chapter-1.3-Four-Basic-Computer-periods.pptxAnjieVillarba1
 
83778-77756 ( HER.SELF ) Brings Call Girls In Laxmi Nagar
83778-77756 ( HER.SELF ) Brings Call Girls In Laxmi Nagar83778-77756 ( HER.SELF ) Brings Call Girls In Laxmi Nagar
83778-77756 ( HER.SELF ) Brings Call Girls In Laxmi Nagardollysharma2066
 

Último (20)

design a four cylinder internal combustion engine
design a four cylinder internal combustion enginedesign a four cylinder internal combustion engine
design a four cylinder internal combustion engine
 
Delhi Call Girls Saket 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Saket 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls Saket 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Saket 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
 
Vip Mumbai Call Girls Mumbai Call On 9920725232 With Body to body massage wit...
Vip Mumbai Call Girls Mumbai Call On 9920725232 With Body to body massage wit...Vip Mumbai Call Girls Mumbai Call On 9920725232 With Body to body massage wit...
Vip Mumbai Call Girls Mumbai Call On 9920725232 With Body to body massage wit...
 
Sales & Marketing Alignment_ How to Synergize for Success.pptx.pdf
Sales & Marketing Alignment_ How to Synergize for Success.pptx.pdfSales & Marketing Alignment_ How to Synergize for Success.pptx.pdf
Sales & Marketing Alignment_ How to Synergize for Success.pptx.pdf
 
John Deere 335 375 385 435 Service Repair Manual
John Deere 335 375 385 435 Service Repair ManualJohn Deere 335 375 385 435 Service Repair Manual
John Deere 335 375 385 435 Service Repair Manual
 
Business Bay Escorts $#$ O56521286O $#$ Escort Service In Business Bay Dubai
Business Bay Escorts $#$ O56521286O $#$ Escort Service In Business Bay DubaiBusiness Bay Escorts $#$ O56521286O $#$ Escort Service In Business Bay Dubai
Business Bay Escorts $#$ O56521286O $#$ Escort Service In Business Bay Dubai
 
What Causes BMW Chassis Stabilization Malfunction Warning To Appear
What Causes BMW Chassis Stabilization Malfunction Warning To AppearWhat Causes BMW Chassis Stabilization Malfunction Warning To Appear
What Causes BMW Chassis Stabilization Malfunction Warning To Appear
 
Delhi Call Girls Vikaspuri 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Vikaspuri 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls Vikaspuri 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Vikaspuri 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
 
Why Won't Your Subaru Key Come Out Of The Ignition Find Out Here!
Why Won't Your Subaru Key Come Out Of The Ignition Find Out Here!Why Won't Your Subaru Key Come Out Of The Ignition Find Out Here!
Why Won't Your Subaru Key Come Out Of The Ignition Find Out Here!
 
一比一原版(UVic学位证书)维多利亚大学毕业证学历认证买留学回国
一比一原版(UVic学位证书)维多利亚大学毕业证学历认证买留学回国一比一原版(UVic学位证书)维多利亚大学毕业证学历认证买留学回国
一比一原版(UVic学位证书)维多利亚大学毕业证学历认证买留学回国
 
Lucknow 💋 (Genuine) Escort Service Lucknow | Service-oriented sexy call girls...
Lucknow 💋 (Genuine) Escort Service Lucknow | Service-oriented sexy call girls...Lucknow 💋 (Genuine) Escort Service Lucknow | Service-oriented sexy call girls...
Lucknow 💋 (Genuine) Escort Service Lucknow | Service-oriented sexy call girls...
 
Sanjay Nagar Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalor...
Sanjay Nagar Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalor...Sanjay Nagar Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalor...
Sanjay Nagar Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalor...
 
如何办理女王大学毕业证(QU毕业证书)成绩单原版一比一
如何办理女王大学毕业证(QU毕业证书)成绩单原版一比一如何办理女王大学毕业证(QU毕业证书)成绩单原版一比一
如何办理女王大学毕业证(QU毕业证书)成绩单原版一比一
 
如何办理麦考瑞大学毕业证(MQU毕业证书)成绩单原版一比一
如何办理麦考瑞大学毕业证(MQU毕业证书)成绩单原版一比一如何办理麦考瑞大学毕业证(MQU毕业证书)成绩单原版一比一
如何办理麦考瑞大学毕业证(MQU毕业证书)成绩单原版一比一
 
How To Fix Mercedes Benz Anti-Theft Protection Activation Issue
How To Fix Mercedes Benz Anti-Theft Protection Activation IssueHow To Fix Mercedes Benz Anti-Theft Protection Activation Issue
How To Fix Mercedes Benz Anti-Theft Protection Activation Issue
 
Greenery-Palette Pitch Deck by Slidesgo.pptx
Greenery-Palette Pitch Deck by Slidesgo.pptxGreenery-Palette Pitch Deck by Slidesgo.pptx
Greenery-Palette Pitch Deck by Slidesgo.pptx
 
How To Troubleshoot Mercedes Blind Spot Assist Inoperative Error
How To Troubleshoot Mercedes Blind Spot Assist Inoperative ErrorHow To Troubleshoot Mercedes Blind Spot Assist Inoperative Error
How To Troubleshoot Mercedes Blind Spot Assist Inoperative Error
 
Chapter-1.3-Four-Basic-Computer-periods.pptx
Chapter-1.3-Four-Basic-Computer-periods.pptxChapter-1.3-Four-Basic-Computer-periods.pptx
Chapter-1.3-Four-Basic-Computer-periods.pptx
 
83778-77756 ( HER.SELF ) Brings Call Girls In Laxmi Nagar
83778-77756 ( HER.SELF ) Brings Call Girls In Laxmi Nagar83778-77756 ( HER.SELF ) Brings Call Girls In Laxmi Nagar
83778-77756 ( HER.SELF ) Brings Call Girls In Laxmi Nagar
 
(INDIRA) Call Girl Surat Call Now 8250077686 Surat Escorts 24x7
(INDIRA) Call Girl Surat Call Now 8250077686 Surat Escorts 24x7(INDIRA) Call Girl Surat Call Now 8250077686 Surat Escorts 24x7
(INDIRA) Call Girl Surat Call Now 8250077686 Surat Escorts 24x7
 

Automotive SPICE® 3.0 - What is new and what has changed?

  • 1. Seite 1 © KUGLER MAAG CIE GmbH Automotive SPICE 3.0 What is new and what has changed? Fabio Bella Klaus Hoermann Bhaskar Vanamali Steffen Herrmann Markus Müller Dezember 2015 Version 2015-12-05
  • 2. About the Trainer: Markus Mueller Qualification & Experience • intacs™-certified Principal Assessor and trainer, intacs™ Advisory Board member, who • conducted more than 50 assessments, many of them for OEMs • trained more than 300 ISO/IEC 15504 provisional assessors from leading car manufactures (OEMs) and suppliers • advised OEM representatives on the development of Automotive SPICE® • Project leader of several change and improvement projects based on ISO/IEC 15504 and CMM/CMMI® • Providing consultancy, coaching, and active support in several ECU development projects in automotive • E.g. project leader for the implementation of a project control office (PCO) in the electronics development of a major car manufacturer, which today controls more than 100 ECU development projects • Married with 2 children • Director Operations at Kugler Maag Cie • Over 15 years of experience in industry and research projects • Assisting medium-size companies as well as international corporations, primarily in the automotive industry • PMI Project Management Professional • Very experienced trainer, moderator, and management coach • Speaker at conferences and co-author of books
  • 3. Seite 3 Introducing myself Dr. Klaus Hoermann • Principal and Partner at Kugler Maag Cie • Leader of the intacsTM working group “Exams” • intacsTM SPICE Principal Assessor, intacsTM SPICE Instructor • Volkswagen-certified Software Quality Improvement Leader (SQIL) • CMMI® SCAMPI Lead Appraiser (CMMI Institute-Certified) • CMMI® Instructor (CMMI Institute-Certified) • Scrum Master (Scrum.org certified)
  • 4. About the trainer: Bhaskar Vanamali Qualification & Experience • intacs™-certified Principal Assessor and trainer, VDA AK13 member, who • conducted more than 90 assessments, many of them for OEMs • trained more than 200 ISO/IEC 15504 provisional assessors from leading car manufactures (OEMs) and suppliers • advised OEM representatives on the development of Automotive SPICE® • Project leader of several change and improvement projects based on SPICE and CMM/CMMI® • Providing consultancy, coaching, and active support in several ECU development projects in automotive • Member of ISO-working group for system and SW-engineering processes • Married with 4 children • Process Director at Kugler Maag Cie • Over 15 years of experience in industry and process improvement • Assisting medium-size companies as well as international corporations, primarily in the automotive industry • Very experienced trainer, moderator, and management coach • Speaker at conferences and co-author of books
  • 5. Seite 5 • Executive Summary • Introduction • Overview of the Changes • Changes in Detail • And Finally • Need more Advice? • Contact Information Contents
  • 6. Seite 6 • Version 3.0 comprises many small changes and improvements, some structural changes, and few changes which will increase project efforts. • Structural changes: • The engineering processes were divided into the two groups System (SYS) and Software (SWE). • The tip of the V was changed: Unit construction and unit verification have been separated into two processes. • A „Plug-In Concept“ allows integration of mechanical and hardware processes (not provided by Automotive SPICE) • Content-wise, the HIS-Scope remains mainly the same (however the name of some processes have changed). • Few changes will cause additional efforts from projects, e.g., evaluation of alternative solutions is required for system and software architectures according to defined criteria. The evaluation result including a rationale for the architecture/design selection has to be recorded. • The Measurement Framework was adapted to the changes in ISO 33020 • Little changes for capability levels 1-3, major changes for capability levels 4-5 • Automotive SPICE 3.0 is not yet mandatory. This will be decided by the VDA Quality Management Board with the release of the new „Blue/Gold Volume“ (Sep. 2016). • In the meantime the VDA AK 13 will develop interpretation guidelines for Automotive SPICE 3.0 and also a guideline for performing assessments (planned for April 2016). Executive Summary
  • 8. Seite 8 • Automotive SPICE Vers. 3.0 will replace the Automotive SPICE Process Assessment Model (PAM) 2.5 and the Process Reference Model (PRM) 4.5. • ASPICE Vers. 3.0 comprises PRM and PAM in one single document. • Automotive SPICE is no longer using ISO 12207 as guidance • Automotive SPICE® is a registered trademark of the Verband der Automobilindustrie e.V. (VDA) • Automotive SPICE 3.0 has been created by the Working Group 13 of the Quality Management Center (QMC) within the German Association of the Automotive Industry (Verband der Automobilindustrie e.V., VDA) with the representation of members of the Automotive Special Interest Group (SIG) – review only, and with the agreement of The SPICE User Group. This agreement is based on a validation of the Automotive SPICE 3.0 version regarding any ISO copyright infringement and the statements given from VDA QMC to the SPICE User Group regarding the current and future development of Automotive SPICE. Basic Facts and Acknowledgements
  • 9. Seite 9 • Employees of Volkswagen, Continental, Schäffler, ZF, Brose, Ford, BMW, Daimler, Knorr Bremse • Secretary: Bhaskar Vanamali (KMC) • Contact VDA QMC: Dr. Jan Morenzin Members VDA AK13
  • 10. Seite 10 • Automotive SPICE 3.0 has been published in July 2015 and may be used for assessments in agreement with the sponsor. • Automotive SPICE 2.3 is still the version which is considered mandatory by the VDA. Automotive SPICE versions 2.3 or 2.5 may still be used. • Mandatory rules for the Automotive SPICE 3.0 transition are decided by the VDA Quality Management Board with the release of the new “Blue/Gold Volume” (Sep. 2016). • In the meantime the VDA AK 13 will develop interpretation guidelines for Automotive SPICE 3.0 and also a guideline for performing assessments (Blue/Gold Volume by VDA, planned date is September 2016). Automotive SPICE 3.0 Deployment and Timeline
  • 11. Seite 11 • Timeline of publications by AK13 and transition time • CAVE! The publication time of the Blue/Gold Volume of VDA is a target date and the transition time of one year (Period II) is still under discussion. Blue/Gold Volume and ASPICE 3.0 - Deployment and Timeline (1/2) Automotive SYS July 2015 Release Automotive SPICE 3.0 End of 2016 Release Blue/Gold Volume End of 2017 Period I Period II = Transition Period Period III
  • 12. Seite 12 Blue/Gold Volume and ASPICE 3.0 - Deployment and Timeline (2/2) Topic Period I Period II Period III Entry Exit ASPICE 3.0 release BG Volume release BG Volume release End of transition (~1y) End of transition Open end iNTACS Trainings Update of trainings to ASPICE 3.0 Update of trainings to the BG Volume None Certified assessors No implication for any grade Upgrade training needed for Competent and Principal No implication for Provisional assessors Still Upgrade-Training needed for Competent and Principal No implication yet for Provisional assessors Additional trainings Update Trainings possible, but no official trainings iNTACS- and VDA WG13- approved upgrade trainings Still iNTACS- and VDA WG13-approved upgrade trainings Provisional assessors No specific requirements Upgrade training in discussion but not clear how to enforce that Upgrade training in discussion but not clear how to enforce that Assessments Assessment for any PAM 2.3 to 3.0 are possible Assessments for PAM 3.0 only accepted if Lead assessor underwent upgrade training In discussion: For German OEMs assessments have to be performed based on PAM 3.0
  • 13. Seite 13 • There will be no official upgrade trainings. • The iNTACS training materials for Provisional and Competent Assessors will be updated (Update for Provisional Assessor training planned for early 2016) • iNTACS instructors are not required to be upgraded • To be able to perform Automotive SPICE 3.0 assessments: no additional requirements • Automotive SPICE 3.0 Assessments are acknowledged as EE-1. • Certification/Recertification: No changes Rules for the Period between July 2015 until the Guidelines are published – Period I
  • 14. Seite 14 • iNTACS training materials will be updated. There will be an official upgrade training. iNTACS will provide all changes, AK13 will perform reviews. iNTACS training providers will perform the trainings. • iNTACS instructors need to be upgraded. Bhaskar Vanamali and Pierre Metz will perform the upgrade trainings. • iNTACS training providers will provide upgrade trainings for all assessor grades. • To be able to perform Automotive SPICE 3.0 assessments: The lead assessor must have participated in official upgrade training. • Automotive SPICE 3.0 Assessments are acknowledged as EE-1 if the assessor has passed an upgrade training (or has attended a new provisional course). Assessments from previous models (2.3 upwards) are still acknowledged as EE- 1. • Certification/Recertification: • Provisional: There are no changes to recertification. New Provisional Course. • Competent: Need upgrade unless they took a new Provisional Course. • Principals: Need upgrade Rules for the Period between Guidelines are published until the end of the Transition Period (date tbd) – Period II
  • 15. Seite 15 • Automotive SPICE 3.0 Assessments are acknowledged as EE-1 if the assessor has passed an upgrade training (or has attended a new provisional course). • Whether assessments from previous models (2.3 upwards) are still acknowledged as EE-1 is not yet decided. • Certification/Recertification: • Provisional: There are no planned changes to recertification. New Provisional Course. However, discussion on mandatory upgrade trainings for Provisional Assessors which were trained based on old trainings. • Competent: Need upgrade unless they took a new Provisional Course. • Principals: Need upgrade Rules for the time after the end of the Transition Period (date tbd) – Period III
  • 16. Seite 16 Overview of the Changes
  • 17. Seite 17 • Adaptation to ISO/IEC 15504-5 2012 • Automotive SPICE 2.5 was based on ISO/IEC 15504-5 2006. • Adaption to the ISO/IEC 33000 series • including the updated measurement framework • Some basic structures, concepts, and terminology needed updates. • Assessment Indicators needed updates (particularly Base Practices for the HIS scope) Motivation for Updating Automotive SPICE
  • 18. Seite 18 Chapter Change Chapter 1 Introduction Editorial adaption to 33000 series, notes regarding combined PRM/PAM in this document Chapter 2 Statement of compliance Adaption to 33000 series Chapter 3 Introduction Optimized for better understanding and adapted to 33000 series. Chapter 4 Process reference model and performance indicators (Level 1) The acronym ENG changed to SYS and SWE, the structure of the processes has changed, Base Practices of the HIS Scope have been reworked. Chapter 5 Process capability levels and process attributes Adapted to the measurement framework of ISO/IEC 33020 Annex A Conformity of the process assessment and reference model Conformity statement adapted to ISO/IEC 33004:2015. Annex B Work product characteristics Modifications on work product characteristics according to the changes in chapter 4. Annex C Terminology Update to recent standards, introduction of new terminology Annex D Key Concepts Added the new major concepts relative to Annex D of AS 2.5 Traceability diagram (Annex E PAM 2.5) is now in Annex D Annex E Reference Standards Updated references to other standards Overview of Main Changes (Document view)
  • 19. Seite 19 The new Annex D was extended to include a clarification of new Automotive SPICE key concepts and is a good starting point to understand the differences between V 3.0 and V 2.5 in particular with respect to Capability Level 1 Annex D Key Koncepts • D.1 The “Plug-in” Concept • D.2 The Tip of the “V” • D.3 Terms “Element”, “Component”, “Unit”, and “Item” • D.4 Traceability and Consistency • D.5 “Agree” and “Summarize and Communicate” • D.6 “Evaluate”, “Verification Criteria” and “Ensuring compliance” • D.7 The Relation Between “Strategy” and “Plan” Overview of Main Changes
  • 20. Seite 20 Structural Changes in Version 3.0: “Plug-In Concept” SWE.1 SWE.2 SWE.3 SWE.5 SWE.4 SWE.6 SYS.1 SYS.2 SYS.3 SYS.4 SYS.5 SystemLevel HWE.1 HWE.2 HWE.3 HWE.4 MEE.1 MEE.2 MEE.3 MEE.4 DomainLevel MAN.3 ACQ.4 SUP.1 SUP.8 SUP.9 SUP.10 SYS System Engineering SWE Software Engineering HWE Hardware Engineering MEE Mechanical Engineering part of the Automotive SPICE® 3.0 PAM not developed by VDA, not included in Automotive SPICE® 3.0 PAM
  • 21. Seite 21 Structural Changes in Version 3.0: The Tip of the V SYS.2 System Requirements Analysis SYS.3 System Architectural Design SWE.1 SW Requirements Analysis SWE.2 SW Architectural Design SWE.3 SW Detailed Design and Unit Construction SWE.5 SW Integration and Integration Test SWE.6 SW Qualification Test SYS.4 System Integration and Integration Test SYS.5 System Qualification Test SWE.4 SW Unit Verification
  • 22. Seite 22 Terms “Element”, “Component”, “Unit”, and “Item” • A system architecture specifies the elements of the system. • A software architecture specifies the elements of the software. • Software elements are hierarchically decomposed into smaller elements down to the software components which are at the lowest level of the software architecture. • Software components are described in the detailed design. • A software component consists of one or more software units. • Items on the right side of the V-model are the implemented counterparts of elements and components on the left side. This can be a 1:1 or m:n relationship, e.g. an item may represent more than one implemented element. System Engineering Process Group (SYS) SYS.1 Requirements Elicitation SYS.2 System Requirements Analysis SYS.3 System Architectural Design Software Engineering Process Group (SWE) SYS.5 System Qualification Test SYS.4 System Integration and Integration Test SWE.1 Software Requirements Analysis SWE.2 Software Architectural Design SWE.3 Software Detailed Design and Unit Construction SWE.6 Software Qualification Test SWE.5 Software Integration and Integration Test SWE.4 Software Unit Verification elements components units units items
  • 23. Seite 23 • Traceability and consistency have been formerly addressed by one single base practice on the right side of the V and have now been split into two base practices. • Traceability refers to the existence of references or links between work products. Traceability supports coverage analysis, impact analysis, requirements implementation status tracking etc. • Consistency means that • All traceability references/links are available (i.e., nothing is missing) • All traceability references/links are correct (i.e., not linking to the wrong work product) • Consistency has to be proven by technical review of the traceability • New traceability requirements have been added: • Between test cases and test results • Between change requests and work products affected by these change requests (SUP.10) Traceability and Consistency
  • 24. Seite 24 Traceability and Consistency Stakeholder requirements System Integration test specification Software qualification test specification Software integration Test specification System qualification test specification SYS.4 BP7 SYS.4 BP8 SYS.5 BP5 SYS.5 BP6 SYS.3 BP6 SYS.3 BP7 SYS.2 BP7 SYS.2 BP8 SUP.10 BP8 SYS.5 BP5 SWE.4 BP5 SWE.4 BP5 SWE.4 BP6 SWE.1 BP7 SWE.1 BP8 SWE.3 BP5 SWE.3 BP6 SWE.2 BP7 SWE.2 BP8 SWE.6 BP5 SWE.6 BP6 SWE.5 BP7 SWE.5 BP8 SWE.3 BP5 SWE.3 BP6 SWE.3 BP5 SWE.3 BP6 SWE.1 BP7 SWE.1 BP8 SWE.4 BP5 SYS.4 BP7 SWE.6 BP5 SWE.5 BP7 To affected work products Test cases Test cases Test cases Test cases System requirements Unit test specification Software detailed design Software architecture Software requirements Change request Unit test results Software Integration test result Software qualification test results System integration test results System qualification test results System architecture Software units Static varification results Consistency Bidirectional traceability
  • 25. Seite 25 • The information flow on the left side of the “V” is ensured through a base practice “Communicate agreed ‘work product x’”. The term “agreed” means that there is a joint understanding of all stakeholders regarding the content of the work product. • The information flow on the right side of the “V” is ensured through a base practice “Summarize and communicate results”. The term “Summarize” refers to abstracted information resulting from test executions made available to all relevant parties. “Agree” and “Summarize and Communicate” SYS.2 System Requirements Analysis SYS.3 System Architectural Design SWE.1 SW Requirements Analysis SWE.2 SW Architectural Design SWE.3 SW Detailed Design and Unit Construction SWE.5 SW Integration and Integration Test SWE.6 SW Qualification Test SYS.4 System Integration and Integration Test SYS.5 System Qualification Test SWE.4 SW Unit Verification BP: “communicate agreed…” BP: “summarize and communicate…”
  • 26. Seite 26 “Evaluate”, “Verification Criteria” and “Ensuring compliance” SYS.2 System Requirements Analysis SYS.3 System Architectural Design SWE.1 SW Requirements Analysis SWE.2 SW Architectural Design SWE.3 SW Detailed Design and Unit Construction SWE.5 SW Integration and Integration Test SWE.6 SW Qualification Test SYS.4 System Integration and Integration Test SYS.5 System Qualification Test SWE.4 SW Unit Verification SUP.2 Verification SYS.2.BP5: Verification criteria SWE.1.BP5: Verification criteria SYS.5.BP2: Ensure compliance SYS.3.BP3: Ensure compliance SWE.6.BP2: Ensure compliance SWE.5.BP3: Ensure compliance SWE.3.BP4: Evaluate SWE.4.BP2: Criteria for unit verification
  • 27. Seite 27 • Verification criteria are used as input for the development of the test cases or other verification measures that ensures compliance with the requirements. Verification criteria are only used in the context of System Requirements Analysis (SYS.2) and Software Requirements Analysis (SWE.1) processes. Verification aspects which cannot be covered by testing are covered by the verification process (SUP.2). • Criteria for unit verification ensure compliance of the source code with the software detailed design and the non-functional requirements. Possible criteria for unit verification include unit test cases, unit test data, coverage goals and coding standards and coding guidelines, e.g. MISRA. For unit testing, such criteria shall be defined in a unit test specification. This unit test specification may be implemented e.g. as a script in an automated test bench. “Evaluate”, “Verification Criteria” and “Ensuring compliance”
  • 28. Seite 28 • Evaluation of alternative solutions is required for system and software architectures. The evaluation has to be done according to defined criteria. Such evaluation criteria may include quality characteristics like modularity, reliability, security, and usability, or results of make-or-buy or reuse analysis. The evaluation result including a rationale for the architecture/design selection has to be recorded. • Compliance with an architectural design (SWE.5.BP3) means that the specified integration tests are capable of proving that interfaces and relevant interactions like e.g. dynamic behavior between - the software units, - the software items and - the system items fulfill the specification given by the architectural design. “Evaluate”, “Verification Criteria” and “Ensuring compliance”
  • 29. Seite 29 • SYS.3.BP5: Evaluate alternative system architectures. Define evaluation criteria for architecture design. Evaluate alternative system architectures according to the defined criteria. Record the rationale for the chosen system architecture. [OUTCOME 1] • NOTE 3: Evaluation criteria may include quality characteristics (modularity, maintainability, expandability, scalability, reliability, security and usability) and results of make-buy-reuse analysis. New practice in System Architectural Design : Evaluate alternative system architectures
  • 30. Seite 30 Both terms “Strategy” and “Plan” are commonly used across following processes of the Automotive SPICE 3.0 PAM: • SYS.4 System Integration and Integration Test • SYS.5 System Qualification Test • SWE.4 Software Unit Verification • SWE.5 Software Integration and Integration Test • SWE.6 Software Qualification Test • SUP.1 Quality Assurance • SUP.8 Configuration Management • SUP.9 Problem Resolution Management • SUP.10 Change Request Management The Relation Between “Strategy” and “Plan” BP1: Develop Strategy Plan Process specific plan WP 08-nn Generic plan WP 08-00 CL >=2 CL =1 Which is documented in the related WP BP 1 defines the strategy…
  • 31. Seite 31 • Capability Level 1: Each of these processes requires the development of a process-specific strategy. The strategy always corresponds to a process-specific “Plan”. For each process-specific “Plan” there are process-specific work product characteristics defined (e.g. “08-52 Test Plan”, “08-04 Configuration Management Plan”). Scheduling like e.g. old SUP.10 BP10 has been moved to Level 2 • Capability Level 2 or higher: Each process-specific “Plan” (WP 08-nn) inherits the work product characteristics represented by the Generic Plan (WP 08-00). This means that for a process-specific “Plan” both the process-specific characteristics (WP 08- nn) and the generic characteristics (WP 08-00) apply. • BPs for proceeding have been deleted. The Relation Between “Strategy” and “Plan”
  • 32. Seite 32 • Color code for different elements • Red for PRM elements (Process ID, name, purpose and outcomes) • Green for base practices and generic practices • Blue for output work products and generic resources • Italics for content from ISO 330xx (e.g. measurement framework) No Separation of PAM and PRM in v3.0
  • 33. Seite 33 Rating scale as defined by ISO/IEC 33020 Rating Rationale N Not achieved There is little or no evidence of achievement of the defined process attribute in the assessed process. P Partially achieved There is some evidence of an approach to, and some achievement of, the defined process attribute in the assessed process. Some aspects of achievement of the process attribute may be unpredictable. L Largely achieved There is evidence of a systematic approach to, and significant achievement of, the defined process attribute in the assessed process. Some weaknesses related to this process attribute may exist in the assessed process. F Fully achieved There is evidence of a complete and systematic approach to, and full achievement of, the defined process attribute in the assessed process. No significant weaknesses related to this process attribute exist in the assessed process.
  • 34. Seite 34 Optional Rating Scale Rating Rationale P- Partially achieved: There is some evidence of an approach to, and some achievement of, the defined process attribute in the assessed process. Many aspects of achievement of the process attribute may be unpredictable. P+ Partially achieved: There is some evidence of an approach to, and some achievement of, the defined process attribute in the assessed process. Some aspects of achievement of the process attribute may be unpredictable. L- Largely achieved: There is evidence of a systematic approach to, and significant achievement of, the defined process attribute in the assessed process. Many weaknesses related to this process attribute may exist in the assessed process. L+ Largely achieved: There is evidence of a systematic approach to, and significant achievement of, the defined process attribute in the assessed process. Some weaknesses related to this process attribute may exist in the assessed process.
  • 35. Seite 35 Extended Rating Scheme - percentages Rating Percentages P- > 15% to 32.5% P+ > 32.5 % to 50 % L- > 50 % to 67.5 % L+ > 67.5 % to 85 % AK13 has not decided whether it is going to be part of the guidelines or out of scope
  • 36. Seite 36 Rating and Aggregation methods • ISO/IEC 33020 identifies 3 rating methods R1, R2 and R3 and different aggregation methods • aggregation within one process (one-dimensional, vertical aggregation) • across multiple process instances (one-dimensional, horizontal aggregation) • both (two-dimensional, matrix aggregation) • They are usually used for Assessments on Organizational Maturity • AK13 has not decided whether it is going to be part of the guidelines or out of scope • In Automotive SPICE V 3.0 the rating methods are only briefly explained and the approaches referenced
  • 37. Seite 37 Rating method R1 (which has the strongest requirements): • The approach to process attribute rating shall satisfy the following conditions: • a) Each process outcome of each process within the scope of the assessment shall be characterized for each process instance, based on validated data; • b) Each process attribute outcome of each process attribute for each process within the scope of the assessment shall be characterised for each process instance, based on validated data; • c) Process outcome characterisations for all assessed process instances shall be aggregated to provide a process performance attribute achievement rating; • d) Process attribute outcome characterisations for all assessed process instances shall be aggregated to provide a process attribute achievement rating. Example of a Rating Method
  • 38. Seite 38 Inst. 1 Inst. 2 Inst. 3 Inst. 4 Inst. 5 Inst. 6 GP 3.2.1 P P N P P L GP 3.2.2 P L L L F L GP 3.2.3 L L L L L L Gp 3.2.4 F F F F F L GP 3.2.5 L L P P L L GP 3.2.6 P L P P N L BewertungderIndikatoren <---> Prozessinstanzen < --- > Hori- zontal Horizontale Aggregation page 38 Hori- zontal Horizontale Aggregation P L L F L P
  • 39. Seite 39 Vertikale Aggregation page 39 Vertikale Aggregation Vertikale Aggregation Inst. 1 Inst. 2 Inst. 3 Inst. 4 Inst. 5 Inst. 6 GP 3.2.1 P P N P P L GP 3.2.2 P L L L F L GP 3.2.3 L L L L L L GP 3.2.4 F F F F F L GP 3.2.5 L L P P L L GP 3.2.6 P L P P N L BewertungderIndikkatoren <---> Bewertung pro Prozessinstanz < --- > Vertikale Aggregation P L P L P L
  • 41. Seite 41 General Changes in V3.0 • See Key concepts (previous slides) • In the testing processes (SYS.4, SYS.5, SWE.5, SWE.6) test cases have to be selected based on the test strategy of the relevant test step. • Removal of Level 2 activities from BPs • Planning and monitoring activities were removed from BPs (e.g. SUP-processes) • Reviews beyond consistency checks were removed from BPs (e.g. SWE.2-4)
  • 42. Seite 42 Outcomes V2.5 As a result of successful implementation of this process 1 the scope of the work for the project is defined; 2 the feasibility of achieving the goals of the project with available resources and constraints is evaluated; 3 the tasks and resources necessary to complete the work are sized and estimated; 4 interfaces between elements in the project, and with other project and organizational units, are identified and monitored; 5 plans for the execution of the project are developed, implemented and maintained; 6 progress of the project is monitored and reported; and 7 actions to correct deviations from the plan and to prevent recurrence of problems identified in the project are taken when project goals are not achieved. Outcomes V3.0 As a result of successful implementation of this process 1 the scope of the work for the project is defined; 2 the feasibility of achieving the goals of the project with available resources and constraints is evaluated; 3 the activities and resources necessary to complete the work are sized and estimated; 4 interfaces within the project, and with other projects and organizational units, are identified and monitored; 5 plans for the execution of the project are developed, implemented and maintained; 6 progress of the project is monitored and reported; and 7 corrective action is taken when project goals are not achieved, and recurrence of problems identified in the project is prevented. • Mainly rewording but no new content MAN.3 Project Management – Outcomes
  • 43. 43Seite 43 Base Practices V2.5 1 Define the scope of work. 2 Define project life cycle 3 Determine and maintain estimates for project Attributes 4 Define project activities 5 Define skill needs 6 Define and maintain project schedule 7 Identify and monitor project interfaces 8 Establish project plan 9 Implement the project plan 10 Monitor project attributes 11 Review and report progress of the project 12 Act to correct deviations Base Practices V3.0 1 Define the scope of work. 2 Define project life cycle 3 Evaluate feasibility of the project 4 Define, monitor and adjust project activities 5 Determine, monitor und adjust project estimates and resources 6 Ensure required skills, knowledge, and experience 7 Identify, monitor and adjust project interfaces and agreed commitments 8 Define, monitor and adjust project schedule 9 Ensure consistency 10 Review and report progress of the project Link to 3, 4, 5, 6, 7, 10 MAN.3 Project Management – Base Practices
  • 44. 44Seite 44 Changes in V3.0 – MAN.3 Project Management • Establish and implement Project plan has been deleted as it caused confusion in the past • Instead all aspects of planning have to be identified, monitored and adjusted (estimates, activities, schedules, plans, interfaces and commitments) • Across all artifacts consistency has to be established (specific BP) – no traceability, just consistency • Scope of work used to contain check of feasibility. In 3.0 a specific BP for feasibility has been introduced • The project plan 08-12 is still an output work product of MAN.3. Check Annex B but the references to risk management were removed. However, risks are mentioned in BP5 and MAN.5 is referenced in the corresponding Note 6
  • 45. 45Seite 45 Outcomes V2.5 As a result of successful implementation of this process 1 a strategy for conducting quality assurance is developed, implemented and maintained; 2 quality assurance is performed independent of the activity or project being performed; 3 evidence of quality assurance is produced and maintained; 4 adherence of products, processes and activities to agreed requirements are verified, documented, and communicated to the relevant parties; 5 problems and/or non-conformance with agreement requirements are identified, recorded, communicated to the relevant parties, tracked and resolved; and 6 quality assurance has the independence and authority to escalate problems to appropriate levels of management. Outcomes V3.0 As a result of successful implementation of this process 1 a strategy for performing quality assurance is developed, implemented, and maintained; 2 quality assurance is performed independently and objectively without conflicts of interest; 3 non-conformances of work products, processes, and process activities with relevant requirements are identified, recorded, communicated to the relevant parties, tracked, resolved, and further prevented; 4 conformance of work products, processes and activities with relevant requirements is verified, documented, and communicated to the relevant parties; 5 authority to escalate non-conformances to appropriate levels of management is established; and 6 management ensures that escalated non- conformances are resolved. • Focus is more on checking conformance and ensuring resolution of non-conformances SUP.1 Quality Assurance – Outcomes
  • 46. 46Seite 46 Base Practices V2.5 1 Develop project quality assurance strategy 2 Develop and maintain an organisation structure which ensures that quality assurance is carried out and report Independentely 3 Develop and implement a plan for project quality assurance based on a quality assurance strategy 4 Maintain evidence of quality assurance 5 Assure quality of work products 6 Assure quality of process activities 7 Track and record quality assurance activities 8 Report quality assurance activities and results 9 Ensure resolution on non-conformances 10 Implement an escalation mechanism Base Practices V3.0 1 Develop project quality assurance strategy 2 Assure quality of work products 3 Assure quality of process activities 4 Summarize and communicate quality assurance activities and results 5 Ensure resolution of non-conformances 6 Implement an escalation mechanism SUP.1 Quality Assurance – Base Practices
  • 47. 47Seite 47 Changes in V3.0 – SUP.1 Quality Assurance • Overall simplifying the process with similar content • On top of independence in QA objectivity is required. • It is clarified that escalation has to lead to management attention and actions.
  • 48. 48Seite 48 Outcomes V2.5 As a result of successful implementation of this process 1 a configuration management strategy is developed; 2 all items generated by a process or project are identified, defined and baselined according to the Configuration management strategy; 3 modifications and releases of the items are controlled; 4 modifications and releases are made available to affected parties; 5 the status of the items and modification requests are recorded and reported; 6 the completeness and consistency of the items is ensured; and 7 storage, handling and delivery of the items are controlled. Outcomes V3.0 As a result of successful implementation of this process 1 a configuration management strategy is developed; 2 all configuration items generated by a process or project are identified, defined and baselined according to the configuration management strategy; 3 modifications and releases of the configuration items are controlled; 4 modifications and releases are made available to affected parties; 5 the status of the configuration items and modifications is recorded and reported; 6 the completeness and consistency of the baselines is ensured; and 7 storage of the configuration items is controlled. • Some rewording but in essence the outcomes have not changed SUP.8 Configuration Management – Outcomes
  • 49. 49Seite 49 Base Practices V2.5 1 Develop a configuration management strategy 2 Identify configuration items 3 Establish a configuration management system 4 Establish branch management strategy 5 Establish baselines 6 Maintain configuration item description 7 Control modifications and releases 8 Maintain configuration item history 9 Report configuration status 10 Verify the information about configured items 11 Manage the backup, storage, archiving, handling and delivery of configuration items Base Practices V3.0 1 Develop a configuration management strategy 2 Identify configuration items 3 Establish a configuration management system 4 Establish branch management strategy 5 Control modifications and releases 6 Establish baselines 7 Report configuration status 8 Verify the information about configured items 9 Manage the storage of configuration items and baselines SUP.8 Configuration Management – Base Practices
  • 50. 50Seite 50 Changes in V3.0 – SUP.8 Configuration Management • Overall simplifying the process with similar content • No new content
  • 51. 51Seite 51 Outcomes V2.5 As a result of successful implementation of this process 1 a problem management strategy is developed; 2 problems are recorded, identified and classified; 3 problems are analysed and assessed to identify acceptable solution(s); 4 problem resolution is implemented; 5 problems are tracked to closure; and 6 the status of all problem reports is known Outcomes V3.0 As a result of successful implementation of this process 1 a problem resolution management strategy is developed; 2 problems are recorded, uniquely identified and classified; 3 problems are analyzed and assessed to identify an appropriate solution; 4 problem resolution is initiated; 5 problems are tracked to closure; and 6 the status of problems and their trend are known • Some rewording but in essence the outcomes have not changed SUP.9 Problem Resolution Management – Outcomes
  • 52. 52Seite 52 Base Practices V2.5 1 Develop a problem resolution management strategy 2 Establish a consistent problem resolution management proceeding 3 Identify and record the problem 4 Investigate and diagnose the cause and the impact of the problem 5 Execute urgent resolution action, where necessary 6 Raise alert notifications, where necessary 7 Initiate change request 8 Track problems to closure 9 Analyze problem trends Base Practices V3.0 1 Develop a problem resolution management strategy 2 Identify and record the problem 3 Record the status of problems 4 Diagnose the cause and determine the impact of the problem 5 Authorize urgent resolution action 6 Raise alert notifications 7 Initiate problem resolution 8 Track problems to closure 9 Analyze problem trends SUP.9 Problem Resolution Management – Base Practices
  • 53. 53Seite 53 Changes in V3.0 – SUP.9 Problem Resolution Management • Only minor changes regarding terminology • No need to start a change request anymore (However, problems have to be tracked to closure and initiating a change request might be an option ) • So also no planning aspects on Level 1 • The proceeding is part of the strategy/plan.
  • 54. 54Seite 54 Outcomes V2.5 As a result of successful implementation of this process 1 a change management strategy is developed; 2 requests for changes are recorded and identified; 3 dependencies and relationships to other change requests are identified; 4 criteria for confirming implementation of the change request are defined; 5 requests for change are analysed, prioritized, and resource requirements estimated; 6 changes are approved on the basis of priority and availability of resources; 7 approved changes are implemented and tracked to closure; and 8 the status of all change requests is known Outcomes V3.0 As a result of successful implementation of this process 1 a change request management strategy is developed; 2 requests for changes are recorded and identified; 3 dependencies and relationships to other change requests are identified; 4 criteria for confirming implementation of change requests are defined; 5 requests for change are analyzed, and resource requirements are estimated; 6 changes are approved and prioritized on the basis of analysis results and availability of resources; 7 approved changes are implemented and tracked to closure; 8 the status of all change requests is known; and 9 bi-directional traceability is established between change requests and affected work products • Major change that traceability to affected work products is included. • Other than that no major changes SUP.10 Change Request Management – Outcomes
  • 55. 55Seite 55 Base Practices V2.5 1 Develop a change request management strategy 2 Establish a consistent change request management Proceeding 3 Identify and record the change request 4 Record the status of change requests 5 Establish the dependencies and relationships to other change requests 6 Assess the impact of the change 7 Analyze and prioritize change requests 8 Approve change requests before implementation 9 Identify and plan the verification and validation activities to be performed for implemented changes 10 Schedule and allocate the change request 11 Review the implemented change 12 Change requests are tracked until closure Base Practices V3.0 1 Develop a change request management strategy 2 Identify and record the change requests 3 Record the status of change requests 4 Analyze and assess change requests 5 Approve change requests before implementation 6 Review the implementation of change requests 7 Track change requests to closure 8 Establish bidirectional traceability Has been moved to Level 2 New BP/aspect of SUP.10 SUP.10 Change Request Management – Base Practices
  • 56. 56Seite 56 Changes in V3.0 – SUP.10 Change Request Management • There are two major changes: • The planning aspects (scheduling and planning of verification and validation) have been moved to Level 2 • The traceability between Change requests and affected Work products has been introduced • Other than that there are no changes except for wording. • The proceeding is part of the strategy/plan.
  • 57. 57Seite 57 Outcomes V2.5 As a result of successful implementation of this process 1 joint activities between the customer and the supplier are performed as needed; 2 all information, agreed upon for exchange, is transferred between the supplier and the customer; 3 information on progress is exchanged regularly with the supplier; 4 performance of the supplier is monitored against the agreed requirements; and 5 changes to the agreement, if needed, are negotiated between the customer and the supplier and documented with the agreement Outcomes V3.0 As a result of successful implementation of this process 1 joint activities, as agreed between the customer and the supplier, are performed as needed; 2 all information, agreed upon for exchange, is communicated regularly between the supplier and customer; 3 performance of the supplier is monitored against the agreements; and 4 changes to the agreement, if needed, are negotiated between the customer and the supplier and documented in the agreement • Some rewording but in essence the outcomes have not changed ACQ.4 Supplier Monitoring – Outcomes
  • 58. 58Seite 58 Base Practices V2.5 1 Agree on joint processes and joint interfaces 2 Exchange all relevant information 3 Review technical development with the supplier 4 Review progress of the supplier 5 Track open items 6 Act to correct deviations 7 Agree on changes Base Practices V3.0 1 Agree on and maintain joint processes 2 Exchange all agreed information 3 Review technical development with the supplier 4 Review progress of the supplier 5 Act to correct deviations Linked to 1, 3, 4 and 5 ACQ.4 Supplier Monitoring – Base Practices
  • 59. 59Seite 59 Base Practices V2.5 1 Agree on joint processes and joint interfaces 2 Exchange all relevant information 3 Review technical development with the supplier 4 Review progress of the supplier 5 Track open items 6 Act to correct deviations 7 Agree on changes Base Practices V3.0 1 Agree on and maintain joint processes 2 Exchange all agreed information 3 Review technical development with the supplier 4 Review progress of the supplier 5 Act to correct deviations Linked to 1, 3, 4 and 5 ACQ.4 Supplier Monitoring – Base Practices
  • 60. 60Seite 60 Changes in V3.0 – ACQ.4 Supplier Monitoring • No major changes, but the approach was simplified
  • 61. Seite 61 Discuss in small working groups and document the results on a flip chart : • What are the major changes ? (summarize) • Where do see barriers in changing your process to ASPICE 3.0 complaince ? • How do you estimate the invest (higher / same / less) ? Why ? • What will be the impact of Assessments (imagine : now based on 3.0 instead of 2.X) ? Group Work / Discussion
  • 62. 62Seite 62 Outcomes V2.5 – ENG.2 As a result of successful implementation of this process 1 a defined set of system requirements is established; 2 system requirements are categorized and analyzed for correctness and testability; 3 the impact of the system requirements on the operating environment is evaluated; 4 prioritization for implementing the system requirements is defined; 5 the system requirements are approved and updated as needed; 6 consistency and bilateral traceability are established between customer requirements and system requirements; 7 changes to the customer’s requirements baseline are evaluated for cost, schedule and technical impact; and 8 the system requirements are communicated to all affected parties and baselined Outcomes V3.0 As a result of successful implementation of this process 1 a defined set of system requirements is established; 2 system requirements are categorized and analyzed for correctness and verifiability; 3 the impact of system requirements on the operating environment is analyzed; 4 prioritization for implementing the system requirements is defined; 5 the system requirements are updated as needed; 6 consistency and bidirectional traceability are established between stakeholder requirements and system requirements; 7 the stakeholder requirements are evaluated for cost, schedule and technical impact; and 8 the system requirements are agreed and communicated to all affected parties • Some rewording but in essence the outcomes have not changed SYS.2 System Requirements Analysis – Outcomes
  • 63. 63Seite 63 Base Practices V2.5 – ENG.2 1 Identify System Requirements 2 Analyze system requirements 3 Determine the impact on the operating environment 4 Prioritize and categorize system requirements 5 Evaluate and update system requirements 6 Ensure consistency and bilateral traceability of customer requirements to system requirements 7 Communicate system requirements Base Practices V3.0 1 Specify system requirements 2 Structure system requirements 3 Analyze system requirements 4 Analyze the impact on the operating environment 5 Develop verification criteria 6 Establish bidirectional traceability 7 Ensure consistency 8 Communicate agreed system requirements Linked to 1-8 SYS.2 System Requirements Analysis – Base Practices
  • 64. 64Seite 64 Changes in V3.0 – SYS.2 System Requirements Analysis • Traceability and consistency are separated (see key concepts) • Verification criteria explicitly required in BP5 • Other than that mainly rewording
  • 65. 65Seite 65 Outcomes V2.5 – ENG.3 As a result of successful implementation of this process 1 a system architectural design is defined that identifies the elements of the system and meets the defined systems requirements; 2 the system requirements are allocated to the elements of the system; 3 internal and external interfaces of each system element are defined; 4 verification between the system requirements and the system architectural design is performed; 5 consistency and bilateral traceability are established between system requirements and system architectural design; and 6 the system requirements, the system architectural design, and their relationships are baselined and communicated to all affected parties. Outcomes V3.0 As a result of successful implementation of this process 1 a system architectural design is defined that identifies the elements of the system; 2 the system requirements are allocated to the elements of the system; 3 the interfaces of each system element are defined; 4 the dynamic behavior objectives of the system elements are defined; 5 consistency and bidirectional traceability are established between system requirements and system architectural design; and 6 the system architectural design is agreed and communicated to all affected parties • New aspect with dynamic behavior but other than that the outcomes have not changed in essence New aspect of system architecture SYS.3 System Architectural Design – Outcomes
  • 66. 66Seite 66 Base Practices V2.5 – ENG.3 1 Define system architectural design 2 Allocate System Requirements 3 Define Interfaces 4 Develop verification criteria 5 Verify System Architectural Design 6 Ensure consistency and bilateral traceability of system requirements to system architectural design 7 Communicate system architectural design Base Practices V3.0 1 Develop system architectural design 2 Allocate system requirements 3 Define interfaces of system elements 4 Describe dynamic behavior 5 Evaluate alternative system architectures 6 Establish bidirectional traceability 7 Ensure consistency 8 Communicate agreed system architectural design New aspects of the system architecture process SYS.3 System Architectural Design – Base Practices
  • 67. 67Seite 67 Changes in V3.0 – SYS.3 System Architectural Design • New: • Dynamic behavior has to be explicitly addressed (before only implicitly) • Design alternatives have to be documented • Traceability and consistency are separated (see key concepts) • No verification criteria required anymore • Review only for consistency check, otherwise on Level 2
  • 68. 68Seite 68 Outcomes V2.5 – ENG.9 As a result of successful implementation of this process 1 a system integration and system integration test strategy is developed for system elements consistent with the system architectural design according to the priorities and categorization of the system requirements; 2 a test specification for system integration test is developed to verify compliance with the system architectural design, including the interfaces between system elements; 3 an integrated system is integrated as defined by the integration strategy; 4 the integrated system elements are verified using the test cases; 5 results of system integration testing are recorded; 6 consistency and bilateral traceability are established between system architectural design and system integration test specification including test cases; and 7 a regression strategy is developed and applied for re- testing the system elements when changes are made Outcomes V3.0 As a result of successful implementation of this process 1 a system integration strategy consistent with the project plan, the release plan and the system architectural design is developed to integrate the system items; 2 a system integration test strategy including the regression test strategy is developed to test the system item interactions; 3 a specification for system integration test according to the system integration test strategy is developed that is suitable to provide evidence for compliance of the integrated system items with the system architectural design, including the interfaces between system items; 4 system items are integrated up to a complete integrated system according to the integration strategy; 5 test cases included in the system integration test specification are selected according to the system integration test strategy and the release plan; 6 system item interactions are tested using the selected test cases and the results of system integration testing are recorded; 7 consistency and bidirectional traceability between the elements of the system architectural design and test cases included in the system integration test specification and bidirectional traceability between test cases and test results is established; and 8 results of the system integration test are summarized and communicated to all affected parties The appropriate test cases need to be selected to support the test strategy (incl. the regression test strategy) SYS.4 System Integration and Integration Test – Outcomes
  • 69. 69Seite 69 Base Practices V2.5 – ENG.9 1 Develop system integration strategy 2 Develop system integration test strategy 3 Develop a test specification for system integration 4 Integrate system elements 5 Verify the integrated system 6 Record the results of system integration testing 7 Ensure consistency and bilateral traceability of system architectural design to the system integration test specification 8 Develop regression testing strategy and perform regression testing Base Practices V3.0 1 Develop system integration strategy 2 Develop system integration test strategy including regression test strategy 3 Develop specification for system integration test 4 Integrate system items 5 Select test cases 6 Perform system integration test 7 Establish bidirectional traceability 8 Ensure consistency 9 Summarize and communicate results New aspects of the system integration and integration test process SYS.4 System Integration and Integration Test – Base Practices
  • 70. 70Seite 70 Changes in V3.0 – SYS.4 System Integration and Integration Test • Test cases have to be selected according to the test strategy including the regression test strategy • Traceability and consistency are separated (see key concepts)
  • 71. 71Seite 71 Outcomes V2.5 – ENG.10 As a result of successful implementation of this process 1 a strategy is developed to test the system according to the priorities of and categorization the system requirements; 2 a test specification for system test of the integrated system is developed that demonstrates compliance with the system requirements; 3 the integrated system is verified using the test cases; 4 results of system testing are recorded; 5 consistency and bilateral traceability are established between system requirements and the system test specification including test cases; and 6 a regression test strategy is developed and applied for re-testing the integrated system when a change in system elements is made Outcomes V3.0 As a result of successful implementation of this process 1 a system qualification test strategy including regression test strategy consistent with the project plan and release plan is developed to test the integrated system; 2 a specification for system qualification test of the integrated system according to the system qualification test strategy is developed that is suitable to provide evidence for compliance with the system requirements; 3 test cases included in the system qualification test specification are selected according to the system qualification test strategy and the release plan; 4 the integrated system is tested using the selected test cases and the results of system qualification test are recorded; 5 consistency and bidirectional traceability are established between system requirements and test cases included in the system qualification test specification and between test cases and test results; and 6 results of the system qualification test are summarized and communicated to all affected parties The appropriate test cases need to be selected to support the test strategy (incl. the regression test strategy) SYS.5 System Qualification Test – Outcomes
  • 72. 72Seite 72 Base Practices V2.5 – ENG.10 1 Develop system test strategy 2 Develop test specification for system test 3 Verify integrated system 4 Record the results of system testing 5 Ensure consistency and bilateral traceability of system requirements to the systems test specification 6 Develop system regression test strategy and perform testing Base Practices V3.0 1 Develop system qualification test strategy including regression test strategy 2 Develop specification for system qualification test 3 Select test cases 4 Test integrated system 5 Establish bidirectional traceability 6 Ensure consistency 7 Summarize and communicate results New aspects of the system qualification test process SYS.5 System Qualification Test – Base Practices
  • 73. 73Seite 73 Changes in V3.0 – SYS.5 System Qualification Test • New name for system test  system qualification test (from ISO 15504-5:2012) • Test cases have to be selected according to the test strategy including the regression test strategy • Traceability and consistency are separated (see key concepts)
  • 74. 74Seite 74 Outcomes V2.5 – ENG.4 As a result of successful implementation of this process 1 the software requirements to be allocated to the software elements of the system and their interfaces are defined; 2 software requirements are categorized and analyzed for correctness and testability; 3 the impact of software requirements on the operating environment is evaluated; 4 prioritization for implementing the software requirements is defined; 5 the software requirements are approved and updated as needed; 6 consistency and bilateral traceability are established between system requirements and software requirements; and consistency and bilateral traceability are established between system architectural design and software requirements; 7 changes to the software requirements are evaluated for cost, schedule and technical impact; and 8 the software requirements are baselined and communicated to all affected parties Outcomes V3.0 As a result of successful implementation of this process 1 the software requirements to be allocated to the software elements of the system and their interfaces are defined; 2 software requirements are categorized and analyzed for correctness and verifiability; 3 the impact of software requirements on the operating environment is analyzed; 4 prioritization for implementing the software requirements is defined; 5 the software requirements are updated as needed; 6 consistency and bidirectional traceability are established between system requirements and software requirements; and consistency and bidirectional traceability are established between system architectural design and software requirements; 7 the software requirements are evaluated for cost, schedule and technical impact; and 8 the software requirements are agreed and communicated to all affected parties • Some rewording but in essence the outcomes have not changed SWE.1 Software Requirements Analysis – Outcomes
  • 75. 75Seite 75 Base Practices V2.5 – ENG.4 1 Identify software requirements 2 Analyze software requirements 3 Determine the impact on the operating environment 4 Prioritize and categorize software requirements 5 Evaluate and update software requirements 6 Ensure consistency and bilateral traceability of system requirements to software requirements 7 Ensure consistency and bilateral traceability of system architectural design to software requirements 8 Communicate software requirements Base Practices V3.0 1 Specify software requirements 2 Structure software requirements 3 Analyze software requirements 4 Analyze the impact on the operating environment 5 Develop verification criteria 6 Establish bidirectional traceability 7 Ensure consistency 8 Communicate agreed software requirements Linked to 1-8 SWE.1 Software Requirements Analysis – Base Practices
  • 76. 76Seite 76 Changes in V3.0 – SWE.1 Software Requirements Analysis • Traceability and consistency are separated (see key concepts) • Verification criteria explicitly required in BP5 • Other than that mainly rewording
  • 77. 77Seite 77 Outcomes V2.5 – ENG.5 As a result of successful implementation of this process 1 a software architectural design is defined that identifies the components of the software and meets the defined software requirements; (ENG.5 – 1) 2 the software requirements are allocated to the elements of the software; (ENG.5 – 2) 3 internal and external interfaces of each software component are defined; (ENG.5 – 3) 4 the dynamic behaviour and resource consumption objectives of the software components are defined; (ENG.5 – 4) 5 consistency and bilateral traceability are established between software requirements and software architectural design; (ENG.5 – 6) Outcomes V3.0 As a result of successful implementation of this process 1 a software architectural design is defined that identifies the elements of the software; 2 the software requirements are allocated to the elements of the software; 3 the interfaces of each software element are defined; 4 the dynamic behavior and resource consumption objectives of the software elements are defined; 5 consistency and bidirectional traceability are established between software requirements and software architectural design; and 6 the software architectural design is agreed and communicated to all affected parties New aspect of software architecture • Outcomes between SWE.2 and SWE.3 (formerly ENG.5 and ENG.6) have been rearranged SWE.2 Software Architectural Design – Outcomes
  • 78. 78Seite 78 Base Practices V2.5 – ENG.5 1 Develop software architectural design (ENG.5 – BP1) 2 Allocate software requirements (ENG.5 – BP2) 3 Define interfaces (ENG.5 – BP3) 4 Describe dynamic behaviour (ENG.5 – BP4) 5 Define resource consumption objectives (ENG.5 – BP5) 6 Develop Verification Criteria (ENG.5 – BP7) 7 Verify Software Design (ENG.5 – BP8) 8 Ensure consistency and bilateral traceability of software requirements to software architectural design (ENG.5 – BP9) Base Practices V3.0 1 Develop software architectural design 2 Allocate software requirements 3 Define interfaces of software elements 4 Describe dynamic behavior 5 Define resource consumption objectives 6 Evaluate alternative software architectures 7 Establish bidirectional traceability 8 Ensure consistency 9 Communicate agreed software architectural design New aspects of the software architecture process SWE.2 Software Architectural Design – Base Practices
  • 79. 79Seite 79 Changes in V3.0 – SWE.2 Software Architectural Design • New: • Design alternatives have to be documented • The architecture has to be agreed upon and communicated • The processes of design and unit construction (ENG.5/6) have been split in three different processes (SWE.2 – 4) • Traceability and consistency are separated (see key concepts) • Verification criteria not required explicitly • Review only for consistency check, otherwise on Level 2
  • 80. 80Seite 80 Outcomes V2.5 – ENG.5/6 As a result of successful implementation of this process 1 a detailed design is developed that describes software units that can be implemented and tested; (ENG.5 – 5) 2 internal and external interfaces of each software component are defined; (ENG.5 – 3) 3 the dynamic behaviour and resource consumption objectives of the software components are defined; (ENG.5 – 4) 4 consistency and bilateral traceability are established between software architectural design and software detailed design. (ENG.5 – 7) 5 software units defined by the software design are produced (ENG.6 – 3) 6 consistency and bilateral traceability are established between software detailed design and software units; (ENG.6 – 6) Outcomes V3.0 As a result of successful implementation of this process 1 a detailed design is developed that describes software units; 2 interfaces of each software unit are defined; 3 the dynamic behavior of the software units is defined; 4 consistency and bidirectional traceability are established between software requirements and software units; and consistency and bidirectional traceability are established between software architectural design and software detailed design; and consistency and bidirectional traceability are established between software detailed design and software units; 5 the software detailed design and the relationship to the software architectural design is agreed and communicated to all affected parties; and 6 software units defined by the software detailed design are produced • Outcomes between SWE.2 and SWE.3 (formerly ENG.5 and ENG.6) have been rearranged New aspect of software detailed design SWE.3 Software Detailed Design and Unit Construction – Outcomes
  • 81. 81Seite 81 Base Practices V2.5 – ENG.5/6 1 Develop detailed design (ENG.5 – BP6) 2 Define interfaces (ENG.5 – BP3) 3 Describe dynamic behaviour (ENG.5 – BP4) 4 Analyze software units (ENG.6 – BP2) 5 Prioritize and categorize software units (ENG.6 – BP3) 6 Develop Verification Criteria (ENG.5 – BP7) 7 Verify Software Design (ENG.5 – BP8) 8 Ensure consistency and bilateral traceability of software architectural design to software detailed design (ENG.5 – BP10) 9 Ensure consistency and bilateral traceability of software detailed design to software units (ENG.6 – BP8) 10 Ensure consistency and bilateral traceability of software requirements to software units (ENG.6 – BP9) 11 Develop software units (ENG.6 – BP4) Base Practices V3.0 1 Develop software detailed design 2 Define interfaces of software units 3 Describe dynamic behavior 4 Evaluate software detailed design 5 Establish bidirectional traceability 6 Ensure consistency 7 Communicate agreed software detailed design 8 Develop software units New aspects of SWE.3 Linked to BPs 5 and 6 SWE.3 Software Detailed Design and Unit Construction – Base Practices
  • 82. 82Seite 82 Changes in V3.0 – SWE.3 Software Detailed Design and Unit Construction • New: • The design has to be agreed upon and communicated • The processes of design and unit construction (ENG.5/6) have been split in three different processes (SWE.2 – 4) • Analysis and prioritization of the detailed design/units (ENG.6 BP2/3) is covered in the evaluation of the detailed design (SWE.3 BP4) • Traceability and consistency are separated (see key concepts) • Verification criteria not explicitly required anymore • Review only for consistency check, otherwise on Level 2
  • 83. 83Seite 83 Outcomes V2.5 – ENG.6 As a result of successful implementation of this process 1 a unit verification strategy is developed for software units consistent with the software design; (ENG.6 – 1) 2 software units defined by the software design are analyzed for correctness and testability; (ENG.6 – 2) 3 software units are verified according to the unit verification strategy; (ENG.6 – 4) 4 results of unit verification are recorded; (ENG.6 – 5) and 5 consistency and bilateral traceability are established between software detailed design and software units; (ENG.6 – 6) Outcomes V3.0 As a result of successful implementation of this process 1 a software unit verification strategy including regression strategy is developed to verify the software units; 2 criteria for software unit verification are developed according to the software unit verification strategy that are suitable to provide evidence for compliance of the software units with the software detailed design and with the non-functional software requirements; 3 software units are verified according to the software unit verification strategy and the defined criteria for software unit verification and the results are recorded; 4 consistency and bidirectional traceability are established between software units, criteria for verification and verification results; and 5 results of the unit verification are summarized and communicated to all affected parties. • Outcomes between SWE.3 and SWE.4 (formerly ENG.6) have been rearranged New aspect of software unit verification SWE.4 Software Unit Verification – Outcomes
  • 84. 84Seite 84 Base Practices V2.5 – ENG.6 1 Define a unit verification strategy (ENG.6 – BP1) 2 Analyze software units (ENG.6 – BP2) 3 Prioritize and categorize software units (ENG.6 – BP3) 4 Develop unit verification criteria (ENG.6 – BP5) 5 Verify software units (ENG.6 – BP6) 6 Record the results of unit verification (ENG.6 – BP7) 7 Ensure consistency and bilateral traceability of software units to test specification for software units (ENG.6 – BP10) Base Practices V3.0 1 Develop software unit verification strategy including regression strategy 2 Develop criteria for unit verification 3 Perform static verification of software units 4 Test software units 5 Establish bidirectional traceability 6 Ensure consistency 7 Summarize and communicate results New aspects of the software unit verification process Covered in SWE.3 SWE.4 Software Unit Verification – Base Practices
  • 85. 85Seite 85 Changes in V3.0 – SWE.4 Software Unit Verification • New: • The processes of design and unit construction (ENG.5/6) have been split in three different processes (SWE.2 – 4) • All verification activities on Unit Level are covered in this process • Traceability and consistency are separated (see key concepts)
  • 86. 86Seite 86 Outcomes V2.5 – ENG.7 As a result of successful implementation of this process 1 a software integration and integration test strategy is developed for software items consistent with the software design according to the priorities and categorization of the software requirements; 2 a test specification software integration is developed that ensures compliance with the software architectural design, software detailed design, allocated to the items; 3 software units and software items are integrated as defined by the integration strategy; 4 integrated software items are verified using the test cases; 5 results of software integration testing are recorded; 6 consistency and bilateral traceability are established between software architectural design and software detailed design to software integration test specification including test cases; and 7 a regression strategy is developed and applied for re- integrating and re-verifying software items when a change in software items (including associated requirements, design and code) occurs Outcomes V3.0 As a result of successful implementation of this process 1 a software integration strategy consistent with the project plan, release plan and the software architectural design is developed to integrate the software items; 2 a software integration test strategy including the regression test strategy is developed to test the software unit and software item interactions; 3 a specification for software integration test according to the software integration test strategy is developed that is suitable to provide evidence for compliance of the integrated software items with the software architectural design, including the interfaces between the software units and between the software items; 4 software units and software items are integrated up to a complete integrated software according to the integration strategy; 5 Test cases included in the software integration test specification are selected according to the software integration test strategy, and the release plan; 6 integrated software items are tested using the selected test cases and the results of software integration test are recorded; 7 consistency and bidirectional traceability are established between the elements of the software architectural design and the test cases included in the software integration test specification and between test cases and test results; and 8 results of the software integration test are summarized and communicated to all affected parties New aspect of software integration and integration test • Some rewording SWE.5 Software Integration and Integration Test – Outcomes
  • 87. 87Seite 87 Base Practices V2.5 – ENG.7 1 Develop software integration strategy 2 Develop software integration test strategy 3 Develop test specification for software integration Test 4 Integrate software units and software items 5 Verify the integrated software 6 Record the results of software integration testing 7 Ensure consistency and bilateral traceability of software architectural design and software detailed design to software integration test specification 8 Develop regression testing strategy and perform regression testing Base Practices V3.0 1 Develop software integration strategy 2 Develop software integration test strategy including regression test strategy 3 Develop specification for software integration test 4 Integrate software units and software items 5 Select test cases 6 Perform software integration test 7 Establish bidirectional traceability 8 Ensure consistency 9 Summarize and communicate results New aspects of the software integration and integration test process SWE.5 Software Integration and Integration Test – Base Practices
  • 88. 88Seite 88 Changes in V3.0 – SWE.5 Software Integration and Integration Test • New: • Selection of test cases based on the test strategy • Traceability and consistency are separated (see key concepts)
  • 89. 89Seite 89 Outcomes V2.5 – ENG.8 As a result of successful implementation of this process 1 a strategy is developed to test the integrated software according to the priorities and categorization of the software requirements; 2 a test specification for software test of the integrated software is developed that demonstrates compliance to the software requirements; 3 the integrated software is verified using the test cases; 4 results of software testing are recorded; 5 consistency and bilateral traceability are established between software requirements and software test specification including test cases; and 6 a regression test strategy is developed and applied for re-testing the integrated software when a change in software items occur Outcomes V3.0 As a result of successful implementation of this process 1 a software qualification test strategy including regression test strategy consistent with the project plan and release plan is developed to test the integrated software; 2 a specification for software qualification test of the integrated software according to the software qualification test strategy is developed that is suitable to provide evidence for compliance with the software requirements; 3 test cases included in the software qualification test specification are selected according to the software qualification test strategy and the release plan; 4 the integrated software is tested using the selected test cases and the results of software qualification test are recorded; 5 consistency and bidirectional traceability are established between software requirements and software qualification test specification including test cases and between test cases and test results; and 6 results of the software qualification test are summarized and communicated to all affected parties SWE.6 Software Qualification Test – Outcomes The appropriate test cases need to be selected to support the test strategy (incl. the regression test strategy)
  • 90. 90Seite 90 Base Practices V2.5 – ENG.8 1 Develop software test strategy 2 Develop test specification for software test 3 Verify integrated software 4 Record the results of software testing 5 Ensure consistency and bilateral traceability of software requirements to software test specification 6 Develop regression test strategy and perform regression testing Base Practices V3.0 1 Develop software qualification test strategy including regression test strategy 2 Develop specification for software qualification test 3 Select test cases 4 Test integrated software 5 Establish bidirectional traceability 6 Ensure consistency 7 Summarize and communicate results New aspects of the software qualification test process SWE.6 Software Qualification Test – Base Practices
  • 91. 91Seite 91 Changes in V3.0 – SWE.6 Software Qualification Test • New name for software test  software qualification test (from ISO 15504-5:2012) • New: • Selection of test cases based on the test strategy • Traceability and consistency are separated (see key concepts)
  • 92. Seite 92 Vers. 3.0 / ISO 330xx • GP 2.1.1 Identify the objectives for the performance of the process. • GP 2.1.2 Plan the performance of the process to fulfill the identified objectives. • GP 2.1.3 Monitor the performance of the process against the plans. • GP 2.1.4 Adjust the performance of the process. • GP 2.1.5 Define responsibilities and authorities for performing the process. • GP 2.1.6 Identify, prepare, and make available resources to perform the process according to plan. • GP 2.1.7 Manage the interfaces between involved parties. Vers. 2.5 / ISO 15504 • GP 2.1.1 Identify the objectives for the performance of the process. • GP 2.1.2 Plan and monitor the performance of the process to fulfill the identified objectives. • GP 2.1.3 Adjust the performance of the process. • GP 2.1.4 Define responsibilities and authorities for performing the process. • GP 2.1.5 Identify and make available resources to perform the process according to plan. • GP 2.1.6 Manage the interfaces between involved parties. Changes to Generic Practices - CL2 (1/3)
  • 93. Seite 93 • GP 2.1.2 Plan the performance of the process to fulfill the identified objectives. [ACHIEVEMENT b] • Plan(s) for the performance of the process are developed. • The process performance cycle is defined. • Key milestones for the performance of the process are established. • Estimates for process performance attributes are determined and maintained. • Process activities and tasks are defined. • Schedule is defined and aligned with the approach to performing the process. • Process work product reviews are planned. • GP 2.1.3 Monitor the performance of the process against the plans. [ACHIEVEMENT c] • The process is performed according to the plan(s). • Process performance is monitored to ensure planned results are achieved and to identify possible deviations Changes to Generic Practices - CL2 (2/3)
  • 94. Seite 94 GP 2.1.6 Identify, prepare, and make available resources to perform the process according to plan. [ACHIEVEMENT f, g] • The human and infrastructure resources, necessary for performing the process are identified made available, allocated and used. • The individuals performing and managing the process are prepared by training, mentoring, or coaching to execute their responsibilities. • The information necessary to perform the process is identified and made available. Changes to Generic Practices - CL2 (3/3)
  • 95. Seite 95 GP 3.1.1 Define and maintain the standard process that will support the deployment of the defined process. [ACHIEVEMENT a] • A standard process is developed and maintained that includes the fundamental process elements. • … GP 3.1.3 Identify the roles and competencies, responsibilities, and authorities for performing the standard process. [ACHIEVEMENT c] • Process performance roles are identified • Competencies for performing the process are identified. • Authorities necessary for executing responsibilities are identified. Changes to Generic Practices – CL3 (1/2)
  • 96. Seite 96 GP 3.1.5 Determine suitable methods and measures to monitor the effectiveness and suitability of the standard process. [ACHIEVEMENT e] • Methods and measures for monitoring the effectiveness and suitability of the process are determined. • … PA 3.2 Process deployment attribute The process deployment attribute is a measure of the extent to which the standard process is effectively deployed as a defined process to achieve its process outcomes. As a result of full achievement of this attribute: In addition the following Notes for GP 3.2.6 Collect and analyze data about performance of the process to demonstrate its suitability and effectiveness. [ACHIEVEMENT f] NOTE 1: Data about process performance may be qualitative or quantitative. Changes to Generic Practices – CL3 (2/2)
  • 97. Seite 97 Vers. 3.0 / ISO 330xx • GP 4.1.1 Identify business goals. • GP 4.1.2 Establish process information needs. • GP 4.1.3 Derive process measurement objectives from process information needs. • GP 4.1.4 Identify measurable relationships between process elements. • GP 4.1.5 Establish quantitative objectives. • GP 4.1.6 Identify process measures that support the achievement of the quantitative objectives. • GP 4.1.7 Collect product and process measurement results through performing the defined process. Vers. 2.5 / ISO 15504 • GP 4.1.1 Identify process information needs, in relation with business goals. • GP 4.1.2 Derive process measurement objectives from process information needs. • GP 4.1.3 Establish quantitative objectives for the performance of the defined process, according to the alignment of the process with the business goals. • GP 4.1.4 Identify product and process measures that support the achievement of the quantitative objectives for process performance. • GP 4.1.5 Collect product and process measurement results through performing the defined process. • GP 4.1.6 Use the results of the defined measurement to monitor and verify the achievement of the process performance objectives. Changes to Generic Practices - CL4 (1/2)
  • 98. Seite 98 GP 4.1.4 Identify measurable relationships between process elements. [ACHIEVEMENT a, d] Identify the relationships between process elements, which contribute to the derived measurement objectives. Changes to Generic Practices - CL4 (2/2)
  • 99. Seite 99 Vers. 3.0 / ISO 330xx • GP 5.1.1 Define the process innovation objectives for the process that support the relevant business goals. • GP 5.1.2 Analyze data of the process to identify opportunities for innovation. • GP 5.1.3 Analyze new technologies and process concepts to identify opportunities for innovation. • GP 5.1.4 Define and maintain an implementation strategy based on innovation vision and objectives. Vers. 2.5 / ISO 15504 • GP 5.1.1 Define the process improvement objectives for the process that support the relevant business goals. • GP 5.1.2 Analyse measurement data of the process to identify real and potential variations in the process performance. • GP 5.1.3 Identify improvement opportunities of the process based on innovation and best practices. • GP 5.1.4 Derive improvement opportunities of the process from new technologies and process concepts. Impact of new technologies on process performance is identified and evaluated. • GP 5.1.5 Define an implementation strategy based on long-term improvement vision and objectives. Changes to Generic Practices – CL5
  • 101. Seite 101 • Meet the experts: • We offer free half day public seminars. Our experts will explain the changes and answer your individual questions. • For English language introductions see www.kuglermaag.com/aspice-intro • For German language introductions see www.kuglermaag.de/aspice-intro • Schedule a one day inhouse seminar: • Our experts will explain the changes, answer your questions, but also look at your individual situation and plan how to proceed to upgrade your organization to Automotive SPICE 3.0 compliance. • For update trainings see www.kuglermaag.com/aspice-update (English version) • www.kuglermaag.de/aspice-update (German version) • Any other questions and concerns? • See “contact information” Need more advice?
  • 102. Seite 102 About the Authors Fabio Bella • Process Director at Kugler Maag Cie, Country Manager for Italy • intacsTM advisory board member • intacsTM SPICE Principal Assessor, intacsTM SPICE Instructor • TüV Rheinland Functional Safety Engineer (Automotive) • Volkswagen-certified Software Quality Improvement Leader (SQIL) Dr. Klaus Hoermann • Principal and Partner at Kugler Maag Cie • Leader of the intacsTM working group “Exams” • intacsTM SPICE Principal Assessor, intacsTM SPICE Instructor • Volkswagen-certified Software Quality Improvement Leader (SQIL) • CMMI® SCAMPI Lead Appraiser (CMMI Institute-Certified) • CMMI® Instructor (CMMI Institute-Certified) • Scrum Master (Scrum.org certified) Bhaskar Vanamali • Process Director at Kugler Maag Cie • Member of the VDA AK13 (working on Automotive SPICE), member of the SC7 WG10 (working on ISO 15504/ISO 33000) • intacsTM SPICE Principal Assessor, intacsTM SPICE Instructor • SixSigma Green Belt • Volkswagen-certified Software Quality Improvement Leader (SQIL)
  • 103. Seite 103 Contact information KUGLER MAAG CIE GmbH Leibnizstr. 11 70806 Kornwestheim, Germany information@kuglermaag.com www.kuglermaag.de Phone +49 7154 1796 100 KUGLER MAAG CIE North America Inc. Columbia Center, 201 W Big Beaver Rd, Troy, MI 48084, USA usa@kuglermaag.com www.kuglermaag.com Phone +1 248 687 1210 KUGLER MAAG CIE Central Eastern Europe cee@kuglermaag.com Phone +48 513 144 297
  • 104. Seite 104 © KUGLER MAAG CIE GmbH Thank you for your attention. Questions? Comments?