5. WE SOLVE BIG TECHNICAL CHALLENGES!
WHY TICKETMASTER?5
Handle huge traffic spikes during on-sales
Build e-commerce platforms that scale
Exploit channel fragmentation
Maximise benefits of SEO and social media
Distinguish between traffic from bots and fans
Take advantage of Big Data
11. Same levels of
understanding and skill
Break down barriers & silos
Align and bring teams closer
together
Develop & release quality
products fast
Ensure stability & reliability
DEVOPS STRATEGY
OBJECTIVES
Delivering business value
Efficiency & quality of
development
Reliability of applications
& environments
Service delivery
4GOALS: MAXIMISE FOR
13. FLEXIBLE ROUTES TO TARGETS
Matrices provided Vision and
Targets
All teams have the same targets
Teams can plan routes flexibly
Choices based on needs and
value
Routes can change as needs
change
14. STANDARDISATIONSimplify Tools & Defined Key Specifications
Less support overhead
Solve problems once
Share knowledge
Provide Guidelines & Best Practices
Shared common understanding
Definition of Terms
Speak the same language
18. WHAT HAVE WE LEARNT?
Standardised Tooling Changes how we do things
Good Reporting Changes how we communicate
Good Communication Changes what we believe in
Internal & External to Engineering
Culture is changing!
21. HOUSTON, WE HAVE A TECH DEBT
PROBLEM!
13 PLATFORMS
5M LINES OF CODE
200 COMPONENTS
>9000 SERVERS
22. What problem are we trying to solve?
Why do we care?1. Definition
Research has anyone else done this?
Filter the data2. Process
Creation of model
Tooling and data capture
3. Results
Make the data usable
4. Report
WHAT DID WE DO?
23. What problem are we trying to solve?
Why do we care?1. Definition
Research has anyone else done this?
Filter the data2. Process
Creation of model
Tooling and data capture
3. Results
Make the data usable
4. Report
WHAT DID WE DO?
24. THE PROBLEM IS BIGGER THAN US
Industry consensus:
• Technical debt is a big problem
• Software decays over time
• Immature industry
• Technical debt lacks credibility as a term
Costs are significant:
Economic
Maintenance 60/60 rule
Psychological
25. 1. Definition
Historically, software engineering has struggled to articulate
the problem to a non-technical audience
WHY DO WE CARE?
EFFICIENCY
TOTAL COST OF OWNERSHIP
DEBT MANAGEMENT
VISIBILITY OF DEBT COST
26. What problem are we trying to solve?
Why do we care?1. Definition
Research has anyone else done this?
Filter the data2. Process
Creation of model
Tooling and data capture
3. Results
Make the data usable
4. Report
WHAT DID WE DO?
27. RESEARCH FINDINGS
KEY FINDING 1: TECHNICAL DEBT AS AN INDUSTRY
TERM MEANS THE MAINTAINABILITY OF THE
APPLICATION CODE
KEY FINDING 2: INFRASTRUCTURE DEBT ISN'T A
GENERALLY WELL KNOWN OR WIDELY USED
CONCEPT
KEY FINDING 3: ARCHITECTURE DEBT IS A KNOWN
CONCEPT WHICH IS NOT WIDELY USED AND THERE IS
NO CONSENSUS ON DEFINITION
28. • Infrastructure
Performance
• Scalability
• Infrastructure Security
• Infrastructure
Maintainability
• Continuity
• Repeatability
• Documentation
• Infrastructure
Complexity
TICKETMASTER TECH DEBT MODEL
Application Debt Infrastructure Debt Architecture Debt
• Code coverage
• Cyclomatic
Complexity
• Application
Performance
• Application Security
• Application
Maintainability
• Deviation from
reference
architecture
• Flexibility
• Single Points of
Failure
• Architecture
Complexity
OUR APPROACH TO SOLVING THIS PROBLEM HAS GENERATED A LOT OF
INTEREST 13.5K VIEWS ON OUR BLOG POST
29. What problem are we trying to solve?
Why do we care?1. Definition
Research has anyone else done this?
Filter the data2. Process
Creation of model
Tooling and data capture
3. Results
Make the data usable
4. Report
WHAT DID WE DO?
30. ARCHITECTURE DEBT
RESIDES IN THE DESIGN OF THE ENTIRE SYSTEM
APPLICATION DEBT
RESIDES IN THE SOFTWARE PACKAGE
INFRASTRUCTURE DEBT
RESIDES IN OPERATING ENVIRONMENTS
31. HOW WE ARE MEASURING IT?
It is essential that the process for
measuring technical debt is:
• Repeatable
• Transparent
• Consistent
We have therefore aimed to automate the collection of as many of the metrics as
possible via automated tooling:
• Application Debt:
• Code coverage, Cyclomatic complexity, Time to Interact, Application security
• Infrastructure Debt:
• Infrastructure performance, scalability
The remaining debt metrics are covered by a manual process
AUTOMATED TOOLING
32. High level component diagram of the platform made visible to the whole team.
Each component discussed one by one to reach
consensus among the team on the score (similar to an
Agile planning session)
DATA CAPTURE
Each team member scores the components
on a pre-agreed scale of 1-5 (low debt to
high debt)
Team identify work items required
in order to remove/reduce the tech
debt from each component
MANUAL MAPPING
Work items ordered according to
the priority in which the team feel
they should be addressed which
would have the greatest impact
on day to day development
33. What problem are we trying to solve?
Why do we care?1. Definition
Research has anyone else done this?
Filter the data2. Process
Creation of model
Tooling and data capture
3. Results
Make the data usable
4. Report
WHAT DID WE DO?
35. MAKE THE DATA USABLE
35
Heading:
Category and
System
Definition:
Each definition is
accompanied by
its meaning
Score:
Current and
target score is
listedHighlight Note:
Main update
since last report
or highlight note.
Top Items to be addressed:
The highest priority items to be
addressed, with risk of not
addressing them, benefit of the
work and schedule (if known)
36. MAKE THE DATA USABLE
36
EXECUTIVE SUMMARY: ROLLED UP SCORE
DETAILED BREAKDOWN
TECHNICAL BACKLOGS
37. TECH DEBT SUMMARY
14% YOY REDUCTION
WHAT WE LEARNT:
- APM IS HARD TO MEASURE
- APPLICATION & INFRASTRUCTURE DEBT CAN BE
REDUCED TACTICALLY ALONGSIDE FEATURE WORK
- ARCHITECTURE DEBT NEEDS SIGNIFICANT DEDICATED
RESOURCE AND STRATEGIC PLAN TO SHIFT
40. WHERE ARE YOU GOING?
Where do you want to go in your career?
How are you going to get there? What do you need to learn?
What support is there for your career goals/ambitions?
What resources do you need to achieve your goals?
What are your next steps?
45. DEVELOPING A COMPETENCY MODEL
Map Locations
Highway Code
Accounting
Communication
Service
Drive Taxi
Negotiate Traffic
Collect Fares
Competency
What I Know
(Knowledge, Best
Practices, etc)
How I Do It
(Attitudinal, etc)
What I do
(Functions, actions,
activities, etc)
Skill Type
Taxi Driver Skills Framework
Map Locations -
Level 1 - Know Cities
Level 2 Know landmarks
Level 3 Know post codes
Level 4 Knows streets
Driving -
Level 1 Drives set routes
Level 2 Selects fastest route
Level 3 Selects alternative
routes in jams
Communication -
Level 1 Courteous
Level 2 Interested in
Customer
Level 3 - Conversational
Capability
Technical SkillsProfessional Skills Behaviour Skills
46. ENGINEERING COMPETENCY MODEL
Technical SkillsProfessional Skills Behaviour Skills
Level 1
Level 2
Level 3
Level 4
Level 5
Level 1
Level 2
Level 3
Level 4
Level 5
Foundation
Professional
Advanced
Expert
Capability
Design & Coding
Level 2
Software Construction -
Design & Coding
Level 2
Working with Others
Professional
Recognised team player
Software
Engineer I
Competency
Developer Tech Skills
QA Tech Skills, etc
Application Design
Writing Code
Testing
Team Work
Communication
Engineering Skills Framework
47. WHAT HAVE WE BASED PROFESSIONAL SKILLS ON?
CAREER MAPPING47
WHY?
Defines activities rather than specific
skills
Can be applied across a broad range
of roles
Mapping is flexible and allows for
team variations
An IEEE (Institute of Electrical and
Electronics Engineers) standards based
framework for software engineering
professionals
Drawn together from 7 different best
practice reference guidelines and
standards
Software Engineering Competency Model - SWECOM
48. SWECOM FRAMEWORK
QBR Q1 2015 MARCH 201548
DesignRequirements Development Testing Support
Lifecycle
SDLC Lifecycle Skills 5 Knowledge Areas
These are the activities required to create, release and
maintain software which is what we DO!
Your core focus will differ depending on your role group
49. SWECOM FRAMEWORK
QBR Q1 2015 MARCH 201549
DesignRequirements Development Testing Support
Measurements
Configuration Management
Security
Quality
Systems Engineering
Process & Lifecycle
LifecycleCross-Cutting
Cross-cutting skills 7 Knowledge Areas
Activities that traverse the lifecycle skills
Sometimes areas
HCI / UX
50. BUILDING THE TM FRAMEWORK
QBR Q1 2015 MARCH 201550
LifecycleCross-Cutting
DesignRequirements Development Testing Support
Measurements
Configuration Management
Security
Quality
Systems Engineering
Process & Lifecycle
HCI / UX
QA
Developer
Front-end
Achitects
Devops
Technical
Service Excellence
Personal Effectiveness
Team Work
Visioning Process
Behaviours
technical and
behavioural skills
51. COMPETENCY FRAMEWORK
DECK TITLE51
Consistency across roles in engineering teams
Common language of skills
Flexible framework which can be modified as
required
Training needs identified
5 levels of
requirements,
skills,
knowledge,
behaviour
54. PROGRESSION GUIDE
A consistent set of levels of attainment for
each role:
Cross cutting skills (e.g. Security)
SDLC skills (e.g. Development)
Technical skills (e.g. Data storage)
Behavioural skills (e.g. working with
others)
Core competencies are highlighted:
These define the skill levels that must be
reached in order to progress your career
57. THANK YOU
Stephen Williams: VP Engineering
E: Stephen.Williams@Ticketmaster.co.uk
T: @Steve2358
Simon Tarry: Director of Engineering Strategy
E: Simon.Tarry@Ticketmaster.co.uk
T: @simontarry76
LNEJOBS.COM