The Ultimate Guide to Choosing WordPress Pros and Cons
Microsoft ALM Platform Overview
1. Microsoft Application Lifecycle
Management
a 30-minute overview
Steve Lange
sr. developer technology specialist | microsoft – denver, co
stevenl@microsoft.com | slange.me
2. Agenda
• What is Application Lifecycle
Management?
• Microsoft’s ALM Platform
• Focus Areas
– Implement process with minimal overhead
– Plan & manage projects
– Align roles across the lifecycle
– Report across project boundaries
• Summary
7. But first, what about
you?
Version Control Test Case
Task Management
Management
Requirements
Management
Bug Tracking
Automated
Build Testing
Development
Automation Tool
14. Process Guidance & Automation
• Baked into Team Foundation Server
• Provides contextual guidance
(documentation)
• Delivered via Process Templates Enable predictability and
• Use templates out of the box, or repeatability across
projects
create your own
• Completely customizable
15. Process Templates
• Product planning based on user stories and story points
• Team progresses most work by moving from active to
MSF for Agile resolved to closed
• Team is not usually required to support rigorous audits
Software
Development
• Product planning based on requirements and CR’s
• Most work moves from proposed to active to resolved to
closed
• Team is required to maintain rigorous audit trails MSF for
• Team is working toward CMMI appraisal CMMI
• Development lifecycle follows Scrum framework (based
on Agile principles)
Visual Studio
Scrum
MSF = Microsoft Solutions Framework
CMMI = Capability Maturity Model Integration
18. Organization
Classification of work artifacts
– Where
– When
Integration organizes efforts
– Build
– Collaboration
Customize to meet team
needs
– Not the other way around
31. Reporting: Not just for
managers
Bug Status Build Quality Indicators
Trends
Reporting Services Bug Status
Reactivations Release Burndown
Burndown & Burn Rate
Sprint Burndown
Requirements Progress Dashboards
Remaining Work
Excel
Status on All Iterations
Test Case Readiness Stories Overview
Unplanned Work
Build Success Over Time
Build Summary
33. Microsoft’s ALM Solution
Create happy teams, enabling
success from idea to
retirement
Process with Plan &
minimal manage
overhead projects
Report
Align roles across
across the project
lifecycle boundaries
35. Thank You
Steve Lange
sr. developer technology specialist | microsoft
stevenl@microsoft.com | 303-918-0500
slange.me | @stevelange
Upcoming Office Hours:
– 11/18 @ 9:30 AM (Pacific)
– 12/2 @ 9:30 AM (Pacific)
– 12/16 @ 9:30 AM (Pacific)
– 12/30 @ 9:30 AM (Pacific)
– 1/13 (2012) @ 9:30 AM (Pacific)
(see blog for details)
37. Links & Resources
• Videos
– Video: Proactive Project Management with Visual Studio 2010
– Improving Developer-Tester Collaboration with Microsoft Visual Studio 2010
• Whitepapers
– What is ALM?
– IDC MarketScape Excerpt: IT Project and Portfolio Management 2010 Vendor
Analysis
– The Forrester Wave: Agile Development Management Tools, Q2 2010
– Attaining Optimal Business Value from Agile Software Development
– White Paper: Reconciling the Agile Team with Enterprise Project Management
– Magic Quadrant for Integrated Software Quality Suites
• Microsoft Solutions Framework
– MSF for Agile homepage
– MSF for CMMI homepage
– Visual Studio Scrum 1.0 homepage
– MSF for Agile+SDL v5.0
38. Links & Resources
• Articles & Product Pages
– Microsoft Application Lifecycle Management
– Effective Team Development
– Heterogeneous Development
– Product homepages
• Visual Studio
• Team Foundation Server
• Test Professional
39. Case Studies
• Flextronics - Visual Studio 2010 helps Flextronics’ developers and QA teams work
together
• Wintellect - Wintellect uses the testing tools in Visual Studio to speed up debugging
• AccessIT - Visual Studio 2010 helps Access IT give its customers better team
collaboration
• Sogeti - Sogeti better understands legacy systems with Visual Studio 2010
• EPiServer - EPiServer tests software more effectively and efficiently with Visual
Studio 2010
• Equiniti - Share Registrar Cuts Testing Time and Improves Application Lifecycle
Management
• ICONICS - ICONICS is cutting cost and increasing productivity with Visual Studio
2010
• Penn National Insurance - Penn National Insurance boosted productivity and reduced
testing time using Visual Studio 2010
• Readify - Using Visual Studio 2010, Readify saves time and money with virtual testing
• Länsförsäkringar AB - Swedish insurance company expects to cut software
development time and costs by 20 percent
• 3M - Eliminating “no repro” bugs helps 3M accelerate delivery of products into the
marketplace
40. Microsoft’s ALM Solution
Architecture
Project Plan
Dep. Graph User Interface
Version
Database
Auto Test
Req Design IntelliTrace Unit
Code
Gen Test Data
Code Analysis Web & Load
Deploy
Code Metrics Database
Task Validate
Compare
Code Profiler Test Impact
Lab Management
Test Manager
Design Gated Check-
TFS Source
Compile
in
TFS Build
Plan/Organize
Validate Arch Branch
Execute Tests Test Dev Analyze
Merge
Impact
Deploy
Fast Forward Shelve
App/DB
Stag Pre Execute Tests Visualize
e Prod
Test Case
Feature
Rich Bug TFS Web Requests
Prod Help Tickets
Needs Work Ready
Access Prod Issues
41. back
Bug Status Report
• Is the team fixing bugs
quickly enough to finish on
time?
• Is the team fixing high priority
bugs first?
• What is the distribution of
bugs by priority and severity?
• How many bugs are
assigned to each team
member?
42. back
Bug Trends Report
• How many bugs is the team
reporting, resolving, and
closing per day?
• What is the overall trend at
which the team is processing
bugs?
• Are bug activation and
resolution rates declining
toward the end of the
iteration as expected?
43. back
Reactivations Report
• How many bugs are being
reactivated?
• How many user stories are
being reactivated?
• Is the team resolving and
closing reactivated bugs at
an acceptable rate?
44. back
Build Quality Indicators Report
• What is the quality of the
software?
• How often are tests passing,
and how much of the code is
being tested?
• Based on the code and test
metrics, is the team likely to
meet target goals?
45. back
Build Success Over Time
• Report
What parts of the project have produced software that is ready to be tested?
• What parts of the project are having trouble with regressions or bad checkins?
• How well is the team testing the code?
46. back
Build Summary Report
• What is the status of all builds over time?
• Which builds succeeded?
• Which builds have a significant number of changes to the code?
• How much of the code was executed by the tests?
• Which builds are ready to install?
47. back
Burndown and Burn Rate
• Is the team likely to finish the
iteration on time?
Report
• Will the team complete the
required work, based on the
current burn rate?
• How much work does each
team member have?
48. back
Remaining Work Report
• What is the cumulative flow of
work?
• Is the team likely to finish the
iteration on time?
• Is the amount of work or
number of work items in the
iteration growing?
• Does the team have too much
work in progress?
• How is the team doing in
estimating work for the
iteration?
# Hours of Items
of Work Work
49. back
Status on All Iterations Report
• Is steady progress being made across all iterations?
• How many stories did the team complete for each iteration?
• How many hours did the team work for each iteration?
• For each iteration, how many bugs did the team find, resolve, or close?
50. back
Stories Overview Report (Agile)
• How much work does each story require?
• How much work has the team completed for each story?
• Are the tests for each story passing?
• How many active bugs does each story have?
51. back
Stories Progress Report (Agile)
• How much progress has the team made toward completing the work for each story?
• How much work must the team still perform to implement each user story?
• How much work did the team perform in the last calendar period?
52. back
Requirements Progress Report (CMMI)
• How much progress has the team made toward completing the work for each
requirement?
• How much work must the team still perform to implement each requirement?
• How much work did the team perform in the last calendar period?
53. back
Requirements Overview Report (CMMI)
• How much work does each Requirement require?
• How much work has the team completed for each Requirement?
• Are the tests for each Requirement passing?
• How many active bugs does each Requirement have?
54. back
Release Burndown (Scrum)
• How much work remains in the release?
• How quickly is your team working through the product backlog?
55. back
Sprint Burndown (Scrum)
• How much work remains in the sprint?
• Is your team on track to finish all work for the sprint?
• When will your team finish all work for the sprint?
• How much work for the sprint is in progress?
56. back
Unplanned Work Report
• How much work was
added after the
iteration started?
• Is too much work
being added during
the iteration?
57. back
Test Case Readiness Report
• When will all the test cases be ready to run?
• Will all the test cases be ready to run by the end of the iteration?
• How many test cases must the team still write and review?
• How many test cases are ready to be run?
58. back
Test Plan Progress Report
• How much testing has the
team completed?
• Is the team likely to finish
the testing on time?
• How many tests are left to
be run?
• How many tests are
passing?
• How many tests are failing?
• How many tests are
blocked?
Notas do Editor
From David Chappell’s “What is ALM?” http://go.microsoft.com/?linkid=9743693
The business analyst starts by adding user stories. CLICK Once the user stories has been entered the developer creates tasks for implementing each user storyCLICK Meanwhile the tester authors tests against those user stories CLICK Now the developer writes code that implements a task and checks it into TFSCLICK The checking are materialized to a buildCLICK The tester examines the build, notes the delivered changes and deploys the build to test environment (not shown)CLICK The tester begins testing the build by choosing a test and running it using Microsoft Test Manager CLICK The tester identifies a bug and files it with one click – the bug is automatically associated with the test and the user storyCLICK The cycle can continue as the developer fixes the bug, associates a check-in, and then creates a build which the tester then pulls into test (and so on) CLICK
The Visual Studio 2010 family is made up of a central team server, and a small selection of client-side tools. The team server—Team Foundation Server 2010—is the backbone of your application lifecycle management…<CLICK>…providing capabilities for source control management, (SCM), build automation, work item tracking and reporting. In this release we’ve expanded the capabilities of Team Foundation Server by adding a true test case management system…<CLICK>…and extended it with Lab Management 2010—a set of capabilities designed to better integrate both physical and virtual labs into the development process. We’ve heard your feedback as well, and we have made it to be easier to set-up and maintain Team Foundation Server—in fact it can be installed, configured and ready to use in as little as 20-minutes. <CLICK>On the client-side we have reduces the complexity of or IDE offerings. For developers, you can choose between Visual Studio 2010 Professional, Premium or Ultimate, with each subsequent product containing all of the features of its predecessor. For testers and business analysts we are introducing Test Professional—a new integrated test environment designed with manual testers in mind.<CLICK>For those people who participate in the development efforts, but for whom Visual Studio—the IDE—is not appropriate, including Java developers, project managers and stakeholders the Team Foundation Server extensibility model enables us to provide alternative interfaces. These include both Team Explorer—a standalone tool built with the Visual Studio shell—and Team Web Access. These tools enable anyone to work directly with Team Foundation Server. In October we announced the acquisition of Teamprise, a technology similar to Team Explorer for the Eclipse IDE on Windows, Linux, Mac OS X and other Unix-based operating systems. That technology has been incorporated into the Visual Studio 2010 product line, and we will be announcing how we are productizing it very soon. The most important thing to know is that we will be releasing a Teamprise-based product, and it will also be included as an MSDN benefit for Visual Studio 2010 Ultimate customers.<CLICK>Of course we are continuing our cross-product integration capabilities with Microsoft Office® and Microsoft Expression. We have improved integration between Team Foundation Server and SharePoint Server with new SharePoint dashboards, and we have a new set of capabilities that make SharePoint development much easier than in the past.Across the board the features and capabilities we built into Visual Studio 2010 are a result of the great feedback we have gotten from our customers. This release continues our commitment to enabling you, our customers, to build the right software, in the right way to ensure success for your business. Throughout the rest of the day you will learn about a variety of capabilities in Visual Studio 2010 that make the process of developing software, by teams of any size, easier. Whether it is by helping you streamline your development process, find and fix bugs quicker, more easily understand existing systems or automate repetitive processes.
After the team has started to find and fix bugs, you can track the team's progress toward resolving and closing bugs by viewing the Bug Status report. This report shows the cumulative bug count based on the bug state, priority, and severity.
You can use the Bug Trends report to help track the rate at which your team is discovering and resolving bugs. This report shows a rolling or moving average of bugs being reported, resolved, and closed over time. When you manage a large team or a large number of bugs, you can monitor the Bug Trends report weekly to gain insight into how well the team is finding, resolving, and closing bugs.The Bug Trends report calculates a rolling average of the number of bugs that the team has opened, resolved, and closed based on the filters that you specify. The rolling average is based on the seven days before the date for which it is calculated. That is, the report averages the number of bugs in each state for each of the seven days before the date, and then the result is divided by seven.
As the team resolves and closes bugs, you can use the Reactivations report to determine how effectively the team is fixing bugs. Reactivations generally refer to bugs that have been resolved or closed prematurely and then reopened. The reactivation rate is also referred to as the fault feedback ratio.You can use the Reactivations report to show either bugs or user stories that have been reactivated. As a product owner, you might want to discuss acceptable rates of reactivation with the team. A low rate of reactivations (for example, less than 5%) might be acceptable depending on your team's goals. However, a high or increasing rate of reactivations indicates that the team might need to diagnose and fix systemic issues. The Reactivations report shows an area graph of the number of bugs or stories that are in a resolved state or that have been reactivated from the closed state.
The Build Quality Indicators report shows test coverage, code churn, and bug counts for a specified build definition. You can use this report to help determine how close portions of the code are to release quality. Ideally, test rates, bugs, and code churn would all produce the same picture, but they often do not. When you find a discrepancy, you can use the Bug Quality Indicators report to examine the details of a specific build and data series. Because this report combines test results, code coverage from testing, code churn, and bugs, you can view many perspectives at the same time.
The Build Success Over Time report provides a pictorial version of the Build Summary report. The Build Success Over Time report displays the status of the last build for each build category run for each day. You can use this report to help track the quality of the code that the team is checking in. In addition, for any day on which a build ran, you can view the Build Summary for that day.
The Build Summary lists builds and provides information about test results, test coverage, code churn, and quality notes for each build.The data that appears in the Build Summary report is derived from the data warehouse. The report presents a visual display of the percentage of tests that are passing, code that is being tested, and changes in code across several builds. You can review the results for both manual and automatic builds, in addition to the most recent builds and continuous or frequent builds. The report lists the most recent builds first and contains build results that were captured during the specified time interval for all builds that were run, subject to the filters that you specified for the report.At a glance, you can determine the success or failure of several build definitions for the time period under review.
After a team has worked on one or more iterations, also known as sprints, you can determine the rate of team progress by reviewing the Burndown and Burn Rate report. Burndown shows the trend of completed and remaining work over a specified time period. Burn rate provides calculations of the completed and required rate of work based on the specified time period. In addition, a chart shows the amount of completed and remaining work that is assigned to team members. You can view the Burndown and Burn Rate report based on hours worked or number of work items that have been resolved and closed.
After the team has estimated its tasks and begun work, you can use the Remaining Work report to track the team's progress and identify any problems in the flow of work. The Remaining Work report summarizes the data that was captured during the specified time interval for each task, user story, or bug based on the filter criteria that were specified for the report. The data is derived from the data warehouse.You can view this report in either the Hours of Work view or the Number of Work Items view. The first view displays the total number of hours of work for the specified time period and the team's progress toward completing that work. The second view displays the number of work items for the specified time period and the number of work items in each state. Each view provides an area graph that charts the progress of completed work against the total estimated work for the specified time duration.
After work has progressed on several iterations, also known as sprints, you can view the team progress by viewing the Status on All Iterations report. This report helps you track the team's performance over successive iterations. For each iteration that is defined for the product areas that you specify, this report displays the following information: Stories Closed: The number of user stories that have been closed. These values are derived from the current values specified for the iteration and the state of each user story.Progress (Hours): A two-bar numeric and visual representation that represents the values for Original Estimate (grey), Completed (green) and Remaining (light blue) based on the rollup of hours that are defined for all tasks. These values are derived from the current values that are specified for the iteration and the hours for each task. Bugs: A numeric value and visual representation for all bugs, grouped by their current states of Active (blue), Resolved (gold) and Closed (green). These values are derived from the current values that are specified for the iteration and the state of each bug.
The Stories Overview report lists all user stories, filtered by area and iteration and in order of importance.Work Progress% Hours Completed: A numeric value and visual representation that shows the percentage of completed work based on the rollup of baseline and completed hours for all tasks that are linked to the user story or its child stories.Hours Remaining: A numeric value for the rollup of all remaining hours for all tasks that are linked to the user story or its child stories.Test StatusTest Points: A numeric value that represents the number of pairings of test cases with test configurations in a specific test suite. For more information about test points, see Reporting on Testing Progress for Test Plans.Test Results: A numeric value and visual representation that shows the percentage of test cases, grouped according to the status of their most recent test run, where the options are Passed (green), Failed (red), or Not Run (black).Bugs: A numeric value and visual representation that shows the number of bugs that are linked to the test case or user story, where the options are Active (blue) and Resolved (gold). If a user story is linked to one or more child stories, the values represent a rollup of all bugs for the user story and its child stories.User Stories that Appear in the ReportThe Stories Overview report lists and highlights user stories according to the following criteria:Stories appear in order of their importance, based on their assigned ranking.Stories appear in bold type when they are in the active or resolved state. Stories appear in normal type when they are in the closed state. Stories appear in gray type when their assigned iteration or area is outside the filtered set, but they have tasks or child stories that are within the filtered set of iterations or product areas.
The Stories Progress report lists all user stories, filtered by product area and iteration in order of importance.This report displays the following information for each user story that appears in the report: Progress (% Completed): Numeric value that represents the percentage of completed work based on the rollup of baseline and completed hours for all tasks that are linked to the user story or its child stories.Hours Completed: A visual representation of the completed hours, displayed as a dark green bar.Recently Completed: A visual representation of those hours completed within the time interval specified for Recent (Calendar) Days, displayed as a light green bar.Hours Remaining: Rollup of all remaining hours for all tasks that are linked to the user story or its child stories.The Stories Progress report lists and highlights user stories according to the following criteria:Stories appear in order of their importance, based on their assigned ranking.Stories appear in bold type when they are in the active or resolved state. Stories appear in normal type when they are in the closed state. Stories appear in gray type when their assigned iteration or area is outside the filtered set but they have tasks or child stories that are within the filtered set of iterations or product areas.
The Requirements Progress report shows the status of completion as determined by the tasks that have been defined to implement the requirement.
The Requirements Overview report presents a snapshot of the work that has been performed for the filtered set of requirements to the current date.
By reviewing a release burndown report, you can understand how quickly your team has delivered backlog items and track how much work the team must still perform to complete a product release. A release burndown graph shows how much work remained at the start of each sprint in a release. The source of the raw data is your product backlog. Each sprint appears along the horizontal axis, and the vertical axis measures the effort that remained when each sprint started. The amount of estimated effort on the vertical axis is in whatever unit that your scrum team has decided to use (for example, story points or hours).
By reviewing a sprint burndown report, you can track how much work remains in a sprint backlog, understand how quickly your team has completed tasks, and predict when your team will achieve the goal or goals of the sprint.A sprint burndown report shows how much work remained at the end of specified intervals during a sprint. The source of the raw data is the sprint backlog. The horizontal axis shows days in a sprint, and the vertical axis measures the amount of work that remains to complete the tasks in the sprint. The work that remains is shown in hours. A sprint burndown graph displays the following pieces of data: The Ideal Trend line indicates an ideal situation in which the team burns down all of the effort that remains at a constant rate by the end of the sprint. The In Progress series shows how many hours remain for tasks that are marked as In Progress in a sprint. The To Do series shows how many hours remain for tasks that are marked as To Do in a sprint. Both the In Progress and the To Do series are drawn based on the actual progress of your team as it completes tasks.
Toward the end of an iteration, you can use the Unplanned Work report to determine how much work was added to the iteration that was not planned at the start of the iteration. You can view the unplanned work as measured by work items added, such as tasks, test cases, user stories, and bugs. Having unplanned work may be acceptable, especially if the team has scheduled a sufficient buffer for handling the load of unplanned work (for example, bugs). On the other hand, the unplanned work may represent a real problem if the team does not have the capacity to meet it and is forced to cut back on the planned work.The Unplanned Work report is useful when the team plans an iteration by identifying all work items that they intend to resolve or close during the course of the iteration. The work items that are assigned to the iteration by the plan completion date of the report are considered planned work. All work items that are added to the iteration after that date are identified as unplanned work.
The Test Case Readiness report provides an area graph that shows how many test cases are in the Design or Ready state over the time period that you specify. By reviewing this data, you can easily determine how quickly the team is designing test cases and making them ready for testing. When you create a test case, it is automatically set to the design state. After the team has reviewed and approved the test case, then a team member should change its state to Ready, which indicates that the test case is ready to be run.
The data that appears in the Test Plan Progress report is derived from the data warehouse and the test results that are generated when tests are run by using Microsoft Test Manager. The report presents an area graph that shows the most recent result of running any test in the specified test plans over time. For more information, see Running Tests.The horizontal axis shows days in a sprint or iteration, and the vertical axis shows test points. A test point is a pairing of a test case with a test configuration in a specific test suite. For more information about test points, see Reporting on Testing Progress for Test Plans.