13. 13
ADOPT THE DOMINANT PARADIGM
Tracking usability
issues: to bug or not to
bug? (Wilson and Coyle
2001)
14. 14
80% OF TARGET
USERS WILL BE ABLE
TO CREATE THE
REFERENCE LAYOUT
IN 20 MINUTES
WITH
DOCUMENTATION.
15. 15
Will include
• 30, 45. 60 and 90° angles.
• Elevation changes,
• Connecting horizontal to
vertical segments.
• Junctions.
• Rectangular profile.
• Segment, elbow, tee, cross
and cap elements.
Will not include
• Sloped duct.
ACCEPTANCE TESTS
17. 17
ADDITIONAL ACCEPTANCE CRITERIA
• Support user ‘s choice of views (Plan,
Section, 3D).
• Customer satisfaction survey: 80% are
Somewhat or Completely Satisfied.
• System Usability Scale score of 80.5
or above
• No New Bugs
• No New Technical Debt
21. 21
OUR WORKFLOW
Open dialog Select part Close dialog
Drag part to an
open connector
Visually confirm
snap
Click to place the
part
Place an initial
part
Select a new
open connector{
One click quickly
connects parts
}
CURRENT WORKFLOW
24. 24
MAY– COMPLETION TIME / SESSION,
MINUTES
0
10
20
30
40
50
60
April May
Magnetic snap
Match dimensions on
snap
Tooltip explaining
shortcuts
Minutes
25. 25
JUNE – COMPLETION TIME PER SESSION
0
10
20
30
40
50
60
April May June
Fast rotate and orient
Parts dialog stays open
Minutes
29. 29
JULY– COMPLETION TIME / SESSION
0
10
20
30
40
50
60
April May June July
Single click part placement
Better recruiting
More focused testing
Minutes
35. 35
HEURISTICS: ASPECTS OF USABILITY
• Standards and guidelines
• Accessibility
• Error prevention
• Exploration
• Scalability
36. 36
ARTISTIC INSPIRATION & CREDIT
• Bauhaus Movement
• Piet Mondrian
• Wassily Kandinsky
• George Lucas
37. Slides posted on LinkedIn.
Thank you to Autodesk
coworkers and friends for help
with my presentation!
37
Notas do Editor
Okay, let’s get started. I’ll let you readjust your necks to my text direction, and set the stage for my presentation with a quick story. About ten years ago, I worked for an online dating website. We were getting killed by big-spending competition. We planned to save the business with a new service: an instant messaging bot that would find compatible users and help them connect. Over months of planning, we didn’t write any code. Instead we wrote and rewrote a specification with behaviors, prompts and responses. We made mockups and storyboards. While we worked, the website was sold and we continued polishing the spec for a big presentation. Eventually the big day came and we presented. The new bosses were not sold. Instead, they told us they were cancelling the project and laying off the entire team. My hundred-page spec was trashed. (transition) Does this type of story sound familiar to anyone? Looking back I bet an Agile process could have delivered a working prototype that would have been far more compelling for new management.
I hope I never again have to slave over a design spec. I’ve become a believer in Agile user experience processes. No specifications required! Build it, launch it, and learn! For me, the process has been a revolution.
My online dating time is behind me now. For about 4 years, I’ve been part of an Agile team as a user experience designer and researcher. My current team works on Autodesk Revit, 3D modeling desktop software for architectural design and building engineering. Last year our project was to support modeling complicated air duct, electrical, and piping systems to a fabrication level of detail. This would allow us to sell the software to a large new customer base.
Some believe that in Agile, no specs means no planning. It’s more accurate to say that, instead of the up-front planning in a traditional process, Agile means continual planning. Since there is no single specification document, Agile planning and design may happen at multiple levels, from each individual’s daily standup status up to a sweeping vision encompassing years of development.
One question therefore concerns user experience fitting into the Agile planning process. Where does the UX fit into those different How can we ensure that UX work is continuous, not rigidly defined at the beginning or sloppily tacked on at the end?
I’ll come back to that. For now I’d like to turn the microphone around, and ask those of you who are working this way: How do you track your user experience in the context of ongoing Agile development?
My current Agile team was new at the beginning of last year. We had three developers and one quality assurance tester in New England and the same number in the UK. In addition, the team’s product owner was based “over there,” and there was me, the UX designer, here. Some of the team members I had worked with for several years, but for the UK members of the team, it was their first time working with a user experience designer.
The software our customers were using at the time has a long list of capabilities and works with a vast array of components. Our first challenge was to select from the long list of things we *could* do, and choose a small piece of work. We needed something that could be done in a reasonably short time, and that would deliver customer value. Breaking work down into small chunks is a hallmark of Agile development.
One of the first planning tools we used is the project charter. Shown on the slide is the template we filled out over three days. Even more valuable than the final doc, however, the most valuable part of the process is the discussion.2
Acceptance Criteria are the minimal documentation that ensures a sufficient implementation of acceptance tests. Acceptance Tests are the detailed specification of the system’s behavior for all meaningful scenarios, used to assert its correctness. From Chris Edwards, “Acceptance Criteria vs. Acceptance Tests: Minimize your documentation.” http://tracks.roojoom.com/r/467#/trek?page=1&_sm_au_=iVVfQrTqnt1SvW0H
I was stunned as we finished our chartering session. I asked myself, did that really just happen? Our key measure of success and Done-ness was not a list of features, but a user experience criterion. Whether we would ship depended the results of user testing, not just on code stability and performance. Personally I felt great opportunity and also great responsibility.
We guessed that we could finish the work in six sprints, which would fix into our overall goal to keep project small and bounded
Near the end of April, our QA tester was able to model the reference layout in 15 minutes. This was great! However, I was skeptical that a real user would be as successful. I explained that
I tested with internal users. The numbers here actually are worse than they might appear, in that we simply ran out of time in our session. The test users were able to create about half of the model in that time.
The team observed all the sessions, and together we discussed the issues we saw, their possible solutions, and estimates of the effort to fix things. Our software’s UI toolkit made some changes more difficult than others. Our velocity at this point was about 20 story points, so the 55 points for changing to a modeless dialog was more than the team was willing to commit to. I believed customers may still struggle with the modal dialog, but if they still can succeed despite that, we can leave that work until later.
In the May testing, we made improvements, and then tested with customer users. I cut off the session on
At this point, I was growing concerned.
The team observed all the sessions, and together we discussed the issues we saw, their possible solutions, and estimates of the effort to fix things. Our software’s UI toolkit made some changes more difficult than others. Our velocity at this point was about 20 story points, so the 55 points for changing to a modeless dialog was more than the team was willing to commit to. I believed customers may still struggle with the modal dialog, but if they still can succeed despite that, we can leave that work until later.