A multi-phase project ascertaining the core issues and potential solutions UX professionals encounter in AGILE environments. All scrum roles were represented and engaged in the process. The core issues found are presented here in this presentation. A paper summarizing the potential solutions to the core issues we identified will soon follow.
Escorts Service Basapura ☎ 7737669865☎ Book Your One night Stand (Bangalore)
The Graffiti Wall: An Emerging Method for Gathering Qualitative Feedback in a Public Setting
1. The Graffiti Wall
An Emerging Method for Gathering Qualitative Feedback in a Public Setting
Peter Roessler, User Researcher,
Salesforce.com
Anshu Agarwal, Usability Analyst,
Salesforce.com
UPA 2009 International Conference
2. Where We Currently Stand
The Graffiti Wall is an emerging method for qualitative asynchronous
data collection that functions in a public setting to gather feedback
from a large, diverse audience.
This approach overcomes some of the drawbacks of existing methods
and provides researchers with an alternative approach in specific
contexts.
3. What we will share today
• The back story on the workshop motivating exploration of an
alternative data collection method
• The formation of the method framework, including audience and
context considerations
• Share our pilot experience with the Graffiti Wall method, including
findings, takeaways, and planned future work
5. Data Synthesis Brainstorming Prototyping
Gathering
6. AGILE/User Experience Workshop
• Large number of attendees; all
UE practitioners on AGILE
teams
• Brainstormed large data set of
UE-specific issues in AGILE
• Tight workshop schedule
precluded clustering the issue
sticky notes
• Band-aid solution: Artificial
topics were created for
solution brainstorming
7. Process broke down
without proper
analysis of issues
Data Synthesis Brainstorming Prototyping
Gathering
8. How might we recover the potential value
from the workshop data and prevent lost
effort?
9. Post-workshop Synthesis
• All issue sticky notes were clustered over several weeks
• Eleven (11) discrete issue buckets were created and a core
question summarizing the expressed concerns was formulated
for each
• Most of the original solution sticky notes from the workshop
brainstorm did not logically sort into the new issue buckets
10. Determine the best way to
repeat the solution
brainstorming, post-workshop,
with the eleven (11) core
questions truly concerning UE
staff on AGILE teams
OUR CHALLENGE
12. Key Criteria
1. Open up the exercise to all roles in the AGILE development cycle
2. Identify well-attended events or venues where we could find all
AGILE team roles represented in one place
3. Given the heterogeneous audience, take into account variations in
their personality types and their preferred mechanisms for
providing feedback
13. Known flaw:
Group data-gathering methods such
as focus groups or group
brainstorming potentially
suppress the expression of ideas
for a subset of the participants
16. VARK Learning Styles
According to Neil Fleming, people
have a sensory preference when
processing and exchanging
information
V = Visual Learner (prefers visual aids) The method should also allow
A = Auditory Learner (prefers lectures) participants to contribute to the
R = Read/Write Learner
K = Kinesthetic Learners (prefers experience)
data collection in multiple ways
to accommodate different
learning styles
17. Resulting Framework
• Ideal Venue: AGILE2008 Conference Toronto
• Make the working space as visible and public as possible to
maximize contributions
• Support emergent behavior by using a large amount of
physical space
• Make any contribution-sharing optional
• Have a moderator available to engage at a level driven by the
participant
• Eliminate any time constraints for the activity
• Seed the brainstorming space with visual contributions to
create awareness of the option
18. Final Plan
• Set up like a brainstorm from a content perspective
• Place in a space public to the target participants, much like an
expo or demo booth
• Have moderators present to answer questions from potential
participants
• Keep set up for the length of the event where it is installed
• Use a large amount of physical space to offer more
opportunities for private contributions as well as ad hoc
discussion among groups willing to engage
• Seed the installation with written and visual contributions to
serve as examples for participants
19. More on Physical Space
• Need at least 20 feet of wall
space
• Display the items driving your
activity on the top of each
foam board
• Use standard 4’x 8’ pieces of
foam board for the canvas
• Be prepared to move the
installation around
• Can use basic folding tables as
a working space
22. More on Process
• We visually tracked response volumes for each of our four primary roles,
based on assigned colors
• Photos and video were captured of the Graffiti Wall’s progression over the
course of the week
• For a variety of reasons, it was critical to have moderators present at all
times
35. Takeaways
+ Δ
Disagreement was evident in responses A lot of partial threads
Supporting both big groups/intimate Missing out on the conversation
Contributions from a diverse audience Constant installation management‘
Visual and auditory communication Instruction Poster not effective
support ‘
Opportunities to moderate Loss of supplies
conversations
‘Mobility of the Fom-Cor
40. Conclusion
The Graffiti Wall is an emerging method for qualitative asynchronous
data collection that functions in a public setting to gather feedback
from a large, diverse audience
This approach overcomes some of the drawbacks of existing methods
by supporting both solitary engagement and lively group discussion
while also being inclusive of varied communication styles
42. Audience Activity
1. How do you envision applying this technique or adapting this
technique to your current work?
1. Comments, criticisms, ideas for evolving the Graffiti Wall
technique that have not yet been discussed
2. Are there any tips or tricks to quick and dirty data analysis,
packaging, or distribution that were missed in the initial
workshop of this Case Study?
We have stood by this theory for some time now. Of course, it’s precisely this audience that may have me running out of the conference, tails between legs… We will be inviting your feedback, praises and criticisms in order for us all to collectively critique what is being shared and in order to realize its potential benefit to the community Moving forward from here, I will be building context for everyone that will help this position statement on the method make more sense…
We can think of a generic user-centered design process as interleaving divergent and convergent methods Data gathering allows you to learn from your target population; “drinking from the firehose” Synthesis/data analysis converges findings down into core observations, drivers, and issues Brainstorming methods encourage divergent thinking once again to imagine potential solutions Prototyping/piloting/testing all help us to converge again on a final solution
Held a workshop with UE practitioners using AGILE The intent was to brainstorm a broad set of UE-specific issues experienced in AGILE environments Unfortunately, the workshop utilized artificially suggested topics for bucketing the issues instead of allowing time for themes to emerge The same buckets were used for solution brainstorming
We can think of a generic user-centered design process as interleaving divergent and convergent methods Data gathering allows you to learn from your target population; “drinking from the firehose” Synthesis converges findings down into core observations, drivers, and issues Brainstorming methods encourage divergent thinking once again to imagine potential solutions Prototyping/piloting/testing all help us to converge again on a final solution
Looking at some of the aftermath Large volume of contributions Colors represented folks that (loved, hated, or were neutral about AGILE, or just didn’t know) That dimension not useful at the end of the day Many stickies not even placed into the categories, further proof that the approach taken in the workshop didn’t work
Pulled all contributions out of the groupings and conducted a proper synthesis in a dedicated space over the course of several weeks 11 issues buckets emerged, for which we articulated a question for each Not surprisingly, solution sticky notes did not fall logically into the new categories
Challenge: repeat the solution brainstorming with the new issue buckets would be very difficult to gather the same group (and volume) of practitioners in order to repeat the brainstorming exercise all the questions were printed in the proceedings if you would like to review them Point is these were the basis for the subsequent solution brainstorming exercise
We thought, ‘well, if we are going to do this, it may be that much more informative if we solicited the opinions of all the core roles of a typical AGILE development team First problem: Where could we hit them all at once? Problem two: this also suggested the need for an approach that would take into account a spectrum of personality types Designing for this in mind would not hinder, but rather theoretically increase feedback quality and quantity
The more we thought about this, the more it became obvious that is was a fundamental flaw in traditional methods such as focus groups Dominating personalities Safe environment for sharing is difficult to create Pressure to generate ideas under defined time constraints What ideas or issues go unexpressed based on the environment in which we ask for feedback?
The more we thought about this, the more it became obvious that is was a fundamental flaw in traditional methods such as focus groups Dominating personalities Safe environment for sharing is difficult to create Pressure to generate ideas under defined time constraints What ideas or issues go unexpressed based on the environment in which we ask for feedback?
The more we thought about this, the more it became obvious that is was a fundamental flaw in traditional methods such as focus groups Dominating personalities Safe environment for sharing is difficult to create Pressure to generate ideas under defined time constraints What ideas or issues go unexpressed based on the environment in which we ask for feedback?
VARK is a framework first described by Neil Fleming of Lincoln University, New Zealand in 1987 provides users with a profile of their learning preferences; the ways that they want to take-in and give-out information gave us a framework, as well as published evidence, that these variations in cognitive processes in fact exist and, in our case specifically, are worth designing for
Reliably strong attendance of all AGILE community members (Developers/QA, User Experience, Product Managers, Scrum Masters) Significant unstructured time available to conference attendees
Aware of this practice, based on anecdotes of informal discussions, that this approach is something often followed by members of Industrial and other design teams When applied externally to specific group s, we are proposing that the method may potentially provide these benefits described
You will need at least 20 feet of wall space or other vertical surface that will allow a large number of participants to freely interact Use the top two feet of each foam board for displaying the items driving your activity (e.g. the UE Issue Questions used) Use standard 4’x 8’ pieces of foam board for the canvas since it is lightweight, can be placed along most vertical surfaces, and is easy to move around Be prepared to move the installation around at a moment’s notice to determine the physical location that maximizes awareness, interest, and participation Use basic folding tables as a working space and area to organize your supply bins
Several pieces of 4’x8’ foam board (pieces needed determined by number of research questions) Roll of plain white plotter paper for additional contribution area (optional) Pre-cut sheets of paper for table drawing and creating new topic categories Industrial or duct tape for securing plotter paper Large instruction poster to provide additional guidance for contributors Scissors Large supply tables and a few chairs Supply bins Colored and black Crayola® brand washable markers “ Super sticky” Post-it® brand sticky notes ; 4 colors and multiple sizes Large color-coding labels ; 3 colors (optional) 5-10 clipboards to help participants write responses Digital camera for high-resolution photos Seed content under each question displayed to encourage initial participation and build interest
Really based on how you’d like to segment your data Facilitators must be present at all times Contributors ask a variety of questions about the open format Ad hoc group discussions may benefit from guidance Quality control is necessary to manage any additional process complexity such as assigned sticky notes colors based on the participant’s role Facilitators help to create a safe, friendly, approachable space
Our prerequisite was placement in a high foot traffic location Few locations were less desirable than where the Graffiti Wall was set up on Day 1 The only access to this initial location was down a long hallway at the top of a set of escalators The bulk of the foot traffic was two floor away at the expo area and registration
After failed attempts at asking for permission to relocate, we decided to attempt different spots including the escalators and the Expo floor
This routine of moving to a new spot each morning inadvertently created increased interest in conference attendees
When set up at the Expo floor, vendors would come by and check the Graffiti Wall out during their slow periods
Feedback steadily built up over time each day; attendees had the flexibility to contribute at whatever time worked best for them [view top/bottom, columns from left to right]
The use of colored sticky notes to distinguish different roles within AGILE, we were able to confirm coverage from all roles across all proposed questions Notably significant numbers of blue sticky notes (assigned to Devs and QA) were present in the end Also, notably, a lot of additional interesting data to sift through
Participants built off of each other’s contributions and illustrated their comfort with the expression of conflicting opinion, often a major concern for us in focus groups or even in the lab. Any hint of your own opinions or the opinions of he majority often influence AGREEMENT. Statement by a Developer > response by UE > retort by Product Owner Solution disagreement among two Developers Would these conversations/conflicting opinions have come out so clearly in a focus group?
Significant amounts of both solitary engagement and lively group discussion were observed Evidence that our method is inclusive of contributors with various preferences
The contributions of visual artifacts by participants supports the theory that this is in fact potentially an unmet need At least serving the visual learner in addition to read/write learners Auditory not represented in the slides because is was recognized by the moderators given how they were engaged in conversation by some individuals who wanted to tell us what they thought
Several variations to the method are worth further investigation in order to test its robustness and expand its potential research data gathering applications:
Experiment with the use of visual prompts as an alternative to verbal content Some examples of visual prompts are: mockups, prototypes, screenshots CONCEPT VALIDATION Facilitators could direct the form that the requested feedback takes (e.g. list of use cases, creation of lo-fi sketches)
Solicit design ideas with user-centered artifacts as alternatives to verbal content, such as : storyboards, scenarios of use , and even personas
Explore a variety of other public settings with this method that supports a safe, anxiety-free environment such as at a school or playground
IDEAS: to pass out post-it notes and have folks write down their answers after they contribute Have a scribe write out audience answers and place them on the Wall Place Wall in a public conference space that will allow contributions *after* the session (Thursday 3:30-4:10pm) UAs Researchers Managers