Teams are a fundamental part of the way we all work. Understanding the ins and outs of team decision making makes us better employees, better co-workers, and even better people. As developers and testers, we continuously make decisions. Most decisions are based on how the decision maker perceives the information at hand. That perception is driven by many factors including cognitive biases—the mental shortcuts we use that lead us to simplify, make quick decisions, and ultimately mess up when we’re trying to attack new problems. Biases can affect how information is perceived, and how teams and individuals behave. Biases are like optical illusions—we see what we want to see, even when we know better. Learn the cognitive biases that affect team thinking and take away strategies for how your team can work with and around these biases. Finally, discover how psychology can make your team more efficient and effective together.
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
The Impact of Cognitive Biases on Test and Project Teams
1. W17
Personal Excellence
5/7/2014 3:00:00 PM
The Impact of Cognitive Biases
on Test and Project Teams
Presented by:
Thomas Cagley
The David Consulting Group
Brought to you by:
340 Corporate Way, Suite 300, Orange Park, FL 32073
888-268-8770 ∙ 904-278-0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com
2. Thomas Cagley
The David Consulting Group
VP of consulting for The David Consulting Group, Thomas Cagley is an authority in guiding
organizations through continuous process improvement. His areas of expertise encompass a
wide variety of methods and metrics including lean software development, agile software
development, quality integration, quality assurance, and the application of the Software
Engineering Institute’s Capability Maturity Model® Integration (CMMI) to achieve process
improvements. Thomas is an active member of the International Function Point Users Group
and a Certified Function Point Specialist. He is the editor of the Software Process and
Measurement Podcast (SPaMCAST), blogs at tcagley.wordpress.com, and can be reached at
t.cagley@davidconsultinggroup.com.
15. Thomas Cagley
The David Consulting Group
VP of consulting for The David Consulting Group, Thomas Cagley is an authority in guiding
organizations through continuous process improvement. His areas of expertise encompass a
wide variety of methods and metrics including lean software development, agile software
development, quality integration, quality assurance, and the application of the Software
Engineering Institute’s Capability Maturity Model® Integration (CMMI) to achieve process
improvements. Thomas is an active member of the International Function Point Users Group
and a Certified Function Point Specialist. He is the editor of the Software Process and
Measurement Podcast (SPaMCAST), blogs at tcagley.wordpress.com, and can be reached at
t.cagley@davidconsultinggroup.com.
16. Software Process and Measurement Cast Page 1
www.spamcast.libsyn.com
Creative Commons 3.0 US Noncommercial, Attribution
Cognitive Bias and Effective Teams
Provided by
Thomas M. Cagley Jr.
t.cagley@davidconsultinggroup.com
+1.440.668.5717
And
Software Process and Measurement Cast
www.spamcast.net
Meghan E. Cagley
megcagley@gmail.com
March 2014
17. Software Process and Measurement Cast Page 2
www.spamcast.libsyn.com
Creative Commons 3.0 US Noncommercial, Attribution
Cognitive Bias and Effective Teams
Thomas M Cagley Jr
Meghan E Cagley
No man is an island,
Entire of itself,
Every man is a piece of the continent,
A part of the main.
- John Donne
John Donne may not have had project teams and cognitive biases in mind when he wrote his famous
poem; however the description is apt. Cognitive bias reflects a pattern of behavior in which a person acts
differently than would seem normal in certain situations (based on inaccurate judgment or illogical
interpretation). Amos Tversky and Daniel Kahneman introduced the phrase cognitive bias in 19721. Biases
can affect how we make decisions, how teams and individuals behave and even our perception of
ourselves. Biases are a part of nearly every human interaction. Every human has cognitive biases and
because humans are all unique the impact of the bias are unique to each individual. Cognitive biases are
an inescapable part of basic human nature. In order for a team to effectively deliver business value, it
must recognize the biases each member brings to the table and how they interact.
Cognitive biases work by causing an individual to perceive the world around them in a manner that is
outside of what typically would be considered logical. For example, when someone falls prey to an anchor
bias when buying a car they will be influenced by the first price they are exposed to even if it does not
match their expectations. A company has to advertise a regular price before they can establish value or
quote a regular price. For example, many people believed that setting a high price for the original Iphone
then dropping the price substantially 74 days later was an attempt to use the anchor bias to create a
perception that the IPhone was a good value for the money. Our cognitive biases are neither good nor bad
if we are aware of them.
Project team members make decisions on a continuous basis. Most decisions are made based on how the
decision-maker sees the information he or she has at hand. One bias that can affect how information is
perceived is the illusory correlation. The illusory correlation is when you think a relationship exists
1
Kahneman, Daniel; Shane Frederick (2002). "Representativeness Revisited: Attribute Substitution in Intuitive
Judgment". In Thomas Gilovich, Dale Griffin, Daniel Kahneman. Heuristics and Biases: The Psychology of Intuitive
Judgment. Cambridge: Cambridge University Press. pp. 51–52. ISBN 978-0-521-79679-8.
18. Software Process and Measurement Cast Page 3
www.spamcast.libsyn.com
Creative Commons 3.0 US Noncommercial, Attribution
between two or more variables, but that relationship doesn’t necessarily exist. For example, a team that
works more hours a week might be perceived to have higher productivity because working longer gives the
perception of creating more output. But, of course, hours and productivity aren’t necessarily related. Once
you fixate on one relationship, you will often pay less attention to others. In this case, you’ll miss the level
of effort the team is expending, which if you are paying for the work by the hour means you could run up
quite a bill when you think your productivity is good based on hours spent. These biases can impact the
outcome of decisions of course.
Biases develop as shortcuts that help us perceive information and help us make decisions quickly. Pattern
recognition biases helped early humans stay alive by recognizing situations where they’d likely run into
predators2. The resulting decisions kept our ancestors alive, even if there were false positives (you could
have lots of false positives, but only one false negative because the one false negative would get you eaten).
Project teams (Agile or not) use or fall prey to a wide range of biases that affect perceptions and impact
decisions. A sample of common biases include:
Anchor bias refers to the tendency to rely heavily on one piece of information when making decisions.
This bias is often seen when early estimates for a project or a tasks are made. The instant they are placed
on the table they become a reference point to which all changes will be compared.
Clustering illusion (or clustering bias) is the tendency to see patterns or trends in clusters or streaks in
a smaller sample of data inside larger data sets. For example, a team I recently worked with had an
average velocity of 30 story points per sprint (ranged between 20 – 36). They had three sprints in a row
that delivered 40+ story points. While nothing significant had changed about how the team was working,
outsiders saw a pattern and believed something out of the ordinary was occurring. (FYI – if there is no
statistical significance to the data what we are seeing is “common cause” variance3.)
The curse of knowledge bias generates a filter that blocks the ability to think about a topic from a
different and generally less-informed perspective. In many cases being an expert on a topic makes it very
difficult to see an out-of-the-box solution. This is one of the reasons significant changes in IT platforms or
concepts come as new players with less experience enter the organization. Similar to the curse of
knowledge bias is the status quo bias, which is the tendency to want things to stay relatively the same. It
creates a perception filter that limits the individual or team to data and concepts that make them
comfortable. These two are similar because knowledge and the status quo are linked.
2
http://www.slideshare.net/elentini1976/the-portable-mba
3
Common cause variance is a reflection of the capacity of the process rather than any special situation.
19. Software Process and Measurement Cast Page 4
www.spamcast.libsyn.com
Creative Commons 3.0 US Noncommercial, Attribution
An availability cascade is when a concept becomes more and more plausible the more it is repeated
publicly. It is a self-reinforcing feedback loop. Perhaps that is why Pokemon became more popular the
more it was shown on the Cartoon Network. Daniel Pink, author of To Sell Is Human, in a July 9th
Salesforce.Com webinar pointed out that repetition increases process fluency, which makes the repeated
item perceived to be more true. Sales, marketing and 24-hour news channels understand and use the
availability cascade bias to great effect.
A final example of biases that affects behavior and perception is optimism bias. Optimism bias is the
tendency to be overly optimistic about favorable outcomes. This bias is often exhibited in status reports or
in promises made early in a project. These are generally not intentional lies, but rather, because of our
optimism bias, the belief that we can that we can deliver more than is truly possible. Dr. Ricardo Validri in
Software Process and Measurement Cast 84 suggests that optimism bias is one of major reasons IT
personnel are poor estimators.
All biases can create blind spots. A good coach or leader will first be aware of his or her biases and then
help the team understand their own blind spots and at the same time not abandoning the shortcuts that
can help us perceive what is important and make timely decisions.
Biases can affect behavior. Neglect of probability is a type of cognitive bias common in IT organizations’
planning and estimating projects or in risk management. Neglect of probability creates a perception of
certainty. For example, most estimates are not certain and should be represented as a range based on
probability. Techniques like Monte Carlo analysis can be used to generate these estimates. However,
almost all estimates are represented as a single number and, as a result, we ignore the continuum of
probability. Lottery ticket sales are a classic example of probability neglect bias; buying one or 10 doesn’t
materially affect the probability of winning, but it does not stop those who think buying ten tickets
increases their chances of winning. In both cases, neglecting probability affects how we behave and make
decisions. Many of the filters and shortcuts that we develop to help us to successfully interact with the
environment can negatively impact team effectiveness. A sample of common biases that affects IT teams
in this category include the following 6:
A zero-risk bias reflects a preference for mitigating (mitigating means finding a way to make the risk go
away) a small risk down to zero, rather than mitigating a larger risk that you can’t drive to zero. This bias
can be seen when developing difficult, and therefore more risky, user stories. For example, a team will do
the easy pieces of work early in a sprint, leaving the more difficult work until late in the sprint which
increases the likelihood of not completing stories during an iteration. By leaving the difficult portions of
20. Software Process and Measurement Cast Page 5
www.spamcast.libsyn.com
Creative Commons 3.0 US Noncommercial, Attribution
work until later, we increase the chance we will miss our commitment when something goes wrong or that
the estimate will be wrong with no time to recover.
The bandwagon effect occurs when there is a tendency to adopt an idea because the crowd does. For
example, when an idea is shown on the cover of all the industry journals, teams tend to take it up with
gusto. When I was a child I used the “everybody is doing it” refrain to try to convince my mother that
staying out all night was a good idea - but she did not buy it. The bandwagon effect is more common with
teams that exhibit ideological homogeneity, because there are fewer alternative perspectives to provide
resistance. Where a common core belief is strongly held, groupthink can make adopting new ideas more
difficult to adopt.
I am firmly convinced that if I watch my alma mater play football on television they will lose (however, my
wife suggests that I am not that powerful). This bias is called the illusion of control, which is defined as
the tendency to overestimate one’s (or a team’s) degree of influence over external events. Teams or team
members who fail prey to the illusion of control can make poor decisions because they think they can
control the future.
The social desirability bias can be problematic, especially during retrospectives. The social desirability
bias is the tendency to over-report while under-reporting undesirable behaviors. Teams or individuals
that fall prey to this bias have a hard time identifying and dealing with the tough interpersonal issues that
can occur on teams. When this bias is present in a team, the coach or leader needs to address it directly by
focusing the team on the behaviors driving the delivery of value, rather than focusing on socially desirable
behaviors alone or risk deep-seated dysfunction.
Team composition is important. Most of us would agree that team membership should have a broad set of
capabilities and that members should push each other intellectually. When a team is assembled by a
leader with a social comparison bias, the leader should be sure that membership decisions are made
so that those who are on the team don’t compete with the leader’s strengths. This type of bias (and there
are a number of biases with similar impacts) leads to teams that will rarely challenge the leaders
perception of the status quo.
Biases drive behaviors. When biases generate unproductive behaviors the effectiveness of the team and its
members will be reduced. When the biases drive behaviors that are benign, a coach or leader can help sort
out problems. However, some biases generate behavior that is far from benign. Where a bias is causing
serious team issues, I strongly suggest involving a trained human relations specialist.
21. Software Process and Measurement Cast Page 6
www.spamcast.libsyn.com
Creative Commons 3.0 US Noncommercial, Attribution
Motivational biases (also known as social biases and/or attribution biases) reflect the errors we make
when evaluating the rationale for both our own behavior as well as the behavior of others. Misperceptions
of what is driving behavior can cause communication problems among team members and erode team
trust. For example, a self-serving attribution bias occurs when success is attributed to internal factors and
failures to external factors. This type of bias can occur at the individual or the team level. While
attribution bias can improve self-esteem (or team-esteem), it can also cloud judgment by causing an
overestimation of capability. For example, if a team is able to deliver more than their long-term
productivity or velocity would predict, the team might believe that they have increased their capability to
deliver. If there haven’t been any fundamental changes, such as an infusion of knowledge, training or new
tools, the higher velocity may not be attributable to the team. A good coach will help teams examine these
types of biases during retrospectives.
A sample of common motivational biases that affect IT teams include:
The halo effect is when our impression of a person influences how we interpret their specific traits.
Recently, I observed a discussion between a very charismatic .net coder and a network administrator. The
coder’s charisma increased the weight of his argument. As the result, the network administrator agreed to
approve a new set of transactions for production that he probably should not have. Later, that approval
had to be renegotiated. The network administrator was motivated into a poor decision by the
programmer’s charisma.
Illusion of transparency is a bias in which an individual overestimates another individual’s ability to
know them, and/or overestimate their own ability to understand what is driving someone else. It is an
inescapable fact that humans interpret each other’s behaviors and actions, assigning a rationale to each
action and reaction which feeds how they act. Conflict can be motivated when interpretations of behaviors
are wrong and conflict can lead to reduced productivity.
Teams are a core feature of most modern IT organizations. By definition, all teams have a boundary.
Hardening of team boundaries can lead to intergroup bias. Intergroup bias motivates members of a
group to give preferential treatment to other members of the group. An overly hardened team boundary
can motivate a team to resist new ideas being introduced which leads to intellectual atrophy. Coaches and
leaders should encourage group cohesion but help teams avoid severe intergroup bias.
Fundamental attribution error refers to a scenario in which an individual overemphasizes
personality-based explanations for behaviors in others (e.g. they are lazy, they aren’t very smart) while
underemphasizing the influence the situation had on driving the behavior. Fundamental attribution error
assigns a theory of motivation to someone elses behavior (many times wrong) which then motivates the
22. Software Process and Measurement Cast Page 7
www.spamcast.libsyn.com
Creative Commons 3.0 US Noncommercial, Attribution
assignor to incorrect behavior. When fundamental attribution error rears its ugly head on any team, a
coach, Scrum Master or leader should uncover the problem, usually in a retrospective, and facilitate a
resolution. When the bias is deep-seated, it may be necessary to leverage professional human resource
facilitators or ultimately to change the team composition.
Cognitive biases can affect projects in every step in which humans are involved. The impact of cognitive
biases tend to be clustered around how people behave and how they communicate. Cognitive biases are
not just psychological theory. As an example of the impact of cognitive biases on real life, use a typical
planning meeting to provide examples of how common cognitive biases can impact an Agile project.
A typical Scrum planning session is a two-step affair. In first step, the product owner starts by selecting
the work he or she wants to accomplish in the current sprint from the prioritized backlog. The product
owner is acting based on business value and input from the team and other stakeholders. While the
selection is made based on an informed process, the mere fact that an amount of work has been placed on
the table leverages the anchor bias and sets an expectation of what is needed. In the second step, the team
will negotiate with the product owner about what really can be accomplished with the amount work that
the product owner has suggested in mind. The discussion that occurs during the first step of the process
can be impacted by the cognitive biases of each individual participating. The types of biases can range
from the halo effect (overweighting the input from an individual based on their personality attribute) to
intergroup bias (ideas constrained by the team’s boundaries). In scenarios where the product owner and
one or more senior team members agree, many times the entire team can be pulled into agreement using
the bandwagon effect. Coaches need to ensure that the backlog is prioritized well and that the product
owner’s initial selection is sensitive to the team’s capacity in order to minimize the impact of anchor and
other cognitive biases.
During this second step in the planning process, the team breaks down the work and determines how
much they really can accomplish. The team begins with an understanding of what they are being asked for
(the other side of the anchor bias). Once they have the list from the Product Owner they can begin to plan
(identify, order and size tasks). The planning process is apt to be affected by many of the planning and
motivational biases noted during the first step of the process and others that are more prevalent in IT
personal. For example, IT personnel are trained problems-solvers and generally have not met a problem
they do not think they can solve (which leads to optimism bias which is the tendency to be overoptimistic
about favorable outcomes). This can lead a team astray when addressing unique business or technical
problems. Another bias that impacts planning is the planning fallacy. The planning fallacy is the tendency
to underestimate task completion times - remember all those late nights late trying to get things
done. The power of group planning sessions is that many minds can be brought to bear on the topic.
23. Software Process and Measurement Cast Page 8
www.spamcast.libsyn.com
Creative Commons 3.0 US Noncommercial, Attribution
Assuming that the team has not fallen prey to intergroup bias, a diverse team will be able to avoid many of
the biases that plague individuals. During the team component of the planning process the Scrum Master
or coach should observe how the team is interacting and facilitate the team’s interactions.
The discussion of cognitive biases is not a theoretical exercise. Even the relatively simple process, sprint
planning in Scrum, is influenced by the cognitive biases of the participants. Biases are powerful
psychological filters that affect how both individuals and teams perceive the world around themselves and
then how they behave. Biases reflect shortcuts in how we interpret and react to stimulus. In many cases
these reactions are valuable; however, they can also cause problems as many shortcuts can.
Understanding how biases impact the way that individuals and teams perceive the world around them can
help teams make better decisions and therefore deliver value more effectively. When teams are
established, members begin the process of learning each other’s biases and capabilities. No amount of
team-building exercises can replace the knowledge gained by collaborating and creating value for the
organization. This learning curve is one of the reasons it is critical for teams to stay together over time.
As teams build an internal knowledge base and accumulate experience, not everything that is discovered
will be conducive to team effectiveness. Assuming that the issues are not pathological (e.g. ax murders or
chronic liars), the team should use techniques such as periodic retrospectives to surface problems and
deal with them. For example, one team I was observing had been struggling with a member that tended to
dominate discussions based on his significant technical prowess in a specific technology. More junior
members were routinely cut out of conversations. The issue surfaced during a retrospective. The
technologist in question agreed that his dominance was causing a communication issue and that he would
be part of the solution. The team decided to adopt a signal technique, i.e. a safe word, to cut him off and
let others get into the discussion. The team dealt with problem without having to get managers involved.
The Scrum Master and/or coach should facilitate the process but should not solve the problem for the
team.
Effective teams build an understanding of the capabilities and biases of each member and the team as
whole. The most effective means of a team learning about itself is by working together towards a common
goal. As teams work together, techniques such as retrospectives provide a means to mitigate individual
biases that are counterproductive (at odds with team or organizational culture).
To paraphrase John Donne, no team member is an island but rather “part of the main.”