Bhiwandi Bhiwandi ❤CALL GIRL 7870993772 ❤CALL GIRLS ESCORT SERVICE In Bhiwan...
Short talk on 2 cognitive biases and reproducibility
1. Cognitive biases can lead to poor
reproducibility and replicability of science
Dorothy V. M. Bishop
Professor of Developmental Neuropsychology
University of Oxford
@deevybee
2. Two cognitive biases that make it
hard to do science well
• Schemata:
• see meaning in random noise
• communicate using simplified narratives
• Confirmation bias: selective attention/memory
• Failure to understand probability
• Misunderstand p-values; many false positives
• Misunderstand sampling error: many false negatives
• Asymmetric moral judgements
3. • Cherry-picking may not be deliberate
• We find it much easier to process and remember information
that agrees with our viewpoint
Confirmation bias: particular problem in citations
3
4. 4
How common is this in practice?
Even when there are trial registries, we get publication bias
de Vries, Y. A., et al (2018). The cumulative effect of reporting and citation biases on
the apparent efficacy of treatments: the case of depression. Psychological Medicine
48, 2453-2455. doi:10.1017/S0033291718001873
5. 5
How common is this in practice?
Even when there are trial registries, we get publication bias
de Vries, Y. A., et al (2018). The cumulative effect of reporting and citation biases on
the apparent efficacy of treatments: the case of depression. Psychological Medicine
48, 2453-2455. doi:10.1017/S0033291718001873
We only
know
these
studies
exist
because
they are
in a
registry!
6. 6
Publication bias and reporting bias conspire to mislead us
de Vries, Y. A., et al (2018). The cumulative effect of reporting and citation biases on
the apparent efficacy of treatments: the case of depression. Psychological Medicine
48, 2453-2455. doi:10.1017/S0033291718001873
7. Inheritance of bias
• When we read a peer-reviewed paper, we tend
to trust the citations that back up a point
• When we come to write our own paper, we cite
the same materials
• If prior papers only cite materials agreeing with a
viewpoint, that viewpoint gets entrenched
• You won’t know – unless you explicitly search –
that there are other papers that give a different
picture
7
14. 14
Key error is to treat a p-value as an indicator of
importance of a finding, regardless of context
I’m not testing whether J is ‘significant’ here – I’m testing whether any of 16 compounds
is ‘significant’
Relevant probability is 1– (probability that all compounds are ‘nonsignificant’)
= 1 - .95^16 = 0.60
Simulated data from random normal distribution with mean = 0
15. 15
To understand p-values:
SIMULATED DATA
“Just as lab scientists are not allowed to handle
dangerous substances without safety training,
researchers should not be allowed anywhere near a P
value or similar measure of statistical probability until
they have demonstrated understanding of what it
means.”
Bishop DVM. Nature 583,
World view: How scientists can stop fooling themselves. July 23 2020
16. 16
Professor Dorothy Bishop, FRS, FMedSci, FBA,
Department of Experimental Psychology,
Anna Watts Building,
Woodstock Road,
Oxford,
OX2 6GG.
@deevybee
http://deevybee.blogspot.com/2012/11/bishopblog-catalogue-
updated-24th-nov.html
https://www.slideshare.net/deevybishop
https://orcid.org/0000-0002-2448-4033