Unlocking Exploration: Self-Motivated Agents Thrive on Memory-Driven Curiosity
Looking to the future: closing the gaps in our assessment approach
1. ORCID - CASRAI , May 18-19 2015, University of Barcelona
Looking to the future: closing the gaps
in our assessment approach
Paul Wouters
2. Outline
• Diagnosis: current state in the organisation of science
and scholarship
• Four key problem areas
• Searching for solutions
• A look at the future
1
7. • A severe imbalance between the dollars available for research and the
still-growing scientific community in the United States.
• The training pipe-line produces more scientists than relevant positions
in academia, government, and the private sector are capable of
absorbing
• Hyper-competition for the resources and positions that are required to
conduct science suppresses the creativity, cooperation, risk-taking, and
original thinking required to make fundamental discoveries.
• Overvaluing translational research is detracting from an equivalent
appreciation of fundamental research of broad applicability
• As competition for jobs and promotions increases, the inflated value
given to publishing in a small number of so-called “high impact” journals
has put pressure on authors to rush into print, cut corners, exaggerate
their findings, and overstate the significance of their work.
• Today, time for reflection is a disappearing luxury for the scientific
community.
• The quality of evaluation has declined
6
8. Research leaders face key questions
• How should we monitor our research?
• How can we profile ourselves to attract the right students
and staff?
• How should we divide funds?
• What is our scientific and societal impact?
• What is actually our area of expertise?
• How is our research trans-disciplinary connected?
7
9. Research leaders need more, not less,
strategic intelligence
• Increasing demand for information about research:
– hyper competition for funding
– globalization
– industry – academic partnerships
– interdisciplinary research challenges
– institutional demands on research & university management
• Increased supply of data about research:
– web based research
– deluge of data producing machines and sensors
– increased social scale of research: international teams
– large scale databases of publications, data, and applications
8
10. Four main problems in current academic
research
• The funding system
• The career structure
• The publication system
• The evaluation system
9
11. Funding system
• level of funding
• balance between project and infrastructure funding
• balance between blue sky and focused funding
• relationship research and teaching
• 1 size 4 all?
10
12. Career structure
• PhDs and postdocs as cheap labour
• hyper-competition
• mismatch training and job opportunities
• lack of dual careers
• emerging separation between researchers and teachers
• increasing inequalities
• lack of diversity in workforce (this may be improving)
11
14. Codification by publication
• The publication system is the basis for communication,
teaching and codification
• Hence, all evaluation systems in science and scholarship
are in the end based on publications
• Commercial interests have been able to use the
publication system as source of vast profits
• Publishing for the smallest audience possible?
• Evaluation systems have developed on the basis of
information (systems) of these publications:
– peer review in various formats
– scientometrics and bibliometrics
13
15. ➡ discrepancy between evaluation criteria and the
social and economic functions of science
➡ evaluation methods (esp. qualitative) have not
adapted to increased scale of research
➡ available quantitative measures are often not
applicable at the individual level
➡ lack of recognition for new types of work that
researchers need to perform
Evaluation Gap
16. Evaluations are liminal
One often has the feeling that there should have been
a clear-cut plan for the purpose and process of an
evaluation, but this is often not the case. (…) people
realize too late that they had very different notions
of plans for evaluation (…) The purpose of the
evaluation constitutes an ongoing controversy
rather than a common logical starting point.
(p. 15)
17. Evaluation Machines
• Primary function: make stuff auditable
• Mechanization of control – degradation of work and
trust? (performance paradox)
• Risks for evaluand and defensive responses
• What are their costs, direct and indirect?
• Microquality versus macroquality – lock-in
• Goal displacement & strategic behaviour
18. Constitutive effects
• Limitations of conventional critiques (eg ‘perverse or
unintended effects’)
• Effects:
• Interpretative frames
• Content & priorities
• Social identities & relations (labelling)
• Spread over time and levels
• Not a deterministic process
• Democratic role of evaluations
19. Effects of indicators
• Intended effect: behavioural change
• Unintended effects:
– Goal displacement
– Structural changes
• The big unknown: effects on knowledge?
• Institutional rearrangements
• Does quality go up or down?
21. New trends in assessment
• Increased bibliometric services at university level
available through databases
• Increased self-assessment via “gratis bibliometrics” on
the web (h-index; publish or perish; etc.)
• Emergence of altmetrics
• Increased demand for bibliometrics at the level of the
individual researcher
• Societal impact measurements required
• Career advice – where to publish?
20
27. • Other relevant patterns:
– Twitter: stronger in Social Sciences and General medicine, weaker
in Natural Sciences and Humanities
26
Coverage by fields
Source: Rodrigo Costas
28. • Blogs and news media have a strong focus on
multidisciplinary journals!
27
Coverage by fields
Source: Rodrigo Costas
29. CWTS Monitor - Meaningful Metrics
• A new interactive way of bibliometric analyses
• Powerful web-based application:
– User-friendly reporting interface
– Robust cleaned WoS database run by CWTS
– Fair and correct benchmarking by state-of-the-art indicators
– Highly configurable to client’s specific needs
• Professional bibliometric reporting in your hands
• Scientists affiliated to the CTWS Institute of Leiden
University provide expert support
28
31. What can we do with altmetrics?
• Indicators for evaluation of scientific performance or
societal impact?
– Mendeley – strongest potential!
– Twitter, blogs, news media, etc. – Not yet there…
• 2 analytical opportunities:
– Social media reception/interest of research topics
– Audiences/communities around topics and scientific publications
30
Source: Rodrigo Costas
32. aim is to give researchers a voice in evaluation
➡evidence based arguments
➡shift to dialog orientation
➡selection of indicators
➡narrative component
➡Good Evaluation Practices
➡envisioned as web service
portfolio
influence
narrative
33. ACUMEN Portfolio
Career Narrative
Links expertise, output, and influence together in an
evidence-based argument; included content is
negotiated with evaluator and tailored to the
particular evaluation
Output
- publications
- public media
- teaching
- web/social
media
- data sets
- software/tools
- infrastructure
- grant
proposals
Expertise
- scientific/scholarly
- technological
- communication
- organizational
- knowledge
transfer
- educational
Influence
- on science
- on society
- on economy
- on teaching
Evaluation Guidelines
- aimed at both researchers and evaluators
- development of evidence based arguments
(what counts as evidence?)
- expanded list of research output
- establishing provenance
- taxonomy of indicators: bibliometric,
webometric, altmetric
- guidance on use of indicators
- contextual considerations, such as: stage of
career, discipline, and country of residence
36. Circulation of knowledge
• focus not only on knowledge production but perhaps
even more on use of existing knowledge
• differentiate the mission and roles of knowledge
institutions
• create regional knowledge centres
• reform education and teach new “21st century skills”
• invest in quality of teaching and training
• restructure the way we work: combine work with learning
35
41. Key challenges in research information
system building
• Will the information infrastructure contain high quality
data and indicators?
• Will it enable and support context- and mission-sensitive
research assessments?
• Will it enable application of research information for
primary research purposes (eg in VREs)?
• Will the public sector remain master in its own house or
will it hand over control to the private sector?
• Will it be possible to truly open up the research agenda to
all stakeholders – open science in a democratic society?
40
43. A look at the future
• universities have developed robust and reliable information systems with coupled databases and robust
identifiers for objects and people
• publication, citation data and all meta data about research are freely available (companies give these
data away as part of their business model)
• the current publication system is replaced by a web based open access platform – the role of the journal
has radically changed (and there are far less journals around); the role of books has been upgraded
• universities have developed effective HRM policies with serious career prospects and less oversupply of
cheap labour. Not every researcher wants to become a professor anymore (or we create much more
types of professorships)
• all universities and research institutes have adopted the Leiden Manifesto for Research Metrics:
http://www.nature.com/news/bibliometrics-the-leiden-manifesto-for-research-metrics-1.17351
• research evaluation has become routine, but also context and mission sensitive and a crucial instrument
in research itself – the current one-size-4all approach is abandoned: the evaluation machines are
servicing research and not the other way around
• most global university rankings have disappeared or have transformed themselves into serious
information services
• research funding consists of four components, each with different criteria and goals: scaleable
infrastructure; innovative projects; heritage funding; and applied and teaching based research
42
44. Thank you for your
attention
p.f.wouters@cwts.leidenuniv.nl
43