Communicating the impact of our research can be essential for securing funding, forming research partnerships, building a case for tenure and promotion, or achieving other goals. But what does “impact” really encompass, and how do we show evidence of it? This session will highlight key strategies, resources, and services that can help you to successfully communicate your research impact.
Presenter: Erin Owens, Professor and Scholarly Communication Librarian, Newton Gresham Library, Sam Houston State University
2. NOTE: SLIDES WILL BE SHARED!
DON'T TRY TO WRITE DOWN
EVERY RESOURCE
SLIDES WILL BE
AVAILABLE ONLINE
HTTPS://SHSULIBRARYGUIDES.ORG/PUBLISH/SLIDE
S
3. WHY WORRY ABOUT IMPACT?
• Inquiry for the sake of inquiry and knowledge-building is valid
• Nevertheless…
• Communicating the impact of our research is sometimes essential
for
• Securing funding
• Forming research partnerships
• Building a case for tenure and promotion
• Achieving other goals
4. WHAT IS IMPACT?
• The effect research has beyond academia
• “…when the knowledge generated by our research contributes to,
benefits and influences society, culture, our environment and the
economy”
• Something that other people or institutions gain or do—not
something that you as a researcher can ‘do’
(York University)
5. WHAT IS IMPACT?
• Academic Impact: “The demonstrable contribution that excellent
social and economic research makes in shifting understanding and
advancing scientific method, theory and application across and
within disciplines "
• Economic and Societal Impact: “The demonstrable contribution
that excellent social and economic research has on society and the
economy, and its benefits to individuals, organisations or nations”
(UK Research and Innovation, Economic and Social Research Council)
6. WHAT IS IMPACT?
• Areas of research impact could include:
• Academic / scholarly / intellectual impact
• Cultural impact
• Economic impact
• Environmental impact
• Social impact
• Impact on health and wellbeing
• Policy influence and change
• Impact on or change in practice (teaching, etc.)
• Legal impact
• Technological developments
8. ACHIEVING IMPACT
• Ask at a project’s start: What can you do to ensure that potential
beneficiaries will have the opportunity to engage with your
research?
• Be specific about who the possible beneficiaries of your research are.
• Make sure your dissemination methods and activities for driving impact are
appropriate for your beneficiaries/audience.
• Consider how you intend to evaluate and gather evidence of impact
9. ACHIEVING IMPACT
• Publicizing Outputs
• SHSU MarCom, coverage of research on the SHSU website and social
media
• Promote through your professional networks, scholarly societies, social
media
• Disseminate to (relevant) online, print, and broadcast media outlets
• Discuss options with MarCom for a press release
• Disseminate directly to likely beneficiaries
• Submit less-academic summaries of work to blogs
• Post citations and abstracts to scholarly profile services
• Share open-access files (within boundaries permitted
by publisher) Image by Digital Photo and Design DigiPD.com from
Pixabay
10. ACHIEVING IMPACT
Image by PublicDomainPictures from
Pixabay
• Commercializing Outputs
• Patenting
• Licensing
• Work with the Technology and Commercialization
unit of SHSU Office of Research & Sponsored
Programs (ORSP)
11. ACHIEVING IMPACT
• Other factors that support impact:
• Establishing networks with research users, acknowledging their expertise and
role in making impact happen
• Involving users at all stages of the research, including user stakeholder and
participatory groups
• Developing good understanding of policy and practice contexts
• Committing to portfolios of research activity that build up a strong reputation with
research users
• Involving intermediaries and knowledge brokers as translators, amplifiers,
network providers
(UK Research and Innovation, Economic and Social Research Council)
12. EVIDENCE OF IMPACT: METRICS
• “Research metrics are measures used to quantify the influence or
impact of scholarly work”
• “Research metrics are used because of a desire for a quantifiable,
objective means of comparing scholarship”
• “However, they all have weaknesses”; criticism often centers on
• Limits of coverage of databases used to create metrics
• Failure to account for differences in scholarly output and citation rates among
disciplines
• Over-reliance…on quantitative rather than qualitative metrics in judging
scholarship.
(Ohio State University)
13. TRADITIONAL IMPACT METRICS
• Journal-level
• Citation-based (JIF)
• Acceptance rate
• Longevity
• Perceived prestige
• Item-level
• Articles published
• Citations to work
• Patents filed/granted
• Commercial profit
• Researcher-level
• H-index
Images captured 8-Sep-2022 from Google
(right) and Web of Science (bottom)
14. TRADITIONAL METRICS: CITATIONS
• What does it mean when a work is cited?
• Many reasons for citing
• Many biases in citing
• “If we don’t know what citations mean, what does it mean when we count them?”
• “A lot of research may in fact labor in obscurity, but that doesn’t mean at some point it won’t
become of signal importance. Basic research takes time to germinate. Some won’t have any
measurable impact at all. But that doesn’t mean it’s low quality and it doesn’t mean that it
doesn’t have value.”
• Quotes from Karin Wulf (2022) on The Scholarly Kitchen
15. ALTERNATIVE METRICS (ALTMETRICS)
• Generally demonstrate engagement at the item-level
• Field-weighted citation counts
• Digital usage (views, downloads)
• Social media posts, likes, etc.
• Mendeley readers
• Syllabus adoption
• Book holdings in WorldCat libraries
• Researcher-level alternatives to h-index
• i10-index, G-index, E-index – different,
but all have their own weaknesses
Image captured 8-Sep-2022 from
Dimensions
16. OTHER METRICS
• Sources of citations: geography, language, experts vs students, etc.
• Openness of your outputs, access beyond “the ivory tower”
• Influence on teaching, practice, policy
• Alignment with Sustainable Development Goals (SDGs) or other
roadmaps
• Research collaborations developed
• Awards, honors, appointments (e.g., National Academies), other
distinctions
17. CREATIVE WORKS METRICS
Image Source and More Details: University of Melbourne Guide: Research Impact for Fine Arts
and Music
18. IDENTIFYING POTENTIAL METRICS
• Metrics Toolkit
• Snowball Metrics – esp. Ch 5, Output & Outcome Metrics
• Becker Medical Library Model for Assessment
of Research Impact
• Periodic Table of Scientometric Indicators
• Embrace the diversity of what impact might look like in your context
19.
20.
21.
22.
23. KEY STRATEGIES
• Start with your researcher
story:
• Your values
• Your goals/agenda
• Your target beneficiaries
• Your discipline’s values
• Your discipline’s priorities or
research “roadmap”
My Impact Story
Impact
Example
s &
Metrics
My
Disciplin
e
Context
My
Goals &
Values
24. KEY STRATEGIES
• Add metrics and indicators that enhance
your story, showing:
• Progress towards your goals
• Alignment with personal and disciplinary
values
• Success in reaching target beneficiaries
• Consider including data visualizations
• Bar charts, pie charts, geographical
engagement maps, collaboration or citation
network maps
My Impact Story
Impact
Example
s &
Metrics
My
Disciplin
e
Context
My
Goals &
Values
25. KEY STRATEGIES
• Integrate meaningful anecdotes
• Sharing one thoughtful and substantive tweet may mean more than a count
of tweets
• Know your field!
• Some metrics will mean a lot more/less to different disciplines
• Don’t waste time exhaustively cataloging “stuff” that doesn’t make sense for
your work or that your field doesn’t prioritize or value
27. CAVEATS
• Metrics are contextual: Do not try to compare across disciplines on
a metric like Journal Impact Factor, a citation count, an h-index,
etc.
• Some cannot even be compared between early-career and late-career
researchers
• Respect a metric’s intention and avoid unintended uses: for
example, Impact Factor describes a journal and should not be
used as a proxy to describe an article or researcher
• Read the San Francisco Declaration on Research Assessment
(DORA) and the Leiden Manifesto for more insight on the
responsible use of metrics
28. CAVEATS
• Research impact is complex: Do not trust attempts to reduce
it to a single metric or “score”
"Unfortunately, there isn't a simple formula to measure the reach, influence, mobilisation of knowledge
and impact in society. The journey to impact is far more heterogeneous than something measured by
an impact factor calculation.
"We need a far more nuanced and sophisticated approach using both quantitative and qualitative
approaches. A combination of the classic quantitative measures, combined with storytelling, narratives
around how research makes that journey to broader society. That can’t be done in a number."
- Tony Roche, CEO of Emerald Publishing (source)
29. KEY RESOURCES
• Tools for journal-level metrics
• * Journal Citation Reports (JCR)
• Scopus Metrics
• Scimago Journal and Country Rank
• Eigenfactor
• * MLA Directory (for arts and humanities)
* Indicates resources that must be accessed through the SHSU
Library
30. KEY RESOURCES
• Tools for article-level metrics (including citations)
• Dimensions
• Google Scholar
• Publish or Perish (better options to aggregate Google Scholar data)
• * Web of Science (incl Citation Network)
• Scite (for AI “classification” of citation context)
• Policy Commons (to find policy citations)
• iCite (Relative Citation Ratios for PubMed works)
• Kudos (create and host plain-language stories to promote research to non-
academic audiences)
31. KEY RESOURCES
• Tools for other metrics
• Dimensions
• WorldCat
• Open Syllabus Project
• SHSU Library's Research Impact Guide
32. LIBRARY SUPPORT
• The Scholarly Communications Librarian can help you to:
• Identify appropriate potential metrics
• Understand and explain what a particular metric conveys (and does not)
• Learn to use key resources for finding metrics for your work
• Compile specific metrics for your work
• Draft a research impact narrative
• Discuss other related topics
33. THANK YOU! QUESTIONS?
Erin Owens
Professor, Scholarly Communications Librarian
Newton Gresham Library, SHSU
936-294-4567
eowens@shsu.edu
Revisit slides at https://shsulibraryguides.org/publish/slides
Notas do Editor
Group discussion: What is impact?
What you need – What you do – What you deliver – Awareness and use of your outputs – Consequences of using your outputs
Note feedback loops along the way
The further you go along the path, the less control you have as the researcher
How many people have heard of Impact Factor?
Impact Factor as an example to discuss limits of coverage
Group discussion time: What does a citation mean? Why do we cite?
Not sure if it’s worth mentioning HuMetrics here or not…
Not just for hard science and social science; metrics look different in the creative fields, but there are still options
Now let’s look a little deeper at these tools…
Metrics Toolkit
Snowball
Becker
Examples of metrics as used in proposal; CV; and (last two) P&T portfolio
I’m not necessarily endorsing these as the best examples or models to follow, just to give you an idea of how others have communicated some of this