Here are a few thoughts on how libraries can engage with altmetrics:- Educate researchers on both traditional and altmetrics, how they complement each other, and their appropriate uses and limitations. Libraries are well-positioned to provide neutral information. - Track altmetric attention to library resources like subject guides, finding aids, datasets, etc. to demonstrate value and inform improvements.- Advocate for standards and best practices in altmetrics to address issues like gaming, duplication, and disciplinary differences. Libraries can contribute to developing robust and equitable metrics.- Partner with researchers to track impact of library services like data management consultations, publishing support, etc. through altmetrics. This helps illustrate libraries' broader contributions
This document summarizes key points from a discussion about impact metrics and altmetrics with university researchers:
- Researchers had varying familiarity with traditional impact metrics like journal impact factors, with some seeing little meaning or value in them. Biology researchers saw potential issues like variability across disciplines.
- Alternative metrics were newer to most researchers, with one mathematician expressing skepticism due to lack of established "truth". However, one mathematician saw potential value in altmetrics that aggregate many user-generated assessments.
- Bringing it back to libraries, traditional metrics focused on research outputs libraries purchase and support. Altmetrics change the landscape by measuring broader impacts beyond citations, possibly including engagement with library resources, services, and online presences. This
Semelhante a Here are a few thoughts on how libraries can engage with altmetrics:- Educate researchers on both traditional and altmetrics, how they complement each other, and their appropriate uses and limitations. Libraries are well-positioned to provide neutral information. - Track altmetric attention to library resources like subject guides, finding aids, datasets, etc. to demonstrate value and inform improvements.- Advocate for standards and best practices in altmetrics to address issues like gaming, duplication, and disciplinary differences. Libraries can contribute to developing robust and equitable metrics.- Partner with researchers to track impact of library services like data management consultations, publishing support, etc. through altmetrics. This helps illustrate libraries' broader contributions
Semelhante a Here are a few thoughts on how libraries can engage with altmetrics:- Educate researchers on both traditional and altmetrics, how they complement each other, and their appropriate uses and limitations. Libraries are well-positioned to provide neutral information. - Track altmetric attention to library resources like subject guides, finding aids, datasets, etc. to demonstrate value and inform improvements.- Advocate for standards and best practices in altmetrics to address issues like gaming, duplication, and disciplinary differences. Libraries can contribute to developing robust and equitable metrics.- Partner with researchers to track impact of library services like data management consultations, publishing support, etc. through altmetrics. This helps illustrate libraries' broader contributions (20)
Here are a few thoughts on how libraries can engage with altmetrics:- Educate researchers on both traditional and altmetrics, how they complement each other, and their appropriate uses and limitations. Libraries are well-positioned to provide neutral information. - Track altmetric attention to library resources like subject guides, finding aids, datasets, etc. to demonstrate value and inform improvements.- Advocate for standards and best practices in altmetrics to address issues like gaming, duplication, and disciplinary differences. Libraries can contribute to developing robust and equitable metrics.- Partner with researchers to track impact of library services like data management consultations, publishing support, etc. through altmetrics. This helps illustrate libraries' broader contributions
2. This work is licensed under:
http://creativecommons.org/licenses/by-nc-sa/4.0/deed.en_US
You are free to:
Share — copy and redistribute the material in any medium or format
Adapt — remix, transform, and build upon the material
Under the following terms:
Attribution — You must give appropriate credit, provide a link to the
license, and indicate if changes were made. You may do so in any
reasonable manner, but not in any way that suggests the licensor endorses
you or your use.
NonCommercial — You may not use the material for commercial purposes.
ShareAlike — If you remix, transform, or build upon the material, you must
distribute your contributions under the same license as the original.
7. Research impact defined
―Impact is usually demonstrated by
pointing to a record of the active
consultation, consideration, citation,
discussion, referencing or use of a
piece of research.‖
http://blogs.lse.ac.uk/impactofsocialsciences/introduction/
11. Journal Level Metrics
Web of
Science
Impact
Factor (2 & 5
year)
Cited half-
life
Eigenfactor®
Score
Article
Influence®
Score
Scopus SCImago
Journal Rank
(SJR)
Source
Normalized
Impact per
Paper
(SNIP)
Google
Scholar
h5-index h5-median
12. Journal Impact Factor
Average number of times articles published in a two year
(or five year) period have been cited
A = total number of times ALL articles published in 2 (or 5)
year period were cited in WofS indexed journals during the
next year
B = total number of "citable items" (usually
articles, reviews, proceedings or notes; not editorials and
letters-to-the-editor) published in 2 (or 5) year period
Impact factor = A/B
13. Journal Impact Factor Debate
• Vary across disciplines
• e.g., variants in time-to-publication
• Review journals have higher impact
• fewer articles per journal, cited more
• Calculations easily skewed
• heavy editorial/letter/opinion content
• high/low percentage of review articles
14. Declared Dissent – 10,419
• ―Impact factors declared unfit for duty‖
Stephen Curry, LSE Blog
• ―Do not resuscitate: the journal impact factor
declared dead‖
Brendan Crabb, The conversation
• ―Just say no to impact factors‖
Ismael Rafols and James Wilsdon, the guardian
DORA: San Francisco Declaration on Research Assessment
(Dec. 2012)
16. h index is NOT perfect
Paper 1 Paper 2 Paper 3 Paper 4 Paper 5
Author A 3 cites 6 cites 100
cites
4 cites 1 cite
Author B 3 cites 6 cites 100
cites
Author C 400
cites
150
cites
3 cites 6 cites
h index = 3
19. Web metrics
• Webometrics - link analysis, web log file analysis, 1990s
• Bibliometric use – downloads, citations
• Google web page ranking
• Web 2.0 ->scientometrics 2.0 (Priem & Hemminger
2010)
20. Open Access
• Producing a highly complex information environment;
touches all disciplines
• Many formats, e.g. websites, nanopubs, blog posts, code,
images
• versions, e.g. preprints, postprints, ―same article different
places‖, version of record
21. Identification Systems
Authors
• International Standard Name Identifier (ISNI)
• ORCID (subset of ISNI)
• Names Project (JISC)
• Author Claim, RePEc Author Service
• arXiv Author ID
• ResearchID (Thomson Reuters)
• Scopus Author ID (Elsevier)
• Gravatars (Auttomatic)
Objects
• DOIs, PMIDs URLs, URIs
22. Altmetrics
… are calculated from newer data sources, e.g.
Wikipedia, Mendeley, Twitter, Facebook, Weibo, blogs
… report the impact of a wider range of research
outputs, e.g. presentation slides, data sets, articles, code
38. The Debate
―Article-Level Metrics: An Ill-Conceived and
Meretricious Idea‖ vs ―Broaden your horizons: impact
doesn‘t need to be all about academic citations‖
(Jeffrey Beall vs Euan Adie, Aug. 2013)
39. Moving forward
―to move out of its current pilot and
proof-of-concept phase‖
• Sloan foundation grant for NISO
initiative - creation of community-
based standards; definitions,
calculations. Data classifications and
data sharing practices
42. What do you think of when you
hear the term impact metrics?
―… I don't think that impact factors at
the journal level necessarily captures
the importance of the paper... I jokingly
say that out of all the things I have
published, the paper that I expect will
have the longest life is the one that's
got the lowest impact factor.‖
Biology
43. What do you think of when you
hear the term impact metrics?
―Nothing I don‘t understand what your
talking about. No clue. I understand
‗impact‘ I understand those words
separately. But not together.‖
Mathematical and Computational Sciences
―Means nothing to me – zero.‖
Political Science
44. What do you think of when you
hear the term impact metrics?
―I have a picture of that book that you find
at the Robarts library on the 4th floor that
is a collection of binders in colourful
colours and when you open it, it looks like
the phone book with all the list of all the
articles and researchers so that is my idea
of impact metrics.‖
Language Studies
45. Have you considered/do you
use any alternative metrics?
―No. No trust in them.‖
Mathematical and Computational Sciences
46. ―… I give a lot of relevance to those metrics and I will tell
you why…There was an event in the year 1900. It was
some popular festivity in England and they were running a
contest. There was a cow or a bull and people had to
guess it's weight and the person who gets closes to the
weight wins the cow, or something like that. And someone
kept a record of the answers and it is very interesting. It
was very wide-ranging so let's say the weight of the cow
was 500 kg. So the range was from 300-1000. The most
accurate answer was the average of all the answers that
people gave…So what is that telling you. It is telling you
many things…It is telling you that this phase in which we
are moving into in which content is user generated, there
is a lot of proof there. You have to remember, I am a
mathematician, I don't care what people think, we build
truth from the bottom up…‖
Mathematical and Computational Sciences
48. Bringing it back to libraries
• Traditional metrics
• Libraries provide the support
• Libraries pay for the metrics
• Libraries purchase the research output
• Altmetrics change the rules
• Now what happens?
• It all depends on what libraries want to measure….
• tweets and blogs about our buildings and spaces and our
service
• Bookmarks of websites and libguides
• Likes on Facebook
• Downloads of librarian publications
- Three sections – time for relevant discussion/questions after each one. We are NOT here to explain metrics in detail, but rather to discuss conceptually what our role is and how we can help our faculty and grad students.
mindy
-not easily measured and debate on effectiveness of measures currently used
The basic premise of most citation metrics is that the more an article is cited the more impact it has. Counts of citations are then used to calculate a range of metrics that are believed to indicate the research impact of journals, journal articles, and authors of these articles. Impact – reach and significance (based on assessment criteria from UK research excellence framework 2014)- This is quantitative, what about qualitative – particularly for social sciences?DISCUSSION POINT
- Although this is logical order, I will be talking about article level metrics last as they are the most discussed at this juncture and are the most in flux.
Also immediacy index and total citesThe journal Impact Factor is the average number of times articles from the journal published in the past two (or 5) years have been cited in the JCR year.The median age of the articles that were cited in the JCR year. Half of a journal's cited articles were published more recently than the cited half-life. Only journals cited 100 or more times in the JCR year have a cited half-life.The Eigenfactor Score calculation is based on the number of times articles from the journal published in the past five years have been cited in the JCR year, but it also considers which journals have contributed these citations so that highly cited journals will influence the network more than lesser cited journals. The Article Influence determines the average influence of a journal's articles over the first five years after publication. It is calculated by dividing a journal’s Eigenfactor Score by the number of articles in the journal, normalized as a fraction of all articles in all publications. (1=average)The Immediacy Index is the average number of times an article is cited in the year it is published.The total number of citations to the journal in the JCR year.SJR is weighted by the prestige of a journal Subject field, quality, and reputation of the journal have a direct effect on the value of a citation. SJR assigns relative scores to all of the sources in a citation network. Its methodology is inspired by the Google PageRank algorithm, in that not all citations are equal. A source transfers its own 'prestige', or status, to another source through the act of citing it. A citation from a source with a relatively high SJR is worth more than a citation from a source with a lower SJR.SNIP is the ratio of a source's average citation count per paper and the citation potential of its subject field.The citation potential of a source's subject field is the average number of references per document citing that source. It represents the likelihood of being cited for documents in a particular field. A source in a field with a high citation potential tends to have a high impact per paper.h5-index is the h-index for articles published in the last 5 complete years.h5-median for a publication is the median number of citations for the articles that make up its h5-index.
Rita and gail’s slides
“Impact factors declared unfit for duty” http://blogs.lse.ac.uk/impactofsocialsciences/2013/05/21/impact-factors-declared-unfit-for-duty/“Do not resuscitate: the journal impact factor declared dead”http://theconversation.com/do-not-resuscitate-the-journal-impact-factor-declared-dead-14480“Just say no to impact factors” http://www.theguardian.com/science/political-science/2013/may/17/science-policyDORA http://am.ascb.org/dora/
Note: To use citation counts, a set timeframe (usually 1 year) must be includedBased on data within each database so calculations of same metric will differ as the papers and their citation counts will differ based on contentTo expand, in our experience for faculty whose subject area does not lend itself to these providers, e.g. linguistics, we have gone to the web to find other sources and helped them calculate their own h-index.The i10-index indicates the number of academic papers an author has written that have at least ten citations from others.
Reference to grad studentsHighlight limit to h-index (plus only uses data from database so each provider may calculate a different value)
- Teaser for next segment – other possibilities…
pam
Priem, J & Hemminger, B. H. (2010) Scientometrics 2.0: New metrics of scholarly impact on the social Web. First Monday 15 (7). Retrieved from http://dx.doi.org/10.5210%2Ffm.v15i7.2874
“on the slow side” Open access on the conference circuitPosted on July 6, 2013 by Stephen Curry, Reciprocal Space http://occamstypewriter.org/scurry/Global Research Council Action Plan http://grc.s2nmedia.com/sites/default/files/pdfs/grc_action_plan_open_access%20FINAL.pdfScience Europe Statement http://www.scienceeurope.org/uploads/Public%20documents%20and%20speeches/SE_OA_Pos_Statement.pdfHouse of Commons, Business, Innovation and Skills Committee - Fifth Report,Open Accesshttp://www.publications.parliament.uk/pa/cm201314/cmselect/cmbis/99/9902.htmgovernment and funder mandates are driving OA; e.g. Finch report and funding now available from the RCUK, FASTR bill introduced to Congress Feb. 14Picking up speed - 1200 journal added to DOAJ in 2012; 8566 journals (Jan. 2013); new open monograph publishing options.All disciplines - MLA endorsement of OA; not just the sciencesDifferent versions – pre, post, final;Multiple copiesNew modes of publishing – providing new ways to distribute research; new tools for authors for self publishing, use of the Creative Commons licences; new look at metricshttp://www.sherpa.ac.uk/romeo/statistics.php?la=en&fIDnum=|&mode=simple
ORCID (Open Researcher and Contributor ID) subset of International Standard Names Identifier http://orcid.orgGoggle Scholar http://scholar.google.com/citationsAuthor Claim http://authorclaim.org/Names Project (JISC) http://names.mimas.ac.uk/ INSPIRE http://inspirehep.net/collection/
Enis, Matt. As University of Pittsburgh Wraps Up Altmetrics Pilot, Plum Analytics Announces Launch of Plum X. Library Journal February 5, 2013PeerEvaluatonResearchScorecardReaderMeterScholarometer
http://article-level-metrics.plos.org/researchers/Widget for WordPress provides ALMs for 1 or more articles (http://wordpress.org/plugins/plos-alm-widget/)
From: Lin & Fenner. Altmetrics in evolution: defining and redefining the ontology of article-level metric. Information Standards Quarterly, (2013) 25(2), 25.
“Single aggregated score for all metrics” Lin & Fenner 2013, p.26Altmetric data from http://support.altmetric.com/knowledgebase/articles/83335-which-data-sources-does-altmetric-track-
www.plumanalytics.com/metrics.htmlAuthor or Group profiles with details and summaries of metrics“Artifact” or item view with altmetric counts Author or Group visualizations weighted by impactAuthor, Group and Artifact-level widgetsTracts 2nd level metrics e.g. tweets of news article about X
Proposal to study, propose, and develop community-based standards or recommended practices in the field of alternative metrics. Todd Carpenter (NISO) and Nettie Lagace (NISO) March 19, 2013. Sud & Thelwall “to identify contexts in which it is reasonable to use them” p. 11See also ACUMEN EU funded project (academic careers understood through measurement and norms) http://research-acumen.eu/about
mindy
- teaching grads and comments from grads about faculty thoughts on metrics - consider each a faculty workshop – talked to department heads - pressures in other countries, altmetrics catch our attention