Todd Carpenter's presentation at the Electronic Resources in Libraries (ER&L) conference in Austin, TX, February 23, 2015. Todd discussed how new forms of assessment are entering the mainstream and how those metrics shouldn't be considered "alternative" any longer. The session also covered the advancements made in 2014 on the NISO Alternative Assessment initiative, that was generously funded by the Alfred P. Sloan Foundation.
Carpenter - Lets remove "alt" from altmetrics - ER&L Presentation
1. “Alt”
is
German
for
“old”:
It’s
&me
to
stop
talking
about
metrics
in
terms
of
“alterna&ve”
metrics.
Todd Carpenter
Executive Director, NISO
February 23, 2015
2. ! Non-‐profit
industry
trade
associa&on
accredited
by
ANSI
! Mission
of
developing
and
maintaining
technical
standards
related
to
informa&on,
documenta&on,
discovery
and
distribu&on
of
published
materials
and
media
! Volunteer
driven
organiza&on:
400+
contributors
spread
out
across
the
world
! Responsible
(directly
and
indirectly)
for
standards
like
ISSN,
DOI,
Dublin
Core
metadata,
DAISY
digital
talking
books,
OpenURL,
MARC
records,
and
ISBN
About
February
23,
2015
2
7. Since
Jason
Priem
coined
the
term
• There
have
been
some
6,200
scholarly
ar&cles
(per
Google
Scholar)
• There
have
been
8
altmetrics
conferences
• Even
I
have
presented
on
altmetrics
12
&mes!
(Just
kidding)
February
23,
2015
7
8. More
than
just
popularity
Research
is
poin&ng
to
the
fact
that
there
is
a
modest
posi&ve
correla&on
between
early-‐signal
metrics
(altmetrics)
and
later-‐signal
metrics
(cita&ons)
Do
altmetrics
correlate
with
cita7ons?
Extensive
comparison
of
altmetric
indicators
with
cita7ons
from
a
mul7disciplinary
perspec7ve
by
Rodrigo
Costas,
Zohreh
Zahedi,
Paul
Wouters
Do
Altmetrics
Work?
TwiCer
and
Ten
Other
Social
Web
Services
by
Mike
Thelwall
,
Stefanie
Haustein,
Vincent
Larivière,
&
Cassidy
R.
Sugimoto
(2006)
Earlier
web
usage
sta&s&cs
as
predictors
of
later
cita&on
impact
by
Brody
T,
Harnad
S
&
Carr
L
February
23,
2015
8
9. Would
a
researcher
focus
on
only
one
data
source
or
methodological
approach?
February
23,
2015
9
Duke
University
-‐
Informa&on
Ini&a&ve
at
Duke
(IID)
10. There
aren’t
metrics
and
“altmetrics;
there
are
only
metrics!
February
23,
2015
10
11. We
have
been
using
non-‐cita&on-‐based
metrics
for
decades
February
23,
2015
11
23. Are we measuring scholarship
using “inches” or “meters”
Image:
Flickr
user
karindalziel
24. I
olen
sound
like
a
broken
record
• Defining
what
is
to
be
counted
=
standards
• How
to
describe
what
to
count
=
standards
• Idenficaon
of
what
to
count
=
standards
• Procedures
for
counng
or
not
=
standards
• Aggregang
counts
from
network
=
standards
• Exchange
of
what
was
counted
=
standards
February
23,
2015
24
25.
26.
27. Alterna7ve
Assessment
Ini7a7ve
Phase
1
Mee7ngs
October
9,
2013
-‐
San
Francisco,
CA
December
11,
2013
-‐
Washington,
DC
January
23-‐24
-‐
Philadelphia,
PA
Round
of
1-‐on-‐1
interviews
–
March/Apr
Phase
1
report
published
in
June
2014
28. Meeng
Lightning
Talks
• Expectaons
of
researchers
• Exploring
disciplinary
differences
in
the
use
of
social
media
in
scholarly
communicaon
• Altmetrics
as
part
of
the
services
of
a
large
university
library
system
• Deriving
altmetrics
from
annotaon
acvity
• Altmetrics
for
Instuonal
Repositories:
Are
the
metadata
ready?
• Snowball
Metrics:
Global
Standards
for
Instuonal
Benchmarking
• Internaonal
Standard
Name
Idenfier
• Altmetric.com,
Plum
Analycs,
Mendeley
reader
survey
• Twiper
Inconsistency
February
23,
2015
28
“Lightning
by
snowpeak
is
licensed
under
CC
BY
2.0
34. Poten7al
work
themes
Defini7ons
Applica7on
to
types
of
research
outputs
Discovery
implica7ons
Research
evalua7on
Data
quality
and
gaming
Grouping,
aggrega7ng,
and
granularity
Context
Adop7on
February
23,
2015
34
35. Potenal
work
themes
Defini7ons
Applicaon
to
types
of
research
outputs
Discovery
implicaons
Research
evaluaon
Data
quality
and
gaming
Grouping,
aggregang,
and
granularity
Context
Adopon
February
23,
2015
35
36. Potenal
work
themes
Definions
Applica7on
to
types
of
research
outputs
Discovery
implicaons
Research
evaluaon
Data
quality
and
gaming
Grouping,
aggregang,
and
granularity
Context
Adopon
February
23,
2015
36
37. Potenal
work
themes
Definions
Applicaon
to
types
of
research
outputs
Discovery
implica7ons
Research
evaluaon
Data
quality
and
gaming
Grouping,
aggregang,
and
granularity
Context
Adopon
February
23,
2015
37
38. Potenal
work
themes
Definions
Applicaon
to
types
of
research
outputs
Discovery
implicaons
Research
evalua7on
Data
quality
and
gaming
Grouping,
aggregang,
and
granularity
Context
Adopon
February
23,
2015
38
39. Potenal
work
themes
Definions
Applicaon
to
types
of
research
outputs
Discovery
implicaons
Research
evaluaon
Data
quality
and
gaming
Grouping,
aggregang,
and
granularity
Context
Adopon
February
23,
2015
39
40. Potenal
work
themes
Definions
Applicaon
to
types
of
research
outputs
Discovery
implicaons
Research
evaluaon
Data
quality
and
gaming
Grouping,
aggrega7ng,
and
granularity
Context
Adopon
February
23,
2015
40
41. Potenal
work
themes
Definions
Applicaon
to
types
of
research
outputs
Discovery
implicaons
Research
evaluaon
Data
quality
and
gaming
Grouping,
aggregang,
and
granularity
Context
Adopon
February
23,
2015
41
42. Potenal
work
themes
Definions
Applicaon
to
types
of
research
outputs
Discovery
implicaons
Research
evaluaon
Data
quality
and
gaming
Grouping,
aggregang,
and
granularity
Context
Adop7on
Promo7on
February
23,
2015
42
43. Alterna7ve
Assessment
Ini7a7ve
Phase
2
Presenta7ons
of
Phase
1
report
(June
2014)
Priori7za7on
Effort
(June
-‐
Aug,
2014)
Project
approval
(Sept
2014)
Working
group
forma7on
(Oct
2014)
Consensus
Development
(Nov
2014
-‐
Dec
2015)
Trial
Use
Period
(Dec
15
-‐
Mar
16)
Publica7on
of
final
recommenda7ons
(Jun
16)
44. February
23,
2015
44
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Unimportant
Of
liple
importance
Moderately
important
Important
Very
important
Community
Feedback
on
Project
Idea
Themes
n=118
45. February
23,
2015
45
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Unimportant
Of
liple
importance
Moderately
important
Important
Very
important
n=118
Community
Feedback
on
Project
Idea
Themes
46. Top-‐ranked
ideas
(very
important
important
70%)
• 87.9%
-‐
1.
Develop
specific
definions
for
alternave
assessment
metrics.
• 82.8%
-‐
10.
Promote
and
facilitate
use
of
persistent
idenfiers
in
scholarly
communicaons.
• 80.8%
-‐
12.
Develop
strategies
to
improve
data
quality
through
normalizaon
of
source
data
across
providers.
• 79.8%
-‐
4.
Idenfy
research
output
types
that
are
applicable
to
the
use
of
metrics.
• 78.1%
-‐
6.
Define
appropriate
metrics
and
calculaon
methodologies
for
specific
output
types,
such
as
solware,
datasets,
or
performances.
• 72.5%
-‐
13.
Explore
creaon
of
standardized
APIs
or
download
or
exchange
formats
to
facilitate
data
gathering.
• 70.7%
-‐
11.
Research
issues
surrounding
the
reproducibility
of
metrics
across
providers.
February
23,
2015
46
47. Alterna7ve
Assessments
of
our
Assessment
Ini7a7ve
White
paper
downloaded
6400
7mes
21
substanve
comments
received
120
in-‐person
and
virtual
parcipants
at
the
meengs
These
3
meengs
apracted
400
RSVPs
for
live
stream
Goal:
generate
about
40
ideas,
in
total,
generated
more
than
250
Project
materials
downloaded
more
than
22,000
mes
More
than
500
direct
tweets
using
the
#NISOALMI
hashtag
Survey
ranking
of
output
by
118
people
Six
arcles
in
tradional
news
publicaons
15
blog
posts
about
the
iniave