Todd Carpenter's talk at Elsevier Booth During ALA Annual Conference in Las Vegas about NISO alternative assessment project and the results of the first phase of the project
Scaling API-first – The story of a global engineering organization
Carpenter Talk at Elsevier Booth During ALA Annual about NISO alternative assessment project
1. Comparing
digital
apples
to
digital
apples:
An
update
on
NISO’s
Alterna8ve
Assessment
Ini8a8ve
Todd Carpenter
Executive Director, NISO
ALA Annual
June 28, 2014June
28,
2014
1
2. ! Non-‐profit
industry
trade
associa9on
accredited
by
ANSI
! Mission
of
developing
and
maintaining
technical
standards
related
to
informa9on,
documenta9on,
discovery
and
distribu9on
of
published
materials
and
media
! Volunteer
driven
organiza9on:
400+
contributors
spread
out
across
the
world
! Responsible
(directly
and
indirectly)
for
standards
like
ISSN,
DOI,
Dublin
Core
metadata,
DAISY
digital
talking
books,
OpenURL,
MARC
records,
and
ISBN
About
June
28,
2014
2
16. Steering
CommiZee
• Euan
Adie,
Altmetric
• Amy
Brand,
Harvard
University
• Mike
Buschman,
Plum
Analy9cs
• Todd
Carpenter,
NISO
• Mar9n
Fenner,
Public
Library
of
Science
(PLoS)
(Chair)
• Michael
Habib,
Reed
Elsevier
• Gregg
Gordon,
Social
Science
Research
Network
(SSRN)
• William
Gunn,
Mendeley
• Nebe
Lagace,
NISO
• Jamie
Liu,
American
Chemical
Society
(ACS)
• Heather
Piwowar,
ImpactStory
• John
Sack,
HighWire
Press
• Peter
Shepherd,
Project
Counter
• Chris9ne
Stohn,
Ex
Libris
• Greg
Tananbaum,
SPARC
(Scholarly
Publishing
Academic
Resources
Coali9on)
June
28,
2014
16
17. Alterna8ve
Assessment
Ini8a8ve
Phase
1
Mee8ngs
October
9,
2013
-‐
San
Francisco,
CA
December
11,
2013
-‐
Washington,
DC
January
23-‐24
-‐
Philadelphia,
PA
Round
of
1-‐on-‐1
interviews
–
March/Apr
Phase
1
report
Published
June
9,
2014
June
28,
2014
17
18. Mee9ngs’
General
Format
• Collocated
with
other
industry
mee9ng
• Morning:
lightning
talks,
post-‐it
brainstorming
• Aiernoon:
discussion
groups
– X
– Y
– Z
– Report
back/react
• Live
streamed
(video
recordings
are
available)
June
28,
2014
18
19. Mee9ng
Lightning
Talks
• Expecta9ons
of
researchers
• Exploring
disciplinary
differences
in
the
use
of
social
media
in
scholarly
communica9on
• Altmetrics
as
part
of
the
services
of
a
large
university
library
system
• Deriving
altmetrics
from
annota9on
ac9vity
• Altmetrics
for
Ins9tu9onal
Repositories:
Are
the
metadata
ready?
• Snowball
Metrics:
Global
Standards
for
Ins9tu9onal
Benchmarking
• Interna9onal
Standard
Name
Iden9fier
• Altmetric.com,
Plum
Analy9cs,
Mendeley
reader
survey
• TwiZer
Inconsistency
June
28,
2014
19
“Lightning
by
snowpeak
is
licensed
under
CC
BY
2.0
21. SF
Mee9ng
Discussions
• Business
Use
cases
– Publishers
want
to
serve
authors,
make
money
– People
don’t
value
a
standard,
they
value
something
that
helps
them
– …
Couldn’t
iden9fy
a
logical
standard
need
that
actors
in
the
space
would
value,
and
best
prac9ces
are
of
interest
• Quality
Data
science
– Themes:
context,
valida9on,
provenance,
quality,
descrip9on
metadata
– We'll
never
get
to
the
point
where
assessment
can
be
done
without
a
human
in
the
loop,
but
discovery
and
recommenda9on
can
• Defini9ons
– Define
“ALM”
and
“Altmetrics”
– Map
the
landscape
– We'll
never
get
to
the
point
where
assessment
can
be
done
without
a
human
in
the
loop,
but
discovery
and
recommenda9on
can
June
28,
2014
21
22. DC
Mee9ng
Discussions
• Business
and
Use
Cases
• Discovery
– metrics
only
get
generated
if
material
is
discovered
• Qualita9ve
vs.
Quan9ta9ve
• Iden9fying
Stakeholders
and
their
Values
– stakeholders
in
outcomes
/
stakeholders
in
process
of
crea9ng
metrics
– shared
values
but
tensions
– branding
• Defini9ons/Defining
Impact
– metrics
and
analyses
– what
led
to
success
of
cita9on?
– how
to
be
certain
we
are
measuring
the
right
things
• Future
Proofing
– what
won't
change
– impact
-‐
hard
to
establish
across
disciplines
June
28,
2014
22
23. Philly
Mee9ng
Discussions
• Defini9ons
– Define
life
cycle
of
scholarly
output
and
associated
metrics
– Qualita9ve
versus
Quan9ta9ve
aspects
-‐
what
is
possible
to
define
here
– Consider
other
aspects
of
these
data
collec9ons
• Standards
– Develop
defini9ons
(what
is
a
download?
what
is
a
view?)
– Differen9ate
between
scholarly
impact
versus
popular/social
use
– Define
sources/characteris9cs
for
metrics
(social,
commercial,
scholarly)
• Data
Integrity
– Counter
biases/gaming
– Associa9on
with
credible
en99es
-‐
e.g.
ORCID
ID
v.
gmail
account
– Reproduceability
is
key
– Everyone
needs
to
be
at
the
table
to
establish
overall
credibility
• Use
cases
(3X)
June
28,
2014
23
25. Poten8al
work
themes
Defini9ons
Applica9on
to
types
of
research
outputs
Discovery
implica9ons
Research
evalua9on
Data
quality
and
gaming
Grouping,
aggrega9ng,
and
granularity
Context
Adop9on
June
28,
2014
25
26. Poten9al
work
themes
Defini8ons
Applica9on
to
types
of
research
outputs
Discovery
implica9ons
Research
evalua9on
Data
quality
and
gaming
Grouping,
aggrega9ng,
and
granularity
Context
Adop9on
June
28,
2014
26
27. Poten9al
work
themes
Defini9ons
Applica8on
to
types
of
research
outputs
Discovery
implica9ons
Research
evalua9on
Data
quality
and
gaming
Grouping,
aggrega9ng,
and
granularity
Context
Adop9on
June
28,
2014
27
28. Poten9al
work
themes
Defini9ons
Applica9on
to
types
of
research
outputs
Discovery
implica8ons
Research
evalua9on
Data
quality
and
gaming
Grouping,
aggrega9ng,
and
granularity
Context
Adop9on
June
28,
2014
28
29. Poten9al
work
themes
Defini9ons
Applica9on
to
types
of
research
outputs
Discovery
implica9ons
Research
evalua8on
Data
quality
and
gaming
Grouping,
aggrega9ng,
and
granularity
Context
Adop9on
June
28,
2014
29
30. Poten9al
work
themes
Defini9ons
Applica9on
to
types
of
research
outputs
Discovery
implica9ons
Research
evalua9on
Data
quality
and
gaming
Grouping,
aggrega9ng,
and
granularity
Context
Adop9on
June
28,
2014
30
31. Poten9al
work
themes
Defini9ons
Applica9on
to
types
of
research
outputs
Discovery
implica9ons
Research
evalua9on
Data
quality
and
gaming
Grouping,
aggrega8ng,
and
granularity
Context
Adop9on
June
28,
2014
31
32. Poten9al
work
themes
Defini9ons
Applica9on
to
types
of
research
outputs
Discovery
implica9ons
Research
evalua9on
Data
quality
and
gaming
Grouping,
aggrega9ng,
and
granularity
Context
Adop9on
June
28,
2014
32
33. Poten9al
work
themes
Defini9ons
Applica9on
to
types
of
research
outputs
Discovery
implica9ons
Research
evalua9on
Data
quality
and
gaming
Grouping,
aggrega9ng,
and
granularity
Context
Adop8on
Promo8on
June
28,
2014
33
34. Alterna8ve
Assessment
Ini8a8ve
Phase
2
Presenta8ons
of
Phase
1
report
(June
2014)
Priori8za8on
Effort
(June
-‐
Aug,
2014)
Project
approval
(Sept
2014)
Working
group
forma8on
(Oct
2014)
Consensus
Development
(Nov
2014
-‐
Dec
2015)
Trial
Use
Period
(Dec
15
-‐
Mar
16)
Publica8on
of
final
recommenda8ons
(Jun
16)
June
28,
2014
34