How the ViBRANT and eMonocot projects are building tools, including a modified implementation of Bourne and Fink's 'Scholar Factor', the Biodiversity Data Journal, and Scratchpad's user metrics and statistics modules.
8. ViBRANT and eMonocot
Projects based around communities
Provide tools for communities
• Quantify user contribution to multi-user
projects
9. ViBRANT and eMonocot
Projects based around communities
Provide tools for communities
• Quantify user contribution to multi-user
projects
• Quantify how content generated by the
community is used/reused
14. Citizen Science
• COMBER
• anymals+plants
• Citizen Science profile
Metrics allow for more competitive activities
and for users to quantify their involvement
17. Who needs impact metrics?
• The authors of data
• People who employ data authors
18. Who needs impact metrics?
• The authors of data
• People who employ data authors
• People who reuse the data
19. Who needs impact metrics?
• The authors of data
• People who employ data authors
• People who reuse the data
Am I useful?
20. Who needs impact metrics?
• The authors of data
Am I useful?
• People who employ data authors
• People who reuse the data
Do we have our
priorities right?
21. Who needs impact metrics?
• The authors of data
Am I useful?
• People who employ data authors
• People who reuse the data
Do we have our
priorities right?
22. Who needs impact metrics?
• The authors of data
Am I useful?
• People who employ data authors
• People who reuse the data
Hidden gem?
Do we have our
priorities right?
23. Who needs impact metrics?
• The authors of data
Am I useful?
• People who employ data authors
• People who reuse the data
Hidden gem?
Avoid like the plague?
Do we have our
priorities right?
30. But the people weren’t happy.
Of all the things I do you
only care about papers?
31. But the people weren’t happy.
The only way you value my
papers is by citation in other
papers?
Of all the things I do you
only care about papers?
32. But the people weren’t happy.
The only way you value my
papers is by citation in other
papers?
Of all the things I do you
only care about papers?
The impact of my career is
measured by one number?
35. But there are still problems
This STILL only deals with my
publications!
36. But there are still problems
This STILL only deals with my
publications!
How do I get credit for data?
37. But there are still problems
This STILL only deals with my
publications!
How do I get credit for data?
I wrote some useful
biodiversity software – what
about that?
41. Are alt metrics useful?
These measure broader
public impact
These are fairly scholarly
42. Are alt metrics useful?
These measure broader
public impact
These are fairly scholarly
• A good start
43. Are alt metrics useful?
These measure broader
public impact
• A good start
• Much broader than just
scholarly citations
These are fairly scholarly
44. Are alt metrics useful?
These measure broader
public impact
• A good start
• Much broader than just
scholarly citations
• Actually ignores
traditional citations
These are fairly scholarly
45. Are alt metrics useful?
These measure broader
public impact
• A good start
• Much broader than just
scholarly citations
• Actually ignores
traditional citations
These are fairly scholarly
• Still only applied to
papers
52. Biodiversity Data Journal
Data papers (descriptions of data)
• Allows data to become citable
• Allows data to participate in
standard credit/metric systems
53. Biodiversity Data Journal
Data papers (descriptions of data)
• Allows data to become citable
• Allows data to participate in
standard credit/metric systems
• … allows data to participate in alt
metrics
57. Scratchpads Statistics Module
What content exists on a Scratchpad site?
• Filter by user or taxonomic term
• How much content is there?
• What kind of content is it?
58. Scratchpads Statistics Module
What content exists on a Scratchpad site?
• Filter by user or taxonomic term
• How much content is there?
• What kind of content is it?
• How many registered users?
59. Scratchpads Statistics Module
What content exists on a Scratchpad site?
• Filter by user or taxonomic term
• How much content is there?
• What kind of content is it?
• How many registered users?
• How often is the content viewed?
63. Scratchpads Metrics Module
Puts the Scratchpad content in a broader
context
• Users and content
• Opt-in (not enabled by default)
64. Scratchpads Metrics Module
Puts the Scratchpad content in a broader
context
• Users and content
• Opt-in (not enabled by default)
• Modular (pick and choose what to include)
65. Scratchpads Metrics Module
Puts the Scratchpad content in a broader
context
• Users and content
• Opt-in (not enabled by default)
• Modular (pick and choose what to include)
• Available by end of the year
67. Scratchpads User Metrics
A partial (modified) implementation of the
Scholar Factor proposed by Bourne and Fink
Citations
68. Scratchpads User Metrics
A partial (modified) implementation of the
Scholar Factor proposed by Bourne and Fink
Citations
Scholar Factor
69. Scratchpads User Metrics
A partial (modified) implementation of the
Scholar Factor proposed by Bourne and Fink
Citations
Software / Data
Scholar Factor
70. Scratchpads User Metrics
A partial (modified) implementation of the
Scholar Factor proposed by Bourne and Fink
Citations
Software / Data
Scholar Factor
Grant / manuscript
reviews
71. Scratchpads User Metrics
A partial (modified) implementation of the
Scholar Factor proposed by Bourne and Fink
Citations
Software / Data
Scholar Factor
Grant / manuscript
reviews
87. Summary
We’re not there yet, but we have…
• Modular framework to add services as they
become available
88. Summary
We’re not there yet, but we have…
• Modular framework to add services as they
become available
• multi-dimensional – not just papers and
citations
89. Summary
We’re not there yet, but we have…
• Modular framework to add services as they
become available
• multi-dimensional – not just papers and
citations
• Not a one size fits all approach
90. Summary
We’re not there yet, but we have…
• Modular framework to add services as they
become available
• multi-dimensional – not just papers and
citations
• Not a one size fits all approach
Notas do Editor
This work has been done under the ViBRANT and eMonocot projects. Much of this relates to Scratchpads which are in a simple terms an online content management system for biodiversity data (if you really want to know more you may want to leave and go next door as there is a summary of the project going on there).
These projects focus on data collection and curation, and making it available for reuse. Both projects use Scratchpads for a large part of this process.
In all large projects we like to know how well we are doing. But this is a pretty selfish use of statistics and metrics.
Communities are the core of these projects. We build tools for communities.
so we also need to consider how we build metrics tools for communities.
This might be allowing people to see how much they have contributed to a project, or conversely letting people using the project see who has contributed
We’d like a way to show people how the content they have generated is being used: who links to it, has it been shared socially, has it been traditionally cited, have people reused the data for novel purposes.
As part of ViBRANT there are a few citizen science projects so we also need to consider how we build metrics tools for communities
Collecting observation data via mobile phone
We’re working on a citizen science platform for Scratchpads
Metrics provide an added layer of incentive for citizen scientists – they could form the basis of a gamification reward system and allow people to quanify their involvement.
People who reuse their data. All of these different people need metrics to answer specific questions.
Perhaps the stereotypical metric – the h-index
We’ve found a way of creating a number out of one aspect of our output, and many people use it as proxy for the rest of what scientists do.
There has been a move towards alternative metrics, this example uses the altmetric
But there are still some problems with this method
Some parts of the metric, like Mendeley, can be considered to be fairly scholarly
Scratchpads metrics module
Scratchpads statistics module
It’s possible to publish data from a Scratchpad in the recently launched Biodiversity Data Journal, so an opportunity for some metrics activity.
Through CrossRef DOI
The Scratchpads statistics module provides information about the content on a given Scratchpad.
Filtering by user or taxonomic term means it’s easy to find out how much people have contributed, and how content is available for a given taxon.
If you’re looking for references or images you know how rich a resource this site might be.
Might give an indication of how active a community is
Do other people find the content useful?
This isn’t perfect, in fact it’s pretty experimental. So it will be on an opt-in basis.
Not every community wants the same set of criteria.
Traditional citations still play a part.
You also get credit for software you have written and datasets that you have created
Also get credit for grant and manuscript reviews. This is perhaps the trickiest part – but we are working with Pensoft to include this for reviews of papers in their journals.
Some of these things take more effort than others, so they are weighted.
So we have seen the theoretical model, now for the implementation we have created.
The user metrics module deals with the aggregation, weighting and display of metrics data.
A number of helper modules are responsible for getting the required data from external sources.
It’s easy to write a module that can supply data – maybe as little as 20 lines of code.
Modular design allow for a mix-and-match set of functionality, that also has some degree of future proofing. As a maybe slightly extreme example…
… we could add Facebook….
.. And remove legacy systems
One of the most basic web metrics available
We have done studies on links to Scratchpad content in articles on Google Scholar – and there has been a noticeable increase in recent years.
Also take some ideas from the altmetric community.