SlideShare a Scribd company logo
1 of 1
Download to read offline
References	
  
Ball, R.J. & Medeiros, N. (2012). Teaching integrity in empirical
research: A protocol for documenting data management and
analysis. Journal of Economic Education, 43(2), 182-189.
Shadish, W. R., Cook, T. D., & Leviton, L. C. (1991). Foundations
of program evaluation: Theories of practice. Sage.
Smith, N. L., & Brandon, P. R. (Eds.). (2008). Fundamental issues
in evaluation. Guilford Press.
Trochim, W., Urban, J. B., Hargraves, M., Hebbard, C., Buckley,
J., Archibald, T., Johnson, M., & Burgermaster, M. (2012).
The guide to the systems evaluation protocol. Ithaca, NY:
Cornell Digital Print Services.
Urban, J. B., & Trochim, W. (2009). The role of evaluation in
research—Practice integration working toward the ‘‘Golden
Spike.’’ American Journal of Evaluation, 30(4), 538-553
!
Teaching Integrity in Empirical Research
http://www.haverford.edu/TIER/
Example:	
  TIER	
  Teaching	
  Integrity	
  in	
  
Empirical	
  Research	
  	
  
• A protocol for teaching students who are doing original analysis
of quantitative data to document the steps of data management
and analysis to facilitate replicability.
• Students learn documentation experientially in the context of
learning research methods
• For use in situations where:
Students do research projects using real data to engage in
statistical inquiry (not prescribed analysis using assigned
data).
Students receive instructor feedback and mentoring over the
course of their project.
• Developed and implemented collaboratively over the past 10
years by Richard Ball, an economics faculty member, and Norm
Medeiros, a librarian, at Haverford College.
!
!
We	
  need	
  a	
  better	
  way	
  of	
  
conceptualizing	
  evaluation	
  than	
  
asking,	
  “What	
  metrics	
  should	
  we	
  
use?”	
  
This question presumes:
• Evaluation = measurement
• Uniformity of measurement and evaluation questions across
projects evaluation as an afterthought.
• Measurement involves passive data collection that is the same
for all projects at all phases of project development.
• Measurement does not involve collecting structured data in the
context of an evaluation study.
• It is possible to measure project impact outside the context of
an evaluation study and without taking into account the logic of
how program activities lead to long-term outcomes.
This approach to evaluation of RDM interventions is a blunt
instrument that lacks mechanisms for informing improvement or
meaningful comparison of interventions.
If we hope to build an infrastructure of research data support,
more robust evaluation is in order. In the professional program
evaluation community, measurement choices depend upon
context. Central considerations are stakeholder questions and the
intended use of evaluation findings.
Charlotte	
  L.	
  Flynn

Syracuse	
  University

charflynn5@gmail.com
Problem	
  &	
  Solution
• An emerging role for information professionals in higher
education is supporting researchers in managing data.
• Evaluation of data support services and tools is limited.
• This inhibits local project improvement and larger scale
decision-making.
• Local project and broad policy evaluation information needs
differ in emphasis and timeframe.
• Solution: Theory-based local evaluation supported by a
cyberinfrastructure that builds local evaluation capacity and a
research base that connects local activities to long-term
outcomes. Urban and Trochim’s (2009) Systems Evaluation
Protocol (SEP) is one such approach.
!
TIER	
  Logic	
  Model Factors	
  to	
  take	
  into	
  account	
  before	
  
deciding	
  what	
  to	
  measure	
  
• Description
• Key activities/elements participants experience that
contribute to outcomes.
• Informed by differences in information or values that shape
stakeholder perceptions of the program.
• Phase of development
• Life cycle analysis
• Theory of change
• Expresses stakeholder understanding of how program brings
about change
• Logic model à pathway model
• Program-system links
• Evidence that links local activities to long-term outcomes.
• Evaluation scope
• Feasibility, credibility, accuracy, usefulness
Evaluation of research data services: What things should we
evaluate? What are the proper metrics for judging success?
Inputs Activities Outputs Short-term outcomes Medium-term outcomes Long-term outcomes
TIER protocol (procedures
for assembling files to
document analysis & data
management à
replicability)
Faculty require students to
follow TIER protocol to
document their data
management and analysis.
Student research outputs are
documented for replicability.
Students gain more in-depth
understanding of research
methods.
Students adopt
documentation practices.
Professional research is
documented.
Instructor who devotes time
to offering feedback on
student projects
-Faculty use student doc of
analysis & data management
to offer better feedback on
student work.
Higher quality final student
projects (e.g. fewer errors).
Students see benefit of
documentation (e.g. use it to
find and correct errors).
Increased researcher interest
in repository ingest.
Professional research is of
higher quality.
Subject liaison librarian Librarian supports students in
implementing TIER.
Repository ingest of student
research (study, data, &
documentation).
Students view librarians as a
source of data management
support.
Increased researcher-librarian
data management
collaboration.
Professional research data is
publicly shared.
Platforms for sharing outputs
during project and after
Librarian supports faculty
with infrastructure to manage
student output
Greater librarian & faculty
shared understanding of
community documentation
practices.
Opportunities for inquiry are
expanded.
Context: TIER was developed for use in higher education contexts where faculty supervise student empirical research projects that use real data, offering feedback over the course of
the project. It can be used in the context of a quantitative social science methods courses (economics, sociology, political science, psychology) or to advise thesis projects. It can be
used to advise individual or group projects.
!
Assumptions: Faculty in higher education settings that emphasize quality teaching (e.g. liberal arts colleges) are receptive to implementing innovations to optimize instructional
quality. Academic library services are often structured to facilitate librarian-faculty collaboration via subject liaisons, which affords an opportunity for offering students experiential
learning of data management best practices.
Phases	
  of	
  Development
• Programs have life cycles
• Less stability early on, trial and error
• Tend to stabilize as they mature
• Program phases: initiation, development, stability,
dissemination.
• Evaluation should have a lifecycle
• Early on = rapid feedback for program improvement
• Once program stabilizes and is disseminated, evaluation
should entail greater social science rigor.
• Over the course of a program, evaluation questions attend
first to process, then change, then comparison, then
generalization.
TIER	
  Pathway	
  Model

More Related Content

What's hot

Strategies for Discussing and Communicating Data Services
Strategies for Discussing and Communicating Data ServicesStrategies for Discussing and Communicating Data Services
Strategies for Discussing and Communicating Data ServicesJoel Herndon
 
RDAP14: Comparing disciplinary repositories: tDAR vs. Open Context
RDAP14: Comparing disciplinary repositories: tDAR vs. Open ContextRDAP14: Comparing disciplinary repositories: tDAR vs. Open Context
RDAP14: Comparing disciplinary repositories: tDAR vs. Open ContextASIS&T
 
RDAP 15 Local ICPSR Data Curation Workshop Pilot Project
RDAP 15 Local ICPSR Data Curation Workshop Pilot ProjectRDAP 15 Local ICPSR Data Curation Workshop Pilot Project
RDAP 15 Local ICPSR Data Curation Workshop Pilot ProjectASIS&T
 
Rachel Bruce UK research and data management where are we now
Rachel Bruce UK research and data management where are we nowRachel Bruce UK research and data management where are we now
Rachel Bruce UK research and data management where are we nowJisc
 
Martin Lewis and Stephen Pinfield Research Data Management - where should col...
Martin Lewis and Stephen Pinfield Research Data Management - where should col...Martin Lewis and Stephen Pinfield Research Data Management - where should col...
Martin Lewis and Stephen Pinfield Research Data Management - where should col...Jisc
 
Methods for measuring citizen-science impact
Methods for measuring citizen-science impactMethods for measuring citizen-science impact
Methods for measuring citizen-science impactLuigi Ceccaroni
 
Data Management Plan Advising? A New Business Venture for Libraries
Data Management Plan Advising?  A New Business Venture for LibrariesData Management Plan Advising?  A New Business Venture for Libraries
Data Management Plan Advising? A New Business Venture for LibrariesAndrew Sallans
 
What can the DCC do for you? Sheffield Roadshow
What can the DCC do for you? Sheffield RoadshowWhat can the DCC do for you? Sheffield Roadshow
What can the DCC do for you? Sheffield RoadshowKevin Ashley
 
Researcher needs - a researchers perspective
Researcher needs - a researchers perspectiveResearcher needs - a researchers perspective
Researcher needs - a researchers perspectiveJisc
 
Lorraine Beard RDM at the University of Manchester
Lorraine Beard RDM at the University of ManchesterLorraine Beard RDM at the University of Manchester
Lorraine Beard RDM at the University of ManchesterJisc
 
RDAP 16: Building the Research Data Community of Practice
RDAP 16: Building the Research Data Community of PracticeRDAP 16: Building the Research Data Community of Practice
RDAP 16: Building the Research Data Community of PracticeASIS&T
 
New Data, Same Skills: Applying Core Principles to New Needs in Data Curation
New Data, Same Skills: Applying Core Principles to New Needs in Data CurationNew Data, Same Skills: Applying Core Principles to New Needs in Data Curation
New Data, Same Skills: Applying Core Principles to New Needs in Data CurationLynn Connaway
 
Building the Future of Research Together
Building the Future of Research TogetherBuilding the Future of Research Together
Building the Future of Research TogetherIUPUI
 

What's hot (20)

Strategies for Discussing and Communicating Data Services
Strategies for Discussing and Communicating Data ServicesStrategies for Discussing and Communicating Data Services
Strategies for Discussing and Communicating Data Services
 
Holmes "Institutional Infrastructure for Data Sharing"
Holmes "Institutional Infrastructure for Data Sharing"Holmes "Institutional Infrastructure for Data Sharing"
Holmes "Institutional Infrastructure for Data Sharing"
 
Lee "Supporting Research Data is a Group Effort"
Lee "Supporting Research Data is a Group Effort"Lee "Supporting Research Data is a Group Effort"
Lee "Supporting Research Data is a Group Effort"
 
RDAP14: Comparing disciplinary repositories: tDAR vs. Open Context
RDAP14: Comparing disciplinary repositories: tDAR vs. Open ContextRDAP14: Comparing disciplinary repositories: tDAR vs. Open Context
RDAP14: Comparing disciplinary repositories: tDAR vs. Open Context
 
Chilton "Collaborative Collection Assessment"
Chilton "Collaborative Collection Assessment"Chilton "Collaborative Collection Assessment"
Chilton "Collaborative Collection Assessment"
 
Zucca "Technology & Systems"
Zucca "Technology & Systems"Zucca "Technology & Systems"
Zucca "Technology & Systems"
 
Praetzellis "Data Management Planning and Tools"
Praetzellis "Data Management Planning and Tools"Praetzellis "Data Management Planning and Tools"
Praetzellis "Data Management Planning and Tools"
 
RDAP 15 Local ICPSR Data Curation Workshop Pilot Project
RDAP 15 Local ICPSR Data Curation Workshop Pilot ProjectRDAP 15 Local ICPSR Data Curation Workshop Pilot Project
RDAP 15 Local ICPSR Data Curation Workshop Pilot Project
 
Rachel Bruce UK research and data management where are we now
Rachel Bruce UK research and data management where are we nowRachel Bruce UK research and data management where are we now
Rachel Bruce UK research and data management where are we now
 
Martin Lewis and Stephen Pinfield Research Data Management - where should col...
Martin Lewis and Stephen Pinfield Research Data Management - where should col...Martin Lewis and Stephen Pinfield Research Data Management - where should col...
Martin Lewis and Stephen Pinfield Research Data Management - where should col...
 
Methods for measuring citizen-science impact
Methods for measuring citizen-science impactMethods for measuring citizen-science impact
Methods for measuring citizen-science impact
 
Emerging roles and collaborations in research support for academic health lib...
Emerging roles and collaborations in research support for academic health lib...Emerging roles and collaborations in research support for academic health lib...
Emerging roles and collaborations in research support for academic health lib...
 
Data Management Plan Advising? A New Business Venture for Libraries
Data Management Plan Advising?  A New Business Venture for LibrariesData Management Plan Advising?  A New Business Venture for Libraries
Data Management Plan Advising? A New Business Venture for Libraries
 
What can the DCC do for you? Sheffield Roadshow
What can the DCC do for you? Sheffield RoadshowWhat can the DCC do for you? Sheffield Roadshow
What can the DCC do for you? Sheffield Roadshow
 
Researcher needs - a researchers perspective
Researcher needs - a researchers perspectiveResearcher needs - a researchers perspective
Researcher needs - a researchers perspective
 
Lorraine Beard RDM at the University of Manchester
Lorraine Beard RDM at the University of ManchesterLorraine Beard RDM at the University of Manchester
Lorraine Beard RDM at the University of Manchester
 
DMPTool2 Webinar #1 for Administrators
DMPTool2 Webinar #1 for AdministratorsDMPTool2 Webinar #1 for Administrators
DMPTool2 Webinar #1 for Administrators
 
RDAP 16: Building the Research Data Community of Practice
RDAP 16: Building the Research Data Community of PracticeRDAP 16: Building the Research Data Community of Practice
RDAP 16: Building the Research Data Community of Practice
 
New Data, Same Skills: Applying Core Principles to New Needs in Data Curation
New Data, Same Skills: Applying Core Principles to New Needs in Data CurationNew Data, Same Skills: Applying Core Principles to New Needs in Data Curation
New Data, Same Skills: Applying Core Principles to New Needs in Data Curation
 
Building the Future of Research Together
Building the Future of Research TogetherBuilding the Future of Research Together
Building the Future of Research Together
 

Similar to RDAP14 Poster: Evaluation of research data services: What things should we evaluate? What are the proper metrics for judging success?

Poster: Perspectives on Increasing Competency in Using Digital Practices and ...
Poster: Perspectives on Increasing Competency in Using Digital Practices and ...Poster: Perspectives on Increasing Competency in Using Digital Practices and ...
Poster: Perspectives on Increasing Competency in Using Digital Practices and ...Katja Reuter, PhD
 
ABLE - Inside Government Nov 2017 - Rebecca Edwards
ABLE - Inside Government Nov 2017 - Rebecca EdwardsABLE - Inside Government Nov 2017 - Rebecca Edwards
ABLE - Inside Government Nov 2017 - Rebecca EdwardsEd Foster
 
The nature of program evaluation
The nature of program evaluationThe nature of program evaluation
The nature of program evaluationCarlo Magno
 
Keynote 1. How can you tell if is not working? Evaluating the impact of educa...
Keynote 1. How can you tell if is not working? Evaluating the impact of educa...Keynote 1. How can you tell if is not working? Evaluating the impact of educa...
Keynote 1. How can you tell if is not working? Evaluating the impact of educa...CONUL Teaching & Learning
 
Classification of Researcher's Collaboration Patterns Towards Research Perfor...
Classification of Researcher's Collaboration Patterns Towards Research Perfor...Classification of Researcher's Collaboration Patterns Towards Research Perfor...
Classification of Researcher's Collaboration Patterns Towards Research Perfor...Nur Hazimah Khalid
 
From Reporting to Insight to Action
From Reporting to Insight to ActionFrom Reporting to Insight to Action
From Reporting to Insight to ActionEllen Wagner
 
ABLE - the NTU Student Dashboard - University of Derby
ABLE - the NTU Student Dashboard - University of DerbyABLE - the NTU Student Dashboard - University of Derby
ABLE - the NTU Student Dashboard - University of DerbyEd Foster
 
EBUS5423 Data Analytics and Reporting Bl
EBUS5423 Data Analytics and Reporting BlEBUS5423 Data Analytics and Reporting Bl
EBUS5423 Data Analytics and Reporting BlDr. Bruce A. Johnson
 
Evaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation MethodsEvaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation MethodsDebbie_at_IDS
 
Tools And Resources For Continuous Improvement Of Technology In Schools
Tools And Resources For Continuous Improvement Of Technology In SchoolsTools And Resources For Continuous Improvement Of Technology In Schools
Tools And Resources For Continuous Improvement Of Technology In Schoolsfridayinstitute
 
Mehrnoosh vahdat workshop-data sharing 2014
Mehrnoosh vahdat  workshop-data sharing 2014Mehrnoosh vahdat  workshop-data sharing 2014
Mehrnoosh vahdat workshop-data sharing 2014MehrnooshV
 
invoNET 2012 - Evans, Laterza & Davies
invoNET 2012 - Evans, Laterza & DaviesinvoNET 2012 - Evans, Laterza & Davies
invoNET 2012 - Evans, Laterza & DaviesinvoNET
 
Visualising Learner Behaviours in MOOCs - Ascilite 2018 presentation
Visualising Learner Behaviours in MOOCs - Ascilite 2018 presentationVisualising Learner Behaviours in MOOCs - Ascilite 2018 presentation
Visualising Learner Behaviours in MOOCs - Ascilite 2018 presentationUniversity of Newcastle, NSW.
 
Feedback as dialogue and learning technologies: can e-assessment be formative?
Feedback as dialogue and learning technologies: can e-assessment be formative?Feedback as dialogue and learning technologies: can e-assessment be formative?
Feedback as dialogue and learning technologies: can e-assessment be formative?Centre for Distance Education
 
Presentation on Research Proposal (Qualitative); Digitalisation & Top Managem...
Presentation on Research Proposal (Qualitative); Digitalisation & Top Managem...Presentation on Research Proposal (Qualitative); Digitalisation & Top Managem...
Presentation on Research Proposal (Qualitative); Digitalisation & Top Managem...Himanish Kar Purkayastha
 
Ellen Wagner: Putting Data to Work
Ellen Wagner: Putting Data to WorkEllen Wagner: Putting Data to Work
Ellen Wagner: Putting Data to WorkAlexandra M. Pickett
 
Data Collection and Accessibility for Inclusive Excellence
Data Collection and Accessibility for Inclusive ExcellenceData Collection and Accessibility for Inclusive Excellence
Data Collection and Accessibility for Inclusive ExcellencePresence
 
Learning Analytics for Learning
Learning Analytics for LearningLearning Analytics for Learning
Learning Analytics for LearningWolfgang Greller
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...Institute of Development Studies
 

Similar to RDAP14 Poster: Evaluation of research data services: What things should we evaluate? What are the proper metrics for judging success? (20)

Poster: Perspectives on Increasing Competency in Using Digital Practices and ...
Poster: Perspectives on Increasing Competency in Using Digital Practices and ...Poster: Perspectives on Increasing Competency in Using Digital Practices and ...
Poster: Perspectives on Increasing Competency in Using Digital Practices and ...
 
ABLE - Inside Government Nov 2017 - Rebecca Edwards
ABLE - Inside Government Nov 2017 - Rebecca EdwardsABLE - Inside Government Nov 2017 - Rebecca Edwards
ABLE - Inside Government Nov 2017 - Rebecca Edwards
 
The nature of program evaluation
The nature of program evaluationThe nature of program evaluation
The nature of program evaluation
 
Keynote 1. How can you tell if is not working? Evaluating the impact of educa...
Keynote 1. How can you tell if is not working? Evaluating the impact of educa...Keynote 1. How can you tell if is not working? Evaluating the impact of educa...
Keynote 1. How can you tell if is not working? Evaluating the impact of educa...
 
Classification of Researcher's Collaboration Patterns Towards Research Perfor...
Classification of Researcher's Collaboration Patterns Towards Research Perfor...Classification of Researcher's Collaboration Patterns Towards Research Perfor...
Classification of Researcher's Collaboration Patterns Towards Research Perfor...
 
From Reporting to Insight to Action
From Reporting to Insight to ActionFrom Reporting to Insight to Action
From Reporting to Insight to Action
 
ABLE - the NTU Student Dashboard - University of Derby
ABLE - the NTU Student Dashboard - University of DerbyABLE - the NTU Student Dashboard - University of Derby
ABLE - the NTU Student Dashboard - University of Derby
 
EBUS5423 Data Analytics and Reporting Bl
EBUS5423 Data Analytics and Reporting BlEBUS5423 Data Analytics and Reporting Bl
EBUS5423 Data Analytics and Reporting Bl
 
Presentación de la defensa de la tesis de Li Yang
Presentación de la defensa de la tesis de Li YangPresentación de la defensa de la tesis de Li Yang
Presentación de la defensa de la tesis de Li Yang
 
Evaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation MethodsEvaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation Methods
 
Tools And Resources For Continuous Improvement Of Technology In Schools
Tools And Resources For Continuous Improvement Of Technology In SchoolsTools And Resources For Continuous Improvement Of Technology In Schools
Tools And Resources For Continuous Improvement Of Technology In Schools
 
Mehrnoosh vahdat workshop-data sharing 2014
Mehrnoosh vahdat  workshop-data sharing 2014Mehrnoosh vahdat  workshop-data sharing 2014
Mehrnoosh vahdat workshop-data sharing 2014
 
invoNET 2012 - Evans, Laterza & Davies
invoNET 2012 - Evans, Laterza & DaviesinvoNET 2012 - Evans, Laterza & Davies
invoNET 2012 - Evans, Laterza & Davies
 
Visualising Learner Behaviours in MOOCs - Ascilite 2018 presentation
Visualising Learner Behaviours in MOOCs - Ascilite 2018 presentationVisualising Learner Behaviours in MOOCs - Ascilite 2018 presentation
Visualising Learner Behaviours in MOOCs - Ascilite 2018 presentation
 
Feedback as dialogue and learning technologies: can e-assessment be formative?
Feedback as dialogue and learning technologies: can e-assessment be formative?Feedback as dialogue and learning technologies: can e-assessment be formative?
Feedback as dialogue and learning technologies: can e-assessment be formative?
 
Presentation on Research Proposal (Qualitative); Digitalisation & Top Managem...
Presentation on Research Proposal (Qualitative); Digitalisation & Top Managem...Presentation on Research Proposal (Qualitative); Digitalisation & Top Managem...
Presentation on Research Proposal (Qualitative); Digitalisation & Top Managem...
 
Ellen Wagner: Putting Data to Work
Ellen Wagner: Putting Data to WorkEllen Wagner: Putting Data to Work
Ellen Wagner: Putting Data to Work
 
Data Collection and Accessibility for Inclusive Excellence
Data Collection and Accessibility for Inclusive ExcellenceData Collection and Accessibility for Inclusive Excellence
Data Collection and Accessibility for Inclusive Excellence
 
Learning Analytics for Learning
Learning Analytics for LearningLearning Analytics for Learning
Learning Analytics for Learning
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
 

More from ASIS&T

RDAP 16: Sustaining Research Data Services (Panel 2: Sustainability)
RDAP 16: Sustaining Research Data Services (Panel 2: Sustainability)RDAP 16: Sustaining Research Data Services (Panel 2: Sustainability)
RDAP 16: Sustaining Research Data Services (Panel 2: Sustainability)ASIS&T
 
RDAP 16: Sustainability of data infrastructure: The history of science scienc...
RDAP 16: Sustainability of data infrastructure: The history of science scienc...RDAP 16: Sustainability of data infrastructure: The history of science scienc...
RDAP 16: Sustainability of data infrastructure: The history of science scienc...ASIS&T
 
RDAP 16: DMPs and Public Access: Agency and Data Service Experiences
RDAP 16: DMPs and Public Access: Agency and Data Service ExperiencesRDAP 16: DMPs and Public Access: Agency and Data Service Experiences
RDAP 16: DMPs and Public Access: Agency and Data Service ExperiencesASIS&T
 
RDAP 16: Perspective on DMPs, Funders and Public Access (Panel 5: DMPs and Pu...
RDAP 16: Perspective on DMPs, Funders and Public Access (Panel 5: DMPs and Pu...RDAP 16: Perspective on DMPs, Funders and Public Access (Panel 5: DMPs and Pu...
RDAP 16: Perspective on DMPs, Funders and Public Access (Panel 5: DMPs and Pu...ASIS&T
 
RDAP 16: DMPs and Public Access: An NIH Perspective (Panel 5, DMPs and Public...
RDAP 16: DMPs and Public Access: An NIH Perspective (Panel 5, DMPs and Public...RDAP 16: DMPs and Public Access: An NIH Perspective (Panel 5, DMPs and Public...
RDAP 16: DMPs and Public Access: An NIH Perspective (Panel 5, DMPs and Public...ASIS&T
 
RDAP 16: If I could turn back time: Looking back on 2+ years of DMP consultin...
RDAP 16: If I could turn back time: Looking back on 2+ years of DMP consultin...RDAP 16: If I could turn back time: Looking back on 2+ years of DMP consultin...
RDAP 16: If I could turn back time: Looking back on 2+ years of DMP consultin...ASIS&T
 
RDAP 16: Data Management Plan Perspectives (Panel 5, DMPs and Public Access)
RDAP 16: Data Management Plan Perspectives (Panel 5, DMPs and Public Access)RDAP 16: Data Management Plan Perspectives (Panel 5, DMPs and Public Access)
RDAP 16: Data Management Plan Perspectives (Panel 5, DMPs and Public Access)ASIS&T
 
RDAP 16 Poster: Challenges and Opportunities in an Institutional Repository S...
RDAP 16 Poster: Challenges and Opportunities in an Institutional Repository S...RDAP 16 Poster: Challenges and Opportunities in an Institutional Repository S...
RDAP 16 Poster: Challenges and Opportunities in an Institutional Repository S...ASIS&T
 
RDAP 16 Poster: Interpreting Local Data Policies in Practice
RDAP 16 Poster: Interpreting Local Data Policies in PracticeRDAP 16 Poster: Interpreting Local Data Policies in Practice
RDAP 16 Poster: Interpreting Local Data Policies in PracticeASIS&T
 
RDAP 16 Poster: Measuring adoption of Electronic Lab Notebooks and their impa...
RDAP 16 Poster: Measuring adoption of Electronic Lab Notebooks and their impa...RDAP 16 Poster: Measuring adoption of Electronic Lab Notebooks and their impa...
RDAP 16 Poster: Measuring adoption of Electronic Lab Notebooks and their impa...ASIS&T
 
RDAP 16 Poster: Responding to Data Management and Sharing Requirements in the...
RDAP 16 Poster: Responding to Data Management and Sharing Requirements in the...RDAP 16 Poster: Responding to Data Management and Sharing Requirements in the...
RDAP 16 Poster: Responding to Data Management and Sharing Requirements in the...ASIS&T
 
RDAP 16 Lightning: Spreading the love: Bringing data management training to s...
RDAP 16 Lightning: Spreading the love: Bringing data management training to s...RDAP 16 Lightning: Spreading the love: Bringing data management training to s...
RDAP 16 Lightning: Spreading the love: Bringing data management training to s...ASIS&T
 
RDAP 16 Lightning: RDM Discussion Group: How'd that go?
RDAP 16 Lightning: RDM Discussion Group: How'd that go?RDAP 16 Lightning: RDM Discussion Group: How'd that go?
RDAP 16 Lightning: RDM Discussion Group: How'd that go?ASIS&T
 
RDAP 16 Lightning: Data Practices and Perspectives of Atmospheric and Enginee...
RDAP 16 Lightning: Data Practices and Perspectives of Atmospheric and Enginee...RDAP 16 Lightning: Data Practices and Perspectives of Atmospheric and Enginee...
RDAP 16 Lightning: Data Practices and Perspectives of Atmospheric and Enginee...ASIS&T
 
RDAP 16 Lightning: Working Across Cultures: Data Librarian as Knowledge Broker
RDAP 16 Lightning: Working Across Cultures: Data Librarian as Knowledge BrokerRDAP 16 Lightning: Working Across Cultures: Data Librarian as Knowledge Broker
RDAP 16 Lightning: Working Across Cultures: Data Librarian as Knowledge BrokerASIS&T
 
RDAP 16 Lightning: An Open Science Framework for Solving Institutional Challe...
RDAP 16 Lightning: An Open Science Framework for Solving Institutional Challe...RDAP 16 Lightning: An Open Science Framework for Solving Institutional Challe...
RDAP 16 Lightning: An Open Science Framework for Solving Institutional Challe...ASIS&T
 
RDAP 16 Lightning: Quantifying Needs for a University Research Repository Sys...
RDAP 16 Lightning: Quantifying Needs for a University Research Repository Sys...RDAP 16 Lightning: Quantifying Needs for a University Research Repository Sys...
RDAP 16 Lightning: Quantifying Needs for a University Research Repository Sys...ASIS&T
 
RDAP 16 Lightning: Personas as a Policy Development Tool for Research Data
RDAP 16 Lightning: Personas as a Policy Development Tool for Research DataRDAP 16 Lightning: Personas as a Policy Development Tool for Research Data
RDAP 16 Lightning: Personas as a Policy Development Tool for Research DataASIS&T
 
RDAP 16 Lightning: Growing Data in Utah: A Model for Statewide Collaboration
RDAP 16 Lightning: Growing Data in Utah: A Model for Statewide CollaborationRDAP 16 Lightning: Growing Data in Utah: A Model for Statewide Collaboration
RDAP 16 Lightning: Growing Data in Utah: A Model for Statewide CollaborationASIS&T
 
RDAP 16: Building Without a Plan: How do you assess structural strength? (Pan...
RDAP 16: Building Without a Plan: How do you assess structural strength? (Pan...RDAP 16: Building Without a Plan: How do you assess structural strength? (Pan...
RDAP 16: Building Without a Plan: How do you assess structural strength? (Pan...ASIS&T
 

More from ASIS&T (20)

RDAP 16: Sustaining Research Data Services (Panel 2: Sustainability)
RDAP 16: Sustaining Research Data Services (Panel 2: Sustainability)RDAP 16: Sustaining Research Data Services (Panel 2: Sustainability)
RDAP 16: Sustaining Research Data Services (Panel 2: Sustainability)
 
RDAP 16: Sustainability of data infrastructure: The history of science scienc...
RDAP 16: Sustainability of data infrastructure: The history of science scienc...RDAP 16: Sustainability of data infrastructure: The history of science scienc...
RDAP 16: Sustainability of data infrastructure: The history of science scienc...
 
RDAP 16: DMPs and Public Access: Agency and Data Service Experiences
RDAP 16: DMPs and Public Access: Agency and Data Service ExperiencesRDAP 16: DMPs and Public Access: Agency and Data Service Experiences
RDAP 16: DMPs and Public Access: Agency and Data Service Experiences
 
RDAP 16: Perspective on DMPs, Funders and Public Access (Panel 5: DMPs and Pu...
RDAP 16: Perspective on DMPs, Funders and Public Access (Panel 5: DMPs and Pu...RDAP 16: Perspective on DMPs, Funders and Public Access (Panel 5: DMPs and Pu...
RDAP 16: Perspective on DMPs, Funders and Public Access (Panel 5: DMPs and Pu...
 
RDAP 16: DMPs and Public Access: An NIH Perspective (Panel 5, DMPs and Public...
RDAP 16: DMPs and Public Access: An NIH Perspective (Panel 5, DMPs and Public...RDAP 16: DMPs and Public Access: An NIH Perspective (Panel 5, DMPs and Public...
RDAP 16: DMPs and Public Access: An NIH Perspective (Panel 5, DMPs and Public...
 
RDAP 16: If I could turn back time: Looking back on 2+ years of DMP consultin...
RDAP 16: If I could turn back time: Looking back on 2+ years of DMP consultin...RDAP 16: If I could turn back time: Looking back on 2+ years of DMP consultin...
RDAP 16: If I could turn back time: Looking back on 2+ years of DMP consultin...
 
RDAP 16: Data Management Plan Perspectives (Panel 5, DMPs and Public Access)
RDAP 16: Data Management Plan Perspectives (Panel 5, DMPs and Public Access)RDAP 16: Data Management Plan Perspectives (Panel 5, DMPs and Public Access)
RDAP 16: Data Management Plan Perspectives (Panel 5, DMPs and Public Access)
 
RDAP 16 Poster: Challenges and Opportunities in an Institutional Repository S...
RDAP 16 Poster: Challenges and Opportunities in an Institutional Repository S...RDAP 16 Poster: Challenges and Opportunities in an Institutional Repository S...
RDAP 16 Poster: Challenges and Opportunities in an Institutional Repository S...
 
RDAP 16 Poster: Interpreting Local Data Policies in Practice
RDAP 16 Poster: Interpreting Local Data Policies in PracticeRDAP 16 Poster: Interpreting Local Data Policies in Practice
RDAP 16 Poster: Interpreting Local Data Policies in Practice
 
RDAP 16 Poster: Measuring adoption of Electronic Lab Notebooks and their impa...
RDAP 16 Poster: Measuring adoption of Electronic Lab Notebooks and their impa...RDAP 16 Poster: Measuring adoption of Electronic Lab Notebooks and their impa...
RDAP 16 Poster: Measuring adoption of Electronic Lab Notebooks and their impa...
 
RDAP 16 Poster: Responding to Data Management and Sharing Requirements in the...
RDAP 16 Poster: Responding to Data Management and Sharing Requirements in the...RDAP 16 Poster: Responding to Data Management and Sharing Requirements in the...
RDAP 16 Poster: Responding to Data Management and Sharing Requirements in the...
 
RDAP 16 Lightning: Spreading the love: Bringing data management training to s...
RDAP 16 Lightning: Spreading the love: Bringing data management training to s...RDAP 16 Lightning: Spreading the love: Bringing data management training to s...
RDAP 16 Lightning: Spreading the love: Bringing data management training to s...
 
RDAP 16 Lightning: RDM Discussion Group: How'd that go?
RDAP 16 Lightning: RDM Discussion Group: How'd that go?RDAP 16 Lightning: RDM Discussion Group: How'd that go?
RDAP 16 Lightning: RDM Discussion Group: How'd that go?
 
RDAP 16 Lightning: Data Practices and Perspectives of Atmospheric and Enginee...
RDAP 16 Lightning: Data Practices and Perspectives of Atmospheric and Enginee...RDAP 16 Lightning: Data Practices and Perspectives of Atmospheric and Enginee...
RDAP 16 Lightning: Data Practices and Perspectives of Atmospheric and Enginee...
 
RDAP 16 Lightning: Working Across Cultures: Data Librarian as Knowledge Broker
RDAP 16 Lightning: Working Across Cultures: Data Librarian as Knowledge BrokerRDAP 16 Lightning: Working Across Cultures: Data Librarian as Knowledge Broker
RDAP 16 Lightning: Working Across Cultures: Data Librarian as Knowledge Broker
 
RDAP 16 Lightning: An Open Science Framework for Solving Institutional Challe...
RDAP 16 Lightning: An Open Science Framework for Solving Institutional Challe...RDAP 16 Lightning: An Open Science Framework for Solving Institutional Challe...
RDAP 16 Lightning: An Open Science Framework for Solving Institutional Challe...
 
RDAP 16 Lightning: Quantifying Needs for a University Research Repository Sys...
RDAP 16 Lightning: Quantifying Needs for a University Research Repository Sys...RDAP 16 Lightning: Quantifying Needs for a University Research Repository Sys...
RDAP 16 Lightning: Quantifying Needs for a University Research Repository Sys...
 
RDAP 16 Lightning: Personas as a Policy Development Tool for Research Data
RDAP 16 Lightning: Personas as a Policy Development Tool for Research DataRDAP 16 Lightning: Personas as a Policy Development Tool for Research Data
RDAP 16 Lightning: Personas as a Policy Development Tool for Research Data
 
RDAP 16 Lightning: Growing Data in Utah: A Model for Statewide Collaboration
RDAP 16 Lightning: Growing Data in Utah: A Model for Statewide CollaborationRDAP 16 Lightning: Growing Data in Utah: A Model for Statewide Collaboration
RDAP 16 Lightning: Growing Data in Utah: A Model for Statewide Collaboration
 
RDAP 16: Building Without a Plan: How do you assess structural strength? (Pan...
RDAP 16: Building Without a Plan: How do you assess structural strength? (Pan...RDAP 16: Building Without a Plan: How do you assess structural strength? (Pan...
RDAP 16: Building Without a Plan: How do you assess structural strength? (Pan...
 

RDAP14 Poster: Evaluation of research data services: What things should we evaluate? What are the proper metrics for judging success?

  • 1. References   Ball, R.J. & Medeiros, N. (2012). Teaching integrity in empirical research: A protocol for documenting data management and analysis. Journal of Economic Education, 43(2), 182-189. Shadish, W. R., Cook, T. D., & Leviton, L. C. (1991). Foundations of program evaluation: Theories of practice. Sage. Smith, N. L., & Brandon, P. R. (Eds.). (2008). Fundamental issues in evaluation. Guilford Press. Trochim, W., Urban, J. B., Hargraves, M., Hebbard, C., Buckley, J., Archibald, T., Johnson, M., & Burgermaster, M. (2012). The guide to the systems evaluation protocol. Ithaca, NY: Cornell Digital Print Services. Urban, J. B., & Trochim, W. (2009). The role of evaluation in research—Practice integration working toward the ‘‘Golden Spike.’’ American Journal of Evaluation, 30(4), 538-553 ! Teaching Integrity in Empirical Research http://www.haverford.edu/TIER/ Example:  TIER  Teaching  Integrity  in   Empirical  Research     • A protocol for teaching students who are doing original analysis of quantitative data to document the steps of data management and analysis to facilitate replicability. • Students learn documentation experientially in the context of learning research methods • For use in situations where: Students do research projects using real data to engage in statistical inquiry (not prescribed analysis using assigned data). Students receive instructor feedback and mentoring over the course of their project. • Developed and implemented collaboratively over the past 10 years by Richard Ball, an economics faculty member, and Norm Medeiros, a librarian, at Haverford College. ! ! We  need  a  better  way  of   conceptualizing  evaluation  than   asking,  “What  metrics  should  we   use?”   This question presumes: • Evaluation = measurement • Uniformity of measurement and evaluation questions across projects evaluation as an afterthought. • Measurement involves passive data collection that is the same for all projects at all phases of project development. • Measurement does not involve collecting structured data in the context of an evaluation study. • It is possible to measure project impact outside the context of an evaluation study and without taking into account the logic of how program activities lead to long-term outcomes. This approach to evaluation of RDM interventions is a blunt instrument that lacks mechanisms for informing improvement or meaningful comparison of interventions. If we hope to build an infrastructure of research data support, more robust evaluation is in order. In the professional program evaluation community, measurement choices depend upon context. Central considerations are stakeholder questions and the intended use of evaluation findings. Charlotte  L.  Flynn
 Syracuse  University
 charflynn5@gmail.com Problem  &  Solution • An emerging role for information professionals in higher education is supporting researchers in managing data. • Evaluation of data support services and tools is limited. • This inhibits local project improvement and larger scale decision-making. • Local project and broad policy evaluation information needs differ in emphasis and timeframe. • Solution: Theory-based local evaluation supported by a cyberinfrastructure that builds local evaluation capacity and a research base that connects local activities to long-term outcomes. Urban and Trochim’s (2009) Systems Evaluation Protocol (SEP) is one such approach. ! TIER  Logic  Model Factors  to  take  into  account  before   deciding  what  to  measure   • Description • Key activities/elements participants experience that contribute to outcomes. • Informed by differences in information or values that shape stakeholder perceptions of the program. • Phase of development • Life cycle analysis • Theory of change • Expresses stakeholder understanding of how program brings about change • Logic model à pathway model • Program-system links • Evidence that links local activities to long-term outcomes. • Evaluation scope • Feasibility, credibility, accuracy, usefulness Evaluation of research data services: What things should we evaluate? What are the proper metrics for judging success? Inputs Activities Outputs Short-term outcomes Medium-term outcomes Long-term outcomes TIER protocol (procedures for assembling files to document analysis & data management à replicability) Faculty require students to follow TIER protocol to document their data management and analysis. Student research outputs are documented for replicability. Students gain more in-depth understanding of research methods. Students adopt documentation practices. Professional research is documented. Instructor who devotes time to offering feedback on student projects -Faculty use student doc of analysis & data management to offer better feedback on student work. Higher quality final student projects (e.g. fewer errors). Students see benefit of documentation (e.g. use it to find and correct errors). Increased researcher interest in repository ingest. Professional research is of higher quality. Subject liaison librarian Librarian supports students in implementing TIER. Repository ingest of student research (study, data, & documentation). Students view librarians as a source of data management support. Increased researcher-librarian data management collaboration. Professional research data is publicly shared. Platforms for sharing outputs during project and after Librarian supports faculty with infrastructure to manage student output Greater librarian & faculty shared understanding of community documentation practices. Opportunities for inquiry are expanded. Context: TIER was developed for use in higher education contexts where faculty supervise student empirical research projects that use real data, offering feedback over the course of the project. It can be used in the context of a quantitative social science methods courses (economics, sociology, political science, psychology) or to advise thesis projects. It can be used to advise individual or group projects. ! Assumptions: Faculty in higher education settings that emphasize quality teaching (e.g. liberal arts colleges) are receptive to implementing innovations to optimize instructional quality. Academic library services are often structured to facilitate librarian-faculty collaboration via subject liaisons, which affords an opportunity for offering students experiential learning of data management best practices. Phases  of  Development • Programs have life cycles • Less stability early on, trial and error • Tend to stabilize as they mature • Program phases: initiation, development, stability, dissemination. • Evaluation should have a lifecycle • Early on = rapid feedback for program improvement • Once program stabilizes and is disseminated, evaluation should entail greater social science rigor. • Over the course of a program, evaluation questions attend first to process, then change, then comparison, then generalization. TIER  Pathway  Model