For many years metadata development activities have focused on developing and sharing metadata for discovering data. This is important. Once data are discovered, metadata supporting use and understanding become important. Efforts to encourage scientists and data providers to create those metadata have had limited success. This talk describes some approaches and tools for supporting the organizational change efforts required to integrate use and understanding metadata into organizational cultures. These approaches are described in terms of the ideas presented in Switch: How to Change Things When Change is Hard.
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Metadata Evaluation and Improvement
1. Metadata Evaluation and
Improvement
Ted Habermann
Director of Earth Science
The HDF Group
thabermann@hdfgroup.org
What looks like a people problem is often a situation problem.
To change someone’s behavior, you’ve got to change that person’s situation.
January 8-10, 2014
ESIP Winter 2014
1
2. Change Efforts Have Two Targets
Direct the
Rider
Motivate the
Elephant
January 8-10, 2014
ESIP Winter 2014
2
3. Point to the Destination
Complete Documentation Can Be Overwhelming
4. Point to the Destination
DocumentationRewards Card
Discovery
January 8-10, 2014
Use
ESIP Winter 2014
Understanding
4
5. Find the Feeling
DocumentationRewards Card
Discovery
Use
Understanding
Your data are important for science.
Complete documentation makes them trustworthy.
January 8-10, 2014
ESIP Winter 2014
5
6. Shrink the Change
DocumentationRewards Card
X
Discovery
Use
Understanding
Your data are important for science.
Complete documentation makes them trustworthy.
January 8-10, 2014
ESIP Winter 2014
6
7. Shrink the Change
DocumentationRewards Card
Identification
Connection
Extent
Content
X X X XX X XX X
X
XXX
X
Lineage
Acquisition
Your data are important for science.
Complete documentation makes them trustworthy.
January 8-10, 2014
ESIP Winter 2014
7
9. Script the Critical Moves
Discovery
Identification
Id
Title
Abstract
Resource Date
Topic Category
Theme Keyword
Metadata Contact
Science Contact
Extent
Geospatial Bounding Box
Temporal Start/End
Vertical Min/Max
Place Keywords
Connection
OnlineResource:
Linkage (URL)
Name
Description
Function
Understanding
Text Searches
Purpose
Extent Description
Lineage Statement
Project Keywords
Distribution
Distributor Contact
Online Resource
Distribution Format
Data Center Keywords
Browse Graphic
Content Information
Attribute Type
Attribute Names
Attribute Definitions
Attribute Units
Acquisition Information
Instrument
Platform
Instrument Keywords
Platform Keywords
Quality/Lineage
Sources
Process Steps
Quality Reports /
Coverages
11. Identify Bright Spots
52 Records
2 Groups
14 Bright Spots
Global Average
2400+ Records
38 Opportunities
for Improvement
ID – Identification
EX – Extent
CN – Connection
DI- Distribution
January 8-10, 2014
CO – Content
DE – Description
LI – Lineage
AC - Acquisition
ESIP Winter 2014
11
12. Grow Your People - Community
January 8-10, 2014
ESIP Winter 2014
12
13. Change Efforts Have Two Targets
Know Destination
Script Critical Moves
Find Bright Spots
Grow Your People
Find the Feeling
Shrink the Change
January 8-10, 2014
ESIP Winter 2014
13
15. Acknowledgements
This work was partially supported by contract number NNG10HP02C from NASA.
Any opinions, findings, conclusions, or recommendations expressed in this material are
those of the author and do not necessarily reflect the views of NASA or The HDF Group.
July 24, 2013
BESSIG
15
Notas do Editor
Direct the riderFind the bright spotsScript the critical movesPoint to the destinationMotivate the elephantFind the feelingShrink the changeGrow your people
The simplified destination is documentation for discovery, use, and understanding
The feeling we are aiming for is the satisfaction of creating trustworthy data. Everyone that collects data wants the data and the processing of it to be trusted.
We are already 1/3 of the way there.
Identification is the keystone of discovery and it has a lot of elements. We are actually more than 1/3 of the way there.
A complete picture of where we are headed broken up into do-able steps.
The rubric web application has information about how to accomplish the spirals and links to a community resource (wiki and other web pages) for more information.
This Figure summarizes rubric scores for 52 records from the Alaska Satellite Facility (ASF). The x axis shows the spirals used to calculate the score. The y-axis shows the differences between the scores and the global average computed fro over 2400 EOS metadata records. Two groups were identified and they had the same scores for all but one spiral (that is why only the red line shows up. In the description spiral the scores were different. Fourteen of the 52 records had above average scores (bright spots) and 38 had below average scores (opportunities for improvement). The bright lights serve as good examples for the community.
the same analysis can be done for other groups of records within the EOS community. Each has local and global bright spots and opportunities for improvement.