Brief presentation on data driven collection development or evidence based collection development. Generally, some of the things to watch out for and advice on how to view your data.
2. Who is driving “Data Driven Collection
Development”?
In 2008 financial markets fell apart. (CD, Sept., 2008)
2009, ICOLC issues statement; that they consider the financial crises of such significance that they could
not “simply assume libraries and publishers share the same perspective.” As it turns out, this turned out
to be quite prophetic. (Worst materials cancelation in SDSU library history that summer. Some
publishers held their prices… others did not.)
Meeting new accountability and budgetary demands placed upon the entire university; while we still strive
to meet the faculty and students’ curricular and research needs. (We needed to demonstrate the
collections use. Are we spending our money wisely?)
Not a new concept. In a collection evaluation article from 1979, Paul Mosher advocates for measuring a
collection development policy’s effectiveness to keep from falling down the “bottomless pit” of library
acquisitions.
I believe that libraries have been and continue to be superb stewards of their budgets; however, the
current economic climate demands that we prove it. We are not being asked any more than any other
department on campus is being asked. More importantly, we do have the means to prove it.)
Mosher ,P. (1979, Winter). Collection evaluation in research libraries: the search for quality. Library Resources & Technical Services
23 (1),16-32.
Bullington, J.C. (2009). About ICOLC and the ICOLC Statement on the Global Economic Crisis and Its Impact on Consortial Licenses.
Collaborative Librarianship 1 (4), 156-161. URL: http://www.collaborativelibrarianship.org/index.php/jocl/article/viewArticle/52
3. Who is driving “Data Driven Collection
Development”?
How do we prove it? (Data! Well, data and, as always, the subject expertise, outreach,
and liaison work of the librarians. Data must have a context in order to be interpreted
correctly.)
More data is available and is collected than ever before with the widespread adoption of
electronic resources to replace print materials. (Vendor usage and turn-away reports.)
Questions like, “Is the collection reflecting the changing needs of our evolving academic
programs, research interests, and new interdisciplinary studies?” can be answered. (Now
we can tell. We can see what is being used at a very granular level: article by article;
ebook by ebook; request by request)
There is no longer a need for as heavy a reliance on perceived institutional use. (We can
see what the use is by our users, now we must interpret the data to try to meet their
continually evolving information demands.)
Through combining different data sources to get a clearer picture of how materials are
being used. (Usage, ILL, Citation analysis: JCR or LJUR, and turn-away reports) By
placing this data in the hands of the subject specialist; they can have more informed
discussions about the collection and its use.
12. What statistics to use?
There is no one model that fits all libraries. (Culture, programs,
organizational issues) Select the evaluation indicators as they
apply to your situation.
Collection practices will become more patron centric. Patron
driven focus may seem counter to consortium-level collection
development. (Declining budgets, shrinking building space,
expanding and interdisciplinary programs) It is increasingly
imperative the consortia work with the collection needs of its
individual libraries. Libraries must be able to articulate what
those needs are for their patrons. What is core for your
patrons? (Is it really Springer journals?) Use the data to make
your case.
13. Opportunities and Pitfalls
Hard statistics must guide collection
development, but they need to be
supplemented by input from the user
community and subject specialists.
– Opportunity to create a scalable, flexible and
tailored library collection for your users.
– Pitfall to misinterpret use patterns in a vacuum.
Data is most useful when it is shared and
combine with qualitative information.
14. Opportunities and Pitfalls
Keep your data up-to-date. Maintain a data bank of
collection assessment can quickly become
overwhelming.
– Opportunity to tie your data to the University mission and
specific library goals. Support institutional effectiveness.
– Pitfall to try to collect and manage too much data, such that
the data becomes unusable because it is so overwhelming
to manage. “There is no time to do assessment.” Or,
worse, “Mistakes were made.”
15. Opportunities and Pitfalls
Think globally; act locally. Approach your
assessment from a philosophical point of view.
Create local solutions to fit the philosophy. The
statistics and the data the library receives may
change, but the way the data is used should remain
the same.
– Opportunity to describe the collection to your patrons and
how it seeks to meet their information needs.
– Pitfall to bury them in numbers, bar graphs and
spreadsheets. Anecdotal and specific examples should be
used to explain the data. (Everyone likes a good story.)