“Thinking like an Educator” and Demonstrating Value Like a Service Provider

Leveraging Emerging Learning Technologies to Promote Library Instruction”
By Beth Strickland, Laurie Alexander, Amanda Peters, and Catherine Morse
June 3, 2013, Educause Review: http://www.educause.edu/ero/article/leveraging-emerging-learning-technologies-promote-library-instruction

As we think about the value added by library services and instruction, the article “Leveraging Emerging Learning Technologies to Promote Library Instruction” highlights key elements of a successful library program — collaboration, assessment, revision, and repeat. Two of the Value Reports’ essential questions resounded as I read the work of these University of Michigan librarians:

1. How does the library contribute to the student experience?
2. How does the library contribute to student learning?

Determined to move beyond the traditional one-shot workshop and supported by the assistant dean of undergraduate education, these librarians collaborated with faculty to develop a for-credit research course. As they assessed their work, they realized there were key components they could enhance using learning technologies. Again, they collaborated with an instructional technologist and created a blended learning approach to the material. This work demonstrates their extension beyond the traditional role of library information literacy instruction and work in curriculum development:

In the area of student learning, academic libraries are in the middle of a paradigm shift. In the past, academic libraries functioned primarily as information repositories; now they are becoming learning enterprises (Bennett 2009, 194). This shift requires academic librarians to embed library services and resources in the teaching and learning activities of their institutions (Lewis 2007). In the new paradigm, librarians focus on information skills, not information access (Bundy 2004, 3); they think like educators, not service providers (Bennett 2009, 194). VAL Report p.37

The online component allowed them to monitor progress immediately instead of waiting for a bibliography or final project to review. This generated discussion among the course librarian faculty member and students in a way that was not as evident in the face-to face version of the class. While we think about how to develop similar classes in our own institutions, these University of Michigan librarians have given us a great model to help others conceive and convince constituents of the benefits inherent in assessing and reviewing workshop and curriculum design.

Information Literacy Assessment and the National Survey of Student Engagement (NSSE)

A session at ALA in Anaheim entitled, “Feasible, Scalable, and Measurable; information literacy assessment and the National Survey of Student Engagement” focused on presenting a draft module for the NSSE survey in 2013 to include more focused questions on information literacy.  The presenters, Polly Boruff-Jones (Drury University) , Carrie Donovan (Indiana University), and Kevin Fosnacht (NSSE) have made their slides available here –  http://cpr.iub.edu/uploads/ALA%202012.pdf

This project builds on work that was done creating test questions that were incorporated in the 2008 version of the survey.  These were not permanent, so the new module is developing specific new questions to collect data on a regular basis.  While the presentation focused on discussing the draft questions and soliciting feedback, the presentation also included a demonstration of the NSSE Report Builder, a publicly accessible site that allows users to construct reports based on variables selected from the NSSE.

The NSSE website has a page (http://nsse.iub.edu/html/tools_and_services.cfm)  with tools and services that includes links to the Report Builder, Custom Analyses, and Data Analysis Resources.  The same page allows users to search for participating institutions, so for users who aren’t familiar with the NSSE and want to see which institutions participate can check this list.    Please keep in mind that many institutions do not complete this survey every year, so while there is a complete list of 2012 participating institutions, check the separate list of participating institutions with a display of years in which the survey was completed for each institution.

A couple of articles that may help readers get a better understanding of how to use NSSE are available from ACRL publications:

An example of one institution using the data for benchmark comparisons, is Franklin & Marshall.  Readers can see the benchmark comparisons for information literacy using the Level of Academic Challenge data.

So, you may have access to similar data for your institution and can use it to prepare your own analyses.  If your institution participates in the NSSE survey, have you reviewed the results?  Have you created similar benchmarks with your institutional data from the NSSE?  Please share how you are using NSSE data.







Getting Started with Small-Scale Local Studies

Following up on Megan’s article,  in this and future postings, we’re going to try and look at different aspects of measuring value as outlined at the end of the report in the Research Agenda.  In the section “What to Do Next” there are suggestions on how to get started.  It is easy to feel overwhelmed in thinking about how to undertake an assessment project or even imagine how you might demonstrate the value your library has on your campus.  As mentioned in the “Get Started” section, “small-scale local studies are often more powerful within individual institutions.”  So what kind of small-scale studies are manageable on your campus?

An example of a small-scale study is one conducted by Hope College.  Hope was one of the participants in the Summit in November and a podcost with their Provost was uploaded last week.  Upon returning to campus after the Summit, Kelly Jacobsma, Library Director, decided to test one possible hypothesis.  Could they correlate number of library checkouts with the student GPA? They ran a report focusing on juniors and seniors using student ID numbers and matched them with their GPAs.  They discovered there was a strong positive correlation with students who had a higher number of checkouts also having higher GPAs.  Obviously, this is just a simple test, but as one indicator, it is a start.  The second aspect of this is how to share this and tell our story in a meaningful way for the campus community.  Each institution will need to address that in terms of what works best on your campus.  However, another challenge, as pointed out by Kelly, is the fact that while we can get numbers for individual students using our circulation records, we cannot get similar data for use of journal articles.  Knowing that our electronic databases and e-journals are a key resource for our students, but not having a means to measure their use makes for an interesting challenge.  One of the questions shared in the report to demonstrate a possible approach to collecting data was “What percent of readings used in courses are available and accessed through the library?”  All of us have some kind of course management system in use on our campuses, but how many libraries have access to the course management pages for the classes?  Also, the common practice is for professors to download the pdf and upload it to their course page instead of linking to the article in our systems.  Perhaps some of you have started to explore this issue and have some thoughts to share on how we could create a meaningful, but manageable study or a means to gather data that would help demonstrate our value as it relates to journal articles used by classes?

This is just one example of gathering data.  Several years ago I executed a short one-question survey for our faculty utilizing our Center for Scholarship and Teaching.  With the cooperation of the Director, we sent out an email via their listserv asking faculty to respond to a simple question — Did they publish a book or article in the previous academic year?  If they answered “yes,” we asked if the library contributed in any way and we provided a checklist of options including journals and books owned by the library, interlibrary loan service, assistance of reference librarians, etc.  We were able to demonstrate that based on those who responded, 75% attributed contributions by the library.  It was a short survey, but we were able to share our story in terms of how the library provides service by contributing to published scholarship faculty produce on our campus.

We invite you to look at the Research Agenda.  As stated in the report, “If each library identifies some part of the Research Agenda in this document, collects data, and communicates it…the profession will develop a body of evidence that demonstrates library impact in convincing ways.” [p94]  Last week Megan provided a survey form.  Please complete the survey, but also please respond if you can answer any of the following questions.

Do you have stories to share?  Do you have assessment techniques that have worked for you?  Help us build our body of evidence by telling us how you have compiled data or stories that demonstrate the impact your library has on your campus.