Leveraging Emerging Learning Technologies to Promote Library Instruction
By Beth Strickland, Laurie Alexander, Amanda Peters, and Catherine Morse
June 3, 2013, Educause Review: http://www.educause.edu/ero/article/leveraging-emerging-learning-technologies-promote-library-instruction

As we think about the value added by library services and instruction, the article “Leveraging Emerging Learning Technologies to Promote Library Instruction” highlights key elements of a successful library program – collaboration, assessment, revision, and repeat. Two of the Value Reports’ essential questions resounded as I read the work of these University of Michigan librarians:

1. How does the library contribute to the student experience?
2. How does the library contribute to student learning?

Determined to move beyond the traditional one-shot workshop and supported by the assistant dean of undergraduate education, these librarians collaborated with faculty to develop a for-credit research course. As they assessed their work, they realized there were key components they could enhance using learning technologies. Again, they collaborated with an instructional technologist and created a blended learning approach to the material. This work demonstrates their extension beyond the traditional role of library information literacy instruction and work in curriculum development:

In the area of student learning, academic libraries are in the middle of a paradigm shift. In the past, academic libraries functioned primarily as information repositories; now they are becoming learning enterprises (Bennett 2009, 194). This shift requires academic librarians to embed library services and resources in the teaching and learning activities of their institutions (Lewis 2007). In the new paradigm, librarians focus on information skills, not information access (Bundy 2004, 3); they think like educators, not service providers (Bennett 2009, 194). VAL Report p.37

The online component allowed them to monitor progress immediately instead of waiting for a bibliography or final project to review. This generated discussion among the course librarian faculty member and students in a way that was not as evident in the face-to face version of the class. While we think about how to develop similar classes in our own institutions, these University of Michigan librarians have given us a great model to help others conceive and convince constituents of the benefits inherent in assessing and reviewing workshop and curriculum design.


Guest post by Debbie Krahmer, Learning Commons Librarian, Colgate University

[A version of this post was originally printed in the Fall 2012 Colgate University Libraries newsletter.]


The Core Curriculum is a key part of Colgate University’s liberal arts education. Only one category, Core Communities & Identities, explicitly states information literacy in their top 3 learning goals. Every Core CI class involves a research project of some kind, in the form of posters, videos, or research papers. The variety of research projects has been an obstacle to designing a suitable assessment instrument. How can you assess the impact of information literacy instruction across a diverse set of projects, geographical areas, and disciplines?

Grounded in ACRL’s Information Literacy guidelines, we based our assessment on “Project Information Literacy,” the national survey conducted by Alison Head and Michael Eisenberg. I worked with three teaching faculty from Core CI to survey over 500 students to determine how they were using and evaluating resources, and if attending a library session had any influence on the outcomes at the end of the semester.

The survey was administered at the beginning and end of the Fall 2011 and Spring 2012 semesters, and is being repeated for the 2012-2013 academic year. It consisted of 7-8 select PIL 2010 questions, administered through Google Forms. Using Google Forms helped to cut down on the time it took to collect the data, and the shortness of the survey allowed students to finish it in about 15 minutes. The majority of the 574 respondents were first year or sophomore-level students.

The broad reach of the PIL survey made it the perfect model for developing a simple assessment tool that could be used for any course. Colgate was one of the participants in the 2010 survey, so we had not only national data but also Colgate-specific benchmarks to measure against.

Perhaps the most exciting result of the survey was that attending a library information literacy session showed a far more positive change in using and evaluating research materials from the start of the semester. While all students tended to move from non-scholarly to scholarly sources over the semester, the students who had a library session reported using scholarly materials more than students who didn’t have a session. These students consulted online journal databases much more often, and generic Internet search engines much less often. Nearly 50% of the students who had a library session reported that they consulted a librarian in the course of the research project, while only 13% of the students without a library session reported consulting librarians.

Overall, the survey supported the faculty’s perception that Core CI has a positive effect on students’ information literacy skill development. It also serves to support the need for hands-on, face-to-face interactions within the classroom between students and librarians. The simplicity of the survey doesn’t allow us to determine why the library sessions had such a positive influence, but it does give us a very simple way to show that there is value in inviting librarians into the classroom.

We’ve used the results, as well as the growing research around the PIL study, to determine areas where instruction should be improved. Perhaps more importantly than that, the conversation around information literacy standards and expectations has improved since the survey. There are more frank, open discussions of what faculty and librarians expect from students, a key component for continuing to improve our support of student learning.

You can view a poster demonstrating the results of the 2011-2012 Core CI Information Literacy Assessment at the Colgate University Digital Commons.


If you have similar studies that you would like to see highlighted in the Value blog, please submit using this form.


Recently I’ve been part of at least two separate conversations that included someone asking “how does an institution measure the impact that changes in library spaces may have on the student experience.“  In the Value Report, each of these questions included the use of library space as a possible area of correlation:

  • How does the library contribute to student achievement?
  • How does the library contribute to student learning?
  • How does the library contribute to the student experience?

 In the section on student experience it was suggested that one technique would be to issue a survey that “ask[s] library users to describe what the academic library has enabled them to accomplish.” (p.124) For those libraries that have initiated changes in their spaces, it is possible that the survey could ask what has the new improved spaces enabled them to accomplish that they couldn’t accomplish before.   The following are just a few of many available resources that may assist libraries in conducting some means of assessment for their new spaces or assist in planning as well as assessing the impact of new spaces.

The new ACRL Standards for Libraries in Higher Education (SLHE) include a principle on space, which reads “Libraries are the intellectual commons where users interact with ideas in both physical and virtual environments to expand learning and facilitate the creation of new knowledge.” The SLHE are designed to guide academic libraries in advancing and sustaining their role as partners in educating students, achieving their institutions’ missions, and positioning libraries as leaders in assessment and continuous improvement on their campuses. They articulate expectations for library contributions to institutional effectiveness, with evidence collected in ways most appropriate for each institution. It is worth remembering that the SLHE provide a strong framework for demonstrating the impact and value of all types of academic libraries. Librarians can use the standards to prepare for accreditation, demonstrate value of their library to student learning and success and institutional initiatives, and improve quality in many areas, including library space.

At ACRL 2007 in Baltimore, Rachel Applegate presented a paper entitled, “Build it and Then What?  Measuring Implementation and Outcomes of Information Commons” which provides some guidance on developing programs of assessment for new Information Commons.  Based on the experiences of the University Library (UL) of Indiana University Purdue University Indianapolis (IUPUI, Applegate provided information on outcomes they were able to measure as a result of the new Information Commons.  She shared evidence that database usage increased based on data collected pre-Commons versus pilot-Commons.  (p.171) This was a demonstrated measurement showing that there was at least one significant change based on the implementation of the Information Commons.  As stated in the article:

“But merely proving that they use an area does not capture what kind of outcome it has achieved.

Outcome evaluation is a core goal of the academic assessment and accountability movement. Inasmuch as academic libraries are indeed academic, they are called to participate in this effort to determine just what student learning outcomes have been achieved. For libraries this is very challenging, as they serve an assistive, not a direct, role in learning.” (p.171)

Challenging as it may be to demonstrate learning outcomes, Applegate shared further details on a grant-funded survey that collected data in a limited study of eight courses: four graduate and four undergraduate.  Limited as the study may have been, it provided enough information to make a general assertion that the “results tend to show that library and Commons usage can be associated with positive outcomes, for undergraduates…”(p.172)  You can read the complete details in the report that is freely available online.  She shares information on methodologies and provides detailed background on the nature of their studies.  What is most important is that these are the types of studies that can be replicated in other libraries, including your own.

The main point is that projects such as these still require that some kind of data be collected and analyzed in order to demonstrate how these changes have contributed to improving the student experience in a quantitative manner.  In the publication, ARL Learning Space Pre-Programming Tool Kit a majority of the report focuses on assessment techniques after the planning process.  The report includes a detailed example of a survey that could be executed at a workstation within a multi-media lab.  Even though this is an ARL publication, the example of a survey that could be executed at a workstation could be applied to any type of academic library – community college, four year, or research level.  The important piece is that the survey might generate enough information to provide a means to initiate a more detailed follow up with selected participants.  A follow-up could be used to collect data that might be able to demonstrate improvements in GPA, or retention, or other correlations that could demonstrate that more has taken place beyond just improving the student experience as a result of creating a multimedia lab or Information Commons or new comfortable reading spaces.

The Value Report suggested that correlations with spaces and student achievement, student learning, and the student experience could all be compiled for the purpose of demonstrating value.  What is really needed now are more examples from libraries that have conducted studies and collected data and a sharing of those studies so that other libraries can learn and apply lessons learned in their own institutions.

Perhaps your library has recently initiated substantial changes in spaces within the library.  Have you implemented a plan of assessment to determine how these changes are contributing to student achievement, learning or their overall experience?  We’d like to hear from you if have and would love to see and share any reports you may have generated.  Use the “Share Your Reearch” feature to the right of this posting – 

Working on a Value of Academic Libraries project at your institution? Share your research through our VAL project survey. We’ll highlight many of these projects on our blog.


With thanks to Megan Oakleaf and Kara Malenfant for contributions to this post and to John Pollitz for the topic suggestion.

Site Admin

© 2010-2012 Association of College & Research Libraries, a division of the American Library Association

Suffusion theme by Sayontan Sinha