Using a Pilot Study to Test and Assess a New Instruction Model

Introduction

Have you ever attended a workshop and promptly forgot most of what you learned a few days later? Chances are almost everyone has experienced this phenomena or knows someone who has. Given that library staff teach hundreds of library instruction sessions each semester through training workshops, course-integrated sessions, campus workshops, etc., this is an issue that is probably affecting those who attend our instruction sessions as well.

My colleague, Alex Rivera, and I explored a potential solution to this problem called Learning Boosters last summer. Learning Boosters are adopted from a model called Training Boosters by cognitive scientist Art Kohn, and is a method to apply theories developed by Jeffrey Karpicke and Henry Roediger of using repeated retrieval practice to enhance long-term information retention.

Learning boosters consist of three tests that are sent to students days, weeks, and months after a workshop to reinforce 3-5 topics:

  • First booster (recall) is sent two days after the workshop and asks students to answer a series of multiple choice questions

  • Second booster (generative) is sent two weeks after the workshop and asks students to answer a series of fill-in-the-blank questions

  • Third booster (integrative) is sent two months after the workshop and asks students to reflect on how they applied the 3-5 topics reinforced since the workshop

Ebbinghaus Forgetting Curve Diagram

Our Pilot Study

A pilot study was developed and approved by the university’s IRB to investigate the effectiveness of Learning Boosters on long-term retention in the context of course-integrated library instruction. Five classes were chosen from the 2017 Summer Bridge Scholars Program;  3 classes served as a study group (i.e. received boosters), and 2 classes served as a control group (no boosters). All classes received a pre- and post-test to evaluate long-term information retention. The library session boosters were developed on the following concepts, resources or services:

  • Ask a Librarian instant messaging service

  • Peer Review

  • Course Reserves

  • ProQuest

  • Study Room Reservations

The pre/post tests and Learning Boosters were created in Qualtrics. Links to the Qualtrics surveys were sent to all classes, with one exception, via a course announcement in the course management platform, Canvas. Students in one class received an email from the Graduate Student Instructor that included the Qualtrics links. The timeline for distributing the Learning Boosters was adjusted to fit the program schedule (i.e. 2 days, 2 weeks, and roughly 1 month) 

Results

The results of the pilot study were limited in that the data could not answer questions of long-term retention due to a lack of student participation, but the results were valuable for assessing the implementation model. Overall, few students participated in the Learning Boosters and almost no students participated in the post-survey. One study class instructor did, however, incentivize students to participate, which led to a high percentage of participation in the pre-test and in all 3 learning boosters.

The pilot study found that the group with high participation rates did show an increase in retention of specific concepts such as peer review between the second and third booster. Although the sample was small, having some evidence of a positive impact of learning boosters was encouraging enough to begin scaling up the use of Learning Boosters in instruction sessions offered during Winter 2018.

The difference in participation rates between the study classes was a valuable, if slightly unexpected, finding which greatly helped improve the research team’s understanding of implementing Learning Boosters. The pilot study suggested that the closer the librarian and course instructor partnered together on implementation and incentivizing the boosters, the higher the participation. The class with the highest participation rates included a librarian as a co-instructor in the online course environment, and the course instructor included the Learning Boosters in the course syllabus and grading scheme to help incentivize students.

Next Steps

Overall we considered the pilot study a success as it gathered some evidence of positive results and exposed barriers as well as potential solutions to the implementation of future iterations of Learning Boosters. The pilot study also clearly identified future areas of study in addition to collection methods that are better suited for assessing long-term retention of information.

With the lessons learned in a pilot phase, the implementation of Learning Boosters has been expanded and improved for Winter 2018. A number of librarians have begun implementing Learning Boosters in the following course-integrated library instruction areas:

  • 8 English 125 sections

  • 1 English 124 section

  • 1 Women’s Studies course

  • 1 Theatre & Dance course

Each course uses a variation of the successful implementation model found in the pilot study. Participation rates for each use of Learning Boosters during Winter 2018 will be assessed in order to develop an ideal model that library instructors can use to maximize participation in future instruction sessions. Participation rates for two open workshops (e.g. not course-integrated) will also be assessed in Winter 2018.

In addition to assessing the implementation model with a larger pool of courses, we are conducting a longitudinal study to assess the impact Learning Boosters have on long-term retention in a number of English courses.

Lessons Learned

  • Using a small pilot to test an idea is helpful for determining the future steps of a project and for making necessary changes. The scale of a pilot allow for changes to be made faster and easier than making changes after a large-scale implementation.

  • The failure of some aspects of a pilot study can be just as valuable as the successes for informing future implementation of a project as well as future research directions.

  • Incentivizing students to participate and complete surveys is difficult! Partnering closely with course instructors and leveraging instructor support can help in motivating students. Other incentives should also be used to increase participation of post tests, especially if the post test will take place after classes have ended.