Applying Iterative Design to Assessment

Background

During 2019, I was fortunate to collaborate with a group of library colleagues to think about how to pilot three organizational change training opportunities. The purpose of this effort was to better understand the needs, desires and possibilities for future training opportunities and/or structured conversations about change within the Library, for all library staff. Because we knew that there were several upcoming change opportunities related to Library spaces, services, and programs, we wanted to develop and/or enhance change skills for individuals across the Library. Our hypothesis: everyone on staff might benefit from a structured and engaging discussion or activity about surviving, communicating, implementing and leading change.

As part of our project goals, the Change Management Training Advisory Team developed three different opportunities for library staff to engage with the concepts and current practices of change in an organization. By offering a variety of training formats and content, we felt we were meeting diverse learning needs and varied work schedules, while connecting pre-existing knowledge to practice. The best part of this project was our ability to assess where our efforts were hitting the mark (or not!) while we were developing the content and structure of these activities. As a former colleague of mine used to say, we were “building the plane while we were flying it.” The Advisory Team embraced our contributions to our Library’s culture of assessment by practicing a variety of feedback and assessment techniques -- and we responded to results in a timely fashion by employing an iterative design lens to our work.

“Assessment is done throughout the course of a project for varying reasons: formative assessment is done to provide feedback for ongoing activities, and to inform any needed mid-course corrections; summative assessment is done to measure a project's overall success; longitudinal assessment tracks impacts beyond the duration or initial scope of the project.” (See https://serc.carleton.edu/research_education/assessment.html)

 

Types of Assessment

Because each of the three learning opportunities were different in format, audience, content and level of engagement, our assessment strategies were varied as well. For the first learning opportunity, a lecture-style presentation from an external consultant aimed at all library staff, the content focused on general knowledge about organizational change. The Advisory Team and our Library Assessment Specialist created a pre-presentation survey, and a post-presentation survey. The pre-presentation survey gave the team a sense of where library staff were in their own change acceptance views, and in their confidence to face and to adapt to or influence change. The post-presentation survey then not only measured staff confidence in light of what they learned, but also gathered feedback about the presentation format and content, and presenter. With a high response rate for both surveys, Advisory Team members brought to light staff concerns and strengths related to organizational change, resilience, and communication. 

We learned that a lecture style presentation was useful for some attendees and not so helpful for others. We decided to arrange a follow-up conversation for library staff, to recap the presentation content and to provide more staff with a chance to delve into change management concepts and terminology, using case studies. Discussion attendees were invited to provide informal, in-the-moment feedback, which we were able to use going forward.

The second learning opportunity focussed on the needs of supervisors and managers to develop leadership around change. Advisory Team members reached out to individuals to learn about participants’ experiences and thoughts, post-workshop, and then we distributed a survey to those supervisors and managers who attended, with the hope that Advisory Team would understand the impact of the workshop on attendees’ individual practice. We learned that managers and supervisors wanted environments where they could talk more about their challenges around change and communicating change to staff. As a result of that helpful feedback, the Advisory Team offered a small panel session with several workshop participants during which they shared how they applied some ideas and strategies they learned in the workshop. Attendees were invited to share feedback with the team via email or on-one-on. 

The third learning opportunity in our pilot change training program consisted of two parts: a facilitated discussion about a specific article on change management, open to all interested library staff, and a by-invitation-only working session for small project teams or committees to apply a locally-created worksheet structured to help plan and communicate change. 

As the Advisory Team was honestly tired of surveys at this point, and really wanted some actionable but recordable data from participants, we presented a reflection set at the end of the article discussion session. Attendees were asked to respond individually on notecards to two prompts, at the end of the session: 

  • Was this facilitated discussion helpful for learning about communicating change and/or taking actionable steps? Briefly explain why/why not. 

  • If you were facilitating this discussion, what would you do to keep the conversation going or to create future opportunities for learning and engagement around change management?

For the hands-on experience with the change planning worksheet, we distributed a very brief post-session survey which asked about the workshop as well as ideas for kinds of support we should offer in the future. Both qualitative approaches to assessment gave us great themes and specific comments (or testimonials) for future strategic directions.

What Did We Learn About Assessment?

By approaching this project as iterative and experimental, we found opportunities to gather meaningful data and feedback that directly impacted the development of our activities. Any assessment technique is valuable in a project; the Advisory Team found our formal and informal assessment steps to be valuable in planning the sequence of learning opportunities while allowing us to make critical adjustments in response to participant feedback. 

While we used anonymous survey techniques throughout our pilot, the Advisory Team never felt locked into the “let’s-make-a-survey” path during the project. This flexible view helped the group organically respond to our process, making course corrections in response to collegial feedback. On the flip side, some days it felt like a team-based “rapid response” to the project assessment demands and needs, with little time to be scientific or rigorous in our approach.

Even though we took an iterative design approach to our project, the Advisory Team knew that we had to bake in assessment steps, not only to contribute to our Library’s assessment culture but to also understand what worked and did not work in our project. Pilot projects or experimental programs need to be fluid. Having an expectation that you will assess everything you do, including pilot or experimental efforts, helps everyone not only create a culture of assessment but also make meaning out of what they are doing related to the project. 

And finally, having folks on the team that knew something about assessment and evaluation aided and informed the Advisory Team conversation and project planning. Project teams are best served when group membership can include that skill or interest.