Tiny Studies

Stories and reflections from U-M Library assessment practitioners.
One orange lego in the middle of a blue lego base.

Posts in Tiny Studies

Showing 31 - 40 of 63 items
Two columns, left one labeled Seen and right one labeled Safe.
  • Denise Leyton
In three blog posts, the authors describe a multi-year library service design project. This second post describes the research process used to develop our user experience tool.
Two columns, one labeled Seen and the other labeled Safe, with a gray scale gradient.
  • Denise Leyton
In three blog posts, the authors describe a multi-year library service design project. This first post describes the origins and goals of the assessment project.
Photo of an empty lecture hall.
  • Jesus Ivan Espinoza
How do we begin applying a critical lens towards assessing library instruction? Recently U-M Library Instructor College and the The Feminist Pedagogy Reading Group discussed Maria Accardi’s book chapter "Teaching Against the Grain: Critical Assessment in the Library Classroom."
Iteration in the design thinking process: Understand, Explore, and Materialize categories, with steps of empathize, define, ideate, prototype, test, implement.
  • Karen A Reiman-Sendi
A project team charged with providing staff training activities approached the project assessment with an iterative design lens, allowing for responsive and timely development of multiple opportunities for staff engagement around organizational and personal change. The team tried out different assessment techniques related to the opportunities offered.
Line image of questions to ask about data: what do we want to know, what could data show, who do we want to show, why do we want to know, and what does the data represent.
  • Kat King
Chances are the work processes you already have in place are generating data that you could be using to learn more about those processes. In this second blog post, the author continues to highlight steps for working with data that is generated by your daily tasks.
Line image of questions to ask about data: what do we want to know, what could data show, who do we want to show, why do we want to know, and what does the data represent.
  • Kat King
Chances are the work processes you already have in place are generating data that you could be using to learn more about those processes. In two blog posts, the author shares some steps for working with data that is generated by your daily tasks.
Circle with text in the center that reads It's all about building community.
  • Sheila Garcia
What does it mean to evaluate assessment practices through a DEIA lens? Sheila Garcia, Resident Librarian in Learning and Teaching, shares her personal journey applying a critical lens to her capstone project that centers the experiences of undergraduate language brokers.
Photo taken above a busy crowd. Some figures are actively walking and their figure is blurred from their movement. Others figures are clear and sharp and they are standing, talking to or watching other people in the crowd.
  • Ben Howell
How can we improve the familiarity and credibility between Library experts, resources and services we offer and the students, faculty and staff who use them? Whether we’re building new relationships or reconnecting with patrons/colleagues during assessment or user research activities, we have the opportunity to use certain marketing and communication best practices and tools during our user research to align clear and targeted communication with our key audiences.
Image of bar chart and magnifying glass
  • Joe Zynda
Assessment and research activities focused on the U-M Library faculty, staff, and student experiences are happening regularly, and often the Library Human Resources (LHR) team is contributing to these activities if not leading the research. This work can focus on quantitative data, qualitative data, or take a hybrid approach, and can involve surveys, interviews, and/or some general number-crunching. This post looks over some recent HR assessment projects.
Text: Keep Calm and Don't Forget About IRB Review
  • Craig Smith
When planning an assessment project in the Library, one important step is to consider whether your project should be vetted by the Institutional Review Board (IRB) at U-M, a committee that ensures studies with human subjects are ethical, that subjects are protected from unnecessary psychological or physical risks, and that subjects are participating in a fully informed, voluntary manner. This post details when your data collection may be subject to a full IRB application and review process.