Skip to Main Content

Smathers Libraries Assessment and User Experience

Introduction

Pauline Bickford Cline, Doug Kiker, and Christy Shorey, all members of the Assessment Advisory Committee, presented this year at the Charleston Library Conference. The presentation was on their work with a subcommittee of the AAC, the Ask-A-Librarian Transcript Analysis Subcommittee. The presentation was during the virtual conference week with a live Q&A component over Zoom. 

In the presentation, Pauline, Doug, and Christy discussed their analysis of an academic year’s worth of Ask-A-Librarian chat transcripts.  Beginning in September 2021, the Ask-A-Librarian subcommittee analyzed about 100 anonymized Ask-A transcripts per month, collected and de-identified by David Carnell. In some months, the committee analyzed more – for example, in January 2022 when the UFDC platform was updated. This project took place during a year of change at the Libraries, which included the library website migration, the ILS migration to Alma Primo VE, and the UFDC platform migration, as well as multiple other simultaneous web page redesign projects based on usability testing. 

The subcommittee developed a list of common chat topics and barriers faced by patrons, which were used to code each response. Each interaction was then rated using predetermined criteria and recommendations for improvement were made based on how the patron’s needs were met. Using this data, the subcommittee developed large-scale recommendations to share with Assessment Librarian Laura Spears, Dean of Assessment Val Minson, Ask-A-Librarian coordinator Robin Fowler, and Library Technology Services. Based on these recommendations, changes are in the process of being made to the Ask-A-Librarian training program and resources as well as to the library website. Some of the recommendations included the reorganization of the study room web page and increased Ask-A staff training ahead of the fall semester on topics such as the VPN and best practices for a reference interview. The data analyzed contained valuable information that helped improve library services and resources. The dataset and the Charleston presentation recording will be made available in UF’s institutional repository. 

Transcript Analysis

The Assessment Committee pulls a 10% sample (typically between 100-150) of monthly transcripts of Ask A Librarian interactions.

  • The transcripts are anonymized for both patron and Ask-A attendant.
  • The transcripts are then coded by topic (what the patron's question was about) and by barrier (what caused the issue).
  • The coding enables the Libraries to make informed decisions about improvements to library services based on frequent issues and questions that appear in the chat transcripts.

FAQ

What is the criteria for the rating system and how did you approach rating each transcript?

Ratings are broken down into 4 stars.

  • 1    Unsatisfactory response
  • 2    Some knowledge from the Ask-A attendant, but problem wasn't solved
  • 3    Could be more detailed, problem was solved
  • 4    Great response; problem solved and Ask-A attendant demonstrated comprehensive knowledge of the topic

It's important to note that this system is relatively simple in design and was not designed to be punitive or judgemental. There are some scenarios in which Ask-A attendants must manage several chats of varying complexity at once and the subcommittee acknowledges the challenges inherent in this type of work. Additionally, not all of the subcommittee members have worked Ask-A themselves. When rating an interaction, the focus was on the objective improvement of library services for patrons.

How are transcripts anonymized? 

  • The transcript sample (10% of total monthly interactions) are pulled at the start of each month
  • A committee member manually reads through each interaction and replaces all identifying information (names, emails, phone numbers, etc) with generic codes
  • Responders are coded with consistent identifiers across months. So if responder John Doe is coded as R5 in a September report, he will be R5 in the October report and so on.
  • Patrons names and identifier are renamed "Patron." This includes emails, so if a patron provides "JaneDoe@ufl.edu" that will be changed to "patron@ufl.edu."

How many committee members coded the transcripts?

  • There were between 3-4 members for any given month
  • On average they coded between 25-50 interactions per member

 

Topic Code Definitions

Access: Questions about inaccessible content or accessing a particular type of content or tool

Example: I’m seeing a paywall for this article. Can you help me access the full text?   

Policy: Questions about library policies and procedures, such as building rules and borrowing

Example: How long can I book a study room?

Research: Questions about research projects or researching a particular topic, including citation assistance and referral to subject specialists

Example: Can you help me find articles about polar exploration?

Technology: Issues and questions related to or caused by library software, platforms, or tools

Example: I see an error message when I try to log in to the VPN.

Broader UF: Issues and questions not related to the Libraries, such as parking or the registrar

Example: How do I register for classes?

N/A: The patron signed off before receiving help and a topic could not be determined

Data Set and Charleston Presentation

The data set and a recording of the Charleston presentation are available in UFDC: https://original-ufdc.uflib.ufl.edu/l/IR00011975/00001/citation

University of Florida Home Page

This page uses Google Analytics - (Google Privacy Policy)

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.