Good afternoon,
I’m reaching out because our team is rethinking how we approach ongoing assessment of our website, and I would love to hear about your experiences or any resources you might recommend.
Up to now, our work has mostly focused on localized, incremental improvements based on user feedback. While this has been valuable, we’re hoping to take a step back and build a more structured and holistic assessment framework—one that gives us a clearer overall view of the site, aligns with other assessment activities happening across the library, and better reflects how our users navigate between online and in‑person services.
We’re especially interested in approaches that treat the website as part of a larger user ecosystem—for example, booking a study room online and then using it onsite, looking up research support information before meeting a librarian, or discovering services on the website that connect directly with what happens at our service desks.
If you have models, strategies, examples, or even brief insights from similar work you’ve done, would you be willing to share them?
Thank you!
Myriam
Myriam Dupont,
M.S.I.
Bibliothécaire
- Planification, évaluation et expérience usager
Bibliothèque - Bureau de gestion de la performance organisationnelle
Université Laval
Clavardons
sur Microsoft Teams
Pavillon Jean-Charles-Bonenfant, local 3122
Québec (Québec) G1V 0A6
Avis relatif à la confidentialité
Myriam,
I agree with viewing the website as part of a larger ecosystem. I would also say that a library’s assessment program is an ecosystem wherein various components should be mutually reinforcing. (This principle is reflected in a conference workshop that a colleague and I led a few years ago.) For example, quantitative assessment may answer some questions while leaving others unanswered or raising new ones. Since qualitative analysis generally reflects the experiences of a limited number of users, it may give rise to questions that need to be probed statistically.
Obviously, I think it makes sense to pay attention to whatever transactional data that you can get regarding your website, but this needs to be complemented with other data. In my experience, survey data only goes so far. Usability studies can be really useful, but they can be labor-intensive to conduct. When I’ve done them, I’ve found it helpful to design the protocol around tasks that a user should be able to complete on the website. The usability study overtly seeks to tests the website’s ease of use, so no failures reflect poorly on the user who is recruited to run the test.
Recently, I did a deep analysis of study room reservations submitted into the 25Live system. This included both transactions mediated by staff at the desk and those performed directly by end users who knew how to do reservations for themselves. When I had follow-up questions that transactional data couldn’t answer, I contacted known users and got some valuable qualitative insights. I hope these tidbits are helpful in some way.
—Greg
Gregory A. Smith, Ed.D.
Director
Ehrhorn Law Library
(434) 592-4892

Liberty University | Training Champions for Christ since 1971
From: 'Myriam Dupont' via ARL ASSESS <arl-a...@arl.org>
Sent: Wednesday, January 28, 2026 3:45 PM
To: arl-a...@arl.org
Subject: [External] [ARL-ASSESS] Website assessment
[ EXTERNAL EMAIL: Do not click any links or open attachments unless you know the sender and trust the content. ]
--
To post to this group, send email to arl-a...@arl.org
To unsubscribe from this group, send email to
arl-assess+...@arl.org
For more options, visit
https://groups.google.com/a/arl.org/d/forum/arl-assess?hl=en
For instructions on logging in visit
https://sites.google.com/a/arl.org/techguides_arl/login.
Discussions on this list are subject to ARL's Code of Conduct:
https://www.arl.org/who-we-are/#section-codeofconduct.
To unsubscribe from this group and stop receiving emails from it, send an email to
arl-assess+...@arl.org.