MSU Main Library’s Digital Signage
User Testing & Developing Wayfinding Systems
Determining the Client’s Expectations
We were asked by the MSU Libraries User Experience Team to research and develop various ways of
improving the digital wayfinding and signage throughout the Main Library. Our research was focused on,
but not limited to, the following:
- Who is using the system and why / why not?
- What are the pain points of using the system?
- Does the UI offer the information users want most, and in a clear way?
- What do target users actually want from the system?
- How does the digital system translate to the physical spaces?
- What are the main objectives and motivations for using the system?
Research Findings
Interviews / Surveys
We surveyed 25 random students and faculty then personally interviewed 5 of them. Our results
indicated that the kiosks were exhibiting some of the problems we had initially anticipated. Many of
the participants did not spend much time at the library and therefore did not have a need for any
wayfinding systems. Additionally, not many people actually struggled to find resources in the library,
leading to less than 25% of the participants ever having actually used a kiosk. However, despite the
majority of people not having a use for a wayfinding aid, there was still a substantial population that
could benefit from these kiosks. From our research it was apparent people were using the kiosks for
what they are intended for, however it was also clear the user experience was in need of many
improvements. The more shocking data point was that over half surveyed did not assume the displays in
the library are interactive. This was something we felt the library should take action on immediately.
We concluded there would potentially be a 50% increase in kiosk usage with some simple touch friendly
signs added.
Card Sorting
Card sorting allowed us to get a better idea of how users would organize information without being
pre-exposed to the interface. From our card sorting, we gathered that some of the information displayed
on the kiosks wasn’t where the user would expect it to be. The arrangement of the cards provided us
insight as to what aspects of the kiosks would benefit from having more emphasis, as well as where
items should be nested.
Contextual Inquiry
What We Did // Participants were provided with a random picture of a book and its call number,
or a random room name in the library. They were instructed to use one of the kiosks to assist them in
finding the book or room, while explaining their thought process as they used the system. We recorded
their actions, making sure not to interfere with their decision making process.
Good // In both cases our participants were able to achieve their task; however there were
several moments where they were either frustrated or confused. The participant trying to locate the
book was frustrated by the speed of the map animations. Perfecting the animation times was an easy fix
that would dramatically increase the user experience. Additionally, by the time the participant got all
the way up the stairs, he already started to forget where the map told him to go. This proved it can be
a difficult task locating an item far away from the kiosk. The biggest struggle for the second
participant instructed to find a specific room was scrolling through the long list of locations. Once
he found the listing, he effortlessly navigated his way to the room.
Better // Both participants argued they were not a fan of the list system. They felt the ability
to quickly type in a call number, or room name, would be a substantially quicker method than scrolling
through lists. While the system does work, it is clear there are a few simple pain points that should
be addressed. Additionally, we noticed the kiosk on the 4th floor only allowed the user to search for
an item on that floor. It was not able to help locate items anywhere else in the library. Both kiosks
on the 1st and 4th floor physically look the same however don’t perform the same. This could
potentially cause an inconsistent user experience, causing frustration or confusion.
Heuristic Evaluation
What We Did // We took notes and critically evaluated the digital touchscreen. After doing so,
following protocol, we re-evaluated everything based upon 10 UX guidelines for heuristic evaluation
using our notes and heuristic skills to analyze our findings.
- Visibility of system status
- Match between system and the real world
- User control and freedom
- Consistency and standards
- Error prevention
- Recognition rather than recall
- Flexibility and efficiency of use
- Aesthetic and minimalist design
- Help users recognize, diagnose, and recover from errors
- Help and documentation
Good // Overall the screens followed MSU brand standards and consistency with the rest of the
MSU Libraries media. The information was accessible and there was an “escape exit” made obvious in
every corner to take users back to the homepage. The homepage showed the MSU Spartan icon, title,
“welcome”, weather, date, time in header and scrolling announcement in the footer which gave off a
welcoming, yet modern feel that is similar to a mobile app.
Better // There were some concerning glitches in the touchscreen interface. The call numbers A-C
seemed to be the go to for when there wasn’t another option when trying to find locations or call
numbers. This was confusing and would be better to not happen at all, or worst case to show an error
message here. The map was useful, but would be much more useful if directions were given. The feedback
screen needed to be redesigned because it directs users to the website, but the link was not actually
clickable. A QR code or quick survey form would be more useful for users and the library to collect
information.
Usability Test
What We Did // We created a list of tasks for participants to complete on the touchscreen based
on thoughtfully created questions:
- How easy are the tasks to complete?
- What tasks were most difficult to complete?
- What frustrated the user?
- How long does it take the participant to complete the task?
- How does the user interact with the touchscreen?
- If prompted to repeat something, does the user remember?
- How accurate was the user in completing the tasks?
- What worked well on the touchscreen?
- What parts of the touchscreen need improvement?
- How helpful is the interface to the participant?
- Once the person understands how it works, do they seem to be more efficient and comfortable?
We then created a task list for the participant to do:
- Find an area of the library based on call number.
- Find an area of the library based on room name.
- Find a specific area on campus.
- Find information on a specific staff member.
- Reserve a room.
Good // Some notable things that went really well in this test was the process of reserving a
room. The user felt it was easy to do and finished quickly. Secondly, the task of finding a room by
location went well for our participant. She noted that the process to find a room by name went smoothly
as well. The touchscreen was helpful in this case.
Better // We believed that the interfaces could greatly improve through some changes in the
navigation and location of items. It is important that the user knows where to go on the screen in
order to complete their task. We removed some of the less useful tabs and made sure the correct content
was in the places where the user expected it to be. Our user got disheartened quickly when they could
not find the thing they were looking for.
Project Delivery
After providing the client with a comprehensive UX analysis of the current systems in place, we opted
to create a redesigned user interface that reflected what we learned from our research. Our redesigned
screens were imported into Adobe XD where we built a functioning prototype for the client.
Check out the high resolution redesigned screen samples below and watch
this video to learn more.