How to improve a website’s navigation through usability evaluation
The Archivists Round Table of Metropolitan New York, Inc. (ART) is a not-for-profit organization representing more than 400 archivists, librarians and records managers in New York metropolitan area. They were looking for ways to consolidate information so that it is more easily discoverable, findable, and accessible.
Our team conducted a moderated remote usability test to evaluate the website’s current navigation and content for first time or inexperienced users. Through this study, we identified several opportunities to improve A.R.T.’s home, about, and events pages.
The challenge
To evaluate the satisfaction, effectiveness, and efficiency of their current user experience we focused on the following questions:
Does the A.R.T. website clearly explain who they are and what they do?
Do visitors understand how to get engaged in the A.R.T. community and participate related events/programs?
Is the current user experience usable on both mobile and desktop devices?
Our Goals
To understand:
What information users seek out to learn more about the organization
Areas of friction when trying to get involved with the organization
How users navigate the site on mobile and desktop devices
The Team
Danielle Kingberg
Yi Zhong
Yuki Shimano
My Role(s)
Project Manager - I took the initiative to facilitate the client engagements. I fostered face-to-face and virtual collaborative discussions documented on Google Docs, provided structure and methods to efficiently synthesize data in Miro an Google Sheeets, and helped the team coordinate the communications along with moderated sessions.
UX Researcher - I helped plan, conduct, moderate (2) and observe (2) usability tests on mobile or desktop devices. I worked with the team to analyze results and surface significant usability issues. Then worked with the team to prioritize and consolidate actionable recommendations.
Reporting - I assisted in creating and presenting a presentation with compelling videos of highlighted findings along with mockups of recommendations for the client.
How we designed the usability test
A.R.T.’s audience was defined as prospective archiving students or archiving enthusiasts. In alignment with the goals and key questions we created a test with the following parameters:
Participants
Nine participants were selected based on their master’s degree pursued at Pratt’s School of Information:
Library and Information Science (MSLIS),
Museums and Digital Culture (MS)
Dual Degree in History of Art and Design (MSLIS/MA)
And if they frequented museums at least 5 times per month (including pre-COVID)
Scenario(s)
We thought about why new members may come to the site and created a different scenario within the context of each task.
Tasks
Scroll about the page without clicking into anything, what are your first impressions?
To help us understand user’s first impressions of the site.
Find an event you would be interested in attending
To understand the ease of learning about events. We asked additional probes: Was this information helpful on the events page? Is there anything missing that you would want to ask the contact?
You met a member from Princeton University at another event, but forgot to get their contact information. Use the website to contact the member.
To gauge the ease of searching for existing members to network and/or find a job. We also asked if this experience was what they expected to understand how the content supported their needs.
You’re interested in progressing your career, use the website to learn about the mentorship programs available
To see how effective the content is for new members to start or further their career. To further understand user needs, we asked additional questions during the task such as:
Is this experience what you expected?
Based on this information, how likely it is it that you would you sign up for the mentorship program 1 - very unlikely 5 - very likely
Note: For usability purposes in tasks 2-4, we consistently asked the following questions after each task:
Do you feel you have completed this task successfully?
From a scale of 1-5, 5 being very easy, how would you rate this task?
Post Test Questions
After all tasks had been attempted, we asked a few more questions to understand their impressions of the overall experience and the website compared to competitors:
How does this experience compare to other archive websites that you’ve used? (1- worse 5 - better) why?
Which of the following did you encounter as you were using the site? (Choose all that apply)
Hard to navigate, too much scrolling, not well organized, too much reading, text too small, too much clicking, none of the above
What frustrated you most about this site?
What did you like most about this experience?
Did this activity improve your understanding of what archivists do? If not, what would help you to better understand their role?
Is there anything else that you would like to share or discuss?
[mobile only] How was the overall experience with this website compared to others on your smartphone?
This was particularly helpful for qualitative feedback.
Analyzing Data
We used Miro to aggregate our findings for synthesizing and surfacing trends.
Initial Insights
34% of observations were positive or neutral — mostly related to the page layout.
Only 35% of the observations were ranked a severity level of 3 or 4 in accordance with Neilson’s severity levels.
Going Well
1. Participants saw a lot of opportunities to engage that offers a good sense of what archivists do.
2. The website made good use of photos that illustrates what the website is about.
3. Comparatively, the A.R.T. website is not as busy as other library websites.
Needs improvement
Overall Findings
Content appears too long and hard to navigate on the homepage
Inconsistent purpose of search bar use and results
Missing information on ways to get involved
Minor content fixes
I focused on the findings for 1 and 2 in the st.
Why did users have difficulty in navigating A.R.T.’s homepage?
The size of the graphics and the gray headers made it hard for participants to determine where to focus their attention. They also expressed frustration around repetitive information linked from the homepage (see figure 1).
Two participants indicated, "[most headers] on the main page directed to a second page that had the same content shown on the main page.”
67% of participants suggested better organization in the page layout (segmented headings, more brief descriptions, and better use of space) would make it easier to identify relevant information.
Mobile participants found the information too lengthy and hard to read through stating “since the mobile version doesn't have 'Control + F’ key, I can't easily find things".
The navigation menu was also difficult for mobile users. One participant stated “I thought the menu only had ‘Home” and ‘About” at first!”
How can the A.R.T. make it easier to navigate through content on desktop and mobile?
Homepage layout recommendations:
Adopting a content card format, grouped and segmented by topic area, into easily understand information purpose and priority. Each card would have a small image, a distinctive title by color and bolded text, and brief summary of at topic with a hyperlink to more information.
Adding a column for more time pressed or highlighted topics to make better use of the margins.
Making the logo the home button to further reduce page redundancy.
Overall layout recommendations:
Implementing a more responsive, modular format with a:
Home icon for quick access to the homepage
Search’ icon for quick navigation to key topics
Navigation Menu icon to access all tabs from a dropdown menu
These in combination will improve the navigation for mobile users.
Why did users have difficulty using the search feature?
44% of participants used the search bar to find information about topics ranging from mentorship programs to contact information of A.R.T. members.
11% of participants successfully found information they expected.
The A.R.T. Member directory was not easily accessible to non-member users, participants suggested adding advanced filters to the search bar to help understand what information they could find.
How can ART make the search bar more user friendly?
While adding a filter for more advanced filtering could improve the usability of the search feature, this could similarly be achieved by adding a type-ahead information and suggested drop-down lists to inform the user of information they can search.
Conclusion
A.R.T. was very appreciative of our recommendations. They mentioned how membership benefits are more important than ever as archivists in the area are looking to build out their network due to furloughs and untimely layouts from the recent pandemic. They are looking forward to discussing the report and presentation with their board of directions to implement our recommendations.
Lessons learned
Pilot testing more would have helped us frame the tasks to be more concise and broad; putting more focus on exploring information versus navigation. The insight we compiled was good, but could have surfaced more insight on the information if the tasks were more focused on reading through the website’s content.
Participants were more authentic in their feedback after the following statement was added to the script: “We did not create this website ourselves, we are just conducting the usability test for it.” Students were much more comfortable with providing constructive feedback when they realized our team was only performing evaluative research and did not design the website.
Due to testing remotely, we incurred system errors from the Validately (UserZoom Go’s mobile app), which resulted in most participants only being able to perform the test on desktop devices due to issues with. Overall, this approach provided us with qualitative insights backed by evidence from 2 mobile participants and 7 desktop participants.
Opportunities for further research
Navigation — conducting card sorting and tree testing to help assess how users group and navigate the website to find information and reduce the number of clicks to access desired content.
Member experience — exploring the signup or login experience for members vs. non-members to understand what information A.R.T. would like to collect from users (e.g. photo, email address…etc) and similarly what information member-users vs. non-member users can access (e.g. what information they can access about other members, events, or programs).
Mentorship program information — conducting interviews to understand the benefits that mentees or mentors are looking for in the mentorship program and update the program page respectively. Similarly, it would be good to explore which networking activities that members are interested in.
Outcomes
ART updated their “About Page” to make it easier for users to understand their mission and how to get involved.
References
Clarissa Peterson. “Mobile and Beyond.” In Learning Responsive Web Design: A Beginner’s Guide.
“Severity Ratings for Usability Problems: Article by Jakob Nielsen.” Accessed September 30, 2020. https://www.nngroup.com/articles/how-to-rate-the-severity-of-usability-problems/?lm=how-to-conduct-a-heuristic-evaluation&pt=article.
“Font Size Guidelines for Responsive Websites (2020 Update).” Accessed December 7, 2020. https://learnui.design/blog/mobile-desktop-website-font-size-guidelines.html.