This paper was refereed by Weave's peer reviewers.

Abstract

Conducting user research doesn't have to be difficult, time consuming, or expensive. Your Library website can be improved through user research even if you have design restrictions because of a prescribed branding scheme, content management system, or any other reason. At Simmons College in Boston, we recently performed a user research study that took an in-depth look at the content organization and wording of links on the Library homepage. We conducted an in-person paper survey using paper prototypes to collect feedback from Library users. Based on the research findings, we made significant updates to the Library homepage that make it much easier for users to find the information they need.


 
At Simmons College Library, staff members regularly assess the content, organization, and design of the library website based on user feedback. Our most recent user research, conducted in July of 2013, was a small-scale study that focused on the organization, wording, and content of the library homepage, and allowed us to make significant enhancements to the homepage’s usability without overhauling its design and layout. Performing incremental user research enables the library to make small site improvements on a regular basis, instead of waiting for a complete site redesign project. By giving our study the precise focus of improving content labeling and organization on the homepage, we made significant usability improvements to the most popular page on the library website without a major resource investment.

Simmons College mandates formal site-wide branding guidelines. The library has limited control over the site design and architecture because the site is stored within a content management system shared by the entire institution. There was no potential for a complete library site redesign for some time because the entire institution had recently began a campus-wide website redesign project.

These types of restrictions on the website are not uncommon in libraries, especially when the website is under the umbrella of a larger parent organization. Other libraries might find themselves restricted from using images or by the content management system settings, but these are not reasons to forgo user testing. Libraries can still improve usability by focusing on elements of the site that are within their control to change. We decided to examine the potential for updating the content, labels, and organization of links on the library homepage, because they were within our control.

The Simmons College Library website was redesigned in 2010 by the Simmons Office of Online Communication and Design. In early 2012, librarians performed significant think-aloud usability testing with a variety of users; this testing focused on the entire library site's information architecture and content organization. As a result of the 2012 study, we renamed pages, updated the main navigation hierarchy, and improved content organization. While these changes were significant and did improve the site, we still received feedback from patrons that, due to a multitude of links and inconsistent labeling, the library homepage and sidebar were cluttered and confusing. For this reason, we decided to focus the 2013 study on the usability of the library homepage. The library homepage includes a tabbed search box, which is unique to the homepage and a right-hand sidebar that appears on each page of the website (Fig. 1).

Figure 1
Figure 1

For the tabbed search box, we wanted to understand if the labeling and order of links should be updated. Since most students access course reserves through the college's learning management system, the “course reserves” tab of the white box was potentially obsolete. On the sidebar, we researched the ordering, grouping, and labeling of links, as well as the placement of the chat button and hours information. The segments of the homepage we researched were items completely within our control to update via the campus-wide content management system.

The data we reviewed to inform the changes came from an in-person user research study we carried out using a paper survey instrument with large website screen captures to help illustrate the questions. A small group of four library staff members constructed the survey, carried out data collection, and reviewed results over the course of a month. The results were presented and changes were made during the following month. The entire time investment was around 12 hours of staff time and 8 hours of student worker time. The library spent a small amount on snacks as incentives for participation.

After the most recent library website redesign, we performed traditional usability testing in the summer of 2012, which focused on the information architecture of the site. We received feedback that the homepage was overwhelming and unorganized as part of our usability test and from comments staff received at the reference desk. Janice Redish (2012) writes in Letting Go of the Words: Writing Web Content That Works, “If you try to give equal emphasis to all things for all people on your home page, you'll end up satisfying no one.” The library homepage was overwhelming and needed to be streamlined in order to become easier to use.

We were hesitant to take on another significant usability test, but Erica Reynolds (2008) writes, “You shouldn't wait for an entire redesign to implement usability studies. ...the secrets to patron-centered web design are to talk to your patrons and staff members and test, test, test” (p.6). And Steve Krug (2006) reminded us that user research is an iterative process and must be done on a regular basis (p.135). This approach of performing smaller tests on an ongoing basis would allow us to continuously improve the usability of the site.

When creating this study, we decided to focus on improving the labeling of links, updating the organization of the sidebar "quick links," and reducing the amount of jargon on the library homepage. We saw that we could greatly improve the usability of our library homepage by cutting out jargon that is unfamiliar to many of our users.

Although reducing jargon is a smart goal for any library website, determining how to replace the jargon can often be difficult and the basis for staff deliberation. The research study we performed focused on collecting direct feedback from current students on their opinions of the elements of the library homepage. This approach was our way of engaging in participatory design. Participatory design, as described in Participatory Design in Academic Libraries: Methods, Findings, and Implementations, is not limited to website testing, but may be used in other areas of library service design or resource selection as well. Rather than having library staff or other stakeholders determine design or content changes, the users are asked directly what they prefer and expect (Foster, 2012, p.1). This research is then used to inform the final decisions. In our study we included options for users to select from, rather than asking all open-ended questions. This was to ensure participants could complete the survey quickly and easily. The purpose of asking users to rank the importance of links on the homepage was so we could determine what was most important to users instead of making assumptions about user preferences. This way we were able to use the information collected from users to develop clear reasoning for proposed changes to the homepage.

In the study we used a paper survey, supported by visual aids, to collect user feedback about the content and organization of the library homepage. This survey was distributed at the entrance to the library and participants were asked to fill out the survey on the spot (it took about 5-7 minutes to complete), rather than take it with them and return it later. We chose to collect the survey in-person, rather than online for two reasons. First, there are many feedback surveys distributed to the campus community via email and the community has been experiencing online survey fatigue, evidenced by a decrease in participation in recent online surveys. Second, we had read about the positive results and ease of execution of walk-up, in-person user testing in recent professional literature like “Adding Users to the Website Design Process” by Megan Tomeo (2012) and “The North Carolina State University Libraries Search Experience: Usability Testing Tabbed Search Interfaces for Academic Libraries”  by Teague-Rector, Ballard, & Pauley (2011) and were interested in trying this approach. This method was a good fit for our library because it is not resource intensive and enabled us to collect user feedback quickly.

We determined a goal of seeking feedback from at least 30 participants. The staff group constructing the study examined guidelines for other in-person usability data collection methods, such as card sorting, for guidelines. Although card sorting is a method for testing potential information architecture approaches, it is similar in that it utilizes an in-person, non-technological approach to determine user preferences. In his article “Card Sorting: How Many Users to Test,” Nielsen (2004) writes “There is great variability in different people's mental models and in the vocabulary they use to describe the same concepts. We must collect data from a fair number of users before we can...determine how to accommodate differences among users” and recommends a minimum of 15 participants, but states more would improve the results correlation. We doubled that minimum to 30 to ensure there would be at least 15 usable, complete responses and allow for some variety in the types of participants. In the end we collected 57 complete, usable surveys.

One weakness of the study is that we did not survey a representative sample of users. Anyone entering the library when we were conducting the survey could participate. The majority of students at Simmons College are graduate level. As of 2013 there were approximately 1,704 undergraduate students and 2,325 graduate students. There are many library science graduate students at Simmons College and although we wanted them to be represented in this study, we want the library website to serve all populations equitably so their opinions were not weighted differently than others. However, we don’t see a wide discrepancy in how users describe library resources in questions asked at at the reference desk, so we did not think this one weakness corrupted the overall survey results. The major benefit of performing the survey in this way was that we saved a significant amount of time by not having to recruit students formally and schedule specific meeting times. Because we engage in small-scale user testing on a regular basis we can construct our next method of inquiry to contain a representative sample to ensure we capture the opinions of those users that may have been missed in this survey.

Since one goal of the study was to clarify language used on the homepage, we needed to ask participants about word choice. Staff members who participated in the survey design determined the list of terms to include. We brainstormed the most appropriate terms by reviewing smartly designed academic library websites, which served similar user populations, and by reviewing the most commonly used phrases that students asking reference questions used. During a meeting of the survey design team we looked at academic library websites (each participant suggested one or two) to generate ideas for terms to evaluate. A member of the team also looked through our reference desk statistics to determine the type of language students use. We also referred to the article by Mark Polger (2011), “Student Preferences in Library Website Vocabulary” for guidance.

The survey instrument consisted of visual aids with examples of potential updates to the language and order of links on the library homepage, accompanied by questions about preferred language. After the surveys were collected, a student worker encoded the data using Excel to allow the staff committee to examine trends and to determine appropriate changes.

The survey instrument was comprised of three components:

  1. Preferred terms used on the library homepage
  2. Order of tabs on the homepage search box
  3. Order of links on the blue right-hand sidebar

Section 1: Preferred Terms on Homepage

We asked participants to select which term was easiest to understand from a list of related terms. Participants also wrote a short description of what they would expect to find under each choice. It was important for this section to be first so that the rest of the survey questions did not bias the respondents’ answers.

The list of preferred terms from which to select included:

  • "Books & Media" OR "Library Catalog"
  • "Article Search" OR "Databases"
  • "Library Guides" OR "Research Guides" or "Subject Guides"
  • "E-Resources" OR "Online Resources"
  • "Interlibrary Loan" OR "Information Delivery" OR "Materials Request"

Section 2: Order of Tabs on Search Box

Participants then ranked links in order of importance, from 1-4, with 1 as most important and 4 as least important. We included a screen capture mock-up of the search box on the library homepage, with the tab labels removed as a reference point for the participants (Fig. 2). The topics included:

  • "E-Resources"
  • "Library Guides"
  • "Books & Media"
  • "Articles"
Figure 2
Figure 2

Section 3: Importance of Links on Sidebar

Finally, participants ranked two sets of links by importance from 1-5, with 1 as most important and 5 as least important. Participants viewed a picture of the Library website with a callout to the sidebar as a reference (Fig. 3). We broke the complete list of quick links into two sets—one set for resources and one set for services—although the participants were unaware of the designation. We separated the list into two parts because it is an easier task to rank five items from most important to least important than it is to assign ranking to a greater number of items.

Resources links included:

  • Library Guides
  • Refworks
  • Full-Text Journals
  • Library Catalog
  • E-Resources

Services links included:

  • Chat with a Librarian
  • Group Study Rooms
  • Interlibrary Loan
  • Library Hours
  • Writing & Citing
Figure 3
Figure 3

Section 4: Open Feedback

Participants could then provide open-ended feedback about the library website. The survey provided several prompts, including:

  • "I would like to see..."
  • "I really love..."
  • "I can never find..."
  • "I don't understand..."
  • "Other"

These prompts made it easier for participants to provide focused feedback, rather than providing a single open-ended question at the survey’s conclusion.

Data Review & Outcomes

A detail-oriented student worker organized the survey responses into an easy-to-interpret spreadsheet using Excel. When examining the responses, we used mode to calculate the importance of each link that participants were asked to rank. Mode is useful for analyzing categorical data and allowed us to determine which value occurred most frequently in each set of responses (“Measures of Central Tendency,” 2013). For example, in examining responses to questions about the tabs of the library homepage search box, the label "Books & Media" was assigned a rank of 1 by 23 participants, so according to our interpretation, most respondents deemed "Books & Media" to be the most important tab.

Although “Books & Media” was deemed the most important search tab, the term “Library Catalog” was strongly preferred over the wording “Books & Media” by participants. The survey designers were surprised at this outcome because we assumed “Books & Media” would be more user-friendly, which is why it was used throughout the survey when asking participants to rank links by importance. Other terms the participants preferred included: Article Search, Research Guides, Online Resources, and Interlibrary Loan (Fig. 4).

Figure 4
Figure 4

When asked to rank the importance of the terms on the search box tabs from most important (number 1) to least important (number 4) the results were (Fig. 5):

  1. Library Catalog
  2. E-Resources
  3. Articles
  4. Library Guides
Figure 5
Figure 5

When asked to rank the importance of resource sidebar links from most important (number 1) to least important (number 5) the results were (Fig. 6):

  1. Library Catalog
  2. E-Resources
  3. Full-Text Journals
  4. Library Guides
  5. Refworks
Figure 6
Figure 6

When asked to rank the importance of service sidebar links from most important (number 1) to least important (number 5) the results were (Fig. 7):

  1. Library Hours
  2. Group Study Rooms
  3. Writing & Citing
  4. Interlibrary Loan
  5. Chat with a Librarian
Figure 7
Figure 7

Other Observations

Survey respondents could discuss the library website via open-ended survey questions and terminology preferences. Multiple survey respondents provided feedback that:

  • it is difficult to find a complete list of databases on the library website
  • they don’t understand what “A-Z List” means
  • they would like to see a list of databases organized by subject
  • they don’t understand what “Library Guides” means
  • they expect the Library Catalog to be the main search option
  • the Library Hours and Account Log In should be more prominent on the homepage

Recommended Improvements

The staff committee who created and carried out the survey met after the data collation and statistics were organized to review the findings. The committee used a combination of the qualitative and quantitative findings from the survey to create a brief proposal that outlined recommended changes to the library homepage. This proposal included mocked-up illustrations of what the library homepage could look like after the changes. Due to time constraints and the fact that summer classes were over, there was no user testing for the mockup. We wanted the changes to be in place for the fall semester and by the time the mockups were approved by library administration, the summer semester had finished and there were no students on campus. The majority of the changes could be made by the library’s staff CMS administrators because they were content-focused. The following changes were recommended to the library’s leadership committee:

1. Update the order and simplify the list of links on the blue navigation sidebar (Fig. 8). The updated sidebar should include the following items in this order:

  • Today’s Hours and a link to “View Complete Hours and Access Restrictions”
  • Library Account Log In
  • Resources
  • Library Catalog
  • Databases (Note: Although the term “Article Search” was preferred in the terminology section, one of the repeated comments in the open-ended feedback was that it was difficult to find a list of “Databases” so the term warranted inclusion as a replacement for the link to the “E-Resources A-Z”)
  • Journal Title Search
  • Research Guides
  • Refworks
  • Services
  • Group Study Room Reservations
  • Writing & Citing
  • Interlibrary Loan
  • Course Reserves
  • Chat with a Librarian

2. Update the white tabbed box on the homepage to better meet student research needs and usability expectations, as well as to highlight appropriate library resources (Fig. 8). The changes to the white tabs should include the following:

  • Catalog - The basic catalog search box will stay the same, but will be the default tab on the homepage
  • Databases - This tab will contain the quick list of popular e-resources and a link to the full list of e-resources. We found that students do not generally know the term “e-resources,” and the majority of survey respondents noted they were unable to find a list of databases. We recommend using the term ‘databases’ because we determined that is what users understand.
  • Articles - This tab will contain a search box that searches Academic Search Complete, with relevant explanation text stating what kind of resource Academic Search Complete is.
  • Research Guides - This tab will contain links to the subject, course, career, how-to, and faculty guides. We found that students do not understand the term “Library Guides” and prefer the term “Research Guides.”
Figure 8
Figure 8

3. Combine the “E-Resources by Subject” Guide and “A-Z List” Guide into a single research guide entitled “Databases.” Although reorganizing the database pages was not in the original scope of the research, we received feedback that it was difficult to find a complete list of databases on the library website, and participants also noted that they didn’t understand what the “A-Z List” was. This new guide will contain drop-down menus of resources organized by subject on the first tab, and resources arranged alphabetically on the other tabs (Fig. 9).

Figure 9
Figure 9

Conclusion & Discussion

We made the recommended changes to the library website in August 2013 after receiving approval from library administration. There were no reference questions or complaints about the changes to the library website in the first month of the changes. We reviewed the Google Analytics for the library homepage, and observed a decrease in the average time spent on the homepage after the changes were implemented. In March 2013, before the changes, the average time spent on the homepage was 4:57. After the updates, this number dropped to 4:27, during March 2014. As usability is an iterative process, we plan to re-assess the homepage in 2014 after the debut and adoption of the new discovery platform.

The project’s success is also due to the focus that we placed on the user preferences and understanding. By grounding the study’s survey design and proposed changes in a kind of participatory design process, we were able to use concrete findings to support the changes proposed and eventually made to the site. This minimized the influence library staff’s opinions had on the site.

We found that the paper survey is an easy and inexpensive way to collect student feedback. It proved to be a great way to reach out to patrons and to find out what they are thinking about library issues and the visual aids made it easy to illustrate sections of the webpage. A library doesn’t need endless financial resources to conduct a survey like the one we did—you only need staff who are willing to develop and execute this kind of plan. Although we did have small snack incentives for participants, many users refused the incentive and were simply happy to provide feedback for “free.”

By focusing the study on changes we could easily make to the library homepage, within the parameters of the campus-wide CMS, it was much easier to implement changes after the results of the study were analyzed. Although it may be tempting to perform a usability test and use the results to try and convince the campus webmaster (or whatever group is the ‘gatekeeper’ of your site) that a total redesign is needed, it was much more useful to focus on attainable goals for the library homepage.

The library staff at our institution is lean and we do not have any earmarked funding for usability-type studies. Again, in terms of resources and budget, we looked to use what we had available so as not to place insurmountable constraints on the study. The survey was completed during the summer term and staff volunteers who were interested in engaging in user research were members of the committee. The snacks offered as incentives were funded from the library’s small marketing budget.

Even if a library has minimal control over the design or layout of its website, like at Simmons College Library, there are almost always small changes that can be made to improve the user experience. At our library, although we don't have total control of the site's design and layout, we still want to improve usability in the areas over which we do have control. Although there may be limitations to the flexibility of some components of your library’s website, you can use user experience research methods in order to collect feedback and transform user opinions into tangible changes that improve the website experience, and ultimately, the library experience.

References

  • Foster, N. F. (2006). Introduction. In Participatory Design in Academic Libraries. Washington, D.C.: Council on Library and Information Resources. 1-3. http://www.clir.org/pubs/reports/pub155/pub155.pdf.
  • Krug, S. (2006). Don’t Make Me Think. Berkley, CA: New Riders. “Measures of Central Tendency.” (2013). https://statistics.laerd.com/statistical-guides/measures-central-tendency-mean-mode-median.php.
  • Nielsen, J. (2004). “Card Sorting: How Many Users to Test.” http://www.nngroup.com/articles/card-sorting-how-many-users-to-test/.
  • Polger, M. (2011). Student Preferences in Library Website Vocabulary. Library Philosophy & Practice, 69-84.
  • Redish, J. (2012). Letting go of the words: writing web content that works, second edition. [Books24x7 version] Available from http://0-common.books24x7.com.library.simmons.edu/toc.aspx?bookid=51014.
  • Reynolds, E. (2008). The Secret to Patron-Centered Web Design: Cheap, Easy, and Powerful Usability Techniques. Computers In Libraries, 28(6), 6-47.
  • Teague-Rector, S., Ballard, A., & Pauley, S. (2011). The North Carolina State University
  • Libraries Search Experience: Usability Testing Tabbed Search Interfaces for Academic Libraries. Journal Of Web Librarianship5(2), 80-95.
  • Tidal, J. (2012). Creating a user-centered library homepage: a case study. OCLC Systems & Services, 28(2), 90-100.
  • Tomeo, M. (2012). Adding Users to the Website Design Process. Public Services Quarterly8(4), 350-358.