Introduction
In Australian universities, approximately 90 percent of referencing guides—also known as citation guides—are authored and managed by libraries as opposed to other university departments (see appendix A). However, there is limited research available on best practices for designing guides of this type. At the University of Notre Dame (UNDA) Library, our referencing LibGuide (https://library.nd.edu.au/instruction/referencing/apa7) was the second-most-visited area of the library website. Beyond usage data, however, students’ experiences while using the guide were unknown. The numbers did not confirm that students found the content easy to interpret, nor the structure intuitive to navigate. Considering the impact referencing skills can have on student grades, we felt an investigation into students’ experiences, using an evidence-based approach, would benefit users and inform further refinement of the guide.
Literature Review
The “Framework for Information Literacy for Higher Education” acknowledged that we work and live in a “dynamic and often uncertain information ecosystem” (Association of College & Research Libraries, 2015, p. 7). Students confront this uncertainty when they interpret style rules for citing sources in their coursework. Citation is a complex activity that relates to several aspects of the “Framework,” including evaluating the source (Authority is Constructed and Contextual), understanding the context (Information Creation as a Process), and giving credit to the creator (Information Has Value).
Students may be expected to master referencing before commencing undergraduate study (Peters & Cadieux, 2019), although such instruction may be uneven. In a study on freshman students’ experiences of academic literacy, 61 percent of respondents reported that they had received no instruction on citation and referencing in high school (Hossain, 2022). Even professionals struggle with the intricacies of citation: reference list errors are common across disciplines (Unver et al., 2009), even within library science (Genzinger & Wills, 2017). Gravett and Kinchin depicted referencing as an “obscure academic practice” for students (2020b, p. 377), a process of compliance both alienating and stressful (2020a). In a research project on student perceptions of referencing, Neville (2010) found that 75 percent were critical of the practice, either disliking it outright or expressing concern, anxiety, or difficulty with the process.
While official sources of citation guidance such as the Publication Manual of the American Psychological Association (APA) are available and have been much improved in recent editions, they can still intimidate novices. Van Note Chism and Weerakoon (2012) observed that even postgraduates tended to avoid using the APA Manual, describing it as “dense,” and “overwhelming.” Kupersmith’s (2012) analysis of the literature on library terminology revealed that the terms periodical, reference, and database were among the most misunderstood by users; yet these terms, along with other academic jargon, commonly appear in the APA Manual. Possibly in response to these obstacles, university libraries create in-house guides, providing both instructions and examples of correct citation practice.
Best Practices in the Design of Library Guides
Libraries often use LibGuides to create subject, topic, or instructional guides, with over 456,500 LibGuides in existence as of May 2022 (Springshare, 2022). In Australia, 45 percent of university libraries used LibGuides to provide referencing information (see appendix A). UX practices, including usability testing, can help to evaluate the use of such guides by revealing the complexity and diversity of information-seeking practices (Priestner & Borg, 2016). However, Logan and Spence (2021) found that 74 percent of libraries do not test their LibGuides for usability. Castro Gessner et al. (2015) suggested that many librarians rely on faith that their guides will be used as intended. Schmidt and Etches (2012) advised that to improve a website “you need to watch people interact with it” (p. 33).
In developing a UX program at Duke University Libraries, Daly et al. (2017) recounted how they moved from “a culture in which staff made decisions based on instinct or isolated experience to a responsive culture with students’ opinions and feedback at its center” (p. 268). However, Marquez and Downey (2015) observed that libraries tend to develop services in isolation, both from other library services and users themselves. As a result, librarians may be inclined to design guides based on their own mental model and work processes, not their users’ (Welker, 2016).
Along with conducting usability tests, libraries can follow established best practices for LibGuide design. At UNDA, we agreed with the findings of Oulette (2011) that students prefer clean, simple guides that provide information quickly and effectively. We also followed Welker’s (2016) advice on managing cognitive load by reducing text-heaviness and tab numbers on each page. Comprehensive literature reviews of LibGuide best practices (Goodsett et al., 2020; Bergstrom-Lynch, 2019) recommended consistency and clarity above all: employing guide templates and standards, avoiding jargon, decluttering guides, and “chunking” content. Like many other university libraries, UNDA adhered to this advice in our local best practice guidelines.
Design of Citation Guides
However, literature on the specific requirements of citation guide content and design is limited, despite the ubiquity of librarian-authored citation guides. Investigations into other forms of specialist LibGuide content are not uncommon; we found studies on LibGuides for East Asian studies (Chen & Chen, 2015), geology (Dougherty, 2013), aviation (Horton, 2022) and COVID-19 (Fraser-Arnott, 2020), among others. Many have written about the challenges for students in citing complex, born-digital sources (Gibbs et al., 2015; Gray et al., 2008; Greer & McCann, 2018), which reinforces the importance of making sure content is up to date and relevant.
The research we found to be closest in alignment with our project was that of Gibbs et al. (2015), who described their process of designing a referencing tool that helped students classify the resource they were citing. This “Roadmap to Referencing” tool added an interactive layer over their traditional, static citation guide to help students understand information sources. While we agreed that a similar tool would be useful for our students, we lacked the resources and personnel needed to make it possible.
Stevens (2016) recounted an investigation into student use of Purdue University’s Online Writing Lab in formatting citations. An in-class assessment activity found that students not only had difficulty navigating the lab to find citation examples, but also in applying them to their own work. Park et al. (2011) described the challenges of providing citation assistance to university students in an environment where librarians had become “citation vending machines” (p. 42): the difficulty of classifying sources, official citation manuals failing to provide modern examples, and differing standards among faculty members. While these articles’ findings all served as evidence of the need for well-designed citation instruction and help resources, the authors did not examine the design and content of citation guides in detail. Based on these gaps in the literature, there is a need to more closely examine the UX of citation guides.
Aims
Our main objectives in this research project were to determine whether:
users could efficiently locate the guide via the university and library websites.
the headings in the guide were clear and intuitive.
users could efficiently locate specific referencing examples within the guide.
users could effectively use the guide to identify how to reference a given resource.
Background
In 2016, the UNDA library carried out a focus group study investigating the referencing process and user experiences of the referencing guide (Dawe et al., 2021). There were student and faculty groups of 3–6 people. Overall, 35 students and 14 faculty members participated.
The focus group discussions revealed the citation process to be challenging in different ways for students and faculty. The faculty focus groups described the difficulty of conveying ideas about referencing as an ethical practice, rather than merely a set of formatting procedures, while maintaining consistent approaches in teaching. The student groups in turn described negative, even punitive, communication from faculty that left them feeling intimidated and confused. Students received mixed messages about referencing standards in the classroom or as part of assignment feedback. They also mentioned the challenges of citing online sources that didn’t fit easily into a given category.
When it came to feedback about the guide specifically, the discussion yielded fewer insights. We learned that students appreciated color-coded citation examples and the sheer number and variety of examples that the guide provided. However, it appeared that students often found it difficult to locate the type of information they needed or didn’t know what they needed from the guide at all. Comments tended to be positive but vague, leaving us with little information to make improvements.
Two years later, a member of the research team attended a presentation by library researcher Hollie White, who recounted a user experience (UX) research project she had conducted on a library website in her previous role at Duke University (2018). White described the recruitment and research process of “lobby testing,” in which researchers recruit students as they enter the building, lead them to a nearby table, and observe them completing pre-determined tasks on a laptop. White emphasized the value of observing viewers in action rather than relying on self-reporting. The simplicity and quickness of this process was appealing to our team member, who recalled the time- and resource-intensive nature of the focus group recruitment in our previous study (Dawe et al., 2021).
After attending the presentation, we read Daly and White’s (2015) project report, which included a test script and task details. While the sample size was small, the results gave enough information for the library team to take action in revising aspects of the website to suit student needs. At the University of North Carolina, Arnold et al. (2016) followed Daly and White with a similar library UX project related to resource discoverability. Both these projects revealed a procedure that was inexpensive, “painless,” and quick to set up, with Arnold et al. noting, “This method of testing will be practical for us to do at least 2–3 times each semester” (“Next steps,” para 1, 2016). These two UX studies were the impetus for our team in designing this project.
We employed an exploratory user testing process outlined by Rubin and Chisnell (2008). This involves testing a prototype site to see if basic functions, such as navigation, layout, and language, are sound before further development ensues. We designed our prototype based on the focus group comments gathered in our previous study (Dawe et al., 2021), and we incorporated new product features available in LibGuides (Springshare).
We adjusted terminology in favor of plain language, for example “Multimedia” became “Video & Audio,” and “Web Resources” became “Internet & Social Media.” Rather than long pages that relied on users to scroll or use in-page hyperlinks, we presented the content more laterally in tabbed boxes. Headings within pages emphasized type of content (for example Newspapers, Magazines, Audio) rather than type of element (title, edition, URL) as in the previous guide (see fig. 1). We added more examples to reflect common referencing queries, particularly in the Internet & Social Media category, for example PDFs, government documents, and reports. Information on secondary source citation was more prominent, as this had been a standout theme from the focus groups. Finally, we added a section called “Missing Information” to each page to help users deal with missing elements in a reference (for example dates or page numbers).
Methods: Student Usability Testing
Participants and Recruitment
We followed the “on-the-fly” usability testing approach (Arnold et al, 2016; Daly & White, 2015) that involved recruiting students as they entered the library and offering them a small incentive to undertake usability testing on the prototype guide. We distributed flyers and posters around all three UNDA campuses in the lead-up to the testing periods, and librarians included information about the testing in their usual undergraduate orientation presentations. During lunch hours in testing weeks, the library student employees staffed a table at the library entrance to attract and welcome participants (see fig. 2), answer any questions, and showcase the free coffee vouchers and candy incentives. The student library employee also issued participant information sheets and consent forms and ushered recruits to the testing room.
Questions
Usability testing took place during lunch hours in weeks 4 and 7 of Semester 2, 2018. This timeline coincided with periods when undergraduate students were completing formative assessments. We changed some scenario questions for week 7, as we had achieved saturation on some of the week 4 responses and wanted to present different types of sources to participants (see appendix B). We worded some questions to avoid describing a source, as we wanted students to evaluate and classify the source in their own way. For example, instead of “You’ve been using information from this online newspaper [link] in your assignment,” it read, “You’ve been using information from this source [link] in your assignment.” In this way, students wouldn’t be seeking headings in the referencing guide that matched the terminology of the question.
Schmidt and Etches (2012) advised that a minimum of five testers can reveal 85 percent of the usability problems within a website. We recruited 53 students (38 from Fremantle and 15 from Sydney), which exceeded our expectations. All were undergraduates except for two postgraduate students and one faculty member. Students represented all schools, with the majority being in their second year and studying in a health-related field. Due to technical issues, only forty-eight of the fifty-three sessions are part of this analysis.
Procedure
We accepted all participants on a first come, first served basis. We presented consent forms, which informed participants that the session would be audio-recorded, but that all information would be kept confidential, with any published data de-identified. Participants then came to a room containing a computer with two screens. Two library staff members were in attendance: a facilitator and a notetaker. We based the testing script on those developed by Arnold et al. (2016) and Krug (2013). See appendix B. The questions centered on how students located the guide (navigation) and how they would use the prototype guide to cite a source (tasks). The facilitators read all questions aloud and provided paper copies to the participants.
To help us better understand a user’s perspective as they completed the tasks, we asked participants to “think aloud”—verbalize their thoughts and feelings (Bergstrom, 2013). To understand the efficiency of the process, notetakers documented the navigational pathways users took as well as any salient nonverbal data, including facial expressions, body language, and tone.
Analysis
We collated and parsed transcripts using NVivo software. King and Brooks (2017) and Nowell et al. (2017) guided our coding process, but we designed the coding labels. We coded navigational questions according to the approach the participants took to locate the guide, for example from the student portal, Google, etc. We coded task-based questions according to which headings students explored to find an answer, for example Video & Audio, Other Sources, etc. The second pass at coding identified other, unanticipated themes that emerged as students were thinking aloud or in general comments as the session concluded. These codes included labels such as Tabbed Boxes, University Website, or Referencing Instruction. We explore these further in the results section.
Methods: Online Testing with Faculty
Participants and Recruitment
Recruiting faculty for live testing during the allocated times proved to be challenging, so we designed an unmoderated online version of the usability test using the LibWizard tutorial platform. LibWizard allows participants to view and answer questions in a side panel while interacting with a live website—in this case, the prototype guide itself—in the main window (see fig. 3).
We sent participant information forms, consent forms, and links to the online test via a faculty email list in September 2018. Of the approximately 760 faculty members on the email list, we received 40 responses (23 from Fremantle, 16 from Sydney, and 1 from the Broome campus). We asked participants to identify their campus, school, and employment status (academic [permanent], sessional, adjunct). Participants represented all schools, with the majority being from the health-based disciplines. Most were permanently employed, with 5 sessional faculty responding. At the conclusion of the test, we encouraged faculty to contact us with any follow-up comments or questions, as we emphasized that refining the guide was an iterative and ongoing process.
Procedure
We presented participants with four scenario-based tasks that reflected common referencing situations faculty face (see appendix C). After completing each task, we asked participants to select the term(s) that best described their reactions and experience in undertaking the task (easy to find, satisfying, unsure where to go, confusing, frustrating), and they had the opportunity for open-ended responses. After the tasks section, we asked participants for their opinions about the guide’s content and usefulness as a referencing support tool.
Analysis
We downloaded responses into a spreadsheet and reviewed them for themes and salient comments.
Ethics Approval
Ethics approval for this project was supported by the School of Education’s Research Committee. To create awareness and to generate interest in assisting with testing among the greater library team, we created a “Usability Testing Kit” that included an FAQ document, the testing script, consent forms, the notetaking template, and a checklist for setting up the testing area (see appendix D). In the weeks leading up to recruitment, we trained library staff volunteers on notetaking and recruitment methods.
Results: Student Usability Testing
Navigation
We asked participants a set of navigation questions in the first part of the testing session, as an important goal of this project was to determine whether users could effectively and efficiently locate the Guide on the library website.
Starting from the university home page, we asked the 48 participants how they would get to the guide on the library website. We included this question to help us understand how students located the guide, and how efficient—or circuitous—these paths might be. Among the varied responses, the most popular route (17) was navigating to the Students section of the website, and then selecting the library tile. The next most common were to navigate from the student home page via the MyND student portal, and then to the library website (10) or by using the search function on the university website (9).
Once at the library home page, most users found the link to the guide by either clicking on the referencing link in the central area of the home page or via a text link under the Find menu to the left of the page. However, it was clear both from the actions and comments of users that these access points were not prominent enough: “It just blends in,” “It’s very hidden, if you didn’t know where to look for it,” “I only know how to [find] it since I’m in third year.” Other comments mentioned how “busy” the library home page was, with lots of content competing for their attention: “It’s kind of a busy page I’d say, and because it’s just one generic link amongst a bunch, I wouldn’t have found it so easily.” One comment illustrated how the referencing guide link needed to be emphasized on the home page:
There are a lot of useful things in the library. And obviously the referencing guide is very useful for assignments. All this other stuff is very useful as well, “borrowing” and so on. I’d say I use the referencing guide more than this other stuff, so to put it on an equal weighting as the rest of it [doesn’t make sense].
Classifying Sources
Another goal was to understand whether the main headings for the guide were clear and intuitive to users. Specifically, we wanted to get a sense of the way students mentally categorized sources so we could match their expectations. Therefore, students finding the “answer” was less important to us than understanding where they looked for it. The main headings we chose for the prototype guide were In-Text Citations, Journals, Newspapers & Magazines, Books & eBooks, Video & Audio, Internet & Social Media, Religious & Classical Works, Interviews & Personal Communications, Figures, Images & Tables, and Other Sources.
We were pleased to see that certain questions appeared easy for participants to answer. One of the test scenarios gave participants a link to a YouTube video containing news coverage of a political speech. Most (23/24) of the Week 4 participants immediately located the Video & Audio section and the correct example. Similarly successful was a question posed on both weeks asking students to reference an e-book with three authors, with all participants looking under the Books & eBooks heading at the outset.
However, some of the sources were confusing to students as they didn’t fit within traditional categories. One example of this was a PDF of a government document. Of the 24 participants who had this question in Week 7, 12 verbally described it as a “report”—probably due to the word appearing on the front page of the source. However, there was no main heading for “report” in the guide, and participants made circuitous explorations around the guide: “I’d look to see if there [are] reports. Journals, Newspapers & Magazines, that doesn’t quite fit that kind of thing. I don’t think it would be under Internet and Social Media. I think, maybe Other Sources … possibly … see what it says under Specialist Subjects.…” Another participant stated, “There should be one [category] that’s more clear, specifically for reports.”
Some assessed the document based on its format (PDF), while others focused on content and structure. Participants’ narration provided insight into their reasoning: “I would try and figure out what it is first, whether it’s an e-book or a … it looks like an e-book actually”; “I’d imagine Interviews and Personal Communications, because it would only be sent out to a few people”; “I’d probably go to Journals, Newspapers & Magazines. It couldn’t be classified as a book, and it doesn’t seem to be anything else. Process of elimination.”
First Selected Heading |
n=24 |
Percent |
---|---|---|
Other Sources |
8 |
33.3 |
Journals, Newspapers & Magazines |
5 |
20.8 |
Internet & Social Media |
5 |
20.8 |
Books & eBooks |
3 |
12.5 |
Interviews & Communications |
1 |
4.2 |
In-Text Citations |
1 |
4.2 |
Undecided |
1 |
4.2 |
Another challenge for participants was classifying a PowerPoint presentation from a government website (see Table 2). While most (16/24) eventually chose the Other Sources heading, it appeared they weren’t confident about the decision: “I’m just going to look at other sources because it would be a citation within a source that’s coming from a website to get the PowerPoint presentation … but I’m also second guessing that.” Several students took their time, considering each heading in turn.
First Selected Heading |
n=24 |
Percent |
---|---|---|
Other Sources |
16 |
66.7 |
Internet & Social Media |
4 |
16.7 |
Undecided |
3 |
12.5 |
Interviews & Personal Communications |
1 |
4.2 |
We asked one group of participants to use a direct quote from a resource that had no page numbers, an online article from The Atlantic magazine. While students eventually settled on a category selection (see Table 3), it appeared to be a difficult decision. Many suspected The Atlantic was a newspaper, but the fact that it was online gave them pause: “Looks like a blog or something,” “This might be one of the types you’d avoid.” Confusion and frustration were common reactions: “This is a tough question,” “I was really stumped,” “I’m just clicking on everything!”
First Selected Heading |
n=24 |
Percent |
---|---|---|
Journals, Newspapers & Magazines |
10 |
41.7 |
Internet & Social Media |
9 |
37.5 |
In-text Citations |
3 |
12.5 |
Books & eBooks |
1 |
4.2 |
Undecided |
1 |
4.2 |
Some assessed the source based on authorship, as the PowerPoint came from the Department of Health, and were looking for a “government document” category. Others considered the format more important and looked for a “PowerPoint” category: “I don’t know… it is a government document thing, but it’s still a PowerPoint.” The PowerPoint being online caused more confusion, and several considered the Internet & Social Media category before settling on Other Sources. Some were unable to decide: “nothing’s popping out,” “I’m stumped on that,” “I’m a bit lost.”
Secondary Source Citation
Participants from our previous study (Dawe et al., 2021) were emphatic about the need for clarity on referencing secondary sources. In the prototype guide, we placed this information in several places. We wanted to see if the way we had presented this information was clear and helpful. Thus, we gave participants in both weeks a scenario in which they would need to reference a source within a book they were reading. Several students made it clear that they felt this was not good practice in referencing: “Well if I found it in another resource, shouldn’t I reference that resource?” “If I cannot prove where they came from, I would be using unreliable information.” However, others were grateful that this information was more accessible to them: “This is like the biggest issue in referencing”; “I get that all the time. Does my head in”; “You know I’ve come across that problem before. And I just ended up by picking a different quote, because I wasn’t sure how to do it.”
In terms of choosing a heading, most selected Books & eBooks, likely because they were told that the source they were working with in the scenario (containing the quote) was a book. A small number either gave up looking for an answer or stated that they wouldn’t use this method. For the majority, it appeared that the term we used for this concept, “secondary citation (source within a source)” was descriptive enough that students recognized the heading quickly once they found it.
First Selected Heading |
n=48 |
Percent |
---|---|---|
Books & eBooks |
16 |
33.3 |
In-text Citations |
12 |
25.0 |
Other Sources |
11 |
22.9 |
Undecided or wouldn’t use |
6 |
12.5 |
Journals, Newspapers & Magazines |
3 |
6.3 |
Layout
Either during the activities or in final comments, 8 of the 48 participants mentioned that the prototype guide was an improvement on the previous iteration, that it seemed less “overwhelming,” more “straightforward,” and “streamlined.” One of the biggest changes we made in creating the guide was to house the information in tabbed boxes (fig. 4). We had not previously employed this function in our guides, as it did not satisfy Web Content Accessibility Guidelines (WCAG), however Springshare had recently made this function compliant. This change allowed us to compress the pages and reduce scrolling.
Thirteen participants commented positively about this change, unprompted: “it’s good to have these little sections [tabs] … instead of having to scroll down like before,” “I definitely liked the tabs up the top rather than having it search like one long list,” “I like the tabs you guys have put in here, before I had to keep going, going, going.” One student liked how the tabs contained more information, enabling the user to remain on a single page: “Don’t have to delve into different links and ‘click here’ and not know where you’re looking and click everywhere.”
Ten students made positive comments about the color-coded example images of how to reference common sources (fig. 4). “Color coding is good for me … I usually just follow the example”; “I’m understanding it very quickly looking at the visual.” There was also positive feedback for the condensed, printable versions of the guide: “I always print them out, and I have them with me for when I’m working on an assignment”; “In the past usually, I’ve used the quick guide which I printed out and stuck on my desk.”
Referencing and Citation Instruction
Participants raised several issues to do with referencing instruction, namely that the guide needed to illustrate “why” as well as “how”: “What helps me the most when I’m referencing is understanding … why they do it this way, cause then it makes sense”; “I needed more information just than the example. I needed an explanation of the example.” One student felt this lack of knowledge limited the types of sources she chose to use:
Another noted the lack of consistency between the guide and messages from course instructors:For me it would be better to still reference only books and journal articles, doesn’t matter if it’s limiting my sources but at least I’m on the safe side. It’s helping me play safe, but not actually helping me understand how referencing works.
I don’t know if the library actually checks with the course coordinators about the standards they expect for referencing. From what I’ve seen, each lecturer does it slightly differently to the way we were taught in [course] and the way the Uni does referencing.
Results: Faculty Online Usability
Using the LibWizard tool, we gave the 40 faculty respondents several scenario-based referencing questions and asked them to locate relevant information within the Guide. After each question, we asked the participants to describe their experience by selecting a response from a list or add their own comments. Most responses were positive, with participants selecting “Easy to Find” (21), “Satisfying” (9), or both (3) as their main reactions. Then we asked faculty more open-ended questions about whether the guide offered adequate support for students. Again, the responses were mainly positive: “I think this latest version is much more user friendly”; “I like the way boxes and tabs have been used to make the guide simpler and variations easier to find”; “The navigation is clear and the explanations easy to understand.”
Although the survey represented a small sample of staff members, some of the comments provided a window into a range of faculty views. Two participants acknowledged the broader issues affecting student referencing: “[the guide] appears clear to me, but it does also need academic staff to lead by example and educate students”; “The guide … has good explanations of the mechanics of [references]. The next step would be to include more on the where and why, to give students a deeper understanding of the purpose of referencing.”
Limitations
As a small institution, this method of usability testing recruitment was practical to implement, with surprisingly rich results. Though our interactions with students were brief, the knowledge we gained challenged our assumptions and provided new insights. However, this approach does favor the views of people who attend the library in person, which is a limitation. We recommend using this approach as part of a spectrum of other investigative methods to gain a fuller picture of user needs. This study is a step toward better understanding our community, not the final destination.
The study has other limitations that were only apparent to us after we collated the data. Upon reflection, starting students at the university home page and asking them to navigate to the library page may have biased some participant behavior. A more neutral, open-ended approach would have participants start with a blank browser window. Observing and interpreting participant emotions is an inexact science as people express emotions differently. However, we found that asking participants to narrate their experiences as they completed tasks gave us a general idea of their concerns and preferences. As we were unable to conduct live user testing on faculty, we did not gain any insight into their information-seeking behavior. The faculty survey sample was also small and could have been biased towards participants who held strong feelings about referencing.
Discussion
This project’s main goal was to determine whether the referencing guide was meeting our users’ needs in terms of efficiency, clarity, and effectiveness. By observing participant behavior and listening to their reactions, we gained a more accurate picture of the student experience to inform our guide’s redesign. We found usability testing to be an inexpensive and accessible way of conducting action research. The main investment was time, particularly for library staff training and data analysis.
The referencing guide is an important tool in our students’ academic lives. Based on student feedback, we decided to make it more prominent on the website by giving it a dedicated side tab and a central icon on the library home page. It was reassuring to see that students were able to find the guide by searching the university website. The most important—and well-received—changes we made were to reduce page lengths by displaying information in tabbed boxes. Several students approved of how it saved them time in locating the right example and gave them a chance to peruse other examples without scrolling.
Our library’s general approach to LibGuide design was already in line with the advice of previous researchers: to design simple, clear guides with users’ needs as priority (Oulette, 2016), to manage cognitive load (Welker, 2016), and to be clear and consistent (Bergstrom-Lynch, 2019; Goodsett et al., 2020). Our study offers more specific guidance on how to present the complex information inherent to referencing. Students told us they may avoid using sources if they do not know how to reference them, which may have implications for the process of academic exploration. Proper referencing guide design can support students’ confidence in using novel information sources, knowing they can cite them correctly.
Now we have the data to make informed decisions about naming and categorization. The term “Internet & Social Media,” for example, appeared intuitive to students. Difficulties nevertheless remain when it comes to “born digital” resources and those that sit in gray areas. For example, library staff commonly assist students wanting to reference “floating” PDFs that have come up in a Google search—documents with an unclear provenance or context. When we posed this scenario to students, it proved to be challenging in terms of evaluation as well as referencing. Many struggled with this task and looked across most of the guide to find an answer. Because students looked for the term “PDF” as well as “Report” and “Government Document,” we included all three terms in our final guide, as the rules for referencing these in APA style are essentially the same. We situated the examples in both the Internet & Social Media and Other Sources pages.
We also found that guidance about secondary source citation did not fit into a simple category. As in our previous research (Dawe et al., 2021, p. 998), students were emphatic about their need for this information, and recounted previous instances where they were unable to find it. We decided to remedy this by adding secondary source citation examples on most pages in the guide. Overall, our approach was to meet students where they were looking rather than telling them where they should go.
We recommend including a Missing Information section on each relevant page of a guide to address questions about page numbers, DOIs, and other quirks of online documents. One student recounted his experience trying to find a “missing” DOI:
This comment reveals how the referencing guide can teach students about the variety—and validity—of source types merely by displaying examples. We observed students scanning the examples to see where their source fit and using the example they felt was closest in type. For example, one student saw the example for blog post and pondered whether the source she found might fit into that category: “I would go to...is it a blog post? Not a blog post.” Displaying a variety of examples in referencing guides may encourage students to be more adventurous in their choice of source types.I remember I got stuck for a day trying to reference, going, ‘Where in God’s name?’ [laughs]. I looked up on all the different websites, I was [typing] ‘How do I find a DOI?,’ ‘Where is the DOI?,’ ‘DOI finder,’ and I was typing in the article name, and it was like ‘Oh, there’s no DOI showing up.’ [...] No one told me that some [journal articles] might not have a DOI, so I was like ‘Argh! All of them do, I’ll just look. If I look hard enough, I’ll find it.’ So I spent a solid day looking for this DOI.
Conclusion
Our study shows how citing sources continues to be a complex academic practice. Thoughtfully designed referencing guides can streamline the process. The iterative nature of UX research can teach library staff about how students use guides and identify areas for improvement.
Insights from this Project
We recommend taking the following actions when designing referencing guides:
Make the guide highly visible and easily accessible on library websites.
Find ways to reduce scrolling, for example through tabbed boxes.
Use descriptive language that students recognize, even if it doesn’t match a style manual’s terminology, for example “secondary citation (source within a source).”
Include diagrammatic, color-coded images to illustrate citation examples.
Provide a variety of examples to reflect diverse online sources, and be open to providing this information in more than one category.
Acknowledgments
Special thanks to Jackie Stevens (Manager, Research Services, University of Notre Dame Australia) for her work in supporting and implementing this project, including her contributions to early writing drafts. Thank you to the library team at the University of Notre Dame Australia for their work in revising the guide and recruiting and testing participants. Thank you to Janice Chan (Manager, Library Research Services, Curtin University) for her support as I completed the final drafts.
Note: This research was conducted while I was working at the University of Notre Dame Australia. I have since moved to Curtin University.
References
Arnold, S., Haefele, C., & Sharrar, G. (2016). On-the-fly usability testing findings. https://libux.web.unc.edu/2016/07/28/on-the-fly-usability-testing-findingshttps://libux.web.unc.edu/2016/07/28/on-the-fly-usability-testing-findings
Association of College & Research Libraries. (2015). Framework for Information Literacy for Higher Education. https://www.ala.org/sites/default/files/acrl/content/issues/infolit/Framework_ILHE.pdfhttps://www.ala.org/sites/default/files/acrl/content/issues/infolit/Framework_ILHE.pdf
Bergstrom-Lynch, Y. (2019). LibGuides by design: Using instructional design principles and user-centered studies to develop best practices. Public Services Quarterly, 15(3), 205–223. https://doi.org/10.1080/15228959.2019.1632245https://doi.org/10.1080/15228959.2019.1632245
Bergstrom, J. R. (2013). Moderating usability tests. http://web.archive.org/web/20190809131528/https://www.usability.gov/get-involved/blog/2013/04/moderating-usability-tests.htmlhttp://web.archive.org/web/20190809131528/https://www.usability.gov/get-involved/blog/2013/04/moderating-usability-tests.html
Castro Gessner, G., Chandler, A., & Wilcox, W. S. (2015). Are you reaching your audience? The intersection between LibGuide authors and LibGuide users. Reference Services Review, 43(3), 491–508. http://www.emeraldinsight.com/doi/abs/10.1108/RSR-02-2015-0010http://www.emeraldinsight.com/doi/abs/10.1108/RSR-02-2015-0010
Chen, Y., & Chen, X. (2015). Web-based subject guides for East Asian studies: Current status, challenges, and recommendations. Internet Reference Services Quarterly, 20(1–2), 1–17. https://doi.org/10.1080/10875301.2015.1018474https://doi.org/10.1080/10875301.2015.1018474
Daly, E., Chapman, J., & Crichlow, T. (2017). Just ask them! Designing services and spaces on the foundation of student feedback. In S. Arnold-Garza. & C. Tomlinson (Eds.), Students lead the library: The importance of student contributions to the academic library (pp. 267–282). Association of College and Research Libraries.
Daly, E., & White, H. (2015). Accessing online articles usability test. https://dukespace.lib.duke.edu/dspace/handle/10161/10847https://dukespace.lib.duke.edu/dspace/handle/10161/10847
Dawe, L., Stevens, J., Hoffman, B., & Quilty, M. (2021). Citation and referencing support at an academic library: Exploring student and faculty perspectives on authority and effectiveness. College & Research Libraries, 82(7), 991–1003. https://doi.org/10.5860/crl.82.7.991https://doi.org/10.5860/crl.82.7.991
Dougherty, K. (2013). Getting to the core of geology LibGuides. Science & Technology Libraries, 32(2), 145–159. https://doi.org/10.1080/0194262X.2013.777233https://doi.org/10.1080/0194262X.2013.777233
Fraser-Arnott, M. (2020). Academic library COVID-19 subject guides. The Reference Librarian, 61(3–4), 165–184. https://doi.org/10.1080/02763877.2020.1862021https://doi.org/10.1080/02763877.2020.1862021
Genzinger, P., & Wills, D. (2017). Giving credit: How well do librarians cite and quote their sources? Reference & User Services Quarterly, 57(1), 30–41. https://doi.org/10.5860/rusq.57.1.6440https://doi.org/10.5860/rusq.57.1.6440
Gibbs, C., Kooyman, B., Marks, K., & Burns, J. (2015). Mapping the “Roadmap”: Using action research to develop an online referencing tool. Journal of Academic Librarianship, 41(4), 422–428. https://doi.org/10.1016/j.acalib.2015.05.004https://doi.org/10.1016/j.acalib.2015.05.004
Goodsett, M., Miles, M., & Nawalaniec, T. (2020). Reimagining research guidance: Using a comprehensive literature review to establish best practices for developing LibGuides. Evidence Based Library and Information Practice, 15(1), 218–225. https://doi.org/10.18438/EBLIP29679https://doi.org/10.18438/EBLIP29679
Gravett, K., & Kinchin, I. M. (2020a). Referencing and empowerment: Exploring barriers to agency in the higher education student experience. Teaching in Higher Education, 25(1), 84–97. https://doi.org/10.1080/13562517.2018.1541883https://doi.org/10.1080/13562517.2018.1541883
Gravett, K., & Kinchin, I. M. (2020b). The role of academic referencing within students’ identity development. Journal of Further and Higher Education, 45(3), 377–388. https://doi.org/10.1080/0309877X.2020.1766665https://doi.org/10.1080/0309877X.2020.1766665
Gray, K., Thompson, C., Clerehan, R., Sheard, J., & Hamilton, M. (2008). Web 2.0 authorship: Issues of referencing and citation for academic integrity. The Internet and Higher Education, 11(2), 112–118. https://doi.org/10.1016/j.iheduc.2008.03.001https://doi.org/10.1016/j.iheduc.2008.03.001
Greer, K., & McCann, S. (2018). Everything online is a website: Information format confusion in student citation behaviors. Communications in Information Literacy, 12(2), 150–165. https://files.eric.ed.gov/fulltext/EJ1205062.pdfhttps://files.eric.ed.gov/fulltext/EJ1205062.pdf
Horton, J. J. (2022). Survey of aviation online library research guides. Science & Technology Libraries, 41(4), 411–421. https://doi.org/10.1080/0194262X.2021.1999883https://doi.org/10.1080/0194262X.2021.1999883
Hossain, Z. (2022). University freshmen recollect their academic integrity literacy experience during their K–12 years: Results of an empirical study. International Journal for Educational Integrity, 18, Article 4. https://doi.org/10.1007/s40979-021-00096-4https://doi.org/10.1007/s40979-021-00096-4
King, N., & Brooks, J. M. (2017). Template analysis for business and management students. Sage. https://doi.org/10.4135/9781473983304https://doi.org/10.4135/9781473983304
Krug, S. (2013). Don’t make me think, revisited: A common sense approach to web usability (3rd ed.). New Riders.
Kupersmith, J. (2012). Library terms that users understand. https://escholarship.org/uc/item/3qq499w7https://escholarship.org/uc/item/3qq499w7
Logan, J., & Spence, M. (2021). Content strategy in LibGuides: An exploratory study. The Journal of Academic Librarianship, 47(1), 102282. https://doi.org/10.1016/j.acalib.2020.102282https://doi.org/10.1016/j.acalib.2020.102282
Marquez, J., & Downey, A. (2015). Service design: An introduction to a holistic assessment methodology of library services. Weave: Journal of Library User Experience, 1(2). https://doi.org/10.3998/weave.12535642.0001.201https://doi.org/10.3998/weave.12535642.0001.201
Neville, C. (2010). The complete guide to referencing and avoiding plagiarism. McGraw-Hill Education.
Nowell, L. S., Norris, J. M., White, D. E., & Moules, N. J. (2017). Thematic analysis: Striving to meet the trustworthiness criteria. International Journal of Qualitative Methods, 16(1). https://doi.org/10.1177/1609406917733847https://doi.org/10.1177/1609406917733847
Ouellette, D. (2011). Subject guides in academic libraries: A user-centred study of uses and perceptions. Canadian Journal of Information and Library Science, 35(4), 436–451. https://doi.org/10.1353/ils.2011.0024https://doi.org/10.1353/ils.2011.0024
Park, S., Mardis, L. A., & Ury, C. J. (2011). I’ve lost my identity—oh, there it is … in a style manual: Teaching citation styles and academic honesty. Reference Services Review, 39(1), 42–57. https://doi.org/10.1108/00907321111108105https://doi.org/10.1108/00907321111108105
Peters, M., & Cadieux, A. (2019). Are Canadian professors teaching the skills and knowledge students need to prevent plagiarism? International Journal for Educational Integrity, 15, Article 10. https://doi.org/10.1007/s40979-019-0047-zhttps://doi.org/10.1007/s40979-019-0047-z
Priestner, A., & Borg, M. (2016). Uncovering complexity and detail: The UX proposition. In A. Priestner and M. Borg (Eds.), User experience in libraries: Applying ethnography and human-centred design (pp. 1–8). Routledge. https://doi.org/10.4324/9781315548609https://doi.org/10.4324/9781315548609
Rubin, J., & Chisnell, D. (2008). Handbook of usability testing: How to plan, design, and conduct effective tests (2nd ed.). John Wiley & Sons.
Schmidt, A., & Etches, A. (2012). User Experience (UX) design for libraries. ALA TechSource.
Springshare. (2022, May 2). 15 years of Springshare! Springshare blog. https://blog.springshare.com/2022/05/02/15-years-of-springshare/https://blog.springshare.com/2022/05/02/15-years-of-springshare/
Stevens, C. R. (2016). Citation generators, OWL, and the persistence of error-ridden references: An assessment for learning approach to citation errors. The Journal of Academic Librarianship, 42(6), 712–718. https://doi.org/10.1016/j.acalib.2016.07.003https://doi.org/10.1016/j.acalib.2016.07.003
Unver, B., Senduran, M., Unver Kocak, F., Gunal, I., & Karatosun, V. (2009). Reference accuracy in four rehabilitation journals. Clinical Rehabilitation, 23(8), 741–745. https://doi.org/10.1177/0269215508102968https://doi.org/10.1177/0269215508102968
Van Note Chism, N., & Weerakoon, S. (2012). APA, meet Google: Graduate students’ approaches to learning citation style. Journal of the Scholarship of Teaching & Learning, 12(2), 27–38. https://eric.ed.gov/?id=EJ978905https://eric.ed.gov/?id=EJ978905
Welker, J. (2016). Making user-friendly guides: Navigation and content considerations. In A. W. Dobbs & R. L. Sittler, (Eds.), Integrating LibGuides into library websites (pp. 107-126). Rowman & Littlefield.
White, H. (2018, March 13). Untitled presentation. [networking event]. ALIAWest. Perth, Western Australia.
Appendix A: Referencing Guides on Australian University Websites
University |
Is Guide Hosted or Authored by Library? |
Platform |
---|---|---|
Australian Catholic University |
Yes |
LibGuide |
Bond University |
Yes |
|
Charles Darwin University |
Yes |
LibGuide |
Charles Sturt University |
Yes |
Other |
Central Queensland University |
No |
|
Curtin University |
Yes |
Other |
Deakin University |
Combination |
LibGuide, Other |
Edith Cowan University |
Yes |
LibGuide |
Federation University |
Yes |
Other |
Flinders University |
No |
|
Griffith University |
Yes |
Other |
James Cook University |
Yes |
LibGuide |
La Trobe University |
Yes |
LibGuide |
Macquarie University |
Yes |
LibGuide |
Monash University |
Yes |
Other |
Murdoch University |
Yes |
LibGuide |
Queensland University of Technology |
Yes |
Other |
RMIT University |
Yes |
Other |
Southern Cross University |
Yes |
LibGuide |
Swinburne University of Technology |
Yes |
Other |
The Australian National University |
Yes |
Other |
The University of Adelaide |
Yes |
LibGuide, Other |
The University of Melbourne |
Yes |
Other |
The University of New England |
Yes |
|
The University of Newcastle |
Yes |
Other |
The University of Notre Dame Australia |
Yes |
LibGuide |
The University of Queensland |
Yes |
Other |
The University of Sydney |
Yes |
LibGuide |
The University of Western Australia |
Yes |
LibGuide |
Torrens University Australia |
Yes |
LibGuide, Other |
University of Canberra |
Yes |
LibGuide |
University of New South Wales |
No |
Other |
University of South Australia |
Yes |
Other |
University of Southern Queensland |
Yes |
Other |
University of Tasmania |
Yes |
LibGuide |
University of Technology Sydney |
Yes |
|
University of the Sunshine Coast |
Yes |
Other |
University of Wollongong |
Yes |
LibGuide |
Victoria University |
Yes |
LibGuide |
Western Sydney University |
Yes |
Other |
Appendix B: Student Usability Testing
Author’s note: Please adapt and reuse the contents of this Appendix for your own UX research.
Facilitator Script
Hello, my name is _________, [observer] and this is _________ [note-taker] and we’re librarians here at _________ [library name]. Thank you so much for taking the time to help us today.
Before we begin, I have some information for you, and I’m going to read it to make sure that I cover everything. We’ve made some changes to the APA referencing guide and we’re asking people to try it out so we can see whether it works the way we intended. The session should take about 15 minutes. The first thing we want to make clear right away is that we’re testing the site, not you.
You can’t do anything wrong here—you won’t hurt our feelings and we won’t react negatively to anything you have to say. I will be reading out a series of questions to you, and you’ll have a digital copy to refer to (we will use that to go to external websites) as well as a printed copy. The questions relate to aspects of the referencing guide, and as you use the site, I’d like you, as much as possible, to try to think out loud: to say what you’re looking at, what you’re trying to do, and what you’re thinking. While you are working through the questions, _________ [note-taker] will be making notes on what she observes and how you interact with the guide. Our voices are also being recorded.
Your thoughts and experiences will be a big help to us. We’re doing this to improve the referencing guide, so we need to hear your honest reactions. Once we have finished the activities you will receive the coffee voucher as thanks for your participation.
We will start with two background questions that can help us understand the context for your responses:
Demographic Questions:
Students:
What degree are you studying for?
-
What year are you in?
Academic staff [if applicable]:
Which School are you from?
Which role best describes you?
Academic staff
Adjunct
Sessional
Other
So, you’re here at the University home page. How do you get to the referencing guide from here?
Wait for student to find Library referencing guide
-
Where else would it be helpful to see a link to the referencing guide?
Prompts:
The University home page? From Blackboard?
-
Is the link to the referencing guide on the Library website in a useful place?
Prompts:
Ask why yes or no. Alternative places?
Move to open beta Guide tab
The next few questions are scenario-based. Feel free to look at the handout as you need to, but I’ll read out each question as we go. And again, if you could talk through the process as you are doing it that would be great.
Scenario Questions
You’ve been using information from this source [link] in your assignment. Where in the Guide would you look to find examples on how to reference it?
If you wanted to quote something where would you look for guidance?
You’ve been asked to reference an e-book that has three authors. Where in the Guide would you look to find examples on how to reference it?
[Week 4 question]. You want to make reference to this [link] in an assignment (or research/teaching). Where in the Guide would you look to find examples on how to reference it?
[Week 7 question]. You want to make reference to this [link] in an assignment (or research/teaching). Where in the Guide would you look to find examples on how to reference it?
[Week 4 question]. You’ve found this graph about life expectancy from the Australian Bureau of Statistics that you want to include in your work, that is, you want to use the actual graph in your text. Where would you go to find examples on how to do this?
[Week 7 question]. You’ve found a journal article which doesn’t have a DOI. Where in the Guide would you go to find out how to reference it?
[Week 4 question].You’ve been using this [link] in an assignment (or research/teaching). Where in the Guide would you look to find examples on how to reference it?
[Week 7 question].You want to quote a sentence from this resource [link] but it doesn’t have any page numbers—where in the Guide would you look for examples on what to do?
-
You’ve been reading the book below and notice a quote that you want to use—the quote by Farid, but you can’t access the original source for this quote. Where in the Guide would you look to find examples on how to reference this?
Harrison, K., & Hefner, V. (2014). Virtually perfect: Image retouching and adolescent body image. Media Psychology, 17(2), 134–153. doi:10.1080/15213269.2013.770354
Do you have any final comments about your experience with this Guide?
Appendix C: Faculty Unmoderated Testing (LibWizard Module)
Author’s note: Please adapt and reuse the contents of this Appendix for your own UX research.
Demographic Questions:
Please select the option that best describes you:
Academic staff
Adjunct
Sessional
Other
School:
Campus:
Scenario Questions
-
You are creating a reference in Blackboard for [link] so that students can view to prepare for a class.
Explore the APA Guide on the right to find examples on how to reference this material.
Which term(s) best describe your reactions and experience in locating the example? Select all that apply.
Easy to find
Satisfying
Unsure where to go
Confusing
Frustrating
Other / Please explain your response further and give examples
-
You need to create a reference for this type of source [link] for your course readings list. [Please click on link to go to the source in a new tab. You may have to login and/or refresh your browser to reload the page].
Explore the APA Guide on the right to find examples on how to reference this material.
Which term(s) best describe your reactions and experience in locating the example? Select all that apply.
Easy to find
Satisfying
Unsure where to go
Confusing
Frustrating
Other / Please explain your response further and give examples
-
You want to create a reference for an image in a PowerPoint slide for a class presentation.
Explore the APA Guide on the right to find examples on how to reference this material.
Which term(s) best describe your reactions and experience in locating the example? Select all that apply.
Easy to find
Satisfying
Unsure where to go
Confusing
Frustrating
Other / Please explain your response further and give examples
-
A student provided a secondary reference for a quote they used in an essay, as they weren’t able to access the original source to verify the quote and context.
Explore the APA Guide on the right to find examples on how to reference this material.
How well have we explained secondary referencing in the guide?
Very well
Quite well
Neutral
Not very well
Not well at all
Please provide any further comment you may have:
-
Looking across the guide in general, do you feel that it provides an adequate level of referencing support for students?
Yes
No
Please tell us why you have chosen this response.
-
Do you feel that the breadth of examples in the guide sufficiently cover the kinds of resources that you would encourage students to use in your course?
Yes
No
If not, what examples are missing?
Please add any other feedback on the guide or comments about your experience navigating it.
Appendix D: Usability Testing Kit
Author’s note: Please adapt and reuse the contents of this Appendix for your own UX research. In the text below, instances of [redacted] appear in the place of identifying data.
Project Overview and FAQs: Library Referencing Guide Redesign
What is this project?
Team:
Duration: Semester 2, 2018 & Summer 2019.
Outcome: A referencing guide that is easy to interpret and navigate
We are asking for staff and student feedback on a beta version of the APA referencing guide. We will be doing this through observed live testing and a survey. The testing will take place in the Library, on the Fremantle and Sydney campuses during Week 4 & Week 7*, Semester 2.
*These dates may be subject to change based on when the Ethics approval comes through.
Why are we doing this?
In 2016 the Library conducted focus group research investigating the student and staff experiences of using the Referencing Guide. For an overview account of the findings, please see the summary report. Overall, responses from students and staff were positive, however points from the discussion about the challenges of referencing prompted ideas about revising the Guide. An unpublished beta version of the Guide (APA only) has been created which includes changes to layout, headings, and wording. Before the beta Guide is officially published on the Library website, the Library again seeks the opinions of its users: University staff and students. Based on the feedback, the Library will also redesign the other two guides (AGLC3 and Chicago) and launch them officially for Semester 1, 2019.
What are we trying to find out?
The purpose of this study is to evaluate the beta Guide based on the following questions:
Can users quickly and effectively locate the Guide via the University and Library websites?
Are the headings in the Guide clear and intuitive?
Can users efficiently locate specific referencing examples?
Can users effectively use the Guide to identify how a given resource might be referenced?
How are we getting user opinions?
We will get user opinions by conducting User Experience (UX) testing with staff and students. From the literature review:
Considering the stressful nature of completing assignments with accurate referencing, a user-centred referencing guide design is of the utmost importance. Libraries have traditionally depended on quantitative data (door counts, circulation statistics downloads, pageviews) to measure a service’s success. However, such data is limited in what it can reveal (Appleton, 2016; Massis, 2018). Next to the Library’s discovery service, Summon, the Guide is the most accessed part of the Library website, yet little is known about patterns of use. More libraries are turning to User Experience (UX) research approaches for deeper insight into the user perspective. Typically used in the Human-Computer Interaction field, UX research aims to understand a user’s feelings, preferences, and perspectives about a service or interface (Bardzell, 2013). Libraries are increasingly using UX approaches to evaluate all services, physical as well as digital (Daly et al., 2017; MacDonald, 2016). Common UX practices include surveys, focus groups, one-on-one interviews, and usability testing (Massis, 2018). Of all these approaches, usability testing is the most effective when assessing websites. Schmidt and Etches (2012) state plainly that “to improve your website, you need to watch people interact with it” (p.33). Krug (2013) also emphasizes the importance of observing users, as “not everyone uses the Web the way you do” (p.114). Ideally these observations will reveal if users experience confusion, disorientation, or dissatisfaction with the Guide.
The data we will collect (survey and usability testing) will play a valuable role in shaping the structure, content, and navigation of the beta Guide.
How will the Usability Testing take place?
There are two ways we are testing the beta Guide:
Live Testing
The usability testing process will entail presenting participants with common referencing scenarios and then observing how they interact with the Guide to find the answers they need. Participants will be read the scenarios as well as be provided with a digital version on the screen with the scenario text and relevant links to tasks. As well as observing navigation, we would like to get a sense of how participants evaluate and classify sources (particularly online documents) before referencing them. For example, would they reference an eBook as a website, or newspaper article as a journal? Thus, links to various information sources will be provided within some of the scenarios for participants to view and classify.
Academic staff will also be invited to participate in the usability testing taking place in the Library at the scheduled times. A survey will also be provided to collect data from academic staff. A survey was chosen as another data collection method for staff as it is a convenient way for a wide variety of staff to provide their feedback. Some academic staff members are not on campus every day, some come to campus only for teaching sessions (e.g., sessional staff), and some are rarely on campus (e.g., adjunct staff). However, all staff members are likely to use the Guide in some way, and thus we hope to get feedback from as many staff as possible despite scheduling and location constraints. By working through the scenarios presented within the survey, our hope is that academic staff who do not participate in live usability testing will get enough experience navigating the beta Guide to provide meaningful feedback.
An “on-the-fly” recruiting method (also known as “lobby testing”), in which participants (usually students) are approached at random within the Library, has been used successfully by Duke University Libraries (Daly et al., 2017) and University of North Carolina (UNC) Libraries (Arnold et al., 2016). This recruitment method is especially beneficial for libraries that are short on staff time and resources. Although [institution name] has a small, dispersed library staff, we would still like to build a culture of regular, systematic evaluation. This informal recruiting method could provide the Library with an efficient, achievable method of gaining student feedback.
Survey
A survey will be created using LibWizard, a tutorial and survey tool subscribed to by the Library. This survey will contain scenario-based questions that reflect common situations in referencing. This will give academic staff a chance to explore the beta APA Guide and provide feedback on their experiences and opinion on the Guide in general. LibWizard is a unique tool in that it provides a window onto a live website while questions appear alongside, so staff will not have to exit the survey to explore the Guide. A link to this survey will be sent via email to academic staff on all campuses by the University Librarian in the month prior to testing (13 August) and during Week 3 (10–14 September). The survey will be anonymous (although non-identifiable demographic questions will be asked) and will take approximately 10–15 minutes to complete.
How will library staff will be involved?
[Name redacted] will be leading the project with [name redacted] as co-investigator. We already have our Liaison Librarians on board with doing live testing and a few IDAs to help with recruitment. However, help is needed from all team members with promoting the testing (including telling students about it on the desk), posting flyers, and helping with setup on the day, among other tasks. Your help is greatly appreciated!
How will we recruit participants?
Flyers and posters in Libraries, bathrooms, and student common areas indicating date of data collection activities.
Recruitment booth in Libraries staffed by library staff on day of data collection. Booth will have participant information sheet/consent form, further information (observer gives coffee voucher at conclusion of activity).
Emails to academic staff with LibWizard link and usability testing dates to be sent by University Librarian on behalf of project team one month prior to testing (13 August) and during Week 3 (10–14 September). Project will be announced by the University Librarian at national L&TC meeting to encourage participation by staff.
When will this take place?
Week 4 (20–24 August) |
Tuesday 10am-12pm, Wednesday 12pm-2pm Thursday, 10am-12pm |
Week 7 (10–14 September) |
Tuesday 10am-12pm, Wednesday 12pm-2pm Thursday, 10am-12pm |
These dates may be subject to change. |
What are some of the questions participants will be asked?
See the usability testing script for questions. Keep in mind these questions are a draft at this time.
What do participants have to do to participate?
Participants will have to be given a participant information sheet and a consent form for live testing. Survey participants (staff) will be emailed a participant information sheet along with the survey, but will be tested anonymously and will be asked to check a consent box on the form.
Do participants get anything out of it?
A coffee voucher from a local provider… and the satisfaction of helping out!
What does the beta version of the Guide look like?
Log in to LibGuide and find it under “Referencing Guide—beta version.”
References
Appleton, L. (2016). User experience (UX) in libraries: Let’s get physical (and digital). Insights: The UKSG Journal, 29(3), 224–227. https://doi.org/10.1629/uksg.317https://doi.org/10.1629/uksg.317
Arnold, S., Haefele, C., & Sharrar, G. (2016). On-the-fly usability testing findings. https://libux.web.unc.edu/2016/07/28/on-the-fly-usability-testing-findingshttps://libux.web.unc.edu/2016/07/28/on-the-fly-usability-testing-findings
Bardzell, J. (2013). Critical and cultural approaches to HCI. In S. Price, C. Jewitt, & B. Brown, (Eds.), The SAGE handbook of digital technology research (pp. 130–143).
Daly, E., Chapman, J., & Crichlow, T. (2017). Just ask them! Designing services and spaces on the foundation of student feedback. In S. Arnold-Garza. & C. Tomlinson (Eds.), Students lead the library: The importance of student contributions to the academic library (pp. 267–282). Association of College and Research Libraries.
Krug, S. (2013). Don’t make me think, revisited: A common sense approach to web usability (3rd ed.). New Riders.
MacDonald, C. M. (2016). User experience librarians: User advocates, user researchers, usability evaluators, or all of the above? Proceedings of the Association for Information Science and Technology, 52(1), 1–10. https://doi.org/10.1002/pra2.2015.145052010055https://doi.org/10.1002/pra2.2015.145052010055
Massis, B. (2018). The user experience (UX) in libraries. Information and Learning Sciences, 119(3/4), 241–244. https://doi.org/10.1108/ILS-12-2017-0132https://doi.org/10.1108/ILS-12-2017-0132
Schmidt, A., & Etches, A. (2012). User Experience (UX) design for libraries. ALA TechSource.
Referencing Guide Testing Checklist
1 week prior
Print out and post flyers around library, in bathrooms
Schedule IDAs as recruiters
Schedule Note-Takers
Procure Coffee Vouchers
Ensure iPad is available with Voice Record Pro app installed
Allocate area of library for recruitment booth
Allocate table for recruitment booth
Book study room for testing
Set up waiting area, if necessary
Day of testing
Ensure iPad is charged
Coffee vouchers ready
Set up recruiting table (see below)
Set up testing room (see below):
Set up chairs for waiting area outside study room
Print forms:
Consent form (10)
Participant Information Sheet (10)
Print testing sheets:
Testing Script (for Observer) (2)
Notes Sheet (for Note-Taker) (10)
Participant questions sheet (for Participant)
Pens/pencils
Bowl of lollies
Recruiting table setup
Hang posters off front of table
Place bowl of lollies on table
Consent forms, Participant Information Sheets printed
Pens and pencils
Testing room setup
Computer logged in with:
Referencing Guide beta opened in one tab
University home page opened in another tab
Make sure Firefox is set as default
Make sure participant questions are onscreen
iPad logged in with app opened
Spare Consent forms, Participant Information Sheets
Coffee vouchers
Participant questions printed out
Script & Notes sheets ready
Pens and pencils
Water & glass
Participant Information Sheet—Usability Testing (Staff And Students)
Library Referencing Guide Redesign
You are invited to participate in the research project described below.
What is the project about?
The research project will investigate how students and staff use a beta (test) version of the redesigned Library Referencing Guide for APA style that we intend to launch in Semester 1, 2019. This project will help the Library to understand how students and staff navigate and use the Guide to find referencing examples. By observing people using the Guide, we hope to understand whether it functions properly and whether any improvements could be made. Overall, the aim of this project is for the Library to design a referencing guide that students and staff find easy to use.
Who is undertaking the project?
This project is being conducted by [name redacted] at the University of Notre Dame Australia.
What will I be asked to do?
If you consent to take part in this research study, it is important that you understand the purpose of the study and what you will be asked to do. Please make sure that you ask any questions you may have and that all your questions have been answered to your satisfaction before you agree to participate.
This study will take place in a private area within either the St Teresa’s or St Benedict’s Libraries during Semester 2, 2018. You’ll be provided with a computer to access the beta (test) referencing guide. Two staff members from the Library will guide the session: one to ask questions, and the other to take notes.
We will ask students which degrees they are studying for and which year they are in, and staff for their School and type of role. We will present you with a series of scenario-based questions that reflect common referencing situations. You will be asked to use the referencing guide to work through these questions, describing your process out loud as you go. We will also ask you about your general impressions of using the Guide.
Your voice will be audio-recorded, and notes will be taken about your interaction with the Guide. We estimate this process will take about 15 minutes.
Are there any risks associated with participating in this project?
There is no foreseeable risk in you participating in this research project.
What are the benefits of the research project?
By participating in this research project, you will help to influence how the referencing guide is designed and presented. With your feedback, the Library will be better able to construct a guide that responds to staff and student needs. The end result will hopefully be an improved experience, and a reduction in stress towards the referencing process.
What if I change my mind?
Participation in this study is completely voluntary. Even if you agree to participate, you are free to withdraw at any time without giving a reason and with no negative consequences. You are also free to ask for any information which identifies you to be withdrawn from the study.
Will anyone else know the results of the project?
Information gathered about you will be held in strict confidence. This confidence will only be broken if required by law.
Information gathered about you (consent forms, notes, audio-recordings) will be held in strict confidence in a password protected file on the UNDA national drive (computer network). Only the researchers named on this form will have access to the file. Audio sessions will be transcribed with names removed. No data will be identifiable. Only quotes, questions and aggregated data will be published.
Once the study is completed, the data collected from you will be de-identified and stored securely in the University Library at UNDA for at least a period of five years. The complete set of de-identified data (transcribed audio-recordings) will also be uploaded to a data storage site that will be accessible to the wider research community for a minimum of five years. Should the research be published in the form of a journal article, selected de-identified data (graphs, tables, quotes from text transcripts) will also be uploaded to the UNDA institutional repository Research Online alongside an open access full-text version of the article for a minimum of five years.
The data may be used in future research but you will not be able to be identified. The results of the study will be published in internal University reports, a conference paper, and a journal article.
Will I be able to find out the results of the project?
Once we have analyzed the information from this study, a report will be made available to the public on the University Library website. An announcement will be made via email and on the Library’s blog alerting staff and students to the availability of the report. You can expect to receive this feedback by mid-2019.
Who do I contact if I have questions about the project?
If you have any questions about this project please feel free to contact [name redacted]. Alternatively, you can contact [name redacted]. We are happy to discuss with you any concerns you may have about this study.
What if I have a concern or complaint?
The study has been approved by the Human Research Ethics Committee at UNDA (approval number [redacted]) If you have a concern or complaint regarding the ethical conduct of this research project and would like to speak to an independent person, please contact [name redacted]. Any complaint or concern will be treated in confidence and fully investigated. You will be informed of the outcome.
How do I sign up to participate?
If you are happy to participate, please complete the attached survey and deliver it to a Library front desk, or scan and email it to [redacted].
Thank you for your time. This sheet is for you to keep.
Yours sincerely,
[name redacted]
UNIVERSITY LIBRARY
Student Consent Form
Library Referencing Guide Redesign
I agree to take part in this research project.
I have read the Information Sheet provided and been given a full explanation of the purpose of this study, the procedures involved, and what is expected of me.
I understand that I will be asked to complete a series of tasks on a computer, using a beta (test) version of the Library referencing guide. The tasks will be based on common referencing scenarios. My voice will be audio-recorded and notes will be taken about my interaction with the guide.
The researcher has answered all my questions and has explained possible problems that may arise as a result of my participation in this study.
I understand that I may withdraw from participating in the project at any time without prejudice, except where the data is de-identified and not re-identifiable.
I understand that all information provided by me is treated as confidential and will not be released by the researcher to a third party unless required to do so by law.
I agree that any research data gathered for the study may be published provided my name or other identifying information is not disclosed.
I understand that, if the sample size is small, this may have implications for protecting my identity.
I understand that research data gathered may be used for future research but my name and other identifying information will be removed.
Name of participant |
|||
Signature of participant |
Date |
I confirm that I have provided the Information Sheet concerning this research project to the above participant, explained what participating involves, and have answered all questions asked of me.
Signature of Researcher |
Date |