Introduction

Data-driven. Metrics. Assessment. Often used in concert, these terms exist at the core of educational decision-making as it relates to accreditation, from modifying instruction in a single course to enacting sweeping change across curricular and research programs. While librarians often support such initiatives through information literacy instruction, data and information provision, and service to the institution, we are rarely on the forefront--or leaders--of the institutional accreditation process. This is particularly true within schools of business, for which the Association to Advance Collegiate Schools of Business (AACSB International) accreditation is an essential--and necessary--differentiator.

At William & Mary’s Raymond A. Mason School of Business, the Level 1 AACSB team is primarily responsible for coordinating accreditation activities and preparing for the Peer Review Team’s site visit. This team, comprising the associate dean of faculty and academic affairs, the associate dean of finance and administration, the director of academic affairs, the business librarian (who reports upwardly to the School of Business), and the faculty director of accounting programs, also significantly contributes to the documentation (known as the Continuous Improvement Review (CIR) report) that demonstrates the school’s alignment with AACSB standards. In 2020, as the Level 1 team prepared for the Continuous Improvement Review, they identified a gap within the CIR report’s Strategic Management and Innovation section, for which intellectual contributions are a significant portion. The team recognized the need for both qualitative and quantitative data about impact (the impact of both faculty research and research initiatives), as well as benchmarking aspirant, competitor, and peer institutions’ research outputs.

Prior to this need, William & Mary Libraries partnered with the Office of the Provost and the Office of Research and Graduate/Professional Studies to explore adding bibliometric platforms to the university’s research infrastructure portfolio. The business librarian served on the library’s research impact task force to identify and evaluate multiple platforms, and the task force presented these options to the provost, deans, and their representatives. After careful consideration, SciVal was selected as the bibliometric platform of record for the university, and, because Scopus is the foundational index for SciVal, the university procured Scopus in tandem with SciVal. The business librarian, with knowledge of this initiative, proposed using data from SciVal (and Scopus, as the primary index) to answer questions about the impact of faculty scholarship for the AACSB Continuous Improvement Review. The team encouraged the business librarian to deepen her expertise in the area of bibliometrics and investigate SciVal as a means of achieving this goal. They also agreed that any metric retrieval would depend on support from the business school’s Faculty Research Committee, who would need to be heavily involved in identifying SciVal metrics for which the school could assess scholarship.

Literature Review

Recent literature, including working papers and documents, detail the history of AACSB and library engagement and suggest multiple roles for librarian involvement with AACSB and other accreditation efforts. Guth and Stonebraker (2021) offer a historical perspective on libraries and their relationship with the standards and identify changes from previous versions of the AACSB standards, which were focused on collections libraries provide, to a focus in the 2020 standards on curriculum and research support. Liu (2021) provides more concrete opportunities for engagement, suggesting that librarians “a) help raise business faculty’s awareness of alert services in library databases, digital commons, and emerging scholarly impact measurements such as Altmetrics; b) help faculty navigate the publication landscape; and c) assist in gathering evidence for faculty scholarly impact and publications” (para. 10). Finally, the AACSB Toolkit (Guth, 2022, July 21) (developed by Guth with sponsorship from RUSA BRASS) provides librarians with templates and other opportunities to seek engagement with their respective business schools.

As Liu (2021) mentions, one opportunity for librarian engagement with AACSB involves bibliometrics, or the practice of gathering and analyzing quantitative bibliographic data. There are three primary sources of “comprehensive citation data” (Sugimito & Lariviere, 2018, p. 18): Clarivate’s Web of Science, Elsevier’s Scopus, and Google Scholar, although statistical bibliography preceded all of them, as it enabled librarians to build and align their collections with various goals. As statistical bibliography grew, so did bibliometric theory, and in 1963, the Science Citation Index (SCI), Web of Science’s evolutionary ancestor, was born. In the mid-2000s, Elsevier launched Scopus as a competitor to Web of Science, with many Elsevier-published journals filling out the index (Sugimito & Lariviere, 2018). As these bibliographic indices became ubiquitous and reliance on citation metrics rose, so did concerns about the increasing use (or misuse) of research metrics--including but not limited to the Journal Impact Factor, citation counts, and the h-Index--as replacements for qualitative judgment of an individual scholar’s research quality, potential, and organizational future.

In 2012, the San Francisco Declaration of Research Assessment, colloquially known as DORA, was written as a response to the overreliance on metrics in research assessment and as a call for the “responsible use of metrics” by faculty, administrators, publishers, funders, and more (American Society for Cell Biology, 2022a, para. 4). DORA specifically identified problems with relying on and using the Journal Impact Factor, as it had become conflated with publication quality, and advised against using these journal-level metrics as a substitution for quality of an individual article. Specifically, DORA tailored distinct recommendations for funding agencies, institutions, publishers, researchers, and bibliometric organizations, and advised on transparency, avoiding the use of the Journal Impact Factor in discussions about journal quality, creating article-level metrics to assess the work in question, creating spaces and opportunities to outline researcher contributions in multi-author articles, and making content-based evaluations (American Society for Cell Biology, 2022b). In 2015, the Leiden Manifesto emerged as another academic counter to address ethical issues surrounding the overuse and misuse of metrics. The Manifesto detailed 10 principles, including: focusing on ensuring qualitative and quantitative assessment of research; measuring research based on its alignment with an institution’s mission, strategy, and goals; ensuring transparent processes; providing opportunities to the evaluatee to recreate the analysis; considering disciplinary differences in publication and citation habits; encouraging peer review of a researcher’s work; using a variety of metrics, rather than relying on one that can be gamed, and displaying/communicating those metrics and associated data appropriately; and evaluating metrics and determining if they need to be modified (Hicks et al., 2015).

For librarians, the ethical and responsible evaluation of information, whether that information occurs as part of the production or the assessment of knowledge, is an information literacy competency and a core tenet of the profession. Biblio- and scientometricians, too, value the responsible evaluation of metrics, and as part of the INORMS and LIS-Bibliometrics communities, they develop and share best practices for the ethical evaluation of research and its impact. In 2019, the INORMS Research Evaluation Working Group launched the SCOPE Process for the responsible evaluation of research. Tailored to research management groups and senior leaders across various industries, the five-step SCOPE Process provides leaders and faculty with a process to ensure that institutional metrics are appropriately aligned and evaluated in concert with an institution’s mission and vision, rather than external forces (e.g., funding agencies or accrediting bodies) defining metric importance (Himanen & Gadd, 2021).

Of course, authors, departments, institutions, librarians, accrediting bodies, and publishers also share a common goal: They want reassurance that their research contributions matter, are impactful, and are appropriately attributed to them. Enter publication and researcher identifiers, which differentiate publications and authors and enable the increased discovery and attribution of work within disciplines.

Selecting Tools

At William & Mary, Scopus is the bibliographic tool of record, as it replaced Web of Science when the University Libraries discontinued their subscription to Web of Science shortly after Scopus’s procurement. Concurrently, William & Mary decided to adopt SciVal, since it relied on Scopus data to produce research impact reports. Although Web of Science was available for assessing research impact for approximately six months after SciVal’s procurement, the AACSB Level 1 team jointly agreed that it would be best to learn and utilize a tool with longevity, and Web of Science’s brief utility was disregarded in favor of Scopus and SciVal. Scopus, SciVal, and Scopus IDs served as the foundation for the School of Business’s research impact analysis, as they supplied raw, accurate publication data about William & Mary’s research outputs, as well individual faculty contributions. The business librarian also used Scopus and SciVal to examine relationships between the business school and its self-identified peer, aspirant, and competitor institutions by retrieving data about the concentration of Financial Times Top 50 Journals (FT50) publications within business and management journals from SciVal.

Later, when considering and comparing research contributions across institutions, the business librarian retrieved data from Data Direct, AACSB’s data universe, on AACSB peer, aspirant, and competitor institutions, as well that subset of institutions identified by the Faculty Research Committee to be research peers. To better understand each business school in the context of their university ecosystem, the business librarian also utilized the Integrated Postsecondary Education Data System’s (IPEDS) “Compare Institutions” portal to retrieve data on each of these institutions, including enrollment, degrees conferred, faculty profile, prioritization of activities, and research emphasis.

Process

From August through mid-October 2020, the business librarian attended training and professional development sessions centered on bibliometrics, Scopus, and SciVal. These sessions included those facilitated by Elsevier representatives to learn about and understand the data sources for both Scopus and SciVal, as well as the Bibliometric Training Series. As noted by the National Institutes of Health (NIH), bibliometrics training opportunities are limited, and the NIH developed the Bibliometric Training Series as asynchronous modules that are freely available to the public to deepen understanding of bibliometric principles and practice (https://www.nihlibrary.nih.gov/services/bibliometrics/bibliometrics-training-series). To round out her training, the business librarian also sought information from LIS-Bibliometrics via their blog, The Bibliomagician (https://thebibliomagician.wordpress.com/), and from bibliometric experts in Virginia, including colleagues at Virginia Tech Libraries.

Upon deepening her understanding of bibliometrics and the ethical issues involved, the business librarian sought to understand the School of Business’s values related to scholarly research and the production of knowledge and determine if those were aligned with any existing metrics in Scopus and/or SciVal. In late October 2020, the business librarian guided a values and metrics translation exercise with the Faculty Research Committee and used the SCOPE Process to guide discussions. She framed the exercise using the Leiden Manifesto and San Francisco DORA and advised faculty to consider that the analysis should be both qualitative and quantitative without overreliance on a single metric. The business librarian also identified the need for the process to be transparent and replicated and encouraged faculty to login to and build their understanding of Scopus and SciVal.

The SCOPE Process involves five steps: start with what you value; consider context; identify all options for measuring; probe the measurements; and evaluate not only the metrics chosen but the evaluation process. The business librarian worked with the research committee to identify faculty, administrator, and institutional values and determine how those aligned and diverged. She advised faculty to consider the context for measuring, taking into consideration Mason faculty values as well as industry values (such as the FT50 used in calculating their research rank) within business publishing across academe. The business librarian also relied upon “Using SciVal Responsibly,” a working guide developed by INORMS, to explain the pros, cons, and pitfalls of various metrics as faculty considered which were most appropriate for usage. Finally, the business librarian encouraged the Faculty Research Committee to consider the school’s definitions of the AACSB terms scholarly academic, scholarly practitioner, practice academic, and instructional practitioner (SA, PA, SP, and IP), as well as existing incentive structures that promoted research.

The research committee identified four values that were consistently embedded across business and school practice: productivity, citation counts (both inclusive and exclusive of self-citations), FT50, and SJR. Although the list included two different journal impact measures, the committee stipulated that these were industry-wide standards and recommended the inclusion of that justification in the Continuous Improvement Review report. In collaboration with the Faculty Research Committee, the business librarian determined that multiple faculty values could be measured using SciVal, and she worked with the committee to translate these internal values into an external system.

  • Overall productivity: scholarly output

  • SJR: Number and percentage of publications in top SJR journals in business: Top 1%, 5%, 10%, & 25%

  • Citation counts (including & excluding self-citation): citation counts, average citations per publication, median citations per publication

  • FT50: Total and % of FT50 journals in the sample

To accurately retrieve data on these metrics, the business librarian created a hierarchical research group of William & Mary faculty who were employed in academic year 2020. To ensure that researcher profiles were accurate, she tagged faculty with known researcher IDs, including ORCiD and Scopus ID, identified their faculty qualification category (SA, PA, SP, and IP) and their area of focus within the business school, and uploaded the research group into SciVal. (This more-involved metadata proved to be invaluable, as accounting is a separate AACSB accreditation process and was part of the joint Peer Review Team visit in February 2021; we were able to analyze accounting faculty both as part of and separate from the whole).

Using that group composition, the business librarian refined publications to range from 2015–2020 and 2010–2020, acknowledging that 2020 was an incomplete publication year and would not be fully indexed until July 2021. She ensured that data were filtered by the ASJC classification system used in Scopus and then retrieved the aforementioned metrics and partnered with the associate dean of faculty & academic affairs to analyze the data. The team also identified limitations with metrics data. For example, while citation counts can indicate the number of times that a particular article has been cited, the number alone doesn’t address how a particular article is discussed within the literature. Additionally, the average citation rate can be skewed by highly-cited outliers, and we addressed this by assessing the median citation rates for different types of scholarly contributions. Finally, along with the associate dean, the business librarian wrote much of the narrative related to metrics and scholarly impact into the school’s CIR report, which was submitted to the Peer Review Team in December 2020. In addition to including bibliometric data, the business librarian advised on additional ways to increase research visibility and impact, including ORCiD adoption and including pre- and post-prints of faculty work into W&M ScholarWorks, William & Mary’s institutional repository.

Later, after the Peer Review Team reviewed the CIR report, the business librarian was asked to deepen the school’s research analysis by determining how the School of Business’s publication standards, particularly that of the FT50, aligned with those of its peer, aspirants, and competitor institutions. Simultaneously, the research committee indicated interest in identifying a subset of research peers: institutions with similar numbers of students, programs, faculty, and emphasis on research, teaching, and service, which might include but could also extend beyond the previously identified AACSB peers, aspirants, and competitors. Using AACSB Data Direct and IPEDS, the business librarian curated a file documenting these outputs for all aspirants, peers, competitors, and potential research peers. Then, applying FT50 publication data from Scopus and SciVal, the business librarian determined the concentration of FT50 publications from 2015–2020, and again from 2018–2020. Not only did this inform the Research Committee’s selection of peers, it also enabled the school’s Steering Committee and Peer Review Team to observe trends in FT50 publishing and William & Mary’s alignment with its peers.

Considerations

As librarians continue to serve in advisory capacities and share bibliometric knowledge, we must acknowledge the following considerations and challenges. First, while metrics can be helpful in providing a bird’s-eye view of institutional and faculty publishing behavior, many insights are lost when quantitative analysis lacks qualitative insights. For example, citation metrics, which can be a useful tool for capturing overall reference to a particular work, lack qualitative assessment. Sentiment analysis would be a useful qualitative metric to accompany citation metrics, as they would provide that additional insight into how and why a particular work might be referenced.

There are also challenges with using platforms like SciVal at face-value in benchmarking contributions against other institutions. SciVal, in general, lacks insights related to faculty attrition, and because of the interdisciplinary nature of the business discipline, the various subject classifications for journals, and differences in programs and majors by school, it can be difficult to compare apples to apples. Citation counts can be skewed by highly-cited outliers, and some proprietary metrics are not easily replicated outside of the platform.

Being a scientometrician or bibliometrician requires functional expertise, and developing this is time-consuming. For librarians who wear multiple hats at their institutions or whose institutions are understaffed, being charged with bibliometric responsibilities may result in a reduction of other critical services that are necessary offerings by libraries.

Yet, there are also opportunities for business librarians to demonstrate their own impact by helping faculty and administrators understand the cost of research and the methods and frequency of scholarly information consumption, as well as share strategies for improving access to faculty publications. For example, the business librarian’s work at William & Mary resulted in the promotion and endorsement of additional researcher IDs, including the Open Researcher and Contributor ID, known as ORCiD, which works to disambiguate author names and increase discovery of scholarly work. ORCiD, too, has potential to increase not only individual impact, but institutional impact, including rankings. Further, Watermark Faculty Success’s ORCiD API enables data to be more easily ingested into that system and then deployed to other internal and external platforms, with the potential to feed institutional repositories.

Because librarians understand the need for a “sustainable research infrastructure,” we can leverage research information management systems (RIMS) and our social media platforms and engagement tools to promote and celebrate published research. We can also partner with business schools to ensure that their high-impact journals are accessible in full-text within our collections, and when they are not, offer them through an on-demand service. Finally, we can encourage open access publication by a) working with faculty to understand and negotiate copyright and creative commons publication rights, which might yield greater visibility of their work, and b) help faculty upload content to institutional repositories and provide metrics on views, downloads, and global readership. All of these can potentially positively influence not only an individual’s societal impact, but the societal impact of the business school.

References

American Society for Cell Biology. (2022a). About DORA. https://sfdora.org/about-dora/https://sfdora.org/about-dora/

American Society for Cell Biology. (2022b). San Francisco declaration on research assessment. https://sfdora.org/read/https://sfdora.org/read/

Guth, L. (2022, July 21). AACSB standards toolkit for business librarians. https://brass.libguides.com/AACSBhttps://brass.libguides.com/AACSB

Guth, L. & Stonebraker, I. (2021). AACSB accreditation standards: What they mean for business librarians past, present, and future. Journal of Business & Finance Librarianship, 27(1), 1–16. https://doi.org/10.1080/08963568.2021.1920173https://doi.org/10.1080/08963568.2021.1920173

Hicks, D., Wouters, P., Waltman, L., de Rijcke, S., & Rafols, I. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature, 520, 429–431. https://doi.org/10.1038/520429ahttps://doi.org/10.1038/520429a

Himanen, L. & Gadd, E. (2019, December 11). Introducing SCOPE: A process for evaluating responsibly. The Bibliomagician. https://thebibliomagician.wordpress.com/2019/12/11/introducing-scope-aprocess-for-evaluating-responsibly/https://thebibliomagician.wordpress.com/2019/12/11/introducing-scope-aprocess-for-evaluating-responsibly/

Liu, G. (2021). Business librarians’ roles in supporting AACSB accreditation: A discussion about the potentials. Academic BRASS, 16(2). https://www.ala.org/rusa/sites/ala.org.rusa/files/content/sections/brass/Publications/Acad_BRASS/2021_fall_liu.pdfhttps://www.ala.org/rusa/sites/ala.org.rusa/files/content/sections/brass/Publications/Acad_BRASS/2021_fall_liu.pdf

Sugimoto, C.R. & Lariviere, V. (2018). Measuring research: What everyone needs to know. Oxford University Press.