Article

Reshaping How Universities Can Evaluate the Research Impact of Open Humanities for Societal Benefit

Authors
  • Paul Longley Arthur (Edith Cowan University)
  • Lydia Hearn (Edith Cowan University)

Abstract

During the twenty-first century, for the first time, the volume of digital data has surpassed the amount of analog data. As academic practices increasingly become digital, opportunities arise to reshape the future of scholarly communication through more accessible, interactive, open, and transparent methods that engage a far broader and more diverse public. Yet despite these advances, the research performance of universities and public research institutes remains largely evaluated through publication and citation analysis rather than by public engagement and societal impact. This article reviews how changes to bibliometric evaluations toward greater use of altmetrics, including social media mentions, could enhance uptake of open scholarship in the humanities. In addition, the article highlights current challenges faced by the open scholarship movement, given the complexity of the humanities in terms of its sources and outputs that include monographs, book chapters, and journals in languages other than English; the use of popular media not considered as scholarly papers; the lack of time and energy to develop digital skills among research staff; problems of authority and trust regarding the scholarly or non-academic nature of social media platforms; the prestige of large academic publishing houses; and limited awareness of and familiarity with advanced digital applications. While peer review will continue to be a primary method for evaluating research in the humanities, a combination of altmetrics and other assessment of research impact through different data sources may provide a way forward to ensure the increased use, sustainability, and effectiveness of open scholarship in the humanities.

Keywords: social impact, altmetrics, bibliometrics, humanities, open scholarship, public engagement, research evaluation

How to Cite:

Arthur, P. L. & Hearn, L., (2021) “Reshaping How Universities Can Evaluate the Research Impact of Open Humanities for Societal Benefit”, The Journal of Electronic Publishing 24(1). doi: https://doi.org/10.3998/jep.788

Downloads:
Download XML

1832 Views

93 Downloads

Published on
12 Oct 2021
Peer Reviewed

Introduction

Evaluation has become a central part of the fabric of modern society, and public universities are increasingly judged by national policies to assess their levels of research and knowledge exchange in response to calls for greater public accountability, financial sustainability, and research impact (Murphy and Costa 2018; Watermeyer 2016). Yet defining, evaluating, and distinguishing the achievements of diverse heterogeneous fields of research has made the assessment of university research outcomes increasingly complex (Haustein 2016). Until recently, research outputs at the university, faculty, and individual levels have been assessed primarily according to peer-reviewed publications and their citation analysis to illustrate quantifiable, material, and tangible results (Toledo 2018; Robinson-Garcia, van Leeuwen, and Rafols 2018). But this approach has never favored the humanities, for which journal impact factors and overall citation indices tend to be lower, with studies often focused on more localized contextual issues or published through detailed archives or manuscripts, the visible outcomes of which may be long-term (Hammarfelt and Haddow 2018; Ochsner, Hug, and Daniel 2016).

Throughout history, universities have been foundations for learning and knowledge creation. However, over the past two decades, with the massive growth and globalization of higher education, research evaluation schemes have placed increasing emphasis on the role that academics should play in addressing social issues and demands (Hazelkorn 2015; Boyer 1996). Thus, the mission of universities has been broadened to not only contribute to the creation of knowledge and higher education but also to bridge the gap between the makers and users of research through public engagement for societal benefit (Robinson-Garcia, van Leeuwen, and Rafols 2017; Beaulieu, Breton, and Brousselle 2018; Wallace and Rafols 2015). The emerging policy discourse around research now focuses on greater citizen participation in and co-construction of knowledge so that research findings can be shared, challenged, and, where possible, used and reproduced by and for the public and all intended users rather than being hidden behind publication paywalls and used by an elite few (Scanlon 2018; Veletsianos and Kimmons 2012; McKiernan 2017).

Spurred on by national research council agendas and the influence of the information age, open access, or full universal access to research outputs—such as the sharing of articles, data, methods, and educational resources—has become a fundamental principle for all universities (Berlin Declaration 2003; Budapest Open Access Initiative 2002). Today, open scholarship encompasses more than just open access and includes open data, open science, open society, open educational resources, and all other forms of openness in the scholarly and research environment, using digital or computational techniques or both (Lewis et al. 2015). In response, national policy directives have affirmed the importance of presenting research outputs in findable, accessible, interoperable, and reusable (FAIR) ways and with data made more easily available to connect academics with the communities they serve (Wilkinson and Dumontier 2016). Empowered by the rise of social web and networked technologies, countless organizations—including journals, universities, research centers, funding agencies, and government and intergovernmental groups—have embraced open practices, making this part of their policy requirements.

The impact of open scholarship is becoming profound and changing the way academics and society produce, share, distribute, and create knowledge. Online publishing, the rise of open scholarship, personal (and often mobile) computing devices, social media, crowdsourcing, citizen scholars, and regulations for information access are all part of this changing picture (Arbuckle and Siemens 2015; Greenhow, Gleason, and Staudt Willet 2019). The standard model of research and scholarly communication is transforming from a closed, print-centric culture to the open sharing of knowledge and data among networks of researchers, organizations, and the broader community (Neylon 2015). With it, a new group of academics is producing and using online tools and platforms that offer opportunities to engage a far broader and more diverse public (Scanlon 2018). Through the use of these digital public spaces, academics can circumvent the power of traditional publishers and instead circulate their ideas via social media and research papers published in electronic, open access journals accessible to all (Neylon 2015; Murphy and Costa 2018; Tofield 2019). The boundaries between knowledge creation and knowledge dissemination are blurring, and academics’ roles and achievements are being redefined (Ren 2015).

Yet despite the advancements toward “open scholarship” and “responsible research and innovation” aimed at enhancing public engagement, outdated metrics that focus on publication and citation analysis remain the key approach to assessing research performance. The overall aim of this article is to explore how, as open scholarship is making research more freely available to the general public, new social metrics could enrich and reshape the way universities evaluate research impact for societal benefit. As a stepping-stone, the article reviews the history of research performance assessment by public universities and how this has both influenced and restricted the passage of academic research toward a broader mission for societal inclusion. The authors then examine the potential for alternative metrics as indicators for research assessment in the digital era and illustrate how open scholarship could influence and impact research outputs, with focus on the humanities. The article provides a critique of altmetrics, analyzing some of the current concerns voiced by academics in this field of research and how altmetrics compare with traditional metrics. Finally, it looks at key drivers for change and examines how the diversification of research assessment focused on networked analysis and the mapping of the contexts in which researchers and diverse community stakeholders are engaged could offer new pathways for universities to assess impact and benefit, especially in the field of humanities. It concludes by making a case for universities to work together with key stakeholders to restructure their assessment policies in line with national and international calls for more open universal access to knowledge for societal benefit.

The History of Evaluating Academic Research Productivity

Introduced initially in 1665, the practice of peer-reviewing publications has become the dominant model for the dissemination of new knowledge through “scholarly communication” (Haustein, Sugimoto, and Larivière 2015). This has involved expert reviews to assess and verify the legitimacy of the academic knowledge being presented and, where necessary, to ensure improvement to research outputs prior to their printed release to a broader intellectual audience (Moed et al. 1985). Yet with the inordinate rise in the availability of journals and publishers, and the excessive time this growth implied for researchers to search through numerous documents around their focus, the concept of citation counts to quantitatively evaluate the importance of papers was first presented in 1927 (Gross and Gross 1927) and has since become the principal method for measuring research outputs globally (Bornmann and Daniel 2008; Garfield 2006). Citation analysis, commonly known as bibliometrics, is used not only for the filtering of information retrieval but more significantly as a tool for indicating “research productivity” or “research quality” through techniques such as the journal impact factor or h-index to rank and categorize publications and their citation levels (Haustein, Sugimoto, and Larivière 2015; Hirsch 2005).

Today, bibliometric indicators remain the dominant form of research evaluation that universities use to illustrate their international research excellence ranking, seek grant funding, and review staff performance for hiring and promotion (Haustein 2016). However, with current policy demands for greater demonstration of the societal impact of research outputs, growing criticism has emerged around the nature of publishing and bibliometrics, with emphasis on their oversimplification and limited capacity to demonstrate behavioral or societal change (Hammarfelt and de Rijcke 2015; de Rijcke et al. 2016). This together with the burden on academics to “publish or perish” is resulting in “salami” publishing of project outputs across numerous journals, self-plagiarism, honorary authorship, self-citation, and “citation cartels” (Haustein, Sugimoto, and Larivière 2015) and has also placed pressure on universities to pursue new methods for research assessment. Within this changing environment, emphasis is now being directed toward the design of innovative assessment approaches for capturing “research impact” in terms of improving society, culture, economy, environment, health, and quality of life—or, in other words, how these public benefits have occurred because of this new knowledge (Research Excellence Framework 2015; Spaapen and van Drooge 2011).

To implement this shift in focus, numerous governments are introducing policies aimed at raising the level of collaboration and social relations between researchers, multidisciplinary stakeholders, and the broader community to promote greater citizen participation in the design, planning, and implementation of research projects for the co-creation of knowledge (Research Excellence Framework 2015; European Commission 2017). Such shifts in policy have been supported through new frameworks to assess the societal impact of research. The United Kingdom’s Research Excellence Framework 2014 is a well-cited example; it noted that publication channels or publisher classifications should not be used by panels to assess impact in the humanities and introduced a voluntary analysis of researchers’ outputs over a six-year period to evaluate achievements (Research Excellence Framework 2015).

Norway, Spain, and other active European countries have introduced similar models, including, for example, inventories of case studies to assess societal impact of researcher outputs according to a set of criteria developed for their field of study as well as stage and level of career, emphasizing process evaluation to assess social outreach and public engagement (Toledo 2018). In the United States, the introduction of the Crowdsourcing and Citizen Science Act of 2015 has meant that citizens should not only be actively involved in research projects but also play a central role in defining the research agenda and hence its evaluation (Robinson-Garcia, van Leeuwen, and Rafols 2018). The recent National Dutch Research Agenda also focuses on collaborative efforts to engage researchers, communities, and industry in the setting of national research priorities. While these models are controversial and costly, they have raised awareness of the need to adapt research evaluations according to different fields of research and their broader socioeconomic objectives to reach responsive societal needs (McKiernan 2017).

Alongside these policy changes, the rise of the web and in particular the social web has meant that universities are at a critical juncture, where multidisciplinary collaborative research underpinned by digital methods and standards is providing new avenues for researchers to openly engage with the broader community. Social media is becoming a central part of scholarly communication, enabling researchers to raise their visibility, actively connect with others, and freely disseminate their work to a larger audience (Haustein 2016; Rowlands et al. 2011). Publications, manuscripts, conference presentations, and policy statements can now be made openly available online; data, methods, and complex software tools can be shared through digital platforms to offer public spaces for citizen participation in knowledge-based activities; research plans, processes, and outcomes can be presented, discussed, and criticized openly through blogs, wikis, and other such forums, including online chats; and findings can be illustrated more visually through new peer-review approaches with anonymous or non-anonymous referees and with opportunities for the public to post open review comments, questions, and assessments (Haustein, Sugimoto, and Larivière 2015; McKiernan 2017; Bartling and Friesike 2014).

Advances in the digital world mean that research is entering a new era of design and development. Through open scholarship, research can now be created, produced, analyzed, and disseminated using digital or computational techniques, or both, to better connect academics with one another and to the communities they serve (Lewis et al. 2015). The social web, with its many applications, can promote participation, interconnections, social interaction, and user-generated content (Greenhow and Gleason 2014). These virtual public spaces allow for the sharing, reuse, analysis, and management of data in new ways and for researchers, community stakeholders, and local citizens to work together to explore and solve research challenges through new cross-disciplinary lenses (Murphy and Costa 2018).

Yet despite the potential for open scholarship, the system for scholarly publishing has continued to be dominated by for-profit publishing companies, in which the costs for publication in and open access to electronic articles, books, and documents have increased exponentially (Australasian Open Access Strategy Group 2018). This has meant that in addition to the substantial investment of public funds for research, major government contributions are required to sustain academic libraries. Due to growing concerns about the cost of access to scholarship, global coalitions have called for the development of online public library systems that would allow the full contents of published research and scholarly discourse to be offered in a freely accessible, fully searchable, interlinked form. Moreover, international governmental and private funding bodies have emphasized the need to focus on not only making research outputs freely available but more importantly presenting these in more FAIR ways so that they can be more easily understood by the broader community for societal benefit (Wilkinson and Dumontier 2016). In 2018, cOAlition S, a European consortium of major national research agencies and funders from twelve countries, launched Plan S (https://www.coalition-s.org/), which calls for researchers who benefit from state-funded research to publish their work in open journals or online platforms and to make data immediately available to all through open access repositories by 2021 (Science Connect 2019).

Notwithstanding these major strategies for changing research assessment and publishing policies, the complexity of the system, the financial costs imposed by for-profit publishers to make research outcomes openly available, and the lack of incentives offered by universities for open scholarship have resulted in inertia among researchers to adopt more open, efficient, and equitable ways for engaging with the broader public in the development and dissemination of research. Despite the significant opportunities offered by today’s digital landscape to make research more responsive and inclusive, universities, faculties, and researchers continue to place excessively high value on publications in specific league journals and books that are ranked and assessed through traditional bibliometric indicators (Narayan and Luca 2017; McKiernan 2017).

To address these issues, alternative metrics—or altmetrics—are emerging as a new option to evaluate research activities from open sources (Priem 2014; Priem and Hemminger 2010), yet researchers’ lack of knowledge about altmetrics has limited their uptake and caused confusion and misuse (Lemke et al. 2019). To raise familiarity with and interest in these new approaches, the San Francisco Declaration on Research Assessment (Cagan 2013) and the Leiden Manifesto (Hicks et al. 2015) have provided recommendations and guidelines for funding agencies, academic institutions, individual researchers, journals, and organizations that supply the metrics. The policy environment is shifting, and moves are underway to develop more open, congruent ways for universities to reshape their assessment of research for societal benefit, but many barriers still remain (Toledo 2018; Beaulieu, Breton, and Brousselle 2018).

Alternative Metrics to Assess Public Engagement

Within this context, bibliometrics are now seen as the traditional way of measuring the impact of research merely through academic outlets, whereas altmetrics are considered as a new approach to assessing the societal reach and impact of research by tracking and measuring public engagement through the use of social media outlets (Bornmann 2014; Piwowar 2013). Although broadly referred to as the measurement of online activities and interactions relating to research output or scholarly content derived from social media or Web 2.0 platforms, the definition of altmetrics is unclear and changing with the emergence of new digital possibilities including those via application programming interfaces (APIs) (Haustein 2016).

The term altmetrics was first introduced and presented in the Altmetrics Manifesto to exclude bibliometrics (Priem et al. 2010). Yet altmetrics have often included article-level metrics (ALM), which can comprise webometrics and usage metrics, such as downloads or page view counts, or hyperlinks to measure the usage of information on the World Wide Web (Almind and Ingwersen 1997)—indicators that have been available for some time and do not involve social media platforms (Haustein 2016). In response, altmetrics can now be defined as the “study and use of scholarly impact measures based on activity in online tools and environments” (Priem 2014). In other words, altmetrics are moving away from citation analysis (and the analysis of downloaded data) to social web usage analysis to observe the engagement with research outputs based on social media platforms, tools, and other scholarly activities. Most importantly, this approach includes various community user activities in social media environments (Bornmann 2014).

Within this framework, altmetrics can be seen as a heterogeneous subset of scholarly metrics that involve “indicators based on recorded events of acts (e.g., viewing, reading, saving, diffusing, mentioning, citing, reusing, modifying) related to scholarly documents (e.g., papers, books, blog posts, datasets, code) or scholarly agents (e.g., researchers, universities, funders, journals)” (Haustein 2016). As such, it is argued that the extent or frequency of “use” of, or public engagement in or with, research output through social media outlets can be seen as a way of assessing the real or direct impact of research for society (Neylon, Willmers, and King 2014). Moreover, it is suggested that by adopting altmetrics and promoting open scholarly communication, universities now have the potential to democratize access to information by making research freely available to all (McKiernan 2017; Murphy and Costa 2018).

Today, aggregators of data for altmetrics (e.g., Altmetric.com, Plum Analytics, and Crossref Event Data) provide data from different social media tools and platforms to collate and present indicators of the diverse users and their reasons for acquiring the research output (Lin and Fenner 2013). For example, Plum Analytics aggregates five categories—captures, usage, citations, mentions, and social media—for over twenty different types of research outputs, including journal articles, books, videos, presentations, and datasets. It focuses primarily on scholars’ use to assess impact but also includes public usage. Altmetric.com tracks where published research is mentioned online to reflect the reach or popularity of the research output. Sources include academic citations and academic platforms, posts on social media and web forums (blogs, Facebook pages, Google+, news outlets, Reddit, Twitter, CiteULike), Wikipedia articles, and reference management on Mendeley. However, unlike Plum Analytics, it does not include information about citation and usage (Ortega 2018). By examining data on social media platforms like Twitter, which serve as an interface between academics and the broader community, these aggregators of altmetric data are analyzing the extent to which researchers are engaging in conversations with a wider public audience (Haustein, Bowman, and Costas 2015).

Thus, although bibliometric providers such as Google Scholar, Web of Science, and Scopus now include mentions on web users and their geographic location in addition to citations, altmetrics providers aggregate classification data from social media platforms with the following different purposes and functionalities (Haustein 2016):

  • Social networking (e.g., ResearchGate, Academia.edu, LinkedIn, Facebook)

  • Social bookmarking and reference management (e.g., Mendeley, CiteULike, Zotero)

  • Social data sharing, including datasets, software code, presentations, figures, and videos (e.g., YouTube, Figshare, GitHub, Jupyter)

  • Blogging and microblogging (e.g., ResearchBlogging, WordPress, Twitter)

  • Wikis (e.g., Wikipedia)

  • Social recommending, rating, and reviewing (e.g., Reddit, F1000Prime, Goodreads)

As is apparent from these different platforms, the heterogeneity both within and between these categories is significant and reflects the diversity of users and their needs. Arguably, the benefits of these altmetrics over traditional bibliometrics are numerous. First, a key advantage of altmetrics is their potential to demonstrate how widely the broader society is sharing and discussing university research products (Priem, Piwowar, and Hemminger 2012). Studies have shown that the overall public—professionals, government and industry staff, students, and non-academics—access and use research but do not cite it in academic publications (McKiernan 2017). This indicates the limitation of using citation counts and illustrates how alternative metrics can offer ways of assessing the interest and opinions of a much wider audience, which should be incorporated into and equally valued as traditional bibliometric counts in the assessment of universities’ outputs and achievements (Hammarfelt 2014; McKiernan 2017). By making them open and freely available, universities can help fulfill their mission of sharing research outputs so they have the broadest and quickest societal impact while complying with government and private funding agencies to justify their budgets through indications of societal benefits (Montgomery et al. 2018). Moreover, altmetrics can demonstrate different forms of impact, including, for example, changes to policy or clinical practice, environmental improvements, and education or cultural activities that emerge as a result of research outputs (Bornmann 2014).

A second benefit of altmetrics is their ability to measure the uptake or interest in a paper or other product of research within a few days of release, as it can be read, saved, bookmarked, and even discussed almost instantaneously (Priem 2014), circumventing several weaknesses of citations as indicators of academic importance (Lemke et al. 2019). Different platforms can enable researchers to track the interests of a range of users over time, assessing the traffic on search engines and the usage of specific components of the platforms to enable them to personalize and tailor their research to particular needs via the analysis of structured altmetrics data (Hearn, Miller, and Lester 2014).

Third, altmetrics not only provide more diverse kinds of data from different sources; they also enable the evaluation of a richer variety of products, not merely publications. For example, the sharing of data, software, grey literature, protocols, and slides among other products of research can encourage increased communication and collaboration within and between disciplines, allowing new forms of analyses with the potential to accelerate advances in knowledge (Piwowar 2013; Wouters, Zahedi, and Costas 2019). In addition to the impact of these products, altmetrics can help universities illustrate their efforts to overcome any negative ivory tower image and engage the public by tracking the views, usage, or circulation of online open courses; community, radio, television, and public media presentations; outreach events; and public impact stories through university websites and social media (McKiernan 2017; Murphy and Costa 2018).

Yet despite the huge benefits, the promotion and implementation of altmetrics have not come without significant barriers. Haustein (2016) outlined the “grand challenges of altmetrics” under the categories of data quality, heterogeneity, and technological dependencies. Data quality in bibliometrics is static, and biases and errors can be corrected, whereas altmetrics data sources are dynamic; accuracy, consistency, and replicability therefore are major issues, and problems can occur at the level of data providers (e.g., ResearchGate, Academia.edu, Mendeley, Twitter), data aggregators (e.g., Altmetric.com, Plum Analytics), and users (Haustein 2016). Moreover, usage changes continually, and findings based on the tracking of engagement may quickly become outdated. It is often difficult, if not impossible, to analyze the opinions and interests of different users (Robinson-Garcia, van Leeuwen, and Rafols 2018). Technical issues include discrepancies in altmetrics according to their data sources, many of which are owned by for-profit companies that operate under pressures to illustrate outcomes with the goal of increasing their revenues by addressing the demands of research funders and policymakers who require indicators of societal impact (Haustein 2016). The linking of social media with societal impacts can also be challenging, subjective, and problematic given the heterogeneity of users and sources and the lack of a conceptual foundation (Robinson-Garcia, van Leeuwen, and Rafols 2018). Moreover, like bibliometrics, leading altmetric providers (Altmetric.com, Plum Analytics, and Crossref Event Data) focus too much on numerical scores, usage, or attention as opposed to societal impact, and all rely on DOIs or other persistent identifiers, which are typically only assigned to traditional research (Greenhow, Gleason, and Staudt Willet 2019; Robinson-Garcia, van Leeuwen, and Rafols 2018; Joly et al. 2015). Thus, while social media platforms can reach broad socioeconomic, demographic, and geographic domains, this does not imply users represent the general public (Alperin et al. 2018; Sugimoto et al. 2017). Measuring attention does not necessarily signify uptake or impact but perhaps better illustrates scholarly communication, and hence a combination of traditional bibliometric data around citations complemented by altmetrics can offer a more complete picture of societal influence (Greenhow, Gleason, and Staudt Willet 2019; Wilsdon et al. 2015; Haustein 2016).

Moreover, while social media may enable a digital public sphere for online debating—through, for example, Facebook and Twitter—these fora may lead to low-grade discussion based on limited evidence accompanied by insults, anger, and hostility (Murphy and Costa 2018). Also, given that not everyone nationally or internationally uses social media platforms, has literacy skills, or speaks the same language, there is a bias and an equity issue in measuring impact based on the specific sample of users who may vary according to age, culture, socioeconomic status, and professional interest (Neylon, Willmers, and King 2014; Robinson-Garcia, van Leeuwen, and Rafols 2018; Greenhow, Gleason, and Staudt Willet 2019). Another concern of altmetrics is their susceptibility to gaming (Sugimoto et al. 2017). Moreover, it is difficult to make comparisons across disciplines in terms of societal impact achievements, and as yet, altmetrics have not overcome this barrier (Lemke et al. 2019).

Thus, while altmetrics are still in their early days compared with traditional published research outputs and bibliometrics, the use of social media offers promising opportunities for universities to capture the impact of research outputs for societal benefit (Ortega 2018; Díaz-Faes, Bowman, and Costas 2019). Increasing interest in altmetrics among government and private funding agencies is illustrated in recent policy documents, such as the Metric Tide Report (Wilsdon et al. 2015) and the European Commission Expert Group on Altmetrics (Wilsdon et al. 2017). Yet while numerous advances have been achieved in the fields of physics and medical and health sciences (Laporte 2017), the evaluation of research outputs in the field of humanities continues to produce tensions (Toledo 2018).

Current Barriers for the Humanities

Humanities scholars have continually struggled to accept the approaches used to evaluate research impact according to quantitative metrics, especially those based on traditional citation analysis (Hammarfelt and Haddow 2018; Ochsner, Hug, and Daniel 2016). The humanities are diverse, and research can be presented through a multitude of communication channels—including books, book chapters, monographs, art, music, digital visualizations, among others—as opposed to the more classical peer-reviewed journal articles of the “hard” sciences, and consequently have received limited recognition and coverage on academic databases (Lemke et al. 2019; Toledo 2018). With international calls for more open scholarship and responsible research, altmetrics offer new hope for humanities scholars, yet their usage continues to be limited with a series of barriers preventing their uptake (Hammarfelt and Haddow 2018; Hammarfelt 2017; Narayan et al. 2018; Suber 2017).

Whereas the fields of physics and mathematics have had their own subject-specific open access repository, arXiv, and the biomedical sciences have been supported through the PubMed Central digital archiving repository, allowing readers free access to either pre-print or post-print versions, the humanities have not yet created a publication “culture” focused on the use of open digital repositories (Narayan and Luca 2017; Lemke et al. 2019). Despite initiatives like Humanities Commons, limited knowledge of the FAIR principles and the value of open scholarship, both for community engagement and to make research outputs freely available to the public whose tax dollars support research, has meant that many humanities scholars view the self-archiving of their research outputs into academic repositories as merely a cumbersome administrative requirement (Armstrong 2014; Narayan and Luca 2017). Thus, while university repositories and academic social networks such as Academia.edu, ResearchGate, and LinkedIn offer great potential to break down current barriers to scholarly communication, they remain underutilized by humanities researchers and similarly are rarely viewed by users (Narayan and Luca 2017; Veletsianos 2016). Although most of these academic social media networks allow self-archiving, publisher policies can be challenging, raising concern among researchers around breaching copyrights (Narayan et al. 2018; Kim 2011).

As a result, while digital environments and platforms can help researchers build connections and networks and encourage openness and sharing, humanities scholars’ use of social media continues to be fragmented and siloed (Veletsianos 2016), with most scholars conforming to traditional publishing practices assessed through peer review, which are seen as the most respectable means of sharing their work (Armstrong 2014; Peekhaus and Proferes 2015). With promotion and grant funding still largely based on classical bibliometrics, this reinforces scholars’ emphasis on publications in prestigious journals and with publishers that continue to be favorably weighted by academic committees (Odell, Coates, and Palmer 2016), resulting in a lack of incentives for researchers to digitally share their research, data, and code (Dermentzi and Papagiannidis 2018; Jamali, Nicholas, and Herman 2016). While not-for-profit open access platforms do exist in the humanities, some consider these to be of lower quality than the established journals run by commercial entities and monographs printed by top-tier private publishers (Peekhaus and Proferes 2015).

In compliance with national and international funding agencies’ requirements for evaluating research according to social impact, many commercial publishers are now offering open access to their publications, including monographs and specialized edited volumes, and have incorporated tools to provide information on views, downloads, usage, and even users’ geographical locations. However, these often have paywalls, resulting in high article processing charges (Toledo 2018; Montgomery et al. 2018). Publication fees to publish online are also sometimes charged to authors once the article or book has been approved, but the massive rise in the number of journals has resulted in concerns regarding the trustworthiness of new publishers (Tenopir et al. 2016). Some are being coined as “predatory” publishers due to the exclusion or altering of the peer-review process seen as necessary to guarantee the quality of the evaluation (Beall 2017). These criticisms are highly controversial, leaving many people confused and suspicious of the credibility of research on open scholarship sites. In some cases, this has resulted in public misunderstanding about the legitimacy of online sources (Tenopir et al. 2016).

Thus, although new digital options are available to offer libraries, universities, and research institutes the ability to transition a wide variety of monographs and journals from paywalled to open access content, their economic sustainability and diversity continue to be a challenge, and few humanities scholars are aware of the value of these alternative publication channels. The humanities are also not uniform or linear in nature, comprising fluid topics and varied contexts analyzed through different methodologies rather than focusing on single causality factors (Hammarfelt 2017; Laporte 2017). Hence, the intended audience that the humanities reaches is diverse and differs according to specific studies and can include international and national scholars, journalists, librarians, archivists, and the broader public (Hammarfelt 2017). But rather than having a global reach, most humanities studies are focused on local geographic communities. Hence their societal impact may be confined to a small population, and the time span for the uptake of social benefits may be over ten years, which is too long for the evaluation process (Robinson-Garcia, van Leeuwen, and Rafols 2018). Similarly, the use of popular media not considered scholarly publications—for example, Facebook, Twitter, magazines, and newspapers—could play a substantial role in taking research to the broader community, but these media are generally not given high recognition by universities (Laporte 2017; Murphy and Costa 2018).

The diversity of languages used in the humanities is another issue for reaching the broader community, requiring numerous selective channels (Toledo 2018). Leading databases, such as Web of Science, Google Scholar, and Scopus, currently primarily index English language (or northern Atlantic) publications and research outputs, yet valuable works in the humanities are done in other languages, including traditional or historic languages, with a narrow audience (Hammarfelt 2017; Laporte 2017). Many of these non-English and regional research outputs are non-indexed, and therefore much of this data is lost to the community, but it could provide valuable knowledge and should be readily available for all (Toledo 2018). Yet evaluating their benefits for society by assessing the number and frequency of data downloads and usage is inadequate.

Arguably one of the most significant barriers preventing the wider usage of social media platforms by humanities researchers lies in their lack of knowledge, skills, infrastructure, and time to use advanced digital applications to enhance scholarly communication and engagement with the broader community. In a recent study, Steffen Lemke and colleagues (2019) illustrated that humanities and social science scholars felt that research-related communication on social media was too shallow given the targeted groups represented on these platforms; information overload was seen as cumbersome, with one’s limited time better spent on other activities, especially given the pressure to publish; there was a perception that social media platforms were unsuitable for academic research; and concerns were raised around data security and paywalls for open access. The study also demonstrated the researchers’ lack of familiarity with metrics, especially altmetrics and national and international moves to evaluate research according to its societal impact. Following an explanation of altmetrics, participants outlined concerns around their quality, reliability, inherent bias, manipulation, and restricted comparability, among other issues (Lemke et al. 2019). Their findings are similar to those of Bhuva Narayan and colleagues (2018), who illustrated the limited use of key tools and metrics by humanities and social science scholars, with greater emphasis still placed on bibliometrics: Google citations (25.8%), h-index (20.2%), Academia.edu and ResearchGate (13.5%), i10-index (10.1%), other metrics (10.1%), Scopus (6.7%), Altmetric.com (6.7%), Web of Science (3.4%), Kudos (2.2%), and Impactstory (1.1%). Moreover, the researchers demonstrated that approximately 70% were unaware of their university’s open access policies, which limited their use of open access publications (Narayan et al. 2018).

Thus, despite the possibilities that alternative metrics could offer the humanities to increase the visibility of how their research benefits society, numerous barriers—including knowledge and awareness—prevent these possibilities from being fully realized by researchers.

Ways Forward through Open Scholarship

Today we are at a critical juncture, where new digital technologies are opening valuable new avenues for research, enabling academics to start dialogues, share knowledge and information, and forge close collaborative connections with the broader community to achieve societal benefits. Advances in communication, information, and data technologies are transforming the lifestyles, behavior, interactions, and environments of our society and their relation to space, time, and knowledge. In response, government and private funding bodies are placing greater emphasis on citizen participation in and co-construction of new knowledge so that research outputs can be shared, analyzed, reused, and adapted in new ways, and on individuals and groups to work together to explore and solve societal challenges through new cross-disciplinary lenses (Scanlon 2018; Veletsianos and Kimmons 2012; McKiernan 2017).

Yet while technological advances are progressing rapidly, the indicators used by universities to evaluate the benefits of these advances on societal impact are still in their embryonic stage, and the capacity of altmetrics to capture societal impact is being seriously questioned. Criticism around altmetrics focuses on whether the use of digital aggregators or online discussions results in societal changes and improvements (Robinson-Garcia, van Leeuwen, and Rafols 2018; Toledo 2018). For example, does online information about healthy lifestyles change the daily activities of individuals or communities? Academic digital information to date still centers largely on one-way communication, with altmetrics indicators primarily measuring linear counts similar to citation counts to justify investments and accountability—an approach that goes against the notion of two-way public engagement for societal impact (Haustein, Sugimoto, and Larivière 2015; Robinson-Garcia, van Leeuwen, and Rafols 2018; Spaapen and van Drooge 2011).

Nevertheless, new datasets and tools, together with technological innovation and the identification and analysis of data that was previously inaccessible, means research is entering a new era of design and development. It now offers researchers fresh opportunities for discovery through the visualization of maps and timelines and the integration of different cross-disciplinary data, revealing previously invisible geographic, historical, and cultural patterns of association and rich new insights into the contexts in which public engagement occurs and how this can influence and benefit the wider public (Robinson-Garcia, van Leeuwen, and Rafols 2018; Costas et al. 2016). Rather than focusing solely on quantitative assessment of research outputs, for which altmetrics are as yet not sufficiently robust, digital network analysis can be used to assess and map, through case studies, the contexts in which researchers and diverse community stakeholders engage through two-way communication and thus how public engagement is improving societal benefit (Robinson-Garcia, van Leeuwen, and Rafols 2018; Joly et al. 2015). Such a shift toward the contextual mapping of networks of interaction offers universities promising new pathways to use social media and altmetrics data to evaluate both the qualitative and quantitative societal impact of their research. It could also open doors to new kinds of inquiry by stimulating greater citizen participation in the co-construction of knowledge and encouraging the maximum sharing of data and interoperability of digital collections and tools among individuals and groups, thereby together addressing shared research challenges.

For these new options to occur, universities need to take a leading role. They need to recognize our changing world and the role academics should now play in open scholarship that values public engagement (Murphy and Costa 2018). Not only are national and international funding agencies calling for universities to refocus their mission toward the translation of research outputs for societal good and to meet their new policies on open access and scholarly communication, but the pioneering nature of open digital access to information, research collaboration, and sharing and reuse of research data means teamwork is required to create changes not just at the institutional level but also, more importantly, at diverse stakeholder levels, including policymakers, university participants (researchers, senior university administrators, librarians, platform providers, and developers), and the broader community (Arthur et al., 2021). While academics have increasing accessibility to universal distributed knowledge networks and practices of openness, researchers need to be equipped with the skills and tools and have access to leading-edge infrastructure to properly conduct research aligned with this global paradigm shift.

Universities must go further than merely signing on to the policies of the San Francisco Declaration on Research Assessment (Cagan 2013), the Leiden Manifesto (Hicks et al. 2015), and the Metric Tide Report (Wilsdon et al. 2015). Unless they acknowledge and provide support to researchers who engage with outreach activities and honor and value their achievements in open scholarship, academics will continue to place greater prestige on high-level peer-reviewed publications than on public engagement (McKiernan 2017). If universities are to signal to academics and the broader public that they are truly committed to their mission of knowledge exchange for societal benefit, then they must respond to the growing pressure placed on them to address the policies outlined in Table 1.

Table 1.

Actions required to illustrate commitments (McKiernan 2017; Cagan 2013; Hicks et al. 2015; Montgomery et al. 2018)

  • Measure performance against the research missions of the institution, group, or researcher and choose indicators from the start that take into account the diverse missions and fields of research and the socioeconomic and cultural contexts to which they apply.

  • Use multiple indicators to evaluate research impact; this may include research impact and influence on policy, industry, or practice rather than solely on traditional bibliometrics to evaluate academic excellence.

  • Eliminate the use of journal-level metrics and impact factors to evaluate research quality.

  • Use article-level metrics, such as citations, only as a quantitative measure to support qualitative assessments of social impact and take into account variation in citation practice according to fields and types of research.

  • Use alternative metrics, including online counts and media coverage, as a measure to assess broader impact and to map the contexts and processes through which researchers engage with diverse community stakeholders.

  • Recognize academics for the open sharing of their research outputs, including data and software, in public repositories or open access publications and count these in their performance appraisals.

  • Acknowledge preprints, even if not counted as highly as peer-reviewed documents.

  • Value and support outreach activities, such as blogging and popular media, as academic outputs and count these in their research assessments.

  • Adopt more flexible forms for assessment and allow researchers and faculties to use narrative summaries of impact stories, non-traditional outputs, and open scholarship practices to capture societal benefits.

  • Base individual researcher assessment on a qualitative judgment of their portfolio rather than primarily on publications, with comparisons and assessments according to the researcher’s field of study, career stage, age, gender, and other criteria.

Thus, university leaders need to commit to changing the culture and policy through long-term plans with clear logistical processes to adopt a more open and engaged environment (Tennant et al. 2019). This requires support for staff education, infrastructure, and financial incentives as well as collaborative efforts to overcome barriers currently hampering the uptake of national policy at individual, facility, library, university, and national levels (Arthur et al. 2021). Investment in infrastructure is a key element for the enhancement of open access and public engagement. Rather than merely building university repositories, this enhancement must include development of advanced digital networked systems to make research available to other scholars and the broader public (Montgomery et al. 2018). These systems can provide greater visibility and accessibility and articulate the university’s mission.

Action Recommendations and Conclusions

This article critiques the history of public universities’ approach to assessing research performance and looks at how traditional bibliometrics have limited the institutions’ broader mission of research for societal benefit. While peer-review publications are—and will remain—one component of the evaluation of research outcomes, this metric alone does not illustrate societal impact. Moreover, the humanities’ emphasis on and recognition for single authorship of printed outputs fail to illustrate collaborative engagement in the research process and outputs.

In contrast, as this article highlights, in our digital era, the frontiers through which knowledge is being advanced and shared are reshaping the landscape in which academic research can have an impact on society. Open access and open scholarship are offering new avenues to make data more freely and easily available for use and reuse by the broader community. While the policy environment is shifting and moves are underway to develop more open, congruent ways for universities to reshape their assessment of research, this article highlights the many barriers that are preventing progress (Toledo 2018; Beaulieu, Breton, and Brousselle 2018; Robinson-Garcia, van Leeuwen, and Rafols 2018). While altmetrics are slowly becoming popular and offer potential benefits, new indicator sources need to be developed to make them transparent and robust, providing quality data to avoid encountering the same mistakes as traditional bibliometrics (Haustein 2016; Díaz-Faes, Bowman, and Costas 2019; Zahedi and Costas 2018).

Within the humanities, the shift toward contextual mapping of networks of interaction and two-way communication with the broader community is offering promising new pathways in the use of social media in providing altmetrics data to evaluate both the qualitative and quantitative benefit of research on certain groups through improved public engagement (Robinson-Garcia, van Leeuwen, and Rafols 2018; Joly et al. 2015; Díaz‐Faes and Bordons 2017). These new approaches to research assessment also allow for variation between the different purposes, methodologies, and processes of the research being conducted and can provide funding agencies with greater clarity of the types or levels of societal impact by illustrating case studies and mapping the contexts and timelines within which these changes have occurred (Robinson-Garcia, van Leeuwen, and Rafols 2018; Toledo 2018; Wouters, Zahedi, and Costas 2019).

Yet to achieve these changes, universities must restructure their assessment policies to provide greater recognition of and support for open scholarship in line with national and international calls for more universal, unrestricted access to knowledge (Arthur et al 2021). Achieving this will require the provision of training and facilities to provide staff with better familiarity of such initiatives and how they could benefit the public as well as research centers, faculties, and the university as a whole (Montgomery et al. 2018). Through the reforming of research assessment strategies and the promotion of open scholarship, universities could change the academic culture toward emphasis on social inclusion (McKiernan 2017). Yet mission statements without actions are unlikely to result in change. Universities should implement the action points summarized above and work collaboratively with national and international research councils, funding bodies, senior university leaders, peak bodies, community groups, and industry to address current barriers, as alone these implementations cannot be achieved. This approach offers different forms of evaluation for diverse types of research across the broad fields of study and acknowledges the various assessment criteria required for evaluations at the individual, research institute, faculty, and university level (Toledo 2018).

With the current speed of digital advancements and their role and influence on engaging our societies, universities should not ignore the digital scholarly ecosystem. By embracing progressive open data and publication policies in ways that meet the needs of all researchers, users, and stakeholders, universities may gain significant benefits in the future development of our society, thereby attracting increasing visibility, funding, and recruitment (McKiernan 2017; Narayan and Luca 2017; Goodfellow 2013). Universities play a central role in the education and formation of our future society. Today, in our digital world, this requires endorsing these new principles to ensure the reshaping of how universities evaluate research impact to maximize societal benefit.

References

Almind, Tomas C., and Peter Ingwersen. 1997. “Informetric Analyses on the World Wide Web: Methodological Approaches to ‘Webometrics.’” Journal of Documentation 53 (4): 404–26.

Alperin, J. P., Carol Muñoz Nieves, Lesley Schimanski, Gustavo E. Fischman, Meredith T. Niles, and Erin C. McKiernan. 2018. “How Significant Are the Public Dimensions of Faculty Work in Review, Promotion, and Tenure Documents?” Preprint, Humanities Commons. http://dx.doi.org/10.17613/M6W950N35http://dx.doi.org/10.17613/M6W950N35

Arbuckle, Alyssa, and Ray Siemens. 2015. “Open Social Scholarship in Canada.” Federation for the Humanities and Social Sciences (blog). February 23, 2015. http://www.ideas-idees.ca/blog/open-social-scholarship-canadahttp://www.ideas-idees.ca/blog/open-social-scholarship-canada

Armstrong, Michelle. 2014. “Institutional Repository Management Models That Support Faculty Research Dissemination.” OCLC Systems & Services 30 (1): 43–51. https://doi.org/10.1108/OCLC-07-2013-0028https://doi.org/10.1108/OCLC-07-2013-0028

Arthur, Paul Longley, Lydia Hearn, Lucy Montgomery, Hugh Craig, Alyssa Arbuckle, and Ray Siemens. “Open Scholarship in Australia: A Review of Needs, Barriers and Opportunities.” Digital Scholarship in the Humanities. January 14, 2021. https://doi.org/10.1093/llc/fqaa063https://doi.org/10.1093/llc/fqaa063

Australasian Open Access Strategy Group. 2018. Supplementary submission to the House of Representatives Standing Committee on Education, Employment and Training Inquiry into the efficiency, effectiveness, and coherency of Australian Government funding for research. https://aoasg.files.wordpress.com/2018/07/sub-11-aoasg-22-jun-2018.pdfhttps://aoasg.files.wordpress.com/2018/07/sub-11-aoasg-22-jun-2018.pdf

Bartling, Sönke, and Sascha Friesike, eds. 2014. Opening Science: The Evolving Guide on How the Internet Is Changing Research, Collaboration and Scholarly Publishing. SpringerOpen. https://www.springer.com/gp/book/9783319000251https://www.springer.com/gp/book/9783319000251

Beall, Jeffrey. 2017. “What I Learned from Predatory Publishers.” Biochemia Medica 27 (2): 273–78

Beaulieu, Marianne, Mylaine Breton, and Astrid Brousselle. 2018. “Conceptualizing 20 Years of Engaged Scholarship: A Scoping Review.” PLoS ONE 13 (2). https://doi.org/10.1371/journal.pone.0193201https://doi.org/10.1371/journal.pone.0193201

Berlin Declaration. 2003. “Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities.” https://openaccess.mpg.de/67605/berlin_declaration_engl_pdfhttps://openaccess.mpg.de/67605/berlin_declaration_engl_pdf

Bornmann, Lutz. 2014. “Do Altmetrics Point to the Broader Impact of Research? An Overview of Benefits and Disadvantages of Altmetrics.” Journal of Informetrics 8 (4): 895–903. https://doi.org/10.1016/j.joi.2014.09.005https://doi.org/10.1016/j.joi.2014.09.005

Bornmann, Lutz, and Hans-Dieter Daniel. 2008. “What Do Citation Counts Measure? A Review of Studies on Citing Behavior.” Journal of Documentation 64 (1): 45–80. https://doi.org/10.1108/00220410810844150https://doi.org/10.1108/00220410810844150

Boyer, Ernest L. 1996. “The Scholarship of Engagement.” Bulletin of the American Academy of Arts and Sciences 49 (7): 18–33. https://doi.org/10.2307/3824459https://doi.org/10.2307/3824459

Budapest Open Access Initiative. 2002. https://www.budapestopenaccessinitiative.org/https://www.budapestopenaccessinitiative.org/

Cagan, Ross. 2013. “The San Francisco Declaration on Research Assessment.” Disease Models & Mechanisms 6 (4): 869–70. https://doi.org/10.1242/dmm.012955https://doi.org/10.1242/dmm.012955

Costas, Rodrigo, Jeroen van Honk, Zohreh Zahedi, and Clara Calero-Medina. 2016. “Discussing Practical Applications for Altmetrics: Social Media Profiles for African, European and North American Publications.” Presentation at Conference 3:AM, Bucharest, September 2016. https://doi.org/10.6084/m9.figshare.3980145.v1https://doi.org/10.6084/m9.figshare.3980145.v1

de Rijcke, Sarah, Paul F. Wouters, Alex D. Rushforth, Thomas P. Franssen, and Björn Hammarfelt. 2016. “Evaluation Practices and Effects of Indicator Use—a Literature Review.” Research Evaluation 25 (2): 161–69. https://doi.org/10.1093/reseval/rvv038https://doi.org/10.1093/reseval/rvv038

Dermentzi, Eleni, and Savvas Papagiannidis. 2018. “Academics’ Intention to Adopt Online Technologies for Public Engagement.” Internet Research 28 (1): 191–212. https://doi.org/10.1108/IntR-10-2016-0302https://doi.org/10.1108/IntR-10-2016-0302

Díaz‐Faes, Adrián A., and María Bordons. 2017. “Making Visible the Invisible through the Analysis of Acknowledgements in the Humanities.” Aslib Journal of Information Management 69 (5): 576–90. https://doi.org/10.1108/AJIM‐01‐2017‐0008https://doi.org/10.1108/AJIM‐01‐2017‐0008

Díaz-Faes, Adrián A., Timothy D. Bowman, and Rodrigo Costas. 2019. “Towards a Second Generation of ‘Social Media Metrics’: Characterizing Twitter Communities of Attention around Science.” PLoS ONE 14 (5): e0216408. https://doi.org/10.1371/journal.pone.0216408https://doi.org/10.1371/journal.pone.0216408

European Commission. 2017. “Towards a Horizon 2020 Platform for Open Access.” https://ec.europa.eu/research/openscience/pdf/information_note_platform_public.pdfhttps://ec.europa.eu/research/openscience/pdf/information_note_platform_public.pdf

Garfield, Eugene. 2006. “Citation Indexes for Science: A New Dimension in Documentation through Association of Ideas.” International Journal of Epidemiology 35 (5): 1123–27

Goodfellow, Robin. 2013. “Scholarly, Digital, Open: An Impossible Triangle?” Research in Learning Technology 21 (1). https://doi.org/10.3402/rlt.v21.21366https://doi.org/10.3402/rlt.v21.21366

Greenhow, Christine, and Benjamin Gleason. 2014. “Social Scholarship: Reconsidering Scholarly Practices in the Age of Social Media.” British Journal of Educational Technology 45 (3): 392–402. https://doi.org/10.1111/bjet.12150https://doi.org/10.1111/bjet.12150

Greenhow, Christine, Benjamin Gleason, and K. Bret Staudt Willet. 2019. “Social Scholarship Revisited: Changing Scholarly Practices in the Age of Social Media.” British Journal of Educational Technology 50 (3): 987–1004. https://doi.org/10.1111/bjet.12772https://doi.org/10.1111/bjet.12772

Gross, P. L., and E. M. Gross. 1927. “College Libraries and Chemical Education.” Science 66 (1713): 385–89. https://doi.org/10.1126/science.66.1713.385https://doi.org/10.1126/science.66.1713.385

Hammarfelt, Björn. 2014. “Using Altmetrics for Assessing Research Impact in the Humanities.” Scientometrics 101 (2): 1419–30

Hammarfelt, Björn. 2017. “Four Claims on Research Assessment and Metric Use in the Humanities.” Bulletin of the Association for Information Science and Technology 43 (5): 33–38. https://doi.org/10.1002/bul2.2017.1720430508https://doi.org/10.1002/bul2.2017.1720430508

Hammarfelt, Björn, and Sarah de Rijcke. 2015. “Accountability in Context: Effects of Research Evaluation Systems on Publication Practices, Disciplinary Norms, and Individual Working Routines in the Faculty of Arts at Uppsala University.” Research Evaluation 24 (1): 63–77. https://doi.org/10.1093/reseval/rvu029https://doi.org/10.1093/reseval/rvu029

Hammarfelt, Björn, and Gaby Haddow. 2018. “Conflicting Measures and Values: How Humanities Scholars in Australia and Sweden Use and React to Bibliometric Indicators.” Journal of the Association for Information Science and Technology 69 (7): 924–35. https://doi.org/10.1002/asi.24043https://doi.org/10.1002/asi.24043

Haustein, Stefanie. 2016. “Grand Challenges in Altmetrics: Heterogeneity, Data Quality and Dependencies.” Scientometrics 108 (1): 413–23. https://doi.org/10.1007/s11192-016-1910-9https://doi.org/10.1007/s11192-016-1910-9

Haustein, Stefanie, Timothy D. Bowman, and Rodrigo Costas. 2015. “Interpreting ‘Altmetrics’: Viewing Acts on Social Media through the Lens of Citation and Social Theories.” Preprint, submitted February 19, 2015. https://arxiv.org/abs/1502.05701https://arxiv.org/abs/1502.05701

Haustein, Stefanie, Cassidy Sugimoto, and Vincent Larivière. 2015. “Guest Editorial: Social Media in Scholarly Communication.” Aslib Journal of Information Management 67 (3). https://doi.org/10.1108/AJIM-03-2015-0047https://doi.org/10.1108/AJIM-03-2015-0047

Hazelkorn, Ellen. 2015. Rankings and the Reshaping of Higher Education: The Battle for World-Class Excellence. Basingstoke, UK: Palgrave Macmillan

Hearn, Lydia, Margaret Miller, and Leanne Lester. 2014. “Reaching Perinatal Women Online: The Healthy You, Healthy Baby Website and App.” Journal of Obesity 2014:573928. https://doi.org/10.1155/2014/573928https://doi.org/10.1155/2014/573928

Hicks, Diana, Paul Wouters, Ludo Waltman, Sarah de Rijcke, and Ismael Rafols. 2015. “Bibliometrics: The Leiden Manifesto for Research Metrics.” Nature 520 (7548): 429–31. https://doi.org/10.1038/520429ahttps://doi.org/10.1038/520429a

Hirsch, Jorge E. 2005. “An Index to Quantify an Individual’s Scientific Research Output.” Proceedings of the National Academy of Sciences 102 (46): 16569–72

Jamali, Hamid R., David Nicholas, and Eti Herman. 2016. “Scholarly Reputation in the Digital Age and the Role of Emerging Platforms and Mechanisms.” Research Evaluation 25 (1): 37–49. https://doi.org/10.1093/reseval/rvv032https://doi.org/10.1093/reseval/rvv032

Joly, Pierre-Benoît, Ariane Gaunand, Laurence Colinet, Philippe Larédo, Stéphane Lemarié, and Mireille Matt. 2015. “ASIRPA: A Comprehensive Theory-Based Approach to Assessing the Societal Impacts of a Research Organization.” Research Evaluation 24 (4): 440–53. https://doi.org/10.1093/reseval/rvv015https://doi.org/10.1093/reseval/rvv015

Kim, Jihyun. 2011. “Motivations of Faculty Self-Archiving in Institutional Repositories.” Journal of Academic Librarianship 37 (3): 246–54. https://doi.org/10.1016/j.acalib.2011.02.017https://doi.org/10.1016/j.acalib.2011.02.017

Laporte, Steven. 2017. “Preprint for the Humanities–Fiction or a Real Possibility?” Studia Historiae Scientiarum 16:367–78

Lemke, Steffen, Maryam Mehrazar, Athanasios Mazarakis, and Isabella Peters. 2019. “‘When You Use Social Media You Are Not Working’: Barriers for the Use of Metrics in Social Sciences.” Frontiers in Research Metrics and Analytics 3 (39). https://doi.org/10.3389/frma.2018.00039https://doi.org/10.3389/frma.2018.00039

Lewis, Vivian, Lisa Spiro, Xuemao Wang, and Jon E. Cawthorne. 2015. Building Expertise to Support Digital Scholarship: A Global Perspective. Washington, DC: Council on Library and Information Resources

Lin, Jennifer, and Martin Fenner. 2013. “Altmetrics in Evolution: Defining and Redefining the Ontology of Article-Level Metrics.” Information Standards Quarterly 25 (2): 20–26

McKiernan, Erin C. 2017. “Imagining the ‘Open’ University: Sharing Scholarship to Improve Research and Education.” PLoS Biology 15 (10). https://doi.org/10.1371/journal.pbio.1002614https://doi.org/10.1371/journal.pbio.1002614

Moed, H. F., W. J. M. Burger, J. G. Frankfort, and A. F. J. Van Raan. 1985. “The Use of Bibliometric Data for the Measurement of University Research Performance.” Research Policy 14 (3): 131–49

Montgomery, Lucy, John Hartley, Cameron Neylon, Malcolm Gillies, Eve Gray, Carsten Herrmann-Pillath, Chun-Kai Huang, Joan Leach, Jason Potts, Xiang Ren, Katherine Skinner, Cassidy Sugimoto, and Katie Wilson. 2018. “Open Knowledge Institutions.” MIT Press Open

Murphy, Mark, and Cristina Costa. 2018. “Digital Scholarship, Higher Education and the Future of the Public Intellectual.” Futures 111:205–12. https://doi.org/10.1016/j.futures.2018.04.011https://doi.org/10.1016/j.futures.2018.04.011

Narayan, Bhuva, and Edward Luca. 2017. “Issues and Challenges in Researchers’ Adoption of Open Access and Institutional Repositories: A Contextual Study of a University Repository.” Information Research 22 (4). http://informationr.net/ir/22-4/rails/rails1608.htmlhttp://informationr.net/ir/22-4/rails/rails1608.html

Narayan, Bhuva, Edward J. Luca, Belinda Tiffen, Ashley England, Mal Booth, and Henry Boateng. 2018. “Scholarly Communication Practices in Humanities and Social Sciences: A Study of Researchers’ Attitudes and Awareness of Open Access.” Open Information Science 2 (1): 168. https://doi.org/10.1515/opis-2018-0013https://doi.org/10.1515/opis-2018-0013

Neylon, Cameron. 2015. “The End of the Journal: What Has Changed and What Stayed the Same.” Science in the Open (blog). November 29, 2015. http://cameronneylon.net/blog/the-end-of-the-journal-what-has-changed-what-stayed-the-same/http://cameronneylon.net/blog/the-end-of-the-journal-what-has-changed-what-stayed-the-same/

Neylon, Cameron, Michelle Willmers, and Thomas King. 2014. “Rethinking Impact: Applying Altmetrics to Southern African Research.” Scholarly Communication in Africa Programme, Paper 1, January 2014. https://open.uct.ac.za/bitstream/handle/11427/2285/SCAP_Neylon_RethinkingImpact_2014.pdf?sequence=1&isAllowed=yhttps://open.uct.ac.za/bitstream/handle/11427/2285/SCAP_Neylon_RethinkingImpact_2014.pdf?sequence=1&isAllowed=y

Ochsner, Michael, Sven E. Hug, and Hans-Dieter Daniel. 2016. “Humanities Scholars’ Conceptions of Research Quality.” In Research Assessment in the Humanities: Towards Criteria and Procedures, edited by Michael Ochsner, Sven E. Hug, and Hans-Dieter Daniel, 43–69. SpringerOpen

Odell, Jere, Heather Coates, and Kristi Palmer. 2016. “Rewarding Open Access Scholarship in Promotion and Tenure: Driving Institutional Change.” College & Research Libraries News 77 (7): 322–25. https://doi.org/10.5860/crln.77.7.9518https://doi.org/10.5860/crln.77.7.9518

Ortega, José Luis. 2018. “Disciplinary Differences of the Impact of Altmetric.” FEMS Microbiology Letters 365 (7). https://doi.org/10.1093/femsle/fny049https://doi.org/10.1093/femsle/fny049

Peekhaus, Wilhelm, and Nicholas Proferes. 2015. “How Library and Information Science Faculty Perceive and Engage with Open Access.” Journal of Information Science 41 (5): 640–61. https://doi.org/10.1177/0165551515587855https://doi.org/10.1177/0165551515587855

Piwowar, Heather. 2013. “Altmetrics: Value All Research Products.” Nature 493 (7431): 159

Priem, Jason. 2014. “Altmetrics.” In Beyond Bibliometrics: Harnessing Multi-dimensional Indicators of Performance, edited by Blaise Cronin and Cassidy R. Sugimoto. Cambridge, MA: MIT Press

Priem, Jason, and Bradley M. Hemminger. 2010. “Scientometrics 2.0: New Metrics of Scholarly Impact on the Social Web.” First Monday 15 (7). https://doi.org/10.5210/fm.v15i7.2874https://doi.org/10.5210/fm.v15i7.2874

Priem, Jason, Heather A. Piwowar, and Bradley M. Hemminger. 2012. “Altmetrics in the Wild: Using Social Media to Explore Scholarly Impact.” arXiv.org, Cornell University. https://arxiv.org/html/1203.4745v1arXiv.orghttps://arxiv.org/html/1203.4745v1

Priem, Jason, Dario Taraborelli, Paul Groth, and Cameron Neylon. Altmetrics: A Manifesto. 2010. http://altmetrics.org/manifestohttp://altmetrics.org/manifesto

Ren, Xiang. 2015. “The Quandary between Communication and Certification: Individual Academics’ Views on Open Access and Open Scholarship.” Online Information Review 39 (5): 682–97. https://doi.org/10.1108/OIR-04-2015-0129https://doi.org/10.1108/OIR-04-2015-0129

Research Excellence Framework. 2015. “Research Excellence Framework 2014: Overview Report by Main Panel D and Sub-panels 27 to 36.” Report. https://www.ref.ac.uk/2014/media/ref/content/expanel/member/Main%20Panel%20D%20overview%20report.pdfhttps://www.ref.ac.uk/2014/media/ref/content/expanel/member/Main%20Panel%20D%20overview%20report.pdf

Robinson-Garcia, Nicolas, Thed N. van Leeuwen, and Ismael Rafols. 2018. “Using Altmetrics for Contextualised Mapping of Societal Impact: From Hits to Networks.” Science and Public Policy 45 (6): 815–26. https://doi.org/10.1093/scipol/scy024https://doi.org/10.1093/scipol/scy024

Rowlands, Ian, David Nicholas, Bill Russell, Nicholas Canty, and Anthony Watkinson. 2011. “Social Media Use in the Research Workflow.” Learned Publishing 24 (3): 183–95. https://doi.org/10.1087/20110306https://doi.org/10.1087/20110306

Scanlon, Eileen. 2018. “Digital Scholarship: Identity, Interdisciplinarity, and Openness.” Frontiers in Digital Humanities 5 (3). https://doi.org/10.3389/fdigh.2018.00003https://doi.org/10.3389/fdigh.2018.00003

Science Connect. 2019. “Plan S: Making Full and Immediate Open Access a Reality.” https://www.coalition-s.org/https://www.coalition-s.org/

Spaapen, Jack, and Leonie van Drooge. 2011. “Introducing ‘Productive Interactions’ in Social Impact Assessment.” Research Evaluation 20 (3): 211–18

Suber, Peter. 2017. Why Is Open Access Moving So Slowly in the Humanities? Blog of the APA. June 8, 2017. https://blog.apaonline.org/2017/06/08/open-access-in-the-humanities-part-2/https://blog.apaonline.org/2017/06/08/open-access-in-the-humanities-part-2/

Sugimoto, Cassidy R., Sam Work, Vincent Larivière, and Stefanie Haustein. 2017. “Scholarly Use of Social Media and Altmetrics: A Review of the Literature.” Journal of the Association for Information Science and Technology 68 (9): 2037–62. https://doi.org/10.1002/asi.23833https://doi.org/10.1002/asi.23833

Tennant, Jonathan, Jennifer E. Beamer, Jeroen Bosman, Björn Brembs, Neo Chung, Gail Clement . 2019. “Foundations for Open Scholarship Strategy Development.” https://open-scholarship-strategy.github.io/site/https://open-scholarship-strategy.github.io/site/

Tenopir, Carol, Kenneth Levine, Suzie Allard, Lisa Christian, Rachel Volentine, Reid Boehm, Frances Nichols, David Nicholas, Hamid R. Jamali, Eti Herman, and Anthony Watkinson. 2016. “Trustworthiness and Authority of Scholarly Information in a Digital Age: Results of an International Questionnaire.” Journal of the Association for Information Science and Technology 67 (10): 2344–61. https://doi.org/10.1002/asi.23598https://doi.org/10.1002/asi.23598

Tofield, Andros. 2019. “The cOALition S and Plan S Explained: European Legislation Requiring Scientific Publications Resulting from Research Funded by Public Grants Must Be Published in Compliant Open Access Journals or Platforms from 2020.” European Heart Journal 40 (12): 952–53. https://doi.org/10.1093/eurheartj/ehz105https://doi.org/10.1093/eurheartj/ehz105

Toledo, Elea Giménez. 2018. “La evaluación de las Humanidades y de las Ciencias Sociales en revisión [Research assessment in humanities and social sciences in review].” Revista Española de Documentación Cientifica 41 (3). https://doi.org/10.3989/redc.2018.3.1552https://doi.org/10.3989/redc.2018.3.1552

Veletsianos, George. 2016. Social Media in Academia: Networked Scholars. New York: Routledge

Veletsianos, George, and Royce Kimmons. 2012. “Networked Participatory Scholarship: Emergent Techno-Cultural Pressures toward Open and Digital Scholarship in Online Networks.” Computers & Education 58 (2): 766–74. https://doi.org/10.1016/j.compedu.2011.10.001https://doi.org/10.1016/j.compedu.2011.10.001

Wallace, Matthew L., and Ismael Rafols. 2015. “Research Portfolio Analysis in Science Policy: Moving from Financial Returns to Societal Benefits.” Minerva 53 (2): 89–115

Watermeyer, Richard. 2016. “Public Intellectuals vs. New Public Management: the Defeat of Public Engagement in Higher Education.” Studies in Higher Education 41 (12): 2271–85. https://doi.org/10.1080/03075079.2015.1034261https://doi.org/10.1080/03075079.2015.1034261

Wilkinson, Mark D., and Michel Dumontier. 2016. “The FAIR Guiding Principles for Scientific Data Management and Stewardship.” Scientific Data 3. https://doi.org/10.1038/sdata.2016.18https://doi.org/10.1038/sdata.2016.18

Wilsdon, James, Liz Allen, Eleonora Belfiore, Philip Campbell, Stephen Curry, Steven Hill, Richard Jones, Riger Kain, Simon Kerridge, Mike Thelwall, Jane Tinkler, Ian Viney, Paul Wouters, Jude Hill, and Benjamin K. Johnson. 2015. “The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management.” Report. https://doi.org/10.13140/RG.2.1.4929.1363https://doi.org/10.13140/RG.2.1.4929.1363

Wilsdon, James, Judit Bar-Ilan, Robert Frodeman, Elisabeth Lex, Isabella Peters, and Paul Wouters. 2017. “Next-Generation Metrics: Responsible Metrics and Evaluation for Open Science.” Report of the European Commission Expert Group on Altmetrics. https://publications.europa.eu/en/publication-detail/-/publication/b858d952-0a19-11e7-8a35-01aa75ed71a1/language-enhttps://publications.europa.eu/en/publication-detail/-/publication/b858d952-0a19-11e7-8a35-01aa75ed71a1/language-en

Wouters, Paul, Zohreh Zahedi, and Rodrigo Costas. 2019. “Social Media Metrics for New Research Evaluation.” In Springer Handbook of Science and Technology Indicators, edited by Wolfgang Glänzel, Henk F. Moed, Ulrich Schmoch, and Mike Thelwall, 687–713. Cham, Switzerland: Springer

Zahedi, Zohreh, and Rodrigo Costas. 2018. “General Discussion of Data Quality Challenges in Social Media Metrics: Extensive Comparison of Four Major Altmetric Data Aggregators.” PloS ONE 13 (5): e0197326. https://doi.org/10.1371/journal.pone.0197326.https://doi.org/10.1371/journal.pone.0197326