Ethical Publishing: How Do We Get There?

The academic journal publishing model is deeply unethical: today, a few major, for-profit conglomerates control more than 50% of all articles in the natural sciences and social sciences, driving subscription and open-access publishing fees above levels that can be sustain-ably maintained by publicly funded universities, libraries, and research institutions world-wide. About a third of the costs paid for publishing papers is profit for these dominant pub-lishers’ shareholders, and about half of them covers costs to keep the system running, including lobbying, marketing fees, and paywalls. The paywalls in turn restrict access of scientific outputs, preventing them from being freely shared with the public and other researchers. Thus, money that the public is told goes into science is actually being funneled away from it, or used to limit access to it. Alternatives to this model exist and have increased in popularity in recent years, including diamond open-access journals and community-driven recommendation models. These are free of charge for authors and minimize costs for institutions and agencies, while making peer-reviewed scientific results publicly accessible. However, for-profit publishing agents have made change difficult by co-opting open-access schemes and creating journal-driven incentives that prevent an effective, collective transition away from profiteering. Here, we give a brief overview of the current state of the academic publishing system, including its most important, systemic problems. We then describe alternative systems. We explain the reasons why the move toward them can be perceived as costly to individual researchers, and we demystify common roadblocks to change. Finally, in view of the above, we provide guidelines and recommendations that academics at all levels can implement, to enable a more rapid and effective transition toward ethical publishing.


Introduction
Ideas are powerful things.They underpin our societies; they cause new technologies to spring into existence, and shape our very experience of the world.Increasingly, humanity's capacity to adapt to a changing world depends upon our capacity to generate, exchange, filter, modify and act on ideas, in light of new information.Humans are increasingly faced with acute crises (Díaz et al. 2019;Ripple et al. 2020;Steffen et al. 2018;Trisos, Merow, and Pigot 2020), and the free exchange of high-quality ideas and incoming information is thus of paramount importance to how we face these challenges.Thus, the medium we use for this exchange needs to be fast, efficient, and widely accessible.
Unfortunately, the currently predominant model for academic exchange fails to meet these needs.A great number of studies have shown that traditional scientific publishing imposes unnecessary impediments to the sharing of ideas and information, in order to extract profit, and promotes a hierarchy of journal esteem as a proxy for the true quality of ideas (Aczel, Szaszi, and Holcombe 2021;Allahar 2021;Brembs et al. 2021;Houghton 2001;Larivière, Haustein, and Mongeon 2015).This model restricts the questions that we ask and who does the asking, while undermining the free and open evaluation of science that is at the heart of a healthy and dynamic culture.
But what is the ideal, ethical publishing system?Such a system should answer to criteria of cost-efficiency, openness, transparency, and adaptiveness to the present requirements of scientific communication.In a nutshell, an ethical publishing system should enable every scientist to enter into open dialogue on their research processes and results, either with their peers or with the wider society; this requires minimal entry costs for both authors and readers.These criteria have already been extensively discussed in numerous initiatives, which are all compatible with discipline-specific rules of communication and exchange (Wilson 2018).However, traditional journals continue to play a huge role in the academic publishing system, dictating the way research is supported and assessed (Stoy, Morais, and Borrell-Damián 2019).
Many of us are aware of this problem and are willing to move towards a healthier model, but the incentives to maintain current publishing practices are strong, especially for young researchers (Tregoning 2018).Academia, therefore, is facing a collective action problem, in which behaviors that are perceived as beneficial to individuals in the short term end up being detrimental to the community, and ultimately to the public, in the longer term (Hardin 1968).So, are we doomed to passively witness the slow but inexorable degradation of academic publishing?If not, how might we steer a path towards an efficient and accessible system which promotes the open and free exchange, evaluation and dissemination of new ideas?
The authors of this manuscript take a realistic but optimistic view.We do not think we are doomed to a publication structure that hurts science and prevents its dissemination while enriching shareholder profits.At the same time, we acknowledge systemic obstacles associated with collective coordination and information sharing.Importantly though, we believe that community-oriented publishing systems are within our grasp: scientists hold the cards to create change, by re-appropriating the ways we publish and evaluate our research output (Logan 2017).Below, we sketch a path towards this goal.We first lay out some of the main problems with the current system, demystify alleged roadblocks, and set out guidelines that scientists of all backgrounds and career stages can follow.These guidelines can enable the transition to a truly ethical publishing system.
 OPEN ACCESS -PTPBIO.ORG 2 Where Are We Now?
The academic publishing system is broken.Over the last 30 years, there has been a massive concentration of journal ownership in the hands of a few major publishing companies: more than 50% of all articles in the natural sciences and the social sciences are currently published by the largest five conglomerates, which effectively function as an oligopoly (Larivière, Haustein, and Mongeon 2015).This increase in ownership concentration has not translated into obvious benefits for the research community, library systems worldwide, or the public at large (Aczel, Szaszi, and Holcombe 2021).While the digitalization of the academic system has led to massive reductions in production costs for publishers, there have not been corresponding reductions in the prices of publication (for authors) and access (for readers).Instead, publication and subscription fees have increased considerably in recent decades (Houghton 2001;Rose-Wiles 2011), often via journal bundling (T.C. Bergstrom et al. 2014).Given that neither journals nor articles are perceived to be substitutable goods (largely as a consequence of marketing for brand recognition), the major publishers have sometimes been called a "conglomerate of monopolists": they have even freer rein than a traditional oligopoly in terms of devising new pricing strategies to drain away research funds (Brembs 2022).
Open-access (OA) schemes were originally proposed as a way to solve these problems.However, many of them have been co-opted by the major publishers, who have coupled them to high article-processing charges (APCs).This, in turn, has led to increased costs to researchers and to scientific institutions, further driving article-price hyper-inflation (Khoo 2019;Morrison et al. 2021).
Globally, research organizations pay approximately ten times more than is necessary to effect publication.The publishing market is worth about 11.5 billion dollars per year ( Johnson, Watkinson, and Mabe 2018), for about 3 million articles published every year (see below).Hence, the current average price of publication is around $3500-4000 per article.Yet the effective cost of publication can be as low as $30 per article (e.g., Discrete Analysis Journal), or even $3 per article (Katz et al. 2019), when scientists voluntarily do the editorial work.The full cost, including the salary of the editorial staff, is estimated to be on the order of $300-400 per article on average (Alizon 2018;Grossmann and Brembs 2021).The total number of academic articles published every year is somewhere on the order of 2.5 million (size of the Web of Science database in 2018) or 3.5 million (size of the Dimensions database in 2018; see also Schimmer, Geschuhn, and Vogler 2015).The overall publication bill should therefore be on the order of 1 billion dollars per year, but research organizations actually pay roughly 10 times more than this in subscriptions and APCs ( Johnson, Watkinson, and Mabe 2018).About a third of this amount is profit for publishers, and about half covers unwanted costs required to support the forprofit system, such as paywalls, lobbying, marketing fees (Grossmann and Brembs 2021), and  OPEN ACCESS -PTPBIO.ORG million-dollar lawsuits against open access initiatives (Shiermeier 2017).All in all, academia could save about $10 billion of public money every year if the publishing system were reformed.This would be a huge saving; enough to cover the salary and research costs of around a hundred thousand scientists across the world, every year.
The cost-dimension linked to subscriptions and APCs-in particular their inflationary trend -can be explained by journal metric-based assessment practices, rather than by the articles' intrinsic quality and impact.The most prestigious journals-which are deemed necessary for career progression-attract subscriptions and APCs, no matter how high their cost.Payments to these journals act as a prestige-rent granted to the dominant publishers, without connection to the actual costs of article editing and dissemination.
Additionally, only well-funded institutions and researchers can afford to pay subscriptions or guarantee the dissemination of their papers in high-APC, open-access venues.This exacerbates existing inequalities among scientists and the public, generating exclusion.Some publishers offer waivers or discounts for certain countries, but these are insufficient to solve the problem (Nabyonga-Orem et al. 2020).Researchers are segmented between those who can afford the costs (mainly on the base of publicly-funded institutions, predominantly situated in wealthy countries) and those who cannot afford them, even when applying the same standards of quality in their research processes and outcomes.
The system also creates a vicious circle by influencing researchers themselves, and their strategic choices and behaviors.Researchers, subject to the esteem hierarchy (see below), increasingly feel the need to publish a large number of papers in the most prestigious journals they can manage.The inevitable result is a tendency towards minimum publishable units; speed and quantity are prioritized over quality (Hortal et al. 2019).This places yet another (distributed) burden on the research community, as the number of peer-review requests mounts, and the number of papers to read and assess becomes insurmountable.In addition, opportunities for innovative, high-cost/high-gain research pathways are deterred (Morais et al. 2021;Saenen et al. 2021;Saenen and Borrell-Damián 2019;Stoy, Morais, and Borrell-Damián 2019).
In a context of increased academic precarization (acting as incentive for competition), quick outcomes and exaggerated claims increase, even in the face of methodological flaws or experimental dead-ends.This environment hence favors short-term and poor research practices over long-term research efforts and integrity, rewarding and leaving unreliable or un-replicable science potentially unnoticed (Brembs 2018;Errington et al. 2021;Sumner et al. 2014).There is, of course, little motivation for the dominant publishers themselves to reduce the pace and volume of output being published, so little to nothing is done to prevent these practices.More papers make for greater profit and also support brand recognition via metric-based rewards, like citation counts (Silva and Vuong 2021).
Researchers hence navigate a system in which their career aspirations are embedded-a system they know to be broken (Kayal, Ballard, and Kayal 2022).This leads to paradoxical behaviors; for example, they may publish articles in costly journals for the sake of career advancement, while also praising initiatives like Sci-Hub that illegally grant free access to those same articles (Resnick 2016).This paradoxical behavior reveals the crux of the issue.To produce science, scientists need a system where they can communicate their processes and results at an affordable cost.To engage with science, readers need a system where science is freely available.
3 Are There Alternative Systems?
The dominant commercial publishers are clearly not serving the interest of scientists, nor the interests of the public that pay for much of the work scientists do (Aczel, Szaszi, and Holcombe  OPEN ACCESS -PTPBIO.ORG 2021).But are there other systems within reach?
The list of alternatives to the oligopolistic commercial journal model is long and diverse, offering a wide range of choices for scientists heading towards more ethical publishing (see Box 1).Increasingly, authors are self-archiving published manuscripts after peer review and placing unembargoed versions of articles in free institutional or funder-based repositories ("green OA"; e.g., see the lists of repositories in the OpenDOAR and ROAR databases).Indeed, an embargofree rights retention strategy is a key component of the Plan S initiative for open access (https: //www.coalition-s.org/rights-retention-strategy/).
Additionally, the practice of uploading manuscripts in preprint servers-like arXiv, Zenodo and bioRxiv-has skyrocketed in the last two decades (Abdill and Blekhman 2019;N. Fraser et al. 2021;Wang, Chen, and Glänzel 2020).This, in turn, has opened up new possibilities for subsequent open and transparent review of manuscripts by community members, without the participation of the dominant publishers-effectively a form of social niche construction.For example, commons-based manuscript-recommendation models like the Peer Community In (PCI) initiative (Guillemaud, Facon, and Bourguet 2019) provide a way for peer review to occur at no cost to reviewers, authors, or readers.PCI's associated diamond OA journal-the Peer Community Journal-grants acceptance to any manuscript that has been reviewed and recommended in the PCI system.Generally, diamond OA journals (e.g., SciPost Physics in physics, Insights in scholarly communication, Discrete Analysis in mathematics, Glossa in linguistics) make content fully available to readers for free, and also ensure there is no cost to the authors for publishing (Fuchs and Sandoval 2013).
Some desirable features for academics and the public are also found in a number of paid-for nonprofit OA journals, such as eLife and the Public Library of Science (PLOS).However, this type of journal can lead to some resources invested in publishing research exiting the research domain-though not at the scale of or via the same mechanisms as for-profit journals (Box 1).Even though they might technically be nonprofit, paid-for nonprofit journals can still be controlled by a profit motive, because the number of employees and the size of their salaries depend on the turnover of the publishing organization.Ultimately, high costs due to marketing and management salaries are imposed on researchers and research institutions via high APCs (Logan 2017).In contrast, diamond OA journals explicitly set APCs to zero, preventing such leakage of research funds away from the research domain.
Lastly, a more traditional, publishing option is a society journal.These journals are typically controlled by well-established and well-connected communities of scientists, with a strong investment in individual fields and a long history of promoting and diffusing science (Schloss, Johnston, and Casadevall 2017).Commonly, however, societies use the services of a for-profit publisher and create a profit-sharing arrangement so that publication fees are in part reinvested in academia.The ethics of this approach depends upon how much of the profit is reinvested in the journal's field.It is, however, typically very difficult to find this out.Given the role and stature of societies in the academic community, they have great potential to play a major role in the transition towards more ethical publishing (Phillips 2019).Making profit-sharing arrangements public is a good first step, in that it gives authors and reviewers the information they need to make ethical choices, and so increases a society's negotiating power around profit-sharing.Until such transparency occurs, however, it will remain unclear whether society journals are truly a good investment for the long-term health of their corresponding fields.
How can we best inform ourselves about the different publishing models and how they prioritize ethical publishing practices?Some initiatives exist to guide scientists and provide valuable information.For example, in the field of ecology and evolutionary biology, DAFNEE (https://dafnee.isem-evolution.fr) is a database intended to inform scientists about the economic  OPEN ACCESS -PTPBIO.ORG models, publication fees, and partnerships of different journals.DAFNEE only includes journals that have a minimum set of credentials to be qualified as "academia-friendly", i.e., owned or controlled by public institutions, nonprofit organizations, or groups of scientists such as learned societies.Some of these organizations partner with for-profit publishers, while others are entirely nonprofit and/or non-commercial.The user can query journals by topic, economic model, Box 1: The multidimensional landscape of academic publishing.
The academic publishing landscape is not simple, and a single dimension is far from enough to capture the complexity of different journal models.From an economic point of view, journals can be held by commercial companies (e.g., Elsevier, Springer Nature, Wiley), by nonprofit, commercial organizations (e.g., Public Library of Science, eLife), or by non-commercial organizations (e.g., the SciPost Foundation).Hence, a society journal may be non-commercial or run by a for-profit company, with none, part, or some of its profits re-circulating back into academia.Nonprofit commercial journals may have high, low or no subscriptions fees or APCs, depending on how much cost it considers necessary to operate in the long-run.Among the different criteria one could use to classify journal models, we focus here on two that we believe are particularly important: "openness" and "economic model".Together, they define a two-dimensional matrix (Figure 1) which allows us to visualize the pros, cons, and trade-offs of different journals along these axes: In the bottom-left corner of the matrix (category 1), we find for-profit, commercial, closed-access journals.Many of them can be qualified as hybrid journals: authors must pay APCs to publish their articles with OA, and the journals also profit from placing paywalls that block access to those who cannot afford to pay to read.This is the least desirable model, and the motivation for publishing in them is generally driven by the esteem hierarchy, which commercial journals encourage and exploit (see main text).
continued on next page  OPEN ACCESS -PTPBIO.ORG (Box 1 continued.) In the bottom-right corner (category 2), we find a trade-off on resource investment with commercial journals publishing all their content in OA-often referred to as Gold OA journals-but charging authors (sometimes very expensive) APCs.The vast majority of OA journals-including some society journals run by for-profit publishers and some nonprofit commercial journals with high salaries and marketing costs-fall into this category.
In the top-left corner (category 3), we find a different kind of trade-off.These are commercial journals that might not be for-profit and might not charge authors, but still restrict access to articles: readers have to pay or be subscribed by their institutions to access the content of the articles.Members of the public without resources or an institution to pay for the costs are generally left without access as well.
Finally, in the top-right corner (category 4), we find the most desirable model: non-commercial journals that publish all articles with OA.In ideal cases, they publish articles without APCs for authors (these are called "diamond OA journals").They are directly supported by institutions that cover their running costs through financial and in-kind donations, maximizing both openness and re-circulation of resources back into academia.Though diamond OA is more desirable than categories 1, 2 and 3, we note that nonprofit commercial OA journals with reasonable, cost-reflective APCs (somewhere between category 4 and category 2) may play a beneficial role in the long-term, by guaranteeing diversity in the publishing ecosystem.
We note, though, that distinctions along these two dimensions do not cover the full complexity of the publishing landscape.For example, journals with the same business model might have differences in ownership models (Figure 2), ranging from society, community, or nonprofit actors to commercial enterprises.This and other considerations not discussed here might also be important when choosing where to publish. OPEN ACCESS -PTPBIO.ORG academic partnership, and publication fees, among other variables, to decide which journal best satisfies their needs as well as those of the academic system as a whole.

Why Haven't We Already Transitioned to Alternatives?
Before proposing ways towards healthier alternatives, like diamond OA, it is important to recognise that the for-profit academic publishing system is a stable one, and is difficult to perturb (Allahar 2021;Ponte, Mierzejewska, and Klein 2017;Schimmer, Geschuhn, and Vogler 2015).Academic publishing companies have, like any other companies, worked to promote and secure their business model.They have effectively exploited a growing trend of metric-based evaluation among researchers, by marketing and lobbying for attaching esteem and perceived objectivity to these metrics, even when such perceptions are in no way associated with the replicability or reliability of scientific results (Brembs 2018;Camerer et al. 2018;Errington et al. 2021; Serra-Garcia and Gneezy 2021).Researchers have come to believe that some journals are better than others, and so professional prestige is attached to publishing in some journals instead of others.This "esteem hierarchy" is a fiction come true: the belief that it is true is enough to make it true.
The esteem hierarchy is also pernicious.Colleagues on grant panels and promotion committees use it as a shorthand when assessing applications.Rewards flow to the individual who publishes high on the journal esteem hierarchy, regardless of the quality of that individual's work.Through their over-reliance on publication metrics, scientific institutions effectively entrust scientific output evaluation to metrics created in a commercial mindset without direct consideration for scientific merit ( Johnston 2015).A logical consequence of the esteem hierarchy is that researchers compete for limited places in what they perceive to be the best journals.With limited supply and lots of demand, price inflation is a natural consequence of this runaway process.
Thus, the esteem hierarchy underpins the difficulty of transitioning to alternative models.Any new model faces an uphill battle against larger and older competitors who hold all the cards.Any researcher publishing with a new, low-APC, ethical journal pays the opportunity cost of esteem points.
On top of this, researchers are generally unaware of the cost of accessing the literature, including costs paid via institutional subscriptions.In fact, dominant publishers often keep institutional subscription costs under non-disclosure agreements, not only leaving researchers and research institutions in the dark about costs, but also enabling publishers to raise prices without scrutiny (T.Bergstrom 2014).Subscription prices have become more transparent in recent years, but access to literature and publication fees are still often considered infrastructure costs or are covered by institutional budgets.Therefore, the incentive to publish high in the esteem hierarchy cannot be counterbalanced by concerns about the dominant publishers' pricing, despite the exorbitant costs absorbed by libraries and research institutions.This is quickly becoming unsustainable under the status-quo (Resnick and Belluz 2019).This is a very clear case in which what the individual perceives to be a short-term good is at odds with the long-term good of the scientific community and humanity as a whole (Casadevall and Fang 2014).To resolve this conflict, we need to find pathways that align the interests of the individual with those of the community, and that work to erode the esteem hierarchy that prevents alternative systems from becoming widespread.

What Can We Do?
Below, we list a set of suggested actions we can undertake as academics, at little to no personal cost to career prospects, in order to pull publishing away from the hands of the dominant commercial publishers and back into the control of scientists, while eroding the esteem hierarchy that commercial actors have encouraged over decades.
1. Recognize our specific leadership roles and leverage points.
The academic system has clear hierarchies, and they constrain the range of actions one can undertake.We recognize that differences in funding, seniority, cultural and academic background, gender, and ethnicity shape our capacity, energy, and time to enact change.We also recognize that those with the largest amount of power have the highest responsibility to enable this change.Yet, we can all contribute in different ways from our different positions, once we identify the leverage points that are within our reach: essentially, the people who engage with us regularly and are willing to listen to what we have to say.For a PhD student, that could mean one's supervisor or advisor, fellow lab members, or members of a student union.For a member of a grant review board, it could mean other reviewers or grant agency officers.For a member of a scientific society, it could mean fellow members or society officers.

Let people know our concerns.
The current perceived dependence on commercial journals poses a collective action problem: all academic actors would be better off if they cooperated, but they fail to do so because of perceived competition among us.A way to begin to defuse the social dilemma is by sharing information: letting our peers, advisors, mentors and mentees know that we disagree with the system, even if we feel constrained to operate within it.For a student, that can mean engaging in conversations about past and future publishing choices for joint work with our supervisor or fellow lab members, especially before a research project begins.For a member of a grant review board, it can mean openly talking to fellow reviewers about the problems with evaluation metrics reliant on prestige derived from for-profit publishing.For an officer in a scientific society, it can involve calling for a re-evaluation of the methods the society uses for publication outlets.For an author of a study, it can mean raising awareness among co-authors of problems with commercial publications, and taking this dimension into account when choosing a journal to publish in.Social media can help to highlight papers published in ethical venues, as well as problems tied to commercial publishing systems.When talking about for-profit journals, we can also use words one would associate with negative feelings/behavior ("exploitative", "profiteering") rather than positive feelings/behavior ("impact", "authoritative", "top").
At the same time, we should keep in mind that all academics are embedded in our current system and should not be ridiculed for trying to preserve their careers (but see Box 2 for a demystification of the perceived "costs" of publishing in nonprofit journals).All of these conversations will be most effective if approached with honesty and good faith, avoiding shaming, blaming or antagonizing the other party, or raising unsubstantiated statements about publishing (see Box 3 for a guide to debunk false claims during such conversations).
Ultimately, the onus lies on the major sources of academic funding.If we have access to officers of public research institutions, funding agencies, negotiation consortiums, and learned societies (or know someone who might), it is crucial to engage in conversation with them, and explain the problems with the for-profit publishing system, without assuming that knowledge about it is widely disseminated  OPEN ACCESS -PTPBIO.ORG Box 2: Is there a "cost" to faculty publishing in nonprofit or society journals?
Journals with high impact-factor scores are overwhelmingly owned by commercial publishers who, in turn, encourage the use of these scores.It is often suggested that publishing in such journals is beneficial to a scientist's career.We assessed the validity of this claim in the field of Ecology and Evolutionary Biology by querying the DAFNEE database, which contains publishing information for this field.We sampled nine cities/states having recently hosted a major conference in this field: the Joint Evolution meeting (Society for the Study of Evolution, European Society for Evolutionary Biology, Society of Systematic Biologists, American Society of Naturalists) or the SMBE meeting (Society for Molecular Biology and Evolution).These were located in cities across three continents: Austin, Ottawa, Québec (America), Auckland, Queensland, Tokyo (Asia), Manchester, Montpellier, Vienna (Europe).In each of these cities, a department of ecology, evolutionary biology or biology was identified.The name, surname, and academic positions of faculty members were retrieved from department web sites.For each scientist, the following statistics were collected: total number of citations in their careers, h-index, and DAFNEE index.The h-index (Hirsch 2005) is the number of articles by a scientist cited at least this number of times.The DAFNEE index is the proportion of a scientist's articles published in society or nonprofit journals.The total number of citations and h-index were manually retrieved from Google Scholar.The DAFNEE index was automatically calculated based on PubMed records and DAFNEE journal annotations.Scientists having published 10 or fewer articles in journals surveyed by the DAFNEE curators were not included.A correlation analysis did not reveal a significant relationship between total cites and DAFNEE index (N =246, r 2 =0.0025,P >0.05; Figure 3.A), or between h-index and DAFNEE index (N =246, R 2 =0.0037,P >0.05).Adding scientists' gender, academic position and city/state as co-variables did not reveal any hidden relationship between citation metrics and the DAFNEE index.In one department (Institut des Sciences de l'Evolution de Montpellier) we could access scientists' ages and expand the data set by considering all of the scientists' publications instead of just PubMed records.Figure 3.B plots h-index as a function of age in this department, with green dots corresponding to scientists with a DAFNEE index above the median, and red dots corresponding to scientists below the median.The two regression lines did not differ significantly (covariance analysis, N =62).These analyses suggest that, at least in the field of ecology and evolutionary biology, to favor society/nonprofit journals over purely commercial journals has no negative impact on the citation rate of faculty members.
continued on next page

Engage in collective action.
Leverage points can become clearer or more approachable if we act collectively.An academic union or a student organization can influence policy better than a single individual.Collective action works because of the power of social signaling: people are more willing to act if they know others are committed to act in unison.Social "tipping points" (reviewed in Lenton 2020) can thus rapidly shift the structure of a system, if enough information flows between relevant actors.There are already declarations of non-collaboration with closed-science publishing practices (e.g., https://nofreeviewnoreview.org/), but these do not yet take into account the dumping of OA "rents" on academic researchers and institutions.Other collective initiatives such as the San Francisco Declaration on Research Assessments (DORA) have brought individuals together to denounce the misuse of journal metrics, which drive the aforementioned esteem hierarchy.Similar initiatives could emerge against commercial journals, by collecting declarations of support from scientists en masse.A public statement in favor of ethical, non- The results above are perhaps not so surprising; indeed, both of the compared sets of journalssociety/nonprofit vs. purely commercial-are highly heterogeneous.Many society journals are known for having a long history of publishing high-quality science, and a number of nonprofit, OA journals are quite selective; one can clearly build a very solid CV while favoring these.This encouraging result should not hide the fact that many society/nonprofit journals have excessive subscription fees or APCs, which are only partly re-injected in academia, and/or do not offer OA to their contents (see main text and Box 1).The results may likely apply to other scientists as well (PhD students, postdocs), since they imply that articles published in academia-friendly journals are not cited less than articles published in purely-commercial journals.profit, publishing or peer review systems by a large academic organization can help frame the conversation in a positive way, and steer others to modify their behavior (see, for example, the PCI Manifesto: https://peercommunityin.org/pci-manifesto/).As we engage in such efforts, we should keep a systemic viewpoint in mind: collective action should target not only publishing practices, but also the journal metric evaluation schemes that fuel them.Researchers should reach out to contacts in funding agencies, public representatives, and university administrators, many of whom already share their same concerns (CIHR et al. 2019;European Commission 2022;2017;European University Association 2022;C. Fraser et al. 2021;Science Europe 2021;2013;Science Europe and European University Association 2019;Carr and Towers 2020;Wilsdon 2016).Such expanding connections between actors can strengthen the leverage we all have to propose and implement changes in research assessment, towards a social tipping point.For example, this can be done by engaging with our student and early career researcher representatives, encouraging them to connect to organizations advocating for change on national and supranational scales (Berezko et al. 2021;Hnatkova, DiFranco, and Srinivas 2020).We can also formalize our support for existing collective actions that pledge for  OPEN ACCESS -PTPBIO.ORG more ethical publishing: e.g., signing DORA, the Jussieu call (Bauin et al. 2017), or the Leiden Manifesto (Hicks et al. 2015).Finally, we can increase collective leverage by changing our own practices, for example when assessing peers and when judging the quality of scientific papers (see recommendation 7 below).Connecting advocacy of all concerned stakeholders can increase the pace of transition in evaluation and publishing practices: policymakers will be pushed to initiate the necessary reforms more quickly and forcefully if a large number of scientists show their disapproval of the current situation, in word and deed.

Withdraw our free peer review or editing labor from commercial journals.
Increasingly, researchers are beginning to openly deny requests for peer review or editing from the dominant commercial publishers.This is a low-cost, high-benefit action that can have high impact, especially if done openly and transparently.There are few (if any) repercussions flowing from systematically denying one's labor to a confidential and private request to give labor away to a for-profit company.Importantly, we can let both the journal's contact person and our peers know why we're doing this (anonymizing the author names and article title to preserve confidentiality), and use it as an opportunity to question the assumptions behind the request.We can stress that the reasons are not necessarily about our own personal money or time per se Box 3: Debunking publication myths.
We often encounter false or unsubstantiated claims related to ethical publishing in our daily research practices.Here, we provide a short guide to debunk these claims when engaging in conversations with other members of our communities or research environments.

"All recognised journals in my field are unethical."
Many recognised journals are owned by learned societies or research institutions, whose members work to keep scientific resources within science.In ecology and evolution, this proportion amounts to about 50% (cf.dafnee.org)"I will be cited less if I publish in an ethical journal." See Box 2 for a demystification of this claim.

"I cannot publish in an ethical journal because my supervisor does not want to."
If you are a student or postdoc, it is always worth discussing publishing choices with your supervisor: you may be surprised.Research supervisors often use the opposite argument to justify themselves: "I cannot publish in an ethical journal because students and postdocs are signatories to the article, and this could harm their careers." "I cannot publish in an ethical journal because students and postdocs are signatories to the article and this could harm their career." Students and postdocs are often the most highly motivated to publish in ethical journals.They might use the opposite argument to explain why they do not publish in ethical journals: "I cannot publish in an ethical journal because my supervisor does not want to".If you are a supervisor, it is worth discussing publishing choices with your students and postdocs, especially at the beginning of a research project.See also Box 2.
continued on next page  OPEN ACCESS -PTPBIO.ORG (Box 3 continued.)

"I have no leverage over unethical journals."
There are many ways in which we can exert leverage over the publishing system and the choices other scientists make.This include but are not limited to: a) refusing to review or be an editor for these journals, b) choosing other journals for publication, c) discussing with members of editorial boards about switching publishers, and d) discussing with members of learned societies about abandoning contracts with dominant commercial publishers.

"Evaluation committees always use journals as indicators of quality."
Many review boards are increasingly using qualitative assessments and abandoning the use of journal metrics as a proxy for research quality.The number of DORA signatories bears witness to this.Additionally, the members of evaluation committees are also researchers.We can argue for increased use of this type of assessments, and refuse to sit on commissions that operate unethically.
"We will have to deal with the same system forever.Nothing will change." The publishing system is changing, and we can accelerate such a change.Every year, there are more and more DORA signatories.Impact factors are now banned from a growing number of evaluation committees.There are also international initiatives created to change the research evaluation procedures, e.g., the Paris Call on research assessment (OSEC 2022) based on the European Commission report on a reform of the research assessment process (European Commission 2021).
(though they might be), but about the nature of the for-profit system in general.
In response to the profiteering model of academic publishing, it has occasionally been suggested that reviewers be paid for their work.We do not believe that this is a good idea.First, the cost of peer review would simply be passed on to the community.Second, any payment would be unlikely to reflect the true value of the time and expertise of the reviewer; so researchers would essentially be de-valuing their expertise-turning a pro bono contribution into an underpaid commercial activity.Finally, the introduction of payment for review can only reinforce the esteem hierarchy: dominant for-profit publishers have much greater financial resources than non-commercial publishers, so can afford to pay more and, in return, charge higher APCs or subscription fees.Were they to actually begin paying reviewers (even small amounts), the system would work against the nonprofit or society publishing organizations that rely on the good will and trust built over years of engagement between academics.Thus, payment requests for peer review could run the risk of driving good actors out of the system.Instead, research institutions should more openly recognise peer review as an integral part of researchers' work, adapting research assessments to explicitly consider peer review as an important academic contribution.

Support healthy journals.
When deciding where to submit a paper, scientists normally aim at reaching the widest possible audience.The set of potential target journals is typically chosen based on a combination of impact factor and thematic relevance.We suggest considering the journal's business model, and more generally, the journal's publication ethics, to be an important criterion.Indeed, our submission choices can have a strong impact on the structure of the publishing system (Logan 2017;Kayal, Ballard, and Kayal 2022).Whenever we have a choice between more or less equivalent options, we can favor diamond, nonprofit and/or society journals over purely commercial journals.By doing so, we will retain (part of ) the publication fees in academia, support the groups of scientists who run these journals, and strengthen their power to negotiate with private publishers, if they use their service.For example, roughly 50% of the articles in the field of ecology and evolution currently appear in journals that are not purely commercial (see Box 2).Increasing this percentage up to, say, 70% would entail no strong disruption of scientists' habits, while having a huge impact on the regulation of the overall publishing market.
Some large institutions have already engaged in the creation or support of diamond OA journals (Becerril, Arianna et al. 2021;Bosman et al. 2021) or publishers of such journals (e.g., SciPost).However, this trend remains scarce, the financial efforts behind it are limited, and these journals often have limited audiences.It is important for large institutions to be more ambitious and create or support general or specialized diamond journals of high reputation: e.g., a "Max Planck Journal of Solar Physics", a "Stanford Journal of Genetics", a "Tokyo University Journal of Asian History", or a "Wellcome Trust Journal of Medicine" (Haspelmath 2015).
We can also support a healthy publication system by providing peer review and editorial support to existing nonprofit journals-fulfilling roles like associate editor, for example, and stepping out of such roles when they serve dominant for-profit publishers.Indeed, volunteering is mentioned as a challenge by many diamond OA journals "who expressed concerns about the reliance on the goodwill of volunteers and on the dedication of certain individuals who sustain journals who cannot be necessarily relied upon in the mid-to long-term" (Becerril, Arianna, et al. 2021).Hence, a more substantial commitment to running those journals is essential if we want them to play a significant role in the publishing landscape.Expanding initiatives like DAFNEE to other fields could also help inform authors of the hidden benefits (to both the individual and the community) of publishing in these journals.Research teams may meaningfully mobilize tools designed by librarians to help them reflect and reach a decision on the choice of OA journal.An example of this is the Publication Strategy and Open Science tool (https://tinyurl.com/publishing-strategy)developed by Jeroen Bosman and Bianca Kramer (2022).

Discuss switching to ethical publishing with existing journal owners.
Many journals are owned by scholarly societies or research institutions (national research institutes, universities, laboratories) but are published by dominant commercial publishers.Publication or subscription fees end up partly in the hands of the publisher and partly contribute to the research institution's or scholarly society's budget, in a generally unknown proportion.The budget of learned societies comes partly from this return from the private publisher, partly from subscriptions from the members of the association, and sometimes from donations from funding agencies or research institutions.In all these cases, 100% of these budgets actually and originally come from the scientific community (the scientists themselves, the research institutions and the laboratories).Indeed, subscriptions, APCs, memberships and donations all come from the academic community, and yet a substantial proportion does not return to the community, but rather ends up in the pockets of publisher shareholders, or covers costs unlinked to the publication itself (marketing and lobbying) (Grossmann and Brembs 2021).
Convincing scholarly societies or research institutions to give up their association with large commercial conglomerates would be a way to end this pointless "bleed-out" of academic funds (Haspelmath 2015;Wilson 2018;Kayal, Ballard, and Kayal 2022).Such organizations could  OPEN ACCESS -PTPBIO.ORG begin by making profit-sharing arrangements explicitly, which could in turn prompt their members to support a switch to ethical or nonprofit publishers, or even organize the publication of their journals themselves in a fair open access model (www.fairopenaccess.org).Several journals formerly published by dominant commercial publishers have already pioneered this type of switch.For example, the whole editorial board of the Journal of Algebraic Combinatorics resigned from Springer in 2017 and set up Algebraic Combinatorics (algebraic-combinatorics.org), a new journal based at Centre Mersenne.Similarly, the editorial board of Lingua decided to quit Elsevier in 2015 and founded Glossa, a journal published by Ubiquity Press (Wilson 2018).Other such cases can be found in the Open Access Directory: http://oad.simmons.edu/oadwiki/Such a switch would mean that the income of learned societies would come less or not at all from the payment of part of the subscriptions or APCs by dominant commercial publishers.They would have to make up for this loss of income by obtaining direct grants from funding agencies or research institutions, or running their journals in autonomy, while keeping APCs on the order of the real publishing costs.For the learned societies, this switch would make little difference.For the scientific community as a whole, it would generate considerable savings.Indeed, funding agencies and research institutions would gain substantially, because 100% of their direct grants to learned societies would actually benefit the scientific community (contributing to the budgets of learned societies and real costs of publications), and publication costs would be kept to a minimum.
How can we convince learned societies and research institutions to switch?This can be done by proposing this idea as a point of discussion during general assemblies, by discussing directly with the teams in charge of learned societies or research institutions, or by talking to the editorial teams of journals.Associations specialized in this approach exist in numerous fields: LingOA in linguistics, MathOA in mathematics, and more generally the Free Journal Network and the Fair Open Access Alliance.They promote the fair access model, help journals to switch towards fair access and help them to maintain this model.For instance, the switch performed by the editorial board of the Journal of Algebraic Combinatorics has been assisted by MathOA.In turn, LingOA has helped four journals to switch to fair OA: the journals Glossa, Laboratory Phonology, Italian Journal of Linguistics, and Journal of Portuguese Linguistics (www.lingoa.eu/new-mission).As a point of reference, the Societies and Open Access Research initiative (SOAR, http://bit.ly/hoapsoar), has identified more than 1,000 societies publishing more than 1,000 OA journals.Society officers can use this resource to identify other societies within the same field that have previously gone through this process, to gather advice and learn about the consequences of flipping.

Promote ethical behavior by focusing on science, not journal names.
We can also promote ethical behavior in our daily academic duties.We can avoid evaluating any colleague on hiring/promotion committees based on bibliometric data alone.And we can reject requests to sit on evaluation committees unless we are given sufficient time to make a qualitative assessment of applicants' research based on the content of their research output.We can encourage our colleagues to cite papers based on merit and not necessarily because they are published in esteemed journals.Similarly, we can encourage researchers to regularly check society and non-commercial journals.When informally discussing a paper (e.g., in journal clubs), we can steer conversations away from "where" the article was published to better focus on the content of the article.
When performing research, we can aim to uphold ethical standards for data openness, responsibility, and accessibility from the start of the research process, e.g., by using the FAIR principles for data management and stewardship (Wilkinson et al. 2016) and the CARE prin- OPEN ACCESS -PTPBIO.ORG ciples for indigenous data governance (Carroll et al. 2021).We can also think critically about how the choice of a particular journal for submission might affect our ability to maintain such principles.
Recently, major changes in the procedures for project evaluation by funding agencies have resulted in a move away from simply counting articles.Many agencies are starting to require narratives and indications of societal impact: researchers can help move this forward by expressing public support for these requirements.For example, the University of Utrecht announced that it was abandoning the use of impact factors for the promotion and recruitment of its scientists (Woolston 2021) in favor of evaluations based on "qualitative measures, narrative and strategy first" (Utrecht University 2021).At the 2022 Paris Open Science European Conference (OSEC), Maria Leptin, the current president of the European Research Council (ERC), said that, having signed DORA, the ERC has completely banned the use of impact factors and that its panels should rather look at the content of the proposed projects (ERC 2021).The National Health and Medical Research Council in Australia recently switched to evaluating track records based only on a researcher's best ten papers of the last ten years, prompting applicants for narratives to justify their choice of "best" (NHMRC 2022).Similarly, at the Centre National de la Recherche Scientifique (CNRS) in France, individual evaluation committees are abandoning impact factors in favor of narratives and qualitative explanations of researchers' contributions to the advancement of knowledge (Larousserie 2021).Indeed, Sylvie Rousset, head of open science at CNRS, recently stated that "We are working to stop delegating the scientific evaluation of researchers to the editors of journals, however prestigious they may be" (CNRS 2021).CNRS now requires the evaluation of researchers to be based on: (i) the results themselves and not the fact that they may have been published in a prestigious journal, (ii) a limited number of such results, and (iii) a larger diversity of professional activities-including preprints, data sharing, software production, training, innovation, management and investment in open science, among other criteria.Similar approaches have been taken up across Europe: for example, the UK Research and Innovation (UKRI), the Dutch National Research Council, the Swiss National Science Foundation, the Luxembourg Research Council, and the Health Research Board Ireland have all started implementing and discussing the strengths and challenges in the implementation of "narrative CVs" (Hazlett 2021).

Conclusion
The scientific publishing system is now at a crossroads.The increasing strain caused by dominant commercial publishers on library systems, research institutions and the public at large is untenable, and is pushing many scientists to reconsider their preferred choices for publication.Yet, it is up to scientists themselves to create actual alternatives that pave the way forward.
Thankfully, many such alternatives are emerging, including diamond OA journals and community peer review initiatives with no cost to reviewers, authors or readers of scientific output.The possibility to transition to ethical, low-cost alternatives is still within our power.Making the transition will naturally require leadership and thoughtful engagement by researchers, their professional societies, institutions and funding agencies.The benefit of this collective action would, however, be profound.Not only would it remove a massive cost burden from research institutions, but also make the global research enterprise more efficient, equitable, and accessible.What is stopping us?  OPEN ACCESS -PTPBIO.ORG

Figure 1 :
Figure 1: Two of the dimensions of the publishing landscape: openness and economic model.

Figure 2 :
Figure 2: Incorporating a third dimension (ownership) into the publishing landscape matrix.

Figure 3 :
Figure 3: A. Total number of citations as a function of the DAFNEE index; N =246 scientists from 9 universities from across Europe, America, and Asia.Red = Professor.Blue = Associate Professor.Green = Assistant Professor.B. The h-index of scientists as a function of age.Green = society + nonprofit journal usage above average.Red = society + nonprofit journal usage below average.The lines denote a linear regression fitted to each of the two sets of DAFNEE groups.