Many academics are critical of the current publishing system, but it is difficult to create a better alternative. The perspective relates to the sciences and social sciences, and discusses the primary purpose of academic journals as providing a seal of approval for perceived quality, impact, significance, and importance. The key issues considered include the role of anonymous refereeing, continuous rather than discrete frequency of publications, avoidance of time wasting, and seeking adventure. Here we give recommendations about the organization of journal articles, the roles of associate editors and referees, measuring the time frame for refereeing submitted articles in days and weeks rather than months and years, encouraging open access internet publishing, emphasizing the continuity of publishing online, academic publishing as a continuous dynamic process, and how to improve research after publication. Citations and functions thereof, such as the journal impact factor and h-index are the benchmark for evaluating the importance and impact of academic journals and published articles. Even in the very top journals, a high proportion of published articles is never cited, not even by the authors themselves. Top journal publications do not guarantee that published articles will make significant contributions, or that they will ever be highly cited. The COVID-19 world should encourage academics worldwide not only to rethink academic teaching, but also to re-evaluate key issues associated with academic journal publishing in the future.
Any discussion about the future is inevitably based on the past and present. It is also often based on opinions.
The primary purpose of the opinions expressed here is to encourage others to offer their carefully considered opinions on the future of academic journals in a COVID-19 world.
In the usual academic sequence of events, a scientist or a group of scientists work on topics of narrow or broad interest, discover something new, hopefully exciting, write it up, and submit the research article to a journal for possible publication. This is followed by a circus of events, including a laborious submission process, anonymous referees, editorial decisions, rejections, possible revisions, and so on.
Many academics and researchers are critical of the current publishing system but it is difficult to create a better alternative.Perhaps the COVID-19 world may force academics worldwide not only to rethink academic teaching, but also academic journal publishing, including a view of academic publishing as a continuous dynamic process in which research may be improved after publication.
2. How to Evaluate Research?
It is well known that this system has many drawbacks. Citations are the benchmark for evaluating the importance and impact of academic journals. Even in the very top journals, a high proportion of published articles are never cited, not even by the authors themselves. A top journal publication does not guarantee that a published article makes a significant contribution, or that it will ever be highly cited.
The current tenure and promotion evaluation system works as a two-step procedure. We ask: “In which journal has the article appeared”? Then we multiply the number of pages by the “prestige index” of the journal, and (possibly) divide it by the number of authors, though this would render superfluous hyperauthored articles with more than 1000 authors. Dividing by the number of authors is rarely undertaken in many disciplines, especially in the sciences. Hence, there would be no need to read the published research of those seeking tenure or promotion—we only need to count points. There is some merit in this system because it is impartial and objective. Unfortunately, the presumed “prestige” of a journal has little predictive power in terms of future career trajectories.
Journals are evaluated according to their perceived prestige, impact, importance, significance, and quality. In essence, every known measure of a journal publication is based on citations, or a function thereof, such as Total Citations, h-index (see  Hirsch, 2005), and impact factor, with the number of views and downloads sometimes also being considered in various outlets. Similarly, the most widely-used measures of evaluating individual academics are total citations and the h-index.
A journal’s prestige is based on citations, as can also be seen in the marketing and advertising efforts of journal publishers, and individual published articles are judged by the journal’s prestige. Why this roundabout method? An article should be judged by its impact through citations. In this way, the misleading prestige of journals would no longer play a role, as it is the article itself that matters.
This might prove difficult for young researchers who might have had little time to make an impact. But these young researchers currently have a more serious problem, namely, that their research articles will only count when they have been published (or at least accepted) in leading journals, while the time between initial submission and final acceptance may be several years, followed by additional time taken for publication.
3. How to Improve Research after Publication?
It is worth noting that the Editors-in-Chief, the members of Editorial Boards, and referees/reviewers of academic journals are guilty of two types of errors, namely rejecting good quality articles that should have been accepted, and failing to reject submissions that were subsequently published. Quality articles that were rejected may have been published elsewhere, possibly in respectable journals. However, what can be done with published articles that should not have been published, for whatever reason, including a subsequent failure to attract citations some years after publication?
A different perspective on journal publishing would be to view academic publishing as a continuous dynamic process, rather than an end result. In the traditional model of academic journals, a published article is finished. This may be convenient for the journal, the author, referencing, and citations analysis. On the other hand, if a published article is never truly finished, as would be the case in an ongoing dynamic process, it may subsequently undergo further refinement through revision, correction, extension, updating, commentary, and associated responses.
A step in the right direction would seem to be the inception of Sci, an innovative open access journal with transparent postpublication peer review (see  Rittman and Vazquez (2019) for the launch and motivation of the journal in 2018, and  Jacob, Rittman, Vazquez and Abdin (2019) for its continuing evolution). Academics can be solicited to provide commentary on published articles, although self-nomination is also possible, and the author(s) of the original publication are invited to respond. There is no anonymity in the review process after publication, but the decision to accept an article is based on academic editors who remain anonymous. Full disclosure at all stages of decision making might be a useful further development and evolution of the editorial process.
All of this takes time and effort, but is worth pursuing as a genuine attempt to improve upon the original published research in a relatively timely manner. It is frequently inconvenient to attract independent experts on a given topic, and inviting the original reviewers for further review is self-defeating and not independent. Self-nomination as a reviewer may be reasonable, but it is not proactive.
Encouraging critical and focused reviews of published articles, without a strict limit on length, would encourage reviewers to provide in-depth analyses that would lead to improvements of the original publication.
The entry is from 10.3390/sci2040076
- Hirsch, J.E. An index to quantify an individual’s scientific research output. Proc. Natl. Acad. Sci. USA 2005, 102, 16569–16572.
- Rittman, M.; Vazquez, F. Sci—An open access journal with post-publication peer review. Sci 2019, 1, 1.
- Jacob, C.; Rittman, M.; Vazquez, F.; Abdin, A.Y. Evolution of Sci’s community-driven post-publication peer-review. Sci 2019, 1, 16.