Our Study is Published, BUT the Journey is not Finished!
August 2020 Issue Table of Contents
DOI: 10.2138/gselements.16.4.229

Image reproduced with permission of Jason McDermott, @redpenblackpen
Each June, we receive e-mails from publishers welcoming the evolution of their journals’ journal impact factor (JIF). The JIF is a controversial metric (Callaway 2016), and it is worth asking, “What’s behind it?” In this age of “publish or perish” (Harzing 2007), we take much time and effort to write our papers and get them published. But how much time and effort do we put into finding readers or ensuring that we are reaching the right audience? Are metrics, such as the JIF, good guides for how well we are doing at reaching our target audience?
What is actually measured by journal-level metrics or “bibliometrics” such as the JIF? The answer is arithmetic means of citation counts. If we take the example of the Geochemical Journal (the journal for which we are part of the Editorial Board) and consider citations in 2018 for articles published in 2016 and 2017 (n = 98), we find that 21% of the articles were cited more than twice, 18% only once, and more than 60% have received no citations. For Elements (n = 76), 71% of articles published in 2016 and 2017 were cited more than twice in 2018, 25% just once, and only 4% were not cited. Figure 1 shows citation distributions for six selected geochemistry journals, along with their JIFs and the percentage of citable items below the JIF. The citation distributions reveal that between 56% and 70% of articles from the selected journals have citation counts below the reported JIF (Fig. 1). Further analysis revealed that a few highly cited articles can inflate a JIF to a value that is not representative of all the articles. The JIF, and many other arithmetic means, are inappropriate statistics for measuring the impact of individual papers (Tennant et al. 2019). Furthermore, such bibliometrics are not providing a measure for the visibility of your work or whether it is reaching your target audience. We need informative and readily available article-level metrics, such as article citation counts or ”altmetrics”, along with other qualitative and quantitative measures that can truly gauge the “impact” of research.

Figure 1. Data citation distributions for six selected geochemical journals. Each plot reports the 2018 journal impact factor (JIF) and the percentage of citable items below the JIF (between parentheses). Number of citations in 2018 for articles published in 2016 and 2017. The 2019 JIF were released during the publication processes, the slight changes do not substantially affect our interpretation. The new JIF are (A) Geochemical Journal: 1.149. (B) Elements: 3.507. (C) Geochimica et Cosmochimica Acta: 4.659. (D) Chemical Geology: 3.362. (E) Geochemistry, Geophysics, Geosystems: 3.275. (F) Geochemical Perspectives Letters: 4.452.
The first step to ensure that our work reaches the right audience is to make our work widely accessible, such as found with Open Access (Pourret 2020). But accessibility does not mean that our target audience will “see” our work. There are thousands of journals available and no one has the time or resources to read every publication. So, the second step to ensure that our work reaches the right audience is to draw attention to our work, create a community, and engage the public. As Ludden et al. (2015) noted, “We are not so good at promoting the social and economic impacts of geochemistry”. Online methods of communication (e.g., Twitter, Reddit, Facebook, blogs) have often had a bad reputation within scientific circles and are often not perceived as scholarly. Yet, we should utilize these platforms to draw attention to our scientific work. It can be something as simple as writing a blog post, participating in a science communication podcast, tweeting our latest findings (Green 2019), or just by drawing a science-based comic strip or sketch (McDermott et al. 2018).
It is also important that the knowledge we produce can quickly reach the people for whom it is relevant. This is why engaged researchers are often visible in public circles rather than just within academic circles: they are frequent guests in traditional mass media, such as newspapers, radio, and television, and they are happy to give popular academic presentations to nonexperts. The most important thing is that our knowledge is dispersed and gets where it can be understood and used.
The value of a journal and the work published within it is also the community it creates. Elements is an established journal in the geochemical community and well known for great papers dealing with topics ranging from the critical zone to planetary systems. The magazine also publishes society news and lists of scientific events, with the intent of drawing together a community of researchers that would otherwise be disconnected. Elements is also used by many of you in classrooms (Pourret 2009), and many of you read the magazine just to learn about personally unfamiliar topics within the Earth sciences. Other than the published articles, none of the other uses result in measurable citations, yet all have a significant impact on the reader.
Fortunately, there are metrics (“alternative metrics”, popularly known as “altmetrics”) that can capture the impact of our work beyond citation counts in journals (Priem et al. 2010). Altmetrics have been shown to circumvent several weaknesses that citation counts possess as indicators of scientific attention (e.g., Altmetric https://www.altmetric.com/, PlumX Metrics https://plumanalytics.com/). They can do this in three main ways. (1) They can be collected for articles, books, book chapters, presentations, figures, and so on. (2) They are available much faster than citation counts (Thelwall et al. 2013). (3) They can reflect the resonance of our work among nonscientific or nontraditional audiences, such as the mainstream media. In addition to altmetrics, there are other tentative alternatives to the JIF, such as the TOP Factor, which is based on the Transparency and Openness Promotion (TOP) Guidelines set out in Nosek et al. (2015).
Metrics are needed. But metrics are numerous, and we have to be careful of using a single metric to measure the impact of our work. Both bibliometrics and altmetrics come with their shortcomings and yet-unsolved challenges (Lemke et al. 2019). We need to work together to find more appropriate measures of quality for authors and research. For a start, research excellence should be remodeled around transparent workflows and accessible research results (Hicks et al. 2015).
Science must go on! So, where will you publish your work, and how will you measure its “value to the community”? A system based on bibliometric parameters favors an approach to “quantity over quality” and undervalues achievements such as societal impact. The best decisions are taken by combining robust statistics with sensitivity to the purpose and nature of the assessed research. There is a need for both quantitative and qualitative evidence; each is objective in its own way. Ultimately, we need to find a way for researchers and their work to be assessed in a way that is fair and accurate, but not to be over-reliant on publication metrics.
References
Callaway E (2016) Beat it, impact factor! Publishing elite turns against controversial metric. Nature 535: 210-211
Green T (2019) Maximizing dissemination and engaging readers: the other 50% of an author’s day: a case study. Learned Publishing 32: 395-405
Harzing A-W (2007) Publish or Perish. https://harzing.com/resources/publish-or-perish Accessed June 2020
Hicks D, Wouters P, Waltman L, de Rijcke S, Rafols I (2015) Bibliometrics: the Leiden Manifesto for research metrics. Nature 520: 429-431
Lemke S, Mehrazar M, Mazarakis A, Peters I (2019) “When you use social media you are not working”: barriers for the use of metrics in social sciences. Frontiers in Research Metrics and Analytics 3, doi: 10.3389/frma.2018.00039
Ludden J, Albarède F, Coleman M (2015) The impact of geochemistry. Elements 11: 239-240
McDermott JE, Partridge M, Bromberg Y (2018) Ten simple rules for drawing scientific comics. PLOS Computational Biology 14, doi: 10.1371/journal.pcbi.1005845
Nosek BA and 38 coauthors (2015) Promoting an open research culture. Science 348: 1422-1425
Pourret O (2009) Elements in the classroom. Elements 5: 195
Pourret O (2020) Global flow of scholarly publishing and open access. Elements 16: 6-7
Priem J, Taraborelli D, Groth P, Neylon C (2010). Altmetrics: a manifesto. http://altmetrics.org/manifesto/. Accessed June 2020
Tennant JP and 15 coauthors (2019) Ten hot topics around scholarly publishing. Publications 7, doi: 10.3390/publications7020034
Thelwall M, Haustein S, Larivière V, Sugimoto CR (2013) Do altmetrics work? Twitter and ten other social web services. Plos One 8, doi: 10.1371/journal.pone.0064841