It’s been about seven years since we developed a tool (a downloadable rubric and explainer) to help our faculty determine the credibility of a journal. You can download the tool at https://digitalcommons.lmu.edu/librarian_pubs/40/ and read about the development of it at https://jlsc-pub.org/articles/abstract/10.7710/2162-3309.2250/. People have told us – often! – how they use the tool in their personal decision-making about where to publish, as well as sharing the tool in their classrooms and with friends. When we developed the tool, I hoped the set of decision criteria in it would be used to further the broader discussion about what is needed for an author to feel confident about submitting their work to a particular publisher.
I was delighted to see a recent article by Albro, et al., using ours as part of a critical evaluation of two tools like this, to determine the credibility of open access library science journals. Our tool was selected for the evaluation for two reasons: its alignment “with the current practice of using descriptive frameworks that empower authors to critically evaluate a journal’s quality and credibility, rather than provide a binary decision of good or bad,” and; its usability, in that it provides “guidance on application and a set list of criteria and measurements for determining quality or credibility” (p. 64).
The 6-person team of evaluators applied our rubric and the Open Access Journal Quality Indicators checklist (Beaubien & Eckerd, 2014) to a sample of 48 open access journals in the field of library science. The team noted that they preferred using our tool over the other one because of the “specific rules for distinguishing between Good, Fair, and Poor for each criterion” (p. 72) but noted that there was some ambiguity in using the rubric, especially as it relates to smaller publishers and hybrid journals. Also noted was that the inter-rater reliability was lower than that of the other tool (70.3%, compared to 81.7%).
The team identified some criteria and indicators that were similar across the tools (see their Table 5, on p. 73) as well as identified the top positive and negative attributes in library science journals, based on the two tools (Table 6, p. 76).
A thoughtful consideration by the authors is noted on p. 74, in which they ask, “Researchers should question … [if] the tools [are] asking the right questions and how should the goals of the tool change over time, especially as open access resources expand and transform?” The authors are aligned with my own thinking during the development of our tool, that “updates to existing tools are imperative, or new evaluation tools should be designed to account for the rapidly developing environment in which the research landscape now resides” (p. 75).
I am encouraged to see our field continue to engage in critical thinking about how an individual author may determine for themselves whether a particular journal is the appropriate outlet for their latest article, rather than relying on a prescribed list or tool that may miss some nuance or a key consideration that is important to that author. Thank you to the authors for engaging with our rubric. It means a lot to me personally that an idea I had has been able to be shaped into a useful tool that is still impacting the field, and I appreciate this critical eye on the work.
Albro, M., Serrao, J.L., Vidas, C.D., McElfresh, J.M., Sheffield, K.M., & Palmer, M. (2024). Applying Librarian-Created Evaluation Tools to Determine Quality and Credibility of Open Access Library Science Journals. portal: Libraries and the Academy 24(1), 59-81. https://doi.org/10.1353/pla.2024.a916990.
Beaubien, S. & Eckard, M. (2014). Addressing Faculty Publishing Concerns with Open Access Journal Quality Indicators”. Journal of Librarianship and Scholarly Communication 2(2), eP1133. https://doi.org/10.7710/2162-3309.1133
Blas, N., Rele, S., & Kennedy, M. R. (2019). The Development of the Journal Evaluation Tool to Evaluate the Credibility of Publication Venues. Journal of Librarianship and Scholarly Communication, 7(General Issue), eP2250. https://doi.org/10.7710/2162-3309.2250
Rele, S., Kennedy, M.R., & Blas, N. (2017). Journal Evaluation Tool. LMU Librarian Publications & Presentations 40. https://digitalcommons.lmu.edu/librarian_pubs/40