Using our Journal Evaluation Tool to determine credibility of a journal

It’s been about seven years since we developed a tool (a downloadable rubric and explainer) to help our faculty determine the credibility of a journal. You can download the tool at https://digitalcommons.lmu.edu/librarian_pubs/40/ and read about the development of it at https://jlsc-pub.org/articles/abstract/10.7710/2162-3309.2250/. People have told us – often! – how they use the tool in their personal decision-making about where to publish, as well as sharing the tool in their classrooms and with friends. When we developed the tool, I hoped the set of decision criteria in it would be used to further the broader discussion about what is needed for an author to feel confident about submitting their work to a particular publisher.

I was delighted to see a recent article by Albro, et al., using ours as part of a critical evaluation of two tools like this, to determine the credibility of open access library science journals. Our tool was selected for the evaluation for two reasons: its alignment “with the current practice of using descriptive frameworks that empower authors to critically evaluate a journal’s quality and credibility, rather than provide a binary decision of good or bad,” and; its usability, in that it provides “guidance on application and a set list of criteria and measurements for determining quality or credibility” (p. 64).

The 6-person team of evaluators applied our rubric and the Open Access Journal Quality Indicators checklist (Beaubien & Eckerd, 2014) to a sample of 48 open access journals in the field of library science. The team noted that they preferred using our tool over the other one because of the “specific rules for distinguishing between Good, Fair, and Poor for each criterion” (p. 72) but noted that there was some ambiguity in using the rubric, especially as it relates to smaller publishers and hybrid journals. Also noted was that the inter-rater reliability was lower than that of the other tool (70.3%, compared to 81.7%).

The team identified some criteria and indicators that were similar across the tools (see their Table 5, on p. 73) as well as identified the top positive and negative attributes in library science journals, based on the two tools (Table 6, p. 76).

A thoughtful consideration by the authors is noted on p. 74, in which they ask, “Researchers should question … [if] the tools [are] asking the right questions and how should the goals of the tool change over time, especially as open access resources expand and transform?” The authors are aligned with my own thinking during the development of our tool, that “updates to existing tools are imperative, or new evaluation tools should be designed to account for the rapidly developing environment in which the research landscape now resides” (p. 75).

I am encouraged to see our field continue to engage in critical thinking about how an individual author may determine for themselves whether a particular journal is the appropriate outlet for their latest article, rather than relying on a prescribed list or tool that may miss some nuance or a key consideration that is important to that author. Thank you to the authors for engaging with our rubric. It means a lot to me personally that an idea I had has been able to be shaped into a useful tool that is still impacting the field, and I appreciate this critical eye on the work.

Albro, M., Serrao, J.L., Vidas, C.D., McElfresh, J.M., Sheffield, K.M., & Palmer, M. (2024). Applying Librarian-Created Evaluation Tools to Determine Quality and Credibility of Open Access Library Science Journals. portal: Libraries and the Academy 24(1), 59-81. https://doi.org/10.1353/pla.2024.a916990.

Beaubien, S. & Eckard, M. (2014). Addressing Faculty Publishing Concerns with Open Access Journal Quality Indicators”. Journal of Librarianship and Scholarly Communication 2(2), eP1133. https://doi.org/10.7710/2162-3309.1133

Blas, N., Rele, S., & Kennedy, M. R. (2019). The Development of the Journal Evaluation Tool to Evaluate the Credibility of Publication Venues. Journal of Librarianship and Scholarly Communication, 7(General Issue), eP2250. https://doi.org/10.7710/2162-3309.2250

Rele, S., Kennedy, M.R., & Blas, N. (2017). Journal Evaluation Tool. LMU Librarian Publications & Presentations 40. https://digitalcommons.lmu.edu/librarian_pubs/40

Ikon Uji Pemahaman (1) - Warna

Posted in library, OA, publishers, writing | Comments Off on Using our Journal Evaluation Tool to determine credibility of a journal

Drop some positivity in the next article you peer review

We recently received feedback on an article we submitted for publication, and I was delighted that the comments we received were all entirely positively toned and productive. The reviewer(s) brought up some issues that we could act on in the revision, and we were happy to make those changes. Overall, it was a good vibes review experience. Upon reflection, it was nice because of the attitude of the reviewer’s comments.

Over the years I’ve become accustomed to critical reviews that are couched in either neutral or negative language. That’s fine, I can take a critical review, and I know reviewers are likely just trying to get through the task, to move on to their next one. I tend not to take negative comments personally. The main thing I’m looking for are productive comments, actionable changes I can make to improve the manuscript.

With that as my focus, I had lost sight of how the tone of a reviewer comment can improve the whole experience. A note in the margins like this, then, caught my eye: “Excellent point and very important!” I could tell that the reviewer was engaged with the article, and interested in what we wrote. There were other notes like that sprinkled throughout the review, and it made all the difference to me.

If you’re the reviewer that commented on this latest article, thank you. You’ve inspired me to be proactive in putting forward this kindness in my future reviews.

Posted in writing | Comments Off on Drop some positivity in the next article you peer review

Mentoring Academic Librarians for Research Success

The major take-away from this book chapter is that the feedback about the IRDL Mentor Program (in place from 2016 to present day), from both mentors and Scholars, has been overwhelmingly positive. In the chapter we describe the process used by the program to recruit and select mentors, the pairing of mentors with their Scholars, and the general administration of the IRDL mentor program. We offer strategies for making the mentor-Scholar relationship work and tips for the design of a formal mentoring program.

A consistent refrain from both the mentors and the Scholars is that the experience, “is much better overall than other mentor/mentee experiences, especially in terms of the clear expectations and structure” (p. 250). In the chapter we aim to make the components of the program as transparent as possible, so that others may reproduce aspects of it in their own mentor programs. We provide some specific guidance for how to administer and maintain a year-long program in the two appendices, the IRDL Mentoring Program Contract and the monthly reflection prompts, designed to keep communication between the mentors and their Scholar consistent over the course of the program.

Read this chapter and get the Contract and monthly reflection prompts at no cost to you, at https://digitalcommons.lmu.edu/librarian_pubs/140/.

Jason, D.P., III, Kennedy, M.R., Brancolini, K.R. (2021). Mentoring Academic Librarians for Research Success. In L. J. Rod-Welch and B.E. Weeg (Ed.). Academic Library Mentoring: Fostering Growth and Renewal (pp. 241- 262). Chicago, Illinois: Association of College and Research Libraries.

Posted in IRDL, writing | Comments Off on Mentoring Academic Librarians for Research Success

Complex and Varied: Factors Related to the Research Productivity of Academic Librarians in the United States

The major take-away from our new research is that librarians are motivated to conduct research, yet the factors leading to their success are complex and varied.

Kris and I have already conducted two studies (with five years between the first and second) on the attitudes, involvement, and perceived capabilities of librarians doing research, and as the time neared for another study, we partnered with two librarians (Kristin Hoffmann and Selinda Berg) doing similar work to conduct an updated study. We have admired and cited the research of these two over the years and it was a treat to get to work with them so closely on a research endeavor.

This updated study is still focused on academic librarians in the United States but this time uses the survey structure from our partners, adapting and extending it. The focus of the work was to identify the factors that have a positive effect on the research productivity of librarian-researchers. As we found in our previous studies, respondents believe that their master’s degree coursework did not prepare them to conduct research, but despite this they are research productive. The three factors of Individual Attributes, Peers and Community, and Institutional Structures and Supports continue to influence research productivity, with no single factor rising to be the main influence.

The usual limits of what we can learn from a web-based questionnaire that was administered during the COVID-19 pandemic apply. Future work will focus on the individual level, to allow for nuance in response and to better understand the fuller complexities and variation among librarians conducting research.

The manuscript has been accepted for publication in College & Research Libraries. The pre-print, appendices, and data may be accessed via https://digitalcommons.lmu.edu/librarian_pubs/141/.

Posted in library, OA, writing | Comments Off on Complex and Varied: Factors Related to the Research Productivity of Academic Librarians in the United States

Meeting Jerry Uelsmann

We moved to Gainesville, Florida from Austin, Texas, right after I completed my MFA in photography, in 1995. I was soon out to meet any local art photographers and ended up getting to be part of the University of Florida (UF) photography studio critique group, which included the photo faculty and students. There were a couple of well-known photographers on the faculty there, of which Jerry Uelsmann was one, and I felt honored to be invited to join them. My role was mainly to hang out and give informal feedback on the photos completed by UF photo students.

The first meeting I joined was in the art department conference room. We were seated around a large table, just getting started, and Jerry walked in. Wouldn’t you expect someone who was so well known to have an air of importance? He had a stack of books and mail with him which he thumped down on the table and said in kind of a goofy way, “Hi, I’m Jerry.” It turns out he was just a regular person, accessible and unassuming. Over time I would have that further confirmed as the group migrated around Gainesville, to do critiques in the studios where people were working. We ended up at Jerry’s house more than once to talk about his work and hear him think aloud about the creation of some of his photo montages. His darkroom setup was large, totally customized to his way of working, with multiple enlargers that he would outfit with different negatives to make the montages.

He was just getting started with digital photography when I met him and it was amusing to me to see a master in visual thinking being prompted to retrain himself in a new environment. I appreciated that he was so open about his methods, using a casual and exploratory approach to image making. I still remember him talking through the creation of one of his montages, saying, “I tried this with the hands in the sky and didn’t like it, so I put them over here instead.” It was affirming to have an accomplished photographer respond to an iteration of an image visually, making adjustments as he went, until he was satisfied with how it looked.

Jerry died this week (link to an obituary in the Gainesville Sun newspaper). I was lucky to have him as part of my life for a while.

A photo montage by Jerry Uelsmann, with hands holding a birds nest, in front of an archway, with bird wings

image from Museum of Contemporary Photography

Posted in art, writing | Comments Off on Meeting Jerry Uelsmann

A new e-resources usage statistics dashboard

New usage statistics dashboard alert! Check it out at https://whheresourceusage.shinyapps.io/dash/. Drop me a line here or on twitter and let me know how much you love it.

screenshot of the new e-resources usage statistics dashboard

New dashboard alert!

***
For the last several years I have been annually publishing the usage statistics of our library’s licensed e-resources, using a Google Sites dashboard that my colleague @mars_bar85 designed. We started developing the dashboard in summer 2013, after constructing a brief document about our decisions and goals for the dashboard. The resulting dashboard still looks good after all these years:  https://library.sites.google.com/site/eresstatistics/home.

Over this past year I’ve been learning R (mainly just for fun) and have grown really interested in the coding behind summary visualizations of data. I wanted to put into practice what I was learning so I decided to update the e-resources usage statistics dashboard to include interactive elements, and this time build it in R. I used a flexdashboard with some shiny components, and it is housed (a free account) at https://whheresourceusage.shinyapps.io/dash/. The components I’ve programmed in the dashboard allow for user interaction and more dynamism than our old dashboard. It’s time for this change. The most exciting component of the new dashboard is an interactive table of all of our licensed databases and their usage statistics. The users of the dashboard can now search on demand to see the usage of any database they’re interested in, rather than emailing me to ask for those stats. Users can sort alphabetically or by usage, to see our most-used/least-used databases, and can download the data for offline manipulation.

This new dashboard still adheres to our original goals of wanting “to communicate … simply, clearly, and quickly about electronic resource usage,” and now it does it in a more engaging way.

Any feedback on the new dashboard is welcome, I’m still learning! The source code is available on the site -> https://whheresourceusage.shinyapps.io/dash/

Posted in e-resource mgmt, library, usage statistics | 2 Comments