Roller, Margaret R., and Paul J. Lavrakas. 2015. Applied Qualitative Research Design: A total quality framework approach. New York: Guilford Press.
If you’re interested in making sure that your qualitative research is trustworthy, and that your methods section doesn’t include statements like, “magic occurs here,” check this book out. The authors have developed a framework for designing qualitative research design to be credible, analyzable, transparent, and useful. They apply the framework to the usual qualitative methods (interview, observation, and more), prompting the reader to consider the four framework components throughout each method.
This is one of those books that as you read, you find yourself nodding along. If you’re designing an in-depth interview, for example, of course you know that you’ll want to make sure that the people you’re interviewing will be able to provide the data you need to answer your research question, and that those people are representative of a larger population. The book ties this intentionality of choosing the right people back to the framework component of credibility. *nod* In order for your research to be credible, you’ll want to select your interviewees carefully. Yes, you know this already, but unless something like this book is prompting you to make sure to do it right, well…we know how tired designing good research can make a person.
Give this book a read to affirm what you want out of your qualitative research, that it be considered rigorous and well designed.
BONUS: I learned about reflexive journaling from this book and am excited to give their journaling template (page 42) a try.
ALA Publishing has a real nice (and short) written interview up with me and Cheryl on their website, about our book Marketing Your Library’s Electronic Resources: A How-To-Do-It Manual for Librarians, at http://www.alaeditions.org/blog/297/interview-marie-r-kennedy-and-cheryl-laguardia-effectively-promoting-electronic-resources. Here’s a little snippet from the interview:
The first edition has been one of our bestsellers. Why did you write a second edition, and what are some of the most useful updates?
We learned so much from our readers about their experiences using the first edition that we wanted to incorporate all that feedback and share it widely. In the first edition our readers found the marketing plan reports we included very helpful – in this edition we’ve added some more. To help you get moving on your own marketing plans faster we’ve created a downloadable template. Grab it, use the prompts to consider the essential steps in a marketing plan, and get going!
I posted previously about the sounds I like to listen to when I’m writing. Based on your recommendations I added quite a bit to add to the rotation. I am smitten with Moby’s Long Ambients (http://moby.com/la1/) – thanks for the suggestion, Mark!
I knew with the latest iPhone OS update it was going to break one of my favorite apps, FM3 Buddha Machine; I updated, and it sure enough did break. It’s nonfunctional at the moment, but they’re working on fixing it. In the meantime I started listening to their Soundcloud station. They even have a piece called Monkey Mind, so of course I had to tell you about it. Here’s a link: https://soundcloud.com/christiaan-virant/fistful-of-buddha-03-monkey.
I think a lot about the accessibility of our library’s licensed e-resources. Sure, if there’s not already a clause in a license agreement about compliance with the Americans with Disabilities Act, we request that it be added. But what does that actually mean for our patrons? There are levels of accessibility, and simply having a clause in the license agreement doesn’t guarantee that our patrons will be able to satisfactorily view/hear/navigate the content.
I worked with our library’s e-resources licensing consortium to ask them to start collecting Voluntary Product Accessibility Templates (VPATs) when initiating conversations with a new vendor. I appreciate our consortium’s willingness to do this, as it gives its member libraries information up front about the expected use of the e-resource for patrons with disabilities. I’m just one person concerned about this, I don’t have much power to affect change. But groups of people concerned about this, banded together, could.
And then I bumped into a particularly large group of people concerned about this, the Big Ten Academic Alliance. Here’s what they’ve accomplished so far (this is grabbed right from their site):
Because of the group:
- The Big Ten libraries have funded a pilot to provide selected vendors with third-party accessibility evaluations. Evaluations, along with any responses provided by vendors, are posted on the E-Resources Testing page. This program provides vendors with the information and opportunity to improve the accessibility of their products and gives members of the library community information about the accessibility of these works.
- The Big Ten Academic Alliance has also adopted model accessibility license language that can be found on the Standardized Accessibility License Language page. Library e-resource vendors may be approached about inserting this (or similar) text into BTAA Library consortial licenses or institutions’ individual licenses to ensure these contracts address accessibility concerns.
The stuff on their site is freely available, go check it out! I especially like that they note standardized language to use when negotiating accessibility in a license agreement. And that tab on the site about the accessibility testing they’ve done on vendor e-resources, posting the results publicly? Awesome!
An exciting new article I read today aligns with some thoughts I’ve had about the development of academic research networks. The article (cited below) looks carefully at the addition and loss of people in individual research networks, a process called “churn.” The article categorizes churn in research networks as either exploratory or exploitative, in which an individual researcher develops ties to new people or severs old relationships (exploratory), or depends on existing ties (exploitative).
The article examines the research networks of a group of scientists over time, to see if the addition/loss of people in those scientists’ networks has an effect on production (measured outputs in this article are publications, grant submissions, successful grant submissions, and grant dollar amounts). A couple of interesting findings: adding new people to one’s research network did not have a significant effect on the number of publications a researcher completed, but a severing of old relationships did; network size had a positive effect on the number of publications completed. They suppose that, “larger networks may also lead to a diversity of ideas.” (p. 8)
I think these findings are especially meaningful to those who are young in their research productivity. We found in our own work, for example, that the networks of the novice librarian-researchers who participate in the Institute for Research Design in Librarianship have active, developing networks, with a lot of churn (cited below). Of course one would expect this at the outset of a research career, but it is useful to think that the addition of new people and the size of one’s network may be able to help these librarian-researchers get to their goals (publications) more efficiently.
Kennedy, Marie R., David P. Kennedy, and Kristine R. Brancolini. 2017. “The Evolution of the Personal Networks of Novice Librarian Researchers.” portal 17(1): 71-89.
Siciliano, Michael D., Erich W. Welch, Mary K. Feeney. 2017 (in press). “Network exploration and exploitation: Professional network churn and scientific production.” Social Networks. doi: http://dx.doi.org/10.1016/j.socnet.2017.07.003