it’s monkey day. to celebrate, why don’t you take my survey?

we here at orgmonkey.net celebrate monkey day (december 14) religiously (see 2009, 2008, 2007), usually by organizing something or suggesting that others do so.  this year, why don’t you consider celebrating by contributing your response to a survey i’ve constructed?

if you’re a librarian working in an academic setting, then this survey is for you!
– – – –

We invite you to participate in a study of research skills and support for research.  You have been invited to this study because you are a librarian in an academic setting.

The purpose of this study is to learn how you would assess your own skills in completing discrete research tasks as well as to discover how your institution may support your research endeavors.  We plan to use the results of this survey to influence the curriculum of a proposed continuing education opportunity for librarians in an academic setting.

The survey is Web-based and is expected to take about 5 minutes to complete.  We will not gather any identifying information about you.

Your participation in the study is completely voluntary and no risks are anticipated for you as a result of participating. The study has been reviewed by the Office of Research and Sponsored Projects at Loyola Marymount University.

Thank you for participating in this study.
Sincerely, Kristine Brancolini and Marie Kennedy

BEGIN THE STUDY BY GOING TO THIS LINK: http://library.lmu.edu/departments/acquisitions_serials/Informed_Consent.htm

– – –
Make this monkey’s day and take the survey!
monkey icon

Posted in library, monkeys/bananas | 1 Comment

ACRL proficiencies for instruction libraries [spoiler: you don’t need to know how to do research]

i just finished reading theresa westbrock and sara fabian’s article in the latest college & research libraries and am sad.  in their article titled “proficiencies for instruction librarians: is there still a disconnect between professional education and professional responsibilities?” they outline acrl’s proficiencies for instruction librarians from 1985 and then compare them with the 2007 updated proficiencies.  the thrust of the updated proficiencies is to focus on “broad areas of proficiency rather than a comprehensive list of skills” (p. 572).  are you wondering which “skills” got left behind?  you guessed it: research skills.

the 1985 proficiencies (http://library.csus.edu/services/inst/indiv/acrl_bis_profic.htm) include a section titled “ability to employ research and evaluation methodologies.”  here is the list of skills that fall under that heading:

  • Is able to design an evaluative instrument and to use survey techniques
  • Is able to interpret feedback and use it to modify activity
  • Is able to solicit and analyze student comments and attitudes
  • Understands the structure of information within various disciplines and the categories of tools necessary to use the information
  • Understands basic statistical concepts and methods
  • Understands validity and reliability measures for research use
  • Understands SPSS or other computerized statistical packages
  • Is able to develop a search strategy

looks pretty solid, doesn’t it?  if i were in an undergrad sociology class i’d feel confident about having that instruction librarian teach me how do gather and analyze data.

in the new proficiencies (http://www.ala.org/ala/mgrps/divs/acrl/standards/profstandards.pdf) the word “statistics” is mentioned once: “1.3. Maintains and regularly reports accurate statistics and other records reflecting own instruction activities.”

is it possible i’m misreading the new proficiencies and that having a clear understanding of how the basic research process works is embedded in the language somewhere that i’m missing?  crossing my fingers.

Westbrock, Theresa, and Sarah Fabian. 2010. “Proficiencies for Instruction Librarians: Is There Still a Disconnect Between Professional Education and Professional Responsibilities?” College & Research Libraries 71(6): 569-590.

Posted in articles i'm reading | Comments Off on ACRL proficiencies for instruction libraries [spoiler: you don’t need to know how to do research]

11 gaps identified in communication between research and practice in librarianship

Of the 11 gaps that the authors identified in communication between research and practice in librarianship, I feel the Publication Gap most acutely.  Very often the first thing I do when I want to answer a question in my field is to see how it may have already been answered in the literature.  More often than not I don’t find the answer, or I end up wading through a sea of weak literature only to give up in frustration.

Publication gap.  The body of LIS research papers is small both in itself and as a proportion of the published literature (Feehan, Gragg, Havener, & Kester, 1987; Nour, 1985; Peritz, 1980).  To some writers, the emphasis on pragmatic issues (Goodall, 1998; Montanelli & Mak, 1998, Rayward, 1983; Saracevic & Perk, 1973; Williamson, 1999) and the low proportion of practitioner authors relative to the number of practitioners in the field (Enger, Quirk, & Stewart, 1988; Fisher, 1999; Mularski, 1991; Olsgaard & Olsgaard, 1980; Stephenson, 1990; Swigger, 1985) are evidence that the relationship between research and practice is troubled and requiring attention.” (p. 32) [emphasis mine]

Haddow, G., & Klobas, J. E. (2004). Communication of research to practice in library and information science: Closing the gap. Library & Information Science Research, 26(1), 29-43. doi:10.1016/j.lisr.2003.11.010

Posted in articles i'm reading | Comments Off on 11 gaps identified in communication between research and practice in librarianship

Library directors meeting

Library directors meeting

Posted in comic, library, management, marketing | Comments Off on Library directors meeting

…and this is how a half hour of the life of an e-resources librarian can slip away

http://0-sq4ya5rf2q.search.serialssolutions.com.linus.lmu.edu/?&url_ver=Z39.88-2004&url_ctx_fmt=info:ofi/fmt:kev:mtx:ctx&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.atitle=Review%3A%20Analysis%20of%20the%20evolutionary%20convergence%20for%20high%20performance%20swimming%20in%20lamnid%20sharks%20and%20tunas&rft.auinit=D&rft.aulast=Bernal&rft.date=2001&rft.epage=726&rft.genre=article&rft.issn=1095-6433&rft.issue=2-3&rft.spage=695&rft.title=Comparative%20Biochemistry%20and%20Physiology%20Part%20A%20Molecular%20and%20Integrative%20Physiology&rft.volume=129A&rfr_id=info:sid/www.isinet.com:WoK:BIOABS&rft.au=Dickson%2C%20K&rft.au=Shadwick%2C%20R&rft.au=Graham%2C%20J

there’s an extra “a” in this url that is causing it to not resolve, thus leaving our patron with an error instead of full text.  the extra “a” is after the volume; there isn’t a volume 129A, only a volume 129.

:/

Posted in e-resource mgmt, library | 2 Comments

counting the troubleshooting of e-resources

we’ve lately felt as if we’re spending more time troubleshooting our library’s licensed e-resources.  saying “i feel like i’m spending a lot of time on this” doesn’t really cut it when it comes time to staff a task appropriately so we’re making an attempt to quantify our process.  knowing how much time we’re spending will tell us if we’ve got enough people tasked with the responsibility of responding to patron problem reports, since providing timely customer service is part of our mission.  enter: gimlet.  gimlet is usually used to count reference desk transactions so you can see how it can easily be adapted to count other kinds of service point interactions.  for example, the reference department gets emails requesting assistance for troubleshooting a research question, my department gets emails requesting assistance for troubleshooting e-resource access.  since our reference department already uses gimlet, we thought we’d give it a try too.

setting it up was super easy.  there are five categories built in to the system, and you can define the variables that fall under those headings.  the five categories are: duration; question type; asked by; format; location.  under the heading of duration we added the variables: 1 interaction, same day; more than one interaction, same day; 1 interaction, next day; more than one interaction, next day; more than two days.  tracking how long it takes to resolve a problem, and how many interactions with the patron it takes to get the information we need, will help us to understand the complexity of the problems we’re troubleshooting.  often we can get what we need from the patron with one interaction and get the problem resolved in one day, but sometimes it can drag on for weeks.  being able to count how often we can quickly respond and how often the problems lag will be instrumental for not only identifying our own staffing needs but can also point to how we evaluate timely vendor responses.  if we’re waiting on a vendor to fix something and they consistently take more than two days, then we may choose not to work with that vendor in the future.

here are the other categories and the variables we’ve entered.  question type: user interface problem; patron education; broken URL; expired subscription; incorrect coverage listed.  asked by: faculty; student; librarian; staff; alumni; law school; visitor; other.  format: email; phone; walk-in; referral; roving ref; text chat.  location: on campus; off campus.

we’ll be adding tags for the names of resources that are reported as problematic so we can get an idea of how many times an ebsco database goes wonky, for example.

we’ll be experimenting using the READ scale (shown as Difficulty in the screen shot) to help us assess the complexity of the questions we’re asked.  stay tuned for more on that later.

you can see from the screen shot that our database is a blank slate, ready to have questions asked and responded.  i’ll report back later to let you know how it’s going!  in the meantime, if you use gimlet to track the troubleshooting of e-resources i’d love to hear from you.

Gimlet

Posted in e-resource mgmt, library | 5 Comments