2014 MLA/DLA Conference

About a month ago, I attended the joint conference of the Maryland & Delaware Library Associations in Ocean City, MD. I took a different tactic this year with my note-taking, and took everything in Evernote rather than in individual blog posts. I’ve finally had a chance to go back in my notes and try to clean things up a bit and add links where appropriate. So, here they are!

All of these sessions were really excellent, and I have already tried to put some of what was presented in the event evaluation session into practice. I think my biggest takeaway from that talk was that people experience cognitive overload when you ask them to complete a complex evaluation at the end of a session. I went back to work and changed up what I did on the event evaluations for a recent conference.

We have four questions that we ask on these evals, which we use at our in-person training sessions. Generally speaking, the questions are “what did you like best,” “what could be improved,” “please rank these four aspects of the presentation on this Likert scale” and “what else?” In my time doing these, we’d always started off with the rankings question, so I rearranged and rephrased a little bit to lead off with “how was this session valuable to you?” in hopes that  I could get some better responses to that particular question.

I’m not sure it really made a big difference in the types of responses I got – librarians are perhaps more willing than most to tell you what they really thought, and my general impression was that I got about the same amount of comments, at about the same level of depth, as usual. But, I want to circle back and compare the kinds of answers I got from this model to what I’ve gotten in past years, and see if I can tweak further for future conferences. I’m not sure we’re quite ready for the one-question evaluation Joyce is using (as someone pointed out, some of us need the numbers) but I think it’s important to rethink these types of things occasionally and see if you can make them work better. I have the feeling that we can probably remove the “anything else?” question and perhaps even drop “what can we do better,” or maybe rephrase it somehow. (Maybe “is there something you wish we’d done differently?”)

 

Advertisements

MLA ’13: The Role of Librarians/Informationists in the Systematic Review

  • Systematic Review Reporting Quality in General Medical Journals: The Influence of Librarian Authorship
  • A Collaborative Approach to Systematic Review and Meta-Analysis Instruction
  • An Interdisciplinary Collaboration to Teach Systematic Review Methods
  • What Happens after: Outcomes of a Systematic Review Course

Abstracts here

Systematic Review Reporting Quality in General Medical Journals: The Influence of Librarian Authorship
Melissa Rethlefsen, AHIP, Ann Farrell, and Leah C. Osterhaus Trzasko

2011 – Institute of Medicine (IOM) standards for systematic reviews. 15 standards just for searching. One is “work with a librarian or other informationist. So what is the impact of having a librarian on a systematic review team? And does the level of librarian involvement also improve the review? Hyphothesis – yes, would be higher quality.  Better able to help shape direction of search question, strategy and reporting.

Developed a short form (capture level of participation and search strategy), checklist (based on 15 recommended standards from IOM) and a scale (Modified the PRESS – Peer Review of Electronic Search Strategies).

Three levels of librarian participation: No/Unclear, Mentioned/Acknowledged, Author.  And is the search replicable – does SR contain the line by line strategy for at least one of the databases used.

Looked at JAMA, BMJ, Lancent, PLoS Med, Annals of IM. 2008-2012. Two reviewers used short form and checklist to review their assigned articles.  Then came to consensus either among the two reviewers or all four reviewers.

630 SRs analyzed. 275 had replicable searches. 7% had librarian author, 23% mentioned/acknowledged. With lib as author, over 60% were replicable. No librarian, less than 40% replicable. Librarian authored SRs also met more standards (average 2 more).

Compared to no librarian involvement, with a librarian author the SRs were more likely to meet 8 different IOM standards. Vs mentioned, more likely to meet 6 different IOM standards. [Sounds like she is referring to particular standards in these numbers]

Limitations – definition of SR still nebulous. No/unclear may mask LIS contributions – hard to tell if there were common names, only first initials, etc. Raters also could have used more training. IOM standards are controversial.

Observations – searches were difficult to locate. Missing, broken links, etc.

Future – going to use another rating measurement, expand journal selection. Also want to look at ways to preserve and present systematic review search strategies. Having a standard would help.

Implications – librarians should ask for authorship so better able to help formulate questions, search methods, presentation of searches, etc. Know about the IOM standards and be ready to help advise.

Q&A/Comments:

-Didn’t contact authors to confirm if they had a librarian’s help, and level of involvement.
-Trends in numbers of authors? Haven’t really looked at that. Did notice that certain groups always include a librarian, but others don’t.
-“cheater” systematic reviews – the ones students have to do in 6 weeks. Lots of variation in what was considered a systematic review. Need to get involved with journal publishers about this – other standards are needed
-there are some other terms – comprehensive review, quick review, etc.

 

A Collaborative Approach to Systematic Review and Meta-Analysis Instruction
Mark P. MacEachern and Whitney Townsend, Liaison Service Librarians at U Michigan Med Library

Integrated vs standalone instruction. Moving towards a model of integrated instruction. Librarian is coming into the curriculum at established points, rather than making them come to the library for a standalone session. “They don’t know what they don’t know.”

One point for this instruction is in a pediatric clerkship for 3rd year medical students – session on how to use them in clinical prax. By then have had a lot of instruction on primary literature. Two part session – librarian leads first part covering what systematic reviews are, when they should look for them, how to find them. Second part is physician-led – small group activity. Given two different SRs on a topic that come to different conclusions. Students work through a checklist and evaluate the SRs. Discuss whether they would integrate the info into their clinical practice.  Librarian chimes in on search strategies, etc.

Why collaborate on these? Integrated into existing series of sessions they get. Reminder on how to locate them once they’re out in the clinic. They also get to watch a physician and an information professional collaborating – two experts working together to solve important problems.

Graduate Medical Education – there’s a slide deck available for all librarians to start from when doing instruction on SRs. Allows to introduce additional resources to them at a critical time. Gives librarians a foot in the door with the residents and fellows who are working on these projects.

Robert Wood Johnson Clinical Scholars Program – 2-3 year fellowships, MDs and MPHs. Clinical Masters in Research students also join in on this. Work with a librarian on training to conduct an SR – walk through the whole process in detail in two two-hour sessions. Again encourages them to work with a librarian and makes a librarian more available to them. Useful repetition for them to get it again.

Epidemiology course – one week graduate level summer course. Students have to create a SR protocol that they will theoretically later complete. Mark teaches the search component of the course. Helps students see importance of running a strong search – they see how the search impacts the analysis they can do. (All materials for this course are in open.michigan repository – http://open.umich.edu/education/sph/epid757/summer2011 ). As a result of this collaboration this particular faculty member has always brought in a librarian for his SRs, librarians have been written into grants too.

Q&A/Comments:
-do you have a policy about how much you work with them? They’ll push the envelope. When do you say you have to step back? They don’t have an official policy that he knows – up to individual librarians to set those boundaries. Also do a lot of consultations that don’t actually lead to a systematic review.
-yes, co-author on some of them.
-pressure for librarians to integrate. How did this come about?
-Someone in audience had a class where one group of students read the IOM standards and came to find a librarian. Then all the rest of their classmates came to see her. She contacted prof to give a heads up, and was then integrated from there.

 

An Interdisciplinary Collaboration to Teach Systematic Review Methods
Claire Twose, Assoc Director for Public Health and Basic Science Info Services at Welch Library @ JHU
Lori Rosman, Peggy Gross, Donna D. Hesson, Julie M. Adamo, Tianjing Li, Ian Saldanha, Swaroop S. Vedula, and Kay Dickersin

Graduate epidemiology class intended to teach how to do SRs and Meta Analysis in 8 weeks. All four librarians for this school are involved in the course in some way. Librarians have actually been involved with the course from the beginning in the ‘90s. Very popular course.

Small groups of students given list of SRs that need to be updated by faculty. Then go through all the steps of doing an SR. protocol, searches, download, de-dupe, etc.

Informal librarian involvement in the beginning. 30 minute lecture in class, then come to research lab time to circulate. Prof did theoretical part, librarians did nuts and bolts. Faculty graded search strategy.

In last 3-4 years, librarian has become a team member rather than a guest lecturer. Support is now more formalized – also shifted to online lectures. Participate in class planning and feedback to students. More than just one in-class presentation, office hours. Roadmap document and other handouts.

Increase in satisfaction with course from evaluations. Also can see a “clear improvement” in search strategies over the last year. Indirect faculty learning – they read all the librarian comments on the students’ search strategies. Also lots of benefits to the librarians! Reviewed strategies together to learn more themselves. Also research opportunities to collaborate and co-author posters and papers with faculty.

Future – will continue to be an 8 week course. Try to reduce searching load for students though – libs will try to do some research on contributions of each database to the SRs. Going to create more online presentations and tutorials.

Q&A/Comments
-working on data to see what these students are doing in terms of publications after the class.
-librarians involved in a two-week chunk of the course, when they’re doing the search component
-grading the search strategies (10 of them) – took 8 hours to grade them all, total. Graded on criteria they were given to follow (using PRESS tool for this). Wrote comments back with suggestions for terms, etc.
-why EndNote vs RefWorks? Her experience – they are downloading thousands of citations from each database. Need to upload somewhere, remove duplicates, create unique sets, etc. RefWorks – interact with a server every time you make a change, slow process. Suspect that was why.
-anecdotally, seems these students are more likely to involve a librarian on SRs and other research in the future – this shows them the value.

 

What Happens after: Outcomes of a Systematic Review Course
Linda M. Hartman, AHIP – Reference Librarian at U Pitt, instructor for workshop on SRs for librarians
Barbara Folb, Mary L. Klem, Melissa A. Ratajeski, AHIP, Ahlam Saleh, Charles B. Wessel, and Andrea M. Ketchum, AHIP

Systematic Review Workshop – Nuts & Bolts for Librarians. 2.5-day workshop with MLA CEs attached. Study design, co-investigator, project management, etc.  class started out as a self-study for her institution.

Survey.  Why? MLA CE survey didn’t answer all the questions they were interested in re: what people learned, any change in institutional processes, etc. 113 respondents, pretty evenly distributed between all the 8 workshops the timeframe covered. Also did a supplemental survey – survey logic was set up incorrectly! 80 people completed that.

Knowledge – what are they learning from our class? 19 questions to assess retention of key points – high percentage of people got almost all questions right. No trend regarding how long ago they took the class.

Professional Practice – how many SR searches have you worked on since the class? Most 1-5, and mostly academic health sciences libraries. Some people had 40-50 searches they worked on. Question was maybe ambiguous – you do several searches for one SR. grey lit – had that module of course made an impact? 70% of people had.

Authorship – 63% asked for it, 76% of who asked were granted (22% didn’t know yet).

Peer Review for a search? Controversial topic – standard on this now, but about 1/3 of respondents took the class before the standard was published. Those whose primary duty was doing SRs were most likely to do this. Also if you’re in a small library you don’t have the resources/colleagues to do a peer review.

Confidence? – 70% agreed or strongly agreed that they could confidently do a high quality SR search.

42% changed the services for SRs offered at their institutions after the course. Have improved search quality and documentation, have added it as a formal service offered. Expanded training and tracking services for SRs.

Administrators are supporting this process (about 20% of those who took survey were administrators)

Next up, prospective study on this – pre, post, and 6 months out.

Q&A/Comments:
-re: peer review – have you considered a network of people who have taken the class and can help with peer review? PRESS forum – can peer review and have your own searches peer reviewed.

MLA’13: Plenary II – Richard Besser, ABC News

The John P. McGovern Award Lecture. Richard Besser is ABC News’s senior health and medical editor, providing medical analysis and commentary for all ABC News broadcasts and platforms, including World News with Diane Sawyer, Good Morning America, and Nightline.

Overall an interesting talk from an engaging speaker, though on the Twitter backchannel there was some chatter about how relevant it was to medical librarians. General theme was the importance of getting accurate, science-based information out, but no explicit connections to librarians. His focus was more on the end product, the narrative that carries the information in a way that the public will pay attention to, not so much on how that science is identified and gathered. This is certainly relevant to anyone who is trying to communicate technical information to an audience unfamiliar with it – for example, teaching students how to use databases, etc. Personally, I don’t think it’s a bad thing if the keynotes at a conference aren’t laser-focused on how the speaker engages with librarians and libraries – it’s a good way to find new ways of thinking about how we do what we do.

Continue reading