Skip to content

Assessment in One-Shot Instruction Sessions: Preliminary Findings

May 4, 2018

Introduction

One-shot library instruction sessions provide little enough time to present information literacy concepts, let alone to assess student comprehension before and after the lesson. This semester I piloted an attempt to incorporate assessment into one-shot sessions for gen ed writing classes. The lesson plan for these classes focused on an introduction to the library’s resources and basic research skills.

My goals for incorporating assessment into these classes were to:

  • determine the effectiveness of the library instruction session.
  • make improvements to the lesson plan.

In order to get pre-test data, I created an online self-paced tutorial in LibWizard (Springshare) and a seven-question quiz on the topics covered in the tutorial. This tutorial aligned with my basic lesson plan, provided brief videos I had created on these topics last semester, and incorporated some hands-on experience.

Topic Mapping tutorial slide
Your Turn tutorial slide

In addition to providing pre-test assessment data, my goals for implementing a pre-session tutorial and quiz were to:

  • distribute self-paced resources for students to re-use as needed.
  • increase time in class for more advanced topics.

Methods

Once I had the tutorial and quiz, I needed to distribute it to the students. Both could be linked directly to Springshare, but I wanted to host them on a platform the students were already familiar with: the learning management system (LMS). At West Chester University, the LMS is Desire2Learn (D2L). Hosting the pre-session module in D2L would also give professors more control in seeing completion rates and to use the quiz as extra credit if they wanted to.

I worked with on-campus D2L specialists to learn how to create and work with the kind of module I wanted. After the specialists set up a library course shell, I created a module with the embedded tutorial and a D2L version of the quiz.

At this point I contacted one of the writing professors, and she was happy to work with me to pilot this assessment in her class. D2L made it easy for me to add the professor into the library course so that she could copy the pre-session module into her writing course. She instructed the students to go through the tutorial and take the quiz prior to the day of their library instruction session. She also added me to her course so that I could view the results, but another method would be for the professor to export and send the results (this is what the professor for my second pilot class did).

On the day of the session, I delivered my lesson plan. As a class we took the quiz again to obtain post-test data, this time using Socrative. One student remarked that it was interesting to see the distribution of participant responses for each question.

Results

The pre-session D2L module and post-session Socrative quiz were administered to two classes. Once I had the pre- and post- test data, I calculated the percentage of response accuracy for each potential response. Some quiz questions only allowed one correct answer, but for multi-select questions I calculated accuracy of responses for each response option.

Question 2: Which statements about topic mapping are true? (check all that apply)
☐ There is only one correct way to design a topic map.
☒ Topic mapping narrows down your focus for your assignment.
☒ Topic mapping develops keywords to use in the search bar.
☐ Topic mapping tells you what citation style to use.

Correct response option is checked when it should be checked option is unchecked when it should be unchecked
Incorrect response option is checked when it should be unchecked option is unchecked when it should be checked


The ideal for correct response accuracy is 100%. The ideal for incorrect response accuracy is 0%. I calculated the improvement percentage for each type of answer (an increase in correct response accuracy between pre- and post-test; a decrease in incorrect response accuracy between pre- and post-test). I added those two improvement rates together to get a total improvement rate of 21.37% and 12.89% respectively. Between the two classes, the improvement in student comprehension averaged 17.13%.

Response Accuracy to Information Literacy Quiz, Pre- and Post- Test (%)

Class 1 Pre-test Post-Test Improvement between pre- and post- test Class 2 Pre-test Post-Test Improvement between pre- and post- test
Correct response accuracy 73.41 86.31 12.9 Correct response accuracy 79.05 84.45 5.4
Incorrect response accuracy 18.38 9.91 8.47 Incorrect response accuracy 17.44 9.95 7.49
Total Improvement 21.37 Total Improvement 12.89


I also administered the Socrative post-test to three additional gen ed writing classes without the pre-session module.

Response Accuracy to Information Literacy Quiz (%)

Post-Test Response Accuracy to Information Literacy Quiz, Post-Test Only Classes Post-Test Response Accuracy to Information Literacy Quiz, Classes 1 & 2
Class 3 Class 4 Class 5 Average Average
Correct response accuracy 87.32 87.02 83.85 86.06 85.38
Incorrect response accuracy 4.9 8.98 7.86 7.25 9.93


Discussion

The 17.13% improvement in comprehension from pre- to post- test in classes 1 and 2 is modest but promising. However, comparison to classes 3-5, which did not view the pre-session tutorial, actually shows a decrease in comprehension when the tutorial was viewed. Classes 1 and 2 showed a lower percentage of correct responses and higher percentage of incorrect responses than classes 3-5. This finding could be due to one of several reasons:

  • Small sample size. The range of accuracies is broad (e.g. 12.89%-21.37% improvement in classes 1 and 2). Natural variations in class composition play a greater role when the sample size is so small.
  • Discrepancy in lesson delivery. Did my in-class delivery change or leave out details in classes 1 and 2 under the assumption that the tutorial was effective?
  • Ineffective tutorial. Was the tutorial confusing rather than helpful?

Clearly more data is needed to determine if administering this particular pre-session module is less effective than only administering a post-test.

Calculating the percentages was a time-consuming, manual process. Exported data from D2L and Socrative did not match each other in format. Also, Socrative’s data counts students who logged into the quiz but didn’t answer a question as an incorrect response. For example, if 20 students were logged in but only 18 responded to a question, even if all 18 students responded correctly, Socrative would still count two incorrect responses. I manually made sure percentages were calculated based on actual number of students responding, not total students logged in but potentially dormant. Future iterations of this assessment should include a process to automate or streamline the data collection and evaluation.

Conclusion

Having completed these pilot classes, I made some progress on my initial goals for assessment.

  • Determine the effectiveness of the library instruction session. The assessment needs to be administered to a larger sample size of classes to get more accurate averages and determine if the pre-session module is more helpful than harmful.
  • Make improvements to the lesson plan. Analyzing the results of the quiz, both pre- and post- test, revealed troublesome topics for the students (particularly Boolean operators). I took this insight and adjusted my lesson delivery to take more time with Boolean.
  • Distribute self-paced resources for students to re-use as needed. Feedback from students is needed to determine the effectiveness of the tutorial. Analytics indicate students are not returning to the materials after completing the pre-session module.
  • Increase time in class for more advanced topics. This goal was more idealistic than I realized. I still spent considerable class time demonstrating the tutorial topics for clarification and for students who did not complete the pre-session module. Future iterations of this lesson plan should include more flipped classroom activities to increase student engagement and comprehension.

With further data collection and a streamlined process, this type of pre- and post- test assessment could be a viable and effective process to inform iterative lesson planning for one-shot library instruction sessions.

Advertisements

Evaluating Information Literacy Instruction

May 3, 2018

Recently, I was offered the privilege of an in-person interview with Bethany College in Bethany, West Virginia, for the position of Research and Instructional Librarian. Unfortunately, circumstances did not allow for me to make the five-and-a-half hour drive across Pennsylvania, and I had to decline the opportunity. Part of my interview would have included a presentation on how I would completely renovate the current information literacy instructional process at the college. With my limited background in academic libraries, this really was a challenge. With some research, however, I secured some guidance on how I would have went through with the information literacy critique and renovation had I done the presentation.

An academic library is not an island and does not operate solely without the connections and support between other departments and campus units. Factors such as the library’s budget, the educational needs of the academic community, and the resources currently available to instruct must be taken into consideration. For this reason, an academic library should have in place a written mission statement for its instructional program. This statement of purpose should consider the educational mission of the institute and the needs which are becoming increasingly more intricate as diversity and inclusion surge within higher education. (I can testify for that having been the secretary for the Center of International Education at my local community college for two years. I regularly interacted with students from all over the globe and often had to overcome language barriers, sometimes with the help of a translation app on my smart phone!) Not to be forgotten are those non-traditional students who take courses online. How does the library’s mission statement reflect those students’ needs? Will the library’s instructional literacy’s impact extend beyond its institute and even that of the college’s, influencing a student’s self-development, career paths, and lifelong learning?

The second step of evaluation is to identify the content of instruction. Content will vary among academic institutions, but it is critical that the objectives of the learning outcomes are closely aligned with the Association of College & Research Libraries’ Information Literacy Competency Standards for Higher Education. These standards assist academic librarians with general campus discussion in identifying information literacy, and coupled with the Association of College & Research Libraries’ Objectives for Information Literacy Instruction, provide guidance in establishing measurable outcomes for any given information literacy program.

Identifying the modes of instruction is the next step. Here is where I see the use of surveys as a very beneficial tool in gauging just how and where students receive their information literacy instruction, and the impact, if any, it makes on their education and coursework. Additionally, surveys designed for faculty, staff, and department administration can ascertain their needs as well. The feedback from these surveys can be a driving force in directing information literacy instruction to better benefit all involved. Perhaps students might not understand the importance of a reference interview when conducting research for a project and do not know where to begin. Could embedded librarians make a huge difference in online courses? How do we reach out to distance learning students to create an atmosphere of inclusion so that they feel they belong to the academic family? Students will be conducting research at all facets of their education, whether it be as an undergraduate, an intern, or working on their capstones, so it is crucial that librarians are implementing effective instructional programs to reflect the total campus learning environment every step of the way. Even reaching out to incoming freshmen at orientation is desirable; making our presence known from the very beginning that we are here to help sets the wheels in motion that our library should be a dedicated and cherished cornerstone in our students’ campus experience. Virtual tours of the library facilities and meet-the-staff videos can reel in distance learning students to increase the likelihood that they will use the library remotely.

Evaluation of the instruction programs should be an on-going process. Regular meetings with faculty, staff, and department administration can determine if the specified outcomes are being implemented and if any needed changes in direction is necessary. There needs to be ongoing support for continuing education, training, and development. New librarians, such as myself, coming on board with no prior information literacy instructional experience, can develop and nurture these skills through structured training sessions. Continual training for those librarians who are already conducting information literary instruction will ensure that their qualifications are constantly being challenged and sharpened. Sensitivity and responsiveness to changing technologies and the overall chemistry of the campus environment is essential. What worked even three years ago may not work now. Make sure your library has the essential tools needed to conduct effective instruction, budget withstanding.

I wonder if I would have been offered the position at Bethany College, and if so, how I would have gone about redesigning and evaluating their current information literacy instructional programs. I certainly would have been up for the challenge. How does your library evaluate your information literacy instruction? What is your greatest obstacle when doing so? Do you feel as though you have that open communication with other campus departments and faculty? For academic librarians, information literacy instruction is a very crucial facet of this profession, one that requires reaching out to the students to instill lifelong appreciation for acquiring what is accurate and conducive to learning and the ability to retain and utilize reliable resources for a more productive lifestyle.

Negan 701

Thanks to http://www.ala.org/acrl/standards/guidelinesinstruction for the help!

 

#BUDSC18 Call for Proposals

May 3, 2018

Bucknell University will host its fifth annual digital scholarship conference (#BUDSC18) from October 5th-7th. The theme of the conference is “Digital Scholarship: Expanding Access, Activism, and Advocacy.”

#BUDSC18 will bring together a community of practitioners–faculty, researchers, librarians, artists, educational technologists, students, administrators, and others–committed to promoting access to and through digital scholarship. We consider “access” in the broadest possible terms: accessible formats and technologies, access through universal design for learning, access to a mode of expression, access to stories that might not otherwise be heard or that might be lost over time, access to understanding and knowledge once considered beyond reach.

We encourage proposals that explore or critique digital scholarship as it relates to access, broadly conceived. Topics may include, but should not be limited to, the following:

  • Accessibility of digital platforms and technology
  • Access to resources to engage in or produce digital scholarship
  • Digital scholarship and social change
  • Sustainability and future access to digital scholarship
  • Digital scholarship and multimodal/interdisciplinary access
  • Access to digital scholarship beyond the academy
  • The public mission of digital scholarship
  • Creating opportunities for diverse voices and perspectives
  • Designing for access, activism, and advocacy

Submissions may take the form of interactive presentations, project demos, electronic posters, panel discussions, work-in-progress sessions, workshops, lightning talks, or other creative formats.

We look forward to building on the success of the last four years, in which we have come together to discuss challenges, share working models, reflect on projects, and inspire new avenues for actively including students in public scholarly pursuits. For more information, please view our highlights from the 2017 meeting, the conference website and this year’s call.

Proposal Submission Form: https://goo.gl/forms/4nVllpVvaLEW9Jc02

Proposals are due: Friday, June 15th, 8:00 PM, Eastern Time (US).
Notifications will be sent by July 15th.
If you have any questions please contact: budsc@bucknell.edu

Money Smart Week video now available

May 1, 2018
by

The Money Smart Week presentation by Emily Mross and Olivia Sullivan is available at this link:

https://zoom.us/recording/play/OVUipP97FEtpUiBz9wxF0qbjo-67AcyBf-q6hb8317GkJ-L8ALVRtWAgy_S3d3S3

Link will be active for 30 days.

Thanks to all who participated!
~Erin

April 29, 2018
by

Revisiting the Discovery Tool: a Periodic Exercise

The academic library has always been a place to develop and adopt new technology, tools and services. Over the last several years we have witnessed a great increase in the reliance on discovery tools. All the big players are here. EDS, Ex Libris and OCLC are some of the very prominent names we have seen in Pennsylvania. And let us not forget Pennsylvania’s own VuFind! Call them what you will, debate has continued as these tools are refined and adopted. It is always good to give a good hard look at what we offer our patrons. As we instruct students in class or assist them at the research help desk, do we see an improvement in their ability to find quality sources? Are these tools really successful in drawing users to our resources or will the familiarity of the “web” always come first? Does it matter? Let’s look again at some of the concerns.

In 2014, Marc Perry wrote a piece in the Chronicle of Higher Education referring to these issues as “the messy world of discovery tools.”[I] The word “messy” jumps out. He identified some of his concerns based on his thorough research. Have we made progress since then? Have we found a balance? One does not need to perform a rigorous study to make similar observational conclusions about discovery tools today.

While a visit to the website of many of the academic libraries in Pennsylvania will show that these products have a strong presence, not every school uses the same system. Sure the products are similar but as practice points out, even slight differences can confuse, delay or even deter a new researcher from continuing to use them. How many times do students approach you about how they “found this source on Google Scholar, do we have it?” When you tell them “Why yes, and here is how easy it is to find”, are they really listening? Do you find students tuning you out when you try to lead an instruction session about a discovery tool? Practice shows this can be a daunting obstacle for even the most skilled instructor. Yet we somehow persevere in our commitment to illustrate how useful the discovery service can be. And they can be. WE know that in our library world, but put yourself in the chair of the student. We are asking them to learn a new way of seeking information for their scholarly efforts. They seem to want grab and go. And why not, that is what they are used to doing for everything else in their lives. Experience shows most of our first-year student’s feel they have been, and can continue to be successful with the internet alone. Our discovery tools don’t yet truly emulate the familiar web search. Any experienced library instructor will try to convey that when it comes to modern research it is not simply one or the other- internet or discovery.

What about those subject-specific databases? Do we do a disservice to our students by tacitly pointing them away from the specific knowledge and tools provided by a focused database? Can you limit results in your discovery tool to “articles written by registered nurses” or would it be easier to point them to CINAHL? How about a company profile for business students? The databases are each different and information can be missed in large indexes of aggregated content. It is sometimes the case that students find the proprietary interface of a standalone database more intuitive and relevant.

Then one might want to consider what we tell/sell our students. “Hey check out this box, it searches everything we have!” It sounds like the best way to go and many students are drawn to the idea. But has this really come to fruition? Promises are made by vendors yet results prove that not every database and discovery service play well together. This has the potential to miss relevant results. About relevancy, are we convinced that the “relevancy-ranking” is not impacted by vendor? Observation shows that when a discovery tool vendor also packages content in databases, bias and business appear to impact results no matter how you configure your particular tool.

If the “one-stop box” approach is our primary answer to the “googlefication” of university research, why do most of our websites still offer links to databases, libguides, legacy catalogs, etc.? Probably because at our core we know we are not quite there with discovery layers alone. Every layer we add has the potential to stall research. It does not serve the mission of easy, quality research if students find an overwhelming amount of results or a dead link or at the end of a search. Are we making progress? Absolutely. Will these tools go away? Probably not.

Time and Effort

The work that is asked of us to maintain these tools is overwhelming. You may have a team at your college or university. You may be solely responsible. Either way the library world salutes you. While progress has been made some issues are likely never to go away:

  • different indexing
  • ever-changing licensing agreements
  • decisions to go with a cheaper competitor
  • learning curves
  • interaction with campus IT departments
  • systems that don’t speak to each other
  • the dropping/adding of databases and records
  • how the results appear at your institution
  • broken links/gateway errors
  • lag time of vendor response to issues
  • vendor bias and competition

These are a few of the many concerns of supporting the discovery tool from the back end. This doesn’t even include uploading cataloging and holdings information for books and serials. The hours/months/years of continuous work in simply maintaining the tools is incredible and often insurmountable. So once again- thanks to all those in our libraries that try very hard to keep up. We know it is a necessary yet often untenable position to be in.

So what do we need to remember? First and foremost, we are making great improvements but we need to keep assessing. Things are better yet many of the same issues remain with each periodic review of discovery tools. This is unlikely to change anytime soon. We should also never forget the user experience. WE see the value, do they? It should be considered one of the larger missions of our libraries to properly market the true value of these tools. Lastly, ask ourselves often if discovery tools should be the only gateway to information students are expected to use.

Do YOU see a day when it will be enough to simply provide one box to rule them all?

 

[I] Parry, M. (2014, April 25). As Researchers Turn to Google, Libraries Navigate the Messy World of Discovery Tools. Chronicle of Higher Education. p. 18.

 

The Natural Enemy of the Librarian?

April 27, 2018

A friend of my shared this article with me on Twitter a few weeks ago, The Natural Enemy of the Librarian and it made me laugh. First because I started reading it on my phone and it was a serious TL;DR. But also because after I got on a computer I was amused to find out that Charles Cutter had declared the architect the natural enemy of the librarian in 1876! Many of the points raised in this article have been reflected in the discussions that have been happening in my library all year. As we have prepared for a library renovation that will start in just a few weeks there has been an ongoing debate between form and function.

Since September, we have met with a parade of architects, vendors, project managers, book movers, regular movers, and campus partners to try to coordinate everything that needs to happen before any construction can occur. I’m happy to report none of while not all of these conversations have gone completely smoothly I don’t think any of them are our natural enemy.

I think some of the push and pull between design and practicality has been alleviated by the current trend in libraries to become more of a collaborative work space rather than a warehouse for books & journals. We were willing to sacrifice some shelf space to incorporate more open spaces. Both sides compromised on the service desk, which is remaining large, but will now become a single point of service for both circulation and reference functions. I’m looking forward to seeing how this flexible furniture and open spaces can be made functional for classes who want to have work days in the library but maybe don’t need the formality of the library classroom.

In any case, at least at my library, I think we can officially declare an end to the 140 year war between architects and librarians! Unless they take away more book truck storage…

Happy World Intellectual Property Day

April 26, 2018

Let’s look at the case of Madonna and Tupac (https://www.theguardian.com/music/2017/jul/19/madonna-halts-tupac-breakup-letter-auction-whitney-sharon-stone).

Personal letters are at stake and the questions  the courts seem to concentrate on was property, merely property.  Who owns the letters and what they can do with them. However, the content of the letters are more of major concern.  First, the information is personal, private, intimate not for public consumption. But, probing minds always deem public figures as fair game to expose most any details of their personal lives. And, to make matters worse, fame (infamy) seems to be dependent not on the most honorable actions made by famous people, but more on their worse sides.  Some very famous public figures live on the fact their fame is dependent on how awful the decisions or actions they make are made public.  But the second fact is, the letters could be protected under copyright and the author has considerable rights to keep them under wraps – currently up to 70 years after the author’s demise.  Empowering Intellectual Property by having a special day to celebrate should encourage us all to consider not only intellectual property but also privacy, moral rights and dignity. Tupac Shakur died tragically, respect the deceased. Madonna is a super stellar singer, actress and accomplished artist, respect the living.