In past years, Library 160 summative course feedback collected at the course's final exam from students often included the assertion from students that they "already knew" the content covered by the course. Although course administrators assumed for years that these student claims were probably exaggerated, there were no objective means in place in the course to determine the accuracy of such summative statements.
To measure whether this was indeed true, in Summer 2004 a pre-test was designed by the instruction coordinator to gain more accurate insights into the question of what do Library 160 students really know on the first day of class in terms of research skills and information literacy conceptual knowledge. This brief pretest consists of 14 scored questions that focus on selective course content, ranging from use of web search engines; awareness of the difference between advertisements and authoritative information on the web; identification of citations; identifying effective search strategies and their likely results; knowing how to find journal articles; and knowing how call numbers are arranged.
Results of the pre-test have documented conclusively that students entering the class are far from "knowing it all" - in fact, the mean scores for each session are typically below 50% accurate, with mean scores of ISU honors students only slightly better. For example, for Fall 2004, mean score was 6.02 (or 43.0% correct); Honors mean score that same session was 7.37 (or an average of 52.6% correct). Fall 2005 mean score was 6.03 (43.1% correct), and Honors mean score was 7.23 (51.6% correct). These pre-test data document the great need for the Library 160 course, despite claims of some students, and form the foundation for subsequent student learning analyses for our course.
The pass-fail rate of Library 160 is generally 90% pass, 10% fail, but this rate in itself does not speak to actual student learning outcomes - meaning, what students actually learned as a result of taking the course. Toward this end, in Spring 2005 the instruction coordinator inserted pre-test questions into the Library 160 final exams, to create a pre-test post-test structure to measure selected learning outcomes of students.
Analyses of these pre-test post-test data were complicated by a number of organizational, hardware, and software changes in the University's Test & Evaluation Service (TES) office, the campus-wide service that reads the "bubble sheet" exam forms and renders the results into machine-readable data files. These organizational changes resulted in significant changes in reports, and some missing data. (For one thing, TES sent course administrators data files for only 253 students taking Spring 2005 final exams, as opposed to data files for 660 students taking the Spring 2005 pre-test.) Because of data discrepancies, matched pair T Test analyses are planned to match student by student of 253 data files from Spring 2005, to verify accuracy of data reported below. Nonetheless, the initial analyses indicate quite positive student learning outcomes on each of the selected pre-test post-test items. In some test items, improvement between pre-test performance and the final exam was quite dramatic. Here are a few selected results.
Knowing the difference between keyword and subject searches:
This specific item represents Spring 2005 students' highest achievement by far on the pre-test, as the majority (85.3%) of students were able to answer this item accurately. (The next highest performance was just 61.1% correct on a different pre-test item, and results went down from there.) Just the same, final exam scores still increased, showing modest yet positive student learning outcomes on this item.
Knowing what Boolean operators and truncation symbols are, and the effect they have on searches:
Despite having grown up with the Internet and computers, the great majority of students could not accurately answer this pre-test item that relies on basic knowledge of how computer databases - from web search engines to library research databases and online catalogs - are searched. The final exam shows a dramatic increase in student learning on this item, though obviously an achievement rate of just 32.8% for all 253 students in this sample is not adequate. Course administrators will investigate why more students are not learning or retaining this specific item.
Able to correctly identify a standard citation:
This is another item that obviously poses great challenges to students entering the course, as the great majority of them are not able to identify a standard citation on the first day of class. Knowing how to interpret citations leads directly to knowing how to access the described material. From these pre-test results, we can posit that most students would not only not know what they were looking at when retrieving such citations from a database, print bibliography, or the free web, but also would not know the next appropriate steps to take to attempt independently to find the item as these vary somewhat by citation type. There is a great positive jump in student learning outcomes seen in the final exam percentage correct, but again a success rate of only 56.1% of all 253 students is not adequate. This item will be addressed by course administrators, in the effort to increase the overall percentage of student learning and retention on this item.
Able to identify paid advertisements in web search engine search results:
Despite the fact that today's undergraduate students have grown up with computers and the web use, almost two-thirds of students did not know the difference between paid advertisements (that pay web search engines for prominent placement) and supposedly "relevant" search results. Students tended to answer uncritically that anything at the top of search results was relevant and credible. Again, there was strong improvement in the final exam scores and student learning outcomes.
Beyond Library 160, the Library has been strengthening its student learning assessment of other components of the Library's instruction program, including the individual class sessions taught by librarians. Training sessions to the library's teaching faculty have been presented on the topic of learning outcomes assessment; how to present assessment data in promotion and tenure portfolios; the student learning assessment underway in Library 160; a session on peer review of teaching will take place in Spring 2006. All of these efforts are focused on measuring student learning outcomes.
Send questions or comments about this page
Contents last modified: ;
2005-11-10 Copyright © 2000-2010, Iowa State University. All rights reserved.