Irrational Exuberance
(continued from yesterday's post)
In the study I looked for a relationship between students’ opinions of a resource and their ability to use it. I asked for students’ opinions of the effectiveness of a resource both before and after they completed the exercises. Students were irrationally exuberant when it came to terms and connectors searching. They ranked it the most effective resource for finding facts and rules both before and after completing the exercise. Their support for this search method was unflappable. Even students who missed questions using a terms and connectors search refused to give it a lower effectiveness rating.
However, students who missed questions using the print digest and KeySearch were not as kind and punished the resources with lower ratings. Students said they were unwilling to give terms and connectors a lower effectiveness rating because it was so much easier to use than KeySearch or the print digest. This finding is in line with previous studies where subjects exhibited a high level of confidence in the results of an electronic resource simply because it was easy to use.
Students overwhelmingly stated that it took them the least amount of time to feel satisfied and confident in their research when using terms and connectors and the most amount of time when using the print digest or KeySearch. This quickly developed confidence in using terms and connectors searching did not translate into exceptional performance as students answered the most questions correctly using the digest and not terms and connectors searches. This result is in line with previous studies that have found researchers using electronic resources often stop researching too soon.
When I informed students that their answers confirmed some common shortcomings of electronic research they were anxious to learn more about these phenomena and any solutions to them. Many students admitted it was the first time they had heard anything negative about electronic searching and its effects on their ability to craft thoughtful arguments.
My students’ ignorance of these shortcomings is no surprise and is likely replicated at law schools around the country. When the majority of us leave first-year LexisNexis and Westlaw training up to the vendors it is no surprise that students only learn the good things about these products. LexisNexis and Westlaw training sessions don't examine the shortcomings of their products or the pitfalls of electronic searching in general. They make no comparisons between electronic searching and print resources. During the sessions students complete a carefully scripted series of exercises where nothing goes wrong and they find relevant results fairly easily. Those of us who use these products on a daily basis know that in reality relevant results aren't always that easy to come by. We must be more involved in the training sessions, inform students of the shortcomings of electronic searching and equip them with strategies to overcome them.
Librarians have also not taken any initiative in developing information literacy standards for the discipline of law while standards exist for over thirty other disciplines.
See the excellent study by Kathryn Hesniak, Donna Nixon and Stephanie Burke-Farne on law students' lack of information literacy skills (40% of 1Ls didn't know what the library catalog contained). We must be more active in this area if we expect our students' to be effective researchers.In the study I looked for a relationship between students’ opinions of a resource and their ability to use it. I asked for students’ opinions of the effectiveness of a resource both before and after they completed the exercises. Students were irrationally exuberant when it came to terms and connectors searching. They ranked it the most effective resource for finding facts and rules both before and after completing the exercise. Their support for this search method was unflappable. Even students who missed questions using a terms and connectors search refused to give it a lower effectiveness rating.
However, students who missed questions using the print digest and KeySearch were not as kind and punished the resources with lower ratings. Students said they were unwilling to give terms and connectors a lower effectiveness rating because it was so much easier to use than KeySearch or the print digest. This finding is in line with previous studies where subjects exhibited a high level of confidence in the results of an electronic resource simply because it was easy to use.
Click here for a graphical representation of my findings.
Students overwhelmingly stated that it took them the least amount of time to feel satisfied and confident in their research when using terms and connectors and the most amount of time when using the print digest or KeySearch. This quickly developed confidence in using terms and connectors searching did not translate into exceptional performance as students answered the most questions correctly using the digest and not terms and connectors searches. This result is in line with previous studies that have found researchers using electronic resources often stop researching too soon.
When I informed students that their answers confirmed some common shortcomings of electronic research they were anxious to learn more about these phenomena and any solutions to them. Many students admitted it was the first time they had heard anything negative about electronic searching and its effects on their ability to craft thoughtful arguments.
My students’ ignorance of these shortcomings is no surprise and is likely replicated at law schools around the country. When the majority of us leave first-year LexisNexis and Westlaw training up to the vendors it is no surprise that students only learn the good things about these products. LexisNexis and Westlaw training sessions don't examine the shortcomings of their products or the pitfalls of electronic searching in general. They make no comparisons between electronic searching and print resources. During the sessions students complete a carefully scripted series of exercises where nothing goes wrong and they find relevant results fairly easily. Those of us who use these products on a daily basis know that in reality relevant results aren't always that easy to come by. We must be more involved in the training sessions, inform students of the shortcomings of electronic searching and equip them with strategies to overcome them.
Librarians have also not taken any initiative in developing information literacy standards for the discipline of law while standards exist for over thirty other disciplines.
Tomorrow I will discuss what we can do about all of this.
No comments:
Post a Comment