Student Success to Date
Like all educators, we face a central question: We are teaching, but are students learning? And if they are, how can we demonstrate their skill attainment accurately?
In studying the effectiveness of our instructional program, we rely on several sources of information: evaluation of student work via e-portfolio; self-reports from students, including responses to supplemental questions in the Noel-Levitz survey; and anecdotal information. Clearly these sources are very different. This mix of quantitative and qualitative information provides alternative perspectives that help form a kind of "triangulation."
Evaluation of student work
Evaluation of student work submitted in response to course assignments is the most direct measure we have of student success. Specific aspects of student work are evaluated by faculty members on a four-point scale: does not meet expectations, nearly meets expectations, meets expectations, and exceeds expectations. Evaluations of "meets" or "exceeds" are considered evidence of having met the competency. In evaluating the overall success of the information literacy effort, we have set a preliminary target that 80% of Champlain students will meet the competency by graduation.
Evidence to date shows that in the Class of 2011, 72% of students achieved competence, and in the Class of 2012, 77% of students met the competency (Figures 1 and 2).
Figures 1 and 2
Over the past two years, student achievement in some components of the competency has exceeded the 80% target: defining the task, considering sources and selecting sources (Figure 3). While short of our target overall, this initial data affirms that our target is attainable, and already reveals some increases in student success.
(Note: For some components of the competency, we are not able to measure student achievement in all years of their work. For example, "considering sources" is currently measured only once, during students' second year of work. The achievement levels shown in Figure 3 are based on the last measuring point for each component.)
We also looked at how student work as a whole changed over time in their achievement of each component of the competency.
As shown in Figure 4, the class of 2012 student cohort performed much better by their third year than they did in their first year; sometimes the improvement was dramatic in these two years. We will continue to track these measures as an indication of overall success.
In addition to examining student work in order to evaluate how well students are meeting the college competency, we also use it to guide our instructional planning. In the very first semester of the new Core curriculum, when the class of 2011 began, evaluation of student work showed clearly that students were not doing well at attribution and citation. Only 56% were citing and attributing competently. We realized that we hadn't actually included any instruction on the topic for this cohort. In subsequent years we added instruction -- and saw dramatic improvement. Seeing student outcomes allows us to monitor and adapt quickly and easily, continually refining our instruction in light of student progress.
Looking ahead, we anticipate continuing to use faculty evaluation of student work as a measure of progress. We have begun librarian evaluation of samples of student work, and we are interested in how closely librarian evaluation correlates with faculty evaluation. We are also exploring additional ways we can monitor our effectiveness.
In 2011, Champlain College students identified Technology and Information Literacy as the most important of seven College-wide competencies. In the Noel-Levitz survey, which the college administers regularly, students are asked to rate two aspects of each item: how important the item is to them, and how satisfied they are with the College on that item. In supplemental questions about the college competencies included on the 2011 Noel-Levitz survey, students ranked the technology and information literacy competency as the most important college competency (the other competencies include written and oral communication, quantitative literacy, ethical reasoning, global appreciation, and critical/creative thinking). When students were asked to rate satisfaction with the College's contribution to their development of these College-wide competencies, the technology and information literacy competency again ranked at the top.
These results indicate that students are much more aware of the technology and information literacy competency than we might have guessed, and that they are very interested in developing their skills in this area.
The opportunity to embed library instruction within the Core curriculum is a wonderful advantage-but it carries a responsibility. Today's students - and faculty -demand high value for the time we spend with them. In particular, students don't want to sit through instruction that feels repetitive or boring. This realization has forced us to continually examine and refresh our instruction, and to integrate new ideas and new technologies. One example occurred when we incorporated students' use of mobile phones, laptops, and other electronic gear in an instruction session, by using them in place of traditional "clickers." The success of this instruction session is best described by the following anecdote from a faculty member teaching the Core course in which the new instructional technique was introduced:
I have taught this course in prior semesters, and taken part in its information literacy component. I assumed, therefore, when I opened my classroom door to the library representatives this year that I knew what to expect.
I couldn't have been more wrong, nor more pleasantly surprised.
When the first slide of the power point presentation flickered into view, I'm sure my students thought that they knew what to expect as well. But as soon as they heard the request to "take out and turn on your cell phones" they were shocked into an attentiveness that usually requires considerable coaxing to achieve... Students began to talk excitedly to one another... asking one another questions about their respective preferences for information resources. Students who had seldom spoken in class before this offered eager defenses for their particular choices, and animated discussions erupted at several tables - and this after only the first slide in the presentation.
The positive effect that this simple, but powerful, pedagogical tool had on student engagement and attitude in my classroom was clear and profound.
This broad range of evidentiary sources allows us to have greater confidence that we are not only seeing success in terms of student achievement, but also in their engagement in the topic and their recognition of its value to their education.