Having run many evaluations in classes at UMBC, in courses from physics to political science, information systems to art, I have met and talked with a wide range of students. These classes have also run the gamut from small freshmen courses to large introductory courses to senior and graduate seminars. One student concern (sometimes more of a complaint) that is raised often enough to be troubling is the desire to have a clearer sense of what the next exam will be like. This concern produces requests for study guides, more review sessions, and a clear answer to the perennial question: "Will this be on the exam?"
I call this a concern because there are several ways of looking at this issue. On the one hand, it's possible that many of the students requesting this clarity are simply asking professors to do more of the work so that students can spend less time studying (and still do well). And of course there is a certain amount of truth to this view. Students in recent years are indeed spending much less time outside of class for every hour in class--one researcher reports that students spend only 0.3 to 1.0 hours for every hour in class (Gardiner, 1997). Below, I'll suggest some solutions for addressing this problem.
But another view of the situation is that though many students are motivated to do well, they're not sure what they're supposed to be doing well at. What does it mean when the professor tells students that they are supposed to "know" the material? What are we asking them to do, especially in terms of developing their conceptual skills? Are we asking them to concentrate on specific thinking skills, and do we teach our students in ways that identify and measure these skills for them?
Part of the problem lies in the way we structure courses, focusing primarily on the content and less carefully and consciously on the thinking skills we are trying to nurture. A simple but useful way of examining our demands on students is to consider Bloom's Taxonomy of Educational Objectives. Bloom breaks down thinking skills into six levels of increasing difficulty: Knowledge, Comprehension, Application, Analysis, Synthesis, and Evaluation. If you are unfamiliar with these levels, you may want to check out a fuller explanation including resource links, and an interesting discussion of how these levels apply to developing multiple choice exams.
If we want to create critical thinkers, we have to understand the level of thinking that we are attempting to develop and also the strategies students will need to learn in order to reach those levels. Because a course may focus so much on content, students may not feel compelled (or able) to move beyond memorization, a skill that may have served them well in the past but is far below the level you wish them to achieve. And if we keep adding more difficult content as the course develops but do not provide ways for students to examine their skills and improve on them, many will simply flounder. Students need to be taught how to begin to think like a professional in your discipline--and taught in stages throughout the semester because they will not develop these skills quickly or easily.
There are some simple ways of having students focus more on skills. Model the thinking behavior you wish them to emulate--don't just produce information. This means illustrating the thinking process--where things go right and wrong--rather than just illustrating results or correct procedures. And consider basing much of what you do on raising questions, not producing simply producing answers. On the Critical Thinking Consortium Website, an article addresses this issue in detail, noting that
Thinking is not driven by answers but by questions. . . . Questions define tasks, express problems and delineate issues. Answers on the other hand, often signal a full stop in thought. Only when an answer generates a further question does thought continue its life as such. This is why it is true that only students who have questions are really thinking and learning. It is possible to give students an examination on any subject by just asking them to list all of the questions that they have about a subject, including all questions generated by their first list of questions. That we do not test students by asking them to list questions and explain their significance is again evidence of the privileged status we give to answers isolated from questions. That is, we ask questions only to get thought-stopping answers, not to generate further questions.
If we can give students assignments that ask them to generate questions, and work their way through to answers--and then provide feedback on their thought processes--we will begin to address areas of critical thinking. Weekly assignments can be the basis for reflection on, for example, whether a student's analytic skills produced the right approach to a problem or a text. And these assignments need to build, over the course of the semester, in complexity. They can be done individually or in groups, and they need not be graded.
Now it's time to return to the question I raised above about how we get students to do more outside of class, because the following example from Gibbs (1999-2000) not only helps with this issue, but also directly addresses what I have just been discussing. Gibbs describes ways of improving learning through making students do more outside of class--without changing the way a course has been taught.
Forbes and Spence describe a failing engineering class in which student performance was transformed by simply requiring students to submit problem sheets for peer assessment on six occasions during the course, while all lectures and tests remained the same.* The improvement resulted from:
We can make students do more outside of class, and they will respond if the assignments we require aren't just busy work And this kind of skill building work outside of class will create opportunities that are necessary for students to begin to think about our content in appropriate ways--and therefore not to worry so much about what to memorize for an exam.
* Forbes, D., & Spence, J. (1991). An experiment in assessment for a large class. In R. Smith (Ed.), Innovations in Engineering Education. London: Ellis Horwood.
Critical Thinking Consortium offers rich list of resources of university teaching.
WolcottLynch Associates: creating steps for better thinkers. Good list of educator resources and conference handouts.
Derek Bok Center for Teaching and Learning, Harvard University: Online Document: Twenty Ways to Make Lectures More Participatory.
Written resources available at the FDC
Bean, J. (2001). Engaging Ideas: The Professor's Guide to Integrating Writing, Critical Thinking, and Active Learning in the Classroom. San Francisco: Jossey-Bass.
Fink, D. (1990-2000). "Higher Level Learning: A Taxonomy for Identifying Different Kinds of Significant Learning." In Teaching Excellence: Toward the Best in the Academy, Vol 11. No.2.
Gardiner, L.F. (1997). Redesigning Higher Education: Producing Dramatic Gains in Student Learning. ASHE-ERIC Higher Education Reports, 23, (7). Washington, DC: Association for the Study of Higher Education.
Gibbs, G. (1999-2000). "Changing Student Learning Behavior Outside of Class." In Teaching Excellence: Toward the Best in the Academy. Vol 11, No. 1.
Halonen, J. & Brown-Anderson, F. (2002). "Teaching Thinking," In McKeachie, W. Teaching Tips, 11th ed. Boston: Houghton Mifflin, pp. 284-290.
Lee, V. (ed.) (2004). Teaching and Learning Through Inquiry: A Guidebook for Institutions and Instructors. Sterling, VA: Stylus.
Weiss, C. (1992-93). "But How Do We Get Them To Think?" In Teaching Excellence: Toward the Best in the Academy, Vol. 4 No. 5.