THE TEACHING ECONOMIST - William A. McEachern                 

HomeAbout The Teaching Economist Contact the Editor Support

Issue 35, Fall 2008

William A. McEachern, Editor


Many instructors continue to face classes that seem like the dawn of the dead, but most welcome opportunities to engage students more. Over the years, various methods and technologies have aimed to break through this barrier of indifference. One twist is the "clicker," an electronic personal response system that allows students to answer questions posed by the instructor during class. Depending on the format, students click their answers either directly or after discussing alternatives with nearby students. The clicker, about the size of a TV remote, sends a signal to the instructor's receiver, which compiles answers and projects the results on a screen.

Advocates believe that clickers boost class participation, offering instructors and students immediate feedback. Instructors can adjust their presentations in real time to help remedy deficiencies. Students must keep up with the material to answer the questions. Just the process of trying to come up with answers aids learning. Many students view clickers as a game (the TV game show Who Wants to Be a Millionaire uses the same approach). And clickers allow students to answer without having to speak up in class—a plus for shy students. Clickers can also be used to monitor class attendance and help get students to class on time (some instructors pose clicker questions at the beginning of class to discourage tardiness).

In a University of Wisconsin survey of 2,700 clicker-using students (Kaleta and Joosten, 2007), most agreed that clickers increased engagement, offered instant feedback, and helped them learn. Nearly all 27 instructors surveyed agreed that clickers increased engagement, helped them assess student knowledge, increased class discussion, and helped students learn. Some instructors said the class came alive as never before.

Is there a downside? Like any new technology, clickers have startup costs. In addition to the dollar cost of clickers ($25 to $40 each) and a receiver ($100 to $200), growing comfortable with the system can take weeks. Even when everyone's up to speed, clickers claim class time, especially while answers are getting aggregated and the results displayed. As one student noted, "It takes time—all the while, the class gets really loud, and I have to enjoy someone's story about a friend's picture on MySpace instead of learning." And complications arise when clickers are lost, stolen, not working, or don't match students; such problems often tie up an instructor's time.

How common is clicker use among economists? I surveyed a random sample of 60 principles of economics courses with syllabi posted on the web (ruling out courses taught exclusively online). I found no evidence that clickers were used anywhere in this sample. I then searched specifically for clickers in economics syllabi and found about a dozen courses that use them. In these instances, clicker questions were spaced throughout the presentation—about five or six questions per class. Clicker activity counted as 10% to 20% of the course grade—weighty enough to get the student's attention.

In at least one course, students were penalized for poor attendance (as captured by clicker use) but not for wrong answers per se. With clickers figuring into the course grade, cheating can become an issue. Ted Bergstrom of UCSB warns students on his syllabus that he will regularly ask a few students whose clickers were used that day to introduce themselves after class and show him an ID: "If you are caught using more than one clicker or if your clicker registers [that day] and you are not in class, you will fail the course. It is that simple." Although instructors and students are generally positive about clickers, evidence of student learning is hard to find. An analysis of 11 parallel courses taught at the University of Wisconsin found a small but statistically significant impact of clicker use on course grades (Kaleta and Joosten, 2007). In clicker classes taught during the fall of 2005, 85% of students earned a C or better versus 83% of students in the non-clicker classes taught during the fall of 2004 (apparently, there were no pretests or other controls, so these results are weak).

In another study, Margie Martyn (2007) taught four computer information systems classes in the fall of 2006. She used clickers in two classes (n=45) and a show of hands in two classes (n=47). A pretest found no significant difference between the two groups. The same PowerPoint® presentations, including the same questions, were used in all four classes. Martyn found no significant difference on the final exam score between the two groups. She suggests that her own inexperience with clickers could have contributed to the lack of improvement. Note that a show-of-hands approach becomes less practical as class size increases. I would have expected classes using clickers to do better, if only because the instructors, who tend to be positive about the experience, are also the ones who compose and grade the exams. Clickers have been around for nearly a decade, but they have yet to catch on (at one state university for which I have recent figures, they were used by only 2% of the faculty).

Top                                                                                                                                                                                          Next