Skip to main content
Physics LibreTexts

Clickers: A study of Classroom Response System use at the University of Toronto

  • Page ID
    4129
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    Abstract

    We examined the use of classroom response systems (clickers) in various lecture based courses at the University of Toronto (U of T). Over 30 U of T instructors were interviewed about their use of clickers in classes with a total enrollment of over 5,000 students. Students in these classes were also surveyed about their perception of the value of this technology. The objectives of our study were to evaluate the logistics of using clickers, the pedagogical value and associated teaching strategies, and students’ perception of its efficacy in their learning. We discuss some of the successes and failures of using clickers as a teaching and learning tool.

    Introduction

    Handheld classroom response systems (clickers) have become increasingly popular in undergraduate teaching as a tool for engaging students and enriching learning environments (Beatty, 2004, Carnevale, 2005; Crouch & Mazur, 2001; Duncan, 2005). Used during lectures, clickers provide prompt feedback on student comprehension (Beatty, 2004; Brueckner & MacPherson, 2004; Burnstein & Lederman, 2003; Mazur, 1997).

    A systematic review of different clicker models suggests that many of the commercially available clickers are very similar (Burnstein & Lederman, 2003). In 2006, the University of Toronto adopted a single clicker vendor for its three campuses, encouraging all instructors to use the same system.1 This decision enables students to purchase one clicker for multiple classes, and it allows the university to offer resources and training to faculty on one system. By the spring of 2008, over 60 instructors at U of T were using these clickers as part of their teaching, and the U of T bookstores were reporting sales of over 10,000 clickers per year.

    We surveyed U of T faculty and students in order to determine the following:

    • what types of classes are using clickers most successfully?
    • what are the best pedagogical practices for teaching with clickers?
    • what are the best logistical practices for the administration and use of clickers?
    • do students believe clickers help them learn?

    Faculty Survey

    We conducted 32 interviews with faculty from various departments (e.g., departments in the Faculty of Arts and Science, Faculty of Medicine, School of Management). All of the interviewees had some experience with clickers. We asked several questions about the nature of the classes for which they used clickers, the logistics of their clicker use, their teaching styles, and their opinions about the advantages and disadvantages of teaching with clickers.

    Logistics

    Different instructors employed different practices regarding assigning grades for the use of clickers. Of the 32 faculty we interviewed, 6% assigned grades for buying and registering clickers only, 16% assigned grades for correct answers when clickers used for quizzes, 47% assigned participation grades (no matter what students answered) and 31% did not assign grades for clicker use. These data are consistent with what others have suggested (e.g., Crouch & Mazur, 2001; Dufresne et. al., 1996): many instructors in this study used clickers to encourage participation. This is reflected in the high proportion (53%) of instructors who based their grade assignment on participation and clicker registration.

    While most instructors prepared clicker-questions before lectures, 50% of interviewees reported occasionally thinking of a clicker-question in the middle of a class and asking it. As others have suggested (e.g., Crouch & Mazur, 2001; Dufresne et. al., 1996), this had the effect of livening things up, enhancing ideas, and clarifying topics, but instructors in this study reported that for the spontaneous questions to be effective, they had to be simple.

    The clicker polling procedures that instructors reported in this study resemble a variety of procedures that others have discussed in the literature (e.g., Beatty, 2004; Crouch & Mazur, 2001; Rao & DiCarlo, 2000). Instructors typically gave students between 30 seconds and 1 minute to answer a question before closing the voting. If calculations were involved, longer times, such as up to 2 or 3 minutes were allowed. All but one of the interviewees tended to show the class a histogram of the results of each vote immediately after the voting for a question was closed. About 33% of the interviewees had occasionally showed the class a histogram of results during the vote, so that the students could actively change their answer and to see the effect on the histogram in real time. This introduced the potential of a histogram influencing a vote. One economics professor reported using this technique to teach about “herding” and to emphasize the value of independent thinking.

    The majority of interviewees (i.e., over 70%) said that they often expected, encouraged or allowed students to discuss a clicker question both before and after voting. Of the faculty who did not allow discussion before the vote, this was usually because the correct answer counted for marks. One instructor regarded discussion as cheating, but felt that “it was probably happening anyway”. One instructor did not allow discussion before the first time students voted on a question. If the students did not do well on the question, he had them discuss amongst themselves and re-vote. Of the 22 interviewees who allowed discussion after the voting, 91% of them asked the students to vote on the same question again after the discussion. About 50% said that occasionally they repeated a clicker-question on a test or exam.

    Pedagogical Practices

    Faculty were asked what types of questions they used with the clickers. Consistent with what others have discussed (i.e., Reay et. al., 2005), the majority of interviewees (84%) responded that they used clickers to ask conceptual questions; these questions had a single correct answer, and were designed to check common misconceptions, apply quick problem-solving strategies, review or synthesize material, or combine readings with lecture material. Other types of questions included fact checking, questions which do not necessarily have a correct answer, and surveys about the class. Often questions were asked not to test the students, but to generate discussion and make them think. 13% of interviewees said they sometimes asked a question, and then, before giving the answer, asked the students to report their level of confidence in their own answer.

    For questions that do have a correct answer, not all faculty were expecting or hoping that a large majority of students would get the correct answer. 34% of interviewees indicated that they were aiming for approximately 50% correct answers, encouraging and advocating peer instruction (Mazur, 1997) as a motivation for striving for this average. In this study, having this lower correct response rate seemed to promote the vote − discuss − then vote again process. In this process, an initial or prequestion is posed to gauge how much students know, a discussion follows, then a postquestion is posed to check if students understood the concept. This pedagogical method is described by others as a process that improves students’ problem solving abilities and performance on quizzes (Rao & DiCarlo, 2000; Ruhl, Hughes & Schloss, 1987), improves student engagement and learning outcomes (Beatty, 2004; Brueckner & MacPherson, 2004; Crouch & Mazur, 2001), improves interactive classroom discourse and increases students’ active participation and ownership of their learning (Beatty, 2004; Dufresne et. al., 1996; Rao & Dicarlo, 2000), while it decreases student anxiety (Owens & Walden, 2001) and lower level learning and passive rote memorization of lecture material (Rao & DiCarlo, 2000).

    We asked faculty who were new to clickers about what changes they might make to their pedagogy if they were to use clickers in the future. Most responded they would put more effort into formulating questions, include more conceptual questions as opposed to fact-checking, and encourage discussion before the vote. Some instructors were planning novel ideas, for example incorporating animations, graphs and math tools to teach various concepts such as gamer theory (e.g., prisoner’s dilemma) or half-life (science fiction computer game viewed from the perspective of the player).

    Advantages of clicker use

    Consistent with what others have reported (Beatty, 2004; Burnstein & Lederman, 2003; Carnevale, 2005; Crouch & Mazur, 2001; Dufresne et. al., 1996; Mazur, 1997; Rao & DiCarlo, 2000), the most common advantage of clicker use, reported by 69% of interviewees, was that of student engagement. By using clickers, students are forced to think and make a decision in class, and this helps to engage them with the material.

    Of the other advantages reported, the most common were:

    • The instructor receives quick feedback on student understanding of course material.
    • The students receive quick feedback on their own understanding, and how they compare to the rest of the class.
    • Clicker use helps stimulate in-class discussion and peer instruction.
    • Clickers engage all students equally, including the quieter ones who would not normally be involved in a spoken discussion.

    Disadvantages of clicker use

    Again, consistent with the literature (e.g., Beatty, 2004; Burnstein & Lederman, 2003; Fies & Marshall, 2005), the most common disadvantage of clicker use reported by faculty was the administrative burden associated with the technology. This included registering student identification with clicker frequency, enforcing policies about lost or forgotten clickers and tabulating and posting clicker grades. Other common disadvantages reported were the extra time and energy instructors needed to devote to lecture participation in order to use clickers effectively, and the fact that stopping the class for a clicker vote takes away from class time, so that less material can be covered.

    Most instructors agreed that they would not use clickers in small classes, such as those with fewer than 30 students. The administrative burden and other disadvantages outweigh the advantages in these small classes. In larger classes, such as 70 or more students, the advantages are much greater and clearly outweigh the disadvantages.

    Correlations between Teaching Practices and Student Experience

    We asked all of our faculty interviewees if they would survey students in the classes in which they were using clickers. Students were asked whether they liked using clickers, and whether they believed using clickers helped their learning. These surveys were conducted in class using clickers.

    In a pilot student survey, involving three classes with a total of 670 students, responses were simply phrased as yes/no. The majority of students in all three classes said “yes” to both questions. In a larger student survey, involving 6 classes with a total of 715 students, a 4-point scale was used to indicate the level which students liked the clickers and the level which they thought clickers helped them learn. The results are shown in Table 1.

    Table 1 Results of student surveys in nine classes

    Do you like using clicker?

    classes

    total # of students

    yes

    no

    Fall '07: Intro Psychology, Geology, Physics 670 65% 28%
        Loved it Liked it Disliked it Hated it
    Spring ’08: Physical Education, Astronomy, Civil Engineering, Psychology, Chemistry

    715

    27% + 41%

    = 68%

    14% + 14%

    = 28%

    Do you believe clickers help you learn?

    classes

     

    total # of students

     

    yes

    no

    Fall ’07: Intro Psychology, Geology, Physics 670 69% 30%
        A lot A fair amount Just a bit Nothing
    Spring ’08: Physical Education, Astronomy, Civil Engineering, Psychology, Chemistry

    715

    13% + 34%= 47%

    32% + 20%= 52%

    The data from the 4-point scale in the larger student survey was collapsed to match the binary (yes/no) scale of the pilot data. The yes/no results of the student surveys in all of the 9 classes were compared to some of the reported teaching practices of the interviewees. We performed logistic regression analyses to determine the likelihood of students reporting that “yes”, they like using clickers and that “yes” they believe clickers help them to learn, given particular teaching practices, as reported in the faculty survey, specifically, the following four yes/no questions posed to the instructors of these 9 classes:

    1. Do you expect/encourage/allow students to discuss a clicker question amongst themselves before they vote?
    2. Do you ever think of a clicker question in the middle of a class and ask it?
    3. Do you ever display a histogram of vote results while voting is going on so the students can see the results while they can decide on or change their answer?
    4. Do you ever have students discuss a clicker question after they have voted?

    The likelihood of being able to predict students liking clickers and believing that clickers help with learning based on instructors teaching practices was evident in 3 of the 4 teaching practices questions. As shown in Figure 1, it’s likely that students will believe that clickers help their learning if instructors allow them to discuss a clicker question amongst themselves before they vote. As shown in Figure 2, it’s likely that students will believe that clickers do not help their learning if instructors allow students to discuss a clicker question after they have voted. As shown in Figure 3, it’s likely that students will believe that clickers do not help their learning if instructors display the histogram while voting is going on and students can see the results while they can decide on or change their answer.

    alt
    Figure 1 Summary data of instructor’s allowing students to discuss a clicker question before the vote (Y/N), and students’ opinion that clickers help learning (histogram).
    PED CLicker fig 2.png
    Figure 2 Summary data of instructor’s allowing students to discuss a clicker question after the vote (Y/N), and students’ opinion that clickers help learning (histogram).
    PED CLicker fig 3.png
    Figure 3 Summary data of instructor’s practice of displaying the vote results during polling, and students’ opinion that clickers help learning (histogram).

    These results suggest that students believe that discussion of questions before a vote is helpful to their learning, but that discussion of questions after a vote is not helpful to their learning. A possible explanation for this might be that students perceive this post vote discussion to be frivolous and this takes away from class time that could be used to cover more material. Also suggested here is that students believe they will not learn more just because instructors display the histogram during a vote. Again, it might be that students regard this practice as a frivolous use of the technology, and again, detracting from time and focus in the lecture that could be spent going over different material.

    Conclusions

    There are many ways to use clickers in the class, as well as many reasons to use or not use them. As demonstrated in this study, most often, students like them (Beatty, 2004; Brueckner & MacPherson, 2004, Roschelle, et. al., 2004 also report this), but their value to students is greatly determined not only by how, logistically, the technology is used, but more importantly, how and why, pedagogically, it is used by the Instructor. Beatty (2004), Bruechner and MacPherson (2004), and Mazur (1997) also suggest this. Many of the faculty interviewed in this study reported that they had not previously thought about many of the issues raised in the interview, and said they would change their future teaching practices with clickers based on our interviews.

    Authors’ Biography

    A Faculty Learning Community is a group of trans-disciplinary faculty who engage in a active collaborative program regarding undergraduate education. The U of T Faculty Learning Community is an informal group comprised of teaching and research faculty at the University of Toronto. Each year, the group decides on a theme and carries out either individual or group projects to investigate various aspects of that theme.

    References

    Beatty, I. 2004. Transforming Student Learning with Classroom Communication Systems. Educause Centre for Applied, Research Bulletin, V2004(3): 2-13.

    Brueckner, J.K., and MacPherson, B.R. 2004. Benefits from peer teaching in the dental gross anatomy laboratory. European Journal of Dental Education, 8: 72-77.

    Burnstein, R.A., and Lederman, L.M. 2003. Comparison of Different Commercial Wireless Keypad Systems. The Physics Teacher, 41(5): 272-275.

    Carnevale D. 2005. Run a Class Like a Game Show: “Clickers” keep students involved. Chronicle of Higher Education, 51(42):B3.

    Crouch C.H., and Mazur, E. 2001. Peer Instruction: Ten years of experience and results. American Journal of Physics, 69(9):970-977.

    Dufresne, R.J., Wenk, L., Mestre, J.P., Gerace, W.J., Leonard, W.J., 1996. Classtalk: A Classroom Communication System for Active Learning. Journal of Computing in Higher Education, 7(2): 3-47.

    Duncan, D. 2005. Clickers in the Classroom: How to Enhance Science Teaching Using Classroom Response Systems. Toronto. Pearson Prentice-Hall and Pearson Benjamin Cummings.

    Fies, C., and Marshall, J. 2005. Electronic Response Systems in Classrooms. American Association of Physics Teachers: Announcer, 34(4): 111.

    Mazur, E. 1997. Peer Instruction: A User’s Manual. Toronto: Prentice-Hall.

    Owens, L.D., Walden, D.J., 2001. Peer Instruction in the learning laboratory: a strategy to decrease student anxiety. Journal of Nursing Education, 40(8): 375-377.

    Rao, S.P. and DiCArlo, S.E. 2000. Peer Instruction Improves Performance on Quizzes. Advances in Physiology Education, 24: 51-55.

    Reay, N.W., Bao, L., Pengfei, L., Warnakulasooriya, R., and Bough, G. 2005. Toward an effective use of voting machines in physics lectures. American Journal of Physics, 73(6): 554-558.

    Roschelle, J., Penuel, B., and Abrahamson, A.L., 2004. The networked classroom. Educational Leadership, 61(5): 50-54.

    Ruhl, K.L., Hughes, C., and Schloss P. 1987. Using the pause procedure to enhance lecture recall. Teacher Education and Special Education, 10: 14-18.


    This page titled Clickers: A study of Classroom Response System use at the University of Toronto is shared under a not declared license and was authored, remixed, and/or curated by David Harrison.

    • Was this article helpful?