Learning Innovation Unit, Dublin City University
Learning Innovation Unit
Altering the Learning Environment Using Classroom Response Systems
By Dr. Eilish McLoughlin and Dr. Odilla Finlayson, Centre for the Advancement of
Science and Mathematics Teaching & Learning (CASTeL)
Classroom response systems (CRS) are increasingly used to increase student interaction and hence enhance student learning, particularly in challenging large lecture environments. This article describes the potential benefits of CRS and outlines how they have been used in DCU’s School of Physical Sciences and School of Chemical Sciences.
What are classroom response systems?
CRS consist of handheld transmitters (often referred to as ‘clickers’) which use infrared, radio or wireless signals to send students' responses to questions posed by the lecturer to a receiver connected to a computer at the front of the class. The responses are compiled and the results can then be used to provide immediate feedback to the lecturer – and to the students if desired – about the understanding/ misunderstanding of the material questioned (Figure I).
Figure I: How does CRS work?
|
Why use classroom response systems?
The benefits of teaching by questioning are widely recognized (Banks 2006; Duncan 2005; Hall et al 2005; Draper & Brown 2004; Beatty 2004; Wit 2003). Questioning can be used to:
- Assess student background knowledge.
- Test if what has been taught has been understood.
- Provoke a class discussion.
- Emphasise or reinforce a point.
- Introduce a new topic.
- Help students combine past material to reach an understanding of present material.
- Check students’ understanding of assigned reading, an activity or simulation.
- Test students’ intuition before a demonstration or before teaching a subject.
- Evaluate students’ interpretation of what is happening in a demonstration, to correct any misconceptions, or to lead students to a better understanding.
Draper (2004) suggests some possible applications of CRS as:
- Assessment: as a substitute for a paper test.
- Formative feedback on learning within a class: the lecturer can discover if points have been understood by the students and/or may require some further clarification. This information is not always obtained in time with more traditional modes of teaching.
- Formative feedback to the lecturer on their teaching: for example, a lecturer can ask for feedback on the best and worst aspects of his/her teaching, and then attempt to correct any issues immediately. • Peer assessment: students who are giving presentations can be graded instantly by their peers on the quality of their work.
- Community building: general questions (for example why students chose this particular class) could create a sense of mutual awareness within the group.
- Demonstrating human response experiments: where, for example, in psychology individual responses can be related to those of the group, showing mean and variability within the group.
- Initiate a discussion: students who have had to commit privately to a definite opinion are much more likely to feel the need to justify their answer in peer discussion.
Benefits of classroom response systems
CRS can engage students in class, encouraging them to become active participants in the learning process. Additionally, by providing frequent feedback to students about the limitations of their knowledge, CRS-based instruction helps them to take charge of their own learning. By providing feedback to a lecturer about their students’ background knowledge and preconceptions, CRS-based pedagogy can help a lecturer to design learning experiences appropriate to the group and explicitly confront and resolve misconceptions.
Many studies have reported positively on student satisfaction with classroom response systems, citing benefits such as making the class more interesting and improved attendance (Hall et al 2005; Duncan
2005; Draper & Brown 2004; Wit 2003). A study by Kennedy and Cutts (2005) examined actual response data per student over the course of a single semester. This data was analysed with end of semester and end of year exam performance results. The investigation showed that students who more frequently participated in use of CRS and who were frequently correct in their responses, performed better on formal assessments. Students who infrequently responded, but did so correctly, nevertheless performed poorly on formal assessments, suggesting that the level of student involvement during the class is positively correlated with better learning outcomes.
While the use of CRS clearly offers many potential pedagogical benefits, it should be noted that the use of CRS in itself will not be sufficient to improve student learning: considered application of the technology, based on defined pedagogical goals, is necessary for successful deployment.
The pilot use of CRS in DCU’s School of Physical Sciences and School of Chemical Sciences
CRS have been used in a first year physics module on waves and optics delivered to approximately 30 students. During this module, continuous assessment was carried out on four separate occasions, twice using CRS and twice via paper-based tests. A range of question types were posed in both the written and electronic continuous assessment and in final examination questions, including: recall, calculation, interpretation, reasoning and application. The variation in student performance in each of these individual assessment components did not show any statistical significance dependent on the form of assessment.
CRS have also been used in second year chemistry lectures (groups of up to 80 students) for two main purposes: (i) to pose questions during lectures; and (ii) for continuous assessment. In the first case, CRS was useful to determine prior knowledge by posing foundation questions; immediate feedback from the group allowed the lecturer to revisit points that were not understood. However, this application did raise questions about how the lecturer should respond. For example, if 50% of students are correct, then it is likely to be beneficial for the lecturer to spend additional time on the problem area; however, if 90% of students are correct, then how much time should the lecturer spend reviewing the material? Another potential issue arises where CRS are used to determine understanding of the previous lecture: if there is not a 100% correct response, then the lecturer must consider whether this is because the material has not been understood, or because it has not been reviewed.
In our experience, CRS can be useful for on-going continuous assessment and for indicating where the students are in their understanding, particularly for larger groups. Response time is an important factor here: if the questions are of the nature that requires a calculation to be carried our or require more thought, then it is likely that different students will require different times to respond. Therefore, CRS can potentially be limiting for some students whom require more time. In addition, as the questions used tend to be multiple choice then the potential for guesswork can be an issue.
From a practical perspective, negative factors relate to the time required to set up and the bulkiness of the units: carrying 100 units and a laptop from one building to another takes additional effort and cannot be easily done if you have a preceding lecture! Additionally, distributing the units to the students takes about five minutes in larger classes.
Overall, the successful use of CRS depends on the balance between the benefits and practical issues. Robertson (2000) offers twelve tips for using CRS that we have found useful (Figure II).
Figure II: Tips for teaching with CRS (Robertson, 2000)
|
Conclusion
In our experience, the efficacy of CRS is highly dependant on the quality of the questions used. Creating effective questions can be challenging and time-consuming, and differs from creating examination and homework problems. However, well designed CRS questions can direct students’ attention, stimulate specific cognitive processes, communicate information to the lecturer and students (via CRS-tabulated graphics in PowerPoint) and facilitate the articulation and discussion of ideas (Beatty et al 2006). Improved achievement of learning outcomes are really the result of changes in pedagogical focus, from passive to active learning, and do not result solely from the specific technology used. Without a focused, well-planned transformation of the large lecture format and pedagogical goals, the technology provides no advantage. If the manner in which the technology is implemented in class is neither meaningful nor interesting to the student, then student interaction and engagement lapses. In conclusion, student interaction can be facilitated via careful use of CRS, but asking the right question is more important than the technology!
The CRS that has been used in DCU is the Quizdom Q4 (www.qwizdom.co.uk) used in conjunction with Microsoft PowerPoint. There are several forms of questioning available on these Quizdom Q4 units: multiple choice-Conceptual or Numeric, True/False, Yes/No, Rating Scale, Sequencing questions and numerical input.
The authors would like to acknowledge the support of the DCU Learning Innovation Fund. If you would like to use CRS in your teaching then please contact morag.munro@dcu.ie.
References
Banks, D. (ed.) 2006. Audience response
systems in higher education: Applications and cases. Hershey, PA:
Information Science Publishing.
Beatty, I. 2004. Transforming student
learning with classroom communication systems, EDUCASE Centre for
Applied Research Bulletin, 3, pp. 1-13.
Beatty, I., Gerace, W, Leonard,
W. & Dufresne, R. 2006. Designing effective questions for classroom
response system teaching. Am. J. Phys. 74(1), pp.31-39.
Draper, S.
& Brown M. 2004. Increasing Interactivity in Lectures Using an
Electronic Voting System. Journal of Computer Assisted Learning, 20,
pp. 81-94. Draper, S. 2009. Electronic Voting Systems Available from:
<http://evs.psy.gla.ac.uk/>. [Accessed 15 November 2010].
Duncan,
D. 2005. Clickers in the classroom: How to enhance science teaching
using classroom response systems. San Francisco: Pearson Education.
Hall, S., Waitz, I., Brodeur, D, Soderholm, D & Nasr, R. 2005.
Adoption of Active Learning in a Lecture-based Engineering class, IEEE
Conference, Boston, MA, USA.
Kennedy, G. & Cutts, Q. 2005. The Association Between Students' Use
of an Electronic Voting System and their Learning Outcomes. Journal of
Computer Assisted Learning, 21(4), pp.260-268.
Robertson, L., 2000.
Twelve tips for using a computerised interactive audience response
system. Medical Teacher, 22 (3), pp. 237-239.
Wit, E. 2003. Who Wants
to be…The Use of a Personal Response System in Statistics Teaching,
MSOR Connections. 3(2), pp.14-20
- About Teaching Reflections
- Current issue: Dec 2010
- Foreword
- Welcome to this Edition of Teaching Reflections
- Learning the Law: Simulation in Legal Education
- DCU Intergenerational Learning Project
- Involving Undergraduates in Research: the DiaMond UREKA Experience
- Bin the Lab Notebook… for Better Scientific Learning
- DCU Voluntary Maths Tuition Programme
- Altering the Learning Environment Using Classroom Response Systems
- DRHEA e-learning Summer School 2010
- Teaching and Learning News
- Teaching and Learning Awards, Events and Funding Calls
- Useful resources: Open Educational Resources
- Contributing to Teaching Reflections
- Archive
- Contributing to Teaching Reflections
- LIU Home