Sei sulla pagina 1di 14

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/285839902

Student Response Systems in Higher Education: Moving Beyond Linear Teaching


and Surface Learning

Article · June 2008


DOI: 10.18785/jetde.0101.08

CITATIONS READS
56 370

2 authors, including:

Charles Xiaoxue Wang


Florida Gulf Coast University
30 PUBLICATIONS   251 CITATIONS   

SEE PROFILE

All content following this page was uploaded by Charles Xiaoxue Wang on 09 August 2017.

The user has requested enhancement of the downloaded file.


Journal of Educational Technology Development and Exchange
(JETDE)
Volume 1 | Issue 1 Article 8

6-2008

Student Response Systems in Higher Education:


Moving Beyond Linear Teaching and Surface
Learning
Harry L. Dangel

Charles Xiaoxue Wang

Follow this and additional works at: http://aquila.usm.edu/jetde


Part of the Instructional Media Design Commons, Online and Distance Education Commons,
and the Other Education Commons

Recommended Citation
Dangel, Harry L. and Wang, Charles Xiaoxue (2008) "Student Response Systems in Higher Education: Moving Beyond Linear
Teaching and Surface Learning," Journal of Educational Technology Development and Exchange (JETDE): Vol. 1 : Iss. 1 , Article 8.
DOI: 10.18785/jetde.0101.08
Available at: http://aquila.usm.edu/jetde/vol1/iss1/8

This Article is brought to you for free and open access by The Aquila Digital Community. It has been accepted for inclusion in Journal of Educational
Technology Development and Exchange ( JETDE) by an authorized editor of The Aquila Digital Community. For more information, please contact
Joshua.Cromwell@usm.edu.
Student Response Systems in Higher Education:
Moving Beyond Linear Teaching and Surface Learning

Harry L. Dangel
Charles Xiaoxue Wang
Georgia State University

Abstract: Over the past decade, instructors in colleges and universities increasingly have used
Student Response Systems (SRSs)--typically in large classes to increase the level of student
engagement and learning. Research shows that both students and instructors perceive SRSs to be
beneficial, although evidence of improved learning has been less clear. Experts emphasize that
instructors must consider how technology might enhance good pedagogy in order for increases
in learning to occur. SRSs do increase student engagement and provide prompt feedback—two
key practices that promote learning. However, professional groups propose goals for students in
higher education that focus on deep learning rather than the knowledge-centered emphasis of
many large classes. Recent research shows that SRSs coupled with pedagogical enhancements can
promote deep learning when teaching and questioning strategies center on higher-level thinking
skills. A framework integrating the levels of student responses with principles for good pedagogical
practice is provided as a guide for using SRSs to foster deep learning.

Key words: teaching methods, educational technology, student learning, student response
systems, large classes

1. Introduction increased. While colleagues were convinced


that students were more engaged and motivated
The glass is half-full. We can foresee and that attendance had improved, examination
asking students to compose a “minute paper” performances did not provide the hoped for
describing the muddiest point for them in class evidence of increased learning. Our experience
that day (Angelo & Cross, 1993) and then is not unusual, although such outcomes usually
having them send their responses electronically are not published.
for immediate compiling; or having students
collaboratively develop a list of additional This article presents a brief summary of SRS
information they need before rendering research, a framework for examining teaching
a tentative diagnosis in a Problem-Based and learning with SRSs in higher education and
Learning class in nursing and then comparing suggested directions for using this technology
those lists in real-time. These are instructional to improve student learning.
possibilities when using the emerging Student
Response Systems (SRSs), often referred to as 1.1. What’s in a name?
clickers. On the other hand, after encouraging
colleagues to use clicker in their large lecture We typically think of SRSs as the clicker
classes several years ago, we learned that there systems used in the large lecture classes.
was not clear evidence that student learning had When using SRSs instructors pose a question,
Volume 1, No. 1, November, 2008 93
Journal of Educational Technology Development and Exchange

problem, or statement, ask students to respond, the erasable whiteboards used in some sciences
and display the results. The systems have a courses (Crouch & Mazur, 2001). What SRSs
variety of names: Audience Response System add is the ability to display the pooled results of
(e.g., Audience Response Systems in Higher the class. Today’s SRSs are typically a clicker
Education, Banks, 2006), Classroom Response system, although there are more interactive
Systems (e.g., Classroom Response Systems: systems being employed (e.g., using hand-held
A Review of the Literature, Fies and Marshall, computers, PDAs, text messages from mobile
2006), and Student Response Systems (e.g., phones).
Student response systems: A University of
Wisconsin study of clickers, Kaleta & Joosten, 2. Beyond Surface Learning—What to Ask
2007). Consider what is implied by each of Students?
name and how what we call the technology
suggests how we would use it. An audience Recent reviews of the literature on the effects
response system suggests that the action is of using SRSs in higher education (Caldwell,
in the front—on the stage. Students are more 2007; Fies & Marshall, 2006; Judson & Sawada,
likely to be viewed as consumers who receive 2002) provide a generally positive picture of
the instruction rather than being full partners the technology’s impact on the classroom:
in the instructional process. We wonder
whether using the terms class or classroom • Students and faculty consistently
implies a focus on group learning (teach to indicate that they have a positive view
the middle) as opposed to individual students, of SRSs, especially related to perceived
as in student response system. A few authors improvement in attendance, engagement,
(e.g., Guthrie & Carlin, 2004) describe the and motivation (e.g., Hansen, 2007).
technology as participation systems (a more • SRSs are effectively used as pre-
collaborative approach) rather than response instruction assessments (pretests or
systems that suggest a pedagogical orientation checks on homework or readings), surveys
of the instructor directing and student reacting. of knowledge or opinions, formative and
For this paper, we will use the term Student low-stakes assessments, comprehension
Response Systems because this term appears in checks during lectures, assessments to
the literature with increasing frequency and the launch or stimulate discussions, and
emerging work clearly views students as more quizzes and tests.
than an audience. • SRSs have a positive or neutral impact
on student learning.
The pedagogical principles used with SRS • Research on the efficacy of SRSs to
originated with technologies as simple as promote student learning typically
the handheld slates or whiteboards on which lacks controls that are necessary to
children would write their answers to questions determine whether the technology or
presented by the teacher. The teacher asks a the accompanying pedagogical changes
question (e.g., “how do you write CAT?” or are responsible for apparent increases in
“How much are 2+2?”), and the students write learning.
their answers. A quick survey of the upheld • Evidence of a positive impact of SRSs is
slates/whiteboards enables the teacher to likely associated with the accompanying
determine what portion of students responded use of effective instructional practices
correctly and adjust instruction accordingly. The (e.g., active learning).
modern equivalent of this simple technology is

94 Volume 1, No. 1, November, 2008


Student Response Systems in Higher Education:
Moving Beyond Linear Teaching and Surface Learning

When we look more closely at the research Expectations (Association ofAmerican Colleges
findings, we question the impact of SRSs on and Universities, 2002) proposes developing
learning. Initial use of SRSs dates back nearly intentional learners who are grounded in a
fifty years as an effort to maximize student liberal education and empowered to:
participation (Horowitz, 2006) and appeared • effectively communicate orally, visually,
to assume that with increased engagement, in writing, and in a second language,
learning would follow. The title of recent • understand and employ quantitative and
articles, Waking the Dead (Guthrie & Carlin, qualitative analysis to solve problems,
2004), To Click or Not to Click (El-Rady, 2006), • interpret and evaluate information from
and Run a Class like a Game Show (Carnevale, a variety of sources,
2005), emphasize the role that attendance and • understand and work within complex
engagement have played for many faculty. systems and with diverse groups,
Further, the multiple-choice format of the • demonstrate intellectual agility and the
current commercial SRSs presents a challenge ability to manage change, and
to faculty to move beyond asking questions • transform information into knowledge
about facts (recognizing a correct answer) and knowledge into judgment and
presented in the lecture or text and to engage action.
students in higher-level thinking.
A similar emphasis on broad, integrated
We contrast these findings on SRSs with the skills related to literacy in technology is echoed
high standards for learning that are being set in a recent article from the EDUCAUSE Center
for post-secondary students in the 21st Century. for Applied Research (Moore, Fowler, Jesiek,
The widely acclaimed white paper, Greater Moore, & Watson, 2008). The proposed goals
Table 1. New Competencies for Learning (Moore, et al., 2008, p.5)
From (Current Prevalent Outlook) To (New Learning Vision)
Re-visioning movements are student-
Re-visioning movements are institutional
focused, on what students need to know and
focused, on inputs, changing courses, curricula,
be able to do; competencies and outcomes
programs.
are central.
Coverage of domain material and skills is via Increasing emphasis on hands-on, minds-
individualistic, passive, and teacher-centered on methods, authentic learning, and high
modes of instruction. concept/high touch capabilities
Students are approached and viewed as Students are approached and viewed as
absolute knowers. independent and contextual knowers.
Students are encouraged to develop problem- Students are encouraged to develop problem-
solving abilities. solving and problem-posing abilities.
Teaching of skills that doe not lead to flexible
Teaching of portable skills occurs
skills of their application.
Skills and competencies are highly Information literacy, technology fluency, and
compartmentalized. domain knowledge are blended.
Students treated as passive receivers of Students treated as big-picture thinkers and
information and unengaged learners. critically engaged doers.

Volume 1, No. 1, November, 2008 95


Journal of Educational Technology Development and Exchange

for students are shown in Table 1 below and are Analyzing: Breaking material into
consistent with the shift from teacher-centered constituent parts, determining how
to learner-center instruction (Barr & Tagg, the parts relate to one another and
1995; Weimer, 2002). to an overall structure or purpose
through differentiating, organizing, and
We highlight a number of the words attributing.
and phrases in the New Learning Vision Evaluating: Making judgments based on
column (right-side) that seem important and criteria and standards through checking
challenging when using SRSs. For example, and critiquing.
authentic learning means that course goals and Creating: Putting elements together to
activities should be anchored in meaningful, form a coherent or functional whole;
real-life assignments and assessments—a reorganizing elements into a new pattern
special challenge for an instructor constructing or structure through generating, planning,
multiple-choice questions. Likewise, problem- or producing.
posing abilities, and big-picture thinkers appear (Anderson & Krathwohl, 2001, pp. 67-68).
to challenge some of the ways that SRSs have
been used. It immediately becomes clear that the
goals advocated by Moore, et al. (2008) are
The complexity of student learning goals more aligned with cognitive skills beyond
described in the AAC&U and EDUCAUSE the “Remembering” level. The challenge for
reports requires rethinking traditional learning instructors is to facilitate students’ ability to
goals and classroom pedagogy. One of the apply, analyze, evaluate, and create.
tools available for faculty in designing goals
for student learning along the lines suggested Instructors address the various levels
by these reports is Anderson and Krathwohl’s of learning through the way we frame the
(2001) revision of Bloom’s Taxonomy of questions we ask. For example, the stems for
Educational Objectives. This seminal work Remembering-level questions when using
redefines the original six components of clickers is typically expressed as “Which is
cognitive learning objectives into active verbs the…?” Such questions typically have a single-
that more accurately reflect what occurs in desired answer drawn from the lecture or
classrooms and a description of the activities readings and would fit in the left-hand column
that comprise deep learning (Biggs, 1996). The of Table 1’s arrangement of competencies. Such
Cognitive Process Dimension consisting of: applications are, unfortunately, least likely to
produce the kind of transformational learning
Remembering: Retrieving, recognizing, called for 21st century students.
and recalling relevant knowledge from
long-term memory. There are a number of recent examples
Understanding: Constructing meaning of using SRS for Understanding, Applying,
from oral, written, and graphic messages and Analyzing (Beatty, Leonard, Gerace,
through interpreting, exemplifying, & Dufresne, 2006; Beuckman, Rebello, &
classifying, summarizing, inferring, Zollman 2006; Crossgrove, & Curran, 2008).
comparing, and explaining. One stem for these items might be in the form
Applying: Carrying out or using of a survey (“What do you think is…?”) that can
a procedure through executing or provide the basis for discussions and exploring
implementing. the elements of a concept. Also, multiple-choice

96 Volume 1, No. 1, November, 2008


Student Response Systems in Higher Education:
Moving Beyond Linear Teaching and Surface Learning

questions can be developed to assess students’ taught without clickers. These differences
conceptual understanding. Here is an example held across the categories of remembering,
of a conceptual understanding question in comprehending, and applying/analyzing
astronomy: questions.

Given a picture of the waning quarter There is still limited evidence of SRSs
moon, what portion of the moon is applications in the Creating domain. Of course,
illuminated by the sun? (a) 25%; (b) students need an SRS technology that would
50%; (c) 75; (d) 100%; (e) none of those afford them the tools to generate responses
(Wilson, 2008) rather than just recognize response options.
Beuckman, Rebello, and Zollman (2006)
In order to select the correct response to describe how their students used handheld
the question above, students must analyze the computers to create answers. Their students
question to determine what is being asked, then had higher grades in physics courses when using
select the relevant information and discard handheld computers to construct responses
any information that is not relevant. In this compared to using a clicker system to select
case, “what portion of the moon is illuminated responses. PDAs, laptop computers, and even
by the sun” is relevant and remembering the mobile phones offer the tools for producing
definition of waning moon and quarter moon individually-created responses given a system
are not relevant and need to be ignored. The sun that can capture and display responses.
always illuminates 50% of the moon regardless
of what we can see from earth (except during An especially interesting use of questions
lunar eclipses). Incorrect responses should be and SRSs to foster higher levels of learning
diagnostic to an instructor (e.g., if a student is described by Beatty, Gerace, Leonard, and
selects 25% as the answer, the instructor has Dufresne (2006) in their research on effective
some indication that the student was misled questioning when teaching physics. Their
by the word quarter). Of course, designing model of using Question-Driven Instruction
and testing multiple-choice questions that tap incorporates students’ questions as the primary
into understanding, analyzing, and evaluating activity (rather than instructors’ lecturing).
requires time and knowledge of both the content This approach fundamentally changes the
and of the types of errors and misconceptions nature of teaching so that what the instructor
of students. says/does is guided by the students’ questions
and responses. The authors describe this
The recent increase in the number of approach as agile teaching because instructors
articles that address more complex learning are led by students’ questions rather than
using SRSs is encouraging. For example, instructors’ questions leading students. When
after finding no significant differences in the instructors do ask questions, they include
student performance in science classes on questions that assess conceptual knowledge,
final examinations for classes using clickers procedural knowledge, and metacognitive
and those that did not, Crossgrove and Curren knowledge.
(2008) implemented the use of clickers for
high-level thinking questions. As a result An instructor’s guide to the effective use
their students did significantly better with of personal response systems (“clickers”) in
multiple-choice questions that tested more teaching (CU Science Education Initiative
content taught using clickers than on content & UBC Carl Wieman Science Education

Volume 1, No. 1, November, 2008 97


Journal of Educational Technology Development and Exchange

Initiative, 2008) provides the following of pedagogy and domain content if new learning
guidance: is the aim. Only then can useful technologies and
teaching strategies be matched to best achieve
By far the most common failing is to desired learning outcomes” (p. 3).
make questions that are too easy. In this
situation, students often see the questions
as simply a quiz to keep them awake, and For the past two decades the Seven
they are annoyed that they had to spend Principles for Good Practice in Undergraduate
money on clickers only for this purpose. Education (Chickering & Gamson, 1987) has
There is also some indication that, in the been a cornerstone of pedagogical renewal.
absence of any other form of feedback, The principles, with more than fifty years
easy questions may mislead students as to
the difficulty of the questions they would of evidence supporting their effectiveness
expect to see on the exam. In extensive (Sorcinelli, 1991), are the product of a group of
surveys of students in many different scholars in higher education who incorporated
classes, students overwhelmingly see a series of reports and research on student
challenging questions as the most useful learning into a comprehensive set of principles.
for their learning. Our observations have
also supported the conclusions that such The principles have been expanded and applied
questions result in greater learning (p. 7). to numerous classroom settings in the years
since they were originally published (Hatfield,
3. Beyond Linear Teaching—How Do We 1995; Fink, 2003; Richlin 2006).
Promote the Learning We Desire?
In Table 2 below, the list of the seven
In addition to redefining the learning principles for good practice (Chickering and
outcomes to emphasize deep learning, we Gamson, 1987) are listed in the left-hand
must consider how to redefine the pedagogy column. The principles have been reordered
that guides our teaching. Consider this set into what we judged to be more frequently
of teaching tips for using a SRS (Robertson, employed with SRSs (e.g., time on task
2000) that focus exclusively on presentation being allocating sufficient time for learning)
techniques almost ignoring elements related to through those more qualitative principles
promoting deep learning: keep questions short which have less frequently been integrated
to optimize legibility, have no more than five with using SRSs (e.g., communicating high
answer options, do not make your questions expectations and respecting diverse talents and
overly complex, keep voting straightforward, ways of knowing). It is interesting to analyze
allow time for discussion when designing commonly used technology tools through the
your presentation, rehearse your presentation lens of the seven principles for good practice.
to ensure that it will run smoothly, provide For example, an instructor who is presenting
clear instruction to your audience, and so on. information using PowerPoint often fails to
Horowitz (2006) suggests a similar list. address any of the seven principles. On the
other hand, an instructor who has students work
Such guidance may be helpful but comes in teams to develop PowerPoint presentations
from a teacher-centered orientation and does is likely addressing several principles.
little to address pedagogical issues, and just
introducing new technologies does not address The Cognitive Process Dimension proposed
the pedagogical issues of how to improve student by Anderson and Krathwohl (2001) is listed
learning (Caldwell, 2007). As noted by Moore, et across the top with selected examples of the
al. (2008), “... change starts with an examination type of task that they suggest for each category.

98 Volume 1, No. 1, November, 2008


Student Response Systems in Higher Education:
Moving Beyond Linear Teaching and Surface Learning

The cells in Table 2 have been shaded with the


darkest cells representing the most commonly Much of the current literature about SRSs
described applications of SRS--such as using centers around two of the principles for good
the multiple-choice recall questions from the practice, (i.e., giving prompt feedback and
database that comes with the textbook. This emphasizing time on task). For example,
use of a SRS promotes the Remembering improved student attendance, improve student
domain and typically employs Time-on-Task motivation, and improved student engagement
and Frequent Feedback. Of course, it is critical (Horowitz, 2006; Kaleta & Joosten, 2007) are
thinking and deep learning (i.e., Understand, the most frequently cited benefits. To a larger
Analyze, and Evaluate levels) that represent the extent, these benefits are quantitative changes—
recommended goals for students (Association related to attending and responding as opposed
of American Colleges and Universities, 2004; to being passive or even absent. While this is a
Moore, et al, 2008) and the great challenge for positive outcome, the impact of using SRS on
instructors. deep learning is more elusive.

Table 2. Principles for good practice (Chickering and Gamson, 1987)


Cognitive Learning Outcomes (Anderson & Krathwohl, 2001)
Remember Understand Apply Analyze Evaluate Create
recognize, interpret, execute, differentiate, check, plan,
recall classify implement organize, critique produce
summarize, attribute
compare
Emphasizes Most Emerging uses of SRSs, primarily in select Areas for
Time-on-Task common science programs—significant potential for potential
Gives Prompt applications increased application. expansion
Feedback of SRSs with
Encourages technologies
Faculty-Student which
Contact enable
students to
Develops Some use generate
Student is evident responses
Cooperation in the with SRSs
Encourages literature
Active Learning

Communicates
High
Expectations
Respects Potential uses of SRSs remain largely untapped
Diverse
Talents/
Knowing

Volume 1, No. 1, November, 2008 99


Journal of Educational Technology Development and Exchange

The reordered Seven Principles follow with students. Using SRS has clear, positive
a brief description of how SRSs have been impact on students’ learning of complex
employed in higher education classrooms. material when paired with student
cooperation. Mazur’s (1997) work in
• Emphasizing time on task. “Allocating physics on the impact of peer instruction
realistic amounts of time means effective provides a model for combining effective
learning for students and effective pedagogy and SRS to increase student
teaching for faculty” (Chickering & learning (Beatty, Gerace, Leonard, and
Gamson, 1987, p.3). SRS enable faculty Dufresne, 2006).
to increase attendance and student • Encouraging active learning. Judson
participation (e.g., time on task. In and Sawada (2002) conclude that when
addition, if SRS can increase interest learning gains are seen, SRSs have been
and engagement with the course content, used to promote active learning. But, for
students may increase time on task active learning to be effective, it must be
beyond the scheduled course time. On the more then just clicking on a multiple-
other hand, taking time in class time for choice answer. Chickering and Ehrmann
using SRS typically reduces coverage of (1996) emphasize that to implement
content (Caldwell, 2007). The end result active learning with technology, students
can be more learning for less teaching. “must talk about what they are learning,
• Giving prompt feedback. Here is the heart write reflectively about it, relate it to past
of what SRS provides—prompt feedback experiences, and apply it to their daily
to students’ responses. But again, prompt lives” (p. 4). With regard to SRSs, just
feedback needs to provide the appropriate using a clicker is not sufficient to engage
elements in order to be effective. While students in the principles of active
feedback using SRSs is appropriately learning.
prompt, feedback should also be directive • Communicating high expectations.
and specific (Benson, Mattson, & Adler, Although communicating high
1995). That is to say, feedback needs to expectations is not intrinsically woven
contain the guidance students need in into SRS, the use of complex questions
order to independently restructure correct that require critical thinking can provide
responses in the future. This frequently the opportunity to model deep learning
means coupling Just-in-Time Teaching within ones discipline (CU Science
(JiTT) with SRS feedback in order to Education Initiative & UBC Carl Wieman
insure students understand the concepts SEI, 2008).
being taught (Caldwell, 2007). • Respecting diverse talents and ways
• Encouraging student-faculty contact. of learning. This principle offers a
While this principle is a core practice, significant opportunity for faculty to
its application with technology is capture voices from the back of the class
typically cited as contacts that occur (e.g., students who do not contribute to
beyond the classroom setting. As whole class discussions and who wait
such, communication technologies are to judge the prevailing class sentiment
reported as the most effective ways before offering an opinion). Although
encourage increased contact (Chickering SRSs using the format of multiple-
& Ehrmann, 1996). choice responses do not capture all
• Encouraging cooperation among the diverse ideas present, a thoughtful

100 Volume 1, No. 1, November, 2008


Student Response Systems in Higher Education:
Moving Beyond Linear Teaching and Surface Learning

instructor can, over time, integrate 4. SPSs as a System for Faculty Development
acceptable alternative explanations into
the response options. Also, an instructor Although most of the literature appropriately
who uses the SRS to survey students or describes the impact of SRSs on student
as a formative assessment should benefit learning, several authors hint at the potential
from the additional information provided of the technology as a tool for faculty learning
by the technology. For example, an (Banks, 2006). For example, while feedback
instructor can instantly display the to students is a critical step in learning, the
variety of views held by a class on feedback that an instructor receives about
an issue to be discussed in class and student misconceptions and error patterns in
respectfully acknowledge the validity to reasoning provide a potentially rich source
that diversity. Thus far, there has been of information about how one might need
little research on how the use of SRSs to restructure readings, lectures, and course
impacts learning through this principle. activities to address student difficulties. Without
the frequent interactions and systematic
Beatty, et al. (2006) provide an excellent display of students’ responses, many of the
model for how Table 2 in action. Instructors’ patterns of students’ misinterpretation, lack of
questions which reflect effective pedagogical prior knowledge, or incomplete logic would
principles (Chickering & Gamson, 1987) can go unnoticed. Beatty et. al. (2006) note that
develop higher levels of learning (Anderson effective questions can identify students’ beliefs
& Krathwohl, 2001). These authors suggest and prior knowledge about a topic and instantly
posing the following array of questions and communicate and store these results.
having students respond with a SRS:
Data from SRSs also might be used to
• Survey students’ background knowledge facilitate the Scholarship of Teaching and
and attitudes related to concepts in the Learning (SoTL). One of the primary obstacles
lesson. to instructors publishing SoTL research is their
• Display response patterns and discuss perception that capturing evidence of student
evidence of perceptions and prior learning is difficult and time consuming (Dangel,
knowledge. 2004). The ability of SRSs to collect and store
• Explore areas of disagreement and evidence of students’ understanding and their
confusion. changes over time provides an effective tool
• Identify relationships between similarities for researchers to document student learning.
and differences in the concepts. Examples of how SRSs can be used as a tool to
• Based on evidence of understanding, capture evidence of student learning in order to
elaborate on applying, analyzing, and evaluate pedagogical approaches are beginning
evaluating the concept. to emerge (Kennedy, Cutts, & Draper, 2006)
• Examine how understanding of the
concept relates to other contexts and 5. Guidance and Challenges in Using SRSs
concepts.
Probably the biggest challenge for effectively
Teaching in this manner places student implementing SRSs is the time and effort needed
understanding at the core of classroom to restructure courses and develop suitable,
activities and, as such, a SRS become complex questions. With current commercial
essential to agile teaching and deep learning. systems, this means developing multiple-choice

Volume 1, No. 1, November, 2008 101


Journal of Educational Technology Development and Exchange

questions with an appropriate array of choices. Cost of the technology for students to use
Ideally, many of the multiple-choice questions a SRS, whether clickers, PDAs, or hand-held
address deep learning and include response computers, usually is borne by students. As
options that provide diagnostic information such, they must be convinced that the cost is
about students’ thinking and reasoning (e.g., worth the benefit. Some textbook companies
lack of prior knowledge, incomplete reasoning provide clickers at a reduced cost when faculty
or faulty conclusions). adopt their textbooks. Also, if a clicker is
used in multiple classes, students will more
The research on SRSs has produced specific likely accept the additional cost. And, as with
pedagogical recommendations which are in any technology, increased support is needed
line with the educational goals noted above. because technical glitches are to be expected.
First, instructors should ask questions that are Some students will forget to bring their clickers
appropriately challenging and require thinking to class or lose them resulting in lost time and
skills beyond just remembering information. possible frustration (Lowery, 2005).
Second, students’ questions or even students’
responses to instructors questions can SRS technology offers great promise for
effectively serve as the roadmap for teaching. engaging students and promoting learning, but
As Shulman (1999) notes, unless we take only if we use this tool using sound pedagogical
seriously what a student already knows, principles to promote learning that will be
teaching becomes very difficult. Students’ meaningful to students in the future. Although
questions and, in many cases, their incorrect the glass is only half-full, it is still being filled
responses, can provide this information. Using as researchers share new classroom applications
SRSs to survey students’ opinions and collect for this emerging tool that are based on sound
information about what students know requires pedagogical practices.
instructors to adjust the way they engage
students. Agile teaching (Beatty, et al., 2006) References
and just-in-time-teaching (Caldwell, 2007)
replace the preset PowerPoint presentation and Anderson, L. W., & Krathwohl, D. R. (Eds.).
lecture in the paradigm shift from teaching to (2001). A taxonomy for learning, teaching
learning. and assessing: A revision of Bloom’s
Taxonomy of educational objectives:
Faculty often notice that using SRSs Complete edition, New York: Longman.
results in covering less material (Caldwell, Angelo, T., & Cross, P. (1993). Classroom
2007). Yet, the potential of deeper learning assessment techniques: A handbook for
as a result of reduced coverage is in line college teachers (2nd edition). San Francisco:
with pedagogical guidelines which call for Jossey-Bass.
emphasizing Big Ideas rather than coverage Association of American Colleges and
(Moore, et al. 2008; Wiggins & McTighe, Universities. (2002). Greater expectations:
2005). Covering less while teaching more A new vision for learning as a nation goes
effectively is certainly acceptable when there to college. Retrieved June 29, 2008 from
is clear evidence that learning has increased. http://www.aacu.org/gex/index.cfm
Or, as Gardner emphasizes, “The greatest Barr, R. B., & Tagg, J. (1995). From teaching to
enemy of understanding is coverage” (Gardner learning: A new paradigm for undergraduate
1993, p. 24). education. Change, 27(6), 12-26.
Banks, D. A. (2006). Audience response

102 Volume 1, No. 1, November, 2008


Student Response Systems in Higher Education:
Moving Beyond Linear Teaching and Surface Learning

systems in higher education: Applications CBE-Life sciences education, 7, 146-154.


and cases. Hershey, PA: Information CU Science Education Initiative & UBC Carl
Science. Wieman Science Education Initiative. An
Biggs, J. (1996). Enhancing teaching through instructor’s guide to the effective use of
constructive alignment. Higher education, personal response systems (“clickers”) in
32, 347-364. teaching. Vancouver: University of British
Beatty, I. D., Leonard, W. J., Gerace, W. J., & Columbia. Retrieved June 13, 2008 from
Dufresne, R. J. (2006), Designing effective http://www.cwsei.ubc.ca/resources/files/
questions for classroom response system Clickers_Final_Version_04_08.pdf
teaching. American Journal of Physics, Dangel, H. (2004, November). A faculty
74(1), 31-39. learning community for the scholarship
Benson, Mattson, & Adler, 1995 in Hatfield, S. of teaching and learning. Professional
R. (1995). The seven principles in action: Organization Development Network
improving undergraduate education. Conference. Montreal, Canada.
Bolton, MA: Anker Publishing Co. El-Rady, J. (2006). To click or not to click:
Beuckman, J., Rebello, N. S, & Zollman, D. That is the question. Innovate: Journal of
(2006). Impact of a classroom interaction online education, 2(4). Retrieved June 28,
system on student learning. AIP Conference 2008 from http://www.innovateonline.info/
Proceedings, 883:1, 129(4). Retrieved June index.php?view=article&id=171
13, 2008 from http://web.phys.ksu.edu/ Fink, L. D. (2003). Creating significant learning
papers/2006/Beuckman_ PERC2006.pdf experiences: An integrated approach to
Caldwell, J. E. (2007). Clickers in large designing college courses. San Francisco:
classrooms: Current research and best- Jossey-Bass.
practice tips. CBE-Life sciences education, Fies, C., & Marshall, J. (2006). Classroom
6, 9-20. response systems: A review of the
Carnevale, D. (2005). Run a class like a game literature. Journal of Science Education
show: ‘Clickers’ keep students involved. and Technology, 15.
Chronicle of Higher Education, 51(42), Gardner, H.W. (1993). ‘Educating for
p.B3-B3. understanding’. The American School
Chickering, A. W., & Ehrmann, S. E. (1996). Board Journal, July, 20-24.
Implementing the 7 principles: Technology Guthrie, R. W., & Carlin, A. (2004). Waking
as lever. American Association for Higher the dead: Using interactive technology to
Education Bulletin, October, 3-6. engage passive listeners in the classroom.
Chickering, A. W., & Gamson, Z. F. (1987). Proceedings of the Tenth Americas
Seven principles for good practice in Conference on Information Systems,
undergraduate education. The Wingspread New York. Retrieved June 13, 2008 from
Journal, 9(2), 1-16. http://www.mhhe.com/cps/docs/CPSWP_
Crouch, C. H., & Mazur, E. (2001). Peer WakindDead082003.pdf
instruction: Ten years of experience and Hansen, C. R. (2007). An evaluation of a student
results. American Journal of Physics, 69, response system used at Brigham Young
970-977. University. Masters Thesis. Retrieved June
Crossgrove, K., & Curran, K. L. (2008). Using 13, 2008 from http://contentdm.lib.byu.edu/
clickers in non-majors- and majors-level ETD/image/etd2127.pdf
biology courses: Student opinion, learning, Hatfield, S. R. (1995). The seven principles
and long-term retention of course material. in action: improving undergraduate

Volume 1, No. 1, November, 2008 103


Journal of Educational Technology Development and Exchange

education. Bolton, MA: Anker Publishing Library/ECAR/Learners20ITand21stCen


Co. tur/46519
Horowitz, H. M. (2006). ARS revolution: Richlin, L. (2006). Blueprint for learning:
Reflections and recommendations. In D. Constructing college courses to facilitate,
A. Banks (Ed.), Audience response systems assess, and document learning. Sterling,
in higher education: Applications and VA: Stylus.
cases (pp. 53-63). Hersey, PA: Information Robertson, L. J. (2000). Twelve tips for using a
Science. computerized interactive audience response
Judson, E., & Sawada, D. (2002). Learning system. Medical Teacher, 22(3), 237-240.
from past and present: Electronic response Sorcinelli, M. D. (1991). Research findings on
systems in college lecture halls. Journal the seven principles. In A.W. Chickering
of Computers in Mathematics and Science and Z. F. Gamson (eds). Applying the
Teaching 21(2), 167-181. seven principles of good practice in
Kaleta, R, & Joosten, T. (2007) Student undergraduate education (pp.13-25). San
response systems: A University of Francisco: Jossey-Bass.
Wisconsin study of clickers. (Research Shulman, L. S. (1999). Taking learning
Bulletin, Issue 6). Boulder, CO: seriously. Change, 13(4), 11-17.
EDUCAUSE Center for Applied Research. Weimer, M. (2002). Learner-centered teaching:
Retrieved June 13, 2008 from http:// Five key changes to practice. San Francisco:
connect.educause.edu/Library/ECAR/St Jossey-Bass.
udentResponseSystemsAUn/40166 Wiggins, G. P., & McTighe, J. (2005)
Kennedy, G. E., Cutts, Q., & Draper, S. W. Understanding by design (2nd ed.).
(2006). In D. A. Banks (Ed.), Evaluating Alexandria, VA: Association for Supervision
electronic voting systems in lectures: Two and Curriculum Development.
innovative methods (pp. 155-174). Hersey, Wilson, J. (2008). Using clickers in a large
PA: Information Science. astronomy class. PRISM spring workshop.
Lowery, R. C. (2005). Teaching and Learning Georgia State University.
with Interactive Student Response Systems:
A Comparison of Commercial Products
in the Higher-Education Market. Annual Contact the Authors
meeting of the Southwestern Social Science
Association. New Orleans. Retrieved June Harry L. Dangel, Ph.D.
20, 2008 from http://people.uncw.edu/ Georgia State University
lowery/SWSSA%20ms.pdf Email: hdangel@gsu.edu
Mazur, E. (1997). Peer instruction: A user’s
manual, Upper Saddle River, NJ: Prentice Charles Xiaoxue Wang, Ph.D.
Hall. Georgia State University
Moore, A. H., Fowler, S. B., Jesiek, B. Email: xwang10@gsu.edu
K., Moore, J. F., & Watson, C. E.
(2008). Learners 2.0? IT and 21st-
Century Learners in Higher Education.
(Research Bulletin, Issue 7). Boulder,
CO: EDUCAUSE Center for Applied
Research, 2008. Retrieved June 13,
2008 from http://connect.educause.edu/

104 Volume 1, No. 1, November, 2008

View publication stats

Potrebbero piacerti anche