Sei sulla pagina 1di 12

This article was downloaded by: [Laurentian University]

On: 23 April 2013, At: 03:34


Publisher: Routledge
Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered
office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Educational Psychology: An
International Journal of Experimental
Educational Psychology
Publication details, including instructions for authors and
subscription information:
http://www.tandfonline.com/loi/cedp20

Using higher order thinking questions


to foster critical thinking: a classroom
study
a a
Jerrold E. Barnett & Alisha L. Francis
a
Psychology, Sociology, and Counseling, Northwest Missouri State
University, Maryville, MO, USA
Version of record first published: 05 Dec 2011.

To cite this article: Jerrold E. Barnett & Alisha L. Francis (2012): Using higher order thinking
questions to foster critical thinking: a classroom study, Educational Psychology: An International
Journal of Experimental Educational Psychology, 32:2, 201-211

To link to this article: http://dx.doi.org/10.1080/01443410.2011.638619

PLEASE SCROLL DOWN FOR ARTICLE

Full terms and conditions of use: http://www.tandfonline.com/page/terms-and-


conditions

This article may be used for research, teaching, and private study purposes. Any
substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,
systematic supply, or distribution in any form to anyone is expressly forbidden.
The publisher does not give any warranty express or implied or make any representation
that the contents will be complete or accurate or up to date. The accuracy of any
instructions, formulae, and drug doses should be independently verified with primary
sources. The publisher shall not be liable for any loss, actions, claims, proceedings,
demand, or costs or damages whatsoever or howsoever caused arising directly or
indirectly in connection with or arising out of the use of this material.
Educational Psychology
Vol. 32, No. 2, March 2012, 201211

Using higher order thinking questions to foster critical thinking:


a classroom study
Jerrold E. Barnett* and Alisha L. Francis

Psychology, Sociology, and Counseling, Northwest Missouri State University, Maryville,


MO, USA
(Received 28 February 2011; nal version received 3 November 2011)
Downloaded by [Laurentian University] at 03:34 23 April 2013

To determine if quizzes containing higher order thinking questions are related to


critical thinking and test performance when utilised in conjunction with an
immersion approach to instruction and effort-based grading, sections of an
Educational Psychology course were assigned to one of three quizzing condi-
tions. Quizzes contained factual multiple-choice questions, factual essay ques-
tions or essay items requiring higher order thinking. Critical thinking was
measured with a pre-testpost-test design and the WatsonGlaser Critical Think-
ing Appraisal (Short Form). Classroom learning was assessed via multiple-
choice and essay tests. Critical thinking increased equally across all sections.
The section receiving higher order thinking quizzes performed signicantly bet-
ter than the other two sections on both the multiple-choice and essay portions of
the classroom tests. The implications of these ndings are discussed in the
context of methodological approaches to encouraging critical thinking.
Keywords: cognition; instruction; higher education; critical thinking

Critical thinking has been the focus of scholarly contemplation for thousands of
years (Morgan, 1995). More recently, critical thinking instruction has been empha-
sised within the higher education curriculum (The Association of American Colleges
and Universities, 2011). Critical thinking is also included in curricular guidelines in
a number of disciplines (e.g. American Psychological Association, 2007) and is the
focus of a number of institution wide efforts (e.g. Halonen, Westcott, & Stanny,
2007; Ross & Pellett, 2009). The weight placed upon critical thinking reects, in
part, the demands of our information-rich society (Angeli & Valanides, 2009;
Halpern & Nummedal, 1995). Those demands are reected, in turn, in the value
employers place on the associated skills (Hart Research Associates, 2010).
Despite the rich history and extensive emphasis, however, questions remain
about the degree to which higher education is effective in establishing critical think-
ing competencies. Employers indicate critical thinking is among the areas in which
higher education should place greater emphasis (Hart Research Associates, 2010).
Indeed, Bok (2006) cites the failure to develop critical thinking as one of the major
disappointments of contemporary higher education. Boks critique centres on the
discrepancy between the emphasis placed upon critical thinking by faculty and
administrators and the amount of classroom time spent trying to facilitate it. In

*Corresponding author. Email: barnett@nwmissouri.edu

ISSN 0144-3410 print/ISSN 1469-5820 online


2012 Taylor & Francis
http://dx.doi.org/10.1080/01443410.2011.638619
http://www.tandfonline.com
202 J.E. Barnett and A.L. Francis

contrast, Pascarella and Terenzini (2005) present a number of more optimistic nd-
ings in an extensive literature review summarising numerous evaluation studies of
critical thinking and how the college experience facilitates its development.
The discrepancies in perceptions of critical thinking instruction may be the result
of the various levels of analysis from which the topic can be considered. Pascarella
and Terenzinis (2005) review focuses primarily at the programme level, especially
on the impact on the general education package. The failures to which Bok (2006)
makes reference may be the product of a lack of systematic, empirically based
methodology for teaching critical thinking within individual classes. This may ini-
tially appear inconsistent with the attention given to the topic within the teaching
literature, which includes a wealth of suggestions for structuring that instruction
(e.g. Dunn, Halonen, & Smith, 2008; Hooks, 2010). The challenge, however, is to
establish empirically validated methods (Paul, 1985). As Nummedal and Halpern
(1995, p. 4) note, learning to think critically is not an inevitable outcome of
Downloaded by [Laurentian University] at 03:34 23 April 2013

instruction. The purpose of the present study is to contribute to the empirical


understanding of critical thinking instruction, using a quasi-experimental, pre-test
post-test design to test a method of critical thinking instruction.

The nature of critical thinking


Scholarly discussion and research generally represent three perspectives on critical
thinking: those related to inherent characteristics (dispositional perspective), those
related to cognitive development (emergent perspective) and those related to behav-
iours and abilities (state perspective) (Halonen, 1995). The interaction of those three
perspectives, in turn, offers a number of insights for critical thinking instruction.
Dispositional perspectives consider traits related to inherent intellectual ability
(Clifford, Boufal, & Kurtz, 2004; Halonen, 1995) as well as affective and cognitive
characteristics such as open-mindedness, attentiveness and inquisitiveness (Clifford
et al., 2004; Ennis, 1985; Facione, 1990). Recent empirical investigations have focused
on the role of need for achievement and related dispositions, with mixed results.
Although a focus on innate characteristics may be discouraging to instructors seeking to
contribute to students critical thinking skills, evidence suggests that students can bene-
t from critical thinking instruction (Angeli & Valanides, 2009; Halpern, 1998).
Emergent approaches consider critical thinking from the perspective of cognitive
development (Halonen, 1995). Work from this perspective is consistent with
Piagets (1969) and Perrys (1970) work which suggests patterns of thinking change
as individuals interact with the environment and encounter novel situations. To the
degree to which general education curriculums and other programmatic initiatives
represent novel situations for students, the emergent perspective may also explain
the improvements noted in Pascarella and Terenzinis (2005) review.
State perspectives focus on the behaviours required of a critical thinker and the
associated abilities (Halonen, 1995). Scholars working from this perspective high-
light a number of skills necessary for critical thinking, including interpretation, anal-
ysis, evaluation, inference, ability to clarify, decision-making and problem solving
(Ennis, 1985; Facione, 1990). The state perspective is represented in a number of
discussions of instructional techniques (e.g. Angeli & Valanides, 2009; Dunn et al.,
2008), including the present work. Consistent with this conceptualisation, our opera-
tional denition of critical thinking includes the ability to assess and apply evidence
in order to support or evaluate an argument (Watson & Glaser, 2008).
Educational Psychology 203

Teaching for critical thinking


Within the state perspective, critical thinking skills are viewed as the product of
cognitive processes which are distinctly different from simple repetition and memo-
risation of details (Morgan, 1995). This conceptualisation is consistent with the cog-
nitive domain in Blooms (1956) taxonomy, which places higher level thinking at
one end of a continuum opposed by understanding and recall of basic facts. Ennis
(1985, p. 47) notes that critical thinking incorporates a good deal of the directly
practical side of higher order thinking.
Although Blooms taxonomy addresses the more general cognitive domain
(Ennis, 1985), the continuum offers a number of insights when critical thinking is
considered as a specic higher order process. The role of factual knowledge in
lower order thinking suggests that embedding critical thinking instruction within
subject matter instruction infusion and immersion approaches (Ennis, 1989) pro-
vides necessary details for the cognitive processes. As McPeck (1981, p. 3) notes,
Downloaded by [Laurentian University] at 03:34 23 April 2013

thinking is always thinking about something and the subject matter instruction
provides that something. The practice of embedding critical thinking also has the
potential to discourage the oversimplication of both critical thinking and subject
matter knowledge (Facione, 1990). Embedding critical thinking within subject mat-
ter instruction is also consistent with an instructional emphasis on ideas, rather than
skills and processes (Prawat, 1991).
Although there are a number of benets to embedding critical thinking instruc-
tion within subject matter instruction, that instruction must be deliberately incorpo-
rated. As Halpern and Nummedal (1995, p. 82) note, improvements in critical
thinking do not readily develop as a spontaneous by-product of standard content
area instruction. Similarly, instruction without appropriate structure will be less
likely to empower students for continued, meaningful learning (Angeli & Valanides,
2009; Mayer, 2004). The role of cognitive skill within critical thinking underscores
the importance of structuring educational experiences to provide independent prac-
tice in carrying out the related cognitive activities (Mayer, 2004; McKeachie,
1992). At the same time, simply incorporating critical thinking within a lecture for-
mat may encourage passive memorisation of linear sequences of information
(Maiorana, 1990).
The time and effort required to incorporate critical thinking may seem daunting
for instructors (Gray, 1993), creating a tension between subject matter instruction
and critical thinking (Coles, 1993). Discussions of the immersion approach to critical
thinking instruction in which related tasks designed to cultivate critical thinking
are included but explicit critical thinking instruction is not suggest one approach to
minimising demands on the instructor (Ennis, 1989; Prawat, 1991). When coupled
with an immersion approach, writing tasks provide an opportunity to deliberately
incorporate critical thinking structures and practice within subject matter instruction
without requiring excessive time on the part of the instructor. Walvoord (n.d., p. 6)
suggests a strategic approach to grading writing tasks in order to minimise time
demands, including providing intensive comment-based grading on only a few
assignments. Effort-based grading which has been associated with increased learn-
ing (Swinton, 2010) also minimises time demands for the instructor while encour-
aging students to practise the thinking skills required by the assignment.
Renaud and Murray (2007, 2008) suggest that the use of written responses to
higher order thinking questions warrants further attention. When a writing task
204 J.E. Barnett and A.L. Francis

requires students to work with content in a unique way different from the presen-
tation in the textbook or lecture thinking is required (Gray, 1993). The process of
writing engages thinking in a way that allows for examination and revision as the
writer explores the meanings of and relationships among ideas (Appelbee, 1984;
Schumacher & Nash, 1991). As such, writing represents both the process and the
result of critical thinking (Bean, 1996). This is consistent with ndings indicating
that students who were required to take essay exams were signicantly more likely
to indicate that their critical thinking abilities improved (Tsui, 1999).
Not all writing tasks, however, contribute to critical thinking. Renaud and Mur-
rays (2007, 2008) experimental investigations resulted in signicant gains in sub-
ject-specic critical thinking. In contrast, critical thinking about general topics did
not increase signicantly. They discuss a number of limitations, however, created
by the experimental design. A classroom-based, correlational study suggested that
the use of higher order thinking questions is associated with gains in general critical
Downloaded by [Laurentian University] at 03:34 23 April 2013

thinking ability (Renaud & Murray, 2007).


Building upon Renaud and Murrays (2007, 2008) ndings, the purpose of the
present study is to examine the effects of written higher order thinking questions on
critical thinking skills when utilised in conjunction with an immersion approach to
critical thinking instruction and effort-based grading. Critical thinking was opera-
tionalised in terms of the ability to assess and apply evidence in order to support or
evaluate an argument. We predicted that students who completed quizzes requiring
higher order thinking would demonstrate greater improvements in critical thinking
skills compared to students who completed quizzes that did not require higher order
thinking. In addition, given that the critical thinking practice was embedded within
subject matter instruction, we predicted students who completed quizzes requiring
higher order thinking would score higher on tests of subject matter knowledge com-
pared to students who completed quizzes that did not require higher order thinking.

Method
We employed a classroom-based, quasi-experimental, pre-test, post-test design to test
the predictions detailed above. Although the lack of random assignment to course
section calls into question some issues related to internal validity, use of intact classes
allows for improvements in ecological validity relative to experimental designs.

Design
Three sections of Educational Psychology were randomly assigned to one of three
experimental conditions. All sections were taught by the same instructor (the rst
author) and were as similar as possible. The variable that was systematically manip-
ulated was the format of questions included in online quizzes, which were delivered
via a course management system. The quizzes always included content from the
textbook chapter assigned for the upcoming week. One section (A) was assigned
multiple-choice items that measured factual information in the chapter. A second
section (B) was assigned quizzes consisting of two or three essay questions that
required critical thinking about the assigned reading. A third section (C) was also
assigned quizzes with essay questions, but their items required only factual knowl-
edge of the reading. Instructional methods and classroom management techniques
were held constant across sections in order to reduce threats to validity.
Educational Psychology 205

The WatsonGlaser Critical Thinking Appraisal (Short Form) (Watson &


Glaser, 2006) was administered to all students in attendance during the rst week
of the semester and again during the last week of the classes. Students took either
four (Section C, which met on Tuesdays and Thursdays for 75 min classes) or ve
(Sections A and B, which met on Mondays, Wednesdays and Fridays for 50 min
each time) classroom tests.

Participants
A total of 147 students completed the Educational Psychology course during the fall
of 2008, when the study was conducted. There were 47 students in Section A, 49 in B
and 51 students in Section C. Most students enrolling in this course were sophomores
or junior education majors. The students were predominantly female and white.
Educational Psychology is usually the rst psychology course taken by these
Downloaded by [Laurentian University] at 03:34 23 April 2013

students.
All students enrolled in the course were presented with an informed consent
statement containing an invitation to participate and all accepted. A few students
enrolled and later dropped the class: those students were not included in any analy-
sis. The number in the analyses presented below varies because some students did
not complete both the pre-test and the post-test of the WatsonGlaser, or skipped
one of the classroom tests.

Measures
Critical thinking
Students were administered the WatsonGlaser Critical Thinking Appraisal (Short
Form). The test has 40 multiple-choice items, measures 5 subscales, and takes about
30 min to complete. Alpha reliabilities for the subscales ranged from .23 to .78. The
reliability for the full scale was .71. Given the low reliabilities of some of the sub-
scales, only total scores were analysed for this paper.

Quizzes
Ten quizzes were administered to each section at approximately one week intervals
throughout the semester. All quizzes were delivered via a course management sys-
tem and taken outside of class. Question items were designed to sample the domain
of information discussed during class in the previous week with each section receiv-
ing questions over the same content areas. For example, considerable attention was
given to the theories of Piaget and Vygotsky in the developmental unit. Section A
received a multiple-choice question requiring students to identify a characteristic
which would be consistent with a student in the preoperational stage. A question on
the quiz for Section B read Comparing Piaget and Vygotsky: Which of these theo-
ries do you think will be more useful in your teaching? Why? In responding, the
selection of the theory they feel will be more useful represents the argument
referred to in the operationalisation of critical thinking while the explanation of
their selection corresponds to the support element, requiring the assessment and
application of related information. Section C received a question directing them to
name the stages and list a characteristic of students at that stage. Naming and listing
information assesses the ability to accurately recall information without requiring
the student to formulate and support an argument or assess evidence.
206 J.E. Barnett and A.L. Francis

In the interest of grading efciency, higher order thinking questions were graded
based on effort demonstrated. Given the example question above, full credit would
be given if a student picked a theory and made an appropriate argument consistent
with the theory. Similarly, feedback was rarely given to individual students and only
when their responses were in serious need of correction. Some feedback was given
during class time if several students made similar mistakes. Points for factual essay
questions were assigned based on the proportion of correct information included in
the response.

Classroom tests
Two sections (A and B) were administered ve classroom tests, worth 45 points
each. The tests consisted of 20 multiple-choice items, which were mostly factual,
and ve 5-point essays, which were a mixture of factual, application and critical
Downloaded by [Laurentian University] at 03:34 23 April 2013

thinking questions. Given class scheduling, Section C had four tests, with 25 multi-
ple-choice items and six or seven essays per test. Questions evenly sampled the
domain of information discussed during the prior unit and were designed to evalu-
ate the ability to recall information, apply the related knowledge and demonstrate
critical thinking about the material. The multiple-choice items were identical across
the sections, but the essay questions varied to prevent cheating across sections.
Although the general topics and level of thinking required were consistent, the exact
content varied slightly. For example, one section answered a question about Piagets
theory and how to apply it to their specic area of teaching. Another section had an
essay about applying Vysotskys theory. All essay responses were scored by the
course instructor using a scoring rubric. A check on scoring accuracy was provided
by returning the exams to students, going over the scoring rubric for each essay
item, and allowing students to question the grading.

Results
Critical thinking
A repeated measures ANOVA, with section as a between subjects factor and pre-
and post-test scores on the WatsonGlaser as repeated measures yielded only one
signicant nding. There was a signicant improvement across the semester, F(1,
106) = 8.38, p = .005, g2 = .073. Scores rose from a mean of 24.62 on the pre-test
(SD = 5.63) to 25.46 (SD = 5.13) on the post-test.
There were no signicant differences among the three sections, F(2, 106) = .06,
p = .94, nor was there a signicant section by time interaction, F(2, 106) = .24, p = .79.
Pre-test and post-test scores, by scale and by section, are presented in Table 1.

Classroom tests
Due to unequal numbers of items on the different tests, all scores were converted to
percentages for analysis. Descriptive statistics by group are presented in Table 2.

Multiple-choice scores
There was a signicant difference across the sections on the multiple-choice portion
of the exams, F(2, 142) = 15.90, p = .001, g2 = .18. Post-hoc tests, using the Bonfer-
roni solution to control for Type I error, found that Section B (higher order thinking
Educational Psychology 207

Table 1. Descriptive statistics on the Watson-Glaser critical thinking scale by section.


Section A (MC Section B (CT Section C (Factual
quizzes) questions) questions)
M SD M SD M SD
Inference
Pre-test 3.76 1.86 3.58 1.62 3.73 1.64
Post-test 3.61 1.62 3.43 1.53 3.72 1.52
Assumptions
Pre-test 4.24 2.05 4.63 2.45 4.71 2.41
Post-test 5.44 2.20 5.23 2.14 5.68 2.00
Deductions
Pre-test 6.18 1.77 5.89 1.82 6.02 1.60
Post-test 6.31 1.94 6.10 1.75 6.02 2.63
Interpretation
Downloaded by [Laurentian University] at 03:34 23 April 2013

Pre-test 3.85 1.42 3.47 1.64 4.15 1.35


Post-test 3.74 1.50 4.05 1.36 3.84 1.40
Evaluation
Pre-test 6.50 1.23 6.68 1.42 6.41 1.48
Post-test 6.31 1.52 6.62 1.51 6.22 1.72
Entire measure
Pre-test 24.53 5.18 24.26 5.63 25.02 6.09
Post-test 25.41 4.93 25.48 4.95 25.50 5.57

Table 2. Means and standard deviations for classroom test scores.


Multiple-choice
scores Essay scores
n M SD M SD
Section A (MC quizzes) 47 .66 .10 .67 .17
Section B (CT questions) 49 .72 .10 .75 .16
Section C (Factual questions) 51 .62 .08 .70 .14

questions) scored higher than Section A (multiple-choice quizzes), t(91) = 3.29,


p = .001, g2 = .105, and Section C (factual essay quizzes), t(93) = 5.78, p = .001,
g2 = .254. Sections A and C did not differ signicantly from each other, t(98)
= 1.95, p = .054.

Essay scores
A similar pattern was found for the essay scores on classroom tests, F(2, 142)
= 5.31, p = .006, g2 = .07. Section B scored signicantly higher than Section A, t
(91) = 3.15, p = .002, g2 = .002. Section B did not differ signicantly from Section C,
t(93) = 1.57, p = .12. Once again, Sections A and C did not differ signicantly from
each other, t(98) = 1.82, p = .072.

Discussion
The results of the present study parallel those of Renaud and Murray (2007, 2008).
Students who completed quizzes containing higher order thinking questions did not
208 J.E. Barnett and A.L. Francis

score signicantly higher in general thinking ability, as measured by the Watson


Glaser. The section receiving higher order thinking quizzes did, however, perform
signicantly better than the other two sections on both the multiple-choice and
essay portions of the classroom tests. While noting that learning is not limited to
the specic context in which it takes place, McPeck (1990) suggests that the partic-
ular subject that is the topic of critical thinking generates unique standards for that
thinking. The pattern of results may suggest the standards for thinking developed in
the domain of Educational Psychology focusing primarily upon application of the-
ories and research ndings to classrooms do not generalise to the domains repre-
sented by the WatsonGlaser scores.
The ndings also suggest that the immersion approach may not be an effective
method for contributing to students critical thinking skills. It may be productive to
investigate the effects of writing tasks in conjunction with deliberate and explicit
critical thinking instruction. Experimental manipulation of the effort-based grading
Downloaded by [Laurentian University] at 03:34 23 April 2013

practice can also provide insights into the effects of that practice. Students may
have perceived the effort-based grading as representative of relatively low perfor-
mance expectations. As a result, they may have experienced lower levels of motiva-
tion to engage with the material (Figlio & Lucas, 2004).
Scores of general critical thinking ability did increase signicantly across the
semester for all participants. The size of these gains was somewhat larger than typi-
cal gains reported in Pascarella and Terenzini (2005). One possible explanation is a
testing effect, since students were retested only four months after the initial test. It
is also possible that the gains represent real improvement consistent with the pre-
mise of the emergent perspective on critical thinking. Many of the participants were
sophomores at a regional public university, many of the students are rst generation
college students, and the school has a strong commitment to general education. It is
likely that many factors contributed to the development of critical thinking skills,
and quizzes requiring critical thinking could be one contributing factor. Given esti-
mations that cognitive skill prociency requires 100 h of instruction and practice
(Anderson, 1982), cumulative activities across the curriculum may contribute to an
increase in general critical thinking skills independent of specic content areas.
Similarly, consistent with emergent theories of cognitive development, instruction
and practice across the curriculum may require students to interact with their envi-
ronments and encounter novel situations in a manner which further contributes to
increased prociency. In contrast, the experimental dosage of 10 quizzes may have
been too small to create a measureable difference between the conditions.
Effort-based grading and minimal feedback practices were employed in an effort
to develop a method of critical thinking instruction which minimises demands on
the instructor, but may also explain the lack of signicant difference between the
sections. Although effort-based grading has been associated with increased learning
(Swinton, 2010), students may have perceived the standards as being relatively low.
As a result, they may have experienced lower levels of motivation to engage with
the material. Renaud and Murray (2007) questioned the effects of motivation in
developing critical thinking skills, noting that their experimental setting reduced
participants efforts to engage with the task. In the present study, motivation to con-
sider alternative approaches to thinking may have been further inuenced by the
minimal feedback (Winne & Nesbit, 2010). Feedback is not only important for cor-
recting student errors (Kang, McDermott, & Roediger, 2007), but it can also
decrease metacognitive failures (Butler, Karpicke, & Roediger, 2008). In order to
Educational Psychology 209

contribute to general critical thinking ability, written tasks requiring higher order
thinking may need to be accompanied by higher standards for performance applied
in conjunction with feedback related to critical thinking skills.
Higher order thinking quizzes did, however, facilitate classroom performance
more than other types of quizzes. This could be considered surprising, given that
the factual multiple-choice and essay quizzes were a better match with the class-
room tests given. One possibility is that higher order thinking questions encourage
students to think deeply about the material, facilitating semantic encoding. Factual
questions may encourage rote memory, or, in online format, lead students to rely on
an open book. Another possibility is that higher order thinking questions require
students to review and rethink the factual material along with the more complex
forms of thinking. A meaningful review should be benecial, with or without criti-
cal thinking. It is important to remember that this was not a test of online quizzes,
since there was no absolute control group, without any quizzes. Rather, these data
Downloaded by [Laurentian University] at 03:34 23 April 2013

suggest that if quizzes are being considered as a technique for teaching subject mat-
ter knowledge, then higher order thinking questions seem to be the most effective
way to go.
One limitation of the present research is that no random assignment to treatment
group was possible. While the sections did not differ on the pre-test of the Watson
Glaser, and all sections were taught by one instructor using consistent instructional
methods and classroom management techniques, it is always possible that the
sections differed in signicant ways other than the types of quizzes they took. This
is especially true since one section met on Tuesdays and Thursdays with four exams
and the others met on Mondays, Wednesdays and Fridays and had ve exams.

References
American Psychological Association. (2007). APA guidelines for the undergraduate psychol-
ogy major. Washington, DC: American Psychological Association.
Anderson, J.R. (1982). Acquisition of cognitive skill. Psychological Review, 89, 369406.
Angeli, C., & Valanides, N. (2009). Instructional effects on critical thinking: Performance on
ill-dened issues. Learning and Instruction, 19, 322334.
Appelbee, A.N. (1984). Writing and reasoning. Review of Educational Research, 54,
577596.
Bean, J. (1996). Engaging ideas: The professors guide to integrating writing, critical think-
ing, and active learning in the classroom. San Francisco, CA: Jossey-Bass.
Bloom, B. (1956). Taxonomy of educational objectives: The classication of educational
goals. New York, NY: David McKay Company.
Bok, D. (2006). Our underachieving colleges: A candid look at how much students learn
and why they should be learning more. Princeton, NJ: Princeton University Press.
Butler, A.C., Karpicke, J.D., & Roediger, H.L. III (2008). Correcting a metacognitive error:
Feedback increases retention of low-condence correct responses. Journal of Experimen-
tal Psychology: Learning, Memory & Cognition, 34, 918928.
Clifford, J.S., Boufal, M.M., & Kurtz, J.E. (2004). Personality traits and critical think-
ing: Skills in college students empirical tests of a two-factor theory. Assessment, 11,
169176.
Coles, M.J. (1993). Teaching thinking: Principles, problems and programmers. Educational
Psychology, 13, 333345.
Dunn, D., Halonen, J., & Smith, R. (2008). Teaching critical thinking in psychology: A
handbook of best practices. Malden, MA: Wiley-Blackwell.
Ennis, R.H. (1985). A logical basis for measuring critical thinking skills. Educational
Leadership, 43, 4448.
210 J.E. Barnett and A.L. Francis

Ennis, R.H. (1989). Critical thinking and subject specicity: Clarication and needed
research. Educational Researcher, 18, 410.
Facione, P.A. (1990). Critical thinking: A statement of expert consensus for purposes of edu-
cational assessment and instruction. Research ndings and recommendations. Newark,
DE: American Philosophical Association.
Figlio, D.N., & Lucas, M.E. (2004). Do high grading standards affect student performance?
Journal of Public Economics, 88, 18151834.
Gray, P. (1993). Engaging students intellects: The immersion approach to critical thinking
in psychology instruction. Teaching of Psychology, 20, 6874.
Halonen, J.S. (1995). Demystifying critical thinking. Teaching of Psychology, 22, 75.
Halonen, J.S., Westcott, T.B., & Stanny, C.J. (2007). Strategies for assessing student learn-
ing in General Education: The Academic Foundations Model. Paper presented at the
Association of American Colleges and Universities (AAC & U).
Halpern, D.F. (1998). Teaching critical thinking for transfer across domains. American
Psychologist, 53, 449455.
Halpern, D.F., & Nummedal, S.G. (1995). Closing thoughts about helping students improve
how they think. Teaching of Psychology, 22, 8283.
Downloaded by [Laurentian University] at 03:34 23 April 2013

Hart Research Associates. (2010). Raising the bar: Employers views on college learning in
the wake of the economic downturn. Washington, DC: Hart Research Associates.
Hooks, B. (2010). Teaching critical thinking: Practical wisdom. New York, NY: Routledge.
Kang, S.H.K., McDermott, K.B., & Roediger, H.L. (2007). Test format and corrective
feedback modify the effect of testing on long-term retention. European Journal of
Cognitive Psychology, 19, 528558.
Maiorana, V.P. (1990). The road from rote to critical thinking. Community Review, 11,
5363.
Mayer, R. (2004). Should there be a three-strikes rule against pure discovery learning? The
case for guided methods instruction. American Psychologist, 59, 1419.
McKeachie, W.J. (1992). Update. In D.J. Stroup & R.D. Allen (Eds.), Critical thinking: A
collection of readings (p. 3). Dubuque, IA: Wm. C. Brown.
McPeck, J. (1981). Critical thinking and education. New York, NY: St Martins Press.
McPeck, J.E. (1990). Critical thinking and subject specicity: A reply to Ennis. Educational
Researcher, 19, 1012.
Morgan, W.R. Jr. (1995). Critical thinking what does that mean? Journal of College
Science Teaching, 24, 336340.
Nummedal, S.G., & Halpern, D.F. (1995). Introduction: Making the case for psychologists
teach critical thinking. Teaching of Psychology, 22, 45.
Pascarella, E., & Terenzini, P. (2005). How college affects students: A third decade of
research. San Francisco, CA: Jossey-Bass.
Paul, R.W. (1985). Blooms taxonomy and critical thinking instruction. Educational Leader-
ship, 42, 3639.
Perry, W. (1970). Forms of intellectual and ethical development in the college years: A
scheme. New York, NY: Holt, Rinehart and Winston.
Piaget, J. (1969). Judgment and reasoning in the child. London: Routledge & K. Paul.
Prawat, R.S. (1991). The value of ideas: The immersion approach to the development of
thinking. Educational Researcher, 20, 310.
Renaud, R.D., & Murray, H.G. (2007). The validity of higher-order questions as a process
indicator of educational quality. Research in Higher Education, 48, 319351.
Renaud, R.D., & Murray, H.G. (2008). A comparison of a subject-specic and a general
measure of critical thinking. Thinking Skills and Creativity, 3, 8593.
Ross, S., & Pellett, T. (2009). Improving critical thinking skills: A university-wide initiative.
Paper presented at the The Association of American Colleges and Universities (AAC & U).
Schumacher, G.M., & Nash, J.G. (1991). Conceptualizing and measuring knowledge change
due to writing. Research in the Teaching of English, 25, 6796.
Swinton, O.H. (2010). The effect of effort grading on learning. Economics of Education
Review, 29, 11761182.
The Association of American Colleges and Universities. (2011). Liberal education and
Americas promise (LEAP). Retrieved from http://www.aacu.org/leap/index.cfm
Educational Psychology 211

Tsui, L. (1999). Courses and instruction affecting critical thinking. Research in Higher
Education, 40, 185200.
Walvoord, B.E. (n.d.). How to make grading fair, time-efcient, and conducive to learning.
Retrieved from http://ctlgrading.project.mnscu.edu/vertical/Sites/%7BE4C565C9-1900-4ADC-
8724-31FA8886F7BC%7D/uploads/%7B896CCE8D-7E15-437B-8527-EE064EF0D2C4%7D.
PDF
Watson, G., & Glaser, E.M. (2006). WatsonGlaser critical thinking appraisal-short form.
San Antonio, TX: Harcourt.
Watson, G., & Glaser, E.M. (2008). WatsonGlaser critical thinking appraisal short form
manual. Upper Saddle River, NJ: Pearson Education.
Winne, P.H., & Nesbit, J.C. (2010). The psychology of academic achievement. Annual
Review of Psychology, 61, 653678.
Downloaded by [Laurentian University] at 03:34 23 April 2013

Potrebbero piacerti anche