Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Copyright
C Taylor & Francis Group, LLC
JENNIFER G. CROMLEY
Temple University, Philadelphia, Pennsylvania, USA
89
90 Reading Achievement and Science Proficiency
Why Science?
Learning science is critical, not just for youth who wish to pursue
scientific careers but also for an informed citizenry (National Re-
search Council, 2007; Trefil, 2007). Students who wish to pursue
careers in fields as diverse as biology, medicine, forensics, aero-
nautics, and pharmaceutical research need a solid foundation in
scientific knowledge and the scientific method. Even high school
students who do not have a goal of entering a scientific field need
basic scientific knowledge in order to make decisions about per-
sonal health and to make voting decisions that are well informed
by scientific knowledge (e.g., about stem cell research, oil drilling,
and global warming; Trefil, 2007).
The need for science literacy has changed in recent decades
as national economies worldwide have become more intercon-
nected; both American and international employers seek em-
ployees across the globe who have advanced scientific and tech-
nical knowledge. For this reason, international comparisons of
students’ math and science skills and knowledge—such as TIMSS
and PISA—receive widespread attention. A small body of research
suggests that performance on science measures is relatively de-
pendent on reading skills (e.g., Dempster & Reddy, 2007). If this is
true, then one important prerequisite for raising science achieve-
ment may be to increase reading comprehension. That is, science
education reform efforts may need to intensively target reading
skills in order to increase science achievement.
Few researchers have reported data specifically on the rela-
tionship of reading comprehension to science proficiency, but
three new data sets from the Programme on International Stu-
dent Assessment (PISA) include data from three samples, each
with hundreds of thousands of 15-year-old students in more than
40 countries who completed both reading literacy and science
literacy assessments in 2000, 2003, and 2006 (Organisation for
Economic Co-operation and Development [OECD], 2007). These
J. G. Cromley 91
Study 1
Method
PARTICIPANTS
Participants were 174,896 fifteen-year-old students from 43
countries across the world.2 They were sampled using a two-stage
J. G. Cromley 95
M EASURES
Students completed demographic questionnaires and pencil-
and-paper measures of science literacy and reading literacy. Each
measure was written by a different expert panel and was designed
around domain-specific content, process, and situation dimen-
sions. In each of these measures, about one half of the questions
were multiple-choice and one half were short answer or essay
questions. Different combinations of items were grouped together
in nine different assessment booklets such that the entire test
lasted 2 hours for each student.
One hundred and forty-one reading prompts were created,
representing 270 minutes of testing time; however, each student
only completed a subset of items during a 60-minute reading
testing period. The prompts included narrative, expository, ar-
gumentative, instructional/injunctive, and descriptive passages,
charts, diagrams, advertisements, maps, tables, matrices, graphs,
and forms. The tests were designed to measure information re-
trieval (20%), broad understanding of information (20%), in-
terpretation development (30%), content reflection (15%), and
form reflection (15%). Therefore, the first three aspects account
for understanding and use of information (70% of items), while
the remaining aspects require interpretation and evaluation (30%
of items). The document’s content was categorized according to
its intended use including reading for private use (letters, fiction,
biographies; 28%), public use (official documents, public events;
28%), workplace (16%), and education (28%). Forty-five percent
of items required students to construct their own responses, 45%
were multiple choice, and the remaining 10% required students
96 Reading Achievement and Science Proficiency
P ROCEDURE
Participants in 32 countries completed measures in
February–October 2000, and participants in an additional
11 countries completed identical measures in 2002. Within
each country, measures had to be completed within a 6-week
period. Measures were given by trained test administrators in
each country. The reading and science assessments each took
one hour (OECD, 2002).
D ATA A NALYSIS
PISA 2000 used many measurement methods similar to the
U.S. National Assessment of Student progress (NAEP). PISA 2000
used a cluster rotation measurement design (sometimes called
matrix sampling), in which each student answered a subset of
items from a pool of questions. That is, not all students answered
all possible questions. All measures were Rasch scaled (an Item
J. G. Cromley 97
Results
Country N N M SE M SE
Country N N M SE M SE
Note. All statistics are calculated at the student level within countries, use student-level
weights, and are calculated using BSS resampling weights on 80 replications.
Country r SE
Country r SE
Note. All statistics are conducted at the student level within countries, use student-
level weights, and are calculated using BSS resampling weights on 80 replications.
0.95
0.90
0.85
0.80
0.75
0.70
0.65
Discussion
Given that these reading and science tests are both written as-
sessments, it is perhaps not surprising that students’ scores on
the two measures are highly correlated. The magnitude of the
correlation—a mean of .840—however, may be surprising. This
correlation did vary across the 43 countries, with a tighter linking
of reading and science proficiency in the countries with higher
mean reading scores.
Study 2
In study 2, I replicated these results with the PISA 2003 data, col-
lected with an entirely new set of 15-year-old participants, in a sim-
ilar set of countries, using very similar measures and procedures.
Method
PARTICIPANTS
Participants were 276,192 fifteen-year-old students from 41
countries across the world,4 sampled using the same two-stage pro-
cess as in PISA 2000.5 The sample size in each country had a me-
dian of 4,704 (with a range of 332 for Liechtenstein to 29,983 for
Mexico). The weighted sample size had a median of 111,831 per
country (with a range of 338 for Liechtenstein to 3,147,089 for the
United States), for a total of 19,155,864 students. Participants are
described in detail in Learning for Tomorrow’s World: First Results from
PISA 2003 (OECD, 2004). Sampling procedures are described in
detail in the PISA 2003 Technical Report (OECD, 2005a).
M EASURES
Students completed demographic questionnaires and mea-
sures of mathematics literacy, science literacy, and reading liter-
acy, with content (but not format) slightly modified from those
used in PISA 2000. The 28 reading prompts included narrative,
expository, and descriptive passages, charts, one map, graphs, and
forms. The tests were designed to measure searching for informa-
tion (literal comprehension; 29% of items) and interpreting and
evaluating texts (inferential comprehension; 71% of items). The
documents included recreational, workplace, citizenship, and
J. G. Cromley 103
P ROCEDURE
Procedures were almost identical to those used in PISA 2000.
Participants completed measures in February–October 2003, and
within each country measures had to be completed within a 30-
day period (OECD, 2005a).
D ATA A NALYSIS
As in the earlier study, PISA 2003 used a cluster rotation mea-
surement design with Rasch scaled items. Resampling weights spe-
cific to the PISA 2003 data set were used to analyze the plausi-
ble value scores (see the PISA 2003 Data Analysis Manual; OECD,
2005b). All analyses were conducted using the SPSS syntax sup-
plied by PISA for analyzing these two-stage matrix sampled Rasch
scale score data. Data analyses using resampling were the same as
that described for Study 1.
Results
Country N N M SE M SE
Australia 12,551 235,591 525.43 2.13 .83 525.05 2.10 .84
Austria 4,597 85,931 490.69 3.76 .87 490.99 3.44 .88
Belgium 8,796 111,831 506.99 2.58 .86 508.83 2.48 .85
Brazil 4,452 1,952,253 402.80 4.58 .76 389.62 4.35 .73
Canada 27,953 330,436 527.91 1.75 .83 518.75 2.02 .83
Czech Republic 6,320 121,183 488.54 3.46 .84 523.25 3.38 .82
Denmark 4,218 51,741 492.32 2.82 .82 475.22 2.97 .82
Finland 5,796 57,884 543.46 1.64 .80 548.23 1.92 .80
France 4,300 734,579 496.19 2.68 .84 511.23 2.99 .84
Germany 4,660 884,358 491.36 3.39 .88 502.34 3.64 .88
Greece 4,627 105,131 472.27 4.10 .78 481.02 3.82 .76
Hong Kong (China) 4,478 72,484 509.54 3.69 .85 539.50 4.26 .85
Hungary 4,765 107,044 481.87 2.47 .81 503.28 2.77 .81
Iceland 3,350 3,928 491.75 1.56 .83 494.75 1.47 .82
Indonesia 10,761 1,971,476 381.59 3.38 .70 395.04 3.21 .68
Ireland 3,880 54850 515.48 2.63 .87 505.39 2.69 .85
Italy 11,639 481,521 475.66 3.04 .85 486.45 3.13 .84
Japan 4,704 1,240,054 498.11 3.92 .84 547.64 4.14 .85
Korea 5,444 533,504 534.09 3.09 .83 538.43 3.54 .84
Latvia 4,627 33,643 490.56 3.67 .80 489.13 3.89 .78
Liechtenstein 332 338 525.08 3.58 .84 525.18 4.33 .84
Luxembourg 3,923 4,080 479.42 1.48 .86 482.76 1.50 .85
Macao (China) 1,250 6,546 497.64 2.16 .78 524.68 3.03 .79
Mexico 29,983 1,071,650 399.72 4.09 .78 404.90 3.49 .72
The Netherlands 3,992 184,943 513.12 2.85 .87 524.37 3.15 .88
New Zealand 4,511 48,638 521.55 2.46 .86 520.90 2.35 .86
Norway 4,064 52,816 499.74 2.78 .82 484.18 2.87 .81
Poland 4,383 534,900 496.61 2.88 .82 497.78 2.86 .81
Portugal 4,608 96,857 477.57 3.73 .84 467.74 3.46 .81
Russian Federation 5,974 2,153,373 442.20 3.94 .76 489.29 4.14 .74
Serbia 4,405 68,596 411.74 3.56 .78 436.37 3.50 .76
Slovakia 7,346 77,067 469.16 3.12 .83 494.86 3.71 .82
Spain 10,791 344,372 480.54 2.60 .81 487.09 2.61 .80
Sweden 4,624 107,104 514.27 2.42 .83 506.12 2.72 .81
Switzerland 8,420 86,491 499.12 3.28 .84 512.98 3.69 .84
Thailand 5,236 637,076 419.92 2.81 .76 429.06 2.70 .74
Tunisia 4,721 150,875 374.62 2.81 .72 384.68 2.56 .69
Turkey 4,885 481,279 440.97 5.79 .82 434.22 5.89 .83
United Kingdom 9,535 698,579 507.01 2.46 .86 518.40 2.52 .87
United States 5,456 3,147,089 495.19 3.22 .87 491.26 3.08 .85
Uruguay 5,835 33,775 434.15 3.43 .77 438.37 2.90 .75
Note. All statistics are calculated at the student level within countries, use student-level
weights, and are calculated using BSS resampling weights on 80 replications.
J. G. Cromley 105
Discussion
Country r SE
Australia .860 .017
Austria .871 .016
Belgium .784 .016
Brazil .813 .012
Canada .845 .018
Czech Republic .881 .012
Denmark .708 .017
Finland .836 .028
France .837 .017
Germany .718 .023
Greece .847 .015
Hong Kong (China) .820 .011
Hungary .779 .018
Iceland .878 .021
Indonesia .822 .020
Ireland .851 .030
Italy .866 .009
Japan .801 .016
Korea .745 .021
Latvia .707 .018
Liechtenstein .892 .020
Luxembourg .800 .014
Macao (China) .858 .011
Mexico .788 .014
The Netherlands .839 .019
New Zealand .736 .015
Norway .809 .020
Poland .849 .015
Portugal .754 .019
Russian Federation .599 .021
Serbia .796 .014
Slovakia .836 .043
Spain .641 .017
Sweden .876 .011
Switzerland .781 .017
Thailand .860 .017
Tunisia .871 .016
Turkey .784 .016
United Kingdom .796 .014
United States .813 .012
Uruguay .845 .018
Note. All statistics are conducted at the student level within countries, use student-
level weights, and are calculated using BSS resampling weights on 80 replications.
J. G. Cromley 107
Correlation Between Reading and Science at Student
0.90
0.85
0.80
0.75
Level
0.70
0.65
0.60
0.55
Study 3
In Study 3, I replicated these results with the PISA 2006 data, also
collected with a new set of 15-year-old participants, in a similar set
of countries with similar measures and procedures.
Method
PARTICIPANTS
Participants were 389,750 fifteen-year-old students from 57
countries across the world,6 sampled using the same two-stage pro-
cess as in PISA 2000 and 2003.7 Reading data for the United States
were not usable because of an error in printing the instructions
for the test booklets, leaving a sample of 393,139 students from
56 countries. The sample size in each of these countries had a
median of 4,909 (with a range of 339 for Liechtenstein to 30,971
108 Reading Achievement and Science Proficiency
M EASURES
Students completed demographic questionnaires and mea-
sures of mathematics literacy, science literacy, and reading liter-
acy, with content and format slightly modified from those used
in PISA 2000 and PISA 2003. The PISA 2006 measures were de-
signed to require less reading to more clearly distinguish between
scientific literacy and reading literacy.
P ROCEDURE
Procedures were almost identical to those used in PISA 2000
and 2003. Participants completed measures in March–November
2006 (OECD, 2007).
D ATA ANALYSIS
As in the earlier studies, PISA 2006 used a cluster rotation
measurement design with Rasch scaled items. Data analyses using
resampling were the same as that described for Study 1.
Results
Country N N M SE M SE
Country N N M SE M SE
Note. All statistics are calculated at the student level within countries, use student-level
weights, and are calculated using BSS resampling weights on 80 replications.
Country r SE
Country r SE
Note. All statistics are conducted at the student level within countries, use
student-level weights, and are calculated using BSS resampling weights on 80
replications.
reading scores and science scores than did countries with lower
mean reading performance (see Figure 3).
As before, I ran a second regression of reading-science corre-
lation on the mean reading score and the standard error of that
mean. Standard errors correlated significantly with mean reading
score (r [54] =−.275, p = .041) but did not contribute signifi-
cantly to the overall regression (F [2, 53] = 19.872, MSE = .002,
p < .001, R2 = .429, adjusted R2 = .407, the beta weight for SE was
nonsignificant, t < 1).
Discussion
1.00
0.95
0.90
0.85
0.80
0.75
0.70
0.65
0.60
0.55
General Discussion
Acknowledgements
I thank the office of the Temple University Vice President for Re-
search for funds to support this research and Ulana Luciw and
Lindsey Snyder for editorial assistance with the manuscript.
116 Reading Achievement and Science Proficiency
Notes
1. Under NCLB, students must be tested at least once yearly across three age
ranges (once in a late elementary grades 3–5, once in a middle school grades
6–9, and once in a high school grades 10–12).
2. The year and month of birth are included in the data set, but for privacy
reasons neither the exact age nor the date of data collection are given.
3. In schools with less than 35 eligible students, all 15-year-old students partici-
pated.
4. As in PISA 2000, the year and month of birth are included in the data set, but
for privacy reasons neither the exact age nor the date of data collection are
given.
5. As in PISA 2000, in schools with less than 35 eligible students, all 15-year-old
students participated.
6. As in PISA 2000, the year and month of birth are included in the data set, but
for privacy reasons neither the exact age nor the date of data collection are
given.
7. As in PISA 2000, in schools with less than 35 eligible students, all 15-year-old
students participated.
References
Best, R., Rowe, M., Ozuru, Y., & McNamara, D. (2005). Deep-level comprehen-
sion of science texts: The role of the reader and the text.Topics in Language
Disorders, 25(1), 65–83.
Braunger, J., Donahue, D. M., Evans, K., & Galguera, T. (2005). Rethinking prepa-
ration for content area teaching: The reading apprenticeship approach. San Francisco:
Jossey-Bass.
Bray, G. B., Pascarella, E. T., & Pierson, C. T. (2004). Postsecondary education
and some dimensions of literacy development: An exploration of longitudinal
evidence. Reading Research Quarterly, 39(3), 306–330.
Cromley, J. G., Snyder, L. E., Luciw, U. A., & Tanaka, J. (2009, June). Testing the
fit of the DIME model of reading comprehension with biology text. Paper presented at
the meeting of the Society for the Scientific Study of Reading, Boston, MA.
Demps, D. L., & Onwuegbuzie, A. J. (2001). The relationship between eighth-
grade reading scores and achievement on the Georgia High School Gradua-
tion Test. Research in the Schools, 8(2), 1–9.
Dempster, E. R., & Reddy, V. (2007). Item readability and science achievement
in TIMSS 2003 in South Africa. Science Education, 91(6), 906–925.
Graesser, A. C., Singer, M., & Trabasso, T. (1994). Constructing inferences during
narrative text comprehension. Psychological Review, 101(3), 371–395.
Guthrie, J. T., Wigfield, A., Barbosa, P., Perencevich, K. C., Taboada, A., Davis,
M. H., et al. (2004). Increasing reading comprehension and engagement
J. G. Cromley 117
Otero, J., León, J. A., & Graesser, A. C. (Eds.). (2002). The psychology of science text
comprehension. Hillsdale, NJ: Lawrence Erlbaum.
Purcell-Gates, V., Duke, N. K., & Martineau, J. A. (2007). Learning to read and
write genre-specific text: Roles of authentic experience and explicit teach-
ing.Reading Research Quarterly, 42(1), 8–45.
Ritchie, S., Rigano, D., & Duane, A. (2008). Writing an ecological mystery in
class: Merging genres and learning science.International Journal of Science Edu-
cation, 30(2), 143–166.
Romance, N. R., & Vitale, M. R. (2008, March). Science IDEAS: A knowledge-based
model for accelerating reading/literacy through in-depth science learning. Paper pre-
sented at the annual meeting of the American Educational Research Associa-
tion, New York.
Roth, K. J., Drucker, S. L., Garnier, H. E., Lemmens, M., Chen, C., Kawanaka,
T., et al. (2006). Teaching science in five countries: Results from the TIMSS
1999 video study statistical analysis report. Washington, DC: U.S. Department
of Education. Retrieved June 16, 2006, from http://www.nces.ed.gov
Shymansky, J. A., Yore, L. D., & Anderson, J. O. (2004). Impact of a school dis-
trict’s science reform effort on the achievement and attitudes of third- and
fourth-grade students. Journal of Research in Science Teaching, 41(8), 771–790.
Smith, M. C. (2000). The real-world reading practices of adults. Journal of Literacy
Research, 32(1), 25–52.
Stanovich, K. (2000). Progress in understanding reading: Scientific foundations and
new frontiers. New York: Guilford.
Trefil, J. (2007). Why science? New York: Teachers College Press.
van den Broek, P., Rapp, D. N., & Kendeou, P. (2005). Integrating memory-
based and constructionist processes in accounts of reading comprehension.
Discourse Processes, 39(2), 299–316.
Yore, L. D., Bisanz, G. L., & Hand, B. M. (2003). Examining the literacy com-
ponent of science literacy: 25 years of language arts and science research.
International Journal of Science Education, 25(6), 689–725.
Yore, L. D., & Treagust, D. F. (2006). Current realities and future possibilities:
Language and science literacy—Empowering research and informing instruc-
tion. International Journal of Science Education, 28(2–3), 291–314.
Zmach, C. C., Sanders, J., Patrick, J. D., Dedeoglu, H., Charbonnet, S., Henkel,
M., et al. (2007). Infusing reading into science learning.Educational Leader-
ship, 64(4), 62–66.