Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
W. Ian O’Byrne
A Dissertation Proposal
Doctor of Philosophy
at the
University of Connecticut
2010
W. Ian O’Byrne, Dissertation Proposal 2
Abstract
Is it possible to teach adolescents how to think more critically about online information
by having them create online content? This investigation explores that question. First, it will
construct and validate an instrument designed to measure the critical evaluation skills used by
students while reading online. Second, it will examine the use if the critical evaluation
critically evaluate online information. Third, it will examine the effectiveness of this
instructional model in building the dispositions needed by students when reading online. Fourth,
it will examine themes and patterns that exist as groups of students comprehend and construct
online information.
Increasingly students are using the Internet to obtain information about both personal and
academic topics (Lubans, 1999; Jones & Madden, 2002; Shackleford, Thompson & James,
1999). Along with this trend there is a growing concern about the dubious nature of online
information, and users’ ability to evaluate this information (Alexander & Tate, 1999; Flanagin &
Metzger, 2000; Browne, Freeman & Williamson, 2000). Research shows that students are
"frequently fooled" when viewing online content (Leu et al., 2007; Johnson & Kaye, 1998; Rieh
& Belkin, 1998). Particularly, students are not able to judge the validity of a website, even when
given procedures to do so (Lubans, 1998, 1999). Because of the increasing use of the Internet in
our students’ lives, it is important that they become well versed in evaluating the validity and
The study will be conducted in three phases using multiple methods of data collection
and analysis. Phase one will guide students in an analysis of the argumentative techniques used
by online authors to manufacture sincerity, credibility, and relevance. Phase two will consist of
W. Ian O’Byrne, Dissertation Proposal 3
students constructing websites using the techniques that are used by authors of online
information. Phase three will encourage students to evaluate and critique websites constructed by
other students. Quantitative data will include students’ scores on instruments that will measure
the ability to critically evaluate online information and the dispositions of successful online
readers. Analysis of these data will be conducted using analysis of covariance to determine
posttest differences between the groups using the pretest as the covariate (Van Breukelen, 2006).
Qualitative data collection and analysis techniques will be used to enable evaluation process data
to complement the interpretation of quantitative results. Data analysis of qualitative data will
consist of a rigorous content analysis (Mayring, 2000) to inductively analyze (Patton, 2002)
focus group interviews, field observation data, and student artifacts. The expected findings will
contribute to theory development, research, and practice in the use of online information in
school settings.
W. Ian O’Byrne, Dissertation Proposal 4
Introduction
Research Center, 2010) does not often critically examine the information with which they are
connected (Alexander & Tate, 1999; Flanagin & Metzger, 2000; Browne, Freeman &
Williamson, 2000; Bennett, Maton, & Kervin, 2008). This presents important challenges for
online information (Horrigan, 2010, Pew Research Center, 2010). Indeed, the Internet has
quickly become this generation’s defining technology for literacy, in part due to technology
facilitating access to an unlimited amount of online information and media (Rideout, Foehr, &
Roberts, 2010). Adolescents read online at a rate far greater than other segments of the world’s
population (Lenhart, Madden, & Hitlin, 2005). Moreover, adolescents’ consumption of digital
media for recreational use continues to increase: from 27 minutes per day in 1999, to 62 minutes
per day in 2004, to 89 minutes per day in 2009 (Rideout, Foehr, & Roberts, 2010).
While students read more online, important questions are being raised about their ability
to think critically and evaluate the information they encounter (Alexander & Tate, 1999;
Flanagin & Metzger, 2000; Browne, Freeman & Williamson, 2000). Research shows that
students are frequently deceived when viewing online content (Leu et al., 2007; Bennett, Maton,
& Kervin, 2008; Bråten, I., Strømsø, H.I., & Britt, M.A., 2009). In particular, students are not
always able to evaluate the validity of a website, even when given procedures to do so (Lubans,
examine effective ways to teach the critical evaluation of online information to adolescents.
W. Ian O’Byrne, Dissertation Proposal 5
This study examines an instructional model that teaches the critical evaluation of online
information. It has three purposes: (1) to evaluate a potentially promising instructional model
that teaches adolescents how to critically evaluate online content; (2) to examine the
effectiveness of this instructional model in building the dispositions needed by students when
reading online; (3) to examine the themes and patterns that exist as students think critically about
and construct online content. This quasi-experimental, mixed methods study (Shadish, Cook &
Campbell, 2002; Johnson & Onwuegbuzie, 2004) will investigate the extent to which critical
evaluation skills, required during online reading comprehension, can be improved using a three-
A central challenge for educators today is that students do not always think critically
about information they encounter online. Research has raised questions about the ability of
students to evaluate online information (Alexander & Tate, 1999; Flanagin & Metzger, 2000;
Browne, Freeman & Williamson, 2000). Quite simply, many students appear not to have the
evaluation skills and strategies to succeed in this environment (Livingstone, 2004; Bennett,
Maton, & Kervin, 2008; Jewitt, 2008). Apparently, students mistakenly trust information they
read online (Leu et al., 2007; Johnson & Kaye, 1998; Rieh & Belkin, 1998). In particular,
students are not able to accurately judge the validity of a website, even when given procedures to
do so (Lubans, 1998, 1999). The lack of critical evaluation skill, while reading online
information, is also a problem among adults. A report in 2006, for example, showed that seventy
five percent of American adults rarely check the source and date of health information that they
Thus, it is clear that critical evaluation of online information is integral to the success of
online readers in their ability to evaluate and safely use the information they find (Lubans, 1999;
Shackleford, Thompson & James, 1999; Jones & Madden, 2002; Leu et al., 2008). Since online
information is commonly used to make decisions affecting the personal well being of
individuals, the ability to critically evaluate this information has become increasingly important
It is likely that at least two elements contribute to this issue. First, students increasingly
rely upon the Internet as a source of information. Second, since anyone may publish anything
Increasingly, students in the United States are using the Internet to obtain information
about both academic and personal topics (Lubans, 1999; Jones & Madden, 2002; Shackleford,
Thompson, & James, 1999). With respect to academic topics, more than 90% of U. S. students
with home access to the Internet use online information for homework (Lenhart, Simon, &
Graziano, 2001). Over 70% of these students used the Internet as the primary source for
information on their most recent school report or project (Lenhart, Simon, & Graziano, 2001),
while only 24% of these students reported using the library for the same task (Lenhart, Simon, &
Graziano, 2001).
In regard to the use of Internet information for personal issues, we have the most
extensive data in the area of health information. A recent study showed that 55% of 7th to 12th
grade students reported using the Internet to look up health information about themselves or
someone they know (Rideout, Foehr, & Roberts, 2010). Moreover, it appears this pattern of
consulting the Internet for online health inquiries is not just limited to adolescents. Another study
W. Ian O’Byrne, Dissertation Proposal 7
showed that adults aged 18-30 use the Internet as a primary information resource when seeking
solutions to a health related problem (Estabrook, Witt, & Rainie, 2007). In 2001, almost half of
the 100 million Americans online reported using the Internet as a means to access health
information (Estabrook et al., 2001). On a typical day, 8 million American adults searched for
academic and personal lives, there is growing concern about the reliability of these sources
(Alexander & Tate, 1999; Flanagin & Metzger, 2000; Browne, Freeman & Williamson, 2000).
This is partially due to the fact that there are few filters to analyze, critically evaluate and verify
accuracy and reliability of information published online (Flanagin & Metzger, 2000; Johnson &
Kaye, 1998; Rieh & Belkin, 1998). Additionally, traditional quality indicators are either difficult
to find or sometimes nonexistent (Fox, 2006). Examples of these indicators are as follows: facts
regarding authorship, vetted content information, and a trail of revision audits (Fox, 2006). Thus,
it appears clear that online reading requires substantial ability to think critically, evaluate
information, and judge the veracity of what is being read (Alexander & Tate, 1999; Flanagin &
Metzger, 2000; Johnson & Kaye, 1998; Rieh & Belkin, 1998), perhaps even more so than offline
reading.
question (Brem, Russell, & Weems, 2001). The sincerity of an online information source has
been defined as the presentation of truthfulness in identity, intent, and information as a means to
determine honest social relationships (Trilling, 1972; Kolb, 1996; Dahlberg, 2001). Brem,
Russell, & Weems (2001) identified three typical web environments that represent different
W. Ian O’Byrne, Dissertation Proposal 8
levels of sincerity: hoaxes, weaker sincere sites, and stronger sincere sites. Hoax websites are
defined as website “fabrications” that have been created for entertainment purposes, usually
(Brem, Russell, & Weems, 2001, p. 198). Weaker sincere sites are identified as more “balanced
between reputability and disreputability” (Brem, Russell, & Weems, 2003, p. 198) than hoax
websites or stronger sincere sites. The claims made are believable, and backed up by supporting
data found online, but do not stand up to close examination. Stronger sincere sites present
information that includes: “professional markers” of organization, more credible experts, and an
While finding credible information online is always challenging, one area of common
information use poses substantial risk of personal harm and illustrates the importance of being
able to accurately evaluate online information: the use of websites containing information about
health issues. The accuracy of online health information is often subject to debate, and when
incorrect has the potential to be harmful to an individual’s health (Berland et al., 2001; Cline &
Haynes, 2001). “Misinformation obtained from the Internet has the potential to produce
detrimental effects on health behavior outcomes” (Benotsch, Kalichman, & Weinhard, 2004).
Due to the lack of dependability and reliability found in online information (Arunachalam,
1998), critical evaluation of online information is a crucial skill when searching for answers to
questions about health. The same can be said from many other important areas in one’s life.
This section will first examine the theoretical perspectives that frame this study.
Thereafter, I will examine the previous work on critical evaluation, multimodal design, and
Theoretical Frameworks
The nature of literacy is rapidly evolving as the Internet and other communication
technologies (ICTs) emerge (Coiro, Knobel, Lankshear & Leu, 2008). These changes demand an
expanded view of “text” to include visual, digital and other multimodal formats (Rose & Meyer,
2002; New London Group, 2000; Alvermann, 2002). A richer and more complex definition of
literacy requires a richer and more complex theoretical framing of research (Leu, O’Byrne,
Zawilinski, McVerry, & Everett-Cacopardo, 2009). Thus, this study uses a multiple theoretical
perspective approach (Labbo & Reinking, 1999), incorporating several theoretical perspectives,
including those from critical literacy, new literacies, and cognitive apprenticeship to frame this
study.
Critical literacy. This study is framed within the theoretical perspective of critical
literacy. Rooted in socio-cultural perspectives of reading, critical literacy uses learning to “build
access to literate practices and discourse resources” (Luke, 2000, p. 449) for use as social capital
in the community (Freebody & Luke, 1990; Lankshear & Knobel, 1998). Critical literacy moves
away from the “self” in critical reading to understand how texts interact in different context
(Luke, 2000). In this study, texts will be analyzed to determine purpose, which ideologies are
represented, and how students can “reject them or reconstruct them” (Cervetti, Pardales, Damico,
reconstruction of online content in this study could be seen as “activism” (Morrell, 2002) or as
Activism in literacy can enhance the work-product with authenticity and increase the
author’s attention to audience (Brown, 2000; Oblinger, 2004; Tapscott, 1999). Activism
identifies the audience intended for a specific task that students are asked to complete (XXXX).
W. Ian O’Byrne, Dissertation Proposal 10
When authentically embedded into a critical literacy framework, activism allows the student to
become an “active viewer” (Davis, 1993) and empower himself or herself (Kellner & Share,
2005). Cyberactivism involves the use of a computer mediated communication (CMC) tool for
facilitating offline actions or change for the empowerment of online citizens and spreading
information (McCaughey & Ayers, 2003; Pickerill, 2003). The use of one website which will
consolidate all student created hoax websites, as well as the instructional model and findings of
this study will provide a vehicle for students to express themselves as readers and writers of
online information.
New Literacies. In addition, this study is framed within both the larger definition of New
Literacies, as well as the more specific definition of new literacies as it applies to online reading
Coiro, Knobel, Lankshear, & Leu (2008) New Literacies theory has four principles:
2. the acquired skills are central to full civic, economic, and personal participation in a
globalized community;
In addition to a broader definition of New Literacies, this study is also framed within the more
specific theory of the new literacies of online reading comprehension (Leu, O’Byrne, Zawilinski,
process of problem-based inquiry, which requires new skills, strategies and dispositions (Leu,
Kinzer, Cammack, & Coiro, 2004; Leu, et. al, 2008). An important skill identified within online
W. Ian O’Byrne, Dissertation Proposal 11
reading comprehension is the ability to evaluate critically the information that an individual
encounters online.
(Brem, Russell, & Weems, 2001; Kiili, Laurinen & Marttunen, 2008). Thus, critical evaluation
requires an examination of the context, content, and contingencies that affect interpretation of
information by students (Bredo, 1994). Cognitive apprenticeship uses four dimensions (e.g.,
content, methods, sequence, sociology) to embed learning in activity and make deliberate the use
of the social and physical contexts present in the classroom (Brown, Collins, & Duguid, 1989;
Collins, Brown, & Newman, 1989). Cognitive apprenticeship includes the enculturation of
students into authentic practices through activity and social interaction (Brown, Collins, &
Duguid, 1989). Cognitive apprenticeship will inform this study by having students: collectively
solve problems, display multiple roles, confront ineffective strategies and misconceptions, as
well as provide collaborative work skills (Brown, Collins, & Duguid, 1989).
models in reading, writing, and mathematics (Collins, Brown, & Newman, 1989). This work on
cognitive apprenticeship will inform the instructional model used in this study in two important
ways: 1) by defining the methods and sequencing used; and 2) by outlining reflective strategies
used by students.
informing this study, reliance will be placed on the complex pattern of goal setting and problem
solving known as “knowledge transformation” (Scardamalia & Bereiter, 1985). This will yield
information on skills and strategies used by the instructor, such as: modeling, coaching,
W. Ian O’Byrne, Dissertation Proposal 12
scaffolding and then fading in instruction (Scardamalia & Bereiter, 1985; Scardamalia, Bereiter.
Outlining reflective strategies used. The second element of cognitive apprenticeship that
informs this study includes the reflection strategies used by students. Students will be
sensitizing them to specifics of an expert performance, and adjustments that may be made to
their own performance to get them to the expert level (Collins & Brown, 1988; Collins, Brown,
& Newman, 1989). Thus, the function of reflection indicates “co-investigation” and/or abstracted
replay by students (Scardamalia & Bereiter, 1983; Collins & Brown, 1988).
In addition to the theoretical perspectives that frame this study, several areas of previous
research guide the investigation. These include: critical evaluation, multimodal design, and
Critical Evaluation. Critical evaluation has been defined as including the critical
thinking abilities used to: 1) question, analyze, and compare resources; 2) judge the quality of
information on various characteristics; and 3) defend an opinion with evidence from multiple
sources and prior knowledge (Coiro, 2008). Research on critical evaluation (Taylor, 1986; Tate
& Alexander, 1996; Metzger, 2007) has focused on a variety of information quality markers
validity) but condenses to credibility and relevance as the two main constructs (Judd, Farrow, &
Tims, 2006; Kiili, Laurinen & Marttunen, 2008). Credibility is typically defined in terms of
expertise and trustworthiness (Judd, Farrow, & Tims, 2006), or the reliability of information
(Kiili, Laurinen, & Marttunen, 2008). Relevance is typically defined in terms of importance and
W. Ian O’Byrne, Dissertation Proposal 13
currency (Judd, Farrow, & Tims, 2006), or judgments about the essential nature of information
Previous research on critical evaluation has also identified the situated nature of these
skills (Brown, Collins, & Duguid, 1989; Brem, Russell, & Weems, 2001). Researchers have
determined that as students critically evaluate they need to: negotiate multiple points of view,
consider conflicting information, and construct their own meaning (Hannafin & Land, 1997;
Hannafin & Land, 2000). Britt, Perfetti, Sandak & Rouet (1999) maintained that readers need to
construct a two level model while evaluating sources: integration (the text based, or internal
model) and situational (text mixed with prior knowledge and inter-text with other sources).
argumentation and sincerity of online information (Brem, Russell & Weems, 2001). Furthermore
Graesser et al. (2007) found that successful online readers evaluate truth, relevance, quality,
impact, and claims made contemporaneously while evaluating the usefulness of the information
to the learner. As critical evaluation is viewed as more of a situated activity, researchers have
examined the varied context, content and contingencies that affect a student’s ability to critically
Multimodal Design. Originating from multimodalities (Kress & van Leeuwen, 2001;
Jewitt, 2008), multimodal design specifies the interchange between linguistic, visual, audio,
gestural, spatial and multimodal elements (New London Group, 2000). When integrated with a
New Literacies framework, students operate as “designers” and “apply critiqued knowledge of
the subject or topic synthesized from multimodal sources” (Kimber & Wyatt-Smith, 2006, p. 26).
Students construct “representations of new knowledge” and communicate this to others with the
intention of engaging their audience (Kimber & Wyatt-Smith, 2006, p. 26). As a pedagogical
W. Ian O’Byrne, Dissertation Proposal 14
tool, design holds together the “process and product” (New London Group, 2000), and allows
students to consider how literacy practices are used to determine truth (Street, 1984; Alvermann
& Hagood, 2000). To provide guidance in instruction and assessment of student work-product in
this study, the skills embedded in multimodal design need to be integrated. Therefore, given the
deictic nature of literacy (Leu, 2000), it is problematic to view creation of content using CMC
tools as belonging to only one skill set, such as: blogging, wikis, e-mail, social networks, word
processing. As these technologies converge, some experts believe the tools associated with
various CMC tools may merge as well (Fox, Anderson, & Rainie, 2005; Anderson & Rainie,
2008; Greenhough, Robelia, & Hughes, 2009). Thus, a broad spectrum of combined skills and
Since this study integrates multiple lines of research from many fields (i.e.,
multiliteracies, new media, digital storytelling, digital literacy, gaming, and others) the
combination of all these skills will be referred to as online content construction (OCC). This
integration of skills originates from content creation as defined by Sonia Livingstone in her
theoretical definition of media literacy and the instability that ICT research presents (2004).
Since she maintains that to “identify, in textual terms, how the Internet mediates the
(Livingstone, 2004) the construct must be broad enough to allow for change in the future. Thus it
appears that integrating these multiple lines of research within OCC is consistent with this
reality.
RAND Reading Study Group, 2002) and critical evaluation (Rieh, 2002) suggest that learning is
a mixture of affective variables (Baker & Wigfield, 1999; Guthrie & Wigfield, 1997) and
W. Ian O’Byrne, Dissertation Proposal 15
motivational factors (Zimmerman & Bandura, 1994) that go beyond specific skills. To measure
not only the cognitive processes involved in this study, but also examine the affective dimension,
it is necessary that this study measure student dispositions. Carr & Claxton (2002) define
dispositions as a “tendency to edit, select, adapt, and respond to the environment in a recurrent,
characteristic kind of way.” Learning dispositions can be seen as a “pattern of behaviors, situated
in the context of the environment, that when recognized and developed by those who can
manipulate the environment may lead to gains in the acquisition of knowledge, skills and
understandings” (O’Byrne, & McVerry, 2009). Due to the unlimited nature of the Internet
(Alvermann, 2004; Gross, 2004) these dispositions may be even more significant as individuals
read information online (Liaw, 2002; Lin, Wu, & Tsai, 2005; Coiro, 2007). Inclusion of a
measure of dispositions allows for recognition of the affective variables and skills that affect
In summary, the theoretical perspectives of critical literacy, new literacies, and cognitive
apprenticeship will inform this study. Additionally, previous research informing critical
evaluation, multimodal design, and dispositions of online reading will also guide this study.
Since credibility and relevance appear to most broadly define the critical evaluation of online
content (Judd, Farrow, & Tims, 2006; Kiili, Laurinen & Marttunen, 2008), these two factors will
be used to measure the central construct of this study, critical evaluation. The skills and
apprenticeship and student reflection to address the research questions that guide this study.
Research Questions
W. Ian O’Byrne, Dissertation Proposal 16
Previous research indicates teaching adolescents how to critically evaluate what is read
online is complicated (Johnson & Kaye, 1998; Rieh & Belkin, 1998; Leu et al., 2007). Indeed,
we have yet to identify an instructional approach that satisfactorily accomplishes this goal
(Lubans, 1998; 1999). This study explores how critical evaluation skills might be improved
students will analyze the techniques authors use to make websites credible (Britt & Gabrys,
2002; Fogg, Marshall, Laraki, Osipovich, Varma, Fang, et al., 2001). Second, students will
construct websites while manufacturing markers of online sincerity (Brem, Russell, & Weems,
2001). Third, students will reflect (Collins & Brown, 1988; Collins, Brown, & Newman, 1989)
on the knowledge and strategies used while critically evaluating and constructing online
information.
This instructional model is expected to build students’ critical evaluation skills with
McInnis, 2001) it is expected that students will comprehend “the interactive product of text and
context of various kinds” (Spiro, 1980). Additionally, Cognitive Apprenticeship theory suggests
(Brown, Collins, & Duguid, 1989; Collins, Brown, & Newman, 1989) that by engaging students
and framed within a multiple theoretical perspective, this study will address the following
research questions:
W. Ian O’Byrne, Dissertation Proposal 17
RQ1. What are the levels of reliability and validity obtained from the construction and
validation of a critical evaluation instrument that measures the critical thinking skills of
Hypothesis: Based on previous results, obtained in a pilot study (O’Byrne, 2009), the
employing solid psychometric techniques will provide a valid and reliable instrument to
be used in research.
RQ2. Does an instructional model that teaches the critical evaluation and construction of online
content with varying levels of sincerity improve the critical thinking skills of adolescents as
hypothesized that construction of online content with varying levels of sincerity will have
RQ3. Does an instructional model that teaches the critical evaluation and construction of online
content with varying levels of sincerity improve student scores on an assessment that measures
(O’Byrne & McVerry, 2009) has been effective in measuring the dispositions necessary
for online reading comprehension. It is hypothesized that time spent reading, evaluating,
RQ4. What are the themes and patterns that exist as groups of students comprehend and
Expectation: It is expected that this qualitative analysis will identify some of the skills,
strategies, and dispositions previous research has identified as necessary for online
reading comprehension (Leu, et. al, 2008). It is also expected that new skills, strategies,
and dispositions will be identified that will further inform literacy practices in online
spaces.
This study employs a quasi-experimental, mixed methods framework (Shadish, Cook &
Campbell, 2002; Johnson & Onwuegbuzie, 2004), which tests the use of an instructional model
that will empower students as evaluators and constructors of online information (Salomon, 1997;
Fabritius, 1999).
RQ1. Does an instructional model that teaches the critical evaluation and construction of
online content with varying levels of sincerity improve the critical thinking skills of
Settings and Participants. The study will be conducted with a convenience sample of
two groups of seventh grade students and teachers from one school in Connecticut. The
convenience sample for this study has been selected because of expertise on the part of the
instructor in working with students in a one-to-one laptop classroom (Leu, Reinking et al., 2008).
Because the instructor is prequalified with the special skills, strategies, and dispositions needed
in a one-to-one laptop classroom, these techniques will more efficiently assist instruction in the
study. The student population will consist of 200 students regularly enrolled in English language
arts classes during the 2009-2010 school year. The teachers in the study will be those who are
regularly assigned to these students. The intervention group (n = 100) will consist of students in
one-to-one laptop classrooms, and a teacher with a high degree of competence in working in a
W. Ian O’Byrne, Dissertation Proposal 19
one-to-one laptop classroom. The control group (n = 100) will consist of a teacher from the same
middle school with access to a school computer lab, but without special training working in a
one-to-one laptop classroom. Thus, the intervention group will have a teacher skilled in working
in a one-to-one laptop classroom and capable of showing the students how to critically use the
resources provided.
To ensure that instruction in the control classrooms remains as close to the normal
English language arts curriculum as possible, data will be collected to provide a qualitative
account of the instruction that takes place in the control group classrooms during the study. The
researcher will collect: the curriculum provided by the school district, lesson plans created by the
control group classroom teacher, and an observation of the control group classroom. Twice
during the study the researcher will observe the control group classroom for all sessions of the
class to take notes on the general curriculum being instructed. These three sources of data will be
used to create a general qualitative account of the content of instruction being presented in the
control group.
Major Dependent Variable. The critical evaluation instrument was based on a measure
used by Brem, Russell, & Weems (2001). This variation of the instrument was used in the pilot
of this study, and as a result is being thoroughly revised to match the hypothesized constructs of
credibility and relevance (Judd, Farrow & Tims, 2006; Kiili, Laurinen & Marttunen, 2008). The
instrument is currently undergoing content validation with experts. Further validation of the
instrument, including the use of Item Response Theory (IRT) and tests of internal consistency
will be conducted using results from this study. This section will define the: constructs and
subconstructs measured by the instrument, content validation techniques being implemented, and
instrument began with a literature review to determine the subconstructs of the hypothesized
factors of credibility and relevance (Judd, Farrow & Tims, 2006; Kiili, Laurinen & Marttunen,
2008). These definitions were used to develop multiple-choice items that reflect the constructs
(credibility and relevance) and sub-constructs. The tasks presented in each item situated in
activities that adolescents would be engaged in as they search for online information.
Credibility is defined in terms of expertise and trustworthiness (Judd, Farrow, & Tims,
2006) or the reliability of information (Kiili, Laurinen, & Marttunen, 2008). The subconstructs of
credibility are defined as evaluation of: author, purpose, source, content, argument, and accuracy
of online information (Kiili, Laurinen, & Marttunen, 2008). Evaluation of author is defined as
the creator or source of the information showing evidence of being knowledgeable, reliable, and
determine the desired intent of the information (Harris, 1997). Source is defined as consideration
of the provider of the information and whether they are able to answer the desired question (Rieh
& Belkin, 1998; Strømsø & Bråten, 2010). Content is defined as consideration of the style or
manner in which information is presented (Harris, 1997; Kiili, Laurinen, & Marttunen, 2006).
made (Kiili, Laurinen, & Marttunen, 2006). Accuracy is defined as consideration of the vetted,
Relevance is defined in terms of importance and currency (Judd, Farrow, & Tims, 2006),
or judgments about the essential nature of information (Kiili, Laurinen, & Marttunen, 2008),
especially in relation to the task. The subconstructs of relevance are defined as evaluations of
relevance of topic and website, as well as usability and currency of information (Kiili, Laurinen,
W. Ian O’Byrne, Dissertation Proposal 21
& Marttunen, 2008). Relevance of topic is defined as consideration of the information as being
essential to the student’s task (Kiili, Laurinen, & Marttunen, 2008). Relevance of website is
relation to other sources of online information (Kiili, Laurinen, & Marttunen, 2008). Usability is
(Kiili, Laurinen, & Marttunen, 2008). Currency of information is defined as consideration of the
information as being circulating and valid at the present time (Meola, 2004; Kiili, Laurinen, &
Marttunen, 2008).
currently undergoing a content validation phase with experts familiar with critical evaluation
research to develop definitions for the constructs (McKenzie, Wood, Kotecki, Clark, & Brey,
1999). The six experts include professors and graduate students familiar with the field of critical
evaluation of online information research. The experts are rating the dimensionality of each of
the twenty multiple-choice items by indicating which of the constructs and subconstructs the
item measures. Any item identified by 90% of participants as measuring the hypothesized
construct will be kept for further analysis (Gable & Wolfe, 1993, McKenzie et al., 1999). A
Content Validity Index (CVI) (Rubio, Berg-Weger, Tebb, Lee, & Rauch, 2003) will be created
for each item using the feedback provided by the experts to test for multidimensionality of items.
For inclusion in the final version of the instrument to be used in this study, the CVI for each item
will need to exceed 0.70 (Rubio et al., 2003). Finally, the experts are encouraged to leave written
feedback that will be used to ensure the adequacy and accuracy of definitions of constructs and
Planned testing of validity and reliability. Further validation of the critical evaluation
instrument will be conducted using results from student responses in the study. IRT will be used
to summarize the data provided by student scores into student scores on the latent trait and
parameters of the items (Hambleton & Swaminathan, 1985). IRT is appropriate for this study
since it provides explicit modeling for the probability of each possible response to each item. To
ensure reliability of the instrument, the researcher will also test for internal consistency, or the
degrees of relatedness of the items. Cronbach’s coefficient alpha will be calculated (Pett et al.,
2003) to identify the total proportion in a given scale that can be attributed to a common source.
This calculation of Cronbach’s alpha requires several assumptions to be met: 1) the items are
independent; 2) the errors are uncorrelated; 3) the items are positively correlated; and 4) the
items are unidimensional. To test these assumptions an initial factor analysis will be conducted
using student scores to determine which items have positive inter-item correlations. The inter-
item correlation matrix will be checked to remove any items that are redundant, too positively
instrument will be checked to ensure that items do not load at least moderately on more than one
Research Question 2: Does an instructional model that teaches the critical evaluation and
construction of online content with varying levels of sincerity improve the critical thinking
study?
Settings and Participants. The school, students, and instructors that have been described
sample size for the two groups in the study. A power analysis program called GPOWER (Faul,
Erdfelder, Buchner, & Lang, 2009) was used with a power estimate of 0.80, a Cohen’s d effect
size of 0.35, an alpha level of 0.05. A medium effect size of 0.35 has been selected which is
adequate for research (Cohen, 1992). With a conservative correlation between the dependent
variable and the covariate of 0.60 (i. e., correlation between the pretest and posttest) the required
Instructional Model. The instructional model contains three phases with instruction
guided by modeling, coaching and then fading as detailed by Cognitive Apprenticeship theory
(Collins, Brown, & Newman, 1989). The intervention will be conducted four times a week
during the students’ regular English language arts ninety-minute class time, and will last six
weeks. During the first phase, students will analyze the argumentative techniques used by OCC
authors to manufacture the appearance of sincerity and resultant credibility (Brem, Russell, &
Weems, 2001; Britt & Gabrys, 2002; Fogg, Marshall, Laraki, Osipovich, Varma, Fang, et al.,
2001). These techniques include multiple layers of argument used by OCC authors that provide
challenges to online readers as they try to evaluate credibility, accuracy, reasonableness, and
support (Harris, 1997; Brem, Russell, & Weems, 2001). These techniques include, but are not
limited to: hyperlinks, about us pages, images, video, testimonials, page layout, color choices,
and font selection. During the second phase, groups of students will collaboratively construct
as hoax websites (Brem, Russell, & Weems, 2001). During the third phase, students will evaluate
and critique the work-product of others and reflect upon the skills and strategies used by novice
and expert critical evaluators and authors of online information (Collins & Brown, 1988). The
W. Ian O’Byrne, Dissertation Proposal 24
cumulative work-product and lessons learned will be consolidated at one accessible online
Fidelity checks will be conducted to ensure consistency across various class sessions of
students during the study. A criteria checklist will be created to identify two levels of
information that should exist in the instruction given to treatment groups. This checklist will
consist of: a) general elements (e.g., time, topic, assignments) and b) specific activity level
elements. This checklist will be constructed and routinely edited by the instructor and researcher
weekly during lesson planning sessions. At least once time during phase one, phase two, and
phase three of the instructional model an outside observer will conduct a fidelity check to ensure
consistency of instruction across all classes of students. The outside observer will be a researcher
with knowledge of the study and constructs involved. The observer will receive a checklist
identifying the general elements and specific activity level elements that should be evident in
instruction. The observer will watch all classes during one school day and take notes as to how
consistent instruction was across all classes, as identified by the checklist. These findings and
any additional comments will then be shared with the researcher and instructor.
Analysis of Covariance. The pretest scores on the critical evaluation instrument will first
be tested to ensure that pretest differences do exist between groups. If differences do exist,
analysis of data for RQ2 will be conducted using a one-way analysis of covariance (ANCOVA)
test. This will analyze differences between groups on student scores of the critical evaluation
instrument, using the pretest as the covariate (Van Breukelen, 2006). The use of an ANCOVA
for analysis carries an assumption of homogeneity of regression slopes, which will be tested by
building an interaction between the covariate and treatment group. If this interaction is
significant, then an ANCOVA cannot be used in this analysis. In this case, the analysis will be
W. Ian O’Byrne, Dissertation Proposal 25
conducted using a multiple linear regression while keeping the interaction term (i.e., pretest
scores) significant. Differences between the pretest and posttest on the critical evaluation
instrument will suggest that the instructional model had an effect on students’ critical evaluation
skills.
Research Question 3: Does an instructional model that teaches the critical evaluation and
construction of online content with varying levels of sincerity improve student scores on an
Settings and Participants. The school, students, and instructors that have been described
sample size for the two groups in the study. A power analysis program called GPOWER (Faul,
Erdfelder, Buchner, & Lang, 2009) was used with a power estimate of .80, a Cohen’s d effect
size of .35, an alpha level of 0.05. A medium effect size of .35 has been selected which is
adequate for research (Cohen, 1992). With a conservative correlation between the dependent
variable and the covariate of 0.60 (i. e., correlation between the pretest and posttest) the required
Instructional Model. The instructional method tested in RQ3 is the same method as
apprenticeship: 1) students will analyze the techniques authors use to make websites credible
(Britt & Gabrys, 2002; Fogg, Marshall, Laraki, Osipovich, Varma, Fang, et al., 2001); 2) then
students will construct websites with manufactured markers of online sincerity (Brem, Russell, &
Weems, 2001); 3) and finally students will reflect (Collins & Brown, 1988) on the knowledge
and strategies used while critically evaluating and constructing online information.
W. Ian O’Byrne, Dissertation Proposal 26
Analysis of Covariance. The pretest scores on the DORC will first be tested to ensure
that pretest differences do exist between groups. If differences do exist, analysis of data for RQ3
will be conducted using a one-way ANCOVA test. Student gains in the dispositions needed
while reading online will be measured by the Dispositions of Online Reading Comprehension
(DORC) instrument (O’Byrne & McVerry, 2009). The ANCOVA will be conducted using the
pretest as the covariate (Van Breukelen, 2006) to analyze differences between groups. The use of
an ANCOVA for analysis carries an assumption of homogeneity of regression slopes, which will
be tested by building an interaction between the covariate and treatment group. If this interaction
is significant, then an ANCOVA cannot be used in this analysis. In this case, the analysis will be
conducted using a multiple linear regression while keeping the interaction term (i.e., pretest
scores) significant. Differences between the pretest and posttest on the DORC will suggest that
the instructional method had an effect on the students’ dispositions needed for reading online
information.
Research Question 4. What are the themes and patterns that exist as groups of students
Settings and Participants. The school and instructors described in RQ1, RQ2 and RQ3
are the same settings for RQ4. Qualitative data will be collected from the three top performing
and three bottom performing groups of students as identified by researcher and instructor
purposely sampled to provide data that addresses the research question. Observations of student
groups will begin for a period of two weeks before the study begins while students work in
groups they normally use for their English language arts class. Selection of student groups will
involve a selection due to modified criteria established by Oakley, Felder, Brent, & Elhajj (2004)
W. Ian O’Byrne, Dissertation Proposal 27
that will begin with exclusion of any groups that do not work well together. The objective of
RQ4 is to identify the benchmarks established by groups that work well together in this study,
and provide educators with skills and strategies needed by students, for this reason groups of
students that do not work well together will not be considered. After groups have been identified
that work well together, the final selection will be made using the checklist detailed in Table 1.
________________
_________________
Determination of the selected groups will be made using observation of classroom group
behavior and completed group work. To ensure that these selected groups will be effective in
addressing RQ4, the researcher and instructor will exchange observation notes and make a final
decision as to which groups to follow by the end of the first week of the study.
Qualitative Data Collection. Three types of data will be collected to allow for
triangulation of findings (Denzin, 1978): interviews of student groups, researcher notes, and
student work-products.
focus groups (Krueger & Casey, 2008) with the selected top three and bottom three groups to
understand how students engage with online information (Damico & Baildon, 2007). The main
research question of the focus groups will be RQ4. The focus groups will consist of one to two
meetings with each of the six groups of students, each lasting 30 minutes. The researcher will act
as the moderator, with the role being that of a “seeker of knowledge” (Krueger & Casey, 2008).
In the focus groups, students will be asked to detail choices made in the critical evaluation and
construction of online information. The students will be informed of the intent of the interviews
W. Ian O’Byrne, Dissertation Proposal 28
ahead of time to allow them to prepare their thoughts for the interview to allay concerns that
students may have difficulty considering metacognitive decisions while being interviewed
(Ericsson & Simon, 1993; Ericsson, 2002). These interviews will be collected using iShowU, a
screen capture software. The use of video screen captures is necessary to allow analysis of the
students describing design choices of their websites using their computers. These video taped
interviews will be coded and transcribed using ELAN, a qualitative data video analysis program.
Researcher notes. Researcher notes will include observational notes collected during the
study. The notes will include observations of group dynamics of the six selected groups and
indications of how effectively they work with their groups. The researcher will also take note of
specific strategies used, modifications made, or problems encountered while working with
groups.
organizers created by students during the study. In the instructional model, students will
construct graphic organizers to map out and plan representations of their hoax websites. These
graphic organizers will be full color maps of their websites that indicate plans of design choices.
The hoax websites will be constructed using iWeb (website construction software) and saved on
student laptops. These graphic organizers and final versions of their hoax websites will be
Analysis. Analysis of the data will be conducted with the intent of identifying
connections between the data and research questions (Thomas, 2006) to explain the decisions
students make as they critically evaluate and construct online content. Analysis will be
conducted in a multi-step process to inductively analyze (Patton, 2002) and ultimately develop
themes (Merriam, 2002) from the data. Initially, the researcher will closely read across the
W. Ian O’Byrne, Dissertation Proposal 29
interview data, researcher notes, and student work-product to gain insight into the diversity of
codes possible. During this initial close reading of the data, the researcher will take notes and
create preliminary inductive codes. The researcher will then revisit the preliminary codes and
examine their relationship to the research purposes (Thomas, 2006). These codes will then be
used to code all documents. After the initial coding, constant comparative methods (Glaser &
Strauss, 1967) will be used across all codes to collapse the preliminary codes into specific
categories. Once categories have been created a rigorous content analysis (Mayring, 2000) will
Limitations
Several limitations impact the findings of this study. Given that there are no previous
instruments used to evaluate the ability to critically evaluate information during online reading
comprehension, this study uses a new instrument. Care has been taken, however, to ensure that
this instrument will provide valid and reliable data. The instrument was previously piloted and
revised thoroughly prior to this study. A systematic content validation using experts is currently
taking place. Further validation techniques, including an EFA and tests of internal construct
consistency will help to ensure that this instrument provides valid and reliable data.
Another limitation of the study is the setting. This school was selected because of the
environment. Additionally, the researcher served as a resource and instructor in the classroom.
The addition of the extra instructor, the expertise on working with OCC tools and the presence of
one-to-one laptops could be perceived to be advantages for the intervention group. It is unknown
whether this study could be conducted with an instructor that did not have the skills and
dispositions needed to bring this instructional model to the average seventh grade student.
W. Ian O’Byrne, Dissertation Proposal 30
Another limitation would be the previous skill level of students in the study. It is
unknown what previous training the students may have received in OCC tool use, either in
school or out. Additionally, there are other cognitive factors that affect the success and failure of
students that cannot be measured by the critical evaluation instrument and DORC. Due to the
status of the participants as seventh grade students in a middle school, there are challenges and
opportunities that affect the study (Moje, 2002). Adolescents, specifically seventh graders have
been shown to include identity construction as they engage in literacy practices (Broughton &
Fairbanks, 2003). Identity construction in adolescent literacy and fiction is still being studied by
the research community (Moje, 2002; Bean & Moni, 2003). These literacy practices could be
seen as the construction of sincere and credible online information by students in this study. The
inclusion of qualitative data in the form of interviews and researcher notes is an attempt to
Additionally, the use of the convenience sample in the study, rather than a random
sampling of students is a limitation. The sample of students was specifically selected because of
the presence of one-to-one laptops, and an instructor that regularly uses them effectively in
classroom practice. Thus, these results must be cautiously interpreted and only used to plan more
As students increasingly use the Internet for information seeking behaviors the skills and
dispositions needed while comprehending these texts are vital. Research shows that students
rarely successfully deploy the healthy skepticism that is needed when searching and evaluating
information online. Research such as this will continue the examination into those patterns but
extends it by questioning the effectiveness of instruction of these skills. While dispositions used
W. Ian O’Byrne, Dissertation Proposal 31
during online reading comprehension have been successfully measured (O’Byrne & McVerry,
2009), research needs to continuously examine the multifaceted dispositions that occur as
Research has shown that direct instruction of these skills does not transfer into student
acquisition and use of these skills when reading online (Spires, Lee, Turner, & Johnson, 2008;
Greenhow, Robelia, & Hughes, 2009). The study will test a new model for teaching greater
understanding of credibility, relevance and sincerity of online information through the act of
constructing these markers (Brem, Russell & Weems, 2001). Future research can also investigate
whether content construction and computer-mediated communication can affect the operational,
academic and critical lenses (Damico, Baildon, & Campano, 2005) students must use while
reading online.
Results from this study will also point to a new field of research involving the content
construction habits of adolescents in online spaces. The ability to modify markers of credibility,
relevance and sincerity show students’ dexterity to not only recognize these markers, but also to
create or remix them. Further research can investigate methods to deploy scaffolding techniques
when constructing online information. Additionally, skill in the use of these markers of critical
elements of online information suggests that these same skills can be carried over to offline
References
Ackerman, J. (1991). Reading, writing, and knowing: The role of disciplinary knowledge in
Alexander, P. A., & Jetton, T. L. (2002). Learning from text: A multidimensional and
(Eds.) Handbook of reading research, Volume III (pp. 285-310). Mahwah, NJ: Erlbaum.
Alexander, J. E., & Tate, M. A. (1999). Web wisdom: how to evaluate and create
Alvermann, D. & Hagood, M. (2000) Fandom and critical media literacy. Journal of
Lang.
Pew Internet and American Life Project.
Three Essential Strategies for Planning and Creating Effective Educational Websites.
Baker, L., & Wigfield, A. (1999). Dimensions of children’s motivation for reading and their
relations to reading activity and reading achievement. Reading Research Quarterly, 34,
452-476.
W. Ian O’Byrne, Dissertation Proposal 33
Bennett, S., Maton, K., & Kervin, L. (2008). The digital natives debate: A critical review
Benotsch, E., Kalichman, S., & Weinhardt, L. (2004). HIV-AIDS patients’ evaluation of
health information on the Internet: The digital divide and vulnerability to fraudulent
Berland, G., Elliott, M., Morales, L., Algazy, J., Kravitz, R., Broder, M., et al. (2001). Health
information on the Internet: accessibility, quality, and readability in English and Spanish.
Bråten, I., Strømsø, H.I., & Britt, M.A. (2009). Trust matters: Examining the role of
Brem, S. K., Russell, J., & Weems, L. (2001). Science on the Web: Student evaluations
Britt, M., Perfetti, C., Sandak, R., & Rouet, J. (1999). Content integration and source
den Broek (Eds.), Narrative, comprehension, causality, and coherence: Essays in honor
Britt, M. & Gabrys, G. (2002). Implications of document-level literacy skills for Web site
Brown, J. (2000). Growing up digital: how the Web changes work, education, and the ways
Brown, J., Collins, A., & Duguid, P. (1989). Situated Cognition and the Culture of Learning.
Browne, M. N., Freeman, K. E., & Williamson, C. L. (2000). The importance of critical
thinking for student use of the Internet. College Student Journal, 34(3), 391–398.
Carr, M. & Claxton, M. (2002). Tracking the development of learning dispositions. Assessment
Cervetti, G., Pardales, M., & Damico, J. (2001, April). A tale of differences: Comparing
the traditions, perspectives and educational goals of critical reading and critical literacy.
HREF/articles/cervetti/index.html
Cline, R. & Haynes, K. (2001) Consumer health information seeking on the Internet: the state of
critically evaluating content on the Internet. [Electronic Version]. New England Reading
Association Journal.
University of Connecticut.
online. Paper presented at the annual meeting of the National Reading Conference,
Orlando, FL.
Coiro, J., Knobel, M., Lankshear, C. & Leu, D. (Eds.) (2008). Handbook of research on
Collins, A., & Brown, J. (1988). The computer as a tool for learning through reflection. In H.
Mandi & A. Lesgold (Eds.), Learning issues for intelligent tutoring systems (pp. 1-18).
Collins, A., Brown, J., & Newman, S. (1989). Cognitive Apprenticeship: Teaching the craft of
Comrey, A. L., & Lee, H. B. (1992) A first course in factor analysis (2nd ed.). Hillsdale, NJ:
Erlbaum.
from http://jcmc.indiana.edu/vol7/issue1/dahlbergold.html
Damico, J., Baildon, M., & Campano, G. (2005). Integrating literacy, technology and disciplined
Technology, Humanities, Education and Narrative. Retrieved March 15, 2010, from then
journal.org/feature/92/
Damico, J., & Baildon, M. (2007). Examining ways readers engage with websites during think-
Davis, J. (1993). Media literacy: From activism to exploration. Media literacy: A report of the
Eagleton, M. B., Guinee, K. & Langlais, K. (2003). Teaching Internet literacy strategies:
the hero inquiry project. Voices from the Middle, 10, 3, 28–35
Eagleton, M., & Dobler, E. (2007). Reading the Web: Strategies for Internet inquiry.
experience without reactivity: Interpreting the verbal overshadowing effect within the
theoretical framework for protocol analysis. Applied Cognitive Psychology, 16, 981-987.
Estabrook, L., Witt, E., Rainie, L. (2007). Information Searches That Solve Problems.
Pew Internet & American Life Project: Washington, DC. Retrieved December 20, 2007,
from: http://www.pewinternet.org/pdfs/Pew_UI_LibrariesReport.pdf
cognitive framework for analysis of work content. Retrieved March 1, 2010, from:
http://www.shef.ac.uk/~is/publications/infres/fabritiu.html
W. Ian O’Byrne, Dissertation Proposal 37
Faul, F., Erdfelder, E., Buchner, A., & Lang, A. (2009). Statistical power analyses
using G*Power 3.1: Tests for correlation and regression analyses. Behavior Research
Fleiss, J. L., Cohen, J. (1973). The equivalence of weighted kappa and the intraclass
Fogg, B.J., J. Marshall, O. Laraki, A. Osipovich, C. Varma, N. Fang, et al. (2001). What
makes web sites credible? A report on a large quantitative study. Presented to the
Fogg, B. J., Soohoo, C., Danielson, D. R., Marable, L., Stanford, J., & Tauber, E. R.
(2003). How do users evaluate the credibility of Web sites? A study with over 2,500
http://portal.acm.org/citation.cfm?
id=997097&coll=ACM&dl=ACM&CFID=36236037&CFTOKEN=18606069
Fox S. (2006). Online Health Search 2006. Washington, DC: Pew Internet & American Life
Project.
Fox, S., Anderson, J. Q., & Rainie, L. (2005, January). The Future of the Internet. Washington,
http://www.pewinternet.org/pdfs/PIP_Future_of_Internet.pdf
Freebody, P. & Luke, A. (1990). Literacies programs: Debates and demands in cultural
W. Ian O’Byrne, Dissertation Proposal 38
Gable, R. K., & Wolf, M. B. (1993). Instrument development in the affective domain: Measuring
attitudes and values in corporate and school settings. (2nd ed.). Boston, MA: Kluwer
Academic Publishers.
Gee, J. (2003). What video games have to teach us about learning and literacy.
Glaser, B. G. & Strauss, A. L. (1967). The Discovery of Grounded Theory: Strategies for
Graesser, A.C., Wiley, J., Goldman, S.R., O’Reilly, T., Jeon, M., & McDaniel, B. (2007).
SEEK Web Tutor: Fostering a critical stance while exploring the causes of volcanic
Greenhow, C., Robelia, B., & Hughes, J. (2009). Web 2.0 and classroom research: What path
Gross, E. F. (2004). Adolescent Internet use: What we expect, what teens report. Journal of
Guthrie, J. T., & Wigfield, A. (Eds.). (1997). Reading engagement: Motivating readers through
Edward Arnold.
Hannafin, M., & Land, S. (1997). The foundations and assumptions of technology-
Hannafin, M., & Land, S. (2000). Technology and student-centered learning in higher
Harris, R. (1997). Evaluating Internet research sources. Retrieved September 12, 2010,
from www.virtualsalt.com/evalu8it.htm
M. J. Metzger & A. J. Flanagin (Eds.), Digital Media, Youth, and Credibility. Cambridge,
Henry, L. (2006). SEARCHing for an answer: The critical role of new literacies while
Horrigan, J. (2010). Broadband Adoption and Use in America. Washington, D. C.: Federal
http://hraunfoss.fcc.gov/edocs_public/attachmatch/DOC-296442A1.pdf
Johnson, T., & Kaye, B. (1998). Cruising is believing? Comparing Internet and
Johnson, R.B. & Onwuegbuzie, A.J. (2004). Mixed methods research: A research
Jones, S. & Madden, M. (2002). The Internet goes to college: How students are living in
W. Ian O’Byrne, Dissertation Proposal 40
the future with today's technology. Washington DC: Pew Internet & American Life
http://www.pewinternet.org/pdfs/PIP_College_Report.pdf.
Judd, V., Farrow, L., & Tims, B. (2006). Evaluating public web site information: a
Kaiser Family Foundation. (2005). Generation M: Media in the lives of 8-18 year olds.
Kellner, D. & Share, J. (2005). Toward critical media literacy: Core concepts, debates,
organizations, and policy. Discourse: Studies in the Cultural Politics of Education, 26(3),
369-386.
Kiili, C., Laurinen, L., & Marttunen, M. (2008). Students evaluating Internet sources:
Kimber, K. & Wyatt-Smith, C. (2006). Using and creating knowledge with new
19-34.
Korp, P. (2006). Health on the Internet: implications for health promotion. Health Education
Kress, G., & van Leeuwen, T. (2001). Multimodal discourse: The modes and media of
Krueger, R. & Casey, M. (2008). Focus Groups: A practical guide for applied research.
Lankshear, C. & Knobel, M. (1998). Critical literacies and new technologies. Paper
Diego, CA.
Lankshear, C., & Knobel, M. (2003). New literacies: changing knowledge and classroom
Lenhart, A., Simon, M., & Graziano, M. (2001). The Internet and education: Findings of the Pew
Internet & American Life Project. Washington, DC: Pew Internet & American Life
Project.
Lenhart, A., Madden, M. & Hitlin, P. (2005). Teens and technology: Youth are leading
the transition to a fully wired and mobile nation. Washington DC: Pew Internet &
Leu, D. J. (2000). Literacy and technology: Deictic consequences for literacy education
in an information age. In M.L. Kamil, P. Mosenthal, R. Barr, & P.D. Pearson (Eds.),
Leu, D.J., Kinzer, C.K., Coiro, J., Cammack, D. (2004). Toward a theory of new
literacies emerging from the Internet and other information and communication
technologies. In R.B. Ruddell & N. Unrau (Eds.), Theoretical Models and Processes of
http://www.readingonline.org/newliteracies/lit_index.asp?HREF=/newliteracies /leu
W. Ian O’Byrne, Dissertation Proposal 42
Leu, D. J., Zawilinski, L., Castek, J., Banerjee, M., Housand, B., Liu, Y., et al. (2007).
What is new about the new literacies of online reading comprehension? In L. Rush, J.
Eakle, & A. Berger, (Eds.). Secondary school literacy: What research reveals for
Leu, D., Coiro, J., Castek, J., Hartman, D., Henry, L., & Reinking, D. (2008). Research
Leu, D. J., Reinking, D., Hutchinson, A., McVerry, J. G., Robbins, K., Rogers, A.,
Malloy, J., O’Byrne, W. I., Zawilinski, L. (2008). The TICA Project: Teaching the New
264-269.
Lin, C., Wu, S., & Tsai, R. (2005). Integrating perceived playfulness into expectation-
confirmation model for web portal context. Information & Management, 42, 683-693.
Livingstone, S. (2004). Media literacy and the challenge of new information and
Livingstone, S., & Bober, M. (2005). UK children go online: Final report of key
Lubans, J. (1998, April). How first-year university students use and regard Internet
Lubans, J. (1999, May). Students and the Internet. Available from Duke University
may 2000).
http://www.qualitative-research.net/fqs-texte/2-00/2-00mayring-e.htm
McKenzie, J. F., Wood, M. L., Kotecki, J. E., Clark, J. K., & Brey, R. A. (1999).
Establishing content validity: Using qualitative and quantitative steps. American Journal
Merriam, S.B. (2002). Introduction to qualitative research. In Merriam, S.B. and associates.
Metzger, M. (2007). Making sense of credibility on the web: Models for evaluating
among urban youth. Journal of Adolescent & Adult Literacy, 46, 72-77.
The New London Group. (1996). A pedagogy of multiliteracies: Designing social futures.
Oakley, B., Felder, R., Brent, R. & Elhajj, I. (2004). Turning Student Groups into Effective
Teams. Journal of Student Centered Learning, 2(1), 9–34. Retrieved on March 1, 2010
from http://www.ncsu.edu/felder-public/Papers/Oakley-paper(JSCL).pdf
L. Zawilinski (Chair), “Because the Internet Said So”: Critical Evaluation and
O'Byrne, W. I., & McVerry, J. G. (2009). Measuring the Dispositions of Online Reading
M. Hundley, R. Jiménez, & V. Risko (Eds.), The 57th National Reading Conference
http://www.jime.open.ac.uk/2004/8/oblinger-2004-8-disc-t.html
Orsmond, P., Merry, S. & Reiling, K. (2000) The use of student driven marking criteria
in peer and self assessment, Assessment & Evaluation in Higher Education, 25(1), 23–38.
Orsmond, P., Merry, S. & Reiling, K. (2002) The use of exemplars and formative
feedback when using student derived marking criteria in peer and self-assessment,
Patton, M. (2002). Qualitative Research and Evaluation Methods. Thousand Oaks, CA:
Sage.
Pett, M. A., Lackey, N. R., Sullivan, J. J. (2003) Making sense of factor analysis. The use of
factor analysis for instrument development in health care research. Thousand Oaks, CA:
Sage.
Pew Research Center. (2010). Millennials: A Portrait of Generation Next. Retrieved March 1,
University Press.
RAND Reading Study Group. (2002). Reading for understanding: Toward an R&D program in
Rideout, V., Foehr, U., & Roberts, D. (2010). Generation M2: Media in the Lives of 8- to 18-
mh012010pkg.cfm
Rieh, S. (2002). Judgment of information quality and cognitive authority in the Web. Journal of
the American Society for Information Science and Technology, 53(2), 145-161.
Rieh, S., & Belkin, N. (1998). Understanding judgment of information quality and
cognitive authority in the WWW. Proceedings of the ASIS Annual Meeting, 35, 279–
289.
Rose, D. H., & Meyer, A. (2002). Teaching every student in the digital age: Universal design for
http://www.cast.org/teachingeverystudent/ideas/tes/
Rubio, D. M., Berg-Weger, M., Tebb, S. S., Lee, E. S., & Rauch, S. (2003). Objectifying
content validity: Conducting a content validity study in social work research. Social Work
Scardamalia, M., & Bereiter, C. (1983). Child as co-investigator: Helping children gain insight
into their own mental processes. In S. Paris, G. Olson, & H. Stevenson (Eds.), Learning
and motivation in the classroom (pp. 83-107). Hillsdale, NJ: Lawrence Erlbaum
Associates.
Thinking and learning skills: Research and open questions (pp. 563-577). Hillsdale, NJ:
Scardamalia, M., Bereiter, C., & Steinbach, R. (1984). Teachability of reflective processes in
Shackleford, J., Thompson, D., & James, M. (1999). Teaching strategy and assignment
design: assessing the quality and validity of information via the Web. Social Science
Spires, H., Lee, J., Turner, K., & Johnson, J. (2008). Having our say: Middle grade student
Press.
literacy in theory and practice. Current Issues in Comparative Education, 5(2), 1–15.
Strømsø, H.I., & Bråten, I. (2010). The role of personal epistemology in the self-regulation of
Tapscott, D. (1999). Educating the Net generation. Educational Leadership, 56(5), 6–11.
Tate, M., & Alexander, J. (1996). Teaching critical evaluation skills for World Wide Web
Publishing.
Thomas, D. (2006). A General Inductive Approach for Analyzing Qualitative Evaluation Data.
Van Breukelen, G. J. (2006). ANCOVA versus change from baseline: More power in
Wittel, A. (2000). Ethnography on the move: From field to net to Internet. Forum:
http://www.qualitative-research.net/fqs-texte/1-00/1-00wittel-e.htm
209–234.
Zimmerman, B. J., & Bandura, A. (1994). Impact of self-regulatory influences on writing course
In a few sentences tell us how you feel about using the Internet to learn in school.
Please answer the following question, “What is the best part of using the Internet in school?”
In a few sentences let us know the best way to learn something new on the Internet.
Please answer the following question, "What is the hardest thing about using the Internet in
school?"
In a few sentences describe the type of person you think is really good at reading websites.