Sei sulla pagina 1di 51

W.

Ian O’Byrne, Dissertation Proposal 1

Facilitating critical evaluation skills through content creation:

Empowering Adolescents as readers and writers of online information

W. Ian O’Byrne

B. A., University of Massachusetts, Amherst, 1997

M. Ed., University of Massachusetts, Amherst, 2001

A Dissertation Proposal

Submitted in Partial Fulfillment of the

Requirements for the Degree of

Doctor of Philosophy

at the

University of Connecticut

2010
W. Ian O’Byrne, Dissertation Proposal 2

Abstract

Is it possible to teach adolescents how to think more critically about online information

by having them create online content? This investigation explores that question. First, it will

construct and validate an instrument designed to measure the critical evaluation skills used by

students while reading online. Second, it will examine the use if the critical evaluation

instrument in measuring the effectiveness of an instructional model in teaching students how to

critically evaluate online information. Third, it will examine the effectiveness of this

instructional model in building the dispositions needed by students when reading online. Fourth,

it will examine themes and patterns that exist as groups of students comprehend and construct

online information.

Increasingly students are using the Internet to obtain information about both personal and

academic topics (Lubans, 1999; Jones & Madden, 2002; Shackleford, Thompson & James,

1999). Along with this trend there is a growing concern about the dubious nature of online

information, and users’ ability to evaluate this information (Alexander & Tate, 1999; Flanagin &

Metzger, 2000; Browne, Freeman & Williamson, 2000). Research shows that students are

"frequently fooled" when viewing online content (Leu et al., 2007; Johnson & Kaye, 1998; Rieh

& Belkin, 1998). Particularly, students are not able to judge the validity of a website, even when

given procedures to do so (Lubans, 1998, 1999). Because of the increasing use of the Internet in

our students’ lives, it is important that they become well versed in evaluating the validity and

reliability of websites (Leu et al., 2008).

The study will be conducted in three phases using multiple methods of data collection

and analysis. Phase one will guide students in an analysis of the argumentative techniques used

by online authors to manufacture sincerity, credibility, and relevance. Phase two will consist of
W. Ian O’Byrne, Dissertation Proposal 3

students constructing websites using the techniques that are used by authors of online

information. Phase three will encourage students to evaluate and critique websites constructed by

other students. Quantitative data will include students’ scores on instruments that will measure

the ability to critically evaluate online information and the dispositions of successful online

readers. Analysis of these data will be conducted using analysis of covariance to determine

posttest differences between the groups using the pretest as the covariate (Van Breukelen, 2006).

Qualitative data collection and analysis techniques will be used to enable evaluation process data

to complement the interpretation of quantitative results. Data analysis of qualitative data will

consist of a rigorous content analysis (Mayring, 2000) to inductively analyze (Patton, 2002)

focus group interviews, field observation data, and student artifacts. The expected findings will

contribute to theory development, research, and practice in the use of online information in

school settings.
W. Ian O’Byrne, Dissertation Proposal 4

Introduction

It is a paradox that history’s first generation of “always connected” individuals (Pew

Research Center, 2010) does not often critically examine the information with which they are

connected (Alexander & Tate, 1999; Flanagin & Metzger, 2000; Browne, Freeman &

Williamson, 2000; Bennett, Maton, & Kervin, 2008). This presents important challenges for

classroom learning, as the Internet becomes an increasingly common source of information

during content area instruction.

It is increasingly clear that this generation of adolescents is almost always connected to

online information (Horrigan, 2010, Pew Research Center, 2010). Indeed, the Internet has

quickly become this generation’s defining technology for literacy, in part due to technology

facilitating access to an unlimited amount of online information and media (Rideout, Foehr, &

Roberts, 2010). Adolescents read online at a rate far greater than other segments of the world’s

population (Lenhart, Madden, & Hitlin, 2005). Moreover, adolescents’ consumption of digital

media for recreational use continues to increase: from 27 minutes per day in 1999, to 62 minutes

per day in 2004, to 89 minutes per day in 2009 (Rideout, Foehr, & Roberts, 2010).

While students read more online, important questions are being raised about their ability

to think critically and evaluate the information they encounter (Alexander & Tate, 1999;

Flanagin & Metzger, 2000; Browne, Freeman & Williamson, 2000). Research shows that

students are frequently deceived when viewing online content (Leu et al., 2007; Bennett, Maton,

& Kervin, 2008; Bråten, I., Strømsø, H.I., & Britt, M.A., 2009). In particular, students are not

always able to evaluate the validity of a website, even when given procedures to do so (Lubans,

1998, 1999). Considering the increasing importance of online information it is important to

examine effective ways to teach the critical evaluation of online information to adolescents.
W. Ian O’Byrne, Dissertation Proposal 5

This study examines an instructional model that teaches the critical evaluation of online

information. It has three purposes: (1) to evaluate a potentially promising instructional model

that teaches adolescents how to critically evaluate online content; (2) to examine the

effectiveness of this instructional model in building the dispositions needed by students when

reading online; (3) to examine the themes and patterns that exist as students think critically about

and construct online content. This quasi-experimental, mixed methods study (Shadish, Cook &

Campbell, 2002; Johnson & Onwuegbuzie, 2004) will investigate the extent to which critical

evaluation skills, required during online reading comprehension, can be improved using a three-

phase instructional model designed to engage students as creators of online information.

Statement of the Problem

A central challenge for educators today is that students do not always think critically

about information they encounter online. Research has raised questions about the ability of

students to evaluate online information (Alexander & Tate, 1999; Flanagin & Metzger, 2000;

Browne, Freeman & Williamson, 2000). Quite simply, many students appear not to have the

evaluation skills and strategies to succeed in this environment (Livingstone, 2004; Bennett,

Maton, & Kervin, 2008; Jewitt, 2008). Apparently, students mistakenly trust information they

read online (Leu et al., 2007; Johnson & Kaye, 1998; Rieh & Belkin, 1998). In particular,

students are not able to accurately judge the validity of a website, even when given procedures to

do so (Lubans, 1998, 1999). The lack of critical evaluation skill, while reading online

information, is also a problem among adults. A report in 2006, for example, showed that seventy

five percent of American adults rarely check the source and date of health information that they

find online (Fox, 2006).


W. Ian O’Byrne, Dissertation Proposal 6

Thus, it is clear that critical evaluation of online information is integral to the success of

online readers in their ability to evaluate and safely use the information they find (Lubans, 1999;

Shackleford, Thompson & James, 1999; Jones & Madden, 2002; Leu et al., 2008). Since online

information is commonly used to make decisions affecting the personal well being of

individuals, the ability to critically evaluate this information has become increasingly important

to individuals at home, work, and school (Korp, 2006).

It is likely that at least two elements contribute to this issue. First, students increasingly

rely upon the Internet as a source of information. Second, since anyone may publish anything

online there is far more information that may be questionable.

The reliance on the Internet as a source of information

Increasingly, students in the United States are using the Internet to obtain information

about both academic and personal topics (Lubans, 1999; Jones & Madden, 2002; Shackleford,

Thompson, & James, 1999). With respect to academic topics, more than 90% of U. S. students

with home access to the Internet use online information for homework (Lenhart, Simon, &

Graziano, 2001). Over 70% of these students used the Internet as the primary source for

information on their most recent school report or project (Lenhart, Simon, & Graziano, 2001),

while only 24% of these students reported using the library for the same task (Lenhart, Simon, &

Graziano, 2001).

In regard to the use of Internet information for personal issues, we have the most

extensive data in the area of health information. A recent study showed that 55% of 7th to 12th

grade students reported using the Internet to look up health information about themselves or

someone they know (Rideout, Foehr, & Roberts, 2010). Moreover, it appears this pattern of

consulting the Internet for online health inquiries is not just limited to adolescents. Another study
W. Ian O’Byrne, Dissertation Proposal 7

showed that adults aged 18-30 use the Internet as a primary information resource when seeking

solutions to a health related problem (Estabrook, Witt, & Rainie, 2007). In 2001, almost half of

the 100 million Americans online reported using the Internet as a means to access health

information (Estabrook et al., 2001). On a typical day, 8 million American adults searched for

information about health topics (Fox, 2006).

The questionable nature of online content

As a result of the significance that individuals give to online information in their

academic and personal lives, there is growing concern about the reliability of these sources

(Alexander & Tate, 1999; Flanagin & Metzger, 2000; Browne, Freeman & Williamson, 2000).

This is partially due to the fact that there are few filters to analyze, critically evaluate and verify

accuracy and reliability of information published online (Flanagin & Metzger, 2000; Johnson &

Kaye, 1998; Rieh & Belkin, 1998). Additionally, traditional quality indicators are either difficult

to find or sometimes nonexistent (Fox, 2006). Examples of these indicators are as follows: facts

regarding authorship, vetted content information, and a trail of revision audits (Fox, 2006). Thus,

it appears clear that online reading requires substantial ability to think critically, evaluate

information, and judge the veracity of what is being read (Alexander & Tate, 1999; Flanagin &

Metzger, 2000; Johnson & Kaye, 1998; Rieh & Belkin, 1998), perhaps even more so than offline

reading.

Additionally the issue of sincerity of online information sources is always open to

question (Brem, Russell, & Weems, 2001). The sincerity of an online information source has

been defined as the presentation of truthfulness in identity, intent, and information as a means to

determine honest social relationships (Trilling, 1972; Kolb, 1996; Dahlberg, 2001). Brem,

Russell, & Weems (2001) identified three typical web environments that represent different
W. Ian O’Byrne, Dissertation Proposal 8

levels of sincerity: hoaxes, weaker sincere sites, and stronger sincere sites. Hoax websites are

defined as website “fabrications” that have been created for entertainment purposes, usually

invoking the ridiculous, but maintaining a “superficial appearance of scientific professionalism”

(Brem, Russell, & Weems, 2001, p. 198). Weaker sincere sites are identified as more “balanced

between reputability and disreputability” (Brem, Russell, & Weems, 2003, p. 198) than hoax

websites or stronger sincere sites. The claims made are believable, and backed up by supporting

data found online, but do not stand up to close examination. Stronger sincere sites present

information that includes: “professional markers” of organization, more credible experts, and an

“air of precision and authority” (Brem, Russell, & Weems, 2003).

While finding credible information online is always challenging, one area of common

information use poses substantial risk of personal harm and illustrates the importance of being

able to accurately evaluate online information: the use of websites containing information about

health issues. The accuracy of online health information is often subject to debate, and when

incorrect has the potential to be harmful to an individual’s health (Berland et al., 2001; Cline &

Haynes, 2001). “Misinformation obtained from the Internet has the potential to produce

detrimental effects on health behavior outcomes” (Benotsch, Kalichman, & Weinhard, 2004).

Due to the lack of dependability and reliability found in online information (Arunachalam,

1998), critical evaluation of online information is a crucial skill when searching for answers to

questions about health. The same can be said from many other important areas in one’s life.

Background of the Study

This section will first examine the theoretical perspectives that frame this study.

Thereafter, I will examine the previous work on critical evaluation, multimodal design, and

dispositions that also inform this study.


W. Ian O’Byrne, Dissertation Proposal 9

Theoretical Frameworks

The nature of literacy is rapidly evolving as the Internet and other communication

technologies (ICTs) emerge (Coiro, Knobel, Lankshear & Leu, 2008). These changes demand an

expanded view of “text” to include visual, digital and other multimodal formats (Rose & Meyer,

2002; New London Group, 2000; Alvermann, 2002). A richer and more complex definition of

literacy requires a richer and more complex theoretical framing of research (Leu, O’Byrne,

Zawilinski, McVerry, & Everett-Cacopardo, 2009). Thus, this study uses a multiple theoretical

perspective approach (Labbo & Reinking, 1999), incorporating several theoretical perspectives,

including those from critical literacy, new literacies, and cognitive apprenticeship to frame this

study.

Critical literacy. This study is framed within the theoretical perspective of critical

literacy. Rooted in socio-cultural perspectives of reading, critical literacy uses learning to “build

access to literate practices and discourse resources” (Luke, 2000, p. 449) for use as social capital

in the community (Freebody & Luke, 1990; Lankshear & Knobel, 1998). Critical literacy moves

away from the “self” in critical reading to understand how texts interact in different context

(Luke, 2000). In this study, texts will be analyzed to determine purpose, which ideologies are

represented, and how students can “reject them or reconstruct them” (Cervetti, Pardales, Damico,

2001, p. 8) to support their own experiences (Luke, 2000). Student construction, or

reconstruction of online content in this study could be seen as “activism” (Morrell, 2002) or as

“online cyberactivism” (McCaughey & Ayers, 2003).

Activism in literacy can enhance the work-product with authenticity and increase the

author’s attention to audience (Brown, 2000; Oblinger, 2004; Tapscott, 1999). Activism

identifies the audience intended for a specific task that students are asked to complete (XXXX).
W. Ian O’Byrne, Dissertation Proposal 10

When authentically embedded into a critical literacy framework, activism allows the student to

become an “active viewer” (Davis, 1993) and empower himself or herself (Kellner & Share,

2005). Cyberactivism involves the use of a computer mediated communication (CMC) tool for

facilitating offline actions or change for the empowerment of online citizens and spreading

information (McCaughey & Ayers, 2003; Pickerill, 2003). The use of one website which will

consolidate all student created hoax websites, as well as the instructional model and findings of

this study will provide a vehicle for students to express themselves as readers and writers of

online information.

New Literacies. In addition, this study is framed within both the larger definition of New

Literacies, as well as the more specific definition of new literacies as it applies to online reading

comprehension (Leu, O’Byrne, Zawilinski, McVerry, Everett-Cacopardo, 2009). As noted by

Coiro, Knobel, Lankshear, & Leu (2008) New Literacies theory has four principles:

1. ICTs require us to bring new potentials to their effective use;

2. the acquired skills are central to full civic, economic, and personal participation in a

globalized community;

3. they are deictic and change regularly;

4. and they are multiple, multimodal, and multifaceted.

In addition to a broader definition of New Literacies, this study is also framed within the more

specific theory of the new literacies of online reading comprehension (Leu, O’Byrne, Zawilinski,

McVerry, Everett-Cacopardo, 2009). This perspective defines online reading comprehension as a

process of problem-based inquiry, which requires new skills, strategies and dispositions (Leu,

Kinzer, Cammack, & Coiro, 2004; Leu, et. al, 2008). An important skill identified within online
W. Ian O’Byrne, Dissertation Proposal 11

reading comprehension is the ability to evaluate critically the information that an individual

encounters online.

Cognitive Apprenticeship. Critical evaluation has been shown to be a situated activity

(Brem, Russell, & Weems, 2001; Kiili, Laurinen & Marttunen, 2008). Thus, critical evaluation

requires an examination of the context, content, and contingencies that affect interpretation of

information by students (Bredo, 1994). Cognitive apprenticeship uses four dimensions (e.g.,

content, methods, sequence, sociology) to embed learning in activity and make deliberate the use

of the social and physical contexts present in the classroom (Brown, Collins, & Duguid, 1989;

Collins, Brown, & Newman, 1989). Cognitive apprenticeship includes the enculturation of

students into authentic practices through activity and social interaction (Brown, Collins, &

Duguid, 1989). Cognitive apprenticeship will inform this study by having students: collectively

solve problems, display multiple roles, confront ineffective strategies and misconceptions, as

well as provide collaborative work skills (Brown, Collins, & Duguid, 1989).

Examples of cognitive apprenticeship frequently appear in the research on instructional

models in reading, writing, and mathematics (Collins, Brown, & Newman, 1989). This work on

cognitive apprenticeship will inform the instructional model used in this study in two important

ways: 1) by defining the methods and sequencing used; and 2) by outlining reflective strategies

used by students.

Defining the methods and sequencing used. In regard to cognitive apprenticeship

informing this study, reliance will be placed on the complex pattern of goal setting and problem

solving known as “knowledge transformation” (Scardamalia & Bereiter, 1985). This will yield

information on skills and strategies used by the instructor, such as: modeling, coaching,
W. Ian O’Byrne, Dissertation Proposal 12

scaffolding and then fading in instruction (Scardamalia & Bereiter, 1985; Scardamalia, Bereiter.

& Steinbach, 1984).

Outlining reflective strategies used. The second element of cognitive apprenticeship that

informs this study includes the reflection strategies used by students. Students will be

encouraged to alternate between novice and expert strategies in a problem-solving context,

sensitizing them to specifics of an expert performance, and adjustments that may be made to

their own performance to get them to the expert level (Collins & Brown, 1988; Collins, Brown,

& Newman, 1989). Thus, the function of reflection indicates “co-investigation” and/or abstracted

replay by students (Scardamalia & Bereiter, 1983; Collins & Brown, 1988).

Research in Critical Evaluation, Multimodal Design and Dispositions

In addition to the theoretical perspectives that frame this study, several areas of previous

research guide the investigation. These include: critical evaluation, multimodal design, and

dispositions of online reading comprehension.

Critical Evaluation. Critical evaluation has been defined as including the critical

thinking abilities used to: 1) question, analyze, and compare resources; 2) judge the quality of

information on various characteristics; and 3) defend an opinion with evidence from multiple

sources and prior knowledge (Coiro, 2008). Research on critical evaluation (Taylor, 1986; Tate

& Alexander, 1996; Metzger, 2007) has focused on a variety of information quality markers

(e.g., accuracy, authority, comprehensiveness, coverage, currency, objectivity, reliability, and

validity) but condenses to credibility and relevance as the two main constructs (Judd, Farrow, &

Tims, 2006; Kiili, Laurinen & Marttunen, 2008). Credibility is typically defined in terms of

expertise and trustworthiness (Judd, Farrow, & Tims, 2006), or the reliability of information

(Kiili, Laurinen, & Marttunen, 2008). Relevance is typically defined in terms of importance and
W. Ian O’Byrne, Dissertation Proposal 13

currency (Judd, Farrow, & Tims, 2006), or judgments about the essential nature of information

(Kiili, Laurinen, & Marttunen, 2008), especially in relation to the task.

Previous research on critical evaluation has also identified the situated nature of these

skills (Brown, Collins, & Duguid, 1989; Brem, Russell, & Weems, 2001). Researchers have

determined that as students critically evaluate they need to: negotiate multiple points of view,

consider conflicting information, and construct their own meaning (Hannafin & Land, 1997;

Hannafin & Land, 2000). Britt, Perfetti, Sandak & Rouet (1999) maintained that readers need to

construct a two level model while evaluating sources: integration (the text based, or internal

model) and situational (text mixed with prior knowledge and inter-text with other sources).

Critical evaluation as a situated activity was also demonstrated as students evaluate

argumentation and sincerity of online information (Brem, Russell & Weems, 2001). Furthermore

Graesser et al. (2007) found that successful online readers evaluate truth, relevance, quality,

impact, and claims made contemporaneously while evaluating the usefulness of the information

to the learner. As critical evaluation is viewed as more of a situated activity, researchers have

examined the varied context, content and contingencies that affect a student’s ability to critically

evaluate online information.

Multimodal Design. Originating from multimodalities (Kress & van Leeuwen, 2001;

Jewitt, 2008), multimodal design specifies the interchange between linguistic, visual, audio,

gestural, spatial and multimodal elements (New London Group, 2000). When integrated with a

New Literacies framework, students operate as “designers” and “apply critiqued knowledge of

the subject or topic synthesized from multimodal sources” (Kimber & Wyatt-Smith, 2006, p. 26).

Students construct “representations of new knowledge” and communicate this to others with the

intention of engaging their audience (Kimber & Wyatt-Smith, 2006, p. 26). As a pedagogical
W. Ian O’Byrne, Dissertation Proposal 14

tool, design holds together the “process and product” (New London Group, 2000), and allows

students to consider how literacy practices are used to determine truth (Street, 1984; Alvermann

& Hagood, 2000). To provide guidance in instruction and assessment of student work-product in

this study, the skills embedded in multimodal design need to be integrated. Therefore, given the

deictic nature of literacy (Leu, 2000), it is problematic to view creation of content using CMC

tools as belonging to only one skill set, such as: blogging, wikis, e-mail, social networks, word

processing. As these technologies converge, some experts believe the tools associated with

various CMC tools may merge as well (Fox, Anderson, & Rainie, 2005; Anderson & Rainie,

2008; Greenhough, Robelia, & Hughes, 2009). Thus, a broad spectrum of combined skills and

tools may emerge.

Since this study integrates multiple lines of research from many fields (i.e.,

multiliteracies, new media, digital storytelling, digital literacy, gaming, and others) the

combination of all these skills will be referred to as online content construction (OCC). This

integration of skills originates from content creation as defined by Sonia Livingstone in her

theoretical definition of media literacy and the instability that ICT research presents (2004).

Since she maintains that to “identify, in textual terms, how the Internet mediates the

representation of knowledge, the framing of entertainment, and the conduct of communication”

(Livingstone, 2004) the construct must be broad enough to allow for change in the future. Thus it

appears that integrating these multiple lines of research within OCC is consistent with this

reality.

Dispositions. Recent theories of reading comprehension (Alexander & Jetton, 2002;

RAND Reading Study Group, 2002) and critical evaluation (Rieh, 2002) suggest that learning is

a mixture of affective variables (Baker & Wigfield, 1999; Guthrie & Wigfield, 1997) and
W. Ian O’Byrne, Dissertation Proposal 15

motivational factors (Zimmerman & Bandura, 1994) that go beyond specific skills. To measure

not only the cognitive processes involved in this study, but also examine the affective dimension,

it is necessary that this study measure student dispositions. Carr & Claxton (2002) define

dispositions as a “tendency to edit, select, adapt, and respond to the environment in a recurrent,

characteristic kind of way.” Learning dispositions can be seen as a “pattern of behaviors, situated

in the context of the environment, that when recognized and developed by those who can

manipulate the environment may lead to gains in the acquisition of knowledge, skills and

understandings” (O’Byrne, & McVerry, 2009). Due to the unlimited nature of the Internet

(Alvermann, 2004; Gross, 2004) these dispositions may be even more significant as individuals

read information online (Liaw, 2002; Lin, Wu, & Tsai, 2005; Coiro, 2007). Inclusion of a

measure of dispositions allows for recognition of the affective variables and skills that affect

students as they critically evaluate and construct online information.

Summary of Theoretical Perspectives and Previous Research

In summary, the theoretical perspectives of critical literacy, new literacies, and cognitive

apprenticeship will inform this study. Additionally, previous research informing critical

evaluation, multimodal design, and dispositions of online reading will also guide this study.

Since credibility and relevance appear to most broadly define the critical evaluation of online

content (Judd, Farrow, & Tims, 2006; Kiili, Laurinen & Marttunen, 2008), these two factors will

be used to measure the central construct of this study, critical evaluation. The skills and

strategies identified as OCC will be used in conjunction with perspectives on cognitive

apprenticeship and student reflection to address the research questions that guide this study.

Research Questions
W. Ian O’Byrne, Dissertation Proposal 16

Previous research indicates teaching adolescents how to critically evaluate what is read

online is complicated (Johnson & Kaye, 1998; Rieh & Belkin, 1998; Leu et al., 2007). Indeed,

we have yet to identify an instructional approach that satisfactorily accomplishes this goal

(Lubans, 1998; 1999). This study explores how critical evaluation skills might be improved

using a three-phase model of instruction, based on principles of cognitive apprenticeship. First,

students will analyze the techniques authors use to make websites credible (Britt & Gabrys,

2002; Fogg, Marshall, Laraki, Osipovich, Varma, Fang, et al., 2001). Second, students will

construct websites while manufacturing markers of online sincerity (Brem, Russell, & Weems,

2001). Third, students will reflect (Collins & Brown, 1988; Collins, Brown, & Newman, 1989)

on the knowledge and strategies used while critically evaluating and constructing online

information.

This instructional model is expected to build students’ critical evaluation skills with

online information. By having students synthesize discourse elements (Ackerman, 1991;

McInnis, 2001) it is expected that students will comprehend “the interactive product of text and

context of various kinds” (Spiro, 1980). Additionally, Cognitive Apprenticeship theory suggests

(Brown, Collins, & Duguid, 1989; Collins, Brown, & Newman, 1989) that by engaging students

as “co-investigators” (Scardamalia & Bereiter, 1983) we will encourage them to reflect on

strategies they have or may need.

Guided by previous research in critical evaluation, multimodal design, and dispositions

and framed within a multiple theoretical perspective, this study will address the following

research questions:
W. Ian O’Byrne, Dissertation Proposal 17

RQ1. What are the levels of reliability and validity obtained from the construction and

validation of a critical evaluation instrument that measures the critical thinking skills of

adolescents as they read online?

Hypothesis: Based on previous results, obtained in a pilot study (O’Byrne, 2009), the

revisions to this instrument indicate a need to develop a multiple-choice version of the

instrument. It is hypothesized that construction of this version of the instrument, while

employing solid psychometric techniques will provide a valid and reliable instrument to

be used in research.

RQ2. Does an instructional model that teaches the critical evaluation and construction of online

content with varying levels of sincerity improve the critical thinking skills of adolescents as

identified on the measure of critical evaluation validated in this study?

Hypothesis: Based on previous results, obtained in a pilot study (O’Byrne, 2009), it is

hypothesized that construction of online content with varying levels of sincerity will have

a positive effect on the ability to critically evaluate online information.

RQ3. Does an instructional model that teaches the critical evaluation and construction of online

content with varying levels of sincerity improve student scores on an assessment that measures

the dispositions of online reading?

Hypothesis: The Dispositions of Online Reading Comprehension (DORC) instrument

(O’Byrne & McVerry, 2009) has been effective in measuring the dispositions necessary

for online reading comprehension. It is hypothesized that time spent reading, evaluating,

and constructing online content will improve scores on the DORC.

RQ4. What are the themes and patterns that exist as groups of students comprehend and

construct online information in a one-to-one laptop classroom?


W. Ian O’Byrne, Dissertation Proposal 18

Expectation: It is expected that this qualitative analysis will identify some of the skills,

strategies, and dispositions previous research has identified as necessary for online

reading comprehension (Leu, et. al, 2008). It is also expected that new skills, strategies,

and dispositions will be identified that will further inform literacy practices in online

spaces.

Methods and Procedures

This study employs a quasi-experimental, mixed methods framework (Shadish, Cook &

Campbell, 2002; Johnson & Onwuegbuzie, 2004), which tests the use of an instructional model

that will empower students as evaluators and constructors of online information (Salomon, 1997;

Fabritius, 1999).

RQ1. Does an instructional model that teaches the critical evaluation and construction of

online content with varying levels of sincerity improve the critical thinking skills of

adolescents as identified on the measure of critical evaluation validated in this study?

Settings and Participants. The study will be conducted with a convenience sample of

two groups of seventh grade students and teachers from one school in Connecticut. The

convenience sample for this study has been selected because of expertise on the part of the

instructor in working with students in a one-to-one laptop classroom (Leu, Reinking et al., 2008).

Because the instructor is prequalified with the special skills, strategies, and dispositions needed

in a one-to-one laptop classroom, these techniques will more efficiently assist instruction in the

study. The student population will consist of 200 students regularly enrolled in English language

arts classes during the 2009-2010 school year. The teachers in the study will be those who are

regularly assigned to these students. The intervention group (n = 100) will consist of students in

one-to-one laptop classrooms, and a teacher with a high degree of competence in working in a
W. Ian O’Byrne, Dissertation Proposal 19

one-to-one laptop classroom. The control group (n = 100) will consist of a teacher from the same

middle school with access to a school computer lab, but without special training working in a

one-to-one laptop classroom. Thus, the intervention group will have a teacher skilled in working

in a one-to-one laptop classroom and capable of showing the students how to critically use the

resources provided.

To ensure that instruction in the control classrooms remains as close to the normal

English language arts curriculum as possible, data will be collected to provide a qualitative

account of the instruction that takes place in the control group classrooms during the study. The

researcher will collect: the curriculum provided by the school district, lesson plans created by the

control group classroom teacher, and an observation of the control group classroom. Twice

during the study the researcher will observe the control group classroom for all sessions of the

class to take notes on the general curriculum being instructed. These three sources of data will be

used to create a general qualitative account of the content of instruction being presented in the

control group.

Major Dependent Variable. The critical evaluation instrument was based on a measure

used by Brem, Russell, & Weems (2001). This variation of the instrument was used in the pilot

of this study, and as a result is being thoroughly revised to match the hypothesized constructs of

credibility and relevance (Judd, Farrow & Tims, 2006; Kiili, Laurinen & Marttunen, 2008). The

instrument is currently undergoing content validation with experts. Further validation of the

instrument, including the use of Item Response Theory (IRT) and tests of internal consistency

will be conducted using results from this study. This section will define the: constructs and

subconstructs measured by the instrument, content validation techniques being implemented, and

planned testing of validity and reliability.


W. Ian O’Byrne, Dissertation Proposal 20

Constructs and subconstructs measured. The construction of the critical evaluation

instrument began with a literature review to determine the subconstructs of the hypothesized

factors of credibility and relevance (Judd, Farrow & Tims, 2006; Kiili, Laurinen & Marttunen,

2008). These definitions were used to develop multiple-choice items that reflect the constructs

(credibility and relevance) and sub-constructs. The tasks presented in each item situated in

activities that adolescents would be engaged in as they search for online information.

Credibility is defined in terms of expertise and trustworthiness (Judd, Farrow, & Tims,

2006) or the reliability of information (Kiili, Laurinen, & Marttunen, 2008). The subconstructs of

credibility are defined as evaluation of: author, purpose, source, content, argument, and accuracy

of online information (Kiili, Laurinen, & Marttunen, 2008). Evaluation of author is defined as

the creator or source of the information showing evidence of being knowledgeable, reliable, and

truthful (Harris, 1997). Evaluation of purpose is defined as consideration by the reader to

determine the desired intent of the information (Harris, 1997). Source is defined as consideration

of the provider of the information and whether they are able to answer the desired question (Rieh

& Belkin, 1998; Strømsø & Bråten, 2010). Content is defined as consideration of the style or

manner in which information is presented (Harris, 1997; Kiili, Laurinen, & Marttunen, 2006).

Argument is defined as consideration of evidence provided by the author referencing claims

made (Kiili, Laurinen, & Marttunen, 2006). Accuracy is defined as consideration of the vetted,

reliable and error-free nature of the information (Meola, 2004).

Relevance is defined in terms of importance and currency (Judd, Farrow, & Tims, 2006),

or judgments about the essential nature of information (Kiili, Laurinen, & Marttunen, 2008),

especially in relation to the task. The subconstructs of relevance are defined as evaluations of

relevance of topic and website, as well as usability and currency of information (Kiili, Laurinen,
W. Ian O’Byrne, Dissertation Proposal 21

& Marttunen, 2008). Relevance of topic is defined as consideration of the information as being

essential to the student’s task (Kiili, Laurinen, & Marttunen, 2008). Relevance of website is

defined as consideration of the essential nature of the website as a source of information in

relation to other sources of online information (Kiili, Laurinen, & Marttunen, 2008). Usability is

defined as consideration of the mode or medium of information presented as it relates to task

(Kiili, Laurinen, & Marttunen, 2008). Currency of information is defined as consideration of the

information as being circulating and valid at the present time (Meola, 2004; Kiili, Laurinen, &

Marttunen, 2008).

Content validation techniques. In order to establish item validity, the instrument is

currently undergoing a content validation phase with experts familiar with critical evaluation

research to develop definitions for the constructs (McKenzie, Wood, Kotecki, Clark, & Brey,

1999). The six experts include professors and graduate students familiar with the field of critical

evaluation of online information research. The experts are rating the dimensionality of each of

the twenty multiple-choice items by indicating which of the constructs and subconstructs the

item measures. Any item identified by 90% of participants as measuring the hypothesized

construct will be kept for further analysis (Gable & Wolfe, 1993, McKenzie et al., 1999). A

Content Validity Index (CVI) (Rubio, Berg-Weger, Tebb, Lee, & Rauch, 2003) will be created

for each item using the feedback provided by the experts to test for multidimensionality of items.

For inclusion in the final version of the instrument to be used in this study, the CVI for each item

will need to exceed 0.70 (Rubio et al., 2003). Finally, the experts are encouraged to leave written

feedback that will be used to ensure the adequacy and accuracy of definitions of constructs and

items constructed (McKenzie et al., 1999).


W. Ian O’Byrne, Dissertation Proposal 22

Planned testing of validity and reliability. Further validation of the critical evaluation

instrument will be conducted using results from student responses in the study. IRT will be used

to summarize the data provided by student scores into student scores on the latent trait and

parameters of the items (Hambleton & Swaminathan, 1985). IRT is appropriate for this study

since it provides explicit modeling for the probability of each possible response to each item. To

ensure reliability of the instrument, the researcher will also test for internal consistency, or the

degrees of relatedness of the items. Cronbach’s coefficient alpha will be calculated (Pett et al.,

2003) to identify the total proportion in a given scale that can be attributed to a common source.

This calculation of Cronbach’s alpha requires several assumptions to be met: 1) the items are

independent; 2) the errors are uncorrelated; 3) the items are positively correlated; and 4) the

items are unidimensional. To test these assumptions an initial factor analysis will be conducted

using student scores to determine which items have positive inter-item correlations. The inter-

item correlation matrix will be checked to remove any items that are redundant, too positively

correlated, or negatively correlated with each other. Finally, multidimensionality of the

instrument will be checked to ensure that items do not load at least moderately on more than one

scale; these items will be deleted.

Research Question 2: Does an instructional model that teaches the critical evaluation and

construction of online content with varying levels of sincerity improve the critical thinking

skills of adolescents as identified on the measure of critical evaluation validated in this

study?

Settings and Participants. The school, students, and instructors that have been described

in RQ1 are the same settings and participants as in RQ2.


W. Ian O’Byrne, Dissertation Proposal 23

Power Analysis. An a priori power analysis was conducted to determine adequate

sample size for the two groups in the study. A power analysis program called GPOWER (Faul,

Erdfelder, Buchner, & Lang, 2009) was used with a power estimate of 0.80, a Cohen’s d effect

size of 0.35, an alpha level of 0.05. A medium effect size of 0.35 has been selected which is

adequate for research (Cohen, 1992). With a conservative correlation between the dependent

variable and the covariate of 0.60 (i. e., correlation between the pretest and posttest) the required

sample size would be 170 participants.

Instructional Model. The instructional model contains three phases with instruction

guided by modeling, coaching and then fading as detailed by Cognitive Apprenticeship theory

(Collins, Brown, & Newman, 1989). The intervention will be conducted four times a week

during the students’ regular English language arts ninety-minute class time, and will last six

weeks. During the first phase, students will analyze the argumentative techniques used by OCC

authors to manufacture the appearance of sincerity and resultant credibility (Brem, Russell, &

Weems, 2001; Britt & Gabrys, 2002; Fogg, Marshall, Laraki, Osipovich, Varma, Fang, et al.,

2001). These techniques include multiple layers of argument used by OCC authors that provide

challenges to online readers as they try to evaluate credibility, accuracy, reasonableness, and

support (Harris, 1997; Brem, Russell, & Weems, 2001). These techniques include, but are not

limited to: hyperlinks, about us pages, images, video, testimonials, page layout, color choices,

and font selection. During the second phase, groups of students will collaboratively construct

websites by manufacturing these indicia of sincerity in their work-product, previously identified

as hoax websites (Brem, Russell, & Weems, 2001). During the third phase, students will evaluate

and critique the work-product of others and reflect upon the skills and strategies used by novice

and expert critical evaluators and authors of online information (Collins & Brown, 1988). The
W. Ian O’Byrne, Dissertation Proposal 24

cumulative work-product and lessons learned will be consolidated at one accessible online

location for review and use by others.

Fidelity checks will be conducted to ensure consistency across various class sessions of

students during the study. A criteria checklist will be created to identify two levels of

information that should exist in the instruction given to treatment groups. This checklist will

consist of: a) general elements (e.g., time, topic, assignments) and b) specific activity level

elements. This checklist will be constructed and routinely edited by the instructor and researcher

weekly during lesson planning sessions. At least once time during phase one, phase two, and

phase three of the instructional model an outside observer will conduct a fidelity check to ensure

consistency of instruction across all classes of students. The outside observer will be a researcher

with knowledge of the study and constructs involved. The observer will receive a checklist

identifying the general elements and specific activity level elements that should be evident in

instruction. The observer will watch all classes during one school day and take notes as to how

consistent instruction was across all classes, as identified by the checklist. These findings and

any additional comments will then be shared with the researcher and instructor.

Analysis of Covariance. The pretest scores on the critical evaluation instrument will first

be tested to ensure that pretest differences do exist between groups. If differences do exist,

analysis of data for RQ2 will be conducted using a one-way analysis of covariance (ANCOVA)

test. This will analyze differences between groups on student scores of the critical evaluation

instrument, using the pretest as the covariate (Van Breukelen, 2006). The use of an ANCOVA

for analysis carries an assumption of homogeneity of regression slopes, which will be tested by

building an interaction between the covariate and treatment group. If this interaction is

significant, then an ANCOVA cannot be used in this analysis. In this case, the analysis will be
W. Ian O’Byrne, Dissertation Proposal 25

conducted using a multiple linear regression while keeping the interaction term (i.e., pretest

scores) significant. Differences between the pretest and posttest on the critical evaluation

instrument will suggest that the instructional model had an effect on students’ critical evaluation

skills.

Research Question 3: Does an instructional model that teaches the critical evaluation and

construction of online content with varying levels of sincerity improve student scores on an

assessment that measures the dispositions of online reading?

Settings and Participants. The school, students, and instructors that have been described

in RQ1 are the same settings and participants as in RQ3.

Power Analysis. An a priori power analysis was conducted to determine adequate

sample size for the two groups in the study. A power analysis program called GPOWER (Faul,

Erdfelder, Buchner, & Lang, 2009) was used with a power estimate of .80, a Cohen’s d effect

size of .35, an alpha level of 0.05. A medium effect size of .35 has been selected which is

adequate for research (Cohen, 1992). With a conservative correlation between the dependent

variable and the covariate of 0.60 (i. e., correlation between the pretest and posttest) the required

sample size would be 170 participants.

Instructional Model. The instructional method tested in RQ3 is the same method as

detailed in RQ1. A three-phase model of instruction, based on principles of cognitive

apprenticeship: 1) students will analyze the techniques authors use to make websites credible

(Britt & Gabrys, 2002; Fogg, Marshall, Laraki, Osipovich, Varma, Fang, et al., 2001); 2) then

students will construct websites with manufactured markers of online sincerity (Brem, Russell, &

Weems, 2001); 3) and finally students will reflect (Collins & Brown, 1988) on the knowledge

and strategies used while critically evaluating and constructing online information.
W. Ian O’Byrne, Dissertation Proposal 26

Analysis of Covariance. The pretest scores on the DORC will first be tested to ensure

that pretest differences do exist between groups. If differences do exist, analysis of data for RQ3

will be conducted using a one-way ANCOVA test. Student gains in the dispositions needed

while reading online will be measured by the Dispositions of Online Reading Comprehension

(DORC) instrument (O’Byrne & McVerry, 2009). The ANCOVA will be conducted using the

pretest as the covariate (Van Breukelen, 2006) to analyze differences between groups. The use of

an ANCOVA for analysis carries an assumption of homogeneity of regression slopes, which will

be tested by building an interaction between the covariate and treatment group. If this interaction

is significant, then an ANCOVA cannot be used in this analysis. In this case, the analysis will be

conducted using a multiple linear regression while keeping the interaction term (i.e., pretest

scores) significant. Differences between the pretest and posttest on the DORC will suggest that

the instructional method had an effect on the students’ dispositions needed for reading online

information.

Research Question 4. What are the themes and patterns that exist as groups of students

comprehend and construct online information in a one-to-one laptop classroom?

Settings and Participants. The school and instructors described in RQ1, RQ2 and RQ3

are the same settings for RQ4. Qualitative data will be collected from the three top performing

and three bottom performing groups of students as identified by researcher and instructor

observations. Student groups, heterogeneously composed of two to four students, will be

purposely sampled to provide data that addresses the research question. Observations of student

groups will begin for a period of two weeks before the study begins while students work in

groups they normally use for their English language arts class. Selection of student groups will

involve a selection due to modified criteria established by Oakley, Felder, Brent, & Elhajj (2004)
W. Ian O’Byrne, Dissertation Proposal 27

that will begin with exclusion of any groups that do not work well together. The objective of

RQ4 is to identify the benchmarks established by groups that work well together in this study,

and provide educators with skills and strategies needed by students, for this reason groups of

students that do not work well together will not be considered. After groups have been identified

that work well together, the final selection will be made using the checklist detailed in Table 1.

________________

Insert Table 1 Here

_________________

Determination of the selected groups will be made using observation of classroom group

behavior and completed group work. To ensure that these selected groups will be effective in

addressing RQ4, the researcher and instructor will exchange observation notes and make a final

decision as to which groups to follow by the end of the first week of the study.

Qualitative Data Collection. Three types of data will be collected to allow for

triangulation of findings (Denzin, 1978): interviews of student groups, researcher notes, and

student work-products.

Interviews of student groups. Interviews of student groups will consist of think-aloud

focus groups (Krueger & Casey, 2008) with the selected top three and bottom three groups to

understand how students engage with online information (Damico & Baildon, 2007). The main

research question of the focus groups will be RQ4. The focus groups will consist of one to two

meetings with each of the six groups of students, each lasting 30 minutes. The researcher will act

as the moderator, with the role being that of a “seeker of knowledge” (Krueger & Casey, 2008).

In the focus groups, students will be asked to detail choices made in the critical evaluation and

construction of online information. The students will be informed of the intent of the interviews
W. Ian O’Byrne, Dissertation Proposal 28

ahead of time to allow them to prepare their thoughts for the interview to allay concerns that

students may have difficulty considering metacognitive decisions while being interviewed

(Ericsson & Simon, 1993; Ericsson, 2002). These interviews will be collected using iShowU, a

screen capture software. The use of video screen captures is necessary to allow analysis of the

students describing design choices of their websites using their computers. These video taped

interviews will be coded and transcribed using ELAN, a qualitative data video analysis program.

Researcher notes. Researcher notes will include observational notes collected during the

study. The notes will include observations of group dynamics of the six selected groups and

indications of how effectively they work with their groups. The researcher will also take note of

specific strategies used, modifications made, or problems encountered while working with

groups.

Student work-products. Student work-products will include websites and graphic

organizers created by students during the study. In the instructional model, students will

construct graphic organizers to map out and plan representations of their hoax websites. These

graphic organizers will be full color maps of their websites that indicate plans of design choices.

The hoax websites will be constructed using iWeb (website construction software) and saved on

student laptops. These graphic organizers and final versions of their hoax websites will be

collected and analyzed.

Analysis. Analysis of the data will be conducted with the intent of identifying

connections between the data and research questions (Thomas, 2006) to explain the decisions

students make as they critically evaluate and construct online content. Analysis will be

conducted in a multi-step process to inductively analyze (Patton, 2002) and ultimately develop

themes (Merriam, 2002) from the data. Initially, the researcher will closely read across the
W. Ian O’Byrne, Dissertation Proposal 29

interview data, researcher notes, and student work-product to gain insight into the diversity of

codes possible. During this initial close reading of the data, the researcher will take notes and

create preliminary inductive codes. The researcher will then revisit the preliminary codes and

examine their relationship to the research purposes (Thomas, 2006). These codes will then be

used to code all documents. After the initial coding, constant comparative methods (Glaser &

Strauss, 1967) will be used across all codes to collapse the preliminary codes into specific

categories. Once categories have been created a rigorous content analysis (Mayring, 2000) will

be used to bring order to the field and identify themes.

Limitations

Several limitations impact the findings of this study. Given that there are no previous

instruments used to evaluate the ability to critically evaluate information during online reading

comprehension, this study uses a new instrument. Care has been taken, however, to ensure that

this instrument will provide valid and reliable data. The instrument was previously piloted and

revised thoroughly prior to this study. A systematic content validation using experts is currently

taking place. Further validation techniques, including an EFA and tests of internal construct

consistency will help to ensure that this instrument provides valid and reliable data.

Another limitation of the study is the setting. This school was selected because of the

presence of a one-to-one laptop classroom, and an instructor skilled in teaching in this

environment. Additionally, the researcher served as a resource and instructor in the classroom.

The addition of the extra instructor, the expertise on working with OCC tools and the presence of

one-to-one laptops could be perceived to be advantages for the intervention group. It is unknown

whether this study could be conducted with an instructor that did not have the skills and

dispositions needed to bring this instructional model to the average seventh grade student.
W. Ian O’Byrne, Dissertation Proposal 30

Another limitation would be the previous skill level of students in the study. It is

unknown what previous training the students may have received in OCC tool use, either in

school or out. Additionally, there are other cognitive factors that affect the success and failure of

students that cannot be measured by the critical evaluation instrument and DORC. Due to the

status of the participants as seventh grade students in a middle school, there are challenges and

opportunities that affect the study (Moje, 2002). Adolescents, specifically seventh graders have

been shown to include identity construction as they engage in literacy practices (Broughton &

Fairbanks, 2003). Identity construction in adolescent literacy and fiction is still being studied by

the research community (Moje, 2002; Bean & Moni, 2003). These literacy practices could be

seen as the construction of sincere and credible online information by students in this study. The

inclusion of qualitative data in the form of interviews and researcher notes is an attempt to

recognize these influences.

Additionally, the use of the convenience sample in the study, rather than a random

sampling of students is a limitation. The sample of students was specifically selected because of

the presence of one-to-one laptops, and an instructor that regularly uses them effectively in

classroom practice. Thus, these results must be cautiously interpreted and only used to plan more

systematic, studies with randomed controlled trials.

Significance of the Study

As students increasingly use the Internet for information seeking behaviors the skills and

dispositions needed while comprehending these texts are vital. Research shows that students

rarely successfully deploy the healthy skepticism that is needed when searching and evaluating

information online. Research such as this will continue the examination into those patterns but

extends it by questioning the effectiveness of instruction of these skills. While dispositions used
W. Ian O’Byrne, Dissertation Proposal 31

during online reading comprehension have been successfully measured (O’Byrne & McVerry,

2009), research needs to continuously examine the multifaceted dispositions that occur as

students critically evaluate and construct online information.

Research has shown that direct instruction of these skills does not transfer into student

acquisition and use of these skills when reading online (Spires, Lee, Turner, & Johnson, 2008;

Greenhow, Robelia, & Hughes, 2009). The study will test a new model for teaching greater

understanding of credibility, relevance and sincerity of online information through the act of

constructing these markers (Brem, Russell & Weems, 2001). Future research can also investigate

whether content construction and computer-mediated communication can affect the operational,

academic and critical lenses (Damico, Baildon, & Campano, 2005) students must use while

reading online.

Results from this study will also point to a new field of research involving the content

construction habits of adolescents in online spaces. The ability to modify markers of credibility,

relevance and sincerity show students’ dexterity to not only recognize these markers, but also to

create or remix them. Further research can investigate methods to deploy scaffolding techniques

when constructing online information. Additionally, skill in the use of these markers of critical

elements of online information suggests that these same skills can be carried over to offline

reading and writing.


W. Ian O’Byrne, Dissertation Proposal 32

References

Ackerman, J. (1991). Reading, writing, and knowing: The role of disciplinary knowledge in

comprehension and composing. Research in the Teaching of English, 25, 133-178.

Alexander, P. A., & Jetton, T. L. (2002). Learning from text: A multidimensional and

developmental perspective. In M. L. Kamil, P. Mosenthal, P. D. Pearson, and R. Barr

(Eds.) Handbook of reading research, Volume III (pp. 285-310). Mahwah, NJ: Erlbaum.

Alexander, J. E., & Tate, M. A. (1999). Web wisdom: how to evaluate and create

information quality on the Web. Hillsdale, NJ: Lawrence Erlbaum.

Alvermann, D. & Hagood, M. (2000) Fandom and critical media literacy. Journal of

Adolescent & Adult Literacy, 43, 436-446.

Alvermann, D. (Ed.). (2002). Adolescents and Literacies in a Digital World. New York: Peter

Lang.

Alvermann, D. (2004). Media, information communication technologies, and youth literacies: A

cultural studies perspective. American Behavioral Scientist, 48(1), 78-83.

Anderson, J. Q., & Rainie, L. (2008). The Future of the Internet III. 

Pew Internet and American Life Project.

Bailey, G. & Blythe, M. (1998). Outlining, Diagramming and Storyboarding –

Three Essential Strategies for Planning and Creating Effective Educational Websites.

Learning & Leading With Technology, 6-11.

Baker, L., & Wigfield, A. (1999). Dimensions of children’s motivation for reading and their

relations to reading activity and reading achievement. Reading Research Quarterly, 34,

452-476.
W. Ian O’Byrne, Dissertation Proposal 33

Bennett, S., Maton, K., & Kervin, L. (2008). The digital natives debate: A critical review

of the evidence. British Journal of Educational Technology, 19, 775–786.

Benotsch, E., Kalichman, S., & Weinhardt, L. (2004). HIV-AIDS patients’ evaluation of

health information on the Internet: The digital divide and vulnerability to fraudulent

claims. Journal of Consulting and Clinical Psychology, 72, 1004–1011.

Berland, G., Elliott, M., Morales, L., Algazy, J., Kravitz, R., Broder, M., et al. (2001). Health

information on the Internet: accessibility, quality, and readability in English and Spanish.

JAMA, 285, 2612-2621.

Brandt, S. (1997). Constructivism: Teaching for understanding on the Internet.

Communications of the ACM, 40, 112-116.

Bråten, I., Strømsø, H.I., & Britt, M.A. (2009). Trust matters: Examining the role of

source evaluation in students' construction of meaning within and across multiple

texts. Reading Research Quarterly, 44, 6-28.

Bredo, E. (1994). Reconstructing educational psychology: Situated cognition and Deweyian

pragmatism. Educational Psychologist, 29(1), 23-25.

Brem, S. K., Russell, J., & Weems, L. (2001). Science on the Web: Student evaluations

of scientific arguments. Discourse Processes, 32, 191–213.

Britt, M., Perfetti, C., Sandak, R., & Rouet, J. (1999). Content integration and source

separation in learning from multiple texts. In S. R. Goldman, A. C. Graesser, & P. van

den Broek (Eds.), Narrative, comprehension, causality, and coherence: Essays in honor

of Tom Trabasco (pp. 209-233). Mahwah, NJ: Erlbaum.

Britt, M. & Gabrys, G. (2002). Implications of document-level literacy skills for Web site

design. Behavior Research Methods, Instruments & Computers, 34, 170-176.


W. Ian O’Byrne, Dissertation Proposal 34

Brown, J. (2000). Growing up digital: how the Web changes work, education, and the ways

people learn. Change, March/April, 10–20.

Brown, J., Collins, A., & Duguid, P. (1989). Situated Cognition and the Culture of Learning.

Education Researcher, 18(1), 32-42.

Browne, M. N., Freeman, K. E., & Williamson, C. L. (2000). The importance of critical

thinking for student use of the Internet. College Student Journal, 34(3), 391–398.

Carr, M. & Claxton, M. (2002). Tracking the development of learning dispositions. Assessment

in Education, 9(1), 9-37.

Cervetti, G., Pardales, M., & Damico, J. (2001, April). A tale of differences: Comparing

the traditions, perspectives and educational goals of critical reading and critical literacy.

Reading Online 4(9). Available: http://readingonline.org/articles/art_index.asp?

HREF/articles/cervetti/index.html

Cline, R. & Haynes, K. (2001) Consumer health information seeking on the Internet: the state of

the art. Health Education Research, 16, 671–692.

Coiro, J. (2003). Rethinking comprehension strategies to better prepare students for

critically evaluating content on the Internet. [Electronic Version]. New England Reading

Association Journal.

Coiro, J. (2007). Exploring changes to reading comprehension on the Internet:

Paradoxes and possibilities for diverse adolescent readers. Unpublished dissertation:

University of Connecticut.

Coiro, J. (2008). Exploring the relationship between online reading


W. Ian O’Byrne, Dissertation Proposal 35

comprehension, frequency of Internet use, and adolescents’ dispositions toward reading

online. Paper presented at the annual meeting of the National Reading Conference,

Orlando, FL.

Coiro, J., Knobel, M., Lankshear, C. & Leu, D. (Eds.) (2008). Handbook of research on

new literacies. Mahwah, NJ: Lawrence Erlbaum Associates.

Collins, A., & Brown, J. (1988). The computer as a tool for learning through reflection. In H.

Mandi & A. Lesgold (Eds.), Learning issues for intelligent tutoring systems (pp. 1-18).

New York: Springer-Verlag.

Collins, A., Brown, J., & Newman, S. (1989). Cognitive Apprenticeship: Teaching the craft of

reading, writing and mathematics. In L. B. Resnick (Ed.), Knowing, learning, and

instruction: Essays in honor of Robert Glaser. Hillsdale, NJ: Erlbaum.

Comrey, A. L., & Lee, H. B. (1992) A first course in factor analysis (2nd ed.). Hillsdale, NJ:

Erlbaum.

Dahlberg, L. (2001). Computer-mediated communication and the public sphere: A critical

analysis. Journal of Computer-Mediated Communication, 7 (1). Retrieved May 1, 2009

from http://jcmc.indiana.edu/vol7/issue1/dahlbergold.html

Damico, J., Baildon, M., & Campano, G. (2005). Integrating literacy, technology and disciplined

inquiry in social studies: The development and application of a conceptual model.

Technology, Humanities, Education and Narrative. Retrieved March 15, 2010, from then

journal.org/feature/92/

Damico, J., & Baildon, M. (2007). Examining ways readers engage with websites during think-

aloud sessions. Journal of Adolescent & Adult Literacy, 51(3), 254–263.


W. Ian O’Byrne, Dissertation Proposal 36

Davis, J. (1993). Media literacy: From activism to exploration. Media literacy: A report of the

national leadership conference on media literacy (Appendix A) (ERIC Document

Reproduction Service No. ED 365 294).

Denzin, N. K. (1978). The research act: A theoretical introduction to sociological methods.

New York: McGraw-Hill.

Doneman, M. (1997). Multimediating. In C. Lankshear, C. Bigum, & C. Durant (Eds.),

Digital Rhetorics: Literacies and technologies in education–current practices and future

directions, Vol. 3 (pp. 131–148). Brisbane, Australia: QUT/DEETYA.

Eagleton, M. B., Guinee, K. & Langlais, K. (2003). Teaching Internet literacy strategies:

the hero inquiry project. Voices from the Middle, 10, 3, 28–35

Eagleton, M., & Dobler, E. (2007). Reading the Web: Strategies for Internet inquiry.

New York: Guilford.

Ericsson, K. A., Simon, H. A. (1993). Protocol Analysis: Verbal Reports as Data. MIT

Press, Cambridge, MA.

Ericsson, K. A. (2002). Toward a procedure for eliciting verbal expression of nonverbal

experience without reactivity: Interpreting the verbal overshadowing effect within the

theoretical framework for protocol analysis. Applied Cognitive Psychology, 16, 981-987.

Estabrook, L., Witt, E., Rainie, L. (2007). Information Searches That Solve Problems.

Pew Internet & American Life Project: Washington, DC. Retrieved December 20, 2007,

from: http://www.pewinternet.org/pdfs/Pew_UI_LibrariesReport.pdf

Fabritius, H. (1998, 9 September). Information seeking in the newsroom: application of the

cognitive framework for analysis of work content. Retrieved March 1, 2010, from:

http://www.shef.ac.uk/~is/publications/infres/fabritiu.html
W. Ian O’Byrne, Dissertation Proposal 37

Faul, F., Erdfelder, E., Buchner, A., & Lang, A. (2009). Statistical power analyses

using G*Power 3.1: Tests for correlation and regression analyses. Behavior Research

Methods, 41, 1149-1160.  

Flanagin, A. J., & Metzger, M. J. (2000). Perceptions of Internet information credibility.

Journalism and Mass Communication Quarterly, 77, 515–540.

Fleiss, J. L., Cohen, J. (1973). The equivalence of weighted kappa and the intraclass

correlation coefficient as measures of reliability. Educational and Psychological

Measurement, 33: 613-619.

Fogg, B.J., J. Marshall, O. Laraki, A. Osipovich, C. Varma, N. Fang, et al. (2001). What

makes web sites credible? A report on a large quantitative study. Presented to the

Computer-Human Interaction Conference, Seattle, Washington.

Fogg, B. J., Soohoo, C., Danielson, D. R., Marable, L., Stanford, J., & Tauber, E. R.

(2003). How do users evaluate the credibility of Web sites? A study with over 2,500

participants. Proceedings of the 2003 Conference on Designing for User Experiences

(DUX’03). Retrieved April 15, 2008, from

http://portal.acm.org/citation.cfm?

id=997097&coll=ACM&dl=ACM&CFID=36236037&CFTOKEN=18606069

Fox S. (2006). Online Health Search 2006. Washington, DC: Pew Internet & American Life

Project.

Fox, S., Anderson, J. Q., & Rainie, L. (2005, January). The Future of the Internet. Washington,

DC: Pew Charitable Trusts. Retrieved September 29, 2010, from

http://www.pewinternet.org/pdfs/PIP_Future_of_Internet.pdf

Freebody, P. & Luke, A. (1990). Literacies programs: Debates and demands in cultural
W. Ian O’Byrne, Dissertation Proposal 38

context. Prospect: Australian Journal of TESOL 5(7), 7-16.

Gable, R. K., & Wolf, M. B. (1993). Instrument development in the affective domain: Measuring

attitudes and values in corporate and school settings. (2nd ed.). Boston, MA: Kluwer

Academic Publishers.

Gee, J. (2003). What video games have to teach us about learning and literacy.

Macmillan. New York.

Glaser, B. G. & Strauss, A. L. (1967). The Discovery of Grounded Theory: Strategies for

Qualitative Research. Chicago: Aldine.

Graesser, A.C., Wiley, J., Goldman, S.R., O’Reilly, T., Jeon, M., & McDaniel, B. (2007).

SEEK Web Tutor: Fostering a critical stance while exploring the causes of volcanic

eruption. Metacognition and Learning, 2(2-3), 89-105. Doi:10.1007/s11409-007-9013-x

Greenhow, C., Robelia, B., & Hughes, J. (2009). Web 2.0 and classroom research: What path

should we take now? Educational Researcher, 38(4), 246–259.

Gross, E. F. (2004). Adolescent Internet use: What we expect, what teens report. Journal of

Applied Developmental Psychology, 25, 633– 649.

Guthrie, J. T., & Wigfield, A. (Eds.). (1997). Reading engagement: Motivating readers through

integrated instruction. Newark, DE: International Reading Association.

Halliday, M.A.K. (1994). An Introduction to Functional Grammar. 2nd ed. London:

Edward Arnold.

Hambleton R, Swaminathan H. (1985). Item Response Theory: Principles and Applications.

Boston, MA: Kluwer-Nijhoff.

Hannafin, M., & Land, S. (1997). The foundations and assumptions of technology-

enhanced student-centered learning environment. Instructional Science, 25, 167-202.


W. Ian O’Byrne, Dissertation Proposal 39

Hannafin, M., & Land, S. (2000). Technology and student-centered learning in higher

education. Journal of Computing in Higher Education, 12(1), 3-30.

Harris, R. (1997). Evaluating Internet research sources. Retrieved September 12, 2010,

from www.virtualsalt.com/evalu8it.htm

Harris, F. (2007). Challenges to Teaching Credibility Assessment in Contemporary Schooling. In

M. J. Metzger & A. J. Flanagin (Eds.), Digital Media, Youth, and Credibility. Cambridge,

MA: The MIT Press.

Henry, L. (2006). SEARCHing for an answer: The critical role of new literacies while

reading on the Internet. Reading Teacher, 59, 614–627.

Horrigan, J. (2010). Broadband Adoption and Use in America. Washington, D. C.: Federal

Communication Commission. Retrieved February 25, 2010 from

http://hraunfoss.fcc.gov/edocs_public/attachmatch/DOC-296442A1.pdf

Jewitt, C. (2008). Multimodality and Literacy in School Classrooms. Review of Research

in Education, 32(1), 241-267.

Jick, T. (1979). Mixing qualitative and quantitative methods: Triangulation in action.

Administrative Science Quarterly, 24, 602-61 1.

Johnson, T., & Kaye, B. (1998). Cruising is believing? Comparing Internet and

traditional sources on media credibility measures. Journalism & Mass Communication

Quarterly, 75, 325–340.

Johnson, R.B. & Onwuegbuzie, A.J. (2004). Mixed methods research: A research

paradigm whose time has come. Educational Researcher, 33, 14-26.

Jones, S. & Madden, M. (2002). The Internet goes to college: How students are living in
W. Ian O’Byrne, Dissertation Proposal 40

the future with today's technology. Washington DC: Pew Internet & American Life

Project. Retrieved January 20, 2007 from

http://www.pewinternet.org/pdfs/PIP_College_Report.pdf.

Judd, V., Farrow, L., & Tims, B. (2006). Evaluating public web site information: a

process and an instrument. Reference Services Review, 34(1): 12-32.

Kaiser Family Foundation. (2005). Generation M: Media in the lives of 8-18 year olds.

Retrieved March 9, 2005, from http://www.kff.org/entmedia/entmedia030905pkg.cfm

Kellner, D. & Share, J. (2005). Toward critical media literacy: Core concepts, debates,

organizations, and policy. Discourse: Studies in the Cultural Politics of Education, 26(3),

369-386.

Kiili, C., Laurinen, L., & Marttunen, M. (2008). Students evaluating Internet sources:

From versatile evaluators to uncritical readers. Journal of Educational Computing

Research, 39(1): 75-95.

Kimber, K. & Wyatt-Smith, C. (2006). Using and creating knowledge with new

technologies: a case for students-as-designers. Learning, Media and Technology, 31(1),

19-34.

Korp, P. (2006). Health on the Internet: implications for health promotion. Health Education

Research, 21(1): 78–86.

Kress, G. (2003). Literacy in the new media age. London: Routledge.

Kress, G., & van Leeuwen, T. (2001). Multimodal discourse: The modes and media of

contemporary communication. London: Arnold.

Krueger, R. & Casey, M. (2008). Focus Groups: A practical guide for applied research.

Thousand Oaks, CA: Sage.


W. Ian O’Byrne, Dissertation Proposal 41

Labbo, L. & Reinking, D. (1999). Negotiating the multiple realities of technology in

literacy research and instruction. Reading Research Quarterly, 34, 478-492.

Lankshear, C. & Knobel, M. (1998). Critical literacies and new technologies. Paper

presented at the American Education Research Association convention in San

Diego, CA.

Lankshear, C., & Knobel, M. (2003). New literacies: changing knowledge and classroom

learning. Open University Press, Buckingham, UK.

Lemke, J (2002) Travels in hypermodality. Visual Communication 1(3), 299-325.

Lenhart, A., Simon, M., & Graziano, M. (2001). The Internet and education: Findings of the Pew

Internet & American Life Project. Washington, DC: Pew Internet & American Life

Project.

Lenhart, A., Madden, M. & Hitlin, P. (2005). Teens and technology: Youth are leading

the transition to a fully wired and mobile nation. Washington DC: Pew Internet &

American Life Project.

Leu, D. J. (2000). Literacy and technology: Deictic consequences for literacy education

in an information age. In M.L. Kamil, P. Mosenthal, R. Barr, & P.D. Pearson (Eds.),

Handbook of reading research: Volume III (pp.743-770). Mahwah, NJ: Erlbaum.

Leu, D.J., Kinzer, C.K., Coiro, J., Cammack, D. (2004). Toward a theory of new

literacies emerging from the Internet and other information and communication

technologies. In R.B. Ruddell & N. Unrau (Eds.), Theoretical Models and Processes of

Reading, Fifth Edition (1568-1611). International Reading Association: Newark, DE.

Retrieved October 15, 2008 from

http://www.readingonline.org/newliteracies/lit_index.asp?HREF=/newliteracies /leu
W. Ian O’Byrne, Dissertation Proposal 42

Leu, D. J., Zawilinski, L., Castek, J., Banerjee, M., Housand, B., Liu, Y., et al. (2007).

What is new about the new literacies of online reading comprehension? In L. Rush, J.

Eakle, & A. Berger, (Eds.). Secondary school literacy: What research reveals for

classroom practices. (37-68). Urbana, IL: National Council of Teachers of English.

Leu, D., Coiro, J., Castek, J., Hartman, D., Henry, L., & Reinking, D. (2008). Research

on instruction and assessment in the new literacies of online reading comprehension. In

Cathy Collins Block, Sherri Parris (Eds.). Comprehension instruction: Research-based

best practices (pp. 321-346). New York: Guilford Press.

Leu, D. J., Reinking, D., Hutchinson, A., McVerry, J. G., Robbins, K., Rogers, A.,

Malloy, J., O’Byrne, W. I., Zawilinski, L. (2008). The TICA Project: Teaching the New

Literacies of Online Reading Comprehension to Adolescents. An alternative symposium

presented at the National Reading Conference, Orlando, FL.

Leu, D. J., O’Byrne, W. I., Zawilinski, L. McVerry, J. G., & Everett-Cacopardo, H.

(2009). Expanding the New Literacies Conversation. Educational Researcher, 38(4),

264-269.

Liaw, S. S. (2002). Understanding user perceptions of world-wide web environments. Journal of

Computer Assisted Learning, 18(2), 137-148.

Lin, C., Wu, S., & Tsai, R. (2005). Integrating perceived playfulness into expectation-

confirmation model for web portal context. Information & Management, 42, 683-693.

Livingstone, S. (2004). Media literacy and the challenge of new information and

communication technologies. The Communication Review , 7 (1), 3-14.

Livingstone, S., & Bober, M. (2005). UK children go online: Final report of key

project findings. Project Report. London: London School of Economics


W. Ian O’Byrne, Dissertation Proposal 43

and Political Science.

Lubans, J. (1998, April). How first-year university students use and regard Internet

resources. Available from Duke University Libraries Web site:

www.lib.duke.edu/staff/orgnztn/lubans/docs/1styear/firstyear.htm (accessed may 2000).

Lubans, J. (1999, May). Students and the Internet. Available from Duke University

Libraries Web site: www.lib.duke.edu/staff/orgnztn/lubans/docs/study3.htm (accessed

may 2000).

Luke, A. (2000). Critical literacy in Australia: A matter of context and standpoint.

Journal of Adolescent & Adult Literacy, 43, 448-461.

Mayring, P. (2000). Qualitative content analysis. Forum Qualitative Social

Research/Forum Qualitative Sozialforschung, 1(2). Retrieved September 24, 2008, from

http://www.qualitative-research.net/fqs-texte/2-00/2-00mayring-e.htm

McCaughey, M. & Ayers, M. (2003). Cyberactivism: Online activism in theory and

practice. New York: Routledge.

McInnis, R. (2001). Introduction: Defining Discourse Synthesis. In R. G. McInnis (Ed.),

Discourse Synthesis: Studies in Historical and Contemporary Social Epistemology.

Westport: Praeger Publishers.

McKenzie, J. F., Wood, M. L., Kotecki, J. E., Clark, J. K., & Brey, R. A. (1999).

Establishing content validity: Using qualitative and quantitative steps. American Journal

of Health Behavior, 23, 311-318.

Meola, M. (2004). Chucking the checklist: A contextual approach to teaching undergraduates

Web-site evaluation. portal: Libraries and the Academy, 4, 331–344.


W. Ian O’Byrne, Dissertation Proposal 44

Merriam, S.B. (2002). Introduction to qualitative research. In Merriam, S.B. and associates.

Qualitative research in practice. San Francisco: Jossey-Bass, pp. 3 – 17.

Metzger, M. (2007). Making sense of credibility on the web: Models for evaluating

online information and recommendations for future research. Journal of the

American Society for Information Science and Technology, 58(13):2078–2091.

Morrell, E. (2002). Toward a critical pedagogy of popular culture: Literacy development

among urban youth. Journal of Adolescent & Adult Literacy, 46, 72-77.

The New London Group. (1996). A pedagogy of multiliteracies: Designing social futures.

Harvard Educational Review, 66, 60–92.

Nunnally, J. (1978). Psychometric theory. New York: McGraw-Hill.

Oakley, B., Felder, R., Brent, R. & Elhajj, I. (2004). Turning Student Groups into Effective

Teams. Journal of Student Centered Learning, 2(1), 9–34. Retrieved on March 1, 2010

from http://www.ncsu.edu/felder-public/Papers/Oakley-paper(JSCL).pdf

O’Brien, D. (2006). “Struggling” adolescents’ engagement in multimediating: Countering

the institutional construction of incompetence. In D. E. Alvermann, K. A. Hinchman, D.

W. Moore, S. F. Phelps, & D. R. Waff (Eds.), Reconceptualizing the literacies in

adolescents’ lives (pp. 29-46). Mahwah, NJ: Erlbaum.

O’Byrne, W. I. (2009). Facilitating Critical Thinking Skills through Content Creation. In

L. Zawilinski (Chair), “Because the Internet Said So”: Critical Evaluation and

Communication During Online Reading Comprehension. A paper presented at the 58th

Annual National Reading Conference, Albuquerque, NM.

O'Byrne, W. I., & McVerry, J. G. (2009). Measuring the Dispositions of Online Reading

Comprehension: A Preliminary Validation Study. In K. Leander, D. Rowe, D. Dickson,


W. Ian O’Byrne, Dissertation Proposal 45

M. Hundley, R. Jiménez, & V. Risko (Eds.), The 57th National Reading Conference

Yearbook. Oak Creek, WI: National Reading Conference.

Oblinger, D. (2004). The next generation of educational engagement. Journal of Interactive

Media in Education, 8. Retrieved June 27, 2009, from

http://www.jime.open.ac.uk/2004/8/oblinger-2004-8-disc-t.html

Orsmond, P., Merry, S. & Reiling, K. (2000) The use of student driven marking criteria

in peer and self assessment, Assessment & Evaluation in Higher Education, 25(1), 23–38.

Orsmond, P., Merry, S. & Reiling, K. (2002) The use of exemplars and formative

feedback when using student derived marking criteria in peer and self-assessment,

Assessment & Evaluation in Higher Education, 27(4), 309–323.

Patton, M. (2002). Qualitative Research and Evaluation Methods. Thousand Oaks, CA:

Sage.

Pett, M. A., Lackey, N. R., Sullivan, J. J. (2003) Making sense of factor analysis. The use of

factor analysis for instrument development in health care research. Thousand Oaks, CA:

Sage.

Pew Research Center. (2010). Millennials: A Portrait of Generation Next. Retrieved March 1,

2010 from http://pewresearch.org/millennials/

Pickerill, J. (2003). Cyberprotest: Environmental activism online. Manchester: Manchester

University Press.

RAND Reading Study Group. (2002). Reading for understanding: Toward an R&D program in

reading comprehension. Santa Monica, CA: Author.

Rideout, V., Foehr, U., & Roberts, D. (2010). Generation M2: Media in the Lives of 8- to 18-

Year-Olds. Retrieved February 1, 2010 from http://www.kff.org/entmedia/


W. Ian O’Byrne, Dissertation Proposal 46

mh012010pkg.cfm

Rieh, S. (2002). Judgment of information quality and cognitive authority in the Web. Journal of

the American Society for Information Science and Technology, 53(2), 145-161.

Rieh, S., & Belkin, N. (1998). Understanding judgment of information quality and

cognitive authority in the WWW. Proceedings of the ASIS Annual Meeting, 35, 279–

289.

Rose, D. H., & Meyer, A. (2002). Teaching every student in the digital age: Universal design for

learning. Alexandria, VA: ASCD. Retrieved December 25, 2009, from

http://www.cast.org/teachingeverystudent/ideas/tes/

Rubio, D. M., Berg-Weger, M., Tebb, S. S., Lee, E. S., & Rauch, S. (2003). Objectifying

content validity: Conducting a content validity study in social work research. Social Work

Research, 27, 94-104.

Salomon, G. (1997). Of mind and media. Phi Delta Kappan, 375-380.

Scardamalia, M., & Bereiter, C. (1983). Child as co-investigator: Helping children gain insight

into their own mental processes. In S. Paris, G. Olson, & H. Stevenson (Eds.), Learning

and motivation in the classroom (pp. 83-107). Hillsdale, NJ: Lawrence Erlbaum

Associates.

Scardamalia, M., & Bereiter, C. (1985). Fostering the development of self-regulation in

children’s knowledge processing. In S. F. Chipman, J. W. Segal, & R. Glaser (Eds.),

Thinking and learning skills: Research and open questions (pp. 563-577). Hillsdale, NJ:

Lawrence Erlbaum Associates.

Scardamalia, M., Bereiter, C., & Steinbach, R. (1984). Teachability of reflective processes in

written composition. Cognitive Science, 8, 173-190.


W. Ian O’Byrne, Dissertation Proposal 47

Shackleford, J., Thompson, D., & James, M. (1999). Teaching strategy and assignment

design: assessing the quality and validity of information via the Web. Social Science

Computer Review, 17(2), 196–208.

Shadish, W., Cook, T. & Campbell, D. (2002). Experimental and quasi-experimental

designs for generalized casual inference. New York: Houghton Mifflin.

Spires, H., Lee, J., Turner, K., & Johnson, J. (2008). Having our say: Middle grade student

perspectives on school, technologies, and academic engagement. Journal of Research on

Technology in Education, 40(4), 497-515.

Street, B. (1984). Literacy in theory and practice. Cambridge: Cambridge University

Press.

Street, B. (2003). What’s “new” in New Literacy Studies? Critical approaches to

literacy in theory and practice. Current Issues in Comparative Education, 5(2), 1–15.

Strømsø, H.I., & Bråten, I. (2010). The role of personal epistemology in the self-regulation of

Internet-based learning. Metacognition Learning, 5, 91-111.

Tapscott, D. (1999). Educating the Net generation. Educational Leadership, 56(5), 6–11.

Tate, M., & Alexander, J. (1996). Teaching critical evaluation skills for World Wide Web

resources, Computers in Libraries, 16(10), 49-55.

Taylor, R. S. (1986). Value-added processes in information systems. Norwood, NJ: Ablex

Publishing.

Thomas, D. (2006). A General Inductive Approach for Analyzing Qualitative Evaluation Data.

American Journal of Evaluation, 27, 237.

Van Breukelen, G. J. (2006). ANCOVA versus change from baseline: More power in

randomized studies, more bias in nonrandomized studies. Journal of Clinical


W. Ian O’Byrne, Dissertation Proposal 48

Epidemiology, 59, 920–925.

Wittel, A. (2000). Ethnography on the move: From field to net to Internet. Forum:

Qualitative Social Research, 1(1). Retrieved on Dec. 24, 2006, from

http://www.qualitative-research.net/fqs-texte/1-00/1-00wittel-e.htm

Wysocki, A. (2001). Impossibly distinct: On form/content and word/image in

two pieces of computer-based interactive multimedia. Computers and Composition, 18,

209–234.

Zimmerman, B. J., & Bandura, A. (1994). Impact of self-regulatory influences on writing course

attainment. American Educational Research Journal, 31, 845-862.


W. Ian O’Byrne, Dissertation Proposal 49

Table 1 Criteria for identifying group performance

Students work independently and collaboratively


Delegate responsibility and know who is responsible for what
Establish expectations for work-product and work ethic
Resolve disagreements amicably
Keep personal issues from interfering group business
W. Ian O’Byrne, Dissertation Proposal 50

Appendix A Critical Evaluation Items

The full instrument is available online at:


https://sites.google.com/site/criticalevaluationinstrument/

1. Which author is the most knowledgeable about volcanoes?


2. Which website has words and ideas that I can understand?
3. Which author used the most reliable details to support his or her argument?
4. Where do you look on a website to find out when it was written?
5. What is the reason this website was published?
6. Which link in the search results should you click on to answer the question?
7. What type of website is this?
8. Which website has the most up to date information?
9. Which website has information to prove the claim?
10. Which website uses information from the most reliable source?
11. Where do you click to learn more about an author?
12. Which of these websites about Pluto was created for students your age?
13. What is the author’s main argument?
14. Which website would be best to answer the question?
15. Who is the main audience of this website?
16. Which website has pictures and video to help answer the question?
17. Which website has the best information to disprove the claim?
18. Which link should you click to answer the question?
19. What section of the website should you read to learn about the parts of blood?
20. Which website uses information from the least reliable source?
W. Ian O’Byrne, Dissertation Proposal 51

Appendix B DORC Items

1. I trust what I read on the Internet.


2. Authors tell the truth when writing on the Internet.
3. I don’t like doing projects with other people when using the Internet.
4. I like to help others learn how to use the Internet.
5. It is important to keep your goal in mind when reading online.
6. I think about my opinion of a subject when reading websites.
7. I quit when I can’t find information online.
8. When I make a mistake when using the Internet I keep trying until I get it right.
9. I am ready to learn new things on the Internet even when they are hard.
10. When using the Internet I understand what I read by combining information from
different websites.
11. When searching online I often have to change the strategies I have used in the past.
12. Solving problems using the Internet often takes strategies I learned somewhere else.
13. Websites are full of opinions.
14. I like working with friends to make and post stuff online.
15. I enjoy working with classmates when using the Internet.
16. I can work with a partner to solve problems online.
17. When one strategy does not work to find information on the Internet I pick another
and keep trying.
18. I look for quicker and better ways to read online.
19. When searching online gets tough, I am willing to spend extra time.
20. When reading the Internet you have to look at information by moving between
different viewpoints.
21. When I choose a website to read I think back to what I already know.
22. When looking for information on websites I know there is one truthful answer.
23. When I work on the Internet with others they often teach me strategies or tricks.
24. Using the Internet requires me to make quick changes in how I read.
25. I think about the words I choose when I write an email or comment.
26. I believe the information I find on websites.
27. I try hard when using the Internet to learn new things.
28. You can trust the pictures on websites.
29. I think about how I am reading when I visit websites.
30. I never think about what I am doing as I use the Internet.

DORC: Open Response Items

In a few sentences tell us how you feel about using the Internet to learn in school.
Please answer the following question, “What is the best part of using the Internet in school?”
In a few sentences let us know the best way to learn something new on the Internet.
Please answer the following question, "What is the hardest thing about using the Internet in
school?"
In a few sentences describe the type of person you think is really good at reading websites.

Potrebbero piacerti anche