Sei sulla pagina 1di 35

Close Reading and Comprehension

Beth Duellman

Saint Marys University of Minnesota

Schools of Graduate and Professional Programs

Portfolio Entry for Wisconsin Teacher Standards 7 and 8

EDUW 693 Instructional Design and Assessment

Ryan Ourada, M.S. Ed., Instructor

March 4, 2017
Entry Introduction

This entry documents completion of a comprehensive learning process to improve skills

related to instructional design (WTS 7) and instructional assessment (WTS 8). The seven steps

forming that brain-based learning process were adapted to organize sections in this entry:

A Brain-based Applied to Improving Educator Effectiveness


Learning Process for as a Seven-step Professional Learning Process
All Ages

1. Expand 1. Expand perspectives


perspectives (Standards for educators and education serve as a common starting point,
(Learning assumes representing collected wisdom of the profession. Vanguard ideas offer
moving beyond what another option for proposed solutions for improving educator
we already know/do.) effectiveness.)

2. Assess abilities 2. Assess current professional knowledge/skills/attitudes


from evidence developmentally. (Determine areas most in need of improvement
(Define: What to compared to standards for educators and student learning. Assess three
learn?) types of evidence: (a) teaching practices based on educator standards, (b)
whole-class and lowest-median-highest student performance based on
academic standards that guide subject learning, and literacy standards that
guide tasks to prove learning, and (c) student participation and learning
environment evidence such as observations, ongoing student feedback,
and anonymous student surveys. Reason inductively from assessment
conclusions to define an inquiry question that addresses areas most in
need of improvement.)

3. Learn 3. Research/Learn from professional/credible sources for practical


answers/insights to improve targeted areas.

4. Plan (connect 4. Incorporate learning into a plan. (Teachers in the Master of Education
learning) Program improve lesson plans.)

5. Try (and gather 5. Implement plan and gather a-b-c evidence for assessment.
evidence)

6. Post Assess from 6. Post-assess from evidence.


evidence Valid, developmental assessment requires evidence that provides direct
(Define: What comparisons of a-b-c evidence from multiple perspectives.
learned?
What remains to
learn?)

7. Reflect 7. Reflect. Process the entire learning experience from the personal
(Process entire perspective to strengthen brain connections for more efficient and
learning experience effective learning next time: What actions/attitudes worked best/least to
for efficient and learn efficiently and effectively? What are my next learning steps in this
effective recall in area?
future)
Learning Step 1: Start with Standards to Expand Perspectives

Learning Step 2: Assess Current Evidence Compared to Standards to Find Areas to

Improve

Educator Standards
Targeted Descriptors from Wisconsin Standards for Teacher Development and Licensure

The descriptors listed for each Wisconsin Teacher Standard (WTS) on this page originated
from the Wisconsin Department of Public Instruction website.
Areas emphasized during EDUW 693 are preceded by a rather than a symbol.
Underlined text indicates two areas in each standard that emerged as most in need of
improvement after studying the descriptors and self-assessing current teacher performance.

Wisconsin Teacher Standard (WTS) 7: Teachers are able to plan different kinds of lessons.
The teacher organizes and plans systematic instruction based upon knowledge of subject
matter, students, the community, and curriculum goals.
Knowledge
The teacher understands learning theory, subject matter, curriculum development, and student
development and knows how to use this knowledge in planning instruction to meet curriculum
goals.
The teacher knows how to take contextual considerations (instructional materials, individual
student interests, needs and aptitudes, and community resources) into account in planning
instruction that creates an effective bridge between curriculum goals and students' experiences.
The teacher knows when and how to adjust plans based on student responses and other
contingencies.
Dispositions
The teacher values both long-term and short-term planning.
The teacher believes that plans must always be open to adjustment and revision based on student
needs and changing circumstances.
The teacher values planning as a collegial activity.
Performances
As an individual and a member of a team, the teacher selects and creates learning experiences that
are appropriate for curriculum goals, relevant to learners, and based upon principles of effective
instruction (e. g. that activate students prior knowledge, anticipate preconceptions, encourage
exploration and problem-solving, and build new skills on those previously acquired).
The teacher plans for learning opportunities that recognize and address variation in learning
styles, learning differences, and performance modes.
The teacher creates lessons and activities that operate at multiple levels to meet the
developmental and individual needs of diverse learners and help each progress.
The teacher creates short-range and long-term plans that are linked to student needs and
performance, and adapts the plans to ensure and capitalize on student progress and motivation.
The teacher responds to unanticipated sources of input, evaluates plans in relation to short- and
long-range goals, and systematically adjusts plans to meet student needs and enhance

Wisconsin Teacher Standard (WTS) 8: Teachers know how to test for student progress.
The teacher understands and uses formal and informal assessment strategies to evaluate and
ensure the continuous intellectual, social, and physical development of the learner.
Knowledge
The teacher understands the characteristics, uses, advantages, and limitations of different types of
assessments (e.g. criterion-referenced and norm-referenced instruments, traditional standardized
and performance-based tests, observation systems, and assessments of student work) for evaluating
how students learn, what they know and are able to do, and what kinds of experiences will support
their further growth and development.
The teacher knows how to select, construct, and use assessment strategies and instruments
appropriate to the learning outcomes being evaluated and to other diagnostic purposes.
The teacher understands measurement theory and assessment-related issues, such as validity,
reliability, bias, and scoring concerns.
Dispositions
The teacher values ongoing assessments as essential to the instructional process and recognizes
that many different assessment strategies, accurately and systematically used, are necessary for
monitoring and promoting student learning.
The teacher is committed to using assessment to identify student strengths and promote student
growth rather than to deny students access to learning opportunities.
Performances
The teacher appropriately uses a variety of formal and informal assessment techniques (e.g.
observation, portfolios of student work, teacher-made tests, performance tasks, projects, student
self-assessments, peer assessment, and standardized tests) to enhance her or his knowledge of
learners, evaluate students progress and performances, and modify teaching and learning strategies.
The teacher solicits and uses information about students' experiences learning behavior, needs, and
progress from parents, other colleagues, and the students themselves.
The teacher uses assessment strategies to involve learners in self-assessment activities, to help
them become aware of their strengths and needs, and to encourage them to set personal goals for
learning.
The teacher evaluates the effect of class activities on both individuals and the class as a whole,
collecting information through observation of classroom interactions, questioning, and analysis of
student work.
The teacher monitors his or her own teaching strategies and behavior in relation to student
success, modifying plans and instructional approaches accordingly.
The teacher maintains useful records of student work and performance and can communicate
student progress knowledgeably and responsibly, based on appropriate indicators, to students,
parents, and other colleagues.

Wisconsin Educator Effectiveness Expectations

Source: Danielson. Enhancing Professional Practice: A Framework for Teaching (2007).

See Artifact A, Tables 2, 3, 4, 5, 7 for a self-assessment of current skills associated with

instructional design and assessment. Tables 1, 2, 3, 4, and 5 address four areas related to assessing

instructional design: (1) appropriate starting points based on current student performance compared

to developmental standards, (2) appropriate outcomes, (3) optimal learning processes, (4) engaged

learning, and (5) assessment design.

Self-assessment of Instruction Related to WTS and Targeted Student Learning Objective(s)


For Wisconsin Teacher Standards (WTS) 7 and 8, I wanted to focus on strengthening my

reading instruction around teaching main ideas and details. I teach 20 fourth grade students with a

wide variety of needs and abilities. In my class, there are three English Language (EL) learners

including one Spanish speaking student and two Hmong speaking students. The class also consists

of four special education students including learning disabilities (LD) in reading and math, autism,

and emotional and behavioral disorders (EBD). Of my students, four have Individualized Education

Plans (IEPs) with accommodations of extra time on tests, pull out and push in support, check in

check out sheets, and tests read aloud. I will work towards my targeted student learning objectives

of identifying main ideas and details by improving how I instruct reading comprehension.

I chose six WTS 7 and 8 descriptors to guide my learning process. I started my focus on the

WTS 7 performance descriptor that the teacher selects and creates learning experiences that are

appropriate for curriculum goals, relevant to learners, and based upon principles of effective

instruction to make reading comprehension meaningful for the students. In this research I

connected with Beers & Probst (2013) when they said that we want kids who dive into a text and

cant begin to think of coming up for air until they know what happens (p. 3). Choosing authentic,

interesting literature is important to create the conditions for students to become lost in a book.

Additionally, WTS 8 performance descriptor of the teacher evaluates the effect of class activities

on both individuals and the class as a whole, collecting information through observation of

classroom interactions, questioning, and analysis of student work influences those choices. We

have been using a curriculum at the school I teach at the uses text that the students have not been

connecting with. It is clear in our flat reading scores on the Wisconsin State Assessment System that

a shift in our instruction needs to take place. As a school, we are working to incorporate higher

quality literature to increase student engagement. Increased student engagement will produce

increased student performance.

The disposition descriptor of the teacher believes that plans must always be open to

adjustment and revision based on student needs and changing circumstances from WTS 7 supports
my current focus on modification of my reading instruction. Based on analysis of my schools

forward exam results, our reading scores for main ideas and details needs improvement. In recent

years, we have spent most of our time studying vocabulary and writing instruction. We are shifting

our focus to reading instruction and student engagement this year. Working with my colleagues, I

am strengthening my understanding of close reading and other reading instructional strategies. The

WTS 8 disposition descriptor that the teacher is committed to using assessment to identify student

strengths and promote student growth rather than to deny access to learning opportunities is

evident by the change in focus in reading instruction. I want my students to succeed and use

assessments to make sure the instruction they are receiving is effective. I currently use informal and

formal assessments with my students including the STAR assessment, district common reading

assessments, classroom assessments, performance assessments, checklists, running records, and

anecdotal records. All of these assessments help me determine the areas for improvement in my

instructional strategies.

Implementing my plan for improved reading comprehension instruction will support my

growth in WTS 7 knowledge descriptor that teacher knows how to take contextual considerations

(instructional materials, individual student interests, needs and aptitudes, and community resources)

into account in planning instruction that creates an effective bridge between curriculum goals and

students experiences to engage students in reading comprehension strategies. Understanding my

students and using that information to select engaging texts will help students improve their

understanding of main ideas and details. With the increased engagement, students will read with a

greater purpose and make more connections to the texts. These skills will improve the overall

comprehension of the selected texts. The WTS 8 knowledge descriptor which states that a teacher

knows how to select, construct, and use assessment strategies and instruments appropriate to the

learning outcomes being evaluated and to other diagnostic purposes. will track the effectiveness of

the improved instructional strategies implemented in reading instruction. I will use the assessments
to target the growth of students in reading comprehension especially of their understanding of main

ideas and details.

Student Performance Standards

The ultimate goal of improving instructional design and assessment is to achieve each

students developmental capabilities through confident and independently-competent learning. My

seven-step process will aim to improve instructional design for a reading lesson about close reading.

Artifact A, Tables 1a and 1b, show current assessment results for the targeted subject of

instruction as of this writing. After Learning Step 6, the post-assessment results will also be

included in each table for ease of direct comparisons. Standards addressing subject performance

include Wisconsin Common Core State Standards for English-Language Arts. Wisconsin Common

Core State Standards for Literacy in All Subjects guided student tasks that provide evidence of

learning in each subject. See Artifact A, Tables 1a and 1b for the specific targeted standards for this

learning process.

Assessment of Student Performance Related to Targeted Student Learning Objective(s)

Currently, I teach reading, writing, math, social studies, and science. According to the STAR

assessment, my students range from a second grade reading level to tenth grade reading level. My

STAR reading class average was below the district, state, and national averages. On the 2015-16

Forward Exam, 36.8% of students were proficient in reading and 0% of students were advanced in

reading. Overall, there was a need to improve reading scores for this group of students.

Digging a little deeper into student reading performance, I found that students struggled on

questions about main ideas and details. Students give basic constructed responses because they lack

the skills to think about the text critically. Overall, students did not demonstrate a higher level of

thinking. My current group of students need more guidance as they tackle grade level texts. With

teacher prompting and guiding questions, students can understand the text but are still lacking in the

skills to analyze and evaluate the text. Students follow a curriculum that teaches comprehension but

it is often with texts that the students do not enjoy or connect with. Missing from our curriculum is
the ability to practice close reading with the texts. Students need repeated readings with short texts

that they can work to construct meaning. Students were not making the connections they needed to

when comprehending the text. Utilizing close reading strategies and good literature will provide the

modification in reading instruction to improve students comprehension strategies on main ideas

and details.

Student Participation (Learning Environment) Expectations

See Artifact A, Tables 6, 7, and 8. Table 6 shows measurable teacher observations that form

ongoing data for measuring instructional effectiveness. Table 7 uses Danielson Framework

assessments that included learning environment aspects. Table 8 draws from WTS 8 expectations

that relate to self-assessment practices among students.

Assessment of Learning Environment While Learning Targeted Objective(s)

At Sam Davey, we use a balanced literacy approach to teaching reading. Good Habits, Great

Readers is the reading curriculum and Lucy Calkins is the writing curriculum that guides our

instruction of the Common Core State Standards (CCSS). There are over 2 hours each day devoted

to literacy with guided reading, shared reading, writing, and word work. Many of the basic skills are

embedded within the program, but there is little additional time to properly instruct students on

those skills. A master schedule and pacing calendar keep our schedule tight without room for

additional ideas. Our current reading curriculum does not provide any opportunities for close

reading. The literature provided with the program is plentiful but is lacking in quality. As a school,

we are trying to find ways to incorporate better choices for literature and opportunities for close

reading.

Analysis Conclusion and Essential Question to Guide Research

The essential question guiding professional growth for the EDUW 693 learning process is

How do I improve instructional design and assessment to achieve each students developmental

capabilities through confident and independent-competent learning? The visual below shows the
analysis, interpretation, and conclusion steps for reasoning inductively to a logical inquiry question

best suited to my areas to improve:

1. Inductive Reasoning: Analysis Step

Gathered Data for Analysis, Grouped by Type of Evidence: Key Idea


Next steps transferred from each pre-assessment at Artifact A. Representing
Each Next Step

Instructional Design: underlined WTS 7 planning lessons descriptors (Pg. Connections


3) Differentiation
a. The teacher knows how to take contextual considerations (instructional Collaboration
materials, individual student interests, needs and aptitudes, and community (PLC)
resources) into account in planning instruction that creates an effective
bridge between curriculum goals and students experiences.
b. The teacher believes that plans must always be open to adjustment and
revision based on student needs and changing circumstances.
c. As an individual and a member of a team, the teacher selects and creates
learning experiences that are appropriate for curriculum goals, relevant to
learners, and based upon principles of effective instruction (e.g. that
activate students prior knowledge, anticipate preconceptions, encourage
exploration and problem solving, and build new skills on those previously
acquired).

Instructional Design: next steps for Appropriate Outcomes (Table 2) Formative


Area to Improve: Develop formative assessments to match learning Assessments
objectives.

Instructional Design: next steps for Optimal Learning Processes (T.3) Cooperative
Area to Improve: Create cooperative groups to dig deeper in reading Groups
comprehension. Facilitate discourse. Student Discourse

Instructional Design: next steps for Designing Engaged Learning (T.4) Student
Area to Improve: Engage students with high interest texts Participation

Assessment Design: underlined WTS 8 descriptors (Page 4) Formative


a. The teacher knows how to select, construct, and use assessment strategies Assessment
and instruments appropriate to the learning outcomes being evaluated and Self-Regulation
to other diagnostic purposes. Collaboration
b. The teacher is committed to using assessment to identify student Communication
strengths and promote student growth rather than to deny access to learning
opportunities.
c. The teacher evaluates the effect of class activities on both individuals and
the class as a whole, collecting information through observation of
classroom interactions, questioning, and analysis of student work.

Assessment Design: next steps for Designing Student Assessment (Table Goal Setting
5). Area to Improve: Provide options for self-regulation

Current Student Performance in Academic Subject (Table 1a) Determine main


Area to improve: Refer to details and examples in a text when explaining ideas and details
what the text says explicitly and when drawing inferences from the text.
Current Student Performance in Literacy Skills (Table 1b) Communication
Area to improve: Identify the reasons and evidence a speaker provides to
support particular points.

Student Participation: next steps Related to Instructional (Formative) Expectations


Assessment (Table 6). Areas to improve: Develop expectations for
monitoring student self-assessment and reflection.

Student Participation (Danielson): next steps Related to Assessment Formative


Practices (Table 7). Area to improve: Create formative assessments to Assessment
check for student understanding at the end of each lesson. Monitor growth.

Student Participation (WTS 8) next steps Related to Assessment Practices Authenticity


(Table 8). Area to improve: Provide options for recruiting interest that
increase individual choice and autonomy and enhance relevance, value, and
authenticity.

2. Inductive Reasoning: Interpretation Step. Group key idea words into one or two focus
topics.
Plan to foster student discourse, communication, and cooperative learning to engage students in
learning.
Learn how to implement forms of formative assessment that includes self-regulation.

3. Draw a Conclusion:
The general question guiding professional growth for this process: How do I improve
instructional design and assessment to facilitate independent competence in achieving each
students developmental capabilities?
The specific inquiry question that emerged from my pre-assessments: How does reading
instruction including close reading affect reading comprehension in the general education
classroom?

Learning Step 3: Research to Find Answers/Insights

Introduction to Research Summary

The self-assessment, assessment of student performance, and learning environment

assessment show that I need to get my students to think critically about text to identify main ideas

and details. If students can comprehend texts using close reading, then student assessment scores

should increase on all forms of assessments. Current assessment scores show a class with a wide

range of abilities that struggles overall compared to others. I want to learn more about instructional

strategies to support my students growth of comprehension strategies especially focused on main

ideas and details. Additionally, I want my students to become more confident with close reading so

they do it naturally without teacher prompting. Finally, my essential question to guide my learning
is How does reading instruction including close reading affect reading comprehension in the

general education classroom?

Research Summary

Reading comprehension is the cornerstone of English Language Arts (ELA) in the

intermediate grades. Most students can fluently read through grade level texts but need to have a

much deeper understanding than in previous years. In the primary grades, students are spending

much of their energy learning to decode and read fluently. In fourth grade, students are asked to

think and write critically about literature. It requires mental strength and grit to have the deep

understanding of the text required for reading comprehension.

Richardson (2009) talks about using a balanced literacy approach for reading instruction

including read aloud, shared reading, independent reading, and guided reading. Read aloud gives

students a chance to hear fluent reading, answer questions in a whole group, and have a shared

interest in a book. Shared reading gives teachers the opportunity to teach grade level skills in grade

level texts. Independent reading allows students to a chance to build a love of reading with self-

selected text. Choosing their own books gives student more motivation to keep them reading.

Teachers need to make sure students have the right books, books in which they can lose

themselves and books in which they can find themselves (Beers and Probst, 2013, p. 7). Guided

reading is a small group of students receiving reading instruction in texts at their level. It is

important to for students to find books that interest them to increase reading volume (Richardson,

2009, p. 263).

Jackson (2016) claimed that it is important for students to receive direct instruction of

strategies in order to develop evaluative comprehension (p. 2). One strategy mentioned by several

studies (Jackson, 2016; Richardson, 2009) is think-aloud. Using this strategy allows the teacher to

model strategies while reading a shared text. The more often think-alouds occur in reading

instruction, the more effective the instruction is for students to make those connections on their

own. Jackson (2016) shared that repeatedly modeling strategies for students will foster the ability
for students to internalize and apply the thinking process that should take place while interacting

with a variety of texts (p. 7).

The most relevant research to improving reading comprehension points to close reading of

texts. Fisher & Frey (2012) shared that close reading is an instructional routine in which students

critically examine a text, especially through repeated readings (p. 179). Close reading is often

associated with middle school and high school students but is now an approach being used in

elementary classrooms. This approach must be accompanied by other essential instructional

practices that are vital to reading development: interactive read-alouds and shared readings, teacher

modeling and think-alouds, guided reading with leveled texts, collaborative reading and discussion,

and independent reading and writing (Fisher & Frey, 2012, p. 180).

According to Boyles (2013), when engaging students with close reading, teachers need to

explicitly teach how to approach a text to uncover its multiple layers of meaning (p. 41). It is

important to evaluate the learning taking place to hold students accountable for reading

comprehension. Beers and Probst (2013) shared what all educators hope to accomplish using close

reading strategies and that is to create students that are alert, observant, responsive, responsible,

self-reliant readers, respecting their own perspectives and values but also willing to change their

minds when evidence and reason demand (p. 6).

Close reading can look differently depending on the classroom. There are certain key

features present in close reading in elementary classrooms. According to Fisher & Frey (2012),

those features are short passages, complex texts, limited frontloading, repeated readings, text-

dependent questions, and annotation (pg. 181-182). Short passages allow teachers to focus on the

skills and get started with close reading. The texts chosen were complex to give students the

opportunity to struggle and persevere with the strategies. The struggle required the students to use

repeated readings to build their understanding of a text. Teachers do not frontload these articles and

instead let students build their understanding through critical thinking and repeated reading. In close

reading, text-dependent questions are used so students have to provide evidence from the text.
Finally, students need experience annotated the text using notes, highlighters, and circles (Fisher &

Frey, 2012, p. 181-182).

Close reading is a strategy to empower students to think critically to comprehend texts. It

moves students to higher level of thinking than previously taught in elementary schools. Teachers

can integrate think-alouds into close reading lessons to model the strategy (Fisher & Frey, 2012, p.

184). Integrating this teaching strategy will help deepen students understanding of complex text

which is required by the Common Core State Standards. In closing, improving reading

comprehension by implementing close reading strategies will create students that are close and

thoughtful readers whose entire lives will be enriched by books (Beers and Probst, 2013, p. 7).

Research Conclusion

Guiding my research was the question How does reading instruction including close reading

affect reading comprehension in the general education classroom? The research pointed to

implementing a balanced literacy approach to reading instruction. One area I had not implemented

fully was utilizing close reading strategies. I began transforming my reading instruction by adding

close reading strategies to our daily routine. Student engagement with the texts increased and the

discussion became much more student led. Moving forward, I plan to work with my school

professional learning community (PLC) to plan common lessons using close reading strategies. I am

also going to implement close reading in social studies and science to gain a better understanding of

academic texts as well. Reading comprehension will deepen and students will be prepared to tackle

complex text with the addition of close reading strategies.

Research Implications for Implementation in Planning and Instruction

The essential question guiding professional growth for this process: How do I improve

instructional design and assessment to achieve each students developmental capabilities through

confident and independently competent learning?


My specific inquiry question: How does reading instruction including close reading affect

reading comprehension in the general education classroom?

Answers/insights from research and course learning that I plan to apply in planning and

instruction for my targeted learning unit:

1. Design lessons using close reading strategies. Begin with Storyworks articles and Newsela

articles. Apply UDL strategies.

2. Deliver instruction of close reading strategies including short passages, complex texts,

limited frontloading, repeated readings, text-dependent questions, and annotation in whole group

and small group settings.

3. Plan for student self-reflection including goal setting and assessment rubrics.

4. Compare assessments results of reading comprehension weekly as measured by easyCBM.

Modify lessons based on assessment outcomes.

5. Facilitate student discourse by scaffolding questioning techniques. Foster collaboration

and communication.

Learning Step 4: Plan, Incorporating Answers and Insights from Research

Artifact B-1 shows typical assessment criteria/tool and practices before this learning process.

Artifact B-2 shows improvements associated with the assessment criteria/tool and practices

connected to the targeted lesson.

Artifact C-1 is a typical lesson plan before this learning process. Artifact C-2 is the lesson

plan that resulted from research and in-class learning.

Learning Step 5: Implement Plan and Gather Evidence

Artifact D has student work samples examples with comments that explain how new

instructional design and assessment practices affected student learning. Other evidence related to

instruction, student performance, and learning environment is in the post-assessment notes in

Artifact A.
Learning Step 6: Post Assess Evidence Compared to Pre-assessments and

Standards

See Artifact A, which uses italicized type to distinguish post assessment information from

pre-assessment information.

Learning Step 7: Reflection of My Entire Learning Process

The learning process addressing WTS 7 and WTS 8 focused on improving standards-based

instructional design and assessment to achieve each students developmental capabilities through

confident and independently-competent learning. My specific area of inquiry that guided growth:

How does reading instruction including close reading affect reading comprehension in the general

education classroom?

The post-assessments summarized what worked and what did not work from the

perspectives of instructional outcomes, so this final step aims primarily at learning how I may use

my time more efficiently and effectively for future learning. Each area below summarizes the two

most significant conclusions that emerged from reflecting from the perspective of my processes and

practices as a learner:

Most Effective Actions/Attitudes in My Seven-Step Learning Process, with Evidence

1. My most effective action was including regular exit tickets into reading instruction. This

allowed me to focus on misconceptions and reteach expectations as needed. The exit ticket

contained one question that used Close reading strategies to construct a response. After the first

lesson in this unit, 53% of students were proficient or advanced. After the final lesson in this unit,

72% of students were proficient or advanced.

2. Another effective action was fostering and providing more opportunity for purposeful

student discussion. Using an observation checklist, I found 15% of students were asking meaningful

questions or citing text evidence in their discussion on the first day. Understanding the needs of my

students, I created a bookmark with sentence stems for students to use during reading discussion.
This improved student discourse during reading with 80% of students asking meaningful questions

or citing text evidence in their discussion on the final day.

Least Effective Actions/Attitudes in My Seven-Step Learning Process, with Evidence

1. The least effective action was using easyCBM to progress monitor student growth. I had

been looking for an assessment similar to the STAR assessment to use to progress monitor student

growth in reading comprehension. It seemed like easyCBM would do just that; however, I found the

test to be time consuming and gave unpredictable results. The average score was 55% the first time

we took the assessment, and it fell to 50% the second time. I did not continue to use the assessment

after that and instead switched back to the STAR assessment for the final assessment.

2. Another action that I found not as effective was using rubrics during the assessments. I

predicted students would easily score proficient in their exit tickets by using the rubrics. It was clear

how to earn all points on each question. Surprisingly, I found that students just checked each step

off on the rubric but did not take the time to find evidence in their responses. Even though students

had the information in front of them, they did not use it properly. For example, on exit ticket 5,

100% of students marked on their rubrics that they restated the question in their answer. After

scoring exit ticket 5, only 69% of students actually restated the question in their answer.

My Next Steps for Professional Educator Improvement

1. I need to continue to create meaningful assessments to measure reading comprehension. I

like the checklists and exit tickets I created for formative assessment but need to expand that to

cover more standards. I would also like an assessment that would measure skills similar to the

STAR assessment.

2. I want to utilize the Universal Design for Learning to plan integrated reading and social

studies curriculum for the innovation zone at my school.


References

Beers, K. & Probst, R. (2013). Notice & note: Strategies for close reading. Portsmouth, NH:

Heinemann.

Boyles, N. (2013). Closing in on close reading. Educational Leadership, 70(4), 36-41.

Fisher, D. & Frey, N. (2012). Close reading in elementary schools. The Reading Teacher, 66(3),

179-188.

Jackson, V. (2016). Applying the think-aloud strategy to improve reading comprehension of

science content. Current Issues in Education, 19(2), 1-35.

Richardson, J. (2009). The next step in guided reading: Focused assessments and targeted

lessons for helping every student become a better reader. New York: Scholastic Inc.
Artifact A: Pre- and Post-Assessments
Instructional Design Practices Related to WTS 7
Instructional Assessment Practices Related to WTS 8

Italicized type distinguishes post-assessment additions (Learning Step 6) from the earlier
pre-assessment (Learning Step 2). Changes in assessment notes explain changed and/or unchanged
ratings. (Unchanged ratings generally represent improvements within the same developmental
range as the pre-assessment.) Rating codes: U=Unsatisfactory, B=Basic, P=Proficient,
D=Distinguished.
To view the targeted standards guiding the pre- and post-assessment, see the first page of
Artifact C. See Artifact D for evidence of student work supporting the post-assessment.

Define Developmental Level of Academic Outcomes as a Starting Measure of Effectiveness


ELA Writing Standard (1a)
ELA Key Ideas and Details Standard (1b)

Table 1a: Pre/Post Academic Student Performance Compared to PK-12 Vertical Standards
Skill Grade Grade level represents current level of proficiency based on a developmental
Level Level assessment of significant subject standards for PK-12+ Academic Performance.
(Proficiency = performance meets all expectations at and below the rating)
Lowest 1 With guidance and support from adults, recall information to answer a question (1)
to 2 and struggles to recall the information without support from adults (2).
Median 3 Recall or gather information and sort evidence into provided categories (3) and
to 4 determine relevancy and provide sources (4). Some struggle finding relevant
information.
Highest 5 Recall relevant information from experiences or gather relevant information from
to 5 print and digital sources; summarize or paraphrase information in notes and finished
work, and provide a list of sources. (5)
Evidence source: Exit ticket assessment
Area to improve: Cite story evidence in written response
Evidence source: Checklists, exit slips, graphic organizers, activity sheets
Most improved Recall and gather relevant information and demonstrate that understanding in a written
area: response.

Most Significant Evidence of Improvements in Subject Knowledge and Skills Outcomes


Specific comparisons are in Artifact D. These examples summarize evidence of greatest gains.
1. 53% of students were proficient on the first exit ticket and 72% of students were proficient on the
last exit ticket.
2. I implemented a guided writing approach for struggling students in guided reading to work on
improving constructed response. That is moving into the range for grades 1-2. 90% of students were
proficient on at least one exit ticket.
3. I created sentence stem bookmarks to improve discussion around the text. Improved discussion
will improve the written response. First students to be able to comprehend orally before moving to the
written response. After implementation of the bookmarks, 80% of students were asking meaningful questions
or citing text evidence in their discussions.

Table 1b: Pre/Post Academic Literacy Performance Compared to PK-12 Vertical Standards
Skill Grade Grade level represents current level of proficiency based on a developmental
Level Level assessment of significant literacy standards for PK-12+.
(Proficiency = performance meets all expectations at and below the rating)
Lowest 3 Ask and answer questions to demonstrate understanding of a text, referring explicitly
to 3 to the text as the basis for the answers (3).
Median 4 Refer to details and examples in a text when explaining what the text says explicitly
to 4 and when drawing inferences from the text (4).
Highest 5 Quote accurately from a text when explaining what the text says explicitly and when
to 5 drawing inferences from the text (5).
Evidence source: Exit tickets, activity sheets
Area to improve: Explicitly citing the text
Evidence source: Exit tickets, activity sheets
Most improved Refer to details and examples in a text when explaining what the text says explicitly.
area:

Most Significant Evidence of Improvements in Literacy Outcomes


Specific comparisons are in Artifact D. These examples summarize evidence of greatest gains.
1. 21% of students were advanced on the first exit ticket and 33% of students were advance on the
final exit ticket. This increase was due to increased feedback, teacher think alouds to demonstrate learning,
and student rubrics.
2. 47% of students restated the question on the first exit ticket with the verbal directions and 69% of
students restated the question on the last exit ticket with the student self-assessment rubric.
3. In classroom discussions, 15% of students were asking meaningful questions or citing text
evidence. I had students model discussions and provided students with sentence stems to scaffold their
discussion techniques. Discussion is the foundation for the written response so student need to strengthen
their verbal comprehension first. By the end of the unit, 80% of students were asking meaningful questions
or citing text evidence in their discussions.

Pre- and Post-Assessments of Design and Instruction Practices Related to WTS 7

Table 2: Pre- and Post-assessment of Instructional Design for Appropriate Outcomes


Danielson A Framework for Teaching, Domain 1: Planning and Preparation Component 1c: Setting
Instructional Outcomes (p. 51-53 and chart on page 54).
Element Rating Assessment Based on Danielson Framework Criteria.
Value, Basic 1. (All) Outcomes represent moderately high expectations and rigor. 2.
sequence, To Most reflect important learning in the discipline. 3. Some outcomes
and Proficient connect to a sequence of learning in the discipline. 4. Some outcomes
alignment connect to a sequence of learning in related disciplines.
Clarity Proficient 1. Outcomes are clear, written in the form of student learning.
To Proficient 2. Most outcomes permit viable methods of assessment.
Balance Basic 1. Outcomes reflect several different types of learning, but no attempt
To Proficient to coordinate or integrate disciplines.
Suitability Proficient 1. Most outcomes are suitable for most students in the class and based
for diverse To Proficient on assessment of students needs. 2. Needs of most individual student
learners or groups are accommodated.
Evidence source: Lesson plans, assessment results
Area to improve: Outcomes integrate disciplines
Evidence source: Lesson Plan for Video Activity
Most improved area: Connection to writing unit (informational writing)

Most Significant Evidence of Improvements in Designing Appropriate Outcomes


1. Previously, all subjects were taught independently through separate curriculums. Outcomes did
not coordinate or integrate disciplines. Outcomes in the lesson in the artifact connected the research of the
author with the research in writing to complete informational writing. Lessons integrated social studies,
science, writing, and reading. I would not have made those connections before.
2. Before this assessment, there was not frequent formative assessment. I added in exit tickets at the
end of each lesson to inform my instruction. It gave me information to modify my lessons to support students
based on their needs.
3. Outcomes reflected minimal types of learning before the unit. Added different types of outcomes
based on different types of learning within one lesson.

Table 3: Pre- and Post-assessment of Instructional Design for Optimal Learning Processes
Danielson A Framework for Teaching, Domain 1: Planning and Preparation Component 1e: Designing
Coherent Instruction (p. 55-59 and chart on page 60).
Element Rating Assessment Based on Danielson Framework Criteria. Improve
Learning Basic 1. Some learning activities are suitable to students or to the
activities To Proficient instructional outcomes.
2. Most represent moderate cognitive challenge. 3. Some are
differentiated for groups of students.
Instructional Basic 1. Some of the materials and resources are suitable to students, support
materials and To Proficient the instructional outcomes, and engage students in meaningful
resources learning. 2. There is some evidence of appropriate use of technology
and (upper =) of student participation in selecting or adapting
materials.
Instructional Proficient 1. Instructional groups partially support the instructional outcomes. 2.
groups To Proficient Instructional groups are appropriately varied for students and the
different instructional outcomes. 3. Evidence of student choice in
selecting the different patterns of instructional groups.
Lesson and Basic 1. The lesson or unit has recognizable structure that organizes
unit structure To Proficient activities. 2. The structure is maintained throughout.
3. Even coherent progression of activities.
4. Reasonable time allocations for each activity.
5. Some allowance for different pathways according to diverse student
needs.
Evidence source: Unit outline, lesson format
Area to improve: Providing appropriate cognitive challenge
Evidence source: Lesson Plan for Video Activity
Most improved area: Students wrote compare contrast at an appropriate level for cognitive
ability. Some students wrote in an essay format, some used a VENN
diagram, and some used talk to text on their iPads to record ideas.

Most Significant Evidence of Improvements in Designing Optimal Learning Processes


1. Before, all students completed the same activity with varied supports. I included multiple methods
to meet the same learning target based on student need. Example given above with compare contrast activity.
2. I usually group students based on beneficial partnerships. I gave students choice in forming groups
throughout this unit. I also including an activity in each lesson that gave students choice. Creating 3
questions for the class to answer gives each student flexibility and choice in creating these.
3. Students do not engage with the instructional materials connected to our district curriculum
because it is not relevant. I decided to create this unit around instructional material that is more engaging. I
used Storyworks by Scholastic. The text was interesting and including different types and formats.

Table 4: Pre- and Post-assessment of Instructional Design for Engaged Learning


Danielson A Framework for Teaching, Domain 3: Instruction. Component 3b: Using Questioning and
Discussion Techniques and Component 3c: Engaging Students in Learning
(combining rows in the charts on pages 82 and 85).
Element Rating Assessment Based on Danielson Framework Criteria.
Quality of Basic 1. Teachers questions are high quality in cognitive challenge. 2.
questions To Proficient Students generally respond with some thoughtful responses 3.
Questions are asked with adequate time to respond.
Discussion Basic 1. Teacher-student interaction with some attempt to engage student in
techniques To Proficient genuine discussion.
2. Teacher steps aside when appropriate.
Student Basic 1. Teacher successfully engages all students in the discussion.
participation To Proficient

Activities and Basic 1. Activities and assignments are inappropriate/appropriate to


assignments To Proficient some/appropriate to all students age or background. 2.
No/Some/Almost all/All student are mentally/cognitively engaged in
the activities and assignments in exploring content. 3. Students do
not/sometimes/generally initiate or adapt activities and projects to
enhance their understanding.
Evidence source: Discussion bookmarks
Area to improve: Student to student meaningful discussion. Engage all students.
Evidence source: Sentence stem discussion and questioning bookmarks from lesson
Most improved area: Student participation in discussion increased with use of sentence
stems

Most Significant Evidence of Improvement in Designing Engaged Learning


1. Students created questions to use in class discussion creating higher engagement in the learning
activity. In the past, the questions were teacher led.
2. Provided students with sentence stems for discussion which gave them more confidence to
participate. There are not strong leaders within the class and need the additional supports to know what to do.
3. Multiple methods of questions in lessons to engage all learners. Varied between explicit and
implicit questions as well as providing higher level thinking. This is demonstrated in my lessons.

Pre- and Post-Assessments of Assessment and Instruction Practices Related to WTS 8

Table 5: Pre- and Post-assessment of Assessment Design


Danielson A Framework for Teaching, Domain 1: Planning and Preparation (p. 63)
Component 1f: Designing Student Assessments (Read pages 59-63.)
Rating options: U=Unsatisfactory, B=Basic, P=Proficient, D=Distinguished
Element Rating Current Evidence to Support Rating/Area to Improve
Congruence with Basic 1. None/Some/All instructional outcomes are assessed through the
instructional outcomes To proposed assessment approach.
Proficient 2. Assessment methodologies have/have not been adapted for
groups/individuals as needed.
Criteria and standards Basic 1. No/unclear/clear criteria and standards. 2. Students do/do not
To contribute to development of assessment criteria.
Proficient
Design in formative Basic 1. Lesson plans include no/rudimentary/well-developed/well-
assessments To designed formative assessments strategies for all instructional
Proficient outcomes.
2. Lesson plans include no/minimal/particular/well-designed
approaches to engaging students in assessment and correction of
their work.
Use for planning Basic 1. No plans/Plans to use assessment results in designing future
To instruction. 2. Does not use/Uses assessment results to plan for
Proficient whole class (basic) and/or group (proficient) and/or individual
instruction. (Distinguished is all three levels.)
Evidence source: Lesson Plan, unit plan
Area to improve: Students contribute to development of assessment criteria.
Evidence source: Assessment Artifacts
Most improved area: Worked together as a class to modify short answer rubric to student rubric

Evidence of Improvements in Designing Effective Assessment Practices


1. Additional independent formative assessment in form of exit tickets. Previous formative was
through work completed as a group or with partners. These additional assessments gives me a more accurate
snapshot of the individual knowledge and demonstration of standards.
2. Formative assessment used to plan for a support individual growth during guided reading. With
the assessments, I worked with each student based on their needs from the assessment. The formative
assessment prompted my planning for individual instruction.
3. Modified short answer rubric for teacher use to score assessments to a more student friendly
checklist rubric as a class. Format based on student preference.
Three Pre- Post-assessments of Participation/Learning Environment Related to Assessment Design

Approx. % Table 6: Student Participation Related to Instructional (Formative) Assessment


Current approximate % of student learning/engagement observed by teacher during
a = 65% to 70% (a) teacher-guided formative assessments in classroom
b = 50% to 60% (b) independent formative assessments in classroom
c = 45% to 55% (c) formative peer assessments in classroom
30% to 40% Current approximate % of completion for assessments assigned as homework.
55% to 60% Current overall accuracy in assessing learning using criteria or assessment tools.
80% to 85% Current understanding of formative assessment as a valuable learning strategy.

Table 7: Assessment Practices Based on Danielson Framework


Danielson A Framework for Teaching, Domain 3: Using Assessment in Instruction (p. 89)
Component 3d: Using Assessment in Instruction. (Read pages 86-89.)
Rating options: U=Unsatisfactory, B=Basic, P=Proficient, D=Distinguished
Element Rating Current Evidence to Support Rating/Area to Improve
Assessment Basic 1. Students are not aware/know some/are fully aware of the criteria and
Criteria To performance standards by which their work will be evaluated. 2. Students
Proficient have not/have contributed to the development of the criteria.
Monitoring of Basic 2. Teacher does monitor progress of whole class (basic) and groups
student learning To (proficient). 2. Teacher elicits no (basic)/makes limited use of
Proficient (proficient)/actively and systematically elicits (distinguished) diagnostic
information from individuals regarding their understanding and monitors
individual progress.
Feedback to Basic 1. Teachers feedback to students is poor quality and untimely/uneven
students To quality and untimely/high quality and timely/consistently high quality and
Proficient timely.
2. Students do not/make use of the feedback in their learning with/without
prompting. (with=proficient, without=distinguished)
Student self- Basic 1. Students do not/occasionally/frequently assess and monitor the quality
assessment and To of their own work against the assessment criteria and performance
monitoring of Proficient standards.
progress 2. Students do not/rarely/occasionally/frequently make active use of that
information in their learning.
Evidence source: Teacher recall, current assessment tools and practices.
Area to improve: Students frequently assess and correct to expected quality based on
Evidence source: criteria.
Most improved area: Student assessment artifacts
Providing students with a rubric to self-assess their own responses.

Table 8: Assessment Practices Based on WTS 8 Teacher Standards


Rating options: U=Unsatisfactory, B=Basic, P=Proficient, D=Distinguished
Element Ratin Questions to consider in rating current performance and defining areas to improve
g
Criteria B Can students name expectations (what know/do) for each learning step?
and To P For a task, can students explain the line between unacceptable (below proficiency
Rating range) and essentially proficient? ...between fully proficient and mastery (above
System proficiency range)?
Does the rating system result in points/percentages/rating phrases that match the
proficiency range for the task based on standards for the grade level (or
temporarily adjusted expectations to raise overall PK-12 performance to standards)
Monitoring B Do all students participate willingly in formative assessment, knowing the
To P environment is safe for making inevitable learning mistakes?
Do students quickly and objectively provide evidence and ideas for improvement
when the teacher solicits information about what worked best and what did not to
achieve objectives?
Do students use subject terminology and assessment criteria to question ratings
and frame discussions/questions, rather than personal opinions/emotional thinking?
Would students agree that the teacher maintains useful records of student work
and performance and can communicate student progress understandably?
Feedback B Do class and/or groups and/or individuals receive immediate feedback at each
To P mini-step of learning that confirms learning or corrects learning?
Is the same confirm- or adjust-instruction-process happening on the teachers part
based on continual assessments of student learning and feedback? (In other words,
students know the goal is to get it, and if they are trying and dont get it, the
teacher accepts responsibility for finding a method that worksa learning TEAM.)
Student- U Do students consider continual informal and formal formative assessments as not
initiated To B only beneficial, but necessary for successful learning?
Assessment Before deadlines, do students ask for additional formative assessments if unsure
of performance or to ensure performance meets high expectations?
Do students take responsibility for their own formative assessments and try to
evaluate objectively, knowing it will help them become aware of their strengths
and needs, and encourage them to set personal goals for learning?
Evidence source: Student rubrics
Area to improve: Students self-assess using rubric combined with teacher score and feedback
Evidence source: Student rubrics
Most improved Students have rubrics to use to self-assess before turning in formative assessments.
area:

Evidence of Improvements in Learning Environment Related to Assessment


1. Students were turning in exit tickets without fully understanding what was being scored. I verbally
shared the requirements I would be using to score the assessments but many did not follow those
suggestions. Adding the student rubric provided a visual reminder of criteria needed to be proficient.
2. Besides scoring and writing on rubrics, I occasionally pulled students individually to conference
around the assessment. This gave me time to give more personal feedback and allow for student questions. I
noticed increased performance after these conferences.
3. Students need to be able to connect the learning objectives in the lesson to criteria in the formative
assessment. I currently post and state the learning objectives for each class and give clear criteria for the
assessment. I think I could do a better job making the connection between those areas throughout the lesson.
I want students to be able to communicate and reflect on what proficiency would look like on the assessment.
Artifact B: Improved Assessment Design

Previous Example of an Assessment Method or Tool Before Improvement


Source: Fourth grade does not use regular, formative independent assessments in reading
comprehension. We currently use guided reading work and unit tests for shared reading. Students
can work together during guided reading so that does not give us a fully independent assessment.
Unit tests are based on the assessment guide from our reading curriculum, Good Habits, Great
Readers from Houghton Mifflin and are summative. I focused on implementing exit tickets at the
end of each lesson. Previously, our discussion was verbal and the written response was modeled
within our small guided reading groups. Feedback was immediate but not recorded. I added the
student rubric after the first exit ticket. It improved scores but not as drastically as I predicted.
Below is a rubric fourth grade uses on short answer questions on unit assessments.

Skill 4 Advanced 3 Proficient 2 Minimal 1 Basic


Restate question Fully restated Partially Little or no
in answer question in restated response that
answer question in meets criteria.
answer
Answers the Gives a complete Gives a partial Little or no
question answer answer response that
meets criteria.
Provides Goes above and Provides one Partially Little or no
evidence beyond complete detail to complete response that
expectation support answer meets criteria.
Cites the test Goes above and Provides a detail Partially Little or no
beyond from the text complete response that
expectation meets criteria.
Writing 1-2 errors 3-5 errors More than 5
Conventions errors

Trial Example of an Assessment Method or Tool After Improvement

Text boxes indicate significant improvement or lack of improvement by comparison to usual


previous outcomes based on progress toward PK-12 developmental expectations/standards.
Explanations are summarized in the Post Assessment section

I decided to design a simplified rubric that students could use when completing formative
assessments. This encourages them to be complete and double check their work to make sure it
meets all of the descriptors. I decided to take out the levels of proficiency because my students
struggle with reading comprehension. I wanted it to be similar to a checklist to make sure the
students included everything needed in a short answer. The previous rubric was somewhat vague in
some of the descriptors.

Students score with a plus or


Students self-assess while minus. Either they have met
completing the exit ticket. the descriptor or have not met
the descriptor.
Exit Ticket Rubric
Use this rubric to complete your exit ticket. If you completed the descriptor, put a plus (+) in the
box next to each descriptor under student score. If you did not complete the descriptor, put a minus
(-) in the box next to each descriptor under student score.
Teacher Student Descriptor
Score Score
Restate the question in the answer

Answer the question

Provide evidence

Cite the text

Writing conventions (capitalization, punctuation, grammar,


spelling)

Descriptors are broken down into


manageable pieces to allow for easier
comprehension

Artifact C: Improved Instructional Design

The two plans below show an example of previous planning practices compared to a plan

excerpt created during the EDUW 693 course.

Previous Lesson Plan Example


This first lesson plan excerpt demonstrates the ideas that typically guided me for instructing
students how to comprehend reading passages by citing text evidence.
Working on answering
questions with direct text
evidence.

Trial Lesson Plan Example


This plan demonstrates understanding of EDUW 693 expectations for lesson design processes and
elements, guided by WTS 7 expectations. Color codes indicate applications of planning terminology and
practices aimed at aligning expectations, content, process, product, and assessment elements.
5 essential planning elements: objectives, content, process, product, assessment (3 types: diagnostic,
formative, summative). One example each in CAPITALS & YELLOW HIGHLIGHT.
5 different assessment tools/methods: five total formative or summative methods in red print
6 levels of Blooms Taxonomy (Explain missing or eventual levels with name of level in upper case.)
5 thinking patterns (place term next to synonym: Introduce/Define by group
5 instructional strategies/techniques: see 693 term sheet for ideas
3 differentiation/variation/alternative strategies, highlighted in light gray (learning via 2+ strategies).
(DIFF=necessary for learning. VAR=appeals to sustain learning for most students. ALT=if needed
May differentiate: expectations (if capabilities), content, process, product, assessment tool/method.
Multiple intelligences: musical, visual, verbal, logical, body/kinesthetic, interpersonal (social),
intrapersonal (solitary), natural, existential (reflecting inwardly/philosophically)
Learning styles: concrete/feeling, abstract/thinking, active/doing, reflective/watching, accommodating
(feel + do), assimilating (think + watch), converging (think + do), diverging (feel/watch)
Differentiate by senses: see, hear, touch, smell, taste, do, emotion, setting.
1 use of technology to assist learning (green print)
1 example of making purposeful connections: expanding perspectives beyond academics to realities,
interests, students past/present/future, cultural/racial/ethnic awareness, gender sensitivity, etc.

Unit Plan that Incorporates the Use of Thinking Tools


Subject Area
English Language Arts
Grade Level(s)
4
Unit Overview
Unit Title
Detectives Close the Case on Reading
Unit Summary
There are two texts that students will examine to determine main ideas and details utilizing
close reading. Close reading is thoughtful, critical analysis of a text. Students will begin by
reading an informational text about a space mission. The lesson will begin with vocabulary
work and setting the purpose for reading. Teacher will scaffold the process of close reading
an informational text by modeling question types and using think alouds. Students will
communicate and collaborate during this process. They will need to defend their responses
with evidence from the text as well as critiquing the responses of others. Students will
identify problems and solutions from the text.
Student will read a second fiction text about a lonely boy that makes a connection. The
lesson will begin with vocabulary work and setting the purpose for reading. Teacher will
scaffold the process of close reading an informational text by modeling question types and
using think alouds. Students will communicate and collaborate during this process. They will
need to defend their responses with evidence from the text as well as critiquing the
responses of others. Students will identify the climax of the plot and the theme of the story.
Students will work independently, with partners, and in small groups. Students will respond
verbally and in written form. This mini-unit will not include the highest level Blooms
Taxonomy (create). One of the main focuses of this unit is to use questioning to promote
student discourse to grow in understanding main ideas and details while close reading a text.
Building the Foundation
Habits of Learning Taxonomy
Defend, support, critique, compare, contrast, use, demonstrate, discuss, explain,
Standards
CCSS.RL.4.1, CCSS.RL.4.2, CCSS.RL.4.3 (literature - key ideas and details)

CCSS.RI.4.1, CCSS.RI.4.2, CCSS.RI.4.3 (informational - key ideas and details)

CCSS.SL.4.1 (speaking and listening)

CCSS.L.4.3 (vocabulary)
Learning Objectives
Students will identify main ideas and details when close reading a text.

Students will use vocabulary in written response.


Students will identify problem and solutions in a nonfiction article.
Students will defend their responses with text evidence.
Students will identify the climax of the plot in a fiction story.
Students will summarize a text.
Essential Question
How can I defend my answer with text evidence?
Unit Questions
Look at the illustrations, headline, and large words on pages 4-5.
What mood do they create?
What major problem is presented in the first section of this article?

Curriculum-Framing
What is one theme, or big idea, you think this story has?
Questions
Content Questions

Lauren Tarshis writes that the mission to the moon would not be a
luxurious ride. How does she support this statement?
How does Tarshis create suspense in the last section of the article?

How has the main character changed throughout this story?

Student Assessment Plan


Assessment Summary
The assessment will take place throughout the unit. EasyCBM is a research based reading comprehension
assessment that I am using as a benchmark assessment for this unit. Students should show 10% growth
on this assessment. I will measure my students background knowledge based on classroom performance,
the benchmark assessment, and the product the first day. This will inform my instruction as I move forward.
Students will have an exit ticket at the end of the class period to demonstrate understand of the central
task. Some exit tickets will be individual while others will be completed with partners. Students will set
personal learning goals at the beginning of the unit to provide motivation. I will use teacher observation
during the lessons to determine if students are understanding the main ideas. Students will use checklists
and/or rubrics when completing work. After the work has been completed, students will take a final unit
assessment as well as the easyCBM. Students will score an 80% on the unit assessment.

Assessment Timeline

Before Project Work Begins While Students Work on After Project Work
Projects Ends

easyCBM (reading comprehension Exit slip easyCBM


tool)
Background knowledge Checklist unit assessment
Observation
Self-reflection
Rubrics
Graphic organizers

Unit Details

Approximate Time Needed


2 weeks, 30 minutes per day.
Prerequisite Skills
How to use the magazine app, GAFE, Possible Canvas quiz
Procedures
Day 1: Introduce unit by activating and sharing background knowledge (5 minutes)
Preview vocabulary via slideshow (15 minutes)
Set purpose for reading (5 minutes)
Make predictions (5 minutes)
Day 2: Read text (20 minutes)
Teacher scaffolded close reading (10 minutes)
Day 3: Graphic organizer for main ideas and details (15 minutes)
Graphic organizer for main ideas and details (15 minutes)
Day 4: Close reading and comprehension questions in groups (15 minutes)
Video connection and whole group discussion (15 minutes)
Day 5: Vocabulary for second story via slideshow (15 minutes)
Set purpose for reading (10 minutes)
Predictions (5 minutes)
Day 6: Read text (20 minutes)
Character analysis graphic organizer (10 minutes)
Day 7: Close reading - teacher, group, individual (30 minutes)
Day 8: Plot graphic organizer (15 minutes)
Making inferences graphic organizer (15 minutes)
Day 9: Common Assessment
Day 10: easyCBM
Accommodations for Differentiated Instruction
Lower-lexile version of text, audio version of text, more time on tests, beneficial
Resource
partnerships, verbally share ideas instead of writing the responses, read aloud
Student:
questions
Guide students through the article section by section, circling unfamiliar words.
Discuss the words and their meanings in groups
English
Language
Learner: Point out that some parts of this story tell about events that happened before the
story started. Look at the past perfect construction together (for example, Ray had
found, He had shown)
Have students do further research on Apollo 13, finding at least three new details
about how astronauts and engineers responded to the disaster. Include these
Gifted details in a written response.
Student:
Retell the story from another characters point of view.
Materials and Resources Required For Unit
Printed
Storyworks articles, rubrics, checklists, exit tickets, graphic organizers, questions
Materials:
Supplies: No special materials are needed for this unit.
Technology
Computer, projection camera, iPads
Hardware
Technology -
Internet browser, word processing
Software
Internet
Magazines app, Google Drive (GAFE)
Resources:
Other
Virtual Field Trip from Microsoft Education Space Science
Resources:
Individual Lesson Plan: Video Activity
Previous Lessons in Unit: Vocabulary, listen to article, Close reading, discussion, graphic organizer,
group discussion

OBJECTIVES: Students will defend their responses with video evidence. Students will compare
and contrast using text and video evidence.

Content Connections: Social Studies (famous events from American history), Science (space travel,
engineering)

Materials: Computer, projector, activity sheet, pencils, iPads

Activities: CONTENT
1. Recall the main idea of the article Disaster in Space from previous days work.
(remembering)
2. Give examples from the text of problems faced by the astronauts. (understanding)
3. Predict how the author knew the information to write this article.
4. Preview activity sheet and learning objectives.
5. Watch the video about how the author, Lauren Tarshis, researched Apollo 13.
6. PROCESS Complete activity sheet applying using evidence from a video instead of a text
(applying)
7. Connect to informational writing project and the research they are doing. PROCESS
8. Compare and contrast strategies for citing text evidence compared to citing video evidence
(analyzing)
9. Discuss student responses to activity sheet. Defend responses and critique evidence.
(evaluating)
10. Compose 3 questions for other groups to answer based on the video or article. (creating)
PRODUCT
11. Exit ticket assessing students understanding of citing evidence. ASSESSMENT

Assessment Methods:
1. Pre-Assessment before unit (diagnostic)
2. Observation during introduction of lesson (formative)
3. Checklist during discussion (formative)
4. Activity sheet (formative)
5. Exit slip (formative)
6. Rubric with student and teacher score
(formative)
7. Post-Assessment end of unit (summative)

Instructional Strategies in this lesson:


1. Accountable talk
2. Sentence stem discussion and questioning
bookmarks (pictured below)
3. Cooperative learning
4. Effective questioning
5. Graphic organizers
6. Identifying similarities and differences
7. Student self-assessment
8. Video

Differentiation in this lesson:


1. Talk to text for written response (DIFF)
2. Visual learning (VAR/MI)
3. Interpersonal (VAR/MI)

Artifact D: Examples of Lowest, Median Highest Student Work with Assessment Markings

Text boxes indicate areas that show significant improvement (or lack of improvement) by
comparison to usual previous outcomes based on progress toward PK-12 developmental
expectations/standards. Explanations are located in the Post Assessment section.

Before: High Student

Student does not


restate the
question or look
back to the text
for evidence. This
student has the
right idea that the
astronauts have to
figure out how to
solve more
problems.

Before: Median Student


Student restates
the question but
is unsure how to
answer the
question. After a
long time of
thinking, this
student decided
he/she could not
answer it.

Before: Low Student

This student does


not restate the
question but does
include the
explosion that
causes the new
problems.
After: High Student

Restates
question

Answers
question
with
evidence

Cites the
text

The student
was able to
self-assess
before
turning in
the exit
ticket.
After: Median Student

Attempted to restate the


question although it does
not make complete sense.

Answers the question


using text evidence

Much greater detail


provided throughout with
the guidance of the rubric
as compared to the
before.
After: Low Student

Restates the question

Provides evidence but


does not make the
connection to the text
clear.

Answer meets more of


the requirements to be
proficient than the
original.

Potrebbero piacerti anche