Sei sulla pagina 1di 30

Building Mathematicians:

A Reflection on Online Coaching in the Math Classroom

Douglas D. Harrington

Assignments & Final Report


MATC Program
College of Education
Michigan State University

TE 808 - Inquiry into Classroom Teaching & Learning


Fall 2016
INTRODUCTION

Context

This semester begins my fifth year teaching mathematics at a high school in suburban
southeast Michigan. From the very beginning, I have worked extensively with curriculum
development and, specifically, designing mathematics support courses for targeted students.
Presently, I am teaching two advanced courses (Honors Geometry), two regular-tracked
courses (Algebra 2), and a support class (Geometry Lab). Children of color are approximately
30% of the high schools population; my Geometry Lab class is 80% African American and
English Language Learners. The proportion of students from lower socio-economic homes in
the Geometry Lab class does not reflect the school demographic profile either. In all, the high
school ranks near the top of the state each year regarding standardized test scores as well as
gains national recognition. In many cases, students are taking courses one to two levels higher
than their peers from neighboring communities.

Tension

Teaching mathematics to students who have had little success in previous courses can
prove daunting. Identifying their misconceptions while facilitating the formation of new
mathematical knowledge requires adequate preparation and focus. Given the proclivity of my
Geometry Lab students to evade engagement the mathematics, I rarely have the opportunity to
focus on teaching that is both diagnostic and prescriptive. As a result, I grapple with finding
instructional methods which incorporate both review and preview of content. Because my
Geometry Lab students lack skills necessary for understanding foundational concepts in
Geometry, focusing on remediation fails to provide them with access to core content; however,
spending time preparing students for future content proves fruitless as their skills need
development. For some time I have been torn by this tension: Should this class be remedial or
progressive? Possibly both? A colleague suggested I take an approach that seeks to remedy
this tension: teaching mathematical content through the use of (blended) learning stations.

Justification

I want to study how blended learning stations address this tension because, first and
foremost, combining review and preview seamless into my classroom practice is provides
students with the tools necessary for mathematical success in Geometry and beyond. Secondly,
students require a teachers recommendation, parental agreement, and a prior failure in a core
mathematics course. As a result, these are the students who would benefit most from effective
reactionary and proactive instruction which learning stations could provide. Third, the Geometry

Page 2
Lab class provides an opportunity to address mathematical disparities between my students and
their peers. If working in blended learning stations proves to effectively support my lab students,
I can adapt this process to fit the needs of my other course as well. Finally, I will be able to
analyze whether students develop self-direction and increase in their ability to solve problems
creatively.

Research Question

Given the context, tension, justification, and wide range of mathematical abilities present
in my Geometry Lab students thinking, the research question I choose to study is: How do
blended learning stations that review and preview using online coaching software affect
achievement in Geometry, both perceived and realized, for students with a history of obstacles
to mathematical success?

Page 3
LITERATURE REVIEW

Overview

Using my research question as a guide, I reviewed literature on learning stations and


centers as well as blended learning. I begin my review by explaining the five big ideas I
synthesized from the literature. I conclude by summarizing, comparing, discussing, and
evaluating the relevance of these big ideas to the research question I want to answer.

Review

The five big ideas I synthesized from the literature are identified, explained, and
illustrated below.

Blended Learning Stations Are New

As I listened to a colleague recount her participation in a breakout session focusing on


the implementation of learning stations while at an obscure mathematics conference in northern
Michigan, I reveled in the novel, revolutionary idea of mathematics instruction through rotating
stations. Surprisingly, the use of stations to improve the quality of instruction and opportunities
for mastery for students has existed since at least the 1970s (Ukens, 1976); however,
comparing scholarly sources highlights a renewed focus on these ideas in recent years in light
of Internets transcendent impact (see dates in list of references). Given the ubiquity of personal
technology devices in todays culture, new opportunities arise to explore the old notions of
learning by stations through blended learning. The most comprehensive and exhaustive
exploration of blended learning and its effects I discovered while researching is Means,
Toyama, Murphy, & Baki (2013). Though they cite more than 100 assorted studies and
reflections regarding blended learning, only seven of the sources referenced are before the year
2000 (p. 40-46).

Stations Differ from Centers

Before analyzing existing research for foundational claims regarding my inquiry, clear
delineations between stations and centers in the classroom must be constructed. Tomlinson
(1999) provides the most comprehensive discussion of differentiated learning techniques and
suggests the fundamental difference between stations and centers lies at the intersection of
purpose and content. The driving force behind stations is an inherent flexibility provided by the
students ability to work on different tasks without the necessity of visiting every station while still

Page 4
targeting a specific set of skills or content goals (p. 62). On the other hand, centers do not
necessitate similar content links between areas. Centers exist independently of one another,
some with the purpose of addressing content standards while others are considered interest
centers (p. 76). As a whole, my research question is much more interested in two of the three
stations which Ukens (1976) lays out in his short essay: directed discovery stations and skill
development stations.

The literature focuses primarily on the effects of blended learning and stations on the attitudes
and achievement of students. The next big ideas serve as a review of the most significant
impacts on students as well as the most effective ways to encourage these behaviors.

Blended Learning Increases Student Engagement

Blended learning stations promote positive perceptions about learning resulting in


deeper student engagement. Research data overwhelmingly agrees that blended learning
and/or stations encourage student engagement (Reiser et al, 2000; Billingsley et al, 2009; Lim
et al, 2009; Delialiolu, 2011; Murphy et al, 2014; Bhagat et al, 2016). Specifically, Murphy et al
(2014) found that 62% of students were moderately engaged while 25% were highly engaged
while using Khan Academy (p. 10). Although differences exist in both scale and factors
analyzed, the support for blended learning to engage students is unquestionable.

Increased engagement directly arises from the nature of the work occurring during
instruction. Wang, Han, and Yang (2015) explain blended learning is adaptive and dynamic;
progress occurs individually and strategies adjust accordingly (p.389-390). In other words, the
power of blended learning lies in its inherent ability to differentiate learning. The organization of
multiple opportunities for learning encourages students to engage with the method most
personally successful.

Research further highlights that blended learnings differentiation leads to diverse,


acceptable methods for explaining their learning. Furthermore, Reiser e t al (2000) argues the
multiplicity of knowledge, the methods students employ to justify thinking, increases
engagement as students perceive more entry points to understanding. By allowing students to
work in multiple contexts while arranged in stations, instructors can provide at least one
environment in which each student can experience success. Students can learn through
traditional, direct instruction while also being given opportunities to construct their own
understandings, review and preview material, and engage in blended learning. Authors Doo
Hun Lim and Michael Morris (2009) also highlight increases in perceived learning, as
insinuated by Reiser et al (2000), are a natural consequence of the organization of blended
learning and lead to actual, realized success (p. 289).

Blended Learning Increases Self-Directed Learning

Page 5
The scholarly work analyzing blended learning provides evidence that these instructional
routines support students ability to self-direct learning. Although research supports direct
instruction as best practice for students with emotional or behavioral disorders, these same
studies indicate that Computer Aided Instruction (CAI) when applied in a contained setting as a
learning intervention increases these students abilities to self-direct learning (Billingsley,
Sheuermann & Webber, 2009). Furthermore, since increases in self-directed learning are linked
to increases in both positive learner identities and problem-solving abilities (Reiser & Butzin,
2000), incorporating online resources, like Khan Academy, into the students work within
stations serves the overall purpose of the lab course and supports students beyond
mathematical content. As a result, blending learning encourages the development of positive
classroom behaviors supporting personal independence.

Coaching Softwares Facilitate Blended Learning Goals

Furthermore, research supports utilizing online coaching resources for blended learning
as a significant intervention in mathematics instruction. In fact, research supports the conclusion
that the resource does not hold any magical powers for increasing student achievement; rather,
correct implementation encourages students to independently access applicable content
resulting in student growth. Specifically, Cargile (2015) finds that only 20 percent of total class
time should be devoted to KA work (p. 37) while Khan (2012) reveals only a fifth of the class
should be engaged in this work at any one time (p. 204). Furthermore, research reveals that KA
best serves in supplement to direct instruction as students explore concepts which need
remediation or extension (Means et al, 2013; Murphy et al, 2014; Moore et al, 2014).

Relevance

Summary

In summation, the value of blended learning lies in its unique ability to provide
opportunities for students to engage mathematical content in multiple ways. The multiplicity is
especially important in support classrooms where strategic interventions can encourage the
development of positive habits and learner identities. As a result, blended learning interventions
directly affect the scale of student growth occurring in the classroom. In the end, the evidence
supports the validity of blended learning stations ability to alter outcomes.

Connection

Overall, the research harmonizes like a large choir supporting each others notes while
still providing something new to the sound. The most common thread focused on the ability of
blended learning stations to increase independence and self-directed learning. Slight dissidence
occurs when Billingsley et al (2009) comments that direct instruction provides the best means
for instructing students; however, the contentious notes are dampened by the data which

Page 6
supports CAI routines. The variables for time and method of blended instruction seemingly lack
detail between comparisons.

Relevance

Since my students receive daily direct instruction in their standard Geometry course, I
create more space for students to exclusively explore Khan Academy for longer than research
normal recommends. Instructional routines based on this work in stations encourages
engagement with mathematical concepts affording the opportunity to challenge my students
independently control progressive learning and their personal remediation. As a result, this
inquiry is focused on how stations, utilizing principles of blended learning, impact the success
and identities of struggling mathematics students.

Questions

After careful exploration of the research and discussions surrounding blended learning
stations, I still wonder about a few aspects of the instructional routine. First, What programs or
software perform best for creating blended learning spaces in the classroom? Although I
focused primarily on Khan Academy, discussions of other resources seemed to be lacking from
the general sources. Second, How is blended learning best organized in the classroom? The
scholarly voice is surprisingly mute on these discussions despite explaining what students gain
from the experience in great detail? Finally, I am anxiously wondering, What new inferences or
evidence can my inquiry add to the existing knowledge?

Page 7
DESIGN & METHOD

Participants

Sixteen high school students formed the focus group of my teacher inquiry project one
senior, five juniors, and ten sophomores. Seven of these students were female and nine were
male. Seven participants were African American, one was Hispanic, one was Native American,
three were from Mediterranean or Middle Eastern states, and four students were White. The
participants composed my Geometry support class. In order to be admitted into the course,
students needed a score consistent with the lowest quartile of the schools population on the
PSAT, administered the previous year, as well as a teachers recommendation. I selected my
Geometry support class because these students are receiving primary mathematics instruction
elsewhere so that I can focus my attention almost entirely on intervention strategies both
progressive and reactionary.

Setting

Only a few features of my classroom setting are truly important for answering my
research question. First, students have daily access to laptops and/or their own personal
devices (my classroom is considered Bring Your Own Device). Therefore, students will be
working with the technology that they are most comfortable with and facilitates the completion of
the days tasks. Secondly, my students are arranged in pairs about the classroom to encourage
students to work collaboratively, but also limit distractions from off-topic conversations.
Additionally, the coaching software also prominently features help videos so extra space is
necessary for students to listen to the audio. Finally, I am circling the classroom and using
physical check-ins as well as notifications on my iPad to monitor the progress of students and
recommend appropriate interventions

Data

Fieldnotes served as my first line of data collection. I chose this source because it allows
me to track my initial observations and reactions to the student work occurring in the classroom.
Furthermore, fieldnotes provide an opportunity to look at general trends of engagement, effort,
and achievement occurring throughout the course of my study. I envision fieldnotes providing an
added level of depth to simple observations by charting the significant moments and contexts
occurring during an instructional period.

Page 8
Likert scale surveys composed my second source of data. I chose this source to gain
perspective of general class attitudes towards the implementation of blended learning stations
as intervention instructional routines. These broad reactions revealed specific students to track
as barometers for the entire population and mobilize with in-depth interviews and
questionnaires. These initial reactions provided a foundation for the formulation of targeted
interview/questionnaire questions designed to unpack conflicting attitudes. Furthermore,
surveys provide a less-intensive way to include parental/guardian attitudes towards blended
learning as well.

My third source of data was assessment scores in their Geometry course. Although
these scores are quantitative, they provide a lens for quickly measuring student achievement.
By charting student growth, I assessed the viability of the online coaching software to positively
impact student understanding in my course. These assessment scores will also be
accompanied by the completion scores from the mentoring software.

Finally, questionnaires were my fourth source of data. I chose questionnaires because


they provided an easy transition into a formal interview if I wanted more depth to a students
response. (Technically, interviews would constitute a fifth source of data; however, I view these
questionnaires and interviews as a single, interconnected source of data.) Questionnaires
provided the opportunity to identify why students held the attitudes expressed in their Likert
scale surveys. Parent questionnaires also help triangulate the information obtained in this way.

Timeline

My students began work with the blended learning stations approximately three weeks
before formalizing my plan of implementation. As a result, I made sure to survey pre- and initial
reactions to the work utilizing the online coaching software and my data collection will be
focusing on how attitudes have shifted as well as the implications on student learning. I
envisioned the process as a two week collection of data - the time-frame allowing for a unit
assessment to take place during the course of the inquiry and for surveys/questionnaires to be
returned.

After the first day of the online coursework, I asked students to quickly reflect on four
questions. On October 12, students commented on their engagement, perceived value of online
mentoring, desire to continue the work, and suggestions for more productive learning. Although
the survey did not ask students to respond in-depth, these four questions became the backbone
of future surveys.

The table below outlines the timeframe for conducting my teacher inquiry after collecting
initial responses:

Page 9
Table 1:
10/24 10/25 10/26 10/27 10/218

Fieldnotes Likert Scale Survey Fieldnotes Geometry Grades


Updates from primary
Geometry teachers

10/31 11/1 11/2 11/3 11/4

Student Interviews Fieldnotes Likert Scale Survey Fieldnotes Student Questionnaire

Parent Questionnaire

11/7 11/8 11/9 11/10 11/11

Fieldnotes Likert Scale Survey Fieldnotes Geometry Grades


Updates from primary
Geometry teachers

Exit Interviews

Role

I will balance my role as a classroom teacher and teacher inquirer in multiple ways. First,
I will not prevent my desire to analyzing the effectiveness of the given resource to impact the
instructional routines I employ for interventions or remediation. Rather than pushing the single
method, I will continue to work closely with the other Geometry teachers to effectively address
challenges to understanding. This includes pausing work in the online resource to organize
classroom explorations of current topics through high cognitive-demand tasks. Furthermore, I
will jot down quick, important observations, but will wait until the end of the day to reflect on the
days work. Since the support class is the second to last hour of the day, this routine should be
very successful.

Second, I will use formative feedback from my inquiries to adjust learning opportunities
for my students. By being flexible with my implementation of blended learning and stations in my
classroom, I can meet the diverse demands of my student population. Each student possesses
different gaps in mathematical understanding which must be met with differentiation. One of the

Page 10
factors that initial drew me to this question is the chance to provide more robust differentiation
than I could previously.

Issues

In the case any ethical issues arise during the course of this inquiry project arise, there are a
few steps I would like to follow. Although I have already cleared my research with my
department head, I will be notifying parents and guardians of the work taking place in my
classroom and how I hope to use it to provide more diverse opportunities for their students
growth. Secondly, I plan to review my observations/fieldnotes each week to ensure that all
students are being properly supported through the use of the mentoring software. As I discuss
the construction of the Geometry Lab course with my mentor (department chair), I hope to
reflect on the progress of the course collaboratively. Finally, I will be transparent with my
intentions and purposes to both parents and students.

Page 11
DATA ANALYSIS

Procedures

priori and emergent themes.


I analyzed the data for this inquiry project by using both a
This combination proved incredibly important as research already provided significant claims
which I could explore; however, I also wanted freedom to analyze themes which emerged from
my personal classroom.

I started my analysis by organizing and coding my observations contained in the


fieldnotes. I used three categories to quickly separate these recorded interactions: engagement,
identity, and other. Since I began the inquiry with student engagement and identity as my focus
these were my a priori themes while other significant events could still be identified and singled
out for further analysis. Then I tackled the student surveys, searching for affirmation or
contradiction of the themes which I previously identified. Since I chose not to conduct a member
check, student exit questionnaires were used to re-assess student perceptions without
introducing bias. These questions specifically focused on the a priori themes. Finally, parent
surveys and overall grade analysis were employed to ensure triangulation of any loose themes.

To identify themes, I used the following data analysis chart (see next page). By
incorporating this into my analysis, I tracked how the sources provided data support for each
theme (allowing me to ensure triangulation before reflecting). Allow the themes are not provided
in the table, they can be located in the F indings section of the report. If you are interested in the
design of my data sources, refer to the Appendix section to see specific questions that were
asked as well as the organization of these tools.

In order to not prejudice my findings regarding student self-perceptions, I chose not to


conduct a member check before, during or after the data collection. Since part of my inquiry
required me to reflect on changes in perceived success, revealing my observations to the
participants could bias their future feedback. Instead, I met weekly with the participants other
Geometry teacher to discuss my observations and receive feedback. In this way, I built a check
for myself so that my own bias did not overwhelmingly alter my findings.

Page 12
Data Sources Theme #1 Theme #2 Theme #3 Theme #4 Theme #5

Student Survey Reflection Y Y Y N

Fieldnotes Number of N N Comments Formative


disruptive about ability and assessments
episodes and observations
observations

Student Y N Y Y N
Questionnaires/
Exit Interviews

Parent Noticings N Y Y N Y
Survey

Coaching Tracking the Number of N N Y


Software Scores number of minutes beyond
minutes class time

Findings

Using my research question as a guide, the five themes I induced from the data sets are
identified, explained, and substantiated below.

Student Engagement Increased

The data collected highlights online coachings ability to encourage deeper engagement
with content, specifically Geometry in my case. Although prior research already illuminated the
effectiveness of blended learning, coaching provided a nuanced view of student engagement.

Before embarking on the journey through mentorship, the participants expressed


skepticism and doubts. Not a single student had participated in online coaching prior to my
inquiry and 14 out of 16 participants selected agree or strongly agree when referring to the
statement, I was skeptical about Khan Academy before we started working with it. Without any
prior experience and the presence of possible bias, students were ready to disengage. As one
participant reflected, Honestly, I thought it was going to help me a little, not much I just
thought it would not really do anything. However, through their own words, the result proved
much different.

Engagement is perilous thing to quantify so I asked students to do the heavy lifting for
me. In response to being asked how engaged they were through our four weeks of Khan

Page 13
Academy, I received the following feedback. Out of sixteen participants, eleven students had
favorable scores regarding the experience on their Likert scale surveys including thirteen
responses of highly agree to The extra practice Khan Academy provides is helpful for
understanding mathematics. Furthermore, the statement, I complete Khan Academy work
when it is assigned, received no negative responses. In final reflection, ten students used the
word help, ten students used understanding, and eight students used thinking when asked,
Does Khan Academy help you engage better with mathematics?

During the inquiry period, the number of classroom disruptions decreased. A disruption
consisted of an episode where multiple students went off task, while a positive episode marked
students working collaboratively, explaining thinking, or making positive comments. The chart
provided highlights this trend clearly. Furthermore, the average time to begin the task decreased
and the average time spent in Khan Academy tasks increased.

Given the trends observed, students clearly deepened their engagement with Geometry
concepts through the use of the coaching software. As new interventions were provided,
students recognized the connections to better understanding. Furthermore, they recognized the
significance of these events and completed tasks.

Self-Directed Learning Remains Constant

Surprisingly, recognition of the mentoring softwares value did not seemingly translate to
an increase in self-directed application of the resource. The mentoring software provides an

Page 14
opportunity to track the time every student spends within the site (including a breakdown for
specific questions). At the macro-level, the average time spent per student per week matched
almost entirely the requirements within class suggesting that sporadic time, at best, was being
spent practicing outside of class. Across the four week window, the average number of minutes
spent in Khan Academy only fluctuated by 3.4 minutes.

Despite all the positive reviews focusing on the usefulness of the resource for
understanding, students did not employ the mentoring software for their own understanding.
The tables below highlight student and parent responses regarding the use of the resource
outside the classroom.

Student Responses

Statement Negative Positive

When I struggle with a math 11 3


concept, I use Khan
Academy on my own.

When I am stuck on a 14 1
problem, I use Khan
Academy videos to help
*Discrepancies in numbers due to neutral being a possible response.

Parent Responses

Question Yes No

Have you observed your 3 10


student working in Khan
Academy at home?

Has your student discussed 0 13


their work in Khan Academy
with you?

Does your student use other 4 9


resources at home besides
the textbook and Khan
Academy to study?
* Only 13 of 16 parent surveys were returned.

Page 15
Despite engaging with Khan Academy regularly and completing practice sets in stations
on a weekly basis, these practices did not extend beyond the classroom. Although I did not
explore whether or not self-directed learning occurred at all, simply put, Khan Academy did not
encourage students to master concepts on their own.

In some ways, I wonder whether the design of the mentoring program actually inhibits
the development of independence (contrary to some of the research I found). Given this
particular programs construction, students rely on intervention recommendations from the
teacher and then complete the task. As a result, students are not actively seeking out topics for
mastery, but are relying on someone elses perceptions instead. That being said, this could be
an issue intrinsic to the programs design or simply a fault of my own implementation and action.
In the future, this is a valid sub-question for me to research.

In fact, this section actually poses more questions than answers. Could focusing on
problem-solving rather than procedural practice increase the ability for students to self-direct
learning? Could altering my implementation of the technology encourage students to begin
identifying areas of need rather than relying on teacher interventions? Does a reliance on
teacher interventions affect the depth to which students engage with the topic? These
sub-questions are like treasure maps in need of future exploration.

Motivation Remains Constant

In the same way that the implementation of Khan Academy in workstations did not
increase my Lab students ability to self-direct learning, the work failed to increase student
motivation for mastering content. Since I began the data collection process with self-direction, or
independence, as a loci of focus due to research, the surprising results regarding self-direction
pushed me to explore the root causes more deeply.

In a second survey, students were asked to comment on how they complete their
Geometry coursework (classwork, homework, and studying). The pie charts below represent a
summary of the information obtained from the survey.

Page 16
The mentoring software does nothing in regards to building students understanding of
the importance of the knowledge that they are learning in their course. Rather than application,
the software focuses on practice - lending itself to targeted intervention strategies for
remediation or introducing connections between topics. It is the core teachers responsibility to
introduce the context and develop the concept in application. In this regard, the result that
independence and motivation is not necessarily cultivated by Khan Academy is unsurprising.

However, it would be unfair to say that a ll students did not experience at least s ome
measure of increased motivation. The software actually features a structure of arbitrary points
awarded at the completion of each mission standard. In coding the exit questionnaires, I
noticed two students called out this feature specifically; Khan Academy is like a competition
whoever gets the most point win (sic). In this instance, the student actually experienced
decreased motivation because this competition affected his self-perception; in his words, When
learning its great but those on the bottom feel well not intelligent (sic).

In total, the marginal effects of Khan Academy on motivation are dubious at best. My
data simply does not support a strong conclusion. In hindsight, exploring and utilizing the
rewards system built into the software may have been an interesting avenue for exploring
motivation; however, this inquiry does not shed further light on the usefulness of the technology
for reviewing and previewing Geometry content.

Self-Perceptions Improved

Through meaningful interaction with the coaching software, my students strengthened or


established identities as mathematics learners. Given the previous mathematical histories of my
Geometry Lab students, this change cannot be overlooked or understated.

Page 17
The changes in my students self-perceptions surfaced during the first coaching session
and developed throughout the course of the inquiry. During that first session, I recorded to
verbal outbursts which distracted the class, but illustrate this point. The first student shouted,
This is the first question I got right all year! The second outburst featured a student
proclaiming, I think I can be the teacher tomorrow, Mr. Harrington. After two wrong, I hit five
straight. Thankfully, these outburst subsided into students taking on teacher roles within the
classroom as time progressed. During the final four Khan Academy work days, I recorded 21
instances of a student helping another complete the task at hand compared to none during the
first four observation days.

Feeling accomplished through a perceived increase in understanding, students felt


comfortable venturing into roles previously unexplored. This accomplishment led to academic
risks that were not a part of my classroom culture previously. As previously mentioned in the
Student Engagement Increased section, the number of positive disruptive episodes increased
significantly - the most common verbal outburst, I get it!

Student reflections confirmed my observations. Although no discernable change


occurred as students reflected on, I like mathematics, a significant change marked their
reflection on their progress. Eleven of sixteen participants claimed that their understanding of
mathematics is growing and their Geometry grade improved during the year. Therefore, an
exploration of student perceptions cannot be devoid of commenting on their own mathematical
progress. In fact the two are interwoven.

As a question on their questionnaire, students reflected on whether they could identify


improvement from the beginning of the year and explain how it happened. Fifteen students
experienced growth while nine made explicit mention of Khan Academy during their reflection.
Extended time to understand concepts and their connection formed the common theme
establishing a sense of mastery.

At the beginning of the course, only two students considered themselves gifted in
mathematics, thirteen considered math their least favorite subject, and sixteen of sixteen agreed
they had struggled in math class previously. By the conclusion of the inquiry, ten students
commented that they experienced less struggles as a result of their work and engagement. The
only possibility for this turnaround is that students reconstructed their own mathematics
identities.

Actual Achievement Improved

Of all the themes, actual achievement proved to be the most challenging to triangulate
as there is primarily only one source of assessment feedback; however, I compared Geometry
course grades, Lab course grades, and Khan Academy scores. The Geometry course grades
muddy the water, but the overall trend is clear.

Page 18
Khan Academy scores improved as the inquiry progressed as the average number of
mastered concepts reached 16. At the time of this reflection, the course covered approximately
30 practice sets and the course averaged approximately 85% practiced, 52% mastered, and
71% on a level between the two. In total only five problem sets were left as practice needed.

In the Lab course, 12 students experienced growth in their overall grade percentage,
one student experienced no change, and three students saw a decline. Since other
interventions were utilized during this inquiry, one must not assume that these results follow
directly from the blended learning; however, these results do support its effectiveness when
seen in concert with the other themes.

Finally, five student grade averages increased, eight remained constant, and three
declined in the regular Geometry course during this time frame. Despite growth in the other two
areas, Geometry grade changes are decidedly less positive. Given that the work expectations
vary between Geometry and Lab, this is a complex issue to unpack. At this juncture, further
exposition is not necessary as the overall trends still suggest positive impact on student growth.

As previously mentioned, this data, standing on its own, is unremarkable; however,


including the observations about increased student engagement, a strong case exists for online
mentoring positively impacting student achievement.

Page 19
CONCLUSIONS & DISCUSSION

Summary

In the Data Analysis section, I identified five themes which emerged from the data set.
Online mentoring increased student engagement as measured by participation, completion, and
grade level success. Surprisingly, increased engagement did not clearly translate to measurable
increases in self-directed learning or motivation. In this sense, it is important to realize that on
the micro-level some students may have improved in these areas, but the macro-data highlights
a lack of significant change in either direction. At a glance, these results seem at odds;
however, a nuanced look at the students revealed complex changes.

Despite self-learning and motivation remaining relatively constant, student surveys and
interviews revealed significant improvements in student self-perceptions. Through carefully
crafted discussion and reflection at the beginning of the year, my students revealed that
mathematics presented a significant obstacle in forming their academic identities. Despite this
history, online coaching provided a safe place for students to fail, revise, and succeed -
reversing old trends of negativity.

This significant change in their mathematics learning identities translated to significant


achievement. Seventy-five percent of my students experienced improvement in their overall
grade in the Lab course during the course of this inquiry and almost 32% also saw this increase
occur in their Geometry course.

Connections

Clearly, my inquiry results support the research behind blended learning and its positive
impact on student engagement. In fact, the increase in student engagement observed over the
course of this project is the most staggering result of my research. Almost every student
routinely challenged themselves in ways that previously did not occur in my classroom.
Analyzing the changes in negative and positive disruptions in my classroom highlights that
change in culture which occurred as a result of the time spent coaching my students. Although
the most dramatic increases occurred in engagement, these results, admittedly, are the most
unsurprising.

In contrast to the research literature, self-directed learning did not increase with the
application of blended learning work stations through the coaching software. Again, one must be
careful to note that this discussion is occurring at the macro-level - trends within my classroom
as a whole; however, Computer Aided Instruction (CAI), as discussed in the literature, did not
directly lead to self-directed learning in the case of online mentoring.

Page 20
Just like the research indicated, my students developed positive learner identities as a
result of working through the coaching program as implemented in my course. Analyzing the
data provided the opportunity to see how the students identities changed throughout the inquiry
period, but my initial question and data collections methods do not provide much insight into
why the coaching had this effect.

Implications
The findings from this inquiry project imply three major implications for my own
instructional practice as well as the profession at large.

First, coaching technology provides new methods for remediation promoting positive
learner identities rather than reinforcing negative self-perceptions. The word remediation carries
negative connotations for students, an educational stigma, which can actually oppose the
purposes of a support class. Finding another avenue to provide specific support for student
understanding buffers students from the negative effects of remediation on self-perceptions. As
a result, this technology provides an invaluable resource for differentiating the classroom
content.

Second, the online software effectively facilitates personalized interventions for students
with ease. As a result, targeted students receive the additional instruction that they need without
providing more work for students that do not need extra time with those concepts. In theory,
students can control their own learning by having access to future concepts as well as those
which can be assigned for remediation. This differentiation is possible using traditional
resources; however, the effort those actions demand can make the most well-intentioned
teachers feeling overwhelmed.

Third, the instant feedback provided by the software program is essential to furthering
student growth. Since the program tallies correct and incorrect responses (requiring three to five
consecutive correct responses), students can identify areas of strength and weakness
immediately. Furthermore, since the program rates each students progress, the teacher can
spend time providing students with the direct instruction students need to master concepts. In
this way, the teacher can provide informed instruction which is the desired result. Although this
theme was not triangulated with three data sets, students repeatedly mentioned being able to
understand what they know and dont know as the major advantage provided by the program.

Further Study

Three new questions were generated by the findings of this inquiry project. In the
process of interviewing my students, the importance of immediate learning feedback surfaced
multiple times. First, I wonder, How does immediate feedback about student mastery
encourage student growth? At the conclusion of the inquiry, I cannot help but wonder if I

Page 21
missed the importance of this question to my endeavor in online coaching; however, in the
future, understanding the role that feedback plays in the process seems central to unlocking the
programs effectiveness. Second, I am left wondering why my students did not demonstrate an
increase in self-directed learning. This begs the question, How does online coaching affect
students abilities to self-direct their learning and alter motivation? It is possible that the answer
is directly tied to the role instant feedback provides as the students do not necessarily need to
take ownership - this is certainly a worthwhile endeavor. Finally, I wonder on a personal level,
Through what instructional practices and routines can the effectiveness, in terms of improving
student achievement and fostering positive learner identities, of an online coaching program be
increased? Given that the technology making this inquiry possible is relatively young, the
possibility for innovation is extremely great. The research literature and my own exploration do
not shed light on when blended learning in this way is effective, but rather that it i s effective.
More studies should be completed analyzing what factors increase the effectiveness of such a
program.

Reflections

Two experiences feature prominently in my reflections on this inquiry project. I


summarize both and muse about their meaning to me and the teaching profession.

First, during the course of this inquiry, students clearly developed confidence in their
mathematical abilities or potential. One student in particular exclaimed aloud that she finally
understood something in Geometry class. I cannot forget this experience because it occurred 52
days into the school year. It is important to remember the power of mastering a concept.
Despite flaws and inadequacies in the design of the coaching software, the program created a
space for students to experience success - not only success, but r epeated success. As I seek to
foster student growth in mathematics, this experience will stand as a stark reminder that
challenging student understanding is not the only goal in mathematics education; instead, the
goal is to prepare and support students to overcome intellectual obstacles by relying on prior
success.

Second, the exit interviews conducted at the conclusion of data collection highlighted the
importance of feedback on learning in ways I had not considered. Sure, formative assessments
provide the foundation for my instructional actions; however, instant feedback for the students
allows them to alter their own actions which can increase their understanding. While developing
my research question, I had not even considered how the programs feedback vessel may affect
learner outcomes, but now I am wondering if it is, in fact, the most important. Hearing the
students recall the joy of reaching five correct solutions in a row and the pain of completing four
only to falter on the fifth, highlights the enduring impact this design feature had on their
comprehension.

Page 22
REFERENCES

Andreasen, J. B., & Hunt, J. H. (2012). Using math stations for commonsense inclusiveness.
Teaching Children Mathematics, 19(4), 238-246.

cademy of
Arbaugh, J. (2008). Introduction: Blended learning: Research and practice. A
Management Learning & Education, 7(1), 130-131.

Bhagat, K. K., Chang, C. N., & Chang, C. Y. (2016). The impact of the flipped classroom on
mathematics concept learning in high school. J ournal of Educational Technology & Society,
19(3), 134-142.

Billingsley, G., Scheuermann, B., & Webber, J. (2009). A comparison of three instructional
methods for teaching math skills to secondary students with emotional/behavioral disorders.
Behavioral Disorders, 35(1), 4-18.

he Mathematics Teacher,
Cargile, L. A. (2015). Blending instruction with khan academy. T
109(1), 34-39.

Delialiolu, . (2012). Student engagement in blended learning environments with


lecture-based and problem-based instructional approaches. J ournal of Educational Technology
& Society, 15(3), 310-322.

athematics
Edwards, C. (2012). Online learning: A middle school mathematics perspective. M
Teaching in the Middle School, 18(4), 244-247.

Lim, D. H., & Morris, M. L. (2009). Learner and instructional factors influencing learning
outcomes within a blended learning environment. J ournal of Educational Technology & Society,
12(4), 282-293.

Means, B., Toyama, Y., Murphy, R. F., & Baki, M. (2013). The effectiveness of online and
blended learning: A meta-analysis of the empirical literature. Teachers College Record, 115(3).

Moore, A. J., Gillett, M. R., & Steele, M. D. (2014). Fostering student engagement with the flip.
The Mathematics Teacher, 107(6), 420-425.

Murphy, R. F., Gallagher, L., Krumm, A., Misslevy, J., & Hafter, A. (2014). Research on the use
of khan academy in schools. Menlo Park, CA: SRI Education.

Parsons, S., Dodman, S., & Burrowbridge, S. (2013). Broadening the view of differentiated
instruction. The Phi Delta Kappan, 95(1), 38-42.

Page 23
Reiser, R., & Butzin, S. (2000). Using teaming, active learning, and technology to improve
instruction. Middle School Journal, 32(2), 21-29.

Schoenfeld-Tacher, R., McConnell, S., & Graham, M. (2001). Do no harm: A comparison of the
effects of online vs. traditional delivery media on a science course. J ournal of Science
Education and Technology, 10(3), 257265.

Tomlinson, C. A. (1999). The differentiated classroom: Responding to the needs of all learners.
Alexandria, VA: Association for Supervision and Curriculum Development.

Wang, Y., Han, X., & Yang, J.. (2015). Revisiting the blended learning literature: Using a
complex adaptive systems framework. Journal of Educational Technology & Society, 18(2),
380-393.

Page 24
APPENDIX

Appendix A
KHAN ACADMEY SURVEY
Directions: Complete the following survey by circling the number that applies to each statement
- if you think a statement does not apply to you, do not circle a response but explain why. For
each question, 1 represents Strongly Disagree and 5 represents Strongly Agree. Please
answer the survey honestly.
1. I was skeptical about Khan Academy before we started working with it.
1 2 3 4
5
Strongly Disagree Disagree Neutral Agree
Strongly Agree

2. In the past, mathematics has been my least favorite subject.


1 2 3
4 5

3. In the past, I have struggled in math class.


1 2 3
4 5

4. I have been engaged and working hard in my regular Geometry class.


1 2 3
4 5

5. I have been engaged and working hard in my Geometry Lab class.


1 2 3
4 5

6. I have been engaged and working hard when assigned Khan Academy work.
1 2 3
4 5

7. I complete Khan Academy work when it is assigned.


1 2 3
4 5

8. Khan Academy helps me understand the mathematics better.


1 2 3
4 5

Page 25
9. The extra practice Khan Academy provides is helpful for understanding mathematics.
1 2 3
4 5

10. When I struggle with a math concept, I use Khan Academy on my own (when it is not
assigned).
1 2 3
4 5

11. When I am stuck on a problem in Khan Academy, I use the Khan Academy videos to help.
1 2 3
4 5

12. The Khan Academy videos help me understand material better.


1 2 3
4 5

13. My Geometry grade has improved during the year.


1 2 3
4 5

14. My understanding of mathematics is growing.


1 2 3
4 5

15. Khan Academy has supported my work in my other Geometry class.


1 2 3
4 5

16. The Lab course has helped me in my Geometry course.


1 2 3
4 5

17. I am successful in my regular Geometry class.


1 2 3
4 5

18. I am successful in my Geometry Lab class.


1 2 3
4 5

Page 26
19. I have given full effort in my regular Geometry class.
1 2 3
4 5

20. I have given full effort in my Geometry Lab class.


1 2 3
4 5

Page 27
Appendix B
KHAN ACADEMY QUESTIONNAIRE
1. When Mr. Harrington mentioned that we were going to be using Khan Academy this year to
help learn Geometry, what was your reaction? Explain why you had that reaction.

2. Before you started working with Khan Academy, did you have any doubts that it would be
useful? Explain why you did or did not have those doubts.

3. How has working with Khan Academy changed your mathematical understanding? Explain
the why Khan Academy has helped or why it has not helped.

4. Does Khan Academy help you engage better with mathematics? Explain your thinking.

Page 28
5. Have you noticed improvement in your mathematics ability since beginning our work? Explain
why you have experienced growth or why you have not.

6. How engaged have you been with the Khan Academy work? What would increase your
desire to use Khan Academy? Give your overall impressions of Khan Academy.

7. What suggestions do you have?

Page 29
Appendix C
PARENT/GUARDIAN SURVEY
Directions: Please answer each question as fully as possible. If the question is not applicable or
you are unable to answer the question, please leave the space blank.
1. Have you observed your student working in Khan Academy at home? If so, how often and
how long?

2. Has your student discussed their work in Khan Academy with you? If so, how do they
describe their interactions?

3. Does your student use other resources at home besides the textbook and Khan Academy to
study?

4. Have you noticed any improvements in your childs achievement? What specifically?

5. Any other comments?

Page 30

Potrebbero piacerti anche