Sei sulla pagina 1di 8

Computers in Human Behavior 47 (2015) 90–97

Contents lists available at ScienceDirect

Computers in Human Behavior


journal homepage: www.elsevier.com/locate/comphumbeh

Investigating student motivation in the context of a learning analytics


intervention during a summer bridge program
Steven Lonn a,⇑, Stephen J. Aguilar b,1, Stephanie D. Teasley c,2
a
USE Lab, Digital Media Commons, University Library, University of Michigan, United States
b
Combined Program in Education and Psychology and USE Lab, Digital Media Commons, University Library, University of Michigan, United States
c
School of Information and USE Lab, Digital Media Commons, University Library, University of Michigan, United States

a r t i c l e i n f o a b s t r a c t

Article history: Summer bridge programs are designed to improve retention and academic success among at-risk popu-
Available online 10 August 2014 lations in postsecondary education by focusing on successful skills, behaviors, and high impact practices
that promote academic performance. Recent research on these programs has focused primarily on how
Keywords: students’ incoming demographics and prior academic performance predict academic performance at
Learning analytics the university level. This study investigated changes in students’ academic motivation orientations over
Motivation the course of one bridge program, and how a learning analytics-based intervention was employed by aca-
Early warning systems
demic advisors to inform their face-to-face meetings with students. The results of our study show that
At-risk students
Design-based research
students’ mastery orientation decreased over the course of the bridge program, and indicate that stu-
dents’ exposure to displays of their academic performance negatively predicts this change. The findings
suggest that student perceptions of their goals and formative performance need to be carefully consid-
ered in the design of learning analytics interventions since the resulting tools can affect students’ inter-
pretations of their own data as well as their subsequent academic success.
Ó 2014 Elsevier Ltd. All rights reserved.

1. Introduction that six areas—or components—contribute to students’ decision


to depart college before earning a degree: (1) pre-entry attributes,
Retention has been seen as a critical issue in higher education such as academic preparation, cultural background, and first-gen-
for decades. Admitting students who fail to graduate is devastating eration status; (2) students’ goals such as academic major, and
for the students, and has ramifications for institutional account- career choice, and level of commitment to achieving those goals;
ability and related revenue models. Indeed, there is a newfound (3) students’ institutional experiences, both formal and informal
sense of urgency to address this issue because of the new institu- with peers, faculty, and staff; (4) integration and balance between
tional rating model proposed by President Barack Obama to tie academic and social interactions; (5) re-examination and updating
U.S. federal financial aid to graduation rates, tuition, and the of goals and commitments; and (6) decision finalization based on
percentage of lower-income student enrollment (Lewin, 2013). students’ cumulative experiences. Related to Tinto’s model, Astin
Consequently, postsecondary institutions are ever more interested (1975) asserted that students who physically and psychologically
in investing in viable and successful models to increase retention, involved themselves in the academic and social opportunities in
particularly for groups of students with historically lower gradua- the college environment are more likely to persist. Further, Astin
tion rates, such as first-generation college students (Dennis, (1984) argued that student involvement is a behavioral manifesta-
Phinney, & Chuateco, 2005) and students from low-SES families tion of the psychological construct of motivation. In order to better
(Adelman, 2006; Walpole, 2003). assess and monitor the metrics related to student persistence,
Students’ academic persistence has been a well-researched many postsecondary institutions have turned to learning analytics
topic. Tinto’s (1987, 1993) longitudinal work has demonstrated tools utilizing models driven by pre-entry attributes to address
their retention concerns. These attributes are generally paired with
students’ formal institutional experiences (i.e., grades), and this
⇑ Corresponding author. Tel.: +1 734 615 4333. approach has dominated the higher education intervention land-
E-mail addresses: slonn@umich.edu (S. Lonn), aguilars@umich.edu (S.J. Aguilar), scape (e.g., Arnold & Pistilli, 2012).
steasley@umich.edu (S.D. Teasley).
1
Tel.: +1 734 615 4333.
To combat declining retention rates for at-risk student popula-
2
Tel.: +1 734 763 8124. tions, many institutions have developed summer bridge programs

http://dx.doi.org/10.1016/j.chb.2014.07.013
0747-5632/Ó 2014 Elsevier Ltd. All rights reserved.
S. Lonn et al. / Computers in Human Behavior 47 (2015) 90–97 91

(Gandara, 2001; Myers & Schirm, 1999; Terenzini & Wright, 1993). non-cognitive as well as academic performance measures to ulti-
Historically, these programs have been designed to provide aca- mately improve student learning and retention.
demic support and information regarding college campus life, ori-
ent students to the institutional culture, and develop at-risk
2. Literature review
students’ self-esteem and self-efficacy (Ackermann, 1991; Fitts,
1989; Garcia & Paz, 2009; Kezar, 2001; Pascarella & Terenzini,
2.1. Summer bridge programs
2005). Put simply, ‘‘summer bridge programs are intended to
address important preparation and achievement gaps that are evi-
The programmatic content and structure of summer bridge pro-
dent in the research [on retention and persistence]’’ (Colyar, 2011,
grams are, in most instances, inspired by Tinto’s (1987, 1993) and
p. 123). Although these programs are extremely varied in program-
Astin’s (1984) foundational theories of student retention (Kezar,
matic content and implementation, most aim to develop students’
2001). The content of these programs can vary widely, but typically
study and time management skills, their ability to utilize univer-
include accelerated mathematics, English or writing, and general
sity services (e.g., tutoring centers, libraries), and provide meaning-
‘‘college knowledge’’ courses (Suzuki, Amrein-Beardsley, & Perry,
ful exposure to college course work and faculty (Cabrera, Miner, &
2012). Offering these programs during the transitional summer
Milem, 2013). Early evaluations and assessments of summer bridge
months between high school graduation and college matriculation
programs were largely descriptive, relying on student satisfaction
is intentional. Tinto (1996) argued that in order to increase student
surveys (e.g., Ackermann, 1991) while more recent research has
retention, an institution must ‘‘ensure that students receive the
utilized comparison samples of non-participants, longitudinal
guidance they need at the beginning of the journey through college
analyses, and other empirical techniques (e.g., Allen & Bir, 2012;
to graduation’’ (p. 4). Furthermore, the residential component
Cabrera et al., 2013; Strayhorn, 2011).
found in most summer bridge programs is based on Astin’s
Students’ intrinsic motivation to achieve their stated goals, and
(1984) finding that living on campus is the single most important
their capacity to plan and utilize available resources are fundamen-
factor and positive predictor of persistence for all students.
tally linked to their retention in higher education (Allen & Bir,
Assessments of the impact and success of summer bridge pro-
2012). However, the learning analytics tools that are currently
grams have yielded inconsistent results. While some studies indi-
available for deployment at a large scale do not include measures
cate that summer bridge programs improve students’ academic
of student motivation, as reported either by the students them-
success (Strayhorn, 2011; Walpole et al., 2008), others show no
selves or from assessments by their instructors or academic advis-
impact (Fletcher, Newell, Newton, & Anderson-Rowland, 2001)
ors. These measures are difficult to include in large-scale analytics
and some indicate decreased academic success (Ackermann,
tools since such information is not typically captured by institu-
1991). Further complicating these findings, Myers and Schirm
tional student information systems. Whereas the assessment and
(1999) claim that summer bridge program outcomes are more
evaluation of retention programs once relied primarily on student
social than academic. To investigate these mixed results, Cabrera
satisfaction surveys (Astin, 1993; Garcia & Paz, 2009; Strayhorn,
et al. (2013) conducted a longitudinal study assessing the impact
2011; Walpole et al., 2008), today a reliance on institutional
of the University of Arizona’s New Start Summer Program (NSSP)
records of high school preparation, standardized test scores, and
on participants’ first year grade point average (GPA) and retention,
student aid serve as the ‘‘basis for models that evaluate the effects
controlling for incoming student characteristics. While program-
of interventions on attainment outcomes’’ (St. John, 2006, p. 100).
matic participation significantly predicted first-year GPA and
A range of data sources representing a broader characterization of
retention, this relationship became insignificant when controlling
the student experience is needed if technological tools are to suc-
for first-year college experiences and student development.
cessfully produce models that represent all of the components of
As summer bridge programs have matured, research has begun
Tinto’s original model. In response, this study investigates stu-
to focus on the connections between student perceptions and aca-
dents’ motivational orientations and how assessment of those ori-
demic success. For example, Suzuki et al. (2012) investigated stu-
entations can inform a learning analytics-based intervention
dents’ confidence about college expectations and their sense of
employed during a summer bridge program to support data-driven
belonging following their participation in a five-week summer
decisions and actions of the academic advisors.
bridge program at Arizona State University. Bridge program partic-
ipants had a higher likelihood of demonstrating these attributes
1.1. Research questions
than non-bridge students, and the program positively influenced
their short-term retention. Bridge program participation also
The overarching research questions guiding this current study
resulted in students’ forming valuable friendships, learning
are:
about skills known to increase student success (e.g., note taking,
time management), increased feelings of security and confidence,
 RQ1: To what extent, if any, do students’ motivational orienta-
and greater sense of belonging at the institution. Similarly,
tions change throughout the course of a summer bridge
Strayhorn (2011) investigated summer bridge students’ academic
program?
self-efficacy, sense of belonging, and academic and social skills.
 RQ2: What factors predict the changes in motivation, if any, that
His results indicated that summer bridge participation positively
occur over the course of a summer bridge program?
correlated with specific academic skills (e.g., use of technology,
 RQ3: What is the relationship between advisors’ use of a learn-
interpreting syllabi) and academic self-efficacy, but did not seem
ing analytics-powered Early Warning System and their
to affect students’ sense of belonging or social skills. Students’ posi-
students’ academic performance during a summer bridge
tive beliefs about their academic skills and precollege aptitude also
program?
positively predicted first-semester grades in college, explaining
approximately 30% of the variance in first-semester GPA.
This study is the result of working in partnership with summer
bridge staff; we believe that such partnerships are necessary in
order to better understand the different factors that affect stu- 2.2. Relevant learning analytics techniques and tools
dent’s motivational orientations within the context of the summer
transition program. These factors can, in turn, inform the future Learning analytics techniques can utilize large data sets to pro-
designs of learning analytics tools so that new tools include vide decision makers with actionable information that can help
92 S. Lonn et al. / Computers in Human Behavior 47 (2015) 90–97

determine the best course of action to improve learning outcomes 2.3. Theoretical framework
(e.g., grades, retention) (EDUCAUSE Learning Initiative, 2011). Early
Warning Systems (EWSs) are one approach for utilizing historical Postsecondary retention is highly influenced by pre-entry char-
and formative educational data to identify students who might acteristics and academic achievement, yet some level of individual
be at risk of academic failure, often in near real time. Arnold desire, or motivation, is necessary to persist in higher education,
(2010) states that users of EWSs can leverage this ‘‘actionable particularly for students in groups with historically lower context
intelligence’’ to direct students toward resources or changes in of competence-relevant activities (e.g., the kinds of learning activ-
behavior in a timely manner. Prior work in this area has generated ities prevalent in schooling) (Allen, 1999). Achievement Goal The-
several useful proofs of concept, including Morris, Finnegan, and ory is a useful framework to conceptualize and measure students’
Wu’s (2005) correlation of students’ activity within a Learning motivation orientations towards academic work (see Elliot, 2005
Management System (LMS) with persistence in an online course. for a review). Students’ achievement goals—which range from
Macfadyen and Dawson’s (2010) work classified specific online mastery oriented goals to performance oriented goals—have effects
activities—supported by an LMS—that correctly identified 81% of on a variety of cognitive, affective, and behavioral outcomes
students who failed the course. Goggins, Galyen, and Laffey (Rawsthorne & Elliot, 1999).
(2010) found that students were able to use LMS feedback to Intrinsic motivation—which is closely aligned to mastery-ori-
identify what their peers were doing, and what they, in turn, might ented learning—has direct implications for educational, occupa-
need to accomplish in order to catch up to the ‘‘herd.’’ The Check tional, and sport settings (Heyman & Dweck, 1992). Intrinsic
My Activity tool, developed at the University of Maryland– motivation, defined as interest in and enjoyment of an activity
Baltimore County, is one example of a tool that allows students for its own sake (Deci & Ryan, 1985; Lepper, 1981), can be juxta-
to compare their LMS activity and grades to that of their peers posed against extrinsic motivations, such as getting a good grade
(Fritz, 2011). More recently, Muñoz-Merino, Valiente, and Kloos or pleasing a teacher. Collins, Brown, and Newman (1987) state
(2013) found several parameters in the Khan Academy platform that when students are provided extrinsic rewards for previously
that can provide useful information about the learning process intrinsically motivated activities (e.g., reading), they are less likely
for students and teachers. Tool designers have utilized this prior to perform the task on their own in the future. If neither extrinsic
work to create systems that can predict and inform users about nor intrinsic motivations are present (‘‘amotivation’’), the student
the likelihood of a students’ academic failure or success. goes through the motions of learning, but ultimately fails to actu-
Course Signals’, developed at Purdue University, is one example ally learn the material (Deci & Ryan, 1985), and is therefore at risk
of an EWS that uses a predictive model composed of a students’ of dropping out. Similarly, retention programs that focus primarily
current course performance, online LMS activity, prior academic on helping students master course content but fail to develop ade-
history, and demographics to indicate likelihood of academic fail- quate academic self-confidence, academic goals, institutional com-
ure to instructors (Arnold, 2012). Taking a slightly different mitment, and social support may ultimately be unsuccessful in
approach, the University of Phoenix developed an EWS with a cat- increasing rates of persistence (Lotkowski et al., 2004).
egorization scheme that takes several student factors into account While motivation researchers have delineated achievement
for increasing or decreasing the student’s priority for receiving a goals into distinct forms, Ames and Archer (1987) argue that the
phone call from their academic advisor (Barber & Sharkey, 2012). various formulations converge in mastery and performance goals.
Nearly all EWSs and similar systems are intended to guide the user ‘‘Mastery goals focus on (and celebrate) the development of com-
to matching specific students with appropriate help and resources petence and task mastery, whereas performance goals focus on
in a timely manner to mitigate against an increasing likelihood of the demonstration of competence relative to others’’
academic failure. Recent work has continued to develop and refine (Rawsthorne & Elliot, 1999, p. 326). Performance goals may be fur-
techniques to improve these processes, including using compe- ther separated into performance-avoid orientations (i.e., the stu-
tency maps (Grann & Bushway, 2014), delivering feedback in Mas- dent avoids tasks that might signal incompetence) and
sive Open Online Courses (Coffrin, Corrin, de Barba, & Kennedy, performance-approach orientations (i.e., the student pursues tasks
2014), and in a variety of blended computer programming courses that reinforce their beliefs of competence) (Elliot & McGregor,
(Blikstein, 2013). 2001). Students with high mastery beliefs have demonstrated dee-
The information presented in EWSs can reveal activities and per processing outcomes (e.g., Nolen, 1988) while high perfor-
patterns of student behavior that are otherwise opaque to interme- mance-avoid beliefs have been linked to superficial learning
diaries whose stated job function is to act upon this information, strategies and cheating behaviors (e.g., Elliot & Dweck, 1988). Stu-
such as academic advisors (Duval, 2011; May, George, & Prévôt, dents with high performance-approach orientations need to be
2011). Learning analytics can help these intermediaries understand carefully monitored; as such beliefs can lead to maladaptive as
and optimize student learning and the environments in which it well as adaptive behaviors. Contextual factors, like the program-
occurs (Society of Learning Analytics Research, n.d.). Those who matic content and structure of summer bridge programs, can influ-
implement summer bridge programs can leverage learning analyt- ence students’ mastery and performance goals and beliefs
ics tools to quickly identify students in need of academic support (Harackiewicz, Barron, Pintrich, Elliot, & Thrash, 2002). For exam-
and allow them to engage in sensemaking activities that support ple, a student who is typically performance-oriented might exhibit
subsequent actions (Krumm, Waddington, Lonn, & Teasley, 2014). mastery-oriented behaviors if proper support is provided.
Combining traditional pre-entry characteristics, academic achieve- We use Achievement Goal Theory both to help us understand
ment data, and students’ motivation orientations can further and measure the goals and beliefs of students within the Summer
increase the customization and personalization of intermediary’s Bridge Program, and to help us detect any possible changes that
actions informed by learning analytics tools. occur during the 7-week term. Moreover, since mastery orientation
The study presented here extends the prior research on summer is aligned with intrinsic motivation, we utilize it as a reliable proxy
bridge programs by not only examining student perceptions about measure for intrinsic motivation. In general, we feel that this is a
their motivational orientation and their related academic success, necessary step for any learning analytics intervention that aims
but also combining those data sources with the use of learning to scale; without this preliminary work it would be difficult to
analytics tools collaboratively designed with bridge program lead- know what non-cognitive factors (e.g., motivation) are at play for
ers and staff to support timely data-driven decision making. the students these interventions are meant to serve.
S. Lonn et al. / Computers in Human Behavior 47 (2015) 90–97 93

3. Methodology 3.3. Data sources

Our research agenda is organized around principles of design- Two online surveys were distributed to Bridge students, one at
based research (Brown, 1992) as it involves ‘‘a series of the start-of-term (12 items) and one at the end-of-term (22 items).
approaches, with the intent of producing new theories, artifacts, Students received course credit for survey completion. To measure
and practices that account for and potentially impact learning Bridge students’ motivational orientations, we used the Patterns of
and teaching in naturalistic settings’’ (Barab & Squire, 2004, p. Adaptive Learning Scales (PALS; Midgley et al., 2000), to measure
2). We collaborated closely with the leaders and staff of a summer students’ achievement goal orientations in both surveys. These
bridge program to iterate the system design of our learning items have been validated and used in multiple areas (e.g.,
analytics-powered early warning system (EWS) intended for Blumenfeld, 1992; Elliot & Harackiewicz, 1996), and produce fac-
academic advisors (see Krumm et al., 2014; Lonn, Krumm, tors that indicate a respondent’s mastery goal orientation (e.g.,
Waddington, & Teasley, 2012) to and implement student motiva- ‘‘One of my goals in class is to learn as much as I can.’’), perfor-
tional surveys. mance-approach orientation (e.g., ‘‘I want to do better than other
This EWS utilizes information from the institutional learning students in my class.’’), and performance-avoidance orientation
management system (LMS) to inform just-in-time advisor inter- (e.g., ‘‘It’s very important to me that I don’t look stupid in my
ventions designed to identify student’s maladaptive academic class.’’). The surveys also included questions that asked students
behaviors and study habits before students fall further into aca- about their perceptions of their college preparation, academic
demic jeopardy. expectations and goals for Bridge courses, support from family
and friends, and their interaction with their academic advisors.
Student data from the institutional data warehouse was gathered
3.1. Setting to supplement the survey responses, including pre-entry demo-
graphic and achievement data.
This study focuses on the Summer Bridge Program (referred to While the two surveys described above represent the main data
simply as ‘‘Bridge’’ by both program staff and students) that is sit- sources described in this paper, our overall design-based research
uated within a large four-year, more selective, lower transfer-in, program investigates how academic advisors in Bridge and other
and primarily residential university in the Midwestern United at-risk student programs use our EWS, called Student Explorer.
States with very high research activity (http://classifications.carne- This system was designed to provide advisors information about
giefoundation.org). Bridge began in 1975 as an effort to assist non- their students’ engagement and performance to facilitate timely
traditional students’ transition from high school to college by interventions (see Aguilar, Lonn, & Teasley, 2014; Krumm et al.,
providing highly structured introductory coursework. Today, 2014; Lonn, Aguilar, & Teasley, 2013; Lonn et al., 2012). Students’
Bridge offers intensive academic preparation, individualized aca- weekly progress updates were presented through a dashboard that
demic advising, and a community-building living environment to provided representations of student engagement and performance
an incoming cohort of over 200 students. Students are expected that allowed the advisors to readily identify students who were
to attend all of the Bridge classes and workshops, while also meet- succeeding in any given course, beginning to show signs of falling
ing with their academic advisor on a regular basis. behind, or struggling with their coursework (see Fig. 1). The forma-
Bridge students are enrolled in three courses for a seven-week tive data for this dashboard was provided from the institutional
term: (1) a mathematics course (remedial intermediate algebra LMS and updated daily to align with the brisk pace of the 7-week
(Math A), college-level intermediate algebra (Math B), or mathe- term. For each course presented within Student Explorer, students
matical reasoning (Math C)); (2) an English or Writing course; were designated with an ‘‘E3’’ status of ‘‘Encourage’’ (green3),
and (3) a freshman seminar that serves as an introduction to social ‘‘Explore’’ (yellow), or ‘‘Engage’’ (red), a categorization derived from
science. Students are sorted into the mathematics and English or overall percentage of points earned, distance from the course aver-
Writing courses based on a combination of placement exam scores, age, and LMS course website logins (see Krumm et al., 2014 for addi-
prior academic history, and primary major interest. Of these tional details about the algorithm underlying this classification). Log
courses, Bridge leaders and staff have stated that the mathematics data captured from academic advisors’ use of Student Explorer and
courses are the most ‘‘high stakes’’ because they are designed to corresponding data from the student meeting appointment system
the specifications of the math department, including grading stu- were matched with student survey data. This allowed us to know
dents on a curve. how often an advisor looked at a particular student’s data and
whether advisors did so during a meeting with the student (Note
that it is not possible to know if the student looked at the data view
3.2. Participants with the advisor, we can only note that it was possible).

Two hundred and sixteen students completed Bridge in the


Summer 2013 term (3 students dropped from the program dur- 4. Results
ing the term). Most of the participants were female (62.5%) and
identified as members of an underrepresented minority group We conducted paired-sample t-tests to investigate whether stu-
(69.4%). The full student cohort was comprised of the following dents’ goal-orientations changed throughout the course of Bridge
racial/ethnic groups: Black (46.8%), White (22.2%), Hispanic (RQ1) by comparing pre and post measures of students’ mastery,
(13.9%), ‘‘two or more’’ (8.8%), Asian (6.5%), and Other (1.8%). performance-approach, and performance-avoid orientations. There
Nearly a quarter (24.1%) of students reported that their parent’s were no significant differences between pre-bridge performance-
income was less than $25,000 per year and many (22.2%) approach scores (M = 2.8, SD = .93, a = .85) and post-bridge perfor-
reported incomes of $25,000–$50,000 per year. One-fifth (21.8%) mance-approach scores (M = 2.7, SD = .99, a = .90); (t(208) = .792,
of Bridge students were first-generation college students and p = .43). There were also no significant differences between pre-
one-sixth (14.8%) were student athletes. Students’ high school bridge performance-avoid scores (M = 2.9, SD = .92, a = .77) and
grade point averages (GPAs) ranged from 2.1 to 4.0 (M = 3.45)
and composite American College Testing (ACT) scores ranged 3
For interpretation of color in Fig. 1, the reader is referred to the web version of
from 14 to 30 (M = 22.54). this article.
94 S. Lonn et al. / Computers in Human Behavior 47 (2015) 90–97

Fig. 1. Example dashboard displays from Student Explorer early warning system. Summary (left) dashboard presents most recent formative data across courses. Course detail
(right) dashboard presents all assignment details, a historical performance graph, and LMS login history about a specific course in which the selected student is enrolled.

post-bridge performance-avoid scores (M = 2.8, SD = .93, a = .82); Students’ incoming mastery was positively associated with
(t(208) = .772, p = .44). There was statistically significant decrease, their outgoing mastery. However, students’ self-reports of how
however, in students’ reported pre-bridge mastery scores often their advisors showed them their Student Explorer data
(M = 4.7, SD = .55, a = .89), and post-bridge mastery scores negatively predicted the change in mastery over the course of
(M = 4.3, SD = .84, a = .93); (t(208) = 6.53, p < .001). bridge. Advisors’ logged (actual) use of the Student Explorer EWS
To better understand what might contribute to students’ before, during, or after meetings with each student was not a
decrease in mastery orientations during Bridge (RQ2), we specified significant predictor of the change in mastery.
a multiple regression model of students’ change in mastery Since Bridge programs are meant to ensure that students are
(Table 1). We controlled for relevant demographic characteristics prepared for college-level coursework, we wanted to understand
(e.g., athletic status, gender), previous academic expectations and what factors would predict student final grade outcomes in three
support (e.g., support from family and friends), incoming mastery, of the courses offered during the seven week term (RQ3): an Eng-
as well as previous achievement (ACT scores). We were also inter- lish course, a remedial intermediate algebra course (Math A), and a
ested in the relationship between students’ understanding of their college-level intermediate algebra course (Math B). We did not
formative academic achievement data and their mastery percep- model the Math C, Writing, or freshman seminar courses because
tions. To measure this relationship, we included variables that there was little grade variation. We used multiple regression to
measured students’ self-reports about how often their academic model which factors would predict student outcomes in each of
advisor showed them Student Explorer data, as well as whether these courses, and controlled for demographic characteristics
advisors used the Student Explorer EWS before, during, or after (e.g., gender and athletic status), academic achievement measures
their meetings with students. (e.g., ACT scores, math pre-tests, and math midterms), high school
experiences (e.g., perceived quality of high school teachers in gen-
eral, and math/English preparation, specifically), and encourage-
Table 1
ment by family and friends. This allowed us to focus on the
Change in mastery orientation over the course of the bridge program.
relationships between student motivation, advisors’ use of Student
b SE b R2 Explorer, and course grade outcomes (Table 2).
Change in mastery orientation .30 Results of the multiple regression models indicated that athletic
Intercept 1.890 .974 status negatively predicted English course grades, while students
Demographic variables who stated that they had excellent high school teachers positively
Mastery orientation before summer bridge .358** .116 predicted English course grades. These variables, however, were
Student athlete .294 .248
not predictive of math course grades. For students in the remedial
Female .045 .127
Math course (A), mastery orientation negatively predicted course
Achievement variables
grade. The extent to which Bridge advisors viewed students’ data
ACT composite score .042 .031
via Student Explorer after meeting with students also negatively
1-on-1 Advisor-student discussionsa
predicted students’ Math A course grades. For students in the
Graphs within Student Explorer shown to student .112* .050
Studying for bridge courses .058 .043 college-level Math course (B), perceived family encouragement
negatively predicted course grades. The midterm score positively
Advisor’s use of Student Explorer
Before meeting (count) .025 .049 predicted the final course grade for both Math courses. Finally, stu-
During meeting (count) .009 .030 dents’ self-reports of how often their advisors showed them their
After meeting (count) .015 .137 Student Explorer data did not predict any course grade, suggesting
Pre-enrollment support that students’ perceptions of their data affected their mastery ori-
Family encouraged academic success .252** .082 entation but not their grade because mastery orientation.
Friends encouraged academic success .013 .062
Excellent HS teachers .145* .066
**
p < .01.
5. Discussion
*
p < .05.
a
Represents students’ reports of the number of times each issue was discussed While Achievement Goal Theory research has shown that high
during 1-on-1 meetings. mastery orientation is adaptive for students in academic settings
S. Lonn et al. / Computers in Human Behavior 47 (2015) 90–97 95

Table 2
Predicting summer bridge course grade outcomes.

Variable English course Math A course Math B course


2 2
b SE b R b SE b R b SE b R2
.31 .63 .80
Mastery .06 .07 5.48* 2.58 3.33 4.81
Performance-approach .09 .01 2.00 3.29 11.85 8.93
Performance-avoid .10 .09 3.63 2.82 4.11 8.77
Student athlete .66** .26 – – 1.92 15.15
Female .11 .11 2.22 3.81 15.23 15.23
Student Explorer data shown to student .06 .05 .65 1.62 3.72 4.29
Studying for bridge courses .03 .04 1.85 1.37 .76 3.73
Student Explorer before meeting (count) .04 .04 .63 1.62 1.85 4.48
Student Explorer during meeting (count) .01 .03 1.62 .98 1.96 2.176
Student Explorer after meeting (count) .36 .21 13.42** 4.55 .44 7.62
Family encouragement .05 .07 1.09 2.53 15.59* 7.24
Friend encouragement .04 .06 .54 1.90 8.00 5.73
Excellent HS teachers .12* .06 3.76 2.39 7.24 8.19
ACT English score .01 .02 .88 1.67 7.46 4.15
HS English preparation .07 .05 – – – –
ACT math – – .82 .86 1.42 2.71
Math pre-test – – .61 .16 .43 .81
Math midterm – – .78*** .17 .96* .41
***
p < .001.
**
p < .01.
*
p < .05.

(Rawsthorne & Elliot, 1999), our results show that some features of from the data available in this study. They do, however, highlight
the Bridge program may moderately decrease students’ mastery a need for additional research investigating how Achievement Goal
goal orientations (RQ1). In an effort to understand this change, we Theory constructs might be interpreted differently by the at-risk
were surprised to learn that showing students their own data within populations participating in summer bridge programs who are
the context of the Student Explorer early warning system (EWS) selected specifically because they differ demographically and aca-
may have contributed to this decrease (RQ2). Yet, we do not know demically (in terms of preparedness) from the general student
which features of Student Explorer use contribute to this effect. body at their respective institutions.
Moreover, our findings indicate that student perceptions, rather Learning analytics is designed to use data mining and statistical
than advisors’ actual utilization of Student Explorer are the strongest techniques to ‘‘help target instructional, curricular, and support
predictors of changes in student motivation. The lack of relationship resources’’ in support of student achievement (van Barneveld,
between the advisors’ frequency of Student Explorer use and stu- Arnold, & Campbell, 2012). To that end, learning analytics should
dent motivational changes lends support to the idea that any ‘‘stu- leverage resources to enable informed decision-making. The prom-
dent-facing’’ learning analytics intervention will need to be ise of learning analytics is allowing users to interact with tools that
developed and deployed with care. Such tools can affect students’ support their sensemaking and suggest actions that benefit stu-
interpretations of their own data and possibly their subsequent aca- dents, either directly or indirectly (Krumm et al., 2014). Given
demic success. Howley and Rosé’s (2014) recent finding that mas- the wide variability of course contexts, grading schemes and learn-
tery learning-oriented goals are negatively correlated with seeking ing objectives, academic advisors are a logical choice to first test
help from informal sources gives additional pause when considering and refine analytics efforts, leveraging their institutional knowl-
how students’ might interpret and act upon learning analytics data. edge of courses, instructors and degree requirements in reference
Our investigation of the relationship between advisors’ use of to the analytic information. The work presented here adds infor-
the Student Explorer EWS and students’ academic performance mation about the student and their motivational orientation into
as measured by course grades (RQ3) resulted in mixed findings. these analytics-based interventions that compounds the complex-
The extent to which Bridge advisors viewed students’ data via Stu- ity in advisors’ sensemaking tasks for the future. New learning ana-
dent Explorer after meeting with students was negatively associ- lytics-powered tools will need to balance students’ academic goals,
ated with students’ Math A course grades, after controlling for motivation to achieve those goals, prior academic skills and habits,
pre-entry demographics, students’ perceptions about encourage- formative performance, and current effort to support decision
ment from family and friends, and formative course performance. making about the most appropriate and timely recommended
This finding may be related to advisors spending more time outside actions. Constructing the next generation of learning analytics
of meetings trying to help the students whose formative course tools that can guide users through these complex data sources to
performance was already flagged by Student Explorer as far below arrive at actionable decisions is a worthwhile challenge for this
the course average. emergent field.
The analyses conducted to answer RQ3, the relationship
between advisors’ use of Student Explorer and their students’ aca- 5.1. Limitations
demic performance, also highlighted several surprising results
with regard to students’ perceptions. Specifically, mastery orienta- The size, selection biases of an at-risk student population, and
tion negatively predicted Math A course grades, suggesting that the short duration of the Bridge program are all factors that limit the
context of or students’ placement in this math course may have in generalizability of this study. Nevertheless, the findings presented
some way been discouraging for students more prone to mastery in this article lay important groundwork for future initiatives that
achievement orientations. In addition, for the Math B course, stu- seek to incorporate measures of student motivation into tools that
dents’ perceptions about family encouragement negatively pre- employ learning analytics, especially in settings like summer
dicted course outcomes. The meaning of these findings is unclear bridge programs.
96 S. Lonn et al. / Computers in Human Behavior 47 (2015) 90–97

One of the limitations of our study is that, by design, EWSs motivations that provide direction about how to tailor learning
encourage higher contact with students who are struggling aca- environments to learners’ needs.
demically than students who are thriving, thus skewing user data
from EWS actions. Specifically, advisors are likely to log into the Acknowledgments
EWS more frequently to check on a student with lower grades than
a student with higher grades, potentially explaining the trends for This work was partially funded by an Exploring Learning Ana-
demonstrated by our results, particularly the relationship between lytics Grant awarded by the Learning Analytics Task Force at the
advisors’ use of Student Explorer and their students’ academic per- University of Michigan. The authors would like to thank their part-
formance. Also, the lack of understanding why advisors choose to ners in the Summer Bridge Program and colleagues who helped
use the EWS in a particular way (e.g., only viewing summary views develop earlier iterations of Student Explorer, the early warning
or only for students already having difficulties) is a limitation of system used in this investigation. Special thanks to William Geh-
learning analytics-based interventions where the focus is only on ring for his assistance with survey development and Richard Rich-
student outcome data. The data sources utilized by these tools ter for his development work on the most recent version of Student
can be informative about what, when, who, and how a student per- Explorer. Finally, many thanks to the members of the USE Lab for
formed within a given learning environment, yet intentionality is their thoughtful feedback and suggestions throughout this project.
very difficult to capture with current data structures and reporting
mechanisms. References

Ackermann, S. P. (1991). The benefits of summer bridge programs for


5.2. Future research underrepresented and low-income transfer students. Community/Junior
College, 15, 211–224. http://dx.doi.org/10.1080/0361697910150209.
Adelman, C. (2006). The toolbox revisited: Paths to degree completion from high school
Our research agenda is focused on better understanding (A) through college. Washington, DC: U.S. Department of Education. http://
what habits and behaviors students bring to their higher education www2.ed.gov/rschstat/research/pubs/toolboxrevisit/toolbox.pdf.
contexts, (B) how those habits and behaviors mediate students’ Aguilar, S., Lonn, S., & Teasley, S. D. (2014). Perceptions and use of an early warning
system during a higher education transition program. In Proceedings of the
goals, performance, and effort, (C) what suggestions advisors make fourth international conference on learning analytics and knowledge (pp. 113–
to remediate students’ unproductive habits and behaviors based on 117). Indianapolis, IN: ACM. http://dx.doi.org/10.1145/2567574.2567625.
the data presented in systems like the EWS, and finally, (D) what Allen, D. (1999). Desire to finish college: An empirical link between motivation and
persistence. Research in Higher Education, 40(4), 461–485. http://dx.doi.org/
actions, if any, students take in response to their advisors’ sugges-
10.1023/A:1018740226006.
tions. Understanding this entire process will add needed clarity to Allen, D. F., & Bir, B. (2012). Academic confidence and summer bridge learning
the ‘‘feedback loops’’ between students, instructors, and academic communities: Path analytic linkages to student persistence. Journal of College
Student Retention, 13(4), 519–548. http://dx.doi.org/10.2190/CS.13.4.f.
advisors (Clow, 2012). To further this agenda, we propose to
Arnold, K. E., & Pistilli, M. D. (2012). Course signals at Purdue: Using learning
unpack the ‘‘design literacy’’ skills that are required to develop analytics to increase student success. In Proceedings of the second international
effective learning analytics-powered tools. Informed by definitions conference on learning analytics and knowledge (pp. 267–270). Vancouver, BC,
of affordances by Norman (2013) and Soegaard (2003), this form of Canada: ACM. http://dx.doi.org/10.1145/2330601.2330666.
Astin, A. (1975). Preventing students from dropping out. San Francisco: Jossey-Bass.
literacy encompasses the skills required to unpack, investigate, and Astin, A. (1984). Student involvement: A developmental theory for higher
ultimately understand the affordances that are inherent in any education. Journal of College Student Personnel, 25, 297–308.
data visualization within a given context. Properly scaffolding this Astin, A. W. (1993). Assessment for excellence: The philosophy and practice of
assessment and evaluation in higher education. Westport: Oryx Press.
form of literacy will require the creators of data representations Barber, R., & Sharkey, M. (2012). Course correction: Using analytics to predict course
(e.g., learning analytics researchers) to construct visualizations success. In Proceedings of the second international conference on learning analytics
that have perceptible affordances that are sensitive and actionable, and knowledge (pp. 259–262). Vancouver, BC, Canada. http://dx.doi.org/
10.1145/2330601.2330664.
to the needs of the intended – and implied – audiences. In short, Blikstein, P. (2013). Multimodal learning analytics. In Proceedings of the third
we believe that in order to help academic advisors to identify international conference on learning analytics and knowledge (pp. 102–106).
and act more effectively on data, learning analytics tools will need Leuven, Belgium: ACM. http://dx.doi.org/10.1145/2460296.2460316.
Blumenfeld, P. (1992). Classroom learning and motivation: Clarifying and
to facilitate visual communication from the data source and,
expanding goal theory. Journal of Educational Psychology, 84(3), 272–281.
through the informed intermediary, to the student. http://dx.doi.org/10.1037/0022-0663.84.3.272.
Cabrera, N. L., Miner, D. D., & Milem, J. F. (2013). Can a summer bridge program
impact first-year persistence and performance? A case study of the new start
5.3. Conclusion summer program. Research in Higher Education, 54, 481–498. http://dx.doi.org/
10.1007/s11162-013-9286-7.
Clow, D. (2012). The learning analytics cycle: Closing the loop effectively. In
Learning analytics interventions are designed to foster optimi- Proceedings of the second international conference on learning analytics and
zation of learning and learning environments (Society of Learning knowledge (pp. 134–138). Vancouver, BC, Canada: ACM. http://dx.doi.org/
Analytics Research, n.d.) that can ultimately improve student 10.1145/2330601.2330636.
Coffrin, C., Corrin, L., de Barba, P., & Kennedy, G. (2014). Visualizing patterns of
performance and retention. The study presented here is one of student engagement and performance in MOOCs. In Proceedings of the fourth
the first attempts to understand how the use of analytic data might international conference on learning analytics and knowledge (pp. 83–92).
impact students’ motivation. Our results indicate that even guided Indianapolis, IN: ACM. http://dx.doi.org/10.1145/2567574.2567586.
Collins, A., Brown, J. S., & Newman, S. E. (1987). Cognitive apprenticeship: Teaching
presentations of student performance data by trained advisors can the craft of reading, writing, and mathematics. In L. B. Resnick (Ed.), Knowing,
have significant effects on students’ academic achievement, moti- learning, and instruction: Essays in honor of Robert Glaser (pp. 453–494).
vational orientation, and ultimately, their decision to persist in Hillsdale, NJ: Lawrence Erlbaum Associates.
Colyar, J. (2011). Strangers in a strange land: Low-income students and the
college. Therefore, designers of learning analytics interventions transition to college. In A. J. Kezar (Ed.), Recognizing and serving low-income
should consider how best to collect and present such data. Motiva- students in higher education: An examination of institutional policies, practices, and
tional and psychological scales have proven useful for helping culture (pp. 121–138). New York: Routledge.
Deci, E. L., & Ryan, R. M. (1985). Intrinsic motivation and self-determination inhuman
instructors and instructional designers tailor learning environ-
behavior. New York: Plenum.
ments to learners’ needs, yet are difficult to marry with current Dennis, J. M., Phinney, J. S., & Chuateco, L. I. (2005). The role of motivation, parental
learning analytics interventions. Thus, the next generation of support, and peer support in the academic success of ethnic minority first-
learning analytics interventions must resolve the tension between generation college students. Journal of College Student Development, 46,
223–236. http://dx.doi.org/10.1353/csd.2005.0023.
ease of scalability in current data sources (e.g., grades) and the Duval, E. (2011). Attention Please! Learning Analytics for Visualization and
richness of measures such as students’ intentionality, goals, and Recommendation. In Proceedings of the first international conference on learning
S. Lonn et al. / Computers in Human Behavior 47 (2015) 90–97 97

analytics and knowledge (pp. 9–17). Banff, Canada: ACM. http://dx.doi.org/ international conference on learning analytics and knowledge (pp. 235–239).
10.1145/2090116.2090118. Leuven, Belgium: ACM. http://dx.doi.org/10.1145/2460296.2460343.
Elliot, A. J. (2005). A conceptual history of the achievement goal construct. Lotkowski, V. A., Robbins, S. B., & Noeth, R. J. (2004). The role of academic and non-
Handbook of Competence and Motivation, 52–72. academic factors in improving college retention. Policy report. Iowa City, IA: ACT.
Elliot, E. S., & Dweck, C. S. (1988). Goals: An approach to motivation and http://www.act.org/research/policymakers/pdf/college_retention.pdf.
achievement. Journal of Personality and Social Psychology, 54, 5–12. http:// Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an ‘‘early
dx.doi.org/10.1037//0022-3514.54.1.5. warning system’’ for educators: A proof of concept. Computers & Education,
Elliot, A. J., & Harackiewicz, J. M. (1996). Approach and avoidance achievement goals 54(2), 588–599. http://dx.doi.org/10.1016/j.compedu.2009.09.008.
and intrinsic motivation: A mediational analysis. Journal of Personality and Social May, M., George, S., & Prévôt, P. (2011). TrAVis to enhance online tutoring and
Psychology, 70(3), 461–475. http://dx.doi.org/10.1037/0022-3514.70.3.461. learning activities: Real-time visualization of students tracking data. Interactive
Elliot, A. J., & McGregor, H. A. (2001). A 2  2 achievement goal framework. Journal Technology and Smart Education, 8(1), 52–69. http://dx.doi.org/10.1108/
of Personality and Social Psychology, 80(3), 501–519. http://dx.doi.org/10.1037/ 17415651111125513.
0022-3514.80.3.501. Midgley, C., Maehr, M. L., Hruda, L. Z., Anderman, E., Anderman, L., Freeman, K. E.,
Fitts, J. D. (1989). A comparison of locus of control and achievement among et al. (2000). Manual for the Patterns of Adaptive Learning Scales (PALS). Ann
remedial summer bridge and nonbridge students in community colleges in New Arbor, MI: University of Michigan. http://www.umich.edu/~pals/manuals.html.
Jersey (Unpublished doctoral dissertation). Rutgers University, New Brunswick, Morris, L. V., Finnegan, C., & Wu, S. (2005). Tracking student behavior, persistence,
NJ. and achievement in online courses. The Internet and Higher Education, 8(3),
Fletcher, S. L., Newell, D. C., Newton, L. D., & Anderson-Rowland, M. R. (2001). The 221–231. http://dx.doi.org/10.1016/j.iheduc.2005.06.009.
WISE summer bridge program: Assessing student attrition, retention, and Muñoz-Merino, P. J., Valiente, J. A. R., & Kloos, C. D. (2013). Inferring higher level
program effectiveness. Paper presented at the Annual Meeting of the American learning information from low level data for the Khan Academy platform. In
Society for Engineering Education, Albuquerque. http://www.foundationcoalition. Proceedings of the third international conference on learning analytics and
org/publications/journalpapers/fie01/01161.pdf. knowledge (pp. 112–116). Leuven, Belgium: ACM. http://dx.doi.org/10.1145/
Fritz, J. (2011). Classroom walls that talk: Using online course activity data of 2460296.2460318.
successful students to raise self-awareness of underperforming peers. Internet Myers, D., & Schirm, A. (1999, April). The impacts of Upward Bound: Final report for
and Higher Education, 14(2), 89–97. http://dx.doi.org/10.1016/j.iheduc.2010. Phase I of the national evaluation. Report submitted to the U.S. Department of
07.007. Education. Washington, DC: Mathematica Policy Research, Inc. http://
Gandara, P. (2001). Paving the way to postsecondary education: K-12 interventions for mathematica-mpr.com/publications/PDFs/upwardph1.pdf.
underrepresented youth. Washington DC: National Center for Education Nolen, S. B. (1988). Reasons for studying: Motivational orientations and study
Statistics. http://nces.ed.gov/pubs2001/2001205.pdf. strategies. Cognition and Instruction, 5, 269–287. http://dx.doi.org/10.1207/
Garcia, L. D., & Paz, C. C. (2009). Evaluation of summer bridge programs. About s1532690xci0504_2.
Campus, 14(4), 30–32. http://dx.doi.org/10.1002/abc.299. Norman, D. (2013). The design of everyday things: Revised and expanded edition. New
Goggins, S., Galyen, K., & Laffey, J. (2010). Network analysis of trace data for the York: Basic Books.
support of group work: Activity patterns in a completely online course. Pascarella, E. T., & Terenzini, P. T. (2005). How college affects students: Volume II, A
Proceedings of the 16th ACM International Conference on Supporting Group third decade of research. San Francisco: Jossey-Bass.
Work (pp. 107–116), Sanibel Island, FL. http://dx.doi.org/10.1145/1880071. Rawsthorne, L. J., & Elliot, A. J. (1999). Achievement goals and intrinsic motivation:
1880089. A meta-analytic review. Personality and Social Psychology Review, 3(4), 326–344.
Grann, J., & Bushway, D. (2014). Competency map: visualizing student learning to http://dx.doi.org/10.1207/s15327957pspr0304_3.
promote student success. In Proceedings of the fourth international conference on Society of Learning Analytics Research (n.d.). About [Webpage]. http://
learning analytics and knowledge (pp. 168–172). Indianapolis, IN: ACM. http:// www.solaresearch.org/mission/about/.
dx.doi.org/10.1145/2567574.2567622. Soegaard, M. (2003). Affordances. http://www.interaction-design.org/encyclopedia/
Harackiewicz, J. M., Barron, K. E., Pintrich, P. R., Elliot, A. J., & Thrash, T. M. (2002). affordances.html.
Revision of achievement goal theory: Necessary and illuminating. Journal of St. John, E. P. (2006). Lessons learned: Institutional research as support for academic
Educational Psychology, 94(3). http://dx.doi.org/10.1037//0022-0663.94.3.638. improvement. In E. P. St. John & M. Wilkerson (Eds.). Reframing persistence
Heyman, G. D., & Dweck, C. S. (1992). Achievement goals and intrinsic motivation: research to improve academic success: New directions for institutional research
Their relation and their role in adaptive motivation. Motivation and Emotion, 16, (vol. 130, pp. 95–107). San Francisco: Jossey-Bass.
231–247. http://dx.doi.org/10.1007/BF00991653. Strayhorn, T. (2011). Bridging the pipeline: Increasing underrepresented students’
Howley, I., & Rosé, C. P. (2014). Undergraduate attitudes toward help-seeking. In preparation for college through a summer bridge program. American Behavioral
Proceedings of the 11th international conference of the learning sciences (pp. 1561– Scientist, 55(2), 142–159. http://dx.doi.org/10.1177/0002764210381871.
2562). Boulder, CO: International Society of the Learning Sciences. Suzuki, A., Amrein-Beardsley, A., & Perry, N. J. (2012). A summer bridge program for
Kezar, A. (2001). Summer bridge programs: Supporting all students. ERIC Digest. underprepared first-year students: Confidence, community, and re-enrollment.
http://files.eric.ed.gov/fulltext/ED442421.pdf. Journal of the First-Year Experience & Students in Transition, 24(2), 85–106.
Krumm, A. E., Waddington, R. J., Lonn, S., & Teasley, S. D. (2014). A learning Terenzini, P. T., & Wright, S. T. (1993). Students’ personal growth during the first
management system-based early warning system for academic advising in two years of college. Review of Higher Education, 10, 259–271. http://
undergraduate engineering. In J. A. Larusson & B. White (Eds.), Learning files.eric.ed.gov/fulltext/ED281463.pdf.
analytics: From research to practice (pp. 103–119). New York: Springer Tinto, V. (1987). Leaving college. Chicago, IL: University of Chicago Press.
Science+Business Media. http://dx.doi.org/10.1007/978-1-4614-3305-7_6. Tinto, V. (1993). Leaving college: Rethinking the causes and cures of student attrition
Lepper, M. R. (1981). Intrinsic and extrinsic motivation in children: Detrimental (2nd ed.). Chicago, IL: The University of Chicago Press.
effects of superfluous social controls. In W. A. Collins (Ed.). Aspects of the Tinto, V. (1996). Reconstructing the first year of college. Planning for Higher
development of competence: The Minnesota symposium on child psychology (Vol. Education, 25(1), 1–6.
14, pp. 155–214). Hillsdale, NJ: Lawrence Erlbaum Associates Inc.. van Barneveld, A., Arnold, K. E., & Campbell, J. P. (2012, January). Analytics in higher
Lewin, T. (2013, August 22). Obama’s plan aims to lower cost of college. The New education: Establishing a common language. ELI paper 1. http://www.educause.
York Times (pp. A1). http://www.nytimes.com/2013/08/22/education/obamas- edu/Resources/AnalyticsinHigherEducationEsta/245405.
plan-aims-to-lower-cost-of-college.html. Walpole, M. (2003). Socioeconomic status and college: How SES affects college
Lonn, S., Krumm, A. E., Waddington, R. J., & Teasley, S. D. (2012). Bridging the gap experiences and outcomes. The Review of Higher Education, 27(1), 45–73. http://
from knowledge to action: Putting analytics in the hands of academic advisors. dx.doi.org/10.1353/rhe.2003.0044.
In Proceedings of the second international conference on learning analytics and Walpole, M., Simmerman, H., Mack, C., Mills, J. T., Scales, M., & Albano, D. (2008).
knowledge (pp. 184–187). Vancouver, Canada: ACM. http://dx.doi.org/10.1145/ Bridge to success: Insight into summer bridge program students’ college
2330601.2330647. transition. Journal of the First-Year Experience & Students in Transition, 20(1),
Lonn, S., Aguilar, S., & Teasley, S. D. (2013). Issues, challenges, and lessons learned 11–30. http://fyesit.metapress.com/content/f6k70720655h856k/.
when scaling up a learning analytics intervention. In Proceedings of the third

Potrebbero piacerti anche