Sei sulla pagina 1di 12

Studies in Educational Evaluation 59 (2018) 29–40

Contents lists available at ScienceDirect

Studies in Educational Evaluation


journal homepage: www.elsevier.com/locate/stueduc

Comparative effect of online summative and formative assessment on EFL T


student writing ability

Zohre Mohamadi
English Translation Department, Karaj Branch, Islamic Azad University, Karaj, Iran

A R T I C L E I N F O A B S T R A C T

Keywords: This study investigated the effect of online summative and formative assessments on 130 Iranian English as
Online summative assessment foreign language (EFL) junior university students' writing ability. Three assessment interventions in writing
Online formative assessment performances of participants were investigated in 27 sessions using pretest/posttest time series design. The
Student portfolio writing interventions included online summative assessment and online portfolio writing assessment conducted in-
Collaborative writing
dividually and online collaborative formative assessment. Data were collected from students' individual writing
in both online summative and portfolio formative assessments as well as collaborative writing in online colla-
borative formative assessment in e-writing forum. The writing performances were assessed using International
English Language Testing System (IELTS) rating scale. Paired sample t-test and analysis of covariance results
indicated improved writing ability in all interventions and highest significant enhanced writing in online col-
laborative writing assessment intervention. The results imply that using engaging technology and techniques
along with appropriate assessment strategies is a powerful way of making learning efficient.

1. Introduction weakness and strength (Yilmaz, 2017). How advancement of informa-


tion and communication technology (ICT) affected education and
Education and assessment are so interwoven that one cannot have a learning is well-documented (Vo, Zhu, & Diep, 2017). Not equally well-
comprehensive picture if either of them is missed from considerations. documented in research is how ICT has advanced assessment.
Assessment in either summative type (assessment for accreditation and Looking to the future of e-assessment, Bennett (1998) describes
validation) or formative one (assessment for learning) is at the heart of three generations in e-assessment. The first one is using designs based
education (Gikandi, Morrow, & Davis, 2011). Summative assessment is closely on paper-based tests which are conducted online. The second
assessing if the predetermined learning outcomes are achieved ac- generation includes multimedia, constructed response, automatic item
cording to in- advance programed objectives or if the requirements are generation and automatic scoring. The study by Llamas-Nistal et al.
fulfilled to an accreditation or certification to be granted (Llamas- (2013) introduced a tool by which classical exams are blended to digital
Nistal, Fernández-Iglesias, González-Tato, & Mikic-Fonte, 2013). At the devices where learners have classical exams and provided by automated
heart of formative assessment is assistance in the form of feedbacks grading and statistical results and reports accessible anytime. The
given to those involved in education which has no effect in vacuum timely reports the device produces are summative online assessment
unless it occurs in a learning context to which it can be addressed helping learners to shift from classical exams to digital online assess-
(Hattie & Timperley, 2007). The interaction between teacher and stu- ment. Another device proposed by Rashad, Youssif, Abdel-Ghafar, and
dents or student with student in formative assessment mediates Labib (2008) is EAT; an electronic assessment system which provides
learning through scaffolding and assistance (Bennett, 2009) and lear- information about students' answers and time to answer. Another e-
ners give and receive feedbacks that fine tune their current level of assessment is Testweb proposed by Dippel, Neundorf, and Yakimchuk
language ability as it taps the process of learning rather than its product (2008) which includes dynamic tests that change according to the an-
(Tarighat & Khodabakhsh, 2016). swers provided by the participants. The Testweb provides rating of tests
With the advancement of information and communication tech- and statistical summaries. Shared among the electronic assessment
nology (ICT) in education, unique feature of ICT in connecting people systems are automatic test corrections, automatic grading and summary
has received attention. ICT provides a medium in which learners can reports (Llamas-Nistal et al., 2013). Although the aforementioned stu-
receive feedback through interaction and help them notice their dies are important and timely, they are summative in nature (Gikandi


Corresponding author at: Islamic Azad University, Karaj Branch, Moazen Boulevard, Rajaeeshahr, Karaj, 31485-313, Iran.
E-mail address: Zohre.mohamadi@kiau.ac.ir.

https://doi.org/10.1016/j.stueduc.2018.02.003
Received 29 July 2017; Received in revised form 11 February 2018; Accepted 14 February 2018
0191-491X/ © 2018 Elsevier Ltd. All rights reserved.
Z. Mohamadi Studies in Educational Evaluation 59 (2018) 29–40

et al., 2011) since online summative assessment is conducted through research from achievement-based analysis towards process −based
less integrated and asynchronous use of computers such as wikis, blog learning through meditational tools such as collaboration, interaction,
writing, asynchronous emails which use objective tests or cloze, true/ and scaffolding (Miyazoe & Anderson, 2010). Scholars tend to in-
false tests and give a digital report of correct answers to teachers and vestigate potential of collaborative writing in construction and devel-
students (Siozos, Palaigeorgiou, Triantafyllakos, & Despotakis, 2009). opment of knowledge (Lei, 2008). Collaborative writing is defined as
The third generation started with more synchronous use of computers social negotiation between several writers in which knowledge is con-
such as telecollaborations and forums for assessment. These are med- structed and conveyed (Challob, Bakar, & Latif, 2016). Collaborative
iums in which assessment includes on-demand testing, e-portfolios, writing is a medium for scaffolding which involves students in problem
student modelling, formative assessment supporting learner autonomy, solving and finding the best way to communicate what they mean (Lin
and diagnostic assessment (Pachler, Daly, Mor, & Mellar, 2010) which & Maarof, 2013). In collaborative writing, learners engage in collective
are all formative in nature. activity and they have equity and mutuality in labor and follow certain
Despite important and timely research on online summative as- regulations to establish group harmony to achieve the outcome (Cho,
sessment, scant attention was given to more integrated, communicative 2017).
and interactive platform. Platforms by which the examinees' true To go in line with the reforms in learning theories, assessment was
knowledge and skills can be assessed in an authentic way (Huff & Sireci, also undergone a shift from assessment of learning to assessment for
2001). Moreover, Kingston and Nash (2011)'s meta-analysis of 300 learning (Lee & Coniam, 2013) which requires teaching practitioners
articles on formative assessment indicated that the studies on formative move from assessment for certifying and accountability purposes to a
assessment are flawed since the reported effect sizes make the inter- platform through which learners get engaged with self and peer as-
pretability of the results difficult. Besides, comparative studies of online sessment and establish critical awareness through formative feedback
formative and summative assessment deal with test behaviors such as and close the gap between their current performance and the desired
anxiety level of the test and confidence (Cassady & Gridley, 2005), and performance (zone of proximal development) (Vygotsky, 1980). New
student engagement (Han & Finkelstein, 2013). Few studies have ad- technologies provide great potential for connecting writing to situated
dressed comparative potential of the two in student achievement learning practices in and out of classroom which support collaborative
(Broadbent, Panadero, & Boud, 2017), and no comparative study of writing and writing portfolios that can be assessed by peers, teachers
aforementioned assessment on literacy skill such as writing is done (to and teaching practitioners.
the best of the researcher's knowledge). Missing from the aforemen- With the introduction of technology and computers into education
tioned studies is investigating whether interactive online platforms and urgent need to design web-based educational application, research
improve students' learning quality is a promising area for research. investigated how online writing opportunities such as asynchronous
Although online learning is now becoming a trend in developed coun- email writing, wiki writing, and blog writing affected writing experi-
ties, it is still in its infancy in less developed or developing countries. ences both from education and assessment perspectives (Cope,
This requires researchers investigate the potential of online education Kalantzis, McCarthey, Vojak, & Kline, 2011). However, what is common
ecologies and how learning is affected by the lack of required infra- among writing in EFL context specially in middle east is that although
structures such as internet speed and quality which is a common pro- synchronous online writing is acknowledged theoretically in terms of
blem in these countries including Iran. authenticity and on demand processing it brings, it is left behind in
This study is an attempt to fill such a void as it aims at investigating practice (Shojaei & Fatemi, 2016) due to lack of required infrastructure
the comparative effect of online and summative assessment on writing facilities such as internet quality and speed (Rabiee, Nazarian, &
ability of Iranian EFL learners. Gharibshaeyan, 2013). Research on assessment with respect to syn-
chronious writing is even scarcer in these contexts. This study is an
2. Background attempt to fill such a void in EFL context of Iran and the results can be
insightful for EFL contexts similar to Iran.
In this section, theoretical background and empirical studies on
teaching and assessment of EFL writing and assessment and electronic 2.2. Assessment and electronic assessment of writing
assessment of writing are reviewed critically. This critical review
helped the researcher design the study in the most logical and appro- Early studies of writing assessment mainly focused on how to design
priate way. tools and tasks with high reliability for evaluation purposes (Nixon &
McClay, 2007). Broad (2003)’s review of many studies criticized them
2.1. Teaching and assessment of EFL writing for being obsessed with finding ways of standardization to reduce the
effect of bias in evaluation decisions. Different attempts were made at
Writing has long been an interesting area of research for teaching investigating assessment rubrics and their reliability (McMillan,
practitioners and researchers. There are several reasons why writing Venable, & Varier, 2013). Critical assessment of rubrics was the initial
received special attention; a) writing is the least attended skill till step in reforming assessment. Rezaei and Lovorn (2010) questioned the
school age so it is the late needed and most demanding skill reliability and validity of the rubrics used for assessment purposes by
(Naghdipour, 2016), b)writing is not taught as a separate skill and it is indicating that raters with rubrics and without rubrics were concerned
treated as a medium to practice structure and vocabulary of a language, with the mechanics of writing rather than the content. The needs
and c) it is seen more as one of the skills examined in internal and analysis of academic and science communication courses by Rakedzon
public exams in many educational systems (Lee, 2010). Writing in most and Baram-Tsabari (2017) helped the researchers design a rubric which
EFL contexts is taught through traditional practice-examination or- was used for genre sensitive evaluation. In contrast with what tradi-
iented approach which requires teachers to assign a topic, learners to tional approach towards focused rubrics in assessment highlighted,
write about a topic within a specified time limit and teachers to provide reform movements on assessment emphasized writing process, audi-
feedback on grammar and vocabulary on the product submitted for ence awareness, and topic knowledge. In addition, procedural facil-
teacher comments (Shojaei & Motamedi, 2014). In traditional writing itators such as rubrics, checklists and dictionaries and inclusion of as-
classes, writing was rendered to one audience- assessing teacher- and it sessment criteria such as organization, and content as well as genre
was assessed with respect to formal features of a text (Lee & Coniam, specific evaluative components in rubrics besides structure and se-
2013; Vojak, Kline, Cope, McCarthey, & Kalantzis, 2011). mantics are reported to be implemented in large scale educational as-
With the introduction of constructivist approaches towards sessment (Mo & Troia, 2017).
learning, reform movements changed the trend of both education and Although these studies are contributive to the field, they are

30
Z. Mohamadi Studies in Educational Evaluation 59 (2018) 29–40

criticized since they are summative in nature and do not provide social characteristics on the basis of which reliability of online assessment
and process oriented approach towards writing advocated by socio- relies; " (1) " opportunities for documenting and monitoring evidence of
cultural and constructivist approaches towards education (Nixon & learning” through which learners and teachers can identify strength and
McClay, 2007) and ignore social nature of writing and understanding weakness and decide on remedial actions, (2) multiple sources of evi-
writing as a human communication (Vojak et al., 2011). This indicates dence of learning and (3) explicit clarity of learning goals and shared
assessment is left behind education, learning and teaching theories meaning of rubrics” which helps learners to be active participants of
(Farhady & Hedayati, 2009). The irony is very remarkable while their assessment and make principled decisions about their learning.
practitioners acknowledge the implementation of communicative ap- Newhouse (2011) states that shared among all sorts of online assess-
proaches towards learning and teaching, discrete point approach to- ment are three general features: they provide a medium for presenta-
wards assessment is still practiced in many educational contexts across tion of knowledge, a recording of performance indicators of compe-
the globe. The situation becomes even worse when it comes to in- tences and a framework for interpretation of evidences.
corporation of ICT to assessment (Vojak et al., 2011). Part of it is be- Despite these challenges, various studies provide insights into how
cause of the complexity of establishing authentic situation for language online formative assessment can help learners expand their learning.
performance for evaluation purposes (Newhouse, 2011). It is very cost Online formative assessment helps learners to be managers of their own
effective to stimulate a medium representing real performance. If the learning by providing feedback which helps them self-regulate their
concern is with sensitive jobs like pilot and surgery, high stake orders learning (Nicol & Macfarlane-Dick, 2006). Comparative study of stu-
may require educational services to afford but for learning less cre- dents' utilizing online formative assessment and the control group in-
dential jobs, high stake orders may not afford it. Siozos et al. (2009) dicated that students with opportunities for online formative assess-
named several challenges of incorporating technology into assessment. ment outperformed the counter group in subsequent summative tests
This creates test mode effect which assumes paper and pencil (PB) and (Olson & McDonald, 2004). Comparative study of the effect of online
computer based (CB) are not equivalent and the difference may trigger formative and summative assessment on the test behaviors and beliefs
construct irrelevant factors such prior experience, attitudes and per- such as anxiety and perceived test threat indicated that there is no
ceptions towards technology (Huff & Sireci, 2001). significant difference in students anxiety level and test threat due to test
One challenge is the type of the questions which is believed to have mode effect (Cassady & Gridley, 2005). The results support integration
unintended washback effect if they are ill −formed (Huff & Sireci, of online assessment since it increases learner confidence and time on
2001). This brings teachers again into actions in designing creative instruction. Another comparative study of formative and summative
questions; otherwise, they may promote rote learning (Conole & assessment mediated by feedback technologies suggested that the tool
Warburton, 2005). The type of feedback is another challenge. Feedback increased student engagement, student perception of technology use
is integral part of assessment (Evans, 2013). From socio-constructivist and more benefits from formative assessment than summative assess-
perspective, feedback facilitates learning since it helps students gain ment (Han & Finkelstein, 2013). Kingston and Nash (2011)'s meta-
shared understanding and increased responsibility for acting on feed- analysis of 300 articles on formative assessment indicated that the
back (Carless, Salter, Yang, & Lam, 2011). Computer-assisted human studies on formative assessment are flawed since the effect sizes re-
feedback systems are designed to assist learners by offering them in text ported make the interpretability of the results difficult. Mediation
feedback such as " track changes” where learners have a dialogical analysis indicated that students of art benefit more than those of science
parallel to conversation to the text (Cope et al., 2011) and accept or and mathematics and among many approaches for formative assess-
reject changes. Therefore, few studied have worked on designs that ment, professional formative assessment and computer aided formative
instigate individualized feedback (Lew, Alwis, & Schmidt, 2010). Re- assessment were the most effective ones having the highest effect size.
search indicated that the power of feedback in increasing the length and Regarding computer mediated feedback, parametric tests indicate that
type of responses was recognized in technology designs (formative) in there is a moderate positive correlation between automated writing
which computers were seen as social actors in comparisons with other evaluation and instructors’ analytic ratings and interview with in-
designs which are summative or diagnostic ones (Jordan, 2012). Most structors indicated that teachers use automated writing strategically to
of CB assessment provides performance oriented assessment where respond to student needs (Li, Link, Ma, Yang & Hegelheimer, 2014).
students are provided with simply confirming correct answers rather Students error ratio and error correction ratios on web-based error
than more personalized qualitative assessment (Peterson & Irving, correction practice mechanism on online annotation system through
2008). Accordingly, soft wares introduced by the designers are pre- Error Correction Recommender system which recommends students
pared in advance in such a way that respond to large group of learners error correction of an essay with peer and compares them with teacher
without taking into account specific group of learners, therefore, correction approved writing development (Yeh, Lo, & Chu, 2014).
homogenization issues are hard to respond in online assessment Cheng (2017)‘s study of the effect of online automated feedback on
(Hinostroza & Mellar, 2001). students' reflective journal indicated that experimental group which
The other challenge in online assessment is complexity of validity received either teacher or automated feedback on writing quality out-
and reliability issues (Gikandi et al., 2011). Validity of online assess- performed the control group not only in terms of writing score but also
ment is defined as whether the assessment promotes learning or not in students perceived efficacy of the feedback system. Stevenson and
(Black & Wiliam, 2009) and it relies on three characteristics according Phakiti (2014)'s review of computer generated feedback on the quality
to literature review by Gikandi et al. (2011): a) authenticity of assess- of writing indicated that research findings in this area are inconsistent
ment activity which is whether the activity requires real life language since the efficacy of it depends on heterogeneity of participants, con-
processing such as analysis, problem solving, decision making. (Crisp & texts and designs and methodological issues (Stevenson, 2016). Cheng,
Ward, 2008; Lin, 2008), b) effective formative feedback which is Liang and Tsai (2015)’s study of the effect of online feedback on stu-
whether the feedback is timely and ongoing and requires learners to dents writing indicated that cognitive feedback such as direct correc-
undergo modifications and accompany well-designed rubrics shared tion helped more than metacognitive (reflecting comments) and affec-
between teachers and learners (Wang, Wang, & Huang, 2008), and c) tive feedback (praising comments).
multidimensional approaches and variety of activities by which lear- In a study by Asoodar, Atai, Vaezi, and Marandi (2014), the analysis
ners can demonstrate their abilities and foster autonomous learning of self-perceived questionnaire on learning, and sense of community
(Crisp & Ward, 2008). According to Driessen, Van Der Vleuten, and semi-structure interview and participants observation on weblog
Schuwirth, Van Tartwijk, and Vermunt (2005), reliability is the degree writing suggests that students with high sense of community have more
to which what is assessed is sufficient indicator of what intended perceived learning and students with low sense of community have
knowledge is. Gikandi et al. (2011, p. 2339) propose three lowest amount of perceived learning. This suggests a more

31
Z. Mohamadi Studies in Educational Evaluation 59 (2018) 29–40

constructivist and social interactional approaches towards learning and literacy as asked in registration form. Those who declared using com-
in turn assessment which instigate more learner engagement. puters daily were included. The study was conducted in 10 classes of
Behizadeh and Pang (2016)'s study of new wave of writing assessment advanced writing with approximately 10 to 13 students in each.
indicated that regardless of the advancements on theory of socio-
cultural assessment, there was no evidence of it in any form in 46 out of 3.1.2. Teacher participants
50 states of USA as most of the writing assessments are on-demand 30 PhD students of EFL with the teaching experience of 5–8 years
writing. The distinction between psychometric multiple choice ques- were invited to take part in the study. The writing program was a free
tions and on- demand writing tasks and sociocultural assessment program for the students. Therefore, the institute did not fund the
practices is that the latter requires production rather recognition, pro- program. In order to have maximum cooperation of the teachers, they
jects rather than items, teacher judgment rather than mechanic scor- were paid by the researcher. The criteria for teacher selection were
ings, reflection and appraisal on the side of both teachers and learners years of teaching experiences in general and their teaching writing
and personalized, longitudinal and contextualized evaluation (Lam, experience in particular.
2017). A prototype of sociocultural assessment which has received good
attention is portfolio writing. Portfolio writing has been used for in- 3.1.3. Assistant researchers
tercultural awareness, teacher's professional development, and student Five assistant researchers (senior PhD students in EFL) gave their
assessment (Burner, 2014). “There is no one definition of portfolio” best shot into the study on several occasions including participant se-
(Burner, 2014, p. 2). The purpose for which portfolio is used decides its lection phase, student briefing sessions on portfolio writing, member-
type. The shared features among all portfolio types are the archives ship acceptance of students in the online writing forum (website)
learners keep to document their effort, strength and weakness and (EWF), rating of writing quality in summative assessment, portfolio
achievements in writing development process, reflection through self- writing and log analysis of EWF in formative assessments. They were
assessment and selection of the text (Burner, 2014). How summative also certificated IELTS examiners. All IELTS examiners hold relevant
(showcase portfolio) and formative (learning portfolio) portfolio teaching qualifications and were trained and certified as IELTS ex-
writing affected writing is well-documented (Nezakatgoo, 2011). aminers by Cambridge University’s English Language Assessment. They
However, electronic portfolio is its infancy and scant attention is given were also approved by British Council or IDP: IELTS Australia.
to it. Respectively, this research is an attempt to explore possible po-
tential of it in formative sense of assessment. This research is intended 3.2. Instruments
to investigate comparative potential of online formative and summative
assessment on writing ability. It utilizes online writing forums where One instrument- electronic writing forum- was utilized for collecting
learners collaboratively wrote their writing assignments and were as- the data. This forum was managed to be used differently in three as-
sessed through e-portfolios. To achieve the objectives of this study, the sessment interventions. In online summative assessment stage, elec-
following research questions were formulated. tronic writing forum was managed to be used by students in online
summative stage where students were supposed to have individual
1. Does summative assessment have any significant effect on the im- writing. Electronic writing forum was managed to be used differently in
provement of the online writing ability of Iranian EFL learners? online portfolio formative assessment where students were supposed to
2. Does online portfolio writing have any significant effect on the have individual portfolio writing. The collaboration options of elec-
improvement of the online writing ability of Iranian EFL learners tronic writing forum were activated for students' collaborative writing
after removing (controlling for the) the effect of summative assess- in collaborative formative assessment stage. How the performances
ment? were rated was mentioned in each instrument respectively. The teacher
3. Does EWF have any significant effect on the improvement of the feedback in treatment sessions was given in each stage according to the
online writing ability of Iranian EFL learners after removing (con- rubrics mentioned respectively in each instrument. The assessment for
trolling for) the effect of summative assessment and portfolio research purpose was through IELTS rating scale.
writing?
3.2.1. Electronic writing
3. Method For the purpose of summative assessment, after similar classroom
instruction both in terms of content and procedure, students were asked
In this section, the step by step procedure in conducting this study is to write essays on various genres of classification and division, com-
mentioned. It includes information on participant selection procedure, parison and contrast, argumentation, narratives and description in-
instruments used in this study, how data gathered by each instrument dividually on EWF (e-writingforum.ir). The topics of writing were
was coded and analyzed, research intervention in each assessment type, chosen on the basis of topic familiarity questionnaire. Each student had
particular roles teachers and students played in each group. an account in the website. They were supposed to go into their account
and fill the fields in the website. The fields are associated with com-
3.1. Participants ponents of essay. Students were directed to fill the fields on the website
and when finished upon their confirmation, a file compiled with con-
3.1.1. Student participants tent of fields was made and archived for both student self-evaluation
130 Iranian male and female junior university students studying and teacher comments and feedback. The writings were rated by the
English as a foreign language voluntarily participated in this study. The teachers for the purpose of teacher feedback and by assistant re-
essay writing program was a free summer school program held in re- searchers for the purpose of credibility in research. The teachers'
searcher's institution. It was an extra curriculum activity rather than a feedback included comments on language, content and organization of
central to it program which means it was a program that students vo- student writings. The sum of the assistant researchers' rating on 7 essay
lunteered to enroll. More than 200 students enrolled for the program writings was considered as an index of student writing ability in sum-
but two criteria were implemented in inclusion and exclusion of the mative and formative assessments. This phase lasted 7 sessions. For
participants. The first was their proficiency level of their writing ability. summative assessment purposes, student’s electronic writings before
Although students from heterogeneous writing ability enrolled for the and after 7 sessions were considered as pretest and posttest measures.
program, the students who were rated the same in pretest stage of the For online formative assessment, two individual techniques of on-
research were included and the rest enjoyed the program but they were line portfolio writing and collaborative writing through online writing
excluded from the study. The second criterion was their computer forum (EWF) were utilized. At the first phase of formative assessment

32
Z. Mohamadi Studies in Educational Evaluation 59 (2018) 29–40

technique, online portfolio writing through EWF was utilized for the 3.2.3. E- portfolio writing
same students for 7 sessions and assessed on sessions 10th and 18th as Students were asked to write individual e-portfolio writing.
pretest and posttest measures. At the second phase of formative as- Portfolio was new to both teachers and students. Therefore, following
sessment technique, collaborative writing through EWF was utilized for Imhof and Picard (2009) guide, debriefing session on what portfolio is
the same students for 7 sessions and assessed on sessions 19th and 27th and how it should be dealt with was held so that any sources of mis-
as pretest and posttest measures. There were significant agreements understanding could be eliminated from the portfolio performance. The
between the raters who rated the students’ performance on a) pretest of purpose was giving the students a concrete example of what portfolio is
electronic writing (r (128) = 0.885 representing a large effect size, and how it should be written and giveing them the opportunity to start
p = .000) and b) posttest of electronic writing (r (128) = .915, re- their own portfolio writing. Student writing portfolio consisted of re-
presenting a large effect size p = .000) flective evaluation of their growth, references to the evidences of
growth by providing the best exemplar from the archive of writing they
have, their future vision of the problems they have in writing and how
3.2.2. E-writing forum (EWF) they are going to solve them, their evaluation of feedback they received
A website named E-writing forum (e-writingforum.ir) was launched from the teachers and how they respond to the comments. The elec-
to achieve the objectives of the study. Some of the features of this tronic portfolio writing on online writing forum provided several fa-
website are as follow; (1) sharing with anyone their finished file by cilities for students including check points for students, reflection
uploading; (2) accept or reject changes which means the possibility of prompts by which students ' reflection was directed to have an appro-
tracking the changes and making control of what makes into the writing priate account of their progress, an area for reviewing portfolios, and
tasks and what does not; (3) in line comments which are provided checking grades of portfolio assessment.
through collaboration on specific pieces of text; (4) discussion tools by Students were provided a sample portfolio and a debriefing session
which participants could share ideas, review changes and gather feed- on how to conduct it. Several suggestions on how to interpret the
back in one place. The website had also the possibility of uploading and themes and how to provide requested information were provided for
downloading any sort of file. Students were supposed to plan, organize, each theme. Student portfolios were assessed according to a pre-
monitor, analyze, synthetize and asses and evaluate their writings determined scoring scale. The electronic portfolio of students was as-
through this medium. It should be noted that for individual online sessed using Carleton College Writing Portfolio Scoring Sheet (2017)
electronic writing followed by online summative assessment and online since it is reported to be valid and reliable (Nezakatgoo, 2011). The
portfolio writing followed by online formative assessment, some of assistant researchers wrote a brief summary in which comments on
EWF's potential could not be used because students were not grouped in scores were given and important arguments and evidences were cited,
forum. Students could have interactions with teachers, thought. The consulted follow assessor and discussed if the assigned scores could be
reason was the type of assessment intervention under study. Only in compared, discussed the assigned scores and the rational by providing
collaborative writing followed by online formative assessment, students evidences and arguments and determined whether to hold on to the
had collaboration potential of EWF to write their assignments colla- original score or make adjustments. It should be mentioned that per-
boratively. It is also worth mentioning that the website was utilized in sonal views of assessors could not be completely eliminated. But, care
other project by the author (Mohammadi, 2017; Mohamadi, 2018). The was taken to minimize this effect through briefing sessions on assessing
reported problems in first implementation of the website were re- procedure. Besides, inter-rater reliability of five research assistants'
sponded and the website was improved for subsequent use in the pre- scoring of students' electronic portfolio writing was considered as an
sent research. index for reliability. There were significant agreements between the
Students collaboratively decide, discuss and write about the selected raters who rated the students’ performance on students’ portfolio
topics. Log analysis of EWF indicated how students negotiate form and writing (r (128) = 0.885, representing a large effect size, p = .000).
meaning and attend their writing. To analyze how formative assessment This scoring sheet had items covering issues such as the appropriateness
worked for the students, researcher conducted log analysis to in- of rhetoric and diction with purpose and audiences, if the arguments
vestigate the episodes in which learners repair or manage discourse by were made logically and coherently and evidence based, if they con-
attending to the form and content of their writing. Those episodes were tributed to analysis and synthesis, and if writing needed to be edited in
taken as tokens for the potential of formative assessment in mediating terms of language irregularities. There was an extra space for additional
writing ability. To identify negotiation of form and meaning, Varonis comments on student portfolio writing.
and Gass (1985) cited in Ellis and Barkhuizen (2005)'s framework is
used. The following episode indicates how the framework is used. The 3.2.4. IELTS writing rating scale
sequence “indicator, trigger, reaction and response to reaction makes Each task was assessed independently using IELTS rating rubric. The
one negotiation unit. rubric included detailed performance descriptors developed for de-
scribing written performance at the nine IELTS bands on Task
Trigger A: when you speak do you translate from your mother lan-
Achievement (how appropriately, accurately and relevantly the re-
guage to English?
sponse fulfils the requirements set out in the task), Coherence (overall
Indicator B: I didn’t get the point clarity and fluency of the message) and Lexical Resource (the range of
vocabulary the test takers have used and the accuracy and appro-
Response A: you know we have different cultures. We translate
priateness of that use in terms of the specific task Grammatical Range
sentences in our
and Accuracy (the range and accurate use of the test takers' gramma-
Reaction B: yes, I know it but sometime we need in some situations tical resource as manifested in their test takers’ writing at the sentence
level).
A: but most of the students do. If they speak English every day, they
can solve the problems
3.3. Procedure
There were significant agreements between the raters who rated the
students’ performance on electronic writing forum (r (128) = 0.980, After participant selection, several debriefing sessions were held on
representing a large effect size p = .000). This is taken as inter- rater how to work with e-writing forum website. Students were provided
reliability of 5 assistant researchers in rating student writing on online with teacher-directed instruction with similar lesson plan both in terms
writing forum. of content and procedure on essay writing of different genres across
classes. As for writing assignments for summative assessment, they

33
Z. Mohamadi Studies in Educational Evaluation 59 (2018) 29–40

were asked to write on predetermined and in- class brainstormed topics summative assessment, online portfolio writing formative assessment
on the website. The scores on first online writing were taken as the and online collaborative formative assessment on students' writing
pretest. ability are presented.
The website on electronic writing had fields which were supposed to
be filled by student writing. Each field was associated with different 5.1. The effect of online summative assessment on students' writing ability
components of essay writing such as motivator, thesis statement,
blueprints, first, second and third body paragraphs, clincher and con- A paired-samples t-test was run to compare the students’ means on
clusion. No time limitation was exposed. The grammar and spelling the pretest and first posttest; i.e. summative assessment. Based on the
checker of the website were deactivated. Meanwhile, teachers assessed results displayed in Table 3, it can be claimed that the participants had
students' writing quality by providing comments of all types including a higher mean on the posttest of summative assessment (M = 13.42,
language, content and organization on student writing through the SD = 1.90) than pretest (M = 8.48, SD = 1.10).
website. Analytic evaluation through highlighting, crossing the erro- The results of the paired-samples t-test (t (129) = 48.55, p = .000,
neous form and providing the correct form and general assessment were r = 0.974 representing a large effect size) (Table 4) indicated that the
utilized. The online writing on 9th session was taken as the posttest. students, after receiving summative assessment, made a significant
This phase of the study lasted 7 sessions. One session before and the last improvement in their mean score from pretest to posttest (Fig. 1).
session of this stage were taken as pretest and posttest. The first research question which explores if online individual
The second phase of the study started with students' portfolio summative assessment affects writing ability is answered as the findings
writing which marked the beginning of formative assessment. Since the suggest that students' writing ability was improved as a result of online
study had time-series design, the posttest of electronic writing was used summative assessment. Given the summative nature of assessment
as pretest at this stage for the second round of treatment. Along with conventions in Iran, it seems that students' improved gain scores
online writing assignment in each session, students were asked to have through summative assessment were not surprising. However, the on-
online individual portfolio writing. Students were debriefed on student line feature of it might have helped them enjoy benefits of the assess-
portfolio writing and a sample portfolio and how it should be written ment context.
was presented in class. Students were supposed to have online portfolio
writing. Students online portfolio writing was assessed as mentioned 5.2. The effect of online portfolio writing formative assessment on students'
before and students were provided with teacher comments on portfolio writing ability
writing online through the writing website. Online portfolio writing
lasted for 7 sessions. Teachers' assessment of student online portfolio After the first posttest, the students received online portfolio as-
writing was considered as online formative assessment. The online sessment which was followed by the second posttest. A repeated mea-
writing at the 18th session was considered as the posttest of writing sures ANCOVA was run to compare the participants’ means on the
quality after online formative portfolio writing. second and the first posttests controlling for the possible effect of their
Considering time series design of the study, the second posttest of entry writing ability as measured through the pretest. Before discussing
online writing marked the beginning of the third phase of the study the results, it should be noted that the assumption of homogeneity of
which was online formative assessment through log analysis of colla- variances and homogeneity of regression slopes were not checked be-
borative writing on EWF (e-writing forum) and considered as pretest at cause the present study included one single group. However; the as-
this stage. At this stage, students were asked to make pairs in class and sumption of linear relationship between the dependent variable; second
the admin of the website joined the pairs in a group on the website. This posttest, and covariate; pretest, was retained. As displayed in Table 5,
time, students wrote their writing assignment collaboratively for 7 the results of the ANOVA test (F (1, 125) = 339.20, p = .000) indicated
sessions. Teachers provided comments on students' writing the same that the statistical assumption that there was not a linear relationship
way they did in previous stages of the study. To account for individual between the two variables was rejected.
writing quality, teachers coded student contribution into forum through Based on the results displayed in Table 6, it can be concluded that
log analysis of EWF and discourse repair each student made. Open the participants after receiving portfolio assessment had a higher mean
coding procedure was used to account for negotiation of meaning and on the second posttest (M = 20.87, SE = 0.11) than first posttest
negotiation of form units each student instigated and responded ac- (M = 13.42, SE = .093) controlling for the effect of pretest.
cording to the coding procedure mentioned in Section 3.2.2. Online The results of the repeated measures ANCOVA (F (1, 128) = 11.47,
formative assessment through EWF lasted 7 sessions. The last writing on p = .001, partial η2 = 0.82 representing a moderate effect size)
EWF was considered as the third posttest on writing quality. A summary (Table 7) indicated that the participants after receiving portfolio as-
of research interventions including learner role and teacher roles in sessment had a significantly higher mean on the posttest of online
each mode of assessment and how student writings were assessed for portfolio assessment than posttest of summative assessment after con-
research purposes is provided in Table 1. trolling for the effect of pretest (Fig. 2).
Considering the second research question which investigates if on-
4. Data analysis line individual portfolio writing affects writing ability, it should be
mentioned that the findings confirm improvement in students writing
The objectives of this study were threefold; it aimed at investigating ability as a result of online individual portfolio writing. Although
the effect of summative assessment, online formative assessment with portfolio writing is not conventional in Iranian education context,
two techniques of portfolio writing and online collaborative through neither in assessment nor in class instruction, it had significantly more
electronic writing forum (EWF) on the improvement of the writing effect on writing ability than online summative assessment. It implies
ability of the Iranian EFL learners. Before discussing the results it that portfolio writing has high potential in boosting students' writing
should be noted that the normality of the data which was probed by ability. Accordingly, teachers and testing practitioners should give
computing the ratios of skewness and kurtosis over their respective special attention to it.
standard errors was retained. As displayed in Table 2, these ratios were
lower than ± 1.96; hence normality of the data. 5.3. The effect of online collaborative writing through EWF on students'
writing ability
5. Findings of the study
After the second posttest, the students received EWF which was
In this section, the results of the analyses on the effect of online followed by the third posttest. A repeated measures ANCOVA was run

34
Z. Mohamadi Studies in Educational Evaluation 59 (2018) 29–40

Table 1
Research intervention.

Online summative assessment (online individual writing through EWF)


Learner role 1) they research and gather information using an source
2) they provide the outline and give it back to the teacher and the teachers provides pertinent comments
3) Then plan and write the first draft
4) they check out the first draft and edits the writings individually
5) hand in they hand in their writing to the teacher and the teachers comment on language, content and organization
6) receives the teachers' comments and revise the paper together.
Teacher role 1) explicit teaching of essay writing
2) teacher lecturing on different genres of writing, tips for writing them and problems to avoid through PowerPoint in class presentation
3) Online analytic evaluation through highlighting, crossing the erroneous form and providing the correct form
Research measures Individual writings were measured in terms of language, content and organization

Online Formative assessment (individual electronic portfolio writing)


Learner role Reflective evaluation of their growth, references to evidences of growth, their evaluation of the feedback they received from teacher and respond to the
comments
Teacher role 1) explicit teaching of essay writing
2) teacher lecturing on different genres of writing, tips for writing them and problems to avoid through PowerPoint in class presentation
3) Analysis of Portfolios in terms of portfolio rubric analysis and online Providing reflection prompts, directing learners to account for their progress after
portfolio assessment
Research measures Individual writings were measured in terms of language, content and organization

Online Formative assessment (collaborative writing through EWF)


Teacher role 1) explicit teaching of essay writing
2) teacher lecturing on different genres of writing, tips for writing them and problems to avoid through PowerPoint in class presentation
3) analytic online feedback on student writing
Open coding of negotiation of form and meaning units to account for improvements of writing quality through discourse repair to account for individual
writing and treating each unit as one score adding to the potential of the collaborative writing mode.
Learner role Students choose their partners on the basis of their convenience; (2) they brainstorm about the topic which is chosen considering topic familiarity issue; (3)
they research and gather information using any source; (4) they provide the outline and give it back to the teacher and the teachers provides pertinent
comments; (5) then plan and write the first draft; (6) they check out the first draft according to the check list provided by the teacher in advance; (7) each
student edits the writings individually with different highlight colors so that when handed together they could track each other's ideas and provide
justification for the required revisions; (8) they hand in their writing to the teacher and the teachers comment on language, content and organization; (9)
students receive the teachers' comments and revise the paper together.
Research measure Individual writings were measured in terms of language, content and organization

Table 2
Descriptive Statistics; Testing Normality Assumption.

Skewness Kurtosis

Statistic Std. Error Ratio Statistic Std. Error Ratio

Pretest 0.334 0.212 1.58 −.668 0.422 −1.58


Posttest of 0.293 0.212 1.38 −.402 0.422 −0.95
Summative
Assessment
Posttest of Online −.015 0.212 −0.07 −.224 0.422 −0.53
Portfolio
Posttest of EWF 0.407 0.212 1.92 0.421 0.422 1.00

Fig. 1. Pretest and Posttest of Writing.

Table 3
Descriptive Statistics; Pretest and Posttest of Summative Assessment. Table 5
Test of Linear Relationship between Posttest of Portfolio Assessment and Pretest.
Mean N Std. Deviation Std. Error
Mean Sum of df Mean F Sig.
Squares Square
Tests Pretest 8.48 130 1.108 0.097
Posttest of Summative 13.42 130 1.908 0.167 Post2 * Between (Combined) 592.055 4 148.014 85.666 0.000
Assessment Pret- Groups Linearity 586.075 1 586.075 339.202 0.000
est Deviation 5.980 3 1.993 1.154 0.330
from
Table 4 Linearity
Paired-Samples t-test; Pretest and Posttest of Summative Assessment. Within 215.976 125 1.728
Groups
Paired Differences T Df Sig. (2- Total 808.031 129
tailed)
Mean Std. Deviation Std. 95% Confidence
Error Interval of the to compare the participants’ means on the third and second posttests
Mean Difference
controlling for the possible effect of their entry writing ability as
Lower Upper measured through the pretest and posttest of summative assessment.
That is to say; the students’ performance on the posttest of EWF might
4.938 1.160 0.102 4.737 5.140 48.551 129 0.000 have been affected by the pretest and also by the summative assessment
administered at first phase of the study.

35
Z. Mohamadi Studies in Educational Evaluation 59 (2018) 29–40

Table 6
Descriptive Statistics; Posttests of Online Portfolio and Summative Assessment with
Pretest.

Writing Mean Std. Error 95% Confidence Interval

Lower Upper
Bound Bound

Posttest of Summative 13.423a 0.093 13.239 13.607


Assessment
a
Posttest of Online Portfolio 20.877 0.115 20.648 21.105

Before discussing the results it should be noted that assumption of


Fig. 2. Posttests of online portfolio and summative assessment with pretest.
linear relationships between the dependent variable; third posttest, and
covariates; pretest and first posttest, was met. As displayed in Table 8
and 9, the results of the ANOVA test (F (1, 125) = 170.96, p = .000) collaborative log writing can be attributed to its process oriented
indicated that the statistical assumption that there was not a linear nature. In both online modalities, students were engaged with processes
relationship between the posttest of EWF and pretest was retained. such as collecting writing pieces, reflecting on them, receiving scaf-
The results of the ANOVA test (F (1, 123) = 451.64, p = .000) in- folding and feedback from teacher (e-portfolio writing) or fellow stu-
dicated that the statistical assumption that there was not a linear re- dents (collaborative writing in EWF). Inherent feature of summative
lationship between the posttest of EWF and posttest of summative as- and formative assessment is reflection. The reflection occurred in EWF
sessment was retained. helped students improve their higher cognitive abilities (managing
Based on the results displayed in Table 10, it can be concluded that discourse through metacognitive strategies). Formative learning de-
the participants after receiving EWF had a higher mean on the third signs are supported by studies both in EFL (Hassaskhah & Sharifi, 2011;
posttest (M = 35.81, SE = .18) than second posttest (M = 20.87, Roohani & Taheri, 2015) and other contexts (Lin, Preston, Kharrufa, &
SE = 0.93) controlling for the effect of pretest and posttest of summa- Kong, 2016).
tive assessment. Integral to formative and summative assessment is feedback which
The results of the repeated measures ANCOVA (F (1, 127) = 8.31, scaffolds learners to bridge the gap between their actual level of de-
p = .005, partial η2 = 0.61 representing a moderate effect size) velopment determined by individual problem solving and their poten-
(Table 11) indicated that the participants after receiving EWF had a tial level of development determined through collaboration with adult
significantly higher mean on the posttest of EWF than posttest of online guidance or more capable peers. The results of this study indicated that
portfolio assessment after controlling for the effect of pretest and formative assessment in collaborative log writing in EWF mode led to
posttest of summative assessment (Fig. 3). better student writing output which confirms that students' language
Recalling the third research question, the researcher investigated if needs are recognized and responded in the most individualized,
online collaborative writing affected students' writing ability. The meaningful, timely, constant and manageable " way in collaboration
findings indicated that the online collaborative writing significantly with other students (Barootchi & Keshavarz, 2002; Li, Link, &
affected students' writing ability in a positive way. Considering Iranian Hegelheimer, 2015). It is supported by research that interaction in L2
educational conventions which provide little space for collaborative includes interactional feedback on learners' language mistakes and this
learning, the findings are of great value. This shows that teaching helps learners notice the gap between their erroneous forms and target
practitioners need to make principled decisions in revisiting educa- like forms and encourages them to modify their output (Rassaei, 2013).
tional conventions and pave the way for more integrated educational In addition, the review of forty four articles by Tenório, Bittencourt,
mediums. Isotani, and Silva (2016) on the benefits of peer assessment supports the
results of this study since the peer assistance in collaborative log writing
(EWF) type of formative assessment in online contexts was approved to
6. Discussion improve student writing performance. Self-paced learning and self-
regulated learning in online formative assessment in this research in-
The objective of the study was investigating comparative effect of dicated how learners negotiate meaning and form to manage their
online summative assessment and online formative assessment (two communication, solve communication problems and repair discourse
modes of collaborative e-writing and student electronic portfolio (in collaborative writing) and self-monitoring and self-assessment (e-
writing) on student writing ability. Although both forms of summative portfolio writing). Learner autonomy is other possible benefits of for-
and formative assessment improved student writing quality, it was in- mative assessment. In both types of formative assessments in this re-
dicated that collaborative electronic writing had highest significant search, students gained more responsibility for their learning which lies
impact on student writing ability. The potential benefits of formative on the fact that student-centered learning is at the heart of portfolio
portfolio assessment in both forms of online portfolio writing or assessment.

Table 7
Repeated Measures ANCOVA; Posttests of Online Portfolio and Summative Assessment with Pretest.

Effect Value F Hypothesis df Error df Sig. Partial Eta Squared

Writing Pillai's Trace 0.082 11.476 1 128 0.001 0.082


Wilks' Lambda 0.918 11.476 1 128 0.001 0.082
Hotelling's Trace 0.090 11.476 1 128 0.001 0.082
Roy's Largest Root 0.090 11.476 1 128 0.001 0.082
Writing * Pretest Pillai's Trace 0.125 18.304 1 128 0.000 0.125
Wilks' Lambda 0.875 18.304 1 128 0.000 0.125
Hotelling's Trace 0.143 18.304 1 128 0.000 0.125
Roy's Largest Root 0.143 18.304 1 128 0.000 0.125

36
Z. Mohamadi Studies in Educational Evaluation 59 (2018) 29–40

Table 8
Test of Linear Relationship between Posttest of EWF and Pretest.

Sum of Squares df Mean Square F Sig.

Post3 * Pretest Between Groups (Combined) 1407.278 4 351.820 42.851 0.000


Linearity 1403.652 1 1403.652 170.962 0.000
Deviation from Linearity 3.627 3 1.209 0.147 0.931
Within Groups 1026.291 125 8.210
Total 2433.569 129

Table 9
Test of Linear Relationship between Posttest of EWF and Posttest of Summative Assessment.

Sum of Squares df Mean Square F Sig.

Post3 * Post1 Between Groups (Combined) 1921.116 6 320.186 76.852 0.000


Linearity 1881.693 1 1881.693 451.648 0.000
Deviation from Linearity 39.423 5 7.885 1.892 0.100
Within Groups 512.453 123 4.166
Total 2433.569 129

Table 10
Descriptive Statistics; Posttests of EWF and Posttest of Portfolio with Covariates.

Writing Mean Std. Error 95% Confidence Interval

Lower Bound Upper Bound

Posttest of Online Portfolio 20.877a 0.111 20.657 21.097


Posttest of EWF 35.815a 0.182 35.455 36.175

a
Covariates appearing in the model are evaluated at the following values:
Pretest = 8.48, Post1 = 13.42.

The findings are in line with many other studies when it comes to Fig. 3. Posttest of EWF and online portfolio with pretest and summative assessment.
electronic portfolio assessment. Chang, Tseng, Liang, and Liao (2013)'s
investigation of web-based portfolio assessment with the purpose of
conducted on others' action at the beginning of the task whereas later
promoting self-regulation indicated that the technology enhanced
reflection reports were more on self- actions which can be an incidence
learning and interactive digital learning environments are helpful in
of scaffolding.
students' regulating their own learning and increasing learning quality.
However, the instruments used in aforementioned studies were
The findings of this study are also supported by other studies in lit-
web-based portfolio assessment applications and self- regulation ques-
erature as far as achievements through online collaboration are con-
tionnaire whereas in this study students were assessed through web-
cerned. For example, the cluster analysis of online collaboration in
based intervention and performance-based tests. In addition, the most
terms of self- regulation and socioemotional interactions indicated that
common methods for gathering data in the aforementioned studies
online collaboration resulted in intensive collaboration among mem-
were student text analysis and observation whereas this study was
bers, positive socioemotional interactions and group regulatory beha-
empirical and this gives more credibility to the results. However, the
viors (Kwon, Liu, & Johnson, 2014). In addition, the significant effect of
researcher acknowledges the need for more triangulation of results
computer mediated collaboration in this study is supported by a re-
through various data sources to increase reliability and validity of
search on computer mediated collaboration and its mediating role on
further research. Besides, the assessment studies utilized means-based
interaction through reflection (Lavoué, Molinari, Prié, & Khezami,
statistical analysis which masks descriptive analysis of natural ecolo-
2015). In their study, reflection in action and on action helped students
gical class validity (Al-Jarrah, 2016). Despite its potential for commu-
to work on what works for interaction rather than on how to plan to
nication and interaction, synchronious computer mediated
improve interaction. Besides, research on reflection reports is more

Table 11
Repeated Measures ANCOVA; Posttests of EWF and Online Portfolio with Covariates.

Effect Value F Hypothesis df Error df Sig. Partial Eta Squared

Writing Pillai's Trace 0.061 8.315 1 127 0.005 0.061


Wilks' Lambda 0.939 8.315 1 127 0.005 0.061
Hotelling's Trace 0.065 8.315 1 127 0.005 0.061
Roy's Largest Root 0.065 8.315 1 127 0.005 0.061
Writing * Pretest Pillai's Trace 0.092 12.792 1 127 0.000 0.092
Wilks' Lambda 0.908 12.792 1 127 0.000 0.092
Hotelling's Trace 0.101 12.792 1 127 0.000 0.092
Roy's Largest Root 0.101 12.792 1 127 0.000 0.092
Writing * Post1 Pillai's Trace 0.362 72.185 1 127 0.000 0.362
Wilks' Lambda 0.638 72.185 1 127 0.000 0.362
Hotelling's Trace 0.568 72.185 1 127 0.000 0.362
Roy's Largest Root 0.568 72.185 1 127 0.000 0.362

37
Z. Mohamadi Studies in Educational Evaluation 59 (2018) 29–40

communication has not been characterized as a venue where assess- This study has important implications for improved pedagogical
ment and correction naturally takes place” (Akiyama, 2017, p.59). This practices and instructional outcomes. As it was indicated in this re-
study contributed to the field by mitigating the unbalanced number of search, assessment under collaboration context had considerable po-
studies utilized asynchronous mediated communication techniques in tential for enhanced learning. Formative assessment through colla-
assessment via utilizing a novel synchronious electronic writing forum boration maximizes learning opportunities and enhances learner
for assessment purposes. products. Student engagement and their management of discourse,
Despite interesting findings, this study is limited in several ways. make an individualized learning environment. Formative assessment
Missing from the study is an account of psycholinguistic aspects. There could be descended to a summative assessment if it does not take into
are overwhelming number of research suggesting mediating effect of account learners' monitoring their learning. Interactive feedback and
technology on enhanced learning environment on learner engagement decision making for the next step occurred in collaborative formative
and satisfaction (Henrie, Halverson, & Graham, 2015; Hu & Hui, 2012; assessment design can increase teacher's awareness of their learners'
Nguyen, Rienties, Toetenel, Ferguson, & Whitelock, 2017), and teacher understanding and progress and can also help them understand how to
and student motivation (Biškupić, Lacković, & Jurina, 2015). Therefore, set up formative assessment system. The collaborative formative as-
research is needed to investigate how to recognize and measure related sessment investigated in this research can help testing practitioners
psychological aspects that prime learning and investigate their med- assure quality of technology- enhanced formative assessment because it
iating role in testing behaviors and assessment quality. In addition, creates contingency. Contingency is defined as real time teaching ad-
despite the contribution this study made, it might be affected by the justments in teaching, professional decisions informed by on demand
sources of errors that the researcher failed to control for. For example, it and real classroom evidences and help them decide when and how to
is suggested by research (Nguyen et al., 2017) that the design of com- intervene to better respond to learning problems (Sheard & Chambers,
puter made assessment may affect student engagement, satisfaction and 2014). The potentials pertained to the assessment techniques in-
pass rate. The time spent on task as a result of student engagement may vestigated in this study indicate that using engaging technology along
mediate the results. The time students spent on each of formative as- with appropriate assessment strategies is a powerful way of making
sessment tasks was not equal. This might affect the benefits and ad- learning efficient. Resource developers and policy makers need to de-
vantages students have out of assessment tasks. Moreover, there is no sign and afford technologies that are best placed to assure quality of
clear cut distinction between individual performance and collaborative formative assessment. According to Lucas, Oliveira, Farias, and Alencar
one which means it is not clear whether individual performance med- (2017), research on synchronic technologies such as computers and
iates the collaborative one and in turn student writing ability or vice software that promote collaborative learning suggests that they suffer
versa. Therefore, as suggested in literature (Kent, Laslo, & Rafaeli, from a number of problems such as lack of interplay for reuse activities,
2016), patterns of interactivity in online collaborations should be in- and their inefficacy in basing the group working on distinct responsi-
vestigated which can inform us about not only collaboration and group bilities and inefficacy in group works in collaborative and in parallel
dynamics but also individual learning. Besides, it is suggested the way. Therefore, education technology designers can design and pro-
content and quality of interaction in collaboration is predictor of in- duce technologies that promote collaborative learning through ac-
dividual progress (Yücel & Usluel, 2016). Therefore, research is needed counting for interactional practice as well as individual accountability.
to investigate how interaction and engagements are mediated by
technology enhanced learning environment. Conflict of interest
In addition, achievement-based etic perspective may mask student
and teacher emic perspectives. Practitioners should make principled None
decisions when it comes to technology implementation in education.
For example, in education ecologies where conventional text-based Funding
instruction is practiced such as in Iran, it might be difficult to overcome
the idea of change towards more learner-directed and other material There is no funding for this research.
based instruction such as computers. Teaching practitioners and ma-
terial developers need to make principled decisions about initiating a References
change and monitoring the consequences of it when it comes to im-
plementation of computers in education. Therefore, multidimensional Akiyama, Y. (2017). Learner beliefs and corrective feedback in telecollaboration: A
emic perspective which takes into account student, teacher and macro longitudinal investigation. System, 64, 58–73.
Al-Jarrah, R. S. (2016). A suggested model of corrective feedback provision. Ampersand,
policy makers’ ideas is required to investigate the mediating factors 3, 98–107.
affecting the efficacy of computer induction education. The mediating Asoodar, M., Atai, M. R., Vaezi, S., & Marandi, S. S. (2014). Examining effectiveness of
factors may include how students' collaboration and group membership communities of practice in online English for academic purposes (EAP) assessment in
virtual classes. Computers & Education, 70, 291–300.
affect quality of their performance, how teacher in-service and pre- Barootchi, N., & Keshavarz, M. H. (2002). Assessment of achievement through portfolios
service induction programs improve the efficacy of computer assisted and teacher-made tests. Educational Research, 44(3), 279–288.
education, and how environmental support such as infrastructure ser- Behizadeh, N., & Pang, M. E. (2016). Awaiting a new wave: The status of state writing
assessment in the United States. Assessing Writing, 29, 25–41.
vices including internet quality and speed which are determined at
Bennett (1998). Reinventing assessment speculations on the future of large-scale educa-
policy making level can affect the efficacy of technology mediated tional testing. A Policy Information Perspective.
learning and assessment in practice. Bennett (2009). A critical look at the meaning and basis of formative assessment. Princeton:
Educational Testing Service.
Biškupić, I. O., Lacković, S., & Jurina, K. (2015). Successful and proactive e-learning
7. Conclusion environment fostered by teachers’ motivation in technology use. Procedia-Social and
Behavioral Sciences, 174, 3656–3662.
This study is an investigation of the impact of online formative and Black, P., & Wiliam, D. (2009). Developing the theory of formative assessment.
Educational Assessment, Evaluation and Accountability (Formerly: Journal of Personnel
summative student assessments on their writing ability. Online sum- Evaluation in Education), 21(1) 5.
mative assessment of student electronic writing helped learners im- Broad, B. (2003). What we really value: Beyond rubrics in teaching and assessing writing.
prove their writing from pretest to posttests. The online individual University Press of Colorado.
Broadbent, J., Panadero, E., & Boud, D. (2017). Implementing summative assessment
formative assessment of student writing through portfolio writing and with a formative flavour: A case study in a large class. Assessment & Evaluation in
online collaborative formative assessment through E-writing forum in- Higher Education, 1–16.
dicated that student performance was improved through technology Burner, T. (2014). The potential formative benefits of portfolio assessment in second and
foreign language writing contexts: A review of the literature. Studies in Educational
enhanced learning environment.

38
Z. Mohamadi Studies in Educational Evaluation 59 (2018) 29–40

Evaluation, 43, 139–149. Li, Z., Link, S., Ma, H., Yang, H., & Hegelheimer, V. (2014). The role of automated writing
Carless, D., Salter, D., Yang, M., & Lam, J. (2011). Developing sustainable feedback evaluation holistic scores in the ESL classroom. System, 44, 66–78.
practices. Studies in Higher Education, 36(4), 395–407. Lin, Q. (2008). Preservice teachers’ learning experiences of constructing e-portfolios
Cassady, J. C., & Gridley, B. E. (2005). The effects of online formative and summative online. The Internet and Higher Education, 11(3), 194–200.
assessment on test anxiety and performance. Journal of Technology, Learning, and Lin, O. P., & Maarof, N. (2013). Collaborative writing in summary writing: Student per-
Assessment, 4(1), n1. ceptions and problems. Procedia-Social and Behavioral Sciences, 90, 599–606.
Challob, A.a. I., Bakar, N. A., & Latif, H. (2016). Collaborative blended learning writing Lin, M., Preston, A., Kharrufa, A., & Kong, Z. (2016). Making L2 learners’ reasoning skills
environment: Effects on EFL students’ writing apprehension and writing perfor- visible: The potential of computer supported collaborative learning environments.
mance. English Language Teaching, 9(6), 229. Thinking Skills and Creativity, 22, 303–322.
Chang, C.-C., Tseng, K.-H., Liang, C., & Liao, Y.-M. (2013). Constructing and evaluating Llamas-Nistal, M., Fernández-Iglesias, M. J., González-Tato, J., & Mikic-Fonte, F. A.
online goal-setting mechanisms in web-based portfolio assessment system for facil- (2013). Blended e-assessment: Migrating classical exams to the digital world.
itating self-regulated learning. Computers & Education, 69, 237–249. Computers & Education, 62, 72–87.
Cheng, G. (2017). The impact of online automated feedback on students’ reflective Lucas, E. M., Oliveira, T. C., Farias, K., & Alencar, P. S. (2017). CollabRDL: A language to
journal writing in an EFL course. The Internet and Higher Education, 34, 18–27. coordinate collaborative reuse. Journal of Systems and Software.
Cheng, K.-H., Liang, J.-C., & Tsai, C.-C. (2015). Examining the role of feedback messages McMillan, J. H., Venable, J. C., & Varier, D. (2013). Studies of the effect of formative
in undergraduate students’ writing performance during an online peer assessment assessment on student achievement: So much more is needed. Practical Assessment,
activity. The Internet and Higher Education, 25, 78–84. Research & Evaluation, 18.
Cho, H. (2017). Synchronous web-based collaborative writing: Factors mediating inter- Miyazoe, T., & Anderson, T. (2010). Learning outcomes and students’ perceptions of
action among second-language writers. Journal of Second Language Writing. online writing: Simultaneous implementation of a forum, blog, and wiki in an EFL
Conole, G., & Warburton, B. (2005). A review of computer-assisted assessment. ALT-J, blended learning setting. System, 38(2), 185–199.
13(1), 17–31. Mo, Y., & Troia, G. A. (2017). Similarities and differences in constructs represented by US
Cope, B., Kalantzis, M., McCarthey, S., Vojak, C., & Kline, S. (2011). Technology-mediated States’ middle school writing tests and the 2007 national assessment of educational
writing assessments: Principles and processes. Computers and Composition, 28(2), progress writing assessment. Assessing Writing, 33, 48–67.
79–96. Mohamadi, Z. (2018). Comparative effect of project-based learning and electronic pro-
Crisp, V., & Ward, C. (2008). The development of a formative scenario-based computer ject-based learning on the development and sustained development of english idiom
assisted assessment tool in psychology for teachers: The PePCAA project. Computers & knowledge. Journal of Computing in Higher Education, 1–23.
Education, 50(4), 1509–1526. Mohamadi, Z. (2017). Interactional complexity development, interactional demonstrators
Dippel, H. C., Neundorf, V., & Yakimchuk, V. (2008). WebSIS–a web based portal with an and interaction density in collaborative and e-collaborative writing modalities.
integrated e-assessment environment. Interactive Computer Aided Learning ICL2008, Journal of Teaching Language Skills, 36(2), 75–102.
24(26.9). Naghdipour, B. (2016). English writing instruction in Iran: Implications for second lan-
Driessen, E., Van Der Vleuten, C., Schuwirth, L., Van Tartwijk, J., & Vermunt, J. (2005). guage writing curriculum and pedagogy. Journal of Second Language Writing, 32,
The use of qualitative research criteria for portfolio assessment as an alternative to 81–87.
reliability evaluation: A case study. Medical Education, 39(2), 214–220. Newhouse, C. P. (2011). Using IT to assess IT: Towards greater authenticity in summative
Evans, C. (2013). Making sense of assessment feedback in higher education. Review of performance assessment. Computers & Education, 56(2), 388–402.
Educational Research, 83(1), 70–120. Nezakatgoo, B. (2011). The effects of portfolio assessment on writing of EFL students.
Farhady, H., & Hedayati, H. (2009). Language assessment policy in Iran. Annual Review of English Language Teaching, 4(2), 231.
Applied Linguistics, 29, 132–141. Nguyen, Q., Rienties, B., Toetenel, L., Ferguson, R., & Whitelock, D. (2017). Examining
Gikandi, J. W., Morrow, D., & Davis, N. E. (2011). Online formative assessment in higher the designs of computer-based assessment and its impact on student engagement,
education: A review of the literature. Computers & Education, 57(4), 2333–2351. satisfaction, and pass rates. Computers in Human Behavior.
Han, J. H., & Finkelstein, A. (2013). Understanding the effects of professors’ pedagogical Nicol, D. J., & Macfarlane-Dick, D. (2006). Formative assessment and self-regulated
development with Clicker Assessment and Feedback technologies and the impact on learning: A model and seven principles of good feedback practice. Studies in Higher
students’ engagement and learning in higher education. Computers & Education, 65, Education, 31(2), 199–218.
64–76. Nixon, R., & McClay, J. K. (2007). Collaborative writing assessment: Sowing seeds for
Hassaskhah, J., & Sharifi, A. (2011). The role of portfolio assessment and reflection on transformational adult learning. Assessing Writing, 12(2), 149–166.
process writing. Asian EFL Journal, 13(1). Olson, B. L., & McDonald, J. L. (2004). Influence of online formative assessment upon
Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, student learning in biomedical science courses. Journal of Dental Education, 68(6),
77(1), 81–112. 656–659.
Henrie, C. R., Halverson, L. R., & Graham, C. R. (2015). Measuring student engagement in Pachler, N., Daly, C., Mor, Y., & Mellar, H. (2010). Formative e-assessment: Practitioner
technology-mediated learning: A review. Computers & Education, 90, 36–53. cases. Computers & Education, 54(3), 715–721.
Hinostroza, J. E., & Mellar, H. (2001). Pedagogy embedded in educational software de- Peterson, E. R., & Irving, S. E. (2008). Secondary school students’ conceptions of as-
sign: Report of a case study. Computers & Education, 37(1), 27–40. sessment and feedback. Learning and Instruction, 18(3), 238–250.
Hu, P. J.-H., & Hui, W. (2012). Examining the role of learning engagement in technology- Rabiee, A., Nazarian, Z., & Gharibshaeyan, R. (2013). An explanation for internet use
mediated learning and its effects on learning effectiveness and satisfaction. Decision obstacles concerning e-learning in Iran. The International Review of Research in Open
Support Systems, 53(4), 782–792. and Distributed Learning, 14(3), 361–376.
Huff, K. L., & Sireci, S. G. (2001). Validity issues in computer-based testing. Educational Rakedzon, T., & Baram-Tsabari, A. (2017). To make a long story short: A rubric for as-
Measurement: Issues and Practice, 20(3), 16–25. sessing graduate students’ academic and popular science writing skills. Assessing
Imhof, M., & Picard, C. (2009). Views on using portfolio in teacher education. Teaching Writing, 32, 28–42.
and Teacher Education, 25(1), 149–154. Rashad, A., Youssif, A. A., Abdel-Ghafar, R., & Labib, A. E. (2008). E-assessment tool: A
Jordan, S. (2012). Student engagement with assessment and feedback: Some lessons from course assessment tool integrated into knowledge assessment. Innovative techniques in in-
short-answer free-text e-assessment questions. Computers & Education, 58(2), struction technology, e-learning, e-assessment, and education. Springer7–12.
818–834. Rassaei, E. (2013). Corrective feedback, learners’ perceptions, and second language de-
Kent, C., Laslo, E., & Rafaeli, S. (2016). Interactivity in online discussions and learning velopment. System, 41(2), 472–483.
outcomes. Computers & Education, 97, 116–128. Rezaei, A. R., & Lovorn, M. (2010). Reliability and validity of rubrics for assessment
Kingston, N., & Nash, B. (2011). Formative assessment: A meta-analysis and a call for through writing. Assessing Writing, 15(1), 18–39.
research. Educational Measurement: Issues and Practice, 30(4), 28–37. Roohani, A., & Taheri, F. (2015). The effect of portfolio assessment on EFL learners’ ex-
Kwon, K., Liu, Y.-H., & Johnson, L. P. (2014). Group regulation and social-emotional pository writing ability. Iranian Journal of Language Testing, 5(1), 46–59.
interactions observed in computer supported collaborative learning: Comparison Sheard, M. K., & Chambers, B. (2014). A case of technology-enhanced formative assess-
between good vs. poor collaborators. Computers & Education, 78, 185–200. ment and achievement in primary grammar: How is quality assurance of formative
Lam, R. (2017). Taking stock of portfolio assessment scholarship: From research to assessment assured? Studies in Educational Evaluation, 43, 14–23.
practice. Assessing Writing, 31, 84–97. Shojaei, A., & Fatemi, M. (2016). The effect of E-assessment on writing skill of iranian
Lavoué, É., Molinari, G., Prié, Y., & Khezami, S. (2015). Reflection-in-action markers for upper intermediate efl learners. Journal of Fundamental and Applied Sciences, 8(2S),
reflection-on-action in Computer-supported collaborative learning settings. 1041–1057.
Computers & Education, 88, 129–142. Shojaei, A., & Motamedi, A. (2014). Applicability of E-assessment in iran as an efl context:
Lee, I. (2010). Writing teacher education and teacher learning: Testimonies of four EFL From fantasy to reality. International Journal of Language Learning and Applied
teachers. Journal of Second Language Writing, 19(3), 143–157. Linguistics World, 93.
Lee, I., & Coniam, D. (2013). Introducing assessment for learning for EFL writing in an Siozos, P., Palaigeorgiou, G., Triantafyllakos, G., & Despotakis, T. (2009). Computer based
assessment of learning examination-driven system in Hong Kong. Journal of Second testing using “digital ink”: Participatory design of a Tablet PC based assessment ap-
Language Writing, 22(1), 34–50. plication for secondary education. Computers & Education, 52(4), 811–819.
Lei, X. (2008). Exploring a sociocultural approach to writing strategy research: Mediated Stevenson, M. (2016). A critical interpretative synthesis: The integration of automated
actions in writing activities. Journal of Second Language Writing, 17(4), 217–236. writing evaluation into classroom writing instruction. Computers and Composition, 42,
Lew, M. D., Alwis, W., & Schmidt, H. G. (2010). Accuracy of students’ self-assessment and 1–16.
their beliefs about its utility. Assessment & Evaluation in Higher Education, 35(2), Stevenson, M., & Phakiti, A. (2014). The effects of computer-generated feedback on the
135–156. quality of writing. Assessing Writing, 19, 51–65.
Li, J., Link, S., & Hegelheimer, V. (2015). Rethinking the role of automated writing Tarighat, S., & Khodabakhsh, S. (2016). Mobile-assisted language assessment: Assessing
evaluation (AWE) feedback in ESL writing instruction. Journal of Second Language speaking. Computers in Human Behavior, 64, 409–413.
Writing, 27, 1–18. Tenório, T., Bittencourt, I. I., Isotani, S., & Silva, A. P. (2016). Does peer assessment in on-

39
Z. Mohamadi Studies in Educational Evaluation 59 (2018) 29–40

line learning environments work? A systematic review of the literature. Computers in Wang, T.-H., Wang, K.-H., & Huang, S.-C. (2008). Designing a web-based assessment
Human Behavior, 64, 94–107. environment for improving pre-service teacher assessment literacy. Computers &
Varonis, E. M., & Gass, S. (1985). Non-native/non-native conversations: A model for Education, 51(1), 448–462.
negotiation of meaning. Applied Linguistics, 6(1), 71–90. Yeh, S.-W., Lo, J.-J., & Chu, H.-M. (2014). Application of online annotations to develop a
Vo, H. M., Zhu, C., & Diep, N. A. (2017). The effect of blended learning on student per- web-based Error Correction Practice System for English writing instruction. System,
formance at course-level in higher education: A meta-analysis. Studies in Educational 47, 39–52.
Evaluation, 53, 17–28. Yilmaz, R. (2017). Exploring the role of E-learning readiness on student satisfaction and
Vojak, C., Kline, S., Cope, B., McCarthey, S., & Kalantzis, M. (2011). New spaces and old motivation in flipped classroom. Computers in Human Behavior.
places: An analysis of writing assessment software. Computers and Composition, 28(2), Yücel, Ü. A., & Usluel, Y. K. (2016). Knowledge building and the quantity: Content and
97–111. quality of the interaction and participation of students in an online collaborative
Vygotsky, L. S. (1980). Mind in society: The development of higher psychological processes. learning environment. Computers & Education, 97, 31–48.
Harvard university press.

40

Potrebbero piacerti anche