Sei sulla pagina 1di 15

http://www.ifets.info/journals/5_3/deepwell.

html
Towards Capturing Complexity: an interactive framework for institutional
evaluation
Frances Deepwell
Centre for Higher Education Development, Coventry University
Address: CHED, Coventry University, Priory Street, Coventry CV1 5FB
United Kingdom
Tel: +44 24 76887590
Fax: +44 24 76887599
f.deepwell@coventry.ac.uk

ABSTRACT
This article describes the strategy we adopted to evaluate the
implementation of virtual learning environment (VLE) across the
institution. The paper presents the theoretical model we developrf
and the methods used. The availability of a VLE has affected many
aspects of University life and practices and we are endeavouring to
understand the impact of the changes on staff and students in terms
of motivation, achievement and satisfaction as well as on the
curriculum and modes of delivery. Through the presentation of this
University-wide case study. The article explores a possible
framework for an evaluation that captures the complexity of a
major development such as this.
Keywords: Evaluation, Virtual learning environment, WebCT,
Higher education research, Online learning

Introduction
This article discusses the evaluation of the implementation of a virtual learning
environment (VLE) at Coventry University, UK. The development and integration of
VLEs is demanding significant resources and investment across the sector in the UK.
As institution after institution adopts a system for online delivery of their teaching and
learning, it is timely to reflect on the process and the possible outcomes of these types

of adoption. The availability of a VLE affects many aspects of University life and
practices. At Coventry, we are endeavouring to understand the impact of the changes
on staff and students in terms of motivation, achievement and satisfaction as well as
on the curriculum and modes of delivery. Through the presentation of our evaluation
model, the article offers a framework for capturing the complexity of a major
development such as this. Examples of the findings from the first two years are used
to illustrate the nature of the Coventry evaluation model.

The evaluation strategy


As the title of this paper suggests, we are concerned with complexity. Our research is
both multi-disciplinary and multi-method and, similar to the large scale evaluation
study carried out by Anderson et al (2000) adopts an "eclectic approach to the
selection and combination of research methods". The evaluation research from which
our evidence is drawn, is carried out by students, lecturers with educational
researchers.
From the outset, there have been three distinct layers to our evaluation of the
implementation of a VLE, each with a discrete audience for the results. Firstly, the
evaluation is being used to guide the further development of the technology and the
associated staff and educational development. This serves the needs of the VLE
development team and the users (both teachers and learners.) Secondly, it is informing
decision-making around policy and practice in relation to teaching and learning and
technology. The primary audience for this layer of evaluation is management and
administration. Whilst the infrastructure for online learning continues to be refined
and modified, the findings have also been able to influence decisions made at various
levels of University administration, both central and devolved. Thus the evaluation
can be perceived as a cycle of action-reflection-action-reflection (Deepwell & Cousin
2002), with interim results feeding into relevant forums across the institution. Thirdly,
it is a vehicle for us to disseminate our experiences to a wider academic community in
this field and thereby offer a contribution to theory-building in the area of technology
in teaching and learning. The maintenance of a web site indicating the evaluation
findings and approaches, together with conference presentations and articles such as
this, serves this third aim.

Background

In 1999, Coventry University launched a virtual learning environment comprising


WebCT plus additional web-based facilities (a module information directory, previous
exam papers, and study skills materials) to enhance teaching and learning at the
University. An automated procedure generated a web presence for all modules based
on a common template. Student registration on modules was updated on a nightly
basis using an automated routine. Use of the VLE was encouraged and centrally
supported, but not compulsory.
With such a large-scale move into a new area of teaching and learning, we needed to
ensure that as the implementation was rolled out we were informed of the impact and
consequences of the change, including unintended phenomena. The findings needed to
be delivered in as timely and insightful a way as possible, representing the many
voices touched by the development. This information would enable us to review
factors under our control in order to improve matters and provide evidence to others
of the need for change elsewhere in the institution. Our intention in conducting an
evaluation, therefore, was to inform developments as they progressed and has become
more of a process (ongoing reflection and feedforward) than a product.. For these
reasons, we felt outsider evaluation would not yield as rich a return as an in-house
process. We are seeking a deeper understanding of the changes in the teaching and
learning environment both "prospectively" and "retrospectively", therefore our
research is evaluation for development and for knowledge rather than evaluation for
accountability (Chelimsky, 1997).

Evaluation models
In considering the nature of our evaluation, we can position it in regard to the three
measures outlined by Oliver and Conole (1998): authenticity, exploration and scale.
The evaluation study remains highly "authentic" in that it attempts to capture the
effects of the VLE as it operates in real time. The study is "exploratory" in that it
observes, describes and interprets events. Finally, it is "large scale"; it compiles the
big picture of what is happening, illustrated by university-wide phenomena as well as
smaller in-depth studies.
A number of theoretical models have been considered along the way. The introduction
of an institution-wide online learning environment can variously be interpreted as
institutional change, curriculum development, technology application, cultural shift,
learning opportunities, widening participation measure or even product evaluation;
each interpretation serving a discrete audience, or group of stakeholders (Pawson and
Tilley, 2000). The initial understanding of the nature of the intervention significantly

impacts on the evaluation strategy and model adopted. In the case of Coventry
University, the introduction of a VLE was clearly directed by senior management. The
intentions for such a move were a fundamental component of the University's teaching
and learning strategy. Concurrently there was an expectation that the universal
implementation of such a VLE would be designed to serve the needs of the teaching
and learning processes and new developments at both a macro and a micro level. At
the macro level, the institutional procedures and support mechanisms would embrace
the VLE. At the micro level individual learners and teachers would be able to integrate
their use of the VLE with existing and evolving teaching and learning methods. A
lynchpin of our evaluation therefore are the questions on how these intentions, both
implicit and explicit, have been met.
In our evaluation we have drawn on the strengths of illuminative evaluation and have
sought to study the innovation within the "learning milieu" (Parlett and Hamilton
1977). Its methods, too, are akin to our preferred methods of data collection, namely
observation, interviews, questionnaires, document analysis and background
information. We were concerned however not to adopt the outsider stance associated
with illuminative evaluation. We felt that an illuminative approach might produce a
one-off report with recommendations for actions around phenomena but would not
support the evolution of the innovation across time. Such an approach was therefore
seen as not sufficiently productive for our needs and we were drawn to Robert Stake's
"countenance" model (Stake 1967).

Countenance model of evaluation


The "countenance" model of evaluation seemed more appropriate because its
suggested matrices for descriptive and judgmental data are able to support the study of
an evolving programme across time, looking at the antecedents as well as the intended
and unintended consequences of the programme. Robert Stake's "countenance model"
(Stake, 1967) was originally formulated for curriculum studies in the late 1960s. The
countenance model aims to capture the complexity of an educational innovation or
change by comparing intended and observed outcomes at varying levels of operation.
The congruence between the intentional and the observational accounts provides the
basis for judging the success or otherwise of the innovation, whilst at the same time
allowing for the recording of unintended outcomes. A summary model of Stake's data
matrix is shown in Figure 1

Figure 1: Stake's matrix for processing descriptive data (adapted)

Our evaluation however deviates from the model in two significant aspects. Firstly in
that it is conducted by internal agents who are active and reflective participants in the
development. Secondly, it is an inextricable part of the development cycle of the
implementation of a VLE in higher education. Therefore, our approach also lies close
to the emerging notions of "action evaluation" (Rothman and Friedman) and the
collaborative research findings continually feed into the further evolution of the online
learning developments. In this action evaluation model, intended outcomes are
compared to observed outcomes to reveal the areas for further action and revision of
intentions. It is flexible and cyclical in allowing the findings to change the direction of
enquiry. This is particularly relevant in the current case, where we were introducing an
innovation that has enabled new departures in practice.
Action evaluation is a form of evaluation that has its roots in conflict resolution. In
many regards, this is a valid metaphor for our research and evaluation into processes
of change at Coventry. Because of the large number of stakeholders and range of their
interests, there are inevitable conflicts which are in need of resolution in our higher
education context. Action evaluation, like action research, is iterative. It also attempts
to give voice to all stakeholders in the spirit of collaborative inquiry. In our case, this
involves gathering disparate and uneven data from the widest variety of sources.
Stake's countenance framework, combined with the stakeholder participation inherent
in action evaluation, is helping us achieve a long range view of the development. A
view that expresses multiple voices and permits a range of data sources and enables us
to observe and contrast the actual findings with stated intentions in an iterative and
dynamic cycle.
One further aspect of our evaluation is the medium through which we present the
results. For such a multi-layered and complex analysis of the intervention, we have

deployed the online medium itself. The hypertext environment of the web has enabled
a three-dimensional approach to capturing and presenting the analysis and some of the
underlying evidence. This inherently dynamic environment also enables the
evaluation to grow in depth over time, making it an organic, possibly interactive,
entity rather than a static report.

Online Learning at Coventry University: the Evaluation


Since 1999, students at the University have had access to an online environment for
each of their taught modules. Our approach to implementing the online facility across
the institution has been outlined previously (Deepwell and Syson, 1998). Briefly,
whilst the implementation of the online learning environment using WebCT is
centrally designed, it has been rolled out in such a way as to provide academic
colleagues with as much scope as possible in determining the extent to which, how
and whether they adopt the technology in their teaching. Administrative procedures
and generic support for students' learning is managed centrally, for example: each
module has a link to the module description and regulatory information held centrally.
Furthermore, the student class lists are uploaded directly from the student information
system and updated nightly. "Ownership" of the online modules rests with the
lecturers involved in delivering those modules. Access to the modules is through the
University homepage and is simplified for students. A recent upgrade has enabled us
to simplify this entry route for staff, too.
A comprehensive evaluation of such a dispersed and flexible implementation is far
from straightforward. Further complexity comes from it being a moving target - the
evaluation process is under way whilst the "intervention" is taking shape. Software
upgrades, changes in legislation and University policy together with the upsurge in
Internet usage amongst students and staff all affect the outcomes. Concurrently,
attitudes towards the use of computers in higher education are shifting, particularly in
non-technical disciplines, as availability of the technology becomes more widespread
and awareness of its usage grows across all sectors of the economy.

Data collection
The methods of data collection we used combine both quantitative and qualitative
data. Indeed the evaluation framework is sufficiently flexible to integrate quite diverse

data into coherent sets according to a structured framework (see Figure 1). Evaluation
data is derived from a wide range of sources, including:
focus groups (staff and students)
questionnaires
structured interviews
participant observations
messages to email lists
diary entries
critical incident analysis
subject-specific research papers
local reports (e.g. small group studies)
University reports (e.g. quality assurance)
student monitoring
server statistics
student feedback
staff development feedback
The server statistics were collected in order to give a broad-brush impression of
comparative usage across the University. They are not sufficiently detailed to provide
a clear map of usage. Due regard is also paid to the limitations of the software as
regards data capture. There is no audit trail of designer actions in WebCT, although
there is student tracking. With an earlier version of WebCT, the designer account was
a shared account and it was not evident whether one person or more had been actively
involved in WebCT developments. With the current version we are able to gauge how
many people have designer access, but still unable easily to ascertain frequency,
duration or nature of their interactions.

Summary of interim research findings and their implications


The findings presented here are illustrative of the evaluation approach and represent a
discussion of the congruence between the intended and the observed during the
transaction phase and a statement of the actions taken or planned as a consequence. In
order to aid the reader, I have given each section a heading with a general theme that
encapsulates the issue presented and then listed three subsections: intended
transactions, observed transactions and actions. The findings mostly relate to the
period between September 1999 and July 2001 and where relevant I have noted
indicative sources of the data. Fuller presentation of the findings is available on the
Online Learning Research website at http://www.coventry.ac.uk/ched.

Usage
Intended transactions: over 10% active use. The initial usage target was set at
senior management level to be 10% uptake. From a preliminary analysis of the server
statistics, this was seen as an attainable target and therefore a "safe" measure of
success in VLE uptake. In the pilot year, where early adopters were able to use the
VLE but did not have administrative support, we were already observing usage levels
of around 10%.
Observed transactions: 23% use. An analysis of the server statistics showed that
around 23% of modules were using WebCT to some extent throughout the second
year of its full implementation. It was observed, however, that the percentage
distribution of usage varies within Schools between over 50% to under 10% usage.
Each School is a separate administrative unit and has a great deal of autonomy over its
expenditure. Significantly higher usage is made of the VLE in two of the Schools:
Business (CBS) and Computing (MIS - mathematical and information science). This
is evidenced by the server statistics and corroborated by quality monitoring and other
local reports from the Schools. In the case of the Business School, additional support
was obtained to convert teaching materials for the VLE.
Action. Quantitative evidence needs to be supplemented with background knowledge
and qualitative research (e.g. case study, student interview) in order to build a larger
picture of events and triangulated in order to reconcile discrepancies in findings. The
server statistics are used as a baseline snapshot of activity and welcomed by
University committees who can then chart the rise or fall of use and wield the data as
an argument for more investment or justification of resource expenditure.

Access
Intended transactions: full and equitable access. Access to the VLE was linked to
University, and modular, registration. In this way, registered students should have
access to all the modules for their current programme of study. Students would have
access to all module spaces, irrespective of whether lecturers were using the VLE. The
template was designed in particular to promote the communication tools.
Observed transactions: partial access. In the first year of operation, two principal
factors prevented full access. Firstly, in order to get a username, students had to be on
campus at least twice in person in order to register and this proved to be a barrier to
access for all students studying University courses at partner colleges or in the
workplace. Secondly, a few departments in the University were not yet using the
student record system for registration and enrolment, thus debarring their students
from automatic registration onto the VLE. Case studies, participant observations and
email reports in particular have provided evidence that student groups took advantage
of the VLE as a peer communication vehicle.
Action. Both of these barriers to access were overcome by the second year of
operation through changes to the username registration process and developments in
the student record system. Access problems remain a prominent feature of complaints
by students (data derived from error report logs, interviews and email messages to a
shared list), caused either by forgetting passwords, or incorrect details stored on the
student record system. Online module registration has emerged as a necessary
development of the system to improve access still further.

Ease of adoption
Intended transactions: adoption to be made as easy as possible. Great efforts were
made to simplify the entry routes both for staff and students into the VLE. Direct
access from the University homepage, module registrations feeding into the student
lists within each module space in the VLE, template containing the essential tools
with a common look and feel for each School of the University, local points of call for
password and technical assistance.
Observed transactions: wide variation in adoption patterns. It is observably easier
for some than for others to adopt the VLE. There has been noticeable dissatisfaction
expressed by, for instance, some computer scientists who want a more robust and

adaptable system, frustration by others who complain (via email messages to a group
list, memoranda, questionnaire feedback and in interviews, for example) of the
unreliability of the systems generally (e.g. obtaining passwords, student registration,
network speed). Local technical support is insufficient to address the tutor needs.
There is evidence that the poor interface design and known bugs in the version of the
VLE have discouraged adoption.
There are a number of tools available in the VLE, which broadly fall into two camps:
communication and presentation. Usage varies widely across the University. Server
statistics and questionnaire feedback shows that the Business School has a relatively
low usage of communication tools, whereas the School of Health and Social Science
exhibits far stronger use of these tools than the presentation ones. There is also
variation at the micro-level, for example some of the most interesting case studies of
use of the communication tools occur in the School of Art and Design, although this is
one of the schools which uses the VLE the least overall.
Action. In the third year of operation the University has upgraded to a version of
WebCT with a substantially improved user interface and, although there are still some
valid usability issues, feedback from colleagues suggests that it no longer appears to
be a disincentive to use. Further action needs to be taken to encourage adoption, for
instance, by recommending third party software to create questions for online
assessments.

Training and support


Intended transactions: the provision of adequate training and support, both
technical and pedagogical. Local support was deemed an essential part of the rolling
out of the VLE. To this end, there was a 10-week familiarisation programme for a
group of technical support staff who assessed themselves against a competence list of
tasks associated with supporting the VLE, including skills in general software used by
colleagues. In each School there were also academic colleagues in the Teaching,
Learning and Assessment Task Force charged as local educational developers and
action researchers to support the adoption process (Beaty and Cousin, forthcoming).
Additionally, a team of three people in the Centre for Higher Education Development
took on the role of a central support team for WebCT. The computing services general
helpdesk supported student users.
Observed transactions: variable support. Studies of the effectiveness of the VLE
support shows that the technical experts were rarely able to perform in this role
because they had not been given remission from other tasks. In-depth interviews

revealed that, in most cases, they did not promote their support role and soon lost the
knowledge they had acquired during the training programme through disuse. In a
couple of instances, extra, short-term appointments were made to produce teaching
materials in a format suitable for the VLE. The temporary appointments were
effective for the jobs they were appointed to do, although there is little evidence that
they have encouraged adoption beyond the close circle of colleagues who were able to
draw on their services. Local academic support on the other hand has developed more
successfully, as shown in feedback questionnaires and case studies, for instance.
Colleagues have been able to draw on the experience and expertise of members of our
teaching, learning and assessment Taskforce, some 25 or so academic colleagues who
have been seconded by the University to develop and disseminate innovations in their
practice. In addition, other support networks have emerged independently (e.g. buddy
systems). The team in CHED has continued to provide central support through online
information and discussions, training sessions and a busy telephone and email hotline.
Action. The short-term appointments that were made were "pump priming" the
implementation and have now been terminated. Empowering colleagues to work
directly with the VLE is the main thrust of the support provided both centrally and
locally. The staff development model is responding to the nature of support required
and is adopting more of a consultancy approach, with consultations tailored to
individual teaching and learning requirements.

Student engagement and staff effort


Intended transactions: online learning to be motivational for student
engagement. During the pilot year, analysis of some student groups showed a marked
improvement in student motivation and performance (including a 5% rise in end of
year results in one module) however these studies were conducted only by those
typified as early adopters (Daniel 1996). The intention of the full implementation was
to bring these motivational advantages of technology to enhance teaching and learning
to the majority of courses.
Observed transactions: required individual effort, time investment and
perseverance on part of staff. Case study and interview evidence has shown that
effective usage still requires additional individual effort of a kind expected from early
adopters. Technical hitches continue to occur, e.g. slow delivery of network
applications at peak periods, hardware failures and other blockages. Notwithstanding
this, a far wider user group has now become involved in using the VLE. Members of
staff who use it even for quite modest enhancements for their face-to-face delivery
have recommended the use of the VLE to their colleagues precisely because they feel

it does motivate their students. Our student-generated data suggest that engagement
with the online medium is particularly beneficial for part-time students, or those with
additional learning needs.
Action. Of special interest are those colleagues who have become engaged with the
online technology primarily for teaching reasons, despite expressed inexperience in
C&IT and expressed scepticism. Further analysis is required to ascertain levels of
student engagement.
Intended transactions: anytime, anyplace learning (24/7). It was intended that
student learning would be enhanced through wider availability of the learning
resources. Through the VLE template all students had links to the central web-based
learning resources of the University, namely the library, study skills and to centrally
held module information. Module tutors would build the web environment
organically, adding learning materials and other information gradually over the
academic year.
Observed transactions: anytime, anyplace module resources. There are a growing
number of modules using the VLE for specific learning activities. However, in most
cases, the VLE has been used as a repository for documents (among other things).
Learning resources have been added to the VLE, often in the form of module
handbooks, lecture notes or summaries. In a few subject areas, the VLE has become a
central rather than a supplemental resource for students. Many colleagues report
student pressure to build up the online learning elements of their modules, including
student/student and student/tutor communication. These reports are backed up by
student survey data.
Action. Further localised staff development is required, including sharing of good
practice to enable those colleagues who make comments such as "next year I'll use it
for more than just my lecture slides" are able to put their intentions into practice.

Unintended outcomes
Along with an analysis of intentions and how they have been met, the Coventry model
of evaluation allows for unintended outcomes emerging from the intervention. In our
case, one of the more unexpected outcomes of the implementation of the VLE was the
emergence of a trust culture amongst academic colleagues from different disciplines.
When the notions of a VLE were first proposed, it was presented as a virtual
classroom - the lecturer could control who was allowed to observe their teaching or
their students' learning. The fears lecturers expressed at the time (in focus groups and

dedicated seminars) were that the VLE would become a means by which they would
themselves be controlled and monitored by administrators or managers. More recent
evidence shows however that those lecturers who have achieved integration of the
VLE into their teaching practice actively want others to observe what they are doing
and often set up guest accounts for their colleagues locally and their peers in other
institutions.
Other unintended outcomes include re-engineering the student registration process
and new staff development opportunities for technicians. The evaluation is also
highlighting further areas for action, such as the conditions of contract for academic
and academic-related colleagues, wider availability of Internet connections in general
teaching rooms etc.

Conclusion
In this paper, I have attempted to offer a framework drawing on Stake and action
evaluation for capturing the complexity of online learning across an institution. As the
evaluation at Coventry University progresses the stakeholder perspectives are shifting
and the evaluation framework we are using accommodates this shift as I have shown
above. Through the availability of a VLE there is increasing expertise in the media of
online learning and a general growth in familiarity with all forms of online practice.
The University is moving towards a natural plateau of usage in mixed mode delivery.
We are now in the fourth year of a five-year cycle of evaluation. An interim report was
presented on years one and two and a second report is being compiled for years three
and four. The question "does it make a difference?" has been answered through
evidence e.g. of joined-up thinking at senior management level, the growth in online
facilities, staff engagement, increase in student Internet usage and IT competence
levels etc). The more elusive question "does it improve student learning?" is still open
- however, the signs are promising.

References
Anderson, C., Day, K., Haywood, J., Land, R., & Macleod, H. (2000). Mapping
the Territory: issues in evaluating large-scale learning technology
initiatives.Educational Technology & Society, 3 (4), 31-42.

Beaty, E., & Cousin, G. (2002). An Action-research Approach to Strategic


Development. In Macdonald R. & Eggins H. (Eds.) The Scholarship of
Academic Development, UK: SRHE and OUP.
Chelimsky, E., & Shadish, W. R. (1997). Evaluation for the 21st Century: A
handbook, London: Sage.
Daniel, J. (1996). Mega-universities and knowledge media: technology
strategies for higher education, London : Kogan Page.
Deepwell, F., & Cousin, G. (2002). A Developmental Framework for
Evaluating Institutional Change, Educational Developments,
3.1,http://www.seda.demon.co.uk/eddevs/vol3/eddevs31.html.
Deepwell, F. H., & Syson, A. J. (1999). Online learning at Coventry University:
You can lead a horse to water ..., Educational Technology & Society, 2
(4),http://ifets.ieee.org/periodical/vol_4_99/deepwell.html.
Harvey, J. (1998). Evaluation Cookbook, Learning Technology Dissemination
Initiative, http://www.icbl.hw.ac.uk/ltdi/.
Oliver, M. (2000). An introduction to the Evaluation of Learning
Technology. Educational Technology & Society, 3
(4),http://ifets.ieee.org/periodical/vol_4_2000/intro.html.
Oliver, M., & Conole, G. (1998). Evaluating communication and information
technologies: a toolkit for practitioners. Active Learning, 8, 3-8.
Parlett, M., & Hamilton, D. (1977). Evaluation as Illumination. In Parlett M. &
Dearden G. (Eds.) Introduction to Illuminative Evaluation: Studies in Higher
Education, Cardiff-by-the-Sea, CA: Pacific Soundings Press.
Pawson, R., & Tilley, N. (1997). Realistic Evaluation, London: Sage.
Rothman, J., & Friedman, V. (2002). Action Evaluation: Helping to Define,
Assess, and Achieve Organizational
Goals,http://www.aepro.org/inprint/papers/aedayton.html.
Stake, R. E. (1967). The Countenance of Educational Evaluation. Teachers
College Record, 68, 523-540.

Potrebbero piacerti anche