Sei sulla pagina 1di 254

A Model for Supporting Electrical Engineering

with e-Learning

Thesis submitted for the degree of


Doctor of Philosophy
at the University of Leicester

by

Dursun Akaslan

Department of Computer Science


University of Leicester

2014

A Model for Supporting Electrical Engineering


with e-Learning

Thesis submitted for the degree of


Doctor of Philosophy
at the University of Leicester

by

Dursun Akaslan

Department of Computer Science


University of Leicester

2014

Declaration
I, Dursun Akaslan, hereby declare that this submission is my own work and that it is the
result of work done mainly during the period of registration. To the best of my
knowledge, it contains no material previously published or written by another person
nor material which to a substantial extent has been accepted for the award of any other
degree or diploma of the university or other institute of higher learning, except where
due acknowledgement has been made in the text. Besides, parts of this submission
appeared in the following conjoint publications (listed chronologically), to each of
which I have made substantial contributions:
1. Akaslan, Dursun and Law, Effie Lai-Chong, Measuring Teachers Readiness for
E-learning in HEIs associated with the Subject of Electricity in Turkey, in
Proceedings of the 2011 IEEE Global Engineering Education Conference
(EDUCON), Amman, Jordan, 2011.
2. Akaslan, Dursun, Law, Effie Lai-Chong and Taskin, Sezai, Analysing Issues for
Applying E-learning to the Subject of Electricity in Higher Education in Turkey,
in Proceedings of the 2011 International Conference on Engineering Education
(ICEE), Belfast, Northern Ireland, 2011.
3. Akaslan, Dursun and Law, Effie Lai-Chong, Measuring Student E-learning
Readiness: A Case about the Subject of Electricity in HEIs in Turkey, in
Proceedings of the 2011 International Conference on Web-based Learning
(ICWL), Hong Kong, China, 2011.
4. Akaslan, Dursun, Law, Effie Lai-Chong and Taskin, Sezai, Analysis of Issues
for Implementing E-learning: The Student Perspective, in Proceedings of the
2012 IEEE Global Engineering Education Conference (EDUCON),, Marrakesh,
Morocco, 2012.
5. Akaslan, Dursun and Law, Effie Lai-Chong, Analysing the Relationship
between ICT experience and attitude toward e-learning, in Proceedings of the
2012 European Conference on Technology Enhanced Learning (ECTEL),Saarbrcken, Germany, 2012.

Some of the publications given above are also, at the time of writing this thesis, cited
few times in other publications, thesis or proceedings in various languages such as
English, Turkish and Indonesian. Moreover, I published the following paper during my
PhD study.While it is not directly associated with my thesis,it helped me collect
information about HEIs associated with the field of electricity in Turkey.

6. Akaslan, Dursun and Law, Effie Lai-Chong, E-learning in the Science of


Electricity in HEIs in terms of Environment and Energy, in Proceedings of the
2010 Postgraduate Research Student Conference (EMUA),Nottingham, United
Kingdom, 2010.

ii

Dedication
I dedicate to my father, Mahir Akaslan, this thesis that he will never read. In the midst
of my thesis, he passed away due to a heart attack in April of 2011 in Erzurum in
Turkey at the age of 52 while he was driving with my mother. In the last seconds of his
life, he tried to put the handbrake on to make sure that my mother was safe.

There are or will not be enough words that I can use to describe how important my
father is to me and to the rest of my family. My father was a man who nurtured me,
taught me, dressed me, fought for me, held me, shouted at me, kissed me but most
importantly loved me unconditionally and always wanted the best for me. If you get
little benefits from this thesis, please send your thoughts and prayers to my father, by
either reciting Al-Fatihah (the opening) or the Ya-Sin from the Holy Quran.

Al-Fatihah
(The Opening)
In the name of Allah, the Gracious, the Merciful.
All praise belongs to Allah, Lord of all the worlds,
The Gracious, the Merciful,
Master of the Day of Judgment
Thee alone do we worship and Thee alone we implore for help.
Guide us in the right path
The path of those on whom Thou has bestowed Thy blessings, those who have not
incurred Thy displeasure and those who have not gone astray.

iii

Acknowledgements
First of all, I am grateful to the Turkish Government for sponsoring me to complete this
thesis, by covering my tuition fees and compulsory education expenses during my
academic studies in the United Kingdom.

Many thanks also go to the Department of Computer Science at the University of


Leicester, for providing me with a home in which to carry out my postgraduate
studies.Its facilities,especially the rich library, have been extremely valuable, and
without them this study would not have been possible.

I am extremely grateful to my supervisor Dr. Effie L-C Law whose support and
feedback helped me to complete this thesis successfully. She guided me from the
beginning, reading drafts of my chapters and commenting on them, and advising me on
my case studies to ensure that I collected good quality data for use in my analysis and
interpretation. I am also indebted to Prof. Dr. Rick Thomaswhose support and
suggestions helped me enhance the quality of this thesis. Towards the end of my thesis,
I could benefit from Prof. Dr. Thomas years of research and teaching experience.

I am also grateful to the teachers, Assoc. Dr. Sezai Taskin, Dr. Yalcin Ezginci and Dr.
Assoc. Dr. Bayram Akdemir for their involvement in the case study in which students
from their institutions volunteered to take part as well. I owe a considerable debt of
gratitude to the survey participants and interviewees from Turkey and the United
Kingdom.

Many friends and colleagues also deserve many thanks for helping me during the
process of data collection, analysis and writing up of this dissertation. I would like to
acknowledge a debt of gratitude that could never be repaid to my family. My dear wife,
Sariye Akaslan, and my little son, Tunay Akaslan, kept me motivated to get this thesis
finished and without their love, I could not have accomplished my study. Finally, I wish
to thank my parents for their morale support and giving me the strength and courage to
complete this thesis and my brothers and sister for their valuable recommendations and
support throughout this thesis.

iv

Abstract
The overall goal of this research work was developing and evaluating a model for
supporting electrical engineering with e-learning. The model development was based on
the survey data collected from representative teachers and students in Turkey whereas
the model evaluation was conducted in the relevant HEIs in Turkey and the United
Kingdom. To develop the model, the study investigated the attitudes of representative
key stakeholders towards e-learning in Turkey by administrating questionnaires and
interviews with teachers and students. Then the responses of the teachers and students
were compared. Based on the results, I proposed a model with a multi-dimensional
approach to e-learning: (1) self-directed learning by studying e-book, (2) selfassessment by solving e-exercises, (3) teacher-directed learning by attending classroom
sessions as an integral part of the blended learning (4) teacher-assessment by solving eexercises, (5) computer-directed learning by playing e-games and (6) computerassessment by solving e-exercises.

To evaluate the applicability of the model in different conditions, a case-control study


was conducted to determine whether the model had the intended effect on the
participating students in HEIs in Turkey and the United Kingdom. As the result of the
case-control study, the effects of e-learning, blended learning and traditional learning
were verified. However, there were significant differences among the groups. The
overall scores indicated that e-learning and blended learning was more effective as
compared to the traditional learning. The results of our study indicated that the
knowledge increase in e-learners seemed to be gradual because they tended to study
daily by completing each activity on time. However, the traditional learners did not
have the same pattern because they usually did not read the core text and did not solve
e-exercise regularly before the classroom sessions. The results of pre-placement, postplacement tests and middle tests also justified these assumptions.

Table of Contents
Declaration ......................................................................................................................... i
Dedication ........................................................................................................................ iii
Acknowledgements.......................................................................................................... iv
Abstract ............................................................................................................................. v
Table of Contents ............................................................................................................. vi
Abbreviations ................................................................................................................. xiii
Figures ........................................................................................................................... xiv
Tables .............................................................................................................................. xv
Equations ...................................................................................................................... xvii
PART I: FUNDAMENTALS ........................................................................................... 1
CHAPTER 1: INTRODUCTION ..................................................................................... 3
1.1

Background and Definitions .............................................................................. 3

1.2

Motivation of the thesis ...................................................................................... 5

1.3

Purpose of the thesis........................................................................................... 7

1.3.1

Development ............................................................................................... 9

1.3.2

Evaluation ................................................................................................... 9

1.4

Parameters of the thesis .................................................................................... 10

1.2.1 Reasons for selecting the domain of Electrical Engineering ......................... 10


1.2.2 Reasons for selecting Turkey and the United Kingdom ................................ 11
1.5

Research questions of the thesis ....................................................................... 12

1.6 Structure of the thesis ........................................................................................... 13


CHAPTER 2: LITERATURE REVIEW ........................................................................ 14
2.1 Introduction ........................................................................................................... 14
2.2 Understanding E-learning ..................................................................................... 15
2.3.1 The First Interpretation: On-line Learning & Off-line Learning ................... 16

vi

2.3.2 The Second Interpretation: Asynchronous & Synchronous e-Learning ........ 16


2.3 Learning Theories ................................................................................................. 17
2.4 Theoretical Underpinning ..................................................................................... 19
2.4.1 Blended Learning ........................................................................................... 19
2.4.2 Game-based and Mobile Learning ................................................................. 21
2.4.3 Student- and Teacher-Centred Learning ........................................................ 22
2.4.4Open Learning ................................................................................................ 23
2.4.5 Distance Learning .......................................................................................... 24
2.4.6 Traditional Learning ...................................................................................... 25
2.4.7 Campus Learning ........................................................................................... 25
2.5 The History of e-Learning .................................................................................... 26
CHAPTER 3: METHODOLOGY .................................................................................. 28
3.1 Introduction ........................................................................................................... 28
3.2 Research Methods Available in Education and Social Science ............................ 29
3.2.1 Qualitative and Quantitative Research .......................................................... 30
3.2.2 Mixed-Methods Research .............................................................................. 30
3.3 The Research Design Adopted for the Study........................................................ 31
3.4 Research Methods used for the study ................................................................... 33
3.4.1 Web-based Questionnaire .............................................................................. 34
3.4.2 Semi-structured Interview.............................................................................. 46
3.5 Ethical Issues ........................................................................................................ 47
PART II: DEVELOPMENT ........................................................................................... 49
CHAPTER 4: FACTORS AFFECTING READINESS FOR E-LEARNING ............... 51
4.1.

Introduction ...................................................................................................... 51

4.2

How to Measure Readiness for E-learning ...................................................... 52

4.2.1 Technology .................................................................................................... 54


4.2.2

People........................................................................................................ 54
vii

4.2.3 Institution ....................................................................................................... 56


4.2.4 Content ........................................................................................................... 56
4.2.5

Acceptance ................................................................................................ 57

4.2.6Training ........................................................................................................... 58
CHAPTER 5: MEASURING TEACHERS E-LEARNING READINESS .................. 60
5.1 Introduction ........................................................................................................... 60
5.2 Methods ................................................................................................................ 60
5.2.1 Procedure ....................................................................................................... 60
5.2.2Missing Data ................................................................................................... 61
5.2.3 Research Group.............................................................................................. 62
5.2.4 Items............................................................................................................... 62
5.4 Results and Discussion ......................................................................................... 64
5.4.1 Initial Findings using Descriptive Statistics .................................................. 64
5.4.2 Comparative Findings using Inferential Statistics ......................................... 71
CHAPTER 6: ANALYSING ISSUES FOR APPLYING E-LEARNING: THE
TEACHER PERSPECTIVE ........................................................................................... 80
6.1 Introduction ........................................................................................................... 80
6.2 Methods ................................................................................................................ 80
6.3. Findings ............................................................................................................... 81
6.3.1

What is E-learning? .................................................................................. 81

6.3.2

Issues and E-learning as a Solution .......................................................... 82

6.3.3

How to Implement E-learning .................................................................. 89

CHAPTER 7: MEASURING STUDENTS E-LEARNING READINESS .................. 93


7.1 Introduction ........................................................................................................... 93
7.3Methodology .......................................................................................................... 93
7.3.1 Questionnaire Design ..................................................................................... 93
7.3.2Procedure ........................................................................................................ 94

viii

7.3.4 Items............................................................................................................... 95
7.3.5 Missing Data .................................................................................................. 95
7.4 Results and Discussion ......................................................................................... 96
8.4.1 Findings in the Factor Technology ........................................................... 100
7.4.2 Findings in the Factor People .................................................................... 101
7.4.3 Findings in the Factor Institution .............................................................. 102
7.4.4 Findings in the Factor Content .................................................................. 102
7.4.5 Findings in the Factor Acceptance and Training .................................... 103
CHAPTER 8: ANALYSING ISSUES FOR APPLYING E-LEARNING: THE
STUDENT PERSPECTIVE ......................................................................................... 104
8.1 Introduction ......................................................................................................... 104
8.2 Methodology ....................................................................................................... 104
8.2.1 Procedure and Items..................................................................................... 104
8.2.4 Research Group............................................................................................ 105
8.3 Findings .............................................................................................................. 107
8.3.1 What is E-learning? ..................................................................................... 108
8.3.2 Current Issues .............................................................................................. 109
8.4.3 How to Implement E-learning ..................................................................... 116
CHAPTER 09: COMPARING THE TEACHER AND STUDENT PERSPECTIVES119
9.1 Introduction ......................................................................................................... 119
9.2 A Model for Measuring Attitudes towards E-learning ....................................... 119
9.2.1 Attitude towards E-learning ......................................................................... 120
9.2.2 Experience with ICT .................................................................................... 121
9.3 Items.................................................................................................................... 122
9.4 Results ................................................................................................................. 123
9.4.1 Results of the analysis of close-ended items ............................................... 123
9.4.2 Analysis of the open-ended item ................................................................. 126

ix

9.5 A Model for Delivering E-learning .................................................................... 131


PART III: EVALUATION ......................................................................................... 134
CHAPTER 10: THE PROCEDURES FOR EVALUATION ...................................... 136
10.1 Introduction ....................................................................................................... 136
10.2 A Model for Delivering e-Learning .................................................................. 137
10.2.1 Stage 1: Getting Prepared at Home for the Class ...................................... 138
10.2.2Stage 2: Attending Lecture Sessions after Studying at Home .................... 139
10.2.3Stage 3: Playing e-Game after Attending Lecture ...................................... 140
10.3 A Model for Conducting a Case-Control Study ............................................... 141
10.4Selecting and Installing E-learning Platform ..................................................... 143
10.5Developing and Integrating E-learning Materials into the E-learning Platform 143
10.6 Training Students before implementing e-learning .......................................... 148
10.7Research Group.................................................................................................. 148
10.7.1 Sampling Method ....................................................................................... 149
10.7.2 Sampling Size ............................................................................................ 149
10.8Measuring E-learning Readiness ....................................................................... 151
10.9 Assessment Methods......................................................................................... 152
CHAPTER 11: COMPARING THE E-LEARNING READINESS OF THE
ELECTRICAL ENGINEERING STUDENTS IN TURKEY AND THE UK ............. 153
11.1 Introduction ....................................................................................................... 153
11.2 Research Group................................................................................................. 153
11.3Initial Findings ................................................................................................... 154
11.4InferentialFindings ............................................................................................. 159
11.4.1Differences between e-Learners in Turkey and the UK ............................. 159
11.4.2 Differences between Blended Learners in Turkey and the UK ................. 160
11.4.3 Differences between Blended and e-Learners in Selcuk University ......... 161
11.4.4 Differences between Blended and e-Learners in Leicester University...... 162

CHAPTER 12: STRUCTURAL EQUATION MODELLING .................................... 164


12.1 Introduction ....................................................................................................... 164
12.2The First Structural Equation Modelling ........................................................... 164
12.3 The Second Structural Equation Modelling...................................................... 171
CHAPTER 13: EVALUATING E-LEARNING .......................................................... 178
13.1 Introduction ....................................................................................................... 178
13.2Results and Discussion ...................................................................................... 178
13.2.1 Measuring Students Knowledge at the Beginning of the Courses ........... 178
13.2.2Measuring Students Knowledge in the Middle of the Courses ................. 186
13.2.3Measuring Students Knowledge at the End of the Courses ...................... 190
CHAPTER 14: CONCLUSION ................................................................................... 197
14.1 Introduction ....................................................................................................... 197
14.2 Research Questions ........................................................................................... 197
14.2.1 Answer to Research Question 1 ................................................................. 197
14.2.2 Answer to Research Question 2 ................................................................. 198
14.2.3 Answer to Research Question 3 ................................................................. 199
14.2.4 Answer to Research Question 4 ................................................................. 199
14.2.5 Answer to Research Question 5 ................................................................ 200
14.3 Limitations ........................................................................................................ 201
14.3.1 Invitation of participants ............................................................................ 201
14.3.2 Limited timeframe for the participant availability ..................................... 202
14.3.3 The generalizability of results.................................................................... 202
14.4 Contributions .................................................................................................... 203
14.4.1 Finding Issues and Suggesting Solutions................................................... 203
14.4.2 Identifying Factors Affecting Individuals Readiness for e-Learning ....... 204
14.5 Future Work ...................................................................................................... 205
14.5.1 Measuring Readiness for E-learning.......................................................... 205
xi

14.5.2 E-learning Materials .................................................................................. 205


14.5.3 Digital Educational Games ........................................................................ 206
References ..................................................................................................................... 207
Appendix I: Teacher Questionnaire .............................................................................. 219
Appendix II: Student Questionnaire ............................................................................. 225

xii

Abbreviations

ICT

Information and Communications Technology

e-Learning

Electronic Learning

HEI(s)

Higher Education Institution(s)

VLE

Virtual Learning Environment(s)

TAM

Technology Acceptance Model

OER

Open Education Resources

SEM

Structural Equation Modelling

xiii

Figures
Figure 1:Purpose of the study ........................................................................................... 8
Figure 2: The Conversational Framework ...................................................................... 18
Figure 3: Research Design adopted for the Study .......................................................... 32
Figure 4: An Assessment Model for Measuring Readiness for E-learning .................... 45
Figure 5: Ten factors of the concept readiness for e-learning....................................... 53
Figure 6: Factors for Measuring Students Readiness for E-learning ............................ 58
Figure 7: Step-by-step for Implementing E-learning ...................................................... 90
Figure 8: Primary Issues in the respective HEIs ........................................................... 109
Figure 9: A Model for Measuring Attitudes for E-learning.......................................... 121
Figure. 10. A Model for the usage of different ICT for E-learning .............................. 121
Figure 11: A web-based approach for delivering e-learning ........................................ 133
Figure 12: A Model for Blended Learning ................................................................... 137
Figure 13: A Model for Case-Control Study ................................................................ 142
Figure 14: 16the Slide of the Presentations for Chapter 6 ............................................ 145
Figure 15: The Development of E-learning Materials .................................................. 146
Figure 16: Learning Outcomes of the Chapter 2 .......................................................... 147
Figure 17: Diagram representing a measurement model for the four-factor solution .. 169
Figure 18: The Input Path Diagram for a Structural Equation Modelling .................... 170
Figure 19: Diagram representing a measurement model for the six-factor solution .... 176
Figure 20: The Input Path Diagram for a Structural Equation Modelling .................... 177
Figure 21: A Circuit Design......................................................................................... 190
Figure 22: Getting the Item (e.g. Star) .......................................................................... 192
Figure 23: Getting the Knowledge after the Item ......................................................... 193

xiv

Tables
Table 1: HEIs associated with the Subject of Electricity in Turkey in 2010 .................. 35
Table 2: HEIs associated with the Subject of Electricity in Turkey ............................... 36
Table 3: Subgroups of the respective HEIs in Turkey .................................................... 36
Table 4: Working Status of teaching staff in the respective HEIs in Turkey ................. 37
Table 5: Qualitative research questions included in the Interview Schedule ................. 46
Table 6: Overall Summary of Missing Values ............................................................... 62
Table 7: List of Items of the Teacher E-readiness Survey .............................................. 63
Table 8:Statisticsof all the items in the study ................................................................. 65
Table 9:The overall mean score of the participants responses for each factor .............. 66
Table 10: Statistics for the items for Gender Differences .............................................. 74
Table 11: Results of University Financial Mode Differences ........................................ 77
Table 12: List of item identifier, content and number of participants (N) ...................... 81
Table 13:The frequencies and percentage of research groups ........................................ 95
Table 14: Overall Summary of Missing Values ............................................................. 96
Table 15: List of Items of the Student E-readiness Survey Part I................................ 97
Table 16: List of Items of the Student E-readiness Survey Part II .............................. 98
Table 17: Scores of all the items in the study ................................................................. 99
Table 18: The overall mean score of the participants responses for each factor ......... 100
Table 19: List of item identifier, content and number of participants .......................... 106
Table 20: Characteristics of Participants ...................................................................... 107
Table 21: Items of individual experiences with ICT and attitudes towards e-learning 122
Table 22:Statistics for the Items related to each factor ................................................. 123
Table 23. The mean score of the participants with various experience ........................ 125
Table 24. The relationship between e-learning and ICT .............................................. 126
Table 25. The relationship views and attitudes towards e-learning .............................. 128
Table 26: Book Chapters .............................................................................................. 145
Table 27: The number of participants in Turkey and the UK ....................................... 150
Table 28: The number of participants in Leicester and Selcuk University .................. 151
Table 29: Number, Mean and Standard Deviation of Items ......................................... 154
Table 30: Statistics for the items related to all factors .................................................. 156
Table 31: Statistics for each sub-factor of the factor traditional skills ......................... 158
Table 32: Statistics of e-Learners for factors in e-learning readiness........................... 159
xv

Table 33: Statistics of e-Learners for each factor ......................................................... 160


Table 34: Statistics of Selcuk University for Blended and Electronic Learners .......... 162
Table 35: Statistics of Leicester University for Blended and Electronic Learners ....... 163
Table 36: Statistics for the Suitability of the data ......................................................... 165
Table 37: Identifying the Number of Factors ............................................................... 166
Table 38: Pattern Matrix for PCA with Oblimum Rotation of 14-Factor Solution ...... 167
Table 39: Statistics for the Suitability of the data ......................................................... 171
Table 40: Identifying the Number of Factors ............................................................... 172
Table 41: Pattern Matrix for PCA with Oblimum Rotation of 14-Factor Solution ...... 173
Table 42: Structure Matrix for PCA with Oblimum Rotation of 14-Factor Solution... 174
Table 43: Sample Questions for Topic 5 ( Multi-dimensional Arrays) ........................ 180
Table 44: The Results of Pre-Placement Test for Turkey and United Kingdom ......... 182
Table 45: The Results of Pre-Placement Test for Selcuk and Leicester University .... 184
Table 46: The result of the Quiz for Turkey and United Kingdom ............................. 187
Table 47: The rate of increment or decrement on students learning .......................... 188
Table 48: Post-Placement Tests for Turkey and United Kingdom .............................. 195
Table 49: The rate of increment or decrement on students learning .......................... 196

xvi

Equations
Equation 1: Start Equation for the Sample Size ............................................................. 40
Equation 2: End Equation for the Teacher and Student Sample Size ............................. 41

xvii

PART I:
FUNDAMENTALS
Fundamentals of Thesis with the Background,
Purpose and Research Questions

Part I starts this thesis off with the fundamentals of the thesis,
spanning Chapters 1,2 and 3.

IN THIS PART
Chapter 01
Introduction

It begins with the definition of the term e-learning, providing a list of


rationales behind this thesis, a detailed explanation of the purpose of

Chapter 02

the thesis, and the research questions in the thesis. It also presents a

Literature Review

discussion of why two countries, Turkey and the United Kingdom,


and the domain of Electrical Engineering were chosen for the study.

Chapter 03
Methodology

It provides a literature review on e-learning, providing various


interpretations on e-learning, examining learning theories and
providing a list of types of learning. Additionally, it concludes with a
research design, describing procedures, collecting, analysing,
interpreting and reporting the survey data.

CHAPTER 1: INTRODUCTION
1.1 Background and Definitions
Information and communications technology (ICT) or shortly technology has important
implications for education in addition to their widespread impact on contemporary
society and economies(Martn-Rodrguez, et al., 2014). The impact of ICT can be
vividly seen on several aspects of todays education. For instance, Steffens (2008)
highlights that the use of ICT in the classroom has had some influence on how we think
about the roles of teachers and students. According to Laurillard (2007), the impact of
ICT creates a new kind of medium for the discovery, articulation and dissemination of
knowledge and therefore affect the knowledge learned and skills developed within a
culture or society. Because this new kind of medium brings two disciplines of
technology and education together, the number of studies aiming to find out how to
bring these two fields together is ever increasing. However, the key idea in bringing
technology and education together is to take advantages of technology in learning the
knowledge and in developing skills.

On the other hand, many ambiguous terms are used to name the integration of
technology into education. Therefore, terminology is also another detail that differs.
Some authors such as (Steffens, 2008), (Laurillard, 2008)and (Rodrguez, et al.,
2012)refer to the term technology-enhanced learning (TEL) when technology is used to
support teaching and learning. Additionally, Kirkwood and Price (2014) state that the
term TEL is increasingly being used in the UK, Europe and other parts of the world in
order to refer to the application of ICT to teaching and learning. Moreover, GuriRosenblit and Gros (2011) notes that TEL subsumes the older term e-learning because
the term e-learning is used with with a confusing variety of meanings. Hovever, like the
term e-learning, Kirkwood and Price (2014)also adds that there are no explicit
statements about what TEL actually means due to the use of the word enhanced in the
term TEL. On the other hand, the terms technology-enhanced learning and e-learning
are used interchangeably. The Higher Education Academy (HEA) and the Joint
Information Systems Committee (JISCS) used the terms e-learning and technologyenhanced learning together interchangeably in their recent reports.

For instance, the expressions such as e-learning, also known as technology-enhanced


learning and technology-enhanced learning, also known as e-learning appeared in
the recent report of the HEA. Similarly, the expression e-learning=enhanced learning
was also used in publications of the JISC. In addition, there are explicit statements
about what e-learning and technology-enhanced learning mean in those reports.
According to the recent report of HEA, e-learning is concerned with computer
technologies to support learning, whether that learning is local or remote (HEA, 2014).
To note here, the word local is used to refer to the use of computer technologies on
campus and the word remote to the use of those at home or in the workplace. On the
other hand, the JISC defines e-learning as learning supported and facilitated through the
use of ICT (JISC, 2004).

The JISC also notes that the technology used to facilitate and support learning may
involve the use of some specific hardware (e.g. computers, interactive whiteboards,
digital cameras & mobile phones) and software (e.g.assistive software, email,
discussion boards, video conferencing, virtual learning environments & learning activity
management systems). Hovewer, devices such as smartphones and tablets are also
becoming popular devices. In sum, the term technology-enhanced learning tend to be
admitted as a generic term for the use of technologyto support education. Hovever, the
term e-learning is used as a more specific term. Especially, the use of the Internet is the
key part of e-learning. Therefore, in the thesis, I use the term electronic learning or
shortly the word e-learning to work out the optimal technologies for the educational
problems and to consider the opinions of teachers and students in this research study.
Especially, the use of the word electronic is important for the ideas about education
and technology that are discussed and implemented in the thesis because I believe that it
carries the components of collaboration across disciplines of technology and education
in the context of my thesis.

1.2 Motivation of the thesis


The role of technology in learning especially e-learning has been high-profile topics in
academic and practitioner publications for some years. Additionally, a number of
organizations are charged with supporting education and research through the
innovative use of ICT such as JISC in the United Kingdom. For instance, one of the
eight core themes of JISC work is that of e-learning (Anderson, 2008). Moreover, there
are many motivations behind the integration of technology into education by
organizations and individuals in the realm of e-learning. These motivations can be
exemplified as pedagogical considerations, the drive for innovation, meeting the needs
of students and maintaining a competitive profile (Weller, 2004). Laurillard also adds
that concerns about how to use new technologies to support education is also another
motivation behind the use of technological opportunities in education(Laurillard, 2008).

Therefore, the development of e-learning products and the provision of e-learning


opportunities is one of the most rapidly expanding areas in education and training
especially at the level of higher education around the world according to (Attwell,
2006)and (Yucel, 2006). Therefore, the number of studies aiming to find out how to
implement e-learning in HEIs is ever increasing due to its potential benefits and claimed
drawbacks.

Different benefits claimed for e-learning can be identified. Tangible

benefits of e-learning can be listed based on the final report of the Higher Education
Academy (HEA), Association for Learning Technology (ALT),the Joint Information
Systems Committee (JISC) and the works of authors who actively carry out research in
e-learning as follows(Ferrell, et al., 2007),(Mackeogh & Fox, 2009), (Yamamoto &
Aydn, 2009):

timely as it gives reduction in time associated with marking and development of


subsequent activities.

cost effective as it provides significant reduction in delivery cost and e-assessment


because automated marking takes seconds rather than hours in addition to
immediate savings in printing costs.

achievement as it offers increased opportunities in terms of student retention and


actively involvement in learning.

consistent as everyone gets the same standard content;


5

easy to track as it facilitates registration, monitoring of learning progress, testing


and record-keeping;

empowering as it allows learners to regulate their learning pace;

interactivity as students are part of learning by practicing, analysing data and


performing task;

economy as

it

saves

expenses such as

transportation, accommodation,

complementary materials and commuting between home and university;

flexibility as there is no need for a tutor to be available throughout the entire


process;

productivity as it can be adjusted and tailored to the needs of organizations and


individuals;

However, despite the benefits of e-learning, some researchers such as(Chapnick,


2000)(Anderson, 2002) and (Bean, 2003)propose that e-learning should be implemented
very carefully due to its potential drawbacks. Potential drawbacks of e-learning are also
claimed by (Pollard & Hillage, 2001)(Bean, 2003)(Trinidad & Pearson, 2004) and
(Aydin & Tasci, 2005)as follows:

costly as conducting e-learning without careful planning most likely ends with cost
overruns;

technology dependent as it needs access to appropriate hardware and software;

unsuitability as it is not suitable for some soft skills development relying on


interpersonal contact and as it require high levels of self-discipline and selfmotivation;

incompatibility as it sometimes not compatible with other systems and material;

expensive as the cost of developing content and providing the essential


infrastructure is high;

dependency as it still dependent on human support such as management support and


the use of the software.

Nevertheless, many of the potential drawbacks might be minimized with careful


planning as creating implementation standards to deal with incapability of organizations
and providing training for individuals to overcome their lack of requisite skills.
However, the main rationale behind those researchers warnings is mainly associated
with the fact that e-learning may not have the same effect for every individual,
institution, organization or country. In sum, it is important to integrate e-learning into
education carefully in order to take advantages of technology in learning the knowledge
and in developing skills. As a result, many institutions and individuals embrace elearning for delivering education as it potentially offers different benefits.

1.3 Purpose of the thesis


There are no clear methodologies and ways of thinking to bring the disciplines of
technology and education as they mutually challenge each other according to
(Laurillard, 2007). In addition, Chartlon et al.higlight that technology in education adds
another dimension to creating knowledge products captured in learning designs and
lesson plans because teachers need to know when, how and what technology to apply,
and to understand the impact of taking such a challange due to the availabilty of a
variety of tools in technology(Chartlon, et al., 2012).Therefore, it is important to
understand what technology offers and what education needs to develop clear
methodologies and ways of thinking to bring these two research fields.It is also noted
that

education

has

problems

and

technology

has

solutions

looking

for

problems(Laurillard, 2008). For instance, broadband technology is a solution for


students at a distance from the university campus because it allows those students to
have similar access to resources, multimedia and fast downloads as students using
terminals on campus(Mason & Rennie, 2004).However, it is also argued that the
solutions especially new technologies offer are solutions to problems that education
does not have(Laurillard, 2008). It is therefore important to make sure these tworesearch fields technology and education fit and this complementarity fuels clear
methodologies to bring together technology and education to take advantages of
technology in learning the knowledge and in developing skills.

In this regard, it is crucial to analyse the educational problems and use this analysis to
identify the solutions from what technology offers because integrating technology in
education is not only a technical matter (Rodrguez, et al., 2012).Moreover, it is
important to work out the optimal technologies for the educational problems and to
consider the educational enterprise from the point of view of teachers and students
because they are at the centre of the educational problems. Additionally, it is important
to note here that teachers attempting to integrate technology into their teaching also face
a variety of challenges in todays classrooms (Cowan, 2008). Such challenges are not
only associated with the use of technology but they also related to the identification of
many factors such as pedagogical approaches and beliefs, teacher confidence, attitudes,,
and skills relating to ICT, school ICT infrastructure, supervision and technical support,
involvement and leadership of school principals and time spent by teachers on meetings,
training, exercises and lesson planning (Rodrguez, et al., 2012). Mor and Winters
summarize that the integration of technology into education is a challenge because it
addresses many issues ranging from learning theory to software engineering(Mor &
Winters, 2007).In the light of the foregoing arguments, the purpose of this study is twofold as illustrated in Figure 1namely development and evaluation.

Developing a
Model for
Supporting
Electrical
Engineering

Evaulating
the Model for
Supporting
Electrical
Engineering

Figure 1:Purpose of the study

1.3.1 Development
First, I aim to develop a model for supporting electrical engineering with e-learning. To
achieve this, I aim to carry out studies to find out how to bring technology and
education in order to support electrical engineering with e-learning. The model
development is based on the survey data collected from representative teachers and
students in Turkey. To develop the model, I aim to investigate the attitudes of
representative key stakeholders towards e-learning by administrating questionnaires and
interviews with teachers and students in higher education institutions (HEI) associated
with the subject of electricity in Turkey. In this way, I believe that I can analyse the
educational problems in the domain of electrical engineering from the point of view of
teachers and students and then use this analyse to identify the solutions from what elearning may offer.

1.3.2 Evaluation
Second, I aim to evaluate the model to find out whether it works in higher education
institutions in Turkey by addressing the needs of the domain of electrical engineering.
To evaluate the model, I aim to conduct a case-control study with students to find out
whether the model has had the intended effect on participating students in HEI
associated with the domain of electrical engineering in Turkey. Additionally, I aim to
evaluate the applicability of the model in the United Kingdom in addition to Turkey to
find out whether the model works in a different situation. Moreover, I would like
highlight that I decided to implement e-learning in Turkey and the United Kingdom
given the resources and time available. Due largely to the use of different languages in
HEIs in other countries, it is not possible to survey the attitudes of students and teachers
towards e-learning within the limited time of this PhD study. In this way, I could focus
on the quality of research. By developing and evaluating such a model, I could evaluate
whether and to what extent the actual benefits of e-learning would be applicable in the
domain of electrical engineering.

1.4 Parameters of the thesis


1.2.1 Reasons for selecting the domain of Electrical Engineering
The field of electrical engineering was chosen for the study as existing literature clearly
points out that the related knowledge and skills are important to almost every aspect of
human lives and any research to enhance education in the field is worthwhile(Prasad,
2009). For example, the field of electrical engineering play a crucial role in both
developing and developed countries as it enhances productivity throughout the economy
if those countries want to compete successfully in dissemination of rapid economic and
technological change (Middleton, et al., 1991). In this regard, electrical engineering is
quite common in target and other countries and the students of this field fairly find their
first job with ease because students gain a wide scope of knowledge during their studies
such as control, communication and energy. Moreover, students can work in most of
countries because the laws of electrical engineering are universal. Additionally, students
from this field graduate with high computer skills as they face with computer problems
every day.

Furthermore, rapid changes in electrical and electrical engineering comparing with other
subjects are observed and there are various modules in this field that students have to
achieve. It is therefore valuable to introduce e-learning into HEIs associated with the
subject of electricity in order to stimulate innovations. Moreover, the support and
resources needed for running courses in electrical engineering are also quite different
from those for other disciplines, because both theory and practice play an important role
in electrical engineering. For instance, throughout this research study, I found that
teachers and students in this field have concerns about how e-learning might help to
enhance education in electrical engineering because the field aims to increase the
practical skills of students. However, the majority of models proposed address the needs
of organizations and HEIs that do not involve practices such as lab experiments.
Moreover, I examined the annual statistics of the Higher Education Council in Turkey
to find out about studies with keywords such as e-learning, online learning or electronic
learning in their titles or subtitles. It indicates that the first thesis about e-learning was
published in 2002 and then 96 more theses followed until 2013.

10

However, only three of them were in the context of electrical engineering in Turkey,
though none of them were about development and evaluation of an e-learning model in
the context of e-learning. However, there is a need to carry out more research on elearning in this domain to bring new innovations in this field and help this field to deal
with the new advancements in electrical engineering.

1.2.2 Reasons for selecting Turkey and the United Kingdom


The second goal of this research study is to evaluate the model in order to find out
whether it works by addressing the needs of the domain of electrical engineering in
Turkey. To meet this aim, I planned to evaluate the e-learning model in HEIs associated
with the subject of electricity based on the empirical studies in Turkey. In addition to
Turkey, I also aimed to evaluate the applicability of the final model in a different
situation namely in the United Kingdom. These two countries were chosen for the
implementation of the empirical studies of this project because they exemplified
different conditions for e-learning such as the instructional language, infrastructures and
general attitudes towards educational technologies; the related variables were evaluated
with the tools developed such as surveys and e-courses.

By evaluating the model in Turkey and then in the United Kingdom, I aimed to contrast
those countries in terms of a developed and developing country. Due largely to the use
of different languages in HEIs in other countries , it is not possible to survey the
attitudes of students and teachers towards e-learning within the limited time of this PhD
study. Therefore, I planned to investigate the attitudes of different stakeholders
especially teachers and learners towards e-learning in electrical engineering in Turkey.
After the appearance of the final model, I decided to implement e-learning in Turkey
and the United Kingdom given the resources and time available. In this way, I could
focus on the quality of research. By developing and evaluating such a model, I could
evaluate whether and to what extent the actual benefits of e-learning would be
applicable in the domain of electrical engineering.

11

1.5 Research questions of the thesis


This thesis aimed to analyse the educational problems in the domain of electrical
engineering and use this analyse to identify the solutions from what e-learning offers
from the point of view of teachers and students and then evaluate the final model in
Turkey and the United Kingdom to examine whether it works. To achieve this aim, the
research question has important status. Several writers in research methods such
as(Flick, 1998), (Mason, 2002)and (Creswell, et al., 2003)and (Cohen, et al.,
2007)emphasized the importance of research questions. The research question is viewed
as the critical part of research process because it helps the researcher link his (or her)
literature review to the kinds of data that will be collected (Bryman, 2007). Therefore, I
seek answers to several questions based on the systematic literature review and
discussions among me and my supervisors.

What factors put a barrier into the readiness of teachers and learners towards elearning in higher education institutions (HEI) associated with the subject of
electricity in Turkey?

What criteria should be used to select an e-learning platform before embarking


on e-learning in the domain of electrical engineering?

How should e-learning materials should be developed to support electrical


engineering in terms of the point of view of teachers and students in higher
education institutions associated with the subject of electricity in Turkey?

How should students be trained in order to get them ready for e-learning in the
domain of electrical engineering?

Is there any significant difference between e-learning, blended learning and


traditional learning in the domain of electrical engineering in Turkey and the
United Kingdom?

12

1.6 Structure of the thesis


This thesis is organized in three parts, including 14 chapters. Part I, Fundamentals,
begins with Chapter 1, discussing the purpose of the thesis and provides information
about the background of the thesis. It continues with Chapter 2 providing a literature
review on e-learning and concludes with Chapter 3 with explanations of procedures for
collecting, analysing, interpreting and reporting data.

Based on the purpose of the thesis, I divided the rest of the thesis in two parts:
development and evaluation. Part II starts with Chapter 4, Developing an e-Learning
Model, the largest section of the thesis. Chapter 4 reports the analyses of the factors that
might affect the readiness of organizations or individuals and describes a conceptual
framework for measuring different stakeholders in HEIs, especially those associated
with the subject of electricity. Chapter 5, 6, 7 and 8 report the analyses of the survey
data in Turkey. Chapter 9 presents the development of a model for delivering elearning by comparing the perspectives of students and teachers in Turkey.

Chapter 10 opens Part III, Evaluating the E-learning Model, with a chapter that
provides procedures for evaluating the model for e-learning based on the empirical
studies in Turkey and the UK. Chapter 11 and 12 report the measurement concerning
the readiness of students in Turkey and the UK and develop various structural equation
models. Chapter 13 describes a case-study conducted to assess the pedagogical value of
e-learning using a web-based, campus-based and mix mode of these approaches. It
concludes with Chapter 14, summarizing the key points of the thesis and answering the
research questions.

13

CHAPTER 2: LITERATURE REVIEW


2.1 Introduction
e-Learning as a research field is interdisciplinary because it brings together two reseach
fields, technology and education Therefore, it is highly important to develop clear
methodologies and ways of thinking that define the disciplines of technology and
education. To develop clear methodologies and ways of thinking, the focus has to be
pedagogy as Laurillard pointed(Laurillard, 2007). She notes that we have to focus on
what the pedagogy requires rather than what technology offers because technology
offers a range of different ways of engaging learners in development of the knowledge
and skills. In addition, theories of learning are there to guide our approach to providing
what pedagogy needs.

Therefore, it is important to focus on the learners and teachers to find out what they
need because they are in the act of learning. In this way, we can develop clear methods
and activites of learning through e-learning. Therefore, I believe that I have to carry out
a literature review about e-learning to find out what it offers in terms of technology.
That is to say, I aim to understand the use of the word electronic in the term electronic
learning. Second, as theories of learning are there to guide our approach to providing
clear methods and activites of learning, I will examine learning theories. Finally, I will
include a discusssion on wider range of concepts including blended learning, gamebased learning and flipped classrom because I believe they could be a theoretical
underpinning for the ideas about the use of e-learning that I discuss and implement in
the thesis.

14

2.2 Understanding E-learning


The term e-learning stands for electronic learning and usually used to refer to any use of
technology (e.g. computers) to learn (Donnelly, et al., 2012). However, there are many
ways to define the term e-learning because there are a variety of related terms that are
used interchangeably because of the word electronic in the term. The JISC defines elearning as learning supported and facilitated through the use of ICT (JISC, 2004). That
is to say, the word electronic in the term e-learning in considered as ICT in the report of
JISC. Hovever, JISCS also limits the use of ICT to specific hardware and software.
The word software

is used to refer to the use of applications namelyassistive

software, email, discussion boards, video conferencing, virtual learning environments &
learning activity management systems and the word hardware to the use of tools
namely computers, interactive whiteboards, digital cameras & mobile phones. In a
simalr way, e-learning is defined in the report of the Higher Education Academy as
computer technologies to support learning, whether that learning is local or remote
(HEA, 2014). That is to say, the word electronic stands for computer technologies in the
report HEA.

As seen, as technology, the word electronic in the term e-learning offers a range of
different ways of engaging learners in development of the knowledge and skills such as
computer, mobile phone, interactive white boards, digital cameras, email, video
conferencing, etc. Therefore, the term e-learning is considered as really just an umbrella
term because it comprises all forms of electronically supported learning. Therefore,
many terms such as e-learning, open, distance, online learning and so on are used
interchangeably, though they have important differences. I believe that it is useful to
interpret different technologies as the synonym of e-learning such as online learning and
offline learning.

15

2.3.1 The First Interpretation: On-line Learning & Off-line Learning


The term online learning is commonly used as the synonym of e-learning, though it
does not represent the whole characteristics of e-learning.

The word online is

frequently used to describe products, services, or information that can be bought or used
on the Internet. Moore et al. questioned whether e-learning and online learning are the
same(Moore, et al., 2010). They found that there are differences between the two terms
and those differences also vary from continent to continent in terms of their usage.
Oblinger & Oblinger distinguished online learning from e-learning and described online
learning as wholly online learning(Oblinger & Oblinger, 2005). According to Oblinger
et al (ibid), some other authors (e.g. (Benson, 2002; Carliner, 2004; Conrad,
2002)described online learning as access to learning experiences via the use of some
technologies.

2.3.2 The Second Interpretation: Asynchronous & Synchronous e-Learning


The term asynchronous and synchronous e-learning are also commonly used to
differentiate some characteristics of e-learning. The word asynchronous is used to
describe something that is not happening or done at the same time or speed opposite
to the word synchronous. Hrastinski (2008) noted that e-learning mainly relied on
asynchronous means for teaching and learning but the recent improvements in
technology have led to the growth of synchronous e-learning(Hrastinski, 2008). The
term synchronous e-learning is defined as synchronous learning that takes place live and
real-time. However, it differs from synchronous learning such as lectures, product
demonstrations and other knowledge dispersal activities because synchronous elearning takes place through electronic means (Hyder, et al., 2007). Besides, Lado
(2008) pointed out that synchronous e-learning, which takes place live, can help
students overcoming geographical barriers(Lado, 2008). According to both Lado (ibid)
and Hyder et al. (2007), I interpret the term synchronous e-learning as learning that
takes places live through electronic means but it is independent of space. Technologies
used for asynchronous e-learning are usually categorized as instant messaging, live
webcasting, audio and video conferencing (Hyder, et al., 2007).

16

However, it should be noted that examples given above do not imply that synchronous
e-learning takes place only via the Internet. Radio and TV programmes that take place
live and real-time have also been successful means of synchronous learning. Besides,
synchronous e-learning might also happen in the classroom through the use of
whiteboard and slides. However, there is also a need to clarify the term asynchronous elearning in order to differentiate it from synchronous e-learning. The term asynchronous
e-learning is defined as learning that happens independent of time and space (Lado,
2008).

It is distinguished from synchronous e-learning because students access

intermittently on demand rather than continuously (Hyder, et al., 2007). A discussion


forum can be a good example of asynchronous e-learning. For example, a student can
post a message and another student or teacher can comment on the posting hours or
days later. Other examples for asynchronous e-learning might be listed as documents
and web pages, e-mail, podcasting, DVDs and CDs, recorded events and so on.

2.3 Learning Theories


e-Learning is about the use of technology to support learning especially with the use of
the Internet. Therefore, learning through technology play an important role in creating
an effective and adaptable learning environment when referring to e-learning. As I
mentioned above, terms such as e-learning, online learning and so on are used
interchangeably, though they have important differences. However, the word learning
is the most important factor shared by all of them and central to each of them (Race,
2005). Therefore, it is important to understand theories of learning as they are there to
guideour approach to providing what learners and teachers need. Theories of learning
involve the several aspects of the learning process. For instance, Diana Laurillard and
her colleagues examined learning theories in deatail to find out about the aspects of
learning process such as the relationship between teacher and learner, and between
theory and practice as illustrated inFigure 2.In the framework, the teacher is represented
in dialogue with a learner and each learner in dialogue with other learners. The
framework also characterizes the learning process as a serious of activities by teachers
and learners, cycling between theory and practice and theory, between teacher and each
learner and between and among learners.

17

Figure 2: The Conversational Framework

MellOw, et al.(2011) describe the conversational framework in terms of learning


theoris: first, a didactic form of teaching and learning appears when a teacher presents
ideas and the learner asks questions(MellOw, et al., 2011). The second theory in the
framework isthe social constructivism that

learners needsto discuss, debate and

negotiate ideas. On the othear hand, learning through collaboration occurs when
learners work in partnership to share the outputs of thear practice. Hovever, to actualize
constructionism, learners need to use their ideas to achieve a goal in a practice
environment. As mentioned above, theories of learning indicate the several different
aspects of the learning process. However, MellOw, et al. (2011) point out that the
education community knows about learning theories but they does not always apply this
knowledge. For instance, the didactic form of teaching and learning is still considered as
the dominant pedagogy for engineering education despite the large body of education
research advocates other approaches (Mills & F.Treagust., 2003). In addition, Weller
(2004) notes that transferring teaching approach from one medium to another is not
effective(Weller, 2004). For instance, there are a variety of pedagogical reasons about
the ineffectiveness of the standard lecture hall like an online teaching hall. For these
reasons, many educators find themselves adapting their approach. Therefore, it is
important to apply learning theories in practice.

18

2.4 Theoretical Underpinning


There are no clear methodologies and ways of thinking to bring the disciplines of
technology and education (Laurillard, 2007). However, there are some theoretical
underpinning for the ideas about learning and teaching technology. For example, gamebased learning is suggested for purely for motivational reasons but there is a literature
of game-based learning that draws theories such as situated learning; constructivism;
social constructivism and scaffolding. Similarly, much literature about online learning
also includes conversational approaches to learning.

2.4.1 Blended Learning


It is important for higher education to find effective and flexible delivery models to
provide all students with more convenient access to quality learning experiences rather
than using the traditional learning alone (George-Walker & Keeffe, 2010). On the other
hand, blended learning has been proposed one important solution to address both
student learning and higher education organisational needs (Macdonald, 2008). The
most common understanding from the term blended learning still seems to be that
blended learning mixes different deliver modes, especially online and face-to-face
teaching. However, the use of the term blended learning is growing and this growing
causes the term to lose all its meaning (Mason, 2005; Hofmann, 2006; Torrisi-Steele &
Drew, 2013).

According to Mason (2005),the term blended learning is amorphous term because it can
be theoretically applied to almost any learning situation because the word blended in
blended learning can be anything such as the technologies, teaching methods, the
learning experiences of the students or the location of the learning events. For instance,
Mason and Rennie (2004) note that broadband technology allows students at a distance
from the university campus to have similar access to resources. multimedia and fast
downloads as students using terminals on campus. As the definitions and the
understandings of the term blending learning are many, a literature search by TorrisiSteele & Drew (2013) was undertaken using Thomas Reuters Web of Knowledge
citation database in order to gain better understanding of academics design and
implementation of blended learning.
19

Key findings from this literature search points that the case and student focus studies
about blending learning are undeniably useful and necessary. However, Torrisi-Steele &
Drew (2013) also note that professional development and support are needed to
formulate the appropriate strategies to facilitate academics effective implementation of
blending learning, especially in the integration of technology and transformation of
practice. However, blended learning in practice involves the use of ICT to traditional
learning and hence mixes different delivery modes (Verkroost, et al., 2008). Therefore,
researchers should search for the most appropriate combination of blended learning in
higher education (George-Walker & Keeffe, 2010). Nowadays, a number of different
modes of blended learning appear in higher education such as flipped classroom.
Flipped classroom (also called flipped teaching) is known as a form of blended learning
in which students start learning at home and then continue in class with teachers and
with other students.

Hughes (2007) highlights that the flipped classroom is a pedagogical concept and
method that replaces the standard lecture-in-class format because students have
opportunities to review, discuss, and explore course content with the teacher and other
students in class. Hughes also notes that there are many ways that a classroom can be
flipped. However, the most common way to apply the flipped classroom approach is to
encourage students to view the recorded lectures or read course materials outside of
class and then meet to engage in problem solving, discussion, and practical application
exercises with their instructor and other students inside of class. However, students in
traditional approaches do not have such opportunities because the teacher plays the role
of information conveyor, while the students assume a receiver role with primary
responsibilities of listening and note-taking(Zappe, et al., 2009). However, it is also
important to note here that the instructor may apply various teaching styles in higher
education but time constraints limit their teaching style to the traditional lecture format
(Lage, et al., 2000). However, the flipped classroom approach can encourage students
learning in both outside- and inside of class. Strategies for flipping classroom outside of
class and inside of class may vary.

20

Hughes (2007) suggests that moving the lecture out of the classroom may involve
selecting course content, deciding the organization of content, choosing multimedia to
deliver content, creating materials and making the materials available to students.
Moreover, strategies for in-class may involve answering students questions at the
beginning of class, facilitating individual or group activities and summarizing key
points. For instance, Zappe et al (2009) used iTunesU to post video-records of lecture
material with supplemental content to allow greater time for in-class problem solving
and increase the opportunity for increased teacher-student interaction. However, it is
highly important to note that there is no single model for the flipped classroom
approach because the term is used to describe almost any class structure that provides
strategies for learning outside and inside of class. As short video lectures are widely
used for students views at home, broadband technology plays an important role in the
flipped classroom approach. Hence, the focus should be on models for supporting
learning with e-learning.

2.4.2 Game-based and Mobile Learning


Game-based learning is a new concept. However, Liu, et al. (2014) note that there is
little agreement as to the definitions of a game because many terms such as games,
computer games and video games are used interchangeably. Digital games are initially
designed for the mass market as a form of entertainment with few educational
connections. As gaming has grown in popularity and become a defining characteristics
of young learning, digital games have also gained increasing interest from educators and
researchers (Sadera, et al., 2014). Researchers such as Schunk (1991) and Zimmerman
(2000) question whether the characteristics of games (e.g. carefully crafted storylines,
the quality of design, and dynamic and immediate feedback) might be leveraged to
support learning. Moreover, some researchers such as Abrams (2009), Gerber and
Price (2011), Squire (2011), Steinkuehler, et al. (2011) have identified that game-based
learning can be used to enhance student learning in class-related activities.

21

For instance, Holmes (2011) suggests that appropriately designed computer games may
play a useful role in helping some struggling readers at home due to the fun and
motivational characteristics of computer games. In addition, mobile-learning is also
becoming a popular concept in education(Mifsud, 2014). Mobile learning is defined by
Traxler (2009) as learning that allows students to access course resources and materials
from anywhere using mobile devices(Traxler, 2009). Devices such as smart phones and
tablets are becoming the main tools of mobile learning because people can gain learning
support or obtain a great amount data of knowledge from mobile devices connected to
the Internet (Hsu & Chen, 2010). Our life is already surrounded by mobile learning.
For example, it is common to read a document on a smart phone while drinking a cup of
coffee in a Starbucks coffee. However, there are some concerns about the combination
of mobile devices with learning. Hoppe, et al. (2003) highlight that mobile devices
should be adapted to the learning needs. For instance, platforms such as Moodle support
smartphones, tablets and computers at the same time. However, it is also important to
investigate the opinions of teachers and students as they are at the centre of learning in
adapting or designing mobile technology to the learning needs.

2.4.3 Student- and Teacher-Centred Learning


The term student-centred learning is used simply to express a system of providing
education and training which has the student at its heart as opposed to teacher-centred
learning (Brandes & Ginnis, 1991). However, it seems that there is a considerable
confusion about what student-centred learning actually is and lack of agreement about
its definition. Many terms such as flexible learning, experiential learning and selfdirected learning have been linked with student-centred learning (ONeill & McMahon,
2005). Lea, et al. (2003) noted that the confusion about its definition arise from a range
of potential definitions because different researchers and practitioners emphasize
different aspects of learning and teaching process. Burnard (1999) interpreted the
concept of student-centred learning as the fact that the student does not only have the
opportunity to

choose what to study but they might also choose how and why

something interesting to study. From the interpretations of Burnard, ONeill &


McMahon (2005) emphasized that the concept of choice in learning is precondition for
student-centred learning.

22

Similarly, students in open learning are also considered as ones having choice in their
learning (Race, 1994). There is also another term that I need to clarify, namely distance
learning in addition to open learning. For the similar reasons, distance learning has also
been credited to provide study opportunities for those who cannot or do not want to take
part in campus learning (Holmberg, 1995). On the other hand, teacher-centred learning
is also important concept in education. Brandes & Ginnis (1991) said that many
teachers find the teacher-centred learning safe, natural, comfortable and appropriate
because the student acts as a participator.

On the other hand, students have the full

responsibility for their own learning and evaluating the results in the student-centred
learning because the teacher acts only as a facilitator.

2.4.4Open Learning
Open learning allows the learner to choose how to learn, when to learn, where to learn
and even what to learn as far as possible within the constraints of any education and
training provision because the learners learn in their own ways, at their own pace, in
their own place and at their own time (Race, 1994; Paine, 1989). However, open
learning with some limitations might also appear within traditional learning. Race
(1994) exemplified the use of open learning in a crowded lecture room. For example,
the teacher might ask the class to spend a few minutes reading some hands-out in order
to answer some questions, studying in their own ways and at their own pace. Race
(1994) highlighted that learning brings more choices to open learners namely: pace,
taking as long as is needed to complete a chunk of studying; place, choosing where to
learn (e.g. home, library, workplace); time, choosing when to do learning; processes,
choosing how to learn. Besides, Race (ibid) pointed out that open learners should have
the responsibility to make a sensible choice. Besides, Race (2005) added that the term
distance learning is usually applied to open learning that takes place at a distance. On
the other hand, Rumble (1989) argued that the use of the terms open learning and
distance learning in practice is frequently misleading and noted many contiguous and
distance learning systems are open in their practices.

23

2.4.5 Distance Learning


The term distance learning is considered as learning that takes place at a distance and is
characterized by a clear separation in space or/and time of the majority of teaching and
learning activities(Barbara, 1993; Keegan, 1996; Race, 2005).However, there is no
universal agreement about the characteristics of distance learning because its use varies
in practice. The popularity and use of distance learning have grown as more advanced
technology has become available. For example, although the first phase of distance
education is based on the ingenious idea for delivering instruction to a potentially
limitless audience: correspondence courses by mail in the mid-1800s, distance learning
were supported carefully with constructed texts and audio and video materials
(Matthews, 1999). After some advances in technology, distance education was also
supplemented with conventional broadcast radio and television. Williams, et al.(1999)
highlighted the development of distance education in three levels.

Level 1 considers the learners as passive because they have no opportunity for the
learner to interact with the instructor in real time. For instance, a distance learner may
send a message to his instructor but he may receive responses after a lengthy delay due
to mailing. Moreover, this level consists of only printed material, audio- and videotapes
and radio transmissions. In level 2, distance learners have the ability to transmit
messages simultaneously and receive immediate feedback because of new technologies.
This level consists of two-way audio tele-training, electronic mail, computer-mediated
conferencing and so on. The third level consists of virtual environments and hybrid
networks and hence is considered highly interactive. In this level, according to the
elements of the course being taught, there can be more than one primary mode of
delivery. This third level of distance learning is frequently named e-learning.

24

2.4.6 Traditional Learning


Traditional learning is simply defined as learning from people such as lecturers,
instructors and tutors (Race, 2005). McInnerney & Roberts (2009) note that traditional
learning cannot be easily characterized because it comes in a variety of forms. However,
they highlighted that the most common thing in traditional learning is that it comes with
the idea of the sage on the stage, with information provided by the tutor in the
classroom in the form of lectures and the printed materials.

Rashty (1999) also

identified some common characteristics of traditional learning including: the tutor talks
more than the student; the learning is based on the whole class participating; the tutor
teaches according to the existing curriculum; the student mainly learns what but not
how; the students motivation is low; the tutor is the authority and the teacher dictates
the structure of the lesson and the division of time; there is almost no group or
individual activity. As traditional learning is closely related to the notion of campus
learning, I explore the latter in the following sub-section.

2.4.7 Campus Learning


The campus is mainly described as the land and buildings belonging to a college or
university. A campus usually consists of libraries, lecture halls, residence halls,
laboratories, student centres, dining halls, cafes, gyms, stadiums and so on. Beyond the
lecture theatre,
students.

there is a range of high quality facilities being in the service of

Astin (1999) also highlighted that residence halls on the campus have

important effects on the student persistence in school. Therefore, campus learning


differs from traditional learning because student involvement outside of the classroom
has also been linked to students learning and development such as residence halls
(Mull, 2002).

25

2.5 The History of e-Learning


The term e-learning most likely originated during the 1980s, though the origin of the
terms e-learning is not certain (Moore, et al., 2010). However, Benson (2002) stated that
online learning is a newer version of distance learning. Like Benson, many authors
perceive that there is a relationship between e-learning and distance learning, though
they have important differences and similarities. However, Moore at al. (2010)
investigated the perceptions of these two terms through a survey and found a point from
the feedback that distance learning was perceived as old-fashioned belonging to a time
in the past. Besides, many written texts used the terms online learning, distance learning
and e-learning interchangeably. Murphy & Chris Sharman (2006) also noted that the
growth of communication technologies has made distance learning easier in recent years
and those have also led to the advent of new terms known as e-learning. It is therefore
important to date back to the origins of distance learning to understand e-learning much
better globally especially from the perspectives of Turkey and the United Kingdom.

The term distance learning is defined as the form of study, which is not led by teachers
present in the classrooms but supported by tutors at a distance from the students
(Sewart, et al., 1983). From this brief description, I interpret that two conditions might
exist before distance learning can happen: (i) the separation of the teacher and the
learner in terms of geography and time and (ii) the support of the teacher to the student
with the help of an organization. Sewart et al. (1983) therefore equated distance
learning with correspondence learning due to those two conditions. The early pioneers
of distance learning were Isaac Pitman and Caleb Phillips. Some sources credit Isaac
Pitman as the early pioneer of distance learning, who started correspondence learning in
1840 in Bath in England and asked his students to copy the short passages of the Bible
and return them for grading though the post system. However, Caleb Phillips is also
recognized as the teacher of new method of short hand in 1728 in Boston (Holmberg,
2005). However, the methods used by Pitman are considered as much more modern.
The history of distance learning is interpreted differently. However, the main feature of
those citations might be identified and interpreted in terms of its interactivity,
technology and chronology.

Many words such as interactive, interactivity and

interaction are derived from the word interact to describe people or things that talk to
each other, work together or affect each other.
26

It is therefore easy to find the terms interaction and interactivity to use interchangeably,
though they serve at cross-purposes. The word interaction is mainly used to imply an
occasion when two or more people or things communicate with or react to each other
such as interaction between two languages or interaction between teacher and learner.
On the other hand, the word interactivity is described as the involvement of users in the
exchange of information with computers and the degree to which this happens. Besides,
Roblyer & Ekhaml (2000) highlighted that the focus of those words are different:
interaction focuses on peoples behaviours and interactivity focuses on characteristics of
a system. According to these explanations, I interpret the word interactivity here as the
degree of interaction between things or people that talk to each other, work together or
affect each other.

Roblyer & Ekhaml (2000) considered interactivity as one factor that plays a primary
role in the achievement and satisfaction of students in distance learning programmes
and noted that technologies supporting high interactivity seem necessary to allow
person-to-person and person-to-system interaction. However, it should be also noted
that interaction refers to action and reaction chain, which can be verbal or nonverbal.From this perspective, we can interpret the evaluation of distance learning in
terms of interactivity and categorize into groups. There are many ways to establish
interactions between things or people. However, the main methods we used today are
generally based on verbal (e.g. oral and written communications) and non-verbal
communications (e.g. gesture, body language, posture, tone of voice, facial expressions,
touch).

For example, Roblyer & Ekhaml (2000) found that many students never

choose distance learning because it could never provide the qualities of face-to-face
course. It seems that the main reason behind the perceptions of such students is that all
types of communications are used in traditional learning actively and together this
brings high interactivity into the classroom. It is therefore important to evaluate the
history of distance learning in terms of interactivity.

27

CHAPTER 3: METHODOLOGY
3.1 Introduction
Research is commonly identified as a step-by-step process by which the researcher can
extend his knowledge or find the answers to his questions (Matthews & Ross, 2010).
Therefore, a research study involves many stages from planning and design, through
data collection to data analysis and reporting (Cohen, et al., 2007). However, the
question, research process and answer are highlighted as important
characteristics of any research studies (Matthews & Ross, 2010).

According to

Matthews and Rose (2010), the first thing to consider in planning research is to identify
the purpose of research and therefore it needs a critical review of literature because the
researcher may pose research questions, already known. The second thing to consider is
called a research design because the researcher have to think about how he is going to
do his research. Therefore, a research design is considered as essential part of the
research process because it has significant implications for the quality of the data
(Pallant, 2010). However, there is no single way for planning research.

Cohen et al. (2007) highlight that research design is controlled by the notion of fitness
for purpose. As the aim of this research is to bring technology and education to support
electrical engineering with e-learning, research design is crucial in the thesis to make
surethese two-research fields technology and education fit. Once the researcher has
identified the research problem, they are required to choose a specific research design
that best fits the research problem. As a result, a number of frameworks have been
published to help researchers plan their studies. Cohen et al. (2007) published a
framework for planning research, involving 24 items from the statement of the research
problem to the writing up the research. Creswell (2009) also proposed a systematic
guide for researchers to choose a research design in three steps namely selecting
research design, deciding research study and describing research method. However,
they also emphasized that there are a number of aspects of a research design depending
on types of research designs, approaches and methods selected by the researcher. A
framework for planning research may encompass procedures for collecting, analysing,
interpreting and reporting data (Creswell & Clark, 2011).

28

Morrison (1993) also identified issues that constitute a framework for planning research
into four main areas namely orienting decisions, research design and methodology, data
analysis and presenting and reporting the results. As I outlined the research problem and
the relevant literature in Chapter 1 and 2, the purpose of this chapter is to present the
methods adopted to answer the research questions. This chapter, therefore, starts by
giving an overview of the methods available in education and social science. Then, it
describes the research design adopted for the study. Next, it discusses in details the
stages of the research design describing sampling, procedures for collecting, analysing,
interpreting and reporting data and the limitations of the techniques used in each stage.
Finally, the chapter closes with ethical concerns in this research study.

3.2 Research Methods Available in Education and Social Science


Research is defined as human activity based on intellectual application in the
investigation of matter (Glenn, 2010).

Therefore, research typically involves two

components namely the search for truth and finding something new(Cohen, et al., 2007;
Gordon &Marian, 2006). For instance, Kothari (2004) notes that research aims to find
out the truth which is hidden. According to Glenn (2010), the purpose of research is to
discover, interpret and develop methods and systems in order to enhance human
knowledge on a wide variety of scientific matters of our world and the universe. The
basic types of reseach are as follows: descriptive vs. analytical, applied vs fundamental,
quantitative vs. qualitative, and conceptual vs. empirical(Kothari, 2004).

Hovever, the basic types of research points outthat there are three are three baic
approaches to research namely qualitative, quantitative and mixed methods approaches.
Qualitative approach is concerned with subjective assessment of attitudes, opinions and
behaviour (Kothari, 2004). On the other hand, quantitative approach has typically been
more directed at theory testing or verification (Punch, 2005). The distinction between
qualitative and quantitative research is often expressed in terms of using words and
numbers (Creswell, 2009).Similarly, open-ended questions frequently appear in
qualitative research and closed-ended ones in quantitative one. Mixed-method research
simply combines the collection, analysis and interpretation of both qualitative and
quantitative data.

29

However, mixed-method research also involves the use of both kinds of data in tandem
to make sure that the overall strength of a study is greater than the use of either
qualitative or quantitative research alone (Creswell & Clark, 2011).

3.2.1 Qualitative and Quantitative Research


Some researchers tend to focus on quantitative research such as chemists, economists
and sociologist and other researchers rely on qualitative inquiry such as anthropologists,
historians and political scientists (Lapan, et al., 2012). There might be many reasons for
researchers to choose a qualitative method such as exploring the experiences of a
particular group of people without making prior assumptions (McQueen & Knussen,
2013). There are a number of research studies used to obtain qualitative data such as
observation, ethnography, interviews and discourse analysis in order to understand a
particular phenomenon better.

While the quantitative approach often leads to

hypothesis-testing, the qualitative approach leads to hypothesis-generating (Auerbach &


Silverstein, 2003).

3.2.2 Mixed-Methods Research


Mixed methods research is becoming an popular approach in several areas because it is
viewed as an approach for providing a better understanding of research problems
(Molina-Azorin, 2012). Mixed-method studies incorporates elements of research studies
in qualitative and quantitative research(McQueen & Knussen, 2013). Creswell (2009)
noted that there are three general studies in a mixed methods research: first (sequential
mixed methods), the researcher seeks to detail findings of one study (e.g survey) with
another study (e.g. interviews); second (concurrent mixed methods), the researcher
merges quantitative and qualitative research studies in order to provide a comprehensive
analysis of the research problem; and third (transformative mixed methods), the
researcher is interested in using a theoretical lens as an overarching perspective within a
design that contains both quantitative and qualitative data.

30

The methodological pluralism of mixed methods research is viewed as a key feature


because mixed methods research frequently results in superior research compared with
that of qualitative or quantitative approach.(Johnson & Onwuegbuzie, 2004) For
example, a study may begin with a survey study in which a theory is tested, followed by
an interview study involving detailed exploration with a few individuals. Furthermore,
the researcher may benefit from the use of both closed- and open-ended questions in a
survey study at the same time and integrate the information in the interpretation of the
overall results.

The timing of the two strands within a study is a major criterion for

selecting a research study in mixed methods research when the researcher implements
both the quantitative and qualitative strands during a single phase of the research
study(Creswell & Clark, 2011). When the researcher collects analyses and interprets
both qualitative and quantitative data at the same time, they can also get different but
complementary data on the same topic (Morse, 1991).

3.3 The Research Design Adopted for the Study


The overall goal of this research work was developing a model for supporting the
domain of electrical engineering in higher education institutions in Turkey with elearning and then evaluating the model in higher education institutions associated with
the subject of electricity in Turkey and the United Kingdom. Therefore, this research
study adopted a mixed methods study to develop and evaluate the model as illustrated in
Figure 3. The model development was based on the survey data collected from
representative teachers and students in Turkey whereas the model evaluation was
conducted in the relevant HEIs in Turkey and the United Kingdom using case-control
study.

31

Mixed-Methods Research

Stage 1:
The Model
Development based
on Survey Study

Stage 2:
The Model
Evaluation based on
Case-Control Study

1.Step
(Quantitative Study)

Administrating a
Web-baed
Questionnaire

2. Step
(Qualitative Study)

Conducting Semistructured Interviews

3. Step
(Quantitative Study)

Conducting a casecontrol study

4. Step
(Qualitative Study)

Administrating a
Web-based
Questionnaire

Figure 3: Research Design adopted for the Study

To develop the model, the study investigated the altitudes of representative key
stakeholders towards e-learning in Turkey by administrating a web-based questionnaire
and semi-structured interviews in 2010 with teachers and in 2011 with students. Then
the responses of teachers and students were compared. To evaluate the model to find out
whether it works in Turkey and in a different condition namely in the United Kingdom,
a case-control study was conducted to determine whether the model had the intended
effect on the participating students in HEIs in Turkey and in the United Kingdom.

32

3.4 Research Methods used for the study


This research study involves both quantitative and qualitative studies based on the
survey study with representative teachers and students and the case-control study with
students. Survey research is considered as the bread and butter of mainstream social
science because it allows researchers to use representative samples to learn about
peoples beliefs, behaviours and experiences. (Abbott, 2013). The conduct of surveys
using the Internet is also becoming commonplace in many branches of social science.
There are three types of surveys regardless of whether they are conducted using the
Internet namely face-to-face interviews, telephone interviews and self-administered
questionnaires. However, Cohen et al. (2007) note that internet-based surveys also have
their own particular features, whereas they have many features in common with-paperbased surveys. Abbott (2013) note that online surveys are unique because of the way in
which they are delivered to respondents. Several writers in research methods point that
internet-based surveys appeared in the form of emails and then moved to emails-plusattachments of the questionnaire itself. In meantime, researchers started to benefit from
emails in order to direct potential respondents to a web site. Moreover, using emails and
web sites in order to conduct surveys have also different particular features.

For instance, while web-based surveys include great graphics, emails have the attraction
of immediacy. In addition, web-based surveys have the potential to rich greater numbers
of respondents and emails can attract greater response. However,

the particular

features of web- and email-based surveys might be used together. For instance, emails
can direct potential participants to a website at which the survey questionnaire is located
in HTML form. In this regard, in this study, I aim to send personalized invitations via
email to potential participants to direct them to a web site at which the survey
questionnaire located. However, using online surveys would be difficult for most
researchers because they need to create their own surveys. To help researchers, several
online companies have developed templates for online surveys such as SurveyMonkey
(https://www.surveymonkey.com)

and

LimeSurvey

(http://www.limesurvey.org).

Therefore, I aim to use the LimeSurvey for online surveys because it is open source and
easy to install into a server. In addition, at the end of the questionnaire, I aim to invite
the voluntary participants to make semi-structured interviews.

33

3.4.1 Web-based Questionnaire


While surveys are often the best way to collect data about the views of people, without a
clear idea, no survey can achieve success especially without a well-designed
questionnaire (Crawford, 1997). Hence, various steps by researchers are proposed to
prepare an effective questionnaire. A seven-step procedure is proposed by Lambin, et
al. (2007) to assist the design of a questionnaire namely determining the information
required; the type of the questionnaire to be used; the content of individual questions;
the type of the question to use; wording of questions; the sequence of questions and pretest the questionnaire. Crawford (1997) also noted a number of points that a
questionnaire should meet the research objectives, obtain the most complete and
accurate information possible, should make it easy for participants to give the necessary
information and for the interview to record the answer and should be brief and
interesting. Many of these steps are also broadly categorized into two groups by Dray,
et al. (2011) to conduct a questionnaire with a validated instrument. Dray, et al. (2011)
suggested a two-stage approach. First, an initial questionnaire is to be developed with
respect to the constructs specified in a conceptual model, which should be grounded in
the related literature and should also address the objectives of a research study;
discussion among researchers and subject matter researchers are to refine the items in
the questionnaire. Second, focus groups and interviews are to be conducted to explore
prospective participants interpretations of the meaning of individual item in the
questionnaire. In the light of these concepts, hence, I aim to design a questionnaire
using mixed-method research as discussed in the Section 3.3 in detail.

3.4.1.1 Defining the Information Required


A well-designed questionnaire should meet objectives of the research and must address
the population about which the researcher wishes to generalise from the sample data to
be collected (Crawford, 1997) As the main objective of this study is to investigate what
principles are needed to implement e-learning in HEIs associated with the subject of
electricity in Turkey, it is important to obtain information about what factors are
important such as age, education, etc. of the target participants and specify the size of
the population.

34

3.4.1.2 Finding about Population


According to the official data in 2010 provided by OSYM, which stands for the Student
Selection and Placement Centre in Turkey, the number of programmes associated the
subject of electricity that could be selected by Turkish students was 427. In addition to
the official data of OSYM in 2010, there were ten more programmes ( i.e Electrical
Education) associated with the subject of electricity in Turkey. However, those
programmes were closed down on 13 Nov 2009 while many students were still studying
in those programmes. Hence, it was also worth to include those programmes in our
study. The characteristics of the remaining are given in Table 1.

Table 1: HEIs associated with the Subject of Electricity in Turkey in 2010


Program Title
P1: Electricity
P2: Electrical Gen. Trans. and Distribution
P3: Rail Systems Technology
P4: Electrical Appliance Technology
P5: Electrical Engineering
P6: Electrical and Electronics Engineering
P7: Avionics
P8: Aircraft Electrics and Electronics
P9: Electrical Education
Total

Inside of
Turkey
n
%
230
55.2
7
1.7
2
0.5
10
2.4
6
1.4
148
35.5
1
0.2
3
0.7
10
2.4
417
100.00

Outside of
Turkey
n
%
0
0.0
1
5.0
0
0.0
0
0.0
0
0.0
19
95.0
0
0.0
0
0.0
0
0.0
20
100.0

As, 20 of them are located outside of Turkey namely in North Cyprus and Bosna
Hersek. Hence, they were excluded from our study. Table 2 also shows the number of
students studying in those institutions at the time of data collection.

35

Table 2: HEIs associated with the Subject of Electricity in Turkey


Programs and their Titles
Associate
Programs

Undergraduate
Programs

P1: Electricity
P2: Electrical Generation
P3: Rail Systems
P4: Electrical Appliance
P5: Electrical Engineering
P6: EE Engineering
P7: Avionics
P8: Aircraft EE
P9: Electrical Education

Total

Capacity
n
%
24295 39.58
680
01.11
110
00.18
820
01.34
2787
04.54
29581 48.19
130
00.21
490
00.80
2495
04.06
61388 100.00

Placed
n
%
20053 36.77
622
01.14
110
00.20
593
01.09
2787
05.11
27257 49.98
130
00.24
490
00.90
2495
04.57
54537 100.00

EE: Electrical and Electronics

As seen from Table 2, the majority of students were studying in departments of


electrical and electronics engineering (49.98%) and in departments of electricity
(36.77%) across Turkey. Hence, the number of students currently studying in HEIs
associated with the subject of electricity in Turkey was 54537 as of 2010.Moreover, as
education and training in several programs given in Table 2were highly similar, they
could be categorized together as illustrated in Table 3. For example, programs of
electricity (P1), electrical generation, transmission and distribution (P2), rail systems
electrics and electronics technology (P3) and electrical appliance technology (P4) can
be categorized into programs in vocational programs of electrical engineering (P5) and
programs of electrical and electronics engineering (P6) into engineering programs,
avionics (P7) and aircraft electrics and electronics (P8) into engineering programs as
illustrated in Table 3.

Table 3: Subgroups of the respective HEIs in Turkey


No
1
2
3

Program Titles
Vocational Programs
Engineering Programs
Education Programs
Total

Programs
n
%
249 59.8
158 37.8
10
2.4
417 100

36

Capacity
n
%
25905 42.21
32988 53.74
2495
4.06
61388
100

Placed
n
%
21378 39.2
30664 56.24
2495
4.57
54537
100

The number of teaching staff in associate (2-year) and undergraduate (4-year) programs
according to their working status are also illustrated in Table 4. The working status of
the majority of teaching staff was research assistant, assistant professor, instructor and
professor.

Specialist

Research
Assistant

0
3
3
12
85
97
1
12
13
13
100
113

Reader

Total

0
2
2
29
206
235
1
12
13
30
220
250

Instructor

Education
Programs

F
M
T
F
M
T
F
M
T
F
M
T

Assistant
Professor

Engineering
Programs

Associate
Professor

Vocational
Programs

Professor

Programs

Gender

Table 4: Working Status of teaching staff in the respective HEIs in Turkey

0
22
22
70
266
336
8
70
78
78
358
436

25
223
248
12
59
71
0
32
32
37
314
351

0
8
8
4
0
4
0
0
0
4
8
12

1
2
3
7
15
22
0
2
2
8
19
27

0
2
2
106
328
434
6
53
59
112
383
495

Total
N
26
262
288
240
959
1199
16
181
197
282
1402
1684

%
1.54
15.56
17.10
14.25
56.95
71.20
0.95
10.75
11.70
16.75
83.25
100.00

Table 4shows that the number of teaching staff in the respective HEIs was 1684. It
shows that the majority of teaching staff were male (83.25%) and the minority were
female (16.75%). Table 4also displays that 17.10% of teaching staff were working in
vocational programmes, 71.20% in engineering programmes and 11.70% in education
programmes.

37

3.4.1.3 Determining the Sampling and Sample Size


There can be many practical reasons for selecting samples of a population. Lind et al.
(2010) noted some of the reasons for sampling: (i) to contact the whole population
would be time consuming; (ii) the cost of studying all the items in a population might be
prohibitive; (iii) the physical impossibility of checking all items in the population; (iv)
the destructive nature of some tests and (v) the sample tests were adequate. Hence, these
reasons may imply that sampling is more feasible than studying the all items in the
population to measure or observe something. As the information gathered from the
sample is used to generalize findings to a population within the limits of random error,
determining the sample size is highly vital for this research. The determination of
sample size influences the quality and accuracy of research since the size of the sample
may be inappropriate, inadequate or excessive (Bartlett, et al., 2001). Several factors
play an important role in accurately estimating the size of sample such as the marginal
error, the type of data, confidence interval, etc. These factors are usually considered
when the size of the population (namely teaching staff) is more than 200. As the number
of teaching staff in HEs in Turkey especially those associated with the field of
electricity was1684 in 2010 as illustrated in Table 4, there was a need to deepen the
understanding of sampling methods and to apply them accordingly.

3.4.1.4 Sampling Method for the Research


There are many types of sampling such as simple random sample, systematic random
sampling, stratified random sampling and cluster sampling. The simple random sample
is the most widely used sampling which each item or person in the population has the
same chance of being included. One way of ensuring that every teaching staff in the
population has the same chance of being chosen is to first write the name of each
teaching staff on a small slip of paper and deposit all of the slips in a box (Lind, et al.,
2010). After they have been thoroughly mixed, the first selection is made by drawing a
slip out of the box without looking at it and this process should repeated until the
sample size is chosen.

38

On the other hand, many researchers commonly increase the sample size to compensate
for persons that the researcher is unable to contact by adding some percentages (Israel,
1992b). However, given the relatively short history of e-learning in Turkey, it is not
clear to specify the percentages to compensate for non-response. Inflating the sample
size in order to obtain the desired level of confidence and precision may be a solution to
compensate for non-response. However, it is obvious that once the sample size is
increased, the same chance of being chosen for the target teaching staff will be
substantially larger than the teaching staff already chosen. Instead of inflating the
sample size, as I had access to the contact details of most, if not all, teaching staff in the
target population (i.e. academic teaching various topics related to electricity), all
teaching staff can be invited to participate in our studies by sending an invitation
through their email addresses. With this process, every teaching staff in the population
has the same chance of being chosen. However, the characteristics of teaching in the
HEIs associated with the subject of electricity in Turkey are not the same. As illustrated
inTable 4, some of the teaching staff were working in vocational programs, some in
engineering programs and some in educational programs. For example, the majority of
teaching staff in vocational programs had no proficiency of foreign language and no
research on their fields in national and international areas and no degree in master and
PhD. However, they were required to teach the entire curriculum at higher education
level (Akaslan, et al., 2011). On the other hand, the majority of teaching staff in
engineering programs and in educational programs had proficiency of foreign language,
research on their fields, and degree in master and PhD. Hence, the population could be
clearly divided into three groups based on their characteristics namely teaching staff in
(i) vocational, (ii) engineering and (iii) educational programs. Hence, the population
was divided into subgroups (called strata) and a sample is randomly selected for each
stratum. To achieve this process, first, the sample size required for a desired level of
confidence and precision should be specified, and second, the percentage of each
subgroup should be determined according to the rate of teaching staff in programmes as
illustrated in Table 4.

39

3.4.1.4 Sampling Size for the Research


When designing a statistical study, determining sample size is the most important
concern for many researchers (Israel, 1992a; Lind, et al., 2010). A number of factors
influence the quality and accuracy of the sample size such as the purpose of the study,
population size and confidence level. Three criteria are usually taken into account in
addition to the purpose of the study and population size to determine the appropriate
sample size namely (i) the level of precision (also called margin of error or sampling
error), (ii) the level of confidence (also called risk) and (iii) degree of variability. The
first factor is the allowable error is designated as c in the formula and it is the amount
that added and subtracted to the sample mean to determine the end point of the
confidence interval (Lind, et al., 2010). Given the relatively short history of e-learning
in Turkey, 5% is the amount of error is acceptable for the study. The margin of error at
5% is illustrated as 0.05 in the formula.

The second factor is the confidence level or risk. It is based on the ideas examined
under the Central Limit Theorem (Israel, 1992a) and it is showed as Z in the formula.
The third factor is the degree of variability that refers to the distribution of attributes in
the population and showed as p in the formula. A proportion of 0.5 is often used in
determining a more conservative sample size as it indicates the maximum variability in
a population. In addition, some methods such as using a comparable study, a rangebased approach and conducting a pilot study are also recommended by Lind, et. al
(2010). To calculate the sample size, Cochran (1963) recommended the following
equation for the infinite population:
2 ( 1)
0 =
2
(.)(.)(.)
(.)

384.16 - Sample Size

Z: Confidence Level; p: Degree of Variability; c: Allowable Error;


n0: Sample Size for the Infinite Population

Equation 1: Start Equation for the Sample Size


40

However, for the finite population, the sample size can be reduced slightly because a
given sample size provides proportionately more information for a small population
(Israel, 1992a). Since the number of teaching staff and students in the respective HEIs is
1684 and 54537 respectively, the sample size can be adjusted using for the finite
population (N).

384.16
(384 .161)
1684

1+

= 313 314 (Teacher Sample Size)

664.01
(664 .011)
54537

1+

= 386 or 387 (Student Sample Size)

n0: Sample Size for the Infinite Population; n: Sample Size for the Finite Population
N: The size of population

Equation 2: End Equation for the Teacher and Student Sample Size

As a result, at least 314 teaching staff and 387 students from the population were
needed for the study. However, since the population was divided into subgroups based
on the characteristics of teaching staff in the respective HEIs. 17.10 % (53 or 54
persons) of sample should be selected from vocational programs, 71.20 % (224 or 225
persons) from engineering programmes and 11.70 % (36 or 37 persons) should be
selected from educational programmes to ensure that the sample could well represent
the teacher population. With regard to students, 39.2 % (151 or 152 persons) of sample
should be selected from vocational programmes, 56.24 % (217 or 218 persons) from
engineering programmes and 4.57 % (29 or 30 persons) should be selected from
educational programmes to ensure that the sample could well represent the student
population.
41

3.4.1.5 Determining Scales


It is important to determine how variables are going to be measured because they are
used to construct unobserved variables in SEM. Set of measured variables also form a
variance / covariance matrix that is used in the SEM to test a theoretical model.
Understanding distinctions within variables is critical for identifying variables.
Distinctions within variables are also termed levels and options. It was noted by Maltby
& Day (2002) all variables should have a number of options. For example, variable 1
(whether a person owns a computer) has two options : Yes, I own a computer and
No, I do not a computer and variable 2 (whether a person is confident in the use of
computers given his agreement) may have more than two levels (e.g. a five-point Likert
scale): Strongly Agree, Agree, Neutral, Disagree and Strongly Disagree.

Given the number and types of levels (e.g. numeric or text), variables are broadly
divided into two categories: quantitative and qualitative variables. Broadly, a
quantitative variable is one in which the variates differ as numeric such as age and
income whereas a qualitative variable is one in which the variates differ as string such
as gender and nationality. In SPSS, there are three levels of measurement (Gray &
Kinnear, 2012): (1) scale, measurement on an independent scale with units such as
height and weight; (2) ordinal, data in the form of ranks that are not measures on an
independent scale with units such as assigning the rank 1 to the heaviest and 10 to the
lightest; (3) nominal, merely labels or strings such as gender or blood group.

However, different researchers categorize ordinal and discrete variables differently.


Some researchers insist perceiving those variables as categorical but some as continuous
(Maltby & Day, 2002). There is no strict agreement between authors about the position
of rankings. It is described as a grey area by many authors like Maltby & Day (2002)
and Gray & Kinnear (2012). It is, however, does not preclude researchers from the use
of rankings.

A 5-point Likert scale is commonly used to deal with public opinion.

When participants are asked about their attitudes, it is assumed by survey researchers
that responses reflect individuals beliefs about the object, which can be based on a preexisting judgement or on a newly formulated one (Krosnick, 2002).

42

First, a pre-existing judgement is made by participants when their responses reflect


information or opinions that they previously have stored in memory. Secondly, a new
formulated judgement is formed by participants when the question itself prompts them
to draw on relevant beliefs or attitudes. However, no-opinion options are routinely
included in questions when participants are asked about an object regarding which they
have no knowledge and no opinion. Some survey researchers such as Bogart (1972),
Converse and Presser (1986), Payne (1950) and Vaillancourt (1973) recommended noopinion options should be included in questions because ,without no-opinion options,
participants may wish not to appear foolishly uninformed and may therefore fabricate or
give arbitrary answers (Krosnick, 2002). However, no-opinion option in a question is
considered to be a different class of options in the 5-point Likert scale. It is used when
participants has no opinion or lacks enough information to form an opinion. Hence, it is
treated as a missing data.

3.4.1.6 Dealing with Missing Data


Missing data is defined as the absence of unavailability of data on one or more
measured variables for one or more cases (Schumacker & Lomax, 1996). The impact of
missing data on quantitative educational research is not only a great concern to
methodologists but also for educational researchers(Peng, et al., 2007).

Since the

objective of this study is to make valid inferences regarding the e-learning readiness of
HEIs associated with the subject of electricity in Turkey, missing data can make the
sample different from the population from which it was drawn, creating a biased
sample. Therefore, it is suggested to deal with missing data in a way that reflects the
population of inference (Wayman, 2003). There are a number of methods for dealing
with missing data. These methods are broadly categorized into two groups namely ad
hoc methods (e.g. pairwise and list wise deletion) and principled statistical methods
(e.g. single or multiple imputation).The use of ad hoc methods is commonly proffered
by educational researchers. However, it was noted in (Peng, et al., 2007) that
organizations such as the American Psychological Association Task Force on Statistical
Inference warned against the use of these methods for handling missing data due to their
failure to take into account the mechanism that caused missing data.

43

In the list-wise deletion method, an entire record is excluded from analysis of any single
value is missing.

The list-wise deletion method can be used in all analysis in this

dissertation but the choice of list-wise deletion will lead to the removal of a large
amount of the data if one or more values are missing in responses of participants
because they selected no-opinion response or have not completed the questionnaire
fully. However, the removal of such an amount of the data may cause devastating
effects on inferences due to these missing data since non-participants might have
different response profiles compared to those who responded completely. In addition,
since the listwise deletion can eliminate cases with missing values, it may bias results if
the remaining cases are not representative of the population. Furthermore, it is also
possible to code all missing data zero in all analyses.

However, the conclusion would be biased because no-opinion response is a different


class of options in the 5-point Likert scale. Peng et al. (2007) also warn that the use of
listwise or pairwise may also cause a loss of information and statistical power since it
reduces the error degree of freedom (df) in statistical tests and increase standard errors.
Therefore, methodologists prefer principled statistical methods such as maximum
likelihood and multiple imputation as they perform better than ad-hoc methods
(Wayman, 2003). Therefore, it is important to understand the pattern of missing values
before using one of principled statistical methods such as multiple imputation. As a
result, the multiple imputation technique was used in this study and the pooled results
obtained from this technique have been used to analyse the readiness of participating
teachers and students for e-learning.

3.4.1.6 Determining Variables


According to Maltby & Day (2002), there are two important skills that the researcher
should have: first, the ability to identify variables related to their disciplines and second,
the ability to find out how variables are related to other variables. Both of these are
central to any research. The variables used in this study were attached as Appendix 1 for
the teacher questionnaire and as Appendix 2 for the student questionnaire.

44

3.4.1.7Assessment Methods
The majority of the items in the questionnaire were evaluated with a five-point Likert
scale with the leftmost and rightmost anchors being Strongly Disagree and Strongly
Agree respectively. However, given the relative short history of e-learning in Turkey
and suggestions from the literature review, a new option Not applicable / No opinion /
Dont Know was also included (see Chapter 5.3). These alternatives were ordered in a
way that responses could easily be coded into five-point Likert type scale where 1
indicates the lowest readiness value while 5 the highest. As the alternatives were coded
as 1, 2. 3, 4 and 5 in a five-point Likert-type scale, Aydin and Tasci (2005) suggested
that the mean score of 3.40 could be identified as the expected readiness level of
readiness with the items being able to show higher and lower levels of readiness for elearning. They determined the mean score over 3.40 as the expected level of readiness
because the five point scale includes 4 intervals and 5 categories, and the ratio 4
intervals / 5 categories is 0.8 as illustrated in Figure 4. The assessment model developed
by Aydin and Tasci is used through the thesis to indicate whether the institutions are
ready for e-learning in an adequate way as the initial assessment. For only two items,
two binary descriptors were presented: Yes and No since those items are designed
to find out about the access of the participants to the Internet at home and at university.

Strongly Disagree

Disagree

1.8

Neutral

2.6

Agree

3.4

Not Ready for E-learning

Strongly Agree

4.2

Ready for E-learning

Figure 4: An Assessment Model for Measuring Readiness for E-learning

45

3.4.2 Semi-structured Interview


Interviews with students and teachers were semi-structured and based on the four
components as illustrated in Table 5to elicit information for each component.
Participant teachers and students were asked about their experience, intentions, and
knowledge for each component. The duration of interviews ranged from 38 minutes to
120 minutes. Interviews were audio-taped and transcribed.
Table 5: Qualitative research questions included in the Interview Schedule
Structure
Issues
E-learning
Solutions
Strategies

Questions
What are issues or inadequacies in your
department?
What is the meaning of e-learning for you?
How can e-learning solve or help to solve issues in
your department?
How should e-learning be implemented in your
department or program?

The analysis of qualitative data is viewed as an ongoing process that is best started early
as soon as the data collection begins (McQueen & Knussen, 2013). Thematic analysis is
a widely-used method for qualitative analysis across the social, behavioural and more
applied sciences such us clinical, health, education and psychology (Howitt & Cramer,
2014). Braun and Clarke (2006) describe the thematic analysing as a qualitative analytic
method for identifying, analysing, and reporting patterns (themes) within data. In
addition, thematic analysis is also viewed as a tool across different methods for
analysing qualitative data rather than as a specific method.

For instance, Ryan and Bernard (2000) perceive thematic coding as an ongoing process
performed within major methods such as grounded theory rather than as a specific
method in its own right. However, I consider thematic analysis as a method in its own
right and apply thematic analysis undertaken using the methods described by Braun and
Clarke(2006)to data obtained from interviews in this research study. To count a theme
within the data set is central to thematic analysis. Braun and Clarke (2006) point that a
theme captures something important about the data in relation to the research question
and represents some level of patterned response or meaning within the data set.
Prevalence of something important about the data plays an important role to capture a
theme or pattern.
46

However, Braun and Clarke (2006) highlight that more instances of a specific theme
across the data set or the data item do not necessarily mean that the theme itself is more
crucial. In addition, capturing something important in relation to the overall research
question may also specify the keyness of a theme within the data set. Therefore,
researcher judgement is necessary to determine what a theme is within the data set. The
interviews were recorded and later transcribed verbatim. Initial data analysis was
undertaking using Microsoft Excel 2010 to support the classification of the text to
particular themes. The transcript was first read and descriptions applied to relevant
section of the text for each interview.

These descriptions were then interpreted to suggest possible meaning and finally
grouped into themes. To guide the thematic analysis pre-existing overall themes based
on the interview schedule were created: issues (with sub-themes of issues in theory
and practice), e-learning (with sub-themes of experience and intentions), solution,
(with sub-themes of benefits and drawbacks of e-learning) and strategies (with subthemes of strategies in theory and practice).Each interview transcript was read through a
number of times, and potential emergent codes were noted. Transcripts were then coded
systematically, initial codes sorted into potential themes and subthemes, and themes
were then reworked (collapsed, deleted, and rened) to ensure that each theme had
sufficient supporting data and data cohered meaningfully.

3.5 Ethical Issues


It is important to strike a balance between spending and saving in retirement. Similarly,
it is also important for researchers to strike a balance between the demands placed on
them in search of truth and their subjects rights and values potentially threatened by the
research (Cohen, et al., 2007). This is known as the costs/benefits ratio and reflected
as a major critical dilemma in the growth of relevant literature. For example, a
researcher may publish a description of an individual that is rich in detail and their
identity may be obvious to others who know them (Lapan, et al., 2012).Cohen et al
(2007) specifically warn researchers that ethical problems can multiply unexpectedly
when the research move from the general to the particular, and from the abstract to the
concrete.

47

Ethical problems can occur at any stage of the research. For instance, methods selected
for data collection, the context of the research, the nature of the participants in the
research, the type of data collected, and the publication of data can raise ethical issues.
Therefore, it is important to present a conspectus of the main issues that may confront
me such as informed consent and sponsored research.

Especially, qualitative

researchers may face complex ethical issues because they interact with participant
individuals and communities (Lapan, et al., 2012). Much social research necessitates
obtaining the consent (Cohen, et al., 2007). However, Nachmias and Nachmias (1992)
suggest that obtaining the consent is particularly important if participants tend to be
exposed to any stress, pain and invasion of privacy.

48

PART II:
DEVELOPMENT
Developing an E-learning Model in Electrical
Engineering: Empirical Studies in Turkey

49

This part covers many significant features that play a

IN THIS PART

crucial role in developing a model for e-learning. Chapter


4 presents a detailed look at various factors that might

Chapter 04

affect the readiness of organizations and individuals and

Factors affecting

then depicts a framework for measuring the readiness of

Readiness for

different stakeholders in HEIs. Chapter 5 and 7 delineate

E-learning

the measurement about the readiness of teachers and


students for e-learning.

Chapter 05
Teachers Readiness for E-learning

Chapter 6 and 8 report the analyses of the issues for


applying e-learning from the student and teacher

Chapter 06

perspectives. Part II is concluded with Chapter 09 where a

Analysing Issues from the Teacher

model for delivering e-learning by comparing the teacher

Perspective

and the student perspectives in Turkey is described.


Chapter 07
Students Readiness for E-learning

Chapter 08
Analysing Issues from the Student
Perspective

Chapter 09
Comparing the Teacher and
Student Perspectives in Turkey for
developing a model for e-learning

50

CHAPTER 4: FACTORS AFFECTING READINESS FOR E-LEARNING


4.1.

Introduction

Several models have already been designed to assess individuals and organizations
readiness for e-learning, which have been mainly developed for commercial
organizations rather than HEIs by a number of e-learning researchers suchas Chapnick,
(2000), Anderson (2002) and Bean (2003) in early years. The most frequent cited
model was prepared by Chapnick (2001) for a commercial purpose. Chapnick (ibid)
identified a list of 66 main factors that influenced individuals readiness, which were
classified as psychological, sociological, environmental, financial, human resource,
equipment and content readiness. In addition, a model similar to Chapnicks was
developed by Haney (2002),suggesting 70 factors under seven different categories.
These and the other common models guide commercial organizations rather than
educational organizations to justify whether they are ready for e-learning. However,
Kaur and Abbas

(2004) criticized that those models did not fully fit the higher

education sector and developed another model applicable for HEIs by considering eight
dimensions: learner, management, personnel, content, technical, environmental,
cultural, and financial readiness.
In spite of the fact that there are many models to assess individuals or organizations
readiness for e-learning, every system, be it a commercial organization or an academic
institution, should have its own way of measuring readiness for e-learning or any
innovation (Rogers, 2003). It is also important to emphasize here that a standard model
for measuring e-learning readiness may not work for other countries (Aydin & Tasci,
2005). Hence, it is necessary and also important to develop a model with factors
influencing the e-learning readiness of HEIs, especially those associated with the
subject of electricity in Turkey. It is also important to add that existing models similarly
and mainly consider views, needs and experiences of different stakeholders such as
policymakers, administrators, academicians and learners to measure e-learning
readiness. Therefore, our theoretical framework was also designed in a similar way and
took those factors into consideration.

51

4.2

How to Measure Readiness for E-learning

E-learning readiness is not only defined in terms of attributes pertaining to an


organization but also those to individuals. Hence, there is a need to generate a model for
assessing individuals readiness for e-learning in HEIs. The concept readiness for elearning and readiness should be defined conceptually and operationally. For
example, the meaning of the concept readiness should be clarified. The eightdimension model of Kaur and Abas (2004), as described earlier, is generic and seems to
be applicable to any type of HEI. Moreover, existing models on e-learning readiness
commonly consider views, needs and experiences of individuals in organizations.
Hence, I adopted Kaur and Abas model because I aimed to measure e-learning
readiness in Turkey focussing on the subject of electricity. Specifically, the factors that I
intended to measure were identified after detailed analyses of the existing e-learning
readiness models combined with the cultural and environmental characteristics of the
institutions associated with the subject of electricity such as electrical engineering and
rail systems in Turkey.

Integrating these concepts resulted in the model presented in Figure 5: Technology,


Experience and Confidence with ICT, Attitude towards E-learning and Others,
Traditional Skills, Institution, Content, Acceptance and Training for E-learning. As the
experiences, confidences, attitudes and skills of people are so close to each other, I
explain those factors together. Basically, the model was designed with the following
objectives: to investigate the extent to which individuals and the respective HEIs are
ready for e-learning; to examine the perceptions of different stakeholders in the
respective HEIs to find out whether they believe that e-learning would be free of effort
and would enhance their respective works or study; to discover whether individuals
need training for e-learning before embarking on it.

52

Phase 01:
Technology
Phase 02:
Experience with ICT
Phase 03:
Confidence with ICT
Phase 04: Attitude
towards E-learning

People

Phase 05: Attitude


towards Others
Phase 06:
Traditional Skills
Phase 07:
Institution

Phase 08: Content

Phase 09:
Acceptance
Pahse 10: Training
for E-learning

Figure 5: Ten factors of the concept readiness for e-learning

In addition, each of these factors in the model subsumes a set of sub-factors. These subfactors are also assumed to support the institutions especially those associated with the
subject of electricity in Turkey and thus can be used for indicating their readiness for elearning. Each factor and its sub-factors should be taken into consideration as much as
possible during the assessment process. For instance, the stability of the internet
connectivity, which is a factor of technology, is essential for e-learning readiness; the
lack of such a factor and others may result in failure.

53

4.2.1 Technology
Technology is the fundamental factor because e-learning, apart from other critical
elements, is essentially based on computer and Internet. Rogers (2003) described the
readiness of technology in terms of two components namely hardware and software.
Hardware refers to physical components whereas software is the information aspect of
technology (Aydin & Tasci, 2005). The availability of both of them should be
investigated for my study because I aimed at implementing e-learning by using opensource web-based virtual learning environment (VLE). This required having access to
the Internet with a PC or laptop as hardware and a web browser such as Internet
Explorer or Firefox as software. This motivated me to investigate the access to the
Internet at home and at university because e-learning transcends the temporal and
location constraints. Furthermore, it was also important to find out the ease and
flexibility of such an access. For this purpose, I was interested in finding out how
individuals were connecting to the Internet at home, for example, broadband or dialup
and at university, for example, wired and wireless. This was investigated under the subfactor of technology as stability. An easy and flexible access to the Internet was also
associated with the downloading and uploading speeds. While it may not possible to
find out these speeds by posing the related questions to the target group, it could be
possible as well as relevant to investigate to what extent they were satisfied with their
internet connection at university and at home.

4.2.2

People

Experiences and confidences of people with ICT and their attitudes towards e-learning
are referred to as another significant components of measuring readiness as e-learning is
implemented by people. These factors deals with the characteristics of individuals in
HEIs. It is obvious that the more skilled people working at institutions the more likely
they can have a successful e-learning implementation. It is hence deemed relevant to
find out about individuals self-reported competence, experience, confidence and
anticipation for deploying various ICT for different purposes.

Relevant skills,

experiences, confidence levels, and attitudes of the people concerned, namely


researchers, lecturers, administrators and strategists towards e-learning may have an
effect on the integration of e-learning.
54

The readiness of individuals in those institutions is analysed by considering their own


experiences and confidences in the use of various ICT , their attitudes towards elearning and their traditional skills for e-learning.
For the sub-factor Experience with ICT, the users adoption of an innovation is
highly associated with their usage of other functionally similar technologies(Park et al.,
2009). Besides, a system usage is significantly affected by previous experiences of
other systems (McFarland & Hamilton, 2006). As the internet usage is affected by the
computer usage (Lin, 1998), they are also significant factors that affect the e-learning
adoption. For the sub-factor Confidence with ICT, the existing work on e-learning
readiness such as Asaari and Karia (2005), Aydin and Tasci(2005),Lopes (2007) and
So(2005)tended to investigate the skills and confidence of individuals for the particular
usages of ICT. For instance, a person who searches for information about something for
ten hours may not be more skilful than a person who searches for the same thing for
only three hours in case he is aware of a key word system.
For this reason, individuals confidence for any particular ICT usage should be used to
determine the level of readiness for e-learning, because there is generally a linear
relationship between internet/software skills and confidence regarding e-learning
(Agboola, 2006).The sub-factor Attitude towards E-learning, the pessimistic or
optimistic opinions or beliefs of individuals about e-learning are considered relevant.
The actions that individuals take are assumed to be greatly influenced by their
expectations regarding the likely consequences of those actions (Scheier & Carver,
1993). Scheir and Carver (1993) also emphasized that those individuals who had
optimistic beliefs about something continued to work towards the desired outcome even
their progresses were slow, and they strived for it. This assumption motivated me to
find out whether positive attitudes towards e-learning could be a significant factor that
influenced the readiness for e-learning.

55

For the sub-factor Traditional Skills, individuals experiences and confidences in


the use of various ICT and their attitudes towards e-learning are critical success factors
for e-learning. However, the factor people should be further refined by adding a new
sub-factor to investigate individuals traditional skills such as self-motivation, selfresponsibility and time management skills. It is important to note here that the
measurement of these traditional skills may not be a key factor for teachers. However,
(Clarke, 2008) argued that e-students needed a foundation of traditional skills on which
to build their e-learning skills to succeed in e-learning. Moreover, (Dabbagh, 2007) and
(Dray, et al., 2011) also believed

that characteristics of students that made them

successful in traditional learning could contribute to their success to e-learning.


Chapnick (2000) also pointed out that the individuals state of mind was an important
factor influencing the outcome of e-learning initiative.

4.2.3 Institution
Institution is an environment which can be instantiated as a university with its faculties
and departments (Clarke, 2008). It should support e-learning by offering a good
infrastructure, a supportive culture, incentives, models and resources. By investigating
the current strategy and curriculum of institutions as well as their facilities and
personnel, it can be somewhat easy to justify their appropriateness for e-learning.

4.2.4 Content
Content is associated with the availability of existing content, its format, levels of
interactivity, reusability, and interoperability (Lopes, 2007). However, it is almost
impossible for me to instantiate all these aspects because the curriculum currently
applied in the HEIs associated with the subject of electricity is massive. Hence, I
addressed the appropriateness of e-learning for enhancing the quality of learning and
teaching electricity at a broad theoretical and practical - rather than a fine-grained
level.

56

4.2.5

Acceptance

This factor aimed to understand the degree to which a teacher believes that e-learning
would be free of effort and enhance his or her teaching. As there was a high rate of
failure of ICT initiatives for the creation of development opportunities, a solid
understanding of the determinants of user acceptance of particular ICT is crucial not
only for theory building but also for effective practice (Park et al., 2009). A number of
studies aimed to understand the process of user acceptance of new initiatives. The often
cited related work is Technology Acceptance Model (TAM), which was introduced
by(Davis, 1989) to measure the perceptions of users of new ICT in terms of two
constructs: perceived usefulness and perceived ease of use. TAM is still valuable for
understanding the determinants of individuals adoption and use of ICT, whereas
(Venkatesh & Bala, 2008)identified more relevant factors and thus augmented the
related model to become TAM3. Nonetheless, in this study, I adhered to the original
TAM, highlighting the significant role of perceived usefulness and perceived ease of
use in determining the acceptance for e-learning. I elaborate the two constructs
subsequently.
For the sub-factor Perceived Usefulness, the perceived usefulness is defined as the
extent to which a user believes using a system can support the attainment of his specific
goal or need. According to Davis (1989), the tendency that individuals adopt or not
adopt an innovation is dependent on their belief whether it will help them perform their
work better. He developed fourteen items to measure the perceived usefulness of a
system and found that there was a positively significant relationship between the usage
of a system and the users perceived usefulness of the system. The majority of these
fourteen items were directly generated to measure the extent to which a system can
enhance the performance of users. The rationale why Davis used more than one item to
measure perceived usefulness in terms of user performance because he wanted to reduce
the extraneous effects of individual items as different individuals may assign different
meanings to particular items. Generally speaking, the multi-item approach is to ensure
the reliability of a questionnaire. Nonetheless, in the recent TAM3 (Venkatesh & Bala,
2008), four instead of 14 items on perceived usefulness were used without
compromising the reliability. Hence, I adopted the parsimonious approach for my study,
given that a long questionnaire would demotivate participants to complete it.
57

For the sub-factor Perceived Ease of Use, the perceived ease of use is defined as the
extent to which the user believes that using a particular ICT, e-learning in our case,
would be free of effort. Davis (1989) says that individuals may believe the usefulness of
a given innovation, but they may find it difficult to use; the potential benefits of the
application are then outweighed by the effort of using it. Similarly, instead of using
Daviss fourteen items, I adopted as well as adapted the TAM3 approach to use two
items to evaluate this construct.

4.2.6Training
In addition to understand how people in the institutions tend to accept or reject elearning, it is also deemed relevant to evaluate whether the people in the institution need
training for e-learning before embarking on it. Training for e-learning is important for
e-learning readiness and it should be considered in the process of implementation of elearning (Agboola, 2006). In sum, the final model is illustrated in Figure 6in three
stages: Readiness, Acceptance and Training.

Step 1: Readiness for E-learning


Technology

People

Content

Institution

Hardware
Software
Stability

Experience
Confidence
Attitude
Traditional Skills

Theory
Practice

University
Faculty
Department

Step 2: Acceptance for E-learning


Perceived Usefulness

Perceived Ease of Use

Step 3: Training for E-learning


Training Teachers

Training Learners

Training Personals

Improving Facility
Facility

Figure 6: Factors for Measuring Students Readiness for E-learning

58

The underlined factor Traditional Skills is only included in the model for students elearning readiness). This model is basically appropriate for measuring students and
teachers e-learning readiness, because the core factors and their subsuming attributes
(or sub-factors) remain relevant. Presumably the model is generalizable for assessing elearning readiness in developing countries such as Turkey, albeit some fine-tuning may
be required for specific socioeconomic attributes.

59

CHAPTER 5: MEASURING TEACHERS E-LEARNING READINESS


5.1 Introduction
Implementing e-learning in HEIs is influenced by various barriers and drivers. The
majority of barriers are related to the challenging issue concerning the integration of elearning into universities. Hence, it is deemed relevant to understand whether different
stakeholders in HEIs tend to embrace or ostracize e-learning for their work. In addition,
this can help us understand the traits and characteristics of teachers in the respective
HEIs and shape our next stage. The objectives of this chapter are to investigate the
extent to which teaching staff in the HEIs associated with the subject of electricity in
Turkey were ready for e-learning and to compare the readiness of teachers in terms of
various variables such as the gender and age of the participants. It also examined two
factors that presumably affected the perceptions of academic staff on e-learning under
their readiness: the degree to which teachers believe that e-learning would be free of
effort and would enhance their teaching and; whether teachers needed training on elearning before embarking on it.

5.2 Methods
5.2.1 Procedure
By taking the sampling method and the size of sample into the account, which is
specified in Chapter 3, 417 programmes in Turkey were selected for the study in 2010.
The participant institutions were determined whether they were associated with the
subject of electricity such as electrical and electronics engineering according to the
official data of the higher education council in Turkey. The academic staff including
administrators, strategists, researchers and lecturers in those institutions were chosen as
participants who could provide data regarding their institutions readiness for elearning. I used the open-source Lime-survey to implement the questionnaire into the
web-based format. Teachers in HEIs offering the subject of electricity in Turkey were
invited to participate in the survey by notifying the faculty or high school secretaries
and the head of departments of 417 programs in the respective HEIs in Turkey.
60

I successfully sent invitations via email on 16th March 2010 to all the faculty or high
school secretaries and the head of departments of 417 programmes. At the time of the
survey, the number of teaching staff in the respective HEIs was 1684. Since the number
of teaching staff in the respective HEIs was finite, the sample size with the 5%
allowable error, the 95% confidence level and with the 0.5 degree of variability was
calculated as 316 as detailed in Chapter 3. To encourage the teacher staff in HEIs to
participate and reduce the non-response rate, a personal invitation and reminder was
also sent to the personal email of the majority of the 1684 teachers in Turkey. When the
sample size achieved, which was 316, the survey was closed on the 16th May 2010.
Until the 16th May 2010, 342 individuals responded to the survey; 289 of them fully
completed the survey and 53 of them only partially. No incentive was offered to the
participants whose participation was entirely voluntary. However, based on the sample
size method, the responses of the first 54 persons from vocational programs, 225 from
engineering programs and 37 persons from educational programs were accepted and
analysed in this research as detailed in Chapter 3. However, 7.52% of the data was
missing data because of two reasons: first, partial answer of some participants and
second; the selection of no-opinion option in the scale for some items.

5.2.2Missing Data
Table 6displays the overall summary of missing values for items in the background and
readiness sections in the survey. Overall, 36 (73.47%) of 49 variables in the
questionnaire had at least one missing value. However, only 117 (37.03% ) of 316 cases
contained at least one missing value. Overall, 7.52% (ie, 1165) of all values were
missing. As a result, the multiple imputation technique (Peng et al., 2007) was used in
this study and the pooled results obtained from this technique would be used to analyse
the readiness of participating teachers for e-learning.

61

Table 6: Overall Summary of Missing Values


Items
Background
Readiness
Total

Complete
Incomplete
Complete
Incomplete
Complete
Incomplete

Variables
n
%
9 100.0
0
0.00
6 14.29
36 85.71
13 26.53
36 73.47

Cases
n
%
316 100.0
0
0.00
119 62.97
117 37.03
199 62.97
117 37.03

Values
n
%
2212
100.0
0
0.00
12107 91.22
1165
8.78
14319 92.48
1165
7.52

5.2.3 Research Group


The study revealed that the majority of the participants were male (85.4%) and the
remaining part were female (14.6). The age groups of the participants were categorised
as follows: 4.4% under 24, 36.4% between 25 and 34, 37% between 35 and 44, 15.5%
between 45 and 54 and 6.6% over 55 years old. This indicated that almost 90% of the
participants were between 25 and 55 years old at the time of the survey. Another
criterion to categorise the participants is type of affiliation: 88.9% of the participants
were working in public universities across Turkey and the others were in private ones.
Besides, the participants were classified according to their main roles: 38.3%
researchers, 37.7% teachers, 4.7% administrators and 19.3% strategists. Furthermore,
17.1% of the participants were registered in the institutions that took two academic
years to attain an associate degree while 82.9% were working in institutions offering
four-year Bachelors degree programmes.

5.2.4 Items
The number of the questions in the teacher questionnaire (see Appendix I)was 27 in
total. There were altogether 64 items in the questionnaire which gauge participants
self-reported perceptions on different aspects of e-learning and collects information
about their background such university, age and gender. Specifically the nine factors
depicted in Figure 6, except the factor traditional skills, correspond to the nine parts of
the questionnaire. A list of the close-ended items, which were analysed in this chapter,
is shown in Table 7.

62

Table 7: List of Items of the Teacher E-readiness Survey


Factors
Technology

Experience
with ICT

Confidence
with ICT

Attitudes
towards Elearning
Attitudes
for Others
Institution

Content

Acceptance

Training

I01
I02
I03
I04
I05
I06
I07
I08
I09
I10
I11
I12
I13
I14
I15
I16
I17
I18
I19
I20
I21
I22
I23
I24
I25
I26
I27
I28
I29
I30
I31
I32
I33
I34
I35
I36
I37
I38
I39
I40

Item identifier and their content


I have access to the Internet at home.
I am satisfied with the stability of the home internet network.
I have access to the Internet at university.
I am satisfied with the stability of the home internet network.
I use the internet as information source.
I use e-mail as the main communication tool.
I use office software for content delivery and demonstration.
I use social network sites (e.g. Facebook & Orkut).
I use electrical software (e.g. AutoCAD &MATLAB).
I use instant Messaging (e.g. MSN, Yahoo).
I use computers confidently.
I use web browsers (e.g. Internet Explorer) confidently.
I use search engines (e.g. Google & MSN Search) confidently.
I use digital file management tools confidently.
I use authoring tools to create learning materials confidently.
I have information about what e-learning is.
I have enough ICT competency to prepare e-learning materials.
I feel that I am ready to integrate e-learning in my teaching.
I have enough time to prepare e-learning materials.
I believe my students will like e-learning.
I support the integration of e-learning in my own department.
My top-level administration understands what e-learning is.
My top-level administration supports the use of e-learning.
It is applied in my university.
It is applied in my faculty / high school.
It is applied in my department.
It can enhance the quality of the theoretical part of the subject.
It can enhance the quality of the practical part of the subject..
It can be applied to the practical part of the subject electricity.
It can be applied to the practical part of the subject electricity.
I believe that e-learning can improve the quality of my teaching.
I believe that using e-learning can increase my productivity.
I believe that e-learning is useful for my research.
I believe that e-learning is more effective than traditional one.
I believe that it is easy for me to use e-learning tools.
I believe that my students find it easy to use VLE.
I do not need training on e-learning.
My students do not need training on e-learning.
Technical and administrative personals do not need training
The facilities of university are enough for e-learning.

63

5.4 Results and Discussion


This section is divided into two parts: The first part reports the descriptive statistics
among items in the study whereas the second part compares the mean scores of
variables such as the gender and age of the participants to find out whether there were
significant differences with respect to these variables.

5.4.1 Initial Findings using Descriptive Statistics


This section is used to report the initial findings of the teacher survey by reporting the
descriptive statistics among the items in the study in nine-fold based on the factors that
may affect the readiness of teachers for e-learning, including technology, experience
and confidence with ICT, attitudes towards e-learning and others, institution, content,
acceptance and training. The means and standard deviations of the scores of all the
items in the study are presented in Table 8.

The overall results of the items associated with each factor are explained separately. In
addition to the statistics of all the items, Table 9illustrates the mean score of the
participants responses related to each factor and the overall mean scores of factors in
the study. From Table 9, it can be observed that the overall mean score of factors is
higher than the expected level of readiness (MOVERALL = 3.50 > MEXPECTED = 3.40).

64

Table 8:Statisticsof all the items in the study

Factors
I01
I02
Technology
I03
I04
I05
I06
Experience I07
with ICT
I08
I09
I10
I11
I12
Confidence
I13
with ICT
I14
I15
I16
I17
Attitudes
I18
towards EI19
learning
I20
I21
Attitudes
I22
for Others I23
I24
Institution I25
I26
I27
I28
Content
I29
I30
I31
I32
I33
Acceptance
I34
I35
I36
I37
I38
Training
I39
I40
Overall

Items
TEC1
TEC2
TEC3
TEC4
EXP1
EXP2
EXP3
EXP4
EXP5
EXP6
CNF1
CNF2
CNF3
CNF4
CNF5
MAT1
MAT2
MAT3
MAT4
MAT5
MAT6
OTA1
OTA2
INS1
INS2
INS3
CNT1
CNT2
CNT3
CNT4
TAM1
TAM2
TAM3
TAM4
TAM5
TAM6
TRA1
TRA2
TRA3
TRA4

Standard
Error
Mean
0.07
0.08
0.00
0.06
0.04
0.04
0.05
0.08
0.05
0.08
0.04
0.04
0.04
0.05
0.05
0.05
0.06
0.05
0.05
0.05
0.06
0.05
0.05
0.11
0.10
0.10
0.05
0.06
0.05
0.06
0.05
0.05
0.05
0.06
0.04
0.04
0.05
0.04
0.05
0.06
0.22

Mean
4.53
4.37
5.00
3.33
4.60
4.49
4.38
2.92
4.35
3.36
4.45
4.43
4.47
3.95
3.79
3.65
3.68
3.64
2.76
3.54
3.19
3.23
3.48
3.15
2.31
2.19
3.77
3.27
3.89
3.09
3.54
3.62
3.58
3.16
3.72
3.78
2.67
2.23
2.09
3.29
3.50

65

Standard
Deviation

Variance

1.29
1.34
0.00
0.98
0.63
0.77
0.87
1.38
0.91
1.34
0.69
0.67
0.66
0.92
0.95
0.92
0.99
0.95
0.97
0.97
0.99
0.93
0.87
1.95
1.83
1.79
0.91
1.10
0.88
1.11
0.88
0.83
0.91
1.00
0.69
0.67
0.96
0.79
0.81
1.02
0.39

1.66
1.81
0.00
0.96
0.39
0.59
0.75
1.91
0.82
1.79
0.47
0.45
0.44
0.84
0.90
0.85
0.97
0.91
0.94
0.94
0.98
0.87
0.75
3.82
3.35
3.22
0.82
1.21
0.77
1.24
0.78
0.69
0.83
1.01
0.48
0.44
0.92
0.63
0.66
1.04
0.15

Table 9:The overall mean score of the participants responses for each factor

Factor

No of
item

Mean

Technology
Experience with ICT
Confidence with ICT
Attitudes towards e-learning
Attitudes towards others
Institution
Content
Acceptance
Training
Overall

4
6
5
6
2
3
4
6
4
40

4.31
4.02
4.22
3.41
3.35
2.55
3.51
3.56
2.57
3.50

Standard
Error
Mean
0.04
0.03
0.03
0.04
0.05
0.08
0.04
0.03
0.04
0.02

Standard
Deviation

Variance

0.70
0.61
0.62
0.68
0.84
1.44
0.77
0.62
0.63
0.39

0.49
0.38
0.38
0.46
0.71
2.06
0.59
0.38
0.40
0.39

Based on this result, it can be inferred that teachers in the HEIs surveyed in Turkey
were in general ready for e-learning. The mean scores for the factors could also be used
to identify the areas of improvement in the participating institutions in Turkey. First of
all, the mean scores for institution and training, the only factors whose mean score were
lower than the expected readiness level, show that the participating teachers needed
training for e-learning before embarking on it and the infrastructure of institutions were
not sufficient for e-learning. So, we definitely should make improvements in those
areas.

5.4.1.1 Findings in the Factor 1: Technology


For the technology factor, the participants were first asked about their ownership of a
desktop or laptop computer connected to the Internet at both home (TEC1) and
university (TEC3) and second the stability of the Internet at their home (TEC2) and
university (TEC4), because e-learning is facilitated by the access to the Internet and a
computer. The descriptive statistics of the scores of all the items in the technology
factor are represented in Table 8.All the participants reported that they have access to
the Internet at university (TEC3) whereas only 88.3 % (279) of them at home (TEC1).
Besides, in order to find out how the participants access the Internet both at home and at
university, they were also surveyed for the way they usually do so: whether they
connect to the Internet using a broadband or dial-up at home or using wireless or wired
at university.

66

As obtained, the 80.1% (251) of the participants, use broadband at home whereas at
university 64.9% (205) of the participants have access to the Internet using both wired
and wireless connection, 32.6% (103) with only wired and 2.5% (8) with only wireless.
For the stability of the Internet at home and university, the participants were asked to
what extent they are satisfied with the home and university network. The mean score of
the participants answers for the items TEC2 and TEC4 in Table 8indicates that the
stability of the internet at home is sufficient but not at university. According to views of
the participants, the mean score of their responses for the stability of the Internet is
under expected readiness level at university (M=3.33 < MEXPECTED=3.40) but it is higher
at home (M=4.37 > MEXPECTED=3.40).
These findings seem to imply that the stability of the Internet at university should be
improved to enhance teachers e-learning experience before embarking on e-learning.
The overall descriptive statistics of scores for all the items in the technology factor are
also computed as illustrated in Table 9. The overall mean score of the items from the
table indicates that the overall mean score for all the items in the technology factor are
sufficient because it much higher than the expected readiness level (M=4.306 >
MEXPECTED=3.40). Based on these results, it can be inferred that while the overall
responses of the participants indicates that they are ready for e-learning technologically,
it may not possible to transcend the location constraints for the participants without
access to the internet at home.

5.4.1.2 Findings in the Factor 2:Experience with ICT


Table 8illustrates the descriptive statistics of the scores of all the items designed to
assess the participants experiences in the usage of different ICT for their work (e.g.
items EXP1 to EXP6) including mean, standard error mean, standard deviation and
variance. From the table, it can be observed that the mean scores of the items EXP1,
EXP2, EXP3 and EXP5 are higher than the expected level of readiness
(MEXPECTED=3.40) whereas those of the items EXP4 and EXP6 are not.

67

Based on these results, it can be inferred that the experiences of the participants in ICT
usage are mostly sufficient for e-learning, although their experiences of using social
network sites (EXP4) and instant messaging for synchronous communication (EXP6)
are under the expected readiness level. These findings seem to imply that while the
participants in terms of their individual experiences seems to be ready for e-learning,
they do not indicate a sufficient readiness for their social experiences with ICT.

5.4.1.3 Findings in the Factor 3:Confidence with ICT


For the factor confidence with ICT, five items were designed to assess the participants
confidence with ICT. Table 8displays standard deviation, number and mean scores for
the questions associated with the confidence in ICT usage. The results show that the
participants in those institutions have sufficient level of confidence in using particular
ICT. All items related to the factor confidence are higher than the expected level of
readiness, revealing that the participants are confident in using computers, web
browsers, search engines, digital file management tools and authoring tools to create
learning materials.

5.4.1.4 Findings in the Factor 4: Attitudes towards E-learning


For the factor attitudes towards e-Learning, six items were designed to assess the
attitudes of the participants towards e-learning (i.e. items MAT1 to MAT6). The items
designed to evaluate the perceptions of the participants towards e-learning are indicated
in Table 8. It is apparent from this table that although its overall mean score is slightly
higher than the expected readiness level, it was lower than the factors technology,
experience and confidence with ICT.As can be seen in Table 8, except item MAT4 and
MAT6, the mean scores of all the items were higher than the expected readiness level.
These findings seem to imply that the participants have information regarding elearning (MAT1); they have sufficient competence with ICT (MAT2); they feel that
they are ready for e-learning (MAT3) and their students will like e-learning (MAT5).
However, the participants are afraid that they may not have time to prepare e-learning
materials (MAT4). Furthermore, it seems that they are not ready to support the use of elearning in their department immediately (MAT6).

68

5.4.1.5 Findings in the Factor 5: Attitudes towards Others


For the factor attitudes towards others, two items were designed to evaluate the
perceptions of the participants to find out what they think about their top-administration
regarding e-learning (i.e. items OTA1 and OTA2). The scores of the participants asked
to rate their opinions about their top-administrations regarding e-learning are illustrated
in Table 8. As seen in Table 8, the findings seem to imply that although their managers
(top-administration) will support the integration of e-learning, they may lack the
information regarding e-learning. This imply that the top-administration in the
respective HEIs support the use of new technologies regardless of the fact whether they
have information or not.

5.4.1.6 Findings in the Factor 6: Institution


Three items were designed to assess the institutions of the participants regarding elearning. The participants were investigated for the fact whether e-learning is currently
implemented in their university in terms of three units; their own departments, other
departments and their university. The descriptive statistics of the scores of these three
items are illustrated in Table 8. According to findings, the almost half (%53.73, 170) of
participants universities, according to the responses of the teachers, apply e-learning in
Turkey. Besides, the participations were also asked for the fact whether e-learning is
applied at their own faculty or high school. The %32.72 (104) of the participants believe
that e-learning is currently applied in their faculties or high schools. Finally, 29.68%
(94) of the departments associated with the subject of electricity currently implement elearning officially or with the personal efforts of the teachers.

69

5.4.1.7 Findings in the Factor7: Content


For the factor content, four items were designed to assess the perceptions of the
participants regarding the subject of electricity (i.e items CNT1 and CNT4). The
participants were asked to what extent they agree that e-learning can enhance the quality
of the theoretical (CNT1) and practical (CNT2) parts of the subject of electricity and
can be applied to those theoretical (CNT3) and practical (CNT4) parts. Table 8shows
that the participants believe that e-learning can enhance the quality of theoretical part of
subject electricity (CNT1) and can be also applied into that part (CNT3).

However, their mean scores indicate that e-learning may not be applicable for the
practical parts of the subject electricity because the mean scores of the items CNT2
AND CNT4 are under the expected readiness level. It is also interesting to note here
that their belief that e-learning can enhance the practical part of the subject of electricity
is stronger than their belief that e-learning can be applied to that part. Overall, these
findings imply that teacher participants consider that e-learning can be integrated into
theory to enhance the quality of the courses on electrical engineering but not in practice.

5.4.1.8 Initial Findings in the factor 8: Acceptance


For the factor acceptance of the study, the participants were asked to opine for six items
to measure their acceptance for e-learning (i.e. items TAM1 to TAM6), based on the
Technology Acceptance Model (Venkatesh & Bala, 2008; Davis, 1989).

Table 8 shows the mean score and standard deviation of the responses for those items.
The mean scores of all items in Table 8except the item TAM4 (M=3.16, SD=1.00) is
70

over the expected readiness level. It can be easily interpreted that the participants hold
positive perception toward e-learning. However, the participants do not believe that elearning will enable them to accomplish their teaching more effectively than the
campus-based approach. The main rationale for this result can be their belief that elearning is not applicable to the practical part of the subject of electricity as mentioned
above. As a result, I may conclude that the participants believe that e-learning can help
enhance their teaching and can be implemented without much effort in overall.

5.4.1.9 Initial Findings in the factor 9: Training


For the last part of the study, the participants were required to answer four questions to
find out whether there is a need of training for e-learning before it is implemented (i.e.
items TRA1 to TRA4). Table 8indicates the statistics of the scores of those items. The
mean scores of items TRA1, TRA2 and TRA3 in

Table 8 indicate that the participants highly need training for e-learning for themselves,
for their students and for their colleagues. Additionally, they think that their institutions
do not have sufficient facilities to implement e-learning (i.e. TRA4).

5.4.2 Comparative Findings using Inferential Statistics


Independent-sample t-test and one-way ANOVA test were used to verify statistical
significance of differences in mean scores on various variables, namely between male
and female, public and private universities, 2-year and 4-year bachelor degree
programs, among different regions and institutions, and among teachers, researchers and
71

administrators and strategists. However, t-test and one-way ANOVA test cannot be
computed for the item TEC3 because the standard deviations of both groups are 0.

5.4.2.1 Differences between Males and Females


The gender differences on the readiness for e-learning in always assumed to be a
controversial topic as it is not consistently observed (So, 2005). Table 10shows the
number, mean and standard deviation of the scores of the items in the study for females
and males. Independent-sample t-test is also illustrated in Table 10to verify statistical
significances of differences in mean scores between males and females at the level of
0.05. As shown in Table 10, no significant difference occurred between females and
males forthe scores on the measure of four items in the first factor of the study (i.e.
technology). However, the responses of both groups indicate that they are not satisfied
with the stability of the Internet at university in which they are currently
working.zHowever, the overall mean of the items in the technology is higher than the
expected readiness level and no significant difference (t[314] = -0.321, p > 0.05)
occurred between females (M=4.337, SD=0.652) and males (M= 4.301, SD=0.711) in
terms of the overall mean of four items in the technology factor.
For the factor experience with ICT, the responses of the males for the item designed to
assess whether they use the instant messaging such as MSN and YAHOO Messengers
(i.e. EXP6) are under the expected readiness level (M=3.308 < MEXPECTED=3.40) and
slightly lower for the responses of the females. Moreover, the responses of both groups
for the item to assess whether they use social network sites such as Facebook and Orkut
(i.e. EXP4) are also under the expected readiness level. These findings imply that the
individual experiences of the participants regardless of their gender are highly stronger
than their social experiences. However, in terms of the overall mean score of the items
in the factor experience with ICT, a significant difference (t[314] = -1.98 p < 0.05) was
found between males (M=3.99, SD=0.60) and females (M=4.18, SD=0.65). For the
third factor of the study (i.e. confidence), a significant difference (t[314] = -2.19, p >
0.05) occurred between females (M=4.67. SD=0.48) and males (M=4.44, SD=0.68)
with respect to the use of search engines such as Google and MSN Search (i.e. CNF3).
Besides, the responses of both females and males for all the items designed to measure
their confidence with ICT are higher than the expected readiness level. The overall

72

mean score of all the items in the factor confidence with ICT is also computed for both
female (M=4.24, SD=0.51) and male (M=4.21, SD=0.64) but not significant difference
was found (t[314] = -0.246, p >0.05) .

However, the attitude of the female participants whether they believe their learners will
like e-learning (i.e. MAT5) is substantially weaker than that of the male ones in the
fourth factor of the study (i.e. attitudes towards e-learning).Moreover, males (M=3.26,
SD=0.99) scores significantly higher (t[314] = 2.21, p < 0.05) than males (M=2.91,
SD=0.85) on the measure of the item to assess whether they will support the use of elearning in their department (i.e. MAT6). However, the responses of both males for this
item are under the expected readiness level and hence not sufficient for e-learning.
Furthermore, the mean score of the item designed to assess whether the participants
have sufficient time to prepare e-learning materials are also under the expected
readiness level. This shows that both groups are afraid that they may not have extra time
for e-learning. Besides, the overall mean score of all the six items in the factor attitude
towards e-learning is also calculated as 3.41. With the regard of the fifth factor of the
study (i.e. attitude towards others), the responses of both females and males for the item
designed to assess whether their top-administration have information about e-learning
(i.e. OTA1) is lower than the expected readiness level.

73

Table 10: Statistics for the items for Gender Differences

Factors
I01
I02
Technology
I03
I04
I05
I06
Experience I07
with ICT
I08
I09
I10
I11
I12
Confidence
I13
with ICT
I14
I15
I16
I17
Attitudes
I18
towards EI19
learning
I20
I21
Attitudes for I22
Others
I23
I24
Institution
I25
I26
I27
I28
Content
I29
I30
I31
I32
I33
Acceptance
I34
I35
I36
I37
I38
Training
I39
I40
Overall

Items
TEC1
TEC2
TEC3
TEC4
EXP1
EXP2
EXP3
EXP4
EXP5
EXP6
CNF1
CNF2
CNF3
CNF4
CNF5
MAT1
MAT2
MAT3
MAT4
MAT5
MAT6
OTA1
OTA2
INS1
INS2
INS3
CNT1
CNT2
CNT3
CNT4
TAM1
TAM2
TAM3
TAM4
TAM5
TAM6
TRA1
TRA2
TRA3
TRA4

Female (N=46)
Standard
Mean
Deviation
4.65
1.14
4.35
1.27
5.00
0.00
3.35
1.04
4.68
0.52
4.58
0.81
4.51
0.79
3.19
1.28
4.44
0.98
3.69
1.21
4.34
0.72
4.51
0.53
4.67
0.48
3.90
0.94
3.77
1.01
3.64
0.88
3.54
0.99
3.61
0.93
2.55
0.88
3.37
1.01
2.90
0.87
3.07
0.99
3.39
0.98
3.07
2.01
1.94
1.70
2.27
1.86
3.51
1.01
2.94
1.00
3.57
0.99
2.75
1.01
3.14
0.89
3.40
0.78
3.34
0.94
3.06
0.98
3.31
0.68
3.49
0.70
2.64
0.97
2.44
0.94
2.33
0.93
3.30
0.94
3.43
0.37

74

Male (270)
Standard
Mean
Deviation
4.51
1.31
4.37
1.36
5.00
0.00
3.32
0.97
4.59
0.64
4.47
0.76
4.36
0.88
2.87
1.40
4.34
0.90
3.31
1.35
4.47
0.68
4.42
0.69
4.44
0.68
3.96
0.92
3.79
0.94
3.65
0.93
3.70
0.99
3.65
0.96
2.80
0.98
3.57
0.96
3.24
1.00
3.26
0.92
3.49
0.85
3.16
1.95
2.37
1.85
2.17
1.79
3.82
0.88
3.32
1.11
3.94
0.84
3.15
1.12
3.60
0.86
3.65
0.83
3.61
0.90
3.17
1.01
3.79
0.67
3.82
0.65
2.68
0.96
2.19
0.76
2.05
0.79
3.29
1.04
3.51
0.39

t
Value

p
Value

-0.69
0.11
-0.16
-0.91
-0.92
-1.12
-1.45
-0.69
-1.78
1.17
-0.88
-2.19
0.38
0.12
0.05
1.05
0.22
1.58
1.34
2.22
1.24
0.76
0.30
1.49
-0.34
2.14
2.17
2.74
2.30
3.39
1.96
1.87
0.69
4.44
3.17
0.23
-1.91
-2.18
-0.06
1.37

0.49
0.92
0.87
0.36
0.36
0.27
0.15
0.49
0.08
0.24
0.38
0.03
0.71
0.90
0.96
0.30
0.83
0.12
0.18
0.03
0.21
0.45
0.77
0.14
0.74
0.03
0.03
0.01
0.02
0.00
0.05
0.06
0.49
0.00
0.00
0.82
0.06
0.03
0.95
0.17

However, both groups are in agreement that their top-administration will support the use
of e-learning in their institutions. For the sixth factor of the study (i.e. institution), no
significant differences occurred between the responses of males and females. However,
the mean scores of all the items designed to assess whether e-learning is currently
applied in the institutions participants are under the expected readiness level. For the
seventh factor of the study (i.e. content), significant differences occurred in all the items
associated with the content factor. Male participants scored significantly higher than
female ones. However, the responses of both groups for the items designed to assess
whether e-learning is applicable to the practical part of the subject of electricity (CNT2)
and whether e-learning can enhance the quality of practical part of the subject of
electricity (CNT4) are under the expected readiness level. With regard to the eighth
factor of the study (i.e. acceptance for e-learning), the responses of the females for all
the items to designed to assess whether they accept e-learning in terms of perceived
usefulness and ease of use are lower than those of males and four of these items are
under the expected readiness level (MEXPECTED=3.40).
This shows that the females do not believe that using e-learning can enhance the quality
of their teaching (TAM1); e-learning is useful for their research (TAM3) and e-learning
is better than the campus-based approach (TAM4). Besides, they do not believe
themselves that they will use learning tools with ease (TAM6). However, they believe
that e-learning can increase their productivity (TAM2) and their students will use elearning tools with ease (TAM6). The responses of males for all these questions in the
acceptance factor are over the expected readiness level. In terms of the ninth factor of
the study, which training for e-learning, the responses of both females and males show
that teachers need training for e-learning for themselves, for their students and for their
technical and administrative personals. Besides, they think that the facilities of their
institutions are not sufficient for e-learning because the mean score of all the responses
is under the expected readiness level. Interestingly, the females scored higher than the
males on all the items. These results indicate that female participants do not believe in
the importance of training for e-learning as much as males do or they feel that they are
more ready for e-learning and have the same feeling for other people, namely students,
and technical and administrative personals.

75

From Table 10, it can be observed that the overall mean score is higher than the
expected level of readiness for both females (MFEMALES=3.49> MEXPECTED=3.40) and
males (MMALES=3.66> MEXPECTED=3.40). However, no significant differences occurred
between these two groups (t[314]=1.33, p > 0.05). Based on these results, it can be
inferred that both female and male teachers in HEIs in Turkey are overall ready for elearning, although they needs a few improvements

5.4.2.2 Differences between Private and Public Universities


Table 11shows the differences between the participants who are working in private and
in public universities. For the first factor of the study, which is technology for elearning, no significant differences occurred between private and public universities.
However, the responses of the teachers in public universities were slightly higher than
the ones in private ones. No significant differences were also found in the second factor
of the study, which is experiences with ICT for e-learning. However, the responses of
the participants in private universities show that their usage of instant messaging such as
MSN and Yahoo Messenger was under the expected level of readiness and much lower
than the mean score of the participants in public universities. Moreover, the mean scores
of both groups for the item designed to assess whether they use social network sites
such as Facebook and Orkut were under the expected level of readiness. For the third
factor of the study, which confidence with ICT for e-learning, the responses of both
groups for all the items designed to assess whether they are confident in using several
items are higher than the expected readiness level. No significant differences were
found between private and public universities. With regard to the fourth factor of the
study, which is attitudes towards e-learning, a significant difference (t[314]=2.08, p
<0.05) was found between private (M=3.88, SD=0.77) and public (M=3.51, SD=0.98)
universities on the measure of the item designed to assess whether they believe their
students will like e-learning (MAT5).

76

Table 11: Results of University Financial Mode Differences

Factors

Items

I01
I02
Technology
I03
I04
I05
I06
I07
Experience
with ICT
I08
I09
I10
I11
I12
Confidence
I13
with ICT
I14
I15
I16
I17
Attitudes
I18
towards EI19
learning
I20
I21
Attitudes
I22
for Others
I23
I24
Institution
I25
I26
I27
I28
Content
I29
I30
I31
I32
I33
Acceptance
I34
I35
I36
I37
I38
Training
I39
I40
Overall Scores

TEC1
TEC2
TEC3
TEC4
EXP1
EXP2
EXP3
EXP4
EXP5
EXP6
CNF1
CNF2
CNF3
CNF4
CNF5
MAT1
MAT2
MAT3
MAT4
MAT5
MAT6
OTA1
OTA2
INS1
INS2
INS3
CNT1
CNT2
CNT3
CNT4
TAM1
TAM2
TAM3
TAM4
TAM5
TAM6
TRA1
TRA2
TRA3
TRA4

Private (N=35)
Mean Standard
Deviation
4.54
1.29
4.37
1.35
5.00
0.00
3.29
1.13
4.65
0.50
4.65
0.77
4.25
1.16
2.89
1.41
4.25
1.20
2.99
1.37
4.57
0.56
4.56
0.52
4.60
0.50
4.06
1.00
3.67
1.13
3.57
1.04
3.66
1.09
3.63
0.98
2.83
1.04
3.87
0.76
3.26
0.90
3.53
0.89
3.78
0.85
3.86
1.77
2.23
1.81
2.90
1.98
3.84
0.73
3.55
0.88
3.96
0.57
3.46
0.95
3.83
0.67
3.89
0.59
3.50
0.90
3.34
0.91
3.69
0.69
3.75
0.61
2.79
0.86
2.34
0.89
2.51
0.95
3.30
1.14
3.64
0.42

77

Public (N=281)
Standard
Mean
Deviation
4.53
1.29
4.37
1.35
5.00
0.00
3.33
0.96
4.60
0.64
4.47
0.76
4.40
0.83
2.92
1.38
4.37
0.87
3.41
1.33
4.44
0.70
4.41
0.69
4.46
0.68
3.94
0.91
3.80
0.93
3.66
0.91
3.68
0.97
3.64
0.95
2.75
0.96
3.50
0.99
3.18
1.00
3.20
0.93
3.44
0.86
3.06
1.96
2.32
1.84
2.10
1.75
3.76
0.93
3.23
1.12
3.88
0.91
3.05
1.12
3.50
0.90
3.58
0.85
3.58
0.91
3.13
1.01
3.73
0.70
3.78
0.67
2.65
0.97
2.22
0.78
2.04
0.78
3.29
1.01
3.48
0.38

t
Value

p
Value

0.06
0.02
-0.26
0.45
1.35
-0.98
-0.13
-0.71
-1.76
1.03
1.23
1.20
0.72
-0.73
-0.53
-0.08
-0.09
0.48
2.16
0.45
2.02
2.18
2.29
-0.26
2.50
0.47
1.65
0.51
2.10
2.10
2.10
-0.53
1.14
-0.27
-0.20
0.82
0.85
3.27
0.07
2.29

0.96
0.98
0.80
0.65
0.18
0.33
0.90
0.48
0.08
0.31
0.22
0.23
0.47
0.46
0.59
0.94
0.93
0.63
0.03
0.66
0.05
0.03
0.02
0.80
0.01
0.64
0.10
0.61
0.04
0.04
0.04
0.59
0.26
0.79
0.84
0.42
0.40
0.00
0.94
0.02

Moreover, the responses of both groups for the items designed whether they believe elearning enable them to accomplish their teaching more effectively than the campusbased approach (MAT4) and whether they will support the use of e-learning in their
institution (MAT6) are under the expected level of readiness. For the fifth factor of the
study, which is attitude towards others, no significant was found between groups.
However, the responses of the participants in private universities show that their believe
whether their top-administration understand (OTA1) and support (OTA2) e-learning are
higher than the expected level of readiness and much better than the mean score of the
participants in public universities. For the sixth factor of the study, which is e-learning
in institution, a significant difference (t[314]=2.15, p < 0.05) was found between
private and public institutions on the measure of whether e-learning is currently applied
in their universities.

The responses of the participants in private universities for all the items designed to
assess whether e-learning is currently applied in institutions (INS1, INS2 and INS3) are
much better than the ones in public universities. In terms of the seventh factor of the
study, which is content for e-learning, no significant differences are found. The
tendency of the responses of both participants in public and private universities for all
the items is similar. The mean scores of both participants indicate that e-learning is
applicable (CNT1) and can enhance the quality of theoretical part of the subject of
electricity (CNT3). However, the same feeling does not exist for the practical part of the
subject of electricity. However, the participants in private universities still believe that
e-learning can enhance the quality of the practical part of the subject of electricity but
may not be implemented in practice. With regard to the eighth factor of the study, which
is acceptance for e-learning, no significant differences were found. However, the mean
score of the participants for all the items except TAM4 are higher than the expected
level of readiness. This indicates that participants in both public and private universities
do not believe that e-learning can bring a better approach than the classroom-based
approach. Moreover, the mean scores of the participants in private universities for all
the items are much better than the ones in public universities.

78

For the last part of the study, which is training for e-learning, a significant difference
(t[314]=2.79, p < 0.05) was found between private (M=2.44. SD=0.95) and public
(M=2.04, SD=0.76) on the measure of the item designed to assess whether the
participants believe that their technical and administrative personal need training for elearning. In spite of this difference, the mean scores of all the items for both groups are
under the expected level of readiness. Table 11also illustrate the overall mean score of
the participants responses. From Table 11, it can be observed that the overall mean
score is

higher

than

the

expected level

(MPUBLIC=3.48>MEXPECTED=3.40)

and

males

of

readiness

(MPRIVATE=3.64>

for

both

females

MEXPECTED=3.40).

Moreover, there was a significant difference occurred between these two groups
(t[314]=2.25, p < 0.05). Based on these results, it can be inferred that both public and
private teachers in HEIs in Turkey are overall ready for e-learning, although they needs
a few improvements.

79

CHAPTER 6: ANALYSING ISSUES FOR APPLYING E-LEARNING: THE


TEACHER PERSPECTIVE
6.1 Introduction
In Chapter 5, the readiness of the teaching staff in the HEIs associated with the subject
of electricity in Turkey was measured to find out whether they tend to embrace or
ostracise e-learning for their work by administrating a questionnaire. At the end of the
questionnaire, I conducted a research study to analyse those teachers views on elearning by conducting semi-structured interviews.

Specifically, the interviews

addressed four main aspects: (i) the current theoretical and practical issues in both
education and training in the respective HEIs; (ii) the definition of e-learning; (iii)
advantages and disadvantages of e-learning; whether e-learning could be a solution for
the current issues identified or whether it may create new issues in the HEIs; (iv) the
way e-learning should be implemented to solve actual issues in education and training
in those HEIs. Based on the understanding of these four aspects, a model is developed
that enables the realization of e-learning as a potential solution for resolving certain
issues in the Turkish HEIs offering the subject of electricity. Additionally, the model is
supported by the findings derived from the open-ended questions in the questionnaire.

6.2 Methods
417 departments/programs in the HEIs in Turkey were selected for the study. The
participating institutions were determined by considering whether they were associated
with the subject of electricity according to the official data in 2010 provided by the
OSYM (the Student Selection and Placement Centre) in Turkey. Two survey
techniques, questionnaire and interviews, were employed for the study. The
questionnaire consists of eight sections with 39 quantitative and 8 open-ended items
which measure teachers' readiness for e-learning in the respective HEIs. The interview
is semi-structured with 4 questions.

424 staff from the HEIs (e.g. researchers,

strategists, lecturers, and administrators) responded to the questionnaire (for its detailed
design, see Akaslan & Law, 2011; also the findings of the quantitative items) and 66 of
them indicated their willingness to take part in an online interview, but only 18 of them
attended it. Of the 18 interviewees, only 2 were female.

80

Nine, six and three are working in the area of Electricity, Electrical and Electronic
Engineering and Electrical Education, respectively. Table 12 displays the 8 openended items in the questionnaire and 4 questions of the interviews and the respective
number of participants. The responses in the interviews have been transcribed and
analysed using the method described in Chapter 3.4.

Table 12: List of item identifier, content and number of participants (N)

I1
I2
I3
I4
Q1
Q2
Q3
Q4
Q5
Q6
Q7
Q8

Interview Items
What are issues or inadequacies in your department?
What is the meaning of e-learning for you?
How can e-learning solve or help to solve issues in your department?
How should e-learning be implemented in your department or program?
Survey Items
What kind of ICT do you use in confidence or in difficulty?
Can you elaborate your personal experiences and views on e-learning?
Is e-learning applied in your department / program?
Is e-learning applied in your faculty / high school?
Is e-learning applied in your university?
Can you elaborate how useful and how easy for you to use e-learning?
Can you elaborate the types of training you have in mind on e-learning?
How can e-learning help to solve current issues in the science of electricity?

N
18
16
16
16
F
43
55
24
22
15
60
95
56

6.3. Findings
In this section, I report the results of our analysis of the aforementioned items as
indicated in Table 12. Specifically, our analysis is based on the thematic analysis
approach (see Chapter 3.4) and focuses on several main issues.

6.3.1

What is E-learning?

The interviewees were asked to define e-learning in order to find out their interpretation
of this notion, which can be instantiated in different forms. The interviewees seemed to
converge on the understanding that e-learning is an internet-based and computersupported learning method.

81

The common understanding of the interviewees aligns with the authors conception of
designing a web-based learning environment, be it open source like Moodle or
proprietary like Blackboard. Selecting an accurate form of e-learning to be consistent
with the understanding of the interviewees may facilitate the implementation of elearning and increase the chance for its uptake.

6.3.2

Issues and E-learning as a Solution

The interviewees were asked to describe issues in their institutes that hamper education
and training and to assess whether e-learning could resolve those issues. Issues are
firstly conceptualized and then the related concepts are integrated into a category. Some
of the main issues for which e-learning can serve as a solution are discussed in the
following (NB: Each participant in the survey is designated with an identifier P1, P2
Pn; this identifier and the source of the finding, interview or survey, are presented in
brackets):

6.3.2.1 Individual Issues


Learner-Based Issues: Nearly all applicants for undergraduate programs of the
universities in Turkey are selected and placed in accordance with the results of two
systems: SS and SGH. The SS is used to measure applicants several abilities,
namely, analytical thinking, problem-solving and knowledge of high school curriculum
whereas the SGH is used for vocational secondary school graduates to apply for
placement in two-year vocational higher education programs which are compatible with
their high school majors without measuring applicants abilities. The SGH is seriously
criticized. It is because, to be successful in the SS, applicants need to study hard for a
long time (interview, P3). The findings of the interviews have revealed that there is a
significant gap between the abilities of the students selected and placed in accordance
with the SGH and the expectations of two-year vocational higher education programs
because the respective students have no motivation, knowledge and ability required for
vocational higher education programs (interview, P2, P3).

82

This lack indicates that the students who expected to complete a higher education
program-preparatory curriculum in vocational secondary schools are prepared poorly.
The insufficiency of the respective students for vocational higher education has led to
rote learning (interview, P3), unprepared attendance for modules (interview, P2),
diploma-oriented learning rather than industry-oriented learning (interview, P2, P3),
lower demand for information (interview, P2), lower usage of university facilities or
only in exam days (e.g. libraries, photocopy rooms) (interview, P2, P3), note-taking
anxiety and lack of competition.
However, there are solutions for issues with e-learning. Firstly students insufficient
knowledge of high school curriculum may be boosted by arranging a variety of online
courses, such as mathematics, physics and chemistry, according to the needs of learners.
Secondly, as learners have tendency for rote learning, e-learning may offer several
different examples and their solutions for each related topic. Furthermore, topics may be
enriched utilizing strong graphics, animations and videos to prevent rote learning.
Thirdly, e-learning may replace lessons missed (the survey, P352) because students in
online courses are no longer dependent on fixed timetables or on classrooms where
students sit in front of teachers. Additionally, students can view topics again and again
until they better understand them. Fourthly, e-learning can host the digital versions of
textbooks, notes and documents and thus reduce the anxiety of students with regard to
note taking in classrooms and labs.

Fifthly, e-learning can help students to enhance their motivations. As materials with elearning may be read and processed at anytime, in anywhere and at students own pace
(the survey, P302), students may have more time and interest to study online documents
and gain more details about them. Additionally, e-learning may render education and
training more appealing. Finally, students may discuss and exchange courses contents
with other students via synchronous (e.g. chat rooms, conference systems, messengers)
or asynchronous (e.g. e-mail, forum) tools.

83

Teacher-Based Issues: The Higher Education Council (YK) in Turkey assigns few
teaching staff (e.g. one to three) into two-year vocational higher education programs in
Turkey and assumes that they will teach the entire curriculum in programs. However,
the abilities of these teachers are not sufficient for vocational higher education
programs. Firstly, the majority of teaching staff in vocational higher education programs
have no research on their fields in national and international areas and no degree in
master or PhD. However, they are forced to teach the entire curriculum at higher
education level. Secondly, the majority of them have no proficiency of foreign language
(interview, P2). Thirdly, teaching staff take several modules whereas they do not have
advanced knowledge on these modules. The lack of the competence of researching,
foreign language and knowledge significantly reduces the quality of education and
training and causes difficulty in providing up-to-dated information. Hence, it is
recommended that the Turkish Education Council increase the number of teaching staff
in these programs and encourage these teachers to learn a foreign language, to make
research on their fields, and to take a higher degree. However, e-learning may serve as
a solution to issues related to teachers.

Firstly, teachers in different universities may share ideas with national or international
researches online. In this way, more teachers and learners may have up-to-date
knowledge according to the needs of the century. Secondly, collaboration between
universities may be improved as materials will be electronic. In this manner, institutions
may have more up-to-date information. Plus, teachers who get the proficiency of
foreign language may translate important parts of international conference or journal
publications and distribute to universities across Turkey. In order to achieve these
objectives, it seems that e-learning platforms should be enriched with new components.
Furthermore, e-learning may not only improve accessibility associated with the place,
time and pace that suit the learner best but it may also reduce teaching time
considerably. Therefore, teaching staff may have more time for research on their fields
and improve their competencies. Finally, teachers may support learners to learn more
comfortably at anytime (survey, P146,P39,P53,P324,P316) and can interact with more
students (survey, P205). In this manner, teachers may have more opportunities to
respond to more questions raised by students and to update their knowledge more
frequently (survey, P131).
84

6.3.2.2 Financial Issues:


Poor Infrastructure: The strategy of universities in Turkey to gain practical hands-on
experience is through laboratories rather than industrial training. However, there are
many problems in laboratories across Turkey. Firstly, there are high demands for
laboratories due largely to large classes whereas the number of labs is limited
(interview, P3). It is therefore not possible for students to conduct sufficient
experiments to gain hands-on skills. Secondly, the number of experimental sets in labs
is limited. It is therefore not possible to practise in labs individually and thus students
are forced to practise in groups (interview, P1). Thirdly, the facilities of labs seem to be
out of date (interview, P1). However, it is essential for laboratories like PLC,
pneumatic, micro controller and microprocessor to have continuous modernization to
meet the needs of the century.

However, as the cost of equipments in labs is high, the modernization has been cut.
Fourthly, experimental sets in labs are designed with the aim of conducting experiments
with ease for learners. However, these sets are criticized as their standards are not
compatible with commercial standards, because all the parts of experiments in
industries should be integrated by students. However, e-learning may offer solutions to
help students to gain practical experience to students. It seems that the majority of
issues related to laboratories are arisen from insufficient number of laboratories and
equipments and insufficient modernization of this infrastructure. E-learning may offer
the development of e-laboratory for illustrating the fundamentals of electricity because
it is possible to simulate laboratory projects on a computer and on the internet.

Class Size: Young population growth in Turkey is likely to lead to an increase in


demand for higher education and also an increase in class size. This is especially true
for institutions associated with the field of electricity (survey, p114; interview, P1, P3)
because higher education funding to the respective institutions are not sufficient and
staffing is inadequate. The large classes in the respective HEIs have an adverse impact
on education and training (interview, P3). In order to maintain the education quality in
the respective institutions, the class size of the institutions should be between 25 and 30
whereas it varies between 40 and 50.
85

The large classes in the institutions contributes to learning problems associated with
noise in classrooms and cause inadequate practice in laboratories (survey, P114) as the
number of experimental sets is limited. Furthermore, large classes increase stress and
reduce satisfaction for learners, and some students may need extra time to grasp
difficult concepts. However, it is not possible to offer more tuition for such students in
the classroom. E-learning may reduce the negative impacts of large classes in the
respective institutions. It may reduce stress and increase satisfaction of learners because
e-learning offers self-pacing for quick and slow learners.

Insufficient Staffing: It seems that the Turkish Higher Education Council have a
tendency to maintain education and training with minimum teaching staff and assistants
in HEIs associated with the field of electricity, especially in 2-year vocational higher
schools in Turkey. The lack of research assistants in 2-year vocational higher schools
aggravates teachers teaching load. This causes teachers not to conduct research on their
fields at the national or international level. However, this may be solvable by recruiting
top-class students in the classrooms as a research or teaching assistant. In this manner,
students may be encouraged to study and appreciate their profession, and the quality of
experiments in laboratories may be higher because students can benefit from the
experiences of top class students. Besides, there is insufficient staffing in departments
of electrical and electronics engineering. Therefore, students are forced to be
professional in the field of electrics or electronics. However, these problems may be
resolved by offering e-learning in such departments. Universities with sufficient
academic staff may offer a support to universities with insufficient staffing by arranging
online teacher-led instructions to fix the problem, because e-learning can essentially
take place anywhere.

6.3.2.3 Political Issues


Insufficient Status of Graduates: It is well recognized that vocational and technical
education and training is one of the most important factors for the advancement process
in developing countries. Accordingly, the main objective of technical education
faculties is to qualify students with systematic education and training, thereby
accrediting them to be specialists and instructors for vocational and technical high
schools in Turkey.
86

Graduates of technical education faculties can work as teachers in schools authorized by


the Ministry of Education, but they are not permitted to work in industries as engineers.
However, only less than 5 % of graduates are recruited as teachers for vocational and
technical high schools. The remaining majority have to face the problem of
unemployment, get a low-paid salary job or look for a job which is out of their field
after graduation. This gloomy prospect of graduates significantly reduces the motivation
of learners in classrooms and laboratories as they may probably be unemployed after
graduation. These faculties were therefore closed down on 13 November 2009 whereas
the undesirable condition for current students and graduates still continue. This issue is
likely to be solved by comparing the curriculum of graduates of technical education
faculties (TEFs) with that of engineering faculties. Graduates of the TEFs can be
informed which modules required to be qualified as an engineer are missing in their
own curriculum and decide accordingly whether to complete them. However, there are
thousands of graduates from the TEFs and there is no university in Turkey to offer
sufficient place to such graduates. However, all graduates of the TEFs could complete
those missing modules online. It is possible to design highly interactive modules for
graduates of the TEFs. In this way, students may study e-modules and upon completion
of such modules they may be qualified as an engineer.

Double Majors: The term electrical and electronic engineering comprise two double
majors namely, electrical engineering covering generation, transmission, control and
use of all forms of electrical power and electronic engineering including the expanding
fields of electronic communications, computers and electronic components. These very
broad and vibrant disciplines are delivered as single undergraduate degree, namely
electrical and electronics engineering in Turkey (interview, P1). This increases the
number of courses and requires more teaching staff in electrical and electronics
engineering in order to ensure that graduates can be well-qualified. However, the
existence of this undergraduate program is seriously criticised because there are too
many courses in electrical and electronics engineering with insufficient teaching staff.
Furthermore, it is not possible for students to study all modules related to both electric
and electronics (interview, P1). This constitutes superficial learning because it is not
possible to offer detailed course contents.

87

It seems that learners can only learn a few things about everything related to electrics
and electronics but they are not able to fully understand the relevance of course contents
(interview, P1). The number of optional modules for students to be well qualified in
electrics or electronics is not enough (interview, P1). It is therefore important to identify
the deficiency between the discipline of electrics and electronics in the respective
institutions by offering sufficient optional modules to students or these institutions
should be divided into two different branches namely electrical engineering and
electronics engineering (interview, P1). Thirdly, it seems those course contents are not
compatible with commercial standards (interview, P1).

However, e-learning may be used to resolve the issue of superficial learning engendered
by an excessive number of courses. Firstly, students may study with their own pace
without teacher-led instruction online. In this manner, e-modules may be prepared using
interactive animation, simulation and illustrations of module contents such as electrical
machines, electromagnetic fields and electronic circuits. Therefore, students may spend
more time on fundamental concepts in order to fully grasp them and may have more
time to practise more experiments. Secondly, students may be supported by teacher-led
instructions online. In this way, students may get simultaneous feedback for things
learned. Additionally, these two approaches may be used at the same time. Furthermore,
students may study at their own pace and then meet with teachers to get feedback.

Lack of cooperation between university and industry: The quality of education and
training in HEIs associated with the subject of electricity highly depends on the proper
acquisition of both theoretical knowledge and practical skills. Academic institutions are
supposed to be the best place for acquiring theoretical knowledge whereas industries are
the best setting for gaining practical skills (Chukwujekwu, 1976). It is therefore
important to set up a close collaboration between universities and industries to achieve
optimum education and training in the respective HEIs. However, this strategy is
ignored in Turkey. There are many observations indicating this ignorance. Firstly, the
duration of students industrial training is restricted to only 40 days (interview, P2). It is
therefore impossible for students to ensure relevance of course content and sufficient
exposure of industries.

88

However, the industrial training experience should be between 5 and 15 months


according to the duration of the degree programs. Secondly, universities have no
responsibility to find industrial training opportunities for students. Thirdly, there is no
such a mechanism to monitor students industrial training performance (interview, P2).
However, universities should designate an industrial training officer who should get
relief in his teaching load and should be paid for his/her effort and expenses to monitor
students performance.

These issues related to industrial training contribute to

unemployment, qualified personal shortage and inadequate learning experience


(interview, P2). However, e-learning has the potential for strengthening collaboration
between universities and industries. Firstly, institution using e-learning may ensure that
training and education standards are in line with occupational standards by acquainting
teachers and learners with advancements in labour market across the world.

This may be implemented by recording and integrating practices in industries as video


or pictures and illustrating them on the internet with technology-enhanced learning.
Secondly, e-learning may be used to inform teachers and researchers in universities with
industrial problems to increase competence. Thirdly, theoretical knowledge of staff in
industries may be updated by universities according to the latest developments. This is
likely to be conducted developing a network-based database and uploading to the
Internet with e-learning. Fourthly, the performance of students industrial training may
be monitored under the framework of e-learning. A monitoring or tracking component
may be employed. In this manner, learners can report their daily performance in
industries to their industrial training officer.

6.3.3

How to Implement E-learning

The underlying assumptions of teachers indicate using technology may achieve better
learning outcomes. However, a model of e-learning should be illustrated and explained
to demonstrate what principles are needed to implement e-learning. The results of
interviews have also led us to identify some stages how to implement e-learning. These
stages are illustrated in Figure 7 and are briefly explained carefully to indicate what
assumptions teacher are making at each stage:

89

Stage 1:
Measuring
Readiness for
E-learning

Stage 2:
Selecting or
Developing a
E-learning
Platform

Stage 3:
Developing
Materials for
E-learning

Stage 4:
Training
Individuals for
the Plartform

Stage 5:
Delivering Elearning

Figure 7: Step-by-step for Implementing E-learning

Stage 1: It is necessary to investigate the extent to which the HEIs associated with the
subject of electricity in Turkey are ready for e-learning. This is highly essential as many
factors can have impact on e-learning. Firstly, e-learning is essentially based on
physical components including computer and internet. An investigation to find out
whether individuals in the respective institutions own computer and reliable internet
connectivity is deemed useful (interview, P14; survey, P114). Secondly, the
characteristics of individuals must be discovered as their confidence, experience,
anticipation and attitudes for deploying various information and communication tools
have strong effects on the implementation of e-learning (survey, P168).

Stage 2: The finding of the survey indicates that the respective institutions are familiar
with these e-learning platforms: Blackboard, ATutor, Moodle, Ninova and also with the
following programming languages C++, C #, HTML, PHP, XML, 3Ds Max, and, Flash
CS5. These languages should be the first choice to develop or select an e-learning
platform such as Moodle because the respective HEIs are familiar with it. This choice
may give them a chance to develop the e-learning platform, to fix errors on it or to add
new components when needed. However, international e-learning practices should be
examined before introducing e-learning into the respective HEIs. It is therefore essential
to identify the properties of e-learning platforms used by universities.

Stage 3: The following software tools are also widely used by the HEIs associated with
the field of electricity in Turkey: Microsoft Office, AutoCAD, LabVIEW, Matlab, 3Ds
Max, SolidWorks, Flash CS5, Smulink, Vissim, Corel Draw, Google Documents and
Wave, Facebook and MSN.

Firstly, the above mentioned tools should be considered

before and after introducing e-learning into the HEIs as they are familiar with those
tools.

90

As an example, LabVIEW rather than MultiSIM should be chosen to develop e-learning


materials as being familiar with a tool will save time and give a chance to individuals to
understand, improve or change materials in a better way according to needs. Besides, it
is essential to describe the stages of developing e-materials. Mayes and Freitas (2000)
states four steps for developing materials: (i) to describe intended learning outcomes; (i)
to choose learning and teaching activities to allow students to achieve learning
outcomes; (iii) to design assessment tasks which test whether the learning outcomes has
been reached; (iv) to evaluate achievement of outcomes. The styles of teaching staff and
institutions should be considered while developing e-materials. Teaching staff may have
intention to make changes on e-materials. Therefore, the source code of e-materials
should also be accessible through e-learning platforms to create opportunities for
teachers to make changes on e-materials.

Stage 4: It is also essential to train teachers and their students to implement e-learning.
It seems that this should be conducted before delivering e-learning. To train teachers
and their students effectively, the types of training that teachers need should be
investigated. To address this issue, the interviewees were asked to define the types of
training they need for e-learning. The result of the study indicates that the types of
training teachers need may be classified as follows. Firstly, it is necessary to inform
teachers regarding e-learning in detail (survey, P397). Secondly, it is essential to inform
teachers regarding responsibilities that e-learning will bring into the respective HEIs
(survey, P304, P276). Thirdly, various seminars should be arranged for teachers
regarding how to develop e-materials and thus training for programming languages such
as Photoshop, AutoCAD and SolidWorks, Flash CS5 should be provided (survey, P304,
P131, P374, P208, P203, P376, P161). Fourthly, teachers need detailed information
how to implement e-learning for theoretical and practical parts of the subject of
electricity (survey, P357, P408, P242). Fourthly, it is also necessary to arrange seminars
to inform teachers about how to integrate e-materials into the e-learning platform
(survey, P213). In summary, teachers mostly need training about how to prepare elearning materials which is interactive.

91

Stage 5: The final stage is to deliver e-learning to implement e-learning in the


respective HEIs in Turkey. However, have no intention to give details how to deliver
e-learning. To address this issue, the perspectives of students in addition to teachers
should be also analysed. However, the results indicate that implementing e-learning will
bring an innovation to the respective HEIs. However, e-learning should support face-toface education and training. Therefore, e-learning should be integrated with classroom
methods (the survey, P276, P357, P59, P219, P269). This indicates that teachers mainly
support blending learning. However, the details of blending learning will be reported
after analysing students opinions about e-learning.

92

CHAPTER 7: MEASURING STUDENTS E-LEARNING READINESS


7.1 Introduction
The readiness of teachers for e-learning were measured in Chapter 5 by administrating a
questionnaire and their views on e-learning were analysed in Chapter 6 by
administrating semi-structured interviews in tandem. However, the attitudes of students
towards e-learning also plays a crucial role in the successful integration of e-learning
into the respective HEIs in Turkey. Hence, it is also important to measure students
readiness for e-learning and to analyse their views on e-learning.

The objectives of this chapter is to understand whether students in the HEIs offering the
subject of electricity (e.g. departments of electrical and electronic engineering) in
Turkey are ready for e-learning in terms of successful integration. To meet this aim, I
investigated the extent to which students believe that e-learning would be free of effort
and would enhance their learning.

This is important to understand the needs of

students before embarking on e-learning. I addressed these issues by adapting the


conceptual model I had previously developed for measuring teachers e-learning
readiness in Chapter 4. The adapted model has then been validated with students from
some of the HEIs which had been involved in the corresponding survey with their
teachers.

7.3Methodology
7.3.1 Questionnaire Design
Based on the empirical findings and practical experiences of this previous study (see
Chapter 5), I adapted the questionnaire for assessing teachers e-learning readiness to
measure students e-learning readiness (see Appendix II). The adapted questionnaire
consisted of eight sections and was administered online, given that the web is an
efficient and effective means of distribution. As an introduction of the questionnaire,
definitions of several key terms were provided to maximize the participants common
understanding of the items used including: readiness, ICT, e-learning and readiness for
e-learning.
93

The first section consisted of several items to gather demographic data of the students,
including their age, gender, education level, academic year, and affiliated institution.
Section 2, 3, 4, 5, and 6 of the questionnaire were designed to measure the extent to
which the students and their affiliated institutions are ready for e-learning by
considering five main factors and several sub-factors (or attributes): technology, people,
institution and content. Section 7 of the survey was designed to understand the extent to
which students believe that e-learning would be free of effort and would enhance
learning. Section 8 was also developed to evaluate whether the participants in the
respective institutions need training for e-learning before its integration. Furthermore,
all the items are measured with either a binary option or a five-point Likert scale (see
Chapter 3.5), and most of them are supplemented with a free text box for the
participants to elaborate their choice or rating. There were altogether 73 items in the
questionnaire, which were divided into three parts to investigate self-reported
perceptions of participants on different aspects of e-learning. These items are explained
in Section 8.3.4. Additionally, the participants were invited to be interviewed to discuss
four main questions: current issues in their institutions; their perceptions towards elearning; the advantages and disadvantages of e-learning as a solution for the current
issues; their scenarios for the implementation of e-learning. The findings of the
interviews were reported in Chapter 8.

7.3.2Procedure
After the preparation of the questionnaire, the open source lime-survey software was
used to implement the questionnaire into the web-based format. The students in the
HEIs offering the subject of electricity in Turkey were invited to participate in the
survey by notifying the secretaries in the respective institutions The web-based survey
was launched on 9th January 2011 and closed on 20th April 2011. Table 13illustrates
the distribution of the participants for each group. The majority of the participants were
male (491; 89.8 %); a distinct contrast to the number of female (56; 10.2%). However, a
similar proportion between female and male existed in the targeted population as the
subject of electricity was not usually preferred by females in Turkey. The age groups of
the participants revealed that 89.3% of the participants were less than 27 years old.

94

Another variable was the participants educational level: 92.85 % were studying an
undergraduate degree and the rest were taking a postgraduate degree. Besides, the
participants were also classified according to their university affiliation: 14.4 % private
universities and 85.6 % public universities.
Table 13:The frequencies and percentage of research groups
Gender
Female
Male
Degree programme
Associate
Bachelor
Course-based Postgraduate
Research Postgraduate

N
56
491
N
213
293
25
16

%
10.2
89.8
%
38.9
53.6
4.6
2.9

University
Private
Public
Age Groups
Less than 21
Between 22 - 26
Between 27 - 31
More than 32

N
79
468
N
282
206
27
32

%
14.4
85.6
%
51.6
37.7
4.9
5.9

7.3.4 Items
There were73 items in the questionnaire with different types, namely binary (7 items),
categorical (3 items) and a five-point Likert-scale with the leftmost and rightmost
anchors being Strongly Disagree and Strongly Agree. The Likert-scale questions
were coded with 1 indicating the lowest readiness and 5 the highest. Table 15 and Table
16 display the list of items in the study.

7.3.5 Missing Data


Table 14displays the overall summary of missing values for items in the background
and readiness sections in the survey. Overall, 73 (82.95%) of 88 variables in the
questionnaire had at least one missing value. However, 242of 547 cases contained at
least one missing value. Overall, 22.23% of all values were missing, which are equal to
10700 values.

95

Table 14: Overall Summary of Missing Values


Items
Background
Readiness
Total

Complete
Incomplete
Complete
Incomplete
Complete
Incomplete

Variables
n
%
15 100.0
0
0.0
0
0.0
73 100.0
15 17.05
73 82.95

Cases
n
%
547 100.0
0
0.0
205 37.48
342 62.52
205 37.48
342 62.52

Values
n
%
8205
100.0
0
100.0
29231 73.20
10700 26.80
37436 77.77
10700 22.23

Since the list-wise deletion would eliminate all the cases with missing values, it was
safe to prefer one of the principled statistical methods namely multiple imputation, as
they are better than ad-hoc methods. Therefore, it is important to understand the pattern
of missing values before using one of principled statistical methods.

7.4 Results and Discussion


In this section I aim to analyse the responses of the participants under six parts: the first
four parts analyse the extent to which the student participants are ready for e-learning;
the fifth part reports the extent to which student participants believe that e-learning
would be free of effort and would enhance their learning; the sixth one discusses
whether students need training for e-learning before embarking on it. Independentsample t-test and one-way ANOVA were also used to verify statistical significance of
differences in the mean scores of various variables (e.g. between male and female,
between public and private universities). The means and standard deviations of the
scores of all the items in the study are presented in Table 17.

96

Table 15: List of Items of the Student E-readiness Survey Part I

Training

Acceptance

Content

Instit
ution

Attitudes
towards
Others

Attitudes
towards Elearning

Confidence
with ICT

Experience
with ICT

Technology

Factors
TEC01
TEC02
TEC03
TEC04
TEC05
TEC06
EXP01
EXP02
EXP03
EXP04
EXP05
EXP06
CNF01
CNF02
CNF03
CNF04
CNF05
MAT01
MAT02
MAT03
MAT04
MAT05
MAT06
OTA01
OTA02
OTA03
OTA04
OTA05
OTA06
INST01
INST02
INST03
CNT01
CNT02
CNT03
CNT04
TAM01
TAM02
TAM03
TAM04
TAM05
TAM06
TAM07
TAM08
TRA01
TRA02
TRA03
TRA04
TRA05

I01
I02
I03
I04
I05
I06
I07
I08
I09
I10
I11
I12
I13
I14
I15
I16
I17
I18
I19
I20
I21
I22
I23
I24
I25
I26
I27
I28
I29
I54
I55
I56
I57
I58
I59
I60
I61
I62
I63
I64
I65
I66
I67
I68
I69
I70
I71
I72
I73

Item identifier and their content


I have access to a computer connected to the Internet at home.
I am satisfied with the stability of the Internet access at home.
I am satisfied with the speed of the Internet access at home.
I have access to a computer connected to the Internet at university.
I am satisfied with the stability of the Internet access at university.
I am satisfied with the speed of the Internet access at university.
I use the internet as information source.
I use e-mail for the communication with my peers.
I use office software (e.g. Microsoft Office) for my coursework.
I use social network sites (e.g. Facebook & Orkut).
I use instant messaging software (e.g. MSN & Yahoo).
I use engineering software (e.g. AutoCAD & MATLAB).
I use computers (e.g. notebooks, desktop computers) confidently.
I use web browsers (e.g. Internet Explorer) confidently.
I use search engines (e.g. Google & MSN Search) confidently.
I use digital file management tools confidently.
I use authoring tools to create learning materials confidently.
I have information about what e-learning is.
I have enough ICT competency to prepare my coursework.
I feel I am ready for e-learning.
I have enough time to prepare my coursework in e-format.
I support the use of e-learning in my department / programme.
I will like e-learning.
My teachers have enough information about e-learning.
My peers have enough information about e-learning.
My teachers support the use of e-learning in my department.
My peers support the use of e-learning in my department.
My teachers will like e-learning.
My peers will like e-learning.
E-learning is applied in the university.
E-learning is applied in my faculty / high school / institute.
E-learning is applied in my department.
It can be applied to the theoretical part of the subject electricity.
It can enhance the quality of the theoretical part of the subject.
It can be applied to the practical part of the subject electricity.
It can enhance the quality of the practical part of the subject.
E-learning will improve the quality of my learning experience.
E-learning will improve the quality of my outcomes.
E-learning will increase my productivity.
E-learning will be useful for my studies.
E-learning will be more effective than the traditional one.
E-learning tools will be easy to use for me.
E-learning tools will be easy to use for my teachers.
E-learning tools will be easy to use for my peers.
I do not need training on e-learning.
My teachers do not need training on e-learning.
My peers do not need training on e-learning.
Technical and administrative personals do not need training.
The facilities of my university are enough for e-learning.
97

Table 16: List of Items of the Student E-readiness Survey Part II

I30

TRD02

I31

TRD03

I32

I can start writing without feeling overwhelmed after receiving an


assignment on a certain topic.
I present the ideas clearly in my own words without simply
reproducing what I have read and heard about the topic.
I document, review and revise my writing of the topic iteratively
with the help of a tool (e.g. MS Word).

Note-taking

TRD04

I33

I take notes of learning activities (e.g. lectures, books, seminars).

TRD05

I34

TRD06

I35

I note the details of the learning activity, specifying its objectives,


processes and outcomes.
In note taking, I identify relationships between the concepts
addressed in the learning activity with a tool (e.g. mind mapping)

Collaboration

TRD07

I36

I can work well in groups to implement a given collaborative task.

TRD08

I37

I am skilful to share and discuss knowledge with my teammates

TRD09

I38

TRD10

I39

TRD11

I40

TRD12

I41

I manage my contribution to the group work professionally with


the use of a tool (e.g. Google doc)
I can remember what I have just read when I get to the end of a
chapter.
I know how to pick out what is important in the text and identify
the main ideas.
I annotate the text with the use of a tool to document my
reflection on its content.

Attendance

TRD13

I42

I attend classes regularly

TRD14

I43

I carefully prepare myself for most class sessions

TRD15

I44

I discuss issues in classes to clarify them and update my personal


notes accordingly with the use of a specific tool.

TRD16

I45

I make timetables and list of activities to organize my tasks.

TRD17

I46

I have discipline to plan and manage time during study

TRD18

I47

I manage the integrity of my timetable efficiently with the use of


a specific tool (e.g. Google calendar, Mobile Phone Calendar).

TRD19

I48

I set my objectives and prioritize them when undertaking a task.

TRD20

I49

I keep track of the progress of a task and adjust my strategies.

TRD21

I50

I can evaluate my own performance and identify my strengths and


weaknesses with the use of a specific tool.

TRA22

I51

I can concentrate on studying without being easily distracted.

TRA23

I52

TRA24

I53

SelfDirected

Reading

Writing

TRD01

SelfMotivation

Traditional Skills

Item identifier and their content

Time
Management

Factor and
their Sub-Factors

My moods or personal problems seldom prevent me from


completing my tasks.
I know how to sustain my motivation and persist to accomplish
the task despite difficulties experienced.

98

The overall results of the items associated with each factor are explained separately. In
addition to the statistics of all the items, Table 18 illustrates the mean score of the
participants responses related to each factor and the overall mean scores of factors in
the study. FromTable 18, it can be observed that the overall mean score of factors is
higher than the expected level of readiness (MOVERALL = 3.44 > MEXPECTED = 3.40).

Table 17: Scores of all the items in the study

01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25

Items
TEC01
TEC02
TEC03
TEC04
TEC05
TEC06
EXP01
EXP02
EXP03
EXP04
EXP05
EXP06
CNF01
CNF02
CNF03
CNF04
CNF05
MAT01
MAT02
MAT03
MAT04
MAT05
MAT06
OTA01
OTA02

M
4.26
3.08
2.85
4.14
2.87
2.79
4.19
3.68
3.94
3.82
3.64
3.77
4.17
4.16
4.23
4.29
3.53
3.21
3.49
3.61
3.39
3.75
3.66
3.26
2.91

SD
1.53
1.47
1.49
1.62
1.45
1.46
0.79
1.13
0.99
1.14
1.17
1.07
0.82
0.85
0.79
0.81
1.08
0.99
1.07
0.92
1.02
0.97
0.98
0.99
0.85

26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50

Items
OTA03
OTA04
OTA05
OTA06
TRD01
TRD02
TRD03
TRD04
TRD05
TRD06
TRD07
TRD08
TRD09
TRD10
TRD11
TRD12
TRD13
TRD14
TRD15
TRD16
TRD17
TRD18
TRD19
TRD20
TRD21

M
3.34
3.23
3.54
3.47
3.51
3.55
3.48
3.59
3.30
2.80
3.83
3.99
3.26
3.64
3.83
3.12
3.92
3.44
3.34
3.09
3.24
3.03
3.93
3.89
3.62

SD
0.89
0.89
0.91
0.94
0.90
0.85
1.00
0.96
1.02
1.07
0.91
0.78
1.14
0.86
0.77
1.01
0.85
0.92
1.03
1.05
1.05
1.14
0.71
0.71
0.84

51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73

Items
TRA22
TRA23
TRA24
INST01
INST02
INST03
CNT01
CNT02
CNT03
CNT04
TAM01
TAM02
TAM03
TAM04
TAM05
TAM06
TAM07
TAM08
TRA01
TRA02
TRA03
TRA04
TRA05
Overall

M
3.42
3.27
3.75
2.52
2.42
2.34
3.72
3.66
3.39
3.46
3.70
3.54
3.62
3.82
3.30
3.78
3.62
3.58
2.57
2.45
2.41
2.38
2.81

SD
0.91
1.03
0.84
1.64
1.61
1.57
0.86
0.89
1.01
1.01
0.84
0.88
0.86
0.75
0.98
0.76
0.81
0.81
0.91
0.80
0.80
0.81
1.05

3.44

0.41

I: Items; SD: Standard Deviations of Scores; Mean: Mean of Scores


TEC: Technology; EXP: Experience; CNF: Confidence; MAT: Attitudes towards e-learning
OTA: Attitudes towards Others; TRD: Traditional Skills; INST: Institution; CNT: Content
TAM: Acceptance; TRA: Training for e-learning

99

Based on this result, it can be inferred that students in the respected HEIs in Turkey,
within limits of the HEIs surveyed, are overall ready for e-learning. Mean scores for the
factors can be also used to identify the areas of improvement in the participant
institutions in Turkey. First of all, the mean scores for attitudes towards others,
institution and training, the only factors whose mean score is lower than the expected
readiness level, shows that the participating students need training for e-learning before
embarking on it, the infrastructure of institutions are not sufficient for e-learning and
students have not positive opinion for other students and their teachers regarding elearning. Hence, we definitely should make improvements in those areas.

Table 18: The overall mean score of the participants responses for each factor

Factor

No of
item

Mean

Technology
Experience with ICT
Confidence with ICT
Attitudes towards e-learning
Attitudes towards others
Traditional Skills
Institution
Content
Acceptance
Training
Overall

6
6
5
6
6
24
3
4
8
5
73

3.33
3.84
4.08
3.52
3.29
3.49
2.43
3.56
3.62
2.52
3.44

Standard
Error
Mean
0.05
0.03
0.03
0.03
0.03
0.02
0.06
0.03
0.03
0.03
0.08

Standard
Deviation

Variance

1.06
0.71
0.75
0.73
0.72
0.57
1.48
0.81
0.69
0.69
0.41

1.12
0.51
0.56
0.54
0.52
0.32
2.18
0.65
0.48
0.47
0.17

8.4.1 Findings in the Factor Technology


The participants were asked about their access to a desktop or laptop computer
connected to the internet at their residence (TEC01) and at university (TEC02). The
majority of the participants reported that they have access to the Internet at their
residence (445; 81.39 %) and at university (430; 78.61 %).Table 17shows the mean
scores for the items associated with students access to the internet and the stability and
speed of the Internet. According to this table, the speed and the stability of the internet
at home and university are not sufficient because the mean scores of these items are
lower than the expected level of readiness (MEXPECTED= 3.40). This indicates that there
is a lack of infrastructure in the respective HEIs.

100

This result can be interpreted that students access to the internet are not sufficient for elearning and must be improved before embarking on e-learning. Therefore, the HEIs
should identify proper strategies to ensure that e-learning is accessible to all in case they
are interested in e-learning, An independent-sample t-test was used to verify whether
there is a significant difference in the overall mean score of the items in the technology
factor between the student participants who study in private and public universities. A
significant difference (t (545) = 4.944, p = 0.000) was found between private (M=3.86)
and public (M=3.24) universities. Moreover, the mean score of the students in public
universities is slightly lower than the expected level of readiness for e-learning
(MEXPECTED= 3.40). Therefore, it seems more urgent for the public universities than the
private ones across Turkey to improve the internet access before embarking on elearning. Furthermore, I found that the mean scores of the participants on the stability
and speed of the internet access at their home and at the university reduce when the
their educational degrees are lower.

7.4.2 Findings in the Factor People


The participants experiences and confidences in the usage of different ICT for their
study, their attitudes towards e-learning and their traditional skills were investigated to
find out the extent to which they are ready for e-learning. Given the space limit, only
the overall mean scores of all the related items are given in Table 17.As shown in the
table, the mean scores of all the items related to the factor People are higher than the
expected level of readiness except the overall mean score of the items related to the
students attitudes towards others.

Therefore, informing the students about basic

characteristics of e-learning is significant. With the better understanding, their attitudes


towards others may improve.

101

7.4.3 Findings in the Factor Institution


The participants were also asked whether e-learning is currently implemented in their
own institution at three levels: department, faculty, and university. Data provided by the
participants show that 37.92 % of their universities (I54), 35.58 % of their faculties
(I55) and 33.43 % of their own departments (I56) currently implement e-learning
officially or with personal efforts of their teachers. This finding can be interpreted that
the majority of the HEIs are not familiar with e-learning and thus it is important to train
academic as well as non-academic staff how to succeed in e-learning before embarking
on it. Table 17displays the mean scores of the items related to the factor institution.

7.4.4 Findings in the Factor Content


Table 17provides the mean scores and number of the participants for each item related
to the factor Content. The participants were asked to what extent they agree that elearning can enhance the quality of the theoretical and practical parts of the subject of
electricity and can be applied to the two parts. As shown in the table, it can be easily
observed that the mean scores of all the items related to the factor Content is higher
than the expected level of readiness (MEXPECTED= 3.40) except the item related to the
item I59.

This implies that the participants consider that e-learning can be integrated into theory
and practice to enhance the quality of the courses on the subject of electricity. However,
the students do not believe that e-learning might be applied for the practical part of
electrical engineering. On the other hand, the overall mean score of the items related the
factor

content

is

still

higher

than

the

expected

level

of

readiness

(MEXPECTED=3.40).Results of one-way ANOVA indicated significance differences in the


scores of I59 (F(3, 448) = 8.05, p< 0.01) and I60 (F (3, 448) = 3.02, p< 0.05) among the
participants studying for the associate (MI59= 3.64, MI60= 3.62), bachelor (MI59= 3.20,
MI60= 3.36), master (M59= 3.40, MI60= 3.31) and doctorate (MI59= 3.55, MI60= 3.37)
degree.

102

7.4.5 Findings in the Factor Acceptance and Training


The participants were asked to opine for eight items to measure their acceptance for elearning (i.e. items I61 to I68). Table 18shows the overall mean scores of all the items
related to the factor Acceptance. It can be easily interpreted that the participants hold
positive attitudes towards e-learning because their responses show that they believe elearning would be free of effort and would enhance their learning.

The participants were asked to answer five questions to find out whether there is a need
of training for e-learning before it is implemented (items I69 to I73).Table 17indicates
that the participants, their peers and teachers highly need training for e-learning and
their institutions do not have sufficient facilities to implement e-learning
(I73).Therefore, there is a need to ensure a clear understanding of e-learning through the
respective HEIs. This should include understanding the potential benefits of e-learning
and how it can enhance students to learn more effectively and efficiently.

103

CHAPTER 8: ANALYSING ISSUES FOR APPLYING E-LEARNING: THE


STUDENT PERSPECTIVE
8.1 Introduction
In Chapter 7, the readiness of the students in the HEIs associated with the subject of
electricity in Turkey was measured to find out whether they tend to embrace or ostracise
e-learning for their studies by administrating a questionnaire. At the end of the
questionnaire, I conducted a research study to analyse those students views on elearning by conducting semi-structured interviews. The purpose of this chapter is to
analyse the data obtained from open-ended questions and semi-structured interviews to
reveal students perspectives on issues for implementing e-learning.

Specifically, the interviews addressed four main aspects: (i) the current theoretical and
practical issues in both education and training in the respective HEIs; (ii) the definition
of e-learning; (iii) advantages and disadvantages of e-learning to find out whether elearning could be a solution for the current issues identified or whether it may create new
issues in the respective institutions; (iv) the way e-learning should be implemented to
solve actual issues in education and training in those institutions. Based on the findings
derived from the interviews, a model has been developed to illustrate what principles are
needed for applying e-learning in the respective institutions.

8.2 Methodology
8.2.1 Procedure and Items
Personalized invitations followed by reminders to participate in the survey were e-mailed
to teachers and secretaries to notify their students regarding the research.

The

questionnaire consists of eight sections and 45 questions which comprising one or more
items.

Altogether 86 close-ended and 16 open-ended items were used to measure

students readiness for e-learning in the respective HEIs. The interview was semistructured with four main questions. The list of open-ended and interview items and the
number of participants to them are given in Table 19.

104

8.2.4 Research Group


704 university students of different academic levels participated in the survey. At the
end of the questionnaire, the participants were invited to participate in a semi-structured
interview. 63 accepted the invitation but only 33 of them actually attended the interview
or answered interview questions online.

Table 19shows the distribution of the

participants according to their demographic characteristics. The majority of the


participants were male (88.9%). The participants were categorized into different age
groups. The distribution indicates that 90% of the participants were less than 26 years
old. Another criterion to categorize the participants is type of university affiliation:
84.5% of the participants were in public universities across Turkey and the others were
in private ones.

Furthermore, 31.7% of students were in the institutions where it takes 2 academic years
while the rest were registered in institutions offering 4-year Bachelors, Taught or
Research Postgraduate degree programs. It also indicates the majority of the participants
were in departments of electrical and electronics engineering. Of the 33 interviewees,
only 3 were female. 16, 9 and 6 were in the area of Engineering, Electricity and
Education respectively. Six of the interviews were carried out using a White Board
(e.i. Goto Meeting) and interviews were recorded. The responses of these interviewees
were first transcribed. The rest of the interviews were implemented using emails. These
qualitative data as well as the open-ended questions in the survey were analysed using
the method described in Chapter 3.

105

Table 19: List of item identifier, content and number of participants

I1

Interview Questions
What are issues or inadequacies in your department / program you are
currently working?

N
33

I2

What is the meaning of e-learning for you?

33

I3

How can e-learning solve or help to solve issues in your department or


program?

33

I4

How should e-learning be implemented in your department or program?

Q1
Q2
Q3
Q4
Q5
Q6
Q7
Q8
Q9
Q10
Q11
Q12
Q13
Q14

Survey Items
Can you elaborate your experience of using ICT application (e.g. which
190
systems, for what purposes, how long)?
Can you elaborate your confidence of using ICT application (e.g. difficulty,
140
pleasure, frustration, confidence)?
Can you elaborate your personal experiences and views on e-learning?
118
Can you elaborate your ratings regarding your teachers' and peers'
77
knowledge of and attitudes towards e-learning?
Can you elaborate whether and how you are would like to be a successful
96
learner:
Can you elaborate whether and how you are or would be would like to be a
78
self-directed learner?
Can you elaborate regarding e-learning applications being used by your
52
University?
Can you elaborate regarding reasons why e-learning is not yet applied in
65
your university?
Can you elaborate e-learning applications being used by your Faculty /
33
High School / Institute?
Can you elaborate reasons why e-learning is not applied in your Faculty /
50
High School / Institute?
Can you elaborate e-learning applications being used by your Department /
37
Program?
Can you elaborate reasons why e-learning is not yet applied in your
49
Department / Program?
Can you elaborate whether or why e-learning is (not) suitable for certain
62
topics in the subject of electricity?
Can you elaborate whether it would be useful and easy to apply e-learning
41
approaches and tools in your studies?

Q15 Can you elaborate the training you need for e-learning?
Q16

33
N

Do you have any comments, suggestions or questions regarding e-learning


or our research?

106

45
88

Table 20: Characteristics of Participants


Items
Gender
Female
Male
University Mode
Public
Private
Age Group
<21
22-26
>27
Degree
Associate (2-Year)
Bachelor (4-Year)
Taught and Research Postgraduate
Institutions
Aircraft Electrics and Electronics
Electrical and Electronics Eng.
Electrical Education
Electrical Appliance Technology
Electrical Engineering
Electricity
Total:

Surveyed
N
%
78 11.1
626 88.9
N
%
595 84.5
109 15.5
N
%
348 49.4
286 40.6
70 10.0
N
%
223 31.7
430 61.1
51
7.2
N
%
17
2.4
421 59.8
25
3.6
45
6.4
18 2.56
178 25.3
704 100

Interviewed
N
%
3
09.9
30
90.1
N
%
33 100.0
0
0
N
%
7
21.2
26
78.8
0
0.0
N
%
10 30.30
23 69.70
0
0.00
%
N
1
3.03
15 45.45
6
18.18
1
3.03
0
10 30.30
33 100.0

8.3 Findings
Results of our analysis of the aforementioned items given in Table 19are reported in this
section in four-part: First, the perceptions of interviewees regarding the definition of elearning are presented. Second, I discuss the main issues in the respective HEIs, which
hinder education and training. Third, I explain how e-learning can be applied as a
solution to those issues. Finally, the principles needed to implement e-learning is
elaborated. Our analysis is based on the principles of the thematic analysis approach
(see Chapter 3.4).

107

8.3.1 What is E-learning?


The notion of e-learning is not easy to define because of two main reasons: first, it is
used by many different individuals or organizations for different activities; second, it is
implemented using different technologies. 33 interviewees were asked to define elearning in order to find out their interpretation of this notion, which can be instantiated
in different forms. Besides, 118 participants of the survey expressed their personal
experiences and views on e-learning and 77 participants on their teachers and peers
views on e-learning.

Responses of the interviewees and the survey participants

regarding the notion of e-learning can roughly be categorized into five groups based on
their commonalities: The first group were based on seven of the interviews, which
commonly defined e-learning as the use of technology to enable them to study any
subject at any time and in many different locations through the internet or computer
without having to go to the university campus.

Some of them emphasized synchronous interactions between teachers and students


through the internet as an integral part of this approach. However, students in this group
had concerns whether gaining hands-on skills through e-learning was possible. The
second group, based on nine of the interviews, had the tendency to express e-learning as
part of face-to-face education in the shape of delivering lecture notes (e.g. power points,
videos or documents) and activities (e.g. quizzes, simulations) over the internet. The
third group, based on eight of the interviewees, tended to mention the properties of elearning rather than the definition of it such as equal opportunity. The fourth group,
based on two of the interviews, did not address directly the definition of e-learning, but
assumed that e-learning would enhance their learning and that it would be a solution for
some issues in their departments. Finally, one interviewee highlighted the importance of
training for e-learning as he does not have sufficient information about this notion.

108

8.3.2 Current Issues


The interviewees were asked to describe issues in their institutes that hampered
education and training and to assess whether e-learning could resolve those issues.
Issues were firstly conceptualized and then the related concepts are integrated into a
category. Some of the main issues for which e-learning can serve as a solution are
discussed in the following (NB: Each participant in the survey is designated with an
identifier P1, P2 Pn; this identifier and the source of the finding, interview or
survey, are presented in brackets). The main concepts related to the issues are given in
Figure 8.

Figure 8: Primary Issues in the respective HEIs

109

8.3.2.1 Financial Issues


Poor Infrastructure
This sub-section reports the effect of the poor infrastructure on the learning experience
of students in two contexts: lecture theatres and laboratory.
Lecture Theatres: The effect of lecture theatres (or classrooms) on students learning
experiences were analysed along four aspects: first, there are high demands for
university programs in Turkey but the size of lecture theatres is limited (Interview, P20,
P32, P52). Second, it is vital that the fitness is the first condition to be sought in
planning a building and (Fleming & Storr, 2000).However, the findings from students
interviews show that buildings designed for delivering lectures in the institutions
associated with the subject of electricity in Turkey did not fit for its purpose. Several
students observed this non-fitness. For example, lecture theatres have been lacking
quality in delivering lectures and are not conducive for gaining positive learning
experiences (Interview, P13, P29, P32, and P52). The function of lecture theatres is to
amplify and clear noise from front to back and absorb it from back to front (Fleming &
Storr, 2000).

Moreover, the acoustic environment is stressed as crucial to the speech perception,


performance, attention, and participation of students in classrooms(Choi & McPherson,
2005). However, the speech perception begins to fade away at the back of classrooms
especially in the rainy weather (Interview, P32, P52). Third, the facilities and seats of
lecture theatres are criticized as they are not appropriate (Interview, P49, P13, P32). At
the first glance, issues associated with the size of lecture theatres are likely to be solved
by allocating (or placing) students into different branches. However, the current issues
related to insufficient staffing and lack of classrooms block this solution. E-learning
may be used to overcome lack of classrooms whereas it may not help poor laboratories
as hands-on skills are essential in engineering (Interview, P20). For example, students
may be notified about lecture notes before they go to the class over the Internet via elearning because it is not possible for every student to seize a place near the blackboard
or instructor due largely to small classrooms (Interview, P32).

110

Laboratories: Overall, as distinct from the size of lecture theatres, the lack of
laboratories is emphasized much more by students during interviews. Issues related to
laboratories are also likely to be analysed along several dimensions. Firstly, the number
of laboratories is limited and there is even none in some institutions, but demands for
those labs are high (Interview, P13 P20, P52, P80, P81). The lack of laboratories and
insufficient experimental sets are causing an increase of the number of lecture hours in
the respective institutions (Interview, P13 & P81). As a consequence of the condition of
insufficiencies in labs, it has negative impacts on students hands-on skills (Interview,
P52). The second issue is related to the limited number of experimental sets in the labs
in the targeted institutions. It was strongly stressed that the experimental sets are limited
and some of them are missing (Interview, P4, P52, P53, P57 & P80). Thirdly, the
majority of concepts given through lectures are abstract and few of them are just
transferred into practice via labs (Interview, P18, P57 & P53).

Additionally, the modernization of labs is also criticized as it is too slow (Interview,


P29).

Moreover, the physical settings of lab have impact on student learning

experiences (Interview, P68). Finally, a student remarked that the lack of virtual labs
can have a negative impact on his learning experience (Interview, P62).It seems that the
majority of issues in laboratories are mainly associated with the poor infrastructure due
largely to insufficient financial sources. However, e-learning may offer some solutions
for the poor infrastructure. First, it is easy to establish experimental sets through elearning in case individuals have access to the Internet through a laptop or computer at
home or university. Ideally, students can simulate experimental sets on a computer or on
the Internet immediately and get feedback to find out whether their experiments are
working accurately (Interview, P2). Secondly, while the respective institutions offer
number of lecture and laboratory hours to students for each module, it is not possible to
teach or gain practice for everything. However, the actual application of a term, method
or theory given in classrooms could be put in practice by producing a computer model
for experimental sets in laboratories.

111

To address these issues, software such as AutoCAD and ISIS could be used to
overcome this limitation to support students (Interview, P4, P62). On the other hand, elearning is not able to provide hands-on experience over the Internet or computer
(Interview, P13). But, it does not mean that e-learning must be implemented without
campus-based education and training. This can be solved by implementing e-learning
together with traditional methods.

Insufficient Staffing
In 2011, the number of candidates for private and public universities is 1,688,804
whereas the 46.7% (789,167) of them were actually selected and placed for a program
in public, private or open universities in Turkey by SYM (Student Selection, and
Placement Center). Therefore, the growth of young population has a tendency to lead to
an increase in demand for university programs and an increase in class size in
Turkey(Akaslan, et al., 2011). This high demand also shows itself as insufficient
staffing according to the perceptions of the current Turkish higher education students.
The lack of insufficient staffing can be analysed along two aspects: insufficient teaching
and research staff.

Insufficient Teaching Staff: It is stressed that the Turkish Higher Education Council
tends to maintain education and training with minimum staffing especially in vocational
higher school in Turkey (Akaslan, 2011). The lack of teaching staff is reflected by
interviewed students in different ways. First, as the number of teaching staff is limited,
the existing staff are forced to teach several modules at the same time irrespective if
they have advanced knowledge or interest for those modules (Interview, P16 P41 &
P68).

Second, the teaching load of the existing staff increases and hence their

performance and interest reduces (Interview, P16, P65). As a result, some of the
existing staff are criticized and described as academics who lack knowledge and interest
for their profession (Interview, P39, P16, P32, P65).

112

Insufficient Research Staff: In Turkey, the responsibilities of research staff are


legislated as assistants who help research, analysis and experiments implemented by
HEIs. However, the lack of research assistants in 2-year vocational higher education
schools increases the responsibilities of teaching staff. However, research assistants in
4-year HEIs were also criticized as they could not properly recognize and introduce
equipments in laboratories (Interview, P39). Therefore, it is suggested that academic
staff in institutions should be divided into groups: first, staff who only deliver lectures
and second, staff who make only research studies (Interview, P32).

Class Size
The large classes in the respective HEIs are also perceived as having negative impacts
on students learning experience. The over-populated class size was stated by some
interviewees (Interview, P18, P32, P52 & P65). While many studies have mentioned
about the effects of class size on students performance, large classes are still an ongoing issue in education and in particular training in the respective institutions. It is
recognized by many that large class sizes contribute to a decrease in student
achievement due largely to various reasons (e.g. discipline, individualized instruction,
classroom interaction, encouragement, and feedback). However, the positive effects of
large classes are also stated. For example, Martins and Walker notes that a large class
increases the probability that students in a class may also benefit from questions of a
student at the same time (Martins & Walker, 2006).

It is also true that a large class also reduces the probability that a student may ask a
question. Moreover, the effects of class size may not be the same for every country
(Wmann & West, 2006). However, the interviewed students mentioned the negative
effects of over-populated classes: noise, blackboard and visual angle. First, the design of
lecture theatres and classes is often criticized; it is difficult to see the blackboard
particularly after the middle of the class and students become unmotivated (Interview,
P32).

113

Second, it is difficult to hear the speaker at the back of the classes (P52). Moreover, due
largely to the design of classes, a students visual angle in the class is blocked by the
body of other students (Interview, P32).

However, e-learning may provide

opportunities for HEIs to reduce negative impacts of overpopulated classes (Interview,


P18). Some solutions for the issues related to class size were also stated by the survey
participants. First, as the some classroom activities are mainly based on the content of
textbooks and presented through the blackboard by the lecturer without extra
information, e-learning may save our time and increase our speed (Survey, P574).

Individual Issues
Peer-based Issues: The effects of peers on the outcomes of students were also revealed
through the interviews: attendance, ICT skill and secondary school knowledge. First,
lectures in classes are a primary means of instruction for undergraduate courses in
Turkey and hence regular attendance at all class meetings is compulsory for students.
Moreover, Golding

notes

that a positive correlation between attendance and

performance of students is stressed by many(Golding, 2011).However, some


undergraduate students tend to go to classes late and reduce the class motivation
(Interview, P39) Secondly, the lack of ICT skills of some students is also criticized
during interviews. Low ICT skills of some students in group studies lower the learning
speed of other students and reduce the motivation (Interview, P16).

On the other hand, e-learning may help students communicate with each other better
(Interview, P13). Besides, e-learning may encourage some students not to attend
traditional classes as having access to lecture notes over the Internet will be possible via
e-learning. However, the positive effects of having access to lecture notes are much
more than its negative ones (Interview, P39, P49). Additionally, e-learning may deliver
lectures much more visually and demonstrate the use of lecture concepts in industries
(Interview, P65). Furthermore, students may study notes repeatedly and watch sessions
which they missed (Interview, P80).

114

Teacher-based Issues: It seems that some teachers have no student attendance policy.
This lack of policy is highly criticized during interviews. Some teachers have no policy
to stop late attendance of students to classes and hence this lack of policy highly reduces
the motivation of regular students (Interview, P39). Secondly, the knowledge of
teachers is considered as insufficient by a student (Interview, P27). It seems that the
primary rationale of this bias regarding teachers is highly associated with the policy of
higher education council (YK). The policy of the YK is to assign only a few
teaching staff into 2-year vocational higher institutions and to expect they will teach the
entire curriculum in these schools. However, the performance of teachers are ignored.

Political Issues
Status of Graduates: The status of technical education faculties (TEF) is still an ongoing issue in spite of the fact that these faculties were closed down on 13 November
2009. While the main objective of TEF is to qualify students, thereby accrediting them
to be specialists and instructors for vocational high schools, they are not permitted to
work in industries as engineers and the majority (95%) of them are not recruited as
teachers (Akaslan, et. al, 2011). It is revealed from students reports that this issue still
needs to be solved.

Optional Modules: The lack of optional modules especially in some departments of


electrical and electronics engineering is a well-known common issue due largely to
insufficient staffing. In such departments, students are frequently forced to be qualified
in either electrics or electronics. However, the lack of optional modules is also a problem
for other subjects. The policy of departments regarding optional modules is criticized as
there is no policy to offer a module related to programming language such as (C++ or
PHP) at least at the level of instruction (Interview, P18).

Insufficient Demand: HEIs in Turkey tend to provide practical hands-on experience


and lectures to their students through laboratories and classrooms. However, the Higher
Education Council founds institutions especially 2-year vocational higher schools in
towns without establishing essential facilities (e.g. accommodation, laboratories, lecture
theatres). The physical class size in these institutions is getting lower (Interview, P41).

115

8.4.3 How to Implement E-learning


The underlying assumptions of the majority of students indicate that e-learning may
help them achieve better outcomes (Interview, P85). On the other hand, few students
believe that e-learning may increase the number of current issues (Interview, P81).
However, it is deemed relevant to demonstrate and explain what stages and principles
are needed to integrate e-learning into current practices of the respective campus-based
institutions in Turkey and also analyse negative perspectives of some students towards
e-learning. The proposed stages briefly explained to indicate what assumptions students
are making at each stage:

8.4.3.1 Stage 1: Measuring Readiness for E-learning


It seems that it is essential to investigate the extent to which students and other
stakeholders in the respective HEIs are ready for e-learning before implementing elearning according to students responses. Firstly, it was mentioned through interviews
that some students are not able to use the Internet and this may be a problem for elearning (Interview, P16).However, e-learning is essentially based on computer and the
Internet and the characteristics of individuals namely their competence, experience,
confidence and anticipation for utilizing various ICT may have effect on the
implementation of e-learning (Akaslan & Law, 2011). Hence, it is deemed relevant for
us to investigate students readiness for e-learning in the respective HEIs. This is highly
essential because students lacking ICT skills may not work together with their peers
online especially in team works. Secondly, a student worries that e-learning may
increase his responsibility (Interview, P20). Thirdly, some students have no idea about
how e-learning may work (Interview, P27, P62, P78). This indicates that assessing
students e-learning readiness is important for understanding whether they need training
for e-learning before embarking on it.

116

8.4.3.2 Stage 2: Selecting a E-learning Platform


There are a number of e-learning platforms available which either free to use (e.g.
Moodle, Sakai, Ilias, Docebo) or to purchase (e.g. Blackboard, First Class). They are
designed to work either on a desktop or through a browser. The needs of different
stakeholders especially teachers and students should be investigated together with the
particular characteristics of the respective HEIs too. This is highly essential for
specifying a requirement list before selecting an e-learning platform.

First, the data obtained from the close-ended questions indicate that participants use
various operating systems (OS) for their studies namely Linux (Survey, P3, P5, P30,
P73, P154, P160, P186), Windows (Survey, P3, P10, P12, P181, P182, P183, P186) and
Pardus (Survey, P5). However, it is not possible to select an e-learning platform that
supports Linux, Mac and Windows at the same time. Besides, mobile operating systems
are also commonly used on smart phones and tablets. For example, while the Android
OS is commonly used by Samsung Mobile Devices, it is not preferred by Apple for
iPads and iPhones. To address these issues, a web-based e-learning platform must be the
first criterion to deliver education and training at any time and in anywhere because it
only requires having access to the Internet through a web browser such as Google
Chrome or Safari.

Besides, the findings of the survey indicate that students are familiar with the following
programming languages C++, C#, HTML, PHP, Java, XML, Flash CS5 etc. The
familiarity of students with these programming languages should be considered as the
second criterion to select an e-learning platform. This may facilitate the integration of elearning platforms created with those languages in three aspects: first, they develop elearning platforms by adding new components; second, they can fix errors on it; third,
the integration of the platform into their server may be easy. For example, as the
respective institutions teach PHP language, they are familiar with MYSQL Database
System. Once they decide to select to integrate Moodle Platform into their server, this
may save their time remarkably.

117

8.4.3.3 Stage 3: Developing E-learning Materials


The participants were also surveyed for ICT that they are familiar with. Accordingly, a
number of software tools, especially AutoCAD, Proteus, MATLAB, ISIS and PSpice,
are used by the participants to succeed at their courses or to improve their skills. The
familiarity of students with such tools should be taken into account before and after
developing e-learning materials. For example, MATLAB and AutoCAD can be
preferred for preparing worksheets associated with lecture notes to enhance students
learning as they already use these tools in their institutions. Hence, students may save
time as they are familiar with such tools.

8.4.3.4 Stage 4: Training Individuals for E-learning


At the first glance, the readiness of students for e-learning was also measured as part of
the survey with closed-ended questions (see Chapter 7). While their readiness seems to
be sufficient, their readiness for e-learning must be also strengthened to facilitate
effective adoption of e-learning in their institutions. Besides, with an open-ended
question, the participants were asked about the training that they might need for elearning. Some interviewees (Interview, P27, P62, P78, etc) also expressed they had no
information about e-learning. It is therefore essential to inform them about benefits and
drawbacks of e-learning and about how to use e-learning platforms.
8.4.3.5 Stage 5: Delivering E-learning
The final stage is highly associated with the delivery of e-learning. The data obtained
from open-ended questions and interviews show that students mainly consider elearning as a tool supporting campus-based activities. Besides, it is also true that elearning is not meant to replace completely the traditional education and training in the
field of engineering because the students stress that gaining hands-on skills through
laboratories are highly important for them. Therefore, the blended learning approach to
education and training, which supports the combination of face-to-face methods with
online methods, should be developed. However, I have no intention to report the details
of the blended learning approach without analysing and comparing the perceptions of
students and teachers together.

118

CHAPTER 09: COMPARING THE TEACHER AND STUDENT


PERSPECTIVES
9.1 Introduction
Several barriers hinder the integration of e-learning into HEIs. To address these
concerns in the context of HEIs associated with the subject of electricity in Turkey, I
conducted two research studies to measure individuals readiness for e-learning and to
investigate how to implement e-learning in these HEIs in 2010 with teachers (see
Chapters 5 and Chapter 6) and in 2011 with students (see Chapters 7 and Chapter8).
Correspondingly, I administrated a questionnaire and semi-structured interviews for
each study in tandem. The results of the questionnaire and the analysis of interviews
was analysed in the respective chapters for teachers and students separately.

However, no systematic comparisons between the responses of the two groups was
made. In the questionnaires, two main aspects were investigated: First, both students
and teachers were asked to rate their usage of different ICT and then their attitudes
towards e-learning using several close-ended items. Second, they were asked to
elaborate their past or current experiences with e-learning, if any, and attitudes towards
e-learning using an open-ended item. This chapter aimed to achieve two objectives: to
find out whether teachers and students tend to embrace or ostracize e-learning when
they have more (or less) usage of ICT and to analyse the qualitative data of their
personal experiences and views on e-learning. Besides, I analysed teachers and
students ratings to find out whether they are different significantly from each other in
the other aspects pertaining to e-learning.

9.2 A Model for Measuring Attitudes towards E-learning


Based on the framework given in Chapter 4, I constructed two surveys and implemented
them with the respective target groups in the HEIs offering the subject of electricity in
Turkey. As shown in Chapter 4, there were many factors affecting the ability of teachers
or students to take the advantage of e-learning in their own situation such as the
experience in using different ICT and attitude of individuals towards e-learning.

119

As I aimed to find out whether individuals tend to embrace or ostracize e-learning when
they have more or less experiences in using ICT, I focused on the two attributes of the
factor People in this chapter: experience with ICT and attitude towards e-learning.

9.2.1 Attitude towards E-learning


The attitudes of individuals towards e-learning are emphasized as an important aspect of
predicting and improving e-learning usage (Liaw, et al., 2007). However, it is worth
finding out about what attitudes towards e-learning are before implementing it. The
word attitude was described by the Oxford Dictionaries as a settled way of thinking or
feeling about something. The attitudes of individuals towards e-learning are usually
evaluated by their agreement or disagreement with several statements about e-learning
and can be measured by different scales such as Likert scale (Link & Marz, 2006). The
measurement of individual attitudes also plays an important role in analysing user
behaviour, because a strong correlation between attitude and behaviour was found
(Bertea, 2009). Personal attitudes were a major effect on the usage of information
technology because it facilitates the creation of the e-learning environments (Liaw, et
al., 2007).

Different researchers (e.g., Rosenberg, 2001) measured peoples attitude with different
approaches. However, they agreed that the attitudes of people should be measured
using a multi-linear methodology. Specifically, some researchers used the Technology
Acceptance Model (TAM) developed by Davis in 1989. The TAM was used to measure
two constructs: perceived usefulness and ease of use, which denote the degree to which
people believe that using a system would be useful and free of effort, respectively
(Akaslan & Law, 2011). I also used the TAM to measure teachers and students beliefs
whether e-learning would be free of effort and useful for their respective tasks (as in
Figure 9). As a result, five sub-factors was identified to measure attitudes for elearning: knowledge, ICT competencies, time, feeling of readiness for e-learning and
thinking about others. Each sub-factor should be taken into consideration as much as
possible during the assessment process.

120

Attitude

Information

ICT
Competencies

Feeling of
Readiness

Time

Thinking about
others

Figure 9: A Model for Measuring Attitudes for E-learning

9.2.2 Experience with ICT


It is expected that the more skilled people working the more likely they can have the
implementation of a successful e-learning at institutions (Akaslan & Law, 2011). It is
hence worth investigating individuals experiences for deploying various ICT for their
different task. The use of e-learning was affected by the lack of knowledge of ICT. It is
known that the usage of a system was significantly affected by previous experiences of
other systems (Park et al., 2009). A research demonstrated that teachers who have little
or no information about ICT find it difficult to use e-learning packages (Etukudo, 2011).
As a result, I identified six sub-factors to find out about individuals experiences in
deploying various ICT for e-learning as in Figure. 10): the Internet, e-mail, office and
engineering software, instant messaging and social network sites.

Experience with ICT

Internet

E-mail

Office
Software

Social
Network

Instant
Messaging

Engineering
Software

Figure. 10. A Model for the usage of different ICT for E-learning

121

9.3 Items
There were altogether 11 items in the questionnaire, which gauged the participating
students and teachers self-reported personal experiences in using different ICT and
attitudes towards e-learning. Specifically, an open-ended item was used to elaborate
teachers and students experiences with e-learning, if any, and their attitudes towards elearning. A list of close-ended items and the open-ended are shown in Table 21.

Table 21: Items of individual experiences with ICT and attitudes towards e-learning

I1
I2
I3
I4

Item Identifier and Content


Part 1: Closed-ended Items
Teacher Items
Student Items
Factor 1: Experiences with ICT
I use the Internet as information
I use the Internet as information source.
source.
I use e-mail as the main I use e-mail for the communication
communication tool.
with my peers.
I use the office software for content I use the office software for my
delivery and demonstration.
coursework.
I use social network sites.

I use social network sites.

I5

I use instant messaging.


I use instant messaging software.
I use electrical software (e.g. I use engineering software (e.g.
I6
AutoCAD).
AutoCAD)
Factor 2: Attitudes towardsE-learning
I have enough information about I have enough information about what
17
what e-learning is.
e-learning is.
I have enough ICT competencies to
I have enough ICT competencies to
I8
prepare my coursework in electronic
prepare e-learning materials.
format.
I feel that I am ready to integrate eI9
I feel that I am ready for e-learning.
learning in my teaching.
I have enough time to prepare e- I have enough time to prepare my
I10
learning materials.
coursework in electronic format.
I believe my students will like e- I believe my teachers will like eI11
learning.
learning.
Part 2: Open-ended Item
Can you elaborate your personal experiences of and attitudes towards eI12
learning?

122

9.4 Results
This section comprises two sub-sections reporting the results of analysing the close-end
and open-end item, respectively.

9.4.1 Results of the analysis of close-ended items


As mentioned earlier, my study aimed to investigate the relationship between peoples
experience with ICT and attitudes towards e-learning with questionnaires, analysing the
empirical data collected. Table 22shows the mean scores of individual items related to
each factor for both teachers and students. It also indicates the results of independentsample t-test to verify statistical significance of differences in mean scores between
teacher and student participants. A row at the end of each block displays the overall
mean score of the items related to individuals experiences with ICT and attitudes
towards e-learning (MOVERALL).

Table 22:Statistics for the Items related to each factor

I
I1
I2
I3
I4
I5
I6
M0

Number and Mean of the Related Items


Factor 1: Experiences with ICT
Factor 2: Attitudes towards E-learning
T
T
P
t
p
T
T
P
t
p
I
4.65
4.25 4.40
-7.33
0.00
3.70
3.29
3.44
-5.37
0.00
I7
4.53
3.80 4.07
-9.43
0.00
3.72
3.65
3.68
-0.83
0.41
I8
4.46
4.04 4.19
-6.08
0.00
3.70
3.74
3.73
0.54
0.59
I9
2.95
3.90 3.55
9.85
0.00 I10 2.81
3.54
3.27
9.45
0.00
3.40
3.70 3.59
3.12
0.00 I11 3.64
3.60
3.61
-0.63
0.53
4.40
3.90 4.08
-6.44
0.00
4.07
3.93 3.98
-2.66
0.01 M0 3.51
3.56
3.55
0.87
0.37
MO = Overall Mean; I: Items; T: Teacher (N=280); S: Student (N=483); P: Pooled (N=763)

To reiterate I aim to find out whether teachers and students tend to embrace or ostracize
e-learning when they have more or less experiences with ICT. The mean score of the
participants responses can be used to identify the level of their experiences with ICT.
Therefore, I categorized the attitudes and experiences of the participants with sufficient
experiences, neutral and insufficient experiences with ICT considering the assessment
model in Chapter 3.5.
123

The participants were categorized into three groups according to the overall mean score
of their responses to the factor experience with ICT: (1) insufficient, (2) neutral and (3)
sufficient experience. Table 23illustrates the numbers and overall mean scores of the
factor experiences with ICT and attitudes for each group in addition to the mean scores
of individual items related to each factor.

The overall results for each group indicate that when the participants have sufficient
experience in the usage of different ICT for e-learning, they tend take a positive attitude
towards e-learning. On the other hand, when they do not have sufficient experience with
ICT, their attitudes are not positive. This implies that the more the participants have
experience in using various ICT, the more they feel positive for e-learning. A Pearson
product-moment correlation coefficient was also used to examine the relationship
between the participants scores on their experiences with ICT and attitudes towards elearning. A significant positive correlation was found between the student (r[398] =
0.340, p > 0.00) and the pooled (r[641] = 0.237, p > 0.000) participants scores, who
have sufficient experiences in using various ICT.

124

Table 23. The mean score of the participants with various experience
Group A: Attitudes of the participants with Insufficient experience
(Mean Scores between 1.00 and 2.59)
I
I1
I2
I3
I4
I5
I6
M0

Experience with different ICT


T
S
P
N=7
N=27
N=34
t
3
2.7
2.76
-0.51
2.29
1.41
1.59
-2.50
1.71
1.96
1.91
0.52
2
2.59
2.47
1.03
2.43
1.96
2.06
-0.86
1.29
2.07
1.91
1.72
2.12
2.12
2.12
-0.01

I7
I8
I9
I10
I11

Attitudes towards e-learning


T.
S.
P.
N=7
N=27
N=34
t
2.43
2.48
2.47
0.09
2.71
2.44
2.5
-0.51
2.43
2.74
2.68
0.55
2.43
2.74
2.68
0.61
2.43
2.81
2.74
0.69

P
0.92
0.62
0.59
0.55
0.49

M0

2.49

0.40

0.69

I7
I8
I9
I10
I11

Attitudes towards e-learning


N=26
N=60
N=86
t
3.27
2.97
3.06
-1.31
3.19
3.27
3.24
0.27
3.19
3.55
3.44
1.57
3.28
3.07
2.72
2.58
3.35
3.42
3.4
0.34

P
0.19
0.79
0.12
0.01
0.73

M0

3.12

0.28

P
0.61
0.02
0.61
0.31
0.39
0.09
0.99

2.64

2.61

Group B: Attitudes of the participants with Neutral experience


(Mean Scores between 2.60 and 3.39)
I1
I2
I3
I4
I5
I6
M0

Experience with different ICT


N=26
N=60
N=86
t
4.31
3.78
3.94
-3.03
4.12
2.88
3.26
-5.16
3.73
3.33
3.45
-1.59
1.31
2.9
2.42
5.90
1.69
2.83
2.49
4.14
3.73
3.08
3.28
-2.25
3.15
3.13
3.14
-0.25

P
0.00
0.00
0.11
0.00
0.00
0.03
0.80

3.30

3.24

1.08

C: Attitudes of the participants with Sufficient experience


(Mean Scores between 3.40 and 5.00)
Experience with different ICT
N=247
N=396
N=643
t

I1
I2
I3
I4
I5
I6
M0

4.73
4.64
4.62
3.15
3.61
4.56
4.22

4.43
4.11
4.29
4.14
3.95
4.15
4.18

4.54
4.31
4.41
3.76
3.82
4.31
4.19

-7.18
-8.22
-6.23
10.51
3.73
-6.34
-1.12

Attitudes towards e-learning


N=247
N=396
N=643
t

0.00
0.00
0.00
0.00
0.00
0.00
0.26

I7
I8
I9
I10
I11

3.79
3.8
3.79
2.85
3.71

3.39
3.79
3.84
3.63
3.68

3.54
3.8
3.82
3.33
3.69

-4.91
-0.11
0.63
9.82
-0.41

0.00
0.91
0.53
0.00
0.68

M0

3.59

3.67

3.64

1.44

0.15

T: Teachers; S: Students; P: Pooled Results; N: Number

125

9.4.2 Analysis of the open-ended item


At the end of the questionnaire, the participating students and teachers were finally
asked to elaborate their experiences and views on e-learning using an open-ended
question. 147 of them expressed some comments related to their current or past
experiences with e-learning or explained their views on e-learning or both. As I
received lots of comments from the participants, it is worth categorizing their views and
experiences with e-learning into some groups. Since I have already analysed their
experiences in using ICT and attitudes using eleven closed-ended items, I can categorize
their comments according to levels of their attitudes towards e-learning and experiences
with ICT. Additionally, each participant in the survey is designated with an identifier
(e.g. P1, P2, P3 Pn). The identifier is also accompanied with a letter of T or S to
show the source of finding: teacher or student respectively.

9.4.2.1 The comments of the participants related to their experiences with elearning.
The participants were asked to elaborate their experiences with e-learning if any. Table
24 indicates the number and mean score of the participants items related to their
experience in using different ICT and to the fact, whether they had experiences of elearning. As shown in Table 24, 68 out of the 147 participants had some experiences of
e-learning but the rest of them did not have or did not prefer to say. From the table, the
relationship between the experience of the participants with ICT and e-learning can be
easily seen.

Table 24. The relationship between e-learning and ICT


Do you have experience of e-learning?
Yes, I have.
Prefer not say.
No, I do not have.
Total

Experiences of the participants with ICT


Insufficient
Neutral
Sufficient
Total
N
M
N
M
N
M
N
M
2
2.00
4 3.17 62 4.23 68 4.10
1
2.50
5 3.13 36 4.18 42 4.01
7
2.05
1 2.83 29 4.11 37 3.68
10
2.08
10 3.12 127 4.19 147 3.97

N: Number; M: Mean
126

The responses of the participants regarding their experiences of e-learning can be


roughly categorized into six groups based on their commonalities: the first group were
based on twenty of the teacher participants, which partially or entirely delivered some
modules through e-learning. The mean score of these participants responses regarding
to six items related their usage in different ICT was calculated as 3.97. The second
group, based on eleven of the teacher participants, had the tendency to express their
experiences of e-learning as developing e-learning materials. The mean score of their
responses for their usage in different ICT were found as 4.14.

The third group, based on twenty of the student participants, tended to mention about
their experiences of e-learning as downloading e-learning materials (e.g. video, lecture
note, presentation) over the Internet to enhance their knowledge about the respective
modules in their department or develop their personal skills. The mean score of these
groups regarding their ICT experience is 4.22. The fourth group, based on fifteen of the
student participant, highlighted that they studied some modules through e-learning
(M=4.14). Finally, two teachers worked in a Europe project to gain e-learning
experiences (M=3.58).

9.4.3.2 The comments of the participants related to their views on e-learning.


The participants were asked to elaborate their views on e-learning if any. The responses
of the participants regarding their views on e-learning can be also categorized into three
groups: negative, neutral and positive comments as shown in Table 25. Table 25also
indicates the relationship between participants views on e-learning and their attitudes
towards e-learning. These results indicate that the more participants have positive
attitudes towards e-learning, the more they mention about positive things on e-learning.

127

Table 25. The relationship views and attitudes towards e-learning

What do you think about e-learning?


Negative comments
Neutral comments
Positive comments
Total

Attitudes of the participants towards elearning


Negative
Neutral
Positive
Total
N
M
N
M
N
M
N
M
2 2.00 3 2.73
5
4.00 10 3.22
11 1.93 16 2.99 67 4.02 94 3.64
5 3.12 38 4.04 43 3.93
13 1.94 24 2.98 110 4.03 147 3.67

The negative comments of the participants related to their view on e-learning: The
negative comments of the participants regarding e-learning can be also categorized into
two groups based on their similarities. The first group were based on four of the
participants, which commonly ostracised e-learning based on their beliefs. A student in
this group believed that e-learning would not be useful in his department and it would
lead to the increase of unemployment rate (S626). Another student also believed that elearning was boring and distracting and department modules should not be delivered
using the Internet (S269).

Similarly, another student also refuses to use the Internet as it consists of lots of
information (S60). The last student in this group does not believe that e-learning is
better or secure than even using an e-mail (S219). The second group based four of the
participants, mainly refused e-learning because of their practical concerns. They mainly
emphasized that the subject of electricity was a subject based on practice and hence elearning was not suitable for the subject because students could not find the possibilities
for making experiments (T312, T352, S435, and S437). The third group, based on two
of the participants, mainly repudiated e-learning based on their comparison on campusbased learning and e-learning. The first student explained that he joined e-learning
course over the Internet. He believed that e-learning was weaker than the classroom
environment and hence emphasized the benefit of the classroom environment (SP646).
The second one also believes that learning in the classroom is more fluent using the
blackboard (S280).

128

The neutral comments of the participants related to their view on e-learning: The
comments of the participants who both embrace and ostracise e-learning at the same
time were accepted as neutral comments. The comments of these participants can be
categorized into several groups based on their similarities: the first group, based on half
of the participants, expressed that they do not have enough information about this
notion and hence avoided from embracing or ostracising e-learning for their respective
tasks. The second group were based on the answer of several teachers and students who
refused or accepted e-learning providing some requirements were met. For example,
one female teacher drew our attention regarding the possible risk of e-learning: The
Internet offers lots of information that students might not need at all. This would just
make the students keep everything on their computer without analysing or even
attempting to analyse it (T291). This implies that e-learning should be implemented
very carefully. We need to apply something different from tutorial-led approach through
e-learning to ensure there are some challenges for students to access information such as
game-based learning approach.
Particularly, a teacher observed his students behaviours regarding e-learning (T357).
He acknowledged that e-learning could bring many advantages and help students learn
quickly, but it could also detract students from lectures because it would be difficult to
design regular schedules for online students. He also remarked that students in the
classroom environment ask questions without hesitation but they become lazy when
they are online (e.g. sending e-mails). As a result, he suggested that students should
learn in the campus setting but supported by e-learning in case they were under exam
pressure. Similarly, a student also shared his opinion about this issue that confirmed the
teachers (S649). According to him, studying online would take more time because
things such as Facebook and MSN draw his attention. However, this changes in the
classroom setting because he explicitly agreed on the necessity of following the teacher.
Moreover, one male teacher also noted that he worked as a lecturer at the Department of
Distance Education offering a master degree. He pointed out that the main difficulty in
e-learning was to encourage students to sit at the computer for a while and to motive
them to study at their own pace for learning (T121). The responses of the last group
based on their suggestions about developing e-learning materials.

129

A teacher with a high level of ICT experience and attitude towards e-learning
complained about the lack of a department in his institution which should be responsible
to prepare e-learning materials (T114). It seems that he used to prepare e-learning
materials for his department but could not make them effectively accessible to students
because of the slow connectivity. A student shared his opinion about the importance of
videos because he could view videos again and again until he understood topics better.
Similarly, two other students expressed that they had downloaded videos over the
Internet to learn programs (e.g. AutoCAD, HTML) as an amateur. In this way, they
learned several programs at a high level. Likewise, a teacher believed that teaching
through a presentation was a very effective way for students instead of talking.
However, he suggests that feeding presentations by simulations and making
experiments in such simulation-oriented packages can increase students effective
learning. For this reason, desktop recording tools are helpful in learning such packages.

The positive comments of the participants related to their view on e-learning: The
positive comments of the participants regarding e-learning can be also categorized into
three groups based on their commonalities: the first group was based on sixteen of the
participants, which commonly embraced e-learning because of the potential benefits of
e-learning such as providing flexibility (S455, S32), widening access (S260, S601),
developing information skills (S526), saving time (S598, S644). The second group,
based on twenty one of the participants, embraced e-learning based on their experiences
of e-learning. A student also emphasized that he could make detailed and better studies
for modules offering lecture notes and assignments on the Internet (S360). Specifically,
a student emphasized that some modules were taught by the tutor as in the textbook on
chalkboards without additional support. According to him, applying e-learning to such
modules can be better to save time and learn quicker (S574). The comments of the last
group were based seven of the participants, who commonly supported e-learning
because it may bring an innovation into their departments.

130

9.5 A Model for Delivering E-learning


The analyses of the open-ended item and a list of the close-ended items have led us to
identify some stages how to deliver e-learning (see Figure 11): First, the tutorial-led
approach may still work in the classroom environment but our findings indicate that it
may not work alone through e-learning because of two main reasons: first, our analysis
indicate that students may be easily dissatisfied with reading e-learning materials as
they have to sit at the computer reading online for a couple of hours per unit. Several
external factors may also make this process much worse than expected. The Internet
itself has the risk of distracting students from online learning; second, students in
classroom environment at least have to take notes and hence a chance to analyse what
they learn. However, online students may not need to take notes because learning
materials are at their fingertips. The note taking skill of students is just an example. I
also examined several traditional skills of students (e.g. self-motivation, selfresponsibility and time management skills) as they can contribute to their success in elearning (Akaslan & Law, 2011). However, our findings discovered that their traditional
skills were not a high level and even they had no tendency to integrate ICT effectively
into their classroom activities.

Because the lack of traditional skills with the magnetism of the Internet and social
network sites (e.g. Facebook) is a possible risk of e-learning, the tutorial-led approach
should not be used alone through online learning and hence should be supported by
other models such as scenario-based, assessment-driven or game-based approaches.
Second, our findings also manifest the importance of accessibility in delivering elearning. It seems that the biggest challenge is to design e-learning materials available
to as many students as possible with various internet speeds at anytime and anywhere.
This is highly important because I examined teachers and students access to the
Internet at their home and university in addition to the stability of such an access
(Akaslan & Law, 2011). Our findings indicated that while the majority of them had
access to the Internet at home and university, they were not satisfied with their internet
speed. This highlights that I need to think about the Internet speed while I are preparing
e-learning materials.

131

However, this may not be enough in terms of accessibility because extra factors also
affect students access to e-learning such as operating systems. I also investigated
students and teachers platforms to find out about their diversity (Akaslan et. al, 2011,
2012).I found that teachers and students use various operating systems namely Linux,
Pardus, Mac and Windows for their respective tasks. However, it is not possible to find
out a desktop-based e-learning platform that supports all operating systems at the same
time. However, this challenge may be overcome provided that we deliver e-learning
through a web browser as all operating systems support at least one web browser from a
range of options. This is also important when we take the potential market of smart
phone and tablets into account as they have different operating systems (e.g. Android)
from laptops or desktops. Third, it seems that some teachers and students have concerns
how they can implement practices through online learning. Today, remote labs are used
together with e-learning to help students increase their practical skills. However, it is
newly developed and not a way to replace the lab environment. Moreover, human
beings are social creatures and need face-to-face interactions. The benefits of e-learning
should be used with the campus-based learning to improve e-learning usage of students.

However, the respective institutions offer several modules to teach students such as
AutoCAD, MATLAB and programming languages (e.g. C++). Our findings also show
that students have tendency to learn such programs watching videos prepared by
desktop-recording tools. Such modules can be offered through e-learning effectively
because students can also use their computers as lab. In this way, students may not have
concerns to learn such packages. Overall, our findings encourage us to design such a
scenario-based approach for delivering e-learning through a web browser. In such an
environment, enhancing students motivation is highly important for improving elearning usage. However, the benefits of campus-based education and training should be
also integrated into this environment. For this reason, students should start with elearning, continue with face-to-face education on the campus and then came back to elearning for evaluating things they learned. This approach can enhance students
motivation and create some challenges for students to reach information at some degree
of difficulty.

132

However, learning in such an environment takes time because of the game-based


approach. Some students may be eager to learn without spending an extra time on the
levels of game(s). For this reason, I should keep publishing e-learning materials (e.g.
lecture notes, videos, exercises and forum) through a virtual learning environment such
as Moodle as illustrated in Figure 11.

Exercise

Video
Note

Moodle

Forum

Figure 11: A web-based approach for delivering e-learning

133

PART III:
EVALUATION
Evaluating the Model in Electrical Engineering:
Empirical Studies in Turkey and the United
Kingdom

134

Chapter 10 opens Part III with a methodology that describes the

IN THIS PART

procedures for evaluating the model for e-learning based on the

Chapter 10

empirical studies in Turkey and the UK. Chapter 11 reports

Methodology for

measurement on the readiness of students in Turkey and the UK and

Evaluation

the comparison of the differences between two countries. In addition,


Chapter 12 presents the analyses of the pattern of correlations among

Chapter 11

a set of variables in the factors that might affect the readiness of the

Comparing the E-

students in Turkey and the UK for e-learning and depicts the casual

Learning Readiness of

relationships between those factors. Chapter 13 reports a case-control

the Electrical

study on assessing the pedagogical value of e-learning using a web-

Engineering Students in

based, campus-based and mixed mode of these approaches by

Turkey and the UK

opening three different courses in Turkey and the UK. It concludes


with Chapter 14 by summarizing the key points of the thesis and
answering the research questions.

Chapter 12
Structural Equation
Modelling

Chapter 13
Evaluating e-Learning

Chapter 14
Conclusion

135

CHAPTER 10: THE PROCEDURES FOR EVALUATION


10.1 Introduction
To ensure that the actual benefit of e-learning is valid in different conditions, I
illustrated a road map for demonstrating what principles are needed to implement elearning namely (i) measuring readiness for e-learning; (ii) developing or selecting a
platform for e-learning; (iii) developing materials for e-learning; (iv) training
individuals for e-learning; and (v) delivering education and training through e-learning
in Part I. After that, I developed an e-learning model based on the attitudes of teachers
and students, who were working or studying in HEIs associated with the subject of
electricity in Turkey in Part II. However, it was not possible to assess the pedagogical
value of e-learning without evaluating the model. Hence, there was a need to conduct a
case-control study to assess the pedagogical value of e-learning by designing and
implementing an online course by using a web-based, campus-based and mixed mode of
these approaches.

Moreover, I could only evaluate the model by delivering e-learning in HEIs associated
with the subject of electricity in Turkey. However, there was also a need to evaluate the
model to find out its applicability in a different context namely in the United Kingdom.
Hence, the goal of this chapter is to describe the procedure for evaluating the e-learning
model in the domain of electricity in Turkey and the United Kingdom. First, I describe a
model for conducting a case-control study in order to find out the pedagogical value of
e-learning in the domain of electricity in higher education institutions associated with
the subject of electricity in Turkey. Second, I explain the procedure for selecting and
installing an e-learning platform namely Moodle. Third, I justify why the e-course that
I developed is relevant to the students in both countries. Fourth, I describe the strategy
to train students for e-learning if they are not ready for e-learning. Finally, I describe the
procedure for inviting students to participate in our case-control study.

136

10.2 A Model for Delivering e-Learning


Our findings in Part II encouraged us to design such a model for delivering e-learning
through a web browser using a blended learning approach as illustrated in Figure 12. In
such an environment, the benefits of campus-based education and training remain and
the motivation of students is enhanced with e-learning materials. As illustrated in Figure
12, students should start with e-learning continue with the face-to-face learning setting
on campus and then come back to e-learning for evaluating things they have learned..
The stages of such a model were identified after detailed analyses of our previous
research studies (see Part II), related literature and discussion among the researchers. It
is obvious that face-to-face lectures or tutorials remain dominant in campus-based
learning. However, using such a model, students can study at home and assess their
learning before they attend campus-based lectures. The details of each stage in the
model are explained as follows:

Stage 1: Starting with e-learning (Internet-based)


Self-directed
learning

Step 1:
Reading e-Book

Self-assessment

Step 2: Solving eExercise A

Stage 2: Continuing with Face-to-face (Campus-based


Teacher-directed
learning

Step 3:
Attending Session

Teacherassessment

Step 4: Solving eExercise B

Stage 3: Ending with e-learning (Internet-based)


Computer-directed
learning

Step 5:
Playing e-Game

Computerassessment

Figure 12: A Model for Blended Learning


137

Step 6: Solving eExercise C

10.2.1 Stage 1: Getting Prepared at Home for the Class


For students, it is very important to attend scheduled sessions to get the most from
lectures or tutorials designed for modules in HEIs. However, it does not guarantee
success. To benefit from lecturers (or tutorials), students must go to sessions regularly
and be prepared. However, it is frustrating that students do not read the core text that the
lecture or tutorial to be covered in advance of the class (Schwieter, 2008). Going to the
class without being familiarised with key concepts and vocabulary prevents students
from benefitting from lectures. Schwieter (2008) said that many students attend the
class without preparation and then review the information until just before the exam by
cramming all of the material in so quickly that they are likely to forget most of it within
a short period of time. It is highly recommended that students must devote more than
enough time to prepare themselves for their upcoming classes, though there are a host of
reasons that seem to stop students from getting prepared such as lack of textbooks. For
example, the majority of modules in HEIs in Turkey and the UK provide a reading list
for modules rather than a main textbook.

However, as the number of available copies in libraries are not sufficient, students are
forced to purchase or wait for other students to return. Besides, the structure of modules
in universities roughly corresponds to the outline of textbooks in the reading list
because lectures tend to encourage their students to read from multiple perspectives. For
instance, one textbook may be good for a certain topic but may not for the other topics.
However, the number of chapters should be read in advance of the class becomes
unclear for students. Having the outline of the textbook corresponds with the topics of
the lectures is surely a great way to help students prepare for classes. Hence, a book
must be provided to students to make sure that they have a sufficient material for
reading the core text and familiarize themselves with key concepts and vocabulary well
in advance of the class. The development of a textbook that corresponds with the outline
of the module is also part of making sure that students familiarize themselves with key
concepts and vocabulary before going to the class. Moreover, instructors also make
sure that they get their students to read the respective part of the textbook and prepare
themselves for classes.

138

There are many ways in which instructors can assess whether or not their students are
prepared for class. Some instructors ask a few study questions, some give a short quiz
and some ask students to write a response to the reading. McMahon & Stark (2005)
noted that one way to improve the quality of student preparation was to ask students to
respond in writing before a class with several thought provoking questions and then
email their responses to the instructor or post on a website or bring a hard copy to the
class. However, it may not be possible for instructors to check to find out whether or
not students have read the core text due largely to insufficient time. Besides, after
collecting homework at the beginning of the class, it may not possible to evaluate
student responses immediately.

However, as part of class preparation, several questions especially close-ended


questions, which are due (e.g. 24 hours) before a class, might be prepared and posted
using a VLE such as Blackboard or Moodle to encourage students to evaluate their own
reading. In this way, instructors might monitor students answers before the class and
review their presentations. Besides, instructors might use their time effectively by
skipping the slides that students have already learned, focusing on the slides with which
students struggle. Therefore, it is recommended that a book be designed where each
chapter with text and exercises can be used both online and offline, thereby helping
students get prepared for the class and helping instructors review their presentations
before the lecture starts

10.2.2Stage 2: Attending Lecture Sessions after Studying at Home


A formal talk given to a group of students in order to teach them about a subject is
crucial part of the conventional educational system. As the main focus of the
conventional educational system is on delivering the teachers lectures to students, it has
paid less attention to the learning aspect of education (Alonso, et al., 2005). However,
the use of ICT especially using the Internet has significantly affected the way teachers
teach such as the use of the whiteboard and projector in the classroom. On the other
hand, ICT can also be used effectively to help students enhance their learning during the
session. However, there are many issues that put a barrier between the students and the
teacher during the classroom (Akaslan et al., 2011& 2012).

139

Some of the issues, expressed by both teachers and students (see Chapter 6 and Chapter
8) including: first, the large classes contribute to learning problems associated with the
noise in classroom and cause inadequate practice in laboratories; second, large classes
increase stress and reduce satisfaction for learners; third, some students may need extra
time to grasp difficult concepts; the lecture theatres used for delivering the class lectures
do not fit for their purpose due to several reasons such as the lack of sound quality, not
conducive to gaining learning experiences, not amplifying and clearing noise from front
to back due to the large class size, the speech perception fading away at the back of
classrooms, inappropriate facilities and seats of the lecture theatres, lack of discipline,
individualised instruction.

However, we can overcome the lack of classrooms using the ICT effectively. For
example, instead of using the blackboard only, the projector with efficient presentations
could be used because it is not possible for every student to seize a place near the
blackboard or instructor. It may help the teacher to deliver his lectures effectively, and
enhance the student learning. However, the students in 2011 were surveyed about their
traditional skills such as writing and note-taking and found that there are many students
who need extra time to grasp difficult concepts and to take notes. In addition to the use
of projector with effective presentations, the class lecturers could be recorded and
published to reduce the negative impacts of the overpopulated classes. On the other
hand, many students attend the lectures and wait to study and review the information
until just before the exam (Schwieter, 2008). It is hence crucial to find a way to
encourage students to look over their notes while the lecture is still fresh in their mind
after the class.

10.2.3Stage 3: Playing e-Game after Attending Lecture


E-learning may be distracting since there is a risk that the access to web-based resources
such as social network (e.g. Facebook) may distract students from online learning.
Hence, this can aggravate students learning, especially students with low level of
traditional skills such as note taking and time management skills. However, a webbased game could be a solution to encourage students learning while they are learning
on the Internet.

140

There has been interesting increase in the use of games for learning in all stages of
formal education from primary schools to higher education, especially computer games
in recent years. Whitton (2012) pointed out that while there are many examples of the
use of games in education, the main rationale is that learners simply find them
motivational. Many serious game supporters have been conducting research on how
games can be used for learning, resulting in a small outcome. It is not possible to find
out a certain outcome regarding the pedagogical value of educational game towards
higher education students. However, a web-based game could be used to increase
students research skills and create competition between students while they are playing
and learning the content.

10.3 A Model for Conducting a Case-Control Study


However, evaluating the model based on blended learning approach in Figure 12 as it is
might not help us find out the pedagogical value of e-learning without comparing it with
the campus-based learning and e-learning together. Hence, a case-control study should
be conducted to determine whether the model in Figure 12 has the intended effect on
participating students in higher education institutions in Turkey and the United
Kingdom. Hence, a model for conducting case-control study has been developed to find
out the effects of three types of learning namely e-learning, blended learning and
traditional learning as illustrated in Figure 13.

As illustrated in Figure 13, at the beginning and at the end of each type of learning, preand post-placement tests were applied to find out the current knowledge of students
about content of the courses. Placement tests were mainly used to measure students
ability in order to put those students in a particular class or group. However, instead of
assigning students into an appropriate group, I used the results of the placement tests to
find out the rate of increment or decrement on students learning after the end of each
type of learning course.

141

Sampling
Target Population

Measuring Readiness
for E-learning

Pre-Placement
Test

Post-Placement
Test

Pre-Placement
Test

Pre-Placement
Test

Post-Placement
Test

OFF-LINE

ON-LINE

Figure 13: A Model for Case-Control Study

142

Post-Placement
Test

10.4Selecting and Installing E-learning Platform


An e-learning platform (i.e. VLE) is required to deliver e-learning to the potential
students. However, there are many e-learning platforms including commercial (e.g.
Blackboard & WebCT) and open source systems (e. g. Sakai and Moodle). The Moodle
was selected among those systems as the e-learning platform to conduct the case-control
study of this thesis based on the findings obtained from the teachers and students
surveyed in 2010 and 2011 as detailed in Chapter 5, 6, 7 and 8. To install the Moodle as
e-learning platform to facilitate e-learning, the latest version of the Moodle at the
http://www.moodle.orgwas examined and the requirement list was identified to find out
a stable hosting for the instalment. To ensure the e-learning platform is fully working 24
hours nonstop and there is no interruption in the server. Two hosting accounts, a noncommercial and commercial one, were used and the latest version of the Moodle was
installed using the domain http://www.dursunakaslan.com. Moreover, the latest version
of the LimeSurvey, an open source survey application, was also integrated into the
Moodle to diversify the number of questions used to assess the understanding of
students using e-exercises.

10.5Developing and Integrating E-learning Materials into the E-learning Platform


It is not possible to conduct a case-control study without developing and integrating elearning materials into the e-learning platform. To deal with this issue, the
undergraduate curriculum of the HEIs associated with the subject of electricity in
Turkey was examined in detail. It was found that various modules such as circuit
theory, basic linear algebra, programming and electromagnetic waves. Additionally, the
majority of those modules require high computer skills from students as they include the
use of computers and programming. Hence, the teachers and students in the respective
HEIs were surveyed for ICT that they were familiar with. However, the findings
indicated that the respective HEIs offered several modules to teach students
programming skills such as AutoCAD, MATLAB & C++. The MATLAB Software was
selected among those systems as the content to conduct the case-control study of this
thesis based on the findings obtained from the teachers and students surveyed in 2010
and 2011 as detailed in Chapter 5, Chapter 7 and Chapter 9.

143

Hence, a module titled Programming with MATLAB including three parts and 15 topics
were designed. I contained several learning materials including a chapter, exercises
with both solutions and non-solutions, presentations and a digital game for each topic
and they were also digitalized in order to integrate into Moodle. In order to develop an
up-to-date book regarding Programming with MATLAB, the latest advancements in the
MATLAB Software was investigated. As a result, a book including 15 topics was
developed as illustrated in Table 26.

Part I, Fundamentals, was concerned with fundamentals of MATLAB which students


need to know in order to get started with programming with MATLAB, consisting of
Chapter 1-6 of the book. The chapters were presented in a logical way to allow students
to study with the fundamentals of MATLAB right from the beginning to advance.

Part II, Programming, was about programming with MATLAB covering Chapter 7-12.
I aimed at teaching the fundamentals of programming with MATLAB with Chapter 7,
8 and 9. The latter topics introduced more advanced topics in programming considering
the MATLAB Software and its environment.

Part III, Applications, was designed to demonstrate the use of MATLAB in practice
namely in real situations. I aimed to help students find information about how to
develop a simple calculator, sort data on an external file and control an external device
namely PIC16F628 using the serial port of a computer to control a race car.

In parallel with the chapters of the book, a list of exercises (called e-Exercise A)
including six questions for each topic were designed to measure the understanding of
students while they are studying e-book at home before they come to the classroom
session regardless of whether they are online or in the campus. Based on the chapters of
the book, 15 presentations for the classroom sessions using the https://www.prezi.com
were also prepared to encourage students learning during the classroom. For example,
the 16th slide of the unit called Digital Images is given in Figure 14.

144

Table 26: Book Chapters


Parts

Part I:
Fundamentals

Part II:
Programming

Part III:
Practices

01
02
03
04
05
06
07
08
09
10
11
12
13
14
15

Chapters and their titles


Introduction to MATLAB
MATLAB Desktop
Current Folder and Workspace
Matrices
Multidimensional Arrays
Digital Images
Variables
Data Types
Conditionals and Loops
M-Files
Sorting Algorithms
Graphical User Interface
Graphing the Voltage of AC
Controlling a Race Car using PIC15F628
Designing a Windows Calculator

Figure 14: 16the Slide of the Presentations for Chapter 6

145

In parallel with the topics of the presentations, a list of exercises (called e-Exercise B)
were also designed to measure the understanding of students while they are listening
and taking note during the classroom. The results of the open-ended questions and
interviews with teachers in 2010 and students in 2011 also revealed that e-learning may
be distracting since there is a risk that the access to the web-based resources such as
social network (e.g. Facebook) may distract students from online learning. Hence, this
can undermine students learning, especially students with low level of traditional skills
such as note-taking and time management skills. These findings helped us to design a
web-based game to encourage students learning after the classroom sessions. The webbased game designed in parallel with topics in the Part III of the textbook. Finally, a list
of exercises (called e-Exercise C) including six questions for each topic in Part III were
also designed to measure the understanding of students while they are playing a webbased game after they attended the classroom sessions at home. In sum, six types of elearning materials have been designed and integrated into the e-learning platform after
installing Moodle as illustrated in Figure 15.

e-Book

e-Session

e-Game

e-Exercise A

e-Exercise B

e-Exercise C

Figure 15: The Development of E-learning Materials

To develop the e-learning materials, the steps suggested by (Mayes & Freitas, 2000) for
developing materials: (i) to describe intended learning outcomes; (ii) to choose learning
and teaching activities to allow students to achieve learning outcomes; (iii) to design
assessments tasks which test whether the learning outcomes has been reached; (iv) to
evaluate the achievement of outcomes. To achieve the first step, a short description of
learning outcomes were given in the first chapter of the e-book. Additionally, a detailed
explanation of the learning outcomes for each chapter should be given at the beginning
of each chapter. Figure 16 shows the description of the learning outcomes that should be
in Chapter 2.

146

Figure 16: Learning Outcomes of the Chapter 2

To find out choose learning and teaching activities to allow students to achieve learning
outcomes, the qualitative data of the survey data were examined in detail and the
opinion of the teachers and students were taken into account. For instance, one teacher
suggested that education does not only focus on the answer of the question what but
also the students should find answers about the questions how where and why.
Therefore, I designed the activities in the case-study to encourage students learn about
WHAT, WHY, HOW and WHERE for each concept. In this way, I tried to achieve the
second step suggested by (Mayes & Freitas, 2000). For the step 3, the students were
encourage to answer exercises designed for each chapter. To deal with the fourth step, I
applied a placement test at the beginning, in the middle and at the end of the courses. In
this way, the students learning evaluated by comparing their learning during the
courses

147

10.6 Training Students before implementing e-learning


As specified in the Section 10.3, at the beginning of the case-control study, the
readiness of students in the domain of electrical engineering in both Turkey and the UK,
who chose either blended or e-learning mode, was measured. Overall, the findings
indicated that the students in both countries (i.e. Turkey & UK) generally showed
positive experiences, confidences and attitudes towards e-learning. Moreover, the
results of e-learning readiness survey showed that their readiness seems to be sufficient.
However, the qualitative data of the survey data showed that their attitudes towards elearning must be strengthened before implementing e-learning.

Therefore, at the beginning of the first course, the students were trained about the use of
Moodle (e.g. changing their password, updating their profile, uploading their picture,
sending messages to each other, reading PDF documents, solving exercises and
watching recorded sessions) and the use of the White Board (e.g. joining online
sessions, using their microphones and camera, chatting with other students and asking
written questions during the session). In these training sessions, students were also
informed about e-learning, and its benefits and drawbacks. For the training sessions, a
commercial white board (i.e. http://www.gotomeeting.com/) was used. Using the
GotoMeeting board, students were informed about the case-study. For instance, students
were informed about how to log in the Moodle and how to change their password by
demonstrating using the desktop-sharing feature of the respective white board.

10.7Research Group
The participating institutions were determined by considering whether they were
associated with the subject of electricity especially the students of electrical engineering
in 2012 in both countries. Associative, Bachelor, Master and PhD students in those
institutions were chosen as participants in our case-control study. However, many
reasons specified in Chapter 3 imply that sampling is more feasible than studying all the
students in the target countries to observe whether e-learning is effective. Hence, there
was also a need to select one from types of sampling methods for evaluating the model.
148

10.7.1 Sampling Method


The simple random sample was chosen for implementing the evaluation of the model in
my study as each student in the target populations has the same of being included (see
Chapter 3). The sample needed for the study was calculated as 384 with 5% allowable
error, 95% confidence level and 0.5 degree of variability because the number of
students in the respective HEIs in 2012 in both Turkey and the UK was assumed as the
infinite population as they are unknown.

10.7.2 Sampling Size


Given the relatively history of e-learning in Turkey, the students of all HEIs in 2012
associated with the subject of electricity in Turkey were invited by sending an invitation
to their department secretary in order to obtain the desired level of confidence and
precision. To achieve this aim, the universities located in the UK were listed and 10 of
them were selected randomly in order to use our limited resources. I aimed to evaluate
if the model is applicable to another context other than Turkey, though I was fully
aware of the limitation of this sampling strategy. An invitation letter to the department
secretary of the HEIs associated with the subject of electricity in those ten universities
was sent to encourage their students for the case-control study.

While 1364 individuals responded to the invitation, the responses of only 964
participants were valid. Table 27 illustrates the distribution of the participants for each
group. The valid responses of 776 participants were from Turkey and 188 were from the
UK. 776 participants were sufficient for use to obtain the desired level of confidence
and precision in our sampling. However, as 188 students participated in the case-control
study, it was not possible to achieve the desired level of confidence and precision for
the UK.

149

However, in order to use our limited resources effectively, neither a reminder to those
ten universities was sent nor new universities were selected to compensate for nonresponse. The main rationale for this decision is that the number of demands for the elearning platform is high especially for the online session and my server is not
sufficient. Therefore, the generalizability of the UK sample is limited in our study.
Table 27 displays the number of the participants for each mode of learning in Turkey
and the UK. As seen in Table 27, the majority of the students selected the e-learning
mode.

Table 27: The number of participants in Turkey and the UK

Turkey

United
Kingdom

Mode
E-learner
Blended Learner
Traditional Learner
Total
Mode
E-learner
Blended Learner
Traditional Learner
Total

Number
589
113
74
776
Number
162
22
4
188

Per cent
75.9
14.6
9.5
100.0
Per cent
86.20
11.70
02.10
100.0

The students of the traditional and blended learning group were selected from the
University of Leicester in the UK and Selcuk University from Turkey. The number of
students for each mode in University of Leicester and Selcuk University are illustrated
in Table 28 .

150

Table 28: The number of participants in Leicester and Selcuk University

Selcuk

Leicester

Mode
E-learner
Blended Learner
Traditional Learner
Total
Mode
E-learner
Blended Learner
Traditional Learner
Total

Number
30
112
74
216
Number
118
21
4
143

Per cent
13.89
51.86
34.26
100.00
Per cent
82.52
14.69
02.80
100.00

The valid responses of 216 participants were from Turkey and 143 were from the
United Kingdom. The majority of the students from University of Leicester selected the
e-learning mode. On the other hand, the blended one was the choice of the majority of
students in the Selcuk University in Turkey.

10.8Measuring E-learning Readiness


It was also important to understand whether students in Turkey and the UK embrace or
ostracise e-learning at the beginning of the case-control study. To address this issue, at
the beginning of the case-control study, the readiness of both students in Turkey and the
UK was measured on ten factors as explained in Chapter 4. However, the readiness of
only students who selected the blended and e-learning mode was measured. Once they
filled in the survey that measured their readiness for e-learning, an account would be
created for those students to log in the e-learning platform namely Moodle. In total, 886
accounts were created for those students who successfully completed e-learning
readiness survey at the beginning of the case study. The readiness of students in Turkey
and the UK for e-learning were analysed in Chapter 11 and 12.

151

10.9 Assessment Methods


The terms measurement, assessment and evaluation are considered as essential
components of teaching and learning in education and training. Without those, it is
impossible to know whether students have learned or teachers have taught effectively.
However, the terms measurement, assessment and evaluation are often used in higher
education with different meanings, as there are the various meanings of those in an
educational context. It is therefore important to interpret those terms at the beginning
before going further. However, the terms measurement, assessment and evaluation are
often used interchangeably, though they are not the same.

Phillips, et al. (2012) describes the process whereby teachers set specific tasks to judge
the extent to which they can demonstrate outcomes as assessment and the term
evaluation in the sense of making judgements about how effective the design of the
learning environent is for supporting learning. Moreover, the term measurementare
also used in the educational context.However, measurement like observation is a
technique used to assess the state or condition of a thing like the understanding of
students about a specific topic.

Measurement are often described as the process of assigning numerals to objects,


events, etc. according to certain rules (Tyler, 1963). However, the term evaluation is
also used as the process of observing and measuring a thing for trhe purpose of judging
it. As a result, both assessment and evaluation using the measurement rather than the
observation technique were used to understand the state of e-learners, blended learners
and traditional learners and to judge the pedagogical value of those modes.The results
of e-exercise A, B and C were used to measure the understanding of stduents for the
corresponding e-learning materials. Additionally, the results of pre-placement, postplacement and quiz were used to assess the pedagogical value of e-learning, traditional
and blended learning.

152

CHAPTER 11: COMPARING THE E-LEARNING READINESS OF THE


ELECTRICAL ENGINEERING STUDENTS IN TURKEY AND THE UK
11.1 Introduction
As specified in Chapter 10, at the beginning of the case-control study, the readiness of
students in the domain of electrical engineering in both Turkey and the UK, who chose
either blended or e-learning mode, was measured. Note that the batch of the Turkish
students taking the e-readiness as described below as different from that involved in
2011 (see Chapter 7).

In this chapter, I report the extent to which the participants from Turkey and the United
Kingdom are ready for e-learning in terms of ten main factors ( see Chapter 4). I first
report the descriptive statistics of the items in the study (see Chapter 7 for the items)
and then compare the mean scores of demographic variables such as gender and age of
the participants to find out whether there were significant differences with respect to
these variables.

11.2 Research Group


886 university students of different academic levels participated in the survey. At the
end of the questionnaire, the participants were invited to participate in the preplacement test in order to measure their current knowledge regarding the Programming
with MATLAB at the beginning of the online courses. The majority of the participants
were male (79.9%, 708). The participants were categorized into different age groups.
The distribution indicates that 87.4% of the participants were less than 26 years old.
Another criterion to categorize the participants is type of university degree: 78.11% of
the participants were studying a Bachelor degree and the rest of them were with a
Master or PhD degree.

153

11.3Initial Findings
The results of the descriptive statistics of the respective items are presented in ten parts
corresponding to the ten phases depicted in Figure 5. Table 29illustrates the overall
mean and standard deviation scores of the participants responses and the mean scores
of items related to each factor such as technology and confidence. From Table 29, it can
be observed that the overall mean scores were higher than the expected level of
readiness (see Chapter 3.6) in both Turkey (M=3.53 > MEXPECTED=3.40) and in the
United Kingdom (M=3.78 > MEXPECTED = 3.40).
Based on this result, it can be inferred that students in Turkey and the United Kingdom,
within the limits of the students surveyed, were overall ready for e-learning, although
they might need some improvements. Mean scores for the factors can be also used to
identify the areas of improvement for the participating students. First of all, the mean
scores for institution and training, the only pooled factors whose mean score is lower
than the expected readiness level (MINSTITUTION = 2.65 &MTRAINING =2.44 < MEXPECTED=
3.40), show that there was a need to improve university facilities and to train students
for e-learning.

Table 29: Number, Mean and Standard Deviation of Items

Factors

No of
Items

Technology
Experience
Confidence
Attitude 1
Attitude 2
Tradition
Institutions
Content
Acceptance
Training
Overall

6
6
5
6
6
24
3
4
8
5
73

Turkey
(N=702)

UK
(N=184)

Pooled
(N=886)

SD

SD

SD

3.82
3.92
4.15
3.79
3.50
3.72
2.42
3.75
3.88
2.38
3.53

0.82
0.61
0.63
0.60
0.63
0.57
1.71
0.73
0.60
0.70
0.39

4.32
4.12
4.31
3.90
3.57
3.79
3.54
3.70
3.84
2.71
3.78

0.66
0.56
0.64
0.66
0.68
0.58
1.74
0.73
0.63
0.73
0.45

3.93
3.96
4.18
3.81
3.51
3.73
2.65
3.74
3.87
2.44
3.58

0.81
0.60
0.64
0.62
0.64
0.58
1.78
0.73
0.61
0.72
0.41

154

Table 30 also displays the means and stand deviations of all the items in the study
except those of the factor traditional skills. The participants were asked about their
access to a computer connected to the Internet at home and at university, and then
immediately were asked about the stability and speed of those access if they have
access. The majority of the participants reported that they have access to the Internet at
residence in Turkey (665; 94.7 %) and the UK (178; 96.7 %) and at university in
Turkey (607; 86.5 %) and in the UK (174; 94.6 %).

Based on these results, it can be interpreted that almost all of the participants in both
countries had access to the Internet at home. However, there was a gap regarding the
rate of access to the Internet at university between Turkey and the United Kingdom.
The stability and speed of the Internet that students used at place they lived and at the
university they studied were also surveyed in both countries.

Results show that students in Turkey were not satisfied with the stability and speed of
the Internet at their university, though those in the United Kingdom were. Besides, it
can be interpreted that the satisfaction of students in both countries for the stability of
the Internet was much better than the speed of it. Moreover, there was a significant
difference between students in Turkey and the UK for all the items except the access to
the Internet at home.
As shown in Table 29, the mean scores of all the items related to the factor experience
were higher than the expected readiness level. Based on these results, it can be inferred
that the experiences of students in both countries are mostly sufficient for e-learning.
However, the experiences of the students in the UK were better than those in Turkey. It
can be also observed that there was a significant difference between students in the UK
and Turkey in terms of their experiences of using the Internet as information source
(I07), email as communication their peers (I08), office software for their coursework
(I09) and engineering software (I12).

155

Table 30: Statistics for the items related to all factors


Factors and their
Items

Technology

Experience
with ICT

Confidence
with ICT

Attitudes
towards Elearning

Attitudes
towards
Others

Institution

Content

Acceptance

Training

I01
I02
I03
I04
I05
I06
I07
I08
I09
I10
I11
I12
I13
I14
I15
I16
I17
I18
I19
I20
I21
I22
I23
I24
I25
I26
I27
I28
I29
I54
I55
I56
I57
I58
I59
I60
I61
I62
I63
I64
I65
I66
I67
I68
I69
I70
I71
I72
I73

Turkey
(N=702)
M
SD
4.79
3.68
3.39
4.46
3.34
3.28
4.34
3.78
4.05
4.02
3.58
3.77
4.24
4.25
4.32
4.40
3.54
3.37
3.53
4.02
3.65
4.09
4.09
3.35
3.13
3.58
3.47
3.72
3.73
2.74
2.29
2.24
3.89
3.91
3.56
3.64
4.02
3.86
3.98
4.11
3.45
3.97
3.84
3.82
2.31
2.35
2.29
2.21
2.72

0.89
1.13
1.19
1.37
1.31
1.34
0.60
1.04
0.91
1.05
1.17
1.06
0.74
0.74
0.66
0.69
0.99
0.85
0.92
0.74
0.87
0.78
0.73
0.86
0.81
0.86
0.79
0.79
0.76
1.98
1.87
1.85
0.73
0.74
1.02
0.99
0.73
0.78
0.75
0.69
0.99
0.73
0.77
0.77
0.90
0.90
0.81
0.87
1.07

UK
(N=184)
M
SD
4.87
4.14
3.93
4.78
4.14
4.06
4.60
4.08
4.44
4.08
3.76
3.76
4.55
4.55
4.60
4.44
3.38
3.55
3.90
4.08
3.83
4.07
3.97
3.53
3.44
3.67
3.55
3.62
3.61
3.85
3.50
3.28
3.94
3.90
3.48
3.48
3.94
3.81
3.88
4.08
3.66
3.89
3.72
3.77
2.55
2.59
2.62
2.58
3.18

0.71
0.96
1.01
0.91
1.07
1.10
0.67
0.92
0.74
1.04
1.10
1.14
0.70
0.69
0.67
0.79
1.10
0.88
0.92
0.79
0.86
0.77
0.83
0.87
0.81
0.84
0.81
0.77
0.74
1.82
1.94
1.99
0.77
0.85
0.98
1.04
0.81
0.76
0.76
0.72
0.88
0.77
0.76
0.75
1.06
0.86
0.81
0.85
1.00

M
4.81
3.77
3.50
4.53
3.50
3.44
4.39
3.85
4.13
4.03
3.62
3.77
4.30
4.31
4.38
4.41
3.51
3.41
3.61
4.03
3.69
4.09
4.07
3.39
3.20
3.59
3.49
3.70
3.71
2.97
2.54
2.45
3.90
3.91
3.54
3.61
4.01
3.85
3.96
4.10
3.49
3.95
3.82
3.81
2.36
2.40
2.36
2.28
2.82

M: Mean; SD: Standard Deviation; N: Number

156

Pooled
(N=886)
SD
t
0.86
1.11
1.17
1.29
1.30
1.33
0.62
1.02
0.89
1.05
1.16
1.08
0.74
0.74
0.67
0.71
1.01
0.86
0.93
0.75
0.87
0.78
0.75
0.87
0.82
0.85
0.80
0.78
0.76
2.00
1.95
1.93
0.74
0.76
1.01
1.00
0.75
0.77
0.75
0.69
0.98
0.74
0.77
0.77
0.94
0.89
0.82
0.88
1.07

-1.13
-5.15
-5.73
-3.04
-7.61
-7.32
-5.06
-3.54
-5.36
-0.66
-1.89
0.17
-5.13
-5.03
-5.12
-0.79
1.85
-2.53
-4.82
-0.96
-2.49
0.29
2.02
-2.51
-4.52
-1.29
-1.25
1.54
1.94
-6.87
-7.76
-6.72
-0.92
0.16
0.96
1.90
1.44
0.77
1.48
0.49
-2.62
1.41
1.91
0.77
-3.05
-3.35
-4.97
-5.30
-5.26

p
0.26
0.00
0.00
0.00
0.00
0.00
0.00
0.00
0.00
0.51
0.06
0.86
0.00
0.00
0.00
0.43
0.06
0.01
0.00
0.34
0.01
0.77
0.04
0.01
0.00
0.20
0.21
0.12
0.05
0.00
0.00
0.00
0.36
0.88
0.34
0.06
0.15
0.44
0.14
0.62
0.01
0.16
0.06
0.45
0.00
0.00
0.00
0.00
0.00

As shown in Table 29, the mean scores of all the items related to the factor confidence
were higher than the expected level of readiness. It is also obvious that the mean scores
of the items related to the factor confidence were also significantly highest than those
related to the factor experience in Table 29. However, the mean score of the item
designed to find out whether students use authoring tools to create learning materials
such as Movie Maker (I17) was lower than the expected level of readiness for students
in the UK.

Moreover, there were also significant differences between students in both countries for
the items designed to investigate whether students used computers (I13), web browsers
(I14), search engines (I15) and authoring tools (I17) confidently. The attitudes of the
participants towards e-learning were also interesting to investigate. As indicated in
Table 29, the mean scores of all the items related to the factor attitude towards elearning were higher than the expected level of readiness.

The responses of the

participants were also analysed to find out about their attitudes towards other namely
their teachers and peers.
As shown in Table 29, the mean scores of all the items related to the factor attitude
towards others were higher than the expected level of readiness in both countries except
the item I25 for the Turkish students. The traditional skills of the participants were also
a crucial factor in measuring the readiness of those students. Table 31 shows the mean
scores of all the items in the factor traditional skills, indicating the factor traditional
skills were higher than the expected level of readiness.

157

Table 31: Statistics for each sub-factor of the factor traditional skills
Sub
Factors
(24 Items)
Writing
Note Taking
Collaboration
Reading
Attendance
Time Management
Self-Directed
Self-Motivation
Overall

Turkey
(N=702)
M
SD
3.69
0.72
3.53
0.80
3.88
0.71
3.73
0.63
3.82
0.69
3.46
0.89
3.98
0.67
3.64
0.79
3.72
0.57

UK
(N=184)
M
SD
3.76
0.71
3.83
0.75
3.90
0.68
3.78
0.69
3.81
0.73
3.71
0.87
3.91
0.70
3.62
0.78
3.79
0.58

M
3.70
3.59
3.88
3.74
3.82
3.51
3.97
3.63
3.73

Pooled
(N=886)
SD
t
0.72 -1.26
0.80 -4.58
0.71 -0.44
0.64 -0.80
0.70
0.22
0.89 -3.40
0.67
1.35
0.79
0.29
0.58 -1.53

p
0.21
0.00
0.66
0.42
0.83
0.00
0.18
0.77
0.13

M: Mean; SD: Standard Deviation; N: Number

As given in Table 29, the mean scores of all the items related to the factor institutions
were lower than the expected level of readiness. This indicates that the facilities of the
respective HEIs in both countries should be improved for e-learning. As given in Table
29, the mean scores of all the items related to the factor content were higher than the
expected level of readiness. This might imply that the participating students in both
countries believed that e-learning would enhance the quality of the subject of electricity
in both theory and practice.

Additionally, from Table 29, it can be inferred that it is also possible to apply elearning to the practical parts of electrical engineering. Towards the end of the survey, I
also examined two factors that potentially affected the perceptions of the participating
students about e-learning: the degree to which students believed that e-learning would
be free of effort and would enhance their learning. Table 29 shows the mean scores of
all the items related to the factor acceptance were higher than the expected level of
readiness. As given in Table 29, the mean scores of all the items related to the factor
training were lower than the expected level of readiness. This shows that it is essential
to train individuals for e-learning before delivering it.

158

11.4InferentialFindings
Independent-sample t test, one-way ANOVA tests were used to verify statistical
differences in mean scores on various variables, namely between e-learners in Turkey
and the UK, blended learners in Turkey and UK, e-learners and blended learners in
Turkey and e-learners and blended learners in the UK.

11.4.1Differences between e-Learners in Turkey and the UK


Table 32shows the number, mean and standard deviation of the scores of individual
items in the study in addition to the result of independent-samples t-test. Table 32 only
shows the scores of the factors in the study rather than the scores of all the items
because they are much more meaningful.

Table 32: Statistics of e-Learners for factors in e-learning readiness

Factors
F01: Technology
F02: Experience
F03: Confidence
F04: Attitude 1
F05: Attitude 2
F06: Tradition
F07: Institutions
F08: Content
F09: Acceptance
F10: Training
Overall

No of
Items
6
6
5
6
6
24
3
4
8
5
73

Turkey
(N=589)
M
SD
3.92
0.76
3.98
0.58
4.20
0.60
3.85
0.57
3.50
0.64
3.75
0.55
2.20
1.61
3.75
0.73
3.91
0.58
2.35
0.70
3.54
0.37

UK
(N=162)
M
SD
4.34
0.64
4.15
0.51
4.35
0.59
3.94
0.63
3.59
0.67
3.81
0.56
3.54
1.74
3.73
0.73
3.87
0.60
2.70
0.74
3.80
0.43

t Test
t
-6.42
-3.40
-2.88
-1.81
-1.62
-1.22
-9.24
0.30
0.81
-5.61
-7.78

p
0.00
0.00
0.00
0.07
0.11
0.23
0.00
0.77
0.42
0.00
0.00

M: Mean; SD: Standard Deviation

As shown in Table 32,e-learners in the UK showed higher readiness than those in


Turkey in the mean scores of all the factors except factor (F08) content and acceptance
(F09). Moreover, the mean scores of the factors training (F10) were under the expected
level of readiness for both countries.

159

Besides, the mean score of the factor institution was also under the expected level of
readiness for Turkey. Based on these results, it seem to infer that the students in the
domain of electrical engineering were ready for e-learning. However, a caveat is that
one fact the relatively small number of the British students in electrical engineering
limit the generalizability of this claim for the UK. Moreover, training students for elearning is an important step to succeed at e-learning for both countries. Besides, the
participating students from Turkey believe that their institutions were not ready for elearning as well.

11.4.2 Differences between Blended Learners in Turkey and the UK


Table 33indicates the number, mean and standard deviation scores of blended learners
in Turkey and the United Kingdom in addition to the results of independent-samples t
test to verify statistical significance of difference between both countries. However, it
should be noted that the results of blended learners do not represent both counties
because blended learners comes only from Leicester and Selcuk Universities in both
countries.
Table 33: Statistics of e-Learners for each factor

Factors
F01: Technology
F02: Experience
F03: Confidence
F04: Attitude 1
F05: Attitude 2
F06: Tradition
F07: Institutions
F08: Content
F09: Acceptance
F10: Training
Overall

No of
Items
6
6
5
6
6
24
3
4
8
5
73

Turkey
(N=113)
SD
M
3.32
0.93
3.62
0.67
3.90
0.73
3.52
0.70
3.49
0.57
3.51
0.66
3.56
1.81
3.76
0.73
3.75
0.69
2.51
0.69
3.49
0.48

UK
(N=22)
SD
M
4.22
0.77
3.87
0.80
3.99
0.88
3.61
0.80
3.42
0.76
3.60
0.69
3.55
1.74
3.50
0.77
3.68
0.84
2.73
0.69
3.62
0.56

M: Mean; SD: Standard Deviation

160

T Test
t
-4.22
-1.54
-0.52
-0.54
0.55
-0.58
0.04
1.48
0.38
-1.37
-1.07

0.00
0.13
0.60
0.59
0.58
0.56
0.97
0.14
0.71
0.17
0.29

As shown in Table 33, the overall mean scores of blended learners were higher than the
expected level of readiness in both countries. However, the mean score of the factor
training (F10) was under the expected level of readiness in both countries. Besides, a
significance difference (t[133]=-4.222, p < 0.001) was found on the measure of the
items in the factor technology (F01) in the study because the mean score of the factor
technology was under the expected level of readiness (M0=3.40) in Turkey. Based on
these results, training for e-learning is identified an important area to succeed at elearning.

11.4.3 Differences between Blended and e-Learners in Selcuk University


Table 34 displays the number, mean and standard deviation scores of blended learners
and electronic learners in Selcuk University in addition to the result of independentsample t-test. Based on these results, the overall mean score of both blended and
electronic learners were higher than the expected level of readiness (M=3.40).

However, the mean score of e-learner (M=3.51) was slightly higher than one of blended
learner (M=3.49). In spite of the higher readiness of e-learners, the mean scores of elearner on the measure of the items in the factors institution (M=2.07) and training
(M=2.09) was lower than the expected level of readiness. Based on this result, it can be
inferred that e-learners did not believe that their institutions were ready for e-learning as
much as blended learners do.

However, both blended and e-learners needed training for e-learning before embarking
on it. Besides, significant differences between blended and e-learners in Selcuk
University were also found on the measure of the items in all the factors except the
factor attitude 2 (t[140]=-1.006, p >0.05).and the factor content (t[140]=-0.238, p
>0.05).

161

Factors
F01: Technology
F02: Experience
F03: Confidence
F04: Attitude 1
F05: Attitude 2
F06: Tradition
F07: Institutions
F08: Content
F09: Acceptance
F10: Training
Overall

No of
Items

Table 34: Statistics of Selcuk University for Blended and Electronic Learners

6
6
5
6
6
24
3
4
8
5
73

Blended
Learners
(N=112)
M
SD
3.32
0.94
3.62
0.67
3.90
0.74
3.52
0.70
3.49
0.57
3.51
0.66
3.57
1.81
3.75
0.73
3.74
0.70
2.52
0.69
3.49
0.48

Electronic
Learners
(N=30)
M
SD
3.63
0.79
4.03
0.61
4.19
0.59
3.81
0.52
3.60
0.47
3.78
0.49
2.07
1.66
3.79
0.79
4.11
0.56
2.10
0.78
3.51
0.32

T Test
t
-1.64
-3.02
-2.03
-2.10
-1.01
-2.15
4.11
-0.24
-2.70
2.87
-0.19

p
0.10
0.00
0.04
0.04
0.32
0.03
0.00
0.81
0.01
0.01
0.85

M: Mean; SD: Standard Deviation

Besides, the mean score of blended learners for the factor technology was also slightly
lower than the expected level of readiness. Based on this result, it can be interpreted that
the blended learners of Selcuk University may not have a sufficient access to the
Internet at their residence and campus. After examining the details of all the items in the
factor technology, it was found that blended learners were not satisfied with the speed
and stability of the Internet at their residence and university, though they had a
sufficient access at both home and campus.

11.4.4 Differences between Blended and e-Learners in Leicester University


In addition to the difference between blended and e-learners in Selcuk University, the
mean scores of both blended and e-learners in Leicester University were also examined.
Table 35displays the number, mean and standard deviation scores of blended learners
and electronic learners in Selcuk University in addition to the result of independentsample t-test.

162

Factors
F01: Technology
F02: Experience
F03: Confidence
F04: Attitude 1
F05: Attitude 2
F06: Tradition
F07: Institutions
F08: Content
F09: Acceptance
F10: Training
Overall

No of
Items

Table 35: Statistics of Leicester University for Blended and Electronic Learners

6
6
5
6
6
24
3
4
8
5
73

Blended
Learners
(N=21)
M
SD
4.18
0.77
3.86
0.82
3.97
0.90
3.60
0.82
3.42
0.78
3.60
0.71
3.48
1.75
3.53
0.78
3.69
0.86
2.77
0.69
3.61
0.57

Electronic
Learners
(N=118)
M
SD
4.28
0.66
4.14
0.54
4.33
0.60
3.92
0.65
3.59
0.67
3.82
0.55
3.71
1.71
3.70
0.72
3.85
0.62
2.76
0.74
3.81
0.45

T Test
t
-0.65
-2.06
-2.30
-1.99
-1.03
-1.56
-0.58
-1.02
-0.99
0.06
-1.80

p
0.52
0.04
0.02
0.05
0.31
0.12
0.56
0.31
0.32
0.95
0.08

M: Mean; SD: Standard Deviation

As seen in Table 35, the overall mean scores of blended (M=3.61) and e-learners
(M=3.81) indicated that both of them were sufficiently ready for e-learning, though they
need training for e-learning. The mean scores of all the factors except the factor training
was lower than the expected level of readiness for both blended and e-learners. In sum,
it seems that training for e-learning was a significant factor to achieve at e-learning for
students at University of Leicester. Significant differences were also found on the
measure of the factors experience(t[137]=-2.059, p < 0.05), confidence (t[137]=-2.229,
p < 0.05) and attitude towards e-learning (t[137]=-1.987, p < 0.05).

163

CHAPTER 12: STRUCTURAL EQUATION MODELLING


12.1 Introduction
Factor analysis is used as a data reduction technique because it takes a large set of
variables and looks for a way the data may be reduced or summarised using a smaller
set of factors or components (Pallant, 2010). Factor analysis mainly consists of two
parts: exploratory and confirmatory analysis and they are the main parts of the structural
equation modelling (SEM).

SEM is mainly used as a methodology for representing, estimating and testing a


network of relationships between variables including observed and unobserved
variables (Suhr, 2006). Specifically, the analyses consisted of three parts: First, I used
the SPSS package to implement an exploratory factor analysis on the items that I
developed, and measured using the close-ended questions (see Chapter 11).

Three main steps are mainly used in conducting factor analysis namely assessment of
the suitability of the data, factor extraction and rotation and interpretation. I aimed to
understand the patterns of correlation among a set of variables in the study to find out
how well those variables represent their intended constructs. Second, I conducted a
confirmatory factor analysis using the AMOS package to find out the casual
relationship between factors and finally showed the validated model using the path
analysis with the support of the AMOS package.
12.2The First Structural Equation Modelling
The 17 items of the scale were subjected to principal component analysis (PCA) (Palant
2010) using SPSS version 20. Prior to performing PCA, the suitability of data for factor
analysis was assessed as given in Table 36.

164

Table 36: Statistics for the Suitability of the data

Sample
Size

No. of
Items

Ratio of
Items

886

17

52.12

% of
Coefficient
greater than
0.3
28.03

KMO
Measure
of Sampling
Adequacy
0.829

Bartletts
Test of
Sphericity
0.000

Inspection of the correlation matrix revealed the presence of many coefficients of 0.3
and above. That is to say, the percentage of coefficients of 0.3 and above among the
items was calculated 28.03%. The Kiaser-Meyer-Olkin value was 0.829, exceeding the
recommended value of 0.6 (Kaiser 1970, 1974) and Bartletts Test of Sphericity
(Bartlettt 1954) reached statistical significance, supporting the factorability of the
correlation matrix. Principal components analysis revealed the presence of five
components with eigenvalues exceeding 1, explaining 31.77%, 13.29%, 10.95%, 8.01%
and 6.09% of the variance respectively. An inspection of the screeplot also revealed a
clear break after the second and fourth component. Using the Catells (1966), it was
decided to retain four components. This was further supported by the results of Parallel
Analysis, which showed only four components with eigenvalues exceeding the
corresponding criterion values for a randomly generated data matrix of the same size
(17 variables x 886 participants) as indicated in Table 37. The four-component solution
explained a total of 64.09% of the variance,. with component 1 contributing 31.77%,
component 2 contributing 13.29%, component 3 contributing 10.95% and component 4
contributing 8.09%.

165

Table 37: Identifying the Number of Factors


Components
01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
16
17

Eigenvalue
from PCA
5.40
2.26
1.86
1.38
1.04
0.82
0.75
0.65
0.55
0.53
0.45
0.36
0.32
0.21
0.21
0.14
0.08

Parallel
Analysis
1.25
1.20
1.16
1.13
1.10
1.07
1.04
1.02
0.99
0.97
0.94
0.92
0.90
0.87
0.84
0.81
0.78

Decision
Accept
Accept
Accept
Accept
Reject
Reject
Reject
Reject
Reject
Reject
Reject
Reject
Reject
Reject
Reject
Reject
Reject

To aid in the interpretation of these two components, direct oblimin rotation was
performed. The rotated solution revealed the presence of simple structure (Thurstone
1947), with both components showing a number of strong loadings and all variables
substantially on only one component. The interpretation of the four components was
consistent with our initial research on the scale, with the items designed to measure the
confidences of individuals with ICT loading strongly on Component 1, items designed
to measure the access, stability and speed of the Internet of individuals at campus
loading strongly on component 2, the items designed to measure the access, stability
and speed of the Internet of individuals at home loading strongly on component 3 and
the items designed to measure the experiences of individuals with ICT loading strongly
on Component 4 as indicated in Table 38. There was a weak positive correlation
between the component 1 and 2 (r=0.20), a weak positive correlation between the
component 1 and 3 (r=0.24), a weak correlation between the component 2 and 3
(r=0.99).

166

Items

Table 38: Pattern Matrix for PCA with Oblimum Rotation of 14-Factor Solution

I14
I15
I13
I16
I17
I05
I06
I04
I02
I03
I01
I11
I08
I09
I12
I10
I07

Naming

F1:
Confidence
with ICT
F2:
Internet at
campus
F3:
Internet at
home
F4:
Experience
with ICT

Pattern Coefficients
1
0.94
0.91
0.89
0.86
0.41
0.02
0.04
-0.08
0.08
0.04
-0.05
-0.12
-0.08
0.10
0.11
0.04
0.37

2
0.01
0.03
0.03
-0.02
-0.09
0.94
0.93
0.87
0.10
0.10
-0.12
-0.09
0.05
0.08
0.10
0.00
0.04

3
0.04
0.02
0.06
-0.03
0.03
0.07
0.05
-0.07
0.89
0.85
0.81
0.01
0.08
0.05
-0.11
0.01
0.02

4
-0.06
-0.02
-0.01
-0.01
0.30
0.00
0.00
0.03
-0.01
0.05
-0.01
0.72
0.72
0.65
0.49
0.45
0.42

Structure Coefficients
1
0.92
0.91
0.91
0.85
0.53
0.23
0.24
0.09
0.30
0.28
0.11
0.41
0.26
0.18
0.57
0.32
0.24

2
0.19
0.21
0.22
0.15
0.05
0.95
0.94
0.85
0.20
0.20
-0.05
0.22
0.17
0.02
0.20
0.20
0.09

3
0.25
0.23
0.27
0.17
0.18
0.17
0.15
0.00
0.91
0.88
0.79
0.22
0.21
0.13
0.20
0.03
0.12

4
0.36
0.38
0.40
0.36
0.47
0.20
0.20
0.14
0.23
0.26
0.12
0.72
0.71
0.65
0.59
0.54
0.47

C
0.86
0.83
0.83
0.72
0.36
0.91
0.89
0.74
0.85
0.79
0.64
0.54
0.51
0.45
0.47
0.32
0.22

C: Communalities

Besides, a strong positive correlation was also found between the component 1 and 4
(r=0.44). After the four-component solution is obtained, a reliability analysis to consider
how confident I can be about the using it to measure factors were also implemented
using the version 20 of the SPSS package. The Cronbacha alpa coefficient was 0.832,
exceeding the recommended value of 0.7 (DeVellis, 2003). That is quite higher but I
also wondered whether removal of one or more items from the scale might improve the
internal consistency of those remaining. However, no further improvements were made
by removal of any other variables from the scale. So far, the determination of how many
factors were present and whether those factors were correlated are identified and then
the factors that were determined are also named in addition to the reliability analysis.
Now, I began by specifying the measurement model that I wish to test in the form of an
input diagram as illustrated in Figure 17and the structural model as indicated in Figure
18.

167

In the diagrams below, the hypothesized factors are represented by circles and the
measured indicators of the factors are shown as rectangles. There are five indicators of
Confidence with ICT (the subsets: Computers (I13), Web Browsers (I14), Search Engine
(I15), Digital Files (I16) and Authoring Tools (I17)), three indicators of Internet at
Campus(the subsets: Access (I04), Stability (I05) and Speed (I06)), three indicators of
Internet at Home (the subsets: Access (I01), Stability (I02) and Speed (I03)), and six
indicators of Experience with ICT (the subsets: Internet (I13), E-mail (I14), Office
Software (I15), Social Network Sites (I16), Instant Messaging (I16) and Engineering
Software (I17)).

Since the variables are only partly explained by the factors, each variable must have
also a residual (the unexplained part). This is 1 2 , where R is just the beta value,
that is, the correlation between the factor and the variable. However, the software I
used, AMOS, displays R2 rather than the residual. I denote these R2s by the letter e.
Now, there are two main hypotheses to be considered here: first, I can ask whether a
two-factor model fits the data and second, I are also interested in whether there is a
significant covariance between the two factors. The confirmatory factor analysis has
been done using the version 20 of AMOS Graphics and the output path diagram, with
beta coefficients and R2 values entered as illustrated in Figure 17. The chi-square for
goodness of fit was calculated as 479.4 with 113 dfs, which is significant at p < 0.001.
The NFI and CFI was calculated as 0.941 and 0.954 , which seems to be close to 1. The
values close to 1 are generally considered to indicate a good fit. Besides, the RMSEA
index for the default model is also computed as 0.061, which is lower than 0.1. The
values greater than 0.1 indicate a bad fit. Based on these results, it can be inferred that
our model is satisfactory and hence the model fits the data.

168

Figure 17: Diagram representing a measurement model for the four-factor solution

169

Figure 18: The Input Path Diagram for a Structural Equation Modelling

170

12.3 The Second Structural Equation Modelling


The 23 items of the scale were subjected to principal components analysis (PCA) using
SPSS version 20. Prior to performing PCA, the suitability of data for factor analysis was
assessed as given in Table 39.
Table 39: Statistics for the Suitability of the data
% of

KMO

Sample

No. of

Ratio of

Coefficient

Measure

Size

Items

Items

greater than

of Sampling

0.3

Adequacy

36.84

0.881

886

23

38.52

Bartletts
Test of
Sphericity
0.000

Inspection of the correlation matrix revealed the presence of many coefficients of 0.3
and above. That is to say, the percentage of coefficients of 0.3 and above among the
items was calculated 36.84%. The Kiaser-Meyer-Olkin value was 0.881, exceeding the
recommended value of 0.6 (Kaiser 1970, 1974) and Bartletts Test of Sphericity
(Bartlettt 1954) reached statistical significance, supporting the factorability of the
correlation matrix.

Principal components analysis revealed the presence of six components with


eigenvalues exceeding 1, explaining

31.94%, 13.83%, 6.57%, 5.89% 5.09%

and

4.49% of the variance respectively. An inspection of the screeplot also revealed a clear
break after the third and six component. Using the Catells (1966), it was decided to
retain six components.

This was also almost supported by the results of Parallel

Analysis, which showed only five components with eigenvalues exceeding the
corresponding criterion values for a randomly generated data matrix of the same size
(23 variables x 886 participants) as indicated in Table 40.

171

Accept
Accept
Accept
Accept
Accept
Accept
Reject
Reject
Reject
Reject
Reject
Reject

Decision

Decision

1.30
1.25
1.22
1.19
1.16
1.13
1.10
1.08
1.06
1.04
1.01
0.99

Parallel
Analysis

Parallel
Analysis

7.35
3.18
1.51
1.35
1.17
1.03
0.86
0.77
0.66
0.64
0.55
0.52

Eigenvalue
from PCA

Eigenvalue
from PCA

01
02
03
04
05
06
07
08
09
10
11
12

Component

Components

Table 40: Identifying the Number of Factors

13
14
15
16
17
18
19
20
21
22
23

0.51
0.47
0.39
0.37
0.37
0.32
0.28
0.22
0.19
0.16
0.14

0.97
0.95
0.93
0.91
0.89
0.87
0.84
0.82
0.80
0.77
0.73

Reject
Reject
Reject
Reject
Reject
Reject
Reject
Reject
Reject
Reject
Reject

The six-component solution explained a total of 67.80% of the variance. To aid in the
interpretation of these two components, direct oblimin rotation was performed. The
rotated solution revealed the presence of simple structure (Thurstone 1947), with both
components showing a number of strong loadings and all variables substantially on only
one component. The interpretation of the six components was almost consistent with
our initial research on the scale, with the items designed to measure the confidences of
individuals with ICT loading strongly on Component 1, items designed to measure
attitudes of individuals towards their teachers and peers loading strongly on component
2, the items designed to measure the imaginary attitudes of individuals towards elearning loading strongly on component 3,the items designed to measure the individual
experiences of individuals with ICT loading strongly on component 4, the items
designed to measure the solid attitudes of individuals towards e-learning loading
strongly on component 5, and the items designed to measure the social experiences of
individuals with ICT loading strongly on component 6, as indicated in Table 41.

172

Table 41: Pattern Matrix for PCA with Oblimum Rotation of 14-Factor Solution
Items

Naming
Components

I14
I15
I13
I16
I17
I26
I24
I25
I27
I28
I29
I23
I22
I20
I08
I09
I12
I07
I18
I19
I21
I10
I11

F1

Confidence
with ICT

F2

Attitudes
towards
peers and
teachers

F3

Imaginary Attitude
towards e-learning

F4

Individual
Experience
with ICT

F5

Solid Attitude
towards e-learning

F6

Social Experience
with ICT

Pattern Coefficients
1

0.92
0.90
0.88
0.85
0.32
0.00
-0.01
0.01
0.08
-0.05
0.03
0.03
0.05
0.10
-0.14
0.06
0.04
0.33
0.07
0.21
-0.01
0.15
-0.09

-0.02
0.02
-0.02
0.05
0.01
0.84
0.83
0.77
0.76
0.66
0.53
0.06
0.01
-0.01
-0.02
0.03
0.05
-0.08
0.16
0.07
0.01
-0.01
0.03

0.04
0.01
-0.03
-0.02
0.03
-0.06
0.23
0.21
-0.17
-0.39
-0.51
-0.79
-0.78
-0.55
-0.04
0.01
0.01
-0.12
-0.04
-0.04
-0.37
-0.04
0.06

0.01
-0.01
0.02
-0.01
0.21
-0.02
0.01
0.08
0.00
0.01
0.05
0.09
0.07
0.04
0.76
0.71
0.66
0.56
0.02
0.13
-0.04
-0.11
0.18

-0.07
-0.02
-0.03
0.04
-0.29
0.04
-0.19
-0.24
0.10
0.11
0.15
-0.07
-0.16
-0.47
-0.04
0.01
-0.01
0.12
-0.71
-0.67
-0.62
0.10
-0.12

-0.02
0.05
0.03
0.03
0.02
-0.01
0.03
0.05
0.00
0.00
0.03
0.03
0.00
0.01
0.12
0.11
-0.15
0.04
0.07
-0.08
0.11
0.84
0.73

Co.
0.86
0.83
0.83
0.73
0.38
0.72
0.73
0.73
0.67
0.70
0.69
0.75
0.74
0.70
0.59
0.60
0.44
0.57
0.68
0.69
0.62
0.72
0.63

No strong correlation was found among the majority of factors. For example, there was
a weak positive correlation between the component 1 and 2 (r=0.11), a weak negative
correlation between the component 1 and 3 (r=-0.23) and a weak correlation between
the component 1 and 6 (r=0.21). However, a strong positive correlation was also found
between the component 1 and 4 (r=0.42) and a strong negative correlation between the
component 1 and 5 (r=-0.31).After the six-component solution is obtained, a reliability
analysis to consider how confident I can be about the using it to measure factors were
also implemented using the version 20 of the SPSS package.

173

Table 42: Structure Matrix for PCA with Oblimum Rotation of 14-Factor Solution
Items

Naming
Components

I14
I15
I13
I16
I17
I26
I24
I25
I27
I28
I29
I23
I22
I20
I08
I09
I12
I07
I18
I19
I21
I10
I11

F1

Confidence
with ICT

F2

Attitudes
towards
peers and
teachers

F3

Imaginary Attitude
towards e-learning

F4

Individual
Experience
with ICT

F5

Solid Attitude
towards e-learning

F6

Social Experience
with ICT

Structure Coefficients
1

0.92
0.91
0.91
0.85
0.49
0.08
0.10
0.17
0.17
0.09
0.19
0.28
0.30
0.39
0.38
0.22
0.55
0.29
0.35
0.47
0.28
0.26
0.16

0.08
0.12
0.10
0.15
0.14
0.85
0.81
0.79
0.79
0.75
0.66
0.33
0.29
0.28
0.17
0.14
0.08
0.15
0.35
0.27
0.26
0.09
0.16

-0.17
-0.20
-0.23
-0.23
-0.13
-0.30
-0.06
-0.39
-0.09
-0.56
-0.66
-0.85
-0.84
-0.66
-0.14
-0.16
-0.25
-0.11
-0.25
-0.24
-0.49
-0.14
-0.07

0.39
0.38
0.40
0.35
0.43
0.13
0.18
0.17
0.28
0.14
0.21
0.28
0.28
0.32
0.76
0.75
0.68
0.65
0.33
0.42
0.24
0.13
0.35

-0.34
-0.30
-0.32
-0.24
-0.45
-0.16
-0.35
-0.14
-0.42
-0.10
-0.10
-0.28
-0.35
-0.62
-0.25
-0.25
-0.17
-0.21
-0.80
-0.78
-0.69
-0.05
-0.25

0.18
0.24
0.23
0.21
0.18
0.10
0.14
0.12
0.19
0.11
0.16
0.17
0.15
0.18
0.29
0.28
0.23
0.02
0.22
0.11
0.24
0.83
0.77

Co.
0.86
0.83
0.83
0.73
0.38
0.72
0.73
0.73
0.67
0.70
0.69
0.75
0.74
0.70
0.59
0.60
0.44
0.57
0.68
0.69
0.62
0.72
0.63

The Cronbacha alpa coefficient was 0.891, exceeding the recommended value of 0.7
(DeVellis, 2003). That is quite higher but I also wondered whether removal of one or
more items from the scale might improve the internal consistency of those remaining.
However, no further improvements to be made by removal of any other variables from
the scale. So far, the determination of how many factors are present and whether those
factors are correlated are identified and then the factors that were determined are also
named in addition to the reliability analysis.

174

Now, I began by specifying the measurement model that I wish to test in the form of an
input diagram as illustrated in Figure 20. In the diagrams below, hypothesized factors
are represented by circles and the measured indicators of the factors are shown as
rectangles. There are five indicators of Confidence with ICT (the subsets: Computers
(I13), Web Browsers (I14), Search Engine (I15), Digital Files (I16) and Authoring
Tools (I17),four indicators of Individual Experience with ICT (the subsets: Internet
(I07), E-mail (I08), Office Software (I09), and Engineering Software (I12)), two
indicators of Social Experience with ICT (the subjects: Social Network Sites (I0) and
Instant Messaging (I11)), three indicators of Solid Attitudes towards e-learning (the
subjects: Enough Information (I18), Enough ICT Competencies (I19) and Enough
Time (I21)), three indicators of Imaginary Attitudes towards e-learning (the subjects:
Feeling Ready (I20), Supporting E-learning (I22) and Liking E-learning (I23)), six
indicators of Attitudes towards peers and teachers (the subjects: Enough Information
for Teachers (I24) and Peers (I25), Teachers (I26), and Peers (I27) Supporting Elearning, Teachers (I28) and Peers (I29) Liking E-learning).

Since the variables are only partly explained by the factors, each variable must have
also a residual (the unexplained part). This is 1 2 , where R is just the beta value,
that is, the correlation between the factor and the variable. However, the software I
used, AMOS, displays R2 rather than the residual. I denote these R2s by the letter e.
Now, there are two main hypotheses to be considered here: first, I can ask whether a
six-factor model fits the data and second, I are also interested in whether there is a
significant covariance between the two factors.

The confirmatory factor analysis has been done using the version 20 of AMOS Graphics
and the output path diagram, with beta coefficients and R2 values entered as illustrated
in Figure 19and Figure 20.The chi-square for goodness of fit was calculated as 1723.4
with 215 dfs, which is significant at p < 0.001. The NFI and CFI was calculated as
0.841 and 0.857 , which seems to be close to 1. The values close to 1 are generally
considered to indicate a good fit. Besides, the RMSEA index for the default model is
also computed as 0.089, which is lower than 0.1. The values greater than 0.1 indicate a
bad fit. Based on these results, it can be inferred that our model is satisfactory and hence
the model fits the data.

175

Figure 19: Diagram representing a measurement model for the six-factor solution

176

Figure 20: The Input Path Diagram for a Structural Equation Modelling

177

CHAPTER 13: EVALUATING E-LEARNING


13.1 Introduction
To ensure that the actual benefit of e-learning is valid in different context, I developed a
model by comparing the attitudes of teachers and students towards e-learning in Part II.
However, it was not possible to assess the pedagogical value of e-learning without
evaluating it. Hence, a model for conducting case-control study was developed to find
out the effects of three types of learning namely e-learning, blended learning and
traditional learning (see Chapter 10). After the development of the model, the elearning platform Moodle was selected and installed, a number of e-learning materials
were developed and the readiness of the students for e-learning were measured. Now, it
is time to analyse the results of pre- middle and post-placement tests for each course to
find out about whether e-learning was effective. Hence the main goal of this chapter to
investigate the pedagogical value of e-learning, blended and traditional learning by
offering a course about Programming with MATLAB to students in HEIs in Turkey and
the United Kingdom. In order to analyse the pedagogical value of e-learning, the
knowledge increase between the pre- and post-placement tests are be mainly used in this
chapter.

13.2Results and Discussion


This section is divided into three parts: The first part reports the results of preplacement test in the study whereas the second part reports the results of middle test.
The results of the post-placement test are analysed in the last part in detail.

13.2.1 Measuring Students Knowledge at the Beginning of the Courses


At the beginning of the each course including e-learning, blended and traditional
learning, the knowledge of students about the course contents (e.g. Programming with
MATLAB) was measured using a placement test. In addition to the descriptive analyses
of the results, one-way ANOVA was also used to verify statistical significance of
differences in mean scores on between e-learner, blended learner and traditional learner.

178

To assess the knowledge of students about Programming with MATLAB, six questions
for each topic have been designed using different type of questions such as multiple
choice, short answer, multiple short text, numerical input and multiple numerical input.
For example, Table 43illustrates the questions designed for topic 5 and their options (6
questions in total). As seen in Table 43, more than one question for each topic was
designed to measure the knowledge of students. Moreover, questions are not only
designed to measure the knowledge of students but also their other skills such as their
attention. For example, the question 29 is designed to find out about which options
returns the value of 9 if L is 5-dimensional matrix but the matrix A is used in the
options. It was expected that careful students would choose the option (e) (None) from
the options. However, some students also chose the option (a) (length(A)+ismatrix(A))
since the length(L) + ismatrix(L) would return the value of 9.

Table 44 illustrates the number of students taken the placement test and the mean score
of their mark for each topic. Moreover, it also shows the result of one-way ANOVA to
verify the statistical differences between e-learners, blended learners and traditional
learners. For the course 1 (Fundamentals of MATLAB, 6 Topics), the majority of the
participants were studying in HEIs in Turkey (90.58%, 481) and the remaining in the
United Kingdom (9.42%, 50). The participating students in Course I were categorized
as follows: 70.69% e-learners, 22.45 % blended learners and 6.86% traditional learners
in Turkey and 62.00% e-learners, 36.00 % blended learners and 2.00 % in the United
Kingdom. For the course 2 (Programming with MATLAB, 6 Topics), the majority of
the participants were also studying in HEIs in Turkey (91.53%, 400) and the remaining
in the United Kingdom (8.43%, 50).

179

Table 43: Sample Questions for Topic 5 ( Multi-dimensional Arrays)


Q25. A is a 3-dimensional arrays as below. If Q27. What is the main purpose of the cat
B= A(31) + A(1,2,3), what is the value of B? function in MATLAB? Please write your
(3.5 POINTS)

answer here: (3.5 POINTS)

Q28. What is the main difference between a


3-dimesnional

array

and

matrix?

(3.5

POINTS)

Q29. Which of the following returns a value


Please choose only one of the following:

of 9 if L is 5-dimensional array as

a) 309

8x2x6x9x3? Please choose only one of the

b) 218

following: (3.5 POINTS)

c) 24

a) length(A) + ismatrix(A)

d) 9012

b) numel(A)

e) Non

c) length(A)
d) size(A)

Q26. Which of the following functions does not

e) None

create a 4-dimensional array? Please choose


only one of the following; (3.5 POINTS)

Q30. What command displays the number of

a) rand(1,2,6,4)

elements in the array K if K is a 3-

b) cat(4,2,2,2)

dimensional array as 4x4x4? (3.5 POINTS)

c) rand(1,1,1,1)
d) zeros(4,3,2,6)
e) None

180

The participating students in Course II were categorized as follows: 65.00% e-learners,


26.75 % blended learners and 8.25% traditional learners in Turkey and 56.75% elearners, 40.54% blended learners and 2.70% in the United Kingdom. For the course 3
(Practices with MATLAB, 3 Topics), the knowledge of students were not measured
since the topics were designed specifically and total was new in the field of MATLAB.
For example, the topic 13 was about reading data from an external file, analysing and
writing it on an external file. Topic 14 covers the control of a PIC16F528 to develop the
hands on skills of students and aims to teach how to switch on-off LEDs using a time, to
control a race-car or artificial arm. Since the topics were specific, it was expected that
no students will have knowledge about those topics and therefore their knowledge at the
beginning of the course have not been measured and assumed as 0.0.

181

Table 44: The Results of Pre-Placement Test for Turkey and United Kingdom

PROGRAMMING

FUNDAMENTALS

Topic

Max
Range

09
Points

0 12
Points

0 15
Points

0 18
Points

0 21
Points

0 24
Points

0-100
Points

09
Points

0 12
Points

0 15
Points

10

0 18
Points

11

0 21
Points

12

0 24
Points

0-100
Points

M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3

N
340
108
33
340
108
33
340
108
33
340
108
33
340
108
33
340
108
33
340
108
33
260
107
33
260
107
33
260
107
33
260
107
33
260
107
33
260
107
33
260
107
33

Turkey
M
F (p)

United Kingdom
N
M
F(p)

2.50
0.69
0.00
5.42
1.15
0.00
5.54
1.18
0.00
4.94
1.39
0.00
1.96
0.96
0.00
4.33
1.17
0.00
25.69
7.53
1
4.44
3.82
4.01
6.64
7.42
7.17
5.32
8.10
8.83
5.44
5.60
5.27
4.11
5.32
5.30
2.36
1.51
1.21
29.32
32.77
32.79

31
18
1
31
18
1
31
18
1
31
18
1
31
18
1
31
18
1
31
18
1
21
15
1
21
15
1
21
15
1
21
15
1
21
15
1
21
15
1
21
15
1

68.01
(0.00)
75.54
(0.00)
59.13
(0.00)
38.24
(0.00)
8.12
(0.00)
31.39
(0.00)
65.62
(0.000)
2.43
(0.08)
2.23
(0.11)
25.93
(0.00)
0.073
(0.93)
3.63
(0.03)
2.03
(0.13)
1.81
(0.16)

3.29
3.02
6.00
7.68
8.67
11.00
7.61
9.95
12.50
6.97
8.58
6.00
4.52
4.47
0.00
6.90
7.56
12.00
37.96
43.25
48.50
4.60
5.92
7.50
7.24
8.87
11.50
5.30
7.42
12.50
5.57
6.40
15.00
5.83
5.13
14.00
2.00
4.80
24.00
31.54
39.54
85.50

1.17
(0.32)
0.614
(0.545)
2.00
(0.15)
0.54
(0.59)
0.544
(0.584)
0.470
(0.628)
0.609
(0.548)
1.087
(0.349)
1.847
(0.173)
1.549
(0.227)
1.281
(0.291)
1.064
(0.356)
8.931
(0.001)
3.055
(0.060)

M1: E-learner, M2: Blended, M3: Traditional Learner

182

N
371
126
34
371
126
34
371
126
34
371
126
34
371
126
34
371
126
34
371
126
34
281
122
34
281
122
34
281
122
34
281
122
34
281
122
34
281
122
34
281
122
34

Both
M
2.57
1.03
0.18
5.61
2.22
0.32
5.71
2.43
0.37
5.11
2.42
0.18
2.17
1.46
0.00
4.54
2.08
0.35
26.71
12.64
2.4
4.46
4.08
4.11
6.69
7.59
7.29
5.32
8.01
8.93
5.45
5.70
5.56
4.24
5.30
5.56
2.33
1.92
1.88
29.48
33.60
34.34

F(p)
54.160
(0.000)
52.874
(0.000)
39.191
(0.000)
27.597
(0.000)
7.266
(0.000)
22.261
(0.000)
45.937
(0.000)
1.006
(0.366)
3.416
(0.034)
26.632
(0.000)
0.113
(0.893)
3.213
(0.041)
0.421
(0.657)
2.882
(0.057)

For the course 1, Table 44shows the mean score of the students knowledge about the
fundamentals of MATLAB (e.g. History, Matrix, Arrays, Workspace and Digital
Images). It shows that there are significant differences between e-learners, blended
learners and traditional learners in terms of the results of placement test. The overall
mean score of the course 1 shows that the mean score of e-learners is 26.71 out of 100
points, blended learners 12.64 out of 100 points and traditional learners 2.40 out of 100.
However, the overall mean scores of the students in Turkey and the United Kingdom
were quite different. First, the overall mean score of the students in the United Kingdom
were higher than those in Turkey for each mode. Second, there are no significant
differences between e-learner, blended and traditional learners in the United Kingdom.

For the course 2, Table 44also displays the mean score of the students knowledge about
the Programming with MATLAB (e.g. variables, conditionals, sorting algorithm and
GUI). In overall, I found that there no significant differences between electronic,
blended and traditional learners.

However, the mean scores of both blended and

traditional learners were between than e-learner at the beginning of the course. The
mean score of e-learners were calculated as 29.48 out of 100 points for e-learners, 33.60
out of 100 points for blended learners and 34.34 out of 100 points for traditional
learners. However, the results of the placement test might be misleading because the
background of students in Turkey and the United Kingdom are quite different. For this
reason, I also calculated the mean score of each topic and the overall mean score of each
course for Selcuk University and Leicester University to find out the differences
between e-learners, blended and traditional learners as illustrated in Table 45.

183

Table 45: The Results of Pre-Placement Test for Selcuk and Leicester University

PROGRAMMING

FUNDAMENTALS

Topic

Max
Range

09
Points

0 12
Points

0 15
Points

0 18
Points

0 21
Points

0 24
Points

0 100
Points

09
Points

0 12
Points

0 15
Points

10

0 18
Points

11

0 21
Points

12

0 24
Points

0 100
Points

M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3

N
23
107
33
23
107
33
23
107
33
23
107
33
23
107
33
23
107
33
23
107
33
21
106
33
21
106
33
21
106
33
21
106
33
21
106
33
21
106
33
21
106
33

Selcuk
M
F (p)

0.72
0.69
0.00
1.26
1.16
0.00
1.59
1.19
0.00
1.96
1.40
0.00
0.46
0.96
0.00
1.28
1.14
0.00
8.26
7.54
1.00
5.13
3.86
4.01
7.90
7.43
7.17
9.29
8.17
8.83
9.74
5.65
5.27
6.23
5.37
5.30
3.33
1.53
1.21
42.62
33.01
32.79

26
17
1
26
17
1
26
17
1
26
17
1
26
17
1
26
17
1
26
17
1
17
14
1
17
14
1
17
14
1
17
14
1
17
14
1
17
14
1
17
14
1

3.094
(0.048)
2.796
(0.064)
2.425
(0.092)
3.111
(0.047)
1.843
(0.162)
1.765
(0.174)
2.754
(0.067)
4.409
(0.014)
0.744
(0.477)
1.085
(0.340)
12.766
(0.000)
0.438
(0.646)
3.164
(0.045)
6.495
(0.002)

Leicester
M
F(p)
3.53
3.02
6.00
8.31
8.47
11.00
8.69
10.00
12.50
7.85
8.21
6.00
4.71
4.53
0.00
7.10
6.82
12.00
41.18
42.05
48.5
5.59
5.70
7.50
8.06
8.64
11.50
6.40
7.68
12.50
6.35
5.79
15.00
6.59
4.75
14.00
2.00
4.00
24.00
35.99
37.56
85.50

1.411
(0.256)
0.229
(0.796)
0.824
(0.446)
0.087
(0.916)
0.531
(0.592)
0.452
(0.640)
0.092
(0.912)
0.24
(0.801)
0.722
(0.494)
0.837
(0.443)
1.144
(0.332)
1.249
(0.302)
8.542
(0.001)
2.357
(0.113)

M1: E-learner, M2: Blended, M3: Traditional Learner

184

N
49
124
34
49
124
34
49
124
34
49
124
34
49
124
34
49
124
34
49
124
34
38
120
34
38
120
34
38
120
34
38
120
34
38
120
34
38
120
34
38
120
34

Both
M
2.21
1.01
0.18
5.00
2.16
0.32
5.36
2.40
0.37
5.08
2.33
0.18
2.71
1.45
0.00
4.37
1.92
0.35
25.723
12.28
2.40
5.34
4.07
4.11
7.97
7.57
7.29
7.99
8.11
8.93
8.22
5.67
5.56
6.39
5.30
5.56
2.74
1.82
1.88
39.65
33.54
34.89

F(p)
13.503
(0.000)
15.418
(0.000)
13.480
(0.00)
12.918
(0.000)
6.550
(0.002)
9.522
(0.000)
14.997
(0.000)
0.022
(0.979)
5.701
(0.004)
4.886
(0.008)
0.342
(0.710)
0.434
(0.648)
0.594
(0.553)
2.694
(0.070)

Table 45shows the mean scores of students in Leicester and Selcuk University
regarding their placement test results. It shows that there are significant differences
between e-learners, blended learners and traditional learners in Selcuk University
according to their knowledge about Fundamentals of MATLAB at the beginning of the
course 1. However, no significant differences were found for students in Leicester
University. Moreover, the mean scores of students in Leicester University for each
mode were much better than those in Selcuk University at the beginning of the courses.

The overall mean scores of the courses and the mean scores of each topic displays that
the current knowledge of students vary. However, it does mean that the knowledge of elearners at the beginning of the course were much better than those blended and
traditional learners. However, it does NOT give any clue about the pedagogical value of
e-learning, blended and traditional learning. For this reason, it is important to measure
the student knowledge about MATLAB at the beginning of each course. In this way, it
is possible to find out the pedagogical value of e-learning, blended and traditional
learning because I will be able to calculate the increment and decrement rate in the
knowledge of students after the end of courses.

185

13.2.2Measuring Students Knowledge in the Middle of the Courses


After measuring the current knowledge of the students at the beginning of each course,
three different courses have been started using the blended, e-learning and traditional
approaches. For each topic, the blended and e-learners are first asked to study an ebook, which were specifically designed. For the traditional learners, the printed version
of the e-book were handed out to those students or put a sample of it to the photocopy
room of the respective department. In the middle of each course including e-learning,
blended and traditional learning, the knowledge of students about the course contents
(e.g. Programming with MATLAB) have been measured using a quiz. Table 46shows
the mean scores of the questions in the quiz for each topic and the overall result. In
addition to the descriptive analyses of the results, one-way ANOVA was also used to
verify statistical significance of differences in mean scores on between e-learner,
blended learner and traditional learner.

Similarly, to assess the knowledge of students about Programming with MATLAB in


the middle of the courses, 6 questions for each topic have been also designed using
different type of questions such as multiple choice, short answer, multiple short text,
numerical input and multiple numerical input. As illustrated in Table 46, 275 students
have studied e-book or printed version of it (i.e. Step 1), solved the e-exercise A (i.e.
Step 2), attended the lectures on the campus or online (i.e. Step 3), and solved the eexercise B (i.e. Step 4) for each topic in the course 1 (i.e. Fundamentals of MATLAB),
220 students in the course 2 (i.e. Programming with MATLAB) and 233 students in the
course 3 (i.e. Practices with MATLAB).

186

Table 46: The result of the Quiz for Turkey and United Kingdom

PRACTICE

PROGRAMMING

FUNDAMENTALS

Max
Range
1

09
Points

0 12
Points

0 15
Points

0 18
Points

0 21
Points

0 24
Points

0 100
Points

09
Points

0 12
Points

0 15
Points

10

0 18
Points

11

0 21
Points

12

0 24
Points

0 100
Points

13

0 27
Points

14

0 33
Points

15

0 39
Points

0 100
Points

M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3

N
121
103
31
121
103
31
121
103
31
121
103
31
121
103
31
121
103
31
121
103
31
82
93
30
82
93
30
81
93
30
82
93
30
82
93
30
82
93
30
82
93
30
64
109
52
64
109
52
64
109
52
64
109
52

Turkey
M
F (p)
6.57
99.585
3.13
(0.000)
1.95
11.48
65.586
7.44
(0.000)
6.97
9.95
140.247
1.97
(0.000)
0.40
15.99
64.041
9.90
(0.000)
7.62
16.31
111.381
5.30
(0.000)
2.94
13.45
126.521
0.55
(0.000)
0.65
74.75
269.084
29.29
(0.000)
21.52
7.48
99.432
3.83
(0.000)
2.87
7.88
43.410
3.45
(0.000)
1.99
10.15
19.665
6.63
(0.000)
5.04
16.02
69.411
4.45
(0.000)
5.10
15.73
105.802
2.43
(0.000)
1.05
12.48
47.911
3.11
(0.000)
2.00
70.71
143.384
24.90
(0.000)
19.05
24.19
22.045
13.75
(0.000)
13.93
27.84
37.183
12.05
(0.000)
11.15
31.88
37.179
14.01
(0.000)
9.13
84.91
50.818
40.81
(0.000)
35.22
187

United Kingdom
N
M
F(p)
8
7.19
6.501
12
5.04
(0.020)
8
12.00
0.655
12 11.75
(0.429)
8
8.44
1.796
12 11.25
(0.197)
8
14.06
1.631
12 16.50
(0.218)
8
7.00
0.485
12
9.92
(0.495)
8
8.44
1.078
12 13.63
(0.313)
8
58.13
1.188
12 69.08
(0.290)
0
6
8.33
3.837
9
7.11
(0.072)
6
9.20
0.308
9
8.44
(0.589)
6
9.92
0.126
9
10.97
(0.728)
6
18.00
0.650
9
16.67
(0.435)
6
12.67
0.208
9
14.78
(0.656)
6
8.00
1.232
9
13.67
(0.287)
6
67.12
0.265
9
72.64
(0.615)
0
3
27.00
5
27.00
3
23.83
1.637
5
32.45
(0.248)
3
26.00
1.875
5
39.00
(0.220)
3
77.83
5
99.45 1.779
(0.231)
0
-

N
129
115
31
129
115
31
129
115
31
129
115
31
129
115
31
129
115
31
129
115
31
88
102
30
88
102
30
87
102
30
88
102
30
88
102
30
88
102
30
88
102
30
67
114
52
67
114
52
67
114
52
67
114
52

Both
M
F(p)
6.61
99.769
3.33
(0.000)
1.95
11.51
59.668
7.89
(0.000)
6.97
9.85
100.737
2.93
(0.000)
0.40
15.87
54.313
10.58
(0.000)
7.62
15.74
87.615
5.78
(0.000)
2.94
13.14
81.145
1.92
(0.000)
0.65
73.72
184.650
33.44
(0.000)
21.52
7.53
93.063
4.12
(0.000)
2.87
7.97
41.394
3.89
(0.000)
1.99
10.14
17.531
7.02
(0.000)
5.04
16.16
61.563
5.53
(0.000)
5.10
15.52
82.978
3.52
(0.00)
1.05
12.17
36.316
4.04
(0.000)
2.00
70.46
109.427
29.11
(0.000)
19.05
24.31
21.840
14.33
(0.000)
13.93
27.66
34.083
12.94
(0.000)
11.15
31.62
34.331
15.11
(0.000)
9.13
84.60
46.359
43.38
(0.000)
35.22

As illustrated in Table 46, significant differences between e-learners, blended and


traditional learners have been found for the students in Turkey but no significant
differences for the students in the United Kingdom. For the course 1, the mean score of
e-learners was computed as M=74.75 out of 100 points, blended learners as M =29.29
out of 100 points and traditional learners as M=21.53 points out of 100 points.
However, the mean score of the blended (M= 58.13) and e-learners (M=69.08) were
quite close. For the course 2 and 3, the same patterns were also found for the students in
the UK and in Turkey. However, the results of the quiz shows that the gap between elearners, blended and traditional learners in both countries is getting closed when the
number of topics taught are in increment.

However, the mean scores of the quiz like those of the placement test might be
misleading as they point out the success of e-learners, blended and traditional learners.
However, they do not show the pedagogical value of e-learning, blended and traditional
learning approach because the levels of the students at the beginning of each course
were quite different. Hence, the best thing to find out about the value of those three
modes of learning is to compute the increment or decrement rate of the courses on the
students knowledge. Table 47displays the differences between the placement test and
quiz to find out about the increment rate of the courses on the students knowledge. As
shown, the pattern on the increment rate in Table 47for each country was quite different
from the pattern obtained in the results of the quiz, which was implemented in the
middle of each course.
Table 47: The rate of increment or decrement on students learning

Courses
1

E-learner
Blended
Traditional
E-learner
Blended
Traditional
E-learner
Blended
Traditional

N
121
103
31
82
93
30
64
109
52

Turkey
M
F (p)
45.97
21.69
21.04
41.34
-6.91
2.76
84.91
40.81
35.22

42.646
(0.000)
102.681
(0.000)
50.518
(0.000)

United Kingdom
N
M
F(p)
8
12

27.20
21.94

6
9

42.74
21.54

3
5

77.83
99.45

0.177
(0.679)
2.590
(0.132)
1.779
(0.231)

N
129
115
31
88
102
30
67
114
52

M1: E-learner, M2: Blended, M3: Traditional Learner


188

Both
M
44.81
21.72
21.04
41.44
-4.40
2.76
84.60
43.38
35.22

F(p)
39.187
(0.000)
95.344
(0.000)
46.359
(0.000)

The assessment of knowledge increase or decrease was done by the placement test
preceding each mode of learning method, followed by a middle test for each method.
The differences between those tests are shown in Table 47. For the course 1, it seems
that there are significant differences between e-learning, blended and traditional
learning for each country. The increment rate of e-learning (M=45.97 out of 100 points)
were much better than blended (M=21.69 out of 100 points) and traditional learning
(M=21.04 out of 100 points) in Turkey. Similarly, the increment rate of e-learning
(M=27.69 out of 100 points) was also better than blended learning (M=21.94 out of 100
points) in the UK but no significant differences were computed. Moreover, the
knowledge increase of blended learning in both countries was also almost same.
However, e-learners in Turkey using e-learning approach learned much more than those
in the United Kingdom. For the course 2, the pattern between these three approaches
was also same because the knowledge increase of e-learners were much better than
blended and traditional learners in Turkey. However, the increment rate of e-learners in
the United Kingdom has increased significantly from 27.20 to 42.74 points out of 100
points. However, there was no knowledge increase on blended learners in the course 2,
which was -6.91 out of 100 points.

This implies that those students possibly randomly selected the question options in the
pre-placement test of the course to get higher mark. However, the pattern between elearners and blended learners in the United Kingdom has changed in the course.
However, the same pattern remains for students in Turkey. Table 47indicates that the
knowledge increase on the blended learners in the United Kingdom were almost perfect
(M=99.45 out of 100 points). However, it was calculated as M=77.83 for e-learners in
the United Kingdom. It is important to note that the course 3 was about practices with
MATLAB such as controlling a race car using the MATLAB Software. For example,
the circuit in Figure 21was designed step by step using a matrix board in front of
blended learners in Turkey and the United Kingdom and those students had a chance to
improve their hands-on skills. However, the same circuit was designed online for elearners through a CCTV camera.

189

Figure 21: A Circuit Design

This shows that the laboratory use has more positive effect on students learning in the
United Kingdom but the same pattern is not valid for the students in Turkey because the
mean knowledge increase of e-learners for the course 3 (M=84.91) were much better
than blended (M=40.81) and traditional learners (M=35.22).

13.2.3Measuring Students Knowledge at the End of the Courses


After applying the quizzes for each course, 3-week long break has been applied for the
course 1 and course 2. However, for the course 3, an electronic game has been designed
and applied to find out the effect of electronic games on the students knowledge for
blended learners and electronic learners. For traditional learners, instead of offering an
e-game, a research study was asked. However, it is important to note here that the
knowledge that blended learners and e-learners will get from playing e-game was
completely same with the knowledge that traditional learners will obtain with the
research study. In addition, the knowledge level was much more advance in the e-game
and in the research study. The e-game was designed to encourage students learn while
they are playing. A printed screen of e-game was illustrated in Figure 22 and Figure 23.

190

For example, as shown in the figures, when the user picks up the item STAR, a piece of
new knowledge about the serial communication appears as illustrated in Figure 23on the
right-bottom of the screen. The game was designed to encourage students learn about
WHAT, WHY, HOW and WHERE for each concept by collecting four items namely a
ROCK, HEART, STAR and KEY respectively. For example, it is aimed to teach
students What is Serial Communication? when they collects the item ROCK. After
collecting the four items, the students were encouraged to go up the next level.

Table 48 shows the mean score of the final exam for each topic and the overall of each
course. It is noted that the final exam was applied for course 1 and 2 after three-week
long and after the quiz. However, the final exam for the course 3 was applied after
applying e-game for blended and e-learners and the research study for the traditional
learners. The overall results of the course 1 displays that the mean score of e-learners
(M=84.48) was much better than blended (M=70.75) and traditional learners
(M=58.29). The same patter also remains for the students in Turkey and the United
Kingdom based on the separate results. The significant difference between e-learner,
blended and traditional learners was also found for the overall results in Turkey. The
overall results of the course 2 were also computed as illustrated in Table 48. The similar
pattern was also discovered between e-learners, blended and traditional learners.

191

Figure 22: Getting the Item (e.g. Star)

192

Figure 23: Getting the Knowledge after the Item

193

With respect to the results of the course 3, the results of e-learners are still significantly
much better than e-learners, blended and traditional learners in both countries.
However, the rate of the knowledge increase for each course is much important than the
pure results. For this reason, I calculated the differences between the final exam and the
middle exam to find out whether there is increment or decrement in the rate of
knowledge change of the students.

194

Table 48: Post-Placement Tests for Turkey and United Kingdom


Max
Range

PRACTICE

PROGRAMMING

FUNDAMENTALS

T
1

09
Points

0 12
Points

0 15
Points

0 18
Points

0 21
Points

0 24
Points

0 100
Points

09
Points

0 12
Points

0 15
Points

10

0 18
Points

11

0 21
Points

12

0 24
Points

0 100
Points

13

0 27
Points

14

0 33
Points

15

0 39
Points

0 100
Points

M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3
M1
M2
M3

N
58
109
52
58
109
52
58
109
52
58
109
52
58
109
52
58
109
52
58
109
52
58
109
52
58
109
52
58
109
52
58
109
52
58
109
52
58
109
52
58
109
52
56
98
32
56
98
32
56
98
32
56
98
32

Turkey
M
7.11
6.42
6.07
11.03
10.29
9.00
14.04
12.39
10.13
16.68
13.54
10.93
15.21
11.96
10.05
19.44
14.62
11.11
84.51
70.23
58.29
8.24
7.82
6.80
10.80
10.07
9.35
12.15
11.59
10.94
15.47
13.02
9.97
14.84
11.56
8.75
22.24
14.90
14.00
83.74
68.97
59.80
23.90
19.56
19.55
28.15
15.63
15.87
27.33
17.42
16.55
80.39
53.60
52.97

F (p)

N
2
4
2
4
2
4
2
4
2
4
2
4
2
4
0
2
4
2
4
2
4
2
4
2
4
2
4
2
4
0
2
4
2
4
2
4
2
4
0

3.39
(0.04)
7.82
(0.00)
27.09
(0.00)
31.27
(0.00)
17.35
(0.00)
37.99
(0.00)
43.58
(0.00)
10.64
(0.00)
8.14
(0.00)
1.95
(0.14)
21.79
(0.00)
18.56
(0.00)
13.68
(0.00)
24.22
(0.00)
13.74
(0.00)
52.07
(0.00)
25.31
(0.00)
46.03
(0.00)
195

United Kingdom
M
F(p)
6.56
0.073
6.84
(0.801)
12.00
12.00
13.75
2.667
15.00
(0.178)
18.00
18.00
12.25
0.571
14.88
(0.492)
20.00
2.510
17.33
(0.188)
83.56
0.248
85.05
(0.645)
9.00
0.444
8.93
(0.541)
11.75
0.267
11.88
(0.633)
11.88
0.727
13.75
(0.442)
16.50
0.267
17.25
(0.633)
14.00
0.444
14.88
(0.541)
26.00
0.776
34.00
(0.428)
89.13
1.413
100.68
(0.300)
27.00
1.754
21.38
(0.256)
33.00
1.281
29.79
(0.321)
35.75
0.775
31.69
(0.428)
96.75
3.047
83.85
(0.156)
-

N
60
113
52
60
113
52
60
113
52
60
113
52
60
113
52
60
113
52
60
113
52
60
113
52
60
113
52
60
113
52
60
113
52
60
113
52
60
113
52
60
113
52
58
102
32
58
102
32
58
102
32
58
102
32

Both
M
7.09
6.44
6.07
11.07
10.35
9.00
14.03
12.48
10.13
16.73
13.70
10.93
15.11
12.06
10.05
19.46
14.72
11.11
84.48
70.75
58.29
8.27
7.86
6.80
10.83
10.14
9.35
12.14
11.67
10.94
15.50
13.17
9.97
14.82
11.67
8.75
22.37
15.58
14.00
83.92
70.09
59.80
24.01
19.63
19.55
28.32
16.18
15.87
27.63
17.98
16.55
80.95
54.79
52.97

F(p)
3.375
(0.036)
8.400
(0.000)
27.784
(0.000)
32.208
(0.000)
16.841
(0.000)
39.515
(0.000)
43.410
(0.000)
11.446
(0.000)
8.699
(0.000)
1.991
(0.139)
22.633
(0.000)
18.976
(0.000)
12.340
(0.000)
23.789
(0.000)
14.658
(0.000)
49.416
(0.000)
24.727
(0.000)
44.619
(0.000)

Table 49indicates the rate of knowledge increase of the students after the end of each
course in both countries. It shows that the knowledge increase of the blended (M=42.36)
and traditional learners (42.96) after three-weeklong break has increased the knowledge
of those students significantly because they were much better than e-learners (M=2.72).
The same patter also remains for the course 2 for the students in Turkey.

Table 49: The rate of increment or decrement on students learning


Mode
1

M1
M2
M3
M1
M2
M3
M1
M2
M3

Turkey
N
M
57
99
25
56
92
29
56
98
32

2.72
42.36
42.96
11.85
45.87
44.82
-7.86
9.02
7.45

F (p)
157.613
(0.000)
66.025
(0.000)
9.539
(0.000)

United Kingdom
N
M
F(p)

Both
N
M

2
4

2.56
-3.57

2
4

8.48
16.01

2
4

-3.25
-15.46

59
103
25
58
96
29
58
102
32

0.357
(0.583)
0.555
(0.498)
3.380
(0.140)

2.71
40.58
42.96
11.74
44.63
44.82
-7.70
8.06
7.45

F(p)
125.602
(0.000)
62.857
(0.000)
8.868
(0.000)

M1: E-learner, M2: Blended, M3: Traditional Learner

Table 49also shows the effect of the research study and e-game on the students success.
According to the results in the table, blended learners in Turkey has increased their
knowledge by playing e-game (M=9.02) and traditional learners by making a research
study (M=7.45). However, it does not confirm that the knowledge increase remains for
e-learners in Turkey but a low decrease. Moreover, the effect of e-game is not also
positive for e-learners and blended learners in the United Kingdom. However, it should
be noted that it is not possible to observe a knowledge increase on e-learners because
their final test results shows they almost completely got everything from the courses
before e-game.

196

CHAPTER 14: CONCLUSION


14.1 Introduction
The main goal of this thesis study is to first develop an e-learning model in electrical
engineering and evaluate it using the empirical studies in Turkey and the United
Kingdom with the limited participation. In order to develop the e-learning model, the
perspectives of students and teachers in HEIs associated with the subject of electricity
were obtained using different data collection techniques namely questionnaires and
interviews. To develop the e-learning model, a conceptual framework was developed
for achieving the first goal of this thesis - measuring students and teachers readiness
for e-learning (Step 1), selecting and developing an e-learning platform namely Moodle
(Step 2), developing e-learning materials including e-books, e-exercises, presentations
and games (Step 3), training students for e-learning (Step 4) and delivering e-learning
(Step 5).

After developing the e-learning model, it was evaluated based on the

empirical studies in Turkey and the United Kingdom. The pedagogical value of elearning, blended and traditional learning was evaluated by teaching three different
courses regarding the MATLAB Software.

14.2 Research Questions


Based on the purpose and objectives of the thesis, five research questions were
formulated. The results of our studies have answered the questions, which are
elaborated as follows questions as follows:

14.2.1 Answer to Research Question 1


The first question I wanted to answer during my PhD was to find out about factors that
might affect the readiness of different stakeholders in HEIs especially those teachers
and students in electrical engineering in Turkey and the United Kingdom. The factors
that I intended to measure were identified after detailed analyses of the existing elearning readiness models combined with the cultural and environmental characteristics
of the institutions associated with the science of electricity in Turkey.

197

In general, ten main factors have been identified as the factors that might affect the
readiness of teachers and students: Technology, Experience with ICT, Confidence with
ICT, Attitudes towards E-learning and Others, Traditional Skills, Institution Readiness,
Content, Acceptance and Training for E-learning. Moreover, based on these factors, a
model was also generated for assessing teachers and students readiness for e-learning
in Turkey and the United Kingdom. Our model was also validated using different
techniques especially using structural equation modelling.

14.2.2 Answer to Research Question 2


The second question I wanted to answer during my PhD was to find out about how an elearning platform should be selected before embarking on e-learning. However, before
selecting an e-learning platform regardless of whether it is commercial or open-source, I
measured the factors that may affect the readiness of students and teachers. After that, I
identified that there are a number of e-learning platforms available which either free to
use (e.g. Moodle, Ilias and Docebo) and or to purchase (e.g. Blackboard and FirstClass).
Furthermore, the needs of teachers and students together with the particular
characteristics of the respective HEIs to which they belonged were also investigated.
The data obtained from our studies especially the data from the close-ended questions
and interviews indicate that the target teachers and students use various operating
systems in the UK and Turkey at both home and campus such as Linux, Windows, Mac
and Pardus. To ensure the accessibility of the e-learning platform developed, being able
to work through a web browser is deemed one of it is critical success criteria. Besides,
the findings of the survey results indicate that selecting an e-learning platform , which is
open-source, can also facilitate the integration of e-learning into HEIs because the
teachers and students in the target HEIs have experiences with the following scripts
such as HTML, Java and PHP. This may not facilitate the integration of e-learning but
also it helps HEIs develop the e-learning platform by adding new component as I added
e-game to the Moodle platform. It is also important to choose an open source to fix
errors on it immediately.

198

14.2.3 Answer to Research Question 3


The third question I wanted to answer during my PhD was to find out about how elearning materials should be developed to help students in the target institutions. For
this reason, the participants were first surveyed for ICT that they are familiar with. It
was found that the participating teachers and students in the target HEIs are mainly
familiar with AutoCAD, Proteus and MATLAB to succeed at their courses or to
improve their students skills. Hence, the usefulness and familiarity of those in the
target HEIs were taken into account to choose a topic and develop e-learning materials.
I decided that MATLAB or AutoCAD can be selected as a main topic to deliver elearning because the target HEIs already used those software in their many modules.

Secondly, the commercial licences of those programs were also important because they
were not open source. Hence, selecting software that already installed in the university
labs were also taken into account. Moreover, I found that MATLAB would be the best
choice because its various uses in Electrical Engineering such as image processing,
communication, programming etc. Moreover, it also supports the Windows, Mac, Linux
and Mobile Platforms. Hence, to address these issues, I decided to offer a course
regarding Programming with MATLAB to HEIs in Turkey and the United Kingdom.
For this reason, based on the current advancements I developed an e-book to make sure
that students have access to topics. In addition, I designed exercises for each topic in the
book, and pre- and post-placement test to apply at the end and at the beginning of the
courses. Moreover, it was also important to encourage students in the classroom by
preparing high quality presentations and e-game. To design the e-learning materials, the
opinion of the participant teachers and students were taking into account.

14.2.4 Answer to Research Question 4


The fourth question I wanted to answer during my PhD was to find out about how
students should be trained to get them ready for e-learning if necessary based on the
results of e-learning readiness surveys. Hence, the readiness of the target students for elearning was measured as part of the survey with close-ended questions. I found that the
readiness of the target students seems to be sufficient.

199

However, their readiness for e-learning might be also strengthened to facilitate effective
adoption of e-learning in their institutions. Besides, in case they are were not ready, the
participants were also surveyed to find out about training that they might need for elearning. The overall results showed that those students needed knowledge about elearning and how to use the e-learning platform. To address these issues, I decided to
design the first course with ease to encourage students to use the platform and make
sure they are familiar with e-learning. Hence, the first six topics were designed to teach
the fundamentals of the MATLAB Software. In this way, I aimed to increase the
readiness of the students for e-learning and to get them familiar with the Moodle.
Additionally, at the beginning of the first course, the students were trained about the use
of Moodle (e.g. changing their password, updating their profile, uploading their picture,
sending messages to each other, reading PDF documents, solving exercises and
watching recorded sessions) and the use of the White Board (e.g. joining online
sessions, using their microphones and camera, chatting with other students and asking
written questions during the session).

14.2.5 Answer to Research Question 5


The final question I wanted to answer during my PhD was to find out about the most
appropriate way of delivering e-learning in the target HEIs in Turkey and the United
Kingdom. According to the perspectives of the surveyed teachers and students, I found
that the teacher and students believe that e-learning should be considered as a tool
supporting the campus-based activities. They highlighted that e-learning should not be
meant to replace completely the traditional learning in the field of engineering because
the students stress that gaining hands-on skills through labs are highly important for
them. Moreover, it was also stressed that social interactions should also remain by
protecting the campus-based activities. Therefore, I found that the blended learning
approach is the most appropriate way of delivering e-learning in the target HEIs in
Turkey and the United Kingdom.

200

14.3 Limitations
This study had some limitations that should be taken into account when interpreting its
findings. Additionally, the limitation of the current study can be used as a steppingstone from which more in-depth work can be done to overcome the challenges for elearning implementation in developing countries.

14.3.1 Invitation of participants


The model development was based on the survey data collected from the representative
teachers and students in Turkey whereas the model evaluation was conducted in the
relevant HEIs in Turkey and the United Kingdom. To meet this aim, the web-based
surveys were distributed to the respective HEIs, sending invitations and reminders via
email. However, surveying via the Internet is criticized for the biased sample because
individuals who are online and who are motivated to take the survey have been included
in the sample , potentially committing self-selection bias errors (Lawton, 2005). As my
surveys with teachers in 2010 and with students in 2011 were implemented via the
Internet, I discussed whether my research results would be biased. It would be possible
to invite the target students and teachers of the current study sending an invitation via
post. However, it was not possible within the limited resources for a PhD study research
project because of the following reasons: Firstly, the home addresses of the target
students and teachers are unknown, given the privacy concern Furthermore, it was not
possible to cover the expenses of those mails as the target populations of the current
study, who work or study in the HEIs associated with the subject of electricity across
Turkey and the United Kingdom are large. To deal with this issue, I also designed a
letter for the target HEIs to invite their teachers and students via their notice board.
However, I tend to conclude that such a bias is remarkably low in my case, given the
following consideration: Firstly, the mean scores of the items vary between 1.00 and
5.00 on the Likert-scale. This shows that many participants also show (strong)
disagreements to several items despite their (easy) access to ICT. Furthermore, this
biased sample issue could be a major problem in the early days of the Internet
revolution according to Lawton (ibid). However, today, the Internet and e-mail become
a part of everyday life especially at the level of higher education.

201

14.3.2 Limited timeframe for the participant availability


The perspectives of teachers and students in the relevant HEIs played a crucial role in
the current research study. However, the availability of teachers and students is
confined mainly to the start and end of academic terms. Based on the assumption that a
web-based survey can effectively and efficiently reach widely distributed respondents, I
developed a survey to assess the extent to which students are ready for e-learning.
While the survey questionnaire was ready in December 2011, I had to delay the start of
the survey to the March semester in order to wait for the returning students because I
could only invite them via their department secretaries. Similarly, while the teacher
questionnaire was also ready in February 2010, I had to extend the duration of the
survey because their time was occupied with many reasons such registration of new and
returning students and lectures.

.
14.3.3 The generalizability of results
I aimed to evaluate the model As e-learning seems to be more mature in the UK, the
simple random sample was used in the current study to invite students to participate in
the survey. To achieve this aim, the universities located in the UK were listed and 10 of
them were selected randomly in order to use our limited resources. I aimed to evaluate
if the model is applicable to another context other than Turkey, though I was fully
aware of the limitation of this sampling strategy. An invitation letter to the department
secretary of the HEIs associated with the subject of electricity in those ten universities
was sent to encourage their students for the case-control study. However, as the
responses of 188 students were valid from the UK, it was not possible to achieve the
desired level of confidence and precision for the UK. However, in order to use our
limited resources effectively, neither a reminder to those ten universities was sent nor
new universities were selected to compensate for low response rate. Therefore, the
generalizability of the UK sample is limited in our study. Results of the use of a casecontrol study in the United Kingdom is limiting and therefore a larger scale study can be
used to provide a higher generalizability of results. Nonetheless, the methodology
developed can contribute to the future work when it is feasible to involve a larger
number of the universities in the UK.
202

14.4 Contributions
The results of this study have some contributions to theory and practice associated with
e-learning. I believe that this study may be an interesting source for the audience in the
field of information communications technology especially those who considers elearning in higher education. Additionally, this study contributes to the areas of
computer science, electrical engineering, e-learning and education. Specifically, it
introduces novel thinking and techniques to the implementation of e-learning in the
field of electrical engineering in general. The key contributions of this study can be
articulated as follows.

14.4.1 Finding Issues and Suggesting Solutions


In this study, the issues in the higher education institutions associated with the subject
of electricity (e.g. electrical engineering) were identified from both teachers and
students perspectives. I found that the main issues, which hinder education and
training, can roughly be categorized into three main categories based on their
commonalities namely financial issues, political issues and individual issues.
Additionally, e-learning as a potential solution for those issues was discussed with the
same teachers and students. As a result, according to students and teachers
perspectives, e-learning is perceived as a solution for resolving some issues and it
brings some innovation in the perspective institutions in Turkey. For instance, I
discussed the effect of lecture theatres on students experiences with the participant
teachers and students and found that lecture theatres have been lacking quality in
delivering lectures and are not conducive for gaining positive learning experiences due
to several reasons such as the insufficient size of lecture theatres, high noise especially
in the rainy weathers and inappropriate

facilities and seats. As a solution, one

interviewee noted that e-learning may be used to overcome lack of classrooms.

203

Specifically, another interviewee suggested that students may be notified about lecture
notes before they go to the class over the Internet via e-learning because it is not
possible for every student to seize a place near the blackboard or instructor due to small
classroom and not possible to remove noise completely from classrooms. Similarly, I
discussed all the issues, which may put a barrier to education and training, with the
participant teachers and students and suggested solutions via e-learning. I believe that
the findings of this thesis may help academia develop new strategies for implementing
e-learning.

14.4.2 Identifying Factors Affecting Individuals Readiness for e-Learning


It is true that e-learning may not have the same effect for every individual, institution,
organization or country because the actual benefit of e-learning depends on the aim of
usage and the way e-learning is used. Therefore, it is highly important to identify the
factors, which may affect the readiness of individuals and organizations for e-learning.
For this reason, I investigated the factors that may put a barrier into the readiness of
teachers and students in higher education institutions associated with the subject of
electricity in Turkey.

Specifically, I also measured the readiness of teachers and

students to find out the extent to which factors such as confidence, attitudes, and skills
relating to ICT create an impact on teachers and students e-learning readiness. In this
way, I believe that the findings of the thesis ay also arouse an interest of academia to
measure their teachers and students readiness for e-learning. As a result, I found that
this study during my PhD registration have already aroused the interest of many
researchers such as Soydal et al. (2011) in Turkey, Ouma, et al. (2013) in Kenya, Anter
et al. (2014)in Iraq and Trayek et al.(2014) in Palestine.

204

14.5 Future Work


This study aimed to develop an e-learning model in the domain of electrical engineering
from the perspectives of teachers and students in the HEIs associated the subject of
electricity in Turkey and to evaluate the model in the relevant HEIs in both Turkey and
the UK. The research detected several interesting topics in the foreseeable future for
other researchers. The following research studies are recommended.

14.5.1 Measuring Readiness for E-learning


This study focussed on the readiness of different stakeholders in higher education
institutions from both a developed country (i.e. UK) and developing country (i.e.
Turkey). However, more research studies should be implemented regarding e-learning
readiness because e-learning may not have the same effect for every individual,
institution, organization or country. It is highly important to find out the specific needs
of organizations and individuals especially in different contexts. Eventually, comparing
the e-learning readiness of a sufficient number of developed and developing countries
may enable the development of a comprehensive e-learning maturity model, not only
from the technical but also, if not more important, from the pedagogical, social and
organizational perspectives.

14.5.2 E-learning Materials


This study developed various e-learning materials such as e-book, exercises,
presentations and e-game to find out whether e-learning is effective. I found that e the
lack of traditional skills with the magnetism of the Internet and social network sites (e.g.
Facebook) is a possible risk of e-learning. Hence, it is important to ensure about the
quality and diversity of e-learning materials in order to encourage students learning
while they are connected to the Internet.

205

14.5.3 Digital Educational Games


This study investigated the effects of e-game on students success at the level of higher
education. However, more research studies should be conducted in this area to find out
the potential effects of digital educational games in higher education. However, it
seems to me that it is not possible to develop a high quality educational game within the
limited duration of a PhD because many researchers with different skills (e.g.
Programmer, Animator & Educator) should involve in the development and evaluation
of digital educational games. Hence, it important to point out that not only technical but
also pedagogical knowledge should bring in to develop high quality digital educational
games. Finally, I recommend that a PhD student should not bother himself or herself in
the development and evaluation of a high quality digital educational game especially at
the level of higher education because it clearly requires a team-work.

206

References
1. Abbott, M. L., 2013. Understanding and Applying Research Design. Somerset:
Wiley.
2. Abrams, S. S., 2009. A gaming frame of mind: Digital contexts and academic
implications. Educational Media International, Volume 46, pp. 335-347.
3. Agboola, A. K., 2006. Assessing the Awareness and Perceptions of Academic
Staff in Using E-learning Tools for Instructional Delivery in a Post-Secondary
Institution: A Case Study. The Innovation Journal: The Public Sector Innovation
Journal, 11(3).
4. Akaslan, D. & Law, E. L.-C., 2011. Measuring Student E-learning Readiness: A
Case about the Subject of Electricity in Higher Education Institutions in Turkey.
Hong Kong, s.n.
5. Akaslan, D. & Law, E. L.-C., 2011. Mesuring Teachers' Readiness for Elearning in Higher Education Institutions associated with the Subject of
Electricity in Turkey. Belfast, s.n.
6. Akaslan, D. & Law, E. L.-C., 2012. Analysing the Relationship between ICT
experience and attitude toward e-learning: Comparing the teacher and student
perspectives in Turkey. Germany, s.n., pp. 18-21.
7. Akaslan, D., Law, E. L.-C. & Taskin, S., 2012. Analysis of Issues for
Implementing e-Learning: the Student Perspective. Marrakesh, s.n., pp. 1-9.
8. Akaslan, D., Law, E. L.-C. & Takn, S., 2011. Analysing Issues for Applying Elearning to the Subject of Electricity in Higher Education in Turkey. Belfast, s.n.
9. Alonso, F., Lopez, G., Manrique, D. & Vines, J. M., 2005. An instructional
model for web-based e-learning education with a blended learning process
approach. British Journal of Educational Technology, 36(2), pp. 217-235.
10. Anderson, B., 2008. Contemporary Perspectives in E-Learning Research.
American Journal of Distance Education, 22(2), pp. 123-125.
11. Anderson, T., 2002. Is E-Learning Right for Your Organization?, s.l.: ASTD |
The World's Largest Training and Development Association.
12. Anter, S. A., A. M. A. & Mashhadany, Y. I. A., 2014. Proposed E-learning
system for Iraqi Universities. International Journal of Scientific and Research
Publications, 4(5), pp. 1-7.

207

13. Asaari, M. H. A. H. & Karia, N., 2005. Adult learners and e-learning readiness.
Athens, s.n.
14. Astin, A. W., 1999. Student involment: A developmental theory for higher
education. Journal of College Student Development, Volume 40, pp. 518-529.
15. Attwell, G., 2006. Evaluating E-learning: A Guide to the Evaluation of Elearning. Journal of Evaluate Europe Handbook Series, Volume 2.
16. Auerbach, C. F. & Silverstein, L. B., 2003. Qualitative Data : An introduction to
coding and analysis. New York: NYU Press.
17. Aydin, C. & Tasci, D., 2005. Measuring readiness for e-learning: Reflections
from an Emerging Country. Journal of Educational Technology and Society,
8(4), pp. 244-257.
18. Bartlett, J. E., Kotrlik, J. W. & Higgins, C. C., 2001. Organizational Research:
Determining Appropriate Sample Size in Survey Research. Journal of
Information Technology, Learning and Performance, 19(1), pp. 43-50.
19. Bean, M., 2003. Are you ready for e-learning? , Bothell: MediaPro Newsletter:
Tips and tricks of the trade.
20. Bernards, H. R. & Ryan, G. W., 2010. Analyzing Qualitative Data: Systematic
Approaches. Los Angeles: SAGE.
21. Brandes, D. & Ginnis, P., 1991. A Guide to Student-centred Learning. Oxford:
Basil Blackwell.
22. Braun, V. & Clarke, V., 2006. Using thematic analysis in psychology.
Qualitative Research in Psychology, 3(2), p. 77101.
23. Bryman, A., 2007. The Research Question in Social Research: What is its Role?.
International Journal of Social Research Methodology, 10(1), pp. 5-20.
24. Burnard, P., 1999. Carl Rogers and postmodernism: Challenged in nursing and
health sciences. Journal of Nursing and Health Sciences, 1(4), pp. 241-247.
25. Chapnick,

S.,

2000.

Are

you

ready

for

e-learning?.

[Online]

Available

at:

http://www.learningcircuits.org/2000/Nov2000/Chapnick.htm

[Accessed 31 March 2012].


26. Chartlon, P., Magoulas, G. & Laurillard, D., 2012. Enabling creative learning
design through semantic technologies. Technology, Pedagogy and Education,
21(2), pp. 231-253.

208

27. Choi, C. Y. & McPherson, B., 2005. Noise Levels in Hong Kong Primary
Schools: Implications for Classroom Listening. International Journal of
Disability, Development and Education, 52(4), pp. 345-360.
28. Clarke, A., 2008. eLearning skills. s.l.:Palgrave Macmillan.
29. Cohen, L., Manion, L. & Morrison, K., 2007. Research Methods in Education.
6th ed. London: Routledge.
30. Cowan, J. E., 2008. Strategies for Planning Technology-Enhanced Learning
Experiences. The Clearing House: A Journal of Educational Strategies, Issues
and Ideas, 82(2), pp. 55-59.
31. Crawford, I. M., 1997. Marketing Research and Information Systems. Rome:
Food and Agriculture Organization of the United Nations.
32. Creswell, J. W., 2009. Research Design: Qualitative, Quantitative, and Mixed
Methods Approaches. Third Edition ed. s.l.:Sage Publications.
33. Creswell, J. W. & Clark, V. L. P., 2011. Designing and Conducting Mixed
Methods Research. Thousands Oaks: Sage Publications.
34. Creswell, J. W., Plano Clark, V. L., Gutman, M. L. & Handson, W. E., 2003.
Advanced mixed methods research designs. In: A. Tashakkori & C. Teddlie,
eds. Handbook of mixed methods in social and behavioral research. Thousand
Oaks, CA:: Sage.
35. Dabbagh, N., 2007. The online learner: Characteristics and Pedagogical
Implications. Journal of Contemporary Issues in Technology and Teacher
Education, 7(3), pp. 217-226.
36. Davis, F. D., 1989. Perceived usefulness, perceived ease of use, and user
acceptance of information technology. MIS Quaterly, pp. 319-340.
37. Donnelly, P., Benson, J. & Kirk, P., 2012. How to suceed at e-learning.
Somerset: Wiley-Blackwell.
38. Dray, B. J. et al., 2011. Developing an instrument to assess student readiness for
online learning: a validation study. Journal of Distance Education, 7(3), pp. 2947.
39. Etukudo, U. E., 2011. E-learning and Teacher Preparation in Science and
Mathematics: The Paradigm for Utilization of Interactive Packages. Omoku,
s.n.
40. Ferrell, G. et al., 2007. CAMEL Tangible Benefits of E-Learning Project Final
Report, s.l.: JISC InfoNet, ALT, HEA.
209

41. Fleming, D. & Storr, J., 2000. The Impact of Lecture Theatre Design on
Learning Experience. s.l., PRRES 2000.
42. Flick, U., 1998. An introduction to qualitative research. London: Sage.
43. Fowler, C., 2004. JISC e-Learning Models Desk Study, s.l.: JISC.
44. George-Walker, L. D. & Keeffe, M., 2010. Self-determined blended learning: a
case study of blended learning design. Higher Education Research and
Development, 29(1), pp. 1-13.
45. Gerber, H. R. & Price, D., 2011. Twenty-first century adolescents, writing, and
new media: Meeting the challenge with game controllers.. English Journal,
Volume 101, pp. 68-72.
46. Glenn, J. C., 2010. Handbook of Research Methods. Jaipur: Global Media .
47. Gray, C. D. & Kinnear, P. R., 2012. IBM SPSS Statistics 19. Hove: Psychology
Press.
48. Guri-Rosenblit, S. & Gros., B., 2011. e-Learning: Confusing Terminology,
Research Gaps and Inherent Challenges. Distance Education, 25(1), pp. 1-17.
49. Haney, D., 2002. Assessing organizational readiness for e-learning. Journal of
Performance Improvement, 41(4), pp. 10-15.
50. HEA, 2014. Flexible Pedagogies: technology-enhanced learning, s.l.: The
Higner Education Academy.
51. Hofmann, J., 2006. Why blended learning hasnt (yet) fulfilled its promises:
Answers to those questions that keep you up at night. In: C. J. Bonk & C. R.
Graham, eds. Handbook of blended learning: Global perspectives, local designs.
San Francisco: CA: Pfeiffer, pp. 27-40.
52. Holmberg, B., 1995. Theory and Practice of Distance Education. 2nd ed.
London: Routledge.
53. Holmberg, B., 2005. The Evolution, Principles and Practices of Distance
Education, Oldenburg: BIS.
54. Holmes, W., 2011. Using game-based learning to support struggling readers at
home. Learning, Media and Technology, 36(1), p. 519.
55. Hoppe, H. U., R. Joiner, M. M. & Sharples., M., 2003. Wireless and mobile
technologies in education. Journal of Computer Assisted Learning, 19(3), p.
255259.
56. Horton, W., 2001. Evaluating E-learning. s.l.:ASTD.

210

57. Howitt, D. & Cramer, D., 2014. Introduction to Research Methods in


Psycholdgy. 2nd ed. s.l.:Prentice Hall.
58. Hrastinski, S., 2008. Asyncrhonous and syncronous learning. Journal of
Educause Quarterly, Issue 4, pp. 51-55.
59. Hsu, T.-Y. & Chen, C.-M., 2010. A Mobile Learning Module for High School
Fieldwork. Journal of Geography, 109(4), pp. 141-149.
60. Hughes, G., 2007. Using blended learning to increase learner support and
improve retention. Teaching in Higher Education, 12(3), pp. 349-363.
61. Hughes, H., 2012. Introduction to Flipping the College Classroom. Chesapeake,
VA, AACE, pp. 2434-2438.
62. Hyder, K., Kwinn, A., Miazga, R. & Murray, M., 2007. The eLearning Guilds
Handbook on Synchronous e-Learning, Santa Rosa: The eLearning Guild.
63. Israel, G. D., 1992a. Determining Sample Size. Program Evaluation and
Organizational Development.
64. Israel, G. D., 1992b. Sampling Issues: Nonresponse. Program Evaluation and
Organizational Development.
65. JISC, 2004. Effective Practice with e-Learning: A good practice guide in
designing for learning, Bristol: JISC.
66. Johnson, B. & Onwuegbuzie, A., 2004. Mixed methods research: A research
paradigm whose time has come. Educational Researcher, Volume 33, pp. 14-26.
67. Kaur, K. & Abas, Z., 2004. An assessment of e-learning readiness at the Open
University Malaysia. Malbourne., s.n.
68. Kirkwood, A. & Price, L., 2014. Technology-enhanced learning and teaching in
higher education: what is enhanced and how do we know? A critical literature
review. Learning, Media and Technology, 39(1), pp. 6-36.
69. Kothari, C., 2004. Research Methodology : Methods and Techniques. Delhi:
New Age International .
70. Krosnick, J., 2002. The Causes of No-Opinion Responses to Attitude Measures
in Surveys:. In: D. D. J. E. a. R. L. R. M. Groves, ed. Survey Nonresponse. New
York: Wiley.
71. Lado, A., 2008. Asynchronous e-Learning. Bucharest, s.n.
72. Lage, M. J., Platt, G. J. & Treglia, M., 2000. Inverting the Classroom: A
Gateway to Creating an Inclusive Learning Environment. The Journal of
Economic Education,, 31(1), pp. 30-43.
211

73. Lambin, J.-J., Chumpitaz, R. & Schuiling, I., 2007. Market-Driven


Management: Strategic and Operational Marketing. 2nd Edition ed.
s.l.:Palgrave Macmillan.
74. Lapan, S. D., Quartaroli, M. T. & Riemer, F. J., 2012. Qualitative Research: An
introduction to methods and designs. s.l.:Biossey-Bass.
75. Laurillard, D., 2007. Technology, pedagogy and education: concluding
comments. Technology, Pedagogy and Education, 16(3), pp. 357-360.
76. Laurillard, D., 2008. The teacher as action researcher: using technology to
capture pedagogic form. Studies in Higher Education, 33(2), pp. 139-154.
77. Lawton, L., 2005. Sampling Bias Concerns in Web-Based Research, Berkeley:
TeachSociety Research.
78. Lea, S. J., Stephenson, D. & Troy, J., 2003. Higher Education Students'
Attitudes to Student-centred Learning: Beyond 'educational bulimia'?. Journal of
Studies in Higher Education, 28(3), pp. 321-334.
79. Lin, C. A., 1998. Exploring personal computer adoption dynamics. Journal of
Broadcasting and Electronic Media, 42(1), pp. 95-112.
80. Lind, D. A., Marchal, W. G. & Wathen, S. A., 2010. Statistical Techniques in
Business and Economics. 14th Edition ed. Boston: McGraw-Hill.
81. Liu, M., Rosenblum, J. A., Horton, L. & Kang, J., 2014. Designing Science
Learning with

Game-Based

Approaches.

Computers

in

the Schools:

Interdisciplinary Journal of Practice, Theory, and Applied Research,, 31(1-2),


pp. 84-102.
82. Lopes, C. T., 2007. Evaluating E-learning Readiness in a Health Sciences
Higher Education Institution. Porto, Proceedings of the IADIS International
Conference for E-learning .
83. Macdonald, J., 2008. Blended learning and online tutoring. 2nd ed. Hampshire:
Gower.
84. Mackeogh, K. & Fox, S., 2009. Strategies for embedding e-learning in
traditional universities. Electronic Journal of E-learning, 7(2), pp. 147-145.
85. Maltby, J. & Day, L., 2002. Early Success in Statistics. Harlow: Prentice Hall.
86. Martn-Rodrguez, ., Fernndez-Molina, J. C., Montero-Alonso, M. . &
Gonzlez-Gmez, F., 2014. The main components of satisfaction with elearning. Technology, Pedagogy and Education.

212

87. Martins, P. & Walker, I., 2006. Student achivement and education production
function: a case-study of the effect of class attendance,. s.l.:Mimeo.
88. Mason, J., 2002. Qualitative researching. 2nd ed. London: Sage.
89. Mason, R., 2005. Blended Learning. Education, Communication & Information ,
5(3).
90. Mason, R. & Rennie, F., 2004. Broadband: the solution for rural elearning?.
International Review of Research in Open and Distance Learning, 5(1).
91. Matthews, B. & Ross, L., 2010. Research Methods: Apractical guide for the
social sciences. Essex: Pearson.
92. Matthews, D., 1999. The Origins of Distance Education and Its Use in the
United States. Journal of T. H. E. , 27(2), pp. 54-67.
93. Mayes, T. & Freitas, S., 2000. JISC e-Learning Models Desk Study: Review of eLearning theories, frameworks and models. s.l., JISC.
94. McFarland, D. J. & Hamilton, D., 2006. Adding contextual specificity to the
technology acceptance model. Journal of Computer in Human Behavior, 22(3),
pp. 427-447.
95. McInnerney, J. M. & Roberts, T. S., 2009. Colloborative and Cooperative
Learning. In: P. L. Rogers, ed. Encyclopedia of Distance Learning. s.l.:Idea
Groups Inc.
96. McMahon, T. & Stark, T., 2005. Getting Students to Prepare for Class, Eugene:
University of Oregon.
97. McQueen, R. A. & Knussen, C., 2013. Introduction to Research Methods and
Statistics in Psychology: A practical guide for the undergraduate researcher.
2nd ed. Harlow: Pearson.
98. MellOw, G. O., Woolis, D. D. & Laurillard, D., 2011. In Search of a New
Developmental-Education Pedagogy. Change: The Magazine of Higher
Learning, 43(3), pp. 50-59.
99. Middleton, J., Ziderman, A. & Adams, A. V., 1991. Vocational and Technical
Education and Training. Washington: The World Bank.
100.

Mifsud, L., 2014. Mobile learning and the sociomateriality of classroom

practices. Learning, Media and Technology, 39(1), pp. 142-149.


101.

Mills, J. E. & F.Treagust., D., 2003. Engineering Education Is

Problem-Based or Project-Based Learning theAnswer?. Australasian Journal of


Engineering Education.
213

102.

Molina-Azorin, J. F., 2012. Mixed Methods Research in. Organizational

Research Methods Strategic Management: Impact and Applications, 15(1), pp.


33-56.
103.

Moore, J. L., Dickson-Deane, C. & Galyen, K., 2010. e-Learning, online

learning, and distance learning environments: Are they the same?. Journal of
Internet and Higher Education, Volume 14, p. 129135.
104.

Morrison, K. R. B., 1993. Planning and Accomplishing School-Centred

Evaluation. Dereham: Peter Francis.


105.

Morse,

J.

M.,

1991.

Approaches

to

qualitative-quantitative

methodological triangulation.. Nursing Research, 40(2), p. 120.


106.

Mor, Y. & Winters, N., 2007. Design Approaches in Technology-

Enhanced Learning. Interactive Learning Environments, 15(1), pp. 61-75.


107.

Mull, D. I., 2002. The effect of student involment in residence halls on

development and retention., s.l.: s.n.


108.

Murphy, D. B. & Chris Sharman, R., 2006. Developing Skills for

Distance Learning, Hong Kong: Open University of Hong Kong Press .


109.

ONeill, G. & McMahon, T., 2005. Student-centred learning: what does

it mean for students and lecturers?. Dublin, AISHE.


110.

Oblinger, D. G. & Oblinger, J. L., 2005. Educating the net generation..

s.l.:Educause.
111.

Ouma, G. O., Awuor, F. M. & Kyambo, B., 2013. Evaluation of E-

Learning Readiness in Secondary Schools in Kenya. World Applied


Programming, 3(10), pp. 493-503.
112.

Paine, N., 1989. Open Learning in Transtition: An Agenda for Action.

London: Kogan Page.


113.

Pallant, J., 2010. SPSS Survival Manual: A step by step guide to data

analysis using the SPSS program. 4th Edition ed. New York: MC Graw Hill.
114.

Park, N., Roman, R., Lee, S. & Chung, J. E., 2009. User acceptance of a

digital library system in developing countries: an application of thetechnology


acceptance mode. International Journal of Information Management, 29(3), pp.
196-209.
115.

Patricio Rodrguez, M. N. & Dombrovskaia, L., 2012. ICT for education:

a conceptual framework for the sustainable adoption of technology-enhanced

214

learning environments in schools. Technology, Pedagogy and Education, 21(3),


pp. 291-315.
116.

Peng, C.-Y. J., Harwell, M. & Liou, S.-M., 2007. Advances in Missing

Data Methods and Implications for Educational Research. In: Real Data
Analysis. Charlotte, NC: Information Age Publishing, pp. 31-78.
117.

Phillips, R., McNaught, C. & Kennedy, G., 2012. Evaluating e-learning:

guiding research and practice. New York: Routledge.


118.

Pollard, E. & Hillage, J., 2001. Exploring E-Learning. IES Report 376.,

s.l.: s.n.
119.

Prasad, R., 2009. Fundamentals of Electrical Engineering, New Delhi:

PHI Learning.
120.

Punch, K. F., 2005. Introduction to social research: Quantitative and

qualitative approaches. London: Sage.


121.

Race, P., 1994. The Open Learning Handbook. London: Kogan Page.

122.

Race, P., 2005. 500 Tips for Open and Online learning. London:

RoutledgeFalmer.
123.

Rahim, M. A., 1985. A Strategy For Managing Conflict In Complex

Organizations. Human Relations, 3(38).


124.

Roblyer, M. D. & Ekhaml, L., 2000. How Interactive are YOUR

Distance Courses? A Rubric for Assessing Interaction in Distance Learning.


Journal of Distance Learning Administration,, 3(2).
125.

Rodrguez, P., Nussbaum, M. & Dombrovskaia, L., 2012. ICT for

education: a conceptual framework for the sustainable adoption of technologyenhanced learning environments in schools. Technology, Pedagogy and
Education, 21(3), pp. 291-315.
126.

Rogers, E. M., 2003. Diffusion of innovations. Fifth Edition ed. New

York: Free Press.


127.

Rosenberg, M. J., 2001. E-learning, strategies for delivering knowledge

in the digital age. New York: McGraw-Hill.


128.

Rumble, G., 1989. Open learning, distance learning, and the misuse

of language. The Journal of Open, Distance and e-Learning, 4(2).


129.

Ryan, G. a. B. H., 2000. Data management and analysis methods. In: N.

Denzin & Y. Lincoln, eds. Handbook of qualitative research. s.l.:Sage, pp. 769802.
215

130.

Sadera, W. A., Li, Q., Song, L. & Liu, L., 2014. Digital Game-Based.

Computers in the Schools: Interdisciplinary Journal of Practice, Theory, and


Applied Research, 31(1-1), pp. 1-1.
131.

Scheier, M. F. & Carver, C. S., 1993. On the power of positive thinking.

Journal of Current Directions in Psychological Science, 2(1), pp. 26-30.


132.

Schumacker, R. E. & Lomax, R. G., 1996. A Beginners' Guide to

Structural Equation Modelling. New Jersey: IEA.


133.

Schunk, D., 1991. Self-efficacy and academic motivation. Educational

Psychologist, 26(3), pp. 207-231.


134.

Schwieter, J. W., 2008. Preparing Students for Class: A hybrid

enhancement to language learning. Journal of College Teaching Methods and


Styles, 4(6), pp. 41-49.
135.

Secker, J., 2010. Copyright and e-Learning: a guide for practioners.

London: facet publishing.


136.

Sewart, D., Keegan, D. & Holmberg, B., 1983. Distance Education:

International Perspectives, Kent: Croom Helm.


137.

Shift e-Learning, 2012. Top eight reasons why e-learning is effective

invesment.

[Online]

Available at: http://www.shiftelearning.com/


138.

So, K. K., 2005. The e-learning readiness of teachers in Hong Kong. s.l.,

s.n.
139.

Soydal, I., Alr, G. & nal, Y., 2011. Are Turkish universities ready for

e-learning: A case of Hacettepe University Faculty of Letters. Information


Services and Use, 31(3-4), pp. 281-291.
140.

Squire, K., 2011. Video games and learning: Teaching and participatory

culture in the digital age. New York, NY: Teachers College Press.
141.

Startic, A. I. & Turk, Z., 2013. E-Learning and M-Learning for Students

with Special Needs; Competence Registration in Design of Personalised


Learning Environment. In: Outlooks and Opportunities in Blended and Ristance
Learning. Hershey: Information Science Reference (an imprint of IGI Global),
pp. 273-190.
142.

Steffens, K., 2008. Technology Enhanced Learning Environments for

selfregulated learning: a framework for research. Technology, Pedagogy and


Education, 17(3), pp. 221-232.
216

143.

Steinkuehler, C. C.-L. C. & King, E., 2011. Reading in the context of

online games. Chicago, IL, USA, s.n., pp. 222-229.


144.

Suhr, D. D., 2006. The Basics of Structural Equation Modeling. Irvine,

s.n.
145.

Thomas, W. K., 1990. Organizational Behaviour And Management:

Conflict And Conflict Management. Boston: PWS KENT Publishing Company.


146.

Topalolu, C. & Boylu, Y., 2006. rgtii atmalarn Trleri: Otel

letmeler Asndan Ayrntl Br nceleme. Sosyal Bilimler Enstits Dergisi


(lke), Volume 16.
147.

Torrisi-Steele, G. & Drew, S., 2013. The literature landscape of blended

learning in higher education: the need for better understanding of academic


blended practice. International Journal for Academic Development, 18(4), pp.
371-383.
148.

Traxler, J., 2009. The evolution of mobile learning. In: R. Guy, ed. The

evolution of mobile teaching and learning. Santa Rosa, CA: s.n., pp. 1-14.
149.

Trayek, F. A. A., Ahmed, T. B. T. & Nordin, M. S., 2014. E-learning

readiness and its crrelates among secondary school teachers in Nablus,


Palestine. London, Taylor and Francis Group, pp. 229-234.
150.

Trinidad, S. & Pearson, J., 2004. Implementing and evaluating e-

learning environments. s.l., s.n.


151.

Venkatesh, V. & Bala, H., 2008. Technology acceptance model 3 and a

researh agenda on interventions. Decision Sciences Institute, 39(2), pp. 273-315.


152.

Verkroost, M., Meijerink, L., Lintsen, H. & Veen, W., 2008. Finding a

balance of dimensions in blended learning. International Journal on E-Learning,


19(1), p. 499522.
153.

Wayman, J. C., 2003. Multiple Imputation For Missing Data: What Is It

And How Can I Use It?. Chicago, IL, Proceedings of the 2003 Annual Meeting
of the American Educational Research.
154.

Weller, M., 2004. Learning Objects and the e-learning cost dilemma.

Open Learning: The Journal of Open, Distance and e-Learning, 19(3), pp. 293302.
155.

Weller, M., 2007. Virtual Learning Environments: Using, Choosing and

Developing your VLE. New York: Routledge.

217

156.

Whitton, N., 2012. Good game design is good learning design. In: N.

Whitton & A. Moseley, eds. Using games to enhance learning and teaching: A
beginner's guide. London: Routledge.
157.

Williams, M. L., Paprock, K. & Covington, B., 1999. Distance Learning:

The Essential Guide. Thousands Oaks: Sage.


158.

Wmann, L. & West, M. R., 2006. Class-Size Effects in School

Systems Around the World:Evidence from Between-Grade Variation in TIMSS.


European Economic Review, 26 March, 50(3), pp. 695-736.
159.

Yamamoto, G. T. & Aydn, C. H., 2009. E-learning in Turkey. In: E-

learning Practices. s.l.:Anadolu University, pp. 961-987.


160.

Yucel, A. S., 2006. E-learning Approach in Teacher Training. Turkish

Online Journal of Distance Education, 7(11), pp. 123-131.


161.

Zappe, S. et al., 2009. 'Flipping' the Classroom to Explore Active

Learning in a Large Undergraduate Course. Oregon, s.n.


162.

Zimmerman, B., 2000. Self-efficacy: An essential motive to learn.

Contemporary Educational Psychology, 25(1), pp. 82-91.

218

Appendix I: Teacher Questionnaire


MEASURING TEACHERS READINESS FOR E-LEARNING
Are you ready for e-learning? This survey is designed to investigate the use of information and
communication technologies by higher education institutions associated with the science of electricity
(e.g. department of electrical and electronic engineering, department of aircraft electric and electronics) in
universities across Turkey. The information collected will be used to measure teachers readiness for elearning. The survey addresses variables of several dimensions that potentially influence the so-called elearning readiness. The survey data will enable us to find out about how to implement e-learning in the
higher education institutions associated with the science of electricity in Turkey, thereby shaping my
future research work on this specific topic.

All the survey data will be handled in a strict confidential manner. Your identity will remain anonymous
in any form of our publications such as technical report and scientific conference paper. The survey is
divided into 7 sections that consist of 27 questions. It will take approximately 20 minutes to complete the
survey. The survey is designed for the participation of teachers, researchers and administrators who are
currently working in higher education institutions associated with the subject of electricity in Turkey. If
you have any questions or suggestions related to the applicability of the survey to your department, you
can contact the undersigned. I would like to thank you in advance for your time and effort. Your
participation will bring an innovation to higher education institutions associated with the subject of
electricity in Turkey.

Dursun Akaslan
University of Leicester, Department of Computer Science
University Road, Leicester, LE1 7RH, United Kingdom
Email: info@dursunakaslan.com.tr
Tel: +44 (0) 116 252 52 43 and Fax: +44 (0) 116 252 38 39

ATTENTION!
The only marked questions with a red asterisk (*) is compulsory.

219

Q05. Are you a member of strategic decision

SECTION 1:

making body at your university?

GENERAL INFORMATION

This section is to collect information about the


characteristics of the departments or programs
related to electricity that you are studying.

(*) Yes

No

Q06. What is your main function in the


strategic decision making body?

Q01. (*) Which university are you currently


working?
o

University: [

Faculty / High School: [

Department /Program: [

SECTION 2:
TECHNOLOGY

Q02. (*) What is your gender?


This section is to find out whether you have
o

Female

Male

access to computers and the Internet.

Q03. (*) What is your age? Choose one of the

Q07. (*) Do you have access to the Internet at

following answers:

home?
o

Yes [Go to Q08]

No [Go to Q10]

24 or less than 24 years old

Between 25-34 years old

Between 35-44 years old

Q08. (*) What type of Internet connection do

Between 45-54 years old

you have at home? Choose one of the

Between 55-64 years old

following answers.

65 or more than 65 years old


o

ADSL

Q04. (*) What is your main responsibility at

Modem

the university in which you are currently

Other: [

working? Choose one of the following


answers.

Q09.

Please

information
o

Teaching

Research

Administration

elaborate
about

your

connection, if available.

220

the

technical

home

Internet

Q10. (*) If you have access to the Internet at

Q13. (*) To which extent do you agree with

your Faculty/High School, what type of

each of the following statements concerning

Internet connection do you have there?

your use of information and communication

Choose one of the following answers.

technologies (ICT)?

Wired [ Go to Q12]

I use the Internet as information source.

Wireless [Go to Q12]

I use e-mail as the main communication

Both Wireless and Wired [ Go to Q12]

I do not have access to the Internet at

tool with my students and colleagues.


o

university. [Go to Q11]

I use office software for content delivery


and demonstration.

I use social network sites.

Q11. As you do not have access to the

I use electrical software.

Internet at your university, could you please

I use instant Messaging.

explain where you normally have access to


the Internet?
Q14. (*) To which extent are you confident in
using each of the following ICT application?
o

Computers

Q12. (*) To what extent are you satisfied with

Web Browsers

the stability of your University network?

Search Engines

Digital File Management Tools

Authoring

Not at all

A bit

Medium

Large

Q15. Please, elaborate other ICTs that you

Very Large

have experiences or confidences.

Tools

to

Create

Learning

Materials

SECTION 3:
INDIVIDUAL
Q16. (*) To which extent do you agree with
each of the following statement?

This section is to find out about the degree to


which you have experience and confidence in
the use of ICT applications and your attitudes
towards e-learning.

I have information about what e-learning is.

I have enough ICT competency to prepare


e-learning materials.

I feel that I am ready to integrate e-learning


in my teaching.

221

I have enough time to prepare e-learning


materials.

I believe my students will like e-learning.


o

Q17.

Please

elaborate

your

personal

No, e-learning is not applied in any of the


departments in my university.

experiences and views on e-learning:

SECTION 5:
Q18. To what extent do you think that the

CONTENT

top-level administration of your Faculty /


High School:

This section is to find out whether e-learning is


suitable for the subject of electricity in terms of

understand what e-learning is?

support the use of e-learning in your

two concepts: quality and practicality.

university?

Q20. To what extent do you agree with each


of the following statements concerning the
use of e-learning in electricity education?

SECTION 4:
INSTITUTION

E-learning can enhance the quality of the


theoretical part of the subject electricity.

This section is to find out whether higher


education institutions associated with the

E-learning can enhance the quality of the


practical part of the subject electricity.

subject of electricity in Turkey currently apply


for e-learning.

E-learning can be applied to the practical


part of the subject electricity.

o
Q19. Is e-learning applied at the university in

E-learning can be applied to the practical


part of the subject electricity.

which you are currently working? Please,


select all applicable.

SECTION 6:
ACCEPTANCE

Yes,

e-learning

is

applied

in

my

department.

This section is to find out whether you tend to


accept or reject e-learning in terms of perceived
usefulness and use with ease.

Q21. To what extent are you agree with each


o

Yes, e-learning is applied in my faculty /

of the following statements concerning your

high school/

beliefs about e-learning?

222

I believe that e-learning can improve the

This section is to find out the perceived need of

quality of my teaching.
o
o
o

training on e-learning before embarking on it.

I believe that using e-learning can increase


my productivity.

Q24. To what extent do you agree with each

I believe that e-learning is useful for my

of the following statements concerning the

research.

need

I believe that e-learning enables me to

embarking on it?

for

e-learning

training

before

accomplish my teaching more effectively


o

I need training on e-learning.

approach.

My students need training on e-learning.

I believe that it is easy for me to use e-

Technical and administrative personals

than
o

the

learning

traditional

tools

(e.g.

classroom-based

virtual

learning

need training
o

environment (VLE).
o

I believe that my students find it easy to use

The facilities of university are not enough


for e-learning.

VLE.

Q25. Please, elaborate the types of training


you have in mind:

Q22. Please, elaborate how useful and how


easy for you to use e-learning in your work:

SECTION 8:
INTERVIEW REQUEST
Q23. Our research goal is to implement elearning in academic institutions providing

Q26. (*) To deepen our understanding of

education and training in the science of

various issues in connection to e-learning

electricity in Turkey. If the results of this

readiness, we would like to interview you and

survey suggest that your institution seems to

other respondents via phone. Would you like

be ready for e-learning, how strong is your

to be interviewed or to receive the result of

intention to support the integration of e-

the survey?

learning in your own department?


o

I would like to interview.

I would like to learn the results of the

Not at all

A bit

Medium

I would like to both.

Large

I do not want to either interview or learn

Very large

survey.

the results of the survey.

SECTION 7:

Q27.

TRAINING

information to contact you

223

Please

provide

us

the

following

Full Name: [

E-mail: [..

SECTION9:
COMMENT AREA
If you would like to give any other comment
on e-learning or on this research, or you
would like to express your ideas how elearning can help to solve current issues in
the science of electricity, or to suggest
anything, please feel free to write here:

224

Appendix II: Student Questionnaire


MEASURING STUDENTS READINESS FOR E-LEARNING
Are you ready for e-learning? This survey is designed to investigate the use of information and
communication technologies by higher education institutions associated with the science of electricity
(e.g. department of electrical and electronic engineering, department of aircraft electric and electronics) in
universities across Turkey. The information collected will be used to measure students readiness for elearning. The survey addresses variables of several dimensions that potentially influence the so-called elearning readiness. The survey data will enable us to find out about how to implement e-learning in the
higher education institutions associated with the science of electricity in Turkey, thereby shaping my
future research work on this specific topic. All the survey data will be handled in a strict confidential
manner. Your identity will remain anonymous in any form of our publications such as technical report
and scientific conference paper.

The survey is divided into 7 sections that consist of 45 questions. It will take approximately 25 minutes to
complete the survey. The survey is designed for the participation of undergraduate, master and PhD
students who are currently studying in higher education institutions associated with the subject of
electricity in Turkey. If you have any questions or suggestions related to the applicability of the survey to
your department, you can contact the undersigned. I would like to thank you in advance for your time and
effort. Your participation will bring an innovation to higher education institutions associated with the
subject of electricity in Turkey.

Dursun Akaslan
University of Leicester, Department of Computer Science
University Road, Leicester, LE1 7RH, United Kingdom
Email: info@dursunakaslan.com.tr
Tel: +44 (0) 116 252 52 43 and Fax: +44 (0) 116 252 38 39

225

ATTENTION!
The only marked questions with a red asterisk (*) is compulsory.
The survey totally consists of 45 questions.

PRELIMINARY INFORMATION FOR STUDENT SURVEY

What is Readiness?
Readiness is defined as willingness to do something or a state of being prepared for something. Readiness
for something or being prepared for something means that individuals, in order to gain behaviour, must
have the preliminary information, skill, familiarity, interest, attitude and maturation for that behaviour.

What are Information and Communication Technologies (ICT)?


Information and communication technologies (ICT) concern any device or application (e.g. the Internet,
television, radio and computers) for communicating, creating, storing, disseminating and managing
information electronically.

What is E-learning?
E-learning is defined as learning using the Internet, intranet or a computer network that transcends time
and location constraints.

What is Readiness for E-learning?


Readiness for e-learning is defined as the mental or physical preparedness of an organization for some elearning experience or action. There are several components that should be considered in the assessment
process, namely technological, content, institutional and individual readiness.

226

227

SECTION 1:

Faculty of Technology

GENERAL INFORMATION

Other [

Q04. (*) In which Department / Program are

This section is to collect information about the

you studying? Choose one of the following

characteristics of the departments or programs

answers.

related to electricity that you are studying.

Q01. (*) At what Educational Level are you


studying? Choose one of the

following

answers.
o

Associate

Bachelor

Master

Master without Thesis

Doctorate

Other [

Abbant Izzet Baysal University

Acibadem University

Adiyaman University

Adnan Menderes University

Afyon Kocatepe University

Other [

Aircraft Electrical and Electronics

Electrical and Electronic Engineering

Electrical Engineering

Electrical Education

Electricity

Electrical Energy Generation, Transmission

Rail Systems Electrical and Electronic


Technology

Other [

Q05. (*) In which year are you studying at


your university? Choose one of the following

following

answers.

answers.
o

Avionics

and Distribution

Q02. (*) In which University are you


studying? Choose one of the

Foreign Language Preparatory

Scientific Preparatory

1. Class

2. Class

3. Class

4. Class

Other [

Q02. (*) In which Faculty / High School /


Institute are you studying? Choose one of the

Q06. (*) Is you university Private or Public?

following answers.

Choose one of the following answers.


o

Faculty of Electrical and Electronics

Institute of Science ( and Technology)

Vocational High School

Faculty of Engineering and Architecture

Faculty of Engineering

School of Civil Aviation

Faculty of Technical Education

Private University

Public University

Q07. (*) What is your gender?


o

228

Female

Bicycle

Public Bus

Q08. (*) What is your age? Choose one of the

Private Bus

following answers.

Private Car

Other [

Male

16 years old and under

Between 17-21 years old

Q13. (*) How many minutes do you usually

Between 22-26 years old

spend for your travelling? Choose one of the

Between 27-31 years old

following answers.

32 years old and over


o

Less than 15 minutes

Q09. (*) Do you live with your family?

Between 15 and 30 minutes

Choose one of the following answers.

Between 30 and 45 minutes

Between 45 and 60 minutes

Yes [Go to Q10]

Between 60 and 90 minutes

No [Go to Q11]

Between 90 and 120 minutes

More than 120 minutes

Q10. (*) Do you have a private room to study


at home?

SECTION 2:
TECHNOLOGY

Yes

No

This section is to find out whether you have


access to computers and the Internet.

Q11. (*) What kind of residence do you live


during your study at university? Choose one

Q14. (*) Do you have an access to a Desktop

of the following answers.

or Laptop Computer connected to the


Internet at the Residence you live?

Rented Flat/House/Apartment

Relatives dwelling

Hostel

KYK Residence

Private Residence

University Residence

Other [

Yes [ Go to Q15 ]

No [ Go to Q16 ]

Q15. (*) To what extent do you agree with


the following statements?
o

Q12. (*) How do you usually travel for your

Internet access at the place I live.

residence to university? Choose one of the

following answers.
o

I am satisfied with the Stability of the

I am satisfied with the Speed of the Internet


access at the place I live.

Walk

229

Q16. (*) Do you have an access to a Desktop

Q19. Please elaborate your experience of

or Laptop Computer connected to the

using ICT application (e.g. which systems, for

Internet at the University you study?

what purposes, how long):

Yes [ Go to Q17 ]

No [ Go to Q18 ]
Q20. (*) To what extent do you agree with

Q17. (*) To what extent do you agree with

the following statements?

the following statements?


o
o

I am satisfied with the Stability of the

computers) confidently.
o

Internet access at university.


o

I use computers (e.g. notebooks, desktop

I am satisfied with the Speed of the Internet

I use web browsers (e.g. Internet Explorer,


Firefox) confidently.

access at university.

I use search engines (e.g. Google, MSN


Search, vb.) confidently

I use digital file management tools (e.g.


deleting or renaming a file on your

SECTION 3:

computer) confidently.

PEOPLE
o

I use authoring tools to create learning

This section is to find out about the degree to

materials (e.g. Movie Maker, Microsoft

which you have experience and confidence in

Publisher) confidently.

the use of ICT applications and your attitudes


Q21. Please, elaborate your confidence of

towards e-learning.

using

ICT

application

(e.g.

difficulty,

pleasure, frustration, confidence):


Q18. (*) To what extent do you agree with
the following statements?
o

I use the internet as information source.

Q22. (*) To what extent do you agree with

I use e-mail for the communication with my

the following statements?

peers.
o
o
o

I use office software (e.g. Microsoft Office,

I have enough information about e-learning.

and OpenOffice) for my coursework.

I have enough ICT competencies to prepare


my coursework in electronic format.

I use social network sites (e.g. Facebook,


Orkut, vs.).

I feel I am ready for e-learning.

I use instant messaging software (e.g.

I have enough time to prepare my


coursework in electronic format.

MSN, Yahoo, and Skype.).


o

I use engineering software (e.g. AutoCAD,

I support the use of e-learning in my


department / program.

Matlab).
o

230

I will like e-learning.

o
Q23.

Please

elaborate

your

I take notes of learning activities (e.g.


lectures, books, seminars).

personal
o

experiences and views on e-learning:

I note the details of the learning activity,


specifying its objectives, processes and
outcomes.

In note taking, I identify relationships

Q24. (*) To what extent do you agree with

between the concepts addressed in the

the following statements?

learning activity with a tool (e.g. mind


mapping)

My teachers have enough information

given collaborative task.

about e-learning.
o

My peers have enough information about e-

I am skilful to share and discuss knowledge


with my teammates

learning.
o

I can work well in groups to implement a

My teachers support the use of e-learning in

I manage my contribution to the group

my department/program.

work professionally with the use of a tool

My peers support the use of e-learning in

(e.g. Google doc)

my department/program
o

My teachers will like e-learning.

Q27. (*) To what extent do you agree with

My peers will like e-learning.

the following statements?


o

Q25.Please elaborate your above ratings


regarding

your

teachers

and

peers

I can remember what I have just read when


I get to the end of a chapter.

knowledge of and attitudes towards e-

I know how to pick out what is important in


the text and identify the main ideas.

learning:
o

I annotate the text with the use of a tool to


document my reflection on its content.

Q26. (*) To what extent do you agree with

I attend classes regularly

I carefully prepare myself for most class


sessions

the following statements?


o
o

can

start

writing

without

and update my personal notes accordingly

feeling

with the use of a specific tool.

overwhelmed after receiving an assignment


o

on a certain topic.
o

I make timetables and list of activities to


organize my tasks.

I present the ideas clearly in my own words


o

without simply reproducing what I have

I have discipline to plan and manage time


during study

read and heard about the topic.


o

I discuss issues in classes to clarify them

I document, review and revise my writing

I manage the integrity of my timetable

of the topic iteratively with the help of a

efficiently with the use of a specific tool

tool (e.g. MS Word).

(e.g. Google calendar, Mobile Phone


Calendar).

231

This section is to find out whether higher


education institutions associated with the

Q28. Please elaborate whether and how you

subject of electricity in Turkey currently apply

are or would like to be a successful learner:

for e-learning.

Q31. (*) Is e-learning applied at the


University in which you are studying?

Q29. (*) To what extent do you agree with


the following statements?
o

I set my objectives and prioritize them

Yes [ Go toQ32]

No [Go to Q33]

when undertaking a task.


o
o

I keep track of the progress of a task and

Q32. Please elaborate e-learning applications

adjust my strategies.

being used by your University:

I can evaluate my own performance and


identify my strengths and weaknesses with
the use of a specific tool.

My moods or personal problems seldom

Q33. Please elaborate reasons why e-learning

prevent me from completing my tasks.


o

is not yet applied in your University:

I can concentrate on studying without being


easily distracted.

I know how to sustain my motivation and


persist to accomplish the task despite
Q34. (*)

difficulties experienced.

Is e-learning applied in your

Faculty / High School / Institute in which you


are studying?

Q30. Please elaborate whether and how you


are or would like to be a self-directed
learner:

Yes [ Go to Q35]

No [ Go to Q36 ]

Q35. Please elaborate e-learning applications,


if any, being used by your Faculty / High
School / Institute:

Q36. Please elaborate reasons why e-learning


is not yet applied in your Faculty / High

SECTION 4:

School / Institute:

INSTITUTION

232

E-learning can enhance the quality of

Q37. (*) Is e-learning applied in your

the Practical Part of the subject

Department / Program in which you are

electricity.

studying?
Q41. Please elaborate whether and why e-

Yes [Go to Q38]

No [Go to Q39]

learning is (not) suitable for certain topics in


the subject of electricity:

Q38. Please, elaborate regarding e-learning


applications, if any, being used by your
Department / Program:

SECTION 6:
ACCEPTANCE
This section is to find out whether e-learning is

Q39. Please elaborate reasons why e-learning

suitable for the subject of electricity in terms of

is not yet applied in your Department /

two concepts: quality and practicality

Program:

Q42. (*) To what extent do you agree with


the following statements?

SECTION 5:

CONTENT

E-learning will improve the quality of my


learning experience.

o
This section is to find out whether e-learning is

E-learning will improve the quality of my


outcomes.

suitable for the subject of electricity in terms of

E-learning will increase my productivity.

two concepts: quality and practicality

E-learning will be useful for my studies.

E-learning will enable me to accomplish

Q40. (*) To what extent do you agree with

my studies more effectively than the

the following statements?

traditional classroom-based approach.

E-learning

Theoretical

can be applied to the


Part

of

the

E-learning tools will be easy to use for my


teachers.

E-learning tools will be easy to use for my


peers.

E-learning can enhance the quality of


the Theoretical Part of the subject

E-learning tools will be easy to use for me.

subject

electricity.
o

Q43. Please elaborate whether it would be

electricity.

useful

E-learning can be applied to the

approaches and tools in your studies:

Practical Part of the subject electricity.

233

and

easy

to

apply

e-learning

SECTION 7:
TRAINING
This section is to find out the perceived need of
training on e-learning before embarking on it.

Q44. (*) To what extent do you agree with


the following statements?

I need training on e-learning.


o

My teachers need training on e-learning.

My peers need training on e-learning.

Technical and administrative personals


need training.

The facilities of university are not enough


for e-learning.

Q45. Please elaborate the training that you


need for e-learning:

SECTION 8:
COMMENT AREA
Do you have any comments, suggestions or
questions

regarding

e-learning

or

our

research? Please, use the following space to


express your feeling, beliefs or concerns:

234

235

Potrebbero piacerti anche