Sei sulla pagina 1di 25

UNIVERSITY OF MAURITIUS RESEARCH JOURNAL Volume 17 2011

University of Mauritius, Rduit, Mauritius


Research Week 2009/2010

Conducting an Online Survey in an Academic Setting.


Our Experience and Findings
Moloo R K*
Faculty of Engineering,
University of Mauritius
Reduit
Email: r.moloo@uom.ac.mu

Pudaruth S
Faculty of Engineering,
University of Mauritius
Reduit
Email: s.pudaruth@uom.ac.mu

Paper Accepted on 22 June 2011

Abstract
With the ever-increasing access to Internet in Mauritius, online surveys might soon
replace traditional survey methods. If properly conducted, online surveys provide
an easy, cheap, reliable and quicker way of obtaining and analysing valuable data
from a wide and diverse population. In this paper, we present the effectiveness of
an online survey carried out to students of the University of Mauritius to evaluate
the Introduction to Information Technology (IT) module. We share our experience
in designing, implementing the survey and also in collecting the required
information. We show how effective, time and cost saving online surveys can be in
collecting and analysing information in an academic institution. The survey further
analyses gender difference, mean time taken, time of day and the effectiveness of
open-ended and close-ended questions which have been sparsely considered by
earlier researchers. We also present the varied advantages online surveys can bring
and our experience of the techniques we used to encourage students to answer the
survey forms. The survey was carried out online and targeted 1660 students. The
response rate was 29.6%, covering 492 students.

Keywords: Online survey, response behaviour, ICT

*For correspondences and reprints


440
Conducting an Online Survey in an Academic Setting. Our Experience and Findings

1. INTRODUCTION

The success of online surveys relies on the IT literacy of the sampled population.
The National ICT Strategic (NICTSP) 2007-2011 plan [1] projects in making
Mauritius a digital island. Latest statistics shows that Mauritius has a 29.6%
internet penetration ratio with 380,000 internet users [2]. This figure however,
includes business users as well. An estimate of the users age group can be
estimated from Facebook [6] users demographic profile in Mauritius. We firmly
believe that its demographic profile reflects the age range of internet users in our
country. Around 55% of users fall in the age range of 15-24, 27% in 25-34 age
group, 8% in the age bracket of 35-44, and 5% is aged above 45 years old.
Therefore, it can be reasonably inferred that the majority of web users in Mauritius
are in the 15-24 age group.

This can be explained since Information and Communication Technology (ICT) is


rather new in Mauritius, a decade old, and a rapid expansion has been noted in this
sector with the advent of the 1998 ICTA act [3], which consists of empowering
people and businesses with IT skills. Pushing further our analysis, it can be
deduced that online survey audience in Mauritius will be mainly within this age
group.

Based on the facts and figures mentioned and with an ever increasing IT literacy
ratio, we firmly believe that online surveys have a bright future over traditional
methods in our country. To our knowledge, practically no methodological study on
online surveys have been carried out in Mauritius to determine the impact, validity
and response rates of such kind of medium. Online polls or one-question survey are
often to be seen on some Mauritian websites especially media website, but these
cannot be considered as full fledge surveys. All the majors firms specialized in
doing surveys in Mauritius, including the Mauritius Central Statistics Office (an
organization responsible for the official statistics pertaining to our country)
acknowledge that response rate is very low via this method. Traditional ways of
carrying out surveys in Mauritius are mainly by questionnaires and phones.
441
Moloo R K & Pudaruth S

Previously, conducting online survey necessitated a web programmer to setup a


questionnaire online, to extract and report the data in a suitable format. However,
nowadays lots of free Survey websites are available like, surveymonkey [7],
esurveyspro [8], surveygizmo [9] to mention a few. In addition, within the
academic environment e-learning platforms like Moodle [10] and ATutor [5] do
provide plug-ins for the conduct of these surveys. The advantage of using them is
that they also provide tools for analyzing and exporting the data in suitable format.
Some of these websites provide templates, customisation of questionnaires,
security measures like checking IP, stopping users from answering more than once,
set time frame to complete questionnaires, specify start and end date etc. Their
ease of use, flexibility and accessibility makes them unrivaled methods of
conducting surveys at practically no cost.

Following a decline in performance in the Introduction to Information Technology


(IT) module and requests from different faculties for change, the Computer Science
and Engineering (CSE) Department at the University of Mauritius decided to
revamp this module. This course, unlike traditional lectures, is dispensed on a
tutorial based mode, whereby students are provided with online materials for
further readings, to practice questions, tutorials and lab. Students have a one-hour
tutorial every two weeks with no practical labs. The module is mainly self-learning
and students are then asked to attend a one-hour tutorial based class every two
weeks to discuss problems encountered in understanding the subject.

A committee was set up in this effect to look into the matter and among other
things, it was decided to hold on a survey for this module to gather requirements of
the different stakeholders needs and how they evaluate the current situation. This
was an excellent opportunity for us to test the efficacy of online surveys at UoM.
A questionnaire was devised to collect the required information. A free online e-
learning platform powered by ATutor [5] was used to host the survey online. The
questionnaire consisted of 29 questions, out of which 22 were Multiple Choice
Questions (MCQs), 3 were Multiple Choice Multiple Answers questions
(MCMAQs) and 4 were open ended questions. It took one day to put and test the
442
Conducting an Online Survey in an Academic Setting. Our Experience and Findings

survey online. Instructions were given on how to access and complete the survey.
Tutors of this module were asked to motivate their students to fill in the survey.

This survey targeted the whole student population of 1660 doing the Intro to IT
module, from the five faculties of UoM namely the engineering, law and
management, science, humanities and agriculture to determine their needs.
Students were required to register and take up the survey. Only one attempt was
enforced to avoid bias. It is to be noted that no incentives like gifts of money were
given to students upon completion of the survey. From 23rd to the 28th of
November 2009 the survey was conducted. The Students were given 5 days to
answer the survey form at their own free time, whether at home or on campus. An
extension of 1 day was also allowed after the deadline. On day 3, a reminder was
sent to them by email. A response rate of 29.6% was achieved with 492 responses.
The data was collected, processed and analysed using the statistic tools of the
ATutor [5] platform.

The purpose of this paper, however, is not to comment on the surveys findings but
rather share our experience on the implementation, design and conduct of online
surveys in an academic environment. We comment on the surveys response rate
overall and per faculty. The benefits and short-comings of an online survey are also
outlined. We also comment on the answering behaviour of students.

The rest of this paper is organised as follows. Section 2 gives an overview of


related work in conducting surveys. Section 3 discusses the methodology we used
in devising, collecting and analyzing information. Section 4 exposes our analysis
and interpretation of the results of the survey. In section 5 we share our experience
in carrying out the survey. And finally in section 6, we present our conclusion.

2. LITERATURE REVIEW

The first world-wide web user survey was posted in January of 1994 [13]. This
survey was done to get an idea of who uses the WWW and for what purposes. This
443
Moloo R K & Pudaruth S

survey with 4777 respondents in one month was very successful and paved the way
for innumerable number of surveys to be conducted online since then. In 1999,
Bartel [14] surveyed internet users in the United States using emails. He
demonstrated that email surveys can be a viable method to collect responses from
users who are dispersed over a wide geographical area. He also listed several other
advantages like huge penetration, cost benefits, ease of responding and reasonably
high response rates when compared with postal surveys. The problems with emails
are that they can be classified as spam and they expire quite rapidly, i.e., people are
creating new email addresses all the time and disregarding older ones.

Kim Sheehan [15] examined five factors which could affect the response rates
from email surveys since 1986. The factors are: the year in which the study was
done, the total number of questions in the survey, the effect of pre-notification
emails, the number of follow-up contacts and finally the survey topic itself. Quite
surprisingly, the factor which affected the response rate by the largest amount was
the year in which the study was undertaken. This was explained by the fact that
when the technology was new, people were highly motivated to use it. However,
nowadays, the technology is often misused by spammers which has thus created a
mistrust among surfers to reply to emails. Response rates for many of the surveys
oscillated between 10% and 70%.

In his paper [16], Classifying Response Behaviours in Web-based surveys,


Bosnjak analysed the respondent answering process instead of the response rate.
Traditional surveys have categorised responses using only three classes. Unit non-
response is where the person does not reply to the survey at all, item non-response
is where the person replies to some of the questions in the survey and complete
response in which the person replied to all questions asked. However, Bosnjak
identified classes of responses. In particular, he identified people who look at all
the survey questions but do not answer any of them and also people who leave the
survey half-way. The latter are termed as drop-outs. These classes do not exist in
postal surveys.

444
Conducting an Online Survey in an Academic Setting. Our Experience and Findings

Manfreda et al [17] were amongst the first researchers to evaluate whether the
design of a web survey questionnaire has any impact on the measurement error and
partial and unit non-response. They argue that web questionnaires are too often
prepared by non-professionals who have no much idea about questionnaire design.
Measurement errors result from inaccurate responses that stem from poor question
wording or questionnaire design. In [17], the authors identified three main design
issues which affect the visual design of web questionnaires. These are the graphic
layout, presentation on questions and the number of questions on one page. Partial
non-response was higher when logos were used on the web page. Partial non-
response was not significantly affected by having each question on a different page
if the survey is a short one. However, on average respondents took 30% more time
to complete the survey. It was suggested also suggested that additional research
was required to know the ideal number of questions to put on one page. Item non-
response was smaller when logos were used and when questions were places on
several web pages instead of one single page.

In [18], the authors undertook a simple but interesting factor which they thought
could possibly impact on the response rate. An experiment was set-up to test the
impact of four different salutations on response rate. Four samples of 800 alumni
from Stanford University were selected. The salutations used were Dear Stanford
Alum, Dear James, James, and Dear Mr James. Although on average, the
personalised salutations drew more responses than the generic salutations, the
differences were not statistically significant. Thus, the authors suggested that if
personalisation of messages would be too costly in terms of time, effort and
money, overall response rate might not suffer much with a generic salutation. They
went on to give some further suggestions like format personalised salutation(Dear
Mr James) may increase the response rate amongst young and old people and that
the formal personalised salutation may do better with men than with women who
might respond more favourably with a personalised salutation (Dear Jane).
Nevertheless, they recommended that these experiments must be performed again
on different types of populations as the small differences obtained in their study
might be due to luck only.
445
Moloo R K & Pudaruth S

The researchers in [19] demonstrated how invalid responses might be caught from
web-survey responses. Since financial incentives are often given to people to
respond to online surveys, there is the possibility that some people will submit
multiple responses for the same survey. It is also very likely that some respondents
will sign in only to finish the survey questionnaire as quickly as possible thereby
decreasing the confidence of the survey results. However, in this study, the authors
successfully implemented multiple validation protocols to remove the maximum
possible number of invalid responses. While some automated tools were used, they
believe that manual review is still essential at the end.

In his 2005 paper [20], Wright discusses in sufficient depth both the advantages
and disadvantages of conducting online survey research. In particular, he discusses
issues like accessibility to geographically dispersed populations, time and cost. The
main disadvantage that he described was the difficulty to get an un-biased sample.
He went on to compare and contrast different types of online questionnaire
authoring software packages and web survey services. Their strengths and
limitations are described to a reasonable extent. In a study [21] conducted by
Apathia Inc through their website supersurvey.com, they analysed the meta-data
from 199 surveys. All the surveys were paid ones. 50% of surveys received at least
a 26% response rate. Generally, the higher the invitation list, the lower the
response rate. About 50% of survey responses arrive within one day and nearly all
arrive within two weeks. Survey invitations sent in the morning (before 9 a.m.)
achieve higher response rates and quicker response times. There is a decrease in
response rate but an increase in response time during the course of a working day.
According to the ESOMAR Online Research 2009, 20% of surveys were carried
online in 10 countries while the average for telephone research was at 18% for
these 10 countries [22]. [23] analysed the factors which could influence response
rates for online surveys. The factors considered were the saliency of the questions
(or the survey itself), the incentives provides, whether it was done in an academic
setting or to a broader audience, length of the online survey questionnaire, etc.

446
Conducting an Online Survey in an Academic Setting. Our Experience and Findings

The effects of the timing of follow-ups, the types of incentives provided and the
presentation of the questionnaire on the response rate and response quality were
further discussed in [24]. However, these researchers felt that their results were not
conclusive and their results should be applied to other types of online surveys with
caution.

In [25], the suitability of online surveys against mail surveys on a national level
was determined. The researchers concentrated more on response quality rather than
on response rate or completion time. They concluded that both types of surveys
have similar response quality. However, they suggested that their findings might
not be relevant for countries with a low Internet penetration.

In their widely acclaimed article [26], the authors demonstrated the importance of
evidence gathered through online surveys in U.S court cases. In particular, they
outlined a few criticisms of internet surveys. They explained when the results from
online surveys can be used as evidence and when they are not considered as valid.

3. METHODOLOGY

An online survey methodology was chosen over other surveys for various reasons
and they were as follows:
1. As a research work, we wanted to test the effectiveness of doing an online
survey, which has never been carried out at UoM.
2. Free online survey tools were available which enables the setting up
questionnaires, collect and analyse information at practically no cost.
3. For economic reasons in terms of time and money, online surveys prove to be a
very cheap medium. We did not have so much time to spend for face to face
interviews and printing of questionnaires which would have been costly.
4. No sampling was needed since with online surveys, all the student population
could be targeted.

447
Moloo R K & Pudaruth S

5. Students doing the Introduction to IT module, which was an online, self


learning module, were already familiar with the web and the IT environment.
Hence, there were no needs for additional training.
6. Most of the tutors already had the mailing list of their students, and therefore it
was easier to reach these students via emails.
7. Studies have shown that online surveys are a more convenient means to users,
since it enables them to response at their own time when they are free.

3.1 Questionnaire Design

The questions to be set were prepared by a committee. Questions were devised


according to the information required by the committee on the current Introduction
to IT syllabus. Strategies to reduce non-response [11] [12] were incorporated while
preparing the questionnaire. Care was taken not to make it too long since
respondents might lose interest and it was estimated that a user will take on
average 15 minutes to fill it. The questionnaire consisted of 29 questions, out of
which 22 were Multiple Choice Questions (MCQs), 3 were Multiple Choice
Multiple Answers Questions (MCMAQs) and 4 were open ended questions.
Questions ranged from demographic questions (e.g. gender, faculties) to questions
related to the evaluation of the current syllabus. The questionnaire included
question on the mode of delivery, course contents and structure, practical and
theories, things they found relevant to them, per faculty. MCQs provided 5 choices
with the last option being leave blank, to understand response behaviour. Most of
the MCQs were rating questions ranging from strongly agree to strongly disagree.
Questions were arranged in a logical flow. In the last week of November 2009, the
online survey was conducted.

3.2 Sampling

For this online survey, no sampling methodology was effected, since we targeted
the whole population of 1660 students doing the Introduction to IT module.

448
Conducting an Online Survey in an Academic Setting. Our Experience and Findings

Nevertheless, we asked students to enter their respective faculties to stratify the


collected data for better categorization and analysis of the response behaviour of
the students.

3.3 Data Collection and Response rate

We chose the ATutor [5] e-learning platform to host our online survey. It took
around a day to familiarise, set up and test the system. Then tutors of the Intro to IT
module were asked to motivate their students to fill in the online survey forms. It is
to be noted that, no rewards were given to students upon completion of the survey
form. Students were sent the instructions with the link to the website mostly via
email. Students had to register on the website with their email to be able to access
the survey form. Only 1 attempt to the survey was allowed. The survey took place
from the 23rd to the 28th of November 2009. The students were given 5 days to
respond to the survey. An extension of 1 day was allowed and on day 3 a reminder
was sent to the student via email. A response rate of around 30% was achieved
with 492 responses. The survey targeted each of the 5 faculties of UoM namely the
engineering, law and management, science, humanities and agriculture to
determine their needs.

4. DATA ANALYSIS AND INTERPRETATION

Overall, 29.6% of students responded to our survey as shown in Figure 1. The


highest percentage comes from the Agriculture faculty and the lowest percentage is
from the Faculty of Engineering. This is difficult to explain if we rely on the fact
that Engineering students are generally the best ones in IT while Agriculture
students are generally the weaker ones. Instructions to the survey were sent to the
students via their emails. Tutors and program coordinators were asked to motivate
their students to fill in the survey form. It is to be noted that no rewards were
promised to students. The only motivational factor was the emotional appeal that
this would help to enhance the course for future students. The results show that
students who were regularly in contact with their tutors and responded via their
449
Moloo R K & Pudaruth S

mails showed a much higher response rate than students who were contacted
otherwise. In addition, it was noted that the email reminder on day 3 did gave a 6%
boost to the number of respondents.

Figure 1 Number of Respondents per Faculty

In terms of actual numbers, the Faculty of Law and Management was the clear
winner with 203 (41.3%) responses. We got a very significant number of responses
from the Faculty of Science as well. Indeed, 134 (27.2%) students responded to the
survey. It was not properly understood why the response rate was so low for
Engineering students. One plausible reason for this huge difference in response
with other faculties might be the predominance of males in the Engineering faculty.
Indeed, the ratio of male to female for the CSE1010e module for the academic year
2008-2009 was 3:1. As explained below, girls showed more willingness to fill in
the online survey form than boys.

The ratio of girls to boys for the module CSE1001e is 55:45. However, for the
survey we saw that the girls to boys ratio was 69:31. This difference is quite
significant which leads us in believing that girls shows higher response for online
surveys.

4.1 Responses per day

Figure 2 and 3 below show the percentage of respondents for each faculty and for
each day that the survey was available online. Our aim in collecting this data was
to get an idea of the behaviour of respondents with respect to a set deadline which
the respondents were aware of. 20% of respondents filled the survey form on the
450
Conducting an Online Survey in an Academic Setting. Our Experience and Findings

first day itself. Day 4 saw the largest number of responses. This can be explained
by a reminder sent on the previous day and this caused an average rise of 6% in the
response rate. Contrary to our expectation, only 17% of respondents filled the form
on the last official day of the survey. At the last minute of the first deadline, the
survey was extended by one day and the respondents were made aware of this
through another email reminder. Only a minority of 4% responded after the
deadline was extended. Although, we can roughly speak of a downward trend in
the number of respondents as the deadline was getting closer, there are many peaks
and troughs in the graph. This leads to conclude that there is very little correlation
between the number of respondents filling a survey form online and the number of
days before the deadline is due.

Figure 2 Number and Percentage of Respondents on each day

Figure 3 Percentage of Respondents on each day

451
Moloo R K & Pudaruth S

4.2 Responses at various time intervals

The day was divided into three


time intervals: 23:31 09:30,
09:31 17:30 and 17:31
23:30. The first interval
corresponds to very late at night
or early in the morning. The
second interval corresponds to
the time during which the
respondent was very likely to
Figure 4 Percentage of Respondents for
be at the university or at work. each interval
The survey was conducted from Monday to Friday, from the 23rd to the 27th of
November. An extension of one day was allowed on Saturday 28th of November.
And the last interval corresponds to the time during which the respondent is
expected to be at home. We realised that there can be some situations where the
respondent may be at home but was counted as being at work/university. The
reverse might also be possible but since our data size if quite significant, we
believe that the results can be trusted upon. As shown in figure 4 and 5, 63% of
respondents have preferred to fill the survey form from the comfort of their home
as shown in Figure 4. We argue that if all respondents had internet facility at home,
this number would have been much higher. Thus, we conclude that respondents
viewed the survey as being an additional task which they had to complete but
which did not require immediate attention. They kept it as a secondary task to be
completed at their own ease and pace.

Figure 5 Time of day at which the survey form was filled in

452
Conducting an Online Survey in an Academic Setting. Our Experience and Findings

4.3 Changing of preferences

It is good to point out that 12% of respondents modified their submissions at a later
time as in figure 6. The
Science respondents showed
a greater willingness to
change their survey
preferences while the Law
and Management and
Engineering respondents
were not keen on
Figure 4 Number of respondents modifying their
modifying their survey survey forms at a later time
form once they had been submitted. In fact, respondents were not told that they
could modify their survey form once it had been submitted. Thus, we conclude that
respondents showed a significant degree of enthusiasm to modify their survey
preferences at a later time. We believe that if respondents were told that they could
modify their answers even after their submission, more of them would have
modified their survey answers. This in turn shows the flexibility of an online
survey.

4.4 Mean time in completing the survey form

Figure 7 shows the mean time and standard deviation (in minutes) to complete the
survey form. As can be seen, a respondent took 16 minutes on average to complete
the survey with a standard deviation of 11 minutes. It is to be noted that
respondents who took more than 1 hour or less than 1 minute to complete the form
were not used in the calculation of the mean and standard deviation. These were
considered as outliers.

453
Moloo R K & Pudaruth S

Figure 5 Mean time for completing the survey in minutes

Law and Management students were the fastest ones while Agriculture students
took the longest amount of time on average. Initially, we believed that 15 minutes
might be a reasonable time to complete the form completely and reliably. There
were a significant number of students who took less than 5 minutes but many of
them did not complete the open-ended questions. Many students also took more
than 30 minutes, possibly, because they had trouble with the open-ended questions
which required them to generate some ideas. However, it was very encouraging to
see that very few respondents have submitted completely blank forms.

4.5 Response behaviour of students while answering Multiple Choice Multiple


Answers Questions (MCMAQs).

Out of 29 questions, only three were MCMAQs. Figure 8 shows the respondents
for these questions. For Q4, respondents on average selected 2.56 choices from a
choice of 13. For Q4, the average was 1. This suggests that respondents considered
this question more like an MCQ than an MCMAQ. However, when we look more
closely at the detailed results, we can see that the average number of responses for
all faculties, except Law and Management, the average was higher than 1. It is
difficult to explain why Law and Management respondents selected less than 1
choice on average from a choice of 6. In Q6, where respondents had to select one
or more choices from a choice of four, 11% of respondents did not choose any.
Again, it was Law and Management respondents who left the greatest number of
blanks. It was felt that perhaps respondents had difficulty understanding this
question to explain for this significant amount of blank answers for this question.

454
Conducting an Online Survey in an Academic Setting. Our Experience and Findings

Figure 6 - Respondents for the Multiple Choice Multiple Answers Questions


(MCMAQs)

4.6 Responses to open ended questions

Figure 9 and 10 relate to


respondents for Open
Ended questions. 19% of
respondents left the open-
ended questions blank
compared to 4% for
multiple choice questions.
It was noted that the last
question which asked
Figure 9 Number of blank answers per faculty
respondents how they
think this module can be improved and to offer their suggestions got the greatest
number of responses. Furthermore, looking at the graph, it is seen that the number
of blanks for each question for each faculty is very similar. This leads us to believe
that respondents either responded to all four questions or to none. Responses such
as none or nothing were considered as non-blank responses for the purpose of this
survey. Engineering students were least eager to respond to open-ended questions
while Agriculture and Science students offered the most number of responses in
terms of percentage.

455
Moloo R K & Pudaruth S

Figure 10 - Respondents for the unanswered Open Ended Questions

4.7 How many respondents choose the option leave blank when attempting the
MCQs?

Overall, 4% of blank answers were obtained for MCQs as shown in Figure 11. In
terms of actual numbers, this amounts to 411 unanswered MCQs over a total of
10824. Q18, Q19 and Q20 had drawn the largest number of blank answers. These
three questions were about the practical contents for the IT module. About 8% of
students were not keen on answering these questions although these were relevant
to them. However, this can be explained by the fact that although all the content for
the practical sessions are available online to all students, they do not have formal
practical sessions at the University. This explains the unwillingness of some
students to comment on the practicals.

The question which had drawn the least number of blanks was the gender question.
Only 1% of respondents left this blank. Apparently, this was considered to be very
important to answer. Two other questions namely, How would you rate your IT
knowledge prior to CSE1010e? and Which mode of delivery do you think would
be most appropriate for the Introduction to IT module? also received a 99%
response rate. On average the last three questions received fewer blank answers
than the middle questions. It was not clearly understood why 4% of respondents
did not choose their department.

456
Conducting an Online Survey in an Academic Setting. Our Experience and Findings

Humanities students showed the greatest disposition to reply to MCQs. This result
is in contrast with their reluctance to answer open-ended questions where their
response rate was below the average. Indeed, 41% of MCQs were answered by all
Humanities respondents. This suggest that Humanities respondents were perhaps
more aware of the importance of the results than other respondents because surveys
are an important tool for them in many of their studies.

Figure 7 Number of leave blank answers

457
Moloo R K & Pudaruth S

5. OUR EXPERIENCE BASED ON THE ONLINE SURVEY CARRIED


OUT.

Here we share some of our experience while carrying out the online survey.
Several advantages were noticed and they are enumerated below:
Online surveys compared to mail and phone surveys are cheaper and have a
faster turn-around time [27].
It is flexible. People are more likely to respond to online surveys because they
can complete them at any time.
Online surveys are very easy to set up and manage. Many survey sites allow
people without required programming skills to create relevant surveys very
easily and rapidly.
Online surveys eliminate interviewer bias. Respondents complete the survey by
themselves and there is no need for an interviewer.
Once the survey is online, it is a simple step to promote it, either through email
(with a link enclosed), via a link from a website or referenced by other forms
of advertising. Anyone who has the link can be connected instantly to the
survey.
Multimedia content like pictures, audio, video can be included. However, these
should only be used when it is certain that respondents have good broadband
facilities. Otherwise, this can have an adverse effect on response rate [24] as
respondents are then faced with long downloading time which reduced their
motivation to complete the questionnaire and to submit it.
Statistical compilation is done in real time. Survey results can be monitored
throughout the duration of the survey. This is helpful to get an idea of the
response rate and to know when to send the reminders.
A user can take as much time needed to fully complete the survey. He/She can
read the questions as many times as required. The respondent may also modify
his/her responses at a later time if the survey allows it.
It is surely the best method to collect data if the population to be sampled is an
international one.

458
Conducting an Online Survey in an Academic Setting. Our Experience and Findings

Although in this particular survey, respondents had to register in the system


with their real names and ID, it is possible for the survey to be conducted in an
entirely anonymous way. This ensures that the privacy and confidentiality of
the data are protected. We believe that this factor can indeed increase response
rate for certain types of surveys.
However, online surveys do have short comings and they are as follows:
Analysing response from open ended questions are time consuming
It is impossible to determine who actually completed the survey
It is also difficult to know whether the respondents actually understood the
question being asked.

5.1 Demographic response behavior

Our survey has been performed in an academic setting whereby we had control
over the respondents and this can explain the commendable response rates.
Students can be easily monitored in an academic environment for the completion
of the survey form. And this is because we have got the help of our colleagues in
motivating and monitoring the students. In addition, in a university environment,
the population can be considered IT literate since every student must do an
introductory course in IT as we mentioned in the introduction. Hence it can be
inferred that similar response behaviour can be achieved in an academic
environment, assuming that the population is IT literate.

Also the findings of the study can be extended to the wider Mauritian population in
the 18-34 age range, expecting similar response and behaviour towards online
surveys. Based on Facebook[6] demographic profile as mentioned in the
Introduction, nearly 82% of Mauritius facebook users are in the age bracket of 18-
34, being digital natives of our digital island. And our academic environment can
be fairly described as a subset of the digital natives with fairly the same behavior.

Nevertheless, findings of this survey cannot be generalized to the whole Mauritian


population, since the majority of older Mauritian population is not familiar with IT

459
Moloo R K & Pudaruth S

and the Internet. Also, they come from various literacy group, making response to
internet surveys very low. Such groups are more comfortable with paper and pencil
survey or the traditional face-to-face interviews. Also we believe that the findings
are restricted to countries having more or less the same characteristics of Mauritius
in terms of Internet Usage, IT literacy and ICT infrastructure. Countries having IT
as a luxury will not benefit from such kind of surveys.

6. CONCLUSION

The results for our online survey confirm our belief that online surveys can be
considered as an effective, flexible, time and cost saving medium in collecting and
analysing information in academic institutions in Mauritius. The survey was
carried out at practically no cost using a free e-learning platform, ATutor [5]. Time
spent was only in setting up the online survey and contacting the students. It took
us around 1 day to set up a survey questionnaire online using aTutor [5] and 1
week to collect the results. Analysis of the results was instantaneous due to the
statistical tools provided by the platform. The whole student population doing the
Introduction to IT module was targeted. This was possible since we had the
mailing lists of all the different batches of students.

A commendable response rate of 29.6% was obtained without any incentive like
gifts being provided. The emotional appeal that this would help to enhance the
course for future students, was used as a motivational element for current students
to complete the survey. A reminder proves to be useful in boosting response. A 6%
increase in the response was noted. MCQs were properly answered with only 4%
choosing the leave blank option. MAMCQs did confuse the students since many of
them chose only one answer, though multiple answers could have been selected.
Open ended questions were not popular among students of the different faculties
with an average of 18.5% unanswered.

However, care should be taken while generalizing our survey findings to other
sampled population. Mauritius has a sound IT infrastructure with a commendable

460
Conducting an Online Survey in an Academic Setting. Our Experience and Findings

IT literacy especially among the digital natives in the 18-34 age brackets. This
explains the responsiveness and the success of our survey. Also, the survey has
been conducted in an academic environment, whereby students are considered to
be versed in IT, are literate and whereby we had a fair control on the students. It is
believed that online surveys with elderly Mauritians will be rather painful and non-
response rates will be rather high, since they are not versed in IT and the use of the
internet.

Our survey also confirmed the flexibility provided by online surveys. Indeed
55.5% of respondents completed the survey after working hours (17:30 23:30),
showing that respondents completed the survey at a time convenient to them,
compared to conventional survey methods. The mean time taken to complete our
survey was 15.5 minutes with a standard deviation of 11.2 minutes.

To conclude, it is strongly felt that online surveys do have a promising future in


Mauritius. A major issue with any survey is response rate. We believe proper
monitoring like reminders and incentives have a positive impact on the response.
Response quality depended on the proper setting of MCQs, MCMAQs and open-
ended questions. MCMAQs should be clearly labeled as multiple answers, else
students tend to choose only one answer. Unless necessary, open-ended questions
should be avoided. Proper design of questionnaire, clarity of questions and proper
use of MCQs and clearly labeled MCMAQs improve the quality of the collected
data.

Our work was based at University level whereby students mailing list was
available. The level of computer literacy was high and the targeted age group was
between 18 to 24 years. Online surveys on a national level in Mauritius may prove
to be more difficult. Care should be taken while choosing the targeted audience
since IT literate Mauritians are mostly in the 15-34 age group. Older people may
not find online surveys suitable. Mailing list of people should be available and
proper incentives should be provided to get the desired response.

461
Moloo R K & Pudaruth S

In the near future, we intend to carry an online survey at national level to


investigate response rate and response quality.

7. ACKNOWLEDGEMENTS

The authors would like to thank the year 2009 CSE 1010e survey sub-committee
for the design of the survey questionnaire and the CSE1010e committee for their
invaluable suggestions.

8. REFERENCES

[1] PriceWaterHouseCoopers, 2007. The National ICT Strategic plan 2007-2011.


[Online] Available at
http://www.gov.mu/portal/goc/telecomit/files/NICTSP.pdf [Accessed 10 Dec
2009]
[2] Internet World Stats. List of countries classified by Internet Penetration Rates.
[Online] Available at http://www.internetworldstats.com/list4.htm [Accessed 15
December 2009]
[3] ICTA, (2004,p6). The ICT Sector in Mauritius, An Overview. [Online]
Available at http://www.gov.mu/portal/goc/ncb/file/ictview.pdf [Accessed 14
December 2009]
[4] LINDA J. SAX, SHANNON K. GILMARTIN AND ALYSSA N. BRYANT.
Assessing, Response Rates and Nonresponse Bias in Web and Paper Surveys,
Research in Higher Education, v44 n4 p409-32 Aug 2003. .[Online] Available
at http://www.springerlink.com/-content/v71hp772066t1q85/ [Accessed 14
December 2009]
[5] ATutor. Learning Management System [Online] Available at
http://www.atutor.ca/ [Accessed 10 December 2009]
[6] Facebook 2009. Advertise on Facebook. . [Online] Available at
http://www.facebook.com/ads/create/ [Accessed 10 December 2009]
[7] surveymonkey 2009. [Online] Available at http://www.surveymonkey.com
[Accessed 5 December 2009]

462
Conducting an Online Survey in an Academic Setting. Our Experience and Findings

[8] esurveyspro 2009. [Online] Available at http://www.esurveyspro.com/


[Accessed 2 December 2009]
[9] surveygizmo 2009. [Online] Available at http://www.surveygizmo.com/
[Accessed 2 December 2009]
[10]moodle 2009. What is Moodle. [Online] Available at http://moodle.org/
[Accessed 3 December 2009]
[11]Marc I. Tillman, Amplitude Research Inc., Online Market Research Surveys -
Maximizing Your ROI, retrieved December 13, 2009 from
http://www.greenbook.org/marketing-research.cfm/online-market-research-
surveys
[12]COOK ET AL. ,2000. A meta-analysis of response rates in web or internet
based survey, Educational and Psychological Measurement, 60(6), 821-836
[13] PITKOW, J AND RECKER, M. (1994). results from the first world- wide web
user survey. computer networks and ISDN systems, 27(2), 243-254.
[14] SHEEHAN, K AND HOY, M. (1999). Using E-mail To Survey Internet
Users In The United States: Methodology and Assessment. Journal of
Computer Mediated Communication, 4(3).
[15] SHEEHAN, K. (2001). E-mail Survey Response Rates: A Review. Journal of
Computer Mediated Communication, 6(2).
[16]BOSNJAK, M AND TUTEN, T. (2001). Classifying Response Behaviors in
Web-based Surveys. Journal of Computer Mediated Communication, 6(3).
[17]MANFREDA, K., BATAGELJ, Z., AND VEHOVAR, V. (2002). Design of
Web Survey Questionnaires: Three Basic Experiments. Journal of Computer
Mediated Communication, 7(3).
[18]PEARSON, J AND LEVINE, R. (2003). Salutations and Response Rates to
Online Surveys. Fourth International Conference on the Impact of Technology
on the Survey Process.
[19] KONSTAN, J., ROSSER, B., ROSS, W., STANTON, J., AND EDWARDS,
W. (2005). The Story of Subject Naught: A Cautionary but Optimistic Tale of
Internet Survey Research. Journal of Computer Mediated Communication,
10(2).

463
Moloo R K & Pudaruth S

[20]WRIGHT, K. (2005). Researching Internet-Based Populations: Advantages


and Disadvantages of Online Survey Research, Online Questionnaire
Authoring Software Packages, and Web Survey Services. Journal of Computer
Mediated Communication, 10(3).
[21]MICHAEL B. HAMILTON, Ipathia Inc., Online Survey Response Rates and
Times: Background and Guidance from Industry. [Online] Available at
http://www.supersurvey.com/papers/supersurvey_white_paper_response_rates.
pdf [Accessed 2 February 2011 ]
[22]http://blog.vovici.com/blog/bid/22879/ESOMAR-Online-Research-2009
[23]COOK, C., HEATH, F., & THOMPSON, R. (2000). A meta-analysis of
response rates in Web- or Internet-based surveys. Educational & Psychological
Measurement, 60 (6), 821-36.
[24]DEUTSKENS, E., DE RUYTER, K.,WETZELS, M., & OOSTERVELD, P.
(2004). Response rate and response quality of internet-based surveys: An
experimental study. Marketing Letters, 15, 2136.
[25]DEUTSKENS, E., DE JONG, A., DE RUYTER, K., & WETZELS, M. (2006).
Comparing the generalizability of online and mail surveys in cross-national
service quality research. Marketing Letters, 17(2), 119136.
[26]DALTON, J AND HICKEY, A. (2009). Proceed with Caution: Internet-Based
Survey Research. Bloomberg Finance, Vol 2, No. 2.
[27]DILLMAN, D. A. (2000). Mail and Internet surveys: The tailored design
method. 2nd ed. New York.Wiley

464

Potrebbero piacerti anche