Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Pudaruth S
Faculty of Engineering,
University of Mauritius
Reduit
Email: s.pudaruth@uom.ac.mu
Abstract
With the ever-increasing access to Internet in Mauritius, online surveys might soon
replace traditional survey methods. If properly conducted, online surveys provide
an easy, cheap, reliable and quicker way of obtaining and analysing valuable data
from a wide and diverse population. In this paper, we present the effectiveness of
an online survey carried out to students of the University of Mauritius to evaluate
the Introduction to Information Technology (IT) module. We share our experience
in designing, implementing the survey and also in collecting the required
information. We show how effective, time and cost saving online surveys can be in
collecting and analysing information in an academic institution. The survey further
analyses gender difference, mean time taken, time of day and the effectiveness of
open-ended and close-ended questions which have been sparsely considered by
earlier researchers. We also present the varied advantages online surveys can bring
and our experience of the techniques we used to encourage students to answer the
survey forms. The survey was carried out online and targeted 1660 students. The
response rate was 29.6%, covering 492 students.
1. INTRODUCTION
The success of online surveys relies on the IT literacy of the sampled population.
The National ICT Strategic (NICTSP) 2007-2011 plan [1] projects in making
Mauritius a digital island. Latest statistics shows that Mauritius has a 29.6%
internet penetration ratio with 380,000 internet users [2]. This figure however,
includes business users as well. An estimate of the users age group can be
estimated from Facebook [6] users demographic profile in Mauritius. We firmly
believe that its demographic profile reflects the age range of internet users in our
country. Around 55% of users fall in the age range of 15-24, 27% in 25-34 age
group, 8% in the age bracket of 35-44, and 5% is aged above 45 years old.
Therefore, it can be reasonably inferred that the majority of web users in Mauritius
are in the 15-24 age group.
Based on the facts and figures mentioned and with an ever increasing IT literacy
ratio, we firmly believe that online surveys have a bright future over traditional
methods in our country. To our knowledge, practically no methodological study on
online surveys have been carried out in Mauritius to determine the impact, validity
and response rates of such kind of medium. Online polls or one-question survey are
often to be seen on some Mauritian websites especially media website, but these
cannot be considered as full fledge surveys. All the majors firms specialized in
doing surveys in Mauritius, including the Mauritius Central Statistics Office (an
organization responsible for the official statistics pertaining to our country)
acknowledge that response rate is very low via this method. Traditional ways of
carrying out surveys in Mauritius are mainly by questionnaires and phones.
441
Moloo R K & Pudaruth S
A committee was set up in this effect to look into the matter and among other
things, it was decided to hold on a survey for this module to gather requirements of
the different stakeholders needs and how they evaluate the current situation. This
was an excellent opportunity for us to test the efficacy of online surveys at UoM.
A questionnaire was devised to collect the required information. A free online e-
learning platform powered by ATutor [5] was used to host the survey online. The
questionnaire consisted of 29 questions, out of which 22 were Multiple Choice
Questions (MCQs), 3 were Multiple Choice Multiple Answers questions
(MCMAQs) and 4 were open ended questions. It took one day to put and test the
442
Conducting an Online Survey in an Academic Setting. Our Experience and Findings
survey online. Instructions were given on how to access and complete the survey.
Tutors of this module were asked to motivate their students to fill in the survey.
This survey targeted the whole student population of 1660 doing the Intro to IT
module, from the five faculties of UoM namely the engineering, law and
management, science, humanities and agriculture to determine their needs.
Students were required to register and take up the survey. Only one attempt was
enforced to avoid bias. It is to be noted that no incentives like gifts of money were
given to students upon completion of the survey. From 23rd to the 28th of
November 2009 the survey was conducted. The Students were given 5 days to
answer the survey form at their own free time, whether at home or on campus. An
extension of 1 day was also allowed after the deadline. On day 3, a reminder was
sent to them by email. A response rate of 29.6% was achieved with 492 responses.
The data was collected, processed and analysed using the statistic tools of the
ATutor [5] platform.
The purpose of this paper, however, is not to comment on the surveys findings but
rather share our experience on the implementation, design and conduct of online
surveys in an academic environment. We comment on the surveys response rate
overall and per faculty. The benefits and short-comings of an online survey are also
outlined. We also comment on the answering behaviour of students.
2. LITERATURE REVIEW
The first world-wide web user survey was posted in January of 1994 [13]. This
survey was done to get an idea of who uses the WWW and for what purposes. This
443
Moloo R K & Pudaruth S
survey with 4777 respondents in one month was very successful and paved the way
for innumerable number of surveys to be conducted online since then. In 1999,
Bartel [14] surveyed internet users in the United States using emails. He
demonstrated that email surveys can be a viable method to collect responses from
users who are dispersed over a wide geographical area. He also listed several other
advantages like huge penetration, cost benefits, ease of responding and reasonably
high response rates when compared with postal surveys. The problems with emails
are that they can be classified as spam and they expire quite rapidly, i.e., people are
creating new email addresses all the time and disregarding older ones.
Kim Sheehan [15] examined five factors which could affect the response rates
from email surveys since 1986. The factors are: the year in which the study was
done, the total number of questions in the survey, the effect of pre-notification
emails, the number of follow-up contacts and finally the survey topic itself. Quite
surprisingly, the factor which affected the response rate by the largest amount was
the year in which the study was undertaken. This was explained by the fact that
when the technology was new, people were highly motivated to use it. However,
nowadays, the technology is often misused by spammers which has thus created a
mistrust among surfers to reply to emails. Response rates for many of the surveys
oscillated between 10% and 70%.
444
Conducting an Online Survey in an Academic Setting. Our Experience and Findings
Manfreda et al [17] were amongst the first researchers to evaluate whether the
design of a web survey questionnaire has any impact on the measurement error and
partial and unit non-response. They argue that web questionnaires are too often
prepared by non-professionals who have no much idea about questionnaire design.
Measurement errors result from inaccurate responses that stem from poor question
wording or questionnaire design. In [17], the authors identified three main design
issues which affect the visual design of web questionnaires. These are the graphic
layout, presentation on questions and the number of questions on one page. Partial
non-response was higher when logos were used on the web page. Partial non-
response was not significantly affected by having each question on a different page
if the survey is a short one. However, on average respondents took 30% more time
to complete the survey. It was suggested also suggested that additional research
was required to know the ideal number of questions to put on one page. Item non-
response was smaller when logos were used and when questions were places on
several web pages instead of one single page.
In [18], the authors undertook a simple but interesting factor which they thought
could possibly impact on the response rate. An experiment was set-up to test the
impact of four different salutations on response rate. Four samples of 800 alumni
from Stanford University were selected. The salutations used were Dear Stanford
Alum, Dear James, James, and Dear Mr James. Although on average, the
personalised salutations drew more responses than the generic salutations, the
differences were not statistically significant. Thus, the authors suggested that if
personalisation of messages would be too costly in terms of time, effort and
money, overall response rate might not suffer much with a generic salutation. They
went on to give some further suggestions like format personalised salutation(Dear
Mr James) may increase the response rate amongst young and old people and that
the formal personalised salutation may do better with men than with women who
might respond more favourably with a personalised salutation (Dear Jane).
Nevertheless, they recommended that these experiments must be performed again
on different types of populations as the small differences obtained in their study
might be due to luck only.
445
Moloo R K & Pudaruth S
The researchers in [19] demonstrated how invalid responses might be caught from
web-survey responses. Since financial incentives are often given to people to
respond to online surveys, there is the possibility that some people will submit
multiple responses for the same survey. It is also very likely that some respondents
will sign in only to finish the survey questionnaire as quickly as possible thereby
decreasing the confidence of the survey results. However, in this study, the authors
successfully implemented multiple validation protocols to remove the maximum
possible number of invalid responses. While some automated tools were used, they
believe that manual review is still essential at the end.
In his 2005 paper [20], Wright discusses in sufficient depth both the advantages
and disadvantages of conducting online survey research. In particular, he discusses
issues like accessibility to geographically dispersed populations, time and cost. The
main disadvantage that he described was the difficulty to get an un-biased sample.
He went on to compare and contrast different types of online questionnaire
authoring software packages and web survey services. Their strengths and
limitations are described to a reasonable extent. In a study [21] conducted by
Apathia Inc through their website supersurvey.com, they analysed the meta-data
from 199 surveys. All the surveys were paid ones. 50% of surveys received at least
a 26% response rate. Generally, the higher the invitation list, the lower the
response rate. About 50% of survey responses arrive within one day and nearly all
arrive within two weeks. Survey invitations sent in the morning (before 9 a.m.)
achieve higher response rates and quicker response times. There is a decrease in
response rate but an increase in response time during the course of a working day.
According to the ESOMAR Online Research 2009, 20% of surveys were carried
online in 10 countries while the average for telephone research was at 18% for
these 10 countries [22]. [23] analysed the factors which could influence response
rates for online surveys. The factors considered were the saliency of the questions
(or the survey itself), the incentives provides, whether it was done in an academic
setting or to a broader audience, length of the online survey questionnaire, etc.
446
Conducting an Online Survey in an Academic Setting. Our Experience and Findings
The effects of the timing of follow-ups, the types of incentives provided and the
presentation of the questionnaire on the response rate and response quality were
further discussed in [24]. However, these researchers felt that their results were not
conclusive and their results should be applied to other types of online surveys with
caution.
In [25], the suitability of online surveys against mail surveys on a national level
was determined. The researchers concentrated more on response quality rather than
on response rate or completion time. They concluded that both types of surveys
have similar response quality. However, they suggested that their findings might
not be relevant for countries with a low Internet penetration.
In their widely acclaimed article [26], the authors demonstrated the importance of
evidence gathered through online surveys in U.S court cases. In particular, they
outlined a few criticisms of internet surveys. They explained when the results from
online surveys can be used as evidence and when they are not considered as valid.
3. METHODOLOGY
An online survey methodology was chosen over other surveys for various reasons
and they were as follows:
1. As a research work, we wanted to test the effectiveness of doing an online
survey, which has never been carried out at UoM.
2. Free online survey tools were available which enables the setting up
questionnaires, collect and analyse information at practically no cost.
3. For economic reasons in terms of time and money, online surveys prove to be a
very cheap medium. We did not have so much time to spend for face to face
interviews and printing of questionnaires which would have been costly.
4. No sampling was needed since with online surveys, all the student population
could be targeted.
447
Moloo R K & Pudaruth S
3.2 Sampling
For this online survey, no sampling methodology was effected, since we targeted
the whole population of 1660 students doing the Introduction to IT module.
448
Conducting an Online Survey in an Academic Setting. Our Experience and Findings
We chose the ATutor [5] e-learning platform to host our online survey. It took
around a day to familiarise, set up and test the system. Then tutors of the Intro to IT
module were asked to motivate their students to fill in the online survey forms. It is
to be noted that, no rewards were given to students upon completion of the survey
form. Students were sent the instructions with the link to the website mostly via
email. Students had to register on the website with their email to be able to access
the survey form. Only 1 attempt to the survey was allowed. The survey took place
from the 23rd to the 28th of November 2009. The students were given 5 days to
respond to the survey. An extension of 1 day was allowed and on day 3 a reminder
was sent to the student via email. A response rate of around 30% was achieved
with 492 responses. The survey targeted each of the 5 faculties of UoM namely the
engineering, law and management, science, humanities and agriculture to
determine their needs.
mails showed a much higher response rate than students who were contacted
otherwise. In addition, it was noted that the email reminder on day 3 did gave a 6%
boost to the number of respondents.
In terms of actual numbers, the Faculty of Law and Management was the clear
winner with 203 (41.3%) responses. We got a very significant number of responses
from the Faculty of Science as well. Indeed, 134 (27.2%) students responded to the
survey. It was not properly understood why the response rate was so low for
Engineering students. One plausible reason for this huge difference in response
with other faculties might be the predominance of males in the Engineering faculty.
Indeed, the ratio of male to female for the CSE1010e module for the academic year
2008-2009 was 3:1. As explained below, girls showed more willingness to fill in
the online survey form than boys.
The ratio of girls to boys for the module CSE1001e is 55:45. However, for the
survey we saw that the girls to boys ratio was 69:31. This difference is quite
significant which leads us in believing that girls shows higher response for online
surveys.
Figure 2 and 3 below show the percentage of respondents for each faculty and for
each day that the survey was available online. Our aim in collecting this data was
to get an idea of the behaviour of respondents with respect to a set deadline which
the respondents were aware of. 20% of respondents filled the survey form on the
450
Conducting an Online Survey in an Academic Setting. Our Experience and Findings
first day itself. Day 4 saw the largest number of responses. This can be explained
by a reminder sent on the previous day and this caused an average rise of 6% in the
response rate. Contrary to our expectation, only 17% of respondents filled the form
on the last official day of the survey. At the last minute of the first deadline, the
survey was extended by one day and the respondents were made aware of this
through another email reminder. Only a minority of 4% responded after the
deadline was extended. Although, we can roughly speak of a downward trend in
the number of respondents as the deadline was getting closer, there are many peaks
and troughs in the graph. This leads to conclude that there is very little correlation
between the number of respondents filling a survey form online and the number of
days before the deadline is due.
451
Moloo R K & Pudaruth S
452
Conducting an Online Survey in an Academic Setting. Our Experience and Findings
It is good to point out that 12% of respondents modified their submissions at a later
time as in figure 6. The
Science respondents showed
a greater willingness to
change their survey
preferences while the Law
and Management and
Engineering respondents
were not keen on
Figure 4 Number of respondents modifying their
modifying their survey survey forms at a later time
form once they had been submitted. In fact, respondents were not told that they
could modify their survey form once it had been submitted. Thus, we conclude that
respondents showed a significant degree of enthusiasm to modify their survey
preferences at a later time. We believe that if respondents were told that they could
modify their answers even after their submission, more of them would have
modified their survey answers. This in turn shows the flexibility of an online
survey.
Figure 7 shows the mean time and standard deviation (in minutes) to complete the
survey form. As can be seen, a respondent took 16 minutes on average to complete
the survey with a standard deviation of 11 minutes. It is to be noted that
respondents who took more than 1 hour or less than 1 minute to complete the form
were not used in the calculation of the mean and standard deviation. These were
considered as outliers.
453
Moloo R K & Pudaruth S
Law and Management students were the fastest ones while Agriculture students
took the longest amount of time on average. Initially, we believed that 15 minutes
might be a reasonable time to complete the form completely and reliably. There
were a significant number of students who took less than 5 minutes but many of
them did not complete the open-ended questions. Many students also took more
than 30 minutes, possibly, because they had trouble with the open-ended questions
which required them to generate some ideas. However, it was very encouraging to
see that very few respondents have submitted completely blank forms.
Out of 29 questions, only three were MCMAQs. Figure 8 shows the respondents
for these questions. For Q4, respondents on average selected 2.56 choices from a
choice of 13. For Q4, the average was 1. This suggests that respondents considered
this question more like an MCQ than an MCMAQ. However, when we look more
closely at the detailed results, we can see that the average number of responses for
all faculties, except Law and Management, the average was higher than 1. It is
difficult to explain why Law and Management respondents selected less than 1
choice on average from a choice of 6. In Q6, where respondents had to select one
or more choices from a choice of four, 11% of respondents did not choose any.
Again, it was Law and Management respondents who left the greatest number of
blanks. It was felt that perhaps respondents had difficulty understanding this
question to explain for this significant amount of blank answers for this question.
454
Conducting an Online Survey in an Academic Setting. Our Experience and Findings
455
Moloo R K & Pudaruth S
4.7 How many respondents choose the option leave blank when attempting the
MCQs?
Overall, 4% of blank answers were obtained for MCQs as shown in Figure 11. In
terms of actual numbers, this amounts to 411 unanswered MCQs over a total of
10824. Q18, Q19 and Q20 had drawn the largest number of blank answers. These
three questions were about the practical contents for the IT module. About 8% of
students were not keen on answering these questions although these were relevant
to them. However, this can be explained by the fact that although all the content for
the practical sessions are available online to all students, they do not have formal
practical sessions at the University. This explains the unwillingness of some
students to comment on the practicals.
The question which had drawn the least number of blanks was the gender question.
Only 1% of respondents left this blank. Apparently, this was considered to be very
important to answer. Two other questions namely, How would you rate your IT
knowledge prior to CSE1010e? and Which mode of delivery do you think would
be most appropriate for the Introduction to IT module? also received a 99%
response rate. On average the last three questions received fewer blank answers
than the middle questions. It was not clearly understood why 4% of respondents
did not choose their department.
456
Conducting an Online Survey in an Academic Setting. Our Experience and Findings
Humanities students showed the greatest disposition to reply to MCQs. This result
is in contrast with their reluctance to answer open-ended questions where their
response rate was below the average. Indeed, 41% of MCQs were answered by all
Humanities respondents. This suggest that Humanities respondents were perhaps
more aware of the importance of the results than other respondents because surveys
are an important tool for them in many of their studies.
457
Moloo R K & Pudaruth S
Here we share some of our experience while carrying out the online survey.
Several advantages were noticed and they are enumerated below:
Online surveys compared to mail and phone surveys are cheaper and have a
faster turn-around time [27].
It is flexible. People are more likely to respond to online surveys because they
can complete them at any time.
Online surveys are very easy to set up and manage. Many survey sites allow
people without required programming skills to create relevant surveys very
easily and rapidly.
Online surveys eliminate interviewer bias. Respondents complete the survey by
themselves and there is no need for an interviewer.
Once the survey is online, it is a simple step to promote it, either through email
(with a link enclosed), via a link from a website or referenced by other forms
of advertising. Anyone who has the link can be connected instantly to the
survey.
Multimedia content like pictures, audio, video can be included. However, these
should only be used when it is certain that respondents have good broadband
facilities. Otherwise, this can have an adverse effect on response rate [24] as
respondents are then faced with long downloading time which reduced their
motivation to complete the questionnaire and to submit it.
Statistical compilation is done in real time. Survey results can be monitored
throughout the duration of the survey. This is helpful to get an idea of the
response rate and to know when to send the reminders.
A user can take as much time needed to fully complete the survey. He/She can
read the questions as many times as required. The respondent may also modify
his/her responses at a later time if the survey allows it.
It is surely the best method to collect data if the population to be sampled is an
international one.
458
Conducting an Online Survey in an Academic Setting. Our Experience and Findings
Our survey has been performed in an academic setting whereby we had control
over the respondents and this can explain the commendable response rates.
Students can be easily monitored in an academic environment for the completion
of the survey form. And this is because we have got the help of our colleagues in
motivating and monitoring the students. In addition, in a university environment,
the population can be considered IT literate since every student must do an
introductory course in IT as we mentioned in the introduction. Hence it can be
inferred that similar response behaviour can be achieved in an academic
environment, assuming that the population is IT literate.
Also the findings of the study can be extended to the wider Mauritian population in
the 18-34 age range, expecting similar response and behaviour towards online
surveys. Based on Facebook[6] demographic profile as mentioned in the
Introduction, nearly 82% of Mauritius facebook users are in the age bracket of 18-
34, being digital natives of our digital island. And our academic environment can
be fairly described as a subset of the digital natives with fairly the same behavior.
459
Moloo R K & Pudaruth S
and the Internet. Also, they come from various literacy group, making response to
internet surveys very low. Such groups are more comfortable with paper and pencil
survey or the traditional face-to-face interviews. Also we believe that the findings
are restricted to countries having more or less the same characteristics of Mauritius
in terms of Internet Usage, IT literacy and ICT infrastructure. Countries having IT
as a luxury will not benefit from such kind of surveys.
6. CONCLUSION
The results for our online survey confirm our belief that online surveys can be
considered as an effective, flexible, time and cost saving medium in collecting and
analysing information in academic institutions in Mauritius. The survey was
carried out at practically no cost using a free e-learning platform, ATutor [5]. Time
spent was only in setting up the online survey and contacting the students. It took
us around 1 day to set up a survey questionnaire online using aTutor [5] and 1
week to collect the results. Analysis of the results was instantaneous due to the
statistical tools provided by the platform. The whole student population doing the
Introduction to IT module was targeted. This was possible since we had the
mailing lists of all the different batches of students.
A commendable response rate of 29.6% was obtained without any incentive like
gifts being provided. The emotional appeal that this would help to enhance the
course for future students, was used as a motivational element for current students
to complete the survey. A reminder proves to be useful in boosting response. A 6%
increase in the response was noted. MCQs were properly answered with only 4%
choosing the leave blank option. MAMCQs did confuse the students since many of
them chose only one answer, though multiple answers could have been selected.
Open ended questions were not popular among students of the different faculties
with an average of 18.5% unanswered.
However, care should be taken while generalizing our survey findings to other
sampled population. Mauritius has a sound IT infrastructure with a commendable
460
Conducting an Online Survey in an Academic Setting. Our Experience and Findings
IT literacy especially among the digital natives in the 18-34 age brackets. This
explains the responsiveness and the success of our survey. Also, the survey has
been conducted in an academic environment, whereby students are considered to
be versed in IT, are literate and whereby we had a fair control on the students. It is
believed that online surveys with elderly Mauritians will be rather painful and non-
response rates will be rather high, since they are not versed in IT and the use of the
internet.
Our survey also confirmed the flexibility provided by online surveys. Indeed
55.5% of respondents completed the survey after working hours (17:30 23:30),
showing that respondents completed the survey at a time convenient to them,
compared to conventional survey methods. The mean time taken to complete our
survey was 15.5 minutes with a standard deviation of 11.2 minutes.
Our work was based at University level whereby students mailing list was
available. The level of computer literacy was high and the targeted age group was
between 18 to 24 years. Online surveys on a national level in Mauritius may prove
to be more difficult. Care should be taken while choosing the targeted audience
since IT literate Mauritians are mostly in the 15-34 age group. Older people may
not find online surveys suitable. Mailing list of people should be available and
proper incentives should be provided to get the desired response.
461
Moloo R K & Pudaruth S
7. ACKNOWLEDGEMENTS
The authors would like to thank the year 2009 CSE 1010e survey sub-committee
for the design of the survey questionnaire and the CSE1010e committee for their
invaluable suggestions.
8. REFERENCES
462
Conducting an Online Survey in an Academic Setting. Our Experience and Findings
463
Moloo R K & Pudaruth S
464