Sei sulla pagina 1di 24

_________________________________________________________________________

The LMS moodle: A Usability Evaluation


JAY MELTON
Prefectural University of Kumamoto As more and more technology finds its way into language courses, the more complicated software-adoption decisions become. The area of human-computer interaction (HCI) has much to offer those of us in language teaching and research. While conceived specifically to aid in the design process, HCI testing tools such as the DECIDE framework can help language teachers and researchers get an idea of how usable a particular software package is before it is put to use on a wider scale. This small, preliminary study examined the usability of the learning management system (LMS) moodles registration process and assignment submission module. While users were generally successful in the intermediate tasks, one-half were not able to complete the final task of submitting an assignment. While more study is needed to confirm the causes, it is possible that both the lack of experience in a wide variety of computer-related tasks and the use of L2 in the interface played a part in the results.

_________________________________________________________________________

INTRODUCTION

omputing technology is finding its way into more and different kinds of classrooms. Courses that have been taught in more traditional ways, such as foreign language courses, are now using various technologies to enhance lessons. Asynchronous technologies such as email have been used for some time now to widen the scope of learning opportunities for language learners (Chapelle, 2001; Gaer, 1999; Kern & Warschauer, 2000; Peyton, 1999). More recently, discussion boards, or bulletin board systems (BBS), and blogs and wikis have been implemented to further expand the possibilities for these students (Field, 2002; Lavin & Tomei, 2006; Ward, 2004). In addition, synchronous technologies, such as chat, are being put to use in language courses (Pellettieri, 2000). Due to both the ubiquity of the Internet and increased availability of network bandwidth, powerful packages known as learning management systems (LMS) are being developed to enhance learning in a variety of environments. LMSs, sometimes called course or content management systems, combine these and other learning technologies to enable hybrid or online learning (Ko and Rossen, 2004; Waterhouse, 2005). Appropriate use of these packages can help to augment more traditional, teacher-centered courses (McArthur, Parker, & Giersch, 2003). Students may communicate with their instructors and each other in learning communities, access
MOODLE: A USABILITY EVALUATION 1

learning material, take quizzes, and submit assignments, all using the power of the Internet (Ko and Rossen; Preece, 2000; Waterhouse). Well-known commercial LMSs are WebCT (2005) and Blackboard (2005). These systems are very powerful, but they require high fees, which may be prohibitive for educators and institutions working with limited budgets. Educators restricted by these fiscal limitations are searching for packages in other areas, such as open source software. One promising LMS is moodle (Dougiamas, 2004), a robust, fully-featured package incorporating not only the technologies discussed above, but many others (i.e., journals, quizzes, assignments, glossaries, surveys, polls, wikis). Due to the nature of open source software, moodle (modular object-oriented dynamic learning environment) is under constant revision and feature-enhancement. It is not enough, however, to just pick a package based on its price or feature list. Educators considering implementing educational technology must carefully evaluate it before putting it to use with a student population (Colace, Santo, & Vento, 2002; Iding, Auernheimer, Crosby, & Klemm, 2002). Research into human-computer interaction (HCI) tells us that a major design element is a technologys usability (Preece, Rogers, & Sharp, 2002; Rozanski & Haake, 2003). Usability pioneer Nielsen (2003) refers to usability as, a quality attribute [his emphasis] that assesses how easy user interfaces are to use ( 3). Nevertheless, useful tools for evaluating software have traditionally been the domain of HCI specialists. HCI techniques are used in the designing, planning, and programming of software and hardware to ensure that products not only work as intended, but that they work well, too. Why should those involved in design have all the fun? Those of us working with various educational technologies would like to have access to these tools as well. Indeed, some cross-disciplinary cooperation could do everyone a bit of good. Those involved in the design of tools could benefit from a broader scope of testing, and those using the tools should have better knowledge of what to look for when preparing for their courses. An attempt at the latter was made at an HCI-themed computer assisted language learning conference about the same time this study was conducted (Melton, 2004). This study is a preliminary one to determine if moodles registration system and assignment submission module have sufficient levels of usability in the study of English writing for Japanese science graduate students. Therefore the number of participants, four, is considered too small to make any sweeping claims. However, as Nielsen (1994) has written, a clear picture of a software packages usability can be quickly determined with three to five users. Although moodles usability is made possible because that is one of the primary goals of its designers, the design process is not of concern to the present work. The author is not involved in the design process of moodle. However, it is hoped that readers will be able to get a better idea of how to
2 LANGUAGE ISSUES: HCI

evaluate software and ensure that the software we choose for our students has a high level of usability. METHODOLOGY Planning and Design A first step in the planning stage of conducting a usability evaluation was to adopt a framework with which to design the usability evaluation. Preece et al. (2002) describe the D E C I D E framework which is suitable for the evaluation under consideration here (p. 348). The six components of the DECIDE framework are:
1. Determine the overall goals that the evaluation addresses. 2. Explore the specific questions to be answered. 3. Choose the evaluation paradigm and techniques to answer the questions. 4. Identify the practical issues that must be addressed, such as selecting participants. 5. Decide how to deal with ethical issues. 6. Evaluate, interpret, and present the data. (p. 348)

This framework served as the basis for the design of the evaluation conducted in this study. Determine the Goal Using computer technology in teaching and learning environments, although nothing new, can still be a daunting task, especially when the computers or the software are difficult to learn (Kluge, 2002). It is imperative, then, that the interfaces to which we expose our students be as easy to understand and as simple to use as possible. Moodles developers appear to have taken account of design guidelines such as those delineated in Krugs (2000) very readable tome on Web usability. Moodle has a simple interface, uses a minimum of words, features rollovers, and often includes simple icons with the words to aid users. The goal of this evaluation was to determine if participants in the study could complete the steps to create a simple text file, sign up for an account in moodle, log in to moodle, join a course, and, finally, submit the text file into an online assignment in moodle. Because another preliminary study showed that there were no significant differences when using English in the quiz interface in moodle, all instructions for the tasks were in English (Melton, 2006).

MOODLE: A USABILITY EVALUATION

Explore the Question If moodle has been designed well, then it should be easy for users to navigate and accomplish basic tasks, even for non-native speakers of English. Ideally, what tasks need to be completed or what choices are possible should be mostly self-explanatory. What cannot be understood should be accessible through readily available help screens. The question to be answered in the study was whether moodles usability was such that users new to that LMS could to complete tasks well enough to submit an assignment. Choose the Evaluation Paradigm and Techniques The evaluation paradigm for this study was usability testing (Preece et al., 2002). The participants performance was measured by determining whether the tasks were completed successfully. Participants were asked to complete a questionnaire (Appendix A), and the evaluation form, including instructions, for the usability test (Appendix B) were written based on their answers. During the test, all the participants actions and words were recorded on video tape for review after testing. After the testing period, participants were asked to talk about their thoughts on what they were asked to do, and they were encouraged to add any other comments. All the testing was conducted in an office in the New Faculty of Environmental and Symbiotic Sciences Building at the Prefectural University of Kumamoto. During testing every effort was made to ensure that conditionsincluding lighting, temperature, and noise levelswere at an optimum. There were no interruptions during any of the testing sessions. The computer for the testing was an Apple Macintosh G4 running OS X, version 10.3.3. This computer was sitting on a wide desk, and a paper English-Japanese dictionary (Matsuda, 1999) was placed on the desk in the event participants needed help with the instructions or tasks. The participants were asked to use two programs: Apple Computers TextEdit, for editing a text file; and Apple Computers Safari, a web browser used to access the moodle installation. The moodle installation was also on the same G4 machine which was connected to the universitys local area network. Identify the Practical Issues Preece et al. (2002) stress that participants must be appropriate users who represent the user population to which the product is targeted (p. 350). Participants for this study were chosen from recently matriculated graduate students at the Prefectural University of Kumamotos Graduate School of Environmental and Symbiotic Sciences. Four participants were chosen from a population of 23 students for three reasons. First, all had taken an English course with the evaluator three years previously, which
4 LANGUAGE ISSUES: HCI

made approaching them with the request easier for both participants and evaluator. Second, all were known to be Macintosh users; using participants unaccustomed to that platform could raise issues unrelated to the usability test. Finally, and most importantly, all four were potential students in an English-language graduate writing course starting in the same academic year; students taking that course would be using the moodle system for much of their coursework. These considerations established them as prime candidates for this study. The two male and two female participants were Japanese, middleclass, and had a mean age of 22.75. All were students in the Division of Environmental and Ecological studiestwo other divisions study Food and Health, and Human Habitat. The data from the questionnaire (Appendix A) show that two students specialized in forest ecology, one in environment and plant pathology, and one in marine biology. Three students had experience with both Macintosh OS 9 and OS X, and the other OS 9 only. In addition, three of the students had experience with Windows systems. All four noted previous experience with word processing, spreadsheet, browser, presentation, and digital photo software; three noted email software. All remarked they thought computers were useful, all use them for their research, one used them during free time, and all noted that they were okay with computers. None had used moodle before, but one had used a BBS before. It can be concluded from this data that the four participants had experience with computers, but that experience was varied and might have been limited to several specific tasks related to scientific research. From this information, a set of instructions (Appendix B) was prepared for the usability test. To record the actions of each of the participants, a DV camera was set up on a tripod behind the participants. The point of view was the screen, the keyboard, and the mouse; participants were told about this camera angle and that they themselves would not be recorded. Decide How to Deal with Ethical Issues Participants identity and privacy were protected during this study. They were informed that they were not being tested, but rather it was a test of the system. Moreover, participants were told that they could terminate the test at any time for any reason. Every attempt was made to keep personal data confidential. Evaluate, Interpret and Present the Data Because the size of the samplefour participantswas too small to use quantitative analysis, qualitative analysis of the data was conducted (Gay & Airasian, 2003). However, since this was a preliminary study, it is believed that it provides insights into the usability of moodles registration system,
MOODLE: A USABILITY EVALUATION 5

course enrollment procedure, and assignment submission. Preece et al. (2002) describe two methods for analyzing qualitative data, one of which is appropriate for this study: qualitative analysis for categorization (p. 381). Under the three techniques detailed by Preece et al. for this type of analysis, looking for incidents and patterns was the method used for this study (p. 381). This technique is designed to allow evaluators to look for patterns in the data collected during various testing protocols. The participants were observed while performing a series of tasks designed to simulate the online submission of a homework assignment on the moodle LMS. The data for each task were collected by the evaluator writing notes on a worksheet mirroring each of the tasks. In addition, a videotape was made of each participants actions for review to ensure that all actions were recorded. In addition to the note-taking and the use of a video camera and tape to record the participants actions, Erikson and Simons Think-Aloud technique was employed (as cited in Preece et al., 2002, p. 365). In this technique, participants are encouraged to tell evaluators what they are doing and thinking about as they work through the various steps of an evaluation; this can help evaluators to have an idea what is going through participants minds during the test. Since this was a small, preliminary study, its reliability is unknown. However, special efforts were made to choose participants with a similar background in both academic and computing skills, and the instructions for the tasks selected for the usability test were consistently articulated to each of the four participants. For validity, it is assumed that the test was an adequate one because it was designed to see if participants could complete the ultimate task: submitting an assignment online. Since the participants were asked to merely complete the tasks for the usability test, it was assumed that there is very little chance for bias to apply in this test. Potential biases were ill feelings toward the evaluator or the use of English during the test, but neither of these were detected during the test itself. The scope of the results were assumed to be limited to the kinds of tasks the participants were asked to complete, as well as to the kinds of participants who participated in the study. However, since common tasks were part of the test and the participants might be considered typical science students, the results should have a scope representing that set of tasks and that population. Ecological validity, especially the Hawthorn effect (Preece et al., 2002), may have been a concern because of the setting of the testing and the presence of the evaluator and a video camera. This could not be helped because of the need for a quiet, interruption-free environment.

LANGUAGE ISSUES: HCI

The Usability Test Task One In order to examine the usability of registering and uploading an assignment in the moodle system, participants were asked to perform three main tasks (Appendix B), each of which had several subtasks. The objects in Task One were to log in to the target computer (see Figure 1), open a text-editing program called TextEdit, create a simple text file, and save it in a specific area.
Figure 1. Login Screen

This task was designed to simulate writing an assignment which would be later turned in electronically. Since the usability test was for the moodle system, somewhat detailed instructions on how to complete this task were provided, including into which directory the file should be saved. Task Two The second task was to create a new account on the moodle system and write the necessary information in the profile page in order to complete the registration. In order to test the usability of the moodle system, specific instructions, such as which buttons to click on to complete the registration process or how to get to the main moodle page, were not provided in the second task. At the login screen, participants needed to type in a username
MOODLE: A USABILITY EVALUATION 7

and a password, followed by a mouse click on the Login button (see Figure 2).
Figure 2. moodle Login Screen

Instructions for this subtask were kept to a minimum to check moodles usability; they were not, for example, told how to complete this subtask. Next the participants needed to complete the registration process by filling out their user profiles. Although there were 19 potential user fields to modify through text boxes, drop-down lists or a button with which to load in a picture, only five of the text boxes needed to be filled in: Given name, Surname, Email address, City/town, and a Description (see Figure 3).

LANGUAGE ISSUES: HCI

Figure 3. New User Profile

MOODLE: A USABILITY EVALUATION

To avoid confusion, the participants were provided with specific instructions to fill in only those five pieces of data, the latter being a onesentence description. The next direction was that the button to finish was at the bottom of the page; this direction did not include the label of the button (see Figure 4).
Figure 4. Update Profile

After successfully completing this subtask, the participants were taken to a page giving them a synopsis of their account information. It was from here that they had to navigate to the main moodle page. Either of two links, located at the top-left or the bottom of the page and both labeled moodle, would take participants to the main page (see Figure 5).
Figure 5. Profile Synopsis

The final subtask was to join the course created for the participants and then upload the text file that was created in the first task into the correct area of the moodle course. The instructions for this task were kept to a minimum so as to determine the usability of joining a course and uploading a file in the moodle system. Task Three The third task was to submit the text file into a prepared assignment area in the moodle installation. First, from the main moodle screen, participants needed to click on the link to access the appropriate courses (see Figure 6).

10

LANGUAGE ISSUES: HCI

Figure 6. Course Categories

The next subtask was to join the course that the participants might be taking in their graduate program. This could be accomplished by clicking on the Graduate Writing link (see Figure 7).
Figure 7. Join Graduate Writing

After clicking on the Graduate Writing link, participants were taken to a screen where a message to confirm joining the course was displayed: You are about to enroll yourself as a member of this course. Are you sure you wish to do this? The two possible responses were linked as Yes and No. A click on the Yes link took them to the Graduate Writing course page (see Figure 8), and a click on the No link returned them to the main page of the moodle site.

MOODLE: A USABILITY EVALUATION

11

Figure 8. Graduate Writing Course Page

From this page, the instructions were to go to the Introductions assignment. Clicking on that link would take them to the page for that assignment (see Figure 9).
Figure 9. Upload Assignment

Then they were to upload the file that was written and saved in Task One. This was a two-step process which required them to click on Choose File and then click on Browse to locate the file. These specific instructions were not provided by the moodle system, nor were they provided in the instructions. The only instructions provided by moodle were: Submit your assignment using this form:. Completing this task took the participants back to the assignment page where they had to click on the Upload this file button. The final steps were to log out of both moodle and the computer. Test Administration At the beginning of the usability test, the instructions were read slowly and carefully, and Japanese was used to clarify any instructions that were not understood by the participants. Participants were allowed to ask questions at any time during this instruction phase. After this phase was completed, and the participants acknowledged that they were ready, the
12 LANGUAGE ISSUES: HCI

video camera was started, and the testing began. In order to make the participants feel at ease with the test, words of encouragement, such as good or okay, were given at the completion of some of the tasks if the participants were experiencing difficulty. Participants were asked to make a mark next to each subtask as they were completed. A + mark was to indicate that the subtask was easy, a indicated that the task was okay to do, and a - mark denoted a subtask that was difficult. At the end of the usability test, each participant was encouraged to make any comments about their experience or feelings during the test. All four were forthcoming with comments when they were finished. This test was simulated by the evaluator before its administration to the four participants. The test itself was estimated to last between 15 and 25 minutes, with the entire period, including instructions and posttest discussion, to last between 25 and 40 minutes. One final note about the testing environment: after each test, the applications were reset to their original settings, so that participants would have identical starting points. For example, the save path for TextEdit was changed each time to the root of the English Guest account. That way all the participants had to find the Students folder on their own. RESULTS Length of Time To complete the usability test, the four participants required differing amounts of time based on their success with each subtask. The time required for each usability test can be seen in Table 1.
Table 1. Time Required for Test Instruction Time 7 min. 8 min. 15 min. 12 min. 10.5 min. Usability Test 20 min. 15 min. 33 min. 18 min. 21.5 min. Posttest Discussion 4 min. 4 min. 3 min. 5 min. 4 min.

Participant A B C D Average

Total Time 31 min. 27 min. 51 min. 35 min. 36 min.

User Ratings and Success The ratings for each subtask given by the participants were occasionally quite subjective. For example, tasks that were observed to be simple to complete may have been marked with a or even a -. In addition,
MOODLE: A USABILITY EVALUATION 13

some subtasks that were marked with a + occasionally took more than a short amount of time or required participants to navigate back and forth to find the proper location. The results of these user ratings and whether participants were successful can be seen in Table 2.
Table 2. User Ratings and Success
Participant Task One Log in Start TextEdit Type in information Save document Check the folder Quit TextEdit Task Two Safari moodle URL Register Five bits of information Update profile button Go to main moodle page Task Three Go to Mr. Meltons course Join the course Introductions assignment Upload the file Log out of moodle Log out of the computer Successfully uploaded file + + + + + No + + + + Yes + + + No Yes + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + A B C D

Very few of the ratings were deemed easy by the participants. In fact, only the common actions such as opening programs, typing text, saving files, and logging out of programs and the computer were fairly consistently found to be easy to do. Task One Although the first task was not a test of the usability of the moodle system, its successful completion was necessary in order for a major portion of the third task to be possible. All of the participants logged into the correct account on the target machine. Although they proceeded
14 LANGUAGE ISSUES: HCI

in different ways, none of the participants had trouble starting TextEdit or adding in the three items of information. Saving the document in the correct folder was similarly easy, but participants often took some time to check that the folder was the correct one (as per the instructions). None of the participants had problems quitting TextEdit. Task Two Similarly to starting TextEdit, none of the participants had troubles starting up a browser. One did, however, use a browser other than Safari for the text. Participant C used Internet Explorer for the usability test. The only noticeable difference from this change was the page for uploading the text file. The button for choosing the file was on the right side and had a different label: Browse (see Figure 10).
Figure 10. Internet Explorer Upload Page

All of the participants correctly went to the Address Bar of their browsers to type in the URL. Two of the participants saved time by noticing the auto fill-in feature of Safari and went directly to the site. The Internet Explorer user had to type the entire URL, and the fourth participant had to retype the URL because of a typo early in the process. All of the participants were able to easily complete the initial portion of the registration procedure, but completing the second part proved difficult for one. Since the instructions were clear on which five items of information were needed, those did not, for the most part, cause any problems. One participant did input an incomplete address, so that had to be addressed in a subsequent update of the profile. Two participants correctly selected the Update profile button and moved to the next subtask. One seemed to have trouble with the nomenclature in the instructions; bottom and button caused some confusion, but this was overcome. The fourth participant did not choose the Update profile button, but rather the moodle link at the bottom on the first try and the Logout link on a subsequent attempt. In both of these cases, the five items of information input on the profile page were lost and had to be rewritten. During this phase, the participant needed to be made aware of the fact that the data had been lost. On the third attempt, the participant appeared to be at a loss on how to proceed, so the evaluator intervened by asking questions about which task to complete next and

MOODLE: A USABILITY EVALUATION

15

which links had already been explored. These questions helped to guide the participant to the Update profile button. The final subtask in Task Two was to move to the main moodle page; this was accomplished by all four. Two completed it easily; one explored the Personal profile page somewhat and arrive at the main page via the Edit profile page; and the fourth, rather than using one of the links, retyped the main moodle page URL by hand in the Address Bar of the browser. Task Three From the main moodle page, all four participants were able to move to the correct course page. Although all were successful, none of the participants were able to easily join the course Graduate Writing. One was able to join the course after approximately 1 1/2 minutes. Another was unsure of the place to go from the start but after some time noticed the rollover saying Click to enter this course; this process took approximately 2 1/2 minutes. The two other participants used the back button several times to check to see if they were in the correct place. One retyped the URL and returned to the main page. This participant also clicked on the moodle logo at the bottom of the main page at one point and was taken to the moodle.org site (Dougiamas, 2004); this window was quickly closed and the participant navigated to the correct area. The entire process took 2 1/4 minutes. The other participant noted during this phase that the course could not be found. However, once the participant took some time to read what was on the screen, it was possible to join the course. This took 4 1/4 minutes to complete. None of the four participants had much trouble moving to the Introductions assignment. Three went there by way of the Introductions link. The fourth participant first noted not knowing what to do, but then successfully used the Assignments link on the left side of the Graduate Writing course page. The final main test of the usability of moodle was for participants to upload the file created during Task One. Two successfully uploaded their files to the correct area on their own, one did so after intervention from the evaluator, and the fourth was unsuccessful. The two who were successful were able to complete the subtask in less than a minute. The third participant went first to the Upload this file button, and then to the Choose File button; the file was not uploaded using this sequence. The fourth participant first tried the Upload this file button. After some time, the participant clearly was at a loss for what to, so the evaluator intervened by explaining the meaning of browse. This explanation was enough to prompt the participant to click on the Browse button. The file was located and quickly uploaded.

16

LANGUAGE ISSUES: HCI

None of the participants had trouble logging out of moodle; all simply quit their browsers to finish their sessions. Only one had trouble logging out of the computer; this participant was prompted on how to proceed. Participants Comments At the end of each session, participants were asked what they thought about the experience, whether using Japanese in the moodle system would have been helpful, and if they had any additional comments. Predictably, the subtasks that were problematical for each participant were considered difficult, and those that were simpler to accomplish were deemed easy to do. Particular mention was made of the process for joining a course. One participant noted that more explanation or some instruction on joining a course would have been helpful; another alluded to the same need for help. Three participants noted that the use of Japanese would have made the entire process easier, one even noting that the problems encountered during the test were due to the use of English in the moodle system. From the evaluators perspective, all the participants seemed upbeat after the usability test was over, and one even noted that the experience was fun. DISCUSSION & CONCLUSIONS There were varying rates of success and amounts time required to complete the tasks in this usability test. Based on observations, comments from the participants, and information provided through the questionnaires, these results can be attributed partially to the level of computer skills of the participants, partially to the moodle system, and partially to the choice of English as the moodle system language. An examination of the intermediate tasks reveals that the participants were mostly comfortable with tasks such as starting and quitting programs, writing and saving documents, and entering necessary information. These skills can be attributed to their experience with tools that are already familiar to them. However, when it came to working with the features of a new tool such as the LMS moodle, there were varying levels of success. It is important to note that the variety of strategieslike retyping the URL of the site or clicking on a link unralated to the specific taskemployed are undoubtedly the result of a lack of experience with Web-based tools. This should be a clear signal that our students need more practice with a broad range of tools in order to ensure their success during their years in school and beyond. With a 50% success rate on the ultimate goal of submitting the simulated homework assignment, there is obviously a need for improvement. Students will need to be able to accurately and confidently submit assignments using educational (and other) tools such as moodle. Similarly, instructors need to know that their students work will arrive in a secure and timely manner.
MOODLE: A USABILITY EVALUATION 17

To accomplish these needs, users of these systems will need to have more than a basic set of skills and knowledge. Although users will need to be taught how to use the technology, that time will need to be balancedand kept in checkwith the time required for course content (Kluge, 2002). We will need to know exactly what our students need to do, and show them, perhaps more so than we think, exactly what it is we want them to do. Moodle often follows many of the conventions for usability which are defined in Krug (2000): it has a simple interface, uses a minimal number of words, features rollovers providing extra information, and often includes simple icons with the words to aid users. These features should help new users accomplish the basic tasks of registering new accounts, accurately navigating to the proper area of moodle and specific courses, and completing tasks such as submitting assignments for instructor evaluation. The results of the ultimate goal of this preliminary study, to submit an assignment to the LMS moodle, however, were mixed. It should be noted here that moodle does not feature the standard purple color for visited links; links in moodle remain blue at all times. The lack of a colorized visited-links feature may account for the numerous repeated clicks by participants during periods of apparent confusion on links which had already been visited. Unfortunately Erikson and Simons Think-Aloud technique (as cited in Preece et al., p. 365, 2002) did not yield this information. The participants were varied in their readiness to talk during the usability test. The information gleaned from this technique was not enough to evaluate whether the persistence of the blue links was a factor in navigation. Krug (2000) notes that users do not want to read information on web pages; they want to quickly scan for what they are looking for and move on. This detail was supported in the observations of this study. In the two pages required to join the Graduate Writing course, for example, merely scanning the information on the two pages proved difficult for all the participants. In one case it took several minutes, and much unsuccessful navigation, before a participant took the time to read the information on the confirmation page. In another case it was not until a participants discovery of the rollover on the Introductions assignment link that the participant was successful. Although Krug (2000) notes that there are potential pitfalls with using rollovers, in this case, the rollover was vital to the success of the subtask. This leads to the issue of language choice for the interface. Since all the participants are required to do research in English, that language was chosen for the moodle system. However, as noted in the comments after each usability testing session, three of the participants stated that the use of Japanese in the moodle system would have made it easier to accomplish the tasks. For the two participants who could not easily identify the Update profile button and the two who had trouble with the Choose file/Browse and Upload this file scenario, information in their native language may
18 LANGUAGE ISSUES: HCI

have been enough to help them with those subtasks. This is in contrast to the results of Melton (2006) which showed no significant differences in the results of first-year students quiz scores using English or Japanese interfaces in moodle. Since there is an absence of research on the topic, it is an area rich with potential for further study. An essential point to make here is that all users of online tools should be comfortable using the tools. Teachers lacking the skills should get them, through training or through close consultation with colleagues who know how to use them. At the same time we should make sure our students are comfortable with them as well. We should take the time to show them essential features, one at a time. Teaching and learning the skills may take several class meetings, but command of the tools is the key. ACKNOWLEDGEMENTS I would first like to thank Dr. Maxine Cohen of Nova Southeastern University for opening up the world of HCI to me. Second, I would like to thank Dr. Paul Hays of Kwansei Gakuin University for the thoughtful and useful comments on later versions of this work. Finally, thank you to the blind referee who helped me focus on some of the issues vital to HCI. Any errors contained within this paper are mine alone. REFERENCES Blackboard. (2005). Blackboard. Retrieved December 2, 2005 from http: //blackboard.com/ Chapelle, C. A. (2001). Computer applications in second language acquisition. Cambridge: Cambridge. Colace, F., Santo, M. D., & Vento, M. (2002). Evaluating on-line learning platforms: a case study. Paper presented at the 36th Hawaii International Conference on System Sciences, Big Island, Hawaii. Dougiamas, M. (2004). moodle (Version 1.2.1). Perth, Australia. Retrieved March 25, 2004 from http://moodle.org/ Field, M. H. (2002). Towards a CALL pedagogy: Students use and understanding. In P. Lewis (Ed.), The changing face of CALL: A Japanese perspective (pp. 3-17). Lisse, The Netherlands: Swets & Zeitlinger. Gaer, S. (1999). Classroom practice: An introduction to e-mail and the World Wide Web projects. In J. Egbert & E. Hanson-Smith (Eds.), CALL environments: Research, practice and critical issues (pp. 65-78). Alexandria, VA: Teachers of English to Speakers of Other Languages. Gay, L. R., & Airasian, P. (2003). Educational research: Competencies for analysis and applications (7th ed.). Upper Saddle River, NJ: Merrill Prentice Hall.

MOODLE: A USABILITY EVALUATION

19

Iding, M. K., Auernheimer, B., Crosby, M. E., & Klemm, E. B. (2002). Guidelines for designing evaluations of web-based instructional materials. Paper presented at the 36th Hawaii International Conference on System Sciences, Big Island, Hawaii. Kern, R., & Warschauer, M. (2000). Introduction: Theory and practice of network-based language teaching. In M. Warschauer & R. Kern (Eds.), Network-based language teaching: Concepts and practices (pp. 1-19). Cambridge: Cambridge. Kluge, D. (2002). Tomorrows CALL: The future in our hands. In P. Lewis (Ed.), The changing face of CALL: A Japanese perspective (pp. 245-267). Lisse, The Netherlands: Swets & Zeitlinger. Ko, S., & Rossen, S. (2004). Teaching online: A practical guide (2nd ed.). Boston: Houghton Mifflin. Krug, S. (2000). Dont make me think! A common sense approach to web usability. Indianapolis, IN: New Riders. Lavin, R. & Tomei, J. (2006). Wikis in EFL: An evaluation. Language Issues (11/12)1, 35-47. Matsuda, T. (Ed.). (1999). Kenkyushas English-Japanese dictionary for the general reader (2nd ed.). Tokyo: Kenkyusha. McArthur, D., Parker, A., & Giersch, S. (2003). Why plan for e-learning? Strategic issues for institutions and faculty in higher education. Planning for Higher Education, 31(4), 20-28. Melton, J. (2004). The CMS moodle: A heuristic evaluation. Paper presented at JALTCALL2004, Mito, Japan. Retrieved December 2, 2005, from http: //jklmelton.net/2004/jaltcall/ Melton, J. (2006). The effect of a native-language interface vs. a targetlanguage interface on students performance. In P. Zaphiris & G. Zacharia (Eds.), User-centered computer aided language learning (pp. 23456). Hershey, PA: Idea Group. Nielsen, J. (1994). Guerrilla HCI: Using discount usability engineering to penetrate the intimidation barrier. Retrieved December 2, 2005 from http://www.useit.com/papers/guerrilla_hci.html Nielsen, J. (2003). Usability 101: Fundamentals and definitionswhat, why, how? Retrieved April 24, 2004 from http://www.useit.com/alertbox/ 20030825.html Pellettieri, J. (2000). Negotiation in cyberspace: The role of chatting in the development of grammatical competence. In M. Warschauer & R. Kern (Eds.), Networked-based language teaching: Concepts and practices (pp. 5986). Cambridge: Cambridge. Peyton, J. K. (1999). Theory and Research: Interaction via Computers. In J. Egbert & E. Hanson-Smith (Eds.), CALL environments: Research, practice and critical issues (pp. 17-26). Alexandria, VA: Teachers of English to Speakers of Other Languages.
20 LANGUAGE ISSUES: HCI

Preece, J. (2000). Online communities: Designing usability, supporting sociability. New York: John Wiley & Sons. Preece, J., Rogers, Y., & Sharp, H. (2002). Interaction design: Beyond humancomputer interaction. Hoboken, NJ: John Wiley & Sons. Rozanski, E. P., & Haake, A. R. (2003). Curriculum and content: The many facets of HCI. Paper presented at the 4th Conference on Information Technology Curriculum on Information Technology Education, Lafayette, Indiana, USA. Ward, J.M. (2004). Blog assisted language learning (BALL): Push button publishing for the pupils. TEFL Web Journal, (3)1, (1-16). Waterhouse, S. (2005). The power of elearning: The essential guide for teaching in the digital age. Boston: Pearson. WebCT (2005). WebCT. Retrieved December 2, 2005 from http:// webct.com/

MOODLE: A USABILITY EVALUATION

21

APPENDIX A
Some Questions about You and Computers Please answer these questions about you and your use of computers (please use English or Roman letters for all of your answers): 1. What is your name? ____________________________________________________ 2. How old are you? _____________________________ 3. What do you study here at PUK? _______________________________________ 4. What kind of computer system do you usually use (Windows, Mac OS 9, Mac OS X, Linux, other)? ______________________________________________________ 5. What other kind(s) of computer system can you use? _______________________ _________________________________________________________________________ 6. What kinds of programs do you use with your computer (check all that you use)? Word processor (Word, etc.) _____ Browser (Internet Explorer, Safari, etc.) _____ Email program _____ Presentation (PowerPoint, etc.) _____ Chat (AIM, Yahoo! Messenger, etc.) _____ Video _____ Text editor _____ ftp _____ Photo _____ Scheduler _____ Spreadsheet (Excel, etc.) _____

Other (please write some examples) ________________________________________ _________________________________________________________________________ 7. What do you think of using computers (check all that apply to you)? They are useful. _____ I use computers for my research. _____ I am very good at computers. _____ I am just okay with computers. _____ 10. Have you used a BBS before? Yes _____ They are difficult to use. _____ I use computers in my free time. _____ I am good at computers. _____ I have trouble with computers. _____ No _____ No _____

8. How would you rate your computer skill (check the one closest to your skill)?

9. Have you used the moodle system before? Yes _____

22

LANGUAGE ISSUES: HCI

APPENDIX B
Test of the moodle System Thank you for offering to help me with my research. This is a test of the moodle system that you may use for your graduate writing course. You are not being tested. You may decide to stop the testing at any time. The instructions are in English, so that I can explain them to you. You may speak Japanese during the test because I am interested in your honest feedback (comments) during this test, but English is okay, too. You will be videotaped, so that I can review the test again later. This videotape will used only for this research, and will not be shown in a public forum. I may use the results of this test in a presentation or research paper, but I will not use your names or any personal information about you. Instructions: I would like to ask you to complete three tasks. One of them is to create a short text file and save it. The second task is to register on the moodle system. The third is to join a course and upload the text file from Task One into the correct area of moodle. As you work through each task, please talk out loud and say what you are doing and thinking. That way I will know better how the system works. I may remind you to tell me what you are thinking or doing. As you complete each task, write a mark next to the task (+ was easy to do, was okay to do, - was difficult to do). Your name (in Roman letters) _______________________________________ Task One [ [ [ ] ] ] Log into English Guest. Start TextEdit. In the TextEdit window that opens, type in the following information: your name (please use your own name) the date (please write todays date) Introduction (just type the word Introduction) Save the document, changing the name of the file from Untitled.rtf to your family name.rtf (for example, my files name would be: melton.rtf). Check that the file will be in the Students folder. The path is english/ Documents/Students/yourfamilyname.rtf Quit TextEdit.

[ [ [

] ] ]

Turn over, please

MOODLE: A USABILITY EVALUATION

23

Task Two [ [ [ [ ] ] ] ] Start the browser Safari. Go to the URL http://estudiante.pu-kumamoto.ac.jp/moodle/ Register for moodle with a username and a password. Complete the registration by writing five more things: your given name, your surname, your school email address, your hometown, and a onesentence description. You do not need to change anything else. The button to finish is at the bottom of the page. Next go to the main moodle page.

[ [

] ]

Task Three [ [ [ [ [ [ ] ] ] ] ] ] Join Mr. Meltons courses. Join the course that you may take next semester. Go to the Introductions assignment. Upload the file you wrote before. Use the same path from Task One. When you finish, log out from moodle. Finally, log out from the computer.

Thank you very much for your help!

24

LANGUAGE ISSUES: HCI

Potrebbero piacerti anche