Sei sulla pagina 1di 32

CALIFORNIA STATE UNIVERSITY

MONTEREY BAY

Admissions Processing Training:


CSU Monterey Bay, Office of Admissions

CAPSTONE Report

Submitted in partial satisfaction of requirements of the degree of

MASTER OF SCIENCE in

Instructional Science and Technology

Sarah Barnhart

December 10, 2019

Capstone Approvals: (At least one advisor and capstone instructor should approve)

_______________________ ___________________________ _____________


Advisor Name Signature Date

_______________________ ___________________________ _____________


Capstone Instructor Name Signature Date
2

Table of Contents
Table of Contents .................................................................................................................. 2
Executive Summary ............................................................................................................... 4
Introduction .......................................................................................................................... 6
Problem Description ......................................................................................................................6

Target Audience.............................................................................................................................6

Literature Review ..........................................................................................................................8


Industry trends. ....................................................................................................................................................8
Relevant research. ...............................................................................................................................................9

Solution Description ............................................................................................................ 10


Proposed Solution........................................................................................................................ 10

Project Goals ............................................................................................................................... 10

Learning Theories and Instruction Principles ................................................................................. 12

Media Components...................................................................................................................... 13

Challenges ................................................................................................................................... 13

Methods/Procedures ........................................................................................................... 14
Narrative Description ................................................................................................................... 14

List of Steps Completed ................................................................................................................ 15

Resources ............................................................................................................................ 16
Timeline .............................................................................................................................. 16
Formative & Summative Evaluation Plan ............................................................................. 18
Formative Evaluation Plan............................................................................................................ 18
Usability testing..................................................................................................................................................18

Summative Evaluation Plan .......................................................................................................... 19


Pre- and post-testing. .........................................................................................................................................19

Final Thoughts ..................................................................................................................... 21


References........................................................................................................................... 23
Appendix A: Descriptive Statistics ........................................................................................ 24
3

Appendix B: Observation Checklist ....................................................................................... 25


Appendix C: Final User Survey Results .................................................................................. 26
4

Executive Summary
This project serves the position of the Admissions Specialist in the Office of Admissions
at California State University, Monterey Bay. Admissions Specialists are responsible for
processing all prospective student documentation that is submitted to the office. Transcripts are
high priority in terms of processing, since transcript deadlines can alter a student’s admissibility.
High school and college transcripts are viewed by the majority of the Admissions staff for
freshman, transfer, and graduate level evaluation. Determination of admissibility based on
eligibility for admission to CSUMB is decided by the respective Admissions staff that reads the
applicants’ transcripts to make that determination. In order for an Admissions staff member to
evaluate a transcript, it must first be entered in the system.
Admissions Specialists are responsible for thousands of prospective student transcripts
that enter the Admissions Office on an annual basis. A streamlined process for paper transcripts
is in place to ensure transcripts are not missed in the line of processing. After the mail is opened,
transcripts are labeled and scanned into the document database, called OnBase. The transcripts
are then entered into the student system software, called PeopleSoft Oasis. Oasis tracks all
student accounts from both the student perspective and the internal office perspective. For
example, when the transcript is received and entered in Oasis by the processing staff, then the
student receives a message from their dashboard reflecting those changes. Much of this process
is covered in the general Admissions Processing Overview section—lesson one—of this
capstone training. Once the learner is familiar with how to read a transcript—lesson two—and
how to navigate through Oasis—lesson three—the information in the training continues to
describe the system entry process with Posting and Course Entry trainings—lessons four and
five. These two lessons are accompanied by individual simulations for general practice and
assessment purposes. The fourth lesson offers system data entry practice, where the learner
begins with a training on how to post.
The Admissions Specialists are expected to update each student account as transcripts are
received by the Admissions Office; this process is called posting, which was a manual process
for years and is a major lesson in this training. The Admissions Office has since established a
new automated process for posting, which went live in production approximately two weeks
prior to the November 12th capstone deadline. However, despite the new automated process, the
manual training still gives new employees the background knowledge for manual posting.
5

Manual posting will still be relevant through the transitional stages of the automated process.
Manual posting may eventually become irrelevant entirely, and if this happens, the manual
posting module will be replaced with detailed training on the automated process.
Though unforeseen changes have occurred in the way documents are processed, all
elements of this project are still very important to the overall business practices in the Office of
Admissions. Whether an account is updated manually or via an automated process, the person
making updates is required to know specific aspects of a transcript, such as determining if a
transcript is “final” or “in-progress,” making the distinction of district schools, understanding
campus codes and correlating system codes, selecting multiple navigations in Oasis, and
knowing how to enter coursework in a student’s account.
Though it is the responsibility of the Admissions Specialist to make sure accounts are
updated, the Office of Admissions relies on Student Assistant employees to help enter all
information where account access is not limited. With multiple people entering data into the
system, there is an expected percent of human error. Errors are corrected by the Specialists
weekly from data integrity queries. Office errors are also provided in a biannual Enrollment
Reporting Services Student and Enrollment Reporting Services Applicant report, also known as
ERSS/ERSA. System errors that are missed by the Specialists during the semester are corrected
just prior to census, when all CSUs report their data and statistics to the Chancellor’s Office.
The goal of this project was to decrease human error and increase productivity by
reducing one-on-one individualized trainings. Thus, an asynchronous training on the Oasis
system application for transcript entry was created, developed, and implemented. This
asynchronous elearning includes an overview of Admissions processing, as well as trainings on
how to read transcripts, how to navigate through Oasis, how to post a transcript manually, and
how to enter coursework in Oasis. Originally, the goal in creating an effective training solution
was to reduce the interruption time that Specialists spend answering daily processing questions,
and also to eliminate one-on-one individualized time that Specialists spend training new student
assistant, temporary, and internship employees.
The final project consists of bite-sized elements of training and instruction that has been
compiled into an iLearn LMS host site for internal office training. This training has also proven
to be useful by introducing the job duties to new employees who must wait for Human Resources
to provide their system access to Oasis after they finish their FERPA—system security—
trainings. The new temporary processing employees that have entered our office in the last
6

month have had the opportunity to work with this program in their down time as they waited for
system access. Thus, we have already incorporated this training into the new-employee training
program upon all new employees having completed their initial system security trainings, and the
results have been quite pleasing.

Introduction
Prior to the development of this project, there was no formal training or supplementary
support that explained how to navigate Oasis, or what each field requires and why.
Understanding the effects of missing information and gaining the knowledge to prevent any
negative or timely outcomes was vital in filling the gap. Information is entered into the system
incorrectly or fields are missed often due to either lack of understanding on the part of the
employee or simply due to negligence when working too fast. The neglected fields eventually
circle back to the office in the form of ERSS/ERSA system errors, which is also a time-
consuming task. This training serves to support preventative measures that can be employed to
lower the Fall and Spring semester ERSS/ERSA errors, which will lower the biannual
corrections that occur prior to census.

Problem Description
Errors are preventable at the time the information is entered into the system;
however, the current practice—even with the new training—still does not prioritize daily data
integrity checks, because of the amount of time this would consume daily. Data integrity queries
are checked weekly and minimizing the number averages that appear on the queries is one of the
main goals in the training, in other words, reducing human error. Previously, trainees were given
minimal one-on-one training without the use of a job aid, so when student assistants have
questions, they must walk to the cubical of a Specialist and ask, which is disruptive to all parties
involved. In an effort to close the gap, any employee entering information into the system must
understand the correct pieces of information to enter into all fields as an effort to minimize
errors. With the new training, the student’s education page will be filled out entirely according to
the information on the transcript. The posting and course entry processes are clear, concise,
and consistent in the training lessons and address all potential questions.

Target Audience
My audience for this training is targeted to all Admissions processing employees, which
mainly consists of Admissions Specialists and student assistant employees. Course entry is in the
7

job description of an Admissions Specialist, so they are the primary audience. However, since
the Specialists have various other duties, student assistants also help with completing this task.
The Admissions Office includes workers with a wide range of professional backgrounds.
Student assistant employees work for our office part-time during fall and spring semesters and
are in the process of obtaining their bachelor’s degrees. However, the Admissions Office also
hires full-time temporary employees for help with processing. For this project in-particular the
target audience is directed towards any new full-time and temporary Admissions Specialists, as
well as student assistant employees. This training is intended to help new employees in these
positions gain an understanding of the basic data entry processes.
The Admissions department is one of the busiest offices on campus. Admissions receives
thousands of documents on a semesterly basis. There are hard deadlines that require timely
processing, as well as hours of data entry. During stressful times the common attitude in the
office seems to be enter now and fix later. The workload, by default, often leaves the processing
team getting the information posted first, and then dealing with the errors in bulk later. It takes
additional and unnecessary time identifying errors from the ERSS/ERSA error report because
Specialists are required to go into each student record to fix the error.
Fortunately, the culture in the Admissions office is very open and communicative.
Because everyone is an expert in the work pertaining to their job titles, there is a valued rapport
of relying heavily on each other’s support for internal knowledge, information, or requests. Some
of my colleagues have over 10 years of experience at the university and some are only a few
weeks in to their current positions. The student assistants, on the other hand, are almost always
young adults in their early 20s that are all working towards their bachelor’s degrees. Though, we
are all adult learners and it has been important not to undermine anyone’s current knowledge.
The Admissions office also has a director, as well as managerial positions that help supervise and
support the team in its entirety; their support and approval has been important throughout the
implementation process.
Unfortunately, the Admissions Office continues to remain understaffed as the university
grows larger each year, which means many of the employees are overworked. Many of the
workers do what is necessary in their positions to get their work done and tend to keep their
attention on high priority tasks. It is a very fast-paced office, tasks are placed on the back
burner—which includes training—if there is no immediate upcoming deadline. Originally, I
thought getting people on board with this project would be difficult, because of the additional
8

time that would be required. Yet, carving this time out of their days for testing has been easier
than anticipated with the support of management. Specialists, in particular, were asked to spend
additional time on the details in the training modules; this was intended to save them one-on-one
training time later down the road.

Literature Review
The staff in higher educational facilities seems to take a back seat in the training
department since the student body often grows faster than the budget to hire additional in-office
workers. Some authors discuss the resistance to change the office culture in the topic of training
needs. It seems the common agreement is that the resistance to change may come from a state of
mind that involves workload prioritization, rather than the willingness to implement new training
programs. Higher management is also a significant factor in line with facilitating proper training
within university departments.
In the article, “University employees' perceptions of organizational culture and training
and development: A qualitative case study,” Fuller studies participants who work for various
departments in a higher education university setting. Upon interviews with all participants on the
topic of office culture, “the common opinion was that the culture in individualize departments
was not the same as that of the organization as a whole,” which greatly impacts the entire
campus community. Training is excluded in office practices because it is not a priority in
comparison to the overarching workload. Fuller also lists emerging themes in higher education
employees that address the nonexistent nature of training programs, as well as employee
involvement and motivation concerns. The quick fix is often the most time-effective solution
with budget constraints.
Industry trends.
Staff in higher education take on the responsibility of conducting their own training on
minimal time and minimal cost. The challenge, according to Taryn Oesch at Training Industry,
Inc. is “In higher education, there’s a wide range of skills, from entry-level positions to people
with multiple doctorate degrees.” (Oesch) Therefore, emphasizing the impact of training, as well
as obtaining the metrics and proof of learning is critical. In addition, “the hierarchical structure in
higher education is often unclear, which makes it difficult to mandate training.” (Oesch) The
desire for the implementation of training programs must come from a place of managerial
9

support. As a means to change in-office habitual trends, the desire for change must trickle down
from the top.
In some organizations, such as private universities, the ease in hiring a middle party to
suggest structural changes may be available as a resource. However, again, when funding is
limited, employees should request professional development when opportunities arise. In
addition, the request for consistent and concise trainings can work its way up to management if
enough parties are involved in the request.
Relevant research.
Though many universities do currently offer standardized training programs to their
employees, the extent to what an organization is able to offer largely depends on budgets and
funding. John Fielden states, “every institution should prepare a staff development plan as part of
its human resource strategy,” and in the same regard, he acknowledges that there is an
“expansion in numbers and reduction in funding.” (Fielden) Unfortunately, this is often the case,
especially since CSUs are state funded. Thus, the question is how do higher education facilities
continue to thrive by ensuring quality employees are properly trained in policies, procedures, and
common business practices on their campus?
CSUMB currently uses SumTotal to host all training programs. These trainings are
assigned to all faculty and staff and notifications are sent out to employees when an outstanding
training is required. However, the trainings that are offered mostly consist of standardized
trainings that cover such topics as sexual harassment, system security, and hazardous chemicals.
Though this may be in good efforts to keep all staff and faculty up to date on preventative
measures against lawsuits and workers compensation, actual implementation of on-the-job
training is far and few between with lack of funding for inner office staff program development.
Many employees care deeply about their organizations and campuses. Therefore, continuing to
advance workplace knowledge and skills for employees in higher education is vital for sustaining
a harmonizing campus culture. Offering a program that helps to improve an employee’s
professional growth will build motivational cause by allowing employees to see how their work
is important to the campus; this andragogical practice is not only critical to employees, but
organizations as a whole.
10

Solution Description
Proposed Solution
This complete asynchronous online training tool introduces learners to the basics of
Admissions processing with an overview of processing business practices. Following this
introduction, is a lesson which explains how to read high school and college transcripts. The
training teaches learners how to decipher information on a transcript, which is required for data
entry. Next, the user is able to interact with the navigation that the processing team uses when
entering data into a student’s account. Additionally, this program is designed to provide a full
explanation of the navigation through the Admissions processing posting and course entry
processes. The required fields for data entry are emphasized, explained, and reiterated with user
practices opportunities and knowledge checks. Upon the implementation of the training in our
office, new employees that are hired to help with hard copy back-office processing are now
required to complete the training; this practice minimizes errors and decreases time spent on
correcting future errors from the data integrity queries and from the ERSS/ERSA error reports.

Project Goals
The desired goal when developing this course was to eliminate system errors, as well as
errors that appear on the biannual ERSS/ERSA reports that are received by the Chancellor’s
Office. In addition, the goal has been to have less disruption to in-office employees that are
working with students’ admission; in the past, if an error was caught, the solution would require
a Specialist’s intervention, which prevents other Admissions staff from moving forward in that
student’s account. So far, with the implementation of this new training, there have still been
questions received by new users, but the background knowledge is instilled, and we can now
build upon the schema developed upon interacting with the program. Another very important
goal was to increase productivity by eliminating time spent pausing a task to ask and/or answer
questions between the student assistants and the Specialists. This disruption time has also
decrease. Lastly, the office needed a training that sets a standard in order to eliminate potential
miscommunication and retain consistency during the learning process. These project goals were
achieved with the following pieces of training:
 Providing an overview of Admission Office processing practices.
 Providing a training on the navigation through the education page in Oasis.
 Providing detailed descriptive training on how to read high school and college transcripts.
11

 Providing a complete overview of entry fields in Oasis for posting and course entry
processing.
 Meetings with the processing team for subject matter expert opinions and reviews of the
design.
 Implementing appropriate interactive adult-centered learning with consistent knowledge
checks.
 Conducting a design effectiveness survey with usability testing.
In an effort to align the project objectives with the learning objectives, objectives were
implemented for the learner. Upon completing the course, learners will be able to do the
following:
 Identify necessary transcript information to post and enter coursework
 Recognize the differences in district transcripts from all other transcripts
 Navigate a student’s account from the education page in Oasis
 Post high school and college transcripts in Oasis
 Apply default settings and enter coursework accurately in Oasis
In addition, the following terminal objectives were established prior to the project development,
and Employees entering transcript data into the system will enter all system required fields
accurately (affective domain):
1. Given a video of Admissions processing, employees will be able to identify how
paper transcripts are processed from start to finish.
2. Given a transcript, employees will be able to identify the term, year, school subject,
course number, course name, units, and grade on most transcript formats.
3. Given access to Oasis, employees will be able to identify the appropriate fields for
data entry and enter all appropriate fields correctly without error.
4. Given a reference guide, employees will be able to identify the From and To dates on
a transcript with 100% accuracy.
5. Given a list of district transcripts, employees will be able to distinguish the
differences of information on a district transcript and enter all information in Oasis
with 100% accuracy.
6. Given the course entry navigation in Oasis, employees will be able to apply the
proper default settings to all required fields for course entry.
12

7. Given a list of possible answers, employees will be able to identify the correct grade
code to apply in the system for the task of entering courses with 100% accuracy.
8. Given a simulation in Oasis, employees will be able to enter all required fields
without error for posting and course entry.

Learning Theories and Instruction Principles


Andragogy is the basis of all learning theories and was the foundation for this design. The
Andragogy Learning Theory was developed by Malcom Knowles in the 1970s. Since this theory
is still relevant today, and I work with adult learners, I have applied the main elements from this
theory. The main components of Knowles’ theory are: need for knowledge,
motivation/willingness, prior experiences, self-direction, and orientation to learning (Gutierrez,
2018).
The learner has been given clear reasons as to why they will want to know the intended
topics of instruction. The learner has also been able to see what they will gain from taking the
course. This information is stated early on in the objectives and in micro-bits throughout the
course. Many times, users are informed as to how certain elements of the instruction effect other
parts of processing. I would like to speculate that reducing personal error rates may be reason
enough for an adult learner to want to improve their knowledgebase for this task, which also falls
into the willingness category. This can be effective by means of the quizzes and knowledge
checks throughout the instruction. Additionally, I have considered the cognitive, social, affective,
and conative sources for motivation (Huitt, 2011).
I have considered the adult learners’ prior experiences by not over explaining materials
that the participants may already know; this was considered in the design by having information
from each lesson build upon the next. Thus, the microlearning aspect of the design is not only
effective for new learners, but also for learners who are returning to the training and seeking one
or two elements of the instruction; this acts as not to undermine anyone’s intelligence. Users
have the option to choose which module they want to begin, so users are not required to go
through the entire lesson if they already know how to do one of the tasks. The design allows the
user to make their own decisions in some areas of the course. Lastly, implementing task-oriented
learning exercises helps to orient the learner to the job.
Furthermore, John Keller’s ARCS model of motivational design theory is followed for
media usage and knowledge retention. Keller explains attention, relevance, confidence, and
13

satisfaction are key elements to multimedia instructional design (David, 2014). Thus, the design
incorporates ways to draw the learner’s attention and reinforce materials and subject matter by
means of highlighted fields, zoom-in sections, and arrows. As with Andragogy, relevance to
learning must be established early, or there will be potential for losing motivation; thus, the
importance of each lesson is stated in iLearn prior to beginning each major lesson. The benefits
of the training are explained to the learner at the forefront in an attempt to establish the job
relevance early.
As for confidence, reasonable objectives for obtaining necessary on-the-job skills have
been established. The modules build upon prior knowledge, rather than expecting the learner to
complete a task without any initial context. Posting a transcript acts as a scaffold for the course
entry process. Feedback is also provided throughout practice assessments, knowledge checks,
and interactive activates, while still having control over their direction. This is to ensure learners
feel a sense of satisfaction by being able to apply the knowledge learned to the job. Satisfaction
is promoted by positive feedback throughout each lesson.

Media Components
The training course with audio edits was developed using Adobe Captivate, with the
support of Camtasia for video editing. Access to a test environment in Oasis was granted by the
Admissions department to prevent violations against FERPA—Family Educational Rights and
Privacy Act—and to protect current student information. Regular access to OnBase—the
CSUMB document database—was necessary for collecting screenshots of high school and
college transcripts. A simulation video was implemented using both media environments:
OnBase and Oasis. The entire course, which includes individual Captivate lessons with SCORM
packages is hosted in iLearn. Knowledge checks and test questions were developed in iLearn,
and all scores are recorded via this LMS. All lessons have been made as accessible as possible
with the inclusion of closed captioning for all audio elements. Public media that is borrowed
from outside sources have been credited. Access to the Admissions Processing training will be
granted to all new CSUMB employees involved with processing as necessary.

Challenges
Though, we have implemented this training in the new employee training procedures,
establishing a consistent practice with using this training may still pose as a challenge. The office
culture has established a desire for shortcuts in training since the workload is heavy and help and
14

resources are limited. Implementing this change has required some convincing, and reminders to
break the routine of one-on-one trainings from the people involved in processing are still
necessary. Anticipated challenges were expected with the design phase. Adobe Captivate is a
new program to me and has required personal extensive training for learning how to use all of
the desired capabilities that Captivate offers.
Concerns have also presented themselves in the understanding that CSUMB was in the
process of switching to a new application, called SalesForce. At this point, this application is on
hold. However, an entirely different project has managed to completely alter Admissions
processing business practices. Initially, I expected this to be a minor concern with the anticipated
go-live date being far in the future. However, as explained in the executive summary, the process
went live about two weeks prior to the initial capstone due date. The actual changes in the
process completely changes the need for manual posting, which in the long run is a great feature.
Though, the need for manual training may not be as relevant as it was a few weeks ago. Upon
speaking with management, this lesson in the overall training will remain mandatory as it still
helps with understanding the way we previously entered transcripts in a student’s account.
Additionally, the transition to the automated process is still in limbo so the Specialists and
processing team are still practicing manual transcript posting for the time being. The processing
team will also need to know how to enter transcripts manually for one-off cases as they come
into play.
Lastly, technological equipment has been an important consideration in the development
of this design. Unfortunately, due to budget constraints, the office uses very outdated computers
and the training application needed to be hosted in a site as not to cause the computers to slow
down. Thus, the solution to this concern was to provide all users access to iLearn. Though, the
iLearn managers of the Oasis Training course will need to remember to add user access during as
new employees come in and out of the office.

Methods/Procedures
Narrative Description
The design began with the launch of the course entry module. As I received feedback and
usability tests, the course was broken down into micro parts with four additional modules. The
following four modules were developed next. The product came together in the iLearn host site
during the development phase. Once the main elements of the design were accomplished, the
15

quizzes were developed and added in iLearn. Lastly, the iLearn site was setup for user-friendly
viewing and navigation. Particular people were then added on as students. I also asked an
administrator to change the settings so that anyone with the link can access the site so that the
project could be posted to the course forum. All new and current student assistants, temporary
employees, Admissions Specialists, and management have been asked to take this training. The
evaluation phase consisted of regular reviews, including summative and formative evaluations—
tracking learner success and knowledge transfer.
The development plan timeline was rather lengthy due to the fact that the majority of the
materials needed to be created, and existing materials needed to be revised. In addition,
assistance from other parties (i.e. interdepartmental help) was needed in order to create the
program successfully. The majority of the course design is online; though, there is also a ten-
minute session at the end of the training to sit with a Specialist to answer any questions that the
learner may have. This one-on-one time is factored into the hour-long capstone training. The
Specialists work on a number of tasks, but upon assessment and review, student assistants spend
the most time helping the Specialists with posting transcripts and entering coursework.
Therefore, there has been careful consideration for this training and the internal needs of the
office.
Because there are two dense topics of training, the design outline is strategically created
to consider the order of events in which data must be entered into the system. Posting and course
entry are individualized tasks, but they do require order and initial understanding of the effects of
entering information incorrectly. Thus, a regular review of information and module testing will
continue to occur during evaluation.

List of Steps Completed


1. Captivate and storyboard draft review with Specialists
2. Screenshots were taken from Oasis and OnBase as needed
3. Completed the course entry demo video for final project in IST 526
4. Received management approval for user testing
5. User testing was conducted; some Admissions staff tested for design accuracy and
summative evaluation
6. Additional screenshots were reviewed and edited to secure all personal student
data
16

7. Additional captivate modules were created upon user feedback


8. iLearn host site was requested and created for the Admissions Office
9. iLearn host site was revised for user-friendly navigation
10. All student assistants tested the design for usability
11. User survey and assessment data were recorded for evaluation
12. Management approval and sign-off obtained

Resources
The costs for the actual design are minimal. However, as the designer of this training and
a means for professional development, the Admissions office has paid for me to attend an Adobe
Captivate training, which has costed a total of $799, plus food and travel expenses. The Director
of Admissions hopes to see the return on investment through the development of this training
with the expectation that there will be a reduction on time spent with on-on-one training and an
increase in productivity. The following elements were involved in the development and
implementation of this project:
 Adobe Captivate for primary training and storyboarding
 Oasis test environment for student profile simulations
 OnBase for screen shots of transcripts
 Camtasia for video editing and security
 Onsite conference room for design discussion meetings
 Individual computer stations for access to training modules
 Microsoft Word for draft development documents
 Google Drive for design communication
 Time with Admissions employees for interviews and usability testing

Timeline
Date Task
April 3, 2019 Requested access to the PRJ Oasis test environment
for product development
April 9, 2019 Received access to the PRJ Oasis test environment
for product development
17

July 9, 2019 Meeting with management regarding implementation


of capstone project
July 11, 2019 User testing with test participant #1
July 15, 2019 User testing with test participant #2 and #3
July 16, 2019 User testing with test participant #4
July 17, 2019 User testing with test participant #5
July 18, 2019 User testing with test participant #6
July 20, 2019 Completed summative evaluation and effectiveness
testing on demo module from IST 526
July 23, 2019 Sent effectiveness testing report to management to
showcase learning effectiveness
September 12, 2019 iLearn host site request submitted and obtained
September 24, 2019 Final storyboard submission
September 25, 2019 – November 3, 2019 Creation of additional Captivate modules
October 20, 2019 – November 3, 2019 Completion of iLearn host site/LMS container
October 22, 2019 Project checkpoint with MIST advisor and peers
November 4, 2019 User testing for final draft of capstone project
November 7, 2019 Sent management link to iLearn course for approval
November 12, 2019 Final draft of capstone project submission
November 15, 2019 Management approval and sign off on use and
implementation of all processing training elements
November 15, 2019 Final evaluation of usability testing
November 16, 2019 Final review of formative evaluation quiz data
November 17, 2019 Completed changes to Captivate modules upon
management suggestions and usability results
December 10, 2019 Final Report submission
18

Formative & Summative Evaluation Plan


Formative Evaluation Plan
The formative evaluation includes user testing from the Course Entry module. Additional
formative evaluation took place again upon completion of the entire course, where a collection of
feedback from user surveys were recorded.
Usability testing.
The first usability test was conducted in the Admissions Office with six different
individual testers on the course entry module. I scheduled appointments with each and sat beside
them and took notes on my observation checklist (Appendix B) as they trained with the module.
Some of my personal notes that I recorded as I watched each user include that certain features
were either ignored entirely or skipped over due to confusion with the navigation. In an effort to
collect a qualitative and quantitative responses, I requested users to complete a questionnaire
after they finish their training with the first demo module.
The observations allowed for important takeaways from user interactions. Learners
understood how to navigate through the module but had difficulty with some features. Most
users stated that they liked the visual aids and the different practice opportunities. Only 1 user
stated that there were never any frustrations in using the training. Most users had some difficulty
with the features in the training. One user stated, “I sometimes found myself lost on how to click
within the training.” Another user had a similar experience and stated, “Instruction on moving
forward to next segment was confusing.” Thus, changes to the module were critical.
A request for “brutal honesty” from the user survey was emphasized and all users
gave quality feedback and suggestions for improvements. Two users suggested offering more
course entry examples. Three users suggested improving the user functions. One user stated a
desire for a “more clear indication of what to select to move forward.” Another user stated, “I
think question/clarification boxes might be helpful for clicking issues.”
A final user survey was presented at the final product user test. Two new temporary
employees and four current student assistant employees all took the training in its final stage of
development. This questionnaire included questions regarding user-friendly applications,
understanding, information retention, and suggested changes and improvements; it also
contained a section for additional commentary (Appendix C).
19

Summative Evaluation Plan


For my level 1 summative evaluation, I first used the course entry module to establish
learning effectiveness. Upon completion of the entire course development, I had two additional
users test the entire training. These users were new to all Admissions processing, as they are
temporary employees who were recently hired and had just joined the team that week. The
timing was quite desirable, but again, the effectiveness evaluation proved to be substantial.
Pre- and post-testing.
I created a pre- and post-test to the course entry module and discovered significant
learning had occurred. Despite some usability challenges, all users scored higher on the post-test
than on the pre-test (table 1.3). The pre- and post-test scores were obtained manually by
comparing individual user scores from both tests. Since users took the tests from Google forms,
the results were retrieved by extracting all results from the pre- and post-tests in a Google sheet.
One point was granted for each correct answer on both the pre-test and the post-test. Once the
scores were totaled manually, they were placed in an excel file (Table 1.1) and compared for
statistical significance. In total, only 12 of the 13 questions were graded and included in the
learners’ scores. The results were meant to determine if the lesson had any effect on the user’s
understanding of how to enter coursework correctly in Oasis.
Pre-Test Post-Test
Score Score
6 10
5 8
2 8
4 8
4 7
4 7
4 6
Table 1.1

Table 1.3
20

The post-test mean value of 7.714 was significantly higher than the pre-test mean value
of 4.142 (Appendix A). To measure results, a paired two sample for means t-test was run for
dependent samples with 6 degrees of freedom (table 1.2). The data provides evidence for
learning transfer and proves the effectiveness that the course lesson has on the learner. Since the
hypothesis is directional, the one-tail values represent the true values for comparison. The t-stat
value of 7.42 is much larger than the t-critical value of 1.94. In addition, the p-value of 0.00015
is much smaller than the standard 0.05 alpha level. Therefore, the null hypothesis is rejected. In
conclusion, the results are statistically significant.
t-Test: Paired Two Sample for Means

Variable 1 Variable 2
Mean 4.142857143 7.714285714
Variance 1.476190476 1.571428571
Observations 7 7
Pearson Correlation 0.46897905
Hypothesized Mean Difference 0
df 6
-
t Stat 7.426106572
P(T<=t) one-tail 0.000153396
t Critical one-tail 1.943180281
P(T<=t) two-tail 0.000306792
t Critical two-tail 2.446911851
Table 1.2
In addition, the effect size was calculated to determine if the results were practically
significant. The pre-test experimental scores prior to training (M=4.14, SD=1.21) and the
observed post-test scores after training (M=7.71, SD=1.25) differed significantly [t(7)=2.84,
p<.05 and d=2.85]. The effect size of 2.85 is larger than the standard effect size of 0.8; thus, the
observed difference between the two tests are practically significant and the training did have a
significant effect on the learner.
In addition, a level 2 evaluation was completed and was based on the declarative and
procedural test results. I tested the end product on two new temporary employees that took the
training in its entirety. Both users passed all module quizzes with 100% and managed to
successfully post a transcript and enter all courses without errors. Both users printed their
certificates of training completion, which is only retrieved upon successful completion of each
21

module simulation assessment. Therefore, the users have gained an accurate understanding of the
information provided throughout the training.

Final Thoughts
The Specialists will play a key role in the continuing implementation of this training
since the training is designed for a task that a Specialist is responsible for making sure is
complete. Specialists are also responsible for managing all back-office student assistants and will
need to make sure any new employee onboarding receives this training. However, all
administrators have been offered training in order to sustain the continual use of the LMS site.
Managers that are assigned as administrators now know how to add users as students to the
course and will need to know how to navigate to iLearn. This project has been presented in a
meeting to all Admissions staff, which was received quite well with good comments following
the presentation. Managerial staff, in particular, are excited to have this training in the office and
have since discussed adding additional trainings over time.
Overall, this has been a focus that I have been wanting to help implement for the
Admissions Office for quite some time. As an Admissions Specialist, I know how time
consuming our individualized trainings are and was seeking a betterment to our old training
practices. My goal has been to find a solution that would help manage time in the office for
training tasks, as well as reduce common human errors. This solution has proven to be both
effective and engaging to users and third-party spectators. Management has shown their buy-in
and offered their support from the beginning idea stages to the end of the final product. I’ve been
fortunate enough to create a long-lasting product that will help to support new employees in the
area of Admissions processing, while working towards earning my degree.
I have discussed further record keeping of this training and my office has decided against
further pre- and post-testing, though the original summative evaluation has said to be useful and
appreciated for the review of the training. As a means to continue to grow in the area of training,
we are further discussing to allocate some of my time to job aid development and to potentially
add the automated process to a module in this training.
Personally, I will continue to make note of questions that I receive after this training and
look for ways to clarify the answers to all questions in these modules. I also hope to expand the
time of this training to cover more elements of Admissions processing. If time allowed, I would
like to have a full day’s worth of training that covers the line of processing, because there are
22

always odd cases that present themselves and more examples that can be shown to help clarify
understanding. For now, this training will put to good use and, as of now, the end product has
proven to serve itself well.
23

References
David L, "ARCS Model of Motivational Design Theories (Keller)," in Learning Theories, July

23, 2014, https://www.learning-theories.com/kellers-arcs-model-of-motivational-


design.html.
Fielden, J. (1998, August 8). Higher Education Staff Development: A Continuing Mission.
Retrieved November 12, 2019, from
http://www.unesco.org/education/educprog/wche/principal/mission.html.
Fuller, C. R. (2015). University employees' perceptions of organizational culture and training
and development: A qualitative case study (Order No. 10099974). Available from
ProQuest Dissertations & Theses Global: The Humanities and Social Sciences
Collection. (1783602416). Retrieved November 12, 2019, from
https://search.proquest.com/docview/1783602416?accountid=10355
Gutierrez, K. (2018, April 28). Adult Learning Theories Every Instructional Designer Must
Know. Retrieved from https://www.shiftelearning.com/blog/adult-learning-theories-

instructional-design

Huitt, W. (2011). Motivation to learn: An overview. Educational Psychology Interactive.

Valdosta, GA: Valdosta State University. Retrieved November 12, 2019, from

http://www.edpsycinteractive.org/topics/motivation/motivate.html

Oesch, T. (2018). Leading Training in Higher Education: Leveraging the Learning Culture for

Faculty and Staff. Retrieved November 12, 2019, from

https://trainingindustry.com/articles/strategy-alignment-and-planning/leading-training-in-

higher-education-leveraging-the-learning-culture-for-faculty-and-staff/

Rothwell, W. J., Benscoter, G.M., King, M., & King S.B., (2016). Mastering the Instructional

Design Process: A Systematic Approach (5th ed.). Hoboken, NJ: John Wiley & Sons.
24

Appendix A: Descriptive Statistics

Pre-Test Score Post-Test Score

Mean 4.142857143 Mean 7.714285714


Standard Error 0.459221465 Standard Error 0.473803541
Median 4 Median 8
Mode 4 Mode 8
Standard Standard
Deviation 1.214985793 Deviation 1.253566341
Sample Variance 1.476190476 Sample Variance 1.571428571
Kurtosis 1.778772112 Kurtosis 1.492561983
-
Skewness 0.366392178 Skewness 0.739707742
Range 4 Range 4
Minimum 2 Minimum 6
Maximum 6 Maximum 10
Sum 29 Sum 54
Count 7 Count 7
25

Appendix B: Observation Checklist


26

Appendix C: Final User Survey Results


27
28
29
30
31
32

Potrebbero piacerti anche