Sei sulla pagina 1di 51

Running head: ACP EVALUATION PLAN 1

The Assessment Certificate Program


at DEPAUL UNIVERSITY AND LOYOLA UNIVERSITY CHICAGO

Evaluation Plan for the Assessment Certificate Program

Ryan Crisp, Chelsea Metivier, Ariel Ropp

Loyola University Chicago

12/15/16
ACP EVALUATION PLAN 2

Table of Contents

Introduction 3
Rich Description of the Program 3
History and Context of the Program 5
Statement of the Problem 6
Justification for the Evaluation Plan 7
Stakeholder Analysis 8
Logic Model 11
Assumptions 13
External Factors 13
Quantitative Approach 14
Population and Sample Frame 14
Research Design and Method 16
Participant Recruitment 17
Survey Instrument 18
Statistical Analysis 20
Quantitative Data Presentation 22
Qualitative Approach 22
Methodological Strategy: Individual Interviews 23
Recruitment Strategies 24
Positionality Statement 25
Instruments 26
Data Analysis Procedures 27
Validity and Ethical Considerations 28
Qualitative Data Presentation 29
Timeline and Budget 30
Limitations 30
Next Steps 31
References 33
Appendices 34
Appendix A: Logic Model 34
Appendix B: Pre-test Website Invitation 35
Appendix C: Pre-test Survey Instrument 36
Appendix D: Post-test Email Invitation 38
Appendix E: Post-test Email Invitation Reminder 39
Appendix F: Post-test Survey Instrument 40
Appendix G: Interview Invitation Email 43
Appendix H: Interview Confirmation Email 44
Appendix I: Interview Invitation Follow-up Email 45
Appendix J: Consent to Participate in Interview 46
Appendix K: Interview Protocol 48
Appendix L: Timeline 50
Appendix M: Budget 51
ACP EVALUATION PLAN 3

Evaluation Plan for the Assessment Certificate Program

The Assessment Certificate Program (ACP) is a program designed to provide training and

support for Loyola University Chicago and DePaul University faculty and staff charged with

assessing college student learning. This workshop-based program is a collaboration between

Loyolas Faculty Center for Ignatian Pedagogy and DePauls Division of Student Affairs as well

as their Office for Teaching, Learning, and Assessment (Program Overview, 2016). Although

the ACP does not have an explicitly stated mission, it is guided by the missions of the programs

host institution.

Loyola University Chicagos mission states that the institution works to expand

knowledge to serve humanity through learning, justice, and faith (Mission and Identity, 2016).

Similarly, DePaul Universitys mission states that it pursues the preservation, enrichment, and

transmission of knowledge and culture across a broad scope of academic disciplines (Office of

Mission & Values, 2016). Furthermore, DePaul emphasizes its public service responsibility by

encouraging its faculty and staff to use their expertise to contribute towards social good and

developing partnerships with other institutions and agencies when appropriate. As a free

professional development program, the ACP serves as an example of both institutions desires to

use expertise to expand knowledge for the collective good and their willingness to collaborate

with other institutions.

Rich Description of the Program

The ACP began in September of 2014 and has since offered over 60 workshops to nearly

300 individuals registered in the program. The majority of participants are faculty, staff, and

graduate students of DePaul and Loyola, as the program was not initially designed for those

outside of the Loyola and DePaul communities. However, because of professional connections
ACP EVALUATION PLAN 4

and in recognition of both institutions missions to serve the Chicagoland community, the

program staff recently opened the workshops to individuals from Chicago State University, an

institution which suffered financial cuts during the 2015-16 fiscal year.

Although anyone from DePaul, Loyola, and now Chicago State University may attend

ACP workshops, individuals interested in receiving certification must formally register for the

program. It is not necessary to be registered for the ACP to attend the workshops, but the hope is

that participants will want to register and complete the program. Nevertheless, the program is

designed to be accessible professional development for all and not be limited to only those who

can commit the time to completing the program. Completion of the program demonstrates

commitment, exposure, and a basic level of proficiency in student learning assessment practices.

If participants do register for the certification, they are required to attend five workshops

and then complete and present a culminating project demonstrating what they learned from the

programs workshops. Participants from Loyola and DePaul can attend all but one of the

workshops at either institution and may attend the workshops in any order they choose.

Workshop topics include survey and rubric design, qualitative and quantitative assessment

plans, and non-traditional assessment methods, among others. The only mandatory workshop is

the Introduction to Assessment workshop, which must be taken at the participants home

institution. This workshop introduces the purpose and framework of assessment as well as an

overview of assessment initiatives specific to the home institution. Once a participant has

attended five workshops they must submit a culminating project proposal to the program

planning team (described in detail later), which provides feedback as well as the green light to

create and present the project. Participants receive a completion of program certificate after

successfully presenting their final project. The ACP encourages current participants and alumni
ACP EVALUATION PLAN 5

to stay engaged with the program by attending additional workshops, proposing new topics for

workshops, or presenting a workshop on a topic of their choice.

History and Context of the Program

The programs conception grew from conversations between Shannon Milligan and

Jennifer Sweet, the Assessment Coordinator at Loyolas Faculty Center for Ignatian Pedagogy,

the Associate Director of Assessment at DePauls Office for Teaching, Learning, and

Assessment, respectively. Milligan and Sweet recognized the need for assessment training for

faculty and staff, especially as their institutions began to place greater emphasis on assessing

student learning in the classroom as well as extracurricular programming. Together they

envisioned the ACP to address that need.

As expressed in the ACP purpose statement, this program intends to support faculty and

staff charged with assessing student needs in and out of the classroom. Milligan and Sweet

currently work within academic affairs at their respective institutions, but they both began their

careers in student affairs and recognize the value within that realm as well. However, without a

nuanced understanding of their intuitions assessment needs they invited student affairs staff at

DePaul and Loyola to support the program. These individuals include Ellen Meents-Decaigny,

Assistant Vice President of Planning, Operations, and Assessment in the Division of Student

Affairs at DePaul, and D. Scott Tharp, Assessment Coordinator in the Division of Student

Affairs at DePaul. Until a few months ago, Michael Beazley, Director of Assessment in the

Division of Student Development at Loyola, also served on the planning team. However,

Beazley recently accepted a new position at Loyolas Rome campus and his position on the team

has yet to be filled. Together, the planning team designs and facilitates almost every workshop,

with a few exceptions for guest speakers. Both institutions also have graduate interns assisting
ACP EVALUATION PLAN 6

with the logistics of the program. The facilitation of the program is contained within the

departments and division of its planning team.

Statement of the Problem

Assessment is crucial to student learning, as it uncovers whether the goals of the

classroom or extracurricular program in question are being met. Many faculty and staff

members at DePaul and Loyola recognize the need for student learning assessment but lack basic

competencies in this area. As the only assessment training program, available to faculty and

staff at DePaul and Loyola, the ACP plays a key role in supporting faculty and staff in their

assessment of student learning outcomes. With a flexible structure and free cost, the ACP is a

convenient and cost-effective way for faculty and staff to increase their knowledge and skills in

educational assessment at least in theory. Although it is apparent that the ACP addresses a

substantial need at its participating universities, it is still unclear how well the program is

achieving its broadly stated goals.

In this evaluation, we intend to measure the overall effectiveness of the ACP at

improving participants assessment knowledge and skills. We also seek to understand the

effectiveness of specific elements of the workshops and the overall program structure. To

address the first question, we will ask alumni about their assessment self-efficacy and ability to

integrate ACP teachings into their practice. To answer our second evaluation question, we will

concentrate on the effectiveness of the programs design (such as flexibility of programming and

workshop timeline) and the quality of its workshop content. This evaluation plan incorporates

both process assessment and outcomes assessment, as it is concerned with the implementation of

the ACP workshops as well as the learning outcomes of its participants.


ACP EVALUATION PLAN 7

Justification for the Evaluation Plan

Launched in September 2014, the ACP is a relatively new program that has yet to

undergo a formal evaluation. Participants are encouraged to suggest workshop topics and to

answer evaluation questions after individual workshops, but the ACP planning group has not

taken a systematic approach to evaluate whether the program is successfully meeting

participants learning needs. Given the programs relative infancy, now is an excellent time to

conduct a formative evaluation to assess if the program is moving in a positive direction.

According to Fitzpatrick, Sanders, and Worthen (2010), an evaluation is considered to be

formative if the primary purpose is to provide information for program improvement (p. 20). In

this case, the evaluation results can be used to recommend changes to improve the ACP before

this program commits to practices that are less than stellar.

Another reason for the importance of this evaluation is the need to increase buy-in from

upper-level administrators, particularly in Loyolas Division of Student Development and

Faculty Center for Ignatian Pedagogy. Since the departure of Michael Beazley from the Division

of Student Development, Shannon Milligan has become the sole representative of Loyola on the

ACP planning team. Milligan coordinates the Loyola side of the program largely on her own,

with little engagement from other Loyola administrators. It is unclear whether Michael

Beazleys position will be replaced and how the Division of Student Development might support

this program in the future. In light of these unresolved staffing questions, a formal evaluation

could highlight the necessity of the program to Loyola administrators. If the evaluation results

illustrate that the program is successfully addressing participants learning goals, these results

can be shown to administrators to validate the programs existence and advocate for additional

resources and support.


ACP EVALUATION PLAN 8

For this evaluation plan, we propose a mixed method design that incorporates both

quantitative and qualitative components. The quantitative portion will include a pre-test and

post-test with questions focusing on participants learning outcomes. The qualitative portion will

be comprised of interviews with approximately five participants that ask detailed questions about

participants perceptions of the program content and structure. Since the quantitative and

qualitative components focus on different types of questions, both components are of equal

importance to the overall evaluation plan. The quantitative surveys will be launched before the

qualitative interviews start but may continue even after the interviews end, due to the rolling

admission structure of this program. This mixed methods design is similar to a concurrent nested

approach because it has two methods that address different kinds of questions, and the second

data set is nested in the middle of the first data collection (Creswell, 2009). However, unlike a

traditional nested approach, this plan gives equal importance to both the quantitative and

qualitative components. We believe that our mixed methods design will lead to a better

understanding of the evaluations two main concerns (e.g., process and outcomes) than either

qualitative or quantitative approaches alone.

Stakeholder Analysis

Identifying, analyzing, and engaging stakeholders is essential to the design and

implementation of a successful evaluation plan (Bryson & Patton, 2010). Although the ACP has

numerous stakeholders, only a few individuals have both high interest in the program and high

power to implement an evaluation. According to Bryson and Patton (2010), stakeholders are

individuals, groups, or organizations that can affect or are affected by an evaluation process or

its findings (p. 31). Stakeholders then are grouped according to their power which is defined as:

These key players include the four administrators who comprise the program planning team:
ACP EVALUATION PLAN 9

Shannon Milligan, assessment coordinator for the Faculty Center for Ignatian Pedagogy at

Loyola; Ellen Meents-Decaigny, assistant vice president of planning, operations, and assessment

in the Division of Student Affairs at DePaul; Jennifer Sweet, associate director of the Office for

Teaching, Learning, and Assessment at DePaul; and D. Scott Tharp, assessment coordinator in

the Division of Student Affairs at DePaul. A fifth committee member, Michael Beazley, recently

left his position as director of assessment in the Division of Student Development at Loyola and

has not been replaced. The planning team plus two graduate interns regularly meet to plan the

ACP curriculum and coordinate its operation. As such, their direct input will be crucial to the

creation and execution of the evaluation plan.

Context-setters, those who have substantial power but lower direct interest in the ACP,

are the upper-level administrators to whom the planning team members report (Bryson & Patton,

2010). These stakeholders can influence the resources available to the ACP planning team but

have little interest in the daily operations of the program and therefore will not be engaged in the

evaluation process. At Loyola, Shannon Milligan reports to Carol Scheidenhelm, director of the

Faculty Center for Ignatian Pedagogy, who is committed to the success of the program but

provides little tangible support. Scheidenhelm reports to Vice Provost of Academic and Faculty

Resources David Prasse, who in turn reports to Provost John Pelissero. Additionally, upper-level

administrators in Loyolas Division of Student Development have historically played a role in

providing resources to the ACP, though their involvement has ceased since the departure of

Michael Beazley. At DePaul, Ellen Meents-Decaigny and Scott Tharp report to Vice President

of Student Affairs Eugene Zdziarski. Jennifer Sweet reports to Ruben Parra, director of the

Office of Teaching, Learning, and Assessment, who reports to Caryn Chaden, the Associate

Provost for Student Success and Accreditation.


ACP EVALUATION PLAN 10

Besides the key players and context-setters, evaluators must consider engaging

individuals who have high interest in the ACP but low power to influence its implementation.

Program participants generally fall into this category of high interest/low power (Bryson &

Patton, 2010). In this case, the primary subjects of the program are Loyola and DePaul faculty,

staff, and graduate students who are currently enrolled in the ACP, as well as faculty, staff, and

graduate students who attend occasional ACP workshops but are not enrolled in the program.

Current workshop participants, especially those enrolled in the ACP, may be affected by the

results of this evaluation and therefore should be consulted to learn if the program is helping

them achieve their desired learning goals. Likewise, alumni of the ACP may have high interest

in the success of program, particularly alumni who continue to attend new ACP workshops to

enhance their professional development. Prospective ACP participants individuals who have

expressed interest in the program but are not enrolled also have relatively high interest but low

power to influence the direction of the program. These individuals are important to consider

when constructing an evaluation plan, but for this evaluation, we will concentrate our

engagement with participants who have already completed the program and who can provide a

more comprehensive perspective on the programs strengths and weaknesses in terms of content

and delivery.

The final stakeholders in this analysis are those who fall into the crowd category,

having little power or interest in the ACP. Individuals in this category include faculty, staff, and

students at Loyola and DePaul who are not currently in the ACP and have little intention of

joining in the future. To keep the scope of this evaluation manageable, we will not engage these

stakeholders. A complete list of stakeholders is organized in the Power Versus Interest Grid

below (Bryson & Patton, 2010).


ACP EVALUATION PLAN 11

Stakeholders

Players (high power/high interest) Context Setters (high power/low interest)

ACP planning team: Director of Faculty Center for


o Shannon Milligan (Loyola) Ignatian Pedagogy: Carol
o Ellen Meents-Decaigny (DePaul) Scheidenhelm (Loyola)
o Jennifer Sweet (DePaul) Vice Provost of Academic and
o Scott Tharp (DePaul) Faculty Resources: David Prasse
(Loyola)
Provost: John Pelissero (Loyola)
Vice President of Student Affairs:
Eugene Zdziarski (DePaul)
Director of the Office of Teaching,
Learning, and Assessment: Ruben
Parra (DePaul)
Associate Provost for Student
Success and Accreditation: Caryn
Chaden (DePaul)

Subjects (high interest/low power) The Crowd (low power/low interest)

Current and prospective faculty, staff, and Loyola and DePaul faculty, staff,
students enrolled in the ACP and students who are not in the
Faculty, staff, and students who attend program
workshops but are not enrolled in the ACP Loyola and DePaul communities at
large

Logic Model

Komives, Dugan, Owen, Wagner, Slack, and Associates (2011) define a logic model as

a technique that clearly articulates each of the program goals and objectives, the activities,

events, and projects that will occur to accomplish these objectives and a specific way to measure

outcomes associated with each objective (p.186). Our team created a logic model (Appendix A)

to outline the successful implementation of the ACP offered jointly by the Faculty Center for

Ignatian Pedagogy at Loyola, the Division of Student Affairs at DePaul, and the Office for

Teaching, Learning, and Assessment at DePaul. Although the skills and knowledge gleaned
ACP EVALUATION PLAN 12

through the ACP are applicable in myriad ways, this model focuses on the success of program

alumni in integrating assessment methods into their professional practices, as well as the

effectiveness of the program in delivering desirable content.

Moving from left to right on our model, we first outline the multiple inputs of the ACP,

which have been categorized as follows: personnel, time, finances, and technology. Second, the

model explains how these inputs are transformed into the outputs of activities and participation.

The activities section focuses on the central piece of the program (e.g., the workshops that

comprise the ACP) and considers the components such as marketing, promotion, and

collaborative team meetings that contribute to the effectiveness of program planning and

implementation. The participation section outlines the administrative staff responsible for

managing the various components of the program as well as the faculty and staff members

enrolled in the program. Lastly, the model projects short and long term outcomes for the

program under successful conditions, considering all inputs and outputs. The model assumes

that upon completion of the program, ACP alumni will use the knowledge gained in the program

to improve their use of assessment in their current professional positions (short term outcomes)

while continuing to evolve and consider the changes in the evaluation field as they progress

professionally (long term outcomes).

The short-term goals focus specifically on the expansion of assessment knowledge of

alumni upon successful completion of the program and the submission of their capstone project.

Through this evaluation, we aim to help the coordinators of the ACP to better understand how

well alumni are able to integrate skills acquired through the program into their practices, and to

hone in on any potential inefficiencies in the delivery of program content. The long term goals

take a more macro view of the application of program knowledge, assuming that assessment
ACP EVALUATION PLAN 13

practices will shift with the progression of time. Long term focuses also include the assumption

that ACP alumni will work in multiple environments that will allow them to tailor the lessons of

the program to the needs of multiple different departments and/or educational settings.

Assumptions

In addition to input, outputs, and outcomes of this assessment, it is important to also

consider the beliefs the evaluators have about the ACP prior to conducting the assessment. At

the bottom left of the model, assumptions outline the expectation that the ACP alumni and

prospective participants have a demonstrated interest in expanding their knowledge of best

practices in the field of evaluation and that professional staff have a vested interest in continuing

to expand their knowledge of the topic to ensure continual improvement. The model also

assumes that administrative staff members who manage the ACP are committed to the long-term

sustainment of the program.

External Factors

At the bottom right of the model, external factors consider the environment in which the

ACP exists, the organizational and reporting structure of the managing departments at Loyola

University Chicago and DePaul University, and the academic and personal time demands of both

professional support staff as well as participants. The model is cognizant that the ACP hinges on

financial and personnel support from both universities, and that successful evaluation of this

program involves the consideration that these factors interact with and influence programming

decisions.

Quantitative Approach

The following section will summarize the quantitative approach in this evaluation plan:

an outcomes-based evaluation of the Assessment Certificate Program (ACP). The quantitative


ACP EVALUATION PLAN 14

approach is the first step in our evaluation plan and provides the groundwork for the more

nuanced qualitative assessment. While the process-based qualitative assessment will focus on

analyzing the success of specific elements of the workshops and overall program structure, the

outcomes-based quantitative assessment will utilize pre-test/post-test surveys to measure the

effectiveness of the ACP at improving participants assessment knowledge and skills as well as

their application of assessment in their everyday professional practice. Therefore, the

quantitative approach will contribute critical information to evaluate the programs central goal

of enhancing participants assessment of student learning. The following sections outline the

population and sampling frame, research design and method, description of the survey

instrument, data analysis, and presentation of data for this portion of the evaluation.

Population and Sampling Frame

Our target population for this evaluation plan is ACP participants who begin the program

in 2017. These participants include faculty and staff as well as graduate students from Loyola

University Chicago and DePaul University. Since this population has not been previously

sampled, there is no precedent set for determining the likely percentage that will complete the

surveys. Due to the likely small final, useable-sample size, the limited capacity of the program

staff, and the limited resources of the program, this evaluation plan will engage in non-

probability, census sampling. This sampling method will be most effective as we will ask every

participant who registers in Winter/Spring 2017 to complete the pre-test. As such, every

individual in the target population will have a non-zero chance of being selected (Gansemer-

Topf&Wohlgemuth, 2009). Only individuals who completed the pre-test will be invited to take

the post-test.
ACP EVALUATION PLAN 15

The intended implementation of this evaluation plan is for future program registrants, so

it is impossible to know the size of our survey population. However, between January and

November 2016, 56 individuals registered for the ACP. From that total, 14 are faculty, 19 are

staff, and 22 are graduate students. Recognizing that it is unlikely all registrants would complete

the pre-test survey, and not all registrants will complete the certificate program and/or

subsequently complete the post-test survey, it is likely that there will be a small final useable-

sample population. We recognize that the pre-test/post-test design will likely decrease our final

useable-sample size, as fewer participants will complete both tests, but we believe the data

collected will be more substantive by allowing us to measure actual change in skills and or

knowledge from the time participants begin the program to the time they complete it.

Since culminating project workshops are offered two times per year and historically only

two or three people graduate each time, we will send post-test invitations to program alumni

from at least three graduation cycles (Appendix D). This will ensure that we obtain a large

enough useable-sample of survey respondents. We recognize that waiting such a long time

between the pre- and post-tests may result in history or maturation effects (e.g., external events

and participants natural development which occurs between the first and second measurement)

that could threaten the internal validity of the study, but we believe the data we collect over time

will be richer and potentially more useful. We will begin administering pre-tests in January 2017

and post-tests in November 2017, conducting an initial analysis of data in December 2017.

However, it is unlikely that many participants will start and complete the certificate program

during this period. As such, we will continue to administer pre- and post-test surveys until we

collect a sample size large enough to conduct a truly robust evaluation (N=30 or higher).

Research Design and Method


ACP EVALUATION PLAN 16

To evaluate the outcomes of the Assessment Certificate Program, the evaluators will

conduct a quantitative study with a pre-experimental pre-test/post-test design. In this type of

design, one group of individuals is tested before and after receiving a treatment to gauge whether

the treatment has any effect on participants outcomes. One of the benefits of using a pre-

test/post-test design is that the pre-test creates a baseline of participants knowledge and skills

prior to the intervention. In addition, a pre-test/post-test format establishes a stability estimate of

reliability because the same instrument is administered to the same group after waiting a period

between administrations (Saunders & Cooper, 2009). This design is appropriate for our

evaluation because we want to understand how the same group of ACP participants changes over

time as a result of the program. In this case, the intervention is the ACP and the treatment group

is ACP participants.

Since faculty and staff choose whether or not to participate in the program, it is not

feasible to randomly assign them to a treatment group or control group. Although our design

would be stronger if it had a control group by which to compare the treatment group, we lack the

time and resources necessary to include a control group. For example, it may be expensive to

incentivize non-ACP participants who have no personal interest in the program to take time to

complete our pre-tests and post-tests. To mitigate the lack of control group in our study, we will

ask questions on the post-test that explicitly ask participants to consider how the ACP shaped

their assessment skills, knowledge, and confidence. By focusing participants attention on the

ACPs impact, we can feel more confident that their answers will highlight how the ACP not

other factors affects participants outcomes.

Participant Recruitment
ACP EVALUATION PLAN 17

To recruit participants for our evaluation plan, a pre-test survey (Appendix C) will be

administered as part of the ACP registration process. When prospective participants go to the

ACP website to register for the program, the bottom of the registration page will include a

paragraph (Appendix B) inviting them to take a short, 10-minute survey designed to help the

program coordinators improve the ACP. The website will explicitly state that the survey is

confidential and voluntary, and they are free to stop the survey at any time. It will also note that

completion of the survey will not affect participants acceptance into the program. A link will

then take the participant to a Google form where the survey will be located. By asking

participants to complete the pre-test survey when they register, we hope to have a larger

population sample than if we sent the survey to them at a later time that is less convenient. We

will keep a list of individuals who complete the pre-test survey to ensure they receive the post-

test survey, should they complete the certificate.

The post-test survey (Appendix F) will be sent via email three months after participants

receive their certificate at a culminating workshop. Only individuals who completed the pre-test

will be invited to take the post-test. We will keep track of who finishes the program by asking

the ACP coordinators to email us a list of new alumni after each culminating workshop; then we

will create a Google Calendar reminder to contact those individuals three months later. The

invitation email for the post-test (Appendix D) will explain the purpose of the survey and include

a link to a Google form where the survey will be located. Alumni who do not respond after two

weeks will receive a second email inviting them to take the post-test. (Appendix E). Faculty,

staff, and graduate students are generally busy people, but we believe that many individuals will

still feel invested in the program three months after completion and therefore be willing to

complete the post-test. Having recently completed an assessment program, they may be more
ACP EVALUATION PLAN 18

likely than the average person to understand the importance of participating in evaluations like

ours.

Survey Instrument

When participants register for the ACP, they will be presented with an optional 11-item

survey to gauge their confidence level with each aspect of assessment covered in the major

workshops that comprise the program (Appendix C). This pre-test survey will provide data for

the program coordinators at Loyola and DePaul to design individual workshop content in

accordance with the comfort levels of participants while also collecting baseline data to compare

with post-test results. The optional nature of the pre-test will be emphasized, as well as the fact

that non-completion of the survey component will have no impact on registration or completion

of the program.

The majority of questions on the pre-test will be structured using a five point weighted

Likert scale as follows: 1=Not at All Confident, 2=Not Confident, 3=Somewhat Confident,

4=Confident, 5=Very Confident. The pre-test will also include a question aimed at determining

a participants prior experience with assessment-related college-level courses. The participant

will have the option to select the number of 3-credit courses from the following ranges: 0, 1-2, 3-

4, 5 or more. This data will inform both the quantitative results and the program coordinators of

participant experience level. Additionally, participants will be asked to state their employment

status (faculty/staff/student) and institutional affiliation (DePaul or Loyola), which will be

included in the statistical analysis.

The second part of our quantitative plan consists of a post-test survey that is emailed to

participants three months after the submission of their final project and completion of the

certificate (Appendix F). The first set of post-test survey questions will be nearly identical to the
ACP EVALUATION PLAN 19

pre-test questions to accurately measure the self-reported confidence level for each participant in

each assessment category before and after their experience with the program. A Not

Applicable option will only be included on the post-test questions, as participants may complete

the program without exploring every topic covered in the workshops.

The second question set in the post-test will differ from the pre-test, focusing on

participants perceptions of program usefulness. A list of five common participant reasons for

joining the program will be listed (e.g., to learn the basics of assessing student learning, to

fine-tune existing assessment skills and knowledge), and participants will be asked to rate the

helpfulness of the ACP in achieving each of those goals, on a Likert scale ranging from 1=

Unhelpful to 4= Helpful. We will also include a 0=Not Applicable option in case any of the

goals are irrelevant to them. Next, we will ask a Likert-scale question to assess participants

knowledge of assessment-related resources available to them on their campus. This question is

designed to assess how well the ACP has helped participants to access assessment-related

resources on campus, which is one of the programs learning outcomes. Participants will be

asked how confident they feel they can find a person or resource on campus to help them answer

a question about assessment, on a scale ranging from 1=Not at all Confident to 4=Very

Confident. In addition, the survey will ask two Likert-scale questions asking participants about

their recent application of ACP knowledge and skills as well as the likeliness they will

incorporate knowledge gained in the ACP into their future professional practice. These data will

inform the program coordinators of overall participant satisfaction and program success.

As with the qualitative portion of the instrument, the wording of all questions on both

surveys will be developed in conjunction with the programs coordinators at each campus. The

language used for each question will not assume participants already have knowledge of
ACP EVALUATION PLAN 20

assessment or evaluation nomenclature. In addition, several faculty, staff, and students who

already completed the ACP will be invited to take a pilot test of both the pre- and post-tests to

provide feedback on the wording of survey questions. This feedback will be used to ensure that

all questions are clear and understandable to participants. Since we are not using a preexisting

instrument with established reliability and validity, pilot testing is essential for establishing the

surveys content validity (Creswell, 2009).

Statistical Analysis

For our analysis, descriptive statistics will be utilized to determine the means and

frequency distributions of participants answers on the pre-test and post-test surveys. We are

primarily interested in the responses of individuals who complete both the pre-test and the post-

test, not just the pre-test. Participants who only complete the pre-test will not be included in the

analysis conducted at the end of 2017, though their pre-test surveys will remain in storage in the

event that they do ultimately take a post-test. There is still some usefulness in the data from the

participants who dont complete the post-test because their responses will help us determine

which assessment topics are most needed by incoming participants and allow the ACP

coordinators to modify workshop content accordingly.

We will conduct a paired samples t-test for each test item that appears on both the pre-test

and post-test. In a paired t-test, there is one independent variable (i.e., the treatment) with two

levels (i.e., pre-test and post-test) and a continuous dependent variable (i.e., the mean score).

This type of statistical test is appropriate for our purposes because we want to compare the same

individuals scores before and after a treatment. A paired samples t-test will allow us to

determine if there is a statistically significant difference between the ACP participants mean

score on an item before and after participating in the ACP. We hypothesize that participants
ACP EVALUATION PLAN 21

mean post-test score on each question will be significantly higher than the corresponding mean

pre-test score.

In addition, demographic variables such as faculty/staff/student status, prior assessment

training, and university affiliation will be included in the analysis to assess whether the ACP is

more or less effective for different demographics of people. For example, we will conduct one-

way ANOVA tests to compare the mean difference in pre-test/post-test scores of faculty

participants, staff participants, and graduate student participants. This will allow us to determine

if a particular aspect of the ACP is significantly more effective in producing a change in a person

belonging to one group (e.g., faculty) versus another (e.g., staff). An ANOVA is an appropriate

statistical test because there is one nominal independent variable with two or more levels

(faculty, staff, students) and a nominal dependent variable (mean scores). We will repeat this

process with the prior assessment training variable. First, we will ask participants how many

assessment-related courses they have taken prior to the ACP, using the following ranges: 0, 1-2,

3-4, 5 or more. Each of these ranges represents a general level of prior assessment knowledge:

beginner, novice, intermediate, and advanced. Then, we will conduct one-way ANOVA tests to

compare the mean difference in pre-test/post-test scores of beginner, novice, intermediate, and

advanced participants. Doing so will illustrate which aspects of the ACP are most effective for

participants who enter with different assessment skill levels.

Quantitative Data Presentation

The initial results of this assessment will be combined with the qualitative data, collated

into a report, and shared with Shannon Milligan at Loyola University Chicago and Jen Sweet,

Ellen Meents-Decaigny, and D. Scott Tharp at DePaul University after one year of the evaluation

process. Pre-test and post-test quantitative findings will be summarized in narrative form in
ACP EVALUATION PLAN 22

addition to the raw data. The data will be visually communicated through bar graphs. The report

will also include action steps for program improvement and advancement of the goals of the

ACP. As a supplement to the report, evaluators will share the findings using PowerPoint in a

joint meeting with Loyola and DePaul stakeholders.

Qualitative Approach

As previously stated, our evaluation plan includes a quantitative and qualitative portion.

The outcomes-based quantitative component intends to measure the overall effectiveness of the

ACP at improving participants assessment knowledge and skills. The process-based qualitative

component seeks to understand the effectiveness of specific elements of the workshops and the

overall program structure. In this evaluation plan, the quantitative component will be conducted

first. A pre-test will be included as an optional component of the registration process for the

ACP, and post-tests will be conducted three months after participants complete the program.

The qualitative component will occur second, as it will consist of individual interviews with

participants after they complete the certificate program. Despite differences in timing of the

administration of the quantitative and qualitative components, there is no difference in their

relative importance to the study. They are equally important to the completion of the evaluation

plan because they are designed to collect data on different aspects of the ACP.

The purpose of the qualitative assessment plan is to understand how the programs

structure and workshop content influenced participant's completion of the program and

subsequent application (or lack thereof) of knowledge gained through the ACP. The evaluators

will collect this process-based data through interviews with alumni of the program. Each

interview will consist of a series of open-ended questions about the content of the workshops as

well as the programs design. ACP alumni will receive the invitation to participate in the
ACP EVALUATION PLAN 23

interviews while completing the quantitative survey, which will be administered three months

after completing the program. Should they accept the invitation to participate in the interview, it

will happen at their earliest convenience. This time frame is intended to give the ACP alumni

time to utilize some of the knowledge gained through completion of the certificate program.

Methodological Strategy: Individual Interviews

Participants who complete the ACP in late 2017 and early 2018 will be given the

opportunity to participate in a voluntary one-on-one interview to further analyze their experience

with the program and parse specific elements of the program that were beneficial and applicable

to professional practice. In comparison with other methods of qualitative data collection such as

focus groups, we believe the one-on-one nature of the interview will allow for an examination of

the individual experience of the participant and create a comfortable, confidential space for an

open dialogue around program improvement. Participants who choose to complete the

quantitative portion of the evaluation will be encouraged to participate in an interview with a

statement at the end of their quantitative survey indicating our interest in learning more about

their experience with the program.

Given that faculty and staff have busy schedules and program alumni are potentially

spread across at least four campuses, every effort will be made to accommodate the availability

of interview participants. Two members of the evaluation team will participate in each interview

in order to maintain a consistent data gathering process across all interviews. One evaluator will

facilitate the interview and one evaluator will manage the audio recording, if permitted by the

participant. Both evaluators will take notes throughout the process. Roles will rotate with each

interview. We anticipate each interview to last approximately 45 minutes.


ACP EVALUATION PLAN 24

Each interview will begin with a request to record the session in audio format with the

assurance that all recordings will remain confidential and stored on a flash drive that will be

stored in a secure location on campus. If the participant denies this request, detailed notes taken

by the two evaluators will be used as the sole method of data capture. Once the signed consent

form has been obtained, the first interviewer will begin by thanking the individual for

participating in the process and reiterating the value of their feedback regarding their experience.

The second interviewer will then begin recording the interview if permitted by the participant,

and if not permitted, will begin taking detailed notes. The moderator will then ask the questions

sequentially as outlined in Appendix K, allowing for conversational flow and follow-up or

clarifying questions as needed throughout the process. At the conclusion of the written interview

questions, the moderator will give the participant the opportunity to express any further thoughts

or insights not drawn out by the interview process. Finally, the evaluators will express gratitude

for their participation and formally conclude the session.

Recruitment Strategies

Evaluators will attempt to recruit between five and ten participants for interviews using

the convenience sampling method, as we will be reliant on the availability of program

participants when data is collected. We will initially invite participants who complete the

program in late 2017 but may need to invite participants from later cohorts in order to reach our

minimum goal of five interviews. We are confident that this sampling method will establish both

sufficiency and saturation as outlined by R.M. Cooper in Schuh et al. (2009). Alumni of the

program will be invited to participate in the interviews via email (Appendix G) three months

after they complete the program. If they choose to participate, a confirmation email will be sent

confirming details of the interview including date, time, location, and names of the individuals
ACP EVALUATION PLAN 25

conducting the interview (Appendix H). Compensation for participation in the program will be a

Starbucks gift card in the amount of $5.00. If we are unable to meet the minimum threshold of

five participants, a follow-up email (Appendix I) will be sent emphasizing the importance of this

element of the evaluative process in the continual improvement of the ACP and once again

extending an invitation for an interview at a convenient date, time, and location.

Positionality Statement

The qualitative study will be conducted by all three evaluators. Because we will be

responsible for the facilitation, collection, and analysis of the data, it is important to share our

backgrounds and acknowledge our positionality when completing the evaluation. As the current

Graduate Intern for the Faculty Center for Ignatian Pedagogy (FCIP), Chelsea Metivier will

serve as one of two internal evaluators. In the role, Chelsea is primarily responsible for

coordinating the programmatic and logistical aspects of the ACP. Her understanding of the

program, access to participants and course content, as well as the departmental knowledge she

holds, are critical to the successful creation and implementation of the evaluation plan.

Furthermore, because she has over one year of experience managing the program, she has name

recognition and at minimum, a digital relationship with many of the participants. Her access and

previously established relationship with ACP alumni will likely aid the evaluators in establishing

the buy-in necessary for a faculty or staff member to complete the interview.

Ryan Crisp will serve as the second internal evaluator, because he is a participant in the

program and a full-time staff member at Loyola University Chicago. Ryans participation in the

program provides the evaluation team a ground-level understanding of the experience of a

participant in the program. Furthermore, because he is a full-time staff member at the institution,
ACP EVALUATION PLAN 26

Ryan has the institutional credibility and status that will likely be helpful in accessing resources,

such as conference rooms for interviews and technology for recording the interviews.

The evaluation plan will also include an external evaluator, in recognition of the

possibility for bias that internal evaluators may have as a result of their closeness to the program.

Ariel Ropp will serve as the external evaluator because of her expertise in quantitative data

analysis and her previous experience facilitating evaluation plans. Furthermore, as a graduate

assistant at Loyola University Chicago, she has institutional knowledge but lacks previous

exposure or experience with the ACP.

The presence of internal and external evaluators is important to the overall success of the

evaluation. Internal evaluators typically have greater knowledge about the organization,

including insight to the decision-making style of the organization. Additionally, they can

provide important insight and perspective on the history of the organization and program.

External evaluators are important because they can bring greater credibility and perceived

objectivity, and generally bring industry-specific expertise (Fitzpatrick et al., 1997). Together,

the evaluators expertise will ensure a successful implementation of the evaluation plan, while

the variety of their experiences and relative proximity to the program will minimize biases.

Instruments

The evaluators will send an email invitation to all ACP alumni three months after they

graduate from the program. The invitation will explain the purpose of the qualitative evaluation,

topics that will be covered in the interview, and details relating to time, date, and location

(Appendix G). Alumni who respond to the email affirmatively will receive a confirmation email

(Appendix H) with further details on the interview as well as an attached consent form to review

prior to their scheduled interview. The consent form further explains the purpose of the
ACP EVALUATION PLAN 27

interview as well as its risks, benefits, compensation, and confidentiality (Appendix J). The

interviewers will review the consent form with participants at the beginning of each interview.

For the actual interviews, the evaluators will carefully follow the interview protocol

stated in Appendix K. The interview protocol includes an introduction and seven open-ended

questions pertaining to participants experiences in the ACP. Questions include What are your

thoughts on the structure of the Assessment Certificate Program? and As a result of the

program, how did your understanding of best practices in assessment shift? What was your most

salient take-away from the program? Some questions include probes to help participants

elaborate on their answers. The interview protocol also requires the evaluators to state a short

summary of the participants answers to ensure that they understand everything correctly.

Data Analysis Procedures

Prior to the first interview, the three evaluators will meet to discuss possible thematic

codes they expect to arise in the interviews, such as ACP structure flexibility, workshop topics,

and application of ACP skills/knowledge. These themes will be informed by the initial results of

the quantitative assessment. The evaluators will also be prepared to look for emerging themes

they did not initially anticipate. In addition, they will discuss how each evaluator will take turns

transcribing the interviews and what criteria they will use to ensure consistency of transcription.

For example, the evaluators may decide to leave out ums and pauses from the transcript to

save time and space.

For each interview, two evaluators will be present, with one evaluator asking questions

and both evaluators taking notes. These roles will rotate with each interview. After each

interview, the evaluator who asked the interview questions will transcribe the interview

recording and send the transcript to the other two evaluators. Then, all three evaluators will
ACP EVALUATION PLAN 28

individually look for thematic codes in the transcript and any personal notes before meeting up to

compare codes. The individual coding process involves organizing...material into chunks or

segments of text before bringing meaning to information (Creswell, 2009, p. 186). At the first

group meeting, the evaluators will look for commonalities in their codes and create a Google

Document with their shared codes. After the second interview, the evaluators will look for codes

that had emerged in the first interview as well as new codes specific to the second interview.

They will then meet to discuss the old and new codes. This process will be repeated for each

subsequent interview. Since the evaluators hope to conduct at least five interviews, they

anticipate having five post-interview coding meetings. They will then meet for a final meeting to

discuss all the interviews, cluster similar codes, and develop a list of 5-6 overarching themes that

emerged from the majority of the interviews.

Validity and Ethical Considerations

To ensure the results of the qualitative study are valid and ethical, the evaluators will take

several measures to protect participants rights and reduce personal bias. As previously stated,

participants will be emailed an invitation and consent form prior to the interviews. The

invitation explains the purpose of the interview to give participants an accurate expectation of

what the interview process will be like (Appendix G). The consent form reiterates the purpose of

the interview and also elaborates on the risks, benefits, compensation, voluntariness, and

confidentiality of the interview (Appendix I). Then, at the beginning of each interview, the

evaluators will review the consent form with the participants and give them the opportunity to

ask questions before signing the form. This will allow participants to make an informed decision

and reduce the possibility of coercion. Another way the evaluators will protect participants

rights is by presenting the evaluation results in aggregate form rather than on the individual level
ACP EVALUATION PLAN 29

as much as possible. However, in order to highlight the nuances of individuals responses, the

evaluators will likely need to refer to some of the participants directly. In this case, the use of

pseudonyms and removal of personally identifiable information will help ensure that

participants identities remain private.

Besides protecting participants privacy, the evaluators will take steps to ensure the

evaluation results are accurate. By writing a positionality statement, they have acknowledged

their personal biases up front and made an effort to minimize them. Additionally, the evaluators

will engage in member checking to verify that their notes align with participants intended

meanings. They will do this by summarizing the interviewees responses at the end of each

interview and providing a space for the interviewee to clarify statements or contribute additional

comments. As for the coding process, we are confident that having all three evaluators code the

transcripts will reduce bias because each one of us brings a different perspective. Chelsea brings

an insider perspective, Ariel brings an outsider perspective, and Ryan brings a participant

perspective; together, we have a balanced view of the program. Thus, when all three of us agree

on a code, the likelihood of bias is lower and the likelihood of accuracy higher. In these ways,

the evaluation team is ensuring the validity of the research findings.

Qualitative Data Presentation

Qualitative findings will be organized around five or six overarching themes related to

the ACP structure, workshop topics, and application to practice. The evaluators will present the

results in aggregate form as much as possible and use pseudonyms to ensure that participants

identities remain confidential. Detailed description of themes and sub-themes will be

accompanied by a few direct quotations from participants to highlight key ideas (Creswell,

2009). A list of codes and their relative frequency will also be included in the report. Once the
ACP EVALUATION PLAN 30

analysis is complete, the evaluators will compile the qualitative results along with the initial

quantitative results and share the report with Shannon Milligan at Loyola University Chicago

and Jen Sweet, Ellen Meents-Decaigny, and D. Scott Tharp at DePaul University in 2018. The

report will also include recommendations to improve ACP workshop content and structure. The

evaluators will share these findings in a PowerPoint presentation in a meeting with the ACP

coordinators in 2018.

Timeline and Budget

We have devised a Gant chart to illustrate the timeline for our overall evaluation plan

(Appendix L). As noted in the chart, the quantitative component will begin in January 2017,

while the qualitative component will occur between November 2017 and February 2018. Since

we will be evaluating more than one cohort of participants, we have illustrated the timeline of the

quantitative component implementation for two cohorts, A and B.

We have also included a budget to assist the ACP staff with determining the approximate

cost of evaluation plan implementation. The budget (Appendix M) outlines the projected costs

with the assumption that the implementation process will begin in January 2017. Any evaluation

plans outside this proposal would require additional financial resources. The budget centers

around the costs associated with in-person interviews and available resources. Please note that

the budget is rather small due to the fact that most costs are currently covered under the

operating budgets of the respective departments at each institution. In addition, the evaluators

have volunteered to conduct the evaluation pro bono. Finally, the SPSS analysis software is

available for free use by staff, students, and faculty of both DePaul University and Loyola

University Chicago.

Limitations
ACP EVALUATION PLAN 31

We acknowledge that this evaluation plan is constrained by several methodological

limitations. The quantitative portion is limited by sample size, length of administration, and self-

reported data. Since fewer than 15 people typically graduate from the ACP in a given calendar

year, our sample size may be quite small for the initial round of quantitative surveys. It can be

difficult to find statistically significant findings from small sample sizes because statistical tests

usually require larger, more representative samples. As such, we will continue to administer the

surveys until we reach a large enough sample (N=30) to be confident that our statistical analyses

are robust. Since our surveys will be administered to multiple cohorts over a large span of time,

it will be difficult to control for any changes that happen during that time. Additionally, since

program participants are allowed up to two years to finish their certificate, it is hard to know if

the differences on their pre- and post-tests are the result of the ACP or due to other factors, such

as the participants natural maturation or other external events that happen during their time in

the program (or during the three-month window before the post-test is administered). We

attempted to mitigate this limitation by crafting survey questions that explicitly ask respondents

to think about how their assessment skills and knowledge have changed as a result of

participation in the program. Of course, as with all self-reported data, we are not independently

verifying the things that participants tell us; we have to take their answers at face value.

Another limitation of this study is the limited budget and personnel available to conduct

the evaluation. The program does not have the funds to complete this evaluation project on its

own or to compensate the evaluators for our time, so we have volunteered to conduct the

evaluation on a pro bono basis. All three evaluators will code the qualitative interviews to

reduce the possibility of bias, but this process will be time-consuming. We will need to maintain

diligence and motivation as we interview at least five participants and transcribe and code their
ACP EVALUATION PLAN 32

interviews. During the coding process, we will actively hold each other accountable and check

our assumptions, with the goal of establishing validity and inter-rater reliability.

Next Steps

The data collected and analyzed as a result of this evaluation plan will provide the ACP

planning team with valuable information about the programs ability to achieve its broadly stated

goals. Specifically, the evaluation plan intends to measure the overall effectiveness of the

program at improving participants knowledge and skills as it relates to assessing student

learning. Additionally, the plan is designed to understand the overall effectiveness of specific

elements of the workshops provided and the overall structure of the program. Ideally the

information will then be used to make any necessary changes to the program content and

structure.

The ACP planning team recognizes the importance of evaluation and wants to begin the

evaluation process as soon as possible. Therefore, this evaluation plan will be presented to the

ACP planning team at the end of December 2016 for their review and approval to begin

collecting data during the spring 2017 semester. As discussed throughout the plan, the relatively

few participants that complete the program each year will certainly impact the duration of the

evaluation process. However, should participation in the program increase over the next few

semester, the ACP planning team will need to reevaluate their staffing and funding to determine

how to support increased survey data with their limited resources. Overall, the ACP planning

team looks forward to gaining valuable insight into the programs effectiveness and to better

understand what (if any) changes could be made to make this free professional development

opportunity even more impactful for its participants.


ACP EVALUATION PLAN 33

References

Bryson, J. M., & Patton, M. Q. (2010). Analyzing and engaging stakeholders. In J. S. Wholey, H.

P. Hatry, & K. E. Newcomer. (Eds.), Handbook of practical program evaluation (pp. 30-

54). San Francisco, CA: Jossey-Bass.

Cooper, R.M. (2009). Planning For and Implementing Data Collection. In Schuh and Associates,

Assessment Methods for Student Affairs (50-75). San Francisco, CA: Jossey-Bass.

Creswell, J. W. (2009). Research design: Qualitative, quantitative, and mixed methods

approaches. Thousand Oaks, CA: Sage.

Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2010). Program evaluation: Alternative

approaches and practical guidelines (4th ed.). Upper Saddle River, NJ: Pearson

Education, Inc.

Gansemer-Topf, A. M., &Wohlgemuth, D. R. (2009). Selecting, sampling, and soliciting

subjects. In Schuh, J.H. (Eds.), Assessment methods for student affairs (pp. 77-105). San

Francisco, CA: Jossey-Bass.

Komives, S. R., Dugan, J. P., Owen, J. E., Wagner, W., Slack, C., & Associates. (2011).

Handbook for student leadership development. San Francisco, CA: Jossey-Bass.

Mission and Identity. (n.d.). Retrieved October 2, 2016, from Loyola University Chicago

website, http://www.luc.edu/mission/index.shtml

Office of Mission & Values. (n.d.). Retrieved October 2, 2016, from DePaul University website,

https://offices.depaul.edu/mission-and-values/about/Pages/MissionStatement.aspx

Program Overview. (n.d.). Retrieved October 2, 2016, from Assessment Certificate Program

website, http://acp.depaultla.org/
Running head: ACP EVALUATION PLAN 34

Appendix A

Logic Model for Assessment Certificate Program


Running head: ACP EVALUATION PLAN 35

Appendix B

Pre-test Website Invitation

Thank you for registering for the Assessment Certificate Program! Before you complete your
registration, we invite you to take a short survey to help us understand your knowledge of
assessment practices prior to entering this program.

This survey takes fewer than 10 minutes to complete. Please know that you may skip questions
or stop taking this survey at any time. Your responses are voluntary and will have no impact on
your acceptance into the program. Your responses will remain confidential but not anonymous.

Please click the following link to begin the survey: [Insert link to Google form]

Many thanks,

Assessment Certificate Program Staff


ACP EVALUATION PLAN 36

Appendix C

Pre-test Survey Instrument

Thank you for registering for the Assessment Certificate Program and for taking a few moments
to complete this survey. Your answers will assist us in understanding your knowledge of
assessment practices prior to entering this program.

Please know that you can stop taking this survey at any time. Your responses are voluntary and
will have no impact on your participation or completion of the Assessment Certificate Program.
Your responses are confidential, but not anonymous.

If you have any questions, please contact Shannon Milligan smilligan@luc.edu or Jen Sweet at
jsweet2@depaul.edu

Thank you again for your time.

Signature (please type your name): _____________________

On a scale of 1 to 5, please rate how confident you are you in your ability to

Not at All Not Somewhat Very


Confident
Confident Confident Confident Confident
4
1 2 3 5
.develop an assessment
plan for a particular
assignment or program.

define direct
assessment.
define in-direct
assessment.
explain the cyclical
nature of assessment
processes.
determine when you
should use quantitative
data.
determine when you
should use qualitative
data.
ACP EVALUATION PLAN 37

identify types of
surveys used to assess
student learning.
write survey questions
to assess student learning.

How many 3-credit university-level courses have you taken on assessment-related topics
prior to beginning this program?

(drop-down menu)

0
1-2
3-4
5 or more

Please indicate your primary employment status:

(drop-down menu)

Faculty
Staff
Graduate Student

Please indicate your institutional affiliation:


(drop-down menu)

Loyola University Chicago


DePaul University
ACP EVALUATION PLAN 38

Appendix D

Post-test Email Invitation

Dear [ACP ALUMNI FIRST NAME],

Congratulations on completing the Assessment Certificate Program a few months ago! As an alum
of the program, you are invited to participate in an evaluation survey that will help us assess how
well the program is meeting its goals. Your responses will provide information that we can use to
improve the program for current and future participants.

This survey takes approximately 10 minutes to complete. Please know that you may skip questions
or stop taking the survey at any time. Your responses are voluntary and will remain confidential but
not anonymous.

Please click the following link to begin the survey: [Insert link to Google form]

Thank you for your participation in the ACP and for your willingness to consider taking this survey.

Sincerely,

Assessment Certificate Program Staff


ACP EVALUATION PLAN 39

Appendix E

Post-test Email Invitation Reminder

Dear [ACP ALUMNI FIRST NAME],

We hope this email finds you well. If you recall, the Assessment Certificate Program staff contacted
you a few weeks ago to take a short survey about your experience in the ACP. We are interested to
know how your assessment knowledge and skills have changed as a result of participating in this
program. Your responses will provide information that we can use to improve the program for
current and future participants.

This survey takes approximately 10 minutes to complete. Please know that you may skip questions
or stop taking the survey at any time. Your responses are voluntary and will remain confidential but
not anonymous.

Please click the following link to begin the survey: [Insert link to Google form]

Many thanks,

Assessment Certificate Program Staff


ACP EVALUATION PLAN 40

Appendix F

Post-test Survey Instrument

Congratulations on completing the Assessment Certificate Program! Wed like to thank you for
taking a few moments to complete this survey. Your answers will assist us in understanding how
well the program is meeting its broadly stated goals.

Please know that this survey is voluntary and you can stop taking it at any time. Your responses
are confidential, but not anonymous.

If you have any questions, please contact Shannon Milligan at smilligan@luc.edu or Jen Sweet at
jsweet2@depaul.edu.

Thank you again for your time.

Signature (please type your name): _____________________

As a result of completing the Assessment Certificate Program how confident are you in your
ability to

Not Not at All Not Somewhat Very


Confident
applicable Confident Confident Confident Confident
4
0 1 2 3 5
.develop an
assessment plan
for a particular
assignment or
program.
define direct
assessment.
define
in-direct
assessment.
explain the
cyclical nature of
assessment
processes.
determine
when you should
use quantitative
data.
ACP EVALUATION PLAN 41

determine
when you should
use qualitative
data.
identify types
of surveys used to
assess student
learning.
write survey
questions to
assess student
learning.

How helpful was the Assessment Certificate Program in assisting you to meet the following
goals?

Not Somewhat Somewhat


Unhelpful Helpful
applicable unhelpful helpful
1 4
0 2 3
Learning the basics of
assessing student learning
Fine-tuning existing
assessment skills and
knowledge
Learning about on-campus
resources related to
assessment

Creating assessment plans for


a program/class

Completing a portion of your


professional development
plan

If you have a question about assessment, how confident do you feel you can find a person or
resource on campus to help you answer your question? (1=Not at all Confident, 2=Somewhat
Confident, 3=Confident, 4=Very Confident)

1 2 3 4
ACP EVALUATION PLAN 42

In the past three months, how significant was the ACP to your everyday practice?
(1=Not at all Significant, 2=Somewhat Significant, 3=Significant, 4=Very Significant)

1 2 3 4

In the next academic year, how likely are you to incorporate knowledge/skills you gained in the
ACP to your everyday practice? (1=Not at all likely, 2=Somewhat likely, 3=Likely, 4=Very
likely)

1 2 3 4
ACP EVALUATION PLAN 43

Appendix G

Interview Invitation Email

Dear [ACP ALUMNI FIRST NAME],

As a graduate of the Assessment Certificate Program, your valuable feedback is requested. Loyolas
Faculty Center for Ignatian Pedagogy and DePauls Office of Teaching and Learning request your
participation in an evaluative interview. This interview is a chance for the ACP staff to gain a
greater understanding of the impact and success of the program. We request your participation
because the ACP staff care deeply about student learning and the student experience, and as such
want to help faculty and staff by improving the ACP. Your participation in the interview will help
staff improve the program structure and workshop content. The interview will cover such topics as:

Program structure
Breadth and depth of workshop content
Perceived competence of facilitators
Usefulness of the culminating project
Examples (if any) of your ability to implement knowledge gained from the ACP
Your overall experience completing the program

This interview will take approximately 45 minutes, and can be scheduled by responding to this
email. In an effort to best accommodate your schedule, please respond to this email with at least
three dates and times and the corresponding location that would work for you. Please note that
interviews can occur at DePauls Loop and Lincoln Park campuses and at Loyolas Lake Shore and
Water Tower campuses. Once you send preferred meeting dates, times, and locations, you will
receive a confirmation email with additional details and a consent form to review before your
interview.

Thank you for your participation in the ACP and for your willingness to consider this important
opportunity. We hope you will consider our request to share your valuable insight and experience
with us to continually improve the Assessment Certificate Program.

Sincerely,

Assessment Certificate Program Staff


ACP EVALUATION PLAN 44

Appendix H

Interview Confirmation Email

Dear [ACP ALUMNI NAME],

Thank you for your willingness to share your valuable experience and insight with us. Per your
request, your interview is scheduled for [DATE] [TIME] at [CAMPUS] in [BUILDING AND
ROOM NUMBER]. You will meet with Chelsea Metivier, Ryan Crisp, and Ariel Ropp. If this time
no longer works for you or you need to change the location of your interview, please contact us as
soon as possible.

You do not need to prepare anything for this interview. However, it may be helpful to think about
your experience within the program as it relates to the topics listed below. The following topics will
be addressed during the interview:

Program structure
Breadth and depth of workshop content
Perceived competence of facilitators
Usefulness of the culminating project
Examples (if any) of your ability to implement knowledge gained from the ACP
Your overall experience completing the program

The interview is expected to take approximately 45 minutes, so please plan accordingly.

Thank you again for your willingness to share your experiences and participate in this interview. We
look forward to speaking with you and hearing your valuable insights into the program on, [DATE]
[TIME] at [CAMPUS] in [BUILDING AND ROOM NUMBER].

Sincerely,
Assessment Certificate Program Staff
ACP EVALUATION PLAN 45

Appendix I

Interview Invitation Follow-up Email

Dear [ACP ALUMNI FIRST NAME],

We hope this email finds you well. Several weeks ago, the Assessment Certificate Program emailed
you about participating in an interview about your experience in the ACP. This interview is a
chance for the ACP staff to gain a greater understanding of the impact and success of the program.
We request your participation because the ACP staff care deeply about student learning and the
student experience, and want to help faculty and staff by improving the ACP. Your participation in
the interview will help staff improve the program structure and workshop content. The interview
will cover such topics as:

Program structure
Breadth and depth of workshop content
Perceived competence of facilitators
Usefulness of the culminating project
Examples (if any) of your ability to implement knowledge gained from the ACP
Your overall experience completing the program

This interview will take approximately 45 minutes, and can be scheduled by responding to this
email. In an effort to best accommodate your schedule, please respond to this email with at least
three dates and times and the corresponding location that would work for you. Please note that
interviews can occur at DePauls Loop and Lincoln Park campuses and at Loyolas Lake Shore and
Water Tower campuses. Once you send preferred meeting dates, times, and locations, you will
receive a confirmation email with additional details and a consent form to review before your
interview.

Thank you for your participation in the ACP and for your willingness to consider this important
opportunity. We hope you will consider our request to share your valuable insight and experience
with us to continually improve the Assessment Certificate Program.

Sincerely,

Assessment Certificate Program Staff


ACP EVALUATION PLAN 46

Appendix J

Consent to Participate in Interview

Project Title: Assessment Certificate Program


Evaluation Researcher(s): Ryan Crisp, Chelsea Metivier, Ariel Ropp

Introduction:
You are invited to participate in an evaluation of the Assessment Certificate Program (ACP) being
conducted by Ryan Crisp, Chelsea Metivier, and Ariel Ropp on behalf of DePaul University and
Loyola University Chicago. You were asked to participate in an interview because you are an alum
of the program and have valuable knowledge of the programs structure and content. Please read
this form carefully and ask any questions you may have before deciding whether to participate in the
evaluation.

Purpose:
The purpose of this interview is to learn about your experience in the ACP. We are particularly
interested in your impressions of the programs structure, workshop content, and applicability to
practice.

Procedures:
If you agree to participate in the interview, you will be asked to answer questions related to your
experience in the ACP. The interview will be held at the campus of your choice and will last
approximately 45 minutes. Two evaluators will be present for the interview and will take notes
throughout. In addition, we request your permission to make an audio recording of the interview for
note taking purposes.

Risks/Benefits:
There are minimal foreseeable risks involved in participating in this interview beyond those
experienced in everyday life. While every measure will be taken to ensure participant confidentiality
is maintained, there is a nominal risk of loss of confidentiality. Benefits of participating in this
interview may include reflecting on your experience in the ACP as well as contributing to the ACPs
effort to continually improve future programming.

Compensation:
No monetary compensation will be provided for participating in this interview. However, you will
receive a $5 Starbucks gift card as a thank you for participating.

Confidentiality:
Information gathered from the interview will remain confidential. Your name will not be shared
with others and will not be used in the final report. If we refer to your responses in the final report,
we will give you a pseudonym and remove any personally identifiable information. The audio
recording will be stored in a secure location to which only the researchers have access. The purpose
of the audio recording is to accurately cite any findings in the final written report.

Voluntary Participation:
ACP EVALUATION PLAN 47

Participation in this interview is voluntary. If you do not want to be interviewed, you do not have to
participate. Even if you decide to participate, you are free not to answer any question or to withdraw
from participation at any time without penalty.

Contacts and Questions:


If you have questions about this program evaluation, please contact Ryan Crisp at rcrisp@luc.edu,
Chelsea Metivier at cmetivier@luc.edu, or Ariel Ropp at aropp@luc.edu.

Statement of Consent:
Your signature below indicates that you have read the information provided above, have had an
opportunity to ask questions, and agree to participate in this interview. You will be given a copy of
this form to keep for your records.

_______________________________________ __________________
Participants Signature Date

_______________________________________ __________________
Researchers Signature Date
ACP EVALUATION PLAN 48

Appendix K

Interview Protocol

Thank you for agreeing to let us interview you as part of our evaluation of the Assessment
Certificate Program. Our evaluation as a whole is designed to evaluate how well the program is
helping faculty, staff, and grad students to improve their knowledge and application of assessment
practices. This interview will be primarily focused on the structure and content of the program, as
well as your application of program content to your professional practice. Please know that you may
skip any question that you do not want to answer. The whole interview should take approximately 45
minutes.

Before we get started, we would like to review the consent form that we emailed you on [date].
[Review consent form, answer questions, and ask for their signature, if they have not already signed
it]. We would also like to ask for your permission to record the audio of this interview. We promise
that this recording will remain confidential and will eventually be destroyed.

Lets get started.

1. Could you tell us about what made you decide to take part in the Assessment Certificate
Program?
a. What did you hope to get out of it?
b. To what extent did the program help you reach those goals?
c. Were workshops offered frequently and conveniently enough for you?
d. If you attended any workshops on a campus other than your home campus, can
you talk about your experience with this facet of the program?
e. What are your thoughts regarding a more linear format that potentially includes a
time limit for program completion?
f. What were the best or more useful parts of the program? What parts do you think
didnt work as well? Please elaborate.
g. What else would you change to improve the structure of the program?

2. What do you think of the workshop facilitators knowledge and preparedness?


a. Did the facilitators provide examples related to your functional area?

3. Which workshop content did you find to be most applicable and why? Which workshop
content did you find to be least applicable and why?

4. Were there topics that you wish had been covered in the program? What other assessment
topics would you like to see offered in the future?

5. As a result of the program, has your understanding of best practices in assessment shifted?
If so, please explain. What was your most salient take-away from the program?

6. To what extent do you think the culminating project help you apply what you learned in the
workshops to practice?
ACP EVALUATION PLAN 49

7. Please describe a scenario in which you successfully incorporated a lesson or lessons from
the ACP into your professional practice. How did you do this?

Thank you for taking the time to answer our questions -- we truly appreciate your participation in
this evaluation project. Would you mind if we give you a quick summary of your responses to make
sure that we understood them correctly?

[Briefly summarize participants answers to each question and ask if they want to clarify anything].
Running head: ACP EVALUATION PLAN 50

Appendix L

Timeline
ACP EVALUATION PLAN 51

Appendix M

Budget

Activity Item Cost/Item Quantity Total


Cost
Quantitative Survey Pre-test and post-test $0 (Delivered via Google Forms and email at no cost) $0
Implementation surveys

SPSS Data Statistical $0 (Software available for LUC and DePaul staff) $0
Analysis

Qualitative Survey Audio recording device $0 (Evaluators will use their own device or one rented from the 1 $0
Implementation university)

Starbucks gift cards $5.00 10 $50.00


(compensation to
participants)

Coffee/tea/water & cookies $10.00 10 $100.00

Plates, cups, napkins $2.00 10 $20.00

Consent forms $0 (University paper and printer used) 10 $0

Room reservations $0 10 $0

Total Cost: $170.00

Potrebbero piacerti anche