Sei sulla pagina 1di 14

Running head: PROGRAM EVALUATION PLAN

Program Evaluation Plan


Julie K. Marsh
College of William and Mary

PROGRAM EVALUATION PLAN

Program Evaluation Plan


Program evaluation is an essential part of any project, and evaluation should be part of
any project from the very beginning as well as each step of implementation. Evaluative thinking
includes the planning, implementation, and (re)design of aspects of any program (Frechtling,
2007). Evaluation has many benefits, including finding answers to specific questions and
developing appropriate measures. This paper explores a mentoring program and the evaluation
of the program.
Context for the Program
The Educational Policy, Planning, and Leadership (EPPL) program is one of three
program areas in the School of Education (SOE) at the College of William and Mary. There are
five areas under EPPL, including Curriculum and Educational Technology, Curriculum
Leadership, Gifted Administration, Higher Education Administration, and K-12 Administration.
These five areas include both PhD and EdD programs while the Higher Education area also
offers an MEd program.
The first year of any doctoral program can feel overwhelming to new students as they
learn about the expectations of their new programs. Doctoral work can feel isolating, especially
in the first year when imposter syndrome can take hold early and often. Reflecting on my own
experiences in my first year of doctoral work, I chose to design and implement a mentoring
program focused on first year doctoral students (FYDS) in the EPPL program. I worked with
other students across the five EPPL areas, and together we decided it was important to focus on
FYDS, as opposed to also including the MEd students, because we found FYDS were the only
group without already-established support systems in the SOE. This focus was reinforced after
interviewing faculty members about students needs.

PROGRAM EVALUATION PLAN

I asked faculty members for recommendations for students to sit on a student-run


committee. Of those many recommendations, eight students across all five EPPL areas chose to
form a student committee. We co-wrote our vision and mission for the program, created a
handbook for peer mentors and peer mentees, and we also created a logo. We chose to name the
program the Peer-to-Peer mentoring program, or P2P. The purpose of the program is to provide
support, encouragement, and resources to FYDS in order to help our peers achieve academic,
social, and professional success.
Description of the Program
The logic model seen in Figure 1. and Appendix A follows the CIPP model created by
Stufflebeam (2002). CIPP stands for Context, Inputs, Process, and Product (Stufflebeam, 2002).

Figure 1. Marsh Logic Model for the P2P Program

PROGRAM EVALUATION PLAN

Context. P2P serves FYDS across the five EPPL areas, and the program works to
connect current doctoral students with new doctoral students. We mirrored the program after an
existing program in the SOE, specifically the School Psychology and Counseling Education
mentoring program. We use P2P to support our doctoral community and help students build
professional networks of support.
Inputs. There are a number of inputs for P2P, including current students and FYDS
prior experiences and future goals. As a committee, we consider each individual when we match
a peer mentor with a peer mentee. We try to match mentors and mentees together who will
complement one another and support the others professional growth. It is also important for
current students who serve as peer mentors to feel invested in the program, so we check in with
them regularly throughout each semester to hear how things are working and reassess how we
can change elements of the program that are not working. This also helps to have students
representing all five areas of EPPL on the student-run committee in order to reach out to all
participants and connect the five programs. Finally, we chose Dr. Tschannen-Moran as our
faculty advisor to ensure we have a connection with the faculty in case issues arise as well as to
have a person whose research directly connects to mentoring.
Process/Outputs. The P2P committee sends out a call for volunteers to be peer mentors
at the end of the spring semester. The committee also sends out communication to newly
admitted FYDS over the summer along with the link to the programs website, which includes
the vision and mission statements, handbook, resources, and contact information. The FYDS are
invited to participate in a doctoral-specific EPPL orientation run by students at the end of August
before the beginning of the fall semester. This is an opportunity for current doctoral students to
meet FYDS. The orientation takes place on the day before the SOE orientation and the SOE

PROGRAM EVALUATION PLAN

picnic. All of these events allow FYDS to make connections with current students in order to
later request a specific person as a peer mentor. The committee, led by Dr. Tschannen-Moran,
also hosts a training session for all peer mentors at the beginning of the fall semester.
After the first full week of classes, the committee sends all peer mentors and peer
mentees an electronic form to fill out information about themselves. The information consists of
what EPPL area they are in, whether they are full- or part-time, how they prefer to socialize and
where, how they study, etc. The information on the form is then used by the P2P committee to
match peer mentors and peer mentees. Once the matches are made, each mentor is encouraged
to reach out to his or her peer mentee within a week and then regularly throughout the academic
year. Each pair will establish the terms of their relationship as to whether they will meet in
person and/or digitally. The peer mentors are also encouraged to regularly invite their peer
mentees to events offered through the Graduate Education Association, The William and Mary
Educational Review, and other organizations in the SOE and across campus.
Outcomes. The short-term outcomes focus on community building within EPPL and
providing opportunities for FYDS to have a successful first year in their doctoral programs. The
intermediate outcomes for P2P are to increase a sense of connection within the SOE, provide
opportunities for EPPL students, and to provide a support system that includes relationships with
students, faculty, and alumni. The long-term outcomes connect to the intermediate outcomes in
that hopefully the connection with the SOE continues after students finish their doctoral work
and will provide more connections between the SOE and alumni.
Evaluation Questions
The evaluation questions tied to the logic model can be seen below and are meant to
evaluate the summative aspects of the P2P program, specifically the outcomes.

PROGRAM EVALUATION PLAN

1. How much buy-in do we need to create among current students, especially when inviting
them to volunteer as mentors?
Rationale: It is necessary to know how to tap into the current students interest
levels and ability to commit to a program
2. How much faculty support do we need to be successful?
Rationale: It is necessary to create faculty buy-in as well, so we needed to know if
the enthusiasm and support existed in the first place as well as how much
3. To what degree does contacting incoming EPPL students help create excitement and buy-in
for participation?
Rationale: We need to know when contacting EPPL students will be best; the
SPACE program currently contacts their students after they begin classes, so we
thought we would set ourselves up for success by contacting the FYDS earlier and at
multiple times
4. To what degree are expectations being met as set in the orientation and training meetings?
Rationale: It is important to regularly evaluate how the program is working, so the
committee assesses the mentor/mentee relationships stemming from the expectations
setting meeting throughout the year by using anonymous surveys and regular
conversations between committee members and peer mentors/mentees; this will help
show if expectations are being met and, if not, how to make changes
5. To what degree are expectations being met successfully?
Rationale: It is important to regularly evaluate how the program is working, so the
committee assesses the mentor/mentee relationships throughout the year by using
anonymous surveys and regular conversations between committee members and
peer mentors/mentees; this will help show if expectations are being met and, if not,
how to make changes
6. To what degree are mentors/mentees matched successfully stemming from the digital form?
Rationale: The committee needs to see if the formal vs. organic matching process
worked and, if not, how to make changes moving forward

PROGRAM EVALUATION PLAN

Potential Audience
The potential audience for the program evaluation of P2P will include the committee
members and our faculty advisor as well as all peer mentors and peer mentees. The results will
be useful to the committee in order to help us assess what is and is not working with the program
and make adjustments to what is not working to make the program more efficient and useful for
future participants. The results will be useful to our faculty advisor to continue to increase
faculty buy-in and encourage other faculty members to suggest their advisees join the program.
Finally, the results will be useful to the peer mentors and peer mentees to continue to increase
buy-in and show them the program is working, hopefully resulting in their suggesting to their
peers to join the program as well.
Investigator
The people responsible for conducting the evaluation will most likely be the P2P
committee members since they are directly in charge of running the program. Most of the
committee members have been with the P2P program since its inception and have institutional
knowledge that will help guide the evaluation process. There is potential for bias among the
committee members; however, since the committee is comprised of students across the five
EPPL areas, they will probably counterbalance one another in order to not favor any particular
EPPL group or specific students.
Collection of Data
Major activities. The three major activities involved in evaluating the P2P program are
an anonymous survey, interviews with participants, and observations of the peer mentor/peer
mentee relationships.

PROGRAM EVALUATION PLAN

Timeline. Ideally we will be able to evaluate the P2P program during the middle and the
end of each academic year. We have already matched our peer mentors and peer mentees for the
2015-2016 academic year, so the first part of our evaluation will be to send an anonymous
survey by the beginning of the spring semester (late January 2016). Following the survey, we
will interview volunteer participants from the peer mentor group and the peer mentee group as
well as faculty who are interested. We will also conduct observations at SOE events and
individual and group meetings of peer mentors and peer mentees. The interviews and
observations will take place in February and March of 2016. Finally, we will send a second
anonymous survey at the end of the academic year (early May 2016).
Involvement. The people involved in evaluating the program will come directly from the
student-run committee. No one student runs the committee, so I can ask for volunteers from
each EPPL area to help in the evaluation process. This will help balance the workload for all
evaluators as well as increase our reliability through inter- and intra-rater reliability.
Resources. The P2P evaluation will mainly take the resource of time for each of the
evaluators. It will take time to create the survey to be sent in January and May 2016, and it will
take time to conduct interviews and observations. The surveys will be created using free,
electronic forms through the SOE.
Analysis chart. The Information Collection and Analysis Chart for the evaluation of the
P2P program is located in Appendix B and explained further in the following sections.
Rationale. I chose a survey, interviews, and observations for a variety of reasons. The
survey will allow participants anonymity and to answer questions as openly as possible. We can
ask different questions in the survey, such as about the matching process, that we would not need
to ask in the interview. The interview will allow another way to look at the overall program and

PROGRAM EVALUATION PLAN

how forming and maintaining the mentoring relationships works. Finally, the observations will
give us another view of how the mentoring relationships work in a more organic way.
Data sources. The first data source we will use based on our timeline is the first survey to
be sent in January 2016. We will use the survey in a qualitative nature to better understand the
perspectives of our participants and their experiences with the program. Sample questions will
include: 1) Does/did the P2P program run as you expected? Why or why not? 2) What are the
strengths of the P2P program? 3) What areas of the P2P program need improvement? 4) How
can the P2P program further assist current students and FYDS? We will also use the same
survey at the end of the academic year, in May 2016, in order to evaluate changes in
perspectives. We also plan to use interviews and observations in a qualitative nature in order to
understand the successes and limitations of the program as well as participants perspectives and
experiences.
Triangulation. A combination of data using surveys, interviews, and observations will
help uncover and confirm meaning that emerges across all participants (Merriam, 2009).
Triangulation will be used across all data collected and analyzed to confirm themes and findings
(Stake, 2006).
Procedures. In regards to feasibility, propriety, and accurate data collection (PUFA), the
P2P evaluators will follow specific standards. First, we will clarify and focus on our set timeline
to ensure the process of the evaluation is feasible in nature. We are all students first, so we must
make sure the process does not overwhelm the evaluators. Second, we will be sure to request a
Co-Principal Investigator (Co-PI) from among the faculty for our study prior to applying for
Internal Review Board (IRB) approval. Having a Co-PI will help have more of an objective
perspective to help us as we move through our evaluation process and study. IRB will ensure we

PROGRAM EVALUATION PLAN

10

are being ethical in our treatment of our participants, ensuring propriety standards. Finally, we
will ensure accuracy through valid and reliable measures as well as triangulation of our data.
Analyzing the data. The P2P evaluators will collect data from the surveys and use
narrative analysis to code and interpret participant responses. We will also transcribe and code
all interviews and observations using interview protocols and observation guides approved
through IRB. Once we code all interviews and observations, we will look for patterns and
themes before member checking with our participants.
Conclusion
The evaluation set forth in the above document is definitely feasible. I have suggested a
timeline that will span an entire academic semester and use free, available tools for our
procedures. I will also ask for volunteers from the student-run committee, ensuring the workload
of evaluation is balanced among multiple people. The suggested evaluation also follows
standards for propriety in that we will involve a Co-PI and be approved through IRB to ensure
ethical standards and procedures for all evaluators and participants. The evaluation also ensures
accuracy through the multiple data sources, inter- and intra-reliability, and triangulation of the
multiple sources. Finally, the results should provide utility to the P2P program in order to
evaluate what is and is not working with the program and help us make changes for the future.
As part of our IRB, the results of the evaluation will be communicated with all peer
mentors, peer mentees, committee members, and our faculty advisor. The evaluators will write a
report based on our findings and share the report via email. Email was chosen because it is free
and efficient, and we will write a report in order to create transparency within our program. It
also will serve as a written record of what needs to be changed and increase accountability for
the program.

PROGRAM EVALUATION PLAN

11

We as leaders need to anticipate that maybe the program is not working as well as we
believe or hope. The evaluation of the P2P program may reveal severe limitations, and we may
not have the resources to address those limitations as a group of students. Hopefully, if major
weaknesses are revealed, we will find ways to work with the faculty and administration within
the SOE to address the problems and find ways to address students needs since, ultimately, the
program comes back to serving the students.

PROGRAM EVALUATION PLAN

12
References

Frechtling, J.A. (2007). Logic modeling methods in program evaluation (Vol. 5). Jossey-Bass Inc
Pub.
Merriam, S. (2009). Qualitative research: A guide to design and implementation. San Francisco,
CA: Jossey-Bass.
Stake, R.E. (2006). Multiple case study analysis. New York, NY: The Guilford Press.
Stufflebeam, D. (2002). The CIPP model for evaluation. Evaluation models, 279-317.

Context:"
"
EPPL"(curriculum"leadership,"
gi5ed,"curriculum"and"
educa7onal"technology,"higher"
educa7on,"K<12"administra7on"
"
1st"year"doctoral"students"
(FYDS)"
"
Connect"current"and"new"
students"
"
Mirroring"current"SPACE"
mentoring"program"

"

(Dr."Tschannen<Moran)"

Faculty"support""

Student<led"commi[ee"
for"organizing"mentor"
program"(students"from"
across"the"ve"programs"
will"par7cipate)"
"

Current"students"buy<in"
for"the"program"(need"
volunteers"for"mentor"
roles)"
"

o Encourage"mentors/mentees"
to"a[end"student"
organiza7ons"events"
"

o Oer"orienta7on"through"
EPPL"program,"open"to"all"
new"EPPL"doctoral"students,"
led"by"current"EPPL"doctoral"
students"
"
o Give"access"to"P2P"website"
with"lots"of"current"
informa7on"
"

o Match"by"form"responses"
(e.g.,"how"like"to"study,"
socialize,"etc.)"

Students"personal"and"
professional"goals"
(both"mentor/mentee)"
"

Students"prior"
experiences"
"

o List"of"incoming"EPPL"students"
""!"contact"over"summer"and"at"
beginning"of"semester"
"
o Connec7ons"with"other"
student"organiza7ons"events"
to"encourage"mentors/
mentees"to"a[end"and"
network"

PROCESS/
OUTPUTS"

Students"specic"
degree"programs"(both"
mentor/mentee)"

INPUTS"

Updated"Logic"Model"for"P2P"Program"

"

oFuture"connec7ons"with"alumni"

LONG,TERM/
"
oOn<going"connec7on"with"SOE"

oResearch"collabora7ons"
between"students,"faculty,"and"
alumni"

oPersonal"and"academic"
opportuni7es""

oIncrease"sense"of"connec7on"
and"community"in"SOE"

INTERMEDIATE/

"

oPart<7me"students"will"have"
more"connec7on"with"SOE"
community"

oStrengthen"rst"year"experience"
for"FYDS"

SHORT,TERM/

OUTCOMES"

Appendix A

The committee needs to see if the formal


vs. organic matching process worked and,
if not, how to make changes moving
forward

It is important to regularly evaluate how


the program is working, so the committee
intends to assess the mentor/mentee
relationships throughout the year by using
anonymous surveys and regular
conversations between committee
members and peer mentors/mentees; this
will help show if expectations are being
met and, if not, how to make changes

It is important to regularly evaluate how


the program is working, so the committee
intends to assess the mentor/mentee
relationships stemming from the
expectations setting meeting throughout
the year by using anonymous surveys and
regular conversations between committee
members and peer mentors/mentees; this
will help show if expectations are being
met and, if not, how to make changes

We need to know when contacting EPPL


students will be best; the SPACE program
currently contacts their students after they
begin classes, so we thought we would set
ourselves up for success by contacting the
FYDS earlier and at multiple times

It is necessary to create faculty buy-in as


well, so we needed to know if the
enthusiasm and support existed in the first
place as well as how much

It is necessary to know how to tap into the


current students interest levels and
ability to commit to a program

Why the question is important

Anonymous survey; interview questions


tied to current students participation

Anonymous survey; interview questions


for both peer mentors and peer mentees,
observations of peer mentor/mentee
relationships

Anonymous survey; interview questions


for both peer mentors and peer mentees,
observations of peer mentor/mentee
relationships

Interview questions tied to current


students and FYDS participation

Anonymous survey; interview questions


tied to currently involved faculty

Anonymous survey; interview questions


tied to current students participation

Information needed to
answer the question

Qualitative analysis of survey responses


Coding of interviews, looking for patterns
! themes, member checking with
participants

February/March 2016

Coding of interviews, looking for patterns


! themes, member checking with
participants

Qualitative analysis of survey responses

Coding of interviews, looking for patterns


! themes, member checking with
participants

Qualitative analysis of survey responses

Coding of interviews, looking for patterns


! themes, member checking with
participants

Coding of interviews, looking for patterns


! themes, member checking with
participants

Qualitative analysis of survey responses

January 2016

February/March 2016

January 2016

February/March 2016

January 2016

February/March 2016

February/March 2016

January 2016

Qualitative analysis of survey responses


Coding of interviews, looking for patterns
! themes, member checking with
participants

January 2016

Data analysis and


interpretation procedure

February/March 2016

When and how the information


will be collected

Copyright 2006 by Corwin Press. Reprinted from Evaluating School Programs: An Educators Guide (3rd ed.), by James R. Sanders and Carolyn D. Sullins. Thousand Oaks, CA: Corwin Press, www. corwinpress.com.
Reproduction authorized only for the local school site or nonprofit organization that has purchased this book.

To what degree are mentors/mentees


matched successfully stemming from the
digital form?

To what degree are expectations being


met successfully?

To what degree are expectations being


met as set in the orientation and training
meetings?

To what degree does contacting incoming


EPPL students help create excitement and
buy-in for participation?

How much faculty support do we need to


be successful?

How much buy-in do we need to create


among current students, especially when
inviting them to volunteer as mentors?

Evaluation Questions

Figure 2.2 Evaluation Information Collection and Analysis Worksheet

Appendix B

Potrebbero piacerti anche