Sei sulla pagina 1di 2

Practice: Evaluate implementation to document

what you are doing


Key Action: Document implementation based on your logic
model

TOOL: Questions, Methods, and Indicators for Implementation Evaluation

Purpose: The following table can help you and your evaluator
determine appropriate evaluation questions, methods
for answering those questions, and some good
indicators of the extent and quality of implementation.

Instructions: 1. Review the sample evaluation questions and


consider the value of the data collection method used
to answer these questions.

2. Review the sample indicators and generate ideas


for other indicators you might use to document your
magnet program’s implementation.

1
Practice: Evaluate implementation to document
what you are doing
Key Action: Document implementation based on your logic
model

Questions, Methods, and Indicators for Implementation Evaluation


EVALUATION METHODS AND VALUES OF IDEAS FOR YOUR
SAMPLE INDICATORS
RESEARCH QUESTION EVALUATION INDICATORS

Compare the amount and Magnet program


range of activities done in the implementation plans
magnet program with those
To what extent was the prescribed by the program School staff interviews
magnet program developers. The comparison
implemented as gives an indication of the School-site observations
designed? fidelity of the implementation to
the planned program and the Review lesson plans and
frequency and intensity of the curriculum documents
magnet program activities.
Compare student Outreach and recruitment
demographics before the logs (recruitment fairs and
To what degree has the
magnet program is parent nights)
magnet program helped
implemented to those after .
reduce minority group
program implementation for Annual analysis of student
isolation?
any indication of changes in application database and
racial and ethnic composition. class rosters
R School-sponsored activities
What adaptations, ecord, describe, and count any logs
additions, and omissions modifications, since
were made when the adaptations, additions, and Survey stakeholders
magnet program was omissions affect analyses of
implemented? outcome evaluation data. Teacher interviews

Compare the current Professional Development


professional development schedules, activity logs, and
activities against the standards feedback forms
To what extent was staff
for optimal training as planned
trained to implement the
by the program developers. Classroom observation visits
magnet curriculum at a
This gives an indication of the
professional level?
potential strength and Staff surveys in the spring
weakness of the magnet
program.
Survey stakeholders
To what extent are
Maintain records of meetings
stakeholders (parents,
and presentations to Interviews/Focus Groups
students, staff,
stakeholders, as well as their with stakeholders
community members)
questionnaire responses, in
informed and
order to discover a range of Agendas and logs of events
knowledgeable about the
their knowledge. where stakeholders were
magnet program?
present
Adapted from U.S. Department of Education, Office of Safe and Drug-Free Schools. (2007). Mobilizing for evidence-based character
education (p. 18). Washington, DC: Author. The entire guide can be downloaded at www.ed.gov/programs/charactered/mobilizing.pdf (last
accessed December 10, 2008).

Potrebbero piacerti anche