Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
vs.
In evaluating programs were talking about evidence -- not proof. Build a case based on circumstantial
evidence. Consider, do you really need the information youre looking for? What communicates? What
creates a vivid picture?
How can you set up systems for continual monitoring?
How do you decide where to invest money for evaluation?
1.
2.
3.
4.
5.
Why evaluate?
Proving (summative / judge the program)
-- it works
-- it makes a difference
Improving (formative / improve the program)
-- instructor development
-- materials, equipment, facilities
-- revision of content
-- varying strategies
Learning
-- advance organizer
-- reinforce learning
-- focus on learner outcomes
Linking -- training to rest of organization (politics)
-- part of management/strategic direction
-- assure transfer
-- open lines of communication
Kirkpatricks Levels of Evaluation
Level One = reactions (reaction sheet)
Level Two = learning (knowledge and performance tests, observation, self-reports, simulations, work
sample analyses)
Level Three = behavior on the job (self, peer, and supervisor reports, case studies, surveys, site visits,
observation, work-sample analyses)
-- productivity measures: output, loans approved, # incentive bonuses
-- quality measures: scrap level, error, waste, rework, customer complaints
-- performance / behavior: absenteeism, tardiness, requests for transfer, turnover
Evaluation
Sasser, Pulley, & Ritter, with subsequent modifications
11/30/2015
1
Shortcomings: definition of HRD covers training, only; and entirely outcome oriented -- perpetuates trial
and error learning.
Six-Stage Model for Evaluating HRD (Brinkerhoff)
1.
2.
3.
4.
5.
6.
Goal setting
Program design
Program implementation
Immediate outcomes
Usage outcomes
Impacts and worth
Strengths of model:
identifies flawed operations; not just flawed courses
requires articulation of assumptions about why and how each HRD activity is supposed to work
emphasizes formative evaluation by incorporating it at each stage
Evaluation
Sasser, Pulley, & Ritter, with subsequent modifications
11/30/2015
2
Design
learning objectives
assessment
Development
delivery strategies
writing content (information/
practice)
Evaluation
try out / revise
Implementation
provide to students
2.
3.
Nadler
Critical Events
Identify needs of the
organization
Specify job performance
Identify learner needs
4.
5.
Determine objectives
Build curriculum
6.
Select instructional
strategies
Obtain instructional
resources
13.
14.
15.
16.
17.
18.
8.
1.
2.
3.
Phillips
Results-Oriented HRD
Conduct a needs analysis and develop
tentative objectives
Identify purposes of evaluation
Establish baseline data
4.
5.
6.
7.
8.
9.
10.
Implement program
Collect data at proper stages
Analyze and interpret data
Make program adjustments
Calculate return on investment
Communicate program results
1.
7.
Conduct training
Evaluation
Sasser, Pulley, & Ritter, with subsequent modifications
11/30/2015
3
Knowles
Andragogy
Set climate
Establish structure for mutual
planning
Diagnose needs (and interests)
for learning
develop competency model
assess performance
assess needs
Formulate directions
(objectives) for learning
Simplified version
Eylers Work Tasks by Phase - original version designed to be used post course. Ive adapted it to include
the potential for use in pre-course development of an evaluation plan
1.
Course review/analysis
Why evaluate?
Proving
(summative /
judge the
program)
it works
it makes a
difference
Improving (formative /
improve the program)
instructor development
materials, equipment,
facilities
revision of content
varying strategies
Learning
advance
organizer
reinforce
learning
focus on
learner
outcomes
Worth evaluating?
Ethical (Wooten )
misrepresentation & collusion
misuse of data
manipulation & coercion
value & goal conflict
technical ineptness
evaluability (Russ-Eft)
CLAM:
Does it have clear, measurable objectives?
Does it have a logical plan to reach them?
Are the activities logical?
Whats the management plan and plan for data use?
Evaluation
Sasser, Pulley, & Ritter, with subsequent modifications
11/30/2015
4
2.
Strategy definition
Strategic Considerations (Eyler)
need to know
resources
credibility
politics
intrusiveness
instructional reinforcement
identify data thats worth getting/will tell the story clearly and vividly
The measurements the client first used to identify the problems (presenting
symptoms) will often be the measures used in tracking operational results
(Robinson)
looking for evidence - not proof
intended/unintended results (Eyler)
design - collection & analysis plan
Evaluation
Sasser, Pulley, & Ritter, with subsequent modifications
11/30/2015
5
Benefits
Pre-experimental
one-shot
single group, pretest &
post-test
single group, time series
Quasi-experimental
control group
Drawbacks
totally uncontrolled - no way to tell
what influenced results
effect of a pre-test
effect of external factors
mortality threat
Evaluation
Sasser, Pulley, & Ritter, with subsequent modifications
11/30/2015
6
3.
Instrument development
Fitting Instrumentation to the Purpose
Quest.
Interview
Observation
Extant Data
Needs
Assmnt
+
+
Implementation
Level 1
Level 2
Level 3
Level 4
newsletters
financial statement
memo
absenteeism
exit interview data
customer complaints
training materials
work products
applications (i.e. Baldridge award)
Content Analysis:
Count (# incidents) vs. Classify (qualitative judgment)
Establishing a baseline
use mgt panel if necessary to guess
4.
Implementation review
review implementation plan
identify potential problems in collection and analysis
develop solutions
finalize design
Evaluation
Sasser, Pulley, & Ritter, with subsequent modifications
11/30/2015
7
5.
Data collection
notify participants of upcoming data collection activities
coordinate data collection (distribution & return of questionnaires, site visits for interviews,
observations, etc.)
6.
A. Data analysis
ROI = program benefits / program costs
Training investment analysis (Hassett)
1. Determine the information your organization needs
2. Use the simplest and least-expensive method possible for finding the information
you need
3. Perform the analysis as quickly as possible
4. Publish and circulate the results
Sources of Return on Investment (Hassett)
increased income (productivity, sales)
decreased losses (errors, waste)
retention (client and employee)
intangibles
if dont have costs already assigned to these, use a management panel to
assign costs
ROI = program benefits
program costs
Use only when data is clear and definitive; arrive at a single number.
Cost / Benefit: Benefits - Costs = $ Saved
Less technical; easier to couch in qualified terms; provides evidence, not proof.
Typical program costs:
Analysis = 20%; Design & development = 70%; Evaluation = 10%
Evaluation
Sasser, Pulley, & Ritter, with subsequent modifications
11/30/2015
8
6.
B. Presentation Plan
1) Identify audience &
2) What they need/want to know
Stockholders
Top Mgt
Managers
Participants
HRD
Evaluation
Sasser, Pulley, & Ritter, with subsequent modifications
11/30/2015
9
Responsive Evaluation
Identify decision-makers
11/30/2015
10
70%
High credibility
source
Low credibility
source
Amount of
attitude change
in recipient of
information
Degree of dissonance of
information provided
How do you alter values and beliefs? Create dissonance.
Evaluation
Sasser, Pulley, & Ritter, with subsequent modifications
11/30/2015
11