Sei sulla pagina 1di 17

1

Choosing Assessment
Methods
Gloria Rogers

Copyright 2010 by ABET, Inc.


Wewillbeginpromptlyat2:00EST

Overview
9Where does assessment fit into a
Continuous Improvement process?
9Wh t are di
9What directt and
d iindirect
di t measures? ?
9Review of assessment methods
9Making the right choice of methods
9Assessment method truisms
Copyright 2010 by ABET, Inc.

9H
9How d
does assessmentt differ
diff from
f program
educational objectives to learning
outcomes

1
Program Mission
Educational
Objectives

Assess/
Program(Student)
g ( )
EEvaluate
l t Performance
Outcomes Indicators

Constituent/
Feedbackfor
Stakeholder Educational
Continuous Practices/Strategies
Improvement

Evaluation: Assessment:
Interpretationof Collection,Analysis
Evidence ofEvidence

AssessmentforContinuousImprovement
GloriaRogers ABET,Inc

Performance
Student Indicators
Outcomes
Program
Educational Students
S d will
ill
demonstrate:
Objective
Appreciation for
and ability to
Graduates will be effective life- pursue life-long
long learners including
demonstrating the professional learning
and ethical responsibilities
Understand
professional
ethical
responsibilities

G.Rogers--ABET, Inc.

2
Performance
Student Indicators
Outcomes
Program
Educational 1)Demonstrate knowledge
Objective of professional code of
Understand ethics.
Graduates will be effective life- ethical
long learners including responsibilities 2)Evaluate the ethical
demonstrating the professional
and ethical responsibilities
dimensions of a problem
in the discipline.

G.Rogers--ABET, Inc.

Direct Measures
Direct measures provide for the direct
examination or observation of student
knowledge or skills against measurable
learning outcomes
Copyright 2010 by ABET, Inc.

3
7

Indirect Measures
Indirect measures are those that
ascertain the opinion or self-report of the
extent or value of learning experiences.

Copyright 2010 by ABET, Inc.


AssessmentMethods
Written surveys and
questionnaires
ti i
Portfolios
Exit and other
Simulations
interviews Performance
Appraisal
Standardized exams
Locallyy developed
p
External
e aminer
examiner
exams
Archival records
Oral exams
Focus groups
Behavioral
observations

4
Written Surveys and
Questionnaires
9Asking individuals to share their
perceptions
ti about
b t ththe program ((e.g., th
their
i
own or others skills/attitudes/behavior, or
program/course qualities and attributes)
Most common indirect measure
Usuallyy locallyy developed
p but also some
national surveys that allow for comparisons
(e.g., National Survey of Student
Engagement, Educational Benchmarking,
Inc.)

10

Exit and Other Interviews


9Asking individuals to share their
perceptions about the program (e (e.g.,
g their
own skills/attitudes, skills and attitudes of
others, or program qualities) in a face-to-
face dialog with an interviewer
Generally indirect measure
Copyright 2010 by ABET, Inc.

Interview could be crafted to include elements


of direct measures

5
11

Standardized Exams
9Subject-specific examinations, generally
group administered mostly multiple choice,
choice
objective tests, usually purchased from a
private vendor
Direct measure of student learning
Provide ability to make comparisons with

Copyright 2010 by ABET, Inc.


other
th programs
Need to be confident that it is relevant to the
program for which it is used

Locally developed exams


9Objective (includes true-false, fill-in-the
blank, matching,g and multiple
p choice
question) and/or subjective (open-ended
require students to write) tests designed
by faculty of the program
Most common at classroom level
Direct measure of student learning
Can be specific to performance indicators for
the learning outcomes
Can be difficult to get faculty agreement on
questions related to outcomes

6
13

Focus Groups
9Group discussions conducted by a trained
moderator
d t with ith participants
ti i t to
t identify
id tif
trends/patterns in perceptions
Indirect method that can provide valuable
information about student perceptions and
experiences

Copyright 2010 by ABET, Inc.


Can be used to provide insights about student
responses on other assessments
Results cannot be generalized to entire cohort

14

Archival Records
9Biographical, academic, or other file data
available from the college
g or other
agencies and institutions
Identify data already available (data audit)
Build upon data collection efforts that have
already occurred
Copyright 2010 by ABET, Inc.

Constitutes non-intrusive
non intrusive measurement
measurement, not
requiring additional time or effort from
students or other groups

7
Portfolios
9 Collections of student work which is archived
and rated for level of attainment using g scoring
g
rubrics. The design of a portfolio is dependent
upon how the scoring results are going to be
used.
Direct measure of student learning
Possible
oss b e to measure
easu e moreo e tthan
a oone
e learning
ea g
outcome at one time (e.g., writing and use of
technology)
Course management systems often support
portfolio development

16

Simulations
(Competency-Based Measure)

9A persons abilities are measured in a


situation that approximates a real world
world setting
Direct measure of student learning
Need well defined outcomes with appropriate
tasks
Copyright 2010 by ABET, Inc.

Can be designed for individuals and groups of


students

8
17

Performance Appraisals
9Systematic measurement of the
demonstration of acquired skills through
di t observation
direct b ti
Provides a direct measure of students
abilities to apply what has been learned
Internships and co-op experiences provide a
good setting for data collection

Copyright 2010 by ABET, Inc.


Need to be focused data collection process
Those who are in a position to make judgment
Well constructed instrument for data collection

18

External Examiner
9 Using an expert in the field from outside the
program (usually from a similar program at
another institution) to conduct, evaluate, or
supplement assessment of your students
Generally a direct measure of student learning (if they
assess against specific competencies)
Outsiders can see attributes to which insiders have
Copyright 2010 by ABET, Inc.

grown accustomed
Evaluators may have skills, knowledge, or resources
not otherwise available

9
19

Oral Exams
9An assessment of student knowledge
levels through a face-to-face
face to face dialogue
between the student and examiner
usually faculty
Direct measure of student learning
Content and style can be geared to specific learning
outcomes and characteristics of the program,
p g ,

Copyright 2010 by ABET, Inc.


curriculum, etc.
May not be allowed by institution who have concerns
about pressure on students

20

Behavioral Observations
9 Measuring the frequency, duration,
relationships, etc. of student actions, usually in a
natural setting with non-interactive methods
(e.g., formal or informal observations in a
classroom).
Direct measure of student behavior
Observations are most often made be an
Copyright 2010 by ABET, Inc.

individual and can be augmented by audio or


videotape.
Requires experienced observers

10
21

Direct Indirect
Exit and other interviews
Standardized exams Written surveys and
Locally developed exams questionnaires
Portfolios Exit and other
Simulations interviews
Performance Appraisal Archival records
External examiner Focus groups

Copyright 2010 by ABET, Inc.


O l exams
Oral
Behavioral observations

22

Validity
9relevance - the assessment option
measures the educational outcome as
directly as possible
9accuracy - the option measures the
educational outcome as correctly as
possible
9utility - the option provides formative and
Copyright 2010 by ABET, Inc.

summative results with clear implications


for educational program evaluation and
improvement

11
23

Bottom Lines
9All assessment options have advantages
and disadvantages
9Ideal method means those that are best fit
between program needs, satisfactory
validity, and affordability (time, effort, and
money)
9Crucial to use multi

Copyright 2010 by ABET, Inc.


multi-method/multi-source
method/multi source
approach to maximize validity and reduce
bias of any one approach

TRIANGULATION

Truth
Portfolios

12
TRIANGULATION*

Truth

Portfolios

*Joseph Hoey
Hoey--Savannah College of Art and Design

26

Assessment Method Truisms


9There will always be more than one way
to measure any learning outcome
9No single method is good for measuring a
wide variety of different student abilities
9There is generally an inverse relationship
between the quality of measurement
p
methods and their expediency y
Copyright 2010 by ABET, Inc.

9It is important to pilot test to see if a


method is appropriate for your program

13
27

Assessing Program Educational


Objectives
9Similar to program learning outcomes but
nott the
th same
9What are some of the differences?
Degree of specificity
Role of constituents

Copyright 2010 by ABET, Inc.


Types of measurements possible
Cycles of data collection
27

28

Sampling
9For program assessment, sampling is
acceptable and even desirable for
programs of sufficient size.
Sample is representative of all students
Copyright 2010 by ABET, Inc.

14
29

Data collection

Yr 1 Yr 2 Yr 3 Yr 4 Yr
Implement
Define Evaluation &
Data improvements
Outcomes/ Map design of
collection & Data
Curr. improvements

Copyright 2010 by ABET, Inc.


Collection

30
Learning Outcomes: 08-09 09-10 10-11 11-12 12-13 13-14

A recognition of ethical and


professional responsibilities

An understanding of how
contemporary issues shape and are
shaped by mathematics,
mathematics science
science, &
engineering

An ability to recognize the role of


professionals in the global society

An understanding of diverse
Copyright 2010 by ABET, Inc.

cultural
lt l and
d humanistic
h i ti traditions
t diti

An ability to work effectively in


teams

An ability to communicate
effectively in oral, written,
graphical and visual forms

15
32
31

Copyright 2010 by ABET, Inc. Copyright 2010 by ABET, Inc.

16
33

Th
Thankk you ffor your participation!
i i i !
Please fill out the closing evaluation.
All webinars can be ordered online:
www.abet.org/webinar.shtml

Copyright 2010 by ABET, Inc.

17

Potrebbero piacerti anche