Sei sulla pagina 1di 67

The Teaching-Learning Cycle:

Using Student Learning Outcome Results


to Improve Teaching and Learning

Workshop Activities &


Resource Materials

Bill Scroggins
November 2004
Table of Contents

Student Learning Outcomes at the Lesson Level............................................................................1


Student Learning Outcomes at the Course Level: From Course Objectives to SLOs.....................2
Primary Trait Analysis: Statements of Grading Criteria..................................................................4
Selecting the Assessment Method: Authentic Assessment and Deep Learning...............................6
Norming or Inter-Rater Reliability: Assuring Consistency of Grading Among Faculty.................8
The Assessment Report: Sharing the Results of Student Learning Outcomes................................8
Program Level Student Learning Outcomes....................................................................................9
Direct and Indirect Measures of Student Learning Outcomes.......................................................10
Identifying Program CompetenciesExternal and Internal Sources............................................11
Strategies for Direct Assessment of Program SLOs: Mosaic and Capstone Approaches..............11
General Education Student Learning Outcomes............................................................................13
Conclusion 14

Appendices

Appendix 1 Good Practices in Assessing Student Learning Outcomes......................................15


Appendix 2 Activity 3: Writing Student Learning Outcomes.....................................................22
Appendix 3 Developing and Applying Rubrics..........................................................................23
Appendix 4 Examples of Scoring Rubrics..................................................................................28
Appendix 5 Activities 4 & 5: Building and Using a Grading Rubric.........................................29
Appendix 6 The Case for Authentic Assessment by Grant Wiggins .........................................30
Appendix 7 -- State and National Standards, Academic & Vocational Competencies..................32
Appendix 8 Assessment Report Examples.................................................................................36
Appendix 9 Assessment Plan Examples Internet Sites...............................................................40
Appendix 10 Activity 5 Program SLOs from Competency Statements..................................41
Appendix 11 Examples of Program Assessment Reports...........................................................42
Appendix 12 General Education Student Learning Outcomes...................................................44
Appendix 13Resources and References for Student Learning Outcomes Assessment..............52

Endnotes.........................................................................................................................................57

URL for this document: http://cai.cc.ca.us/workshops/SLOFocusOnResults.doc

For further information contact: Bill Scroggins


Interim President
Modesto Junior College
scrogginsb@yosemite.cc.ca.us
The Teaching-Learning Cycle
Using Student Learning Outcome Results to Improve Teaching & Learning

Since the Accrediting Commission identified measuring student learning outcomes as the
focus of the latest revision of the WASC standards, many of us have been struggling with what
we are expected to do differently.i Whatever we do to implement Student Learning Outcomes,
this initiative must be seen to add value to the teaching and learning processvalue that clearly
outweighs the task of constructing SLOs. Those of us who have taught for years consider that we
already measure student learning. However, I have come to believe that SLOs really do have a
new and useful emphasis that can be best captured by one word: resultscollecting them,
sharing them, and using them to improve both learning and the operation of our colleges. This
series of reflections are intended to address getting useful results from the SLO process
maximizing utility and minimizing futility. (That little f really makes a difference, doesnt it?)
Results of SLOs at the Lesson Level
Student Learning Outcomes at the Lesson Level Most of us have criteria for grading student
learning for the individual objectives of
As we teach each lesson and grade the related student each lesson we teachbut we may not
assignments, we typically have a clear concept of the write these down or share them with
students. Heres an example:
results expected, and we have defined methods for Lesson Learning Objective: Describe and
assessing student work and assigning grades. draw the four vibrations of carbon dioxide
However, there are several things that we typically and show how IR light is absorbed by CO2.
dont do that can potentially improve student Sample Graded Question: What two types
learning. While many of us do give students written of motion are caused by the absorbance of
IR light by CO2? Draw one of these
learning objectives for each lesson, we usually do not motions.
write down criteria for grading nor share them with Grading Criteria:
studentsother than how total points relate to the Full credit: Student names bending and
final grade in the course. stretching and draws ball-and-stick
models with arrows up-and-down for
bending and side-to-side for stretching.
In listening to practitioners of SLOs such as Lisa Deductions: 25% for one name missing;
Brewsterii, a Speech teacher at San Diego Miramar 50% for both; 25% for wrong or missing
College, and Janet Fulksiii, a Microbiology teacher at arrows; 50% for no drawing.
Bakersfield College, it is clear that SLOs can become Results of Grading Student Work:
a powerful pedagogical tool by: 82% earned full credit; 5% confused
arrows; 13% had no drawing.
sharing grading criteria with students, Action to Improve Learning:
getting students to use these criteria as a way The greatest deficiency seems to be
to better understand the material, and drawing, so do in-class drawing exercise.
having students evaluate their own and each
others work.

Activity 1
In small groups by discipline or cluster of related disciplines, discuss how you develop grading criteria.
Do you write down your grading criteria for each assignment?
How consistent are you in applying your grading criteria?
Do you use the results of student assessment to improve your grading criteria?
Do you communicate your grading criteria to students? Before or after the assignment?
Do you encourage students to apply the grading criteria to their own work?
Do you involve students in developing or modifying your grading criteria?
Do you share your grading criteria with other faculty?
Aggregating the feedback from grading student assignments can provide valuable insight into
areas in need of improvement. With all the demands on our time, we may not give adequate
attention to mining this valuable source of information for improvement of the teaching and
learning process. One of the challenges that the new accreditation standards present is creating
an assessment plan that outlines objectives, grading criteria, results of assessing student work,
and how we use those results to improve student learning.

Of course, part of improving student learning is improving the way we teach. This inevitable
outcome can potentially be threatening to faculty members. However, when these issues have
been raised in workshops with faculty, the result has generally been a serious engagement in
discussions of teaching methods to improve authentic, deep learning. iv It is extremely important
to build environments for discussing the improvement of student learning which are positive and
reinforcing. Several colleges have made explicit commitments to this principle.v (The endnote
references include approaches by Palomar College in California, College of DuPage in Illinois,
and the American Association of Higher Education.)

Activity 2
Read the following resource documents (see Appendix 1) and join in the group discussion on Good Practices for
Assessment of Student Learning Outcomes.
An Assessment Manifesto by College of DuPage (IL)
9 Principles of Good Practice for Assessing Student Learning by AAHE
Palomar College Statement of Principles on Assessment from Palomar College (CA)
Closing the LoopSeven Common (Mis)Perceptions About Outcomes Assessment by Tom Angelo
Five Myths of Assessment by David Clement, faculty member, Monterey Peninsula College

Student Learning Outcomes at the Course Level: From Course Objectives to SLOs

Beyond the lesson level, we must address results of student learning at the course level.
Moreover, we should do so for all sections of each course, meaning collaboration among the
full- and part-time faculty teaching the course. In stating the desired student learning outcomes,
we have the advantage of agreed-upon student objectives in the course outline.

A great deal of energy has been expended in discussing the difference between a course objective
and a student learning outcome. The difference may be clearer when viewed in the context of
producing assessment results that 1) provide useful feedback to improve the teaching and
learning process and 2) provide useful information to improve college practices. SLOs more
clearly connect with how the instructor will evaluate student work to determine if the objective
has been met. When we write an assignment, we provide a context in which the student will
respond and we evaluate the response based on criteria we use to judge if the student has met the
objectiveusually we have at least a mental construct of minimum acceptable performance
standards. These are the two additional pieces that transform an objective into an SLO. Heres
how it might work.

If course objectives have been written well, they will be complete, measurable, and rigorous. In
practice, as faculty look more closely at the criteria and methods to assess these objectives,

2
changes often result. To operationalize an objective for assessment purposes, that is, to
transform it into a statement of desired student learning outcomes, typically we must address:

1) the stated objectives in terms of acquired knowledge, skill or values (hopefully, the
existing course objectives),

2) the context or conditions under which the student will be expected to apply the
knowledge, skill or values, and

3) the primary traits which will be used in assessing student performance.

Below are some examples of robust course objectives or statements of desired student
learning outcomes. (Note that this difference is largely semantic. Some colleges have chosen to
put SLO statements in course outlines as an enhancement of the objectives, while others have
built statements of desired SLOs into a departmental assessment plan, typically related to
program review.) Whatever vehicle the college uses to operationalize course objectives to SLOs,
it must be done collaboratively among faculty who teach the course.

Examples of Course Objectives Transformed Into Student Learning Outcomes


Course Objective Statement of Desired SLO
Write well-organized, Context: Given an in-class writing task based on an assigned reading,
accurate and significant Objective: demonstrate appropriate and competent writing which
content. (English) Traits: states a thesis, supports assertions, maintains unity of thought and
purpose, is organized, and is technically correct in paragraph
composition, sentence structure, grammar, spelling, and word use.
Analyze behavior following Context: Given a particular behavior and its context (e.g., playing incessantly with
the major accepted theories. ones hair when under pressure in the presence of the opposite sex),
(Psychology) Objective: describe how the perspectives of behaviorism, humanistic,
psychoanalytic, and biological psychology would interpret that behavior
and what methods might each use to alter that behavior.
Traits: Include theoretical basis, description of causality, and treatment regimen.
Understand and apply the Context: Given a hypothesis,
scientific method. (Biology) Objective: design experiments and interpret data according to the scientific method
in order to evaluate the hypothesis.
Traits: Include the ability to approach the scientific method in a variety of ways,
formulate questions, design experiments that answer the questions; and
manipulate and evaluate the experimental data to reach conclusions.
Compare and contrast the Context: After viewing an assigned film based on a literary text,
text and film versions of a Objective: write a review of the film.
literary work. (Film) Traits: Include an appraisal of the directors selection and effective translation
of content from the literary text and the dominant tone the director seems
to be trying to achieve, supporting each statement with detail from the
text and film and your personal reaction to the cited scenes.

Activity 3
Perform the Writing Student Learning Outcomes exercise in Appendix 3. Review the first example. Then for the
second course objective, complete the Performance Context, Measurable Objective, and Primary Traits. Finally,
select an objective from a course in your discipline and construct the three-part SLO statement.

3
Primary Trait Analysis: Statements of Grading Criteria

Primary traits are the characteristics that are evaluated in assessing student work. Identifying
primary traits for a given assignment involved listing those specific components that, taken
together, make up a complete piece of work. They are the collection of things that we as teachers
look for when we grade student work.

Definition of Primary Trait Assessment


Primary trait assessment is a method of explicitly stating the criteria and standards for
evaluation of student performance of an assignment or test. The professor identifies the
traits that will be evaluated, and ranks the student's performance of each trait on a scale of
"most effective" to "least effective" realization of the assignment goals. On this scale, the
level of the student's performance is explicitly ranked so that the student knows how she
is being evaluated. The instructor has created the scale for direct application to the
assignment the student is performing so that if the entire class does poorly on the
assignment, it is clear to the instructor what difficulties the class may share with one
another. This recursive feedback of primary trait assessment can be used to inform
classroom and departmental improvement. vi

While primary traits are the categories into which we can sort competencies when we evaluate
student work, we look for specific levels of performance in each of these areas. For example, an
essay might be rated on development, organization, style, and mechanics. These primary traits
are then rated on some sort of a scaleas simple as A/B/C/D/F or more descriptive as
excellent/superior/satisfactory/poor/unsatisfactory. Occasionally, points are given based on this
scale. The challenge presented by the Student Learning Outcomes process is to write down those
observable student performance characteristics in an explicit way for each of the primary traits
we have identified. This system, known as a grading rubric, can be used to grade student work
collected through all manner of assessment methods.vii
Building a Rubric
Template for a Grading Rubric: Start with expectations for satisfactory
work for each trait such as
Primary Traits and Observable Characteristics Organization in the table to the left:
Trait Excellent Superior Satisfactory Poor Unsatisfactory
Ideas generally related to one
Development another and to the focus, but may
have some unrelated material
Organization Adequate introduction and
conclusion
Style
Some attempt at transitions
Mechanics Then stretch up to excellent and down
to unsatisfactory.

Rubrics can be applied in total by specifically rating each primary trait (an analytic grading
rubric) or holistically (using the rubric as a guide to determine the overall rating of excellent,
satisfactory, or unsatisfactoryor whatever performance levels have been agreed upon). An
example is given below.

4
Primary Trait Grading of Math Problem Solvingviii
Trait 3 points 2 points 1 point 0 points
Understanding complete understanding of good understanding of the minimal understanding of the no understanding of the
the problem in the problem in the problem problem; the problem problem; the problem
problem statement section statement section. Some statement may be unclear to statement section does not
as well as in the minor point(s) of the problem the reader. The plan and/or address the problem or
development of the plan may be overlooked in the interpretation of the solution may even be missing. The
and interpretation of the problem statement, the overlooks significant parts of plan and discussion of the
solution development of the plan, or the problem solution have nothing to do
the interpretation of the with the problem
solution
Plan plan is clearly articulated plan is articulated reasonably plan is not clearly presented no plan OR the plan is
AND will lead to a correct well and correct OR may OR only partially correct completely incorrect
solution contain a minor flaw based on based on a correct/partially
a correct interpretation of the correct understanding of the
problem problem
Solution solution is correct AND solution is incorrect due to a solution is incorrect due to a no solution is given
clearly labeled OR though minor error in implementation significant error in
the solution is incorrect it of either a correct or incorrect implementation of either a
is the expected outcome of plan OR solution is not correct or incorrect plan
a slightly flawed plan that clearly labeled
is correctly implemented
Presentation overall appearance of the paper is hard to read OR
paper is neat and easy to read, pertinent information is
and all pertinent information hard to find
can be readily found

Holistic Grading of Math Problem Solving viii


Trait 3 points 2 points 1 point 0 points
All of the following Exactly one of the following Exactly two of the All of the following
characteristics must be characteristics is present: characteristics in the 2-point characteristics must be
present: answer is incorrect due to section are present OR present:
Analyzed answer is correct; a minor flaw in plan or an One or more of the following answer is incorrect;
explanation is clear algebraic error; characteristics are present. explanation, if any,
holistically
and complete; explanation lacks clarity; answer is incorrect due to uses irrelevant
explanation includes explanation is incomplete a major flaw in the plan; arguments;
complete explanation lacks clarity no plan for solution is
implementation of a or is incomplete but does attempted beyond just
mathematically indicate some correct and copying data given in
correct plan relevant reasoning; the problem statement
plan is partially
implemented and no
solution is provided

Grading rubrics can be applied to a wide variety of subjects and used in association with a range
of assessment techniques. (See the endnote on rubrics for references to good practices for using
rubrics and for a range of examples of rubrics at a variety of colleges and across several
disciplines.)

Before doing these two activities on rubrics, read Developing and Applying Rubrics by Mary
Allen in Appendix 3. If possible, review some of the sample rubrics listed in Appendix 4.
Activity 4: Building a Rubric
Using the grid in Appendix 5A, select or write an SLO, identify Primary Traits, and then decide
on observables for each assessment level
Activity 5: Using a Grading Rubric and Norming the Results

5
Use the English rubric in Appendix 5B to grade the sample student essay in Appendix 5C.
Compare your results with colleagues who graded the same paper. Where were your assessments
different? Can you come to agreement on the overall rating of the paper?
To this point we have discussed stating the desired student learning outcome and developing a
grading rubric. These are the beginning steps that can lead us toward collecting and using the
results of measured student learning outcomes. A road map of a possible SLO Assessment Plan
is shown in the diagram below.

Course Level TLC: Elements of an Assessment Plan


Statement of Desired SLO Faculty Collaboration

Course Context or Primary Observables for Each Assessment Norm Among Evaluate Compile Use Feedback
Objective Conditions Traits Performance Level Method Selected Instructors Student Work Results for Improvement

Grading Rubric Assessment Report


(Compiled for Each Desired SLO)

Selecting the Assessment Method: Authentic Assessment and Deep Learning

The next logical question is What assessment method should be used? There are certainly a
wide variety of methods for determining whether or not a student has demonstrated learning of a
particular objective.

Summary of Tools for Direct Assessment of Student Learningix


Capstone Project/Coursea project or courses which, in addition to a full complement of instructional objectives,
also serves as primary vehicle of student assessment for the course or program.
Criterion-Referenced Testsa measurement of achievement of specific criteria or skills in terms of absolute levels
of mastery. The focus is on performance of an individual as measured against a standard or criteria rather than
against performance of others who take the same test, as with norm-referenced tests.
Norm-Referenced Testan objective test that is standardized on a group of individuals whose performance is
evaluated in relation to the performance of others; contrasted with criterion-referenced test.
Portfolioa collection of student work organized around a specific goal, e.g., set of standards or benchmarks or
instructional objectives); it can contain items such as handouts, essays, rough drafts, final copies, artwork, reports,
photographs, graphs, charts, videotapes, audiotapes, notes, anecdotal records, and recommendations and reviews;
each item in the portfolio provides a portion of the evidence needed to show that the goal has been attained.
Performance Assessmentsactivities in which students are required to demonstrate their level of competence or
knowledge by creating a product or response scored so as to capture not just the "right answer", but also the
reasonableness of the procedure used to carry out the task or solve the problem.
Rating Scalessubjective assessments made on predetermined criteria in the form of a scale. Rating scales include
numerical scales or descriptive scales. Forced choice rating scales require that the rater determine whether an
individual demonstrates more of one trait than another.
Simulationa competency based measure whereby pre-operationalized abilities are measured in most direct, real-
world approach. Simulation is primarily utilized to approximate the results of performance appraisal, but whendue
to the target competency involved, logistical problems, or costdirect demonstration of the student skill is
impractical.

6
Activity 6
Read the article The Case for Authentic Assessment by Grant Wiggins in Appendix 6. Discuss the
assessment methods you use in your classes. What methods do you use? How effective do you find them?

Activity 7
View the film A Private Universex. Discuss the implications for producing and assessing deep learning.

As I have listened to faculty discuss assessment methods (at six statewide California Assessment
Institutes, eight regional RP/CAI workshops, and our own colleges summer institute on SLOs), I
have come to several conclusions:

School-of-Education level discussions of assessment instruments are not well received.


Faculty are eager to talk about the challenges they experience in assessing students.
Discussions often turn to great stuff such as authentic assessment and deep learning.
Most faculty use a rather narrow range of methodsbut use them well.
Faculty will more often try another assessment technique if recommended by a colleague.
Many faculty use assessments that need just slight enhancement to yield SLO results.

A few specifics on the last point may help:

One vocational department teaches portfolios in its introductory courseand uses


portfolios when doing faculty career advisingbut does not follow through by having
students add to the portfolio as competencies are acquired in subsequent courses. The
capstone course in this department has students build a portfolio as part of preparing to
enter the job market, but there is no connection with the portfolio in the intro class nor is
there a grading rubric.
One department has a clinical component in which students are evaluated using a rating
sheet on their hands-on competencies. The department has complained about needing
feedback from clinical to the theory courses, but has not consistently used the results of
the rating sheets for this purpose. The competencies taught in the theory course are fairly
well aligned with those assessed in clinical but could be improved.
Faculty in one of the social science departments have worked on departmental standards
for term papers to the point of a primary trait analysis and meet regularly to discuss
grading of term papers but have not filled in the observables to establish a rubric.
The English department has a grading rubric for written essays, and full- and part-time
faculty have regular norming sessions to improve consistency of grading, but the system
has only been used for two courses, freshman comp and its prerequisite.

Based on these observations, my recommendation is to start with these good things that faculty
are doing, get them engaged in talking about grading (Effective Grading: A Tool for Learning
and Assessment by Barbara Walvoord and Virginia Anderson has been great for this), get faculty

7
to share assessment strategies with one anotherespecially across disciplines, and provide the
support for moving these good existing assessment practices to the next level.

Norming or Inter-Rater Reliability: Assuring Consistency of Grading Among Faculty

Whatever method is chosen to assess student learning and apply the agreed-upon grading rubric,
faculty who teach sections of the course should work together to assure that the results of grading
student work are consistent. This process is known as norming or inter-rater reliability and
has been used in a variety of venues including construction of standardized tests, evaluating
placement test writing samples and ranking grant proposals. An explicit process for establishing
inter-rater reliability would be to have evaluators use the grading rubric on a series of student
assignments and then evaluate the extent of agreement using standard statistical measures. (The
kappa statistic, the chi square test, the Pearson correlation coefficient, and percent agreement
have all been used under various circumstances.) Agreement can be improved through discussion
and training. Norming can be performed informally by having regular discussions among faculty
raters, reviewing and debating examples related to the observables in the grading rubric until
consensus is reached.xi

With the statement of the desired student learning outcome in place, with the grading rubric
established and normed, the results collected can be powerful information for improving student
learningand may provide the basis for directing college resources in areas to address the
learning gaps identified.

The Assessment Report: Sharing the Results of Student Learning Outcomes

A sensitive aspect of the discussion of Student Learning Outcomes has been how the information
is to be used. Most importantly, the significance of the results relates directly to improving
teaching and learning. Most of that improvement lies with facultycurriculum design,
pedagogy, learning environment, assessment methods and the like. The rest is in the hands of the
colleges support systemproviding facilities, equipment, student services and so onto the
instructional program to make those improvements identified by SLO results. To the extent that
we can build an Assessment Report that focuses on the instructional program levelhelping
faculty improve student learning and identifying needed college resources, college faculty and
staff will buy into the process. The examples below illustrate a few key points:

An analysis of the resultsby faculty, particularly all program facultymust


accompany the results.
Results can be listed completely or summarized in narrative form.
Specific actions to be taken as a consequence of the results should be described.
Results often contradict our assumptions of how and what students learn.
Use of SLO results can be effectively centered in the instructional program as
the locus of change.

8
Simple presentations of results form elegant evidence for accreditation.

The key components of the Assessment Plan are the student learning outcomes statements and
the assessment methods used for each. The plan often includes benchmarks that indicate the
incremental gains expected in the assessment results. The essential features of the Assessment
Report are a summary of the results of student evaluations, an analysis of those findings, and a
summary of the actions taken to improve the student assessment performance. The diagram
below summarizes the elements of an effective Assessment Plan and the resulting Assessment
Report. Examples of Assessment Reports are shown in the Appendix.

Course Assessment Report


Department __________________________________ Term & Year __________________
Course Name and Number________________________________________________________
Student Learning Outcome Statements Assessment Method Description (attach rubric)
1. Capstone Project Embedded Test Question Portfolio
Performance Assessment Rating Scale Other
2. Capstone Project Embedded Test Question Portfolio
Performance Assessment Rating Scale Other

3. Capstone Project Embedded Test Question Portfolio


Performance Assessment Rating Scale Other

Assessment Results Analysis & Actions Taken


1.
2.
3.

Program Level Student Learning Outcomes

The term program here refers to core required courses for occupational programs and lower
division major preparation for transfer programs. Many professional societies have standards or
competencies that can be used as the basis for program level SLOs. (Some examples are
referenced in the endnotes and summarized in the Appendix.) Often, however, these
competencies are in the form of discrete skills rather than more global outcomes that would lend
themselves to summaries of student learning by those who have completed those programs. An
example of aggregating detailed standards into more comprehensive SLO statements is this
sample taken from the American Psychological Association.xii
Example of Aggregation of Specific Program Competencies into a Program Student Learning Outcome

Global Student Learning Outcome: Use critical thinking effectively.

Specific Competencies:
a. Evaluate the quality of information, including differentiating empirical evidence from speculation and the
probable from the improbable.
b. Identify and evaluate the source, context, and credibility of information.
c. Recognize and defend against common fallacies in thinking.
d. Avoid being swayed by appeals to emotion or authority.
e. Evaluate popular media reports of psychological research.

9
f. Demonstrate an attitude of critical thinking that includes persistence, open-mindedness, tolerance for
ambiguity and intellectual engagement.
g. Make linkages or connections between diverse facts, theories, and observations.
From Undergraduate Psychology Major Learning Goals And Outcomes: A Report, American Psychological Association, March 2002xii

Direct and Indirect Measures of Student Learning Outcomes

One consideration of the method is whether it is a direct or indirect measure of student learning.
Direct assessment includes using criteria that assesses or measures student learning directly such
as writing an essay, giving a speech, solving problems, using a capstone experience or evaluating a
portfolio of student-created products. Indirect assessment examines student performance or
behavior using criteria which, if accomplished, assume learning has taken place. Examples
include surveys of students and employers, exit interviews of graduates, retention and transfer
studies, and job placement data.xiii

Indirect measures are often thought of as outputs: course completions, degrees, certificates, and
transfers for example. These are the institutional measures of accountability measured by the
California Community Colleges Partnership for Excellence initiative. These measures are often
key indicators of success for a program, as exemplified below.

Example of the Use of Direct and Indirect Measures of Student Learning


From Oklahoma State University: http://www.okstate.edu/assess

Student Outcomes for Geology. Upon degree completion, students will


Demonstrate understanding of the basic concepts in eight subject areas: physical geology,
historical geology, mineralogy, petrology, sedimentology/stratigraphy, geomorphology, Direct
paleontology, and structural geology;
Demonstrate technical skills in the collection and analysis of geologic data, critical-thinking
skills, plus written and verbal communication skills; Direct
Apply geologic knowledge and skills to a range of problems faced by business, industry,
government; Direct
Gain employment in the geology profession or advance to graduate studies in geology or an
allied field. Indirect

Identifying Program CompetenciesExternal & Internal Sources Program: a sequence of


courses leading to an
educational goal in accord
One of the new activities that the accreditation standards require is the with the mission of the
construction of competencies for each of our degree and certificate California Community
programs. One way to approach this task is to begin with the Colleges: transfer, associate
competencies or standards that are used by state or national degree (both of which have
major and general education
professional organizations or licensing/credentialing bodies. These components), certificate,
groups span a wide range of disciplines both academic and vocational. basic skills, or workforce
Some examples: skill upgrades.

The American Welding Society publishes welding codes and standards on which an extensive AWS curriculum
is based. Many community colleges give students AWS certification tests based on these competencies.
The California Board of Registered Nursing uses standards of competent performance and tests nursing
applicants for licensure in many nursing fields.

10
The American Psychological Association recently published Undergraduate Psychology Learning Goals and
Outcomes that lists both global student learning outcomes and detailed competencies for both the psych major
and liberal studies students.
The California State Board of Barbering and Cosmetology tests graduates for licensure based on curriculum
standards enacted in Title 16 of the California Code of Regulations.

Links to these and other competencies and standards are found in the Appendix. While an
individual program may not teach to all the outcomes that these groups specify, the lists are an
excellent starting point. Not all programs have industry associations or professional societies
who write standards. Such programs may need to consult local vocational advisory committees
or faculty colleagues at neighboring institutions.

Strategies for Direct Assessment of Program SLOs: Mosaic and Capstone Approaches

The Mosaic Approach. Assessment of program-level student learning outcomes can be


approached by assessing either detailed competencies or more global program learning goals.
(Look again at the example in the table at the top of page 10 for the distinction between a global
SLO statement and its detailed competencies.) Assessing detailed competencies views the
acquiring of knowledge, skills and attitudes as taking place rather like assembling a complex
mosaic from individual colored tiles. It is a more analytical model and provides more targeted
information about student learning. However, the extent of the effort to find authentic
assessments for a large number of mosaic competencies, get agreement among program faculty
on those assessments, construct rubrics, norm on samples of student work, and then collect and
analyze the data may stretch program resources to the breaking point. Furthermore, the
acquisition of small, discrete packets of knowledge may not lead the student to acquire a more
integrated understanding that provides needed applicability to the next step in that students
career, be it transfer or directly entering the job market. Consequently, more holistic assessments
are often preferred, such as capstone courses or internships.

The Program Audit. Even if an integrated assessment is used at the end of the program, it is
useful to identify where in the curriculum each SLO (or even individual competency) is
acquired. Furthermore, learning most often occurs in cycles: the student will be exposed to a
topic, then later gain competency in that area, and finally master that skill. Doing a program
audit of exactly where SLOs and/or competencies are introduced, reinforced, and mastered in the
program course offerings is a useful exercise. A template for such a program audit is shown
below. Several colleges use such a model to connect individual course learning outcomes
statements with the more global program level learning outcomes statements.

Curriculum Audit Grid: Identifying Specific Competencies in the Program Mosaic


Course
Outcomes
201 202 205 207 251 260 313 314 320 425
1. Recognize and articulate approaches to psychology I E R
2. Independently design valid experiments. I E R
3. Articulate a philosophy of psych/Christian
integration. I I R R R R R R E
Modeled after Hatfield (1999). I = Introduced E = Emphasized R = Reinforced

11
Example from A Program Guide for Outcomes Assessment by Geneva College, April
2000: http://www.geneva.edu/academics/assessment/oaguide.pdf

Activity 8 Synthesizing Global Program SLO Statements from Course Objectives


Appendix 10 presents an exercise in synthesizing course SLO statements into program SLO
statements. Do this exercise in small groups and then compare results between groups.
The Capstone Approach. The above examples focus on the connection between global program
outcomes and the individual competencies students acquire in courses. These individual
competencies may be measured through assessment instruments embedded in those courses, be
they test questions, performance evaluations, or any other tool for which a normed rubric has
been generated. Quite often, program outcomes are determined by a more holistic measure such
as a portfolio or other capstone assignment, a survey, or indirect measures such as success in
transfer or employment. While these program assessment techniques are considerably simpler to
construct and carry out, they may not provide the diagnostic feedback which would enable more
targeted improvements in teaching and learning. The table below gives an example of such a
holistic or capstone program assessment report.

PARKLAND COLLEGE ACADEMIC ASSESSMENT (Excerpts)


DEPARTMENT: Fine and Applied Arts PROGRAM: Mass Communication
Methods: Pre/Post Tests Capstone exam/project Primary Trait Analysis
Course Embedded Test Standardized Exams Professional Certification
Portfolios Performance Assessment Other
Indirect Assessment Measures
Transfer/Employment Data Grad Surveys/Interviews Employer/Faculty Surveys
Intended Outcomes Assessment Criteria & Actual Results Analysis & Action
(Objectives) Methods
(Expected Results)
1. Students will demonstrate 1. Students will demonstrate desired 1. Written comments from industry 1. Desktop Graphics
proficiency in employable mass communication competencies representatives indicate that MC program revised to
Mass Communication as shown by individual portfolios, student's portfolio assessment include more experience
skills. when assessed by representatives ranked 4 (on a scale of 1 to 5-five in Web site graphics.
from industry, as reported on MC being the highest score). Students designed
Portfolio Evaluation form. Suggestions were to include graphics for current MC
more Web site graphics into home page and links.
curriculum.
2. Students will demonstrate 2. When surveyed using Parkland 2. Feedback from employers and 2. Visual Arts program
learning the basic concepts College Student Occupational students strongly indicated that option shelved.
necessary to perform Follow-Up Survey, graduates will Visual Arts program option had
satisfactorily in Mass describe satisfaction with their become obsolete; preference is
Communications entry- Mass Communication knowledge to given to graduates with Desktop
level jobs. recall, analyze, evaluate, and utilize Publishing skills.
basic concepts.
3. Students in the Mass 3. Four-year institutions will report of 3. U of I Coordinator of Transfer 3. Continue to
Communication A.A. a 75 percent acceptance rate into Articulation reported that out of gather/monitor data.
program will have the Mass Communication programs. 29 applicants from other schools Investigate how many
knowledge to successfully to Graphics a Mass Com student Parkland Graphics
complete a Bachelors was the only admit. students applied.
degree in Mass
Communication.

Activity 9 Creating Program SLO Statements and Performing a Program Audit


With a group of faculty from your discipline, write a set of Program SLOs for a degree or
certificate in your area.
Assemble the outlines of record for the required courses for a degree or certificate in your

12
discipline.
List the course objectives for all of these courses, preferably after having revised them to
robust objectives/student learning outcomes as described previously.
Identify which course objectives match with each Program SLO statement. Present the results
in a table format like that above. You may wish to categorize each course objective by the
extent to which is moves students toward mastery of the Program SLO.
Summary of Program Level SLO Assessment Strategies
Indirect: implies that SLOs are achieved Direct: students assessed on SLOs while in program
Transfer Capstone strategies:
Program completion Capstone course or project
Job placement Standardized test: commercial or local, sampled
Employer surveys or comprehensive
Student exit surveys Internship/clinical workplace evaluation
Licensure exams Portfolio: student- or instructor-generated
Mosaic strategies: embedded/program audit

2 3
Identify Identify Direct
Indirect Measures
Measures
Program
1 Level 4
Collect
Implement TLC Assessment
Program
Results

5
6 Disseminate &
Reflect on
Decide on
Results
Program
Improvements

General Education Student Learning Outcomes

Assessment of learning outcomes in general education can be approached rather like those for
programs in the major. Most colleges write global learning statements and then break those down
into specific competencies that are on the level of course objectives. Several examples are given
in Appendix 12, including an audit grid for general education competencies.

13
California Community Colleges have three sets of general education patterns to offer to students:
the associate degree pattern set by Title 5, the CSU GE-Breadth pattern, and IGETC. While these
patterns are similar, they have significant differences. The competency statements found in the
source documents for CSU GE-Breadth and IGETC can be a useful starting point for colleges
beginning the process of constructing SLO statements for general education categories.
General Education Patterns Available to Students (Merced College Example)
Merced College AA CSU GE-Breadth IGETC
A. Language & Rationality A1. Oral Communication 1A. English Composition
A2. Written Communication 1B. Critical Thinking
A3. Critical Thinking 1C. Oral Communication
B. Natural Sciences B1. Physical Science 2. Mathematical Concepts &
B2. Life Science Quantitative Reasoning
B3. Laboratory Activity
B4. Mathematics/Quantitative Reasoning
C. Humanities C1. Arts 3A. Arts
C2. Humanities 3B. Humanities
D. Social & Behavioral Sciences D. Social, Political & Economic Institutions & Behavior; 4. Social & Behavioral Sciences
Historical Background
E. Livelong Understanding & Self- E. Livelong Understanding & Self-Development 5A. Physical Science
Development 5B. Biological Science
F. History & Government 6. Language Other Than English
For more information refer to CSU Executive Order 595 and IGETC Notes 1, 2 and 3

Activity 10 Writing Global SLO Statements with Specific Competencies for Each
Review the models of general education student learning outcomes in Appendix 12 (assumes
knowledge of CCC GE, CSU GE-Breadth and IGETC patterns).
For each college GE area, write a global student learning outcome statement.
For each college GE area, write specific competency SLO statements under each of the global
SLO statements.

Activity 11 Performing a General Education Program Audit


Assemble the outlines of record for the courses approved in each GE area.
List the course objectives for all of these courses, preferably after having revised them to
robust objectives/student learning outcomes as described previously.
Identify which course objectives match with each GE SLO statement. Present the results in a
table format like that discussed previously. You may wish to categorize each course objective
by the extent to which is moves students toward mastery of the AA/AS GE SLO: I =
Introduced, E = Emphasized, or R = Reinforced.

Conclusion

In presenting preliminary findings to be published in an up-coming monograph, Jack


Friedlander, Executive Vice President of Santa Barbara City College, concluded that most
colleges around the country are still at the process level of developing SLOs. Nevertheless, there
are many examples of excellent work on SLOs at colleges around the country, summarized in
Appendix 13. These examples should provide colleges which are new to the Student Learning
Outcomes process with the shared experiences of their colleagues so that climbing the learning
curve can be facilitated. The climate in education today simply will not allow us to expend
valuable time and energy on a process that will not yield useful results. Such results have the

14
potential to allow faculty and others to engage in reflection about the process of teaching and
learning and then use the insights they develop to adjust the teaching-learning-assessment
process to optimize learning to the full extent possible. By having a clear path to those results,
we can move ahead with taking the first few steps. But we need to keep our eye on the goal as
were walking. Remember, utility can quickly become futility by adding a few fs!

15
Appendix 1 Good Practices in Assessing Student Learning Outcomes

Appendix 1A An Assessment Manifesto by College of DuPage (IL)

This 10-point manifesto is taken from the end section of 500 Tips on Assessment by Sally Brown, Phil Race and
Brenda Smith, published by Kogan Page in the Spring of 1996. We state some values which we believe should
underpin assessment, whatever form it takes and whatever purpose it serves. Our thinking on these values owes a
debt to the work of the Open Learning Foundation Assessment Issues Group, in which we all participated, and to the
values adopted by the UK Staff and Educational Development Association for its Teacher Accreditation Scheme,
and Fellowship Scheme.

1. Assessment should be based on an understanding of how students learn. Assessment should play a
positive role in the learning experiences of students.
2. Assessment should accommodate individual differences in students. A diverse range of assessment
instruments and processes should be employed, so as not to disadvantage any particular individual or
group of learners. Assessment processes and instruments should accommodate and encourage
creativity and originality shown by students.
3. The purposes of assessment need to be clearly explained. Staff, students, and the outside world need
to be able to see why assessment is being used, and the rationale for choosing each individual form of
assessment in its particular context.
4. Assessment needs to be valid. By this, we mean that assessment methods should be chosen which
directly measure that which it is intended to measure, and not just a reflection in a different medium
of the knowledge, skills or competences being assessed.
5. Assessment instruments and processes need to be reliable and consistent. As far as is possible,
subjectivity should be eliminated, and assessment should be carried out in ways where the grades or
scores that students are awarded are independent of the assessor who happens to mark their work.
External examiners and moderators should be active contributors to assessment, rather than observers.
6. All assessment forms should allow students to receive feedback on their learning and their
performance. Assessment should be a developmental activity. There should be no hidden agendas in
assessment, and we should be prepared to justify to students the grades or scores we award them, and
help students to work out how to improve. Even when summative forms of assessment are employed,
students should be provided with feedback on their performance, and information to help them
identify where their strengths and weaknesses are.
7. Assessment should provide staff and students with opportunities to reflect on their practice and their
learning. Assessment instruments and processes should be the subject of continuous evaluation and
adjustment. Monitoring and adjustment of the quality o f assessment should be built in to quality
control processes in universities and professional bodies.
8. Assessment should be an integral component of course design, and not something bolted on
afterwards. Teaching and learning elements of each course should be designed in the full knowledge
of the sorts of assessment students will encounter, and be designed to help them show the outcomes of
their learning under favorable conditions.
9. The amount of assessment should be appropriate. Students' learning should not be impeded by being
driven by an overload of assessment requirements, nor should the quality of the teaching conducted
by staff be impaired by excessive burdens of assessment tasks.
10. Assessment criteria need to be understandable, explicit and public. Students need to be able to tell
what is expected of them in each form of assessment they encounter. Assessment criteria also need to
be understandable to employers, and others in the outside world.

Appendix 1A 16
Appendix 1B Palomar College (CA) Statement of Principles on Assessment

Why do Assessment?

Palomars Vision Statement projects a future in which "Palomar College judges its work and its
programs and formulates its policies primarily on the basis of learning outcomes and has a
comprehensive program for assessing those outcomes and responding to its findings." We
adopted this strategic goal even before our accrediting body revised its accreditation standards to
"focus on outcomes and accomplishments, embracing a model of accreditation which requires
assessment of resources, processes, and outcomes at the institutional level." Thus our own
commitment to assess student learning at the institutional level precedes, but complements, the
mandates of accreditation. To carry out that commitment, Palomar will develop and continuously
refine and improve an institutional framework for assessing student learning and using the
information gained from such assessment to serve our students better.
What is assessment?
We mean by "assessment" "the systematic collection, analysis, interpretation, and use of
information to understand and improve teaching and learning" (Tom Angelo).
What is assessment for?
At Palomar, we will use assessment primarily to understand, and thereby improve, student
learning. More specifically, assessment can serve the following roles in the institution:
To provide improved feedback, guidance, and mentoring to students so as to help them
better plan and execute their educational programs.
To provide improved feedback about student learning to support faculty in their work.
To help us design and modify programs to better promote learning and student success.
To develop common definitions and benchmarks for important student abilities that will
enable us to act more coherently and effectively to promote student learning.
To help us understand how different groups of students experience the college differently
so as to adapt our courses and programs to the needs and capacities of all students.

To help us understand how our different courses and programs affect students over time so that
we can better coordinate and sequence the students experience to produce more and deeper
learning.

What is assessment not for?

Different institutions may, of course, use the tools of learning assessment differently. It will help
to clarify the nature of Palomars commitment to learning assessment to specify some of the
possible purposes of assessment that we will exclude from our approach.
We will not use assessment as an end in itself. Assessment that does not help us to
promote student learning is a waste of time.
We will not use assessment of student learning punitively or as a means of determining
faculty or staff salaries or rewards. The purpose of assessment is to evaluate student
learning, not to reward or punish faculty or staff.
We will not use any single mode of assessment to answer all questions or strictly
determine program decisions.

Appendix 1B 17
We will not use assessment in a way that will impinge upon the academic freedom or
professional rights of faculty. Individual faculty members must continue to exercise their
best professional judgment in matters of grading and discipline.
We will not assume that assessment can answer all questions about all students. We need
not directly assess all students in order to learn about the effectiveness of our programs
and policies.
We will not assume that assessment is quantitative. While numerical scales or rubrics
(such as the four-point grading scale) can be useful, their accuracy always depends on the
clear understanding of the concepts behind the numbers. Often the best indicator of
student learning can be expressed better as a narrative or a performance than as a number.
We will not use assessment only to evaluate the end of the students experience or merely
to be accountable to outside parties. Assessment must be ongoing observation of what we
believe is important.
We will not assume that assessment is only grading.

Who will do assessment?

Palomar's faculty, in consultation with the entire college community, will shape and design
institutional assessment activities and will identify the core knowledge and skills that our
students need to master. The faculty will likewise develop benchmarks by which student progress
can be evaluated. These will be ongoing processes, open to modification and improvement. Not
all assessment need be done in individual classes, and not every faculty member need assess all
of the core learning.

How will we use assessment?

The following guidelines will govern the methodology and approach we will employ at Palomar
to institutional assessment:
We will always seek multiple judgments of student learning rather than a single standard.
We will assess those skills and knowledge that our faculty, in consultation with the entire
college community, judges to be important and valuable.
We will assess the ongoing progress of students throughout their experience at the
college.

Appendix 1B 18
Appendix 1C AAHE Nine Principles of Good Practice for Assessing Student Learning

AAHE ASSESSMENT FORUM


9 Principles of Good Practice for Assessing Student Learning
1. The assessment of student learning begins with educational values. Assessment is not
an end in itself but a vehicle for educational improvement. Its effective practice, then,
begins with and enacts a vision of the kinds of learning we most value for students and
strive to help them achieve. Educational values should drive not only what we choose to
assess but also how we do so. Where questions about educational mission and values are
skipped over, assessment threatens to be an exercise in measuring what's easy, rather than
a process of improving what we really care about.
2. Assessment is most effective when it reflects an understanding of learning as
multidimensional, integrated, and revealed in performance over time. Learning is a
complex process. It entails not only what students know but what they can do with what
they know; it involves not only knowledge and abilities but values, attitudes, and habits
of mind that affect both academic success and performance beyond the classroom.
Assessment should reflect these understandings by employing a diverse array of methods,
including those that call for actual performance, using them over time so as to reveal
change, growth, and increasing degrees of integration. Such an approach aims for a more
complete and accurate picture of learning, and therefore firmer bases for improving our
students' educational experience.
3. Assessment works best when the programs it seeks to improve have clear, explicitly
stated purposes. Assessment is a goal-oriented process. It entails comparing educational
performance with educational purposes and expectations -- those derived from the
institution's mission, from faculty intentions in program and course design, and from
knowledge of students' own goals. Where program purposes lack specificity or
agreement, assessment as a process pushes a campus toward clarity about where to aim
and what standards to apply; assessment also prompts attention to where and how
program goals will be taught and learned. Clear, shared, implementable goals are the
cornerstone for assessment that is focused and useful.
4. Assessment requires attention to outcomes but also and equally to the experiences
that lead to those outcomes. Information about outcomes is of high importance; where
students "end up" matters greatly. But to improve outcomes, we need to know about
student experience along the way -- about the curricula, teaching, and kind of student
effort that lead to particular outcomes. Assessment can help us understand which students
learn best under what conditions; with such knowledge comes the capacity to improve the
whole of their learning.
5. Assessment works best when it is ongoing not episodic. Assessment is a process whose
power is cumulative. Though isolated, "one-shot" assessment can be better than none,
improvement is best fostered when assessment entails a linked series of activities
undertaken over time. This may mean tracking the process of individual students, or of
cohorts of students; it may mean collecting the same examples of student performance or
using the same instrument semester after semester. The point is to monitor progress

Appendix 1C 19
toward intended goals in a spirit of continuous improvement. Along the way, the
assessment process itself should be evaluated and refined in light of emerging insights.
6. Assessment fosters wider improvement when representatives from across the
educational community are involved. Student learning is a campus-wide responsibility,
and assessment is a way of enacting that responsibility. Thus, while assessment efforts
may start small, the aim over time is to involve people from across the educational
community. Faculty play an especially important role, but assessment's questions can't be
fully addressed without participation by student-affairs educators, librarians,
administrators, and students. Assessment may also involve individuals from beyond the
campus (alumni/ae, trustees, employers) whose experience can enrich the sense of
appropriate aims and standards for learning. Thus understood, assessment is not a task for
small groups of experts but a collaborative activity; its aim is wider, better-informed
attention to student learning by all parties with a stake in its improvement.
7. Assessment makes a difference when it begins with issues of use and illuminates
questions that people really care about. Assessment recognizes the value of
information in the process of improvement. But to be useful, information must be
connected to issues or questions that people really care about. This implies assessment
approaches that produce evidence that relevant parties will find credible, suggestive, and
applicable to decisions that need to be made. It means thinking in advance about how the
information will be used, and by whom. The point of assessment is not to gather data and
return "results"; it is a process that starts with the questions of decision-makers, that
involves them in the gathering and interpreting of data, and that informs and helps guide
continuous improvement.
8. Assessment is most likely to lead to improvement when it is part of a larger set of
conditions that promote change. Assessment alone changes little. Its greatest
contribution comes on campuses where the quality of teaching and learning is visibly
valued and worked at. On such campuses, the push to improve educational performance
is a visible and primary goal of leadership; improving the quality of undergraduate
education is central to the institution's planning, budgeting, and personnel decisions. On
such campuses, information about learning outcomes is seen as an integral part of
decision making, and avidly sought.
9. Through assessment, educators meet responsibilities to students and to the public.
There is a compelling public stake in education. As educators, we have a responsibility to
the publics that support or depend on us to provide information about the ways in which
our students meet goals and expectations. But that responsibility goes beyond the
reporting of such information; our deeper obligation -- to ourselves, our students, and
society -- is to improve. Those to whom educators are accountable have a corresponding
obligation to support such attempts at improvement.

Authors: Alexander W. Astin; Trudy W. Banta; K. Patricia Cross; Elaine El-Khawas; Peter T. Ewell; Pat Hutchings;
Theodore J. Marchese; Kay M. McClenney; Marcia Mentkowski; Margaret A. Miller; E. Thomas Moran; Barbara D.
Wright
This document was developed under the auspices of the AAHE Assessment Forum with support from the Fund for
the Improvement of Postsecondary Education with additional support for publication and dissemination from the
Exxon Education Foundation. Copies may be made without restriction.

Appendix 1C 20
Appendix 1D Closing the Loop by Tom Angelo

Seven Common (Mis)Perceptions About Outcomes Assessment


1. Were doing just fine without it. (Assessment is medicine only for the sick.)
2. Were already doing it. (Assessment is just old wine in new bottles.)
3. Were far too busy to do it. (Assessment is an administrivial burden.)
4. The most important things we do cant/shouldnt be measured. (Assessment is too reductive and
quantitative.)
5. Wed need more staff and lots more money to do assessment. (Assessment is too complex and
expensive.)
6. Theyll use the results against us. (Assessment is a trick or a Trojan horse.)
7. No one will care about or use what we find out. (Assessment is a waste of time.)

Seven Reasonable Responses to Those (Mis)Perceptions


1. Were doing just fine without it.
Okay, then lets use assessment to find out what works, and to help us document and build on
our successes.
2. Were already doing it.
Okay, then lets audit all the assessments we already do to discover what we know and what we
dont.
3. Were far too busy to do it.
Okay, but since were already doing it, lets use assessment to see where and how we can save
time and effort.
4. The most important things we do cant/shouldnt be measured.
And not everything measurable should be measured, but lets see if we can agree on how we can
tell when were succeeding in these most important things.
5. Wed need more staff and lots more money to do assessment.
Since were unlikely to get more resources, how, what, and where can we piggyback, embed,
and substitute?
6. Theyll use the results against us.
They might. So, lets build in strong safeguards against misuse before we agree to assess.
7. No one will care about or use what we find out.
To avoid that, lets agree not to do any assessments without a firm commitment from
stakeholders to use the results.

Seven Transformative Guidelines for Using Assessment to Improve Teaching and Learning
1. Build shared trust. Begin by lowering social and interpersonal barriers to change.
2. Build shared motivation. Collectively determine goals worth working toward and problems
worth solvingand consider the likely costs and benefits.
3. Build a shared language. Develop a collective understanding of new concepts (mental models)
needed for transformation.
4. Design backward and work forward. Design backward from the shared vision and long-term
goals to develop coherent outcomes, strategies, and activities.
5. Think and act systematically. Understand the advantages and limitations of the larger system(s)
within which we operate and seek connections and applications to those larger worlds.
6. Practice what we preach. Use what we have learned about individual and organizational learning
to inform and explain our efforts and strategies.
7. Dont assume, ask. Make the implicit explicit. Use assessment to focus on what matters most.

Appendix 1D 21
Appendix 1E Five Myths of Assessment by David Clement, Monterey Peninsula College

A s my campus management moves


inexorably toward Learning Outcomes,
Outcomes Based Education (OBE), and
III. Learning outcomes do not compromise
academic freedom.
Academic freedom is a complex issue but basically its
Assessment rubrics, I realize that there are five practice insures that students will be exposed to
major faculty objections, none of which has various, academically legitimate yet contradictory
been adequately addressed. OBE, at one time ideas. That is, they will be drawn into the Great
or another, has appealed to both the Right and Conversation, not simply inoculated with a currently
the Left. Its genesis was during the first Bush
administration while its current adherents are prevailing orthodoxy. Uniformity of input is anathema
more likely to be social utopians or to academic freedom; uniformity of outcome is
professional administrators. The Right saw inhuman. After over 30 years teaching, I still have no
OBE as a means to accountability, idea what any individual student will get out of a
productivity, and particularizing standards. The class.
Left saw OBE as an engine for social change,
attitude engineering, and infusing ideology IV. All students can succeed.
into curriculum. Thats why the OBE
vocabulary is such a This premise is idealistic but misguided. The only way
loopy conflation of edu-babble, computer to insure equal outcomes is to water down standards.
jargon, and therapy-speak. I summarize the All students must have equal opportunity, but each
five objections below for the benefit of other student is a unique and complex individual. The
faculty facing this latest, management-driven, reasons for success or failure cannot be teased
educational fad. The quotations are all apart from the mysteries of personality and talent.
assertions made by our Learning Outcomes
Task Force. V. There should be unanimous learning
outcomes for the whole college.
I. Assessment rubrics and learning Impossible as well as undesirable, and most disturbing
outcomes will not affect teacher when espousing nebulous, therapeutic or value-
evaluation. charged goals. One teacher may prize collaboration
Nonsense. No one can make such a guarantee. while another values self-reliance. One favors
Assessment schemes and expected outcomes are Globalism while another favors Globalization.
easily adapted for use on teacher evaluation forms. One teacher is Green, another is Libertarian. This is as
For example, How well did the teacher explain your it should be. You simply cant have a college
classs learning outcomes? And How well did the commitment both to diversity and to unanimity.
teachers learning activities facilitate class and college Thats hypocrisy. In college education (as in science),
learning outcomes? Maybe not this year or next, but respectful, learned disagreement is an essential part of
learning outcomes are a technocrats idea of the process. OBE is also behaviorist, Skinnerian,
education: flow charts, graph paper, and scores. concerned solely with INPUT and OUTPUT, ignoring
what happens in between. Deep learning is private,
invisible, and frequently ineffable. Often it is
II. Assessment does not intrude on your
dangerous, upsetting, and unpredictable. You cant put
classroom. Of course it does, in the most
it on the Internet, and you cant turn it into a
fundamental way.
PowerPoint magic lantern show. What I find is that
Every competent teacher has goals and grading OBE and Learning Outcomes and Assessment are not
criteria for his or her classes; many of us have used about education at all; they are about control. Nothing
such schemes as writing Instructional Objectives or is more seductive to ideologues and to management
setting Cognitive and Affective Domain goals. Where than the prospect of creating a meaningless jargon
Assessment intrudes is by insisting that all learning and data storm to justify or conceal whatever they do.
is observable and measurable. This may be true in Where does it end? As William S. Burroughs said, . .
skill development or performance courses (nursing or . control can never be a means to any practical
cello), but it is clearly false in humanities or art end . . .. It can never be a means to anything but more
courses. There, as one Joseph Conrad character said of control . . . (133).
those who travel to Africa, The changes take place
inside, you know. How does a student come to Work Cited
realize that Mozart is better than Britney Spears or Burroughs, William S. Naked Lunch. New York:
Michelangelo better than Thomas Kinkade? No chart Grove Press, 1956. Reissue edition 1992.
of the measurable and observable will tell you, yet it
surely involves learning.
Appendix 1E 22
David Clemens has taught, part time and full,
at Monterey Peninsula College since 1971. He
has published in New Directions in Teaching,
San Francisco Chronicle, Teaching English in
the Two Year Colleges, San Jose Mercury, New
Morning, Informal Logic. Ten years
Contributing Editor Media and Methods.
Science and Academic Board of the
Foundation for Research in Accelerating
Change.

Appendix 1E 23
Appendix 2 Activity #3: Writing Student Learning Outcomes
Review the first example. Then for the second course objective, complete the Performance Context, Measurable Objective, and Primary Traits.
Finally, select an objective from a course in your discipline and construct the three-part SLO statement.

Course Objective Performance Context Measurable Objective Grading Criteria/ Primary


Traits
Match the various types of sheet Given specifications and evaluate the performance needs Welds should have a quality edge
metal welding methods to the materials requiring a weld, and match the welding method to joint, meet design specifications,
appropriate application. the required application. have an evenly positioned weld
bead with good penetration, and
have the minimum heat-affected
zone to maximize strength of the
weld.

Demonstrate and develop correct


keyboarding techniques
applicable to keyboarding by
touch for speed and accuracy.

Appendix 2 24
Appendix 3 Developing and Applying Rubrics
Mary Allen, CSU Institute for Teaching & Learning, mallen@calstate.edu

Scoring rubrics are explicit schemes for classifying products or behaviors into categories that
vary along a continuum. They can be used to classify virtually any product or behavior, such as
essays, research reports, portfolios, works of art, recitals, oral presentations, performances, and
group activities. Judgments can be self-assessments by students; or judgments can be made by
others, such as faculty, other students, fieldwork supervisors, and external reviewers. Rubrics can
be used to provide formative feedback to students, to grade students, and/or to assess programs.

There are two major types of scoring rubrics:


Holistic scoring - one global, holistic score for a product or behavior
Analytic rubrics - separate, holistic scoring of specified characteristics of a product or
behavior

Holistic Rubric for Assessing Student Essays


Inadequate The essay has at least one serious weakness. It may be unfocused,
underdeveloped, or rambling: Problems with the use of language seriously
interfere with the reader's ability to understand what is being communicated.
Developing The essay may be somewhat unfocused; underdeveloped, or rambling, but it does
Competence have some coherence. Problems with the use of language occasionally interfere
with the reader's ability to understand what is being communicated:
Acceptable The essay is generally focused and contains some development of ideas; but the
discussion may be simplistic or repetitive. The language lacks syntactic
complexity and may contain occasional grammatical errors, but the reader is able
to understand what is being communicated.
Sophisticated The essay is focused and clearly organized, and it shows depth of development. -
The language is precise and shows syntactic variety, and ideas are clearly
communicated to the reader.

Analytic Rubric for Peer Assessment of Team Project Members


Below Expectation Good Exceptional
Project Made few substantive Contributed a "fair Contributed
Contributions contributions to the share" of substance to considerable
team's final product the team's final product substance to the
team's final product:.
Leadership Rarely or never Accepted a "fair share" Routinely provided
exercised leadership of leadership excellent leadership
responsibilities
Collaboration Undermined group Respected other's Respected other's
discussions or often opinions-and contributed opinions and made
failed to participate to the group's discussion major contributions to
the group's discussion

Appendix 3.1 25
Online Rubrics
For links to online rubrics, go to http://www.calstate.edu/acadaff/sloa/. Many rubrics have
been created for use in K-12 education, and they can be adapted for higher education. It's often
easier to adapt a rubric that has already been created than to start from scratch.

Rubrics have many strengths:


Complex products or behaviors can be examined efficiently.
Developing a rubric helps to precisely define faculty expectations.
Rubrics are criterion-referenced, rather than norm-referenced. Raters ask, "Did the student
meet the criteria for level 5 of the scoring rubric?" rather than "How well did this student do
compared to other students?"
Ratings can be done by students to assess their own work, or they can be done by others, e.g.,
peers, fieldwork supervisions, or faculty.

Rubrics can be useful for grading, as well as assessment.

Analytic Rubric for Grading Oral Presentations


Below Expectation Satisfactory Exemplary Score
Organizatio No apparent The presentation has a The presentation is
n
organization. focus and provides carefully organized
Evidence is not used some evidence which and provides
to support assertions. supports conclusions. convincing evidence
to support
conclusions.
(0-2) (3-5) (6-8)
Content The content is The content is The content is
inaccurate or overly generally accurate, but accurate and
general. Listeners are incomplete. Listeners complete. Listeners
unlikely to learn may learn some are likely to gain new
anything or may be isolated facts, but they insights about the
misled. are unlikely to gain topic.
new insights about the
topic.
(0-2) (5-7) (10-13)
Style The speaker appears The speaker is The speaker is relaxed
anxious and generally relaxed and and comfortable,
uncomfortable; and comfortable, but too speaks without undue
reads notes, rather often relies on notes. reliance on notes, and
than speaks. Listeners are interacts effectively
Listeners are largely sometimes ignored or with listeners.
ignored. misunderstood.
(0-2) (3-6) (7-9)
Total Score

Appendix 3.2 26
Suggestions for Using Rubrics in Courses
1. Hand out the grading rubric with the assignment so students will know your expectations
and how they'll be graded. This should help students master your learning objectives by
guiding their, work in appropriate directions.
2. Use a rubric for grading student work and return the rubric with the grading on it., Faculty
save time writing extensive comments; they just circle or highlight relevant segments of the
rubric. Some faculty include room for additional comments on the rubric page; either
within.
-
each section or at the end.
3. Develop a rubric with your students for an assignment or group project. Students can then
monitor themselves and their peers using agreed-upon criteria that they helped develop:
Many faculty find that students will create higher standards for themselves than faculty
would impose on them.
4. Have students apply your rubric to some sample products before they create their own:
Faculty report that students are quite accurate when doing this, and this process should help
them evaluate their own products as they are being developed. The ability to evaluate, edit,
and improve draft documents is an important skill.
5. Have students exchange paper drafts and give peer feedback using the rubric, then give
students a few days before the final drafts are turned in to you. You might also require that
they turn in the draft and scored rubric with their final paper.
6. Have students self-assess their products using the grading rubric and hand-in the self-
assessment with the product; then faculty and students can compare self- and faculty
generated evaluations.

Sometimes a generic rubric can be used, and it can be refined as raters become more-
experienced or as problems emerge.

Generic Rubric for Assessing Portfolios


Unacceptable: Marginal: Acceptable: Exceptional:
Evidence that the Evidence that the Evidence shows Evidence
student has student has that the student demonstrates that
mastered this mastered this has generally the student has
objective is not objective is attained .this mastered this
provided, provided, but it is objective. objective at a
unconvincing, or weak or high level.
very incomplete. incomplete.
Learning
Objective I
Learning
Objective 2
Learning
Objective 3

Appendix 3.3 27
Steps for Creating a Rubric
1. Identify what you are assessing; e.g.; critical thinking.
2: Identify the characteristics of what you are assessing, e.g., appropriate use of evidence,
recognition of logical fallacies.
3. Describe the best work you could expect using these characteristics. This describes the top category.
4. Describe the worst acceptable product using these characteristics. This describes the lowest
acceptable category.
5. Describe an unacceptable product. This describes the lowest category.
6. Develop descriptions of intermediate-level products and assign them to intermediate categories. You
might decide to develop a scale with five levels (e.g., unacceptable, marginal, acceptable, competent,
outstanding), three levels (e.g., novice, competent,
exemplary), or any other set that is meaningful.
7. Ask colleagues who were not involved in the rubric's development to apply it to some
products or behaviors and revise as needed to eliminate ambiguities.

Group Readings
Rubrics can be applied by one person, but group readings can be very effective because they bring faculty
together to analyze and discuss student learning. If data are aggregated as results come in, the group
reading can end with a discussion of what the results mean, who needs to know the results; what
responses might be reasonable (e.g., curricula, pedagogy, or support changes), and how the assessment
process, itself, could be improved.
Who should be invited to group readings?
Faculty and others (e.g., graduate students, fieldwork supervisors, community professionals),
especially those who control and offer the curriculum and who can make valid, informed
judgments about student learning.

Managing Group Readings


1. If the reliability of the rubric is known to be high, it may be reasonable to have only one reader
analyze each document, but it generally is preferable to use two readers so that inter-rater
reliability can be examined and discrepancies can be identified and resolved.
2. When two readers work independently, the second reader may be allowed to peek at the first
rater's judgments. Readers often are curious about other's opinions, and no harm is done if the
first rater's scores are hidden until after the second opinions have been recorded.
3. Sometimes, results are monitored as they are turned in, and documents are given to a third reader
when necessary to resolve discrepancies. For example, the facilitator may send any document
that has a scorer difference of more than one point to a third reader who determines which rating
is more accurate.
4. Sometimes readers work in pairs, independently rating each document, then jointly resolving all
disagreements. They may be asked to discuss only the ratings that differ by some amount, such
as, at least two units.
5. When two rates disagree, faculty must decide which rating will be used in the analysis, or they
may decide to use both. Whatever the decision, the project report should document how data
were generated.

Appendix 3.4 28
Scoring Rubric Group Orientation and Calibration

1. Describe the purpose for the review, stressing how it fits into program assessment plans.
Explain that the purpose is to assess the program, not individual students or faculty; and
describe ethical guidelines, including respect for confidentiality and privacy:
2. Describe the nature of the products that will be reviewed, briefly summarizing how they were
obtained.
3. Describe the scoring rubric and its categories. Explain how it was developed.
4. Explain that readers should rate each dimension of an analytic rubric separately, and they should
apply the criteria without concern for how often each category is used.
5. Give each reviewer a copy of several student products that are exemplars of different levels of
performance. Include, if possible, a weak product, an intermediate-level product, a strong product,
and a product that appears to be particularly difficult to judge. Ask each volunteer to independently
apply the rubric to each of these products, and show them how to record their ratings.
6. Once everyone is done, collect everyone's ratings and display them so everyone can see the degree
of agreement. This is often done on a blackboard, with each person in turn
announcing his/her ratings as they are entered on the board. Alternatively, the facilitator
could ask raters to raise their hands when their rating category is announced, making the
extent of agreement very clear to everyone and making it very easy to identify raters
who routinely give unusually high or low ratings.
7. Guide the group in a discussion of their ratings. There will be differences, and this discussion is
important to establish standards. Attempt to reach consensus on the most appropriate rating for each
of the products being examined by inviting people who gave different ratings to explain their
judgments. Usually consensus is possible, but sometimes a split decision is developed, e.g., the
group may agree that a product is a 3-4" split because it has elements of both categories. Expect
more discussion time if you include a hard-to-score example; but be aware that its inclusion will
save everyone grief later because such documents are bound to occur. You might allow the group to
revise the rubric to clarify its use, but avoid allowing the group to drift away from the learning
objective being assessed.
8. Once the group is comfortable with the recording form and the rubric, distribute the products and
begin the data collection.
9. If you accumulate data as they come in and can easily present a summary to the group at the end of
the reading, you might end the meeting with a discussion of four questions:
a. What do the results mean?
b. Who needs to know the results?
c. What are the implications of the results for curriculum, pedagogy, or student support
services?
d. How might the assessment process, itself, be improved?
10. It can be useful to set up a spreadsheet to calculate means, frequencies, and reliability. Then discuss
what the scores mean for your curriculum, teaching methods, students, etc. Report the inter-rater
reliability: % + or 1 point, % + or 2 points, etc.

Appendix 3.5 29
Appendix 4 Examples of Scoring Rubrics

Map Rubric is a scoring tool for the Online Map Creation web site (www.aquarius.geomar.de/omc).

Grading Standards: Written Work for The Living Environment BIOL 111 is a rubric for writing in
Biology (A-F scales with definitions) at Southern Illinois University, Edwardsville
(http://www.siue.edu/~deder/grstand.html)

Student Participation Assessment and Evaluation is a rubric with 4-point scales: frequently,
occasionally, seldom, almost never, used at Southern Illinois University, Edwardsville
(http://www.siue.edu/~deder/partrub.html)

Assessing Modeling Projects in Calculus and Precalculus by C. E. Emenaker of the University of


Cincinnati gives a math project problem with two scoring rubrics
(http://www.maa.org/saum/maanotes49/116.html)

Scientific Report Rubric and Collaboration Rubric developed for the Cabrillo Tidepool Study
(http://edweb.sdsu.edu/triton/tidepoolunit/Rubrics/reportrubric.html and
http://edweb.sdsu.edu/triton/tidepoolunit/Rubrics/collrubric.html)

Rubric For Evaluating Web Sites originally developed by John Pilgrim, Horace Mann Academic Middle
School, San Francisco (http://edtech.sandi.net/rubric/)

Secondary Assessment Tools is a web site with links to several dozen simple Performance Assessment
rubrics (http://www.bcps.org/offices/lis/models/tips/assess_sec.html)

Rubric for E-Zine Pursuit is a rubric for evaluating an electronic, web-based magazine or journal
(http://www.esc20.net/etprojects/formats/webquests/summer99/northside/ezine/rubric.html)

Course Embedded Assessment by Larry Kelley, Executive Director of Institutional Effectiveness &
Planning at University of Louisiana Monroe, lkelley@ulm.edu . This workshop presentation material
describes the process of embedding program assessment in courses, gives outlines of several program
assessment plans, describes the use of rubrics, and gives examples of rubrics for Written
Communication Skills, Oral Communication Skills, Problem Solving Skills, and Basic Information
Technology Skills.

Student Learning Outcomes in the California State University is a web site that gives links to about 50
scoring rubrics (http://www.calstate.edu/AcadAff/SLOA/links/rubrics.shtml) . An example is the
Scoring Guide for the CSU English Placement Test (EPT), CSU Fresno rubrics on Critical Thinking,
Integrative Science, and Writing.

Appendix 4 30
Appendix 5A Activity #4: Building a Rubric
Select or write an SLO, identify Primary Traits, and then decide on observables for each assessment level

SLO Statement:

Trait Excellent Satisfactory Unsatisfactory Score

Total:

Appendix 5 31
Appendix 5B Scoring Rubric English Department Modesto Junior College
ExcellentMarkedly Exceptional SuperiorClearly Above SatisfactoryFully PoorMarginally Acceptable FailingUnacceptable
Average Competent
A comprehensive grasp of the subject A lack of familiarity with the A basic lack of
Development

A thorough grasp of the subject A basic grasp of the subject


matter is demonstrated matter is demonstrated matter is demonstrated subject matter is demonstrated understanding of the
Body is developed with original, Focus is clear and thoughtful Focus is generally adequate Focus is vague, either too subject matter is
insightful, and creative support; the Body is generally supported by but may not be immediately general, too narrow, superficial, demonstrated
paper goes beyond repeating what facts, examples, etc. though clear to all readers or indirect Focus is not evident
others have said and contributes support will not be as varied or Response to the assignment Body supported by few Body largely
something new to our understanding of vivid as in an excellent paper is generally adequate examples or facts; many unsupported by
the topic Demonstrates understanding of Body supported by facts, examples are unanalyzed relevant facts or
Focus is clear, imaginative and fully audience and purpose, though examples, details, but are Demonstrates poor examples
realized may occasionally stray from it mainly surface oriented and understanding of audience and Demonstrates no
Demonstrates specific attention to generalized purpose understanding of
relationship between audience and Demonstrates only some audience/purpose
purpose understanding of audience
and purpose
Organization

Clear, logical, and inventive Clear and logical organization Ideas generally related to Unclear ordering of ideas; Minimal organization;
organization of ideas in relation to one of ideas in relation to one one another and to the organization not readily inappropriate or no
another and to the essays focus another and to the focus focus, but may have some apparent paragraphing
Highly effective introduction and Appropriate introduction and unrelated material Underdeveloped or Ineffective or missing
conclusion conclusion Adequate introduction and inappropriate introduction and introduction and
Appropriate and smooth transitions Appropriate and smooth conclusion conclusion conclusion
between paragraphs transitions between paragraphs Some attempt at transitions Transitions are lacking Minimal or no use of
and between most sentences transitions
Engaging and individualized voice Voice appropriate to the Voice adequate to Voice generally hard to Voice/style not
Style/Voice

appropriate to the audience/purpose audience/purpose, though it audience/purpose, but often characterize because of possible due to severe
Consistency of tone/voice may be somewhat generic or is predictable frequent mechanical problems mechanical problems
Refreshing and revealing word choice predictable in places May be slight Phrasing problems, garbled
Varied and interesting sentence Consistency of tone/voice inconsistencies of tone sentence structure noticeable in
structure Interesting and varied word Predictable word choice; several places
choice low range of synonyms Overall lack of
Some creative sentence variety employed control/confidence of writing
Sentences mechanically voice
sound but lack in variety
Full variety of sentence structures used Variety of sentence structure Frequent sentence structure Sentences often simplistic or Simplistic or
Mechanics

correctly used correctly problems incoherent incoherent sentences


Accurate and precise diction and Accurate diction and phrasing Diction/phrasing often Frequent misuse of common outweigh intelligible
phrasing Infrequent grammatical and inaccurate words and phrases sentences
Very few grammatical and punctuation mechanical errors that rarely Frequent and varied Many major grammatical, Diction often
errors disrupt flow or clarity grammatical, punctuation, punctuation, and mechanical inaccurate or severely
and mechanical errors that errors that interfere with some limited vocabulary
interfere with clarity readers understanding of the Mechanical errors
Appendix 6 32
text predominate

Appendix 6 33
Activity #5C: Using a Grading Rubric and Norming the Results
Use the rubric in Appendix 5B to evaluate this sample student writing assignment.
Compare your evaluation with that of a colleague.

English 101: Freshman Composition

Essay #2: Taking a Stand: High School Exit Exams


Due date for peer edit: 10/13 (must be at least 3 pages and typed)
Final draft due 10/15 (Please turn in with rough draft and editing sheet)

Your Task:
Using the articles and editorials provided to you as resources, write a persuasive argument either for or
against the exams.

Paper must:
-Have a clear, underlined thesis explaining which side you support
-Explain/summarize the issue at hand
-Give at least two reasons for your stance
-Explain at least one aspect of the other sides argument
-Use at least two quotes from provided sources

Remember:
-Use MLA format
-Cite all quotes, ideas that are not your own, and statistics
-Include Works Cited list at the end of essay

You may also consider:


-Your own personal experience or the experience of someone you know who is in high school/works at a
high school

Appendix 6 34
English 101

10-13-03

Essay #2: Taking a Stand: High School Exit Exam

High school is stressful considering issue such as: peer pressure, the struggle of passing classes,

and trying to maintain a high Grade Point Average. Most students are desperately trying to keep

themselves a float in this society of raging waters. They feel they cannot handle anything else. For many

of them can hardly carry what they have already. Now students have one more burden to carry and that is

the high school exit exam.

Learning contains many key principles, but the most basic of all is desire. The students have to

have a passion to learn. Many argue this exam hurts the underprivileged such as minorities and low-

income families, but is this true (Burke, Exit Exam B5). The greatest hindrance that keeps most

students from learning is problems of drugs, alcohol, and domestic violence. These problems are found

both in the homes of the rich, the poor and almost of any ethnic background. There isnt any good reason

why a student, that doesnt have any disabilities or language barriers, should have problems learning.

Students that have such problems concerning language barriers and disabilities should be provided

programs that will steadily prepare them for the exit exam. High School students should be made to take

the exit exam to make sure they are progressing, and that they have basic skills to survive life, to get good

jobs or to pursue careers. There shouldnt be a student left behind, not knowing their basic skills of

reading, writing, and math.

During high school, the greatest amount of progression should be made above any other time in

grade school, and this can only be done with the help of our school system. What is done between grades

9th to 12th does matter, for whatever they learn between these grades they will probably carry with them

for the rest of their lives. The teachers should help the students progress by fully explaining what goals

they want the students to meet. They should push the student to think: always getting them involved in

Appendix 6 35
every class project and discussion. There needs to be an interaction between teachers and students. The

class should never look bored and stagnant. There is a great need for open communication between

teachers and students. Students should be able to come to the teacher if they have any trouble with the

assignment or any other issues pertaining to any of their educational needs.

If they are planning on giving an exam; that test high school students abilities; the schools should

fully prepare the teachers and the students. Teachers should be made to teach all the materials that will be

on the exams year around for the full four years of high school. Students should be tested every year, so

they can see where they need to progress for the upcoming year. This will be helpful to both the teacher

and the student. CNN Student News, center director Jack Jennings said, You have to provide a system to

help kids succeedThese test are a good idea if theyre done right (Jennings qtd. In States stick). We

cannot just drop an exam on students laps and expect that they take it if we dont fully prepare them. No

part of the exam should be a mystery to them; it should all be review. Students, on the other hand, should

be made accountable for what they learn. They should study often. This exam is supposed to test what

they have learned during these past four years of their lives. If we go about this the right way, this exam

should be like any other test for the student.

This exam should be taken so that the student will have the basic skills to survive life. Everyday,

if we realize it or not, we are surrounded by writing, reading, and mathematics. For example, anytime we

go to the store we use math, whether it is for calculating 30% off of item on sale or giving and receiving

money from the cahier. Another example is the ability to read or write, and its important usage for the

voter in an election. Its importance is beyond our reasoning, for we really have to know what we are

reading, when it has to do with drafting in different laws. Everyday we are surrounded by these

obscurities that call for basic skills, skills that may look non useful, but one-day students will need.

Once students graduate from high school, thats when life really begins. They will most likely use

all they learned in high school, in college and even after that in the work place. All students will need

these basic skills of reading, writing, and math in their jobs and also in whatever career they decide to

Appendix 6 36
pursue. The whole point of the exam is to encourage students to progress, so they wont feel lost and

confused, when they graduate and try to find a job or seek a profession.

The high school exit exam shouldnt even be a debate, if its just basic material that high school

students should already know. David Cooper, director of secondary education for Modesto City Schools,

said students may take the test up to eight times, and most will eventually pass (qtd. In Herendeen,

Students Cheer A1). Students shouldnt eventually understand the material; they should know the

material (qtd. In Herendeen, Students Cheer: A1). The reason why taking the high school exit exam is

an issue is because they dont already know the basic material, which will be sooner or later in life, be put

before them. We need to go back to the basics, and make sure that math, reading, and writing are being

taught before any other materials. These basics need to be priority, and any other extra curricular subject,

secondary. The only way we can make sure students are being taught, is to test their abilities. We need to

strive together as a people and make sure students are learning. We want students to leave high school

knowing they have progressed, that they have learned something of great value. They should feel

confident when they get out of high school. They should have the ability and opportunity to survive in

life, get a good job, and pursue the career of their dreams. It is our responsibility to make sure they have

their feet planted on solid ground, ready to go out in this world and make a difference.

Works Cited

Burke, Frank. Letter. The Modesto Bee 28 June 2003: B5

Herendeen Susan. Letter. The Modesto Bee 10 July 2003: A1

States stick with high-school exit exam. CNN Student News 20 Aug. 2003. 12. Oct.

2003

<http://www.cnn.com/2003/EDUCATION/08/13/high.school.exams.ap

Appendix 6 37
Appendix 6 The Case for Authentic Assessment by Grant Wiggins
(http://www.ericfacility.net/databases/ERIC_Digests/ed328611.html )
WHAT IS AUTHENTIC ASSESSMENT?
Assessment is authentic when we directly examine student performance on worthy intellectual tasks.
Traditional assessment, by contract, relies on indirect or proxy 'items'--efficient, simplistic substitutes from which
we think valid inferences can be made about the student's performance at those valued challenges.
Do we want to evaluate student problem-posing and problem-solving in mathematics? experimental research in
science? speaking, listening, and facilitating a discussion? doing document-based historical inquiry? thoroughly
revising a piece of imaginative writing until it "works" for the reader? Then let our assessment be built out of such
exemplary intellectual challenges.
Further comparisons with traditional standardized tests will help to clarify what "authenticity" means when
considering assessment design and use:
Authentic assessments require students to be effective performers with acquired knowledge. Traditional tests
tend to reveal only whether the student can recognize, recall or "plug in" what was learned out of context. This
may be as problematic as inferring driving or teaching ability from written tests alone. (Note, therefore, that the
debate is not "either-or": there may well be virtue in an array of local and state assessment instruments as befits
the purpose of the measurement.)
Authentic assessments present the student with the full array of tasks that mirror the priorities and challenges
found in the best instructional activities: conducting research; writing, revising and discussing papers; providing
an engaging oral analysis of a recent political event; collaborating with others on a debate, etc. Conventional
tests are usually limited to paper-and-pencil, one- answer questions.
Authentic assessments attend to whether the student can craft polished, thorough and justifiable answers,
performances or products. Conventional tests typically only ask the student to select or write correct responses--
irrespective of reasons. (There is rarely an adequate opportunity to plan, revise and substantiate responses on
typical tests, even when there are open-ended questions). As a result,
Authentic assessment achieves validity and reliability by emphasizing and standardizing the appropriate criteria
for scoring such (varied) products; traditional testing standardizes objective "items" and, hence, the (one) right
answer for each.
"Test validity" should depend in part upon whether the test simulates real-world "tests" of ability. Validity on
most multiple-choice tests is determined merely by matching items to the curriculum content (or through
sophisticated correlations with other test results).
Authentic tasks involve "ill-structured" challenges and roles that help students rehearse for the complex
ambiguities of the "game" of adult and professional life. Traditional tests are more like drills, assessing static
and too-often arbitrarily discrete or simplistic elements of those activities.
Beyond these technical considerations the move to reform assessment is based upon the premise that
assessment should primarily support the needs of learners. Thus, secretive tests composed of proxy items and
scores that have no obvious meaning or usefulness undermine teachers' ability to improve instruction and students'
ability to improve their performance. We rehearse for and teach to authentic tests--think of music and military
training--without compromising validity.
The best tests always teach students and teachers alike the kind of work that most matters; they are enabling and
forward-looking, not just reflective of prior teaching. In many colleges and all professional settings the essential
challenges are known in advance--the upcoming report, recital, Board presentation, legal case, book to write, etc.
Traditional tests, by requiring complete secrecy for their validity, make it difficult for teachers and students to
rehearse and gain the confidence that comes from knowing their performance obligations. (A known challenge also
makes it possible to hold all students to higher standards).
WHY DO WE NEED TO INVEST IN THESE LABOR-INTENSIVE FORMS OF ASSESSMENT?
While multiple-choice tests can be valid indicators or predictors of academic performance, too often our tests
mislead students and teachers about the kinds of work that should be mastered. Norms are not standards; items are
not real problems; right answers are not rationales.
What most defenders of traditional tests fail to see is that it is the form, not the content of the test that is harmful
to learning; demonstrations of the technical validity of standardized tests should not be the issue in the assessment
reform debate. Students come to believe that learning is cramming; teachers come to believe that tests are after-the-

Appendix 6 38
fact, imposed nuisances composed of contrived questions--irrelevant to their intent and success. Both parties are led
to believe that right answers matter more than habits of mind and the justification of one's approach and results.
A move toward more authentic tasks and outcomes thus improves teaching and learning: students have greater
clarity about their obligations (and are asked to master more engaging tasks), and teachers can come to believe that
assessment results are both meaningful and useful for improving instruction.
If our aim is merely to monitor performance then conventional testing is probably adequate. If our aim is to
improve performance across the board then the tests must be composed of exemplary tasks, criteria and standards.
WON'T AUTHENTIC ASSESSMENT BE TOO EXPENSIVE AND TIME-CONSUMING?
The costs are deceptive: while the scoring of judgment-based tasks seems expensive when compared to
multiple-choice tests (about $2 per student vs. 1 cent) the gains to teacher professional development, local
assessing, and student learning are many. As states like California and New York have found (with their writing and
hands-on science tests) significant improvements occur locally in the teaching and assessing of writing and science
when teachers become involved and invested in the scoring process.
If costs prove prohibitive, sampling may well be the appropriate response--the strategy employed in California,
Vermont and Connecticut in their new performance and portfolio assessment projects. Whether through a sampling
of many writing genres, where each student gets one prompt only; or through sampling a small number of all
student papers and school-wide portfolios; or through assessing only a small sample of students, valuable
information is gained at a minimum cost. And what have we gained by failing to adequately assess all the capacities
and outcomes we profess to value simply because it is time- consuming, expensive, or labor-intensive? Most other
countries routinely ask students to respond orally and in writing on their major tests--the same countries that
outperform us on international comparisons. Money, time and training are routinely set aside to insure that
assessment is of high quality. They also correctly assume that high standards depend on the quality of day-to-day
local assessment--further offsetting the apparent high cost of training teachers to score student work in regional or
national assessments.
WILL THE PUBLIC HAVE ANY FAITH IN THE OBJECTIVITY AND RELIABILITY OF JUDGMENT-BASED
SCORES?
We forget that numerous state and national testing programs with a high degree of credibility and integrity have
for many years operated using human judges:
the New York Regents exams, parts of which have included essay questions since their inception--and which
are scored locally (while audited by the state);
the Advanced Placement program which uses open-ended questions and tasks, including not only essays on
most tests but the performance-based tests in the Art Portfolio and Foreign Language exams;
state-wide writing assessments in two dozen states where model papers, training of readers, papers read "blind"
and procedures to prevent bias and drift gain adequate reliability;
the National Assessment of Educational Progress (NAEP), the Congressionally-mandated assessment, uses
numerous open-ended test questions and writing prompts (and successfully piloted a hands-on test of science
performance);
newly-mandated performance-based and portfolio-based state-wide testing in Arizona, California, Connecticut,
Kentucky, Maryland, and New York.
Though the scoring of standardized tests is not subject to significant error, the procedure by which items are
chosen, and the manner in which norms or cut-scores are established is often quite subjective--and typically
immune from public scrutiny and oversight.
Genuine accountability does not avoid human judgment. We monitor and improve judgment through training
sessions, model performances used as exemplars, audit and oversight policies as well as through such basic
procedures as having disinterested judges review student work "blind" to the name or experience of the student--as
occurs routinely throughout the professional, athletic and artistic worlds in the judging of performance.
Authentic assessment also has the advantage of providing parents and community members with directly
observable products and understandable evidence concerning their students' performance; the quality of student
work is more discernible to laypersons than when we must rely on translations of talk about stanines and
renorming.
Ultimately, as the researcher Lauren Resnick has put it, What you assess is what you get; if you don't test it you
won't get it. To improve student performance we must recognize that essential intellectual abilities are falling
through the cracks of conventional testing.
Appendix 6 39
Appendix 7 -- State and National Standards, Academic & Vocational Competencies

American Welding
Society Welding Codes & Standards (www.aws.org/cgi-bin/shop)
550 NW LeJeune Road AWS is recognized worldwide for the development of
Miami, FL 33126 consensus-based American National Standards. Over 170
Phone: (800) 443-9353 standards -- as codes, recommended practices, guides and
Fax: (305) 443-7559 specifications. Certification is offered in seven different welding
Email: lizett@aws.org processes.
URL: www.aws.org
Shielded Metal Arc Welding (SMAW)
Gas Metal Arc Welding (GMAW)
Gas Metal Arc Welding - Short Circuit (GMAW-S)
Flux Cored Arc Welding (FCAW)
Gas Tungsten Arc Welding (GTAW)
Submerged Arc Welding (SAW)
Brazing

Business Education Standards (www.nbea.org/curfbes.html)


National Business
Using the concepts described in these standards, business
Education Association
teachers introduce students to the basics of personal finance, the
1914 Association Drive
decision-making techniques needed to be wise consumers, the
Reston, VA 20191
economic principles of an increasingly international
Phone: 703-860-8300
marketplace, and the processes by which businesses operate. In
Fax: 703-620-4483
addition, these standards provide a solid educational foundation
Email: nbea@nbea.org
for students who want to successfully complete college
UR: www.nbea.ort
Board of Registered Nursing programs
TITLE 16.inProfessional
various business
And disciplines.
Vocational Regulations
400 R St., Suite 4030 Accounting
1443.5. Standards of Competent Performance
Sacramento, CA 94244 (http://www.calnurse.org/cna/np/brn/standard.html)
Business Law
Phone: 916.322.3350 A registered nurse shall be considered to be competent when he/she
Career Development
Fax: 916.327.4402 consistently demonstrates the ability to transfer scientific knowledge
Email: brnappdesk@dca.ca.gov Communication
from social, biological and physical sciences in applying the nursing
UR: www.rn.ca.gov Computation
process, as follows:
Economics & Personal Finance
(1) Formulates a nursing diagnosis through observation of the client's physical condition and behavior, and
Entrepreneurship
through interpretation of information obtained from the client and others, including the health team.
Information Technology
International Business
(2) Formulates a care plan, in collaboration with the client, which ensures that direct and indirect nursing
Management
care services provide for the client's safety, comfort, hygiene, and protection, and for disease prevention
and restorative measures. Marketing
(3) Performs skills essential to the kind of nursing action to be taken, explains the health treatment to the
client and family and teaches the client and family how to care for the client's health needs.
(4) Delegates tasks to subordinates based on the legal scopes of practice of the subordinates and on the
Appendix 7 40
preparation and capability needed in the tasks to be delegated, and effectively supervises nursing care
being given by subordinates.
(5) Evaluates the effectiveness of the care plan through observation of the client's physical condition and
behavior, signs and symptoms of illness, and reactions to treatment and through communication with the
client and health team members, and modifies the plan as needed.
(6) Acts as the client's advocate, as circumstances require, by initiating action to improve health care or to
change decisions or activities which are against the interests or wishes of the client, and by giving the
client the opportunity to make informed decisions about health care before it is provided.

Knowledge, Skills, and Values Consistent with the Science and


American Psychological Application of Psychology and with Liberal Arts Education that are
Association Further Developed in Psychology.
750 First Street, NE, (www.apa.org/ed/pcue/taskforcereport2.pdf)
Washington, DC 20002 In this document we provide details for 10 suggested goals and related
Phone: 800-374-2721 learning outcomes for the undergraduate psychology major. These
Fax: 202-336-6123 Undergraduate Psychology Learning Goals and
Email: ppo@apa.org Outcomes represent what the Task Force considers to be reasonable
URL: www.apa.org departmental expectations for the psychology major in United States'
institutions of higher education.
Goal 1. Knowledge Base of Psychology
Students will demonstrate familiarity with the major concepts, theoretical perspectives,
empirical findings, and historical trends in psychology.
Goal 2. Research Methods in Psychology
Students will understand and apply basic research methods in psychology, including research design, data
analysis, and interpretation.
Goal 3. Critical Thinking Skills in Psychology
Students will respect and use critical and creative thinking, skeptical inquiry, and, when possible, the
scientific approach to solve problems related to behavior and mental processes.
Goal 4. Application of Psychology
Students will understand and apply psychological principles to personal, social, and organizational issues.

Appendix 7 41
Goal 5. Values in Psychology
Students will be able to weigh evidence, tolerate ambiguity, act ethically, and reflect other values that are
the underpinnings of psychology as a discipline.
Goal 6. Information and Technological Literacy
Students will demonstrate information competence and the ability to use computers and other technology
for many purposes.
Goal 7. Communication Skills
Students will be able to communicate effectively in a variety of formats.
Goal 8. Sociocultural and International Awareness
Students will recognize, understand, and respect the complexity of sociocultural and
international diversity.
Goal 9. Personal Development
Students will develop insight into their own and others behavior and mental processes and apply effective
strategies for self-management and self-improvement.
Goal 10. Career Planning and Development
Students will emerge from the major with realistic ideas about how to implement their psychological
knowledge, skills, and values in occupational pursuits in a variety of settings.

Association of College and


Research Libraries
1914 Association Drive Information Literacy Standards (www.acrl.org/infolit)
Reston, VA 20191 Information literacy is defined and its relationship to technology,
Phone: 703-860-8300 higher education, and pedagogy is discussed. Each of the five
Fax: 703-620-4483 standards come with detailed performance indicators.
Email: library@ala.org
URL: www.acrl.org
Standard One. The information literate student determines the nature and extent of the information
needed.
Standard Two. The information literate student accesses needed information effectively and
efficiently.
Standard Three. The information literate student evaluates information and its sources critically
and incorporates selected information into his or her knowledge base and value system.
Standard Four. The information literate student, individually or as a member of a group, uses
information effectively to accomplish a specific purpose.
Standard Five. The information literate student understands many of the economic, legal, and
social issues surrounding the use of information and accesses and uses information
ethically and legally.

Appendix 7 42
A Hierarchy of Postsecondary Outcomes from Defining and Assessing Learning: Exploring
Competency-Based Initiatives a report of the National Postsecondary Education Cooperative Working
Group on Competency-Based Initiatives in Postsecondary Education published by the National Center for
Educational Statistics, September 2002 (http://nces.ed.gov/pubs2002/2002159.pdf )

Appendix 7 43
Appendix 8 Assessment Report Example #1xiv

SLO Results English 371 Literature & the Visual Arts Raymond Walters College
Course Objective: Trait 4 points 3 points 2 points 1 point
Compare and contrast the text and film Plot Accurate plot Accurate plot Minor Glaring plot
versions of a literary work. review review inaccuracies of inaccuracies
plot
Desired SLO: Text Analysis of textAnalysis of text Analysis of text
Literal
After viewing an assigned film based on a literary Analysis beyond literal beyond literal includes literal
analysis
text, write a review of the film. Include an appraisal interpretation interpretation interpretation
of the directors selection and effective translation of
Supporting Support with Weak support Few specific No specific
content from the literary text and the dominant tone
Statements specific detailswith specific details as details as
the director seems to be trying to achieve, supporting
each statement with detail from the text and film and from text/film details from film support support
your personal reaction to the cited scenes. Personal Personal Personal Little personal
No personal
Reactions evaluation basedevaluation not evaluation evaluation
on analysis based on
analysis
Film #1 10 7 3 1
Number of Students Scoring at Each Film #2 11 7 3 0
Point Level by Film Number Reviewed Film #3 10 8 3 1
Film #4 12 5 5 1
Film #5 13 5 3 0
Film #6 6 8 6 1
Film #7 9 7 7 5
Instructor Analysis: I handed out the trait scale to students on the first day of class, but I am not sure they consulted it; upon my inquiring
whether they had a copy near the end of the course, few students were able to locate it in their notebooks. This taught me that I should refer
to the scale more explicitly in class. I anticipated that it would be easy for students to give an analysis but difficult for them to identify
concrete support for their ideas. However, I discovered that students found it easier to point to specific places in the movies that gave them
ideas than to articulate those ideas. Therefore, I will revise the scale for the next course to reflect the relative challenges of these skills.

Assessment Report Example #2xiv


PARKLAND COLLEGE ACADEMIC ASSESSMENT (Excerpts)
DEPARTMENT: Fine and Applied Arts PROGRAM: Mass Communication
Methods: Pre/Post Tests Capstone exam/project Primary Trait Analysis
Course Embedded Test Standardized Exams Professional Certification
Portfolios Performance Assessment Other

Indirect Assessment Measures


Transfer/Employment Data Grad Surveys/Interviews Employer/Faculty Surveys

Intended Outcomes Assessment Criteria & Actual Results Analysis & Action
(Objectives) Methods
(Expected Results)
1. Students will demonstrate 1. Students will demonstrate desired mass 1. Written comments from industry 1. Desktop Graphics program
proficiency in employable communication competencies as representatives indicate that MC revised to include more
Mass Communication skills. shown by individual portfolios, when student's portfolio assessment ranked experience in Web site
assessed by representatives from 4 (on a scale of 1 to 5-five being the graphics. Students designed
industry, as reported on MC Portfolio highest score). Suggestions were to graphics for current MC
Evaluation form. include more Web site graphics into home page and links.
curriculum.
2. Students will demonstrate 2. When surveyed using Parkland College 2. Feedback from employers and 2. Visual Arts program option
learning the basic concepts Student Occupational Follow-Up students strongly indicated that shelved.
necessary to perform Survey, graduates will describe Visual Arts program option had
satisfactorily in Mass satisfaction with their Mass become obsolete; preference is given
Communications entry-level Communication knowledge to recall, to graduates with Desktop Publishing
jobs. analyze, evaluate, and utilize basic skills.
concepts.
3. Students in the Mass 3. Four-year institutions will report of a 3. U of I Coordinator of Transfer 3. Continue to gather/monitor
Communication A.A. program 75 percent acceptance rate into Mass Articulation reported that out of 29 data. Investigate how many
will have the knowledge to Communication programs. applicants from other schools to Parkland Graphics students
successfully complete a Graphics a Mass Com student was applied.
Bachelors degree in Mass the only admit.
Communication.

Appendix 8 44
Assessment Report Example #3
Mesa Community College Results from Student Learning Outcomes Assessment Spring 2002 and 2003
Outcome Statements Results
Communication

1. Write a clear, well-organized paper using Written: The mean score for the post-group was
documentation and quantitative tools when significantly higher overall and on the scales for
appropriate. content, organization and mechanics/style. When each
2. Construct and deliver a clear, well-organized, skill is considered separately, students showed relative
verbal presentation. strength in stating their own position, addressing the
prompt, using appropriate voice and style and sentence
structure. Students have consistently rated below the
overall average on acknowledging the opposing
position, developing each point with appropriate detail
and commentary, progressing logically and smoothly,
and using transitions and orienting statements.
Oral: Significant differences between beginning
students and completing students were shown in the
total percentage correct for the assessment overall and
for each of the subscales: knowledge about effective
interpersonal interchanges, small group interaction and
conducting oral presentations.
Numeracy

1. Identify and extract relevant data from given The average percent correct was significantly higher
mathematical situations. for the post-group overall and for outcomes related to
2. Select known models or develop appropriate identifying and extracting relevant data, using models
models that organize the data into tables or to organize data, obtaining results, and stating results
spreadsheets, graphical representations, symbolic/ with qualifiers. Patterns of performance have remained
equation format. consistent over several years. Use of models is the
3. Obtain correct mathematical results and state strongest area and use of results is the weakest area.
those results with the qualifiers.
4. Use the results.
Scientific Inquiry

Demonstrate scientific inquiry skills related to: There was no significant difference in the average
1. Hypothesis: Distinguish between possible and percent correct between groups in the 2002
improbable or impossible reasons for a problem. administration; however, significant differences were
2. Prediction: Distinguish between predictions that noted, overall, in prior years. Students have been most
are logical or not logical based upon a problem successful in recognizing possible reasons for a
presented. problem. Making a conclusion based upon information
3. Assumption: Recognize justifiable and necessary presented has had the lowest percent correct for the
assumptions based on information presented. past three years of administration.
4. Interpretation: Weigh evidence and decide if
generalizations or conclusions based upon given
data are warranted.
5. Evaluation: Distinguish between probable and
improbable causes, possible and impossible
reasons, and effective and ineffective action based
on information presented.
ThinkingProblem Solving/ Critical

1. Identify a problem or argument. The average total score was significantly higher for the
2. Isolate facts related to the problem. post-group (completing), overall and for two sub-
3. Differentiate facts from opinions or emotional scales: Interpretation and Evaluation of Arguments.
responses. The post-group score was at the 45th percentile when
4. Ascertain the author's conclusion. compared to a national sample. Average student scores
5. Generate multiple solutions to the problem. have been consistently highest for the Interpretation
6. Predict consequences. and Evaluation of Arguments sections and lowest for
7. Use evidence or sound reasoning to justify a Inference.
position.

Appendix 8 45
Mesa Community College Results from Student Learning Outcomes Assessment Spring 2002 and 2003
Outcome Statements Results
Arts & Humanities
1. Demonstrate knowledge of human creations. Significant differences were observed overall and in
2. Demonstrate awareness that different contexts three of four outcome areas: Demonstrate an awareness
and/or worldviews produce different human that different contexts and/or world views produce
creations. different human creations; an understanding and
3. Demonstrate an understanding and awareness of awareness of the impact that a piece has on the
the impact that a piece (artifact) has on the relationship and perspective of the audience; an ability
relationship and perspective of the audience. to evaluate human creations.
4. Demonstrate an ability to evaluate human
creations.
Information Literacy

1. Given a problem, define specific information The percent correct was significantly higher for the post-
needed to solve the problem or answer the group overall and for three of five outcome areas:
question. evaluating currency and relevance of information,
2. Locate appropriate and relevant information to identifying sources, and locating information. Students
match informational needs. were most successful in evaluating information for
3. Identify and use appropriate print and/or currency and relevance, followed by defining
electronic information sources. information needed to solve a problem and identifying
4. Evaluate information for currency, relevancy, and appropriate sources. Locating information was relatively
reliability. more difficult. Students were least successful in using
5. Use information effectively. information effectively.
Cultural Diversity

1. Identify and explain diverse cultural customs, Students in the completing (post) group had
beliefs, traditions, and lifestyles. significantly higher scores on direct measures of
2. Identify and explain major cultural, historical and knowledge and on several diversity and democracy
geographical issues that shape our perceptions. outcomes in both years. Completing students agreed
3. Identify and explain social forces that can effect more often that they have an obligation to give back to
cultural change. the community. In the most recent administration
4. Identify biases, assumptions, and prejudices in completing students rated themselves more highly than
multicultural interactions. beginning students on having a pluralistic orientation,
5. Identify ideologies, practices, and contributions being able to see both sides of an issue and their own
that persons of diverse backgrounds bring to our knowledge of cultures. Further, they agreed more
multicultural world. strongly with statements that support the value of
diversity, reflect tolerance for differences related to
gender, and indicate that they engage in social action
more often.

Appendix 8 46
Parkland College Academic Program Assessment
Program: Computer Information Systems: Microcomputer Support Specialist/ Programming Specialization
Assessment Direct Assessment Measures
Methods: Pre/Post Tests Capstone exam/project Primary Trait Analysis
Course Embedded Test Standardized Exams Professional Certification
Portfolios Performance Assessment Other

Indirect Assessment Measures


Focus Groups Grad Surveys/Interviews Employer/Faculty Surveys

Intended Outcome(s): Assessment Criteria:


1. Graduates from this program will have 1.a. When surveyed, employers of our interns will rate 80% of the students
acquired knowledge and skills needed for with an average of 4.5 on a scale of
entry-level positions in a variety of computer- 1-5. The rating will be composed of 14 skill areas each rated on a scale of 1-5.
related fields.
Results: Analysis and Action:
1.a.1. Fall 2000: 1.a.1. Fall 2000 data analyzed in Spring 2001:
Two students fell under the 4.5 rating. 80% of This indirect measure is not providing the results anticipated. The committee
the interns received an average score of 4.5 or proposes making changes to the survey to make it a more valuable assessment
higher. The weakest area was identified as tool. In addition, information will be given to the instructors in CIS 297-CIS
"Ability to Plan," which received an average seminar and CIS 231- Systems Analysis, Design and Administration to
score of 4.29. enhance course content to encourage students to strengthen their "ability to
plan." A direct measure to show "ability to plan" will be included in the
capstone tests given near the completion of the program. (See 1.c.)
1.a.2. Spring 2002
1.a.2. Spring 2002 Students did well overall in every area. The lowest marks came in the "ability
Five students took CIS 298: CIS Work to plan" area with 1- Excellent, 4- Good ratings. Suggestions have been made
Experiences in Spring 2002. Employers for all for providing additional information in CIS 297: Seminar and CIS
5 returned surveys. 231:Systems Analysis, Design and Administration.
Intended Outcome(s): Assessment Criteria:
1. (continued) 1.d. 90% of students will score 80% or higher on a standard, capstone test to
be administered near to their completion of program.
Results: Analysis and Action:
1.d.1. Fall 1999: The percentage of those 1.d.1. Fall 1999 data analyzed in Spring 2000:
students giving the right answers ranged from Faculty met and determined that the pilot instrument needed to be changed to
13% on the question that the fewest answered gather more accurate results. Students seemed confused by the questionnaire
correctly to 87% on the question answered and we felt the results were not valid enough.
correctly by the most students.
Intended Outcome(s): Assessment Criteria:
1. (continued) 1.e. All students in the introductory level required courses for all CIS
programs (101 and 117) will be given a set of five questions to be graded with
the final exam. Students completing their final courses in CIS will be given 10
questions.
Results: Analysis and Action:
1.e.1. Fall 2000: Data was collected and 1.e.2.. Spring 2001:
reviewed for CIS 101 and CIS 117. 143 Overall scores for CIS 101 improved by 2%. The weakest question in CIS 101
students answered questionnaires in 101 with was identified. 25% of students missed the question about how to save files
an average score of 84%. 39 students using Save vs. Save As. Instructors were encouraged to spend more time on
answered questionnaires in 117 with an this topic and the question was reworded to be easier to read for the next
average score of 90%. semesters assessment test. Overall scores for CIS 117 improved by 3%.
1.e.2.. Spring 2001: Data was collected at the 1.e.3. Fall 2001:
end of the semester for CIS 101 and CIS 117. Overall scores for CIS 101 stayed the same as the previous semester. The
105 students for CIS 101 had an average score rewording of the question about saving indicated that fewer instructors were
of 86%. 41 students for CIS 117 had an thoroughly teaching the concept of saving vs. the save as command. 29% of
average score of 96%. the students answered the question about saving incorrectly. A memo was sent
1.e.3. Fall 2001: Data was collected from CIS out to all instructors outlining what students need to learn in CIS 101
101 and CIS 117. 118 students for CIS 101 pertaining to the save and save as command. Scores for CIS 117 improved by
had an average score of 86%. 38 students for 2%.
CIS 117 had an average score of 98%.

Appendix 8 47
Appendix 9 Assessment Plan Examples Internet Sites

North Carolina State University


http://www2.acs.ncsu.edu/UPA/assmt/resource.htm
Contains Comprehensive list of Links to National Assessment Forums, Manuals & Handbooks, Student
Learning, & other Institutions

California State University, Fresno


http://www.csufresno.edu/cetl/assessment/assmnt.html &
http://www.csufresno.edu/cetl/assessment/status.html
Contains Links to Program Assessment Plans

Boise State University


http://www2.boisestate.edu/iassess/outcomes/outcomes.htm
Contains Links to Program Assessment Plans organized by college

Oklahoma State University


http://www.okstate.edu/assess/assessment_plans/assessment_plans.htm
Contains Assessment method examples & Assessment Plan tips & checklist

California State University at Sacramento


http://www.csus.edu/acaf/assmnt.htm
Contains listing of program Assessment Plan links

San Jose State University


http://www.sjsu.edu/ugs/assessment/as-main.html
Contains Program Assessment Plans organized by college & a page containing links to other institutions.

Southeast Missouri State University


http://www2.semo.edu/provost/aspnhtm/busy.htm
Busy Chairperson's Guide to Assessment (Table of Contents)

Southern Illinois University


http://www.siue.edu/~deder/assess/depts.html
Contains Program Assessment Plans

Ohio University Student Learning Outcomes Assessment , 2000-2001


http://www.ohiou.edu/provost/OUTCOMES2000_2001.html

Central Michigan University


http://www.provost.cmich.edu/outcomes/
Student Learning Outcomes by College for each major

Appendix 9 48
Appendix 10 Activity 8 Program SLOs from Competency Statements
Sort the following Business program competencies into three categories and then write a global program
student learning outcome for each of the three categories.

Program Competency Category


A. Analyze management theories and their application within the business environment.
B. Analyze special challenges in operations and human resource management in international
business.
C. Analyze the characteristics, motivations, and behaviors of consumers.
D. Analyze the elements of the marketing mix, their interrelationships, and how they are used
in the marketing process.
E. Analyze the influence of external factors on marketing.
F. Analyze the management functions and their implementation and integration within the
business environment.
G. Analyze the role of marketing research in decision making.
H. Apply communication strategies necessary and appropriate for effective and profitable
international business relations.
I. Apply marketing concepts to international business situations.
Explain the concepts, role, and importance of international finance and risk management.
J. Apply operations management principles and procedures to the design of an operations
plan.
K. Describe the elements, design, and purposes of a marketing plan.
L. Describe the environmental factors that define what is considered ethical business behavior
in a global business environment.
M. Describe the interrelatedness of the social, cultural, political, legal, and economic factors
that shape and impact the international business environment.
N. Describe the role of organized labor and its influence on government and business.
O. Develop personal management skills to function effectively and efficiently in a business
environment.
P. Examine the issues of managing in the global environment.
Q. Explain the role of international business; analyze how it impacts business at all levels
(including the local, state, national, and international levels).
R. Identify forms of business ownership and entrepreneurial opportunities available in
international business.
S. Recognize the customer-oriented nature of marketing and analyze the impact of marketing
activities on the individual, business, and society.
T. Relate balance of trade concepts to the import/export process.
From the National Standards for Business Education 2001 National Business Education Association, 1914 Association Dr., Reston, VA 20191.

Category #1 Title: Program SLO #1:

Category #2 Title: Program SLO #2:

Category #3 Title: Program SLO #3:

Appendix 10 49
Program Assessment Report
Program: _______________________________ Term & Year: ______________
Findings/
Program Student Learning Outcomes Evaluation/ Recommendations
Methods of Assessment Type Check the # of the SLO assessed by the Conclusions for
Strategies/techniques/instruments for E=Enter particular assessment method Results of Improvement
collecting the feedback data that I=Intermediate analysis and Recommended
provide an evidence of the extent to X=Exit interpretation actions for
which objectives are reached F=Follow-up A1 A2 A3 A4 A5 A6 A7 A8 of the Improving the
measurement Program
data
Graduate Exit Survey X
Employer Satisfaction Survey F
Exit interviews of graduates X
5-Year Graduate Survey F
Alumni survey F
Advisory Committee feedback I
Peer evaluation of teaching
Student evaluations of teachers I
Appendix 11 Example of Program Assessment Report
West Virginia State Community and Technical College
http://fozzy.wvsc.edu/ctc/program_assesment/General%20Education%20Audit%20Grid.doc

Appendix 11 50
Findings/
Program Student Learning Outcome Evaluation/ Recommendations
Methods of Assessment Type Check the # of the SLO assessed by the Conclusions for
Strategies/techniques/instruments for E=Enter particular assessment method Results of Improvement
collecting the feedback data that I=Intermediate analysis and Recommended
provide an evidence of the extent to X=Exit interpretation actions for
which objectives are reached F=Follow-up A1 A2 A3 A4 A5 A6 A7 A8 of the Improving the
measurement Program
data
Students internship program I
Faculty participation programs with
industry (summer appointments)
Analysis of enrollment/graduation
data
Internal reviews
External reviews (ABET
Accreditation)
Peer review
Dropout and Non-Complete Rate
Community Assessment Needs
Licensure/Certification Practice Test
Faces of the Future Surveys
Student GPA

Appendix 11 51
Appendix 12 General Education Student Learning Outcomes
West Virginia State Community and Technical College
http://fozzy.wvsc.edu/ctc/program_assesment/GeneralEducationCoreLearningOutcomes.htm

GENERAL EDUCATION CORE LEARNING OUTCOMES

Graduates will be able to:


Communicate articulately in speech and writing.
Think critically about issues, theory, and application.
Use effective human relationship skills to work in a diverse society.
Function effectively and positively in a team environment.
Use library print and electronic resources for literature research.
Use computational skills to solve problems, manipulate and interpret numerical data, and
communicate data in a logical manner
Employ fundamental principles of science, the scientific method of inquiry, and skills for applying
scientific knowledge to practical situations.
Use computer technology to organize, access, and communicate information.

GENERAL EDUCATION STUDENT LEARNING OUTCOMES


COCONINO COMMUNITY COLLEGE
COMMUNICATION SKILLS
Present ideas developed from diverse sources and points of view with consideration of target audience.
Demonstrate communication process through idea generation, organization, drafting, revision, editing, and presentation.
Participate in and contribute to collaborative groups.
Construct logical, coherent, well-supported arguments.
Employ syntax, usage, grammar, punctuation, terminology, and spelling appropriate to academic discipline and the professional world.
Demonstrate listening / interpretive skills in order to participate in communications and human exchange.
THINKING SKILLS
Use appropriate method of inquiry to identify, formulate, and analyze a current or historical problem/question (may include recognizing
significant components, collecting and synthesizing information, evaluating and selecting solution(s), applying and defending solution(s).
Translate quantifiable problems into mathematical terms and solve these problems using mathematical or statistical operations.
Interpret graphical representations (such as charts, photos, artifacts) in order to draw appropriate conclusions
Recognize strengths and weaknesses in arguments
Demonstrate observational and experimental skills to use the scientific method to test hypotheses and formulate logical deductions
Understand the uses of theories and models as applied in the area of study
Develop creative thinking skills for application in problem solving
Demonstrate a working knowledge of a technological application in an area of study.
DIVERSITY AND GLOBAL PERSPECTIVE
Recognize the diversity of humanity at the local, regional and global levels
Synthesize information about needs, concerns and contributions of different cultures within society
Identify the influence of cultural and ethnic backgrounds on individual and group attitudes and values.
Link cultural perspectives, practices, and interactions with the societal and physical environment from which they arose.
Explain the importance of cross-cultural influences on physical, cultural and spiritual heritage.
Relate and explain the connections between past and present events and/or issues.
AESTHETIC PERSPECTIVE
Analyze and evaluate literary, visual, or performing arts using discipline-specific approaches and criteria.
Reflect on personal responses to aesthetic experiences.
Incorporate aesthetic reflection into discipline-specific activities.
ETHICAL AND CIVIL VALUES
Identify and assess community needs and the responsibility to balance individual and societal needs
Display responsibility and integrity in one's choices and actions
Integrate knowledge in order to establish an ethical position on an issue and defend it with logical arguments
Develop an appreciation of education and lifelong learning
Understand social values and analyze their implications for the individual, community, society, and world.
Recognize the individual's responsibility to continue the exploration of the changing world and one's role in it.

Appendix 12 52
Assessment of General Education Learning Outcomes
An "Institutional Portfolio" Approach to Assessment of General Education Learning
Outcomes

What Comprises an "Institutional Portfolio"


A collection of student work ("artifacts") produced throughout the curriculum for each of six
major outcomes: Mathematics, Writing, Speaking, Culture and Ethics, Modes of Inquiry,
Problem Solving
Reviewed by faculty teams using holistic scoring criteria (rubrics)
Results are compiled, analyzed, and reported in the aggregate by the Office of Institutional
Research
Results are reported to the Faculty Assessment Committee which, in turn, reports to the
Educational Affairs Committee
Faculty acts on assessment results

Characteristics of the "Institutional Portfolio" Model


The outcomes and scoring teams are multidisciplinary thus "responsibility" rests with the
institution/faculty as a whole, rather than single departments
It is invisible to students, obviating the motivation and other significant problems with
standardized tests
It is minimally intrusive for faculty
It requires no special "sessions," no sacrifice of class time (e.g. for testing), no external incentives for students to
perform well
It is labor intensive and requires significant institutional resources (faculty release time
and/or overload pay, technical support)
It is a dynamic process
It's "messy"

Assessment Plan Logistics


Who Scores: Four-to-six person interdisciplinary faculty teams
How Scored: Individually by team members or as a group
How Many Artifacts: 100 per outcome per year
When Scored: Fall artifacts in spring; spring artifacts in fall
Who Selects Courses: Office of Institutional Research
Who Selects Artifacts: Faculty in each targeted class
Who Collects, Copies, Distributes Artifacts: Office of Institutional Research

For more information contact:


Jeff Seybert, Director, Research, Evaluation, and Instructional Development
Johnson County Community College
12345 College Boulevard
Overland Park, KS 66210-1299
(913) 469-8500 ext. 3442
jseybert@jccc.net

http://www.jccc.net/home/depts/6111/site/assmnt/cogout

Appendix 12 53
Mathematics Outcome
Outcome Statements: Upon receipt of an associate degree from Johnson County Community College, a student should be able
to:
1. Identify relevant data (numerical information in mathematical or other contexts) by
a. extracting appropriate data from a problem containing extraneous data and/or
b. identifying appropriate data in a word problem.
2. Select or develop models (organized representations of numerical information, e.g., equation, table, graph) appropriate
to the problem which represent the data by
a. arranging the data into a table or spreadsheet and/or
b. creating pictorial representations (bar graphs, or pie charts, or rectangular coordinate graphs, etc.) with or
without technological assistance and/or
c. selecting or setting up an equation or formula.
3. Obtain and describe results by
a. obtaining correct mathematical results, with or without technological assistance and
b. ascribing correct units and measures to results.
4. Draw inferences from data by
a. describing a trend indicated in a chart or graph, and making predictions based on that trend and/or
b. describing the important features of data presented in a table or spreadsheet, and making predictions based on
that trend and/or
c. describing the important features of an equation or formula, and making predictions based on those features
and/or
d. making reasonable estimates when given problems involving quantities in any organized or disorganized
form and/or
e. drawing qualitative conclusions about the original situation based on the quantitative results that were
obtained.

The mathematics outcomes consist of four major outcomes, numbered 1 to 4. These major outcomes are each subdivided
into several subpoints labeled by letters. A major outcome is demonstrated when at least one subpoint has been
demonstrated, except for major outcome 3, where subpoint 3.a. must be demonstrated. A subpoint is demonstrated when at
least one instance of the subpoint has occurred, except for subpoints 3.a. (which requires at least 70 percent accuracy of the
items examined) and 3.b. (which requires at least 2 instances involving different measures).

Rubrics: The following rubric will measure the mathematics outcomes:


5 = All four major outcomes are demonstrated by the use of more than one subpoint per major outcome.
4 = All four major outcomes are demonstrated.
3 = Three major outcomes are demonstrated.
2 = Two major outcomes are demonstrated.
1 = Only one major outcome is demonstrated.
0 = No major outcomes are demonstrated.

Standards: At least 75 percent of all JCCC students earning associate degrees should obtain a score of 4 or more on the
mathematics outcomes rubric. At least 95 percent of all JCCC students earning associate degrees should obtain a score of 3
or more on the mathematics outcomes rubric.

Writing Outcome
Outcomes Statement: Upon receipt of an associate degree from Johnson County Community College, a student should be able
to write a clear, well-organized paper using documentation and quantitative tools when appropriate.

Outcome Rubrics:
6 = Essay demonstrates excellent composition skills including a clear and thought-provoking thesis, appropriate and
effective organization, lively and convincing supporting materials, effective diction and sentence skills, and perfect or
near perfect mechanics including spelling and punctuation. The writing perfectly accomplishes the objectives of the
assignment.
5 = Essay contains strong composition skills including a clear and thought-provoking thesis, although development,
diction, and sentence style may suffer minor flaws. Shows careful and acceptable use of mechanics. The writing
effectively accomplishes the goals of the assignment.
4 = Essay contains above average composition skills, including a clear, insightful thesis, although development may

Appendix 12 54
be insufficient in one area and diction and style may not be consistently clear and effective. Shows competence in the
use of mechanics. Accomplishes the goals of the assignment with an overall effective approach.
3 = Essay demonstrates competent composition skills including adequate development and organization, although the
development of ideas may be trite, assumptions may be unsupported in more than one area, the thesis may not be
original, and the diction and syntax may not be clear and effective. Minimally accomplishes the goals of the
assignment.
2 = Composition skills may be flawed in either the clarity of the thesis, the development, or organization. Diction,
syntax, and mechanics may seriously affect clarity. Minimally accomplishes the majority of the goals of the
assignment.
1 = Composition skills may be flawed in two or more areas. Diction, syntax, and mechanics are excessively flawed.
Fails to accomplish the goals of the assignment.

Standards: Ten percent of students who have met the requirements for an associate degree at JCCC will earn 6 (excellent) on
each of the communication rubrics. Thirty percent of students earning an associate degree will score 5 (very good) or 6
(excellent). Eighty percent will earn scores of 4 (satisfactory) or higher and the top 98 percent will earn scores of 3 (minimal
accomplishment of educational goals) or higher. The remaining 2 percent of the associate degree recipients are expected to earn
the score of 2 (unsatisfactory) on the communication rubrics The score of 1 represents a skill level beneath the expectation of
all associate degree recipients at JCCC. Hence, no associate degree recipients are expected to score at the level of 1 on the
communications rubrics.

Speaking Outcome
Outcome Statement: Upon receipt of an associate degree from Johnson County Community College, a student should be able
to make a clear, well-organized verbal presentation.

Rubrics:
Very good/excellent (5-6) = The communicator presents a message that is exceptionally appropriate for the purpose,
occasion, and audience with a purpose that is exceptionally clear and identifiable. The message is supported using
material that is exceptional in quality and variety. The communicator uses an exceptionally clear and coherent
organizational structure, provides a logical progression within and between ideas, and uses language that is
exceptionally clear, vivid, and appropriate. The communicator makes exceptional use of vocal variety in a
conversational mode; has exceptional articulation, pronunciation, and grammar; and demonstrates physical behaviors
that provide exceptional support for the verbal message.
Satisfactory (3-4) = The communicator presents a message that is appropriate for the purpose, occasion, and audience
with a purpose that is adequately clear and identifiable. The message is supported using material that is appropriate in
quality and variety. The communicator uses a reasonably clear and coherent organizational structure, provides a
logical progression within and between ideas, and uses language that is reasonably clear, vivid, and appropriate. The
communicator makes acceptable use of vocal variety in a conversational mode; has acceptable articulation,
pronunciation, and grammar; and demonstrates physical behaviors that provide adequate support for the verbal
message.
Unsatisfactory (1-2) = The communicator presents a message that is not appropriate for either the purpose, occasion,
or audience or is without a clear and identifiable purpose for the message. The message is supported with material that
is inappropriate in quality and variety. The communicator fails to use a clear and coherent organizational structure,
does not provide a logical progression within and between ideas, and uses unclear or inappropriate language. The
communicator fails to use vocal variety; fails to speak in a conversational mode; fails to use acceptable articulation,
pronunciation, and grammar; or fails to use physical behaviors that provide adequate support for the verbal message.

Standards: Ten percent of students who have met the requirements for an associate degree at JCCC will earn 6 (excellent) on
each of the communication rubrics. Thirty percent of students earning an associate degree will score 5 (very good) or 6
(excellent). Eighty percent will earn scores of 4 (satisfactory) or higher and the top 98 percent will earn scores of 3 (minimal
accomplishment of educational goals) or higher. The remaining 2 percent of the associate degree recipients are expected to earn
the score of 2 (unsatisfactory) on the communication rubrics The score of 1 represents a skill level beneath the expectation of
all associate degree recipients at JCCC. Hence, no associate degree recipients are expected to score at the level of 1 on the
communications rubrics.

Appendix 12 55
Culture and Ethics Outcome
Outcomes Statements: Upon receipt of an associate degree from Johnson County Community College, a student should be
able to:
1. Demonstrate a fundamental knowledge of world geography.
2. Demonstrate knowledge of the major cultural issues of a person's own culture as well as other cultures.
3. Demonstrate knowledge of major historical events affecting one's culture and other cultures.
4. Demonstrate familiarity with contemporary global issues.
5. Demonstrate an understanding of major ethical concerns.

Rubrics:
Demonstrates knowledge of world geography:
4 = Compares and contrasts geographies and their relationship to their respective cultures.
3 = Analyzes the relationship between geography and culture.
2 = Analyzes the relationship between geography and economy.
1 = Identifies major characteristics of political and natural geography.
Demonstrates knowledge of the major cultural issues of a person's own culture as well as other cultures:
4 = Compares and contrasts cultural issues affecting one's culture and other cultures.
3 = Analyzes major cultural issues.
2 = Identifies major cultural issues in other cultures.
1 = Identifies major cultural issues from one's culture.
Demonstrates knowledge of major historical events affecting one's culture and other cultures:
4 = Compares and contrasts historical events affecting one's culture and other cultures.
3 = Analyzes major historical events.
2 = Identifies major historical events in other cultures.
1 = Identifies major historical events in one's culture.
Demonstrates familiarity with contemporary global issues.
4 = Compares and contrasts the effect of global issues on cultures.
3 = Analyzes contemporary global issues.
2 = Identifies several contemporary global issues.
1 = Identifies a contemporary global issue.
Demonstrates an understanding of major ethical concerns:
4 = Develops a comprehensive, rational argument for an ethical position and describes its implications for personal
and social behavior.
3 = Analyzes an ethical issue, the pro and con positions and its consequences, and the issue's relation to other ethical
issues.
2 = Identifies the ethical dimensions of academic disciplines.
1 = Identifies a general ethical issue.

Standards: The standard of judgment is 60 percent of the students will score 2 or higher on each outcome.

Modes of Inquiry Outcome

Outcomes Statement: Upon receipt of an associate degree from Johnson County Community College, a student should be able
to demonstrate understanding of the modes of inquiry by identifying an appropriate method of accessing credible information
and data resources; applying the selected method; and organizing results.

Rubrics: Every artifact will be evaluated to determine whether the student has demonstrated the ability to perform each rubric
item. The rubric items have been separated for modes of inquiry and for problem solving, and each artifact will be given scores
for either or both areas, as appropriate. The following rubric will measure the modes of inquiry outcomes:
1. Identifies an appropriate method of accessing credible information and data resources.
2. Applies the selected method.
3. Organizes results.
If an artifact presents evidence that a student demonstrated the ability to perform a rubric, the artifact will be given a plus (+)
score for that rubric.
If an artifact presents evidence that a student did not demonstrate the ability to perform a rubric, the artifact will be given a
minus (-) score for that rubric.

Appendix 12 56
If it appears that the assignment did not present an opportunity for students to perform a rubric, the artifact will be given a zero
(0) score for that rubric. For example, this may be a result of instances where the instructor=s assignment defined the
problem or method of gathering information. The subcommittee scorers should concur on those particular rubrics which
receive zeros.
Artifacts scored for Modes of Inquiry must allow the student to perform at least 2 of the 3 rubrics. Only rubrics with plus or
minus scores will be counted. A zero score is not counted and does not impact the outcome standard. It is not
necessary for the subcommittee scorers to concur on rubrics which receive plus or minus scores. The artifacts are
scored as follows:
3 = the student demonstrated the ability to perform all rubrics that the student had the opportunity to perform (3 or 2).
2 = the student was given the opportunity to perform all 3 rubrics and demonstrated the ability to perform 2 of them.
1 = the student demonstrated the ability to perform only one rubric.
0 = the student was unable to demonstrate the ability to perform any of the rubrics.

Standards: At least 80% of the Modes of Inquiry artifacts should receive a score of 3.

Problem Solving Outcome


Outcomes Statement: Upon receipt of an associate degree from Johnson County Community College, a student should be able
to demonstrate understanding of solving problems by recognizing the problem; reviewing information about the problem;
developing plausible solutions; and evaluating results.

Rubrics: Every artifact will be evaluated to determine whether the student has demonstrated the ability to perform each rubric
item. The rubric items have been separated for modes of inquiry and for problem solving, and each artifact will be given scores
for either or both areas, as appropriate. The following rubric will measure the problem-solving outcomes:
1. Recognizes the problem.
2. Reviews information about the problem.
3. Develops plausible solutions.
4. Evaluates results.
Every artifact will be evaluated to determine whether the student has demonstrated the ability to perform each rubric item. The
rubric items have been separated for modes of inquiry and for problem solving, and each artifact will be given scores for
either or both areas, as appropriate.
If an artifact presents evidence that a student demonstrated the ability to perform a rubric, the artifact will be given a plus (+)
score for that rubric. If an artifact presents evidence that a student did not demonstrate the ability to perform a rubric, the
artifact will be given a minus (-) score for that rubric. If it appears that the assignment did not present an opportunity for
students to perform a rubric, the artifact will be given a zero (0) score for that rubric. For example, this may be a result of
instances where the instructor=s assignment defined the problem or method of gathering information. The subcommittee
scorers should concur on those particular rubrics which receive zeros.
Artifacts scored for Problem Solving must allow the student to perform at least 3 of the 4 rubrics. Only rubrics with plus or
minus scores will be counted. A zero score is not counted and does not impact the outcome standard. It is not necessary for
the subcommittee scorers to concur on rubrics which receive plus or minus scores. The artifacts are scored as follows:
4 = the student demonstrated the ability to perform all 4 rubrics.
3 = the student demonstrated the ability to perform 3 rubrics.
2 = the student was given the opportunity to perform 3 rubrics and demonstrated the ability to perform 2 of them.
1 = the student was given the opportunity to either perform 4 rubrics and demonstrated the ability to perform 1 or 2 of
them or perform 3 rubrics and demonstrated the ability to perform only 1 rubric.
0 = the student was unable to demonstrate the ability to perform any of the rubrics.

Standards: At least 80% of the Problem Solving artifacts should receive a score of 3.

Appendix 12 57
West Virginia State Community and Technical College
GENERAL EDUCATION CORE-AUDIT GRID
B. GENERAL EDUCATION LEARNING OUTCOMES

General Education Courses (listed in required sequence for student progression through program)
1. 2. Written and/or Oral Communications 3. Mathematics 4. Natural Science

General Education PHYS PHYS


COLL ENGL ENGL ENGL ENGL ENGL BST COMM MATH MATH MATH MATH BST CHEM CHEM PHYS PHYS PHYS PHYS BIO BIO BIO
191 & 201 &
Learning Outcomes 101 101 102 112 160 204 230 100 100 101 102 121 104 101 130 103 110 120 170 101 102 210
203 203
B.1 Communicate IE
articulately in speech I E E E R I I I I I I I
I R I R A
and writing E R R R A E E A A A E E
A
B.2 Think critically about
I E E I I I E
issues, theory, and I R E E E E E I I I E I I I
application E R R E R E A
B.3 Use effective human I
relationship skills to I I
I E A A E
work in a diverse society R E
A
B.4 Function effectively and I
positively in a team I I I E I I I I
I E A I I A E I
environment E E E R R E A A
A
B.5 Use library print and IE
electronic resources for I I R I I
I I A I R I I I I
literature research E E A A A
A
B.6 Use computational skills
to solve problems, I I
I I I
manipulate and interpret R R R R I I E E I
numerical data, and
R I I E E I
A A A A E E R R E
communicate data in a A R R
A A
logical manner
B.7 Employ fundamental
principles of science, the I I I I
scientific method of
I E E E E I I
inquiry, and skills for I I I
applying scientific E R R R R E E
knowledge to practical A A A A
situations
B.8 Use computer E
technology to organize, I I I E I
access, and communicate
I R E I I
E E E A A
information A

Appendix 12 58
GENERAL EDUCATION CORE-AUDIT GRID-CONTINUED
5. 6. Social Science 7. Information Skills
HU
General Education SOCL POSC POSC PSYC HIST HIST ECON ECON ET CS BST ITEC
Learning Outcomes
M 101 100 101 151 207 208 201 202 112 106 240 101
101
B.1 Communicate articulately E
in speech and writing A A A A A I I A A
A
B.2 Think critically about issues, I I I
theory, and application A A E E E I A A I I I
A A A
B.3 Use effective human
E E
relationship skills to work in E E E R R
a diverse society R R
B.4 Function effectively and
positively in a team A A I
environment
B.5 Use library print and
R R
electronic resources for A A A A I E E R R
literature research A A
B.6 Use computational skills to
solve problems, manipulate
E E E
and interpret numerical data, I I E E
and communicate data in a R R A
logical manner
B.7 Employ fundamental
principles of science, the
scientific method of inquiry, I
I I R E E A
and skills for applying A
scientific knowledge to
practical situations
B.8 Use computer technology to E
organize, access, and A A
A A A A A A E R R
communicate information R R
A
I=Introduces E=Emphasizes R=Reinforces A=Applies
Introduces-Student is not familiar with content or skill. Instruction concentrates on introducing students to the content area or skill.
Emphasizes-Student should have brought basic content or skill to the course. Instruction concentrates on enhancing content/strengthening skill and
adding new content material building more complex skills based on entrance competency
Reinforces-Student brings reasonable knowledge/content/skill/Competency to the situation as a result of content or skill being taught and/or emphasized
at some previous point in their educational career. Instructional activity continues to teach and build upon previous competency and reinforces content or
skill competency
Applies-Student has knowledge/content/skill/Competency as a result of content or skill being taught and/or emphasized at some previous point in their
educational career. Instructional activity applies a previously taught and/or emphasized content or skill.

Appendix 12 59
Appendix 13. Resources and References for Student Learning Outcomes Assessment

Good Practices
An Assessment Manifesto by College of DuPage (IL) is an excellent values statement.
9 Principles of Good Practice for Assessing Student Learning by the American
Association of Higher Education are the foundational principles of assessment.
Palomar College Statement of Principles on Assessment is a succinct two-page summary of
assessment and how it is done at Palomar College (CA).
Closing the Loop -- seven misperceptions of SLOs and responses to each by Tom Angelo.
Five Myths of Assessment by David Clement, Monterey Peninsula College published in
Inside English (Spring 2003) the newsletter of the English Council for California Two-Year
Colleges (www.ecctyc.org). Expresses concern that SLOs will affect faculty evaluation,
intrude on the classroom, diminish academic freedom, and lead to standards that are watered
down and blandly uniform.
The Case for Authentic Assessment by Grant Wiggins, presented at the California Assessment
Institute. The paper addresses several questions: What is authentic assessment? Why do we
need to invest in these labor-intensive forms of assessment? Wont authentic assessment be
too expensive and time-consuming? Will the public have any faith in the objectivity and
reliability of judgment-based scores?
Is Accreditation Accountable? The Continuing Conversation Between Accreditation and the
Federal Government by the Council for Higher Education Accreditation (2003) provides a
thorough discussion of the tensions between the federal governments call for accountability
for student learning and traditional process based peer review accreditation methods.

Establishing the Student Learning Outcomes Process


Assessment Plan/Progress Report by Isothermal Community College (NC) explains the SLO
process well.
Developing an Assessment Plan to Learn about Student Learning by Peggy Maki of AAHE
gives a tabular Assessment Guide which covers general steps in setting up a student
learning outcome assessment process.
Methods of Assessment of Student Learning classifies SLO methods as Direct, Indirect and
Outputs.
AssessmentAn Institution-Wide Process to Improve and Support Student Learning by
College of DuPage (IL) is a handbook which lays out the student learning outcomes process
and roles in general terms.
Defining and Assessing Learning: Exploring Competency-Based Initiatives a report of the
National Postsecondary Education Cooperative Working Group on Competency-Based
Initiatives in Postsecondary Education published by the National Center for Educational
Statistics, September 2002. Section 4 on Principles of Strong Practice is particularly useful,
giving twelve principles clustered in four areas: planning for competency-based education
initiatives; selecting assessment methods; creating and ensuring that learning experiences
lead to competencies; and reviewing assessment results to identify changes needed to
strengthen student learning. The report concludes with eight case studies; of particular note
are those of Sinclair Community College (OH) which has a flourishing competency-based
initiative that guarantees competencies of graduates and Hagerstown Community College
(MD) which uses a career transcript listing specific competencies.
Assessment at the Program Level by Trudy H. Bers. Notable features: 1) summarizes ten
approaches to program assessment, 2) discusses challenges to implementation, 3) describes
good practices at six community colleges.
60
Narratives of Faculty Experiences with Student Learning Outcomes
Using Rubrics by Michelle Christopherson of the Modesto (CA) Junior College English
Department
Course-Based Assessment in English at Riverside Community College by Arend Flick
Course Level Assessment Currently Being Used: Why Turn Towards Them? by Lisa Brewster
of the San Diego Miramar College Speech Department
Does Assessment of Student Learning Outcomes Make a Difference? One Departments
Experience by Jerry Rudmann, Irvine Valley College

Program Assessment
Displaying Sociological Imagination at College of DuPage (IL) gives process and results for
assessing sociology (and shows the need for inter-rater reliability).
Guide to Outcomes Assessment of Student Learning at CSU Fresno is a how to guide.
Undergraduate Program Assessment Plan for Anthropology at CSU Fresno. Methods: pre/post
test, writing rubric, embedded exam questions, student survey.
Outcomes Assessment Plan for Math at CSU San Bernardino. Method: embedded exam
questions.
Outcomes Assessment Status Report for Communications at CSU San Bernardino. Methods:
portfolio, intern job performance.
Outcomes Assessment Status Report for Nursing at CSU San Bernardino. Methods: clinical
supervisor evaluations, exit survey (commercial vendor), embedded full tests (commercial
vendor).
Parkland College Academic Program Assessments: 1) Theatre, Methods: performance
assessment and grad surveys; 2) Accounting, Methods: college-produced end-of-course
exams and performance assessment, grad surveys.
The Geneva College (PA) Program Guide has a good example of a program audit.
North Carolina States document Data for Program Outcomes Assessment gives
Engineering program competencies and assessment.

General Education Assessment


General Education Assessment Pilot Project by Coconino Community College (FL) wrote
general education learning outcomes and identified which courses covered them. They also
describe how the college gave CAAP exams in reading and writing, with a SWOT analysis.
In the Assessment Plan/Progress Report Isothermal Community College (NC) established
student learning outcomes in 1) Communications (reading, writing, speaking, listening), 2)
Information Literacy, 3) Problem Solving, 4) Interpersonal Skills, 5) Quantitative Skills, and
6) Cognitive Skills. (MJC Institute Exercise: Write observables for these SLOs.) Each of the
Isothermal GE skills areas has a rubric with a 1-4 scale (but not observables for each level
except for Quantitative Skills).
Summary of Two Years of CAAP Assessment at College of DuPage (IL) gives comparisons to
national norms on 6 tests (writing, reading, math, critical thinking, science reasoning, essay).
Students self-reported on progress in three other general education areas: understanding and
appreciating culture, understanding and appreciating environment, developing a system of
personal values.
Benchmarks for Core Skills at Palomar College (CA) gives six general education competency
sets: Communication, Cognition, Information Competency, Social Interaction, Aesthetic
Responsiveness, Personal Development and Responsibility. The document has
Demonstrated Competencies in three categories: Beginner, Developing, and Accomplished.
61
General Education Core-Audit Grid from the University of West Virginia Community and
Technical College. Each of the colleges eight general education skills are identified on a
matrix that lists all courses within the five categories of GE courses, coding the level of
mastery as either I for Introduces, E for Emphasizes, R for Reinforces, or A for Applies.
Assessment of General Education Learning Outcomes: An Institutional Portfolio Approach to
Assessment of General Education Learning Outcomes Johnson County Community
College. This document defines the institutional portfolio, gives the logistics of
implementation, and then lists six GE outcome statements, each with detailed competencies,
rubrics and standards.
Summary of Results from Student Outcomes Assessment - Spring 2002 and 2003, Mesa (AZ)
Community College Office of Research and Planning. Mesa CC uses a student test sampling
approach to SLO assessment. This document details their GE outcome statements in seven
areas and summarizes the testing results.

Writing Measurable Outcomes


The Geneva College (PA) Program Guide has good examples of writing measurable
outcomes.
The Assessment Primer by the FLAG Project stresses deep learning by connecting
Curriculum, Instruction and Assessment (CIA). Particularly strong on matching goal with
assessment tool.
Learning Outcomes: Learning Achieved by the End of a Course or Program: Knowledge
Skills Attitudes By Shirley Lesch, George Brown Toronto City College. The ABCs of
learning outcomes in nine easy-to-read pages.

Tools of Assessment
Overview
Advantages and Disadvantages of Assessment Techniques by Barbara Wright (8/15/02,
presented at a California Assessment Institute workshop). Covers plusses and minuses of
portfolios, capstone courses and projects, performance assessments, embedded assessment,
classroom research and assessment, locally developed tests and commercial standard tests.
Rubrics: How-To Guides
The Use of Scoring Rubrics for Assessment and Teaching by Mary Allen of CSUs Institute
for Teaching and Learning is a three-page summary of what they are, how to create them,
and how to use them. An example is included on assessment of oral presentations. She also
has a six-page version entitled Developing and Applying Rubrics which has considerably
more detail.
Primary Trait Analysis: Anchoring Assessment in the Classroom by Ruth Benander, Janice
Denton, Deborah Page and Charlotte Skinner, Raymond Walters College (OH), from JGE:
Journal of General Education,Vol. 49, No.4, 2000.
Rubrics:: Examples
Map Rubric is a scoring tool for the Online Map Creation web site
(www.aquarius.geomar.de/omc).
Grading Standards: Written Work for The Living Environment BIOL 111 is a rubric for
writing in Biology (A-F scales with definitions) at Southern Illinois University.
Student Participation Assessment and Evaluation is a rubric with 4-point scales: frequently,
occasionally, seldom, almost never, used at Southern Illinois University.
Assessing Modeling Projects in Calculus and Precalculus by C. E. Emenaker of the University
of Cincinnati gives a math project problem with two scoring rubrics: analytic and holistic.

62
Scientific Report Rubric and Collaboration Rubric developed for the Cabrillo Tidepool
Study.
Rubric For Evaluating Web Sites originally developed by John Pilgrim, Horace Mann
Academic Middle School, San Francisco
Secondary Assessment Tools is a web site with links to several dozen simple Performance
Assessment rubrics (http://www.bcps.org/offices/lis/models/tips/assess_sec.html)
Student Learning Outcomes in the California State University is a web site that gives links to
about 50 scoring rubrics (http://www.calstate.edu/AcadAff/SLOA/links/rubrics.shtml).
Examples include the Scoring Guide for the CSU English Placement Test (EPT) and CSU
Fresno rubrics on Critical Thinking, Integrative Science, and Writing.
Portfolios
Individual Student Tracking Project gives a brief explanation of what portfolios are and how
to use them. From Palomar College (CA).
Classroom Assessment Techniques
Classroom Assessment: A Manual for Faculty Developers by the National Council for Staff,
Program and Organizational Development (NBCSPOP) is a step-by-step manual for putting
on a CATs workshop.
Embedded Assessment
The Journal of Chemical Education produces a Chemical Concepts Inventory which is a 22
question nationally normed multiple-choice test on basic chemistry concepts.
The Field-tested Learning Assessment Guide (FLAG Project) has good examples (problems,
tests, surveys) in science, math, engineering and technology (copy of this list is provided).
Course Embedded Assessment Process developed by Larry Kelley at University of Louisiana
Monroe gives a summary of course embedded assessment, examples of ten program
assessment plans, lays out the basics of rubrics, and gives several rubric templates.

Local California Community College Training and Resource Materials


Modesto Junior College (CA) held training institute in the summer of 2003 for 36 faculty and
staff entitled Measuring Student Learning Outcomes. The document includes activities for
writing measurable objectives, writing student learning outcomes starting with existing
course objectives, how to embed assessment in a course, the basics of rubric writing, and
how to construct a program assessment plan. MJC is also holding a summer training institute
in 2004 with an activity and resource guide entitled Student Learning OutcomesA Focus
on Results.Bakersfield College (CA) has assisted the majority of its faculty in writing
student learning outcomes for their courses. Faculty leaders Janet Fulks and Kate Pluta have
put together a resources manual entitled Assessing Student Learning that guides faculty
through the process of writing SLOs, including definitions, criteria, good and bad examples,
and SLOs from their own courses. The document also covers how to include all three
learning domains: cognitive, psychomotor, and affective. The document concludes with the
story of how the college ramped up the SLO process, a summary of achievements to date, a
philosophy statement on SLOs adopted by the Academic Senate, and a listing of web
resources.

State and National Standards on Academic & Vocational Competencies


Welding Codes & Standards (www.aws.org/cgi-bin/shop) AWS is recognized worldwide for
the development of consensus-based American National Standards. Over 170 standards -- as
codes, recommended practices, guides and specifications. Certification is offered in seven
different welding processes.

63
Business Education Standards (www.nbea.org/curfbes.html) Using the concepts described in
these standards, business teachers introduce students to the basics of personal finance, the
decision-making techniques needed to be wise consumers, the economic principles of an
increasingly international marketplace, and the processes by which businesses operate. In
addition, these standards provide a solid educational foundation for students who want to
successfully complete college programs in various business disciplines.
California Code of Regulations, Title 16. Professional And Vocational Regulations 1443.5.
Standards of Competent Performance (http://www.calnurse.org/cna/np/brn/standard.html)
A registered nurse shall be considered to be competent when he/she consistently
demonstrates the ability to transfer scientific knowledge from social, biological and physical
sciences in applying the nursing process in accord with the six enumerated standards.
Undergraduate Psychology Learning Goals and Outcomes This document is the work of the
Task Force on Undergraduate Psychology Major Competencies appointed by the American
Psychological Associations Board of Educational Affairs. The report provides details for 10
suggested goals and related learning outcomes for the undergraduate psychology major.
These represent what the Task Force considers to be reasonable departmental expectations
for the psychology major in United States' institutions of higher education.
Information Literacy Competency Standards for Higher Education by the Association of
College and Research Libraries (ACRL). Each of the five standards come with detailed
performance indicators.

Assessment Plan ExamplesInternet Sites


North Carolina State University: http://www2.acs.ncsu.edu/UPA/assmt/resource.htm. Contains
comprehensive list of links to national assessment forums, manuals & handbooks, student
learning, & other institutions.
California State University, Fresno: http://www.csufresno.edu/cetl/assessment/assmnt.html &
http://www.csufresno.edu/cetl/assessment/status.html . Contains links to program
assessment plans.
Boise State University: http://www2.boisestate.edu/iassess/outcomes/outcomes.htm . Contains
links to program assessment plans organized by college.
Oklahoma State University:
http://www.okstate.edu/assess/assessment_plans/assessment_plans.htm . Contains
assessment method examples and assessment plan tips and checklist.
California State University at Sacramento: http://www.csus.edu/acaf/assmnt.htm . Contains
listing of program assessment plan links.
San Jose State University: http://www.sjsu.edu/ugs/assessment/as-main.html . Contains program
assessment plans organized by college and a page containing links to other institutions.
Southeast Missouri State University: http://www2.semo.edu/provost/aspnhtm/busy.htm . Busy
chairperson's guide to assessment (table of contents).
Southern Illinois University: http://www.siue.edu/~deder/assess/depts.html . Contains program
assessment plans.
Ohio University: http://www.ohiou.edu/provost/OUTCOMES2000_2001.html . Student learning
outcomes assessment , 2000-2001.
Central Michigan University: http://www.provost.cmich.edu/outcomes/ . Student learning
outcomes by college for each major.

Links can be found on the web version: http://cai.cc.ca.us/workshops/SLOFocusOnResults.doc

64
i

Endnotes
Is Accreditation Accountable by the Council for Higher Education Accreditation (2003) provides a thorough
discussion of the tensions between the federal governments call for accountability for student learning and
traditional process based peer review accreditation methods:
http://www.chea.org/pdf/CHEAmonograph_Oct03.pdf
ii
Lisa Brewsters approach is summarized in Course Level Assessment Currently Being Used: Why Turn
Towards Them? (October 2003) presented at the Student Learning Outcomes workshop sponsored by the
RP Group: http://cai.cc.ca.us/SLOworkshops/Strand2/Brewster%20on%20Speech%20SLOs.doc
iii
Janet Fulks SLOs are on the web at http://www2.bc.cc.ca.us/bio16/Student%20Learning%20Outcomes.htm
and her grading rubrics are at http://www2.bc.cc.ca.us/bio16/projects_and_grading.htm
iv
For an excellent short article on this topic see The Case for Authentic Assessment by Grant Wiggins at
http://ericae.net/edo/ED328611.htm
v
SLO Good Practice Statements:
Palomar College: http://www.palomar.edu/alp/principles.html
College of DuPage: http://www.lgu.ac.uk/deliberations/assessment/manifest.html
American Association of Higher Education: http://www.aahe.org/assessment/principl.htm
vi
Primary Trait Analysis definition is from Integrating the Assessment of General Education into the
ClassroomA Two-Year College Model by Ruth Benander and Janice Denton of Raymond Walters
College and Barbara Walvoord of University of Notre Dame presented at the Annual Meeting of the North
Central Accrediting Association in April of 1997:
http://www.rwc.uc.edu/phillips/Assessment/NCApaper.html
See also Effective Grading: A Tool for Learning and Assessment. Walvoord, Barbara E. and Virginia J. Anderson.
San Francisco: Jossey-Bass Publishing, Inc. 1998; and Primary Trait Analysis: Anchoring Assessment in
the Classroom by Benander, Denton, Page and Skinner; Journal of General Education, Vol. 49, No 4, 2000.
vii
Mary Allen at CSU Fresno has written a succinct two pages of advice on the Use of Rubrics that is well
worth reading: http://www.calstate.edu/acadaff/sloa/links/using_rubrics.shtml
For a more detailed commentary on rubrics, see Developing and Applying Rubrics by Ethelynda Harding,
also of CSU Fresno, presented at a chemistry conference in March of 2004:
http://www.csufresno.edu/cetl/Events/Events%2003-04/ChemConf/Rubrics.pdf
viii
Assessing Modeling Projects In Calculus and Precalculus: Two Approaches by Charles E. Emenaker,
University of Cincinnati, Raymond Walters College: http://www.maa.org/saum/maanotes49/116.html
ix
Summary of direct assessment methods taken from A Glossary of Measurement Terms ERIC Digest.
http://ericae.net/edo/ed315430.htm and the Temple University Teachers Connection.
www.temple.edu/CETP/temple_teach/ and the NCIIA Assessment Workshop.
www.nciia.org/CD/public/htmldocs/papers/p_and_j.pdf
x
A Private Universe Schneps, M. H. and P. M. Sadler (1987) Harvard-Smithsonian Center for
Astrophysics, Science Education Department, Science Media Group, A Private
Universe. Video. Washington, DC: Annenberg/CPB: Pyramid Film and Video, 18 minutes.
xi
For commentary on informal norming sessions on an English writing rubric see Using Rubrics by
Michelle Christopherson of Modesto Junior College: http://cai.cc.ca.us/SLOworkshops/Strand2/Using
Rubrics.doc
xii
Undergraduate Psychology Major Learning Goals And Outcomes: A Report, American Psychological
Association (March 2002): http://www.apa.org/ed/pcue/taskforcereport2.pdf
xiii
Taken from A Handbook on Assessment for Two Year Colleges by Ed Morante of College of the Desert:
http://cai.cc.ca.us/Fall2002Institute/2002/assessmenthandbookfinal.doc
xiv
Examples of Course Level Assessment Plans:
Raymond Walters College: http://www.rwc.uc.edu/phillips/Assessment/AcadAssess.html
California State University, Fresno, Anthropology:
http://www.csufresno.edu/cetl/assessment/Programs/Anthropology/AnthroPlan.pdf

Potrebbero piacerti anche