Sei sulla pagina 1di 6

Assessment

1) EVALUATION - a process of systematic interpretation, analysis, appraisal or judgement of the worth of


organized data as basis for decision making.

2) ASSESSMENT - a process of gathering and organizing quantitative or qualitative data into an


interpretable form to have a basis for judgment or decision-making.

✓ Traditional Assessment - it refers to the use of pen and paper objective test.

✓ Performance Assessment - is the direct, systematic observation of an actual student performance and
the rating of that performance according to previously established performance criteria.

✓Authentic Assessment - it refers to the use of assessment methods that stimulate true to life
situations.

✓ Alternative Assessment - it refers to the use of methods other than pen and paper objective test.

3) TEST - an instrument designed to measure any characteristics, quality, ability, knowledge or skill. It
comprise of items in the area it is designed to measure.

✓ Testing - is a technique of obtaining information about the presence and absence of certain
characteristics or qualities in a learner needed for evaluation purposes.

4) MEASUREMENT - a process of quantifying the degree to which someone/something possesses a given


trait.

Types of Measurement

1) Objective measurements (as in testing) - more stable in the sense that repeated measurements of the
same quantity or quality of interest will produce more or less the same outcome.

2) Subjective measurements (as in perceptions) - there are certain facets of the quantity or quality of
interest that cannot be successful captured by objective procedures but can be done by subjective
methods.

PRINCIPLES OF HIGH QUALITY ASSESSMENT

Principle 1: Clear and Appropriate Learning Targets

Assessment can be made precise, accurate and dependable only if what are to be stated in behavioral
terms

a) Cognitive Targets

b) Skills, Competencies and Abilities Targets Skills refer to specific activities or tasks that a student can
proficiently do.
c) Products, Outputs and Projects Target- are tangible and concrete evidence of a student's ability.

Principle 2: Appropriate Methods

Categorized Methods of Assessment:

1.Written Response Instruments - include objective tests (multiple choice, true-false, short answer, etc.),
essays, examinations and check lists

✓ Objective tests are appropriate for assessing the various levels of hierarchy of educational objectives.

✓ Multiple choice tests can be constructed to test higher order thinking skills.

✓ Essays, when properly planned, can test the student's grasp of the higher level call customer cognitive
skills.

2. Product Rating Scale - a teacher is often tasked to rate products (projects, book reports, maps, charts,
drawings, essays and creative endeavor of all sorts.)

3. Performance Tests- one of the most frequently used measurement instrument is the checklist.

4. Oral Questioning - it is the appropriate assessment method when the objectives are:

a) To assess the student's stock knowledge

b) To determine the student's ability to communicate ideas in coherent verbal sentences.

5. Observation and Self-Reports

Observational tally sheet is a devise used by teachers to record the frequency of student behaviors,
activities or remarks

A self-checklist is a list of several characteristics or activities presented to the subjects of a study

Observation and self-reports/ are useful supplementary assessment methods when used in conjunction
with oral questioning and performance tests.

Assessment Methods

1. Objective Supply - Short Answer, Completion Test

2. Objective Selection - Multiple Choice, Matching Type, True/False

3. Essay/ - Restricted, Response Extended, Response

4. Performance Based - Presentations, Papers, Projects, Athletics, Demonstration Exhibitions, Portfolios

5. Oral Questioning - Oral Examinations, Conferences, Interview

6. Observation - Informal, Formal


7. Self-Report - Attitude Survey, Sociometric Devices Questionnaires, Inventories

Modes of Assessment

1) Portfolio

• a process of gathering multiple indicators of student progress to support course goals in dynamic, on-
going and collaborative process

• It measures students growth but development is time consuming and rating tends to be subjective
without rubrics

Examples:

✓ Working portfolio (students day-to-day work which reflect her learning)

✓ Show portfolio (collection of students best works)

✓ Documentary portfolio (a combination of a working and show portfolio)

2) Traditional Mode

• The paper-pen test used in assessing knowledge and thinking skills

• Administration is easy but preparation is time consuming and prone to guessing and cheating

Examples:

✓ Standardized test

✓ Teacher made test

3) Performance

• A mode of assessment that requires actual demonstration of skills or creation of products of learning

• Preparation is easy but scoring is subjective without rubrics and administration is time consuming

Examples:

✓ Practical Test

✓ Oral Test

✓ Projects

Principle 3: Balance

A balance assessment set targets in all domains of learning (cognitive, affective and psychomotor) or
domains of intelligence (verbal-linguistic, logical-mathematical, interpersonal-social, intrapersonal-
instrospection, physical-world-natural, existential-spiritual)
Principle 4: Validity

Validity is one important criterion of a good assessment instrument or tool, which means it should
measure what it intends to measure.

3 main types of evidence that may be collected:

✓ Content-related evidence of validity - refers to the content and format of the instrument

✓ Construct-related evidence of validity - refers to the nature of the psychological construct or


characteristics being measured by the test.

✓ Criterion-related evidence of validity - refers to the relationship between scores obtained using the
instrument and scores obtained using one or more other tests (often called criterion).

Establishing Validity

1) Content validity - is established when the objectives of assessment match the lesson objectives.

2) Face validity - is established by examining the physical appearance of the instrument. This is done by
examining the physical appearance of the instrument.

3) Construct validity - a type of validation that require the correlation of the predictor or concurrent
measure with the criterion measure.

✓ Convergent validity - is established if the instrument defines another similar trait other than what it is
intended to measure.

✓ Divergent validity- is established if an instrument can describe only the intended trait and not the
other traits.

4) Criterion-related validity

✓ A type of validation that refers to the extent to which scores from a test relate to theoretically similar
measures.

Examples:

Classroom reading grades should indicate similar levels of performance as Standardized Reading Test
scores.

a) Predictive validity - describe the future performance of an individual by correlating the sets of scores
obtained from two measures given at a longer time interval. It is done by correlating the sets of scores
obtained from two measures given at a longer time interval in order to describe the future performance
of an individual.

Factors that influence Test Validity


1) Patterns of answers

2) Appropriateness of test items

3) Length of the test

4) Directions

5) Reading vocabulary & sentence structures

6) Construction of test items

7) Difficulty of items

8) Arrangement of items

Principle 5: Reliability

Reliability refers to the consistency of the scores obtained- how consistent they are for each individual
from one administration of an instrument to another and from one set of items to another.

Methods of Establishing Reliability

1) Equivalent-Form Method - a type of reliability determined by administering two different but


equivalent forms of test (also called parallel or alternate forms) to the same group of students in close
succession.

2) Test-retest Method - the same test is administered twice to the same group of subjects at different
times.

3) Parallel/Alternate Forms - is done by evaluating the scores of equivalent forms of test given with close
time interval between forms.

4) */Split-Half Method/* - administer test once. Score two equivalent halves of the test. Tp split the test
into halves that are equivalent, the usual procedure is to score the even-numbered and the odd-
numberee separately.

5) Kuder-Richardson Formula - administer test once. Score total test and apply the Kuder-Richardson
Formula.

6) */Scorer reliability/ Inter-Rater Method/* - is done by evaluating the scores of equivalent forms of
tests given with close time interval between forms.

Purposes of Assessment

1. Assessment for Learning


Includes 3 types of assessment/ done before and during instruction such as placement, formative and
diagnostic.

a) Placement assessment determines appropriate placement of students both in terms of achievement


and aptitude. (done before instruction to determine the needs and ability level of the learners for
possible adjustment in the teaching process.

b) Formative assessment guides the teacher on his/her day-today teaching activity. (may be done during
or after instruction to find out how learners are progressing or to monitor how the learning objectives
are attained)

c) Diagnostic assessment is determining the gasps in learning or learning processes/ students learning
difficulties which are not revealed by formative tests or checked by remedial instruction and hopefully to
bridge these gasps. (may be done before or during instruction to identify recurring difficulties of
learners)

2. Assessment of learning

This is done after instruction, usually referred to as the summative assessment.

✓ Summative assessment tries to determine the extent to which the

learning objectives for a course are met and why. (done after

instruction to determine what have been learned).

3. Assessment as learning this is done for teachers to understand and perform well their role of
assessing FOR and OF learning.

✓ It requires teachers to undergo training on how to assess learning and to equip the following
competencies needed to performing their work as assessors.

Potrebbero piacerti anche