Sei sulla pagina 1di 45

Trends in Testing, Assessment and Evaluation Chapter 4 Test Construction and Administration

Guiding Principles for Evaluation


Evaluation should relate directly to instructional objectives Each evaluation activity should be designed to promote student growth
The actual activity should be useful practice in itself Feedback should be useable by the student

Multiple evaluation strategies should be provided to master achievement of X objective/competency Student should clearly understand the methods of evaluation for X test or activity

Questions to Ask yourself in Designing a Test


What objectives will (should) I be testing? What types of items will be included in the test? How long will the test be in terms of time and number of items? How much will each objective be worth in terms of weighting and number of items?

Tests as Diagnostic Tools


Students demonstrate learning Instructor effectiveness modify teaching strategies or activities Assignment of letter grades

Different Types of Tests & Learning


Paper & Pencil/ WebCT Testing
Limited Choice Questions (MC, T/F, Matching) Open-Ended Questions (Short Answer, Essay)

Performance Testing
Acquisition of skills that can be demonstrated through action (e.g., music, nursing, etc.)

Planning a Test
First step: Outline learning objectives or major concepts to be covered by the test
Test should be representative of objectives and material covered Major student complaint: Tests dont fairly cover the material that was supposed to be canvassed on the test.

Planning a Test
Second Step: Create a test blueprint Third Step: Create questions based on blueprint
Match the question type with the appropriate level of learning

Planning a Test
Fourth Step: For each check on the blueprint, jot down (might use 3x5 cards) 3-4 alternative question on ideas and item types which will get at the same objective Fifth Step: Organize questions and/or ideas by item types

Planning a Test
Sixth Step: Eliminate similar questions Seventh Step: Walk away from this for a couple of days Eighth Step: Reread all of the items try doing this from the standpoint of a student

Planning a Test
Ninth Step: Organize questions logically Tenth Step: Time yourself actually taking the test and then multiply that by about 4 depending on the level of students Eleventh Step: Analyze the results (item analyses)

Sample test blueprint

Translating Course Objectives/Competencies into Test Items


Syllabus
Specification table- what was taught/weight areas to be tested

Creating a Test Blueprint


Blueprint- this is the test plan, i.e., which questions test what concept Plotting the objectives/competencies against some hierarchy representing levels of cognitive difficulty or depth of processing

Thinking Skills
What level of learning corresponds to the course content Blooms Taxonomy of Educational Objectives
Knowledge (see handout) Comprehension Application Analysis Synthesis Evaluation

Practical Considerations
Representative sample of the course content not random purposeful based on blueprint Representative sample of skill or cognitive levels across content Analyze results by level AND content area

Question Arrangement on a Test


Group by question type
Common instructions will save reading time

Limit the number of times students have to change frame of reference Patterns on test must be logical
Arrange from a content standpoint Keep similar concepts together

Group by difficulty (easy to hard)

Selecting the Right Type of evaluation


How do you know what type of question to use and when? It depends on the skill you are testing. Evaluation should always match as closely as possible the actual activity youre teaching.
Examples: Teaching Speech, should evaluate an oral speech If testing ability to write in Spanish, better give an essay. Testing reading MC, TF Wouldnt use MC to test creative writing

Question Types verses Cognitive Levels of Learning


Knowledge Comprehension Multiple Choice (MC) True/False (TF) Matching Completion Short Answer Application Analysis Synthesis Evaluation MC Short Answer Essay

MC Short Answer Problems Essay Performance

Constructing the Test


Types of Test Questions:
Multiple-Choice Items True-False Items Matching Items Fill-In-blank Short-Answer Items Essay Questions

Multiple Choice Items


Advantages:
Extremely versatile-can measure the higher level mental processes (application, analysis, synthesis and evaluation) A compromise between a short answer/essay and T/F item Can cover a wide range of content can be sampled by one test

Disadvantages
Difficult to construct plausible alternative responses

Types of Multiple Choice Items


Four Basic Types
Question Type Incomplete Statement Type Right Answer Type Best Answer Type

Which Type is Best?


Question Type vs. Incomplete Statement Right Answer vs. Best Answer Type

Multiple Choice Items


1. Writing the stem first:
A. Be sure the stem asks a clear question B. Stems phrased as questions are usually easier to write C. Stems should not contain a lot of irrelevant info. D. Appropriate reading level/terms E. Be sure the stem is grammatically correct F. Avoid negatively stated stems

Multiple Choice Items


Writing the correct response
Use same terms/reading level Avoid too many qualifiers Assign a random position in the answer sequence

Read the stem and correct response together Generate the distractors /alternative responses

Multiple Choice Items


Other Tips for Constructing MC Items:
Items should have 3-4 alternatives. Stem should present a single, clearly formulated problem Simple, understandable, exclude extraneous words from both stem and alternatives Include in the stem any word that are repeated in each response Avoid all of the above (can answer based on partial information) Avoid none of the above

Multiple Choice Items


Alternative responses/distractors should be plausible and as homogeneous as possible Response alternatives should not overlap
Two synonymous terms (arithmetic average/mean)

Avoid double negatives


None of the following are part of the brain except which one?

Emphasize negative wording Each item should be independent of other items in the test
Information in the stem of one item should NOT help answer another item.

True-False Test Items


Best suited for testing 3 kinds of info.
Knowledge level learning Understanding of misconceptions When there are two logical responses

Advantages:
Sample a large amount of learning per unit of student testing time

Disadvantages:
Tends to be very easy 50-50 chance of guessing Tends to be low in reliability

Tips for Constructing True/False Items


Tips for constructing True-False Items
Avoid double negatives Avoid long or complex sentences Specific determiners (always, never, only, etc.) should be used with caution Include only one central idea in each statement Avoid emphasizing the trivial Exact quantitative (two, three, four) language is better than qualitative (some, few, many) Avoid a pattern of answers

Objective Test Item Analyses


Evaluating the Effectiveness of Items.
Why?
Scientific way to improve the quality of tests and test items Identify poorly written items which mislead students Identify areas (competencies) of difficulty

Item analyses provided info. on:


Item difficulty Item discrimination Effectiveness of alternatives in MC Tests

Short-Answer Items
Two Types: (Question and Incomplete Statement) Advantages:
Easy to construct Excellent format for measuring who, what, when, and where info. Guessing in minimized Student must know the material- rather than simply recognize the answer

Disadvantages:
Grading can be time consuming More than one answer can be correct

Tips for Constructing Short Answer Items


Better to supply the term and require a definition For numerical answers, indicate the degree of precision expected and the units in which they are to be expressed. Use direct questions rather than incomplete statements Try to phrase items so that there is only one possible correct response When incomplete statements are used, do not use more than one blank within an item.

Essay Questions
Types of Essay Questions Extended Response Question
Great deal of latitude on how to respond to a question. Example: Discuss essay and multiplechoice type tests.

Restricted Response Question


More specific, easier to score, improved reliability and validity Example: Compare and contrast the relative advantages of disadvantages of essay and multiple choice tests with respect to: reliability, validity, objectivity, & usability.

Essay Items
Advantages:
Measures higher learning levels (synthesis, evaluation) and is easier to construct than an objective test item Students are less likely to answer an essay question by guessing Require superior study methods Offer students an opportunity to demonstrate their abilities to:
Organize knowledge Express opinions Foster creativity

Essay Items
May limit the sampling of material covered
Tends to reduce validity of the test

Disadvantages
Subjective unreliable nature of scoring
halo effect good or bad students previous level of performance Written expression Handwriting legibility Grammatical and spelling errors

Time Consuming

Essay Questions
Give students a clear idea of the scope & direction intended for the answer
Might help to start the question with the description of the required behavior (e.g., compare, analyze)

Appropriate language level for students Construct questions that require students to demonstrate a command of background info, but do not simply repeat that info.

Essay Questions
If question calls for an opinion, be sure that the emphasis is not on the opinion but on the way its presented or argued. Use a larger number of shorter, more specific questions rather than one or two longer questions so that more information can be assessed.

Essay Questions
You might Give students a pair of sample answers to a question of the type you will give on the test. Sketch out a rubric (grading scheme) for each question before reading the papers OR randomly select a few to read and make up the grading scheme based on those answers Give students a writing rubric

Essay Questions
Detach identifying information and use code numbers instead to avoid letting personality factors influence you. After grading all the papers on one item, reread the first few to make sure you maintained consistent standards Be clear to student the extend to which factors other than content (e.g., grammar, handwriting, etc.) will influence the grade.

Tip for constructing Essay Questions


Provide reasonable time limits for each question
thinking and writing time

Avoid permitting students a choice of questions


Will not necessarily get a representative sample of student achievement. Only be requiring all students to answer all questions can their achievement be compared

A definite task should be put forth to the student


Critical words: compare, contrast, analyze, evaluate, etc.

Scoring Essay Items


Write an outline of the key points (use outline to design a rubric) Determine how many points are to be assigned to the question as a whole and to the various parts within it. If possible, score the test without knowledge of the students name
Face Sheet

Score all of the answers to one question before proceeding to the next question
Consistent standard

Scoring Essay Exams


If possible, score each set of answers within the same time frame Handwriting, spelling & Neatness
Two separate grades?
Mastery of material Other

Alternative Methods of Assessment


Research/Term Papers Research Reviews Reports Case Studies Portfolios Projects Performances Peer evaluation Mastery Simulations

Preparing rubric

Item Analysis

Item difficulty

Cheating
Preventing Cheating
Reduce the pressure (multiple evaluations) Make reasonable demands (length/content of exam) Use alternative seating Use alternative forms Be cautious with extra copies

Using Assessment & Evaluation to Improve Student Learning Outcomes


Providing feedback to student Closing the assessment & evaluation loop Maximizing student learning

Potrebbero piacerti anche