Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
INSTRUMENT
Instruments fall into two broad
categories, researcher-completed
and
subject-completed,
distinguished by those instruments
that researchers administer versus
those that
are completed by
participants.
INSTRUMENT
Researcher-completed
Instruments
Subject-completed Instruments
Rating scales
Interview schedules/guides
Questionnaires
Self-checklists
Tally sheets
Attitude scales
Flowcharts
Personality inventories
Performance checklists
Achievement/aptitude tests
Time-and-motion logs
Projective devices
Observation forms
Sociometric devices
INSTRUMENT
Open ended
Basic
Questio
ns
Closedended
INSTRUMENT
Open ended questions :- when there
are a great number of possible answers or
researchers cannot predict all the possible
answers
For example a question about students
reason for selecting particular university
would probably be open ended or a
question about a college major would be
open ended because researchers dont
want to the long possible majors.
INSTRUMENT
Things to remember when writing openended questions:
Write questions clearly and concisely.
Write questions by thinking about the
reading level of the target population.
Avoid double negatives.
Avoid double-barreled questions.
Example: Are you satisfied with the
place and time of the program?
INSTRUMENT
Open-ended questions are useful to
explore a topic in depth however,
open-ended questions are difficult to
(i) Respond and (ii) Analyze.
Therefore, limit the number of openended questions to the needed
minimum.
INSTRUMENT
Close ended questions:- all the
possible, relevant responses to a
question can be specified and the
number of possible responses is
limited
Examplegender,
ethnic,
educational level
INSTRUMENT
Close ended questions
INSTRUMENT
Close ended questions
1.
2.
3.
INSTRUMENT
Advantages of Close-ended Questions:
Limiting the answer options makes data analysis easier
Ensure desired information is obtained
Can increase the reliability of the study
Disadvantages
Other equally important information may not be retrieved
Some respondents may become frustrated with limited
yes and no responses
Difficult to know whether the answers were guessed or
actually known
If a question not answered difficult to know whether the
respondent purposely did not answer or it was
inadvertently missed
INSTRUMENT
Both formats can be used in the same
questions.
1. What type of writing assignment do you
typically require in your course?
a. Reports
b. Themes or essays
c. Research papers
d. Take home test
e. Other (Please specify) ____________________
DEVELOPING INSTRUMENT
Use research questions, theoretical
framework and literature as a guide
in constructing instrument
Researchers are communicating with
respondents using questions from the
instrument. So, respondents must
really understand what is the heart
of the instrument. This called
communication validity
DEVELOPING INSTRUMENT
Review literature in the domain which
you wish to measure (i.e., generic
skills").
Conceptual Definition - CD is an element
of the scientific research process, in
which a specific concept is defined as a
measurable occurrence. It basically gives
you the meaning of the concept.
DEVELOPING INSTRUMENT
Construct formation - Develop a list of
subscales also known as constructs that
you wish to sample from the domain.
e.g. : Conceptual definition = generic
skills
Construct Formation = communication
skills, problem solving, ethics,
adaptability and team working
DEVELOPING INSTRUMENT
Operational Definitions An OD is a
process by which the characteristics of a
concept can be defined, including
identification and classification. Also
known as item
Example for OD (i) I prefer to work in group
compared by individually.
It is advisable to write at least 5
items/statements for each construct for
the first phase.
DEVELOPING INSTRUMENT
Draw blueprint / meta data analysis
and give sources for each item to
guarantee the content validity
Give the instrument to at least 3
experts to validate (content and
language experts)
Calculate the level of agreement
among the experts
DEVELOPING INSTRUMENT
Do pilot test on the instrument after
validation
You can run many statistical analysis
to check the validity and reliability of
the item
E.g. : Exploratory factor analysis,
Confirmatory factor analysis, Principal
Component
Analysis,
Unidimensionality etc.
DEVELOPING INSTRUMENT
Factor analysis can help to check
construct validity.
Review
whether
each
item
conceptually belongs with its factor
(subscale) and remove those which
do not.
Check the item fit or factor loading
for each item either its belong to the
given construct or not.
DEVELOPING INSTRUMENT
Run reliability test such as item and
person reliability or Croanbach's Alpha
for each factor/category (subscale) to
investigate internal consistencyreliability.
Modify and retest the instrument if
necessary (alpha<.70).
Check the scaling that you used either it
is understandable for the respondents or
not
STRUCTURE OF QUESTIONS
1. Completion, or fill in - open ended
questions to which respondents
must supply their own answers in
their own words
Example What is the major weakness
you have observed in your students
preparation for college?
STRUCTURE OF QUESTIONS
2. Checklists - present a number of possible
answers and the respondents are asked to check
those are apply
Example ; What type of teaching aids do you use
in your class? Check as many as apply
1) Chalkboard
2) Overhead projector
3) Computer projector
4) Video tapes
5) Other (please specify)_________________________
STRUCTURE OF QUESTIONS
3. Scaled Items- ask respondents to rate a concept, event,
or situation on such dimensions as quantity or intensity,
indicating how much, how well or how often?
Example : How would you rate the writing skills of students
you are teaching this semester? (check one)
1. Very poor
2. Less than adequate
3. Adequate
4. More than adequate
5. Excellent
6. Insufficient information
STRUCTURE OF QUESTIONS
4. Ranking Items ask respondents to
indicate the order of their preference
among a number of options. Ranking
should not involve more than six options.
Example: Do your students have more
difficulty with some types of reading than
with other types? Please rank the order of
difficulty of the following materials with 1
the most difficult and 4 the least difficult
STRUCTURE OF QUESTIONS
_____1) Textbooks
_____2) Other reference books
_____3) Journal articles
_____4)
other
specify)______________
(Please
STRUCTURE OF QUESTIONS
5. Likert Scales lets subjects indicate their responses to
selected statements on a continuum from strongly
disagree to strongly agree.
Example : The students who typically enroll in my course
are underprepared in basic math skills. (Circle one)
1. Strongly Disagree
2. Disagree
3. Undecided
4. Agree
5. Strongly Agree
STRUCTURE OF QUESTIONS
Example for items in Likert
Type Scale format (Agreement with
4 point scale)
1.
2.
3.
4.
Strongly Disagree
Disagree
Agree
Strongly Agree
VALIDITY
Types of Validity
Content validity the extent to which it adequately
covers the various dimensions of the concept under
studied (i.e. physical, psychological and theoretical)
Criterion validity assess a measure against
another criterion (or indicator) of the same concept
Concurrent validity uses an already existing and tested tool
Predictive validity assesses the degree to which a measure can
predict some future event of interest
RELIABILITY
Types of Reliability
Test-retest reliability demonstrates the stability of a
measure over time
Internal consistency reliability most of the items
within a rating scale of a concept show a consistency
of scoring
Interrater reliability the extent to which two or
more independent researchers are consistent in
observing, recording and scoring data (should be 70%
or higher agreement)
ADMINISTRATION OF
QUESTIONNAIRE
Postal surveys
Face to face interviews
Group administration
Telephone surveys
Direct observation
Computer / Internet
31
WRITING SURVEY
QUESTIONS
1. Question should be short, simple
and direct. A useful rule of thumb is
that most of the items should have
fewer that 10 words (one line) and
should be under twenty words
2. Phrase items so that they can be
understood by every respondents.
Avoid too technical words, not to use
slang, abbreviations or acronyms
may not be familiar to all.
WRITING SURVEY
QUESTIONS
3. Phrase items so
unambiguous answers.
as
to
Example :
1. Did you vote for last election?
2. I often go to library
elicit
WRITING SURVEY
QUESTIONS
4. Avoid bias that may predetermined
a respondents answers
Example
1. Have you exercised your Malaysian
right and registered to vote?
WRITING SURVEY
QUESTIONS
5.Avoid double barreled questions
Example
1. Do you feel that university should
provide basic skills courses for
students and give credit for them?