Bulacan State University: Assessment of Student Learning 1 Administering, Analyzing and Improving The Test

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

Republic of the Philippines

Bulacan State University


City of Malolos
Tel. No. (044) 919-7800 to 99 Local 1022

ASSESSMENT OF STUDENT LEARNING 1


MODULE 6
ADMINISTERING, ANALYZING AND IMPROVING THE TEST
After writing the instructional objectives, constructed a test blueprint and written items that
match the objectives, then more likely what has been produced is a good test. But after investing a
considerable time, it is worth to have a little more time to ensure that efforts exerted will not be wasted.
Pitfalls should be avoided in the test assembly, administration, and scoring. Some of these concerns
will now be presented.
6.1 When assembling a test:
a. Group all items of same or similar formats together.
b. Arrange items so that item difficulty progresses from easy to hard.
c. Space items to eliminate overcrowding.
d. Keep items and options on the same page.
e. Place contextual material above the items to which they refer.
f. Arrange answers in a random pattern.
g. Decide how students are to record answers.
h. Be sure to include a space for the student’s name.
i. Be sure directions are specific and accurate.
j. Proofread your master copy before reproducing it.
6.2 But no matter how careful you are and you did the best that you can do to ensure the validity
and reliability of the test, poor copies will make it less so. Care must be taken in reproducing the
test to avoid illegible copies that would impair test validity. The following are some practical steps
that could help you to ensure that the test you constructed does not end in illegible copies.
a. Know the photocopying machine.
b. Make extra copies.
c. Specify copying instructions.
d. Avoid common pitfalls. Be sure to avoid;
1. fine print.
2. finely detailed maps or drawings.
3. barely legible masters or originals.
e. File original test.
6.3 In administering a test, make an effort to:
a. Induce a positive test-taking attitude.
b. Maximize the achievement nature of the test.
c. Equalize the advantages test-wise students have over non-test-wise students.
d. Avoid surprise tests.
e. Provide special instructions before the tests are actually distributed.
f. Alternate your test distribution procedures.
g. Have students check that they have the entire test.
h. Keep distractions to a minimum.
i. Alert students to the amount of time left toward the end of the test.
j. Clarify test collection procedures before handling out the test.
6.4 In scoring the test, try to:
a. Have the key prepared in advance.
b. Have the key checked for accuracy.
c. Score blindly.
d. Double-check scores, if scored by hand.
e. Record scores before returning the tests

6.5 ANALYZING THE TEST


Just like as any other undertakings, you can expect to make errors in test constructions. No
test that you construct will be perfect- it will include inappropriate , invalid or otherwise deficient items.

ITEM ANALYSIS can be used to identify items that are deficient in some way, thus paving the
way to improve or eliminate them, with the result being a better overall test. There are two kinds of
item analysis, the quantitative and qualitative item analysis.

6.5.1 QUANTITATIVE ITEM ANALYSIS


Quantitative item analysis is a technique that will enable us to assess the quality or
utility of an item. It does so by identifying distracters or response actions that are not doing
what they are supposed to be doing.
It is also a numerical method for analyzing test items employing student response
alternatives or options and is ideally suited for examining the usefulness of multiple-choice
formats.

A. Difficulty Index (p) – proportion of students who answered the item correctly.

P = no. of students selecting correct answer = n


total number of students attempting the item N

Example:
A B C* D *Correct option
3 0 18 9 N=30

P = 18 = .60
30

B. Discrimination index (D) – measure of the extent to which a test item discriminates or
differentiates between students who do well on the overall test and those who do not do
well on the overall test.
1. Positive discrimination index
2. Negative discrimination index
3. Zero discrimination index

D= U–L where: D – discrimination index


n U - no. in upper group who got item correct
L - no. in lower group who got item correct
n - no. of students in either group (if group sizes are unequal, choose the
higher no.

To determine each item’s discrimination index (D), complete the following steps.
1. Arrange the papers from highest to lowest score.
2. Separate the papers into an upper group and a lower group based on total test
scores. Do so by including half of your papers in each group.

8
3. For each item, count the number in the upper group and the number in the lower
group that chose each alternative.
4. Record your information for each item in the ff. form.

Options A B C* D
Upper 1 0 11 3 n = 15
Lower 2 0 7 6

5. Compute D.

D = 11 – 7 = .267
15

6.5.2 TABLE FOR INTERPRETING p and D:


Range of p Interpretation Range of D Interpretation
.00 – 0.20 very difficult item -1.00 - -.60 questionable item
.21 – 0.40 difficult item -0.59 - -.20 not discriminating
.41 – 0.60 moderately difficult -0.19 - .20 mod. discriminating
.61 – 0.80 easy item .21 - .60 discriminating
.81 - above very easy .61 - 1.00 very discriminating

ACTION TABLE
Difficulty Level Discriminating Level Action
Difficult Not discriminating Improbable; eliminate
Moderately discriminating May need revision; improve
Discriminating Accept
Moderately Not Discriminating Eliminate
Difficult Moderately discriminating Improve
Discriminating Accept
Easy Not discriminating Eliminate
Moderately discriminating Improve
Discriminating Improve

6.6 OTHER DECISIONS


Quantitative item analysis not only helps us to decide which items to eliminate before it is again
administered but it also enables us in making other decisions. It could tell us whether an item is
miskeyed, whether guessing has been committed in the responses, or whether the item is somewhat
ambiguous. In doing so, we only need to consider the responses of the students in the upper half of
the class.

MISKEYING. When an item is miskeyed, most students who did well on the test will likely select an
option that is a distractor, rather than the option that is keyed. Consider the following miskeyed item.

EX: ________ 1. Who was the first astronaut to set foot on the moon?
a. John Glenn
b. Scott Carpenter
c. Neil Armstrong
*d. Alan Sheppard
The distribution of the responses of the upper half of the class were as follows:

A B C D*
Upper Half 1 1 9 2

9
GUESSING. When students in the upper half of the class respond in more or less random fashion,
there is guessing. This is most likely to occur when the item measures content that is (1) not covered
in class or the text, (2) so difficult that even the upper-half students have no idea what the correct
answer is, or (3) so trivial that students are unable to choose from among the options provided. In
such cases, each alternative is about equally attractive to students in the upper half, so their
responses tend to be about equally distributed among the options. The following distribution is an
example that guessing occurred:
A B *C D
Upper Half 4 3 3 3

AMBIGUITY. Ambiguity is suspected, when among the upper group, one of the distractors is chosen
with about the same frequency as the correct answer. Ambiguity is suggested by the following
distribution:

A B C *D
Upper Half 7 0 1 7

6.7 QUALITATIVE ITEM ANALYSIS

Qualitative item analysis is a non-numerical method for analyzing test items not
employing student responses, but considering test objectives, content validity, and technical
item quality.

Qualitative item analysis is very useful in assessing completion, essay & matching type
test items and is generally for editing poorly written exams.

10

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy