CBRC Let Review (Assessment of Learning)

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 8

ASSESSMENT OF LEARNING

I. BASIC CONCEPTS

M EASURE- quantify data


E VALUATION- judge data
A SSESSMENT- interpret data
T EST – tool/ instrument or gather data
II. PURPOSE OF ASSESSMENT
1. Assessment FOR learning – done before or during instruction
P- Placement Assessment – place students in specific learning groups
D- Diagnostic Assessment- strength and weaknesses
F- Formative Assessment- monitors progress
2. Assessment OF learning
- Done after instruction
- An integral part of the teaching- learning progress
- Finds out if the learning objectives were achieved
 Summative Assessment- assigns grades
3. Assessment AS learning
- Self- assessment
 Teacher – to understand and perform well their role of assessing FOR and OF
learning
 Student- to be self- directed in their own learning (projects)
III. MODES OF ASSESSMENT

T raditional Assessment- pen and paper test (board exam)


A uthentic Assessment- real life situations (selling in classroom e.g graham
balls, peanut)
A lternative Assessment – methods other than paper and pen test
P erformance Assessment – demonstration of products (folkvdancing,
cooking in tle)
P ortfolio Assessment – multiple indicators of student progress

IV. TYPES OF PORTFOLIO


S HOWCASE – best outputs
P ROCESS- metacognitive processes (reflection paper, journal etc.)
E VALUATION – for grading purposes (compilation)
D OCUMENTATION – learning progress (best and worst outputs, drafts,
corrections, final outputs)

V. RUBRICS
A. TYPES OF RUBRICS
1. HOLISTIC RUBRIC- general or overall impression
2. ANALYTIC RUBRIC – specific and describes each criteria
B. Other scoring instrument
1. Likert scale
2. Checklist
3. Rating scale
 SCORING BIASES AND ERRORS
- EXTERNAL ERRORS (directions, luck, item ambiguity, heat in the
room, lightning, sample of items, observer differences and bias, test
interruptions and scoring)
- INTERNAL ERRORS (health, mood, motivation, test taking skills,
anxiety, fatigue, general ability.

VI. PHASES OF MAKING A TEST


PHASE 1: PLANNING STAGE
PHASE 2: TEST CONSTRUCTION/ ITEM WRITING STAGE
PHASE 3: TEST ADMINISTRATION/ TRY OUT STAGE
PHASE 4: EVALUATION STAGE

VII. DO’S AND DON’TS IN TEST CONSTRUCTION


 FIVE GENERAL RULES
1. Thou shall not provide opaque directions to students regarding how to
respond to your assessment (clear directions/instructions)
2. Thou shall not employ ambiguous stage in your assessment items
Ambiguous statement – misinterpretation by students- incorrect responses
3. Thou shall not provide students w/ unintentional clues regarding appropriate
items.
4. Thou shall not employ complex syntax in your assessment items
(unnecessary sentences)
5. Thou shall not use vocabulary that is more advanced than required – level of
vocabulary on words
VIII. CHARACTERISTICS OF A GOOD TEST
1. VALIDITY – appropriate test, it refers to the extent to w/c the test intends
to measure
a. Face validity- physical appearance
b. Content validity- alignment of the content to the curriculum objectives
c. Criterion related validity
1. Predictive validity- at a longer time interval
2. Concurrent validity- at a close time interval
d. Construct validity
-determines w/c assessment a meaningful measure of an
unobservable trait/ characteristics.
1. Convergent validity- defines another similar trait.
2. divergent validity- describes only the intended traits and other traits
* table of specification
* appropriateness of test items
* directions
* reading vocabulary
* avoid identifiable pattern of answers
* sentence structure
* improper arrangement of test item (simple to complex)
* inadequate time limits
2. RELIABILITY – consistency of scores
Methods:
a. Test- retest- repetition of the same test
b. Parallel/ equivalent forms- two parallel form of test to the same group of students
c. Split-half- one test that is divided into two equivalents halves
d. Kuder- Richardson- correlating the proportion/ percentage of the students
passing and not passing a given item

TEST RELIABILITY ENHANCERS


 Use a sufficient number of items
 Make sure the assessment procedures and score are objective as
possible
 Continue assessment until results are consistent
 Use shorter assessments more frequently that fewer long assessment.
3.FAIRNESS- Eliminating personal biases and giving all student equal opportunity to
learn
4. PRACTICALITY AND EFFICIENCY
- teacher familiarity w/ the method
- time required
- complexity of administration
- ease of scoring
- mechanical make up
IX. TYPE OF TEST
A. ADMINISTRATION
1. Individual
2. Group
B. MODE OF RESPONSE
1. Oral
2. Written
3. Performance
C. SCORING
1. Objective
a. Multiple choice
b. True/ false
c. Completion
d. Matching type
e. Short responses
2. SUBJECTIVE
a. Essay
TYPES OF TEACHER- MADE TEST
1. SELECTED- RESPONSE TEST- test items w/ options
2. CONSTRUCTED- RESPONSE TEST- test items w/o options

POINTERS IN WRITING TRUE AND FALSE TEST


1. Avoid giving clues
2. Avoid using specific determiners (always, never, all, usually and impossible)
3. Avoid using trick questions(misspellings)
4. Keep item length similar
5. Avoid using negative statement & double negatives
6. Include one

3. MULTIPLE- CHOICE ITEMS- students are asked to choose a correct/best


answer out of the choices from a list
QUESTIONS- STEM
OPTIONS-STEM ALTERNATIVES
TYPES OF MULTIPLE CHOICE ITEMS
I.
II. Incomplete- statement form
III. Negative stem
IV. Best answer
V. Group options
VI. Contained options
VII. Stimulus material-stem options
POINTERS FOR WRITING MULTIPLE CHOICE
1. Stem should consist of a self-contained question/ problem
2. Distractor should be equally plausible and attractive
3. Avoid options that are synonyms
4. Avoid double negatives
5. The option must be in capital letter
6. Avoid stems that reveal the answer to the next question
7. Avoid complex/ awkward words

4. MATCHING TYPE- students associate an item in one column w/ a choices in the


second column
A. PERFECT MATCHING- a response is the only answer to a premise
B. IMPERFECT MATCHING- response is the answer to more than one
premise

POINTERS FOR WRITING MATCHING TYPE


1. Use homogeneous options and stems
2. Provide more responses that precise
3. Arrange the options alphabetically, numerically
4. Limit the number of items within each other
5. Place the shorter responses in column B
6. Provide complete directions
7. Have both list uppers on the same page

2.Constructed response test

A. short –answers items

Pointers for Short answer items


1. Avoid open- ended items
2. Put the blank at the end
3. Use only one blank
4. Blanks for the answer should have equal length
5. Provide sufficient answers space
B. ESSAY TEST- Allows greater freedom of response to questions and require
more writing.

C. X

TYPES:
1. RESTRICTED RESPONSE- has limit in the form and content of student’s
response
2. EXTENDED RESPONSE- has no limit “ “ “ “ “

D. Sort of Responses Being Emphasized


1. Power- more time: difficult questions
2. Speed- limited time:; easy questions
E. TEST CONSTRUCTOR
1. Standardized – experts
2. Teacher-made – classroom teachers
F. MODE OF INTERPRETING THE RESULTS
1. Criterion-referenced(focus)- mastery of specific set of skills
2. Norm- referenced (compare)- comparison to the person of same age students
G. OTHER TYPES OF TEST
1. Personality test (EQ test)- emotional/social adjustments
2. Intelligence test(IQ)- measures mental ability
3. Aptitude test- potential for success
4. Achievement test- mastery of skills

X. AFFECTIVE ASSESSMENT
Two overriding categories
1. Records- quantitative measures, attitude scales ex. Semantic differential
scales
2. Observations qualitative measures- assess individual/group behavior, may
be formal/informal.
XI. ITEM ANALYSIS: DIFFICULTY AND DISCRIMINATION INDEX
 DISCRIMINATION INDEX
1. Negative discrimination index- more from the lower group answered the test
item correctly
2. Positive discrimination index- none from the upper group
XII. STATISTICS
A. Measures of Central Tendency
1. Mean-average
2. Median- middlemost data
3. Mode- most frequent data
B. Measures of Variability
1. Range
2. Variance
3. Standard deviation
C. Relative positions
1. Percentile- divide the set scores into 100 equal part
2. Quartile- divide the set of scores into four
3. Decile- divide the set of scores into nine
D. Correlation
- positive correlation is direct relationship
- negative correlation is inverse relationship
- no correlation has no relationship
E. Measure of shape
1. Kurtosis
- Leptokurtic
- Mesokurtic
- Platykurtic
2. Skewness
- Positively- skewed
Mean> median> mode
- Symmetrical distribution
Mean= median= mode
- Negatively- skewed
Mean>median>mode
XIII. K-12 CURRICULUM

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy