Notes
Notes
Type of Tests
(De Guzman and Adamos, 2015)
Mode of Response:
Oral test
Written test
Performance test
Test Constructor:
Standardized test
Non-standardized test
Interpreting Results:
Tests that yield norm-referenced interpretations
Tests that allow criterion-referenced interpretations
Nature of Response:
Personality test
Achievement test
Intelligence test
Roles of Assessment:
Placement role is used to determine the entry behavior of students. It
is also used to determine students’ performance at the beginning of
instruction.
Formative role is an assessment during the course of instruction rather
than after it is completed. Your on-going observation and monitoring of
students’ learning while you teach informs you what to do next.
Diagnostic role aims to determine the scientific learning needs of
students so that those needs can be met through regular or remedial
instruction.
Summative role is used to determine the mastery at the end of the
course. It is the process of making the overall assessment or decision
about the program.
Purposes of Assessment:
Assessment for Learning pertains to diagnostic and formative
assessment tasks which are used to determine learning needs and
monitor academic progress of students during a unit or a block of
instruction and guide instruction.
Assessment as Learning employs tasks or activities that provide
students with an opportunity to monitor and further their own learning,
to think about their personal learning habits and how they can adjust
their learning strategies to achieve their goals.
Assessment of Learning is summative and done at the end of a unit,
task, process or period. Unit tests and final projects are typical
examples of summative assessment. Assessment of learning is used
for grading, evaluation, and reporting purposes.
Learning outcomes are statement of what is expected that the student will be
able to do as a result of learning the activity (Jenkins and Unwin, 2001)
One of the great advantages of learning outcomes is that they are clear
statements of what the learner is expected to achieve and how he or she is
expected to demonstrate that achievement.
Remembering
Recalling information
Sample Verbs:
Choose, cite, enumerate, identify, label, list, locate, name, outline,
quote, recall, recognize, retrieve, repeat, select, sort, state, underline,
and write
Understanding
Explaining ideas or concepts
Sample Verbs:
Abstract, categorize, clarify, classify, compare, conclude, contrast,
construct, draw, exemplify, explain, extrapolate, generalize, give,
illustrate, infer, interpret, interpolate, instantiate, map, match,
paraphrase, predict, represent, summarize, and translate
Applying
Using information in another familiar situation
Sample Verbs:
Add, apply, calculate, carry out, compute, divide, dramatize, draw,
execute, implement, manipulate, multiply, paint, practice, sequence,
show, subtract, translate, and use
Sample Verbs:
Analyze, attribute, cohere, compare, contrast, criticize, deconstruct,
determine, differentiate, discriminate, distinguish, find, focus, integrate,
organize, outline, select, separate, scrutinize, and structure
Evaluating (Critical Thinking)
Justifying a decision or course of action
Sample Verbs:
Appraise, argue, assess, choose, check, conclude, coordinate, criticize,
critique, debate, decide, deduce, defend, detect, discriminate, evaluate,
infer, judge, justify, measure, monitor, predict, probe, rank, rate,
recommend, revise, and score
Sample Verbs:
Act, assemble, blend, combine, compile, compose, construct, create,
design, develop, devise, formulate, forecast, generate, hypothesize,
imagine, invent, organize, originate, predict, plan, prepare, propose,
produce, and set up
The affective domain includes the manner in which we deal with things
emotionally, such as feelings, values, appreciation, enthusiasms, motivations,
and attitudes.
Receiving (Awareness)
Sample Verbs:
Accept, acknowledge, ask, attend, describe, explain, follow, focus,
listen, locate, observe, realize, receive, recognize, and retain
Responding (React)
Sample Verbs:
Behave, cite, clarify, comply, contribute, cooperate, discuss, examine,
follow, interpret, model, perform, present, question, react, respond,
show, and studies
Sample Verbs:
Accept, adapt, argue, balance, challenge, choose, confront, criticize,
debate, differentiate, defend, influence, justify, persuade, prefer,
recognize, refute, seek, and value
Sample Verbs:
Adapt, adjust, alter, arrange, build, change, compare, contrast,
customize, develop, formulate, improve, manipulate, modify, practice,
prioritize, reconcile, relate, and revise
Internalizing (Characterization by a value)
Sample Verbs:
Act, authenticate, characterize, defend, display, embody, habituate,
influence, internalize, practice, produce, represent, solve, validate, and
verify
Perception
The ability to use sensory cues to guide motor activities
Sample Verbs:
Choose, describe, detect, differentiate, distinguish, identify, isolate,
relate, and select
Set
Readiness to act
Sample Verbs:
Begin, display, explain, move, proceed, react, show, state, and
volunteer
Guided Response
The early stages in learning a complex skill that includes imitation and
trial and error
Sample Verbs:
Copy, trace, follow, react, reproduce, and respond
Sample Verbs:
Assemble, calibrate, construct, dismantle, display, fasten, fix, grind,
heat, manipulate, measure, mend, mix, organizes, and sketch
Sample Verbs:
The key words are the same as mechanism, but will have adverbs or
adjectives that indicate that the performance is quicker, better, and
more accurate
Adaptation
Skills are well developed and the individual can modify movement
patterns to fit special requirements
Sample Verbs:
Adapt, alter, change, rearrange, reorganize, revise, and vary
Origination
Creating new movement patterns to fit a particular situation or specific
problem
Sample Verbs:
Arrange, build, combine, compose, construct, create, design, initiate,
make, and originate
It is important that the assessment tasks mirror the learning outcomes since as
far as the students are concerned, the assessment is the curriculum, from our
students’ point of view, assessment always defines the actual curriculum
(Ramsden, 2003)
Teacher perspective:
1. Objectives
2. Desired learning outcomes
3. Teaching activities
4. Assessment
Student perspective:
1. Assessment
2. Learning activities
3. Outcomes
The Three Essentials of Alignment
Objective Assessment are those that require one and only one correct answer
and no other possible answers.
Example Tools:
Supply type
Matching type
Alternative-response type
Completion type
Enumeration type
Multiple choice type
Example Tools:
Restricted essay type
Extended essay type
Sentence completion
Short responses
Problem solving
Oral questioning
Example Tools:
Projects
Journals
Science fair
Art show
Exhibits
Portfolio collection
Example Tools:
Teacher observations
Checklist/rating scale
Inventories
Self-report
Reliability means that scores from an instrument are stable and consistent. It
refers to the consistency of scores obtained, how consistent they are for each
individual from one administration of a test to another and from one set of
items to another.
If the test is reliable, we would expect a student who receives a high score the
first time he takes the test to receive a high score the next time he takes the
test. The scores would probably not be identical, but they should be close.
Types of Reliability
Lecture 5: Validity
Validity is the most important idea to consider when preparing test. More than
anything else, the teachers want the information they obtained through the
use of a test to serve their purposes. In recent years, validity has been
defined as referring to the appropriateness, correctness, meaningfulness, and
usefulness of the specific inferences teachers make based on the data they
collect.
Factual Knowledge
The basic elements that students must know to be acquainted with a
discipline or solve problems in it.
Knowledge of terminology
Knowledge of specific details and elements
Conceptual Knowledge
The interrelationships among the basic elements within a larger
structure that enable them to function together.
Knowledge of classifications and categories
Knowledge of principles and generalizations
Knowledge of theories, models, and structures
Procedural Knowledge
How to do something, methods of injury, and criteria for using skills,
algorithms, techniques, and methods.
Knowledge of subject-specific skills and algorithms
Knowledge of subject-specific techniques and methods
Knowledge of criteria for determining when to use appropriate
procedures
Metacognitive Knowledge
Knowledge of cognition in general as well as awareness and knowledge
of one’s own cognition
Strategic knowledge
Knowledge about cognitive tasks, including appropriate contextual and
conditional knowledge.
Self-knowledge
Remember
Retrieving relevant knowledge from long-term memory.
Understand
Determining the meaning of instructional messages, including oral,
written, and graphic communication.
Apply
Carrying out or using a procedure in a given situation.
Analyze
Breaking material into its constituent parts and detecting how the parts
relate to one another and to an overall structure or purpose.
Evaluate
Making judgments based on criteria and standards.
Create
Putting elements together to form a novel, coherent whole or make an
original product.
2. Criterion-Related Evidence of Validity
Criterion-related evidence of validity refers to the relationship between
scores obtained using the test instrument and scores obtained using one
or more other test instruments (Often called as criterion)
There are two forms of criterion related validity, predictive and
concurrent. To obtain evidence of predictive validity, teachers allow a
time interval to elapse between the administration of the instrument
and obtaining the criterion scores.
Test Validity
A measure is valid when it measures what it is supposed to measure. If
a quarterly exam is valid, then the contents should directly measure
the objectives of the curriculum. If a scale that measures personality is
composed of five factors, then the scores on the five factors should
have items that are highly correlated. If an entrance exam is valid, it
should predict students’ grades after the first semester.
Content Validity
When the items represent the domain being measured
The items are compared with the objectives of the program. The items
need to measure directly the objectives (For achievement) or definition
(For scales). A reviewer conducts the checking.
Face Validity
When the test is presented well, free of errors, and administered well.
The test items and layout are reviewed and tried out on a small group
of respondents. A manual for administration can be made as a guide
for the test administrator.
Predictive Validity
A measure should predict a future criterion. Example is an entrance
exam predicting the grades of the students after the first semester.
A correlation coefficient is obtained where the X variable is used as the
predictor and the Y variable as the criterion.
Construct Validity
The components or factor of the test should contain items that are
strongly correlated.
The Pearson-R can be used to correlate the items for each factor.
However, there is a technique called factor analysis to determine which
items are highly correlated to form a factor.
Concurrent Validity
When two or more measures are present for each examinee that
measure the same characteristic.
The scores on the measures should be correlated.
Fairness
The teacher should avoid bias in their assessment tasks and procedures.
Assessment tasks and procedures should be done systematically.
There are two major forms of assessment in bias, offensiveness, and
unfair penalization.
Offensiveness occurs if the content of the assessment offends, upsets,
distresses, angers, or otherwise creates negative affect for particular
students or a subgroup of students.
Unfair penalization is bias that disadvantages a student because of
content that makes it more difficult for students from some groups to
perform as compared to students from other groups.
Another type of assessment task bias that has received a lot of attention
recently is the need to accommodate the special abilities of exceptional
children, and the abilities of children with special needs.
Positive Consequences
On students:
Assessments affects the way students perceive the teacher and gives
them an indication of how much the teacher cares about them and
what they learn.
Undeniably, it is a common classroom scenario that students display
phobia toward test.
Students develop fear because evaluation is based on test alone. So,
students should be given the chance to evaluate their own learning.
Students are expected to positively express their feelings and thoughts
toward evaluation if they realize that, through this process, they are
given a chance to learn more. Their motivation is not anymore directed
toward grades, but toward further learning.
On teachers:
If teachers fully internalize assessment and evaluation, they are given
opportunity to look inward or to introspect and examine their own
performance. The result of the evaluation can provide information
about the area of instruction which the teacher needs to improve on.
This is an opportunity for continuous professional development.
Ethics