100% found this document useful (1 vote)
2K views

Assignment On Theory of B.F Skinner

B.F. Skinner developed the operant conditioning chamber, also known as the Skinner box, to study operant conditioning. The Skinner box contained a lever or key that an animal could press to receive a food reward. Skinner found that different schedules of reinforcement, such as continuous, fixed ratio, and variable ratio reinforcement, led to different response and extinction rates in animals. His work established the principles of operant conditioning and showed how behaviors could be strengthened or weakened through reinforcement and punishment.

Uploaded by

Swagata Banerjee
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
2K views

Assignment On Theory of B.F Skinner

B.F. Skinner developed the operant conditioning chamber, also known as the Skinner box, to study operant conditioning. The Skinner box contained a lever or key that an animal could press to receive a food reward. Skinner found that different schedules of reinforcement, such as continuous, fixed ratio, and variable ratio reinforcement, led to different response and extinction rates in animals. His work established the principles of operant conditioning and showed how behaviors could be strengthened or weakened through reinforcement and punishment.

Uploaded by

Swagata Banerjee
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

Assignment

On
Behaviorism theory of
B.F Skinner

SUBMITTED TO: - Prof. ANURADHA SHARMA


SUBMITTED BY:-SWAGATA BANERJEE
B.ED SPECIAL EDUCATION (LEARNING DISABILITIES)
2ND SEMESTER
DEPARTMENT OF COMMUNITY EDUCATION AND
DISABILITY STUDIES
PANJAB UNIVERSITY
Contents
1. Introduction
2. Bf skinner: operant conditioning
3. Results
4. Conclusion
5. Critical evaluation
6. References
INTRODUCTION
A Skinner box, also known as an operant conditioning chamber, is an enclosed
apparatus that contains a bar or key that an animal can press or manipulate in order to
obtain food or water as a type of reinforcement.
It was developed by B. F. Skinner, this box also had a device that recorded each
response provided by the animal as well as the unique schedule of reinforcement that the
animal was assigned. He was a graduate student at Harvard University. It may have been
inspired by Jerzy Konorski's studies. It is used to study both operant conditioning and
classical conditioning.
Skinner was inspired to create his operant conditioning chamber as an
extension of the puzzle boxes that Edward Thorndike famously used in his research on
the law of effect. Skinner himself did not refer to this device as a Skinner box, instead
preferring the term "lever box."
An operant conditioning chamber permits experimenters to study behavior
conditioning (training) by teaching a subject animal to perform certain actions (like
pressing a lever) in response to specific stimuli, such as a light or sound signal. When
the subject correctly performs the behavior, the chamber mechanism delivers food or
other reward. In some cases, the mechanism delivers a punishment for incorrect or
missing responses. For instance, to test how operant conditioning works for certain
invertebrates, such as fruit flies, psychologists use a device known as a "heat box".
Essentially this takes up the same form as the Skinner box, but the box is composed of
two sides: one side that can undergo temperature change and the other that does not. As
soon as the invertebrate crosses over to the side that can undergo a temperature change,
the area is heated up. Eventually, the invertebrate will be conditioned to stay on the side
that does not undergo a temperature change. This goes to the extent that even when the
temperature is turned to its lowest point, the fruit fly will still refrain from approaching
that area of the heat box.These types of apparatuses allow experimenters to perform
studies in conditioning and training through reward/punishment mechanisms.
BF SKINNER: OPERANT CONDITIONING
Skinner is regarded as the father of Operant Conditioning, but his work
was based on Thorndike’s (1898) law of effect. According to this
principle, behavior that is followed by pleasant consequences is likely to
be repeated, and behavior followed by unpleasant consequences is less
likely to be repeated.
Skinner introduced a new term into the Law of Effect - Reinforcement.
behavior which is reinforced tends to be repeated (i.e., strengthened);
behavior which is not reinforced tends to die out-or be extinguished (i.e.,
weakened).

Operant conditioning (also called instrumental conditioning) is a type of associative


learning process through which the strength of a behavior is modified by reinforcement
or punishment. It is also a procedure that is used to bring about such learning.
Although operant and classical conditioning both involve behaviors controlled by
environmental stimuli, they differ in nature. In operant conditioning, stimuli present
when a behavior is rewarded or punished come to control that behavior. For example, a
child may learn to open a box to get the sweets inside, or learn to avoid touching a hot
stove; in operant terms, the box and the stove are "discriminative stimuli". Operant
behavior is said to be "voluntary". The responses are under the control of the organism
and are operants. For example, the child may face a choice between opening the box and
petting a puppy.
In contrast, classical conditioning involves involuntary behavior based on the
pairing of stimuli with biologically significant events. The responses are under the
control of some stimulus because they are reflexes, automatically elicited by the
appropriate stimuli. For example, sight of sweets may cause a child to salivate, or the
sound of a door slam may signal an angry parent, causing a child to tremble. Salivation
and trembling are not operants; they are not reinforced by their consequences, and they
are not voluntarily "chosen".
However, both kinds of learning can affect behavior. Classically conditioned
stimuli—for example, a picture of sweets on a box—might enhance operant
conditioning by encouraging a child to approach and open the box. Research has shown
this to be a beneficial phenomenon in cases where operant behavior is error-prone.

The study of animal learning in the 20th century was dominated by the analysis of these
two sorts of learning, and they are still at the core of behavior analysis. They have also
been applied to the study of social psychology, helping to clarify certain phenomena
such as the false consensus effect.

Positive Reinforcement
Skinner showed how positive reinforcement worked by placing a hungry rat in his
Skinner box. The box contained a lever on the side, and as the rat moved about the box,
it would accidentally knock the lever. Immediately it did so a food pellet would drop
into a container next to the lever.
The rats quickly learned to go straight to the lever after a few times of being put in
the box. The consequence of receiving food if they pressed the lever ensured that they
would repeat the action again and again.
Positive reinforcement strengthens a behavior by providing a consequence an
individual finds rewarding. For example, if your teacher gives you £5 each time you
complete your homework (i.e., a reward) you will be more likely to repeat this behavior
in the future, thus strengthening the behavior of completing your homework.
Negative Reinforcement
The removal of an unpleasant reinforcer can also strengthen behavior. This is known as negative
reinforcement because it is the removal of an adverse stimulus which is ‘rewarding’ to the animal or
person. Negative reinforcement strengthens behavior because it stops or removes an unpleasant
experience.For example, if you do not complete your homework, you give your teacher £5. You will
complete your homework to avoid paying £5, thus strengthening the behavior of completing your
homework.
Skinner showed how negative reinforcement worked by placing a rat in his Skinner box and then
subjecting it to an unpleasant electric current which caused it some discomfort. As the rat moved about
the box it would accidentally knock the lever. Immediately it did so the electric current would be
switched off. The rats quickly learned to go straight to the lever after a few times of being put in the
box. The consequence of escaping the electric current ensured that they would repeat the action again
and again.In fact Skinner even taught the rats to avoid the electric current by turning on a light just
before the electric current came on. The rats soon learned to press the lever when the light came on
because they knew that this would stop the electric current being switched on.

Schedules of Reinforcement
Imagine a rat in a “Skinner box.” In operant conditioning, if no food pellet is delivered immediately
after the lever is pressed then after several attempts the rat stops pressing the lever (how long would
someone continue to go to work if their employer stopped paying them?). The behavior has been
extinguished.Behaviorists discovered that different patterns (or schedules) of reinforcement had
different effects on the speed of learning and extinction. Ferster and Skinner (1957) devised different
ways of delivering reinforcement and found that this had effects on
1. The Response Rate - The rate at which the rat pressed the lever (i.e., how hard the rat
worked).
2. The Extinction Rate - The rate at which lever pressing dies out (i.e., how soon the rat gave
up).
Skinner found that the type of reinforcement which produces the slowest rate of extinction (i.e.,
people will go on repeating the behavior for the longest time without reinforcement) is variable-ratio
reinforcement. The type of reinforcement which has the quickest rate of extinction is continuous
reinforcement.
(A) Continuous Reinforcement
An animal/human is positively reinforced every time a specific behavior occurs, e.g., every time a
lever is pressed a pellet is delivered, and then food delivery is shut off.
• Response rate is SLOW
• Extinction rate is FAST

(B) Fixed Ratio Reinforcement


Behavior is reinforced only after the behavior occurs a specified number of times. e.g., one
reinforcement is given after every so many correct responses, e.g., after every 5th response. For
example, a child receives a star for every five words spelled correctly.
• Response rate is FAST
• Extinction rate is MEDIUM

(C) Fixed Interval Reinforcement


One reinforcement is given after a fixed time interval providing at least one correct response has
been made. An example is being paid by the hour. Another example would be every 15 minutes (half
hour, hour, etc.) a pellet is delivered (providing at least one lever press has been made) then food
delivery is shut off.
• Response rate is MEDIUM
• Extinction rate is MEDIUM

(D) Variable Ratio Reinforcement


behavior is reinforced after an unpredictable number of times. For examples gambling or fishing.
• Response rate is FAST
• Extinction rate is SLOW (very hard to extinguish because of unpredictability)

(E) Variable Interval Reinforcement


Providing one correct response has been made, reinforcement is given after an unpredictable amount
of time has passed, e.g., on average every 5 minutes. An example is a self-employed person being
paid at unpredictable times.
• Response rate is FAST
• Extinction rate is SLOW

Behavior Modification

Behavior modification is a set of therapies / techniques based on operant conditioning (Skinner,


1938, 1953). The main principle comprises changing environmental events that are related to a
person's behavior. For example, the reinforcement of desired behaviors and ignoring or punishing
undesired ones.This is not as simple as it sounds — always reinforcing desired behavior, for
example, is basically bribery.There are different types of positive reinforcements. Primary
reinforcement is when a reward strengths a behavior by itself. Secondary reinforcement is when
something strengthens a behavior because it leads to a primary reinforcer.

Examples of behavior modification therapy include token economy and behavior shaping.

RESULTS
Operant conditioning chambers are small environments designed to contain an animal subject.
They are generally structured to block external light and sound in order to prevent distracting
stimuli from interfering with experiments. The box ensures behaviors are conditioned
appropriately and rewards are timed correctly. The purpose of the Skinner box is to analyze
animal behavior by detecting when an animal has performed a desired behavior and then
administering a reward, thus determining how long it takes the animal to learn to perform the
behavior. If the goal of the box is to teach a rat to press a lever, for example, pressing the lever
might cause food to fall out of a chute. The rat will likely only push the lever accidentally at
first, but eventually it will learn food appears when it does so. Then the rat will begin to perform
the behavior independently.

When used to study classical conditioning, the chamber may be structured so an automatic
animal behavior, such as scratching, is paired with an unconditioned stimuli, such as a flashing
light. When the animal scratches, the light will flash, and the animal will be given a reward.
After several trials, the animal will likely associate the unconditioned stimulus with the reward.
The Skinner box has been used in pharmaceutical research to observe the effects of drugs on
animal behavior.

CONCLUSION
Looking at Skinner's classic studies on pigeons’ / rat's behavior we can identify some of the major
assumptions of the behaviorist approach.
Psychology should be seen as a science, to be studied in a scientific manner. Skinner's study of
behavior in rats was conducted under carefully controlled laboratory conditions.
Behaviorism is primarily concerned with observable behavior, as opposed to internal events like
thinking and emotion. Note that Skinner did not say that the rats learned to press a lever because they
wanted food. He instead concentrated on describing the easily observed behavior that the rats
acquired.
The major influence on human behavior is learning from our environment. In the Skinner study,
because food followed a particular behavior the rats learned to repeat that behavior, e.g., operant
conditioning.
There is little difference between the learning that takes place in humans and that in other animals.
Therefore research (e.g., operant conditioning) can be carried out on animals (Rats / Pigeons) as well
as on humans. Skinner proposed that the way humans learn behavior is much the same as the way
the rats learned to press a lever.
So, if your layperson's idea of psychology has always been of people in laboratories wearing white
coats and watching hapless rats try to negotiate mazes in order to get to their dinner, then you are
probably thinking of behavioral psychology.
Behaviorism and its offshoots tend to be among the most scientific of the psychological perspectives.
The emphasis of behavioral psychology is on how we learn to behave in certain ways.
We are all constantly learning new behaviors and how to modify our existing behavior. behavioral
psychology is the psychological approach that focuses on how this learning takes place.

Critical Evaluation
Operant conditioning can be used to explain a wide variety of behaviors, from the process of learning, to
addiction and language acquisition. It also has practical application (such as token economy) which can be
applied in classrooms, prisons and psychiatric hospitals.
However, operant conditioning fails to take into account the role of inherited and cognitive factors in learning,
and thus is an incomplete explanation of the learning process in humans and animals.
For example, Kohler (1924) found that primates often seem to solve problems in a flash of insight rather than be
trial and error learning. Also, social learning theory (Bandura, 1977) suggests that humans can learn
automatically through observation rather than through personal experience.
The use of animal research in operant conditioning studies also raises the issue of extrapolation.
Some psychologists argue we cannot generalize from studies on animals to humans as their anatomy and
physiology is different from humans, and they cannot think about their experiences and invoke reason, patience,
memory or self-comfort.
REFERENCES
1. https://www.simplypsychology.org/

2. https://www.psychestudy.com/behavioral

3. https://en.wikipedia.org/wiki/Operant_conditioning

4. https://www.verywellmind.com/operant-conditioning-a2-2794863

5. https://www.khanacademy.org/test-prep/mcat/behavior/learning-slug/a/classical-and-operant-
conditioning-article

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy