Course Hero 1

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 6

1. Who is Burrhus Frederic Skinner? (his background) 5 pts.

 Burrhus Frederic Skinner (1904-1990) - born in Susquehanna, Pennsylvania, and


his ideas remain highly influential in contemporary psychology.
 Skinner majored in English and was inspired by Robert Frost to become a
writer.
 His father did not like the idea since his father wanted Skinner to become
a lawyer.
 Skinner pursued psychology at Harvard and found out that scientific
approach is much better than his literary pursuits.
 He later taught psychology to several schools and became influential as he
continued to thrive in his scientific endeavors.
 Skinner had major contributions to reinforcement and response and he
concluded that the effects of reinforcement and punishment are not
symmetrical; that is, reinforcement changes the probability of a response’s
recurring, but punishment does not.

2. What is his Learning Theory and what is it all about? 5 pts.


 Radical Behaviorism - rejects scientific language and interpretations that refer to
mentalistic events (i.e. drive, motivation, and purpose) in order to explain certain
aspects of human and nonhuman behavior.
  observable and measurable aspects of the environment, of an organism’s
behavior, and of the consequences of that behavior were the critical
material for scientific scrutiny
 Respondent and Operant Behavior
 (1) respondent behavior (unconditioned responses), which is elicited by a
known stimulus (e.g. all reflexes, such as jerking one’s hand when jabbed
with a pin, the constriction of the pupil of the eye when it is exposed to
bright light, and salivation in the presence of food). 
 (2) operant behavior (conditioned responses), which is not elicited by a
known stimulus but is simply emitted by the organism. Most of our daily
activities are operant behaviors. 

3. Compare his different types of behavior and his different types of conditioning. 10
pts
 Respondent behavior is elicited by stimuli and occurs automatically in the
presence of these stimuli just like with reflexes. They are elicited by antecedent
stimuli and are not sensitive to their consequences. On the other hand, Operant
behavior is sensitive to contingencies. It is controlled by many variables,
including the schedule of reinforcement, the response effort required, and the
density of reinforcement available for the operant class of interest.
 Respondent conditioning emphasizes the importance of the stimulus in eliciting
the desired response. It’s a procedure of learning that associates an unconditioned
stimulus that already brings about an unconditioned response with a new neutral
stimulus so that it can elicit the same response. The new stimulus then becomes a
conditioned stimulus and the newly learned behavior is a conditioned response.
On the contrary, operant conditioning gives emphasis on the response. Behavior
could be controlled by its consequences. Reinforcement and punishment are the
processes of applying discriminative stimuli to either increase or decrease target
behavior.

4. What are the principles of Operant Conditioning? 5 pts


 Two general principles are associated with Operant Conditioning: (1) any
response that is followed by a reinforcing stimulus tends to be repeated; and (2) a
reinforcing stimulus is anything that increases the rate with which an operant
response occurs. Therefore, we can say that a reinforcer is anything that increases
the probability of a response recurring. 

5. Illustrate what a skinner box is. 3 pts.


 The Skinner box usually has a grid floor, light, lever, and food cup. It is arranged
so that when the animal depresses the lever, the feeder mechanism is activated,
and a small pellet o f food is released into the food cup. 

6. What are the steps of conditioning in the lever-pressing response? Give your own
specific example or method. 6 pts
 Deprivation - the experimental animal is put on a deprivation schedule. If food is
to be used as the reinforcer, the animal is deprived of food.
 Example: Practicing to play the ukulele is hard. I used a tuning app so
that I could easily tune my ukulele. But, I don’t want to depend on the
tuner so I deleted the application. 

 Magazine Training - after being on a deprivation schedule for a number of days,


the animal is placed into the Skinner box. Using an external hand switch, the
experimenter periodically triggers the feeder mechanism.
 Example: Being able to study the notes (G,C,E,A) of my concert ukulele,
I was able to know its tune. I would gradually adjust the strings and have
them tuned because singing with a misguided tone is not possible. It
doesn’t and won’t sound good, at all. I would do this every time I think
my ukulele is out of tune.

 Lever Pressing - now the animal can be left in the Skinner box on its own.
Eventually, it will press the lever, which will fire the food magazine, producing a
click that reinforces the bar press and also signals the animal to go to the food
cup, where it is reinforced by food.
 Example: Now, I can easily adjust my ukulele’s tuners. Even without the
application, I can easily use my hearing in tuning my instrument.
Whenever I go, I am confident that even when it’s out of tune, I can easily
fix it and play properly. 

7. What is Shaping? Give your own specific example or method. 3 pts 


 It is used to produce extremely complex behavior if rewards and punishments are
delivered in such a way (approximation) as to encourage moving an organism
closer and closer to the desired behavior each time.
 Example: In training my dog to perform a “sit” trick, I would give him
food if he tries to focus on me and my command and stays down (though
he still hasn’t figured out how to actually “sit”, he just stays down but not
a proper sitting). Gradually, I would give my dog pellets and a cuddle or
tap whenever he attempts to perform “sitting”. Over time, I would only
give him pellets if he properly executes the “sit” trick. Through this
approximation, my dog already learned how to sit and he does it correctly
every time I command him to do so. 

8. What is Extinction? 1 pt
 Extinction is when a reinforcer is removed from an operant conditioning situation.
The response rate goes back to where it was before reinforcement was introduced.

9. What is Spontaneous Recovery? 2 pts


 It happens when an animal or a subject in an operant conditioning experiment
performs again the same behavior to which the reinforcement was designed to do
so. 

10. What is Superstitious Behavior? 2 pts


 It is the development of strange ritualistic responses to independent
reinforcement. For example, a rat would think that circling around produces
pellets from the lever or standing up again and again will produce the pellet.
These are ritualistic behaviors that subjects do in response to independent
reinforcement. 
11. What is Discriminative operant and when does it happen? 3 pts
  It is an operant response given to one set of circumstances but not to another.
Discriminative operant involves a signal that leads to a response, which in turn
leads to reinforcement. The arrangement can be symbolized as follows: SD→ R→
SR where R is the operant response and SR is the reinforcing stimulus. 

12. What are the differences of Primary, Secondary and Generalized Reinforcers? 3 pts
 primary reinforcer - these are reinforcers that do not need to be learned, such as
food, water, oxygen, warmth and sex.  These are all primary drives that we have
for basic survival and if they are deprived in any way, gaining access to these
reinforcers is very motivating.
 secondary reinforcer - is something that needs to be learned through pairings
with primary reinforcers. For example, we pair food to a certain behavior. 
 generalized reinforcer - is a secondary reinforcer that has been paired with more
than one primary reinforcer. For example, money because it is associated with a
number of primary reinforcers. 

13. What is Chaining? Give your own specific example or method. 3 pts
 One response can bring the organism into contact with stimuli that act as a
discriminative stimulus for another response, which in turn causes it to experience
stimuli that cause a third response, and so on. 
 various elements of a behavioral chain are held together by secondary
reinforcers but the entire chain depends on a primary reinforcer.
 Example: I learned to play the piano three years ago. Chaining helped me
in learning by knowing the discriminative positions of the chords C, D, E,
F, G, A, and B. These are successions of chords and once I knew what the
chord C was, I would know what chord D, then E, then F, then G, then A,
and finally B would be like when I play. 

14. What is the difference between Positive and Negative Reinforcement? 2 pts
 Positive reinforcement is the occurrence of an appetitive stimulus following a
response, that acts to increase the frequency of that response while negative
reinforcement is the termination of an aversive stimulus, following a response,
that acts to increase the frequency of that response.

15. What is the difference between Punishment and Negative Reinforcement? 2 pts
 A punishment acts to decrease the frequency of a response by introducing an
aversive stimulus while a negative reinforcement acts to increase the frequency of
such response by removing the said aversive stimulus. 

16. Give 3 alternatives of Punishment. 3 pts


 reinforce behavior incompatible with the undesirable behavior 
 have an undesirable response be satiated by letting the organism perform the
undesired response until it is sick of it
 extinction

17. What are Schedules of Reinforcement? Give your own specific example or method.
15 pts
 continuous reinforcement schedule (CRF) - every correct response during
acquisition is reinforced. It is used for strengthening newly learned behaviors.
 Example: I would give my dog pellets every time he gets the command
right. 
 interval reinforcement schedule (FI) - the animal is reinforced for a response
made only after a set interval of time. For example, only a response following a
three-minute interval is reinforced. 
 Example: I would study more often whenever major exams are nearing its
schedule. After that, my studying behavior will decrease. If a major exam
is once again scheduled, I would then again increase my frequency of
studying and my behavior is strengthened. 

 fixed ratio reinforcement schedule (FR) - occurs when every nth response that
the animal makes is reinforced. FR5, for example, means that the animal will be
reinforced at every fifth response. 
 Example: FR2 schedule: With our AnaPhy subject, I would read two (2)
chapters from our PDF material and then will take a break. 

 variable interval reinforcement schedule (VI) - the animal is reinforced for


responses made at the end of time intervals of variable durations. That is, rather
than having a fixed time interval, as with FI schedules, the animal is reinforced on
the average of, say, every three minutes, but it may be reinforced immediately
after a prior reinforcement. 
 Example: VI 10mins: My Papa usually fetches me from school. My
parents were too keen about us, perhaps to ensure our safety. I turn my
phone off during school hours (HNU-SHS was very strict LOL) and I only
turn it on some time between 5:00 PM-5:15 PM (the average is after 10
minutes) - the “reward” of me answering my phone keeps my Papa calm
knowing I am safe and puts his calling behavior on a VI schedule, so he
calls every few minutes until I pick up his call. 

 variable ratio reinforcement schedule (VR) eliminates the steplike cumulative


recording found with the FR schedule and produces the highest response rate of
the five schedules considered thus far. 
 Example: We have booming call center jobs in Bohol. If we use variable
ratio, we can have an example as employees increasing their taken calls to
receive bonuses. Sales bonuses are often rewarded to call center
employees. They never know how much calls they need to make to
receive their bonus however, by increasing their chances, they’ll receive
more sales or bonus. 
 concurrent schedules and the matching law - Skinner (1950) trained pigeons to
peck two operant keys that were available at the same time but that delivered
reinforcements under different schedules. This procedure is referred to as a
concurrent reinforcement schedule. He reported that the pigeons distributed their
responses according to the schedules of reinforcement associated with each key
and continued to do so during extinction.
 Example: On a weekend, I am making my assignment in PSY 119 while
listening to music and chatting with my friends. 
 concurrent chain reinforcement schedule - an animal’s behavior during the
initial phase of the experiment determines what schedule of reinforcement it
experiences during the second, or terminal, phase. 
 Example: An employee had two options or choices on a particular day: to
go have time with his girlfriend or to go to work. Going out with his
girlfriend is a small but immediate response (a day’s leisure) while going
to work is a large delayed reinforcement (not having a deduction from his
salary thereby earning more at his payday). The employee would choose
the former if the required time is lesser but with greater time, the
employee would surely choose the latter. 

18. What are the differences of Fading, Generalization and Discrimination? Give your
own specific example or method. 6 pts
 Fading involves the gradual change of a stimulus while the response stays about
the same.
 Example: Teaching my dog the “sit” trick by commanding it loudly (my
voice) and pushing it down so that it would know that it is the “sit”
command. I would gradually change my ways by commanding it in a soft
voice and then completely change it to hand signals. 
 Generalization occurs when an organism makes the same response to different
stimuli. A classically conditioned response to a slightly different signal will
depend on its resemblance to the original.
 Example: I told my two-year old cousin that what she saw in Carmen was
a cow. Whenever she sees other four legged animals, she would generally
say that it’s a cow when in fact it is not. 
 Discrimination occurs when an organism responds differently to two stimuli.
 Example: Our food was eaten by an unidentified animal. I thought at first
that it was a cat but the paw prints on our white table showed otherwise. It
was too big to be a cat’s so I knew that it was our dog Caesar who ate our
food on the table. 

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy