19 Jun

Introduction/Background to Operant Conditioning

The first behavioral theory emerged in the early 1900’s when Watson reacted against the then popular method of introspection for mental consciousness. Instead he proposed the science of focusing on the observable and measureable human behavior. Watson paved the path for Skinner’s work on operant conditioning, which started in the 1930’s. Skinner became interested in the psychology of learning after reading Watson and Pavlov’s writings. He drew from Watson’s belief that only external behavior should be studied and from he was drawn into the psychology of behavior based on Pavlov’s work related to unlearned (unconditioned) and learned (conditioned) responses. Additionally, Skinner’s theory is built upon William James’ early theory of functionalism by focusing on the interaction with the environment and, additionally, Thorndike’s theory of connectionism in pairing stimuli with responses. Skinner formulated his theory from the laboratory studies he did with animals from which he identified the components of operant conditioning. He published his findings in 1938 in his book The Behavior of Organism (Schunk, 2012).. He then extended these ideas and applied them to human behavior and learning. A timeline and graphical representation of these aforementioned theories and their connections to Skinner’s operant conditioning can be found at the end (see Table 1).

Theory Epistemology of Operant Conditioning

Operant conditioning focuses on the prediction and control of behavior through observable behavior that can be measured. It uses behavioral principles to explain learned behavior. It focuses on the connection of the stimulus and response.  Reinforcements are used to strengthen response behaviors and can vary based on the situation. Extinction is a decline in response strength due to ceasing reinforcement. Primary and secondary reinforces are the drivers of behavior. Schedules of reinforcement are applied in different ways to strengthen a behavior. The basic operant method for changing behavior is using shaping (differential reinforcement) to strengthen successive approximations to the desired form of behavior. Other operant conditioning techniques are chaining, behavioral modification, punishment, and self-regulation (Schunk, 2012). Learning has occurred when the observable behavior has changed. Since the 1930’s this theory has developed into Experimental and Applied Behavioral Analysis Psychology. Skinner is still regarded as the most influential contributor (Schunk, 2012) but the Journal of Applied Behavioral Analysis (JABA) is regarded as a key publication of current research about the applications of experimental analysis of behavior to problems of social importance (Wiley Online Library, 1999-2015).

Learning Process

Skinner’s operant conditioning not only explains human behavior but can also be applied to how people learn.  It states that learning is the result of a conditioned response to a stimulus. The process of learning involves the presence of a discriminative stimulus that results in a behavioral response, which is followed by a reinforcing stimulus. Information is acquired through interacting with the environment, accessing reinforcements, and repeating learned responses. Complex behaviors can be learned in gradual approximations through shaping. Learned behaviors are strengthened through targeted schedules of reinforcements. Generalization of a behavior transfers learning from one environment to the next. Operant conditioning provides a useful explanation for how students learn new behaviors such as reading. In learning to read, first reinforcement is given for repeating sounds when letters are seen on cards, then reinforcement is provided for blending the letters together to produce sounds. The students then are reinforced for putting these blended letters together to form words. There is much practice and repetition involved in these shaping up these behaviors. Reading behavior is then generalized from cards/workbooks to books of sequential levels. Reinforcement can be provided in many different forms.  It can be in the form of social praise from teachers and adults. Or, when I was younger, I read books and earned points for a free pizza from Pizza Hut. Operant conditioning performs well in learning environments that are centered around behavioral objectives of which the task can be taught with direct and sequential instruction, and programmed instructional time is given to deliver the instruction and reinforcement.

Within the general Behavioral theory of learning there are some slight discrepancies between the different theories.  In some ways, Skinner’s operant conditioning theory challenged some of these earlier theorists on how human’s respond. One example is in comparing Thorndike’s Law of Disuse to that of Skinner’s extinction.  Thorndike proposed that once a response is no longer made to a stimulus the connection is forgotten. For Skinner, extinction is not the same as forgetting – it is the decline of response strength due to nonreinforcement. Thorndike would rival that when a behavior is no longer being performed the association is eventually forgotten forever (Schunk, 2012). An example of this that supports Skinner’s explanation over Thorndike’s is that even though we type with a computer – students can still use a pencil and paper to write. Another rival for a principle of operant conditioning is Guthrie on generalization. Guthrie proposed that the environment and stimuli need to be exactly the same for transfer of information to occur. This rivals Skinner’s idea that one response may occur to other different stimuli once a pattern of responding has been established with the initial stimuli (Schunk, 2012).  A current day example of this that supports Skinner’s work is that student’s are learning in online environments and no longer need a traditional classroom with a teacher to apply and demonstrate learning behaviors. Skinner’s operant conditioning theory is built upon the work of many early researchers when the field of psychology was still relatively young. The earlier theorists such as Thorndike and Guthrie proposed the foundational ideas that Skinner then developed further. Skinner’s behavior principles seem to provide advanced explanations for why sometimes behavior occurs and why sometimes it does not.  

Instructional Implications

Reading is a skill that can be shaped from a series of repeated and reinforced behaviors. Operant conditioning principles are best for this outcome because the learning objective of reading is an observable behavior and can be taught as sequential learning. Shaping is an effective approach for reaching accomplishing this advanced task. In learning to read, students will start with the smaller behaviors of identifying and sounding out letters and sounds. Students can participate in practice sessions and small groups to repeat the behaviors. When they produce the correct response – they will be reinforced. If they produce the incorrect response they will be reminded of the correct response and asked to repeat until the right response is given. In teaching reading, this means that the teacher will need to break the behavior of reading down into small steps that are successive in nature. The teacher will need to listen for correct responses and provide effective reinforcement. Knowing what specific reinforcement motivates each student is important. The teacher can even have the whole class say the sound together before moving to the student needing to produce the correct response on his/her own. To reach the level of mastery – some students may need little practice while other students may need to be drilled with consistent repetition. The teacher will also need to take into account the age of the students in identifying what level of reading the students need to master. In the early stages it may be important to keep the situation and the prompts the same for the students as Guthrie would support. However, using Skinner’s principle of generalization the teacher will be able to assign reading homework that can be done in other environments such as the home and in the library. Imagine if we could only read in the same environment, with the same stimuli, from where we learned to read!


References

Schunk, D.H. (2012). Learning theories: An educational perspective. Boston: Pearson.

Wiley Online Library (1999-2015). Journal of applied behavioral analysis. Retrieved from:

http://onlinelibrary.wiley.com/journal/10.1002/%28ISSN%291938-3703/homepage/ProductInformation.html












Table 1

Theoretical Timeline and Representation of Connections

 

Time Period

Theorist

Key Points

Connections

Theoretical Differences

1916

John B. Watson

Observable and Measureable Behavior

focus on behavior and environment foundation for Skinner’s work


1911

Thorndike

connectionism response to stimulus strengthens connection, teaching habits, sequence of curriculum

Set foundation for Skinner’s Stimulus – Response principle

Law of Disuse – association forgotten when response stops

1920

Pavlov

Classical Conditioning

Set stage for Skinner’s work on conditioned pairings

Focused on biological responses

1930s

Skinner

Operant Conditioning – Behavioral Learning Principles

Prediction and control of behavior. Setting up learning environments

Only focused on observable behavior -  no mental processes

Comments
* The email will not be published on the website.
I BUILT MY SITE FOR FREE USING