Generations of psychology students have studies Skinner’s operant conditioning experiments and how they differ from the respondent behavior investigated by Pavlov. In the Pavlovian conditioning situation, a known stimulus is paired with a response under conditions of reinforcement. The behavioural response is elicited by a specific observable stimulus, Skinner called this behavioral response a respondent behavior.
Operant behavior occurs without any observable external stimulus. The organism’s response appears to be spontaneous in that it is not related to any known observable stimulus. This does not mean that there is no stimulus eliciting the response, but rather that no stimulus is detected when the response occurs. As far as the experimenters are concerned, there is no stimulus because they have not applied a stimulus and cannot see one.
Another difference between respondent and operant behavior is that operant behavior operates on the organism’s environment whereas respondent behavior does not. The harnessed dog in Pavlov’s laboratory can do nothing but respond (salivate, for example) when the experimenter presents the stimulus. The dog cannot act on its own to secure the stimulus (the food).
The operant behavior of the rat in the Skinner box, however, is instrumental in securing the stimulus (the food). When the rat presses the bar, it receives food, and it does not get any food until it does press the bar and thus operate on the environment. (Skinner disliked the term Skinner box, first used by Hull in 1933. He referred to the equipment as an operant conditioning apparatus. Skinner box has become so popular a label, however, that it is listed in most dictionaries and is accepted usage in psychology.)
Skinner believed that operant behavior is much more representative of everyday learning. Because behavior is mostly of the operant type, the most effective approach to a science of behavior is to study the conditioning and extinguishing of these operant behaviors.
His classic experimental demonstration involved bar pressing in a Skinner box constructed to eliminate extraneous stimuli. In this experiment a rat that had been deprived of food was placed in the apparatus and allowed to explore. In the course of this exploration the rat sooner or later accidentally depressed a lever or bar that activated a mechanism that released a food pellet into a tray. After receipting a few food pellets, the reinforcers, conditioning was normally rapid. Note that the rat’s behavior (pressing the bar or lever) drive on the environment and was instrumental in securing food. The self-suficient variable in this experiment is effortless and direct; the rate of response.
From this basic experiment Skinner derived his law of acquisition, which states that the strength of an operant behavior is increased when it is following by the presentation of a reinforcing stimulus. Although practice is important in establishing a high rate of bar pressing, the key variable is reinforcement. Practice by itself will not increase the rate; all does is provide the opportunity for additional reinforcement to occur.
Skinner’s law of acquisition various from the points of Thorndike and Hull on learning. Skinner did not contract with the pleasure pain or satisfaction-dissatisfaction consequences of reinforcement, as did Thorndike. Nor did Skinner make any effort to interpret reinforcement in terms of decreasing drives, as did Hull. The systems of Thorndike and Hull are explanatory; Skinner’s is accurately descriptive.
Skinner and his followers conducted a great deal of research on problems of learning, such as the role of punishment in acquiring responses, the effect of various schedules of reinforcement, the extinction of operant responses, secondary reinforcement, and generalization.
They also worked with other animals and with human subjects, using the same basic approach as the Skinner box. With pigons the operant behavior involves pecking at a key or a spot; the reinforcer is food. The operant behavior for human subjects involves problem solving, reinforced by verbal approval or by the knowledge of having given the correct answer.
Skinner reported an attempt to use back-rubbing as a reinforcer for his 3-year -old daughter, but the experiment backfired. He was talking to her at bedtime while rubbing her back and decided to test this as a reinforcer. “I waited,” he wrote, “until she lifted her foot slightly and then rubbed briefly. Almost immediately she lifted her foot again, and again I rubbed. Then she laughed. What are you laughing at?’ I said. Every time i raise my foot you rub my back!'”(Skinner, 1987)