What Is a Skinner Box?

Learn about the Operant Conditioning Center

Pigeons in a Skinner box
Bettmann Archive / Getty Images

A Skinner box, also known an operant conditioning chamber, is an enclosed apparatus that contains a bar or key that an animal can press or manipulate in order to obtain food or water as a type of reinforcement.

Developed by B. F. Skinner, this box also had a device that recorded each response provided by the animal as well as the unique schedule of reinforcement that the animal was assigned.

Skinner was inspired to create his operant conditioning chamber as an extension of the puzzle boxes that Edward Thorndike famously used in his research on the law of effect.

Skinner himself did not refer to his device as a Skinner box, instead preferring the term "lever box."

How Is a Skinner Box Used?

So how exactly do psychologists and other researchers utilize a Skinner box when conducting research? The design of Skinner boxes can vary depending upon the type of animal and the experimental variables. The box is a chamber that includes at least one lever, bar, or key that the animal can manipulate.

When the lever is pressed, food, water, or some other type of reinforcement might be dispensed. Other stimuli can also be presented including lights, sounds and images. In some instances, the floor of the chamber may be electrified.

What exactly was the purpose of a Skinner box? Using the device researchers could carefully study behavior in a very controlled environment. For example, researchers could utilize the Skinner box to determine which schedule of reinforcement led to the highest rate of response in the study subjects.

Examples of How Skinner Boxes Are Used in Research

For example, imagine that a researcher wants to determine which schedule of reinforcement will lead to the highest response rates. Pigeons are placed in the operant conditioning chambers and receive a food pellet for pecking at a response key. Some pigeons receive a pellet for every response (continuous reinforcement) while others obtain a pellet only after a certain amount of time or number of responses have occurred (partial reinforcement).

In the partial reinforcement schedules, some pigeons receive a pellet after they peck at the key five times. This is known as a fixed-ratio schedule. Pigeons in another group receive reinforcement after a random number of responses, which is known as a variable-interval schedule. Still more pigeons are given a pellet after a 10 minute period has elapsed. This is called a fixed-interval schedule. In the final group, pigeons are given reinforcement at random intervals of time, which is known as a variable-interval schedule.

Once the data has been obtained from the trials in the Skinner boxes, researchers can then look at the rate of responding and determine which schedules leads to the highest and most consistent level of responses.

One important thing to note is that the Skinner box should not be confused with one of Skinner's other inventions, the baby tender. At his wife's request, Skinner created a heated crib with a plexiglass window that was designed to be safer than other crib's available at that time.

Confusion over the use of the crib led to it being confused with an experimental device, which led some to believe that Skinner's crib was actually a variation of the Skinner box.

At one point, a rumor spread that Skinner had used the crib in experiments with his daughter, leading to her eventual suicide. The Skinner box and the baby tender crib were two different things entirely, and Skinner did not conduct experiments on his daughter or with the crib, nor did his daughter take her own life. 

The Skinner box became an important tool for studying learned behavior and contributed a great deal to our understanding of the effects of reinforcement and punishment.

Sources:

Schacter, D.L., Gilbert, D.T., & Wegner, D.M. (2011). Psychology. New York: Worth, Inc.

Skinner, B. F. (1983). A Matter of Consequences. New York: Alfred A. Knopf, Inc.

Continue Reading