What Is a Skinner Box?

Learn about the Operant Conditioning Center

Illustration of a Skinner box
Image by Andreas1 / Wikimedia Commons (CC BY-SA 3.0)

A Skinner box, also known an operant conditioning chamber, is an enclosed apparatus that contains a bar or key that an animal can press or manipulate in order to obtain food or water as a type of reinforcement.

Developed by B. F. Skinner, this box also had a device that recorded each response provided by the animal as well as the unique schedule of reinforcement that the animal was assigned.

Skinner was inspired to create his operant conditioning chamber as an extension of the puzzle boxes that Edward Thorndike famously used in his research on the law of effect.

Skinner himself did not refer to his device as a Skinner box, instead preferring the term "lever box."

How Is a Skinner Box Used?

The design of Skinner boxes can vary depending upon the type of animal and the experimental variables. The box is a chamber that includes at least one lever, bar, or key that the animal can manipulate.

When the lever is pressed, food, water, or some other type of reinforcement might be dispensed. Other stimuli can also be presented including lights, sounds and images. In some instances, the floor of the chamber may be electrified.

What exactly was the purpose of a Skinner box? Using the device researchers could carefully study behavior in a very controlled environment. For example, researchers could utilize the Skinner box to determine which schedule of reinforcement led to the highest rate of response in the study subjects.

Examples

For example, imagine that a researcher wants to determine which schedule of reinforcement will lead to the highest response rates.

Pigeons are placed in the operant conditioning chambers and receive a food pellet for pecking at a response key. Some pigeons receive a pellet for every response (continuous reinforcement) while others obtain a pellet only after a certain amount of time or number of responses have occurred (partial reinforcement).

In the partial reinforcement schedules, some pigeons receive a pellet after they peck at the key five times. This is known as a fixed-ratio schedule. Pigeons in another group receive reinforcement after a random number of responses, which is known as a variable-interval schedule. Still more pigeons are given a pellet after a 10 minute period has elapsed. This is called a fixed-interval schedule. In the final group, pigeons are given reinforcement at random intervals of time, which is known as a variable-interval schedule.

Once the data has been obtained from the trials in the Skinner boxes, researchers can then look at the rate of responding and determine which schedules leads to the highest and most consistent level of responses.

References

Schacter, D.L., Gilbert, D.T., & Wegner, D.M. (2011). Psychology. New York: Worth, Inc.

Skinner, B. F. (1983). A Matter of Consequences. New York: Alfred A. Knopf, Inc.

Continue Reading