What Is a Fixed-Interval Schedule?

Paychecks are one example of a fixed-interval schedule.
tattywelshie / Getty Images

In operant conditioning, a fixed-interval schedule is a schedule of reinforcement where the first response is rewarded only after a specified amount of time has elapsed. This schedule causes high amounts of responding near the end of the interval, but much slower responding immediately after the delivery of the reinforcer.

As you may remember, operant conditioning relies on either reinforcement or punishment to strengthen or weaken a response.

This process of learning involves forming an association with a behavior and the consequences of that behavior. Behaviors that are followed by desirable outcomes become stronger and therefore more likely to occur again in the future. Actions that are followed by unfavorable outcomes become less likely to occur again in the future. 

It was the noted psychology B.F. Skinner who first described this operant conditioning process. By reinforcing actions, he observed, those actions became stronger. By punishing behaviors, however, those actions become weakened. In addition to this basic process, he also noted that the rate at which behaviors were either reinforced or punished also played a role in how quickly a response and the strength of that respond. 

How Does a Fixed-Interval Schedule Work?

In order to better understand how a fixed-interval schedule works, let's begin by taking a closer look at the term itself.

A schedule refers to the rate at which the reinforcement is delivered or how frequently a response is reinforced. An interval refers to a period of time, which suggests that the rate of delivery is dependent upon how much time has elapsed. Finally, fixed suggests that the timing of delivery is set at a predictable and unchanging schedule.

For example, imagine that you are training a pigeon to peck at a key. You put the animal on a fixed-interval 30 schedule (FI-30), which means that the bird will receive a food pellet every 30 seconds. The pigeon can continue to peck the key during that interval but will only receive reinforcement for the first peck of the key after that fixed 30-second interval has elapsed.

Characteristics of the Fixed-Interval Schedule

  • Results is a fairly significant post-reinforcement pause in responding
  • Responses tend to increase gradually as the reinforcement time draws closer

Examples of Fixed-Interval Schedules

  • In a Lab Setting: Imagine that you are training a rat to press a lever, but you only reinforce the first response after a ten-minute interval. The rat does not press the bar much during the first 5 minutes after reinforcement but begins to press the lever more and more often the closer you get to the ten-minute mark.
  • In the Real World: A weekly paycheck is a good example of a fixed-interval schedule. The employee receives reinforcement every seven days, which may result in a higher response rate as payday approaches.

    More Psychology Definitions: The Psychology Dictionary

    Continue Reading