What Is a Fixed-Interval Schedule?

Paychecks are one example of a fixed-interval schedule.
tattywelshie / Getty Images

In operant conditioning, a fixed-interval schedule is a schedule of reinforcement where the first response is rewarded only after a specified amount of time has elapsed. This schedule causes high amounts of responding near the end of the interval, but much slower responding immediately after the delivery of the reinforcer.

As you may remember, operant conditioning relies on either reinforcement or punishment to strengthen or weaken a response.

This process of learning involves forming an association with a behavior and the consequences of that behavior. Behaviors that are followed by desirable outcomes become stronger and therefore more likely to occur again in the future. Actions that are followed by unfavorable outcomes become less likely to occur again in the future. 

It was the noted psychologist B.F. Skinner who first described this operant conditioning process. By reinforcing actions, he observed, those actions became stronger. By punishing behaviors, however, those actions become weakened. In addition to this basic process, he also noted that the rate at which behaviors were either reinforced or punished also played a role in how quickly a response and the strength of that respond. 

How Does a Fixed-Interval Schedule Work?

In order to better understand how a fixed-interval schedule works, let's begin by taking a closer look at the term itself.

A schedule refers to the rate at which the reinforcement is delivered or how frequently a response is reinforced. An interval refers to a period of time, which suggests that the rate of delivery is dependent upon how much time has elapsed. Finally, fixed suggests that the timing of delivery is set at a predictable and unchanging schedule.

For example, imagine that you are training a pigeon to peck at a key. You put the animal on a fixed-interval 30 schedule (FI-30), which means that the bird will receive a food pellet every 30 seconds. The pigeon can continue to peck the key during that interval but will only receive reinforcement for the first peck of the key after that fixed 30-second interval has elapsed.

Characteristics of the Fixed-Interval Schedule

There are a few characteristics of the fixed-interval schedule that make it distinctive. Some of these can be seen as benefits, while some might be considered drawbacks.

  • Results is a fairly significant post-reinforcement pause in responding
  • Responses tend to increase gradually as the reinforcement time draws closer

The big problem with this type of schedule is that the behavior tends to occur only right before the reinforcement is delivered. If a student knows that there will be an exam every Friday, he might only begin studying on Thursday night. If a child knows she gets her allowance on Sunday as long as her bedroom is clean, she probably won't clean up her room until Saturday night. The response rate is fairly predictable, but increases as the reinforcement time arrives and then drops off precipitously immediately after reinforcement.

 

Examples of Fixed-Interval Schedules

It can be helpful to look at a few different examples of the fixed-interval schedule in order to better understand how this reinforcement schedule works and what impact it might have on behavior.

Fixed Interval Schedules In a Lab Setting:

  • Imagine that you are training a rat to press a lever, but you only reinforce the first response after a ten-minute interval. The rat does not press the bar much during the first 5 minutes after reinforcement but begins to press the lever more and more often the closer you get to the ten-minute mark.

Fixed Interval Schedules In the Real World:

  • A weekly paycheck is a good example of a fixed-interval schedule. The employee receives reinforcement every seven days, which may result in a higher response rate as payday approaches.
  • Dental exams also take place on a fixed-interval schedule. People who go in for their regular six-month checkup and cleaning often take extra care to clean their teeth right before the exam, yet may not be as diligent on a day to day basis during the six months prior to the exam.

Final Thoughts

Fixed-interval schedules can be an important tool when teaching new behaviors. Sometimes these schedules occur naturally, while other times they are artificially created and controlled by rewards systems. If you are planning to utilize some sort of reinforcement schedule to teach a behavior, it is important to consider how the fixed-interval schedule might influence the speed of learning as well as the rate of response.

Continue Reading