What Is a Variable-Ratio Schedule?

Slot machines operate on a variable-ratio schedule
Christoph Wilhelm / The Image Bank / Getty Images

In operant conditioning, a variable-ratio schedule is a schedule of reinforcement where a response is reinforced after an unpredictable number of responses. This schedule creates a steady, high rate of responding. Gambling and lottery games are good examples of a reward based on a variable ratio schedule.

Schedules of reinforcement play a central role in the operant conditioning process. The frequency with which a behavior is reinforced can help determine how quickly a response is learned as well as how strong the response might be.

Each schedule of reinforcement has its own unique set of characteristics.

Characteristics of the Variable-Ratio Schedule

  • Leads to a high, steady response rate
  • Results in only a brief pause after reinforcement
  • Rewards are provided after an unpredictable number of responses

When identifying different schedules of reinforcement, it can be very helpful to start by looking at the name of the individual schedule itself. In the case of variable-ratio schedules, the term variable indicates that reinforcement is delivered after an unpredictable number of responses. Ratio suggests that the reinforcement is given after a set number of responses. So together, the term means that reinforcement is delivered after a varied number of responses.

It might also be helpful to contrast the variable-ratio schedule of reinforcement with the fixed-ratio schedule of reinforcement. In a fixed-ratio schedules, reinforcement is provided after a set number of responses.

So, for example, in a variable-ratio schedule with a VR 5 schedule, an animal might receive a reward for every five response, on average. This means that sometimes the reward with come after three responses, sometimes after seven responses, sometimes after five responses and so on. The reinforcement schedule will average out to being rewarded for every five response, but the actual delivery schedule will remain completely unpredictable.

In a fixed-ratio schedule, on the other hand, the reinforcement schedule might be set at an FR 5. This would mean that for every five responses, a reward is presented. Where the variable-ratio schedule is unpredictable, the fixed-ratio schedule is set at a fixed rate.

Examples of Variable-Ratio Schedules in Real Life

  • Slot machines: Players have no way of knowing how many times they have to play before they win. All they know is that eventually a play will win. This is why slot machines are so effective, and players are often reluctant to quit. There is always the possibility that the next coin they put in will be the winning one.
  • Sales bonuses: Call centers often offer random bonuses to employees. Workers never know how many calls they need to make to receive the bonus, but they know that they increase their chances the more calls or sales they make.

More Psychology Definitions: The Psychology Dictionary

Continue Reading