Allen Neuringer
Reed College
Our choices are often repetitions: they are habits, the result of conditioning histories. What we do upon waking, the ways we greet one another, the music we listen to, these all can be predicted by both the responder and by an external observer. When I think of the value of repetition, I am reminded of when I was a graduate student, sitting in Professor Skinner’s office as he explained the large cumulative record attached to the wall behind his desk. It showed each day’s time spent working effectively at his desk, where a timer and a desk lamp were automatically activated whenever he sat on the desk chair. Skinner had placed his concentration and writing under the discriminative control of lamp-light and context — when at his desk, he wrote and did nothing else — and he conjectured that he accomplished more in a few hours than did his students, who were spending many more hours attempting to write, but with interruptions, both from others and themselves.
However, repetitive and predictable choices are sometimes counter-productive and Skinner also advised us about the importance of variation. Of particular interest to me is that with age (I am now 83 years old) behaviors tend to become increasingly patterned and insensitive to changes in reinforcement contingencies. In Enjoy Old Age Skinner and Vaughan advised us to deliberately change what we do as well as the environment that surrounds us:
“…make deliberate changes in the ways in which you do things. Try converting self-evident ‘truths’ into their opposites and see what happens…If you are inclined to go in one direction, try going in another. Try, especially, to avoid doing things as you once learned to do them — if only to see what results. The more extravagant the variations, the more valuable…”.
Not only can we vary our ways, we can do so unpredictably. For example, drinking coffee in the morning and decaf tea in the afternoon varies our caffeine intake. But flipping a fair coin, with heads indicating “drink coffee” and tails “drink tea,” may result in quite different effects. We don’t need coins or any other external aid to choose unpredictably. Operants provide the means. This point is based, in part, on a Skinnerian view of the operant. “Operant” refers to both a class of responses, each of which has the same relationship to the environment, and to an individual member of the class. We say that the rat emitted an operant response (a single press of the lever) and describe the rat’s behavior as engaged in operant responding (emitting members of the lever-press class). At the level of class, the operant can be predicted based on knowledge of the organism, its biological and behavioral history and current environment, including reinforcement contingencies. At the level of the individual class members, prediction may be more difficult and, in some cases, where instances are “randomly emitted,” impossible. Here is how Skinner (The Generic Nature of the Concepts of Stimulus and Response, in Cumulative Record) described this situation:
“Suppose that we are studying the behavior of … a rat in pressing a lever. The number of distinguishable acts on the part of the rat which will give the required movement of the lever is indefinite and very large… They constitute a class, which is … defined by the phrase “pressing the lever.” Now it may be shown that … the rate of (lever-press) responding … maintains itself or changes in lawful ways. But the responses which contribute to this total number-per-unit-time are not identical. They are selected at random from the whole class—that is, by circumstances which are independent of the conditions determining the rate” (italics added).
Let me put this differently. The current environment and our behavioral and biological histories lead to the activation of a particular operant class. (Activation is indicated by emergence of response members of that class.) Within the class are many possible instances and often the probabilities of those instances are skewed, some being more likely (or occurring more frequently) than others. For example, if I asked you to “name a bird,” the likelihood of “robin” is high. The distribution of response probabilities is itself a manipulable aspect of the operant and it can be flattened, i.e., the probabilities of within-class instances can approach equality. If I ask you, “Name an uncommon bird,” you will probably sample from a broader class of possibilities, including “dodo” and “Larry.” When classes are broad and probabilities equal, then response outputs approach the definition of random and become maximally unpredictable, with each member as likely as each of the other members.
Variability is controlled by a number of environmental and biological influences. Two broad classes of influences are events that depend upon, or are controlled by, an organism’s responses and events that are independent of responses. Examples of response-independent influences include withholding expected reinforcement (as in extinction), administering certain drugs, and proximity to positive or aversive occurrences, each of which may influence behavioral variations.
A second source of control is reinforcement that is contingent on variability. Much like other attributes — response topography, duration, force, and speed — the variability of operants is a reinforceable dimension. Thus, for example, in rats, one can reinforce varying presses to left and right levers: reinforcement is provided only for sequences that had not previously been emitted frequently or recently. In another example, for varying their play interactions with toys was reinforced. In human participants, the reinforcement of variable responding can generate response sequences that cannot be distinguished, according to statistical tests, from a random generator. These are just a few examples of the wealth of data that documents the operant nature of variability and its control by contingent reinforcement. For an overview of “operant variability” evidence, see: Neuringer, A., & Jensen, G. (2012). Comparative Cognition & Behavior Reviews, 7, 55-84; and for alternative interpretations see: Barba, L. (2012). TheBehaviorAnalyst, 35(2), 213-227; and Nergaard, S. K., & Holth, P. (2020). Perspectives on Behavior Science, 43(3), 579-603.
Sometimes it is adaptive to behave in a predictable way, e.g., press the brake pedal of a car when the traffic light turns red. Other times it is adaptive to behave in a way that others (and, in some cases, you, yourself) cannot predict, e.g., when playing ping-pong. But often most functional is a mixture of repeating and varying, or behaving predictably, and unpredictably. In the laboratory, the mixture is sometimes studied under the rubric of “multiple schedules of reinforcement.” One example: when key-lights are blue, pigeons are reinforced for varying response sequences across Left and Right keys. Sometimes, however, the key-lights turn red, and under those conditions the birds must repeat a single sequence. Blue, vary; Red, repeat. The birds learn to respond appropriately. A different type of the mixing is seen under “concurrent schedules of reinforcement,” where rats or pigeons are given a choice between varying and repeating. As commonly done, each of the choices is intermittently reinforced, with the reinforcement frequencies manipulated, i.e., sometimes varying is more likely to gain reinforcement, sometimes there is equality in reinforcement between the two options, and sometimes repeating is more advantageous. The general result is that the animals mix varying and repeating and, just as with simple operants, these choices to vary or repeat are controlled by the relative frequencies of reinforcement.
Most evenings after dinner, I mix variations and repetitions. I sit at a Yamaha electric piano and improvise, sometimes starting with an arbitrary (or random) sequence of notes or sometimes with a from-the-past song. I generally cannot predict the outcome of the improvisation but find the undertaking to be satisfying, and my behavior has been maintained over the years. Similar intermixing can be applied in other venues: social interactions, love-making, competition, problem solving, art, and music. Perhaps most importantly, varying our attempts, repeating what succeeds, and intermixing the two, may help us as we confront the challenges of our day.