Additional related information may be found at: |
Neuropsychopharmacology: The Fifth Generation of Progress |
Adaptive Processes Regulating Tolerance to Behavioral Effects of Drugs
Alice M. Young and Andrew J. Goudie
This chapter surveys major adaptive learning processes involved in development, expression, and maintenance of tolerance to behavioral effects of drugs. A prevailing view of drug tolerance is that sustained or repeated administration of a drug initiates multiple adaptive processes that reduce the acute effects of an initial drug dose and increase the dose or concentration needed to produce effects of a given magnitude. Higher intensities of drug treatment may also decrease the maximal effects that can be produced by any dose. Adaptations may occur by biochemical, cellular, and behavioral processes triggered by the initial effects of a drug. The specific adaptations recruited by repeated drug treatment are determined by the type of drug administered chronically; its dose, frequency, and duration of administration; and the functional demands placed on the biological system. Tolerance to behavioral effects of drugs in vivo is also determined by an individual's experiences during chronic treatment, that is, by learning processes. Unfortunately, there has been little interaction among researchers working on different levels of tolerance, and a major challenge for future research in this area is to link behavioral adaptations to those occurring at biochemical and cellular levels (see also Behavioral Techniques in Preclinical Neuropsychopharmacology Research, Intracellular Messenger Pathways as Mediators of Neural Plasticity, Neuroendocrine Interactions, and Animal Models of Drug Addiction).
Tolerance is an acquired decrease in sensitivity to a particular effect of a drug that results from exposure to that or a related drug. A distinction is often made between acute tolerance, decreases in sensitivity that develop during a single drug exposure, and chronic tolerance, decreases in sensitivity that develop from repeated exposure. A distinction is also made between tolerance to a specific drug, to related drugs (selective or homologous cross-tolerance), or to drugs from unrelated drug classes (heterologous cross-tolerance). The definition of tolerance carries no implications about specific processes. Potential processes are commonly divided into two groups: (a) dispositional or pharmacokinetic processes, which reduce the concentration of a drug or its duration of action in a target system; and (b) pharmacodynamic or functional processes, which reduce the sensitivity of drug-sensitive systems to a given drug concentration. Dispositional tolerance describes changes in sensitivity that arise from changes in absorption, distribution, metabolism, or excretion that diminish concentrations of a drug at effector sites; these processes are surveyed elsewhere (e.g., 27). In contrast, pharmacodynamic tolerance describes changes in sensitivity that result from adaptive changes in drug-sensitive systems that diminish the initial effects of a drug.
Two general models have been advanced to account for pharmacodynamic tolerance. Although multiple mechanisms are involved in the development of tolerance (reviewed in refs. 4, 12, 19, 22, 38, 51, 56), they can be grouped into two logical classes (4, 32). One class of models postulates that tolerance results from a reduction in the drug signal or stimulus, as a result of adaptations that reduce activity of a drug at its receptors. Such a reduction would of course occur in dispositional tolerance but may also result from adaptive pharmacodynamic processes, such as receptor down-regulation or desensitization, that reduce the intensity of the drug-induced biological signal. These could include changes in affinity, coupling, distribution, or number of target receptors. A second class of models postulates that signal intensity is constant (i.e., that the interaction between a drug and its receptors is unchanged), and that initial acute effects of the drug are opposed or counteracted by homeostatic changes in biochemical, cellular, and effector systems that mediate primary drug effects or in systems changed by these effects. For individual drugs or drug classes, these might include changes in postreceptor signal transduction; intraneuronal signaling pathways; neuronal architecture, connections, or sensitivity to transmitters; neurotransmitter synthesis or distribution; or functioning of effector systems themselves. Behavioral accounts of tolerance generally are of this type, postulating that conditioning of coordinated homeostatic behavioral responses reduces the initial response to a constant biological signal.
Different models make different predictions about the types of adaptive processes that are recruited by chronic drug treatment and about activity of drug-sensitive systems after treatment ends. Littleton and Little (32) have referred to the two basic models of tolerance as decremental and oppositional, respectively. Decremental pharmacodynamic models focus on processes that change the number or properties of drug-sensitive receptor populations and make no predictions about changes in the functioning of other posttransductional processes. In this scheme, termination of drug treatment would be followed by a delayed recovery of initial sensitivity. Oppositional models postulate that continued drug treatment recruits processes that oppose the initial acute effects of a drug or of receptor alterations. When drug treatment ends, these processes may operate unopposed for some period, resulting in appearance of withdrawal signs, followed by readaptation to the drug-free state. For example, increased activity in an intraneuronal signaling pathway normally inhibited by a drug could produce both tolerance (because a greater fractional inhibition of activity, and thus a higher drug dose, would be needed to produce inhibition equivalent to that produced initially in nontreated tissue) and withdrawal after treatment ends (because removal of the drug would be followed for a time by activity above basal levels in the drug-sensitive pathway). A recent extension of these models (22) distinguishes between adaptations occurring within a drug-sensitive neuronal system and those occurring between systems. In the case of within-system adaptations, repeated drug treatment would elicit opposing activity in the system through which a drug induces its primary pharmacological actions. For example, Nestler and Duman) review evidence indicating that, in the locus coeruleus, opioid tolerance involves up-regulation of cyclic adenosine monophosphate (cAMP) systems that oppose an initial sustained opioid-induced inhibition of adenylyl cyclase. In the case of between-system adaptations, initial changes in primary drug-sensitive neurons would trigger adaptations in biochemical, cellular, or behavioral pathways different from those directly involved in the initial pharmacological actions of a drug. Such between-systems adaptations may be implicated, for example, in the apparently pervasive role of N-methyl-D-aspartate receptors in modulating tolerance to opiates and other drugs (e.g., 55, 56).
For many drugs, emerging accounts of tolerance incorporate features of both logical classes, albeit often from studies conducted at different levels of analysis. Hypotheses about the mechanisms involved in drug tolerance can be usefully organized in terms of general principles of biological regulation at behavioral, cellular, and biochemical levels. Integration of information across studies requires parametric studies to link changes occurring at one level with those studied at another level. Although such studies are only beginning to emerge, several features of descriptive studies of tolerance to behavioral effects of a drug can improve their usefulness as bases for inferences about mechanisms. Among these are systematic variations of pharmacological and behavioral parameters, concurrent or matched measures of multiple endpoints, and independent verification of potential mechanisms. At the most basic level, useful studies assess the occurrence and magnitude of any decreases in the potency and/or maximal effects of a repeatedly administered drug. Unfortunately, studies of tolerance to behavioral effects of drugs frequently assess changes in effects of only one or two doses of a drug. Although this approach can demonstrate loss of an initial drug effect, it cannot assess either the magnitude of decreased sensitivity or the ability of a tolerant system to express the initial effect. Information of the latter sort is critical for full characterization of adaptive processes underlying diminished drug effects. Additional useful information is provided by explorations of how changes in the potency and/or maximal effects of a drug change as a function of a variety of pharmacological and behavioral factors. Useful pharmacological information is provided by manipulations of the dose, frequency, and duration of repeated drug administrations; studies of whether adaptive changes are reversed after discontinuation of treatment; and studies of the pharmacological selectivity of tolerance. As will be reviewed more fully below, useful behavioral information is provided by manipulation of the learning conditions encountered during repeated treatment.
INFLUENCES OF LEARNING PROCESSES ON TOLERANCE TO BEHAVIORAL EFFECTS OF DRUGS
It is well established that development of tolerance to complex behavioral effects of drugs can depend on an individual's experiences in the drugged state. Such contingent tolerance is illustrated by a study by Chen (3), in which one group of rats received three doses of 1.2 g/kg ethanol before running a circular maze every fourth day, and a second group received the same doses of ethanol after running the maze. In the critical test of tolerance, all rats received their fourth dose of ethanol before running the maze. All rats displayed tolerance to sedative effects of ethanol, but only those rats that had always received ethanol before running the maze displayed tolerance to its error-enhancing effects. Because the groups differed only in whether rats had previously run the maze while intoxicated, such selective tolerance probably arose from intoxicated practice.
Several learning processes appear to play key roles in the acquisition, expression, or retention of tolerance to behavioral effects of drugs from numerous pharmacological classes (11, 12, 64). A number of terms have been used to describe behavioral influences, including associative tolerance, behavioral tolerance, behaviorally augmented tolerance, conditioned tolerance, contingent tolerance, environment-dependent tolerance, and learned tolerance. Although these multiple terms highlight situations in which different learning processes may operate to regulate tolerance, they also impede recognition of situations in which common processes operate. As with biochemical and cellular influences on tolerance, the learning processes that are likely to operate in a particular situation are those that normally govern the behaviors affected by a drug. In the case of complex learned behaviors, these learning processes are involved in their acquisition, maintenance, or adaptability. To date, the most important processes appear to be classical and instrumental conditioning. In the sections that follow, we review how these learning processes influence tolerance to behavioral effects of drugs (see also Behavioral Techniques in Preclinical Neuropsychopharmacology Research, Animal Models of Drug Addiction, and Animal Models of Psychiatric Disorders).
INFLUENCES OF CLASSICAL CONDITIONING AND HABITUATION ON TOLERANCE
The classical conditioning account of tolerance derives largely from empirical studies initiated by Siegel in 1975 (44), although it has been recognized for many years that classical conditioning processes may modulate drug effects (52, 63). Additionally, Solomon (50) independently developed a general opponent process theory of motivation that accounts for much of the data that support the classical conditioning theory of tolerance.
Procedures for the study of tolerance frequently involve repeated drug administrations and repeated testing. Such procedures are essentially classical conditioning trials. The drug acts as an unconditional stimulus (UCS) that elicits unconditional responses (UCRs). Almost inevitably, the drug UCS will be consistently paired with distinctive environmental stimuli (e.g., handling procedures or the environment in which drug is administered). After repeated pairings, such environmental stimuli may become conditional (drug-predictive) stimuli (CSs) that when presented alone elicit, as conditional responses (CRs), drug effects previously seen only as UCRs. Under such circumstances, one might expect conditioning of drug-like CRs. Siegel noted, however, that in many studies drug UCSs have been reported to elicit drug-opponent or compensatory CRs. Siegel proposed that tolerance develops when a drug UCS elicits two opposing responses in drug-experienced subjects: a constant UCR and a compensatory CR that grows with repeated treatments. Over repeated treatments (i.e., reiterated conditioning trials), the sum of the constant UCR and the increasing compensatory CR should progressively reduce the net drug effect, resulting in tolerance. This theoretical account of tolerance is logically similar to biochemical and cellular accounts of pharmacodynamic tolerance that emphasize recruitment of adaptive homeostatic processes.
The key postulate of the conditioning theory of tolerance is the compensatory CR, which counteracts the initial effects of the drug. In contrast, a drug-like CR should produce sensitization, that is, a decrease in the dose of a drug required for effect and/or an increase in the maximal effect obtained. Both drug-like and drug-compensatory CRs have been observed, however (45), raising questions about behavioral and pharmacological requirements for conditioning of adaptive responses. Stewart and colleagues have suggested that many drug effects are biphasic (e.g., an initial drug-induced hypothermia followed by hyperthermia). It is thus possible that development of compensatory CRs may involve conditioning of the secondary responses. As conditioning proceeds, the CS may elicit these opponent responses coincidentally with the initial drug effect, diminishing the effects of a constant drug dose (see ref. 8 for a detailed discussion of this issue).
If compensatory CRs mediate tolerance, a number of predictions follow from classical conditioning theory. The most obvious is that tolerance should be situation-specific; that is, it should be greater in the presence of CSs eliciting compensatory CRs than in the absence of such CSs. Tolerance to various effects of numerous drugs can show situation specificity in laboratory animals and humans (see ref. 45 for review). In a typical experimental demonstration of situation specificity, subjects are treated with drug in one environment and vehicle in a different environment. The environments differ in their lighting, sound, smell, and so on. After repeated pairings, drug effects are tested in both environments. Typically, more tolerance is seen in the drug-predictive environment than in the vehicle-predictive environment, due presumably to the fact that the compensatory CR is elicited only in the former environment.
Situation specificity of tolerance has been studied most extensively for antinociceptive effects of m-opiates, although animal studies provide evidence for considerable pharmacological and behavioral generality of the phenomenon (10, 45). Situation-specific tolerance has been reported with pentobarbital, ethanol, haloperidol, nicotine, midazolam, and scopolamine. The response systems supporting situation-specific tolerance include those involved in regulation of temperature, pain, drinking, motor activity, and catalepsy. Situation specificity has also been reported in human opiate addicts, who show tolerance to self-administered heroin but not to heroin administered by the experimenter, suggesting that drug-taking rituals become CSs that induce tolerance (7). Perhaps the most striking example of the dynamic manner in which environmental CSs modulate sensitivity to drug effects is provided by situation-specific tolerance to the lethal effects of heroin. Rats that receive a large dose of heroin in a drug-predictive environment suffer lower mortality than rats tested in an environment not previously paired with heroin (47). Siegel (45) suggested that so-called accidental opiate overdose deaths, which occur when addicts take a dose they have previously tolerated, are due to some critical change in the stimuli surrounding drug administration, such that the compensatory CR is not recruited and the full drug UCS is reinstated with dramatic adverse effects.
A further prediction from conditioning theory is that presentations of drug-predictive CSs in the absence of the drug UCS should leave the compensatory CR unopposed and thus detectable as a drug-opponent response. Furthermore, because withdrawal signs are typically drug-opposite, such compensatory CRs may resemble withdrawal signs. Thus, if subjects show situation-specific tolerance, they may also show situation-specific withdrawal. Studies do suggest that situation-specific withdrawal can be elicited by cues repeatedly associated with drug administration (9, 20, 24). Studies to identify biochemical or cellular correlates of situation-specific withdrawal should provide very interesting information about the pharmacological specificity of such changes.
Conditioning theory makes a number of other predictions about tolerance. Tolerance should be extinguished by repeated presentations of a drug-predictive CS in the absence of a drug UCS. This prediction has been validated. Morphine antinociception can be reinstated in previously tolerant animals by repeated exposure to a hot-plate test and placebo injections (44). The case for conditioning theory is strengthened further by demonstrations that tolerance displays other critical features of classical conditioning, including partial reinforcement, latent inhibition, sensory preconditioning, overshadowing, blocking, and external inhibition (see ref. 11 for a review). The phenomenon of conditional inhibition is particularly vivid. According to conditioning theory, if a CS is presented so that it predicts the absence of the UCS, it should become a conditional inhibitor that will retard subsequent conditioning. That is, if CS/UCS pairings are subsequently introduced, the rate of CR acquisition will be lower than in naive subjects. Siegel et al. (46) demonstrated conditional inhibition of tolerance to morphine-induced antinociception. Rats were kept in darkened cages for most of the day, so that a light CS could be presented for 1 hr/day. The conditional inhibition group was given explicitly unpaired morphine exposures, such that the light CS was presented 4 hr before the morphine UCS. Various control groups were exposed to the light CS alone, morphine alone, or no treatment. Subsequently, all rats received repeated tests on a hot plate in the presence of morphine and the light. Tolerance developed more slowly in the conditional inhibition group than in all other groups, including the morphine-alone group, which was drug-experienced at the start of hot-plate testing. Thus, exposure to morphine in a conditional inhibition procedure retards acquisition of tolerance to antinociceptive effects of morphine, making a strong case for the influence of classical conditioning processes.
The classical conditioning account of tolerance is supported by an impressive body of data. The theory is not without problems, however. Most specifically, attempts to demonstrate the compensatory CR, the central theoretical idea of the theory, have met with mixed success. A number of studies have reported marked situation-specific tolerance without a clear compensatory CR (e.g., refs. 21, 34, 43). The presence of situation-specific tolerance in such studies implies that the absence of a compensatory CR was not due merely to low salience of the experimental CS. In other studies, however, evidence exists for compensatory CRs (e.g., ref. 23, 28). Relatively recent studies in humans have also reported the presence of compensatory CRs in alcoholics (35) and opiate addicts (7). The compensatory CR remains, therefore, an intriguing enigma in empirical studies.
The significance attributed to failures to detect compensatory CRs varies among researchers. Negative results may be of less significance than positive ones, as it is possible to explain the absence of the compensatory CR in a post-hoc fashion (37). There are a number of reasons why compensatory CRs may be difficult to detect (45). First, they may be difficult to detect in specific assays. For example, hyperalgesia may be detectable with a hot-plate but not with a tail-flick assay, due to the very low response latency in latter assay (23). Second, compensatory CRs may be difficult to detect after placebo treatment, because, in the absence of the drug UCR (which the CR is designed to counteract), normal homeostatic responses may be recruited to counteract the developing compensatory CR. Finally, drug onset cues themselves may become an essential feature of the CS complex. When drug is administered, a reliable predictor of a specific dose will inevitably be a lower dose. Thus low dose CSs may be a critical part of the overall compound CS, and tests for the CR after placebo may fail.
Siegel (45) and others (33) also suggest that the compensatory CR mediating tolerance may be unobservable under certain circumstances. For example, King et al. (21) found situation-specific tolerance to sedative effects of midazolam, but only weak evidence for conditional hyperactivity. They suggested that the CR mediating tolerance was conditional adrenocorticotropic hormone (ACTH) secretion, which antagonized midazolam's effects. Similarly, Melchior and Tabakoff (36) suggested that situation-specific ethanol tolerance may be mediated by conditional metabolism, because brain ethanol levels in a drug-predictive environment were lower than those in a non-drug-predictive environment. There may well be a strong case for extending the theory beyond CRs that are readily observable to promote studies of the biochemical and cellular bases of adaptive compensatory CRs. Extending the theory to unobserved CRs may weaken it, however, because it cannot be readily refuted unless the nature of the compensatory CR is specified explicitly. For example, if situation-specific tolerance is observed in the absence of an observable compensatory CR, it is possible to explain away the absence of the CR by assuming that tolerance was induced by an hypothetical unobservable compensatory CR. Such a theory is empirically irrefutable. There are obvious dangers in inferring that tolerance is mediated by an unobservable compensatory CR in the absence of any evidence that such a CR was present!
Failures to detect reliably compensatory CRs have resulted in accounts of classical conditioning of tolerance that do not make recourse to compensatory CRs. The most influential classical conditioning account of tolerance that denies a role for compensatory CRs is Baker and Tiffany's (1) habituation theory, which is based upon Wagner's (59) model of habituation to exteroceptive stimuli. There is a formal similarity between habituation to environmental stimuli, in which repeated exposures to a novel stimulus produce progressively smaller behavioral responses, and tolerance, in which repeated exposures to a drug stimulus produce smaller pharmacological responses. Baker and Tiffany's (1) theory assumes that the extent to which a drug stimulus elicits a response depends upon the extent to which the stimulus surprises the subject, which in turn depends upon the degree to which the stimulus is primed in short-term memory (STM). If a drug stimulus is not primed, its presentation surprises the subject and elicits a response. A drug stimulus can be primed in two ways: either by recent presentations of the drug or by recent presentations of drug-predictive stimuli. In both conditions the drug will elicit no response (tolerance will be observed) because the subject is not surprised by the drug. This psychological account of tolerance explains the evidence implicating classical conditioning processes in tolerance by assuming that habituation also involves conditioning processes by which drug-predictive stimuli prime STM and thus induce tolerance.
In contrast to compensatory CR theory, Baker and Tiffany's (1) theory assumes that there are two kinds of tolerance: classically conditioned or associative tolerance, which results from priming by drug-predictive stimuli and is thus situation-specific, and nonassociative, which results from priming by recent drug presentations and is situation-independent. Habituation theory makes predictions about conditions under which these two types of tolerance will be observed. It predicts that infrequent administrations of high doses will lead to situation-specific (associative) tolerance, whereas frequent administrations of the same doses will produce situation-independent (nonassociative) tolerance that will, in turn, disrupt development of situation-specific tolerance. This prediction follows from the idea that frequent administrations of high doses will continuously "prime" the drug stimulus in STM and therefore block associations between drug stimuli and environmental stimuli. Recent studies support the prediction that situation-independent tolerance retards development of situation-specific tolerance. Tiffany and Maude-Griffin (54), using doses of 30 mg/kg morphine given at different interdose intervals, showed that only situation-specific tolerance is seen at a 96-hr interdose interval, whereas situation-independent tolerance predominates at a 24-hr interdose interval, accompanied by a decline in situation-specific tolerance.
Similarly, Dafters and Odber (5) found that situation specificity of morphine tolerance is maximal at long interdose intervals, whereas high doses of morphine given at short interdose intervals produce situation-independent tolerance. Tiffany et al. (53) also showed that long interdose intervals produce situation-specific tolerance to antinociceptive effects of morphine, which is retained over a 30-day period. At short interdose intervals, tolerance is situation independent and rapidly lost. Again, conditions (large drug doses at short interdose intervals) that facilitate situation-independent tolerance inhibit situation-specific tolerance, supporting Baker and Tiffany's (1) theory. Such data are predicted uniquely by habituation theory. Compensatory CR theory, which deals only with situation-specific tolerance, makes no predictions about interactions between situation-specific and situation-independent tolerance. However, habituation theory is not without problems. First, unlike compensatory CR theory, habituation theory has nothing to say about the possible relationship between tolerance and dependence. Second, it is difficult to comprehend how habituation theory can account for the finding that procedures that induce situation-specific tolerance may also induce situationspecific withdrawal (see above). Finally, habituation theory cannot account for demonstrations of compensatory CRs other than by assuming that such phenomena are simply artifacts (34). However, as described above, many drugs have biphasic effects with acute effects being followed by rebound phenomena. The notion that it is possible to condition secondary drug-compensatory effects has intrinsic appeal, as conditioning of such responses would prepare the organism for a forthcoming toxin. The time is clearly ripe for an attempt to resolve the controversies between habituation theory and compensatory CR theory. This will not prove easy, however, because, as noted above, recent formulations of compensatory CR theory are vague in their definitions of compensatory CRs.
The links between behavioral habituation or compensatory conditioning and cellular or biochemical processes are largely unexplored. It is possible to condition activity of neurochemical systems (57). There is therefore reason to suppose that the habituation or classical conditioning processes involved in tolerance act at various levels of neuronal organization. For example, Liljequist et al. (31) reported situation-specific tolerance to ethanol's sedative actions accompanied by situation-specific tolerance to ethanol's action in stimulating dopamine metabolism. As noted above, Nestler and Duman) review evidence suggesting that tolerance to inhibitory effects of morphine on locus coeruleus neurons involves compensatory up-regulation of the cAMP system, producing responses that oppose the actions of morphine. As such responses are, almost by definition, compensatory responses, it is possible that such intraneuronal responses can be conditioned, although empirical studies have not yet addressed this important issue. Integration of work at different levels of analysis represents a substantial challenge for the future.
Finally, it is worth commenting on these approaches to tolerance in so far as they relate to the question of whether tolerance involves a change in the drug stimulus during tolerance acquisition. Habituation theory assumes that the stimulus that causes tolerance is present at a constant intensity throughout treatment. If this were not the case, priming of STM would be reduced and tolerance would be lost. The conditions most favorable for priming of STM, and induction of tolerance, are drug presentations at high doses and short interdose intervals. However, induction of the drug stimulus presumably requires activation of appropriate receptor populations. It appears likely that these are precisely the conditions that might, for some drugs and treatment regimens, favor receptor down-regulation or desensitization, a decline in the drug stimulus and the resulting appearance of pharmacodynamic tolerance. Thus habituation theory requires that there be a constant drug stimulus during chronic treatment to induce tolerance, whereas some forms of pharmacodynamic tolerance may involve a diminishing drug stimulus. As regards conditioning theory, reduced stimulus strength (reduction in the magnitude of the UCS) should retard conditioning, which is positively related to UCS magnitude. Thus any biochemical or cellular processes that progressively diminish a drug stimulus should retard acquisition of tolerance mediated by habituation or classical conditioning. Clearly, it is a matter of considerable importance to attempt to dissect out differences between changes in sensitivity acquired as a result of conditioning or habituation processes and those involving other adaptive processes (see Animal Models of Drug Addiction, Animal Models of Psychiatric Disorders, and Cocaine).
Many behavioral effects of drugs are changes in instrumental behaviors, behaviors whose topography, temporal pattern, and persistence are shaped by their consequences. Complex instrumental behaviors emerge from undifferentiated behavior through reinforcement of successive changes in form and temporal patterning. When an individual encounters regular scheduling arrangements between instrumental behaviors and their reinforcing or punishing consequences, predictable steady-state performances emerge. Although stable over time, these steady-state performances remain sensitive to changes in the probability or scheduling of their antecedents and consequences. If a drug initially disrupts instrumental performances, the persistence of such disruptions can be modified by their consequences. Current evidence implicates at least two instrumental learning processes, reinforcement and stimulus control, in tolerance to effects on instrumental behaviors of a wide range of drugs, including behavioral stimulants, anxiolytics and sedative-hypnotics, cannabinoids, and opiates (11, 58, 64).
Tolerance to disruptive effects of drugs on instrumental behaviors can depend on performing the behaviors during intoxication or, alternatively, experiencing behavioral disruptions that change the consequences of behavior. The general principles can be illustrated by studies with ethanol. LeBlanc et al. (29, 30) studied rats negotiating a circular maze for food or traversing a moving belt above an electrified grid. Ethanol initially impairs performance in either task, decreasing the frequency of food delivery or increasing contact with electric shock. Rats allowed to execute their task while intoxicated, and therefore to experience the consequences of ethanol-induced impairment, develop tolerance to effects of a daily dose of ethanol faster than rats given the same daily dose but allowed to execute the task while intoxicated only at intervals of several days. Later work by Wenger et al. (60, 61) showed that removing interpolated intoxicated test sessions prevents development of tolerance to ataxic effects of ethanol over at least 3 weeks of exposure. Recent studies by Holloway and colleagues have shown that, under various schedules of food reinforcement, consequences of intoxicated performances can shift the full dose-effect curve for ethanol, so that higher doses are required for both rate-increasing and rate-decreasing effects. Rats given ascending daily doses of ethanol before experimental sessions develop greater tolerance to both effects than rats given the same doses after daily sessions or while testing is suspended (2, 14, 17). Presession ethanol produces greater tolerance than postsession ethanol when administered either daily (2, 14, 17) or at intervals of several days (16). Control experiments showed that such differential tolerance requires both performance while intoxicated and some minimal exposure to ethanol.
Several features suggest that the greater tolerance after presession ethanol arises from instrumental reinforcement processes. The magnitude of tolerance to initial effects of ethanol varies directly with the number of opportunities for intoxicated practice (18, 30) and can be retained for long periods after chronic treatment ends (see refs. 2, 14, 17, 26). Moreover, differential tolerance does not appear to arise from differences in ethanol distribution or metabolism, age-related factors, or sensitivity to drug-induced disruptions in general (14, 16, 25). Tolerance developed through reinforcement processes is, in turn, modulated by the dose and duration of ethanol treatment (26, 62). Interestingly, reinforcement processes also modulate development of within-session tolerance to ethanol (25), suggesting rapid recruitment of adaptive processes.
The strongest evidence that immediate consequences of drug-induced behavior can regulate development of tolerance comes from studies demonstrating differential tolerance to effects that differentially alter how successfully behavior meets local requirements for reinforcement. Schuster et al. (42) introduced what has come to be called the reinforcement density or reinforcement loss hypothesis to account for differential tolerance to stimulant effects of d-amphetamine that have different effects on the likelihood of reinforcement. Schuster et al. studied performances under two schedules of food reinforcement: one in which responses were reinforced only when a prespecified time interval elapsed between successive responses, and a second in which responses during a prespecified interval had no effect and the first response at the end of the interval produced reinforcement. The two schedules alternated several times within short daily sessions. Both schedules generated fairly low rates of responding, and a dose of 1.0 mg/kg d-amphetamine initially increased response rates under both. When d-amphetamine was administered daily, tolerance developed rapidly to rate increases that forfeited reinforcers under the first type of schedule, but not to comparable rate increases that did not forfeit reinforcers under the second type. Because the two reinforcement schedules, and their corresponding tolerant or nontolerant behaviors, alternated within sessions, differential tolerance probably did not arise from dispositional sources. Under other conditions, no tolerance developed to rate-increasing effects of d-amphetamine that improved shock avoidance performances. A common interpretation of such outcomes is that reinforcement processes determine both tolerance to drug effects that produce costly behavioral outcomes and lack of tolerance to effects that improve or do not change the likelihood of reinforcement.
Multiple lines of evidence suggest that dynamics of local reinforcement processes modulate development of tolerance to behavioral effects of not only d-amphetamine and other behavioral stimulants, but also anxiolytics and sedative-hypnotics, opiates, cannabinoids, and other drugs (11, 58, 64). The most common, albeit indirect, evidence comes from studies showing that tolerance to drug-induced changes in instrumental behaviors develops when drug is administered chronically before daily experimental sessions, and fails to develop when the same doses are administered postsession. More direct evidence for control of tolerance by drug-induced changes in the interactions of behavior with its consequences comes from demonstrations that tolerance develops to initial effects that decrease reinforcer deliveries, but may not develop, or develops to a lesser degree, to similar initial effects that do not change reinforcer deliveries under other schedules of reinforcement. Such differences are particularly powerful when they occur in the same subjects sequentially within the same experimental session (42, 58, 64), because such sequences minimize the possibility that differential tolerance arises from unsuspected differences in drug disposition. In the case of some drugs, particularly d-amphetamine, control experiments have also ruled out certain other nonlearning processes, such as differential food deprivation or altered body weight set point, as mechanisms for differential tolerance to effects on food-reinforced behaviors (6).
Reinforcement loss is only one behavioral influence on tolerance to drug effects on instrumental behaviors. Tolerance to the effects of drugs on instrumental behaviors can come under the stimulus control of environmental cues, so that tolerance to behavioral effects in one environment may not transfer to similar behavioral effects associated with distinctly different environments or task demands (49). Tolerance is also modulated by strength of stimulus control of behavior (39); amount of work, or effort, required for reinforcement (13); relative number of reinforcement opportunities (18); mental rehearsal of task components (58); and context of reinforcement loss (48). With respect to the latter factor, costly initial effects of a drug may diminish later drug effects only when the drug-induced loss is large relative to total available reinforcers. One challenge for future research will be to delineate how stimulus control and reinforcement loss interact with other controls on instrumental behavior to modulate tolerance development. Another will be to define the boundary conditions, especially the dosing conditions, under which these learning processes modulate tolerance, and to identify factors responsible for reported exceptions (see ref. 64).
Although there is considerable evidence that tolerance to behavioral effects of numerous drugs can be controlled by the dynamics of local reinforcement and/or stimulus control processes, many unanswered questions remain. One concerns how reinforcement processes operate. Reinforcement processes may (a) reestablish instrumental repertoires that have been disrupted by novel effects of a drug, (b) establish new instrumental repertoires that incorporate initially incompatible responses elicited by a drug, or (c) establish new repertoires that incorporate responses that compensate for or oppose initial effects of a drug (37, 64). As an example of the second possibility, Wolgin and Kinney (65) suggested that reinforcement processes shape the stereotyped head movements originally elicited by d-amphetamine into a new instrumental response topography that meets requirements for reinforcement in a milk drinking task. As an example of the third possibility, Holloway and King (15) reported that development of tolerance to initial rate-decreasing effects of ethanol in food-reinforced tasks that favor rapid responding may be accompanied by compensatory rate-increasing effects, as revealed by changes in performances in tasks that favor paced responding. A second question concerns how variations in pharmacological parameters affect the influence of instrumental learning. Of particular importance, few studies have examined how, or whether, the dose, frequency, or duration of drug treatment constrains the operation of instrumental learning processes; whether tolerance modulated by learning processes is retained over long drug-free intervals (2, 42); whether it is accompanied by homologous or heterologous cross-tolerance (14, 40, 41, 66); or whether instrumental learning processes can modulate biochemical or cellular adaptations. With respect to this latter point, it is unclear whether instrumental learning processes operate primarily to modulate effects of a constant drug signal (as might be expected at low doses that do not recruit other homeostatic adaptive processes), or whether they can operate in concert with biochemical or cellular adaptations that reduce the initial behavioral effects of a drug.
Finally, few studies have examined whether differential tolerance to behavioral effects of drugs is accompanied by differential changes in their cellular or biochemical effects. Such studies are becoming technically feasible, and their results may provide challenging information about interactions of adaptive changes at multiple levels of analysis. For example, Sannerud and colleagues (40) compared behavioral and biochemical effects following pre- or postsession administration of chlordiazepoxide. Repeated doses of 18 mg/kg chlordiazepoxide produced tolerance to behavioral effects of chlordiazepoxide only when administered presession. Cross-tolerance developed to midazolam in all rats, but was greater in rats treated with presession chlordiazepoxide. Pre- or postsession treatment produced comparable sensitization to the inverse agonist FG 7142 and did not change sensitivity to various nonbenzodiazepines. Repeated treatment also produced a significant increase in g-aminobutyric acid (GABA) -stimulated Cl- uptake in both cortical and cerebellar tissue. Differences between rats treated pre-or postsession with chlordiazepoxide appeared only in cerebellar tissue, with a smaller increase in GABA sensitivity following presession treatment, suggesting a complex relation among learning processes, changes in functional states of the GABA/benzodiazepine receptor complex, and changes in sensitivity to behavioral effects of chlordiazepoxide.
CONTRIBUTIONS OF MULTIPLE PROCESSES TO DEVELOPMENT OF TOLERANCE
Behavior is an activity of living organisms, not a passive transmitter of drug effects. The lasting effects of a repeatedly encountered drug are jointly determined by properties of the drug and its specific receptor and effector systems, properties of predrug behavior, and learning processes that govern the expression and plasticity of behavior during drug exposure. We have reviewed some of the behavioral adaptations that may be initiated by repeated drug administration and stressed that these adaptations can be organized in terms of general principles of behavior, specifically classical and instrumental conditioning processes. A key challenge for future research will be to define boundary conditions for operation of specific behavioral adaptations. For instance, recruitment of behavioral adaptations may be critically dependent on chronic dosing regimens. Low doses given infrequently may favor classical or instrumental learning processes, whereas higher doses may favor other homeostatic adaptations or receptor regulation processes that decrease signal intensity. Although some theoretical accounts of tolerance to behavioral effects of drugs argue that oppositional and decremental adaptive processes are mutually exclusive (i.e., 1, 37), it remains an empirical question as to whether such processes operate concurrently and under what conditions they do so.
Progress in understanding the influences of these behavioral adaptations complements progress in understanding biochemical and cellular adaptations to repeated drug administrations. In our view, a major challenge for the future is to identify the cascades of adaptations involved in tolerance to behavioral effects of major psychoactive drugs. A useful perspective holds that these adaptive processes are triggered by the acute initial effects of a drug and can be organized in terms of general principles of biological regulation at biochemical, cellular, and behavioral levels. To date, however, few studies have linked adaptive changes at a behavioral level with adaptations studied at biochemical or cellular levels. Studies of behavioral factors in drug tolerance have rarely included detailed examinations of changes in drug disposition or kinetics, or of changes in biochemical or cellular effects of the drug of interest. Similarly, studies of biochemical or cellular factors in tolerance have rarely explored how relevant processes are modulated by environmental factors known to modulate changes in sensitivity to behavioral effects in living organisms. Progress in understanding the consequences of repeated drug administration will require iterative parametric work to organize information within and across levels of analysis.
We thank the editors of this volume and the following colleagues for critical comments on a prior version of this chapter: F. C. Colpaert, M. Emmett-Oglesby, F. A. Holloway, C. A. Sannerud, J. B. Smith, S. T. Tiffany, and D. L. Wolgin. Preparation of this chapter was supported in part by U.S. Public Health Service grant DA03796 from the National Institute of Drug Abuse to A. M. Young, who is recipient of NIDA Research Scientist Development Award K02 DA00132.
published 2000