Sept. 16, 1999
Environmental influences on behavior, Learning
The magnitude of environmental influences (for example hormones, nutrients, etc.) on behavior helps to determine whether a behavioral program is relatively closed or open. Today we will focus on various types of learning. Learning is an expression of the degree to which a behavior is open to modification through environmental influence.
I. Adaptation--the most trivial form of alteration in an animal's behavior as a result of experience; a term taken from the jargon of neurophysiologists that has nothing to do with adaptation in an evolutionary sense: a sensory receptor that is continually stimulated for a prolonged period stops responding. Ex. the slight deafness one experiences when coming out of a loud concert.
In the simplest cases of adaptation, the sensory receptor literally uses up it supply of neurotransmitter, and until the cell manufactures more it can't cause the postsynaptic neuron to fire. Similarly, an animal that repeatedly performs a behavior may cease to do so as a result of muscle fatigue. Adaptation is not considered learning.
II. Types of learning--any change in an individual's behavior that is due to its experience--several types of learning have been defined in the psychological and ethological literature; the divisions between them are sometimes a little artificial but they are still useful for organizing our thinking.
Non-associative learning--animal modifies its behavior but not because of any association of stimuli; habitatuation, dishabituation, and sensitization are described below.
A. Habituation--the reduction or elimination of a behavioral response to a stimulus repeatedly presented without positive or negative reinforcement. The animal simply gets used to the familiar, unreinforced stimulus and learns to ignore it.
Ex. the escape response of fish to a shadow passing overhead diminishes progressively if the stimulus is repeated every few minutes, until the fish cease to react at all; scarecrows deter birds from crops for a short time but then they don't work.
B. Dishabituation--the recovery of a habituated response following the presentation of a strong or novel stimulus.
Ex. The sea slug, Aplysia, withdraws its gills if touched on the siphon, the tube through which water is drawn over the gills. If you touch the siphon repeatedly the slug habituates and gill withdrawal ceases (habituation). However, a weak electric shock to the tail causes the response to dishabituate and the slug again becomes sensitive to being touched on the siphon.
This response proves that the original decline in response was due to habituation and not sensory adaptation or muscle fatigue. If a neuron has run out of neurotransmitter or a muscle can no longer contract, a shock to the tail couldn't cause them to suddenly recover. What did happen during habituation was a decrease in the amount of neurotransmitter released by the sensory neuron.
C. Sensitization--sensitization is the enhancement of a response to an unhabituated stimulus by presenting a different strong or novel stimulus first--the animal becomes more alert and responsive to an unfamiliar stimulus--as opposed to dishabituation, that occurs only after habituation.
A weak shock to the tail of an Aplysia that has not been habituated will cause it to withdraw its gills more rapidly the first time the siphon is touched and it takes more touches to habituate a slug that has first been shocked than one that hasn't.
The mechanisms of these simple forms of learning in Aplysia have been studied thoroughly by neuroethologists, particularly Eric Kandel and his colleagues. In some intricate research they analyzed the slug's "memory" of familiar stimuli at the level of intracellular biochemical changes. Aplysia is particularly amenable to fine-scale research because it has such a simple nervous system, containing only about 20,000 central nerve cells. Kandel and his colleagues have gone on to elaborate the likely series of steps involved in the gill-withdrawal reflex on the basis of pharmacological and biochemical evidence.
Associative learning or conditioningtwo events are associated, the second acting as a reinforcer for the first. If the first event is rewarded or positively reinforced, the behavioral response will become more likely the next time; if it is punished or negatively reinforced, the response becomes less likely.
D. Classical conditioning (also known as stimulus substitution and Pavlovian conditioning)the two events that are associated are both stimuli. The phenomenon was discovered independently in honeybees by von Frisch and in dogs by Pavlov.
Pavlovs dogs, which drooled whenever they smelled meat powder, were presented with both meat powder and another cue, such as the ringing of a bell. After repeated experience of the two stimuli together, the dogs formed an association between meat powder, which ordinarily releases salivation, and the bell which ordinarily does not. After the association was formed, when dogs were presented with the bell alone, they salivated in apparent expectation of the meat powder.
In general terms, classical conditioning always involves an unconditioned stimulus (US), which produces an unconditioned response (R) prior to conditioning.
In the training procedure, the US is paired with a new conditioned stimulus (CS) which initially has nothing to do with releasing the response. After repeated pairings of US + CS the CS, in this case the bell tone, produces the response by itself.
That is, first US >> R;
then US + CS >> R;
finally CS >> R.
Thus a natural behavioral response is transferred from the natural stimulus, which is often a key stimulus, to a new stimulus by means of reinforcement. This is why classical conditioning is also called stimulus substitution.
E. Operant conditioning--an entirely new behavioral response becomes associated with a reward or other reinforcer. In nature, an ex. might be tool use.
This was discovered by Thorndike, but it was pursued so enthusiastically by B. F. Skinner that we now associate his name with operant conditioning.
A pigeon or a rat placed in a so-called Skinner box begins by exploring its unfamiliar environment, and perhaps presses on a level that it finds there, but nothing happens. Sooner or later, the rat accidentally happens to touch the lever and receives a food pellet.
After repeated experiences, the animal comes to associate the reward with pressing the lever. Thus an accidental response is shaped by reward or punishment; for this reason, operant conditioning is also called trial and error learning.
Conditioning procedures such as these are obviously carried out in the laboratory. However, it is easy to think of many instances where conditioning plays a crucial role under natural circumstances.
It is often suggested that play behavior is a form of natural operant conditioning; the evidence is somewhat equivocal on this point.
It has been suggested that operant conditioning may be behind the origination of tool use in some animals, such as orangutans and other primates that use stones and branches in defense against predators, or in chimps that learn to use a stick to extract insects from holes in wood.
F. Insight learning or reasoning--problem solving through perception of configurational relationships, as opposed to trial and error (may be a form of operant conditioning, in which the animal extrapolates the trials in its head rather than performing them overtly, although some people would argue this is not the case)
In Kohler's famous experiments chimpanzees had to pile up boxes to get to food that was out of reach.
Many animals can apparently learn to learn, that is to generalize concepts between experiments in so-called learning sets --after learning a behavior in one round of conditioning, the animal will learn a set of similar behaviors with fewer reinforcements. For example, pigeons and parrots can be taught to select the odd circular object out of a set of square objects, and generalize this concept of "oddness", so that they can also pick out the odd square out of a set of circles.
G. Imitation--though behavioral novelties such as tool use and so-called insights may originate by operant conditioning, in many social species, especially primates, other members of the social group can subsequently acquire them by imitation.
In one famous example, a single female Japanese macaque learned to wash potatoes and to separate rice grains from sand by dunking them in water. Other troop members imitated her and these practices spread rapidly.
This phenomenon is not limited to primates--in the days when people in Britain still had milk deliveries left out on the porch, great tits learned to open foil-topped milk bottles, which spread by imitation among conspecifics and even to other species such as the blue tit, with which they forage in mixed flocks.
Not all studies of imitation learning are so anecdotal.
Ex. of imitation--Mobbing
Curio (1988) has shown that mobbing behavior can be transmitted by imitation, essentially culturally. Mobbing is when small birds will band together to attack predators such as birds of prey.
Curio set up a screened apparatus in which two European blackbirds could see each other and each could see stuffed dummies in adjacent compartments, but neither bird could see the other's dummy.
An adult bird, designated the teacher, was shown an owl while a naive young bird, the student, was shown a perfectly innocuous bird which no blackbird had never seen before--an Australian honeyguide.
The owl stimulated the teacher to display mobbing behavior which, as far as the student could tell, was directed at the honeyguide, and the student was soon frantically mobbing the honeyguide itself.
Next, the teacher and the owl were removed, another naive blackbird was put in, and the honeyguide placed so that both birds could see it. Watching the former student mob the honeyguide, the second young bird also learned this behavior.
By this means, an intense and wholly irrational hatred for honeyguides was passed along a chain of six young blackbirds with no diminishing of the mobbing response.
Learning by observation and imitation of others results in traditions passed on within social groups by non-genetic transmission; this can justifiably be considered a form of CULTURE. Culture of this sort is surprisingly commonplace in animals. Chivers and Smith (1995) recently showed that fathead minnows can also learn to show a fright response, particularly hiding under shelter, to certain odors just through the presence of minnows who have been conditioned to show a fright response in the presence of that smell.
III. Learningpredispositions and constraints
So far we have discussed types of learning but we have not considered the role of natural selection in shaping an organism's capacity for learning. Most ethologists would argue that natural selection can be as important here as in any other area of behavior, but this strongly contradicts a view once held by some psychologists. What factors determine the amount or types of behaviors that can be learned? Are some behaviors more likely to be learned that others? The controversy and studies I will present next demonstrate some of the constraints on learning in organisms.
The psychological learning theorists developed the law of equipotentiality, which states that any two events can be learned associatively with equal ease, in any animal.
This assumption of a "general process" of all learning in all species justified studying rats and pigeons in order to extrapolate to humans, the true focus of interest for many investigators.
In the mid and late 1960's, Garcia caused a furor within the field of learning psychology with a series of flavor aversion experiments.
Rats were fed flavored water and at the same time either injected with a nausea-inducing substance or briefly irradiated, which also causes nausea. They subsequently refused to drink the flavored water.
Rats that were nauseated just once, and as long as 7 hours after drinking, still made the association. The long delay makes perfect adaptive sense for rats, since they must learn not to drink things that are poisonous and not all poisons are fast-acting. Most classical conditioning, in contrast, usually requires a short interval between associated events and several learning bouts.
In direct contrast to the law of equipotentiality, rats were unable to learn to avoid flavored water if it was associated with an electric shock rather than nausea.
They also failed to learn to associate light or the sound of a buzzer with nausea, though they easily associated the buzzer with a shock.
In a lengthy series of such experiments, rats were able to make some associations, but not others, and to ethologists there was an obvious pattern to be drawn from these results: that the adaptations of rats to their natural environment include predispositions to learn some biologically meaningful associations and constraints against learning meaningless ones. Nausea would be a normal response to drinking something poisonous while an electric shock would not.
IV.In addition to adaptive predispositions in what can be learned, many species exhibit predispositions as to when learning takes place. This is particularly evident in the phenomenon of imprinting--a rapid and persistent form of learning, usually in young animals.
Imprinting was discovered by Heinroth and studied extensively by Lorenz. Young goslings and ducklings rapidly learn to treat as the mother any animal that they see soon after hatching. Lorenz found that they readily imprint on humans.
Ducklings exhibit a stereotyped following response to the imprinted object, and Hess (1959) used a carousel setup to quantify the effectiveness of imprinting on various artificial stimuli. Ducklings can be imprinted on all sorts of completely unnatural stimuli as easily as on realistic dummies of mother ducks. They will enthusiastically follow practically anything that moves, even plastic balls.
Conspicuous motionless objects such as flashing lights and boldly patterned objects are also effective for imprinting. Apparently the duck has very few predispositions regulating what can be learned; the stimulus just has to get its attention, for which either conspicuousness or motion will suffice. However, when such learning can occur is a rigidly programmed feature of the duck's nervous system.
Imprinting as originally defined by Lorenz is characterized by four criteria, two of which refer to its timing restrictions:
(1) it takes place in a highly specific time interval called the critical or sensitive period. The timing of the critical period is species-specific. Imprinting on a parent object usually occurs within 5 to 24 hours after hatching in chicks, but in a much shorter interval of 13 - 16 hours after hatching in ducklings.
In some birds, imprinting on auditory stimuli from the mother begins while the chick is still inside the egg.
Occasionally imprinting can be made to occur after the sensitive period normally ends, if birds are deprived of any suitable stimulus throughout the critical period.
(2) it is largely irreversible by subsequent experience.
With these severe restrictions on timing and subsequent flexibility, imprinting is a form of learning that is somewhere nearer the closed end of the open - closed program continuum. However, the fact that the object on which individuals imprint is flexible indicates the program is open to some environmental influence.