Guthrie, Piaget, Tolman, Hebb


Front Back
stimulus for Guthrie internal change in "receptors" organismal event movement-produced stimuli cue causes stimuli, internal stimuli cause responses
cue for Guthrie external phenomena external cues produce internal stimuli
conditioners Guthrie stimuli paired with some response recognition that there are lots of cues creating lots of stimuli--need connection with response
response for Guthrie muscle twitches response to neural input (stimuli) stimuli all come together to produce muscle twitches reduced response to its most basic element
act for Guthrie collection of responses this is what Skinner/Hull/Pavlov considered a response according to Guthrie, they only studied an artifact of behavior
Guthrie's Single Law of Learning LAW OF CONTIGUITY all learning basically classical conditioning if you do something in a given situation, the next time you are in that situation, you will do it again CS should precede UCS by about 1/2 second
Principle of Recency In terms of Guthrie, the response performed last in the presence of a set of stimuli will be that which willl be done when the stimulus combination next occurs.
How does Guthrie account for acquisition? all or none approach a muscle twitch either occurs or it doesnt the incremental curve observed by other theorists is a whole series of muscle twitches (an act) stimuli gain full associative strength at first pairing as more twitches become associated with each other and specific stimuli, there is a refinement of responses
For Guthrie, what is the relationship between the number of stimuli present and the occurance of a response during any given trial? if all cues and stimuli are similar, then responses will be similar--transfer of training transfer of training probably minimal because cue-stimuli-response relationships are very specific if there are dissimilar stimuli, there are incompatible responses associative shifting very influential for Guthrie
What is the role of reinforcement for Guthrie? reinforcement changes cue-stimuli configuration in which animal is imbedded change in stimuli cause animal to preserve the last thing it did--recency principle
Seward experiment two groups of rats lever press one group immediately removed from box when it responds tested recency principle--the task should be learned because stimuli-configuration changed it wasn't....Guthrie not supported
For Guthrie, are operant and classical conditioning any different? they are the same thing ALL learning is contiguous and OC/CC are just reflections of this
How did Guthrie explain motivation? internal drive states exist drive create maintaining stimuli maintaining stimuli last for a long time Sequence of behavior associated with maintaining stimuli seem to be interrelated and local
Did Guthrie think behavior was purposive? Guthrie thought behavior was organized and systematic, not just a random series of muscle twitches maintaining stimuli provide framework for intentions intentions are responses conditioned to maintaining stimuli mechanistic, no cogntion involved
How does Guthrie explain chaining? movement produced stimuli similar to Watson's kinesthetic feedback after a response is initiated by a set of internal stimuli (bodily movements), the body itself produces the stimulus for the next response, and that response produces the stimulus for the next…
How does Guthrie account for extinction? extinction is not passive there is no weakening of connections it is like counterconditioning or associative inhibition organism learns to do something else that replaces original behavior
How can Guthrie say practice makes perfect? he is proponent of one-trial learning for responses, not for ACTS Skills are made up of many acts—practice allows specific associations between muscular stimuli and responses to be made refinement of responses
Guthrie's views on punishment punishment not necessarily bad, depends on what behavior you make the organism do if punishment doesn't tell the organism what to do, then it won't work well need to enable construction of new behaviors--punishment must elicit incompatible behavior expose organism to stimuli that create incompatible responses
what is a habit for Guthrie? a response that has become associated with a large number of stimuli the more stimuli, the stronger the response
Fowler and Miller Experiment trained raze to traverse runway for food reinforcement control group, one group given shock to front paws as they reached the food cup, one group given shock to hind paws as they reached food cup front paw group ran slower, hind paws facilitated running supportive of Guthrie's notions on punishment
threshold method Guthrie method of breaking habit make small incremental changes that allow for incompatible behavior
fatigue method Guthrie method of breaking habits keep doing behavior until it bores you, then sitmuli become associated with avoidance
sidetracking Guthrie habit breaking method avoid stimuli associated with habit
incompatible response method Guthrie habit breaking method introduce stronger stimuli associated with incompatible behavior alongside stimuli associated with habit
How does Guthrie account for spontaneous recovery? extinction occurs with replacement of one behavior for another spontaneous recovery occurs when we place the animal back in the experimental envt, but the animal and the envt change slightly every day in extinction, you slowly knock out the stimuli-response associations a few at a time knocking out certain stimuli leads to fewer responses with spontaneous recovery, some of the stimuli and responses associated with those stimuli still remain meanwhile, the orginal stimuli are slowly being paired with other responses, which encourages extinction
How did Voek try to formalize Guthrie's theory? 4 postulates, 8 definitions, 8 theorems
Principle of Association Voek law of contiguity, Guthrie's one law of learning response must occur 1/2 sec after cue
Principle of Postremity Voek principle of recency the thing an animal does last is what gets preserved
Principle of Response Probability Voek the probability of a response occurring is a function of the proportion of the stimuli present which are cues for that response when # stimuli increase, # responses increase, and response probability increases similar to prepotency of elements
Principle of Dynamic Situations Voek stimulus-response pattern is not static envt and internal sitmuli always in a state of flux
Guthrie Horton Experiment Guthrie didn't experiment much because he taught only undergrads His ONE experiment, studied how 800 cats escaped from puzzle boxes supposedly pulled on string to escape the same way every time cats would by-pass food reward--behavior preserved due to recency and contiguity, not reinforcer critcisms: response drift may occur, Guthrie said that there is shifting in behavior---that's stupid
Voek's 3 experiments All human subjects 1) wire maze, allowed for correction people were like "oh snap, i shouldn't have done that"---she thought this meant that what was last done was preserved, recency better explained improvement in task 2) punch board maze; had to learn pattern on bunch board with no corrections again, people remembered when they fucked up, so recency explains learning with humans, recency is somewhat cognitive 3) eyelid conditioned to a cue--was it incremental or all-or-nothing results showed it was somewhere in between, but she took it has evidence for all-or-non approach
Criticisms of Guthrie postremity isnt as strong as reinforcement postulate 4 (Dynamic Situations) makes theory hard to disprove too strong and emphasis on substitution
Guthrie on education manipulate envt so desired responses are conditioned to those stimuli practice important, motivation not learning should mimic real situations, fuck formal discipline the use of punishment must cause incompatible behavior accepted Thorndike's theory of transfer of training
How was Piaget's approach different from other theorists of his time? FUCK BEHAVIORISM we are not blank slates, cognition most essential part that drives our species used clinical method--idiosyncratic, often criticized for it considered himself genetic episthemologist an organicist
schemata Piaget the potential to act in a certain way elements that compose cogntive structure schemata can be overt thru behavior to covert through cognition
content Piaget the aspects of any particular manifestation of a schema
cognitive structure Piaget number of schemata available to an organism at any given time
Piagets 5 types of intelligence 1) adaptation: child will alter behavior when interacting with envt 2) equillibrium: child undergoes series of adaptations to reach state of homeostasis 3) intelligence as an instrument of achieveing equillibrium 4) mental acitivty: a cognitive act, not behavioral; collection of operations thta continue to develop as child matures 5) competence: optimal level of functioning when child reaches cognitive equilibrium
What are the functions of intelligence according to Piaget? What is the orthogenetic principle? 1) provides organization--ability to integrate strctures at physical/cognitive level there is a heirarchy of coginitive systems---integrate info into one cognitive system, many of those systems integrated into other systems orthogenetic principle: during development, we move through a state of global diffuseness to a period of differentiation and heirarchical integration 2) adaptation--tendency for all humans to adapt to their envt occurs through assimilation and accomodation
Fundamental Assumptions that drive Piaget's theory? 1) Human beings are active organisms; we are not blank slates, but come with predispositioned stuff children come into the world PREPARED 2) Behavior is interaction of cognitive structures with the world--not S-R bonds, but cognitions! reinforcement history not important; history of cognitive development important 3) development assumption: all development is goal-directed towards a state of equilibrium equilibrium acheived when child adapts to immediate envt through assimilations and accomodation all development directed to an end-point=>formal operations
assimilation the process of responding to the environment in accordance with one’s own cognitive structure; matching between cognitive structure and the physical envt use things you already know to interpret envt, sets some limits on what organism can know allows for modest expansion of what you know without changing any major schemata
accomodation the process by which the cognitive structure is modified in response to envt demands allows for LEARNING
Piaget's 4 ways in which development occurs? 1) maturation: biological aging necessary but insufficient condition 2) experience w/ physical envt: evolves from simple interactinos to learning about properties of the world and abstract concepts 3) social experience: interaction is key to development 4) equilibration: develops homeostasis drive to equilibriate is motivational state, you will interact with the world in a way that leads to equilibrium (purposive behavior)
Two things to keep in mind when thinking about Piaget's developmental stages :D 1) theory has unilinear structure; precise order of events stuff in stage 1 is foundation for other stages, not fluid 2) theory is discontinuous when developing in stages 1-4, change is not incremental discrete changes occur, one stage is different from all the others, something new emerges from every stage
Stage 1 child development Sensorimotor period (birth-2 years) most basic behavioral schemata reflexes orientation response, sucking, grasping children will begin to practice responses and understand the outside world at 9 months, object permanence develops
stage 2 preoperational period (2-6 years) emergence of symbolic functioning language, make-believe (cognitive world independent of stimuli) egocentric develop concepts of space and time categorizaiton and formistic thinking develops
stage 3 concrete operations (6-12) independent thought, operational thinking conservation start to develop there is fluidity in thought and logic seriation: order things by magnitude cannot think in abstract terms
stage 4 formal operations (12+ years) reasoned, abstract ideas conservation fully developed proportionality scientific thinking, manipulations child has become sexually mature, hormonal changes working on the brain evolutionaryily makes sense because child needs to understand how stuff works before becoming sexually mature and raising a family frontal lobes not fully developed until early 20's
preconceptual thinking part of preoperational phase (2-4 years) rudimentary concept formation, logic is transductive
period of intuitive thought part of preoperational stage (4-6 years) solves problems intuitively rather than by some logic weak conservation skills
How does Piaget explain how learning occurs? information must be presented that can be assimilated, but must be different enough so that there is slight accomodation nature and nuture are equally important--inherited stuff sets broad limits on intelligence, but envt stimulates new intelligence equilibration also inherited maturatino of physical structures have psychological correlates
the learning dillemma according to Dollard and Miller, learning can only occur when there is a failure to assimilate, which precipitates accomodation all learning depends on FAILURE
Piaget's problem with behaviorism and GESTALT Behaviorists insist that envt projected unto organism, Piaget thought organism projects unto envt but envt plays a role in constructing cognitive structure Gestaltists thought child born with brain according to law of Pragnanz, Piaget thought that brain develops as cognitive structures develop
Piaget on education education must be built around each students cognitive structure materials must be known and partially unknown
Criticisms of Piaget no controlled experiments--clinical method only observed his own children, not those from other races, economic situations, etc time frame for some stages proved incorrect
purposive behavior Tolman behavior is meaningful must study molar behavior whole behavior patterns have meaning--most when we are too elementistic behavior is goal-directed cognitive element, but we must study it objectively
sign learning/sign-significant learning Tolman basically S-S theory through exploration, organism discovers that some things lead to other things reinforcement not required learning is a process of discovery---insight
emphasizer perceptual element that acts as motivation; an organism’s drive state will determine which aspects of the envt are attended to in the perceptual field
cognitive map organism learns where things are in space and creates a picture of the envt that it uses to get around Once organism develops cognitive map, it can reach a goal from any number of directions—opposition to behaviorists creation of cognitive map akin to Guthrie's views on contiguity contiguous presentation of stimuli create cognitive map
principle of least effort organism will do whatever is shortest and requires the least amount of effort to obtain a goal
confirmation learning does not require reinforcement, but confirmation of a hypothesis about the world when hypothesis is proven correct, animal will behave in a similar fashion in the situation During development of cognitive map, animals develop expectations—hunches about what leads to what (hypotheses)
means-end readiness hypotheses that are consistently confirmed organism can develop a plan to get the goal
vicarious trial and error rats look around in a maze and consider where to go next; instead of behavioral trial and error, problem is cogntively considered until solution is reached
Blodgett experiment 3 groups of rats ran a mazel one was always reinforced with food at the end, one group allowed 1-2 days to explore maze first, one group allowed 1-6 days to explore once food was put at end of maze for last 2 groups, they ran the maze as fast as the constantly reinforced rats demonstrated latent learning
latent learning learning that is not translated into performance; possible to remain dormant for long period of time before behavior manifests learning that is stored until it is needed
latent extinction disconfirmation of hypothesis rather than lack of reinforcement; no incremental decline
spontaneous recovery according to Tolman hypothesis is not completely dropped believed it was an artifact of training
demand Tolman, interveneing variable need state that develops after deprivation
appetite Tolman, intervening variable appropriateness of reward with goal
Differentiation Tolman, interveneing variable psychological mechanism by which stimuli are distinguished from each other
motor skills Tolman, interveneing variable behaviors required for organism to behave, more like acts not muscle twitches
biases Tolman, intervening variable animals have predilection to behave in one way or another position habits
HATE participant variables that interact with independent variable these together produce behavior Heredity Age Training Endocrine
maintenance schedule Tolman, independent variable paired with demand deprivation, what creates the demand
G Tolman IV paired with appetite appropriateness of the goal
S Tolman IV paired with Differentiation the stimuli themselves; what it is in the envt that help to distinguish between stimuli Hull's V--stimulus intensity
R Tolman IV for motor skill collection of motor responses required to execute a behavior
P Tolman IV for Biases patter of behavior organism engages in
summa(OBO) Tolman IV for hypotheses practice makes perfect principle the number of trials an organism has experienced basis for behavior ratio the main independent variable--directly effects dependent variable
means-object provides cues and stimuli through contiguity stimuli out there that form the cognitive map
goal-object the goal that you use the cognitive map to get
Tolman, Ritchie, and Kalish experiment animals trained to run a simple maze with a light and food reinforcer at the end then placed in sunburst maze with middle lane blocked according to S-R theorists, rats would run the lanes closet to the middle Tolman found they ran the lane that ran towards the light---cogntive map
Tolman experiment with cable car two areas--one w/ vertical stripes, one w/ horizontal stripes rats passively carried around in little cable car over maze when passes over vertical stripes--SHOCK when animal runs maze actively by itself, avoids vertical stripe area refutes "you learn what you do" principle of behaviorists
Krechevsky experiment In his book, looked at nature of hypotheses in rats, the learning curve, and the unsolvable problem hypotheses defined operationally--behavior systematic and organized rats in maze required to respond to dark side of maze--showed incremental learning curve however, animals also developed right position habit, animals abandoned this habit and picked up the dark habit behavior more organized than once predicted the unsolveable problem--he completely randomized stimuli in maze, behavior should extinguish according to behaviorists the rats tested many hypotheses even thought the maze was unsolvable
switchboard conception mainly held by behaviorists and associationists—sensory events stimulate specific areas of the brain learning causes a change in neural circuitry so that sensory events come to stimulate areas other than where they originally stimulated
mass action the disruption of learning and retention goes up as the amount of cortical damage goes up regardless of the location og the damage
equipotentiality the ability of one portion of the cortex to take over the function of another destroyed section
What things did Hebb notice that led to his theory? 1) The brain does not act as a simple switchboard. If it did, destroying large amounts of brain tissue from the frontal lobes would have been more disruptive 2) Intelligence comes from experience and is not genetically determined 3) Childhood experiences are more important in determining intelligence than adult experiences
restricted environments Hebb restricted environments devoid of stimuli have disabling effects on early learning and development of the nervous system
Von Senden studied adults born with cataracts that were suddenly moved; couldn’t distinguish between triangle and circle—figure-ground perception may be innate but need experience with objects
Riesen raised chimps in the dark; when exposed to light, they eventually LEARNED to see
enriched environments have wide variety of motor and sensory stimuli—probably enhances early development Hebb experiment: two groups of rats, 1 raised in lab, other raised at his home Performance of pet rats superior on maze and detour problems early impoverished envt does not have permanent effect Greater sensory diversity allows for development of more complex and numerous neural circuits and networks Infants born with random neural collections; sensory experience causes these to be more organized and complex---an EMPIRICIST
cell assemblies each envt object we experience stimulates a complex pattern of neurons called a cell assembly Dynamic, not static Can be fired by external stimulation, internal stimulation, or a combo When it fires, we experience the thought of the event the assembly represents-----COOL memories occur in cell assemblies--circuitry of neurons assembly=collection of neurons, memory=reverberation of neurons
Phase sequences integrated cell assemblies; amounts to one current in the stream of thought Fired due to external, internal stimulation or both
Hebb's two kinds of learning 1) building up of cell assmebles and phase sequences--S-R learning 2) adult learning--insight, rearranging cell sequences
Arousal theory Hebb cognitive functions happens best at a certain level of stimulation for cue to function as a stimulus, organism must have certain level of arousal in RAS cue function and stimulus and arousal function of stimulus
reinforcement in Hebb's theory when arousal is too high, organism will operate on envt to try to reduce it, this decrease is REINFORCING when arousal is too low, increase is REINFORCING reinforcement is either increase or decrease in a drive--Hullian?
Sensory Deprivation Mental functioning and personality deterioriated; can only tolerate sensory deprivation for short periods of time Sensory experience important for maintaing proper neurological function Organisms NEED stimulation experiment with college students paid to do nothing....i wish i had that job.
Hebb's take on FEARRR Hebb observed that chimps show no signs of fear until about 4 months old—no fear of completely familiar or unfamiliar objects—when familiar objects shown in unfamiliar way=FEAR Ruled out conditioned response theory of fear Fear occurs when objects in envt trigger existing cell assmeblies or phase sequences but are not subsequently followed by the stimulus events that normally accompany the object
Hebb's two types of memory 1) permanent memory attributed to cell structures already built up 2) transient memory attributed to ongoing cell assembly activity
reverberating neural activity sensory experience sets up neural activity that outlasts the stimulation that causes it—basis for short term memory, but can also be converted into long term memory
consolidation theory Hebb a proponent of this short term memories are eventually consolidated into long term memory
Peterson and Peterson experiment people read a nonsense syllable and then asked to count backwards by threes---retention of syllable decays as a function of time
Duncan experiment rats shocked unless they ran to “safe side” of chamber within 10 secs; shocked either at 20 sec, 40 sec, 60 sec, 15 mins, 1 hour…..14 hour; the more closely the shock follows a learning trial, the more it disrupts memory of the the learning experience, after an hour—no affect on memory
Retrograde amnesia loss of memory for events just prior to a traumatic experience—same effect as Duncan’s experiment
Declarative memory higher-order memory, including the fact that you have learned something new
procedural memory memory for complex motor tasks, but they are aware of failure to learn the task role of basal ganglia this and declarative memory are both forms of long term memory
Olds and Milner when electrically stimulated in reinforcement centers (hypothamamus, hippocampus, thalamus, amygdala, septum, lower cortex) animal tends to repeat behavior that preceded the stimulation Also accredited with discovering pleasure center of the brain
how does reinforcement by direct stimulation differ from primary reinforcement? No deprivation needed before training Satiation does not occur Takes priority over other drives There is rapid extinction Most schedules of reinforcement do not work; only frquent reinforcement schedules can be used with direct brain stimulation
Nucleus accumbens and role of dopamine if a stimulating electrode causes the nuclear accumbens to release dopamine, brain stimulation will be reinforcing, the opposite is also true Release of dopamine can not be equated with the effects of eating, drinking, or drug use Dopamine activity mediates anticipatory or motivational aspects of reinforcers rather than the pleasure associated with them
Salamone and Correa Release of dopamine can not be equated with the effects of eating, drinking, or drug use What all these do have in common is that they stimulate the ncuelus accumbens to secrete dopamine
Robinson, Sandstrom, Denenberg, Palmiter experiment used strain of mice deficient in dopamine, but could inject with dopamine precursor three groups trained to turn left or right in T-maze--1 no dopamine precursor, 1 with caffeine, 1 with dopamine precursor no dopamine group needed assistance, caffeine group active but no improvement with performance, dopamine precursor improved with each day of trianing in follow-up phase of experiment, all mice treated with dopamine precursor increased performance of original no dopa group--performed as well as dopa group most dramatically increased performance was caffeine group shows that learning obviously did occur without dopamine dopamine not necessary for liking rewards or for learning to make associations between rewards and cues dopamine is necessary for reward-related cues to attain motivational significance
how neurons fire Neuron composed of cell body, axon, dendrites Rest in bath of K, NA, CA, and CL ions Resting potential: keep NA out and K in, inside of neuron is negatively charged If this polarization if disrupted, the threshold level may be breached—cell can no longer maintain ion segregation, NA ions flood in an inside the cell becomes positive Cell expends energy to reestablish resting potential—this whole process is known as action potential, travels down cell body End of terminals respond to action potential by releasing neurotransmitters into synapse Dendrites of neighboring cells respond to neurotransmitters by moving towards or away their threshold
learning between neurons Learning occurs when action potential of one cell is predictably generated by the action potential of a neighboring cell
Kandell studied reflexes in Aplysia mollusk Reflexes habituate with weak and repeated stimulation Kandel’s research demonstrated that critical mediating event in habituation=decrease in neurotransmitters from sensory neuron which serve as signals to motor neurons that trigger reflexes Sensitization involves interneurons that stimulate sensory neurons causing the release of neurotransmitters onto the motor neuron
long term potentiation effects of neurons wired together after initial event Stronger, high-frequency stimulation said to potentiate the effect of the initial weak stimulation, effect lasts for months
LTP for associative learning In associative learning, sending cell with weak influence on receiving cell is active at same time a stronger sending cell is active on same receiving cell—Hebbian synapse LTP thought to be lab-phenomenon only, but rats exploring new envt have activity that starts near perforant path and affects cells near the dentate gyrus—pulses called theta pulses NMDA and non-NMDA receptors; activated non-NMDA receptors trigger NMDA receptors somehow
Long term depression possible mechanism by which neurons that are part of an assembly are eliminated When two sending cells stimulate a single recieveing cell, the recieveing cell becomes unresponsive to the activity of the sending cells Occurs in cerebellum, hippocampus and parts of cortex
Neuroplasticity the brain’s capacity to reorganize or modify its connections as the result of experience
Experience and dendrite development enriched envt associated with increased brain weight, increased levels of neurotransmitters, etc. Also increased dendrite length and densities Dendrite organization more complex for cognitive functions than motor functions Increased dendrite development not always a good thing—causes addicts to crave more Drugs can block dendritic differentiation associated with envt enrichment
Neurogenesis the birth and death of new neurons Classical conditioning dependent on neurogenesis In learning process, some old cells must die
relearning after brain injury neurogenesis recruitment of neurons not normally specified for function lost during injury new development of cell assemblies plasticity may be affected by neurotrophins stress reduces plasticity, novel experiences increase
Silent synapse a synaptic connection that might be otherwise nonfunctional but which becomes functional and active during learning
mirror neurons observational learning Mirror neurons reveal one way in which brain encodes a behavior made by another animal, facilitates execution of the same behavior—important implications for imitation and observational learning Mirror system more or less automatic—action mapped from sensory cortex to motor cortex Mirror system also allows for storage of behavior for execution at a later time—these behaviors can be expressed or surpressed Behavior mapped by mirror neurons requires no reinforcement
Chameleon effect we adopt mannerisms, behaviors, vocal intonations from those we interact with without planning it
Sperry experiment two routs for transfer---corpus callosum and optic chiasm (point where info from optic nerve projected on opposite site of occipital lobe) In normal cats, transfer complete; in cats trained before and after chiasm cut, transfer still fine; next cut corpus callosum after training, still fine Lastly, cut both and found that info could not transfer—two brains
Neglect syndrome common in right-brain damaged patients but not left-brain, characterized by failure to see or attend to the left visual field or left side of the body
Doreen Kimura Dichotic listening • Send competeing sensory signals to right and left side of brain • Showed that stimuli presented to right ears (sent to left hemisphere) processed better
why was Bogen dumb? dichotomous ways in which the world or thought processes are often describe reflect the two kinds of hemispheric intelligence Dichotomania is not necessarily correct; it is a matter of degree
Hebb on education emphasis on two types of learning optimal level of arousal expose kid to both left-brain, right-brain tasks
criticisms of Hebb a lot of Pavlovian brain ideas he was sometimes unwilling to change his theory in spite of new evidence—most neural function inhibitory, while Hebb stuck to the idea that it was excitatory