Essays24.com - Term Papers and Free Essays
Search

Choice and Matching

Essay by   •  October 31, 2017  •  Study Guide  •  5,692 Words (23 Pages)  •  1,108 Views

Essay Preview: Choice and Matching

Report this essay
Page 1 of 23

Choice and Matching

  • Concurrent schedules.
  • Consists of the simultaneous presentation of two or more independent schedules, each leading to a reinforcer.
  • The organism is allowed a choice between responding on one schedule versus the other.
  • What happens if, for example, the animal is presented with two VI schedules so reinforcers become available at unpredictable points in time?
  • Matching law

  • The Matching law.
  • The matching law holds that the proportion of responses emitted on a particular schedule matches the proportion of reinforcers obtained on that schedule (i.e. a pigeon will emit approximately twice as many responses on the VI30 as on the VI60).
  • The matching law predicts a consistent relationship between the proportion of reinforcers obtained on a certain alternative and the proportion of responses emitted on that alternative.
  • Example in human behavior: in a group situation (i.e. classroom), we must choose between directing our conversation to one person or another, each of whom provides a different rate of reinforcement (in the form of comments or acknowledgments). The relative amount of time that we look at each person matches the relative frequency of verbal approval delivered by that person.

  • Deviations from matching.
  • Undermatching: The proportion of responses on the richer schedule versus the poorer schedule is less different than would be predicted by matching.
  • Undermatching can occur when there is little cost for switching from one schedule to another; changeover delay  i.e. If two food patches are extremely close together, then undermatching is likely to occur. The animal moves back and forth even if one side is generally a much richer area in which to hunt.
  • Overmatching: The proportion of responses on the richer schedule versus the poorer schedule is more different than would be predicted by matching.
  • Overmatching can occur when the cost of moving from one alternative to another is very high. The animal will switch less often and spend more time on the richer alternative than the matching law would predict.
  • Bias from matching: occurs when one response alternative attracts a higher proportion of responses than would be predicted by matching, regardless of whether that alternative contains the richer or poorer schedule of reinforcement (i.e. a pigeon emits 10% more responses on a red key than expected because the red is more attractive.
  • Operant behavior should often be viewed in context.
  • The amount of behavior directed toward an alternative is a function of the amount of reinforcement available on that alternative as well as the amount of reinforcement available on other alternatives.
  • The fact that a child spends little time reading does not mean that reading is not a reinforcing activity for that child.
  • Melioration.
  • The distribution of behavior in a choice situation shifts toward those alternatives that have higher value regardless of the long-term effect on the overall amount of reinforcement (Herrnstein, 1990).
  • This tendency to move toward the higher valued alternative can sometimes result in a substantial  reduction in the total amount of reinforcement obtained.
  • Three types of problems with melioration:
  • An alternative may not require as much responding as one is distributing toward it to obtain all of the available reinforcers.
  • I.e. someone spends too much time courting clients who are relatively easy sells and too little time courting retailers who are difficult sells. Unfortunately, because the rich schedule of reinforcement provided by the easy clients is very attractive to him, he continues to spend too much time with his easy clients and too little time with his difficult clients.
  • I.e. Distribution of time to study between the courses taken (students spend the most time studying for their most enjoyable courses).
  • Overindulgence in a highly reinforcing alternative can often result in long-term habituation to that alternative, thus reducing its value as a reinforcer.
  • Many people fondly remember those times in their lives when they had limited resources and highly valued items could be experienced in only small quantities and truly enjoyed…
  • Melioration is often the result of behavior being too strongly governed by immediate consequences as opposed to delayed consequences.

Self-Control

  • Introduction: Skinner (types of controlling responses).
  • Skinner viewed self-control as an issue involving conflicting outcomes. Managing this conflict involves two types of responses:
  • A controlling response that serves to alter the frequency of a controlled response (i.e. you leave your phone at home to be less distracted at work- by emitting one response, you affect the other).
  • Several types of controlling responses:
  • Physical restraint: you physically manipulate the environment to prevent the occurrence of some problem behavior (stimulus control: i.e. create environment specifically to study).
  • Depriving and satiating: to utilize the motivating operations of deprivation and satiation to alter the extent to which a certain event can act as a reinforcer.
  • Doing something else (alternative behaviors): to prevent yourself from engaging in certain behaviors.
  •  Self-reinforcement and self-punishment. This may be more effective if other people are aware.

  • Self-control as a temporal issue.
  • We have to choose between alternatives that differ in the extent to which the consequences are immediate versus delayed.
  • The lack of self-control arises from the fact that our behavior is more heavily influenced by immediate consequences than by delayed consequences.
  • It is not only about reinforcement. Self-control can also involve choosing between a smaller sooner punisher and a larger later punisher (i.e. visiting the dentist).
  • It is also important that the later consequences are usually less certain than sooner consequences.
  • The value of the delayed consequences is weakened both because they are delayed and because they are less certain.
  • Self-control issues in the real world often involve a rather complex set of contingencies. It is more complicated than jus a simple choice between two rewards or punishers.
  • Paradigm used in research: delay of gratification task.
  • The opposite of self-control  impulsiveness.

  • Models:
  • Mischel’s Delay of Gratification Paradigm.
  • Research with marshmallows in children.
  • The extent to which children were able to resist to temptation was influenced by:
  • Attention to the reward (i.e. some children cover their eyes with their hands).
  • Alternative task; doing something else like singing.
  • Manner in which children thought about the rewards (i.e. children who focused on the desired outcome, and conceptualized it as a desired outcome, generally became impulsive.
  • Follow-up evaluations: children who had devised tactics that enabled them to wait, were, at 17 years of age, more cognitively and socially competent (i.e. they were able to cope with frustrations…).
  • The Ainslie-Rachlin Model.
  • Preference between smaller sooner (SSR) and larger later rewards (LLR) can shift over time.
  • The value of the reward is a “hyperbolic” function of its delay.
  • At an early point in time, when both rewards are still distant, the larger later reward (LLR) is clearly preferred. As time passes, however, and the smaller sooner reward (SSR) becomes imminent, its value increases sharply and comes to outweigh the value of the LLR.
  • Two alternatives:
  • Changing the shape of the delay function for the larger later reward. Several variables that can affect the shape of a delay function:
  • Innate differences in impulsivity between species.
  • Innate individual differences, with some individuals being more impulsive than others.
  • People are less impulsive when they grow older.
  • People become less impulsive after repeated experience with responding for delayed rewards.
  • Availability of other sources of reinforcement may serve to reduce impulsiveness (self-reinforcement procedures).
  • Setting up an explicit series of precise subgoals: The successful completion of each subgoal provides a salient form of secondary reinforcement that helps maintain progress toward the LLR.
  • Making a commitment response: is an action carried out at an early point in time that serves either to eliminate or greatly reduce the value of an upcoming temptation.
  • Behavioral contracting.
  • I.e. give your brother $20 and tell him to keep it if you don’t study in the afternoon and watch TV.
  • Punishers or reinforcers can work.
  • The Small-But-Cumulative Effects Model.
  • The most difficult self-control issues require an ongoing series of choices, with any single choice having relatively little effect.
  • Each individual choice on a self-control task has only a small but cumulative effect on our likelihood of obtaining the desired long-term outcome.
  • I.e. one burger by itself is unlikely to make any difference in your quest to become healthy. It is only by repeatedly selecting tofu salads over burgers that you will realize any significant effects on your health.
  • Would you really be tempted to eat a burger if you knew that that burger would give you a heart attach in 20 years?
  • Each choice of an SSR versus LLR has only a small but cumulative effect on the outcome.
  • Suggestions by the Small-But-Cumulative Effects Model:
  • Self-monitoring; i.e. tracking the behavior on an ongoing basis forces us to view the behavior from a holistic perspective.
  • Having a plan in place to handle occasional lapses.
  • Establishing rules that clearly distinguish between acceptable and unacceptable behaviors.
  • Sometimes total abstinence is what works.
  • The Strength Model.
  • Baumeister et al., 1998.
  • Self-regulation is like a muscle that can become temporarily fatigued through intense use.
  • It can also be strengthened through repeated efforts.
  • Ego-depleting tasks significantly reduced blood glucose levels  giving participants a sugary solution to drink.

Social (observational) learning

  • Introduction to observational learning (1st slide).
  • The behavior of a model is witnessed by an observer and the observer’s behavior is subsequently changed.
  • This type of learning can occur without our even being aware of it.
  • It’s a social process. Being in a social situation can change behavior, even if no one in the group realizes it.
  • People can improve their performance (i.e. sports) simply by watching others perform.
  • Observational learning can be involved in both classical and operant conditioning.

  • Contagious behavior.
  • More-or-less instinctive or reflexive behavior triggered by the occurrence of the same behavior in another individual.
  • I.e. yawn, fear responses, laugh, orienting responses
  • Stimulus enhancement:
  • The probability of a behavior is changed because an individual’s attention is drawn to a particular item or location by the behavior of another individual. It simply allows the behavior’s triggers to be noticed.
  • Contagious behavior and stimulus enhancement are just a momentary change in behavior.

  • Observational learning in Classical Conditioning.
  • The stimuli involved are usually emotional in nature.
  • Emotions in other people can act as stimuli that elicit similar emotional responses in the observer. Those emotional responses, once acquired, can also motivate other types of new behavior patterns.
  • Such emotions are called vicarious emotional responses: classically conditioned emotional responses that result from seeing those emotional responses exhibited by others.
  • This is a type of higher-order conditioning (i.e. fearful looks in others are often associated with frightening events, they come to serve as CSs for the emotion of fear in ourselves.
  • Observational learning in Operant Conditioning.
  • Distinction between acquisition and performance of a behavior:
  • Acquisition: observational learning first requires that the observer pay attention to the behavior of the model. What makes us attend to a model?
  • Consequences of the model’s behavior.
  • Whether the observer receives reinforcement for the behavior of attending to a model.
  • Whether the observer has sufficient skills to benefit from the modeling. The task has to be easy enough for the observer. Modeling works only when observers have the skills necessary to learn the behavior.
  • The personal characteristics of a model can strongly influence the extent to which we will attend to their behavior (i.e. we attend to models who resemble us).
  • Performance:
  • We are more likely to perform a modeled behavior when we have observed the modeled behavior being reinforced  vicarious reinforcement.
  • We are less likely to perform a modeled behavior when we have observed the modeled behavior being punished  vicarious punishment.
  • We are more (or less) likely to perform a modeled behavior when we ourselves will experience reinforcement (or punishment) for performing that behavior.
  • Imitation.
  • True imitation is a form of observational learning that involves the close duplication of a novel behavior.
  • Children have a strong tendency to imitate the behaviors of those around them.
  • Generalized imitation: the tendency to imitate a new modeled behavior with no specific reinforcement for doing so (i.e. by deliberately reinforcing the imitation of some behaviors, therapists can produce in children a generalized tendency to imitate).
  • This is used by therapist working with children who are developmentally delayed or with autism.
  • Aggression.
  • “Bobo doll studies” (i.e. Bandura, 1965): social learning of aggression.
  • Children who observed a model behaving aggressively toward the Bobo doll and other targets tended to replicate the same behaviors.
  • The children were also influenced by the consequences that the model experienced while behaving aggressively (i.e. aggression was stronger if the child had observed reinforcement of the adult’s aggression).
  • In some cases, the children observed the aggressive behavior and produced far fewer aggressive behaviors when a disapproving adult was present. In the absence of the disapproving adult, the children exhibited an increase in aggression.
  • Violent media viewed in childhood is significantly correlated with aggressive and antisocial behavior 10 years later (Eron et al, 1972).
  • Females who watch violent media are more likely to become victims of violence (Ehrensaft et al., 2003).

Rule-governed behavior

  • Rule-Governed behavior.
  • We can indirectly learn a behavior through the use of language.
  • It enhances our ability to interact with one another and to adapt to the world around us.
  • Language allows us to influence each other, and ourselves, through the presentation of rules.

  • Definitions for rule-governed behavior rules and instructions.
  • A rule is a verbal description of a contingency:
  • “In a certain setting, if we perform a certain behavior, then a certain consequence will follow.”
  • Rules depend largely on the consequences we have received for following those rules or instructions.
  • Rule-governed behavior: behavior that has been generated through exposure to rules.
  • In its purest form, a rule does not say anything about how we should respond with respect to that contingency.
  • Instruction: a rule that says something about how we should respond.
  • We will use the terms rule and instruction interchangeably.

  • Advantages and disadvantages.
  • Advantages
  • Rules are extremely useful for rapidly establishing appropriate patterns of behavior.
  • I.e. what may require several hours of training with a rat requires only a few seconds of instruction with a verbally proficient human.
  • We can learn a behavior even before we have any direct experience.
  • Disadvantages
  • Rule-governed behavior may be less efficient with implicit types of learning, specifically procedural learning (i.e. muscle learning).
  • Behavior is sometimes insensitive to the actual contingencies of reinforcement operating in a particular setting (i.e. “push the button to earn money” – people never slow down enough to realize that such a high rate of response is unnecessary (Betall et al., 1985).
  • People follow rules and are sometime too inflexible (don’t think outside of the box).
  • Although rules are often extremely beneficial, we do well to recognize that they have their limitations and often require modification according to the particular circumstances in which we find ourselves.
  • Personal rules (self-regulation).
  • Personal rules (self-instructions): verbal descriptions of contingencies that we present to ourselves to influence our behavior (Ainslie, 1992).
  • Say-do correspondence occurs when there is a close match between what we say we are going to do and what we actually do.
  • Parents play a critical role in the development of this correspondence.
  • Say-do correspondence is a major predictor of procrastination in students (Howell et al., 2006).
  • Not all personal rules are equally effective. They are more effective if they establish a “bright boundary” between acceptable and unacceptable patterns of behavior
  • I.e. we are more likely to succeed when the rule specifically sets out the conditions under which it has been obeyed or violated.
  • Implementation intentions or personal process rules (Ainslie, 1992): personal rules that indicate the specific process by which a task is to be accomplished  Clear, specific rules (when, where and how).

Preparedness

  • Preparedness in Classical Conditioning.
  • Inherited preparedness to respond to certain types of events:
  • When a NS is a “biological relevant” stimulus for fear, backward conditioning might occur even if the US happens first (i.e. bite feeling followed by a spider).
  • Taste aversion conditioning: a food item that has been paired with gastrointestinal illness becomes a conditioned aversive stimulus. Taste aversion involves many of the same processes found in other forms of classical conditioning:
  • Stimulus generalization: food items that taste similar to the aversive item.
  • A conditioned taste aversion can also be extinguished if the aversive food item is repeatedly ingested without further illness.
  • Overshadowing can occur in that we are more likely to develop an aversion to a stronger-tasting food item.
  • Taste aversion conditioning. There are major differences with other types of conditioning:
  • The formation of associations over long delays: taste aversions can develop when food items are consumed several hours before the sickness develops.
  • One-trial conditioning: Strong conditioned taste aversions can usually be achieved with only a single pairing of food with illness, particularly when the food item is novel.
  • Specificity of associations: you are more likely to associate the nausea with a meal and not with other things that happen at the same time  CS-US relevance: an innate tendency to more readily associate certain types of stimuli with each other.
  • Practical applications:
  • Use highly familiar foods in cancer patients. The patient can also be served a highly novel, yet trivial, food item just before a chemotherapy session.

  • Preparedness in Operant Conditioning.
  • There is a biological disposition for certain types of operant responses to obtain certain consequences:
  • i.e. rats will more readily learn to press a lever to obtain food pellets than to avoid shock.

  • Operant-Respondent Interactions:
  • Instinctive drift: an instance of classical conditioning in which a genetically based, fixed action pattern gradually emerges and displaces the behavior that is being operantly conditioned.
  • I.e. It was once assumed that an animal could be trained to perform just about any behavior it was physically capable of performing. Sometimes a classically conditioned, fixed action pattern interferes with the operant behavior that is being shaped.
  • i.e. we try to teach a pig to deposit a coin in a bank (operant conditioning procedure). Eventually, the pig associates a coin with food and it begins to elicit species-specific behavior patterns associated with feeding (classical conditioning – the pig no longer deposits the coin in the bank and starts tossing the coin in the air and then roots at it on the ground  The coin has become a conditioned stimulus (CS) that elicits a conditioned response (CR) in the form of a food-related, fixed action pattern  The classical conditioning response overrides the operant conditioning response.
  • Sign tracking: an organism approaches a stimulus that signals the presentation of an appetitive event.
  • i.e. a dog approaches a light –CS- and licks it –food related behavior-
  • In the example, the light has become so strongly associated with food that it is now a CS that elicits innate food-related behavior patterns.
  • A behavior that starts off as an elicited behavior, controlled by the stimulus that precedes it (classical conditioning) becomes transformed into an operant behavior, controlled mostly by its consequences (operant conditioning).
  • Sign tracking: your girlfriend was a powerful appetitive stimulus, with the result that you now approach stimuli that have been strongly associated with her. This tendency to approach those settings should eventually extinguish.
  • Burns and Domjan (1996)

Adjunctive behavior

  • Adjunctive behavior:
  • Adjunctive behavior is an excessive pattern of behavior that emerges as a by-product (side effect) of an intermittent schedule of reinforcement for some other behavior.
  • I.e. when rats are trained to press a lever for food on an intermittent schedule of reinforcement, they also begin drinking excessive amounts of water (Falk, 1961).
  • Other types of adjunctive behaviors are possible: i.e. aggression.
  • Distinguishing features:
  • Adjunctive behavior usually occurs in the period immediately following consumption of an intermittent reinforcer.
  • Adjunctive behavior is affected by the level of deprivation for the scheduled reinforcer.
  • There is an optimal interval between reinforcers for the development of adjunctive behavior.
  • Adjunctive behavior in humans.
  • Examples of behaviors associated with periods of enforced waiting:
  • Nail biting, talkativeness, snacking, coffee drinking.
  • Drug and alcohol abuse.
  • These behaviors represent a type of displacement activity, an apparently irrelevant activity sometimes displayed by animals when confronted by conflict or prevented from getting a goal.
  • They help you to keep you alert and give you “something to do” while waiting…
  • Activity anorexia.
  • When exposed to an intermittent schedule of food reinforcement for lever pressing, rats will run in a wheel for several seconds during the interval between reinforcers (Levitsky & Collier, 1968).
  • When interval is long (23 hours): rats can begin to spend increasing amounts of time running during intervals. The more the run, the less they eat.
  • Within a week, the rats are running enormous distances (~12 miles!) and eating nothing and if the process is allowed to continue, the rats will die.
  • Anorexia in humans is also often accompanied by very high levels of activity.
  • Activity anorexia in animals (and anorexia in humans) is more easily induced in adolescent rats than in older rats.
  • Endorphins may be involved  Addiction to endorphins:
  • Endorphins are released during prolonged fasting (Kaye, 2008).
  • Endorphins from exercise: “runner’s high”.
  • Patient with AN: “I enjoy having this disease and I want it”.
  • Behavioral treatments for anorexia nervosa should also focus on establishing normal patterns of activity.

Material from previous exams

  • Functional relationship and correlation.
  • Changes in the independent variable (environmental events) can lead to changes in the dependent variable (behavior)  Cause-and-effect relationship.
  • Appetitive and aversive stimuli.
  • Appetitive stimulus is an event that an organism will seek out.
  • Aversive stimulus is an event that an organism will avoid.
  • We need to be careful because somethings can be aversive and appetitive at the same time (the organism wants it). Some people describe their smoking habit as disgusting, but that is still an appetitive stimulus for them.
  • Bandura’s determinism (example of cognitive behavioral therapy).
  • Bandura’s reciprocal determinism (the notion that each behavior has a cause): environment, behavior and internal events reciprocally influence each other.
  • Cognitive-behavior therapy: psychological disorders are treated by altering both environmental variables and cognitive processes.
  • Generalization.
  • Generalization: tendency for a CR to occur in the presence of a stimulus that is similar to the CS.
  • The more similar the stimulus is to the original CS, the stronger the response.
  • Semantic generalization: Generalization doesn’t occur only when the stimuli are physically similar, it also occurs when the stimuli are similar in meaning (car, truck, automobile …).
  • Class activity: stereotypes.
  • Discrimination.
  • Discrimination: the tendency for a response to be elicited more by one stimulus than another.
  • Such discriminations ca be deliberately trained through a procedure known as discrimination training.
  • Brain: Neural discrimination.
  • Class activity: training to discriminate (exposure, experience).
  • Why can discrimination be difficult in social interactions?
  • I.e. a woman with a bad experience with a man will categorize men as abusive (overgeneralization).
  • It can be hard to help her to discriminate between type of men because there is avoidance (she avoids men).
  • Aversion therapy (phobias).
  • Aversion therapy reduces the attractiveness of a desired event by associating it with an aversive stimulus.
  • The taste of alcohol is paired with painful electric shocks or with nausea.
  • Nausea-based treatments are more effective than shock-based treatments, because there is a biological tendency to associate nausea with substances that we ingest (it’s a biologically relevant response to drinking).
  • Aversion therapy is sometimes carried out with the use of imaginal stimuli rather than real stimuli  covert sensitization.
  • This depends on the client’s ability to visualize images.
  • Introduction to the study of operant conditioning: three components
  • Response
  • Operant behavior or response (class of responses) is called operant.
  • In contrast to classically conditioned behaviors, which are said to be elicited by stimuli, operant behaviors are said to be emitted by the organism (they are more voluntary).
  • Consequences
  • Consequences can either increase (strengthen) or decrease (weaken) the frequency of the behavior in the future.
  • Reinforcers (SR): strengthen a behavior (the future probability of the behavior increases).
  • Punishers (SP): weaken a behavior (the future probability of the behavior decreases).
  • The behavior (not the person) is reinforced or punished.
  • Reinforcers and punishers are formally defined entirely by their effect on behavior (i.e. some consequences can be reinforcers for a person and punishers for a different person).
  • Extinction: weakening of a behavior through the nonreinforcement (withdrawal of reinforcement). This process is gentler but slower than punishment.
  • Discriminative Stimuli
  • Discriminative stimulus (SD): is a signal that indicates that a response will be followed by a reinforce.
  • Discriminative stimulus for punishment (SDP): i.e. someone who receives a fine for speeding in the presence of a police car.
  • Discriminative stimulus for extinction (SΔ): they signal the absence of reinforcement.
  • These stimuli set the occasion for the behavior. They indicate that the consequences are available.
  • They don’t elicit behavior automatically (classical conditioning). The occurrence of the behavior depends on its consequences.
  • Classical conditioning: the stimulus elicits the behavior.
  • Operant conditioning: the organism emits the behavior.
  • Three-term contingency: Notice something, do something, get something.
  • Operant vs. classical conditioning.
  • Processes of operant and classical conditioning overlap such that a particular stimulus can simultaneously act as both a discriminative stimulus and a conditioned stimulus (i.e. a tone can serve as an SD for the operant behavior of lever pressing for food and it can also be a CS and elicit salivation).
  • Classical conditioning: Behavior is more involuntary and is a function of what comes before it (stimulus).
  • Operant conditioning: Behavior is more voluntary and is a function of what comes after it (consequences).
  • Avoidance conditioning and obsessive-compulsive disorder.
  • OCD can be analyzed in terms of avoidance conditioning (i.e. Mowrer’s two-process theory).
  • Obsessions are associated with an increase in anxiety, whereas compulsions are associated with a decrease in anxiety (i.e. the feeling of anxiety is a classically conditioned response elicited by contact with the garbage, while showering is an operant response that is negatively reinforced by a reduction in anxiety).
  • Active avoidance: a person with OCD will generally do something in order to reduce anxiety.
  • First effective treatment: Exposure and response prevention (ERP); preventing the avoidance response from occurring.
  • Prolonged exposure to the anxiety-arousing event while not engaging in the compulsive behavior pattern that reduces the anxiety.
  • Graduated exposure of systematic desensitization with prolonged (90 mins or more) exposure of flooding therapy.
  • Class activity: Four major types of learning we have studied, which ones we share with animals, which ones are only human and why, what experiences influence each type of learning (i.e. own emotional experiences).

...

...

Download as:   txt (30.7 Kb)   pdf (308.2 Kb)   docx (27.5 Kb)  
Continue for 22 more pages »
Only available on Essays24.com