The experimental analysis of behavior is a science that studies the behavior of individuals across a variety of species. A key early scientist was B. F. Skinner who discovered operant behavior, reinforcers, secondary reinforcers, contingencies of reinforcement, stimulus control, shaping, intermittent schedules, discrimination, and generalization. A central method was the [1] examination of functional relations between environment and behavior, as opposed to hypothetico-deductive learning theory[2] that had grown up in the comparative psychology of the 1920–1950 period. Skinner's approach was characterized by observation of measurable behavior which could be predicted and controlled. It owed its early success to the effectiveness of Skinner's procedures of operant conditioning, both in the laboratory and in behavior therapy.
Basic learning processes in behavior analysis
Classical (or respondent) conditioning
In classical or respondent conditioning, a neutral stimulus (conditioned stimulus) is delivered just before a reflex-eliciting stimulus (unconditioned stimulus) such as food or pain. This typically done by pairing the two stimuli, as in Pavlov's experiments with dogs, where a bell was followed by food delivery. After repeated pairings, the conditioned stimulus comes to elicit the response. [3]
Operant conditioning
Operant conditioning (also, "instrumental conditioning") is a learning process in which behavior is sensitive to, or controlled by its consequences. Specifically, behavior followed by some consequences becomes more frequent (positive reinforcement), behavior followed by other consequences becomes less frequent (punishment) and behavior not followed by yet other consequence becomes more frequent (negative reinforcement). For example, in a food-deprived subject, when lever-pressing is followed by food delivery lever-pressing increases in frequency (positive reinforcement). Likewise, when stepping off a treadmill is followed by delivery of electric shock, stepping off the treadmill becomes less frequent (punishment). And when stopping lever-pressing is followed by shock, lever-pressing is maintained or increased (negative reinforcement). Many variations and details of this process may be found in the main article.
Experimental tools in behavioral research
Operant conditioning chamber
The most commonly used tool in animal behavioral research is the operant conditioning chamber—also known as a Skinner Box. The chamber is an enclosure designed to hold a test animal (often a rodent, pigeon, or primate). The interior of the chamber contains some type of device that serves the role of discriminative stimuli, at least one mechanism to measure the subject's behavior as a rate of response—such as a lever or key-peck switch—and a mechanism for the delivery of consequences—such as a food pellet dispenser or a token reinforcer such as an LED light.
Cumulative recorder
Of historical interest is the cumulative recorder, an instrument used to record the responses of subjects graphically. Traditionally, its graphing mechanism has consisted of a rotating drum of paper equipped with a marking needle. The needle would start at the bottom of the page and the drum would turn the roll of paper horizontally. Each subject response would result in the marking needle moving vertically along the paper one tick. This makes the rate of response the slope of the graph. For example, a regular rate of response would cause the needle to move vertically at a regular rate, resulting in a straight diagonal line rising towards the right. An accelerating or decelerating rate of response would lead to a quadratic (or similar) curve. For the most part, cumulative records are no longer graphed using rotating drums, but are recorded electronically instead.
Key concepts
Laboratory methods employed in the experimental analysis of behavior are based upon B.F. Skinner's philosophy of radical behaviorism, which is premised upon:
- Everything that organisms do is behavior (including thinking), and
- All behavior is lawful and open to experimental analysis.
- Central to operant conditioning is the use of a Three-Term Contingency (Discriminative Stimulus, Response, Reinforcing Stimulus) to describe functional relationships in the control of behavior.
- Discriminative stimulus (SD) is a cue or stimulus context that sets the occasion for a response. For example, food on a plate sets the occasion for eating.
- Behavior is a response (R), typically controlled by past consequences and also typically controlled by the presence of a discriminative stimulus. It operates on the environment, that is, it changes the environment in some way.
- Consequences can consist of reinforcing stimuli (SR) or punishing stimuli (SP) which follow and modify an operant response. Reinforcing stimuli are often classified as positively (Sr+) or negatively reinforcing (Sr−). Reinforcement may be governed by a schedule of reinforcement, that is, a rule that specifies when or how often a response is reinforced. (See operant conditioning).
- Respondent conditioning is dependent on stimulus-response (SR) methodologies (unconditioned stimulus (US), conditioned stimulus (CS), neutral stimulus (NS), unconditioned response (UR), and conditioned response, or CR)
- Functional analysis (psychology)
- Data collection
Anti-theoretical analysis
The idea that Skinner's position is anti-theoretical is probably inspired by the arguments he put forth in his article Are Theories of Learning Necessary?[4] However, that article did not argue against the use of theory as such, only against certain theories in certain contexts. Skinner argued that many theories did not explain behavior, but simply offered another layer of structure that itself had to be explained in turn. If an organism is said to have a drive, which causes its behavior, what then causes the drive? Skinner argued that many theories had the effect of halting research or generating useless research.
Skinner's work did have a basis in theory, though his theories were different from those that he criticized. Mecca Chiesa notes that Skinner's theories are inductively derived, while those that he attacked were deductively derived.[5] The theories that Skinner opposed often relied on mediating mechanisms and structures—such as a mechanism for memory as a part of the mind—which were not measurable or observable. Skinner's theories form the basis for two of his books: Verbal Behavior, and Science and Human Behavior. These two texts represent considerable theoretical extensions of his basic laboratory work into the realms of political science, linguistics, sociology and others.
Notable figures
- Charles Ferster – pioneered Errorless learning, which has since become a commonly used form of Discrete trial training (DTT) to teach autistic children, and co-authored the Schedules of Reinforcement book alongside B. F. Skinner.
- Richard Herrnstein – developed the matching law, a mathematical model for decision making, co-authored the controversial The Bell Curve.
- James Holland – co-wrote the highly cited and well-known Principles of Behavior with B.F. Skinner.
- Fred S. Keller – creator of the Personalized System of Instruction (PSI).
- Ogden Lindsley – founder of the Precision Teaching approach to teaching.
- Jack Michael – noted verbal behavior and motivating operations theorist and researcher.
- John Anthony (Tony) Nevin – development behavioral momentum
- David Premack - discovered the Premack principle that more probable behaviors reinforce less probable behaviors, and studied language capacity of chimpanzees
- Howard Rachlin – pioneer in self-control research and behavioral economics.
- Murray Sidman – discovered Sidman Avoidance, highly cited author, researcher on punishment, also has been influential in research on stimulus equivalence.
- Philip Hineline – contributed extensively to negative reinforcement (escape/avoidance), molecular/molar accounts of behavior processes, and the characteristics of interpretive language.
- Allen Neuringer – well known for theoretical work including volition perception, randomness, self-experimentation, and other areas.
- Peter B. Dews principal founder of behavioral pharmacology [6]
References
- ↑ Chiesa, Mecca: Radical Behaviorism: The Philosophy and the Science (2005)
- ↑ Skinner, B.F.: Are Theories of Learning Necessary? (1951) s
- ↑ Skinner, B.F.: The Evolution of Behavior (1984)
- ↑ Skinner, B.F. (July 1950). "Are theories of learning necessary?". Psychol Rev. 57 (4): 193–216. doi:10.1037/h0054367. PMID 15440996.
- ↑ Chiesa, Mecca (2005) Radical Behaviorism: The Philosophy and the Science
- ↑ Barrett, James E. (Spring 2013). "Peter B. Dews (1922–2012)". Behav. Anal. 36 (1): 179–182. doi:10.1007/BF03392303. PMC 3640885.
External links
- The Journal of the Experimental Analysis of Behavior has been the flagship journal for behavioral research since 1958 (as a quarterly and since 1964 as a bimonthly publication).
- The Journal of Applied Behavior Analysis explores what is considered to be the more applied areas of the experimental analysis of behavior.
- Behavioural Pharmacology publishes research on the effects of drugs, chemicals, and hormones on schedule-controlled operant behavior, as well as research into "the neurochemical mechanisms underlying behaviour."
- Experimental Analysis of Human Behavior Bulletin is an online journal publishing experimental research focused on human subjects.
- The Analysis of Verbal Behavior – annual journal for publication of verbal behavior research.
- Are Theories of Learning Necessary? B.F. Skinner's seminal 1950 classic in which he attacks the hypothetico-deductive model of research driven by hypothesis testing.
- Behavioural Processes publishes an annual issue on quantitative analysis of behavior and an issue on Comparative Cognition.