Emergence Of Cognitive Psychology

The decline of behaviourism

Chomsky.jpg
Noam Chomsky

Skinner's Verbal Behavior and Chomsky's review

In 1957 Skinner published a book, Verbal Behavior, which provided a theoretical analysis of how environmental contingencies could explain language acquisition and usage. Two years later, Noam Chomsky published a highly critical review that emphasized "the poverty of the stimulus", the gap between the linguistic environment that children are exposed to and the linguistic ability they acquire. Children show an incredibly rapid acquisition of words throughout their early years, with the ability to construct grammatical sentences appearing at about age 4. Once grammar has been acquired children quickly develop the capacity to produce highly novel sentences; that is, sentences that they have not heard before, and hence which have not been subject to reinforcement. Chomsky argued that children are born with a Language Acquisition Device (LAD) that allows them to infer the grammatical rules of their linguistic environment and which limits the number of grammatical constructions that are permissible.

Chomsky himself had published a book on language in 1957, titled Syntactic Structures, in which he outlined his own theory of transformational grammar. This espoused the idea that all languages share common properties, embedded in a deep structure, that may be obscured by their surface structure. However, it was not until Chomsky published his stinging attack on Skinner's ideas that his own views suddenly attracted a great deal of attention.

Recommended reading: Chomsky, N. (1959). A review of B.F. Skinner's "Verbal Behavior". Language, 35 (1), 26-58.

The Garcia Effect

In the 1950s, John Garcia was investigating the effects of radiation on behaviour. He noticed that rats developed an aversion to food they had consumed prior to the sickness that they developed as a result of the exposure, even though the sickness occurred several hours later. He examined this further by feeding sweetened water to rats, dosing some of them with radiation, and then offering them both tap water and sweetened water a day later. The rats who had been exposed to radiation (and hence became sick) drank less of the sweetened water. These findings were contrary to two basic tenets of behaviourism, according to which conditioning required (1) multiple trials, and (2) the occurrence of reinforcement shortly after a behaviour had been exhibited. Not only did these results contradict radical behaviourism, but also classical conditioning. The rats did not develop an aversion to other stimuli that they were exposed to during this period. In short, it appears that rats have an evolved predisposition to avoid foods that may be associated with sickness. Conditioned taste aversion is known as the Garcia Effect.

The misbehavior of organisms

Another factor leading to the decline of behaviourism was the publication in 1961 of a paper titled "The misbehavior of organisms". This was written by two of B.F. Skinner's students, Keller and Marian Breland, and described their failure on numerous occasions to shape the behaviour of animals using the techniques of operant conditioning. In one such case they tried to train a raccoon to pick up two coins and put them in a metal box. The first part of the training was successful: conditioning the raccoon to pick up a single coin. However, conditioning the raccoon to put the coin in the box proved problematical; the raccoon kept rubbing the coin against the side of the box. Eventually, the researchers trained the raccoon to deposit the coin in the box, but then trying to get the animal to deposit two coins proved impossible: the raccoon simply rubbed the two coins together. This and other examples were described by the Brelands as "a clear and utter failure of conditioning theory" (p.683). They reached the following conclusion:

After 14 years of continuous conditioning and observation of thousands of animals, it is our reluctant conclusion that the behavior of any species cannot be adequately understood, predicted, or controlled without knowledge of its instinctive patterns, evolutionary history, and ecological niche (p.684).

Information processing and the computer revolution

In 1931 the mathematician and subsequent WW2 codebreaker, Alan Turing, published a landmark paper On computable numbers, with an application to the Entscheidungsproblem. This showed how a machine capable of being in a finite number of conditions could compute any computable numbers through the processing of symbols. The subsequent development of computers from the 1950s onwards led many psychologists to adopt the language of computation in their theorising, by referring to information-processing, capacity limitations, and so on. In 1950 Turing devised the Turing test, which proposed that a machine should be declared intelligent if a human questioner was unable to distinguish the responses from an unseen human and an unseen machine.

In 1953, British research Colin Cherry published a paper in The Acoustical Journal of America which described investigations into the cocktail party problem, the ability to track one conversation at a party whilst ignoring others. In several carefully-controlled studies, Cherry found that people were limited-capacity processors. For instance, people found it virtually impossible to track only one message when the same two different messages were presented to each ear. Cherry's research was extended a few years later by Donald Broadbent.

1956 is widely seen as a key year in the cognitive revolution. A key paper was published by George Miller on the "magic number" 7. This paper reviewed various research papers that had found seven items to be an approximate limit on the number of items that could be recalled or discriminated. For instance, people have little trouble identifying a musical tone that has been sampled from a previously-learned range of three tones. As the overall number of tones increases to four, five, six, or seven, performance remains good, but then drops rapidly as the number of tones exceeds seven. Of course, there is some variation across individuals and across tasks, so the magic number is usually described as 7 plus or minus 2.

In the same year, a conference at the Massachusetts Institute of Technology heard Noam Chomsky discuss his theory of language, George Miller talk about his magic number, and Newell and Simon discuss their General Problem Solver computer program. Another conference, held at Dartmouth College, was attended by many key figures in Artificial Intelligence.

The first use of the term cognitive psychology to describe these approaches to the study of the mind was in a textbook of the same name published in 1967 by Ulric Neisser.

Question: Why was the General Problem Solver a significant development in cognitive psychology / cognitive science?

Recommended reading:

Miller, G.A. (1956). The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychological Review, 63, 81-97.

HerbertSimon.jpg
Herbert Simon (1916-2001)

The conceptual basis of cognitive psychology

In the section Precursors to Cognitive Psychology we saw that Alan Turing had specified how a symbol processing device could perform mathematical computations. Later, Allen Newell and Herbert Simon proposed the Physical Symbol Systems Hypothesis according to which physical symbol systems are necessary and sufficient for intelligent behaviour. The human mind is one such physical symbol system. Symbol systems apply processes to symbol structures that lead to new symbol structures. Newell and Simon investigated their hypothesis by developing computer models (of problem-solving, for example) and comparing their performance with the performance of human participants.

An alternative approach to symbol processing is connectionism. In connectionism, cognition is sub-symbolic. Our cognitive architecture is assumed to consist of many interlinked neuron-like units, and cognitive representations exist not in these units but in the patterns of connections between them. Whereas symbol processing is said to be sequential, in connectionist approaches several operations may occur in parallel (a term that is sometimes used is Parallel Distributed Processing or PDP).

Question: What are the strengths and weaknesses of the symbolic and connectionist approaches to cognition? (this question requires further reading)

Recommended reading:

McClelland, J.L., Rumelhart, D.E., and Hinton, G.E. (1986). The appeal of Parallel Distributed Processing. In D.E. Rumelart, J.L. McClelland, and the PDP Research Group (Eds.), Parallel Distributed Processing: Explorations in the microstructure of cognition, Volume 1. Cambridge, MA: MIT Press.

David Marr, in his 1982 book Vision, argued that a complete psychological theory required explanation at three levels. Explanations at the computational level are concerned with what a computational system computes and why. Explanations at the algorithmic level specify how a computation is carried out. Lastly, the hardware or implementational level specifies how representations and processes at the algorithmic level are physically realised.

The influence of computation on psychology meant that many people in the early days treated the mind as a general-purpose computing device, meaning that the same processes that operated in (say) vision would also apply in language. However, in the 1980s Jerry Fodor proposed that low-level cognitive processes like language and vision were modular. A key feature of modules is that they are "encapsulated" from other cognitive processes. This view was developed from consideration of optical illusions and Chomsky's ideas on language.

There is disagreement among theorists as to just how modular the mind may be. Fodor takes the view that just a few low-level processes are modular, whereas some evolutionary psychologists argue that the mind is massively modular, with each module having evolved throughout human history in order to solve particular adaptive problems. Leda Cosmides and John Tooby are the best-known proponents of this idea.

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License