In the 1950s the famous linguist Noam Chomskychallenged this interpretation of language as a more sophisticated form of primate communication by drawing attention to the significance of the remarkable alacrity with which children learn to speak. Language flows so readily, a ‘babbling’ stream of feelings, thoughts and opinions filling every nook and cranny of our lives, it is easy to assume it must be simple, simple enough for children to ‘pick up’ as readily as they pick up measles. Prior to Chomsky, the standard view held that children learned to speak in the same way as blotting paper absorbs ink, by soaking up the words they heard and then reiterating them. Chomsky argued this could not be so, pointing out the skill with which very young children learn to speak lies far beyond the intellectual competency of their years, for while they must struggle to grasp the elementary principles of mathematics, they acquire language with astonishing ease. An infant starting from a situation not dissimilar to that of an adult in a room of foreigners all jabbering away incomprehensibly, nonetheless:
‘Within a short span of timeand with almost no direct instruction will have dissected the language into its minimal separable units of sound and meaning,’ writes linguist Breyne Moskowitz. ‘He will have discovered the rules of recombining sounds into words and recombining the words into meaningful sentences. Ten linguists working full-time for a decade analysing the structure of the English language could not programme a computer with a five-year-old’s ability for language.’
The aptitude of the young mind in mastering the staggering complexity of language presupposed, Chomsky argued, that humans must possess some form of highly specific ‘Language Acquisition Device’ hardwired into their brains that somehow ‘knows’ the meaning of words and the grammatical forms necessary to make sense of them. How, otherwise, can an infant know when its mother says, ‘Look at the cat!’ that she is referring to the furry four-legged creature, and not to its whiskers, or the milk it is drinking. Further, the ‘device’ must not just know what is being referred to, but the grammatical rules that permit the same ‘idea’ expressed in different ways to have the same meaning (‘John saw Mary’ conveys the same message as ‘Mary was seen by John’), but excluding meaningless variations. Further again, it transpires that children learn language in the same way, whether brought up in New Jersey or New Guinea, and acquire the same grammatical rules of ‘present’, ‘past’ and ‘future’ in the same sequence. This implies that the ‘device’ in turn must be sensitive to a Universal Grammar, common to all languages, which can pick up on the subtlest distinction of meaning.
Now, our primate cousins do not possessthis ‘device’, which is why, clever as they are, they remain (in the words of the veteran chimpanzee-watcher Jane Goodall) ‘trapped within themselves’. By contrast, every human society, no matter how ‘primitive’, has a language capable of ‘expressing abstract concepts and complex trains of reasoning’. The million Stone Age inhabitants of the highlands of New Guinea, ‘discovered’ in 1930 after being cut off from the rest of the world for several thousands of years, spoke between them eight hundred different languages, each with its complex rules of syntax and grammar.
How then did the faculty of language cometo colonise the human brain? ‘There must have been a series of steps leading from no language at all to language as we now find it,’ writes the linguist Steven Pinker, ‘each step small enough to have been produced by random mutation [of genes] and with each intermediate grammar being useful to its possessor.’ It is, of course, possible to imagine how language might have evolved in this way from a simpler form of communication or ‘protolanguage’, starting perhaps with gestures, moving on to simple words or phrases with a single meaning, with the rules for linking words into sentences coming later. Pinker’s intended parallel between the means by which our species acquired language and the infant’s rapid progress from burbling through words to sentences might seem plausible, in the way of all evolutionary explanations, and would indeed be reasonable if language simply ‘facilitated the exchange of information’. But, as Chomsky pointed out so persuasively, language is also an autonomous, independent set of rules and meanings that impose order, make sense of the world ‘out there’. Rules and meanings cannot evolve from the simple to the complex, they just ‘are’. The structure of sentences is either meaning ful or meaning less . The naming of an object is either ‘right’ or ‘wrong’. An elephant is an elephant, and not an anteater. Hence Chomsky insisted, against Pinker, that those seeking a scientific explanation for language could, if they so wished, describe it as having evolved ‘so long as they realise that there is no substance for this assertion, that it amounts to nothing more than a belief .’ This, of course, is no trivial controversy, for language is so intimately caught up in every aspect of ‘being human’ that to concede that it falls outside the conventional rubric of evolutionary explanation would be to concede that so does man.
The dispute over the evolutionary (or otherwise)origin of language remained irresoluble till the late 1980s, when the first PET scans revealed how the simplest of linguistic tasks involves vast tracts of the brain in a way that defies any simple scientific explanation. Here every mode of language, thinking about words, reading and speaking them, is represented in different parts. The prosaic task of associating the word ‘chair’ with ‘sit’ generates a blizzard of electrical activity across the whole surface of the brain. Further, those scanning investigations revealed how, in the twinkling of a second that it takes to speak or hear a word, the brain fragments it into its constituent parts through four distinct modules concerned with spelling, sound (phonology), meaning (semantics) and articulation. These ‘modules’ are in turn then further subdivided ad (virtually) infinitum . The perception of sound, for example discriminating between the consonants ‘P’ and ‘B’, is represented in twenty-two sites scattered throughout the brain. There is something absolutely awe-inspiring in supposing we understand a word like ‘elephant’ only after it has been parsed in this way. And then, to compound it all, the brain must simultaneously while ‘parsing’ elephant also comprehend its meaning in its entirety, for the constituent symbols can really only be understood within the context of the whole word.
It is one thing to try to work out how the brain processes a single word (and that is baffling enough), quite another to extrapolate from such findings to try to imagine the same processes as they apply to a sentence, with its structure of ‘subject-verb-object’ and numerous subsidiary clauses. Move into the real world, with its ceaseless conversation, and the problem becomes insuperable. What sort of brain processes, one might ask, must be involved when a group of football fans convening in the pub before a match discuss their team’s prospects for the coming season – drawing on a vast storehouse of knowledge and judgement of the form of previous seasons, the strengths and weaknesses of their players, and assessments of the performance of their rivals. How do they pluck from the storehouse of their memories the right words, or conjure from the rules of syntax and grammar the correct sequence with which emphatically to argue their opinion? How does the electrical firing of the neurons in their brains represent words and capture the nuance of their meanings?
Читать дальше