Yesterday, I attended a lecture by Northeastern psychology professor Iris Berent on “How Human Brains Give Rise to Language.” Berent, who works closely with collaborators in a range of fields, has spent her career examining “the uniquely human capacity for language.”
That’s not to say that other animals don’t have meaningful vocalizations, but, she argues, there is something unique about the human capacity for language. Furthermore, this capacity cannot simply be attributed to mechanical differences – that is, human language is not simply a product of the computational power of our brains or the ability of our oral and aural processing.
Rather, Berent argues, humans have an intrinsic capacity for language. That is, as Steven Pinker describes in The Language Instinct, “language is a human instinct, wired into our brains by evolution like web-spinning in spiders or sonar in bats.”
While this idea may seem surprising, in some ways it is all together reasonable: humans have specialized organs for seeing, breathing, processing toxins, and more – is it really that much more of a jump to say that the human brain is specialized, that the brain has a specialized biological system for language?
Berent sees this not as an abstract, philosophical question, but rather as one that can be tested empirically.
Specialized biological systems exhibit an invariant, universal structure, Berent explained. There is some variety among human eyes, but fundamentally they are all the same. This logic can be applied to the question of innate language capacity: if language is specialized, we would expect to find for principles: we would expect what Noam Chomksy called a “universal grammar.”
In searching for a universal grammar, Berent doesn’t expect to find such a thing on a macro scale: there’s no universal rule that a verb can only come after a noun. But rather, a universal grammar would manifest in the syllables that occur – or don’t occur – across the breadth of human language.
To this end, Berent constructs a series of syllables which she expects will be increasingly difficult for human brains to process: bl > bn > bd > lb.
That is, it’s universally easier to say “blog” than to say “lbog,” which “bnog” and “bdog” having intermediate difficulty.
One argument for this is simply the frequency of such constructions – in languages around the world “bl” occurs more frequently than “lb.”
Of course, this by no means proves the existence of an innate, universal grammar, as we cannot account for the socio-historical forces that shaped modern language, nor can we be sure such variance isn’t due to the mechanical limitations of human speech.
Brent’s research, therefore, aims to prove the fundamental universality of such syllables – showing that there is a universal hierarchy of what human brain prefers to process.
In one experiment, she has Russian speakers – who do use the difficult “lb” construction – read such a syllable out loud. She then asks speakers of languages without that construction (in this case English, Spanish, and Korean), how many syllables the sound contained.
The idea here is that if your brain can’t process “lbif” as a syllable, it will silently “repair” it to the 2-syllable “lebif.”
In numerous studies, she found that as listeners went from hearing syllables predicted to be easy to syllables predicted to be hard, they were in fact more likely to “repair” the word. Doing the experiment with fMRI and Transcranial Magnetic Stimulation (TMS) further revealed that people’s brains were indeed working harder to process the predicted-harder syllables.
All this, Berent argues, is evidence that a universal grammar does exist. That today’s modern languages are more than the result of history, social causes, or mechanical realities. The brain does indeed seem to have some specialized language system.
For myself, I remain skeptical.
As Vyvyan Evans, Professor of Linguistics at Bangor University, writes, “How much sense does it make to call whatever inborn basis for language we might have an ‘instinct’? On reflection, not much. An instinct is an inborn disposition towards certain kinds of adaptive behaviour. Crucially, that behaviour has to emerge without training…Language is different…without exposure to a normal human milieu, a child just won’t pick up a language at all.”
Evans rather points to a simpler explanation for the emergence of language: cooperation:
Language is, after all, the paradigmatic example of co‑operative behaviour: it requires conventions – norms that are agreed within a community – and it can be deployed to co‑ordinate all the additional complex behaviours that the new niche demanded…We see this instinct at work in human infants as they attempt to acquire their mother tongue…They are able to deploy sophisticated intention-recognition abilities from a young age, perhaps as early as nine months old, in order to begin to figure out the communicative purposes of the adults around them. And this is, ultimately, an outcome of our co‑operative minds. Which is not to belittle language: once it came into being, it allowed us to shape the world to our will – for better or for worse. It unleashed humanity’s tremendous powers of invention and transformation.