Visualizzazione post con etichetta science. Mostra tutti i post
Visualizzazione post con etichetta science. Mostra tutti i post

What happens in the brain when you learn a language?

Scans and neuroscience are helping scientists understand what happens to the brain when you learn a second language.

Learning a foreign language can increase the size of your brain. This is what Swedish scientists discovered when they used brain scans to monitor what happens when someone learns a second language. The study is part of a growing body of research using brain imaging technologies to better understand the cognitive benefits of language learning. Tools like magnetic resonance imaging (MRI) and electrophysiology, among others, can now tell us not only whether we need knee surgery or have irregularities with our heartbeat, but reveal what is happening in our brains when we hear, understand and produce second languages.
The Swedish MRI study showed that learning a foreign language has a visible effect on the brain. Young adult military recruits with a flair for languages learned Arabic, Russian or Dari intensively, while a control group of medical and cognitive science students also studied hard, but not at languages. MRI scans showed specific parts of the brains of the language students developed in size whereas the brain structures of the control group remained unchanged. Equally interesting was that learners whose brains grew in the hippocampus and areas of the cerebral cortex related to language learning had better language skills than other learners for whom the motor region of the cerebral cortex developed more.
In other words, the areas of the brain that grew were linked to how easy the learners found languages, and brain development varied according to performance. As the researchers noted, while it is not completely clear what changes after three months of intensive language study mean for the long term, brain growth sounds promising.
Looking at functional MRI brain scans can also tell us what parts of the brain are active during a specific learning task. For example, we can see why adult native speakers of a language like Japanese cannot easily hear the difference between the English “r” and “l” sounds (making it difficult for them to distinguish “river” and “liver” for example). Unlike English, Japanese does not distinguish between “r” and “l” as distinct sounds. Instead, a single sound unit (known as a phoneme) represents both sounds.
When presented with English words containing either of these sounds, brain imaging studies show that only a single region of a Japanese speaker’s brain is activated, whereas in English speakers, two different areas of activation show up, one for each unique sound.
For Japanese speakers, learning to hear and produce the differences between the two phonemes in English requires a rewiring of certain elements of the brain’s circuitry. What can be done? How can we learn these distinctions?
Early language studies based on brain research have shown that Japanese speakers can learn to hear and produce the difference in “r” and “l” by using a software program that greatly exaggerates the aspects of each sound that make it different from the other. When the sounds were modified and extended by the software, participants were more easily able to hear the difference between the sounds. In one study, after only three 20-minute sessions (just a single hour’s worth), the volunteers learned to successfully distinguish the sounds, even when the sounds were presented as part of normal speech.
This sort of research might eventually lead to advances in the use of technology for second-language learning. For example, using ultrasound machines like the ones used to show expectant parents the features and movements of their babies in the womb, researchers in articulatory phonetics have been able to explain to language learners how to make sounds by showing them visual images of how their tongue, lips, and jaw should move with their airstream mechanisms and the rise and fall of the soft palate to make these sounds.
Ian Wilson, a researcher working in Japan, has produced some early reports of studies of these technologies that are encouraging. Of course, researchers aren’t suggesting that ultrasound equipment be included as part of regular language learning classrooms, but savvy software engineers are beginning to come up with ways to capitalise on this new knowledge by incorporating imaging into cutting edge language learning apps.
Kara Morgan-Short, a professor at the University of Illinois at Chicago, uses electrophysiology to examine the inner workings of the brain. She and her colleagues taught second-language learners to speak an artificial language – a miniature language constructed by linguists to test claims about language learnability in a controlled way.
In their experiment, one group of volunteers learned through explanations of the rules of the language, while a second group learned by being immersed in the language, similar to how we all learn our native languages. While all of their participants learned, it was the immersed learners whose brain processes were most like those of native speakers. Interestingly, up to six months later, when they could not have received any more exposure to the language at home because the language was artificial, these learners still performed well on tests, and their brain processes had become even more native-like.
In a follow-up study, Morgan-Short and her colleagues showed that the learners who demonstrated particular talents at picking up sequences and patterns learned grammar particularly well through immersion. Morgan-Short said: “This brain-based research tells us not only that some adults can learn through immersion, like children, but might enable us to match individual adult learners with the optimal learning contexts for them.”
Brain imaging research may eventually help us tailor language learning methods to our cognitive abilities, telling us whether we learn best from formal instruction that highlights rules, immersing ourselves in the sounds of a language, or perhaps one followed by the other.
However we learn, this recent brain-based research provides good news. We know that people who speak more than one language fluently have better memories and are more cognitively creative and mentally flexible than monolinguals. Canadian studies suggest that Alzheimer’s disease and the onset of dementia are diagnosed later for bilinguals than for monolinguals, meaning that knowing a second language can help us to stay cognitively healthy well into our later years.
Even more encouraging is that bilingual benefits still hold for those of us who do not learn our second languages as children. Edinburgh University researchers point out that “millions of people across the world acquire their second language later in life: in school, university, or work, or through migration or marriage.” Their results, with 853 participants, clearly show that knowing another language is advantageous, regardless of when you learn it.
___________________________________________
http://www.theguardian.com/education/2014/sep/04/what-happens-to-the-brain-language-learning?CMP=fb_gu

Sex and gender in language

"In Germany, a baby's a baby except they call it das Baby."
Germany recently became the first country in Europe to allow children with indeterminate sex organs to be registered on their birth certificate as neither male nor female. Over on Scientaisies, his blog on the French SciLogs network, Didier Nordon asks whether the German language itself may have been a factor in the law being passed there rather than in France (as far as I know, it hasn't been proposed in France).
Indeed, whereas the French language only recognises two genders (féminin and masculin), German has three (Femininum,Masculinum, and Neutrum). Could this make it easier for German-speakers to talk about (dare I say even conceive of or think about) intersex babies? A French speaker has to pick between talking about le garçon and la fille, and in the third person has to refer to il or elle. There is no "it". (For those of you who learned French by repeating ilelleon for the third person singular, it's worth knowing that on is closer to "one".)
Germanophones, on the other hand, have ersie and es. What's more, the German for "the baby" is das Baby. Yes, they nicked it from English. Conveniently, the default for such borrowed words is to use the neutral gender. Exceptions tend to be those borrowed words here the German equivalent is masculine or feminine, such as der Computer, which is theDenglisch word for der Rechner(rechnen means "to count" or "to calculate"). But because there is no original German word for "baby" (!), the English term remains neutral. So in German, there is room to use gender-neutral language for an intersex baby (and, indeed, all babies - see below). Further, the German for "the child" is das Kind (plural: Kinder, as in "kindergarten").
In French, the word for "the baby" is the masculine le bébé. Even a "baby girl" is un bébé fille. Is this horrific sexism on blatant display at the heart of the French language? Are all babies considered to be masculine by default? I'm not so sure. For instance, do native French speakers perceive la table as having particularly feminine qualities? I doubt it. I think that would be pushing the Sapir-Whorf hypothesis a bit too far (The Sapir-Whorf hypothesis loosely says that everything you think and how you perceive the world is intimately tied to the words you know and the structure of your native language.)
Pronouns and patriarchy
That being said, there does exist in French the convention to use the masculine form of the third person plural, ils, for any group of people (or nouns) where at least one male (or masculine noun) is present. Only in the case of an entirely female groups do we use ellesThis is the patriarchy at work.
Switching back to German, we find that the third person plural is sie, the same as the feminine third person singular. Using "they" to denote multiple masculine nouns is sie. Also, the formal form of the second person singular (You) is Sie. So, in German at least, feminine pronouns do turn up elsewhere than for only feminine items or groups. Is there in fact less patriarchy built into the German language? (Is that even a meaningful question? Please be forgiving, I'm exploring...)
Another thought: the German word for "girl" is Mädchen. This is the diminutive form of the "word" (it doesn't exist on its own any more) Mäd, which has the same route as "maid" and also turns up in the slang die das Mädel ("the girl"). Diminutives in German all have neutral gender, so it's das Mädchen (Mädel is also a diminutive). Other examples of diminutives include two of my favourite German words: das Kaninchen ("the rabbit") and das Meerschweinchen ("the guinea pig"). But does this mean that girls and pets are thought of as sexless in German? I doubt it.
Conclusions?
We have to be cautious of drawing conclusions about the attitudes of speakers of particular languages based solely on the quirks of grammar. Within a given framework of perception, people can have widely varied viewpoints. these are influenced by far more than just the words they know.
Nonetheless, I think it's important that we stop assigning genders to children as a one-to-one function of genitalia and I welcome the new German law. This issue is not just true in the case of  intersex babies where it is particularly "tricky" to know whether an infant is a "boy" or a "girl". Environmental factors play such a strong role in a child's development that what it means to be a boy/man or girl/woman is highly variable, even in the vast majority of cases where sex organs are clearly male or female. Most concepts around the notions of "masculine" and "feminine" are ideas built up in culture and can be almost entirely decoupled from the biology of sex. Our use of language should reflect that distinction.

__________________________________