Skip to main content

Switching Functions Among Brain Parts

When your brain encounters sensory stimuli, such as the scent of your morning coffee or the sound of a honking car, that input gets shuttled to the appropriate brain region for analysis. The coffee aroma goes to the olfactory cortex, while sounds are processed in the auditory cortex.

That division of labor suggests that the brain's structure follows a predetermined, genetic blueprint. However, evidence is mounting that brain regions can take over functions they were not genetically destined to perform. In a landmark 1996 study of people blinded early in life, neuroscientists showed that the visual cortex could participate in a nonvisual function -- reading Braille.

Now, a study from MIT neuroscientists shows that in individuals born blind, parts of the visual cortex are recruited for language processing. The finding suggests that the visual cortex can dramatically change its function -- from visual processing to language -- and it also appears to overturn the idea that language processing can only occur in highly specialized brain regions that are genetically programmed for language tasks.

"Your brain is not a prepackaged kind of thing. It doesn't develop along a fixed trajectory, rather, it's a self-building toolkit. The building process is profoundly influenced by the experiences you have during your development," says Marina Bedny, an MIT postdoctoral associate in the Department of Brain and Cognitive Sciences and lead author of the study, which appears in the Proceedings of the National Academy of Sciences the week of Feb. 28.

Flexible connections

For more than a century, neuroscientists have known that two specialized brain regions -- called Broca's area and Wernicke's area -- are necessary to produce and understand language, respectively. Those areas are thought to have intrinsic properties, such as specific internal arrangement of cells and connectivity with other brain regions, which make them uniquely suited to process language.

Other functions -- including vision and hearing -- also have distinct processing centers in the sensory cortices. However, there appears to be some flexibility in assigning brain functions. Previous studies in animals (in the laboratory of Mriganka Sur, MIT professor of brain and cognitive sciences) have shown that sensory brain regions can process information from a different sense if input is rewired to them surgically early in life. For example, connecting the eyes to the auditory cortex can provoke that brain region to process images instead of sounds.

Until now, no such evidence existed for flexibility in language processing. Previous studies of congenitally blind people had shown some activity in the left visual cortex of blind subjects during some verbal tasks, such as reading Braille, but no one had shown that this might indicate full-fledged language processing.

Bedny and her colleagues, including senior author Rebecca Saxe, assistant professor of brain and cognitive sciences, and Alvaro Pascual-Leone, professor of neurology at Harvard Medical School, set out to investigate whether visual brain regions in blind people might be involved in more complex language tasks, such as processing sentence structure and analyzing word meanings.

To do that, the researchers scanned blind subjects (using functional magnetic resonance imaging) as they performed a sentence comprehension task. The researchers hypothesized that if the visual cortex was involved in language processing, those brain areas should show the same sensitivity to linguistic information as classic language areas such as Broca's and Wernicke's areas.

They found that was indeed the case -- visual brain regions were sensitive to sentence structure and word meanings in the same way as classic language regions, Bedny says. "The idea that these brain regions could go from vision to language is just crazy," she says. "It suggests that the intrinsic function of a brain area is constrained only loosely, and that experience can have really a big impact on the function of a piece of brain tissue."

Bedny notes that the research does not refute the idea that the human brain needs Broca's and Wernicke's areas for language. "We haven't shown that every possible part of language can be supported by this part of the brain [the visual cortex]. It just suggests that a part of the brain can participate in language processing without having evolved to do so," she says.

Redistribution

One unanswered question is why the visual cortex would be recruited for language processing, when the language processing areas of blind people already function normally. According to Bedny, it may be the result of a natural redistribution of tasks during brain development.

"As these brain functions are getting parceled out, the visual cortex isn't getting its typical function, which is to do vision. And so it enters this competitive game of who's going to do what. The whole developmental dynamic has changed," she says.

This study, combined with other studies of blind people, suggest that different parts of the visual cortex get divvied up for different functions during development, Bedny says. A subset of (left-brain) visual areas appears to be involved in language, including the left primary visual cortex.

It's possible that this redistribution gives blind people an advantage in language processing. The researchers are planning follow-up work in which they will study whether blind people perform better than sighted people in complex language tasks such as parsing complicated sentences or performing language tests while being distracted.

The researchers are also working to pinpoint more precisely the visual cortex's role in language processing, and they are studying blind children to figure out when during development the visual cortex starts processing language.
Other interesting new research on brain includes: Autism, Gene Therapy,CellPhones.
Source: Marina Bedny, Alvaro Pascual-Leone, David Dodell-Feder, Evelina Fedorenko and Rebecca Saxe. Language processing in the occipital cortex of congenitally blind adults. Proceedings of the National Academy of Sciences, 2011; DOI: 10.1073/pnas.1014818108/MIT.

Comments

Popular posts from this blog

Charging Implanted Heart Pumps Wirelessly

Mechanical pumps to give failing hearts a boost were originally developed as temporary measures for patients awaiting a heart transplant. But as the technology has improved, these ventricular assist devices commonly operate in patients for years, including in former vice-president Dick Cheney, whose implant this month celebrates its one-year anniversary. Prolonged use, however, has its own problems. The power cord that protrudes through the patient's belly is cumbersome and prone to infection over time. Infections occur in close to 40 percent of patients, are the leading cause of rehospitalization, and can be fatal. Researchers at the University of Washington and the University of Pittsburgh Medical Center have tested a wireless power system for ventricular assist devices. They recently presented the work in Washington, D.C. at the American Society for Artificial Internal Organs annual meeting, where it received the Willem Kolff/Donald B. Olsen Award for most promising research in

Autism and Eye Contact: Genes very much are involved

We have now a lot of evidence on genetic components in many disorders including neurological in both adults and kids. Autism is one such problem that has many genes involved. Research is still in full swing to find more genes and related pathways. However, one can find autistic features more phenotypically before genotyping. Eye contact is one of them. Studies have shown that autistic kids make less eye contact. This has been shown to have genetic component now. New research has uncovered compelling evidence that genetics plays a major role in how children look at the world and whether they have a preference for gazing at people's eyes and faces or at objects. The discovery by researchers at Washington University School of Medicine in St. Louis and Emory University School of Medicine in Atlanta adds new detail to understanding the causes of autism spectrum disorder. The results show that the moment-to-moment movements of children's eyes as they seek visual information about the

How much people depend on weather reports

Meteorologists on television, radio, online, and in newspapers supply weather reports to the average person over 100 times a month. Surveys demonstrated that the 300 billion forecasts accessed generate a value of $285 per household every year, or $32 billion for the entire United States. Odds are you have already watched one weather forecast today and will probably check out a few more. Accurate, timely forecasts are vital to everyday life, but just how critical may surprise you. Whether at work or play, you probably watch the weather quite closely. Most of us are at the weather person's mercy to know what to wear, what to expect, to prepare for the worst. New research shows the average United States household checks out a weather report more than three times a day. "It impacts pretty much every part of every activity we are involved with for the most part," Jeff Lazo, the director of the Societal Impacts Program at the National Center for Atmospheric Research (NCAR) in B