Information

Are the language and sound centres of the brain in the same area?

Are the language and sound centres of the brain in the same area?


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

I am an English major currently taking a Psycholinguistics module. One of the things we learned is that speech perception is handled differently from non-speech sound perception. Our brain is trained to break down speech signals more intricately than non-speech information, which is why we are able to distinguish syllable and word boundaries.

Since speech sounds are perceived differently than general sounds, my question is this: do different areas of the brain light up when speech is perceived as opposed to a general sound?

I've spent a few hours Googling this, but I found no answers. The only thing I found is that a particular section of the temporal lobe is responsible for auditory perception.

I would expect that a general sound activates mainly the temporal lobe while hearing speech would activate the temporal lobe and areas of the brain associated with the words currently in the content of the speech.


Short answer
The primary auditory cortex is mainly involved in relatively simple processing of sound. The language areas involved in producing and understanding speech are associative auditory cortices and are considered to be areas higher up in the hierarchy. They are connected to the primary auditory cortex, but are situated at anatomically different locations in the brain.

Background
The primary auditory cortex is engaged in the low-level processing of sound. For example, it contains a tonotopic map that is used to separately process the various frequencies in the incoming sound (Purves et al., 2001) (Fig. 1).

The language area involved in production of speech of the brain is called Broca's area (Fig. 1) and is anatomically distinct from the primary auditory cortex.

The part of the brain dealing with understanding speech is Wernicke's area (Fig. 1), and is also a distinct anatomical structure in the brain.

The angular gyrus has also been associated with specific speech production pathologies (Fig. 1).


Fig. 1. Central auditory system. source: McGill University

Reference
- Purves et al. (eds). Neuroscience, 2nd ed. (2001). Sunderland (MA): Sinauer Associates


The Brain and Language: How Our Brains Communicate

I am a neuropsychologist and scientist from Germany. I am studying how our brains grow and develop and how this is related to language development. I also like to spend time in nature, enjoying things like hiking or climbing.

Young Reviewers

Rayleigh

I like gardening. I live in a small farm in Washington State. I have two chickens, two cats, and one dog. I am in the fourth grade and love animals (as you can see). I get a lot of reviewing experience in class. I want to help young people to realize that science is fun, interesting, and understandable.

In this article, we will show what our brains do when we listen to someone talking to us. Most particularly, we will show how the brains of infants and children are tuned to understand language, and how changes in the brain during development serve as preconditions for language learning. Understanding language is a process that involves at least two important brain regions, which need to work together in order to make it happen. This would be impossible without connections that allow these brain regions to exchange information. The nerve fibers that make up these connections develop and change during infancy and childhood and provide a growing underpinning for the ability to understand and use language.

We humans are very social and chatty beings. As soon as we are born, we learn to communicate with our environment. Growing up, we love to talk to our friends, to our families, and to strangers. We exchange our thoughts and feelings and are eager to learn about the thoughts and feelings of others. This includes direct spoken statements by face-to-face interactions, on the telephone, or via Skype, but also written messages via post-it notes, text messages, Facebook, or Twitter. All of these types of communication require language in order to transport a message from one person to the other. Language is used by young children from a very early age on, and their language abilities develop quickly. But how do humans learn to understand language, and what are the first steps in language acquisition? How does language develop from a baby to a child and so on? Are there certain preconditions in our brains that support language? And, most importantly, why is language important? Well, the ability to understand and produce language is a huge advantage for us because it allows us to exchange information very quickly and accurately. It even allows us to pass on this information over centuries, when we write it down and preserve it. The bible, for instance, contains texts that were written many hundreds of years ago, and we can still read it. When we talk, we can talk about things that are right in front of us, or about things that are far away, things that exist, have existed or will exist, or even things that never existed in the real world and will never be. We can even talk about talking itself, or write articles that try to teach us how this tremendous ability is made possible by our brains. Because this is the place where our words come from when we speak, and also where they go to when someone else talks to us. The language ability is one of the most amazing abilities that we have.

When babies are born, they cannot talk or understand words. A baby’s communication is generally basic and non-verbal. Babies are not born with speech or language. This is something they learn from their interactions with others. Within the first year of life, babies say their first words, and they can soon speak full sentences. After only 2𠄳 years, babies are already quite good at verbal communication and are able to say what they want. This fast progress in language abilities is probably supported by genetic conditions that support fast language learning.

However, it is interesting to think that a baby has already taken the first steps in terms of language development even before birth [ 1 ]. This sounds impossible when we know that language needs to be learnt and does not happen automatically, unlike breathing or sleeping. But babies are actually born knowing the sound and melody of their mother tongue – and they can already “speak” by following the melodic pattern of the language. Of course, this “speaking” does not involve words, and the sound made by newborn babies is often that of crying. But this crying follows a certain melody. You might think that all babies sound similar when they cry, but when a group of German and French scientists investigated the crying sounds of German and French newborn babies [ 2 ], they actually discovered that they were different! As you can see in Figure 1, French babies show a cry melody with low intensity at the beginning, which then rises. German babies, on the other hand, show a cry melody with high intensity at the beginning, which then falls. These findings become even more interesting when you know that these cry melodies resemble the melodies of the two languages when people speak French or German: German is, like English, a language that stresses words at the beginning, while French stresses words toward their endings. To give an example, the German word for daddy is “papa” with a stress on the first syllable: papa. The French word for daddy is “papa” with a stress on the last syllable: papa. The surprising thing is that the cry melodies of French and German newborn babies follow these speech stress patterns!

  • Figure 1 - The sound patterns of babies’ cries.
  • The two diagrams show the intensity of the sound as a black curvature over time, in a 1.2-s frame. The wider the curves (i.e., the higher the amplitude), the more intense the sound. The upper diagram shows the sound pattern of a typical cry for French newborn babies. The cry’s highest intensity is at the end (rising from left to right). The lower diagram shows the sound pattern of a typical cry for German newborn babies. Here, unlike in the French example, the cry is more intense at the beginning (falling from left to right). These two different melodies of crying are similar to the sounds of the two languages, French and German, which appear to be learnt already before birth.

How can this be? How can babies learn the melodies and sounds of their mother tongue even before they are born? The answer is as simple as this: about 3 months before birth, while still in their mother’s womb, babies start to hear. At that time, their ears are developed enough and start working. Usually, it will mostly be the mother’s voice that reaches the baby’s ears inside the womb, but other loud sounds or voices as well. Consequently, every day of the last few months before birth, the baby can hear people speaking – this is the first step in language learning! This first step, in other words, is to learn the melody of the language. Later, during the next few months and years after birth, other features of language are added, like the meaning of words or the formation of full sentences.

As we have seen, the development of the baby and the baby’s organs provides important preconditions for speech and language. This can be the development of the hearing system, which allows the baby to hear the sound of language from the womb. But the simultaneous development of the brain is just as important, because it is our brain that provides us with the ability to learn and to develop new skills. And it is from our brain that speech and language originate. Certain parts of the brain are responsible for understanding words and sentences. These brain areas are mainly located in two regions, in the left side of the brain, and are connected by nerves. Together, these brain regions and their connections form a network that provides the hardware for language in the brain. Without this brain network, we would not be able to talk or to understand what’s being said. Figure 2 illustrates this talkative mesh in the brain. The connections within this network are particularly important, because they allow the network nodes to exchange information.

  • Figure 2 - A view of a brain, as seen from the left side.
  • Two brain regions are highlighted in red and orange. These regions are strongly involved in processing speech and language. The blue and green lines illustrate connections that link the two regions with one another and form a network of language areas. There is an upper nerve connection (blue) and a lower nerve connection (green).

Now, let’s keep in mind how important this network is to fully master language. Assuming that the brain develops during infancy and childhood, we might wonder from what age onward the network of language areas is well enough established to serve as a sufficient precondition to speaking and understanding language. Is it that the network is there from a very early age on, and that the development of language is dependent on learning based on the input from the environment? Or is this network something that develops over time and provides a growing precondition that enables more and more possible language functions?

These questions can be answered by investigating the nerve fiber connections in the brain. The nerve fibers are a crucial part of the brain and form the language network. These nerve fibers can be visualized using a technique called magnetic resonance imaging (MRI). MRI is an imaging method that allows us to take pictures of someone’s brain inside their head, like X-ray but without any rays being used. Instead, the magnetic properties of water (yes, water is slightly magnetic) are used while the person is inside a strong magnetic field created by the MR scanner.

Now, using this technique, a comparison between newborn infants and older children (for example 7-year-olds who are already going to school) would show if their brain networks are the same or not. If they are the same, this would mean that there are preconditions for language in the brain from birth onward. If they are different, this would mean that the brain’s preconditions for certain language functions probably are not yet fully established at birth, and that these preconditions grow as the babies get older. The brain networks for newborns and 7-year-olds are depicted in Figure 3.

  • Figure 3 - A view of the brains of newborn infants (left) and 7-year-old children (right).
  • The two important language regions of the brain are highlighted in red and orange (like in Figure 2). The technique of magnetic resonance imaging provides images of the nerve connections between the two language regions. The lower nerve [green (B)] connects these regions of the brain in both newborns and children. But the upper nerve between the language regions [blue (A)] is only observed in children, not yet in infants. However, infants already show a connection to a directly neighboring region [yellow (A)]. This means that the language network in infants is not fully established yet. The important connection by the upper nerve still has to develop. On the other hand, the network shown for children is already very similar to that of adults and shows two network connections, an upper one and a lower one.

As can be seen, for both newborn infants and children going to school, the network of brain connections between the language regions is, in general, established. For newborn infants, a basic connection (green in Figure 3B) within the language network can be used. This is an important precondition from a very early age on. But the important upper connection between the language regions (blue in Figure 3A) cannot be seen in newborns. However, they already possess a second upper connection (yellow in Figure 3A) that does not connect to the red language region directly, but to a region right next to it that helps to develop and increase speech and language abilities. This means that the full language network as it is used for language abilities by older children does not exist in newborn babies yet, but it also means that they already possess a basic network. It is probably true that the full language network is an important precondition in development that allows children to learn and use more advanced language skills.

As we have seen here, the development of the hearing system and the development of the brain language network provide crucial preconditions for infants to be able to develop and improve their language abilities. Although infants already possess an important groundwork for language acquisition, more advanced language learning becomes possible as the brain continues to develop during childhood. Different stages of the language network, as shown in Figure 3, demonstrate that the language network develops over time. The nerve fiber connections in the brain change throughout our lives. During infancy and childhood, they become more and more powerful in their ability to transmit information, and it is only when we reach our teenage years that many of these nerve fibers stop developing. When we get old, they slowly start to decline. For each age for which the networks are illustrated (for example for newborn babies and for children, as in Figure 3), we only get a snapshot of a continuously changing matter. And it is not only maturation and aging that influence these networks. For instance, a therapy that is supposed to cure a disease might also alter the brain. Everything we experience and learn can potentially impact the brain and the brain networks. In other words, with each lesson we learn at school, we change our brains!


Comments

Linda Crampton (author) from British Columbia, Canada on September 22, 2019:

I&aposm very sorry about your brother-in-law&aposs situation. I can&apost help you, though. I&aposm a science writer, not a doctor. Even a doctor would have to be familiar with a patient&aposs specific case in order to answer your questions. I hope you find a physician that is able to answer the questions and help your brother-in-law.

Florence Njoku on September 22, 2019:

I have a brother in-law who had severe brain injury last three years and lost his power of speech since and quadriplegic too although the spinal cord is intact.

I am wondering could it be that his Broca&aposs area was damaged as a result of the injury. He understands spoken word and has a strong grasping reflex. I consulted the best neurosurgeon at the time of his injury and was told nothing could be done for him. The hospital he was admitted severe brain damage and was querying brain death, I guess if if was brain death who will be dead and would not be able to breath or last weeks without life-support.


Are the language and sound centres of the brain in the same area? - Psychology

Human language and its representation in the brain have fascinated scientists for many years. Progress has been made in identifying important language areas in the brain, such as Broca's and Wernicke's areas. Most research has focused on people who speak only one language. However, some researchers have become interested in the brains of people who have learned a second language. Many questions remain unanswered about the similarities and differences between first and second language learning, storage, and usage.

Substantial evidence suggests the existence of a critical learning period for first languages. A critical period for language is defined as the time period during which a person must be exposed to the spoken language in order to best learn the language. In most cases, if a person is not exposed to a language during the critical period, he or she will never be able to speak the language as well as someone who learned language normally. Although the person may be able to learn many vocabulary words, his or her syntax will probably never reach a normal level.

Children who have brain damage are often able to regain their language abilities with practice. Adults, however, who suffer damage to language areas are rarely able to achieve their previous language proficiency. This observation further supports the concept that there might be a difference between learning language in childhood and adulthood. Although it is generally believed that a critical period exists for a first language, it is not known if there is a similar critical period for a second language. For example, if your first language is English, when must you start to learn Spanish, Russian, or Mandarin to be able to learn it completely? What is it about the brain that allows -- and perhaps requires -- us to learn languages at a young age?


Right Brain vs. Left Brain: Language, Dominance, and Their Shared Roles

(Image: hidesy/Shutterstock)

Broca and Wernike

Back in the 19th century, European physicians Paul Broca and Carl Wernicke noticed that their patients, who had circumscribed lesions in the left hemisphere, showed a peculiar symptom profile. Broca’s patient was nicknamed Tan because that was one of the very few words that he could say, and he seemed to have little trouble understanding speech but had great difficulty producing it.

By contrast, Wernicke’s patients, whose brain damage was just a little bit below the area where Tan’s was, were able to produce speech, but when they did, it sounded like word salad—gibberish. They had trouble with speech comprehension, whether it was understanding what someone else was saying or saying something themselves.

(Image: Designua/Shutterstock)

Dominant Hemispheres, Dominant Hands

Studying such groups of lesion patients, then, helped neuroscientists discover that certain cognitive functions like language are localized primarily to one side. This assignment of different functions to the two different hemispheres is known as lateralization. When it comes to our senses and actions, in particular, the two sides of the brain do process different sets of information.

The left side of space is represented primarily in the right hemisphere, and vice versa. This means that your right hand is controlled by the left side of your brain and what you see in your left visual field is projected to the back of the right side of your cerebral cortex. It’s this fact that some acting coaches may use to justify a technique designed to make a person more creative. Engage the left side of your body, they say, and you’ll tap into your creative right brain. But there are a few assumptions that such an explanation glosses over, most of which aren’t true.

Just like you have a dominant hand, most likely you also have a dominant hemisphere—or, because you have a dominant hemisphere, you also have a dominant hand. Neuroscientists are still working out the details of how related the lateralization of cognition and handedness might be and how they develop, and the story is getting more complicated by the day.

But for most of us who are right-handed, our left hemisphere is our dominant hemisphere, and the left side of our brain contains most of our language function.

People sometimes point to left-handed individuals, claiming that they’re more likely to be engaged in creative careers than right-handed ones, and this surely must be because they’re dominated by their right hemisphere. But if you’re left-handed, then there’s only about a 20% chance that your right hemisphere is your dominant hemisphere. There’s also a 20% chance that both of your hemispheres contain language function.

As for the rest of the lefties, they’ve still got about a 50%–60% chance that most of their language function is on the left side. So already, as you can see, even for language, arguably the star of lateralization, the story isn’t quite as neat as we would like it to be.

The Brain and Language

Even though many language functions rely on an intact left hemisphere, as Broca and Wernicke noted, the right hemisphere certainly participates in verbal communication. The right side is much better at deciphering prosody and accentuation, while the left is the home of the grammar police and the dictionary.

For example, in both sets of patients, the physicians and their colleagues noticed that some aspects of speech were relatively preserved. Prosody, for example—the music of speech—seemed to be retained. In other words, their speech contained the appropriate ups and downs of conversation you could hear the emotional intent in the melody of their speech.

But patients with right hemisphere damage can show deficits in prosody: They have trouble distinguishing and expressing modulations in speech. You might say that if there ever was an artistic side to language, surely it would be prosody. But many a great writer might take issue with that stance. So, what else does the right hemisphere do that the left does not and how do we know?

Left and Right Brain Roles

Tracking activity in the brains of healthy people while they’re engaging in different tasks, as we do with neuroimaging, provides another, more modern window into laterality. For example, neuroimaging studies suggest that the two hemispheres might play different roles in emotion processing, with the left hemisphere showing somewhat greater activation for happy or positive emotions and the right hemisphere showing more activity during negative emotional processing.

One particularly interesting insight that neuroimaging has given us is the finding that white matter tracts, or the wiring diagram of the two hemispheres, is different. The wiring of the right hemisphere has been called more efficient because it has greater connectivity between regions.

Think of a city with a really good subway system, like Manhattan, making it easy to get from one side of the city to another. The left hemisphere, in contrast, seems to be more modular. It’s more like a cluster of little cities operating more independently, like Los Angeles, where every mini-city has its own municipal transport system and the cities themselves are not well connected. Santa Monica and Beverly Hills have different governing bodies, so it’s hard to get from one to the other using public transportation.

Left and Right Brain Communications

White matter connections in the brain, obtained with MRI Tractography. (Image: By Xavier Gigandet et. al. – Gigandet X, Hagmann P, Kurant M, Cammoun L, Meuli R, et al./Public domain)

It’s this wiring difference that might explain why the left hemisphere seems to contain regions that operate somewhat more independently and are more specialized than the regions in the right hemisphere, which are involved in more integrative processes like visuospatial tasks—recognizing faces or arranging a set of blocks to match a pattern, or paying attention to all of space, not just one side of it.

It’s easier to find specialization in the left than the right hemisphere. Here, people are looking for evidence for the association between the right brain and creativity, and the left brain and logic point to these wiring differences. If the right brain is more interconnected, does that make it easier for it to generate new ideas, finding new connections between remotely associated concepts?

Not so fast. Neuroimaging research has also shown us just how communicative the two hemispheres are. Unless that connection between them is physically severed, information is zipping across the hemispheres during the vast majority of tasks that we ask our brains to accomplish.

In many regions, signals pass from one hemisphere to the other more quickly than they do within a single hemisphere—that is, some signals from the left and right prefrontal cortex can be exchanged more efficiently than signals from the back to the front of the brain in the same hemisphere.

Common Questions About Right Brain vs. Left Brain

Common theory suggests that the left brain is needed for more logic-based skills such as learning a language and mathematics, while the right brain is needed for creative tasks such as art, as well as connecting to others on an emotional level.

Supposedly, people whose left brain is more dominant tend to be more logical and better at science and mathematics than those who are more right-brain dominant .

A commonly-held belief is that analytical people think more with the left side of their brain while artistic or intuitive people lean more heavily on the right side of their brain for support. In reality, though, the brain sends constant signals back and forth, and everyone is dependent on both sides of the brain.

There has long been a myth that people only use 10% of their brain. In fact, technology such as functional magnetic resonance imaging has revealed activity in the majority of the brain . Furthermore, damage to any part of the brain will impact a person’s cognitive abilities, therefore disproving the myth that we only use a small section of our brain.


Our brain is a filter

Our brain is also active when we discriminate relevant sounds from background noise. Our brain can filter out unwanted noise so that we can focus on what we are listening to. And researchers have found that the brain activity is greater in the left half of the brain when we discriminate sounds from noise. In other words, the cocktail party effect occurs in the left side of our brain.

In the same way, our brain turns up the volume when we speak. When it comes to our own speech, there is a network of volume settings in the brain which can amplify the sounds we make.


Lateralization

Human split-brain studies have helped develop knowledge about language and lateralization. In split-brain studies, the cutting of the corpus callosum (a group of nerve fibers connecting the two brain hemispheres) is cut. These studies have proven that the left and the right brain hemispheres have specific language functions.

Left Hemisphere

Naming objects is one of the language-related functions of the left hemisphere. Objects placed in the right visual field are easily recognized by both normal people and split-brain subjects. However, split-brain subjects cannot identify objects located in the left visual field unlike normal subjects. This proves that the only known language function of the left-hemisphere is to name identified objects. Logic, critical thinking and reasoning are also functions that are dominantly processed in the left hemisphere.

Right Hemisphere

Verbal identification of objects presented in the left visual field cannot be identified by people who have undergone the split-brain surgery. However, they can identify these objects by means of their sense o touch. Some words can also be comprehended through the right hemisphere. The split-brain studies show that the figurative sides and context of language are understood via the right hemisphere. In addition, the emotional expression of language is processed in the right hemisphere. Also, music stimulates the right hemisphere more than spoken words do.

Hemispheric lateralization is important in both the parallel processing and sharing of information. While the left hemisphere is mostly concentrated on the interpretation of information through logic and analysis, it coordinates with the right hemisphere which is focused on the interpretation of experience in its totality via synthesis.

Handedness

In terms of handedness, most people who are right-handed reveal left hemisphere language dominance. This means that they tend to be better in logical and critical thinking than in creative and expressive thinking. Most left-handed people do not show right-handed language dominance, as common belief tells us. Rather, 70% of left-handed people still demonstrate left hemisphere language dominance.


What areas of the brain relate to language and reading?

The human brain is a complex organ that has many different functions. It controls the body and receives, analyzes, and stores information.

The brain can be divided down the middle lengthwise into a right and a left hemisphere. Most of the areas responsible for speech, language processing, and reading are in the left hemisphere, and for this reason we will focus all of our descriptions and figures on the left side of the brain. Within each hemisphere, we find the following four brain lobes (see Figure 1).

    The frontal lobe is the largest and responsible for controlling speech, reasoning, planning, regulating emotions, and consciousness.

In the 19th century, Paul Broca was exploring areas of the brain used for language and noticed a particular part of the brain that was impaired in a man whose speech became limited after a stroke. This area received more and more attention, and today we know that Broca's area, located here in the frontal lobe, is important for the organization, production, and manipulation of language and speech (Joseph, Noble, & Eden, 2001). Areas of the frontal lobe are also important for silent reading proficiency (Shaywitz et al., 2002).

Wernicke's area, long known to be important in understanding language (Joseph et al., 2001), is located here. This region, identified by Carl Wernicke at about the same time and using the same methods as Broca, is critical in language processing and reading.

In addition, converging evidence suggests that two other systems, which process language within and between lobes, are important for reading (see Figure 2).

The first is the left parietotemporal system (Area A in Figure 2) that appears to be involved in word analysis – the conscious, effortful decoding of words (Shaywitz et al., 2002). This region is critical in the process of mapping letters and written words onto their sound correspondences – letter sounds and spoken words (Heim & Keil, 2004). This area is also important for comprehending written and spoken language (Joseph et al., 2001).

The second system that is important for reading is the left occipitotemporal area (Area B in Figure 2). This system seems to be involved in automatic, rapid access to whole words and is a critical area for skilled, fluent reading (Shaywitz et al., 2002, 2004).


The History of Language Processing Research

Scientists have been studying the relationship of language and speech for nearly 150 years. In 1861, while Abraham Lincoln was penning his famous inauguration address, French neurologist Pierre Paul Broca was busy discovering the parts of the brain behind Lincoln’s speech — the parts that handle language processing, comprehension, and speech production (along with controlling facial neurons).

What we now know as “Broca’s area” is located in the posterior inferior frontal gyrus. It’s where expressive language takes shape. Broca was the first person to associate the left hemisphere with language, which remains true for most of us today. (This can’t be said about every brain — it’s possible to have a language center on the right side, which is where the language loop lies in the brains of about 30 percent of left-handed people and approximately 10 percent of right-handers.)

Tucked in the back of Broca’s area is the Pars triangularis, which is implicated in the semantics of language. When you stop to think about something someone’s said — a line in a poem, a jargon-heavy sentence — this is the part of your brain doing the heavy work. Because Broca studied patients who had various speech deficiencies, he also gave his name to “Broca’s aphasia,” or expressive aphasia, where patients often have right-sided weakness or paralysis of the arm and leg due to lesions to the medial insular cortex. (Another of Broca’s patients was a scientist who, after surgery, was missing Broca’s area. Though the scientist suffered minor language impediments, such as the inability to form complex sentences, his speech eventually recovered — which implied some neuroplacticity in terms of where language processing can take place.)

Ten years after Broca’s discoveries, German neurologist Carl Wernicke found that damage to Broca’s area wasn’t the only place in the brain that could cause a language deficit. In the superior posterior temporal lobe, Wernicke’s area acts as the Broca’s area counterpart, handling receptive language, or language that we hear and process.

The arcuate fasciculus links Broca’s area to Wernicke’s area. If you damage this bundle of nerves you’ll find yourself having some trouble repeating what other people say.

Wernicke was also the first person to create a neurological model of language, mapping out various language processes in the brain — speech-to-comprehension, cognition-to-speech, and writing-to-reading — a model that was updated in 1965 by Norman Geschwind. Much of modern neurology as it relates to language is modeled on the Wernicke-Geschwind model, although the model is somewhat outdated today — it gives a broad overview but contains some inaccuracies, including the idea that language processing happens in sequential order, rather than in various parts of the brain simultaneously, which is what know today.

In the 1960s, Geschwind discovered that the inferior parietal lobule has something important to do with language processing. Now, thanks to much improved imaging technology, we know there’s another route through which language travels between Broca’s area and Wernicke’s area in the inferior parietal lobule. This region of the brain is all about language acquisition and abstract use of language. This is where we collect and consider spoken and written words — not just understanding their meanings, but also how they sound and work grammatically. This part of the brain helps us classify things using auditory, visual, and sensory stimuli its late maturation might be why children usually don’t learn to read and write until they’re somewhere around the age of 5.

The fusiform gyrus is also in the frontal lobe, and also plays an interesting role in language processing in the brain. This area helps you recognize words and classify things within other categories. Damage to this part of the brain can cause difficulty in recognizing words on the page.

Today, we’re constantly learning new things about how language works. For example, we believe that the right brain performs its fair share of language functions, including the ability to comprehend metaphors as well as patterns of intonation and poetic meters.

Whereas we used to believe that people who speak with signs used a different, more visually dependent model of language processing in the brain, we now believe that language happens similarly in verbal and nonverbal ways. As it turns out, the brains of deaf people function much the same way as their hearing counterparts: The same parts of the brain are activated while speaking, whether that’s by using signs or not. This research was presented in an issue of NeuroImage, and Dr. Karen Emmorey, a professor of speech language at San Diego State University, has presented new research at the American Association for the Advancement of Science in San Diego illustrating that the brain reacts to signs that are pantomimes — drinking, for example — in exactly the same way as if the word “drink” were spoken aloud.

“It suggests the brain is organized for language, not for speech,” Emmorey says.


Same Brain Spots Handle Sign Language and Speaking

Language is created in the same areas of the brain, regardless of whether a person speaks English or uses American Sign Language to communicate, new research found. The discovery suggests that something about language is universal and doesn't depend on whether people use their voices or their hands to talk.

Two centers in the brain – Broca's area, which is thought to be related to speech production, and Wernicke's area, which is associated with comprehending speech – have long been associated with verbal communication. But now scientists have found the brain areas might be tied to language, no matter whether it's spoken or signed.

Scientists suspected these areas might be particular to speaking, because they are located spatially near areas that are connected to moving the vocal chords, and to the auditory cortex, which is used to hear sounds. In that case, it stood to reason that deaf people who use American Sign Language (ASL) to communicate should use other brain areas to create language, such as parts located near the visual cortex, used for seeing.

But when researchers tested 29 deaf native ASL signers and 64 hearing native English speakers, they found no difference in the brain. They showed both groups pictures of objects, such as a cup or a parrot, and asked the subjects to either sign or speak the word, while a PET (Positron Emission Tomography) scanner measured changes in blood flow in the brain.

In both groups, Broca's and Wernicke's areas were equally active.

"It's the same whether the language is spoken or signed," said Karen Emmorey, a professor of speech language at San Diego State University. Emmorey described the work last week at the annual meeting of the American Association for the Advancement of Science in San Diego, Calif. The research was also detailed in a 2007 issue of the journal Neuroimage.

In a more recent study, which has not yet been published in a scientific journal, the scientists tested whether sign language taps into the same parts of the brain as charades. They wanted to figure out whether the brain regards sign language as more similar to spoken language, or more similar to making pantomime gestures to mimic an action.

The scientists showed both deaf people and hearing people pictures of objects, such as a broom or a bottle of syrup, and asked the subjects to "show how you would use this object." The charade gestures for pouring syrup and for sweeping with a broom are different from the signs for syrup and sweep, so the researchers could be sure the deaf participants were pantomiming and not signing.

Then they asked the deaf subjects to sign the verbs associated with particular objects, such as syrup or broom. The researchers found that the signers activated different parts of their brains when pantomiming versus when signing. Even when the sign is basically indistinguishable from the pantomime – when similar hand gestures are used – the brain treats it like language.

"The brain doesn't make a distinction," Emmorey said. "The fact that many signs are iconic doesn't change the neural underpinnings of language."

And the scans showed that the brain areas signers used when pantomiming were similar to the brain areas hearing participants used when pantomiming – both groups activated the superior parietal cortex, which is associated with grasping, rather than brain areas connected to language.

"It suggests the brain is organized for language, not for speech," Emmorey said.


What areas of the brain relate to language and reading?

The human brain is a complex organ that has many different functions. It controls the body and receives, analyzes, and stores information.

The brain can be divided down the middle lengthwise into a right and a left hemisphere. Most of the areas responsible for speech, language processing, and reading are in the left hemisphere, and for this reason we will focus all of our descriptions and figures on the left side of the brain. Within each hemisphere, we find the following four brain lobes (see Figure 1).

    The frontal lobe is the largest and responsible for controlling speech, reasoning, planning, regulating emotions, and consciousness.

In the 19th century, Paul Broca was exploring areas of the brain used for language and noticed a particular part of the brain that was impaired in a man whose speech became limited after a stroke. This area received more and more attention, and today we know that Broca's area, located here in the frontal lobe, is important for the organization, production, and manipulation of language and speech (Joseph, Noble, & Eden, 2001). Areas of the frontal lobe are also important for silent reading proficiency (Shaywitz et al., 2002).

Wernicke's area, long known to be important in understanding language (Joseph et al., 2001), is located here. This region, identified by Carl Wernicke at about the same time and using the same methods as Broca, is critical in language processing and reading.

In addition, converging evidence suggests that two other systems, which process language within and between lobes, are important for reading (see Figure 2).

The first is the left parietotemporal system (Area A in Figure 2) that appears to be involved in word analysis – the conscious, effortful decoding of words (Shaywitz et al., 2002). This region is critical in the process of mapping letters and written words onto their sound correspondences – letter sounds and spoken words (Heim & Keil, 2004). This area is also important for comprehending written and spoken language (Joseph et al., 2001).

The second system that is important for reading is the left occipitotemporal area (Area B in Figure 2). This system seems to be involved in automatic, rapid access to whole words and is a critical area for skilled, fluent reading (Shaywitz et al., 2002, 2004).


The History of Language Processing Research

Scientists have been studying the relationship of language and speech for nearly 150 years. In 1861, while Abraham Lincoln was penning his famous inauguration address, French neurologist Pierre Paul Broca was busy discovering the parts of the brain behind Lincoln’s speech — the parts that handle language processing, comprehension, and speech production (along with controlling facial neurons).

What we now know as “Broca’s area” is located in the posterior inferior frontal gyrus. It’s where expressive language takes shape. Broca was the first person to associate the left hemisphere with language, which remains true for most of us today. (This can’t be said about every brain — it’s possible to have a language center on the right side, which is where the language loop lies in the brains of about 30 percent of left-handed people and approximately 10 percent of right-handers.)

Tucked in the back of Broca’s area is the Pars triangularis, which is implicated in the semantics of language. When you stop to think about something someone’s said — a line in a poem, a jargon-heavy sentence — this is the part of your brain doing the heavy work. Because Broca studied patients who had various speech deficiencies, he also gave his name to “Broca’s aphasia,” or expressive aphasia, where patients often have right-sided weakness or paralysis of the arm and leg due to lesions to the medial insular cortex. (Another of Broca’s patients was a scientist who, after surgery, was missing Broca’s area. Though the scientist suffered minor language impediments, such as the inability to form complex sentences, his speech eventually recovered — which implied some neuroplacticity in terms of where language processing can take place.)

Ten years after Broca’s discoveries, German neurologist Carl Wernicke found that damage to Broca’s area wasn’t the only place in the brain that could cause a language deficit. In the superior posterior temporal lobe, Wernicke’s area acts as the Broca’s area counterpart, handling receptive language, or language that we hear and process.

The arcuate fasciculus links Broca’s area to Wernicke’s area. If you damage this bundle of nerves you’ll find yourself having some trouble repeating what other people say.

Wernicke was also the first person to create a neurological model of language, mapping out various language processes in the brain — speech-to-comprehension, cognition-to-speech, and writing-to-reading — a model that was updated in 1965 by Norman Geschwind. Much of modern neurology as it relates to language is modeled on the Wernicke-Geschwind model, although the model is somewhat outdated today — it gives a broad overview but contains some inaccuracies, including the idea that language processing happens in sequential order, rather than in various parts of the brain simultaneously, which is what know today.

In the 1960s, Geschwind discovered that the inferior parietal lobule has something important to do with language processing. Now, thanks to much improved imaging technology, we know there’s another route through which language travels between Broca’s area and Wernicke’s area in the inferior parietal lobule. This region of the brain is all about language acquisition and abstract use of language. This is where we collect and consider spoken and written words — not just understanding their meanings, but also how they sound and work grammatically. This part of the brain helps us classify things using auditory, visual, and sensory stimuli its late maturation might be why children usually don’t learn to read and write until they’re somewhere around the age of 5.

The fusiform gyrus is also in the frontal lobe, and also plays an interesting role in language processing in the brain. This area helps you recognize words and classify things within other categories. Damage to this part of the brain can cause difficulty in recognizing words on the page.

Today, we’re constantly learning new things about how language works. For example, we believe that the right brain performs its fair share of language functions, including the ability to comprehend metaphors as well as patterns of intonation and poetic meters.

Whereas we used to believe that people who speak with signs used a different, more visually dependent model of language processing in the brain, we now believe that language happens similarly in verbal and nonverbal ways. As it turns out, the brains of deaf people function much the same way as their hearing counterparts: The same parts of the brain are activated while speaking, whether that’s by using signs or not. This research was presented in an issue of NeuroImage, and Dr. Karen Emmorey, a professor of speech language at San Diego State University, has presented new research at the American Association for the Advancement of Science in San Diego illustrating that the brain reacts to signs that are pantomimes — drinking, for example — in exactly the same way as if the word “drink” were spoken aloud.

“It suggests the brain is organized for language, not for speech,” Emmorey says.


Same Brain Spots Handle Sign Language and Speaking

Language is created in the same areas of the brain, regardless of whether a person speaks English or uses American Sign Language to communicate, new research found. The discovery suggests that something about language is universal and doesn't depend on whether people use their voices or their hands to talk.

Two centers in the brain – Broca's area, which is thought to be related to speech production, and Wernicke's area, which is associated with comprehending speech – have long been associated with verbal communication. But now scientists have found the brain areas might be tied to language, no matter whether it's spoken or signed.

Scientists suspected these areas might be particular to speaking, because they are located spatially near areas that are connected to moving the vocal chords, and to the auditory cortex, which is used to hear sounds. In that case, it stood to reason that deaf people who use American Sign Language (ASL) to communicate should use other brain areas to create language, such as parts located near the visual cortex, used for seeing.

But when researchers tested 29 deaf native ASL signers and 64 hearing native English speakers, they found no difference in the brain. They showed both groups pictures of objects, such as a cup or a parrot, and asked the subjects to either sign or speak the word, while a PET (Positron Emission Tomography) scanner measured changes in blood flow in the brain.

In both groups, Broca's and Wernicke's areas were equally active.

"It's the same whether the language is spoken or signed," said Karen Emmorey, a professor of speech language at San Diego State University. Emmorey described the work last week at the annual meeting of the American Association for the Advancement of Science in San Diego, Calif. The research was also detailed in a 2007 issue of the journal Neuroimage.

In a more recent study, which has not yet been published in a scientific journal, the scientists tested whether sign language taps into the same parts of the brain as charades. They wanted to figure out whether the brain regards sign language as more similar to spoken language, or more similar to making pantomime gestures to mimic an action.

The scientists showed both deaf people and hearing people pictures of objects, such as a broom or a bottle of syrup, and asked the subjects to "show how you would use this object." The charade gestures for pouring syrup and for sweeping with a broom are different from the signs for syrup and sweep, so the researchers could be sure the deaf participants were pantomiming and not signing.

Then they asked the deaf subjects to sign the verbs associated with particular objects, such as syrup or broom. The researchers found that the signers activated different parts of their brains when pantomiming versus when signing. Even when the sign is basically indistinguishable from the pantomime – when similar hand gestures are used – the brain treats it like language.

"The brain doesn't make a distinction," Emmorey said. "The fact that many signs are iconic doesn't change the neural underpinnings of language."

And the scans showed that the brain areas signers used when pantomiming were similar to the brain areas hearing participants used when pantomiming – both groups activated the superior parietal cortex, which is associated with grasping, rather than brain areas connected to language.

"It suggests the brain is organized for language, not for speech," Emmorey said.


Lateralization

Human split-brain studies have helped develop knowledge about language and lateralization. In split-brain studies, the cutting of the corpus callosum (a group of nerve fibers connecting the two brain hemispheres) is cut. These studies have proven that the left and the right brain hemispheres have specific language functions.

Left Hemisphere

Naming objects is one of the language-related functions of the left hemisphere. Objects placed in the right visual field are easily recognized by both normal people and split-brain subjects. However, split-brain subjects cannot identify objects located in the left visual field unlike normal subjects. This proves that the only known language function of the left-hemisphere is to name identified objects. Logic, critical thinking and reasoning are also functions that are dominantly processed in the left hemisphere.

Right Hemisphere

Verbal identification of objects presented in the left visual field cannot be identified by people who have undergone the split-brain surgery. However, they can identify these objects by means of their sense o touch. Some words can also be comprehended through the right hemisphere. The split-brain studies show that the figurative sides and context of language are understood via the right hemisphere. In addition, the emotional expression of language is processed in the right hemisphere. Also, music stimulates the right hemisphere more than spoken words do.

Hemispheric lateralization is important in both the parallel processing and sharing of information. While the left hemisphere is mostly concentrated on the interpretation of information through logic and analysis, it coordinates with the right hemisphere which is focused on the interpretation of experience in its totality via synthesis.

Handedness

In terms of handedness, most people who are right-handed reveal left hemisphere language dominance. This means that they tend to be better in logical and critical thinking than in creative and expressive thinking. Most left-handed people do not show right-handed language dominance, as common belief tells us. Rather, 70% of left-handed people still demonstrate left hemisphere language dominance.


The Brain and Language: How Our Brains Communicate

I am a neuropsychologist and scientist from Germany. I am studying how our brains grow and develop and how this is related to language development. I also like to spend time in nature, enjoying things like hiking or climbing.

Young Reviewers

Rayleigh

I like gardening. I live in a small farm in Washington State. I have two chickens, two cats, and one dog. I am in the fourth grade and love animals (as you can see). I get a lot of reviewing experience in class. I want to help young people to realize that science is fun, interesting, and understandable.

In this article, we will show what our brains do when we listen to someone talking to us. Most particularly, we will show how the brains of infants and children are tuned to understand language, and how changes in the brain during development serve as preconditions for language learning. Understanding language is a process that involves at least two important brain regions, which need to work together in order to make it happen. This would be impossible without connections that allow these brain regions to exchange information. The nerve fibers that make up these connections develop and change during infancy and childhood and provide a growing underpinning for the ability to understand and use language.

We humans are very social and chatty beings. As soon as we are born, we learn to communicate with our environment. Growing up, we love to talk to our friends, to our families, and to strangers. We exchange our thoughts and feelings and are eager to learn about the thoughts and feelings of others. This includes direct spoken statements by face-to-face interactions, on the telephone, or via Skype, but also written messages via post-it notes, text messages, Facebook, or Twitter. All of these types of communication require language in order to transport a message from one person to the other. Language is used by young children from a very early age on, and their language abilities develop quickly. But how do humans learn to understand language, and what are the first steps in language acquisition? How does language develop from a baby to a child and so on? Are there certain preconditions in our brains that support language? And, most importantly, why is language important? Well, the ability to understand and produce language is a huge advantage for us because it allows us to exchange information very quickly and accurately. It even allows us to pass on this information over centuries, when we write it down and preserve it. The bible, for instance, contains texts that were written many hundreds of years ago, and we can still read it. When we talk, we can talk about things that are right in front of us, or about things that are far away, things that exist, have existed or will exist, or even things that never existed in the real world and will never be. We can even talk about talking itself, or write articles that try to teach us how this tremendous ability is made possible by our brains. Because this is the place where our words come from when we speak, and also where they go to when someone else talks to us. The language ability is one of the most amazing abilities that we have.

When babies are born, they cannot talk or understand words. A baby’s communication is generally basic and non-verbal. Babies are not born with speech or language. This is something they learn from their interactions with others. Within the first year of life, babies say their first words, and they can soon speak full sentences. After only 2𠄳 years, babies are already quite good at verbal communication and are able to say what they want. This fast progress in language abilities is probably supported by genetic conditions that support fast language learning.

However, it is interesting to think that a baby has already taken the first steps in terms of language development even before birth [ 1 ]. This sounds impossible when we know that language needs to be learnt and does not happen automatically, unlike breathing or sleeping. But babies are actually born knowing the sound and melody of their mother tongue – and they can already “speak” by following the melodic pattern of the language. Of course, this “speaking” does not involve words, and the sound made by newborn babies is often that of crying. But this crying follows a certain melody. You might think that all babies sound similar when they cry, but when a group of German and French scientists investigated the crying sounds of German and French newborn babies [ 2 ], they actually discovered that they were different! As you can see in Figure 1, French babies show a cry melody with low intensity at the beginning, which then rises. German babies, on the other hand, show a cry melody with high intensity at the beginning, which then falls. These findings become even more interesting when you know that these cry melodies resemble the melodies of the two languages when people speak French or German: German is, like English, a language that stresses words at the beginning, while French stresses words toward their endings. To give an example, the German word for daddy is “papa” with a stress on the first syllable: papa. The French word for daddy is “papa” with a stress on the last syllable: papa. The surprising thing is that the cry melodies of French and German newborn babies follow these speech stress patterns!

  • Figure 1 - The sound patterns of babies’ cries.
  • The two diagrams show the intensity of the sound as a black curvature over time, in a 1.2-s frame. The wider the curves (i.e., the higher the amplitude), the more intense the sound. The upper diagram shows the sound pattern of a typical cry for French newborn babies. The cry’s highest intensity is at the end (rising from left to right). The lower diagram shows the sound pattern of a typical cry for German newborn babies. Here, unlike in the French example, the cry is more intense at the beginning (falling from left to right). These two different melodies of crying are similar to the sounds of the two languages, French and German, which appear to be learnt already before birth.

How can this be? How can babies learn the melodies and sounds of their mother tongue even before they are born? The answer is as simple as this: about 3 months before birth, while still in their mother’s womb, babies start to hear. At that time, their ears are developed enough and start working. Usually, it will mostly be the mother’s voice that reaches the baby’s ears inside the womb, but other loud sounds or voices as well. Consequently, every day of the last few months before birth, the baby can hear people speaking – this is the first step in language learning! This first step, in other words, is to learn the melody of the language. Later, during the next few months and years after birth, other features of language are added, like the meaning of words or the formation of full sentences.

As we have seen, the development of the baby and the baby’s organs provides important preconditions for speech and language. This can be the development of the hearing system, which allows the baby to hear the sound of language from the womb. But the simultaneous development of the brain is just as important, because it is our brain that provides us with the ability to learn and to develop new skills. And it is from our brain that speech and language originate. Certain parts of the brain are responsible for understanding words and sentences. These brain areas are mainly located in two regions, in the left side of the brain, and are connected by nerves. Together, these brain regions and their connections form a network that provides the hardware for language in the brain. Without this brain network, we would not be able to talk or to understand what’s being said. Figure 2 illustrates this talkative mesh in the brain. The connections within this network are particularly important, because they allow the network nodes to exchange information.

  • Figure 2 - A view of a brain, as seen from the left side.
  • Two brain regions are highlighted in red and orange. These regions are strongly involved in processing speech and language. The blue and green lines illustrate connections that link the two regions with one another and form a network of language areas. There is an upper nerve connection (blue) and a lower nerve connection (green).

Now, let’s keep in mind how important this network is to fully master language. Assuming that the brain develops during infancy and childhood, we might wonder from what age onward the network of language areas is well enough established to serve as a sufficient precondition to speaking and understanding language. Is it that the network is there from a very early age on, and that the development of language is dependent on learning based on the input from the environment? Or is this network something that develops over time and provides a growing precondition that enables more and more possible language functions?

These questions can be answered by investigating the nerve fiber connections in the brain. The nerve fibers are a crucial part of the brain and form the language network. These nerve fibers can be visualized using a technique called magnetic resonance imaging (MRI). MRI is an imaging method that allows us to take pictures of someone’s brain inside their head, like X-ray but without any rays being used. Instead, the magnetic properties of water (yes, water is slightly magnetic) are used while the person is inside a strong magnetic field created by the MR scanner.

Now, using this technique, a comparison between newborn infants and older children (for example 7-year-olds who are already going to school) would show if their brain networks are the same or not. If they are the same, this would mean that there are preconditions for language in the brain from birth onward. If they are different, this would mean that the brain’s preconditions for certain language functions probably are not yet fully established at birth, and that these preconditions grow as the babies get older. The brain networks for newborns and 7-year-olds are depicted in Figure 3.

  • Figure 3 - A view of the brains of newborn infants (left) and 7-year-old children (right).
  • The two important language regions of the brain are highlighted in red and orange (like in Figure 2). The technique of magnetic resonance imaging provides images of the nerve connections between the two language regions. The lower nerve [green (B)] connects these regions of the brain in both newborns and children. But the upper nerve between the language regions [blue (A)] is only observed in children, not yet in infants. However, infants already show a connection to a directly neighboring region [yellow (A)]. This means that the language network in infants is not fully established yet. The important connection by the upper nerve still has to develop. On the other hand, the network shown for children is already very similar to that of adults and shows two network connections, an upper one and a lower one.

As can be seen, for both newborn infants and children going to school, the network of brain connections between the language regions is, in general, established. For newborn infants, a basic connection (green in Figure 3B) within the language network can be used. This is an important precondition from a very early age on. But the important upper connection between the language regions (blue in Figure 3A) cannot be seen in newborns. However, they already possess a second upper connection (yellow in Figure 3A) that does not connect to the red language region directly, but to a region right next to it that helps to develop and increase speech and language abilities. This means that the full language network as it is used for language abilities by older children does not exist in newborn babies yet, but it also means that they already possess a basic network. It is probably true that the full language network is an important precondition in development that allows children to learn and use more advanced language skills.

As we have seen here, the development of the hearing system and the development of the brain language network provide crucial preconditions for infants to be able to develop and improve their language abilities. Although infants already possess an important groundwork for language acquisition, more advanced language learning becomes possible as the brain continues to develop during childhood. Different stages of the language network, as shown in Figure 3, demonstrate that the language network develops over time. The nerve fiber connections in the brain change throughout our lives. During infancy and childhood, they become more and more powerful in their ability to transmit information, and it is only when we reach our teenage years that many of these nerve fibers stop developing. When we get old, they slowly start to decline. For each age for which the networks are illustrated (for example for newborn babies and for children, as in Figure 3), we only get a snapshot of a continuously changing matter. And it is not only maturation and aging that influence these networks. For instance, a therapy that is supposed to cure a disease might also alter the brain. Everything we experience and learn can potentially impact the brain and the brain networks. In other words, with each lesson we learn at school, we change our brains!


Our brain is a filter

Our brain is also active when we discriminate relevant sounds from background noise. Our brain can filter out unwanted noise so that we can focus on what we are listening to. And researchers have found that the brain activity is greater in the left half of the brain when we discriminate sounds from noise. In other words, the cocktail party effect occurs in the left side of our brain.

In the same way, our brain turns up the volume when we speak. When it comes to our own speech, there is a network of volume settings in the brain which can amplify the sounds we make.


Right Brain vs. Left Brain: Language, Dominance, and Their Shared Roles

(Image: hidesy/Shutterstock)

Broca and Wernike

Back in the 19th century, European physicians Paul Broca and Carl Wernicke noticed that their patients, who had circumscribed lesions in the left hemisphere, showed a peculiar symptom profile. Broca’s patient was nicknamed Tan because that was one of the very few words that he could say, and he seemed to have little trouble understanding speech but had great difficulty producing it.

By contrast, Wernicke’s patients, whose brain damage was just a little bit below the area where Tan’s was, were able to produce speech, but when they did, it sounded like word salad—gibberish. They had trouble with speech comprehension, whether it was understanding what someone else was saying or saying something themselves.

(Image: Designua/Shutterstock)

Dominant Hemispheres, Dominant Hands

Studying such groups of lesion patients, then, helped neuroscientists discover that certain cognitive functions like language are localized primarily to one side. This assignment of different functions to the two different hemispheres is known as lateralization. When it comes to our senses and actions, in particular, the two sides of the brain do process different sets of information.

The left side of space is represented primarily in the right hemisphere, and vice versa. This means that your right hand is controlled by the left side of your brain and what you see in your left visual field is projected to the back of the right side of your cerebral cortex. It’s this fact that some acting coaches may use to justify a technique designed to make a person more creative. Engage the left side of your body, they say, and you’ll tap into your creative right brain. But there are a few assumptions that such an explanation glosses over, most of which aren’t true.

Just like you have a dominant hand, most likely you also have a dominant hemisphere—or, because you have a dominant hemisphere, you also have a dominant hand. Neuroscientists are still working out the details of how related the lateralization of cognition and handedness might be and how they develop, and the story is getting more complicated by the day.

But for most of us who are right-handed, our left hemisphere is our dominant hemisphere, and the left side of our brain contains most of our language function.

People sometimes point to left-handed individuals, claiming that they’re more likely to be engaged in creative careers than right-handed ones, and this surely must be because they’re dominated by their right hemisphere. But if you’re left-handed, then there’s only about a 20% chance that your right hemisphere is your dominant hemisphere. There’s also a 20% chance that both of your hemispheres contain language function.

As for the rest of the lefties, they’ve still got about a 50%–60% chance that most of their language function is on the left side. So already, as you can see, even for language, arguably the star of lateralization, the story isn’t quite as neat as we would like it to be.

The Brain and Language

Even though many language functions rely on an intact left hemisphere, as Broca and Wernicke noted, the right hemisphere certainly participates in verbal communication. The right side is much better at deciphering prosody and accentuation, while the left is the home of the grammar police and the dictionary.

For example, in both sets of patients, the physicians and their colleagues noticed that some aspects of speech were relatively preserved. Prosody, for example—the music of speech—seemed to be retained. In other words, their speech contained the appropriate ups and downs of conversation you could hear the emotional intent in the melody of their speech.

But patients with right hemisphere damage can show deficits in prosody: They have trouble distinguishing and expressing modulations in speech. You might say that if there ever was an artistic side to language, surely it would be prosody. But many a great writer might take issue with that stance. So, what else does the right hemisphere do that the left does not and how do we know?

Left and Right Brain Roles

Tracking activity in the brains of healthy people while they’re engaging in different tasks, as we do with neuroimaging, provides another, more modern window into laterality. For example, neuroimaging studies suggest that the two hemispheres might play different roles in emotion processing, with the left hemisphere showing somewhat greater activation for happy or positive emotions and the right hemisphere showing more activity during negative emotional processing.

One particularly interesting insight that neuroimaging has given us is the finding that white matter tracts, or the wiring diagram of the two hemispheres, is different. The wiring of the right hemisphere has been called more efficient because it has greater connectivity between regions.

Think of a city with a really good subway system, like Manhattan, making it easy to get from one side of the city to another. The left hemisphere, in contrast, seems to be more modular. It’s more like a cluster of little cities operating more independently, like Los Angeles, where every mini-city has its own municipal transport system and the cities themselves are not well connected. Santa Monica and Beverly Hills have different governing bodies, so it’s hard to get from one to the other using public transportation.

Left and Right Brain Communications

White matter connections in the brain, obtained with MRI Tractography. (Image: By Xavier Gigandet et. al. – Gigandet X, Hagmann P, Kurant M, Cammoun L, Meuli R, et al./Public domain)

It’s this wiring difference that might explain why the left hemisphere seems to contain regions that operate somewhat more independently and are more specialized than the regions in the right hemisphere, which are involved in more integrative processes like visuospatial tasks—recognizing faces or arranging a set of blocks to match a pattern, or paying attention to all of space, not just one side of it.

It’s easier to find specialization in the left than the right hemisphere. Here, people are looking for evidence for the association between the right brain and creativity, and the left brain and logic point to these wiring differences. If the right brain is more interconnected, does that make it easier for it to generate new ideas, finding new connections between remotely associated concepts?

Not so fast. Neuroimaging research has also shown us just how communicative the two hemispheres are. Unless that connection between them is physically severed, information is zipping across the hemispheres during the vast majority of tasks that we ask our brains to accomplish.

In many regions, signals pass from one hemisphere to the other more quickly than they do within a single hemisphere—that is, some signals from the left and right prefrontal cortex can be exchanged more efficiently than signals from the back to the front of the brain in the same hemisphere.

Common Questions About Right Brain vs. Left Brain

Common theory suggests that the left brain is needed for more logic-based skills such as learning a language and mathematics, while the right brain is needed for creative tasks such as art, as well as connecting to others on an emotional level.

Supposedly, people whose left brain is more dominant tend to be more logical and better at science and mathematics than those who are more right-brain dominant .

A commonly-held belief is that analytical people think more with the left side of their brain while artistic or intuitive people lean more heavily on the right side of their brain for support. In reality, though, the brain sends constant signals back and forth, and everyone is dependent on both sides of the brain.

There has long been a myth that people only use 10% of their brain. In fact, technology such as functional magnetic resonance imaging has revealed activity in the majority of the brain . Furthermore, damage to any part of the brain will impact a person’s cognitive abilities, therefore disproving the myth that we only use a small section of our brain.


Comments

Linda Crampton (author) from British Columbia, Canada on September 22, 2019:

I&aposm very sorry about your brother-in-law&aposs situation. I can&apost help you, though. I&aposm a science writer, not a doctor. Even a doctor would have to be familiar with a patient&aposs specific case in order to answer your questions. I hope you find a physician that is able to answer the questions and help your brother-in-law.

Florence Njoku on September 22, 2019:

I have a brother in-law who had severe brain injury last three years and lost his power of speech since and quadriplegic too although the spinal cord is intact.

I am wondering could it be that his Broca&aposs area was damaged as a result of the injury. He understands spoken word and has a strong grasping reflex. I consulted the best neurosurgeon at the time of his injury and was told nothing could be done for him. The hospital he was admitted severe brain damage and was querying brain death, I guess if if was brain death who will be dead and would not be able to breath or last weeks without life-support.


Are the language and sound centres of the brain in the same area? - Psychology

Human language and its representation in the brain have fascinated scientists for many years. Progress has been made in identifying important language areas in the brain, such as Broca's and Wernicke's areas. Most research has focused on people who speak only one language. However, some researchers have become interested in the brains of people who have learned a second language. Many questions remain unanswered about the similarities and differences between first and second language learning, storage, and usage.

Substantial evidence suggests the existence of a critical learning period for first languages. A critical period for language is defined as the time period during which a person must be exposed to the spoken language in order to best learn the language. In most cases, if a person is not exposed to a language during the critical period, he or she will never be able to speak the language as well as someone who learned language normally. Although the person may be able to learn many vocabulary words, his or her syntax will probably never reach a normal level.

Children who have brain damage are often able to regain their language abilities with practice. Adults, however, who suffer damage to language areas are rarely able to achieve their previous language proficiency. This observation further supports the concept that there might be a difference between learning language in childhood and adulthood. Although it is generally believed that a critical period exists for a first language, it is not known if there is a similar critical period for a second language. For example, if your first language is English, when must you start to learn Spanish, Russian, or Mandarin to be able to learn it completely? What is it about the brain that allows -- and perhaps requires -- us to learn languages at a young age?


Watch the video: The Finnish Language (May 2022).


Comments:

  1. Davison

    but yourself, you were trying to do so?

  2. Majid Al Din

    I have not heard this



Write a message