Benefits of word repetition to infants

Repeat after me! Parents who repeat words to 7-month-olds have toddlers with larger vocabularies

Date:
September 21, 2015
Source:
University of Maryland
Summary:
 New research suggests that young infants benefit from hearing words repeated by their parents. With this knowledge, parents may make conscious communication choices that could pay off in their babies’ toddler years and beyond.

“Parents who repeat words more often to their infants have children with better language skills a year and a half later,” said co-author Rochelle Newman, professor and chair of UMD’s Department of Hearing and Speech Sciences (HESP). “A lot of recent focus has been on simply talking more to your child — but how you talk to your child matters. It isn’t just about the number of words.”

Newman and co-authors HESP Professor Nan Bernstein Ratner and Harvard Associate Professor of Education Meredith L. Rowe tracked maternal-child directed speech to prelinguistic (7-month-old) infants. They specifically measured the infants’ ability to understand language at 7 months, and later the children’s vocabulary outcomes at age 2. They found that the toddlers who had stronger language outcomes differed in two ways from their peers: their parents had repeated words more often, and they were more tuned in to the language as infants, and thus better able to process what was being said.

“It takes two to tango,” said Dr. Ratner. “Both the child and the parent play a role in the child’s later language outcomes, and our study is the first to show that.”

The researchers believe their findings will be of immediate use to families. While it is clinically proven that parents naturally speak more slowly and in a specialized “sing-song” tone to their children, the findings from this study will perhaps encourage parents to be more conscious of repeating words to maximize language development benefits.

“It is the quality of the input that matters most, not just the quantity,” said Dr. Rowe.

This new study builds on a growing body of research from HESP focused on exploring infant language development. Professor Newman and two of her then-graduate students recently published “Look at the gato! Code-switching in speech to toddlers” in the Journal of Child Language. That study examined the phenomenon of “code-switching,” wherein adults speak more than one language and “mix” those languages when speaking to their children. A lot of children are told that this type of language mixing is bad for children, but Professor Newman and her colleagues found that this “code-switching” has no impact on children’s vocabulary development.

“Input and uptake at 7 months predicts toddler vocabulary: the role of child-directed speech and infant processing skills in language development” appears online, in advance of its upcoming publication in the Journal of Child Language.


Story Source:

The above post is reprinted from materials provided by University of Maryland. Note: Materials may be edited for content and length.


Journal Reference:

  1. Rochelle S. Newman, Meredith L. Rowe, Nan Bernstein Ratner. Input and uptake at 7 months predicts toddler vocabulary: the role of child-directed speech and infant processing skills in language development. Journal of Child Language, 2015; 1 DOI: 10.1017/S0305000915000446

Brain Structure of Infants Predicts Language Skills at One Year

Jan. 22, 2013 — Using a brain-imaging technique that examines the entire infant brain, researchers have found that the anatomy of certain brain areas – the hippocampus and cerebellum – can predict children’s language abilities at 1 year of age.

The University of Washington study is the first to associate these brain structures with future language skills. The results are published in the January issue of the journal Brain and Language.

“The brain of the baby holds an infinite number of secrets just waiting to be uncovered, and these discoveries will show us why infants learn languages like sponges, far surpassing our skills as adults,” said co-author Patricia Kuhl, co-director of the UW’s Institute for Learning & Brain Sciences.

Children’s language skills soar after they reach their first birthdays, but little is known about how infants’ early brain development seeds that path. Identifying which brain areas are related to early language learning could provide a first glimpse of development going awry, allowing for treatments to begin earlier.

“Infancy may be the most important phase of postnatal brain development in humans,” said Dilara Deniz Can, lead author and a UW postdoctoral researcher. “Our results showing brain structures linked to later language ability in typically developing infants is a first step toward examining links to brain and behavior in young children with linguistic, psychological and social delays.”

In the study, the researchers used magnetic resonance imaging to measure the brain structure of a mix of 19 boys and girls at 7 months of age. The researchers used a measurement called voxel-based morphometry to determine the concentration of gray matter, consisting of nerve cells, and of white matter, which make up the network of connections throughout the brain.

The study is the first to relate the outcomes of this whole-brain imaging technique to predict future ability in infants. The whole-brain approach freed the researchers from having to select a few brain regions for study ahead of time, ones scientists might have expected to be involved based on adult data.

Five months later, when the children were about 1 year old they returned to the lab for a language test. This test included measures of the children’s babbling, recognition of familiar names and words, and their ability to produce different types of sounds.

“At this age, children typically don’t say many words,” Deniz Can said. “So we rely on babbling and the ability to comprehend language as a sign of early language mastery.”

Infants with a greater concentration of gray and white matter in the cerebellum and the hippocampus showed greater language ability at age 1. This is the first study to identify a relationship between language and the cerebellum and hippocampus in infants. Neither brain area is well-known for its role in language: the cerebellum is typically linked to motor learning, while the hippocampus is commonly recognized as a memory processor.

“Looking at the whole brain produced a surprising result and scientists live for surprises. It wasn’t the language areas of the infant brain that predicted their future linguistic skills, but instead brain areas linked to motor abilities and memory processing,” Kuhl said. “Infants have to listen and memorize the sound patterns used by the people in their culture, and then coax their own mouths and tongues to make these sounds in order join the social conversation and get a response from their parents.”

The findings could reflect infants’ abilities to master the motor planning for speech and to develop the memory requirements for keeping the sound patterns in mind.

“The brain uses many general skills to learn language,” Kuhl said. “Knowing which brain regions are linked to this early learning could help identify children with developmental disabilities and provide them with early interventions that will steer them back toward a typical developmental path.”

Todd Richards, a UW professor of radiology, was another co-author. The study was funded by the National Institutes of Health and the Santa Fe Institute Consortium.


 
 

Story Source:

The above story is reprinted from materials provided byUniversity of Washington. The original article was written by Molly McElroy.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. Dilara Deniz Can, Todd Richards, Patricia K. Kuhl. Early gray-matter and white-matter concentration in infancy predict later language skills: A whole brain voxel-based morphometry studyBrain and Language, 2013; 124 (1): 34 DOI: 10.1016/j.bandl.2012.10.007
University of Washington (2013, January 22). Brain structure of infants predicts language skills at one year. ScienceDaily. Retrieved January 27, 2013, from http://www.sciencedaily.com/releases/2013/01/130122142850.htm

Language Learning Begins in Utero, Study Finds; Newborn Memories of Oohs and Ahs Heard in the Womb

Jan. 2, 2013 — Newborns are much more attuned to the sounds of their native language than first thought. In fact, these linguistic whizzes can up pick on distinctive sounds of their mother tongue while in utero, a new study has concluded.


  • Babies only hours old are able to differentiate between sounds from their native language and a foreign language, scientists have discovered. The study indicates that babies begin absorbing language while still in the womb, earlier than previously thought. (Credit: © LanaK / Fotolia)

     

     

    Research led by Christine Moon, a professor of psychology at Pacific Lutheran University, shows that infants, only hours old showed marked interest for the vowels of a language that was not their mother tongue.

     

“We have known for over 30 years that we begin learning prenatally about voices by listening to the sound of our mother talking,” Moon said. “This is the first study that shows we learn about the particular speech sounds of our mother’s language before we are born.”

Before the study, the general consensus was that infants learned about the small parts of speech, the vowels and the consonants, postnatally. Moon added. “This study moves the measurable result of experience with individual speech sounds from six months of age to before birth,” she said. The findings will be published in Acta Paediatricain late December.

For the study Moon tested newborn infants shortly after birth while still in the hospital in two different locations: Madigan Army Medical Center in Tacoma, Wash., and in the Astrid Lindgren Children’s Hospital in Stockholm. Infants heard either Swedish or English vowels and they could control how many times they heard the vowels by sucking on a pacifier connected to a computer.

Co-authors for the study were. Hugo Lagercrantz, a professor at the Karolinska Institute in Sweden as well as a member of the Nobel Assembly, and Patricia Kuhl, Endowed Chair for the Bezos Family Foundation for Early Childhood Learning and Co-Director of the University of Washington’s Institute for Learning and Brain Sciences.

The study tested newborns on two sets of vowel sounds — 17 native language sounds and 17 foreign language sounds, said Kuhl. The researchers tested the babies’ interest in the vowel sounds based on how long and often they sucked on a pacifier. Half of the infants heard their native language vowels, and the other half heard the foreign vowels. “Each suck will produce a vowel until the infant pauses, and then the new suck will produce the next vowel sound,” said Kuhl.

In both countries, the babies listening to the foreign vowels sucked more, than those listening to their native tongue regardless of how much postnatal experience they had. This indicated to researchers that they were learning the vowel sounds in utero.

“These little ones had been listening to their mother’s voice in the womb, and particularly her vowels for ten weeks. The mother has first dibs on influencing the child’s brain,” said Kuhl. “At birth, they are apparently ready for something novel.”

While other studies have focused on prenatal learning of sentences or phrases, this is the first study to show learning of small parts of speech that are not easily recognized by melody, rhythm or loudness. Forty infants were tested in Tacoma and another 40 in Sweden. They ranged in age from 7 to 75 hours after birth.

Vowel sounds were chosen for the study because they are prominent, and the researchers thought they might be noticeable in the mother’s ongoing speech, even against the noisy background sounds of the womb.

The study shows that the newborn has the capacity to learn and remember elementary sounds of their language from their mother during the last 10 weeks of pregnancy (the sensory and brain mechanisms for hearing are intact at 30 weeks of gestational age).

“This is a stunning finding,” said Kuhl. “We thought infants were ‘born learning’ but now we know they learn even earlier. They are not phonetically naïve at birth.”

Prior to the kinds of studies like this one, , it was assumed that newborns were “blank slates,” added Lagercrantz. He said that although it’s been shown that infants seem to be attuned to sounds of their mother tongue, this same effect now seems to occur before birth. This surprised him.

“Previous studies indicate that the fetus seems to remember musical rhythms,” he said. “They now seem to be able to learn language partially.”

Kuhl added that infants are the best learners on the planet and while understanding a child’s brain capacity is important for science, it’s even more important for the children. “We can’t waste early curiosity.”

“The fact that the infants can learn the vowels in utero means they are putting some pretty sophisticated brain centers to work, even before birth,” she said.



Story Source:

The above story is reprinted from materials provided byPacific Lutheran University. The original article was written by Barbara Clements.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. Christine Moon, Hugo Lagercrantz, Patricia K Kuhl.Language experiencedin uteroaffects vowel perception after birth: a two-country studyActa Paediatrica, 2012; DOI: 10.1111/apa.12098
Pacific Lutheran University (2013, January 2). Language learning begins in utero, study finds; Newborn memories of oohs and ahs heard in the womb. ScienceDaily. Retrieved January 3, 2013, from http://www.sciencedaily.com/releases/2013/01/130102083615.htm

Infants Can Use Language to Learn About People’s Intentions

ScienceDaily (July 23, 2012) — Infants are able to detect how speech communicates unobservable intentions, researchers at New York University and McGill University have found in a study that sheds new light on how early in life we can rely on language to acquire knowledge about matters that go beyond first-hand experiences.


Their findings appear in the Proceedings of the National Academy of Sciences (PNAS).

“Much of what we know about the world does not come from our own experiences, so we have to obtain this information indirectly — from books, the news media, and conversation,” explained Athena Vouloumanos, an assistant professor at NYU and one of the study’s co-authors. “Our results show infants can acquire knowledge in much the same way — through language, or, specifically, spoken descriptions of phenomena they haven’t — or that can’t be — directly observed.”

The study’s other co-authors were Kristine Onishi, an associate professor in the Department of Psychology at Canada’s McGill University, and Amanda Pogue, a former research assistant at NYU who is now a graduate student at the University of Waterloo.

Previous scholarship has established that infants seem to understand that speech can be used to categorize and communicate about observable entities such as objects and people. But no study has directly examined whether infants recognize that speech can communicate about unobservable aspects.

In the PNAS study, the researchers sought to determine if one-year-old infants could recognize that speech can communicate about one unobservable phenomenon that is crucial for understanding social interactions: a person’s intentions.

To explore this question, the researchers had adults act out short scenarios for the infants. Some scenes ended predictably (that is, with an ending that is congruent with our understanding of the world) while others ended unpredictably (that is, incongruently).

The researchers employed a commonly used method to measure infants’ detection of incongruent scenes: looking longer at an incongruent scene.

Infants saw an adult actor (the communicator) attempt, but fail, to stack a ring on a funnel because the funnel was just out of reach. Previous research showed that infants would interpret the actor’s failed behavior as signaling the actor’s underlying intention to stack the ring. The experimenters then introduced a second actor (the recipient) who was able to reach all the objects. In the key test scene, the communicator turned to the recipient and uttered either a novel word unknown to infants (“koba”) or coughed.

Although infants always knew the communicator’s intention (through observing her prior failed stacking attempts), the recipient only sometimes had the requisite information to accomplish the communicator’s intended action-specifically, when the communicator vocalized appropriately using speech, but not when she coughed.

If infants understood that speech — but not non-speech — could transfer information about an intention, when the communicator used speech and the recipient responded by stacking the ring on the funnel, infants should treat this as a congruent outcome. Results confirmed this prediction. The infants looked longer when the recipient performed a different action, such as imitating the communicators’ prior failed movements or stacking the ring somewhere other than on the funnel, suggesting they treated these as incongruent, or surprising, outcomes.

Because coughing doesn’t communicate intentions, infants looked equally no matter what the recipient’s response was.

“As adults, when we hear people speaking, we have the intuition that they’re providing information to one another, even when we don’t understand the language being spoken. And it’s the same for infants,” Onishi said. “Even when they don’t understand the meaning of the specific words they hear, they realize that words — like our nonsense word ‘koba’ — can provide information in a way that coughing cannot.”

“What’s significant about this is it tells us that infants have access to another channel of communication that we previously didn’t know they had,” added Vouloumanos.

“Understanding that speech can communicate about things that are unobservable gives infants a way to learn about the world beyond what they’ve experienced. Infants can use this tool to gain insight into other people, helping them develop into capable social beings.”

The study was supported by grants from the National Science Foundation ADVANCE program and Canada’s Social Sciences and Humanities Research Council of Canada.

 

Link:

http://www.nyu.edu/about/news-publications/news/2012/07/23/infants-can-use-language-to-learn-about-peoples-intentions-nyu-mcgill-researchers-find-.html

Citation:

New York University (2012, July 23). Infants can use language to learn about people’s intentions. ScienceDaily. Retrieved July 25, 2012, from http://www.sciencedaily.com­ /releases/2012/07/120723151030.htm

Infants’ Recognition of Speech More Sophisticated Than Previously Known

ScienceDaily (July 17, 2012) — The ability of infants to recognize speech is more sophisticated than previously known, researchers in New York University’s Department of Psychology have found. Their study, which appears in the journal Developmental Psychology, showed that infants, as early as nine months old, could make distinctions between speech and non-speech sounds in both humans and animals.


“Our results show that infant speech perception is resilient and flexible,” explained Athena Vouloumanos, an assistant professor at NYU and the study’s lead author. “This means that our recognition of speech is more refined at an earlier age than we’d thought.”

It is well-known that adults’ speech perception is fine-tuned — they can detect speech among a range of ambiguous sounds. But much less is known about the capability of infants to make similar assessments. Understanding when these abilities become instilled would shed new light on how early in life we develop the ability to recognize speech.

In order to gauge the aptitude to perceive speech at any early age, the researchers examined the responses of infants, approximately nine months in age, to recorded human and parrot speech and non-speech sounds. Human (an adult female voice) and parrot speech sounds included the words “truck,” “treat,” “dinner,” and “two.” The adult non-speech sounds were whistles and a clearing of the throat while the parrot non-speech sounds were squawks and chirps. The recorded parrot speech sounds were those of Alex, an African Gray parrot that had the ability to talk and reason and whose behaviors were studied by psychology researcher Irene Pepperberg.

Since infants cannot verbally communicate their recognition of speech, the researchers employed a commonly used method to measure this process: looking longer at what they find either interesting or unusual. Under this method, looking longer at a visual paired with a sound may be interpreted as a reflection of recognition. In this study, sounds were paired with a series of visuals: a checkerboard-like image, adult female faces, and a cup.

The results showed that infants listened longer to human speech compared to human non-speech sounds regardless of the visual stimulus, revealing the ability recognize human speech independent of the context.

Their findings on non-human speech were more nuanced. When paired with human-face visuals or human artifacts like cups, the infants listened to parrot speech longer than they did non-speech, such that their preference for parrot speech was similar to their preference for human speech sounds. However, this did not occur in the presence of other visual stimuli. In other words, infants were able to distinguish animal speech from non-speech, but only in some contexts.

“Parrot speech is unlike human speech, so the results show infants have the ability to detect different types of speech, even if they need visual cues to assist in this process,” explained Vouloumanos.

The study’s other co-author was Hanna Gelfand, an undergraduate at NYU’s College of Arts and Science at the time of the study and currently a graduate student in the San Diego State University/University of California, San Diego Joint Doctoral Program in Language and Communicative Disorders.

 

Journal Reference:

  1. Athena Vouloumanos, Hanna M. Gelfand. Infant Perception of Atypical Speech Signals. Developmental Psychology, 2012; DOI: 10.1037/a0029055

 

New York University (2012, July 17). Infants’ recognition of speech more sophisticated than previously known. ScienceDaily. Retrieved July 18, 2012, from http://www.sciencedaily.com­ /releases/2012/07/120717100050.htm

Our Brains Often Fail to Notice Key Words That Can Change the Whole Meaning of a Sentence

ScienceDaily (July 16, 2012) — Far from processing every word we read or hear, our brains often do not even notice key words that can change the whole meaning of a sentence, according to new research from the Economic and Social Research Council (ESRC).


After a plane crash, where should the survivors be buried?

If you are considering where the most appropriate burial place should be, you are not alone. Scientists have found that around half the people asked this question, answer it as if they were being asked about the victims not the survivors.

Similarly, when asked “Can a man marry his widow’s sister?” most people answer “yes” — effectively answering that it would indeed be possible for a dead man to marry his bereaved wife’s sister.

What makes researchers particularly interested in people’s failure to notice words that actually don’t make sense, so called semantic illusions, is that these illusions challenge traditional models of language processing which assume that we build understanding of a sentence by deeply analysing the meaning of each word in turn.

Instead semantic illusions provide a strong line of evidence that the way we process language is often shallow and incomplete.

Professor Leuthold at University of Glasgow led a study using electroencephalography (EEG) to explore what is happening in our brains when we process sentences containing semantic illusions.

By analysing the patterns of brain activity when volunteers read or listened to sentences containing hard-to-detect semantic anomalies — words that fit the general context even though they do not actually make sense — the researchers found that when a volunteer was tricked by the semantic illusion, their brain had not even noticed the anomalous word.

Analyses of brain activity also revealed that we are more likely to use this type of shallow processing under conditions of higher cognitive load — that is, when the task we are faced with is more difficult or when we are dealing with more than one task at a time.

The research findings not only provide a better understanding of the processes involved in language comprehension but, according to Professor Leuthold, knowing what is happening in the brain when mistakes occur can help us to avoid the pitfalls,such as missing critical information in textbooks or legal documents, and communicate more effectively.

There are a number of tricks we can use to make sure we get the correct message across: “We know that we process a word more deeply if it is emphasised in some way. So, for example in a news story, a newsreader can stress important words that may otherwise be missed and these words can be italicised to make sure we notice them when reading,” said Professor Leuthold.

The way we construct sentences can also help reduce misunderstandings, he explained: “It’s a good idea to put important information first because we are more likely to miss unusual words when they are near the end of a sentence. Also, we often use an active sentence construction such as ‘Bob ate the apple’ because we make far more mistakes answering questions about a sentence with a passive construction — for example ‘The apple was eaten by Bob’.”

The study findings also suggest that we should avoid multi-tasking when we are reading or listening to an important message: “For example, talking to someone on the phone while driving on a busy motorway or in town, or doing some homework while listening to the newsmight lead to more shallow processing,” said Professor Leuthold.

 

Economic and Social Research Council (ESRC) (2012, July 16). Our brains often fail to notice key words that can change the whole meaning of a sentence. ScienceDaily. Retrieved July 17, 2012, from http://www.sciencedaily.com­ /releases/2012/07/120716091921.htm