We already know that the the sense of touch is a powerful thing. Not only for babies, but for adults as well. Think about it, the power of a hug, someone grabbing your hand to comfort you. It’s a small gesture with a big impact. But a recent study has found that the power of touch can actually help a baby learn words in their native language.
Purdue University researchers have discovered that the touch of a caregiver can actually help babies to find and form words in a continuous stream of speech. Amanda Seidl, associate professor of speech, language and hearing sciences says,
“We found that infants treat touches as if they are related to what they hear and this these touches could have an impact on their word learning. We think of touch as conveying affection, but our recent research shows that infants can relate touches to their incoming speech signal. Others have looked at the role of touch with respect to babies forming an attachment and physical development. But until now the impact of touch on language learning has not been explored.”
Support for Seidl’s research was given by the National Science Foundation. The findings of the study were published in Developmental Science. Seidl says that she is particularly interested in wide array of cues, or the sources of information that infants might put together to learn their language. Learning words can often prove to be challenging for infants, as most of the words they hear are usually presented in a continuous stream of speech, instead of isolated words.
“Parents may pause before saying an infant’s name, but they almost never do so for other words. This research explored whether touches could help infants to find where words begin and end in the continuous stream of speech. They need to find words before they can attach real meaning to their words, because names of body parts are often the first words that babies learn and touching is often involved when caregivers talk about body parts, we speculated that touch could act as a cute to word edges.”
At Purdue’s Infant Speech Lab, approximately 48 English-learning 4-month-old children were tested in two groups. The children sat on their parent’s lap face to face with an experimenter, listening to a pre-recorded stream of speech made up of nonsense words. During the first experiment, each time that a nonsensical word, such as “dobita,” was spoken, the researcher would touch the baby’s kneecap. They did this a total of two dozen times. The nonsensical word “lepoga” was played approximately 24 times, however, the infant was only touched one time on their elbow during the repeating of that word. The rest of the touches to the infant’s elbow had been done on other syllable sequences.
After the listening portion, the children then participated in a language preference study. Nearly all of the infants showed that they had remembered the word “dobita” from the stream of continuous speech, which was the word that had been reinforced by “aligned touching.” During the second experiment, the same recording of continuous speech along with new words had been played, but this time, the researcher had touched their own eyebrow or chin, instead of the baby’s. The babies that participated in that part of the experiment had shown that they did not remember any words.
“It didn’t matter how much time the infant spent looking at the experimenter’s face, the babies were not able to use these cues in the same way as they were when their own body was touched. I am interested in whether we can predict babies’ language later on from early measures of speech perception. If we look at speech perception and learning in a 6-month-old can we predict their language ability at 3 years? If we can find out what kinds of learners young children are, we could target their learning environment to their learning style.”