If you are a fan of wildlife shows, you’ve probably seen those tiny video cameras rigged to animals in the wild that provide a sneak peek into their secret domains. But not all research cams are mounted on creatures with fur, feathers, or fins. One of NIH’s 2014 Early Independence Award winners has developed a baby-friendly, head-mounted camera system (shown above) that captures the world from an infant’s perspective and explores one of our most human, but still imperfectly understood, traits: language.
Elika Bergelson, a young researcher at the University of Rochester in New York, wants to know exactly how and when infants acquire the ability to understand spoken words. Using innovative camera gear and other investigative tools, she hopes to refine current thinking about the natural timeline for language acquisition. Bergelson also hopes her work will pay off in a firmer theoretical foundation to help clinicians assess children with poor verbal skills or with neurodevelopmental conditions that impair information processing, such as autism spectrum disorders.
Already, Bergelson has made progress towards building that firmer foundation. In her doctoral work at the University of Pennsylvania, she and her advisor Daniel Swingley showed that infants begin understanding words about six months after birth . Until then, many researchers believed that babies were unable to shift their focus from sounds and syllables to the meaning of words until about 12 months of age. Using a laboratory-based system that tracked infants’ eye movements when they were asked to identify common objects on a computer screen, Bergelson and Swingley found that some 6-month-old babies could understand, to a certain degree, about a dozen nouns, such as “apple” or “hair.”
The views, opinions and positions expressed by these authors and blogs are theirs and do not necessarily represent that of the Bioethics Research Library and Kennedy Institute of Ethics or Georgetown University.