advertisement
advertisement
  • 03.02.11

MIT Scientist Captures 90,000 Hours of Video of His Son’s First Words, Graphs It

Cognitive scientist Deb Roy blew the curve for Flip cam-packing proud pops. Since he and his wife brought their son home from the hospital, Roy has captured his every movement and word with a series of fisheye-lens cameras installed in every room. The purpose was to understand how we learn language.

MIT Scientist Captures 90,000 Hours of Video of His Son’s First Words, Graphs It

advertisement

In
a talk soon to grab several million views on TED.com, cognitive scientist Deb Roy Wednesday
shared a remarkable experiment that hearkens back to an earlier era of science
using brand-new technology. From the day he and his wife brought their son home
five years ago, the family’s every movement and word was captured and tracked
with a series of fisheye lenses in every room in their house. The purpose was
to understand how we learn language, in context, through the words we
hear. 

A combination of new software and human transcription called Blitzscribe allowed them to parse 200 terabytes of data to capture the emergence
and refinement of specific words in Roy’s son’s vocabulary. (Luckily, the boy
was an early talker.) In one 40-second clip, you can hear how “gaga” turned
into “water” over the course of six months. In a video clip, below, you can hear and watch the evolution of “ball.” 

advertisement

Unreal
3-D visualizations allowed his team to zoom through the house like a dollhouse
and map the utterance of each word in its context.

In a landscape-like image with peaks and valleys, you can see that
the word “water” was uttered most often in the kitchen, while “bye” took place
at the door.

The video was processed to show “time worms,” below, charting the family’s movement from room to room.

Most moving of all was the precise mapping of tight feedback
loops between the child and his caregivers—father, mother, nanny. For example,
Roy was able to track the length of every sentence spoken to the child in which
a particular word–like “water”–was included. Right around the time the child
started to say the word, what Roy calls the “word birth,” something remarkable
happened.

advertisement

“Caregiver speech dipped to a minimum and slowly ascended
back out in complexity.” In other words, when mom and dad and nanny first hear a
child speaking a word, they unconsciously stress it by repeating it back to
him all by itself or in very short sentences. Then as he gets the word, the sentences
lengthen again. The infant shapes
the caregivers’ behavior, the better to learn.

Roy is now taking the amazing
research capability and team he’s developed and applying it to commerce. He’s on leave from
MIT and has founded a VC-backed company called Bluefin Labs
that applies these same high-powered analytics to relate, not the speech of a
child to that of a father, but events broadcast on TV to conversations taking
place in social media, the better to chart “engagement” with the State of the
Union Address or Jersey Shore or a car commercial.

“After
15+ years of academia, I want to take some of my ideas out of the lab
and into the world,” Roy told Fast Company. “I also feel that the changes in the world of mass
and social media provide a perfect environment for these ideas to have
real impact (not just commercial, but also social), a opportunity that I
feel compelled to seize.”

The methods he’s developed are still being applied to babies; some of his senior graduate students at MIT  continue to analyze the data, and he’s designed PlayLamp, a less intrusive recording device currently being used in pilot studies of children at-risk of autism.

see also: BBC News, MIT Media Lab

Read more TED coverage

About the author

She’s the author of Generation Debt (Riverhead, 2006) and DIY U: Edupunks, Edupreneurs, and the Coming Transformation of Higher Education, (Chelsea Green, 2010). Her next book, The Test, about standardized testing, will be published by Public Affairs in 2015.

More

Video