How do babies learn their first language?
Infants learn their first language at an impressive speed. During the first year of life, even before they start to talk, infants convege on the consonants and vowels of their language, and start segmenting continuous speech into words. Such performance is very difficult to acheive for adults learning a second language. Yet, infants, manage it effortlessly, without explicit supervision, while being immersed into a complex and noisy environment. In addition, infants do not seem to follow a logical order (sounds, then words, then sentences) as adults would do, but rather, they start learning all of these linguistic levels in parallel.
The aim of this project is to decipher this puzzling learning process by applying a 'reverse engineering ' approach, i.e., by constructing an artificial language learner that mimicks the learning stages of the infant. We use engineering and applied maths techniques (automatic speech recognition, signal processing, semantic web, machine learning) on large corpora of child-adult verbal interactions in several languages. We develop psychologically plausible (unsupervized) and biologically plausible (bio-inspired) algorithms which can discover linguistic categories (words, syllables, phonemes, features). The validity of these algorithms arel then tested in infants or newborns using behavioral techniques (eye tracking) or noninvasive brain imagery (Near InfraRed Spectroscopy, EEGs).