Language is a very complex system, involving many brain processes. Is it possible to reproduce it in artificial agents? At ISTC building a robot able to understand language and speak is the goal of the Laboratory of Autonomous Robotics and Artificial Life (LARAL).
The discovery of the mirror neurons system has deeply changed the relationship between gesture and speech. Brain revealed to store a vocabulary of actions that can be applied to different objects; the mere watching of a given object activates potential motor acts even without any physical movement. This new evidence had a big impact on cognitive science and more recently in robotics. The main challenge became modelling the mirror neurons system into artificial agents: can a robot understand language just like us?
At ISTC the Laboratory of Autonomous Robotics and Artificial Life (LARAL) is trying to reach this goal. Within the European project Italk – Integration and Transfer of Action and Language Knowledge in robots, researchers are attempting to educate a baby humanoid robot called iCub which, at a metre tall, is the same size as a three year old toddler and is able to crawl, sit up, feel, see and hear. The iCub robot develops its capabilities in the same way as a child, progressively learning about its own bodily skills and how to interact with the world. Next, the toddlerbot uses what it learns individually and socially from others to bootstrap the acquisition of language, and uses its language abilities in turn to drive its learning of social and manipulative abilities.
Since motor system is a prerequisite for speech in humans, it can be considered as a prerequisite for speech also in artificial systems: mirrors neurons mechanisms are therefore reproduced in robots. In iCub this creates a positive feedback cycle between using language and developing other cognitive abilities. Like a child learning by imitation of its parents and interacting with the environment around it, the robot will eventually master basic principles of structured grammar.
By constructing artificial intelligence systems that have structural features similar to ours, we may be more likely to create robots that can ape human abilities. Developing at the same time new scientific explanations of the relations between action, language and social skills.
Contact: Stefano Nolfi