About robots, robot development
and those who make it happen


  • Released:
  • 2009
  • Driving in the near future may not be bad at all, as the AUDI Car Corporation, together with the MIT Personal Robotics Group and Senseable City Lab, launches the Sociable Car, Senseable Cities Project. AIDA, which stands for Affective intelligent Driving Agent, is part of this project.

    Robot AIDA is an in-car robot looking from a car dashboard. He is capable of recognizing facial expressions, moods, gestures and tone of voice. He can respond intelligently in suitable context and tone based on his perceived gestures and voice tone from the driver. Robot AIDA has an array of facial expressions, like turning its head like a human does and can blink, wink and smile.

    The MIT Personal Robotics Group designed and built Robot AIDA, while Senseable Cities Lab is handling the intelligent navigation algorithms. Communication between the driver/passenger and the vehicle will be promoted using the AIDA. This agent can easily memorize the usual path a car owner takes on a daily basis.

    This companion can also provide real-time traffic updates, weather, road conditions, and even a car’s status. Robot AIDA can suggest the quickest routes with lesser traffic to the driver using the above mentioned data. His purpose is to provide an effective, safer and more enjoyable driving experience for drivers. Robot AIDA will be the face of the future every intelligent driver is looking for.



robot aida




Sign in

X Close Panel
Forgot password?