[The team has] been commissioned to develop an intuitive algorithm to track the sentiment of British tweeters about the Olympics by EDF Energy, the official electricity supplier to London 2012, in order to create the ‘world’s first social media driven light show’, called ‘Energy of the Nation’ on the London Eye (which EDF sponsor).
If the overall sentiment is negative – the London Eye will glow purple. If it’s positive it will shine yellow and if the Twitter reaction to the Games is neutral, the wheel will emit green rays.
In 1993, at the same time that the world wide web became available, Goldberg was starting his first robotics lab at USC. He and his colleagues decided to create a universal interface, accessed via the web, that could control a robot in the lab. They built a robot that could garden in the center of a planter and put a camera on its arm to give feedback to users. Anyone in the world could view the garden, water plants, and plant seeds. It ended up in the Ars Electronica Museum in Linz Austria for 9 years and was operated by more people than any other robot in history! This first interactive robot project raised questions in the larger online community, which lead to other inventions.
The fourth project that he outlines here, is a robot that learns from human actions in order to complete delicate tasks, such as cutting and suturing in complex surgeries. Goldberg takes us through the process of development - his studies of human gesture, creation of algorithms, and adapting machinery to create a robot that can successfully mimic and execute actions as nuanced as stitching flesh.