Do Robots Dream of Electric Sheep?
Video Link: https://youtu.be/q17o_BK_Vmo
Should robots perceive the world as we do, or will humanity perhaps be better served having a new and different perspective? As humans, we will have the potential for welcoming change, teaching robots and AI how to work collaboratively to better our future.
- Robots can perceive with advanced machine learning algorithms which enable recognition of objects, actions, and emotions using data from sensors which can see, hear and smell what only very few animals or humans can.
- Insect-like flying robots such as the RoboBee might assist in locating victims in a disaster, sensor-laden ground vehicles can detect hazardous chemicals to keep humans safe, and underwater vehicles equipped with sonar can locate underwater wreckage following a crash.
- Robots learn through supervised learning algorithms, which mimic a classroom setting, or reinforcement learning algorithms, which mimic punishment and reward. By choosing the parameters for these algorithms, scientists are defining the value systems for robots.
- Robots will continue to be flawed like we are, but as humans, we will have the opportunity to welcome change, acquire a new perspective, and accept robots and artificial intelligence into our daily lives, teaching them how to work collaboratively to better our future.
Live Demo Details
A live on-stage demo using an autonomous drone equipped with a camera and using state-of-the-art machine learning algorithms was used to demonstrate some of the capabilities of robots today. The demo included implementations of state-of-the-art 2D pose estimation algorithms (OpenPose) and approaches we developed for:
- Highly robust glasses recognition under challenging lighting conditions
- Action recognition, which can reliably detect waving, pointing, and other human actions
- The information from the camera and machine learning algorithms is used to control the drone in real time to follow the person who waved at it
- A robust algorithm for continuously tracking the same individual without losing sight of them and erroneously following another person
The live demo in an outdoor environment
Additionally, the system we developed is demonstrated in an outdoor environment. This shows the robustness of these algorithms under very different lighting conditions, and in addition to the points above, also demonstrates
- Successfully using the tracking algorithm to follow the user on a bicycle
- Following the contours of the ground as the drone crosses a small stream
- A novel method for determining which direction the user pointed in 3D space using only a single camera on the drone
Additional Links
- Browse the TEDxVerona 2019 gallery here.
- Watch all the TED talk videos from TEDxVerona 2019 here.
- Find the press reviews on TEDxVerona 2019 here.
- Find the tweets hashtagged TEDxVerona here.
LISC Contributors
Taylor Clawson (Email: tsc83@cornell.edu)
Jake Gemerek (Email: jrg362@cornell.edu)
Jane Shin (Email: js3456@cornell.edu)
Chang Liu, Yucheng Chen, Hengye Yang, Zvonimir Stojanovski, and Junyi Dong also contributed on the videos.
We also thank our collaborators Robert Wood (Professor, Harvard University), Harvard Microrobotics Lab, Robert Kester (CEO, Rebellion Photonics), Amedeo Visconti (Ferrari S.p.A), and Guillermo Sapiro (Professor, Duke University) for their videos and research contributions.