An article on our work on neuromorphic control of the RoboBee, in collaboration with Rob Wood’s group at Harvard, has appeared on TechBriefs. The article can be found here: https://www.techbriefs.com/component/content/article/1198-tb/news/news/28211-float-like-a-robot-think-like-a-bee
Our work on neuromorphic control of the RoboBee, in collaboration with Rob Wood’s group at Harvard, has been featured on the news site of the National Academy of Engineers’ Frontiers of Engineering. The article can be found on their news site here: https://www.naefrontiers.org/Media/News.aspx and is titled “Brain-mimicking Neuromorphic Computer Chips for RoboBee.”
Our work on biologically-inspired control of the RoboBee, in collaboration with the Harvard Microrobotics Lab, has been featured by the Cornell Chronicle and The Register. The articles describe our work to create control architectures that are both adaptive and energy-efficient using neuromorphic systems. Click below to see the original articles:
Professor Ferrari, along with several colleagues, discusses the future of engineering. Watch the video here.
This paper develops a new indirect method for distributed optimal control (DOC) that is applicable to optimal planning for very-large-scale robotic (VLSR) systems in complex environments. Read the full article here.
“A robot that can perform a task better and more accurately is valuable indeed. But what if a group of robots could work together to accomplish goals and tasks better than they ever could individually? A team of researchers recently put their minds to just that concept.” Read the full article here.
The paper by Hanna Oh-Descher of Duke University, in collaboration with Silvia Ferrari, “Probabilistic inference under time pressure leads to a cortical-to-subcortical shift in decision evidence integration” has been published in NeuroImage, Vol. 162, pp.138-150, November 2017. Read the full article here.
“Silvia Ferrari, Sibley School of Mechanical and Aerospace Engineering, with Robert J. Wood (Harvard University), is working toward a future where autonomous, small-scale robots would have similar capabilities, sensing and responding to their environments and maneuvering without human commands. These robots would be particularly invaluable for surveillance or reconnaissance missions in dangerous or remote environments.”
Read the full article here.
Using a combination of recent developments, ranging from computer vision to decentralized estimation and control, this project will develop a deep-learning Bayesian-optimization framework hinging on sparse features for mobile cooperative scene perception. The methods developed in this project will be tested using real video data from Cornell’s campus as well as virtual data generated using a realistic game engine.
Cornell Chronicle: Researchers link robots to surveillance teams
Cornell Research: Collaborative Robotic Surveillance