{"id":4130,"date":"2015-05-10T05:51:41","date_gmt":"2015-05-10T05:51:41","guid":{"rendered":"http:\/\/revoscience.com\/en\/?p=4130"},"modified":"2015-05-10T06:44:36","modified_gmt":"2015-05-10T06:44:36","slug":"mit-engineers-hand-cognitive-control-to-underwater-robots","status":"publish","type":"post","link":"https:\/\/www.revoscience.com\/en\/mit-engineers-hand-cognitive-control-to-underwater-robots\/","title":{"rendered":"MIT engineers hand \u201ccognitive\u201d control to underwater robots"},"content":{"rendered":"<p style=\"color: rgb(34, 34, 34); text-align: justify;\"><em><strong>With MIT-developed algorithms, robots plan underwater missions autonomously.<\/strong><\/em><\/p>\n<figure id=\"attachment_4131\" aria-describedby=\"caption-attachment-4131\" style=\"width: 639px\" class=\"wp-caption aligncenter\"><a href=\"http:\/\/revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Ocean-Robotics-1.jpg\" target=\"_blank\" rel=\"noopener\"><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-4131\" src=\"http:\/\/revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Ocean-Robotics-1.jpg\" alt=\"Researchers watch underwater footage taken by various AUVs exploring Australia&#039;s Scott Reef. Courtesy of the researchers\" width=\"639\" height=\"426\" title=\"\" srcset=\"https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Ocean-Robotics-1.jpg 639w, https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Ocean-Robotics-1-300x200.jpg 300w\" sizes=\"auto, (max-width: 639px) 100vw, 639px\" \/><\/a><figcaption id=\"caption-attachment-4131\" class=\"wp-caption-text\">Researchers watch underwater footage taken by various AUVs exploring Australia&#8217;s Scott Reef.<br \/>Courtesy of the researchers<\/figcaption><\/figure>\n<p style=\"color: rgb(34, 34, 34); text-align: justify;\">For the last decade, scientists have deployed increasingly capable underwater robots to map and monitor pockets of the ocean to track the health of fisheries, and survey marine habitats and species. In general, such robots are effective at carrying out low-level tasks, specifically assigned to them by human engineers \u2014 a tedious and time-consuming process for the engineers.<\/p>\n<p style=\"color: rgb(34, 34, 34); text-align: justify;\">When deploying autonomous underwater vehicles (AUVs), much of an engineer\u2019s time is spent writing scripts, or low-level commands, in order to direct a robot to carry out a mission plan. Now a new programming approach developed by MIT engineers gives robots more \u201ccognitive\u201d capabilities, enabling humans to specify high-level goals, while a robot performs high-level decision-making to figure out how to achieve these goals.<\/p>\n<p style=\"color: rgb(34, 34, 34); text-align: justify;\">For example, an engineer may give a robot a list of goal locations to explore, along with any time constraints, as well as physical directions, such as staying a certain distance above the seafloor. Using the system devised by the MIT team, the robot can then plan out a mission, choosing which locations to explore, in what order, within a given timeframe. If an unforeseen event prevents the robot from completing a task, it can choose to drop that task, or reconfigure the hardware to recover from a failure, on the fly.<\/p>\n<p style=\"color: rgb(34, 34, 34); text-align: justify;\">In March, the team tested the autonomous mission-planning system during a research cruise off the western coast of Australia. Over three weeks, the MIT engineers, along with groups from Woods Hole Oceanographic Institution, the Australian Center for Field Robotics, the University of Rhode Island, and elsewhere, tested several classes of AUVs, and their ability to work cooperatively to map the ocean environment.<\/p>\n<p style=\"color: rgb(34, 34, 34); text-align: justify;\">The MIT researchers tested their system on an autonomous underwater glider, and demonstrated that the robot was able to operate safely among a number of other autonomous vehicles, while receiving higher-level commands. The glider, using the system, was able to adapt its mission plan to avoid getting in the way of other vehicles, while still achieving its most important scientific objectives. If another vehicle was taking longer than expected to explore a particular area, the glider, using the MIT system, would reshuffle its priorities, and choose to stay in its current location longer, in order to avoid potential collisions.<\/p>\n<p style=\"color: rgb(34, 34, 34); text-align: justify;\">\u201cWe wanted to show that these vehicles could plan their own missions, and execute, adapt, and re-plan them alone, without human support,\u201d says Brian Williams, a professor of aeronautics and astronautics at MIT, and principal developer of the mission-planning system. \u201cWith this system, we were showing we could safely zigzag all the way around the reef, like an obstacle course.\u201d<\/p>\n<p style=\"color: rgb(34, 34, 34); text-align: justify;\">Williams and his colleagues will present the mission-planning system in June at the International Conference on Automated Planning and Scheduling, in Israel.<\/p>\n<p style=\"color: rgb(34, 34, 34); text-align: justify;\"><strong>All systems go<\/strong><\/p>\n<p style=\"color: rgb(34, 34, 34); text-align: justify;\">When developing the autonomous mission-planning system, Williams\u2019 group took inspiration from the \u201cStar Trek\u201d franchise and the top-down command center of the fictional starship Enterprise, after which Williams modeled and named the system.<\/p>\n<p style=\"color: rgb(34, 34, 34); text-align: justify;\">Just as a hierarchical crew runs the fictional starship, Williams\u2019 Enterprise system incorporates levels of decision-makers. For instance, one component of the system acts as a \u201ccaptain,\u201d making higher-level decisions to plan out the overall mission, deciding where and when to explore. Another component functions as a \u201cnavigator,\u201d planning out a route to meet mission goals. The last component works as a \u201cdoctor,\u201d or \u201cengineer,\u201d diagnosing and repairing problems autonomously.<\/p>\n<p style=\"color: rgb(34, 34, 34); text-align: justify;\">\u201cWe can give the system choices, like, \u2018Go to either this or that science location and map it out,\u2019 or \u2018Communicate via an acoustic modem, or a satellite link,\u2019\u201d Williams explains. \u201cWhat the system does is, it makes those choices, but makes sure it satisfies all the timing constraints and doesn\u2019t collide with anything along the way. So it has the ability to adapt to its environment.\u201d<\/p>\n<p style=\"color: rgb(34, 34, 34); text-align: justify;\"><strong>Autonomy in the sea<\/strong><\/p>\n<p style=\"color: rgb(34, 34, 34); text-align: justify;\">The system is similar to one that Williams developed for NASA following the loss of the Mars Observer, a spacecraft that, days before its scheduled insertion into Mars\u2019 orbit in 1993, lost contact with NASA.<\/p>\n<p style=\"color: rgb(34, 34, 34); text-align: justify;\">\u201cThere were human operators on Earth who were experts in diagnosis and repair, and were ready to save the spacecraft, but couldn\u2019t communicate with it,\u201d Williams recalls. \u201cSubsequently, NASA realized they needed systems that could reason at the cognitive level like engineers, but that were onboard the spacecraft.\u201d<\/p>\n<p style=\"color: rgb(34, 34, 34); text-align: justify;\">Williams, who at the time was working at NASA\u2019s Ames Research Center, was tasked with developing an autonomous system that would enable spacecraft to diagnose and repair problems without human assistance. The system was successfully tested on NASA\u2019s Deep Space 1 probe, which performed an asteroid flyby in 1999.<\/p>\n<p style=\"color: rgb(34, 34, 34); text-align: justify;\">\u201cThat was the first chance to demonstrate goal-directed autonomy in deep space,\u201d Williams says. \u201cThis was a chance to do the same thing under the sea.\u201d<\/p>\n<p style=\"color: rgb(34, 34, 34); text-align: justify;\">By giving robots control of higher-level decision-making, Williams says such a system would free engineers to think about overall strategy, while AUVs determine for themselves a specific mission plan. Such a system could also reduce the size of the operational team needed on research cruises. And, most significantly from a scientific standpoint, an autonomous planning system could enable robots to explore places that otherwise would not be traversable. For instance, with an autonomous system, robots may not have to be in continuous contact with engineers, freeing the vehicles to explore more remote recesses of the sea.<\/p>\n<p style=\"color: rgb(34, 34, 34); text-align: justify;\">\u201cIf you look at the ocean right now, we can use Earth-orbiting satellites, but they don\u2019t penetrate much below the surface,\u201d Williams says. \u201cYou could send sea vessels which send one autonomous vehicle, but that doesn\u2019t show you a lot. This technology can offer a whole new way to observe the ocean, which is exciting.\u201d<\/p>\n<p style=\"color: rgb(34, 34, 34); text-align: justify;\">This research was funded in part by Schmidt Ocean Sciences. The underlying technology was supported in part by Boeing Co., the Keck Institute of Space Sciences, the Defense Advanced Research Projects Agency, and NASA.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>With MIT-developed algorithms, robots plan underwater missions autonomously. For the last decade, scientists have deployed increasingly capable underwater robots to map and monitor pockets of the ocean to track the health of fisheries, and survey marine habitats and species. In general, such robots are effective at carrying out low-level tasks, specifically assigned to them by [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":4131,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[17],"tags":[],"class_list":["post-4130","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-research"],"featured_image_urls":{"full":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Ocean-Robotics-1.jpg",639,426,false],"thumbnail":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Ocean-Robotics-1-150x150.jpg",150,150,true],"medium":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Ocean-Robotics-1-300x200.jpg",300,200,true],"medium_large":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Ocean-Robotics-1.jpg",639,426,false],"large":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Ocean-Robotics-1.jpg",639,426,false],"1536x1536":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Ocean-Robotics-1.jpg",639,426,false],"2048x2048":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Ocean-Robotics-1.jpg",639,426,false],"ultp_layout_landscape_large":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Ocean-Robotics-1.jpg",639,426,false],"ultp_layout_landscape":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Ocean-Robotics-1.jpg",639,426,false],"ultp_layout_portrait":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Ocean-Robotics-1.jpg",600,400,false],"ultp_layout_square":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Ocean-Robotics-1.jpg",600,400,false],"newspaper-x-single-post":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Ocean-Robotics-1.jpg",639,426,false],"newspaper-x-recent-post-big":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Ocean-Robotics-1.jpg",540,360,false],"newspaper-x-recent-post-list-image":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Ocean-Robotics-1.jpg",95,63,false],"web-stories-poster-portrait":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Ocean-Robotics-1.jpg",639,426,false],"web-stories-publisher-logo":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Ocean-Robotics-1.jpg",96,64,false],"web-stories-thumbnail":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Ocean-Robotics-1.jpg",150,100,false]},"author_info":{"info":["RevoScience"]},"category_info":"<a href=\"https:\/\/www.revoscience.com\/en\/category\/news\/research\/\" rel=\"category tag\">Research<\/a>","tag_info":"Research","comment_count":"0","_links":{"self":[{"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/posts\/4130","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/comments?post=4130"}],"version-history":[{"count":0,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/posts\/4130\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/media\/4131"}],"wp:attachment":[{"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/media?parent=4130"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/categories?post=4130"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/tags?post=4130"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}