{"id":4468,"date":"2015-05-29T05:33:21","date_gmt":"2015-05-29T05:33:21","guid":{"rendered":"http:\/\/revoscience.com\/en\/?p=4468"},"modified":"2015-05-29T05:33:21","modified_gmt":"2015-05-29T05:33:21","slug":"mit-cheetah-robot-lands-the-running-jump","status":"publish","type":"post","link":"https:\/\/www.revoscience.com\/en\/mit-cheetah-robot-lands-the-running-jump\/","title":{"rendered":"MIT cheetah robot lands the running jump"},"content":{"rendered":"<p style=\"text-align: justify;\"><span style=\"color: #000000;\"><em><strong style=\"color: #222222;\">Robot sees, clears hurdles while bounding at 5 mph.<\/strong><\/em><\/span><\/p>\n<figure id=\"attachment_4470\" aria-describedby=\"caption-attachment-4470\" style=\"width: 639px\" class=\"wp-caption alignnone\"><a href=\"http:\/\/revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Cheetah-01.jpg\" target=\"_blank\" rel=\"noopener\"><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-4470 size-full\" src=\"http:\/\/revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Cheetah-01.jpg\" alt=\"MIT-Cheetah-01\" width=\"639\" height=\"426\" title=\"\" srcset=\"https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Cheetah-01.jpg 639w, https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Cheetah-01-300x200.jpg 300w\" sizes=\"auto, (max-width: 639px) 100vw, 639px\" \/><\/a><figcaption id=\"caption-attachment-4470\" class=\"wp-caption-text\">MIT Biomimetic Robotics Laboratory members pose with the MIT cheetah robot in Killian Court. (Top row, from left) Deborah Ajilo, Negin Abdolrahim Poorheravi,John Patrick Mayo,Justin Cheung, Sangbae Kim, Shinsuk Park, Kathryn L. Evans, and Matt Angle. (Bottom row, from left) Will Bosworth, Joao Luiz Almeida Souza Ramos, Sehyuk Yim, Albert Wang, Meng Yee Chuah, and Hae Won Park. Photo: Jose-Luis Olivares\/MIT<\/figcaption><\/figure>\n<p style=\"text-align: justify;\"><span style=\"font-weight: normal; color: #000000;\"><strong>CAMBRIDGE, MA<\/strong> &#8212; In a leap for robot development, the MIT researchers who built a robotic cheetah have now trained it to see and jump over hurdles as it runs \u2014 making this the first four-legged robot to run and jump over obstacles autonomously.<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">\u00a0<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">To get a running jump, the robot plans out its path, much like a human runner: As it detects an approaching obstacle, it estimates that object\u2019s height and distance. The robot gauges the best position from which to jump, and adjusts its stride to land just short of the obstacle, before exerting enough force to push up and over. Based on the obstacle\u2019s height, the robot then applies a certain amount of force to land safely, before resuming its initial pace.<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">\u00a0<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">In experiments on a treadmill and an indoor track, the cheetah robot successfully cleared obstacles up to 18 inches tall \u2014 more than half of the robot\u2019s own height \u2014 while maintaining an average running speed of 5 miles per hour.<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">\u00a0<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">\u201cA running jump is a truly dynamic behavior,\u201d says Sangbae Kim, an assistant professor of mechanical engineering at MIT. \u201cYou have to manage balance and energy, and be able to handle impact after landing. Our robot is specifically designed for those highly dynamic behaviors.\u201d<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">\u00a0<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">Kim and his colleagues \u2014 including research scientist Hae won Park and postdoc Patrick Wensing \u2014 will demonstrate their cheetah\u2019s running jump at the DARPA Robotics Challenge in June, and will present a paper detailing the autonomous system in July at the conference Robotics: Science and Systems.<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">\u00a0<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"color: #000000;\"><strong style=\"color: #222222;\">See, run, jump<\/strong><\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">\u00a0<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"color: #000000;\"><span style=\"font-weight: normal;\">Last September, the group demonstrated that the robotic cheetah was able to\u00a0<\/span><a style=\"font-weight: normal; color: #1155cc;\" href=\"http:\/\/mit.pr-optout.com\/Tracking.aspx?Data=HHL%3d8.%3c2%3f9-%3eLCE9%3b4%3b8%3f%26SDG%3c90%3a.&amp;RE=MC&amp;RI=4334046&amp;Preview=False&amp;DistributionActionID=26507&amp;Action=Follow+Link\" target=\"_blank\" rel=\"noopener\"><span style=\"color: #000000;\">run untethered<\/span><\/a><span style=\"font-weight: normal;\">\u00a0\u2014 a feat that Kim notes the robot performed \u201cblind,\u201d without the use of cameras or other vision systems.<\/span><\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">\u00a0<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">Now, the robot can \u201csee,\u201d with the use of onboard LIDAR \u2014 a visual system that uses reflections from a laser to map terrain. The team developed a three-part algorithm to plan out the robot\u2019s path, based on LIDAR data. Both the vision and path-planning system are onboard the robot, giving it complete autonomous control.<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">\u00a0<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">The algorithm\u2019s first component enables the robot to detect an obstacle and estimate its size and distance. The researchers devised a formula to simplify a visual scene, representing the ground as a straight line, and any obstacles as deviations from that line. With this formula, the robot can estimate an obstacle\u2019s height and distance from itself.<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">\u00a0<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">Once the robot has detected an obstacle, the second component of the algorithm kicks in, allowing the robot to adjust its approach while nearing the obstacle. Based on the obstacle\u2019s distance, the algorithm predicts the best position from which to jump in order to safely clear it, then backtracks from there to space out the robot\u2019s remaining strides, speeding up or slowing down in order to reach the optimal jumping-off point.<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">\u00a0<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">This \u201capproach adjustment algorithm\u201d runs on the fly, optimizing the robot\u2019s stride with every step. The optimization process takes about 100 milliseconds to complete \u2014 about half the time of a single stride.<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">\u00a0<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">When the robot reaches the jumping-off point, the third component of the algorithm takes over to determine its jumping trajectory. Based on an obstacle\u2019s height, and the robot\u2019s speed, the researchers came up with a formula to determine the amount of force the robot\u2019s electric motors should exert to safely launch the robot over the obstacle. The formula essentially cranks up the force applied in the robot\u2019s normal bounding gait, which Kim notes is essentially \u201csequential executions of small jumps.\u201d<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">\u00a0<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"color: #000000;\"><strong style=\"color: #222222;\">Optimal is best, feasible is better<\/strong><\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">\u00a0<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">Interestingly, Kim says the algorithm does not provide an optimal jumping control, but rather, only a feasible one.<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">\u00a0<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">\u201cIf you want to optimize for, say, energy efficiency, you would want the robot to barely clear the obstacle \u2014 but that\u2019s dangerous, and finding a truly optimal solution would take a lot of computing time,\u201d Kim says. \u201cIn running, we don\u2019t want to spend a lot of time to find a better solution. We just want one that\u2019s feasible.\u201d<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">\u00a0<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">Sometimes, that means the robot may jump much higher than it needs to \u2014 and that\u2019s OK, according to Kim: \u201cWe\u2019re too obsessed with optimal solutions. This is one example where you just have to be good enough, because you\u2019re running, and have to make a decision very quickly.\u201d<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">\u00a0<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">The team tested the MIT cheetah\u2019s jumping ability first on a treadmill, then on a track. On the treadmill, the robot ran tethered in place, as researchers placed obstacles of varying heights on the belt. As the treadmill itself was only about 4 meters long, the robot, running in the middle, only had 1 meter in which to detect the obstacle and plan out its jump. After multiple runs, the robot successfully cleared about 70 percent of the hurdles.<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">\u00a0<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">In comparison, tests on an indoor track proved much easier, as the robot had more space and time in which to see, approach, and clear obstacles. In these runs, the robot successfully cleared about 90 percent of obstacles.<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">\u00a0<\/span><br style=\"font-weight: normal; color: #222222;\" \/><span style=\"font-weight: normal; color: #000000;\">Kim is now working on getting the MIT cheetah to jump over hurdles while running on softer terrain, like a grassy field<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Robot sees, clears hurdles while bounding at 5 mph. CAMBRIDGE, MA &#8212; In a leap for robot development, the MIT researchers who built a robotic cheetah have now trained it to see and jump over hurdles as it runs \u2014 making this the first four-legged robot to run and jump over obstacles autonomously.\u00a0To get a [&hellip;]<\/p>\n","protected":false},"author":6,"featured_media":4470,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[14,28],"tags":[],"class_list":["post-4468","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-innovation","category-techbiz"],"featured_image_urls":{"full":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Cheetah-01.jpg",639,426,false],"thumbnail":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Cheetah-01-150x150.jpg",150,150,true],"medium":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Cheetah-01-300x200.jpg",300,200,true],"medium_large":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Cheetah-01.jpg",639,426,false],"large":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Cheetah-01.jpg",639,426,false],"1536x1536":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Cheetah-01.jpg",639,426,false],"2048x2048":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Cheetah-01.jpg",639,426,false],"ultp_layout_landscape_large":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Cheetah-01.jpg",639,426,false],"ultp_layout_landscape":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Cheetah-01.jpg",639,426,false],"ultp_layout_portrait":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Cheetah-01.jpg",600,400,false],"ultp_layout_square":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Cheetah-01.jpg",600,400,false],"newspaper-x-single-post":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Cheetah-01.jpg",639,426,false],"newspaper-x-recent-post-big":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Cheetah-01.jpg",540,360,false],"newspaper-x-recent-post-list-image":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Cheetah-01.jpg",95,63,false],"web-stories-poster-portrait":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Cheetah-01.jpg",639,426,false],"web-stories-publisher-logo":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Cheetah-01.jpg",96,64,false],"web-stories-thumbnail":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/05\/MIT-Cheetah-01.jpg",150,100,false]},"author_info":{"info":["Amrita Tuladhar"]},"category_info":"<a href=\"https:\/\/www.revoscience.com\/en\/category\/innovation\/\" rel=\"category tag\">Innovation<\/a> <a href=\"https:\/\/www.revoscience.com\/en\/category\/techbiz\/\" rel=\"category tag\">Tech<\/a>","tag_info":"Tech","comment_count":"0","_links":{"self":[{"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/posts\/4468","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/comments?post=4468"}],"version-history":[{"count":0,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/posts\/4468\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/media\/4470"}],"wp:attachment":[{"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/media?parent=4468"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/categories?post=4468"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/tags?post=4468"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}