This is a valid RSS feed.
This feed is valid, but interoperability with the widest range of feed readers could be improved by implementing the following recommendations.
... rg/feeds/topic/robotics.rss" rel="self"></atom:link><language>en-us</lan ...
^
line 2, column 0: (402 occurrences) [help]
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="h ...
line 2, column 0: (218 occurrences) [help]
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="h ...
line 2, column 0: (191 occurrences) [help]
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="h ...
line 2, column 0: (12 occurrences) [help]
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="h ...
line 6, column 0: (66 occurrences) [help]
<span class="rm-shortcode" data-rm-shortcode-id="2d53883c023c90ee12af94ea37e ...
line 11, column 0: (6 occurrences) [help]
<a href="https://iros2024-abudhabi.org/program-overview" rel="noopener nore ...
line 12, column 0: (28 occurrences) [help]
</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-co ...
line 12, column 0: (28 occurrences) [help]
</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-co ...
line 169, column 15706: [help]
... p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 30 A ...
^
<img alt="a black caged drone with fans and a black box in the middle" class ...
<img alt="a black caged drone with fans and a black box in the middle" class ...
<img alt="a black caged drone with fans and a black box in the middle" class ...
<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:media="http://search.yahoo.com/mrss/"><channel><title>IEEE Spectrum</title><link>https://spectrum.ieee.org/</link><description>IEEE Spectrum</description><atom:link href="https://spectrum.ieee.org/feeds/topic/robotics.rss" rel="self"></atom:link><language>en-us</language><lastBuildDate>Fri, 20 Sep 2024 15:30:04 -0000</lastBuildDate><item><title>Video Friday: Zipline Delivers</title><link>https://spectrum.ieee.org/video-friday-zipline-drones</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-small-drone-descends-out-of-the-body-of-a-larger-drone-which-is-attached-underneath-a-charging-dock.png?id=53650884&width=1200&height=800&coordinates=150%2C0%2C150%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://icra40.ieee.org/">ICRA@40</a>: 23–26 September 2024, ROTTERDAM, NETHERLANDS</h5><h5><a href="https://iros2024-abudhabi.org/">IROS 2024</a>: 14–18 October 2024, ABU DHABI, UAE</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="ny2rjint1k0">Zipline has (finally) posted some real live footage of <a href="https://spectrum.ieee.org/delivery-drone-zipline-design" target="_blank">its new Platform 2 drone</a>, and while it’s just as weird looking as before, it seems to actually work really well.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="be48f346c963d367370f021ec36717cc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/nY2RjINT1K0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.flyzipline.com/">Zipline</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="erxs98c_suc">I appreciate <a href="https://studios.disneyresearch.com/" target="_blank">Disney Research</a>’s insistence on always eventually asking, “okay, but can we get this to work on a real robot in the real world?”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="02828957756c9a0015f2097ce213f00a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/eRXS98c_Suc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://la.disneyresearch.com/wp-content/uploads/RobotMDM_2.pdf">Paper from ETH Zurich and Disney Research [PDF]</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="wgthz30kklk"><em>In this video, we showcase our humanoid robot, Nadia, being remotely controlled for boxing training using a simple VR motion capture setup. A remote user takes charge of Nadia’s movements, demonstrating the power of our advanced teleoperation system. Watch as Nadia performs precise boxing moves, highlighting the potential for humanoid robots in dynamic, real-world tasks.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3127b613394126525697faa29726b631" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wgthZ30kkLk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ihmc.us/nadia-humanoid/">IHMC</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="0t-rnul2yxa"><em>Guide dogs are expensive to train and maintain—if available at all. Because of these limiting factors, relatively few blind people use them. Computer science assistant professor Donghyun Kim and Ph.D candidate Hochul Hwang are hoping to change that with the help of UMass database analyst Gail Gunn and her guide dog, Brawny.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="22d3b9573bcb2e0e60209043f0637239" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0t-RNUl2YXA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.umass.edu/news/article/optimize-guide-dog-robots-first-listen-visually-impaired">University of Massachusetts, Amherst</a> ]</p><p>Thanks Julia!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="xu4dtf5ydpc"><em>The current paradigm for motion planning generates solutions from scratch for every new problem, which consumes significant amounts of time and computational resources. Our approach builds a large number of complex scenes in simulation, collects expert data from a motion planner, then distills it into a reactive generalist policy. We then combine this with lightweight optimization to obtain a safe path for real world deployment.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="123a434e30d30049f5061179f2729515" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xU4DTF5YDpc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://mihdalal.github.io/neuralmotionplanner/">Neural MP</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="uxfpkj_tjje">A nice mix of <a data-linked-post="2650254129" href="https://spectrum.ieee.org/nao-robot-does-star-wars" target="_blank">NAO</a> and AI for embodied teaching.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f6a00ed66d2004cb4fbe53b867f05ae9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/uXFpKJ_TjjE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.aldebaran.com/en/nao">Aldebaran</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="q6tuhgjamr8"><em>When retail and logistics giant Otto Group set out to strengthen its operational efficiency and safety, it turned to robotics and automation. The Otto Group has become the first company in Europe to deploy the mobile case handling robot Stretch, which unloads floor-loaded trailers and containers. </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="740ac12bb4ab9c3b1ca9b9b1a2f7aa97" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/q6TuHGJamR8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/case-studies/stretch-enhances-logistics-maintenance-at-otto-group/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="85jpmfk9ony"><em>From groceries to last-minute treats, <a href="https://wing.com/" target="_blank">Wing</a> is here to make sure deliveries arrive quickly and safely. Our latest aircraft design features a larger, more standardized box and can carry a higher payload which came directly from customer and partner feedback. </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="555816c97318cf871fbebb1eb96443e2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/85JpmFK9onY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.linkedin.com/pulse/customer-demand-wings-aircraft-library-design-approach-shapes-new-v0vnc/">Wing</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="7qcqmwj_zne">It’s the jacket that gets me.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ceb5c2cb55160178b9590245e27f5039" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7QCqmwJ_ZNE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.devanthro.com/">Devanthro</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="vaqq-fzqcao"><em>In this video, we introduce Rotograb, a robotic hand that merges the dexterity of human hands with the strength and efficiency of industrial grippers. Rotograb features a new rotating thumb mechanism, allowing for precision in-hand manipulation and power grasps while being adaptable. The robotic hand was developed by students during “Real World Robotics”, a master course by the Soft Robotics Lab at ETH Zurich.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="af1639a50c95c85b0262f0f15de0b3f1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vaQQ-fzqcao?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://rwr.ethz.ch/">ETH Zurich</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="cxhwqfmfhxm"><em>A small scene where Rémi, our distinguished professor, is teaching chess to the person remotely operating Reachy! The grippers allow for easy and precise handling of chess pieces, even the small ones! The robot shown in this video is the Beta version of Reachy 2, our new robot coming very soon!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5f64c30348be165038a154959d23c759" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/cxHwqfMFHXM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pollen-robotics.com/">Pollen</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ckhqxjtvkqq"><em>Enhancing the adaptability and versatility of unmanned micro aerial vehicles (MAVs) is crucial for expanding their application range. In this article, we present a bimodal reconfigurable robot capable of operating in both regular quadcopter flight mode and a unique revolving flight mode, which allows independent control of the vehicle’s position and roll-pitch attitude.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1eb252a6c82bbe8d159cd47f97153db4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ckHqXJTVKqQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ris.bme.cityu.edu.hk/">City University Hong Kong</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="hnnr70n8gzo"><em>The Parallel Continuum Manipulator (PACOMA) is an advanced robotic system designed to replace traditional robotic arms in space missions, such as exploration, in-orbit servicing, and docking. Its design emphasizes robustness against misalignments and impacts, high precision and payload capacity, and sufficient mechanical damping for stable, controlled movements.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4ddc4cd72a9ae69fff037fb939d286c5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/hnNr70n8gZo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotik.dfki-bremen.de/de/forschung/projekte/continuumpkm">DFKI Robotics Innovation Center</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="wjdycw5vpbm">Even the FPV pros from <a data-linked-post="2650271004" href="https://spectrum.ieee.org/drone-flies-over-world-tallest-building" target="_blank">Team BlackSheep</a> do, very occasionally, crash.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ba83e0e47ccb65bd43466ed0dc729081" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wJDYCW5vPBM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.team-blacksheep.com/">Team BlackSheep</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="xcy2-edq-x0">This is a one-hour uninterrupted video of a robot cleaning bathrooms in real time. I’m not sure if it’s practical, but I am sure that it’s impressive, honestly.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="65e02613ea27a41c6437bbca107b8338" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/XCY2-eDQ-X0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://getsomatic.com/">Somatic</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 20 Sep 2024 15:30:04 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-zipline-drones</guid><category>Video friday</category><category>Zipline</category><category>Disney research</category><category>Somatic</category><category>Nadia</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-small-drone-descends-out-of-the-body-of-a-larger-drone-which-is-attached-underneath-a-charging-dock.png?id=53650884&width=980"></media:content></item><item><title>ICRA@40 Conference Celebrates 40 Years of IEEE Robotics</title><link>https://spectrum.ieee.org/icra40-conference</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/image.jpg?id=53633920&width=1200&height=800&coordinates=0%2C98%2C0%2C99"/><br/><br/><p>Four decades after the first IEEE International Conference on Robotics and Automation (ICRA) in Atlanta, robotics is bigger than ever. Next week in Rotterdam is the <a href="https://icra40.ieee.org/" rel="noopener noreferrer" target="_blank"><u>IEEE ICRA@40 conference</u></a>, “a celebration of 40 years of pioneering research and technological advancements in robotics and automation.” There’s an ICRA every year, of course. Arguably the largest robotics research conference in the world, the 2024 edition was held in Yokohama, Japan back in May.</p><p>ICRA@40 is <em>not </em>just a second ICRA conference in 2024. Next week’s conference is a single track that promises “a journey through the evolution of robotics and automation,” through four days of short keynotes from prominent roboticists from across the entire field. <a href="https://icra40.ieee.org/icra-2024/distinguished-speakers/" rel="noopener noreferrer" target="_blank"><u>You can see for yourself, the speaker list is nuts</u></a>. There are also debates and panels tackling big ideas, like: “What progress has been made in different areas of robotics and automation over the past decades, and what key challenges remain?” Personally, I’d say “lots” and “most of them,” but that’s probably why I’m not going to be up on stage.</p><p>There will also be interactive research presentations, live demos, an expo, and more—<a href="https://icra40.ieee.org/icra-2024/program/" rel="noopener noreferrer" target="_blank"><u>the conference schedule is online now</u></a>, and <a href="https://ras.papercept.net/conferences/conferences/ICRAX24/program/ICRAX24_ProgramAtAGlanceWeb.html" rel="noopener noreferrer" target="_blank"><u>the abstracts are online as well</u></a>. I’ll be there to cover it all, but if you can make it in person, it’ll be worth it.</p><hr/><p>Forty years ago is a long time, but it’s not <em>that</em> long, so just for fun, I had a look at the <a href="https://ieeexplore.ieee.org/xpl/conhome/8150/proceeding" target="_blank">proceedings of ICRA 1984</a> which are available on <a href="https://ieeexplore.ieee.org/Xplore/home.jsp" target="_blank">IEEE Xplore</a>, if you’re curious. Here’s an excerpt of the forward from the organizers, which included folks from International Business Machines and Bell Labs:</p><blockquote><em>The proceedings of the first IEEE Computer Society International Conference on Robotics contains papers covering practically all aspects of robotics. The response to our call for papers has been overwhelming, and the number of papers submitted by authors outside the United States indicates the strong international interest in robotics.</em><br/><em>The Conference program includes papers on: computer vision; touch and other local sensing; manipulator kinematics, dynamics, control and simulation; robot programming languages, operating systems, representation, planning, man-machine interfaces; multiple and mobile robot systems.</em><br/><em>The technical level of the Conference is high with papers being presented by leading researchers in robotics. We believe that this conference, the first of a series to be sponsored by the IEEE, will provide a forum for the dissemination of fundamental research results in this fast developing field.</em></blockquote><p>Technically, this was “ICR,” not “ICRA,” and it was put on by the IEEE Computer Society’s Technical Committee on Robotics, since there was no IEEE Robotics and Automation Society at that time; <a href="https://ethw.org/IEEE_Robotics_&_Automation_Society_History" rel="noopener noreferrer" target="_blank"><u>RAS didn’t get off the ground until 1987</u></a>.</p>1984 ICR(A) had two tracks, and featured about 75 papers presented over three days. Looking through the proceedings, you’ll find lots of familiar names: <a href="https://ieeexplore.ieee.org/author/37279023100" rel="noopener noreferrer" target="_blank">Harry Asada</a>, <a href="https://ieeexplore.ieee.org/author/37298488400" rel="noopener noreferrer" target="_blank">Ruzena Bajcsy</a>, <a href="https://ieeexplore.ieee.org/author/37355483800" rel="noopener noreferrer" target="_blank">Ken Salisbury</a>, <a href="https://ieeexplore.ieee.org/author/37280269300" rel="noopener noreferrer" target="_blank">Paolo Dario</a>, <a href="https://ieeexplore.ieee.org/author/37273994200" target="_blank">Matt Mason</a>, <a href="https://ieeexplore.ieee.org/author/37279174500" target="_blank">Toshio Fukuda</a>, <a href="https://ieeexplore.ieee.org/author/37295062900" target="_blank">Ron Fearing</a>, and <a href="https://ieeexplore.ieee.org/author/37355708600" target="_blank">Marc Raibert</a>. Many of these folks will be at ICRA@40, so if you see them, make sure and thank them for helping to start it all, because 40 years of robotics is definitely something to celebrate.]]></description><pubDate>Wed, 18 Sep 2024 11:30:03 +0000</pubDate><guid>https://spectrum.ieee.org/icra40-conference</guid><category>Icra</category><category>Icra 2024</category><category>Conferences</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/image.jpg?id=53633920&width=980"></media:content></item><item><title>One AI Model to Rule All Robots</title><link>https://spectrum.ieee.org/machine-learning-and-robotics</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-collage-of-five-video-stills-showing-different-robot-embodiments-in-action-including-arms-quadruped-and-a-mobile-base-robo.jpg?id=53636239&width=1200&height=800&coordinates=56%2C0%2C56%2C0"/><br/><br/><p>The software used to control a <a data-linked-post="2661984163" href="https://spectrum.ieee.org/robocup-robots-guide" target="_blank">robot</a> is normally highly adapted to its specific physical set up. But now researchers have created a single general-purpose robotic control policy that can operate robotic arms, wheeled robots, quadrupeds, and even drones.</p><p> One of the biggest challenges when it comes to applying <a data-linked-post="2658442416" href="https://spectrum.ieee.org/number-representation" target="_blank">machine learning</a> to robotics is the paucity of data. While <a data-linked-post="2657095345" href="https://spectrum.ieee.org/synthetic-data-computer-vision" target="_blank">computer vision</a> and <a data-linked-post="2650279156" href="https://spectrum.ieee.org/natural-language-processing-dates-back-to-kabbalist-mystics" target="_blank">natural language processing</a> can piggyback off the vast quantities of image and text data found on the Internet, collecting robot data is costly and time-consuming.</p><p> To get around this, there have been <a href="https://spectrum.ieee.org/global-robotic-brain" target="_blank">growing efforts to pool data</a> collected by different groups on different kinds of robots, including the <a href="https://robotics-transformer-x.github.io/" rel="noopener noreferrer" target="_blank">Open X-Embodiment</a> and <a href="https://droid-dataset.github.io/" rel="noopener noreferrer" target="_blank">DROID</a> datasets. The hope is that training on diverse robotics data will lead to “positive transfer,” which refers to when skills learned from training on one task help to boost performance on another. </p><p> The problem is that robots often have very different embodiments—a term used to describe their physical layout and suite of sensors and actuators—so the data they collect can vary significantly. For instance, a robotic arm might be static, have a complex arrangement of joints and fingers, and collect video from a camera on its wrist. In contrast, a quadruped robot is regularly on the move and relies on force feedback from its legs to maneuver. The kinds of tasks and actions these machines are trained to carry out are also diverse: The arm may pick and place objects, while the quadruped needs keen navigation.</p><p> That makes training a single AI model on these large collections of data challenging, says <a href="https://homerwalke.com/" rel="noopener noreferrer" target="_blank">Homer Walke</a>, a Ph.D. student at the University of California, Berkeley. So far, most attempts have either focused on data from a narrower selection of similar robots or researchers have manually tweaked data to make observations from different robots more similar. But in <a href="https://arxiv.org/abs/2408.11812" target="_blank">research</a> to be presented at the <a href="https://www.corl.org/" target="_blank">Conference on Robot Learning (CoRL)</a> in Munich in November, they unveiled a new model called <a href="https://crossformer-model.github.io/" rel="noopener noreferrer" target="_blank">CrossFormer</a> that can train on data from a diverse set of robots and control them just as well as specialized control policies.</p><p> “We want to be able to train on all of this data to get the most capable robot,” says Walke. “The main advance in this paper is working out what kind of architecture works the best for accommodating all these varying inputs and outputs.”</p><h2>How to control diverse robots with the same AI model</h2><p> The team used the same model architecture that powers large language model, known as a <a data-linked-post="2657108902" href="https://spectrum.ieee.org/nvidias-next-gpu-shows-that-transformers-are-transforming-ai" target="_blank">transformer</a>. In many ways, the challenge the researchers were trying to solve is not dissimilar to that facing a <a data-linked-post="2664307185" href="https://spectrum.ieee.org/weird-ai" target="_blank">chatbot</a>, says Walke. In language modeling, the AI has to to pick out similar patterns in sentences with different lengths and word orders. Robot data can also be arranged in a sequence much like a written sentence, but depending on the particular embodiment, observations and actions vary in length and order too.</p><p> “Words might appear in different locations in a sentence, but they still mean the same thing,” says Walke. “In our task, an observation image might appear in different locations in the sequence, but it’s still fundamentally an image and we still want to treat it like an image.”</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="2d53883c023c90ee12af94ea37e391cc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/m8YDtaroWRc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
<small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://crossformer-model.github.io/" rel="noopener noreferrer" target="_blank">UC Berkeley/Carnegie Mellon University</a></small>
</p><p> Most machine learning approaches work through a sequence one element at a time, but transformers can process the entire stream of data at once. This allows them to analyze the relationship between different elements and makes them better at handling sequences that are not standardized, much like the diverse data found in large robotics datasets.</p><p> Walke and his colleagues aren’t the first to train transformers on large-scale robotics data. But previous approaches have either trained solely on data from robotic arms with broadly similar embodiments or manually converted input data to a common format to make it easier to process. In contrast, CrossFormer can process images from cameras positioned above a robot, at head height or on a robotic arms wrist, as well as joint position data from both quadrupeds and robotic arms, without any tweaks.</p><p> The result is a single control policy that can operate single robotic arms, pairs of robotic arms, quadrupeds, and wheeled robots on tasks as varied as picking and placing objects, cutting sushi, and obstacle avoidance. Crucially, it matched the performance of specialized models tailored for each robot and outperformed previous approaches trained on diverse robotic data. The team even tested whether the model could control an embodiment not included in the dataset—a small quadcopter. While they simplified things by making the drone fly at a fixed altitude, CrossFormer still outperformed the previous best method.</p><p> “That was definitely pretty cool,” says <a href="https://www.linkedin.com/in/riadoshi" rel="noopener noreferrer" target="_blank">Ria Doshi</a>, an undergraduate student at Berkeley. “I think that as we scale up our policy to be able to train on even larger sets of diverse data, it’ll become easier to see this kind of zero shot transfer onto robots that have been completely unseen in the training.”</p><h2>The limitations of one AI model for all robots</h2><p> The team admits there’s still work to do, however. The model is too big for any of the robots’ embedded chips and instead has to be run from a server. Even then, processing times are only just fast enough to support real-time operation, and Walke admits that could break down if they scale up the model. “When you pack so much data into a model it has to be very big and that means running it for real-time control becomes difficult.”</p><p>One potential workaround would be to use an approach called distillation, says <a href="https://www.oiermees.com/" target="_blank">Oier Mees</a>, a postdoctoral research at Berkley and part of the CrossFormer team. This essentially involves training a smaller model to mimic the larger model, and if successful can result in similar performance for a much smaller computational budget.</p><p>But of more importance than the computing resource problem is that the team failed to see any positive transfer in their experiments, as CrossFormer simply matched previous performance rather than exceeding it. Walke thinks progress in computer vision and natural language processing suggests that training on more data could be the key.</p><p> Others say it might not be that simple. <a href="https://web.stanford.edu/~bohg/" rel="noopener noreferrer" target="_blank">Jeannette Bohg</a>, a professor of robotics at Stanford University, says the ability to train on such a diverse dataset is a significant contribution. But she wonders whether part of the reason why the researchers didn’t see positive transfer is their insistence on not aligning the input data. Previous research that trained on robots with similar observation and action data has shown evidence of such cross-overs. “By getting rid of this alignment, they may have also gotten rid of this significant positive transfer that we’ve seen in other work,” Bohg says.</p><p> It’s also not clear if the approach will boost performance on tasks specific to particular embodiments or robotic applications, says <a href="https://www.inf.ed.ac.uk/people/staff/Ram_Ramamoorthy.html" rel="noopener noreferrer" target="_blank">Ram Ramamoorthy</a>, a robotics professor at Edinburgh University. The work is a promising step towards helping robots capture concepts common to most robots, like “avoid this obstacle,” he says. But it may be less useful for tackling control problems specific to a particular robot, such as how to knead dough or navigate a forest, which are often the hardest to solve.</p>]]></description><pubDate>Fri, 13 Sep 2024 17:58:17 +0000</pubDate><guid>https://spectrum.ieee.org/machine-learning-and-robotics</guid><category>Robotics</category><category>Artificial intelligence</category><category>Machine learning</category><category>Embodied intelligence</category><category>Robotic arm</category><category>Quadruped robots</category><category>Quadcopters</category><dc:creator>Edd Gent</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-collage-of-five-video-stills-showing-different-robot-embodiments-in-action-including-arms-quadruped-and-a-mobile-base-robo.jpg?id=53636239&width=980"></media:content></item><item><title>Driving Middle East’s Innovation in Robotics and Future of Automation</title><link>https://spectrum.ieee.org/iros2024-abu-dhabi</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-photo-of-two-women-looking-at-a-robotic-arm.jpg?id=53624597&width=1200&height=800&coordinates=468%2C0%2C0%2C0"/><br/><br/><p><em>This is a sponsored article brought to you by <a href="https://www.ku.ac.ae/" target="_blank">Khalifa University of Science and Technology</a>.</em></p><p>
Abu Dhabi-based Khalifa University of Science and Technology in the United Arab Emirates (UAE) will be hosting the 36th edition of the IEEE/RSJ International Conference on Intelligent Robots and Systems
<a href="https://iros2024-abudhabi.org/program-overview" rel="noopener noreferrer" target="_blank">(IROS 2024</a>) to highlight the Middle East and North Africa (MENA) region’s rapidly advancing capabilities in the robotics and intelligent transport systems.
</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-left" data-rm-resized-container="25%" style="float: left;">
<a href="https://iros2024-abudhabi.org/" target="_blank"></a><a class="shortcode-media-lightbox__toggle shortcode-media-controls__button material-icons" style="background: gray;" title="Select for lightbox">aspect_ratio</a><img alt="Logo for IROS 2024 robotics conference, featuring a line drawing of electrical devices and the words IROS 24 and Abu Dhabi." class="rm-shortcode rm-resized-image" data-rm-shortcode-id="0c4e18743f4f1d3c4852348921be5f7c" data-rm-shortcode-name="rebelmouse-image" id="4c34c" loading="lazy" src="https://spectrum.ieee.org/media-library/logo-for-iros-2024-robotics-conference-featuring-a-line-drawing-of-electrical-devices-and-the-words-iros-24-and-abu-dhabi.jpg?id=53624896&width=980" style="max-width: 100%"/>
</p><p>
Themed “Robotics for Sustainable Development,” the IROS 2024 will be held from 14-18 October 2024 at the Abu Dhabi National Exhibition Center (ADNEC) in the UAE’s capital city. It will offer a platform for universities and research institutions to display their research and innovation activities and initiatives in robotics, gathering researchers, academics, leading corporate majors, and industry professionals from around the globe.
<br/>
</p><p>
A total of 13 forums, nine global-level
<a href="https://iros2024-abudhabi.org/competitions" target="_blank">competitions and challenges</a> covering various aspects of robotics and AI, an IROS Expo, as well as an exclusive Career Fair will also be part of IROS 2024. The challenges and competitions will focus on physical or athletic intelligence of robots, remote robot navigation, robot manipulation, underwater robotics, as well as perception and sensing.
</p><p>
Delegates for the event will represent sectors including manufacturing, healthcare, logistics, agriculture, defense, security, and mining sectors with 60 percent of the talent pool having over six years of experience in robotics. A major component of the conference will be the poster sessions, keynotes, panel discussions by researchers and scientists, and networking events.
</p><p class="shortcode-media shortcode-media-rebelmouse-image">
<img alt="A photo of two people in front of a red robot." class="rm-shortcode" data-rm-shortcode-id="8d6e827cdb93a608ac28dc4c666cca21" data-rm-shortcode-name="rebelmouse-image" id="9786e" loading="lazy" src="https://spectrum.ieee.org/media-library/a-photo-of-two-people-in-front-of-a-red-robot.jpg?id=53624909&width=980"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">Khalifa University will be hosting IROS 2024 to highlight the Middle East and North Africa (MENA) region’s rapidly advancing capabilities in the robotics and intelligent transport systems.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Khalifa University</small></p><p>
Abu Dhabi ranks first on the world’s safest cities list in 2024, according to online database Numbeo, out of 329 global cities in the 2024 standings, holding the title for eight consecutive years since 2017, reflecting the emirate’s ongoing efforts to ensure a good quality of life for citizens and residents.
</p><p>
With a multicultural community, Abu Dhabi is home to people from more than 200 nationalities and draws a large number of tourists to some of the top art galleries in the city such as Louvre Abu Dhabi and the Guggenheim Abu Dhabi, as well as other destinations such as Ferrari World Abu Dhabi and Warner Bros. World Abu Dhabi.
</p><p>
The UAE and Abu Dhabi have increasingly become a center for creative skillsets, human capital and advanced technologies, attracting several international and regional events such as the global COP28 UAE climate summit, in which more than 160 countries participated.
</p><p>
Abu Dhabi city itself has hosted a number of association conventions such as the 34th International Nursing Research Congress and is set to host the UNCTAD World Investment Forum, the 13th World Trade Organization (WTO) Ministerial Conference (MC13), the 12th World Environment Education Congress in 2024, and the IUCN World Conservation Congress in 2025.
</p><p class="shortcode-media shortcode-media-rebelmouse-image">
<img alt="A photo of a man looking at a sensor." class="rm-shortcode" data-rm-shortcode-id="2c350a0b959a3545417adfaa47922c4c" data-rm-shortcode-name="rebelmouse-image" id="00d49" loading="lazy" src="https://spectrum.ieee.org/media-library/a-photo-of-a-man-looking-at-a-sensor.jpg?id=53624901&width=980"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">Khalifa University’s Center for Robotics and Autonomous Systems (KU-CARS) includes a vibrant multidisciplinary environment for conducting robotics and autonomous vehicle-related research and innovation.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Khalifa University</small></p><p>
Dr. Jorge Dias, IROS 2024 General Chair, said: “Khalifa University is delighted to bring the Intelligent Robots and Systems 2024 to Abu Dhabi in the UAE and highlight the innovations in line with the theme Robotics for Sustainable Development. As the region’s rapidly advancing capabilities in robotics and intelligent transport systems gain momentum, this event serves as a platform to incubate ideas, exchange knowledge, foster collaboration, and showcase our research and innovation activities. By hosting IROS 2024, Khalifa University aims to reaffirm the UAE’s status as a global innovation hub and destination for all industry stakeholders to collaborate on cutting-edge research and explore opportunities for growth within the UAE’s innovation ecosystem.”
</p><p class="pull-quote">“This event serves as a platform to incubate ideas, exchange knowledge, foster collaboration, and showcase our research and innovation activities” <strong>—Dr. Jorge Dias, IROS 2024 General Chair</strong></p><p>
Dr. Dias added: “The organizing committee of IROS 2024 has received over 4000 submissions representing 60 countries, with China leading with 1,029 papers, followed by the U.S. (777), Germany (302), and Japan (253), as well as the U.K. and South Korea (173 each). The UAE with a total of 68 papers comes atop the Arab region.”
</p><p>
Driving innovation at Khalifa University is the Center for Robotics and Autonomous Systems (KU-CARS) with around 50 researchers and state-of-the-art laboratory facilities, including a vibrant multidisciplinary environment for conducting robotics and autonomous vehicle-related research and innovation.
</p><p>
IROS 2024 is sponsored by IEEE Robotics and Automation Society, Abu Dhabi Convention and Exhibition Bureau, the Robotics Society of Japan (RSJ), the Society of Instrument and Control Engineers (SICE), the New Technology Foundation, and the IEEE Industrial Electronics Society (IES).
</p><p>
More information at
<a href="https://iros2024-abudhabi.org/" target="_blank"><u>https://iros2024-abudhabi.org/</u></a>
</p>]]></description><pubDate>Fri, 13 Sep 2024 16:29:09 +0000</pubDate><guid>https://spectrum.ieee.org/iros2024-abu-dhabi</guid><category>Abu dhabi</category><category>Autonomous systems</category><category>Innovation</category><category>Robotics</category><category>Automation</category><dc:creator>Khalifa University</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-photo-of-two-women-looking-at-a-robotic-arm.jpg?id=53624597&width=980"></media:content></item><item><title>Video Friday: Jumping Robot Leg, Walking Robot Table</title><link>https://spectrum.ieee.org/video-friday-robot-legs</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/two-men-watch-as-a-tethered-robotic-leg-jumps-around-a-test-area.jpg?id=53623104&width=1200&height=800&coordinates=0%2C0%2C0%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://icra40.ieee.org/">ICRA@40</a>: 23–26 September 2024, ROTTERDAM, NETHERLANDS</h5><h5><a href="https://iros2024-abudhabi.org/">IROS 2024</a>: 14–18 October 2024, ABU DHABI, UAE</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="qaftt2lfdl8"><em>Researchers at the Max Planck Institute for Intelligent Systems and ETH Zurich have developed a robotic leg with artificial muscles. Inspired by living creatures, it jumps across different terrains in an agile and energy-efficient manner.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="dff5f00de87c9710e02baeb2cdd24d9a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/qafTt2LFDl8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.nature.com/articles/s41467-024-51568-3"><em>Nature</em></a> ] via [ <a href="https://is.mpg.de/news/artificial-muscles-propel-a-robotic-leg-to-walk-and-jump">MPI</a> ]</p><p>Thanks, Toshi!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="m9kc0l1ziaa"><em>ETH Zurich researchers have now developed a fast robotic printing process for earth-based materials that does not require cement. In what is known as “impact printing,” a robot shoots material from above, gradually building a wall. On impact, the parts bond together, and very minimal additives are required. </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="90f3791602ba8e6b6301eaed0a8c2f10" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/M9kc0L1zIaA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://gramaziokohler.arch.ethz.ch/web/e/forschung/451.html">ETH Zurich</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="mi6do-x67cm">How could you not be excited to see <a href="https://spectrum.ieee.org/flying-humanoid-robot" target="_blank">this happen for real</a>?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6d9d2e111215eb210548058682b62edb" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/mi6Do-x67CM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2309.12784">arXiv paper</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="2bvu5fj80xm">Can we all agree that sanding, grinding, deburring, and polishing tasks are really best done by robots, for the most part?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7163f6b81ee719ab0f0174d3342501eb" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2BVU5FJ80xM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.cohesiverobotics.com/products/smart-finishing-robotic-workcell">Cohesive Robotics</a> ]</p><p>Thanks, David!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="tqdzxn_k5nu"><em><a href="https://www.youtube.com/watch?v=UUOo8N9_iH0" target="_blank">Using doors</a> is a longstanding challenge in robotics and is of significant practical interest in giving robots greater access to human-centric spaces. The task is challenging due to the need for online adaptation to varying door properties and precise control in manipulating the door panel and navigating through the confined doorway. To address this, we propose a learning-based controller for a legged manipulator to open and traverse through doors.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="40579acf09c16696086f0814c60ecab9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/tQDZXN_k5NU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2409.04882">arXiv paper</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ck_r-v-m3ug"><em>Isaac is the first robot assistant that’s built for the home. And we’re shipping it in fall of 2025.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="34e6b1f75b1644ce2d1c83b1e7ae92b5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ck_r-v-M3ug?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Fall of 2025 is a long enough time from now that I’m not even going to speculate about it.</p><p>[ <a href="https://www.weaverobots.com/">Weave Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="vgnwuppoy9a"><em>By patterning liquid metal paste onto a soft sheet of silicone or acrylic foam tape, we developed <a href="https://spectrum.ieee.org/stretchable-electronics" target="_blank">stretchable versions of conventional rigid circuits</a> (like Arduinos). Our soft circuits can be stretched to over 300% strain (over 4x their length) and are integrated into active soft robots.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="82701174b27f8354210ffa3099f599d0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/VgNwUPpOY9A?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.science.org/doi/10.1126/scirobotics.adn6844"><em>Science Robotics</em></a> ] via [ <a href="https://www.eng.yale.edu/faboratory/">Yale</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="eh6w39ie5bs"><em><a href="https://robotsguide.com/robots/curiosity" target="_blank">NASA’s Curiosity rover</a> is exploring a scientifically exciting area on Mars, but communicating with the mission team on Earth has recently been a challenge due to both the current season and the surrounding terrain. In this Mars Report, Curiosity engineer Reidar Larsen takes you inside the uplink room where the team talks to the rover.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="18df5cb56c8763011500ff976f335b1e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/eH6W39iE5bs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://science.nasa.gov/mission/msl-curiosity/">NASA</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="xkdy4ywxfjm">I love this and want to burn it with fire.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="00ca6db9d679860b7e6706e55f0815fe" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xKDY4yWxfJM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.decarpentier.nl/carpentopod">Carpentopod</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="t3rdwdkza9c"><em>Very often, people ask us what Reachy 2 is capable of, which is why we’re showing you the manipulation possibilities (through teleoperation) of our technology. The robot shown in this video is the Beta version of Reachy 2, our new robot coming very soon!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="91151f172acdba992d0ebc36b8627a5d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/T3RdwDkZA9c?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pollen-robotics.com/">Pollen Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="b3ra3zjtfjw"><em>The Scalable Autonomous Robots (ScalAR) Lab is an interdisciplinary lab focused on fundamental research problems in robotics that lie at the intersection of robotics, nonlinear dynamical systems theory, and uncertainty.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e6227bd15052c498b6928373de2181d5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/b3ra3ZJtFJw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://scalar.seas.upenn.edu/">ScalAR Lab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="yq28yycz05q"><em>Astorino is a 6-axis educational robot created for practical and affordable teaching of robotics in schools and beyond. It has been created with 3D printing, so it allows for experimentation and the possible addition of parts. With its design and programming, it replicates the actions of #KawasakiRobotics industrial robots, giving students the necessary skills for future work.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6ea770336f849992e9fe0c56de9489dd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/yq28yYCZ05Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://astorino.com.pl/en/">Astorino</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="v5wfdmy2fp4">I guess fish-fillet-shaping robots need to exist because otherwise customers will freak out if all their fish fillets are not identical, or something?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6b27f4e90e195dcbc2a8328f1c2eec77" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/V5WfdMY2fP4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.flexiv.com/products">Flexiv</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="buj6e065hi0"><em>Watch the second episode of the ExoMars Rosalind Franklin rover mission—Europe’s ambitious exploration journey to search for past and present signs of life on Mars. The rover will dig, collect, and investigate the chemical composition of material collected by a drill. Rosalind Franklin will be the first rover to reach a depth of up to two meters below the surface, acquiring samples that have been protected from surface radiation and extreme temperatures.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="214c43e8b47a0d0a613d7944a0fcf8d5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/BUj6e065Hi0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.esa.int/Science_Exploration/Human_and_Robotic_Exploration/Exploration/ExoMars">ESA</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 13 Sep 2024 15:35:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robot-legs</guid><category>Artificial muscles</category><category>Mars exploration</category><category>Robotics videos</category><category>Stretchable circuits</category><category>Video friday</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/two-men-watch-as-a-tethered-robotic-leg-jumps-around-a-test-area.jpg?id=53623104&width=980"></media:content></item><item><title>Video Friday: HAND to Take on Robotic Hands</title><link>https://spectrum.ieee.org/video-friday-robotic-hands</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-robotic-and-human-hand-reach-toward-one-another-in-front-of-a-blue-background.jpg?id=53596105&width=1200&height=800&coordinates=0%2C2%2C0%2C2"/><br/><br/><p>
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/>
</p><h5><a href="https://icra40.ieee.org/">ICRA@40</a>: 23–26 September 2024, ROTTERDAM, NETHERLANDS</h5><h5><a href="https://iros2024-abudhabi.org/">IROS 2024</a>: 14–18 October 2024, ABU DHABI, UAE</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>
Enjoy today’s videos!
</p><div class="horizontal-rule">
</div><div style="page-break-after: always">
<span style="display:none"> </span>
</div><blockquote class="rm-anchors" id="1lf36ptwzvw">
<em>The National Science Foundation Human AugmentatioN via Dexterity Engineering Research Center (HAND ERC) was announced in August 2024. Funded for up to 10 years and $52 million, the HAND ERC is led by Northwestern University, with core members Texas A&M, Florida A&M, Carnegie Mellon, and MIT, and support from Wisconsin-Madison, Syracuse, and an innovation ecosystem consisting of companies, national labs, and civic and advocacy organizations. HAND will develop versatile, easy-to-use dexterous robot end effectors (hands).</em>
</blockquote><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="a05d97e59e5c3154aee0f615385bcadd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/1lf36ptWZvw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
</p><p>
[ <a href="https://hand-erc.org/">HAND</a> ]
</p><div class="horizontal-rule">
</div><p class="rm-anchors" id="qz7gwaubn7i">
The Environmental Robotics Lab at ETH Zurich, in partnership with <a href="https://en.wilderness-international.org/" target="_blank">Wilderness International</a> (and some help from <a href="https://www.dji.com/" target="_blank">DJI</a> and <a href="https://www.audi.com/en.html" target="_blank">Audi</a>), is using drones to sample DNA from the tops of trees in the Peruvian rainforest. Somehow, the treetops are where 60 to 90 percent of biodiversity is found, and these drones can help researchers determine what the heck is going on up there.
</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="442f6123b5c3372725529651530fcf87" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/qz7gWauBn7I?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
</p><p>
[ <a href="https://erl.ethz.ch/">ERL</a> ]
</p><p>
Thanks, Steffen!
</p><div class="horizontal-rule">
</div><p class="rm-anchors" id="burluuxv9ge">
1X introduces NEO Beta, “the pre-production build of our home humanoid.”
</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="9ba7be9452be876874d279acf332a1df" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/bUrLuUxv9gE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
</p><blockquote>
<em>“Our priority is safety,” said Bernt Børnich, CEO at 1X. “Safety is the cornerstone that allows us to confidently introduce NEO Beta into homes, where it will gather essential feedback and demonstrate its capabilities in real-world settings. This year, we are deploying a limited number of NEO units in selected homes for research and development purposes. Doing so means we are taking another step toward achieving our mission.”</em>
</blockquote><p>
[ <a href="https://www.1x.tech/discover/announcement-1x-unveils-neo-beta-a-humanoid-robot-for-the-home">1X</a> ]
</p><div class="horizontal-rule">
</div><p class="rm-anchors" id="08j82rek5bw">
We love MangDang’s fun and affordable approach to robotics with Mini Pupper. The next generation of the little legged robot has just launched on <a href="https://www.kickstarter.com/projects/mdrobotkits/md-robot-kits-open-source-support-your-genai-creativity/" target="_blank">Kickstarter</a>, featuring new and updated robots that make it easy to explore embodied AI.
</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="445865c54141810318fabb248258c9c3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/08J82rEK5bw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
</p><p>
The Kickstarter is already fully funded after just a day or two, but there are still plenty of robots up for grabs.
</p><p>
[ <a href="https://www.kickstarter.com/projects/mdrobotkits/md-robot-kits-open-source-support-your-genai-creativity/">Kickstarter</a> ]
</p><div class="horizontal-rule">
</div><p class="rm-anchors" id="5qnpch34m2m">
Quadrupeds in space can use their legs to reorient themselves. Or, if you throw one off a roof, it can learn to land on its feet.
</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="2007095fb88416ccae3b871e2e25b789" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5qNPCH34M2M?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
</p><p>
To be presented at <a href="https://www.corl.org/" target="_blank">CoRL 2024</a>.
</p><p>
[ <a href="https://www.autonomousrobotslab.com/">ARL</a> ]
</p><div class="horizontal-rule">
</div><p class="rm-anchors" id="ahsqacsae2u">
HEBI Robotics, which apparently was once headquartered inside a Pittsburgh public bus, has imbued a table with actuators and a mind of its own.
</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="04a90d7c7a68abea33fe6c1f2d900494" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ahSQAcSAE2U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
</p><p>
[ <a href="https://www.hebirobotics.com/">HEBI Robotics</a> ]
</p><div class="horizontal-rule">
</div><p class="rm-anchors" id="fsxc_kbi7ey">
Carcinization is a concept in evolutionary biology where a crustacean that isn’t a crab eventually becomes a crab. So why not do the same thing with robots? Crab robots solve all problems!
</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="20c4886b631831155469f2b1a78d49cd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FSxC_kbI7EY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
</p><p>
[ <a href="https://news.kaist.ac.kr/newsen/html/news/?mode=V&mng_no=34950">KAIST</a> ]
</p><div class="horizontal-rule">
</div><p class="rm-anchors" id="hubwiuuz-e4">
Waymo is smart, but also humans are really, really dumb sometimes.
</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="6df5aff50de28ff9d5a2f108ede42de4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/hubWIuuz-e4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
</p><p>
[ <a href="https://waymo.com/">Waymo</a> ]
</p><div class="horizontal-rule">
</div><blockquote class="rm-anchors" id="tslev6rd2-g">
<em>The Robotics Department of the University of Michigan created an interactive community art project. The group that led the creation believed that while roboticists typically take on critical and impactful problems in transportation, medicine, mobility, logistics, and manufacturing, there are many opportunities to find play and amusement. The final piece is a grid of art boxes, produced by different members of our robotics community, which offer an eight-inch-square view into their own work with robotics.</em>
</blockquote><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="483784af7b60fc8ccdb27a590a4808a3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/TsLeV6RD2-g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
</p><p>
[ <a href="https://robotics.umich.edu/">Michigan Robotics</a> ]
</p><div class="horizontal-rule">
</div><p class="rm-anchors" id="0i3c8ed2dva">
I appreciate that UBTECH’s humanoid is doing an actual job, but why would you use a <a href="https://spectrum.ieee.org/humanoid-robots" target="_blank">humanoid</a> for this?
</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="5f4ce3d70c049e2fa26553d82bed340b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0I3c8ed2dvA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
</p><p>
[ <a href="https://www.ubtrobot.com/en/humanoid/products/WalkerS">UBTECH</a> ]
</p><div class="horizontal-rule">
</div><p class="rm-anchors" id="teastcjjd0w">
I’m sure most actuators go through some form of life-cycle testing. But if you really want to test an electric motor, put it into a BattleBot and see what happens.
</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="01e11b2a94d0f6219b4369b429e5b20a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/tEAsTCJjd0w?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
</p><p>
[ <a href="https://hardcorerobotics.com/">Hardcore Robotics</a> ]
</p><div class="horizontal-rule">
</div><p class="rm-anchors" id="_6njdqjzjra">
Yes, but have you tried fighting a BattleBot?
</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="6b38956c47f8ad7562d337fa731b0e3a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/_6NjDqjZJRA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
</p><p>
[ <a href="https://global.agilex.ai/products/piper">AgileX</a> ]
</p><div class="horizontal-rule">
</div><blockquote class="rm-anchors" id="bo1-u9dzl-m">
<em>In this video, we present collaboration aerial grasping and transportation using multiple quadrotors with cable-suspended payloads. Grasping using a suspended gripper requires accurate tracking of the electromagnet to ensure a successful grasp while switching between different slack and taut modes. In this work, we grasp the payload using a hybrid control approach that switches between a quadrotor position control and a payload position control based on cable slackness. Finally, we use two quadrotors with suspended electromagnet systems to collaboratively grasp and pick up a larger payload for transportation.</em>
</blockquote><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="0674dfb784fe0036b7bbc3285c93532c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Bo1-U9dzL-M?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
</p><p>
[ <a href="https://hybrid-robotics.berkeley.edu/">Hybrid Robotics</a> ]
</p><div class="horizontal-rule">
</div><p class="rm-anchors" id="gctwnmlhpaw">
I had not realized that the floretizing of broccoli was so violent.
</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="217c68d7d32896b8584037c75dbea7c4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GctwnmlHPaw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
</p><p>
[ <a href="https://www.oxipitalai.com/">Oxipital</a> ]
</p><div class="horizontal-rule">
</div><blockquote class="rm-anchors" id="gndvtlvqgpo">
<em>While the RoboCup was held over a month ago, we still wanted to make a small summary of our results, the most memorable moments, and of course an homage to everyone who is involved with the B-Human team: the team members, the sponsors, and the fans at home. Thank you so much for making B-Human the team it is!</em>
</blockquote><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="d8dab021de1aeaa7089fd915a76b7996" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GndvTlVqgPo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
</p><p>
[ <a href="https://b-human.de/index.html">B-Human</a> ]
</p><div class="horizontal-rule">
</div>]]></description><pubDate>Fri, 06 Sep 2024 15:53:53 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robotic-hands</guid><category>Robotics</category><category>Video friday</category><category>Waymo</category><category>Robocup</category><category>Bionic hand</category><category>Drones</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-robotic-and-human-hand-reach-toward-one-another-in-front-of-a-blue-background.jpg?id=53596105&width=980"></media:content></item><item><title>Unitree Demos New $16k Robot</title><link>https://spectrum.ieee.org/unitree-g1</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://www.youtube.com/embed/B2pmDShvGOY?rel=0" width="100%"></iframe><br/><p>At ICRA 2024, <em>Spectrum</em> editor <a href="https://spectrum.ieee.org/u/evan-ackerman" target="_blank">Evan Ackerman</a> sat down with <a href="https://www.unitree.com/" target="_blank">Unitree</a> founder and CEO Xingxing Wang and Tony Yang, VP of Business Development, to talk about the company’s newest humanoid, the <a href="https://www.unitree.com/g1/" target="_blank">G1 model</a>. </p><p>Smaller, more flexible, and elegant, the G1 robot is designed for general use in service and industry, and is one of the cheapest—if not <em>the</em> cheapest—humanoid around.</p>]]></description><pubDate>Fri, 30 Aug 2024 17:01:06 +0000</pubDate><guid>https://spectrum.ieee.org/unitree-g1</guid><category>Humanoid robots</category><category>Unitree</category><category>Robotics</category><dc:creator>IEEE Spectrum</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/ieee-spectrum.jpg?id=53562309&width=980"></media:content></item><item><title>Video Friday: Robots Solving Table Tennis</title><link>https://spectrum.ieee.org/video-friday-robots-table-tennis</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-series-of-pictures-showing-a-human-playing-a-table-tennis-game-with-a-white-robotic-arm.png?id=53572059&width=1200&height=800&coordinates=0%2C0%2C922%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://icra40.ieee.org/">ICRA@40</a>: 23–26 September 2024, ROTTERDAM, NETHERLANDS</h5><h5><a href="https://iros2024-abudhabi.org/">IROS 2024</a>: 14–18 October 2024, ABU DHABI, UAE</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="eqql-jqxtoe">Imbuing robots with “human-level performance” in anything is an enormous challenge, but it’s worth it when you see a robot with the skill to interact with a human on a (nearly) human level. Google DeepMind has managed to achieve amateur human-level competence at table tennis, which is much harder than it looks, even for humans. Pannag Sanketi, a tech-lead manager in the robotics team at DeepMind, shared some interesting insights about performing the research. But first, video!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9d5caf668751e84d8c060c3c5b24da44" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/EqQl-JQxToE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Some behind the scenes detail from Pannag:</p><ul><li><em>The robot had not seen any participants before. So we knew we had a cool agent, but we had no idea how it was going to fare in a full match with real humans. To witness it outmaneuver even some of the most advanced players was such a delightful moment for team! </em></li><li><em>All the participants had a lot of fun playing against the robot, irrespective of who won the match. And all of them wanted to play more. Some of them said it will be great to have the robot as a playing partner. From the videos, you can even see how much fun the user study hosts sitting there (who are not authors on the paper) are having watching the games!</em></li><li><em>Barney, who is a professional coach, was an advisor on the project, and our chief evaluator of robot’s skills the way he evaluates his students. He also got surprised by how the robot is always able to learn from the last few weeks’ sessions.</em></li><li><em>We invested a lot in remote and automated 24x7 operations. So not the setup in this video, but there are other cells that we can run 24x7 with a ball thrower.</em></li><li><em>We even tried robot-vs-robot, i.e. 2 robots playing against each other! :) The line between collaboration and competition becomes very interesting when they try to learn by playing with each other.</em></li></ul><p>[ <a href="">DeepMind</a> ]</p><p>Thanks, Heni!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="hf4m7tooqfe">Yoink.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7c36901152462367a05baeb3f0a71f08" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/HF4M7TooqfE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://web.mit.edu/sparklab/">MIT</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="wge1rrttoxm">Considering how their stability and recovery is often tested, teaching robot dogs to be shy of humans is an excellent idea.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c5bf3078fc6601cec5ae5a4dc40dfb5e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wGE1RrTtoXM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">Deep Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="7gullkafr7a">Yes, quadruped robots need tow truck hooks.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b8881f50a2cb1df640c76cc66ef5576a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7gullKaFr7A?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2403.19862">Paper</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="rip9zn79zgm">Earthworm-inspired robots require novel actuators, and Ayato Kanada at Kyushu University has come up with a neat one.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="43932972eb75f060a8a94c5fe03eb473" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/rIp9zN79ZGM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/document/10638030">Paper</a> ]</p><p>Thanks, Ayato!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="s75humv1yew"><em>Meet the AstroAnt! This miniaturized swarm robot can ride atop a lunar rover and collect data related to its health, including surface temperatures and damage from micrometeoroid impacts. In the summer of 2024, with support from our collaborator Castrol, the Media Lab’s Space Exploration Initiative tested AstroAnt in the Canary Islands, where the volcanic landscape resembles the lunar surface.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bd49574ea424509a9cb32c0edc055b72" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/S75HUMv1yew?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.media.mit.edu/posts/castrol-moon/">MIT</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="h1sew56zcwo">Kengoro has a new forearm that mimics the human radioulnar joint giving it an even more natural badminton swing.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f63909e2ddba33fa9d8a28edc1eae656" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/h1sEw56zCwo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="http://www.jsk.t.u-tokyo.ac.jp/">JSK Lab</a> ]</p><p>Thanks, Kento!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="mkyatpjmlnu"><em>Gromit’s concern that Wallace is becoming too dependent on his inventions proves justified, when Wallace invents a “smart” gnome that seems to develop a mind of its own. When it emerges that a vengeful figure from the past might be masterminding things, it falls to Gromit to battle sinister forces and save his master… or Wallace may never be able to invent again!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c3073bd785c72fb78fb3da7d77ad01f0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/mkyAtPjMLNU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.wallaceandgromit.com/">Wallace and Gromit</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ksn8xyjyjgw"><em>ASTORINO is a modern 6-axis robot based on 3D printing technology. Programmable in AS-language, it facilitates the preparation of classes with ready-made teaching materials, is easy both to use and to repair, and gives the opportunity to learn and make mistakes without fear of breaking it.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7731fdfc6459f2d44e0468c7a6076fca" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/KSn8xYjYjgw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://kawasakirobotics.com/eu-africa/products-robots/astorino/">Kawasaki</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="rbplwdnwkdw"><em>Engineers at NASA’s Jet Propulsion Laboratory are testing a prototype of IceNode, a robot designed to access one of the most difficult-to-reach places on Earth. The team envisions a fleet of these autonomous robots deploying into unmapped underwater cavities beneath Antarctic ice shelves. There, they’d measure how fast the ice is melting — data that’s crucial to helping scientists accurately project how much global sea levels will rise.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="dba0aaf9ff3a06aac7df2850f03ed07f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/rBplwDNwKDw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.nasa.gov/earth/climate-change/nasa-jpl-developing-underwater-robots-to-venture-deep-below-polar-ice/">IceNode</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="imytkvb5wr4"><em>Los Alamos National Laboratory, in a consortium with four other National Laboratories, is leading the charge in finding the best practices to find orphaned wells. These abandoned wells can leak methane gas into the atmosphere and possibly leak liquid into the ground water.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d031444c2549470cd0197b5b72b979f0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/IMyTKvB5wR4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://discover.lanl.gov/publications/national-security-science/2023-winter/looking-for-whats-lost/">LANL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="jwtwwuzb6cg">Looks like Fourier has been working on something new, although this is still at the point of “looks like” rather than something real.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="87c85035102e317caa8313ff38caf417" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/jWTWWuzB6Cg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://fourierintelligence.com/">Fourier</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="sggp1ttcjz8"><em>Bio-Inspired Robot Hands: Altus Dexterity is a collaboration between researchers and professionals from Carnegie Mellon University, UPMC, the University of Illinois and the University of Houston.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="26a5ad062bd296a8f16387ed58b59a39" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/SGgP1TtcJz8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sites.google.com/view/altusdexterity/">Altus Dexterity</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="xa-jbnzeagm"><em>PiPER is a lightweight robotic arm with six integrated joint motors for smooth, precise control. Weighing just 4.2kg, it easily handles a 1.5kg payload and is made from durable yet lightweight materials for versatile use across various environments. Available for just $2,499 USD.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7456a7ed68844a63868a8a7d61bbfe10" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Xa-jbNzeAGM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://global.agilex.ai/products/piper">AgileX</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="nu5enr8nxqu"><em>At 104 years old, Lilabel has seen over a century of automotive transformation, from sharing a single car with her family in the 1920s to experiencing her first ride in a robotaxi.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="63eb01e41acabf41cb5a43d9ccb05e7f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/NU5enr8nxQU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://zoox.com/journal/104-year-old-takes-a-ride-in-a-zoox-robotaxi">Zoox</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ciorgyczogk">Traditionally, blind juggling robots use plates that are slightly concave to help them with ball control, but it’s also possible to make a blind juggler the hard way. Which, honestly, is much more impressive.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="de88556c288de6deed89f0cb0755323e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/CiorGYCZOgk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://pdj.zulipchat.com/#narrow/stream/403504-FAQ/topic/What.20makes.20Jugglebot.20Special.3F">Jugglebot</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 30 Aug 2024 16:26:29 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robots-table-tennis</guid><category>Robotics</category><category>Tennis</category><category>Quadruped robots</category><category>Humanoid robot</category><category>Nasa</category><category>Google</category><category>Video friday</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-series-of-pictures-showing-a-human-playing-a-table-tennis-game-with-a-white-robotic-arm.png?id=53572059&width=980"></media:content></item><item><title>Robot Metalsmiths Are Resurrecting Toroidal Tanks for NASA</title><link>https://spectrum.ieee.org/metal-forming-robot</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/robots-pair-up-on-each-side-of-a-sheet-of-metal-to-form-a-segment-of-a-toroidal-tank.gif?id=53554469&width=1200&height=800&coordinates=62%2C0%2C63%2C0"/><br/><br/><p>
In the 1960s and 1970s, NASA spent a lot of time <a href="https://ntrs.nasa.gov/search?q=toroidal%20tank%20spacecraft&page=%7B%22size%22:100,%22from%22:0%7D" rel="noopener noreferrer" target="_blank"><u>thinking about</u></a> whether toroidal (donut-shaped) fuel tanks were the way to go with its spacecraft. Toroidal tanks have a bunch of potential advantages over conventional spherical fuel tanks. For example, you can fit <a href="https://ntrs.nasa.gov/api/citations/20040084002/downloads/20040084002.pdf" rel="noopener noreferrer" target="_blank"><u>nearly 40%</u></a> more volume within a toroidal tank than if you were using multiple spherical tanks within the same space. And perhaps most interestingly, you can shove stuff (like the back of an engine) through the middle of a toroidal tank, which could lead to some substantial efficiency gains if the tanks could also handle structural loads.
</p><p>
Because of their relatively complex shape, toroidal tanks are much more difficult to make than spherical tanks. Even though these tanks can perform better, NASA simply doesn’t have the expertise to manufacture them anymore, since each one has to be hand-built by highly skilled humans. But a company called <a href="https://machinalabs.ai/" target="_blank">Machina Labs</a> thinks that they can do this with robots instead. And their vision is to completely change how we make things out of metal.
</p><hr/><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="597633e7e9ee4cd3da950597581ed0d0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/JgCuESdUVIw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
</p><p>
The fundamental problem that Machina Labs is trying to solve is that if you want to build parts out of metal efficiently at scale, it’s a slow process. Large metal parts need their own custom dies, which are very expensive one-offs that are about as inflexible as it’s possible to get, and then entire factories are built around these parts. It’s a huge investment, which means that it doesn’t matter if you find some new geometry or technique or material or market, because you have to justify that enormous up-front cost by making as much of the original thing as you possibly can, stifling the potential for rapid and flexible innovation.<br/>
</p><p>
On the other end of the spectrum you have the also very slow and expensive process of making metal parts one at a time by hand. A few hundred years ago, this was the <em><em>only</em></em> way of making metal parts: skilled metalworkers using hand tools for months to make things like armor and weapons. The nice thing about an expert metalworker is that they can use their skills and experience to make anything at all, which is where Machina Labs’ vision comes from, explains CEO <a href="https://www.linkedin.com/in/edward-mehr/" target="_blank"><u>Edward Mehr</u></a> who co-founded Machina Labs after spending time at SpaceX followed by leading the 3D printing team at <a href="https://spectrum.ieee.org/3d-printed-rocket-relativity-spacex" target="_self"><u>Relativity Space</u></a>.
</p><p>
“Craftsmen can pick up different tools and apply them creatively to metal to do all kinds of different things. One day they can pick up a hammer and form a shield out of a sheet of metal,” says Mehr. “Next, they pick up the same hammer, and create a sword out of a metal rod. They’re very flexible.”
</p><p>
The technique that a human metalworker uses to shape metal is called forging, which preserves the grain flow of the metal as it’s worked. Casting, stamping, or milling metal (which are all ways of automating metal part production) are simply not as strong or as durable as parts that are forged, which can be an important differentiator for (say) things that have to go into space. But more on that in a bit.
</p><p>
The problem with human metalworkers is that the throughput is bad—humans are slow, and highly skilled humans in particular don’t scale well. For Mehr and Machina Labs, this is where the robots come in.
</p><p>
“We want to automate and scale using a platform called the ‘robotic craftsman.’ Our core enablers are robots that give us the kinematics of a human craftsman, and artificial intelligence that gives us control over the process,” Mehr says. “The concept is that we can do any process that a human craftsman can do, and actually some that humans can’t do because we can apply more force with better accuracy.”
</p><p>
This flexibility that robot metalworkers offer also enables the crafting of bespoke parts that would be impractical to make in any other way. These include toroidal (donut-shaped) fuel tanks that NASA has had its eye on <a href="https://ntrs.nasa.gov/api/citations/19730023034/downloads/19730023034.pdf" rel="noopener noreferrer" target="_blank"><u>for the last half century or so</u></a>.
</p><p class="shortcode-media shortcode-media-rebelmouse-image">
<img alt="Two people stand in a warehouse with a huge silver donut-shaped tank in front of them." class="rm-shortcode" data-rm-shortcode-id="2c010ef78faccaa24ebdda1328770557" data-rm-shortcode-name="rebelmouse-image" id="ecbe1" loading="lazy" src="https://spectrum.ieee.org/media-library/two-people-stand-in-a-warehouse-with-a-huge-silver-donut-shaped-tank-in-front-of-them.jpg?id=53554453&width=980"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">Machina Labs’ CEO Edward Mehr (on right) stands behind a 15 foot toroidal fuel tank.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Machina Labs</small>
</p><p>
“The main challenge of these tanks is that the geometry is complex,” Mehr says. “Sixty years ago, NASA was bump-forming them with very skilled craftspeople, but a lot of them aren’t around anymore.” Mehr explains that the only other way to get that geometry is with dies, but for NASA, getting a die made for a fuel tank that’s necessarily been customized for one single spacecraft would be pretty much impossible to justify. “So one of the main reasons we’re not using toroidal tanks is because it’s just hard to make them.”
</p><p>
Machina Labs is now making toroidal tanks for NASA. For the moment, the robots are just doing the shaping, which is the tough part. Humans then weld the pieces together. But there’s no reason why the robots couldn’t do the entire process end-to-end and even more efficiently. Currently, they’re doing it the “human” way based on existing plans from NASA. “In the future,” Mehr tells us, “we can actually form these tanks in one or two pieces. That’s the next area that we’re exploring with NASA—how can we do things differently now that we don’t need to design around human ergonomics?”
</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="237592864bde164dac7d462ef1efc702" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7_VJnbjYG2s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
<small class="image-media media-caption" placeholder="Add Photo Caption...">Machina Labs’ ‘robotic craftsmen’ work in pairs to shape sheet metal, with one robot on each side of the sheet. The robots align their tools slightly offset from each other with the metal between them such that as the robots move across the sheet, it bends between the tools.</small>
<small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Machina Labs</small>
</p><p>
The video above shows Machina’s robots working on a tank that’s 4.572 m (15 feet) in diameter, likely destined for the Moon. “The main application is for lunar landers,” says Mehr. “The toroidal tanks bring the center of gravity of the vehicle lower than what you would have with spherical or pill-shaped tanks.”
</p><p>
Training these robots to work metal like this is done primarily through physics-based simulations that Machina developed in house (existing software being too slow), followed by human-guided iterations based on the resulting real-world data. The way that metal moves under pressure can be simulated pretty well, and although there’s certainly still a sim-to-real gap (simulating how the robot’s tool adheres to the surface of the material is particularly tricky), the robots are collecting so much empirical data that Machina is making substantial progress towards full autonomy, and even finding ways to improve the process.
</p><p class="shortcode-media shortcode-media-rebelmouse-image">
<img alt="A hand holds a silvery piece of sheet metal that has been forged into a series of symmetrical waves." class="rm-shortcode" data-rm-shortcode-id="fcb1907321dc82d1e390a8931f053cf3" data-rm-shortcode-name="rebelmouse-image" id="b4f5c" loading="lazy" src="https://spectrum.ieee.org/media-library/a-hand-holds-a-silvery-piece-of-sheet-metal-that-has-been-forged-into-a-series-of-symmetrical-waves.jpg?id=53566745&width=980"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">An example of the kind of complex metal parts that Machina’s robots are able to make.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Machina Labs</small>
</p><p>
Ultimately, Machina wants to use robots to produce all kinds of metal parts. On the commercial side, they’re exploring things like <a href="https://www.instagram.com/p/C-Ldt7iS3dk/" target="_blank">car body panels</a>, offering the option to change how your car looks in geometry rather than just color. The requirement for a couple of beefy robots to make this work means that roboforming is unlikely to become as pervasive as 3D printing, but the broader concept is the same: making physical objects a software problem rather than a hardware problem to enable customization at scale.
</p>]]></description><pubDate>Thu, 29 Aug 2024 13:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/metal-forming-robot</guid><category>Lunar landers</category><category>Nasa</category><category>Spacecraft</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/robots-pair-up-on-each-side-of-a-sheet-of-metal-to-form-a-segment-of-a-toroidal-tank.gif?id=53554469&width=980"></media:content></item><item><title>Video Friday: Disney Robot Dance</title><link>https://spectrum.ieee.org/video-friday-disney-robot-dance</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/an-animated-gif-showing-a-small-black-humanoid-robot-doing-some-dance-moves.gif?id=53509949&width=1200&height=800&coordinates=62%2C0%2C63%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://icra40.ieee.org/">ICRA@40</a>: 23–26 September 2024, ROTTERDAM, NETHERLANDS</h5><h5><a href="https://iros2024-abudhabi.org/">IROS 2024</a>: 14–18 October 2024, ABU DHABI, UAE</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="q2i7u0tjljs">I think it’s time for us all to admit that some of the most interesting bipedal and humanoid research is being done by Disney.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ad855aa1f4eb7a3947dec6340482caf4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Q2I7u0tjlJs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://la.disneyresearch.com/wp-content/uploads/VMP_paper.pdf">Research Paper</a> from <a href="https://spectrum.ieee.org/tag/eth-zurich" target="_blank">ETH Zurich</a> and <a data-linked-post="2668135204" href="https://spectrum.ieee.org/disney-robot-2668135204" target="_blank">Disney Research</a>]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="funfr7v7kfq"><em>Over the past few months, Unitree G1 robot has been upgraded into a mass production version, with stronger performance, ultimate appearance, and being more in line with mass production requirements.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a515e386a1a71e1f5c18b323b88d183c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FuNFr7V7KFQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/g1/">Unitree</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="dvbyl0bsj30">This robot is from Kinisi Robotics, which was founded by Brennand Pierce, who also founded <a data-linked-post="2650252107" href="https://spectrum.ieee.org/attack-of-the-robot-bears" target="_blank">Bear Robotics</a>. You can’t really tell from this video, but check out the website because the reach this robot has is bonkers.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c158fa2f2bee7f61a330c30f088cd6d3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dVByl0Bsj30?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Kinisi Robotics is on a mission to democratize access to advanced robotics with our latest innovation—a low-cost, dual-arm robot designed for warehouses, factories, and supermarkets. What sets our robot apart is its integration of LLM technology, enabling it to learn from demonstrations and perform complex tasks with minimal setup. Leveraging Brennand’s extensive experience in scaling robotic solutions, we’re able to produce this robot for under $20k, making it a game-changer in the industry.</em></blockquote><p>[ <a href="https://kinisi.tech/product">Kinisi Robotics</a> ]</p><p>Thanks Bren!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="pkghe_ccuwy">Finally, something that <a data-linked-post="2667789605" href="https://spectrum.ieee.org/atlas-humanoid-robot-ceo-interview" target="_blank">Atlas</a> does that I am also physically capable of doing. In theory.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="12fc8188c8dcf0f63b80192a9483d0d6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/PKgHe_CcUWY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Okay, never mind. I don’t have those hips.</p><p>[ <a href="https://bostondynamics.com/atlas/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="v6pjnexavba"><em>Researchers in the Department of Mechanical Engineering at Carnegie Mellon University have created the first legged robot of its size to run, turn, push loads, and climb miniature stairs.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a6ec4a11c08da81cbd0575b2f1df3d15" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/V6PjNExaVbA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>They say it can “run,” but I’m skeptical that there’s a flight phase unless someone sneezes nearby.</p><p>[ <a href="https://engineering.cmu.edu/news-events/news/2024/08/19-picotaur.html">Carnegie Mellon University</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="tgxk20uhc80">The lights are cool and all, but it’s the pulsing soft skin that’s squigging me out.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="358d2b7f255ab0011d9a228f91bd6881" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/tGXk20UHc80?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.liebertpub.com/doi/10.1089/rorep.2024.0035">Paper</a>, <em>Robotics Reports</em> Vol.2 ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="u4dzjzpx678">Roofing is a difficult and dangerous enough job that it would be great if robots could take it over. It’ll be a challenge though.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="483b9300e7018ee3a56b9d766d38ad62" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/U4DZJZpX678?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.renovaterobotics.com/">Renovate Robotics</a> ] via [ <a href="https://techcrunch.com/2024/08/20/watch-this-robot-quickly-install-roof-shingles/">TechCrunch</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="c4mhs5bvkpw">Kento Kawaharazuka from <a href="http://www.jsk.t.u-tokyo.ac.jp/information.html" target="_blank">JSK Robotics Laboratory</a> at the University of Tokyo wrote in to share this paper, just accepted at RA-L, which (among other things) shows a robot using its flexible hands to identify objects through random finger motion.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cc9122933d66d1fb1ebdd6d869c855f8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/c4mhS5BvkPw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2407.08050">Paper</a> accepted by <em>IEEE Robotics and Automation Letters</em> ]</p><p>Thanks Kento!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="itharcgwp3a">It’s one thing to make robots that are reliable, and it’s another to make robots that are reliable and repairable by the end user. I don’t think iRobot gets enough credit for this.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3fc076c2976a7fbf3db7d978a11ad7c1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/IThARCgwP3A?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://homesupport.irobot.com/s/">iRobot</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="srktnepf9s0">I like competitions where they say, “just relax and forget about the competition and show us what you can do.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e95b57defa262662f556a7fbf0ca38b3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/sRkTnEPF9S0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.mbzirc.com/">MBZIRC</a> Maritime Grand Challenge ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="zxkuri7hetg">I kid you not, this used to be my job.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8149354407cc897c2862d48efbec7837" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ZXkURI7heTg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ucl.ac.uk/robotics/research-projects/2022/oct/robohike-autonomous-quadrupedal-robot-navigation-and-hiking-challenging">RoboHike</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 23 Aug 2024 17:30:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-disney-robot-dance</guid><category>Video friday</category><category>Boston dynamics</category><category>Disney</category><category>Unitree</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/an-animated-gif-showing-a-small-black-humanoid-robot-doing-some-dance-moves.gif?id=53509949&width=980"></media:content></item><item><title>Meet Boardwalk Robotics’ Addition to the Humanoid Workforce</title><link>https://spectrum.ieee.org/boardwalk-robotics-alex-humanoid</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-black-humanoid-robot-uses-a-sanding-tool-on-a-piece-of-black-carbon-fiber.jpg?id=53489912&width=3267&height=2624&coordinates=669%2C0%2C0%2C0"/><br/><br/><p>
<a href="https://boardwalkrobotics.com/" target="_blank">Boardwalk Robotics</a> is announcing its entry into the increasingly crowded commercial humanoid(ish) space with Alex, a “workforce transformation” humanoid upper torso designed to work in manufacturing, logistics, and maintenance.
</p><p>
Before we get into Alex, let me take just a minute here to straighten out how Boardwalk Robotics is related to <a href="https://robots.ihmc.us/" target="_blank">IHMC</a>, the Institute for Human Machine Cognition in Pensacola, Fla. IHMC is, I think it’s fair to say, somewhat legendary when it comes to bipedal robotics—its DARPA Robotics Challenge team <a href="https://www.ihmc.us/ihmcs-running-man-captures-2nd-at-darpa-robotics-competition/" rel="noopener noreferrer" target="_blank"><u>took second place in the final event</u></a> (using a Boston Dynamics DRC Atlas), and when NASA needed someone to teach the agency’s <a data-linked-post="2650274165" href="https://spectrum.ieee.org/new-r5-valkyrie-robots" target="_blank">Valkyrie humanoid</a> to walk better, they sent it to IHMC.
</p><p>
Boardwalk, which was founded in 2017, has been a commercial partner with IHMC when it comes to the actual building of robots. The most visible example of this to date has been IHMC’s <a href="https://spectrum.ieee.org/ihmc-developing-new-gymnastinspired-humanoid-robot" target="_blank">Nadia humanoid</a>, a research platform which Boardwalk collaborated on and built.<strong> </strong>There’s obviously a lot of crossover between IHMC and Boardwalk in terms of institutional knowledge and experience, but Alex is a commercial robot developed entirely in-house by Boardwalk.
</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="eeabbfbd7ba4015cec78e8f1fa0fb0d5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/tD3FNZUG2kQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
</p><p>
“We’ve used Nadia to learn a lot in the realm of dynamic locomotion research, and we’re taking all that and sticking it into a manipulation platform that’s ready for commercial work,” says <a href="https://www.linkedin.com/in/brandon-shrewsbury/" rel="noopener noreferrer" target="_blank"><u>Brandon Shrewsbury</u></a>, Boardwalk Robotics’ CTO. “With Alex, we’re focusing on the manipulation side first, getting that well established. And then picking the mobility to match the task.”
</p><p>
The first thing you’ll notice about Alex is that it doesn’t have legs, at least for now. Boardwalk’s theory is that for a humanoid to be practical and cost effective in the near term, legs aren’t necessary, and that there are many tasks that offer a good return on investment where a stationary pedestal or a glorified autonomous mobile robotic base would be totally fine.
</p><p>
“There are going to be some problem sets that require legs, but there are many problem sets that don’t,” says <a href="https://www.linkedin.com/in/rjgriffin42/" target="_blank">Robert Griffin</a>, a technical advisor at Boardwalk. “And there aren’t very many problem sets that don’t require halfway decent manipulation capabilities. So if we can design the manipulation well from the beginning, then we won’t have to depend on legs for making a robot that’s functionally useful.”</p><p>It certainly helps that Boardwalk isn’t at all <em><em>worried</em></em> about developing legs: “Every time we bring up a new humanoid, it’s something like twice as fast as the previous time,” Griffin says. This will be the eighth humanoid that IHMC has been involved in bringing up—I’d tell you more about all eight of those humanoids, but some of them are so secret that even <em>I</em> don’t know anything about them.<strong> </strong>Legs are definitely on the road map, but they’re not done yet, and IHMC will have a hand in their development to speed things along: It turns out that already having access to a functional (top of the line, really) locomotion stack is a big head start.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
<img alt="An annotated image showing a black humanoid robot along with statistics including 19 degrees of freedom and 10kg payload." class="rm-shortcode" data-rm-shortcode-id="b70dd78178a425b19a5265d67a3ae3ff" data-rm-shortcode-name="rebelmouse-image" id="2b0a1" loading="lazy" src="https://spectrum.ieee.org/media-library/an-annotated-image-showing-a-black-humanoid-robot-along-with-statistics-including-19-degrees-of-freedom-and-10kg-payload.png?id=53489789&width=980"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">Alex’s actuators are all designed in-house, and the next version will feature new grippers that allow for quicker tool changes.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Boardwalk Robotics</small>
</p><p>
While the humanoid space is wide open right now and competition isn’t really an issue, looking ahead, Boardwalk sees safety as one of its primary differentiators since it’s not starting out with legs, says Shrewsbury. “For a full humanoid, there’s no way to make that completely safe. If it falls, it’s going to face-plant.” By keeping Alex on a stable base, it can work closer to humans and potentially move its arms much faster while also preserving a dynamic safety zone.<br/>
</p><p class="shortcode-media shortcode-media-rebelmouse-image">
<img alt="An abstract image showing the back of a humanoid robot looking into bright lights." class="rm-shortcode" data-rm-shortcode-id="319cc3e1f1b134cf223ba17b1dfe3bc9" data-rm-shortcode-name="rebelmouse-image" id="f3322" loading="lazy" src="https://spectrum.ieee.org/media-library/an-abstract-image-showing-the-back-of-a-humanoid-robot-looking-into-bright-lights.jpg?id=53491948&width=980"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">Alex is available for researchers to purchase immediately.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Boardwalk Robotics</small>
</p><p>
Despite its upbringing in research, Alex is not intended to be a research robot. You <em><em>can</em></em> buy it for research purposes, if you want, but Boardwalk will be selling Alex as a commercial robot. At the moment, Boardwalk is conducting pilot programs with Alex<strong> </strong>where they’re working in partnership with select customers, with the eventual goal of transitioning to a service model. The first few sectors that Boardwalk is targeting include logistics (because, of course) and food processing, although as Boardwalk CEO <a href="https://www.linkedin.com/in/mijomo/" target="_blank">Michael Morin</a> tells us, one of the very first pilots is (appropriately enough) in aviation. </p><p>Morin, who helped to commercialize <a href="https://robotsguide.com/robots/wam" rel="noopener noreferrer" target="_blank"><u>Barrett Technologies’ WAM Arm</u></a> before spending some time at <a href="https://www.vicarioussurgical.com/" rel="noopener noreferrer" target="_blank"><u>Vicarious Surgical</u></a> as that company went public, joined Boardwalk to help them turn good engineering into a good product, which is arguably the hardest part of making useful robots (besides all the other hardest parts). “A lot of these companies are just learning about humanoids for the first time,” says Morin. “That makes the customer journey longer. But we’re putting in the effort to educate them on how this could be implemented in their world.”</p><p>
If you want an Alex of your very own, Boardwalk is currently selecting commercial partners for a few more pilots. And for researchers, the robot is available right now.
</p>]]></description><pubDate>Thu, 22 Aug 2024 12:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/boardwalk-robotics-alex-humanoid</guid><category>Ihmc</category><category>Boardwalk robotics</category><category>Humanoid robots</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-black-humanoid-robot-uses-a-sanding-tool-on-a-piece-of-black-carbon-fiber.jpg?id=53489912&width=980"></media:content></item><item><title>Video Friday: Silly Robot Dog Jump</title><link>https://spectrum.ieee.org/video-friday-robot-dog-jump</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-black-and-white-robotic-dog-jumps-up-and-down-a-flight-of-stairs.gif?id=53377935&width=1200&height=800&coordinates=62%2C0%2C63%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://icra40.ieee.org/">ICRA@40</a>: 23–26 September 2024, ROTTERDAM, NETHERLANDS</h5><h5><a href="https://iros2024-abudhabi.org/">IROS 2024</a>: 14–18 October 2024, ABU DHABI, UAE</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="9auzj9ep8cm">The title of this video is “Silly Robot Dog Jump” and that’s probably more than you need to know.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="97aebc76715a8fd09c349260b69fc161" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/9AuZj9EP8CM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">Deep Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="hl7crzs7qce">It’ll be great when robots are reliably autonomous, but until they get there, collaborative capabilities are a must.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5d839791ef47708098a622b3aee99d4e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/HL7crZS7QcE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.robust.ai/">Robust AI</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="rc2odkz5yma">I am so INCREDIBLY EXCITED for this.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="626529a483e335c3b7d478a041c4e529" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/rC2OdKZ5ymA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ami.iit.it/aerial-humanoid-robotics">IIT Instituto Italiano di Tecnologia</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="11iz8x27js4"><em>In this 3 minutes long one-take video, the LimX Dynamics CL-1 takes on the challenge of continuous heavy objects loading among shelves in a simulated warehouse, showcasing the advantages of the general-purpose form factor of humanoid robots.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9412adb435a1df4d5ed7bdabe3d00ca8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/11Iz8x27jS4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.limxdynamics.com/en/humanoid-robot">LimX Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="wennwkfldj0"><em>Birds, bats and many insects can tuck their wings against their bodies when at rest and deploy them to power flight. Whereas birds and bats use well-developed pectoral and wing muscles, how insects control their wing deployment and retraction remains unclear because this varies among insect species. Here we demonstrate that rhinoceros beetles can effortlessly deploy their hindwings without necessitating muscular activity. We validated the hypothesis using a flapping microrobot that passively deployed its wings for stable, controlled flight and retracted them neatly upon landing, demonstrating a simple, yet effective, approach to the design of insect-like flying micromachines.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2ca11c97466b4bafbce5676009e87180" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wEnnWkfLdj0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.nature.com/articles/s41586-024-07755-9.epdf?sharing_token=Sh8VnLrLXig1jCyPx6N5oNRgN0jAjWel9jnR3ZoTv0P2qv166J8YLuuFHN-dd7pwwcJEAOVJIs7CR9erh2b9RTOSr2jmzLG44YVoyMeqg0TVOTbpjblRT8ZPHGe7N0_kG7eLTLvEdExrG5TRTpsJ7LDJELOI6dvJag4LDPtA0DY%3D"><em>Nature</em></a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="w6oldjgttjm"><em>Agility Robotics’ CTO, Pras Velagapudi, talks about data collection, and specifically about the different kinds we collect from our real-world robot deployments and generally what that data is used for.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f159e732821144321a2b2b24f708d0c4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/W6oldJgTTJM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://agilityrobotics.com/">Agility Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="wwcnpgufjho">Robots that try really hard but are bad at things are utterly charming.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="59de64b9545b898995757315dc583834" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/WwCNPGUFjho?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="http://www.jsk.t.u-tokyo.ac.jp/research/spined_robots/index.html">University of Tokyo JSK Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="nayi8scelh0">The <a href="https://triagechallenge.darpa.mil/about" target="_blank">DARPA Triage Challenge</a> unsurprisingly has a bunch of robots in it.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e7024dd420b2717bfd3f6781a7742dc8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/nAyI8ScELh0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://triagechallenge.darpa.mil/">DARPA</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="shta_jnytvg">The Cobalt security robot has been around for a while, but I have to say, the design really holds up—it’s a good looking robot.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="902aad46d635d4a873de1ac4bd9a2e2c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/shTa_jnytvg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.cobaltai.com/">Cobalt AI</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="icafzjuguau">All robots that enter elevators should be programmed to gently sway back and forth to the elevator music. Even if there’s no elevator music.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="22fdd09d229027caf581120fd8b29c62" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/icaFZJugUAU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://getsomatic.com/">Somatic</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="tbw7qpdgstk"><em>ABB Robotics and the Texas Children’s Hospital have developed a groundbreaking lab automation solution using ABB’s YuMi® cobot to transfer fruit flies (Drosophila melanogaster) used in the study for developing new drugs for neurological conditions such as Alzheimer’s, Huntington’s and Parkinson’s.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="87445146cb784b3eb2189d8bd289f843" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/tbW7QpDGStk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://new.abb.com/news/detail/117812/prsrl-abb-and-texas-childrens-hospital-create-breakthrough-automation-to-advance-neurological-research">ABB</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="e-8czcdcyd0"><em>Extend Robotics are building embodied AI enabling highly flexible automation for real-world physical tasks. The system features intuitive immersive interface enabling tele-operation, supervision and training AI models.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c4d5fe2e5ec1c5bfe3b2964e1d41d4e2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/e-8CZCdcyD0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.extendrobotics.com/">Extend Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ae0ipglvm30">The recorded livestream of RSS 2024 is now online, in case you missed anything.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="32dd84d487dada8539a08194669f5bd2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/AE0iPGlVM30?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://roboticsconference.org/program/overview/">RSS 2024</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 16 Aug 2024 16:45:04 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robot-dog-jump</guid><category>Agility robotics</category><category>Deep robotics</category><category>Robust ai</category><category>Video friday</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/a-black-and-white-robotic-dog-jumps-up-and-down-a-flight-of-stairs.gif?id=53377935&width=980"></media:content></item><item><title>Video Friday: The Secrets of Shadow Robot’s New Hand</title><link>https://spectrum.ieee.org/video-friday-shadow-robot-hand</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-silver-and-black-three-fingered-robot-hand-gently-squeezes-a-yellow-balloon.gif?id=53141557&width=1200&height=800&coordinates=62%2C0%2C63%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://icra40.ieee.org/">ICRA@40</a>: 23–26 September 2024, ROTTERDAM, NETHERLANDS</h5><h5><a href="https://iros2024-abudhabi.org/">IROS 2024</a>: 14–18 October 2024, ABU DHABI, UAE</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="fjsfli3svvw">At <a href="https://2024.ieee-icra.org/" target="_blank">ICRA 2024</a>, in Tokyo last May, we sat down with the director of <a href="https://www.shadowrobot.com/" target="_blank">Shadow Robot</a>, Rich Walker, to talk about the journey toward developing its newest model. Designed for reinforcement learning, <a href="https://spectrum.ieee.org/robot-hand-shadow-robot-company" target="_blank">the hand</a> is extremely rugged, has three fingers that act like thumbs, and has fingertips that are highly sensitive to touch.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5555146f75cb98294415a90e62cabddc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/fJsFLI3svVw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://spectrum.ieee.org/robot-hand-shadow-robot-company"><em>IEEE Spectrum</em></a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ac0f2jt-ijq"><em>Food Angel is a food delivery robot to help with the problems of food insecurity and homelessness. Utilizing autonomous wheeled robots for this application may seem to be a good approach, especially with a number of successful commercial robotic delivery services. However, besides technical considerations such as range, payload, operation time, autonomy, etc., there are a number of important aspects that still need to be investigated, such as how the general public and the receiving end may feel about using robots for such applications, or human-robot interaction issues such as how to communicate the intent of the robot to the homeless.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d798d1a2521b55486e67ee296427596c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/AC0f2Jt-ijQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.romela.org/">RoMeLa</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="kk-h9sgxgri"><em>The UKRI FLF team RoboHike of UCL Computer Science of the Robot Perception and Learning lab with Forestry England demonstrate the ANYmal robot to help preserve the cultural heritage of an historic mine in the Forest of Dean, Gloucestershire, UK.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2126c75402c1641c10022c13e73e293c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Kk-H9sGXGrI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>This clip is from a reboot of the British TV show “Time Team.” If you’re not already a fan of “Time Team,” let me just say that it is one of the greatest retro reality TV shows ever made, where actual archaeologists wander around the United Kingdom and dig stuff up. If they can find anything. Which they often can’t. And also it has Tony Robinson (from “Blackadder”), who runs everywhere for some reason. Go to <a href="https://www.youtube.com/TimeTeamClassics" target="_blank">Time Team Classics</a> on YouTube for 70+ archived episodes.</p><p>[ <a href="https://rpl-as-ucl.github.io/">UCL RPL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="mcbgec-kubm"><em>UBTECH humanoid robot Walker S Lite is working in Zeekr’s intelligent factory to complete handling tasks at the loading workstation for 21 consecutive days, and assist its employees with logistics work.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6693372e4a367fb87b6aa86afe44809e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/MCbGeC-kuBM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ubtrobot.com/humanoid/products/Walker">UBTECH</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ja0vjm72zdw"><em>Current visual navigation systems often treat the environment as static, lacking the ability to adaptively interact with obstacles. This limitation leads to navigation failure when encountering unavoidable obstructions. In response, we introduce IN-Sight, a novel approach to self-supervised path planning, enabling more effective navigation strategies through interaction with obstacles.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5896c2bde4864b8c79a27b0ce6265add" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ja0Vjm72ZDw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2408.00343">ETH Zurich paper / IROS 2024</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="0tcoei2g-dy">When working on autonomous cars, sometimes it’s best to start small.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="edb131e9b74702b14482e2084df38e24" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0TcoeI2g-dY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://penntoday.upenn.edu/news/penn-engineering-racing-cars-f1tenth">University of Pennsylvania</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="oyzndskpvuu"><em>MIT MechE researchers introduce an approach called SimPLE (Simulation to Pick Localize and placE), a method of precise kitting, or pick and place, in which a robot learns to pick, regrasp, and place objects using the object’s computer-aided design (CAD) model, and all without any prior experience or encounters with the specific objects.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ae4654ba7901bb421584d90e1f46fc67" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/oYznDSkPVUU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://mcube.mit.edu/research/simPLE.html">MIT</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="hukss1xa2lq"><em>Staff, students (and quadruped robots!) from UCL Computer Science wish the Great Britain athletes the best of luck this summer in the Olympic Games & Paralympics.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="35cfc77bf1ed0322ae32b91ad74bf438" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/hUKSS1xA2lQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ucl.ac.uk/robotics/">UCL Robotics Institute</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="6jd9downw9a">Walking in tall grass can be hard for robots, because they can’t see the ground that they’re actually stepping on. Here’s a technique to solve that, published in <em><a href="https://ieeexplore.ieee.org/document/10265206" target="_blank">Robotics and Automation Letters</a></em> last year.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e485a6300ec2f70ff72abaf852eb1acc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/6jD9DownW9A?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sites.google.com/leggedrobotics.com/semantic-pointcloud-filter">ETH Zurich Robotic Systems Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="epd9jpdlalu">There is no such thing as excess batter on a corn dog, and there is also no such thing as a defective donut. And apparently, making Kool-Aid drink pouches is harder than it looks. </p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="715ab6d553df011790c78bec89120888" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/epD9jPdlalU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.oxipitalai.com/">Oxipital AI</a> ]<br/></p><div class="horizontal-rule"></div><p class="rm-anchors" id="pnjr2f_xhoo">Unitree has open-sourced its software to teleoperate humanoids in VR for training-data collection.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8b2f9f17f06c2505557c4dc39c99000e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pNjr2f_XHoo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://github.com/unitreerobotics/avp_teleoperate">Unitree / GitHub</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="varztjeidaq">Nothing more satisfying than seeing point-cloud segments wiggle themselves into place, and CSIRO’s Wildcat SLAM does this better than anyone.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7c43568dd1a5f4f836a1aff60dc59ef0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vaRZTjeIdaQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/pdf/2205.12595"><em>IEEE Transactions on Robotics</em></a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="y1lg4ywutoo"><em>A lecture by Mentee Robotics CEO Lior Wolf, on Mentee’s AI approach.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="93c6709b5f475daed07c79a2bcb666fd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/y1LG4YwUtoo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.menteebot.com/">Mentee Robotics</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 09 Aug 2024 15:30:02 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-shadow-robot-hand</guid><category>Autonomous cars</category><category>Autonomous robots</category><category>Robot videos</category><category>Shadow robot</category><category>Video friday</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/a-silver-and-black-three-fingered-robot-hand-gently-squeezes-a-yellow-balloon.gif?id=53141557&width=980"></media:content></item><item><title>Figure 02 Robot Is a Sleeker, Smarter Humanoid</title><link>https://spectrum.ieee.org/figure-new-humanoid-robot</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-photo-of-the-torso-of-a-grey-and-black-humanoid-robot-with-a-shiny-black-face-plate-looking-at-its-hand.jpg?id=53107004&width=1200&height=800&coordinates=450%2C0%2C450%2C0"/><br/><br/><p>Today, <a href="https://spectrum.ieee.org/figure-humanoid-robot" target="_blank">Figure</a> is introducing the newest, slimmest, shiniest, and least creatively named next generation of its <a href="https://spectrum.ieee.org/humanoid-robots" target="_blank">humanoid robot</a>: Figure 02. According to the press release, Figure 02 is the result of “a ground-up hardware and software redesign” and is “the highest performing humanoid robot,” which may even be true for some arbitrary value of “performing.” Also notable is that Figure has been actively testing robots with BMW at a manufacturing plant in Spartanburg, S.C., where the new humanoid has been performing “data collection and use case training.”</p><p>The rest of the press release is pretty much, “Hey, check out our new robot!” And you’ll get all of the content in the release by watching the videos. What you <em><em>won’t</em></em> get from the videos is any additional info about the robot. But we sent along some questions to Figure about these videos, and have a few answers from <a href="https://www.linkedin.com/in/mr-michael-rose/" target="_blank">Michael Rose</a>, director of controls, and <a href="https://www.linkedin.com/in/vadimchernyak/" target="_blank">Vadim Chernyak</a>, director of hardware.</p><hr/><p>First, the trailer:</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="c48692efbc65bd12448fc94b8d8b65b3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FZbY9sReu1k?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
</p><p><strong>How many parts does Figure 02 have, and is this all of them?</strong></p><p><strong></strong><strong>Figure: </strong>A couple hundred unique parts and a couple thousand parts total. No, this is not all of them.</p><p><strong>Does Figure 02 make little Figure logos with every step?</strong></p><p><strong>Figure: </strong>If the surface is soft enough, yes.</p><p><strong>Swappable legs! Was that hard to do, or easier to do because you only have to make one leg? Figure: </strong>We chose to make swappable legs to help with manufacturing.</p><p><strong>Is the battery pack swappable too? </strong></p><p><strong>Figure: </strong>Our battery is swappable, but it is not a quick swap procedure.</p><p><strong>What’s that squishy-looking stuff on the back of Figure 02’s knees and in its elbow joints?</strong></p><p><strong>Figure: </strong>These are soft stops which limit the range of motion in a controlled way and prevent robot pinch points</p><p><strong>Where’d you hide that thumb motor?</strong></p><p><strong>Figure: </strong>The thumb is now fully contained in the hand.</p><p><strong>Tell me about the “skin” on the neck!</strong></p><p><strong>Figure: </strong>The skin is a soft fabric which is able to keep a clean seamless look even as the robot moves its head.</p><div class="horizontal-rule"></div><p>And here’s the reveal video:</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="846c1da5e4338257891a1dd6697c081c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0SRVJaOg9Co?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
</p><p><strong>When Figure 02’s head turns, its body turns too, and its arms move. Is that necessary, or aesthetic?</strong></p><p><strong>Figure: </strong>Aesthetic.</p><p><strong>The upper torso and shoulders seem very narrow compared to other humanoids. Why is that?</strong> </p><p><strong>Figure: </strong>We find it essential to package the robot to be of similar proportions to a human. This allows us to complete our target use cases and fit into our environment more easily.</p><p><strong>What can you tell me about Figure 02’s walking gait? </strong></p><p><strong>Figure: </strong>The robot is using a model predictive controller to determine footstep locations and forces required to maintain balance and follow the desired robot trajectory.</p><p><strong>How much runtime do you get from 2.25 kilowatt-hours doing the kinds of tasks that we see in the video?</strong></p><p><strong>Figure: </strong>We are targeting a 5-hour run time for our product.</p><div class="horizontal-rule"><br/></div><p class="shortcode-media shortcode-media-rebelmouse-image">
<img alt="A photo a grey and black humanoid robot with a shiny black face plate standing in front of a white wall." class="rm-shortcode" data-rm-shortcode-id="40f638c685f23bc7a4ea515a422ce502" data-rm-shortcode-name="rebelmouse-image" id="7fc98" loading="lazy" src="https://spectrum.ieee.org/media-library/a-photo-a-grey-and-black-humanoid-robot-with-a-shiny-black-face-plate-standing-in-front-of-a-white-wall.jpg?id=53107672&width=980"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">Slick, but also a little sinister?</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Figure</small></p><p>This thing looks <em><em>slick</em></em>. I’d say that it’s maybe a little too far on the sinister side for a robot intended to work around humans, but the industrial design is badass and the packaging is excellent, with the vast majority of the wiring now integrated within the robot’s skins and flexible materials covering joints that are typically left bare. Figure, if you remember, <a href="https://spectrum.ieee.org/figure-robot-video" target="_self"><u>raised a US $675 million Series B that valued the company at $2.6 billion</u></a>, and somehow the look of this robot seems appropriate to that.</p><p>I do still have some questions about Figure 02, such as where the interesting foot design came from and whether a 16-degree-of-freedom hand is really worth it in the near term. It’s also worth mentioning that Figure seems to have a fair number of Figure 02 robots running around—at least five units at its California headquarters, plus potentially a couple of more at the BMW Spartanburg manufacturing facility. </p><p>I also want to highlight this boilerplate at the end of the release: “our humanoid is designed to perform human-like tasks within the workforce and in the home.” We are very, very far away from a humanoid robot in the home, but I appreciate that it’s still an explicit goal that Figure is trying to achieve. Because I want one.</p>]]></description><pubDate>Tue, 06 Aug 2024 13:06:44 +0000</pubDate><guid>https://spectrum.ieee.org/figure-new-humanoid-robot</guid><category>Humanoid robot</category><category>Figure</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-photo-of-the-torso-of-a-grey-and-black-humanoid-robot-with-a-shiny-black-face-plate-looking-at-its-hand.jpg?id=53107004&width=980"></media:content></item><item><title>Rodney Brooks’s Three Laws of Robotics</title><link>https://spectrum.ieee.org/rodney-brooks-three-laws-robotics</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-red-robot-with-its-arms-around-a-a-man-with-a-blue-shirt-against-a-yellow-background.jpg?id=53100051&width=1200&height=800&coordinates=0%2C0%2C0%2C209"/><br/><br/><p>
<em><a href="https://rodneybrooks.com/" rel="noopener noreferrer" target="_blank">Rodney Brooks</a> is the <a href="https://people.csail.mit.edu/brooks/" target="_blank">Panasonic Professor of Robotics (emeritus) at MIT</a>, where he was director of the AI Lab and then <a href="https://www.csail.mit.edu/" target="_blank">CSAIL</a>. He has been cofounder of <a href="https://www.irobot.com/" target="_blank">iRobot</a>, <a href="https://spectrum.ieee.org/robotics/industrial-robots/rethink-robotics-baxter-robot-factory-worker" target="_self">Rethink Robotics</a>, and <a href="https://www.robust.ai/" target="_blank">Robust AI</a></em><em><u>, </u>where he is currently CTO. This article is shared with permission <a href="https://rodneybrooks.com/rodney-brooks-three-laws-of-robotics/" target="_blank">from his blog</a>.</em>
</p><p>
<span></span>Here are some of the things I’ve learned about robotics after working in the field for almost five decades. In honor of Isaac Asimov and Arthur C. Clarke, my two boyhood go-to science fiction writers, I’m calling them my three laws of robotics.<br/>
</p><ol>
<li><strong>The visual appearance of a robot makes a promise about what it can do and how smart it is. It needs to deliver or slightly overdeliver on that promise or it will not be accepted.</strong></li>
<li><strong>When robots and people coexist in the same spaces, the robots must not take away from people’s agency, particularly when the robots are failing, as inevitably they will at times.</strong></li>
<li><strong>Technologies for robots need 10+ years of steady improvement beyond lab demos of the target tasks to mature to low cost and to have their limitations characterized well enough that they can deliver 99.9 percent of the time. Every 10 more years gets another 9 in reliability.</strong></li>
</ol><p>
Below I explain each of these laws in more detail. But in a related post here are my <a href="https://rodneybrooks.com/rodney-brooks-three-laws-of-artificial-intelligence/" target="_blank">three laws of artificial intelligence</a>.
</p><p>
Note that these laws are written from the point of view of making robots work in the real world, where people pay for them, and where people want return on their investment. This is very different from demonstrating robots or robot technologies in the laboratory.
</p><p>
In the lab there is phalanx of graduate students eager to demonstrate their latest idea, on which they have worked very hard, to show its plausibility. Their interest is in showing that a technique or technology that they have developed is plausible and promising. They will do everything in their power to nurse the robot through the demonstration to make that point, and they will eagerly explain everything about what they have developed and what could come next.
</p><p>
In the real world there is just the customer, or the employee or relative of the customer. The robot has to work with no external intervention from the people who designed and built it. It needs to be a good experience for the people around it or there will not be more sales to those, and perhaps other, customers.
</p><p>
So these laws are not about what might, or could, be done. They are about real robots deployed in the real world. The laws are not about research demonstrations. They are about robots in everyday life.
</p><h2>The Promise Given By Appearance</h2><p>
My various companies have produced all sorts of robots and sold them at scale. A lot of thought goes into the visual appearance of the robot when it is designed, as that tells the buyer or user what to expect from it.
</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-left" data-rm-resized-container="25%" style="float: left;">
<img alt="overhead view a black circle robot with buttons on a white tiled floor" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="ee4bc95025216173e5ae2fefcc89e5d7" data-rm-shortcode-name="rebelmouse-image" id="42223" loading="lazy" src="https://spectrum.ieee.org/media-library/overhead-view-a-black-circle-robot-with-buttons-on-a-white-tiled-floor.jpg?id=53087131&width=980" style="max-width: 100%"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">The iRobot Roomba was carefully designed to meld looks with function.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">iStock</small></p><p>
The Roomba, from iRobot, looks like a flat disk. It cleans floors. The disk shape was so that it could turn in place without hitting anything it wasn’t already hitting. The low profile of the disk was so that it could get under the toe kicks in kitchens and clean the floor that is overhung just a little by kitchen cabinets. It does not look like it can go up and down stairs or even a single step up or step down in a house and it cannot. It has a handle, which makes it look like it can be picked up by a person, and it can be. Unlike fictional Rosey the Robot it does not look like it could clean windows, and it cannot. It cleans floors, and that is it.</p><p>
The Packbot, the remotely operable military robot, also from iRobot, looked very different indeed. It has tracked wheels, like a miniature tank, and that appearance promises anyone who looks at it that it can go over rough terrain, and is not going to be stopped by steps or rocks or drops in terrain. When the Fukushima disaster happened, in 2011, Packbots were able to operate in the reactor buildings that had been smashed and wrecked by the tsunami, open door handles under remote control, drive up rubble-covered staircases and get their cameras pointed at analog pressure and temperature gauges so that workers trying to safely secure the nuclear plant had some data about what was happening in highly radioactive areas of the plant.
</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-left" data-rm-resized-container="25%" style="float: left;">
<img alt="a rectangle robot with treads and wheels and an arm in front tries to grab an object on the ground" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="6d669c5db30a7180671d6cbbab447e31" data-rm-shortcode-name="rebelmouse-image" id="291a4" loading="lazy" src="https://spectrum.ieee.org/media-library/a-rectangle-robot-with-treads-and-wheels-and-an-arm-in-front-tries-to-grab-an-object-on-the-ground.jpg?id=53100428&width=980" style="max-width: 100%"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">An iRobot PackBot picks up a demonstration object at the Joint Robotics Repair Detachment at Victory Base Complex in Baghdad.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Alamy</small></p><p>
The point of this first law of robotics is to warn against making a robot appear more than it actually is. Perhaps that will get funding for your company, leading investors to believe that in time the robot will be able to do all the things its physical appearance suggests it might be able to do. But it is going to disappoint customers when it cannot do the sorts of things that something with that physical appearance looks like it can do. Glamming up a robot risks overpromising what the robot as a product can actually do. That risks disappointing customers. And disappointed customers are not going to be advocates for your product/robot, nor be repeat buyers.</p><h2>Preserving People’s Agency</h2><p>
The worst thing for its acceptance by people that a robot can do in the workplace is to make their jobs or lives harder, by not letting them do what they need to do.
</p><p>
Robots that work in hospitals taking dirty sheets or dishes from a patient floor to where they are to be cleaned are meant to make the lives of the nurses easier. But often they do exactly the opposite. If the robots are not aware of what is happening and do not get out of the way when there is an emergency they will probably end up blocking some lifesaving work by the nurses—e.g., pushing a gurney with a critically ill patient on it to where they need to be for immediate treatment. That does not endear such a robot to the hospital staff. It has interfered with their main job function, a function of which the staff is proud, and what motivates them to do such work.
</p><p>
A lesser, but still unacceptable behavior of robots in hospitals, is to have them wait in front of elevator doors, central, and blocking for people. It makes it harder for people to do some things they need to do all the time in that environment—enter and exit elevators.
</p><p>
Those of us who live in San Francisco or Austin, Texas, have had firsthand views of robots annoying people daily for the last few years. The robots in question have been autonomous vehicles, driving around the city with no human occupant. I see these robots every single time I leave my house, whether on foot or by car.
</p><p>
Some of the vehicles were notorious for blocking intersections, and there was absolutely nothing that other drivers, pedestrians, or police could do. We just had to wait until some remote operator hidden deep inside the company that deployed them decided to pay attention to the stuck vehicle and get it out of people’s way. Worse, they would wander into the scene of a fire where there were fire trucks and firefighters and actual buildings on fire, get confused and just stop, sometime on top of the fire hoses.
</p><p>
There was no way for the firefighters to move the vehicles, nor communicate with them. This is in contrast to an automobile driven by a human driver. Firefighters can use their normal social interactions to communicate with a driver, and use their privileged position in society as frontline responders to apply social pressure on a human driver to cooperate with them. Not so with the autonomous vehicles.
</p><p>
The autonomous vehicles took agency from people going about their regular business on the streets, but worse took away agency from firefighters whose role is to protect other humans. Deployed robots that do not respect people and what they need to do will not get respect from people and the robots will end up undeployed.
</p><h2>Robust Robots That Work Every Time<br/></h2><p>
Making robots that work reliably in the real world is hard. In fact, making anything that works physically in the real world, and is reliable, is very hard.
</p><p>
For a customer to be happy with a robot it must appear to work every time it tries a task, otherwise it will frustrate the user to the point that they will question whether it makes their life better or not.
</p><p>
But what does <em><em>appear</em></em> mean here? It means that the user can have the assumption that it going to work, as their default understanding of what will happen in the world.
</p><p>
The tricky part is that robots interact with the real physical world.
</p><p>
Software programs interact with a well-understood abstracted machine, so they tend not fail in a manner where the instructions in them do not get executed in a consistent way by the hardware on which they are running. Those same programs may also interact with the physical world, be it a human being, a network connection, or an input device like a mouse. It is then that the programs might fail as the instructions in them are based on assumptions in the real world that are not met.
</p><p>
Robots are subject to forces in the real world, subject to the exact position of objects relative to them, and subject to interacting with humans who are very variable in their behavior. There are no teams of graduate students or junior engineers eager to make the robot succeed on the 8,354th attempt to do the same thing that has worked so many times before. Getting software that adequately adapts to the uncertain changes in the world in that particular instance and that particular instant of time is where the real challenge arises in robotics.
</p><p>
Great-looking videos are just not the same things as working for a customer every time. Most of what we see in the news about robots is lab demonstrations. There is no data on how general the solution is, nor how many takes it took to get the video that is shown. Even worse sometimes the videos are tele-operated or sped up many times over.
</p><p>
I have rarely seen a new technology that is less than ten years out from a lab demo make it into a deployed robot. It takes time to see how well the method works, and to characterize it well enough that it is unlikely to fail in a deployed robot that is working by itself in the real world. Even then there will be failures, and it takes many more years of shaking out the problem areas and building it into the robot product in a defensive way so that the failure does not happen again.
</p><p>
Most robots require kill buttons or estops on them so that a human can shut them down. If a customer ever feels the need to hit that button, then the people who have built and sold the robot have failed. They have not made it operate well enough that the robot never gets into a state where things are going that wrong.
</p>]]></description><pubDate>Tue, 06 Aug 2024 10:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/rodney-brooks-three-laws-robotics</guid><category>Rodney brooks</category><category>Robotics</category><dc:creator>Rodney Brooks</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-red-robot-with-its-arms-around-a-a-man-with-a-blue-shirt-against-a-yellow-background.jpg?id=53100051&width=980"></media:content></item><item><title>Video Friday: UC Berkeley’s Little Humanoid</title><link>https://spectrum.ieee.org/video-friday-berkeley-little-humanoid</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-compilation-of-shots-of-a-small-two-legged-robot-torso-falling-over-in-different-environments.gif?id=53076870&width=1200&height=800&coordinates=62%2C0%2C63%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://icra40.ieee.org/">ICRA@40</a>: 23–26 September 2024, ROTTERDAM, NETHERLANDS</h5><h5><a href="https://iros2024-abudhabi.org/">IROS 2024</a>: 14–18 October 2024, ABU DHABI, UNITED ARAB EMIRATES</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="8pr1he-wmhw"><em>We introduce Berkeley Humanoid, a reliable and low-cost mid-scale humanoid research platform for learning-based control. Our lightweight, in-house-built robot is designed specifically for learning algorithms with low simulation complexity, anthropomorphic motion, and high reliability against falls. Capable of omnidirectional locomotion and withstanding large perturbations with a compact setup, our system aims for scalable, sim-to-real deployment of learning-based humanoid systems.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="44b5d5278b482dc831992ed45b20f1f8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8pR1HE-wMHw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://berkeley-humanoid.com/">Berkeley Humanoid</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="xvzcvrct-2y"><em>This article presents Ray, a new type of audio-animatronic robot head. All the mechanical structure of the robot is built in one step by 3-D printing... This simple, lightweight structure and the separate tendon-based actuation system underneath allow for smooth, fast motions of the robot. We also develop an audio-driven motion generation module that automatically synthesizes natural and rhythmic motions of the head and mouth based on the given audio.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b3eaaf3a2dccd5007ce02be1b40180b3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xVZCVRct-2Y?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/document/10551622">Paper</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="w-l90bhfdfo"><em>CSAIL researchers introduce a novel approach allowing robots to be trained in simulations of scanned home environments, paving the way for customized household automation accessible to anyone.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="da6cbba32610c3bbe5c8e7488817c9c4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/w-L90BhfDFo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://news.mit.edu/2024/precision-home-robotics-real-sim-real-0731">MIT News</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="4lryrrr4zpu">Okay, sign me up for this.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="94e47f01fe589cedc5a0594e4aef63d8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/4lryrrR4zpU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">Deep Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ge7vdgr_j1g"><em>NEURA Robotics is among the first joining the early access NVIDIA Humanoid Robot Developer Program.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="54c3361f39be667981f578f5b9fd76ed" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GE7VDgR_J1g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>This could be great, but there’s an awful lot of jump cuts in that video.</p><p>[ <a href="https://neura-robotics.com/">Neura</a> ] via [ <a href="https://nvidianews.nvidia.com/news/nvidia-accelerates-worldwide-humanoid-robotics-development">NVIDIA</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="iabddpujgly">I like that Unitree’s tagline in the video description here is “Let’s have fun together.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c00428962e8b81468902b985247b5202" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/iaBDDpuJglY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Is that “please don’t do dumb stuff with our robots” at the end of the video new...?</p><p>[ <a href="https://www.unitree.com/">Unitree</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="jrd_de_kr1c"><em>NVIDIA CEO Jensen Huang presented a major breakthrough on Project GR00T with WIRED’s Lauren Goode at SIGGRAPH 2024. In a two-minute demonstration video, NVIDIA explained a systematic approach they discovered to scale up robot data, addressing one of the most challenging issues in robotics.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0af0f3ba850d22d0a3a238f11bdaa21e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/JRd_De_KR1c?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://nvidianews.nvidia.com/news/nvidia-accelerates-worldwide-humanoid-robotics-development" target="_blank">Nvidia</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="-nmgtxonwua"><em>In this research, we investigated the innovative use of a manipulator as a tail in quadruped robots to augment their physical capabilities. Previous studies have primarily focused on enhancing various abilities by attaching robotic tails that function solely as tails on quadruped robots. While these tails improve the performance of the robots, they come with several disadvantages, such as increased overall weight and higher costs. To mitigate these limitations, we propose the use of a 6-DoF manipulator as a tail, allowing it to serve both as a tail and as a manipulator.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2065922425d3255f5e7ba1b895afbefa" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/-nMGtxonwUA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2407.10420">Paper</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="fzjdxd1pgpi"><em>In this end-to-end demo, we showcase how MenteeBot transforms the shopping experience for individuals, particularly those using wheelchairs. Through discussions with a global retailer, MenteeBot has been designed to act as the ultimate shopping companion, offering a seamless, natural experience.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="fc665df087dc0d63106e068e3662bf85" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/fzjdXD1pGpI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://menteebot.com/blog/#shopping-companion-2024">Menteebot</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="rk_1whdlkb4"><em>Nature Fresh Farms, based in Leamington, Ontario, is one of North America’s largest greenhouse farms growing high-quality organics, berries, peppers, tomatoes, and cucumbers. In 2022, Nature Fresh partnered with Four Growers, a FANUC Authorized System Integrator, to develop a robotic system equipped with AI to harvest tomatoes in the greenhouse environment.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="fa080433250270e6b3c0a16935030463" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Rk_1WHdlkb4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.fanucamerica.com/case-studies/robotic-automation-ai-harvests-tomatoes">FANUC</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="082whnjlunq">Contrary to what you may have been led to believe by several previous Video Fridays, WVUIRL’s open source rover is quite functional, most of the time.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3cb10203ae3eaf8fb9df7e899a8c32ea" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/082wHnJlunQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://urc.orgs.wvu.edu/open-source-documentation">WVUIRL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="wvrxenvlv0i"><em>Honeybee Robotics, a Blue Origin company, is developing Lunar Utility Navigation with Advanced Remote Sensing and Autonomous Beaming for Energy Redistribution, also known as LUNARSABER. In July 2024, Honeybee Robotics captured LUNARSABER’s capabilities during a demonstration of a scaled prototype.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2b6c0e6fd6b4818e6242d15873671023" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wvrxEnvLv0I?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.honeybeerobotics.com/">Honeybee Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="kvedwjpd6lw">Bunker Mini is a compact tracked mobile robot specifically designed to tackle demanding off-road terrains.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c365325ee279e56c426987fc0724a99d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/kvedwJPd6lw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://global.agilex.ai/products/bunker-mini">AgileX</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="cqx_f1t-bec"><em>In this video we present results of our lab from the latest field deployments conducted in the scope of the Digiforest EU project, in Stein am Rhein, Switzerland. Digiforest brings together various partners working on aerial and legged robots, autonomous harvesters, and forestry decision-makers. The goal of the project is to enable autonomous robot navigation, exploration, and mapping, both below and above the canopy, to create a data pipeline that can support and enhance foresters’ decision-making systems.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="31998f40d0f1c05b31ac24208f6e9728" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/CQx_F1t-bEc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.autonomousrobotslab.com/">ARL</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 02 Aug 2024 16:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-berkeley-little-humanoid</guid><category>Video friday</category><category>Robotics</category><category>Humanoid robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/a-compilation-of-shots-of-a-small-two-legged-robot-torso-falling-over-in-different-environments.gif?id=53076870&width=980"></media:content></item><item><title>Will This Flying Camera Finally Take Off?</title><link>https://spectrum.ieee.org/hoverair-x1</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-flat-white-hover-drone-over-a-person-s-outstretched-hand.jpg?id=53017438&width=1200&height=800&coordinates=0%2C88%2C0%2C88"/><br/><br/><p>Ten years. Two countries. Multiple redesigns. Some US $80 million invested. And, finally, <a href="https://zerozerorobotics.com/" target="_blank">Zero Zero Robotics</a> has a product it says is ready for consumers, not just robotics hobbyists—the <a href="https://hoverair.com/" target="_blank">HoverAir X1</a>. The company has sold several hundred thousand flying cameras since the HoverAir X1 started shipping last year. It hasn’t gotten the millions of units into consumer hands—or flying above them—that its founders would like to see, but it’s a start.<br/></p><p>“It’s been like a 10-year-long Ph.D. project,” says Zero Zero founder and CEO Meng Qiu Wang. “The thesis topic hasn’t changed. In 2014 I looked at my cell phone and thought that if I could throw away the parts I don’t need—like the screen—and add some sensors, I could build a tiny robot.”</p><p>I first spoke to Wang <a href="https://spectrum.ieee.org/camera-drone-could-be-a-robot-command-center" target="_self">in early 2016</a>, when Zero Zero came out of stealth with its version of a flying camera—at $600. Wang had been working on the project for two years. He started the project in Silicon Valley, where he and cofounder Tony Zhang were finishing up Ph.D.s in computer science at Stanford University. Then the two decamped for China, where development costs are far less.</p><p>Flying cameras were a hot topic at the time; startup <a href="https://spectrum.ieee.org/is-lily-a-drone-with-a-camera" target="_self">Lily Robotics</a> demonstrated a $500 flying camera in mid-2015 (and was later <a href="https://www.forbes.com/sites/aarontilley/2017/01/13/lawsuit-killed-lily-robotics-drones/" target="_blank">charged with fraud</a> for faking its demo video), and in March of 2016 drone-maker <a href="https://www.dji.com/" rel="noopener noreferrer" target="_blank">DJI</a> introduced a drone with autonomous flying and tracking capabilities that turned it into much the same type of flying camera that Wang envisioned, albeit at the high price of $1400.</p><p>Wang aimed to make his flying camera cheaper and easier to use than these competitors by relying on image processing for navigation—no altimeter, no GPS. In this approach, which has changed little since the first design, one camera looks at the ground and algorithms follow the camera’s motion to navigate. Another camera looks out ahead, using facial and body recognition to track a single subject.</p><p>The current version, at $349, does what Wang had envisioned, which is, he told me, “to turn the camera into a cameraman.” But, he points out, the hardware and software, and particularly the user interface, changed a lot. The size and weight have been cut in half; it’s just 125 grams. This version uses a different and more powerful chipset, and the controls are on board; while you can select modes from a smart phone app, you don’t have to.</p><p>I can verify that it is cute (about the size of a paperback book), lightweight, and extremely easy to use. I’ve never flown a standard drone without help or crashing but had no problem sending the HoverAir up to follow me down the street and then land on my hand.</p><p>It isn’t perfect. It can’t fly over water—the movement of the water confuses the algorithms that judge speed through video images of the ground. And it only tracks people; though many would like it to track their pets, Wang says animals behave erratically, diving into bushes or other places the camera can’t follow. Since the autonomous navigation algorithms rely on the person being filmed to avoid objects and simply follows that path, such dives tend to cause the drone to crash.</p><p>Since we last spoke eight years ago, Wang has been through the highs and lows of the startup rollercoaster, turning to contract engineering for a while to keep his company alive. He’s become philosophical about much of the experience.</p><p>Here’s what he had to say.</p><p><strong>We last spoke in 2016. Tell me how you’ve changed.</strong><br/></p><p><strong>Meng Qiu Wang: </strong>When I got out of Stanford in 2014 and started the company with Tony [Zhang], I was eager and hungry and hasty and I thought I was ready. But retrospectively, I wasn’t ready to start a company. I was chasing fame and money, and excitement.</p><p>Now I’m 42, I have a daughter—everything seems more meaningful now. I’m not a Buddhist, but I have a lot of Zen in my philosophy now.</p><p>I was trying so hard to flip the page to see the next chapter of my life, but now I realize, there is no next chapter, flipping the page itself is life.</p><p><strong>You were moving really fast in 2016 and 2017. What happened during that time?</strong></p><p><strong>Wang: </strong>After coming out of stealth, we ramped up from 60 to 140 people planning to take this product into mass production. We got a crazy amount of media attention—covered by 2,200 media outlets. We went to CES, and it seemed like we collected every trophy there was there.</p><p>And then Apple came to us, inviting us to retail at all the Apple stores. This was a big deal; I think we were the first third party robotic product to do live demos in Apple stores. We produced about 50,000 units, bringing in about $15 million in revenue in six months.</p><p>Then a giant company made us a generous offer and we took it. But it didn’t work out. It was a certainly lesson learned for us. I can’t say more about that, but at this point if I walk down the street and I see a box of pizza, I would not try to open it; there really is no free lunch.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-left" data-rm-resized-container="25%" style="float: left;">
<img alt="a black caged drone with fans and a black box in the middle" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="c1a02584face3321e3a97fc8d51fd54e" data-rm-shortcode-name="rebelmouse-image" id="c1998" loading="lazy" src="https://spectrum.ieee.org/media-library/a-black-caged-drone-with-fans-and-a-black-box-in-the-middle.jpg?id=53017581&width=980" style="max-width: 100%"/>
<small class="image-media media-caption" data-gramm="false" data-lt-tmp-id="lt-525969" placeholder="Add Photo Caption..." spellcheck="false" style="max-width: 100%;">This early version of the Hover flying camera generated a lot of initial excitement, but never fully took off.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..." style="max-width: 100%;">Zero Zero Robotics</small></p><p><strong>How did you survive after that deal fell apart?</strong></p><p><strong>Wang:</strong> We went from 150 to about 50 people and turned to contract engineering. We worked with toy drone companies, with some industrial product companies. We built computer vision systems for larger drones. We did almost four years of contract work.</p><p><strong>But you kept working on flying cameras and launched a Kickstarter campaign in 2018. What happened to that product?</strong></p><p><strong>Wang:</strong> It didn’t go well. The technology wasn’t really there. We filled some orders and refunded ones that we couldn’t fill because we couldn’t get the remote controller to work.</p><p>We really didn’t have enough resources to create a new product for a new product category, a flying camera, to educate the market.</p><p>So we decided to build a more conventional drone—our V-Coptr, a V-shaped bi-copter with only two propellers—to compete against DJI. We didn’t know how hard it would be. We worked on it for four years. Key engineers left out of total dismay, they lost faith, they lost hope.</p><p>We came so close to going bankrupt so many times—at least six times in 10 years I thought I wasn’t going to be able to make payroll for the next month, but each time I got super lucky with something random happening. I never missed paying one dime—not because of my abilities, just because of luck.</p><p>We still have a relatively healthy chunk of the team, though. And this summer my first ever software engineer is coming back. The people are the biggest wealth that we’ve collected over the years. The people who are still with us are not here for money or for success. We just realized along the way that we enjoy working with each other on impossible problems.</p><p><strong>When we talked in 2016, you envisioned the flying camera as the first in a long line of personal robotics products. Is that still your goal?</strong></p><p><strong>Wang:</strong> In terms of short-term strategy, we are focusing 100 percent on the flying camera. I think about other things, but I’m not going to say I have an AI hardware company, though we do use AI. After 10 years I’ve given up on talking about that.</p><p><strong>Do you still think there’s a big market for a flying camera?</strong></p><p><strong>Wang:</strong> I think flying cameras have the potential to become the second home robot [the first being the robotic vacuum] that can enter tens of millions of homes.</p>]]></description><pubDate>Wed, 31 Jul 2024 12:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/hoverair-x1</guid><category>Drones</category><category>Entrepreneur</category><category>Startups</category><category>Zero zero robotics</category><category>Flying camera</category><dc:creator>Tekla S. Perry</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-flat-white-hover-drone-over-a-person-s-outstretched-hand.jpg?id=53017438&width=980"></media:content></item><item><title>A Robot Dentist Might Be a Good Idea, Actually</title><link>https://spectrum.ieee.org/robot-dentist</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-man-wearing-red-googles-and-a-blue-dental-bib-lies-prone-with-metallic-arms-near-his-face.jpg?id=53005021&width=1200&height=800&coordinates=0%2C1%2C0%2C2"/><br/><br/><p>
I’ll be honest: when I first got this pitch for an autonomous robot dentist, I was like: “Okay, I’m going to talk to these folks and then write an article, because there’s no possible way for this thing to be anything but horrific.” Then they sent me some video that was, in fact, horrific, in the way that only watching a high speed drill remove most of a tooth can be.
</p><p>
But fundamentally this has very little to do with robotics, because getting your teeth drilled just sucks no matter what. So the real question we should be asking is this: How can we make a dental procedure as quick and safe as possible, to minimize that inherent horrific-ness?And the answer, surprisingly, may be this robot from a startup called <a href="https://www.perceptive.io/" target="_blank">Perceptive</a>.
</p><p>
Perceptive is today announcing two new technologies that I very much hope will make future dental experiences better for everyone. While it’s easy to focus on the robot here (because, well, it’s a robot), the reason the robot can do what it does (which we’ll get to in a minute) is because of a new imaging system. The handheld imager, which is designed to operate inside of your mouth, uses <a href="https://spectrum.ieee.org/tag/optical-coherence-tomography" target="_blank">optical coherence tomography (OCT)</a> to generate a 3D image of the inside of your teeth, and even all the way down below the gum line and into the bone. This is vastly better than the 2D or 3D x-rays that dentists typically use, both in resolution and positional accuracy. <br/>
</p><p class="shortcode-media shortcode-media-rebelmouse-image">
<img alt="A hand in a blue medical glove holds a black wand-like device with a circuit board visible." class="rm-shortcode" data-rm-shortcode-id="a2b0652f97ea75afbf5e51d020eeb0e4" data-rm-shortcode-name="rebelmouse-image" id="03b3b" loading="lazy" src="https://spectrum.ieee.org/media-library/a-hand-in-a-blue-medical-glove-holds-a-black-wand-like-device-with-a-circuit-board-visible.jpg?id=53005092&width=980"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">Perceptive’s handheld optical coherence tomography imager scans for tooth decay.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Perceptive</small>
</p><p>
X-Rays, it turns out, are actually really bad at detecting cavities; <a href="https://www.linkedin.com/in/christopher-ciriello/" target="_blank">Perceptive CEO Chris Ciriello</a> tells us that the accuracy is on the order of 30 percent of figuring out the location and extent of tooth decay. In practice, this isn’t as much of a problem as it seems like it should be, because the dentist will just start drilling into your tooth and keep going until they find everything. But obviously this won’t work for a robot, where you need all of the data beforehand. That’s where the OCT comes in. You can think of OCT as similar to an ultrasound, in that it uses reflected energy to build up an image, but OCT uses light instead of sound for much higher resolution.
</p><p class="shortcode-media shortcode-media-rebelmouse-image">
<img alt="A short video shows outlines of teeth in progressively less detail, but highlights some portions in blood red." class="rm-shortcode" data-rm-shortcode-id="da1433d70ca2d712acc7995555f1dd1b" data-rm-shortcode-name="rebelmouse-image" id="6914d" loading="lazy" src="https://spectrum.ieee.org/media-library/a-short-video-shows-outlines-of-teeth-in-progressively-less-detail-but-highlights-some-portions-in-blood-red.gif?id=53005412&width=980"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">Perceptive’s imager can create detailed 3D maps of the insides of teeth.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Perceptive</small>
</p><p>
The reason OCT has not been used for teeth before is because with conventional OCT, the exposure time required to get a detailed image is several seconds, and if you move during the exposure, the image will blur. Perceptive is instead using a structure from motion approach (which will be familiar to many robotics folks), where they’re relying on a much shorter exposure time resulting in far fewer data points, but then moving the scanner and collecting more data to gradually build up a complete 3D image. According to Ciriello, this approach can localize pathology within about 20 micrometers with over 90 percent accuracy, and it’s easy for a dentist to do since they just have to move the tool around your tooth in different orientations until the scan completes.
</p><p>
Again, this is not just about collecting data so that a robot can get to work on your tooth. It’s about better imaging technology that helps your dentist identify and treat issues you might be having. “We think this is a fundamental step change,” Ciriello says. “We’re giving dentists the tools to find problems better.”
</p><p class="shortcode-media shortcode-media-rebelmouse-image">
<img alt="A silvery robotic arm with a small drill at the end." class="rm-shortcode" data-rm-shortcode-id="1e80d2a540270031fcdbd2c431c21855" data-rm-shortcode-name="rebelmouse-image" id="a2c26" loading="lazy" src="https://spectrum.ieee.org/media-library/a-silvery-robotic-arm-with-a-small-drill-at-the-end.jpg?id=53005188&width=980"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">The robot is mechanically coupled to your mouth for movement compensation.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Perceptive</small>
</p><p>
Ciriello was a practicing dentist in a small mountain town in British Columbia, Canada. People in such communities can have a difficult time getting access to care. “There aren’t too many dentists who want to work in rural communities,” he says. “Sometimes it can take months to get treatment, and if you’re in pain, that’s really not good. I realized that what I had to do was build a piece of technology that could increase the productivity of dentists.”
</p><p>
Perceptive’s robot is designed to take a dental procedure that typically requires several hours and multiple visits, and complete it in minutes in a single visit. The entry point for the robot is crown installation, where the top part of a tooth is replaced with an artificial cap (the crown). This is an incredibly common procedure, and it usually happens in two phases. First, the dentist will remove the top of the tooth with a drill. Next, they take a mold of the tooth so that a crown can be custom fit to it. Then they put a temporary crown on and send you home while they mail the mold off to get your crown made. A couple weeks later, the permanent crown arrives, you go back to the dentist, and they remove the temporary one and cement the permanent one on.
</p><p>
With Perceptive’s system, it instead goes like this: on a previous visit where the dentist has identified that you need a crown in the first place, you’d have gotten a scan of your tooth with the OCT imager. Based on that data, the robot will have planned a drilling path, and then the crown could be made before you even arrive for the drilling to start, which is only possible because the precise geometry is known in advance. You arrive for the procedure, the robot does the actually drilling in maybe five minutes or so, and the perfectly fitting permanent crown is cemented into place and you’re done.
</p><p class="shortcode-media shortcode-media-rebelmouse-image">
<img alt="A silvery robotic arm with a small drill at the end. The arm is mounted on a metal cart with a display screen." class="rm-shortcode" data-rm-shortcode-id="376deafa0e9edb20f89a26deb27a5c93" data-rm-shortcode-name="rebelmouse-image" id="93fc5" loading="lazy" src="https://spectrum.ieee.org/media-library/a-silvery-robotic-arm-with-a-small-drill-at-the-end-the-arm-is-mounted-on-a-metal-cart-with-a-display-screen.jpg?id=53005257&width=980"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">The robot is still in the prototype phase but could be available within a few years.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Perceptive</small>
</p><p>
Obviously, safety is a huge concern here, because you’ve got a robot arm with a high-speed drill literally working inside of your skull. Perceptive is well aware of this.
</p><p>
The most important thing to understand about the Perceptive robot is that it’s physically attached to you as it works. You put something called a bite block in your mouth and bite down on it, which both keeps your mouth open and keeps your jaw from getting tired. The robot’s end effector is physically attached to that block through a series of actuated linkages, such that any motions of your head are instantaneously replicated by the end of the drill, even if the drill is moving. Essentially, your skull is serving as the robot’s base, and your tooth and the drill are in the same reference frame. Purely mechanical coupling means there’s no vision system or encoders or software required: it’s a direct physical connection so that motion compensation is instantaneous. As a patient, you’re free to relax and move your head somewhat during the procedure, because it makes no difference to the robot.
</p><p>
Human dentists do have some strategies for not stabbing you with a drill if you move during a procedure, like putting their fingers on your teeth and then supporting the drill on them. But this robot should be safer and more accurate than that method, because of the rigid connection leading to only a few tens of micrometers of error, even on a moving patient. It’ll move a little bit slower than a dentist would, but because it’s only drilling exactly where it needs to, it can complete the procedure faster overall, says Ciriello.
</p><p>
There’s also a physical counterbalance system within the arm, a nice touch that makes the arm effectively weightless. (It’s somewhat similar to the PR2 arm, for you OG robotics folks.) And the final safety measure is the dentist-in-the-loop via a foot pedal that must remain pressed or the robot will stop moving and turn off the drill.
</p><p>
Ciriello claims that not only is the robot able to work faster, it also will produce better results. Most restorations like fillings or crowns last about five years, because the dentist either removed too much material from the tooth and weakened it, or removed too little material and didn’t completely solve the underlying problem. Perceptive’s robot is able to be far more exact. Ciriello says that the robot can cut geometry that’s “not humanly possible,” fitting restorations on to teeth with the precision of custom-machined parts, which is pretty much exactly what they are.
</p><p class="shortcode-media shortcode-media-rebelmouse-image">
<img alt="A short video shows a d dental drill working on a tooth in a person's mouth." class="rm-shortcode" data-rm-shortcode-id="ec4300dcd0d94303904bb501e1cb58e9" data-rm-shortcode-name="rebelmouse-image" id="4edba" loading="lazy" src="https://spectrum.ieee.org/media-library/a-short-video-shows-a-d-dental-drill-working-on-a-tooth-in-a-person-s-mouth.gif?id=53005587&width=980"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">Perceptive has successfully used its robot on real human patients, as shown in this sped-up footage. In reality the robot moves slightly slower than a human dentist.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Perceptive</small>
</p><p>
While it’s easy to focus on the technical advantages of Perceptive’s system, dentist <a href="https://www.linkedin.com/in/painlessdrz/" rel="noopener noreferrer" target="_blank"><u>Ed Zuckerberg</u></a> (who’s an investor in Perceptive) points out that it’s not just about speed or accuracy, it’s also about making patients feel better. “Patients think about the precision of the robot, versus the human nature of their dentist,” Zuckerberg says. It gives them confidence to see that their dentist is using technology in their work, especially in ways that can address common phobias. “If it can enhance the patient experience or make the experience more comfortable for phobic patients, that automatically checks the box for me.”
</p><p>
There is currently one other dental robot on the market. Called <a href="https://www.neocis.com/" rel="noopener noreferrer" target="_blank"><u>Yomi</u></a>, it offers assistive autonomy for one very specific procedure for dental implants. Yomi is not autonomous, but instead provides guidance for a dentist to make sure that they drill to the correct depth and angle.
</p><p>
While Perceptive has successfully tested their first-generation system on humans, it’s not yet ready for commercialization. The next step will likely be what’s called a pivotal clinical trial with the FDA, and if that goes well, Cirello estimates that it could be available to the public in “several years”. Perceptive has raised US $30 million in funding so far, and here’s hoping that’s enough to get them across the finish line.
</p>]]></description><pubDate>Tue, 30 Jul 2024 12:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/robot-dentist</guid><category>Medical robots</category><category>Optical coherence tomography</category><category>Robot dentist</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-man-wearing-red-googles-and-a-blue-dental-bib-lies-prone-with-metallic-arms-near-his-face.jpg?id=53005021&width=980"></media:content></item><item><title>Video Friday: Robot Baby With a Jet Pack</title><link>https://spectrum.ieee.org/video-friday-robot-baby-jetpack</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-robot-with-jet-engines-with-blue-flames-coming-out-attached-to-its-arms-and-back-stands-in-a-safety-frame-on-a-rooftop-at-nigh.gif?id=52974821&width=1200&height=800&coordinates=62%2C0%2C63%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://icra40.ieee.org/">ICRA@40</a>: 23–26 September 2024, ROTTERDAM, NETHERLANDS</h5><h5><a href="https://iros2024-abudhabi.org/">IROS 2024</a>: 14–18 October 2024, ABU DHABI, UAE</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="wkhqym57usw">If the Italian Institute of Technology’s iRonCub3 looks this cool while <em>learning</em> to fly, just imagine how cool it will look when it actually takes off!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d6dae7256db9afa12f8a7317404c4962" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wKhqym57USw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Hovering is in the works, but this is a really hard problem, which you can read more about in Daniele Pucci’s post on LinkedIn.</p><p>[ <a href="https://www.linkedin.com/posts/daniele-pucci-78428420_flying-jetpowered-humanoidrobotics-activity-7221876503725166595-nWSL?utm_source=share&utm_medium=member_desktop">LinkedIn</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="8j_wit-rd74"><em>Stanford Engineering and the Toyota Research Institute achieve the world’s first autonomous tandem drift. Leveraging the latest AI technology, Stanford Engineering and TRI are working to make driving safer for all. By automating a driving style used in motorsports called drifting—in which a driver deliberately spins the rear wheels to break traction—the teams have unlocked new possibilities for future safety systems.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4a3a13a9bc7a4bfee74f2c445698288d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8J_WiT-RD74?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://toyotaresearch.medium.com/stanford-engineering-and-toyota-research-institute-achieve-worlds-first-autonomous-tandem-drift-131fcb9a76a9">TRI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="dx7y0bchwmw"><em>Researchers at the Istituto Italiano di Tecnologia (Italian Institute of Technology) have demonstrated that under specific conditions, humans can treat robots as coauthors of the results of their actions. The condition that enables this phenomenon is a robot that behaves in a social, humanlike manner. Engaging in eye contact and participating in a common emotional experience, such as watching a movie, are key.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="60307c73020e7095cf8efbc264655a7b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DX7y0bChWmw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.science.org/doi/10.1126/scirobotics.adj3665">Science Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="hjueskudnds">If Aibo is not quite catlike enough for you, here you go.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c923a7cf3666cd8c8b9815fe1804893d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/hJUesKudNDs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.makuake.com/project/maicat/">Maicat</a> ] via [ <a href="https://robotstart.info/2024/07/24/cat-robot-maicat-on-sale-japan.html">RobotStart</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="69svc-43oqg">I’ve never been more excited for a sim-to-real gap to be bridged.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bcb4c13c6097ae2ac5f1fb6809ba752c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/69SVc-43Oqg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sites.usc.edu/quann/">USC Viterbi</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="m-utklmylhk">I’m sorry, but this looks exactly like a quadrotor sitting on a test stand.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="51c5a294715c623d6e6295b2c3111ce7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/M-utKlMYlHk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>The 12-pound Quad-Biplane combines four rotors and two wings without any control surfaces. The aircraft takes off like a conventional quadcopter and transitions to a more-efficient horizontal cruise flight, similar to that of a biplane. This combines the simplicity of a quadrotor design, providing vertical flight capability, with the cruise efficiency of a fixed-wing aircraft. The rotors are responsible for aircraft control both in vertical and forward cruise flight regimes.</em></blockquote><p>[ <a href="https://avfl.engr.tamu.edu/">AVFL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="okj-newfeci">Tensegrity robots are so weird, and I so want them to be useful.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a09f13b1b2e8b94f78ceb2164b0a153c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/OkJ-nEWFEcI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="http://www-robot.mes.titech.ac.jp/index_e.html">Suzumori Endo Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="xqqxnbgr3ii">Top-performing robots need all the help they can get.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d1993d56b4ee8850163cce456795cd04" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xqqxnbGR3II?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://b-human.de/index.html">Team B-Human</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="-ca3xqe6z_4">And now: a beetle nearly hit by an autonomous robot.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="61aec8c749093473535f7c49aea3aa54" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/-Ca3xqe6z_4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://yugu.faculty.wvu.edu/">WVUIRL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="tmgk0ata5hk"><em>Humans possess a remarkable ability to react to unpredictable perturbations through immediate mechanical responses, which harness the visco-elastic properties of muscles to maintain balance. Inspired by this behavior, we propose a novel design of a robotic leg utilizing fiber-jammed structures as passive compliant mechanisms to achieve variable joint stiffness and damping.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a6f722be407faca7426d51b4544886d4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/TMGk0ATA5Hk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2308.01758">Paper</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="wjb4l4gly8c">I don’t know what this piece of furniture is, but your cats will love it.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9bded74939eec40eb92d0903446ae5bf" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/WJB4L4gLY8c?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://new.abb.com/news/detail/117938/cstmr-robot-wound-veneer-a-sustainable-building-material-for-the-future">ABB</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="sd0p5mmaykm"><em>This video shows a dexterous avatar humanoid robot with VR teleoperation, hand tracking, and speech recognition to achieve highly dexterous mobile manipulation. Extend Robotics is developing a dexterous remote-operation interface to enable data collection for embodied AI and humanoid robots.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="eace905cc56c5c3645894e2bd582f71e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/sD0p5mmAYKM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.extendrobotics.com/">Extend Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="diautbgrxki"><em>I never really thought about this, but wind turbine blades are hollow inside and need to be inspected sometimes, which is really one of those jobs where you’d much rather have a robot do it.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a7b14059d0c2df39a96be46cde0e920d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dIAUTbgRXkI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.flyability.com/">Flyability</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="zqypazevbia"><em>Here’s a full, uncut drone-delivery mission, including a package pickup from our AutoLoader—a simple, nonpowered mechanical device that allows retail partners to utilize drone delivery with existing curbside-pickup workflows.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c7ba3c7bd5f25392d1b00056834e5b90" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zQYPaZeVbIA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://blog.wing.com/2024/01/customer-demand-and-wings-aircraft.html">Wing</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="687hxi0_5li">Daniel Simu and his acrobatic robot competed in “America’s Got Talent,” and even though his robot did a very robot thing by breaking itself immediately beforehand, the performance went really well.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c46e071cf5be504cace4699b280a190a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/687HXI0_5lI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://acrobot.nl/">Acrobot</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="k5_r0r8tiby">A tour of the Creative Robotics Mini Exhibition at the Creative Computing Institute, University of the Arts London.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="41739cdf34dec916c9f5872c4d828466" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/k5_R0R8TiBY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.arts.ac.uk/subjects/creative-computing/postgraduate/msc-creative-robotics">UAL</a> ]</p><p>Thanks, Hooman!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="dycujjms3uo"><em>Zoox CEO Aicha Evans and cofounder and chief technology officer Jesse Levinson hosted a LinkedIn Live last week to reflect on the past decade of building Zoox and their predictions for the next 10 years of the autonomous-vehicle industry.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="62682d4df022a6c98247bd82b5ca608e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DYcujjMs3Uo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://zoox.com/">Zoox</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 26 Jul 2024 16:59:31 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robot-baby-jetpack</guid><category>Robotics</category><category>Video friday</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/a-robot-with-jet-engines-with-blue-flames-coming-out-attached-to-its-arms-and-back-stands-in-a-safety-frame-on-a-rooftop-at-nigh.gif?id=52974821&width=980"></media:content></item><item><title>Elephant Robotics’ Mercury Humanoid Robot Empowers Embodied AI Research</title><link>https://spectrum.ieee.org/elephant-robotics-mercury</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/two-grouped-photos-showing-a-winking-robot-standing-in-a-room-and-another-smiling-robot-in-a-kitchen-area.jpg?id=52857053&width=1200&height=800&coordinates=150%2C0%2C150%2C0"/><br/><br/><p><em>
This is a sponsored article brought to you by <a href="https://www.elephantrobotics.com/en/" target="_blank">Elephant Robotics</a>.</em></p><p><a href="https://www.elephantrobotics.com/en/" rel="noopener noreferrer" target="_blank">Elephant Robotics</a> has gone through years of research and development to accelerate its mission of bringing robots to millions of homes and a vision of “Enjoy Robots World”. From the collaborative industrial robots P-series and C-series, which have been on the drawing board since its establishment in 2016, to the lightweight desktop 6 DOF collaborative robot <a href="https://shop.elephantrobotics.com/collections/mycobot-280" rel="noopener noreferrer" target="_blank">myCobot 280</a> in 2020, to the dual-armed, semi-humanoid robot <a href="https://shop.elephantrobotics.com/collections/mybuddy/products/mybuddy-280" rel="noopener noreferrer" target="_blank">myBuddy</a>, which was launched in 2022, Elephant Robotics is launching 3-5 robots per year, and this year’s full-body humanoid robot, the <a href="https://www.elephantrobotics.com/en/mercury-humanoid-robot/" rel="noopener noreferrer" target="_blank">Mercury series</a>, promises to reshape the landscape of non-human workers, introducing intelligent robots like Mercury into research and education and even everyday home environments.</p><h2>A Commitment to Practical Robotics</h2><p>
<a href="https://shop.elephantrobotics.com/products/mercury-humanoid-robot-series?_pos=1&_psq=mercury&_ss=e&_v=1.0&variant=47556966875448" rel="noopener noreferrer" target="_blank">Elephant Robotics</a> proudly introduces the Mercury Series, a suite of humanoid robots that not only push the boundaries of innovation but also embody a deep commitment to practical applications. Designed with the future of robotics in mind, the Mercury Series is poised to become the go-to choice for researchers and industry professionals seeking reliable, scalable, and robust solutions.
</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="20a19f4fa2a474864196ce49dc15f94c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Ru24sDmK8yI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
<small class="image-media media-caption" placeholder="Add Photo Caption..."><u><br/></u></small>
<small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Elephant Robotics</small></p><h2>The Genesis of Mercury Series: Bridging Vision With Practicality</h2><p>
From the outset, the Mercury Series has been envisioned as more than just a collection of advanced prototypes. It is a testament to Elephant Robotics’ dedication to creating humanoid robots that are not only groundbreaking in their capabilities but also practical for mass production and consistent, reliable use in real-world applications.
</p><h2>Mercury X1: Wheeled Humanoid Robot</h2><p>
<a href="https://www.elephantrobotics.com/en/mercury-x1-en/" rel="noopener noreferrer" target="_blank">The Mercury X1</a> is a versatile wheeled humanoid robot that combines advanced functionalities with mobility. Equipped with dual NVIDIA Jetson controllers, lidar, ultrasonic sensors, and an 8-hour battery life, the X1 is perfect for a wide range of applications, from exploratory studies to commercial tasks requiring mobility and adaptability.
</p><h2>Mercury B1: Dual-Arm Semi-Humanoid Robot</h2><p>
<a href="https://www.elephantrobotics.com/en/mercury-b1-en/" rel="noopener noreferrer" target="_blank">The Mercury B1</a> is a semi-humanoid robot tailored for sophisticated research. It features 17 degrees of freedom, dual robotic arms, a 9-inch touchscreen, a NVIDIA Xavier control chip, and an integrated 3D camera. The B1 excels in machine vision and VR-assisted teleoperation, and its AI voice interaction and LLM integration mark significant advancements in human-robot communication.
</p><p>
These two advanced models exemplify Elephant Robotics’ commitment to practical robotics. The wheeled humanoid robot Mercury X1 integrates advanced technology with a state-of-the-art mobile platform, ensuring not only versatility but also the feasibility of large-scale production and deployment.
</p><h2>Embracing the Power of Reliable Embodied AI</h2><p>
The Mercury Series is engineered as the ideal hardware platform for embodied AI research, providing robust support for sophisticated AI algorithms and real-world applications. Elephant Robotics demonstrates its commitment to innovation through the Mercury series’ compatibility with NVIDIA’s ISSACSIM, a state-of-the-art simulation platform that facilitates sim2real learning, bridging the gap between virtual environments and physical robot interaction.
</p><p>
The Mercury Series is perfectly suited for the study and experimentation of mainstream large language models in embodied AI. Its advanced capabilities allow seamless integration with the latest AI research. This provides a reliable and scalable platform for exploring the frontiers of machine learning and robotics.
</p><p>
Furthermore, the Mercury Series is complemented by the <a href="https://shop.elephantrobotics.com/collections/myarm-mc/products/myarm-c650" target="_blank">myArm C650</a>, a teleoperation robotic arm that enables rapid acquisition of physical data. This feature supports secondary learning and adaptation, allowing for immediate feedback and iterative improvements in real-time. These features, combined with the Mercury Series’ reliability and practicality, make it the preferred hardware platform for researchers and institutions looking to advance the field of embodied AI.
</p><p>
The Mercury Series is supported by a rich software ecosystem, compatible with major programming languages, and integrates seamlessly with industry-standard simulation software. This comprehensive development environment is enhanced by a range of auxiliary hardware, all designed with mass production practicality in mind.
</p><p class="shortcode-media shortcode-media-rebelmouse-image">
<img alt="A set of images showing a robot in a variety of situations." class="rm-shortcode" data-rm-shortcode-id="0f090fc8addd438aa868f01e910606e3" data-rm-shortcode-name="rebelmouse-image" id="654a8" loading="lazy" src="https://spectrum.ieee.org/media-library/a-set-of-images-showing-a-robot-in-a-variety-of-situations.jpg?id=52857217&width=980"/>
<small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Elephant Robotics</small></p><h2>Drive to Innovate: Mass Production and Global Benchmarks</h2><p>
The “Power Spring” harmonic drive modules, a hallmark of the Elephant Robotics’ commitment to innovation for mass production, have been meticulously engineered to offer an unparalleled torque-to-weight ratio. These components are a testament to the company’s foresight in addressing the practicalities of large-scale manufacturing. The incorporation of carbon fiber in the design of these modules not only optimizes agility and power but also ensures that the robots are well-prepared for the rigors of the production line and real-world applications. The Mercury Series, with its spirit of innovation, is making a significant global impact, setting a new benchmark for what practical robotics can achieve.
</p><p>
Elephant Robotics is consistently delivering mass-produced robots to a range of renowned institutions and industry leaders, thereby redefining the industry standards for reliability and scalability. The company’s dedication to providing more than mere prototypes is evident in the active role its robots play in various sectors, transforming industries that are in search of dependable and efficient robotic solutions.
</p><h2>Conclusion: The Mercury Series—A Beacon for the Future of Practical Robotics</h2><p>
The Mercury Series represents more than a product; it is a beacon for the future of practical robotics. <a href="https://shop.elephantrobotics.com/" rel="noopener noreferrer" target="_blank">Elephant Robotics’</a> dedication to affordability, accessibility, and technological advancement ensures that the Mercury Series is not just a research tool but a platform for real-world impact.
</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="175347c43067d1bd4f7f1530d6dbcb91" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/gKJXL0IXeUs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
<small class="image-media media-caption" placeholder="Add Photo Caption...">Mercury Usecases | Explore the Capabilities of the Wheeled Humanoid Robot and Discover Its Precision</small>
<small class="image-media media-photo-credit" placeholder="Add Photo Credit...">
<a href="https://youtu.be/gKJXL0IXeUs" target="_blank">youtu.be</a>
</small>
</p><p>
<strong>Elephant Robotics:</strong> <a href="https://www.elephantrobotics.com/en/" rel="noopener noreferrer" target="_blank">https://www.elephantrobotics.com/en/</a>
</p><p>
<strong>Mercury Robot Series: </strong><u><a href="https://www.elephantrobotics.com/en/mercury-humanoid-robot/" rel="noopener noreferrer" target="_blank">https://www.elephantrobotics.com/en/mercury-humanoid-robot/</a></u>
</p>]]></description><pubDate>Tue, 23 Jul 2024 22:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/elephant-robotics-mercury</guid><category>Elephant robotics</category><category>Ai</category><category>Humanoid robots</category><category>Mercury series</category><dc:creator>Elephant Robotics</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/two-grouped-photos-showing-a-winking-robot-standing-in-a-room-and-another-smiling-robot-in-a-kitchen-area.jpg?id=52857053&width=980"></media:content></item><item><title>iRobot’s Autowash Dock Is (Almost) Automated Floor Care</title><link>https://spectrum.ieee.org/irobot-roomba-combo-10-max</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-round-black-vacuuming-robot-sits-in-front-of-a-large-black-docking-station-that-is-partially-transparent-to-show-clean-and-dir.jpg?id=52955844&width=2048&height=1609&coordinates=0%2C225%2C0%2C214"/><br/><br/><p>The dream of robotic floor care has always been for it to be hands-off and mind-off. That is, for a robot to live in your house that will keep your floors clean without you having to really do anything or even think about it. When it comes to robot vacuuming, that’s been more or less solved thanks to self-emptying robots that transfer debris into docking stations, which iRobot pioneered <a href="https://spectrum.ieee.org/irobot-develops-self-emptying-roomba" target="_self"><u>with the Roomba i7+ in 2018</u></a>. By 2022, iRobot’s <a href="https://spectrum.ieee.org/irobot-roomba-combo-j7-vacuum" target="_self"><u>Combo j7+</u></a> added an intelligent mopping pad to the mix, which definitely made for cleaner floors but was also a step backwards in the sense that you had to remember to toss the pad into your washing machine and fill the robot’s clean water reservoir every time. The <a href="https://www.irobot.com/en_US/roomba-combo-j9plus-auto-fill-robot-vacuum-and-mop/C975020.html" rel="noopener noreferrer" target="_blank"><u>Combo j9+</u></a> stuffed a clean water reservoir into the dock itself, which could top off the robot with water by itself for a month.</p><p>With the new Roomba Combo 10 Max, announced today, iRobot has cut out (some of) that annoying process thanks to a massive new docking station that self-empties vacuum debris, empties dirty mop water, refills clean mop water, and then washes and dries the mopping pad, completely autonomously.</p><hr/><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="79293f109fc36e515ef368eac12d7d0b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Yfx4331nQjg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
<small class="image-media media-photo-credit" placeholder="Add Photo Credit...">iRobot</small>
</p><p>The Roomba part of this is a mildly upgraded j7+, and most of what’s new on the hardware side here is in the “multifunction AutoWash Dock.” This new dock is a beast: It empties the robot of all of the dirt and debris picked up by the vacuum, refills the Roomba’s clean water tank from a reservoir, and then starts up a wet scrubby system down under the bottom of the dock. The Roomba deploys its dirty mopping pad onto that system, and then drives back and forth while the scrubby system cleans the pad. All the dirty water from this process gets sucked back up into a dedicated reservoir inside the dock, and the pad gets blow-dried while the scrubby system runs a self-cleaning cycle.</p><p class="shortcode-media shortcode-media-rebelmouse-image image-crop-custom">
<img alt="A round black vacuuming robot sits inside of a large black docking station that is partially transparent to show clean and dirty water tanks inside." class="rm-shortcode" data-rm-shortcode-id="612af9c9ffcfd1562fa3fac3814a9d70" data-rm-shortcode-name="rebelmouse-image" id="debee" loading="lazy" src="https://spectrum.ieee.org/media-library/a-round-black-vacuuming-robot-sits-inside-of-a-large-black-docking-station-that-is-partially-transparent-to-show-clean-and-dirty.jpg?id=52955848&width=5120&height=3452&quality=85&coordinates=0%2C922%2C0%2C746"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">The dock removes debris from the vacuum, refills it with clean water, and then uses water to wash the mopping pad.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">iRobot</small></p><p>This means that as a user, you’ve only got to worry about three things: dumping out the dirty water tank every week (if you use the robot for mopping most days), filling the clean water tank every week, and then changing out the debris every two months. That is not a lot of hands-on time for having consistently clean floors.</p><p>The other thing to keep in mind about all of these robots is that they do need relatively frequent human care if you want them to be happy and successful. That means flipping them over and getting into their guts to clean out the bearings and all that stuff. iRobot makes this very easy to do, and it’s a necessary part of robot ownership, so the dream of having a robot that you can actually forget <em><em>completely</em></em> is probably not achievable.</p><p>The consequence for this convenience is a real chonker of a dock. The dock is basically furniture, and to the company’s credit, iRobot designed it so that the top surface is useable as a shelf—Access to the guts of the dock are from the front, not the top. This is fine, but it’s also kind of crazy just how much these docks have expanded, especially once you factor in the front ramp that the robot drives up, which sticks out even farther. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
<img alt="A round black robot on a wooden floor approaches a dirty carpet and uses a metal arm to lift a wet mopping pad onto its back." class="rm-shortcode" data-rm-shortcode-id="9bb751380913b961c87b04b3529c21dc" data-rm-shortcode-name="rebelmouse-image" id="b228f" loading="lazy" src="https://spectrum.ieee.org/media-library/a-round-black-robot-on-a-wooden-floor-approaches-a-dirty-carpet-and-uses-a-metal-arm-to-lift-a-wet-mopping-pad-onto-its-back.jpg?id=52955845&width=980"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">The Roomba will detect carpet and lift its mopping pad up to prevent drips.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">iRobot</small></p><p>We asked iRobot director of project management <a href="https://www.linkedin.com/in/warren-fernandez/" rel="noopener noreferrer" target="_blank"><u>Warren Fernandez</u></a> about whether docks are just going to keep on getting bigger forever until we’re all just living in giant robot docks, to which he said: “Are you going to continue to see some large capable multifunction docks out there in the market? Yeah, I absolutely think you will—but when does big become too big?” Fernandez says that there are likely opportunities to reduce dock size going forward through packaging efficiencies or dual-purpose components, but that there’s another option, too: Distributed docks. “If a robot has dry capabilities and wet capabilities, do those have to coexist inside the same chassis? What if they were separate?” says Fernandez.</p><p>We should mention that iRobot is not the first in the robotic floor care robot space to have a self-cleaning mop, and it’s also not the first to think about distributed docks, although as Fernandez explains, this is a more common approach in Asia where you can also take advantage of home plumbing integration. “It’s a major trend in China, and starting to pop up a little bit in Europe, but not really in North America yet. How amazing could it be if you had a dock that, in a very easy manner, was able to tap right into plumbing lines for water supply and sewage disposal?”</p><p>According to Fernandez, this tends to be much easier to do in China, both because the labor cost for plumbing work is far lower than in the United States and Europe, and also because it’s fairly common for apartments in China to have accessible floor drains. “We don’t really yet see it in a major way at a global level,” Fernandez tells us. “But that doesn’t mean it’s not coming.”</p><p class="shortcode-media shortcode-media-rebelmouse-image image-crop-custom">
<img alt="A round black robot on a wooden floor approaches a dirty carpet and uses a metal arm to lift a wet mopping pad onto its back." class="rm-shortcode" data-rm-shortcode-id="627d844ae29c2fc4529e91b185994e61" data-rm-shortcode-name="rebelmouse-image" id="003ba" loading="lazy" src="https://spectrum.ieee.org/media-library/a-round-black-robot-on-a-wooden-floor-approaches-a-dirty-carpet-and-uses-a-metal-arm-to-lift-a-wet-mopping-pad-onto-its-back.jpg?id=52955851&width=1367&height=1091&quality=85&coordinates=0%2C629%2C0%2C328"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">The robot autonomously switches mopping mode on and off for different floor surfaces.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">iRobot</small></p><p>We should also mention the Roomba Combo 10 Max, which includes some software updates:</p><ul><li>The front-facing camera and specialized bin sensors can identify dirtier areas eight times as effectively as before.</li><li>The Roomba can identify specific rooms and prioritize the order they’re cleaned in, depending on how dirty they get.</li><li>A new cleaning behavior called “Smart Scrub” adds a back-and-forth scrubbing motion for floors that need extra oomph.</li></ul><p>And here’s what I feel like the new software <em><em>should</em></em> do, but doesn’t:</p><ul><li>Use the front-facing camera and bin sensors to identify dirtier areas and then autonomously develop a schedule to more frequently clean those areas.</li><li>Activate Smart Scrub when the camera and bin sensors recognize an especially dirty floor.</li></ul><p>I say “should do” because the robot appears to be collecting the data that it needs to do these things but it doesn’t do them yet. New features (especially new features that involve autonomy) take time to develop and deploy, but imagine a robot that makes much more nuanced decisions about where and when to clean based on very detailed real-time data and environmental understanding that iRobot has already implemented. </p><p>I also appreciate that even as iRobot is emphasizing autonomy and leveraging data to start making more decisions for the user, the company is also making sure that the user has as much control as possible through the app. For example, you can set the robot to mop your floor without vacuuming first, even though if you do that, all you’re going to end up with a much dirtier mop. Doesn’t make a heck of a lot of sense, but if that’s what you want, iRobot has empowered you to do it.</p><p class="shortcode-media shortcode-media-rebelmouse-image image-crop-custom">
<img alt="A round black vacuuming robot sits inside of a large black docking station that is opened to show clean and dirty water tanks inside." class="rm-shortcode" data-rm-shortcode-id="47b27289f88402a94a85551f575502dc" data-rm-shortcode-name="rebelmouse-image" id="f456a" loading="lazy" src="https://spectrum.ieee.org/media-library/a-round-black-vacuuming-robot-sits-inside-of-a-large-black-docking-station-that-is-opened-to-show-clean-and-dirty-water-tanks-in.jpg?id=52955852&width=2048&height=1407&quality=85&coordinates=0%2C401%2C0%2C240"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">The dock opens from the front for access to the clean- and dirty-water storage and the dirt bag.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">iRobot</small></p><p>The Roomba Combo 10 Max will be launching in August for US $1,400. That’s expensive, but it’s also how iRobot does things: A new Roomba with new tech always gets flagship status and premium cost. Sooner or later it’ll be affordable enough that the rest of us will be able to afford it, too.</p>]]></description><pubDate>Tue, 23 Jul 2024 11:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/irobot-roomba-combo-10-max</guid><category>Irobot</category><category>Roomba</category><category>Home robots</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-round-black-vacuuming-robot-sits-in-front-of-a-large-black-docking-station-that-is-partially-transparent-to-show-clean-and-dir.jpg?id=52955844&width=980"></media:content></item><item><title>Video Friday: Robot Crash-Perches, Hugs Tree</title><link>https://spectrum.ieee.org/video-friday-bioinspired-robot</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/three-images-showing-a-bat-hugging-a-tree-trunk-an-owl-hugging-a-tree-trunk-and-a-black-robotic-airplane-hugging-a-tree-trunk.png?id=52930125&width=1200&height=800&coordinates=150%2C0%2C150%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://icra40.ieee.org/">ICRA@40</a>: 23–26 September 2024, ROTTERDAM, NETHERLANDS</h5><h5><a href="https://iros2024-abudhabi.org/">IROS 2024</a>: 14–18 October 2024, ABU DHABI, UAE</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="sgyivja3dzg"><em>Perching with winged Unmanned Aerial Vehicles has often been solved by means of complex control or intricate appendages. Here, we present a method that relies on passive wing morphing for crash-landing on trees and other types of vertical poles. Inspired by the adaptability of animals’ and bats’ limbs in gripping and holding onto trees, we design dual-purpose wings that enable both aerial gliding and perching on poles.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="af2fc1eabd69019ce92efbd3a25a9fb5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/SGyivJa3DZg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.nature.com/articles/s44172-024-00241-0"><em>Nature Communications Engineering</em></a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="r1jfvn76kai">Pretty impressive to have low enough latency in controlling your robot’s hardware that it can play ping pong, although it makes it impossible to tell whether the robot or the human is the one that’s actually bad at the game.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4fedcb825f47a67a4fd51ac1d2c2f0aa" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/R1JfVN76kAI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ihmc.us/nadia-humanoid/">IHMC</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="e6vukpg3jxu">How to be a good robot when boarding an elevator.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6e009489f01df1fa602e49b690b7b11f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/E6VUkPG3jXU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://1784.navercorp.com/en/">NAVER</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="j0qh9gu9tko"><em>Have you ever wondered how insects are able to go so far beyond their home and still find their way? The answer to this question is not only relevant to biology but also to making the AI for tiny, autonomous robots. We felt inspired by biological findings on how ants visually recognize their environment and combine it with counting their steps in order to get safely back home.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a4c7b78ab2a256b65196dcccbc2813c2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/J0qh9gu9Tko?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.science.org/doi/10.1126/scirobotics.adk0310"><em>Science Robotics</em></a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="q1_cwyqzt88"><em>Team RoMeLa Practice with ARTEMIS humanoid robots, featuring Tsinghua Hephaestus (Booster Alpha). Fully autonomous humanoid robot soccer match with the official goal of beating the human WorldCup Champions by the year 2050.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cb7acd663808dfe6b77a14fe92a652be" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Q1_cwYQZT88?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.romela.org/">RoMeLa</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="cjnuoxq2axm"><em>Triangle is the most stable shape, right?</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7648ac350c1c14b9f4d0665c6c0d322d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/CJNuoxQ2AxM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://yugu.faculty.wvu.edu/">WVU IRL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="w-l90bhfdfo"><em>We propose RialTo, a new system for robustifying real-world imitation learning policies via reinforcement learning in “digital twin” simulation environments constructed on the fly from small amounts of real-world data.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="da6cbba32610c3bbe5c8e7488817c9c4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/w-L90BhfDFo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://real-to-sim-to-real.github.io/RialTo/">MIT CSAIL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="trtdkuxmlxo">There is absolutely no reason to watch this entire video, but Moley Robotics is still working on that robotic kitchen of theirs.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="17437d7cb98db3ce85d359f39d8e8db3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/TrTDkuXmlxo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>I will once again point out that the hardest part of cooking (for me, anyway) is the prep and the cleanup, and this robot still needs you to do all that.</p><p>[ <a href="https://www.moley.com/">Moley</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="v2vc8pk31n4"><em>B-Human has so far won 10 titles at the RoboCup SPL tournament. Can we make it 11 this year? Our RoboCup starts off with a banger game against HTWK Robots form Leipzig!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f5f8b8a3e52f1f470c4f765ab5792d91" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/v2VC8Pk31n4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://b-human.de/index.html">Team B-Human</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="4p7axxplk-w"><em>AMBIDEX is a dual-armed robot with an innovative mechanism developed for safe coexistence with humans. Based on an innovative cable structure, it is designed to be both strong and stable.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ceea5604c04693daadf67b96f01f7c84" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/4P7AXxPlk-w?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.naverlabs.com/en/ambidex">NAVER</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="tzm846ojxsa"><em>As NASA’s Perseverance rover prepares to ascend to the rim of Jezero Crater, its team is investigating a rock unlike any that they’ve seen so far on Mars. Deputy project scientist Katie Stack Morgan explains why this rock, found in an ancient channel that funneled water into the crater, could be among the oldest that Perseverance has investigated—or the youngest.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9f1794f9f718b7076451f35b11a38777" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/TZm846OJxSA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://science.nasa.gov/mission/mars-2020-perseverance/">NASA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="u9_q6c5yigo"><em>We present a novel approach for enhancing human-robot collaboration using physical interactions for real-time error correction of large language model (LLM) parameterized commands.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9f4a55c2d8a8fdc7ff5667c8767106e4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/U9_q6C5YIgo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.grasp.upenn.edu/research-groups/figueroa-robotics-lab/">Figueroa Robotics Lab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="xim0mxvqygu"><em>Husky Observer was recently used to autonomously inspect solar panels at a large solar panel farm. As part of its mission, the robot navigated rows of solar panels, stopping to inspect areas with its integrated thermal camera. Images were taken by the robot and enhanced to detect potential “hot spots” in the panels.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="250d84f1fffa73c17e614a110d19cfd0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xIM0MXvQyGU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://clearpathrobotics.com/husky-observer/">Clearpath Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="npetybdwwz8">Most of the time, robotic workcells contain just one robot, so it’s cool to see a pair of them collaborating on tasks.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="21bd81d68c59c8991b913f31fbed0d96" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/npetYBdwwz8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://leverage-robotics.com/en/">Leverage Robotics</a> ]</p><p>Thanks, Roman!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="if9immyioh0"><em>Meet Hydrus, the autonomous underwater drone revolutionising underwater data collection by eliminating the barriers to its entry. Hydrus ensures that even users with limited resources can execute precise and regular subsea missions to meet their data requirements.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="264e00b842e459e30010648999f0d897" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/If9immYioh0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.advancednavigation.com/robotics/micro-auv/hydrus/">Advanced Navigation</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="7_lw7u-nk6q">Those adorable Disney robots have finally made their way into a paper.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5d81357a39aba2e6cb10b8c4a3e1163d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7_LW7u-nk6Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://roboticsconference.org/program/papers/103/">RSS 2024</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 19 Jul 2024 19:45:02 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-bioinspired-robot</guid><category>Autonomous robots</category><category>Collaborative robots</category><category>Disney robots</category><category>Perseverance rover</category><category>Robotics</category><category>Video friday</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/three-images-showing-a-bat-hugging-a-tree-trunk-an-owl-hugging-a-tree-trunk-and-a-black-robotic-airplane-hugging-a-tree-trunk.png?id=52930125&width=980"></media:content></item><item><title>Robot Dog Cleans Up Beaches With Foot-Mounted Vacuums</title><link>https://spectrum.ieee.org/robot-dog-vacuum</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-black-robot-dog-with-a-white-backpack-with-tubes-coming-out-of-it-running-down-its-legs-to-its-feet-stands-on-a-pebbly-beach-w.jpg?id=52824948&width=1200&height=800&coordinates=150%2C0%2C150%2C0"/><br/><br/><p>
Cigarette butts are the second most common undisposed-of litter on Earth—of the six trillion-ish cigarettes inhaled every year, <a href="https://pubmed.ncbi.nlm.nih.gov/30782533/" target="_blank">it’s estimated</a> that over 4 trillion of the butts are just tossed onto the ground, each one leeching over 700 different toxic chemicals into the environment. Let’s not focus on the fact that all those toxic chemicals are <em><em>also</em></em> going into people’s lungs, and instead talk about the ecosystem damage that they can do and also just the general grossness of having bits of sucked-on trash everywhere. Ew.
</p><p>
Preventing those cigarette butts from winding up on the ground in the first place would be the best option, but it would require a pretty big shift in human behavior. Operating under the assumption that humans changing their behavior is a nonstarter, roboticists from the <a href="https://dls.iit.it/" target="_blank">Dynamic Legged Systems</a> unit at the Italian Institute of Technology (IIT), in Genoa, have instead designed a novel platform for cigarette-butt cleanup in the form of a quadrupedal robot with vacuums attached to its feet.
</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="1221ce054f5104cd582d1a9d3d1789c9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/O8BqvAe-moI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
<small class="image-media media-photo-credit" placeholder="Add Photo Credit...">IIT</small>
</p><p>
There are, of course, far more efficient ways of at least partially automating the cleanup of litter with machines. The challenge is that most of that automation relies on mobility systems with wheels, which won’t work on the many beautiful beaches (and many beautiful flights of stairs) of Genoa. In places like these, it still falls to humans to do the hard work, which is less than ideal.<strong></strong>
</p><p>
This robot, developed in <a href="https://spectrum.ieee.org/tag/claudio-semini" target="_blank">Claudio Semini’s lab at IIT</a>, is called VERO (Vacuum-cleaner Equipped RObot). It’s based around an AlienGo from Unitree, with a commercial vacuum mounted on its back. Hoses go from the vacuum down the leg to each foot, with a custom 3D-printed nozzle that puts as much suction near the ground as possible without tripping the robot up. While the vacuum is novel, the real contribution here is how the robot autonomously locates things on the ground and then plans how to interact with those things using its feet.
</p><p>
First, an operator designates an area for VERO to clean, after which the robot operates by itself. After calculating an exploration path to explore the entire area, the robot uses its onboard cameras and a neural network to detect cigarette butts. This is trickier than it sounds, because there may be a lot of cigarette butts on the ground, and they all probably look pretty much the same, so the system has to filter out all of the potential duplicates. The next step is to plan its next steps: VERO has to put the vacuum side of one of its feet right next to each cigarette butt while calculating a safe, stable pose for the rest of its body. Since this whole process can take place on sand or stairs or other uneven surfaces, VERO has to prioritize not falling over before it decides how to do the collection. The final collecting maneuver is fine-tuned using an extra Intel RealSense depth camera mounted on the robot’s chin.
</p><p class="shortcode-media shortcode-media-rebelmouse-image">
<img alt="A collage of six photos of a quadruped robot navigating different environments." class="rm-shortcode" data-rm-shortcode-id="a558e68c65459dc391adc6aa6e78231d" data-rm-shortcode-name="rebelmouse-image" id="eb956" loading="lazy" src="https://spectrum.ieee.org/media-library/a-collage-of-six-photos-of-a-quadruped-robot-navigating-different-environments.png?id=52820248&width=980"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">VERO has been tested successfully in six different scenarios that challenge both its locomotion and detection capabilities.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">IIT</small>
</p><p>
Initial testing with the robot in a variety of different environments showed that it could successfully collect just under 90 percent of cigarette butts, which I bet is better than I could do, and I’m also much more likely to get fed up with the whole process. The robot is not very quick at the task, but unlike me it will never get fed up as long as it’s got energy in its battery, so speed is somewhat less important.
</p><p>
As far as the authors of this paper are aware (and I assume they’ve done their research), this is “the first time that the legs of a legged robot are <em>concurrently</em> utilized for locomotion and for a different task.” This is distinct from other robots that can (for example) open doors with their feet, because those robots stop using the feet as feet for a while and instead use them as manipulators.
</p><p>
So, this is about a lot more than cigarette butts, and the researchers suggest a variety of other potential use cases, including spraying weeds in crop fields, inspecting cracks in infrastructure, and placing nails and rivets during construction.
</p><p>
Some use cases include potentially doing multiple things at the same time, like planting different kinds of seeds, using different surface sensors, or driving both nails and rivets. And since quadrupeds have four feet, they could potentially host four completely different tools, and the software that the researchers developed for VERO can be slightly modified to put whatever foot you want on whatever spot you need.
</p><p><em><em><a href="https://onlinelibrary.wiley.com/doi/abs/10.1002/rob.22350" target="_blank">VERO: A Vacuum‐Cleaner‐Equipped Quadruped Robot for Efficient Litter Removal</a></em></em>, by Lorenzo Amatucci, Giulio Turrisi, Angelo Bratta, Victor Barasuol, and Claudio Semini from IIT, was published in the <em>Journal of Field Robotics</em>.</p><p><em>
This article appears in the September 2024 print issue as “Robot Dog Vacuums With Its Feet.”</em></p>]]></description><pubDate>Thu, 18 Jul 2024 14:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/robot-dog-vacuum</guid><category>Quadruped robots</category><category>Legged robots</category><category>Robotics</category><category>Italy</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-black-robot-dog-with-a-white-backpack-with-tubes-coming-out-of-it-running-down-its-legs-to-its-feet-stands-on-a-pebbly-beach-w.jpg?id=52824948&width=980"></media:content></item><item><title>The Smallest, Lightest Solar-Powered Drone Takes Flight</title><link>https://spectrum.ieee.org/smallest-drone</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-silvery-round-drone-with-wings-and-a-small-power-system-sits-in-the-palm-of-a-hand.jpg?id=52820114&width=1200&height=800&coordinates=0%2C52%2C0%2C52"/><br/><br/><p>Scientists in China have built what they claim to be the smallest and lightest solar-powered aerial vehicle. It’s small enough to sit in the palm of a person’s hand, weighs less than a U.S. nickel, and can fly indefinitely while the sun shines on it.<br/></p><p>Micro aerial vehicles (MAVs) are <a href="https://spectrum.ieee.org/robobee-robot-precision-control" target="_blank">insect- and bird-size aircraft</a> that <a href="https://spectrum.ieee.org/nothing-can-keep-this-drone-down" target="_blank">might prove useful for reconnaissance</a> and other possible applications. However, a major problem that MAVs currently face is their limited flight times, usually about 30 minutes. Ultralight MAVs—those weighing less than 10 grams—can often only stay aloft for less than 10 minutes.</p><p>One potential way to keep MAVs flying longer is to power them with a consistent source of energy such as sunlight. Now, in a new study, researchers have developed what they say is the first solar-powered MAV capable of sustained flight.</p><p>The new ultralight MAV, CoulombFly, is just 4.21g with a wingspan of 20 centimeters. That’s about 10 times as small as and roughly 600 times as light as the previous smallest sunlight-powered aircraft, a <u><a href="https://onlinelibrary.wiley.com/doi/abs/10.1002/pip.3169" target="_blank">quadcopter</a></u> that’s 2 meters wide and weighs 2.6 kilograms.</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="af7200142a01f529dc54d28c3065cfe7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/LBoee1l4OXo?rel=0&start=1" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
<small class="image-media media-caption" placeholder="Add Photo Caption...">Sunlight powered flight test</small>
<small class="image-media media-photo-credit" placeholder="Add Photo Credit...">
<a href="https://www.youtube.com/watch?v=LBoee1l4OXo&t=1s" target="_blank">Nature</a>
</small>
</p><p>“My ultimate goal is to make a super tiny flying vehicle, about the size and weight of a mosquito, with a wingspan under 1 centimeter,” says Mingjing Qi, a professor of energy and power engineering at Beihang University in Beijing. Qi and the scientists who built CoulombFly developed a prototype of such an aircraft, measuring 8 millimeters wide and 9 milligrams in mass, “but it can’t fly on its own power yet. I believe that with the ongoing development of microcircuit technology, we can make this happen.”</p><p>Previous sunlight-powered aerial vehicles typically rely on <u><a href="https://spectrum.ieee.org/200-years-ago-faraday-invented-the-electric-motor" target="_self">electromagnetic motors</a></u>, which use electromagnets to generate motion. However, the smaller a solar-powered aircraft gets, the less surface area it has with which to collect sunlight, reducing the amount of energy it can generate. In addition, the efficiency of electromagnetic motors decrease sharply as vehicles shrink in size. Smaller electromagnetic motors experience comparably greater friction than larger ones, as well as greater energy losses due to electrical resistance from their components. This results in low lift-to-power efficiencies, Qi and his colleagues explain.</p><p>CoulombFly instead employs an electrostatic motor, which produce motion using electrostatic fields. Electrostatic motors are generally used as sensors in microelectromechanical systems (<u><a href="https://spectrum.ieee.org/nano-machine-shape-shifting" target="_self">MEMS</a></u>), not for aerial propulsion. Nevertheless, with a mass of only 1.52 grams, the electrostatic motor the scientists used has a lift-to-power efficiency two to three times that of other MAV motors.</p><p>The electrostatic motor has two nested rings. The inner ring is a spinning rotor that possesses 64 slats, each made of a carbon fiber sheet covered with aluminum foil. It resembles a wooden fence curved into a circle, with gaps between the fence’s posts. The outer ring is equipped eight alternating pairs of positive and negative electrode plates, which are each also made of a carbon fiber sheet bonded to aluminum foil. Each plate’s edge also possesses a brush made of aluminum that touches the inner ring’s slats.</p><p>Above CoulombFly’s electrostatic motor is a propeller 20 cm wide and connected to the rotor. Below the motor are two high-power-density thin-film gallium arsenide solar cells, each 4 by 6 cm in size, with a mass of 0.48 g and an energy conversion efficiency of more than 30 percent.</p><p>Sunlight electrically charges CoulombFly’s outer ring, and its 16 plates generate electric fields. The brushes on the outer ring’s plates touch the inner ring, electrically charging the rotor slats. The electric fields of the outer ring’s plates exert force on the charged rotor slats, making the inner ring and the propeller spin.</p><p>In tests under natural sunlight conditions—about 920 watts of light per square meter—CoulombFly successfully took off within one second and sustained flight for an hour without any deterioration in performance. Potential applications for sunlight-powered MAVs may include long-distance and long-duration aerial reconnaissance, the researchers say.</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="7014ac2e38148fa99d02dd4eccccad3e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/-sQR0lG4OLA?rel=0&start=9" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
<small class="image-media media-caption" placeholder="Add Photo Caption...">Long term test for hovering operation</small>
<small class="image-media media-photo-credit" placeholder="Add Photo Credit...">
<a href="https://www.youtube.com/watch?v=-sQR0lG4OLA&t=9s" target="_blank">Nature</a>
</small>
</p><p>CoulombFly’s propulsion system can generate up to 5.8 g of lift. This means it could support an extra payload of roughly 1.59 g, which is “sufficient to accommodate the smallest available sensors, controllers, cameras and so on” to support future autonomous operations, Qi says. ”Right now, there’s still a lot of room to improve things like motors, propellers, and circuits, so we think we can get the extra payload up to 4 grams in the future. If we need even more payload, we could switch to quadcopters or fixed-wing designs, which can carry up to 30 grams.”</p><p>Qi adds “it should be possible for the vehicle to carry a tiny lithium-ion battery.” That means it could store energy from its solar panels and fly even when the sun is not out, potentially enabling 24-hour operations.</p><p>In the future, “we plan to use this propulsion system in different types of flying vehicles, like fixed-wing and rotorcraft,” Qi says.</p><p>The scientists detailed <u><a href="https://www.nature.com/articles/s41586-024-07609-4" rel="noopener noreferrer" target="_blank">their findings</a></u> online 17 July in the journal <em>Nature</em>.</p><p><em>This article appears in the September 2024 print issue as “Tiny Solar-Powered Drone Flies Forever*.”</em></p>]]></description><pubDate>Wed, 17 Jul 2024 15:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/smallest-drone</guid><category>Micro aerial vehicles</category><category>Solar power</category><category>Drones</category><category>Uav</category><category>Micro air vehicles</category><dc:creator>Charles Q. Choi</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-silvery-round-drone-with-wings-and-a-small-power-system-sits-in-the-palm-of-a-hand.jpg?id=52820114&width=980"></media:content></item><item><title>Soft Robot Can Amputate and Reattach Its Own Legs</title><link>https://spectrum.ieee.org/soft-modular-robot</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-photo-of-a-hand-sized-three-legged-soft-robot-with-tubes-and-wires-coming-out-of-it-crawling-away-from-a-rock-leaving-its-fou.png?id=52559148&width=1470&height=1080&coordinates=450%2C0%2C0%2C0"/><br/><br/><p>Among the many things that humans cannot do (without some fairly substantial modification) is shifting our body morphology around on demand. It sounds a little extreme to be talking about things like self-amputation, and it
<em><em>is</em></em> a little extreme, but it’s also not at all uncommon for other animals to do—lizards can disconnect their tails to escape a predator, for example. And it works in the other direction, too, with animals like ants adding to their morphology by connecting to each other to traverse gaps that a single ant couldn’t cross alone.<br/></p><p>
In a new paper, roboticists from
<a href="https://www.eng.yale.edu/faboratory/" rel="noopener noreferrer" target="_blank"><u>The Faboratory at Yale University</u></a> have given a soft robot the ability to detach and reattach pieces of itself, editing its body morphology when necessary. It’s a little freaky to watch, but it kind of makes me wish I could do the same thing.
</p><hr/><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="1810cbb2a95bdc058a86ae7f9c900e96" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/qPd9x9-bALo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
<small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Faboratory at Yale</small>
</p><p>
These are fairly standard soft-bodied silicon robots that use asymmetrically stiff air chambers that inflate and deflate (using a tethered pump and valves) to generate a walking or crawling motion. What’s new here are the joints, which rely on a new material called a bicontinuous thermoplastic foam (BTF) to form a supportive structure for a sticky polymer that’s solid at room temperature but can be easily melted.
</p><p>
The BTF acts like a sponge to prevent the polymer from running out all over the place when it melts, and means that you can pull two BTF surfaces apart by melting the joint, and stick them together again by reversing the procedure. The process takes about 10 minutes and the resulting joint is quite strong. It’s also good for a couple of hundred detach/re-attach cycles before degrading. It even stands up to dirt and water reasonably well.
</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="847ee4e9e09f65f772b74715ea9a5ca7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/kFgKRabL7Rc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
<small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Faboratory at Yale</small>
</p><p>
This kind of thing has been done before with mechanical connections and magnets and other things like that—getting robots to attach to and detach from other robots is a foundational technique for modular robotics, after all. But these systems are inherently rigid, which is bad for soft robots, whose whole thing is about
<em><em>not</em></em> being rigid. It’s all very preliminary, of course, because there are plenty of rigid things attached to these robots with tubes and wires and stuff. And there’s no autonomy or payloads here either. That’s not the point, though—the point is the joint, which (as the researchers point out) is “the first instantiation of a fully soft reversible joint” resulting in the “potential for soft artificial systems [that can] shape change via mass addition and subtraction.”<strong></strong>
</p><p>“<a href="https://onlinelibrary.wiley.com/doi/abs/10.1002/adma.202400241" rel="noopener noreferrer" target="_blank"><u>Self-Amputating and Interfusing Machines</u></a>,” by Bilige Yang, Amir Mohammadi Nasab, Stephanie J. Woodman, Eugene Thomas, Liana G. Tilton, Michael Levin, and Rebecca Kramer-Bottiglio from Yale, was published in May in <em><em>Advanced Materials.</em></em>
</p><p>
.
</p>]]></description><pubDate>Sat, 13 Jul 2024 12:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/soft-modular-robot</guid><category>Soft robotics</category><category>Robotics</category><category>Yale</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-photo-of-a-hand-sized-three-legged-soft-robot-with-tubes-and-wires-coming-out-of-it-crawling-away-from-a-rock-leaving-its-fou.png?id=52559148&width=980"></media:content></item><item><title>Video Friday: Unitree Talks Robots</title><link>https://spectrum.ieee.org/video-friday-unitree-talks-robots</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/two-chinese-men-sit-next-to-a-small-silver-humanoid-robot-with-a-larger-black-humanoid-robot-in-the-background-in-a-booth-at-a-c.png?id=52675892&width=1200&height=800&coordinates=0%2C0%2C300%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><h5><a href="https://icra40.ieee.org/">ICRA@40</a>: 23–26 September 2024, ROTTERDAM, NETHERLANDS</h5><h5><a href="https://iros2024-abudhabi.org/">IROS 2024</a>: 14–18 October 2024, ABU DHABI, UAE</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="b2pmdshvgoy"><em>At ICRA 2024, Spectrum editor Evan Ackerman sat down with Unitree Founder and CEO Xingxing Wang and Tony Yang, VP of Business Development, to talk about the company’s newest humanoid, the G1 model.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="40ba2851eb8b488a9553e27e8f6c4d40" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/B2pmDShvGOY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/g1/">Unitree</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="byloosji-t8">SACRIFICE YOUR BODY FOR THE ROBOT</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ea89766662388b5fa0ed403c9a1f9de5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/bylOoSJI-t8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://yugu.faculty.wvu.edu/">WVUIRL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="kp7291n9jg4"><em>From navigating uneven terrain outside the lab to pure vision perception, GR-1 continues to push the boundaries of what’s possible.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ce80aa3762c4dfeef004c836f21a913d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/KP7291N9jg4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://fourierintelligence.com/gr1/">Fourier</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="g5egnbtqcwa"><em>Aerial manipulation has gained interest for completing high-altitude tasks that are challenging for human workers, such as contact inspection and defect detection. This letter addresses a more general and dynamic task: simultaneously tracking time-varying contact force and motion trajectories on tangential surfaces. We demonstrate the approach on an aerial calligraphy task using a novel sponge pen design as the end-effector.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f38be05694a9a00589d180ebdd928252" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/g5egNbtQCwA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://xiaofeng-guo.github.io/flying-calligrapher/">CMU</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="hyfcgppjjnk"><em>LimX Dynamics Biped Robot P1 was kicked and hit: Faced with random impacts in a crowd, P1 with its new design once again showcased exceptional stability as a mobility platform.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e0234aa5dc2a9fedb2214bde3ced1654" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/HYFCGPPjJnk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://medium.com/@limxdynamics/robotic-rigorous-testing-the-rationale-behind-our-kicks-18b1d96b6e01">LimX Dynamics</a> ]</p><p>Thanks, Ou Yan!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="xh7v-6uwfqc">This is from ICRA 2018, but it holds up pretty well in the novelty department.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d8ccd55dbdc95bb7bca660d8fb4c043d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Xh7v-6uWfQc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.inrol.snu.ac.kr/">SNU INRoL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="onv6e7kssy4">I think someone needs to crank the humor setting up on this one.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3638216d991463ada61835bc77d50d65" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Onv6E7kssY4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en/index/product3.html">Deep Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="fcyv4b-dh68"><em>The paper summarizes the work at the Micro Air Vehicle Laboratory on end-to-end neural control of quadcopters. A major challenge in bringing these controllers to life is the “reality gap” between the real platform and the training environment. To address this, we combine online identification of the reality gap with pre-trained corrections through a deep neural controller, which is orders of magnitude more efficient than traditional computation of the optimal solution.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="dbe9bcfe3d5c9ebf6bf3c3fcb9bb5cba" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FCYV4B-DH68?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://mavlab.tudelft.nl/">MAVLab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="9liei__uchm">This is a dedicated Track Actuator from HEBI Robotics. Why they didn’t just call it a “tracktuator” is beyond me.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="08e8a475e2e249a3638ad634d5dc6d07" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/9Liei__UChM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.hebirobotics.com/actuators#Track%20Actuator">HEBI Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="cwzlhmswl5e"><em>Menteebot can navigate complex environments by combining a 3D model of the world with a dynamic obstacle map. On the first day in a new location, Menteebot generates the 3D model by following a person who shows the robot around.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2a6dbcd662db6c617117ba0da34326b5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/cWZLHmswL5E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.menteebot.com/">Mentee Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="8hwoaikmqwy">Here’s that drone with a 68kg payload and 70km range you’ve always wanted.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="18cd744218ae9fc0d357bc86928c1695" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8HwoaiKMQWY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.malloyaeronautics.com/t150.html">Malloy</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="zlha-rwbbyu"><em>AMBIDEX is a dual-armed robot with an innovative mechanism developed for safe coexistence with humans. Based on an innovative cable structure, it is designed to be both strong and stable.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d5cc4c4d8d2b872ad217b1d06c8e3357" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zLhA-RWBBYU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.naverlabs.com/en/ambidex">NAVER Labs</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="i7qon82reic"><em>As quadrotors take on an increasingly diverse range of roles, researchers often need to develop new hardware platforms tailored for specific tasks, introducing significant engineering overhead. In this article, we introduce the UniQuad series, a unified and versatile quadrotor hardware platform series that offers high flexibility to adapt to a wide range of common tasks, excellent customizability for advanced demands, and easy maintenance in case of crashes.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="89f245b4214e4dd037a4d1f719ec8506" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/I7qoN82rEIc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://hkust-aerial-robotics.github.io/UniQuad/">HKUST</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="spkzgmo5lr4"><em>The video demonstrates the field testing of a 43 kg (95 lb) amphibious cycloidal propeller unmanned underwater vehicle (Cyclo-UUV) developed at the Advanced Vertical Flight Laboratory, Texas A&M University. The vehicle utilizes a combination of cycloidal propellers (or cyclo-propellers), screw propellers, and tank treads for operations on land and underwater.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="21136ac8dd2f8da163c218d2c74b02cd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/sPKZGMO5lR4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://avfl.engr.tamu.edu/projects/amphibious-uuv/">TAMU</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="zhch2w1_8t0"><em>The “pill” (the package hook) on Wing’s delivery drones is a crucial component to our aircraft! Did you know our package hook is designed to be aerodynamic and has stable flight characteristics, even at 65 mph?</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cbb02d6d0d38a05e94a12ed5a5551c1c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ZHCh2w1_8T0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://wing.com/">Wing</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="rtj0gp7twly">Happy 50th to robotics at ABB!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9753a65d2c460c4bb63357ed3dec63fc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/RTJ0gP7TwLY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://global.abb/group/en/about/history">ABB</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="lvmdzrxve_q">This JHU Center for Functional Anatomy & Evolution Seminar is by Chen Li, on Terradynamics of Animals & Robots in Complex Terrain.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="faa7a645891e8df891f0f72163e73270" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/LVmdZRxvE_Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://li.me.jhu.edu/">JHU</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 12 Jul 2024 16:21:11 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-unitree-talks-robots</guid><category>Video friday</category><category>Unitree</category><category>Fourier intelligence</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/two-chinese-men-sit-next-to-a-small-silver-humanoid-robot-with-a-larger-black-humanoid-robot-in-the-background-in-a-booth-at-a-c.png?id=52675892&width=980"></media:content></item><item><title>Food Service Robots Just Need the Right Ingredients</title><link>https://spectrum.ieee.org/chef-robotics-food-robots</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/as-trays-of-food-move-along-a-conveyor-belt-overhead-robot-arms-scoop-up-different-food-items-to-add-them-to-the-trays.png?id=52671148&width=1200&height=800&coordinates=267%2C0%2C267%2C0"/><br/><br/><p>Food prep is one of those problems that seems like it <em><em>should</em></em> be solvable by robots. It’s a predictable, repetitive, basic manipulation task in a semi-structured environment—seems ideal, right? And obviously there’s a huge need, because human labor is expensive and getting harder and harder to find in these contexts. There are currently <a href="https://www.chefrobotics.ai/" target="_blank">over a million unfilled jobs in the food industry</a> in the United States, and even with jobs that are filled, the annual turnover rate is 150 percent (meaning a lot of workers don’t even last a year).<strong></strong></p><p>Food prep seems like a great opportunity for robots, which is why <a href="https://www.chefrobotics.ai/" target="_blank"><u>Chef Robotics</u></a> and a handful of other robotics companies tackled it a couple years ago by bringing robots to fast casual restaurants like Chipotle or Sweetgreen, where you get served a custom-ish meal from a selection of ingredients at a counter.<strong></strong></p><p>But this didn’t really work out, for a couple of reasons. First, doing things that are mostly effortless for humans are inevitably extremely difficult for robots. And second, humans actually do a lot of useful things in a restaurant context besides just putting food onto plates, and the robots weren’t up for all of those things.</p><p>Still, Chef Robotics founder and CEO <a href="https://www.linkedin.com/in/rajatbhageria/" target="_blank">Rajat Bhageria</a> wasn’t ready to let this opportunity go. “The food market is arguably the biggest market that’s tractable for AI today,” he told <em><em>IEEE Spectrum</em></em>. And with a bit of a pivot away from the complicated mess of fast casual restaurants, Chef Robotics has still managed to prepare over 20 million meals thanks to autonomous robot arms deployed all over North America. Without knowing it, you may even have eaten such a meal.<br/></p><p class="pull-quote">“The hard thing is, can you pick fast? Can you pick consistently? Can you pick the right portion size without spilling? And can you pick without making it look like the food was picked by a machine?” <strong>—Rajat Bhageria, Chef Robotics</strong></p><p><span></span>When we spoke with Bhageria, he explained that there are three basic tasks involved in prepared food production: prep (tasks like chopping ingredients), the actual cooking process, and then assembly (or plating). Of these tasks, prep scales pretty well with industrial automation in that you can usually order pre-chopped or mixed ingredients, and cooking also scales well since you can cook more with only a minimal increase in effort just by using a bigger pot or pan or oven. What <em>doesn’t</em> scale well is the assembly, especially when any kind of flexibility or variety is required. You can clearly see this in action at any fast casual restaurant, where a couple of people are in the kitchen cooking up massive amounts of food while each customer gets served one at a time.</p><p>So with that bottleneck identified, let’s throw some robots at the problem, right? And that’s exactly what Chef Robotics did, explains Bhageria: “we went to our customers, who said that their biggest pain point was labor, and the most labor is in assembly, so we said, we can help you solve this.”</p><p>Chef Robotics started with fast casual restaurants. They weren’t the first to try this—many other robotics companies had attempted this before, with decidedly mixed results.<strong> </strong>“We actually had some good success in the early days selling to fast casual <strong></strong> chains,” Bhageria says, “but then we had some technical obstacles. Essentially, if we want to have a human-equivalent system so that we can charge a human-equivalent service fee for our robot, we need to be able to do every ingredient. You’re either a full human equivalent, or our customers told us it wouldn’t be useful.”</p><p>Part of the challenge is that training robots do perform all of the different manipulations required for different assembly tasks requires different kinds of real world data. That data simply doesn’t exist—or, if it does, any company that has it knows what it’s worth and isn’t sharing. <strong></strong>You can’t easily simulate this kind of data, because food can be gross and difficult to handle, whether it’s gloopy or gloppy or squishy or slimy or unpredictably deformable in some other way, and you really need physical experience to train a useful manipulation model.<strong></strong></p><p>Setting fast casual restaurants aside for a moment, what about food prep situations where things are as predictable as possible, like mass-produced meals? We’re talking about food like frozen dinners, that have a handful of discrete ingredients packed into trays at factory scale. Frozen meal production relies on automation rather than robotics because the scale is such that the cost of dedicated equipment can be justified.</p><p>There’s a middle ground, though, where robots have found (some) opportunity: When you need to produce a high volume of the same meal, but that meal changes regularly. For example, think of any kind of pre-packaged meal that’s made in bulk, just not at frozen-food scale. It’s an opportunity for automation in a structured environment—but with enough variety that actual automation isn’t cost effective. Suddenly, robots and their tiny bit of flexible automation have a chance to be a practical solution.</p><p class="shortcode-media shortcode-media-youtube">
<span class="rm-shortcode" data-rm-shortcode-id="fa3f9208eb17dcc5971b40593bdab8fd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/VzCIpfO6peQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
</p><p>“We saw these long assembly lines, where humans were scooping food out of big tubs and onto individual trays,” Bhageria says. “They do a lot of different meals on these lines; it’s going to change over and they’re going to do different meals throughout the week. But at any given moment, each person is doing one ingredient, and maybe on a weekly basis, that person would do six ingredients. This was really compelling for us because six ingredients is something we can bootstrap in a lab. We can get something good enough and if we can get something good enough, then we can ship a robot, and if we can ship a robot to production, then we will get real world training data.”<br/></p><p>Chef Robotics has been deploying robot modules that they can slot into existing food assembly lines in place of humans without any retrofitting necessary. The modules consist of six degree of freedom arms wearing swanky IP67 washable suits. To handle different kinds of food, the robots can be equipped with a variety of different utensils (and their accompanying manipulation software strategies). Sensing includes a few depth cameras, as well as a weight-sensing platform <strong></strong>for the food tray to ensure consistent amounts of food are picked. And while arms with six degrees of freedom may be overkill for now, eventually the hope is that they’ll be able to handle more complex food like asparagus, where you need to do a little bit more than just scoop.<strong></strong></p><p class="">While Chef Robotics seems to have a viable business here, Bhageria tells us that he keeps coming back to that vision of robots being useful in fast casual restaurants, and eventually, robots making us food in our homes. Making that happen will require time, experience, technical expertise, and an astonishing amount of real-world training data, which is the real value behind those 20 million robot-prepared meals (and counting). The more robots the company deploys, the more data they collect, which will allow them to train their food manipulation models to handle a wider variety of ingredients to open up even more deployments. Their robots, <a href="https://www.chefrobotics.ai/post/lifting-the-veil-on-chef-and-the-future-of-embodied-ai-in-the-food-industry" target="_blank">Chef’s website says</a>, “essentially act as data ingestion engines to improve our AI models.”<strong></strong></p><p>The next step is likely <a href="https://cloudkitchens.com/blog/ultimate-guide-to-ghost-kitchens/" target="_blank">ghost kitchens</a> where the environment is still somewhat controlled and human interaction isn’t necessary, followed by deployments in commercial kitchens more broadly. But even that won’t be enough for Bhageria, who wants robots that can take over from all of the drudgery in food service: “I’m really excited about this vision,” he says. “How do we deploy hundreds of millions of robots all over the world that allow humans to do what humans do best?”</p>]]></description><pubDate>Thu, 11 Jul 2024 18:51:23 +0000</pubDate><guid>https://spectrum.ieee.org/chef-robotics-food-robots</guid><category>Chef robotics</category><category>Food robots</category><category>Robotic manipulation</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/as-trays-of-food-move-along-a-conveyor-belt-overhead-robot-arms-scoop-up-different-food-items-to-add-them-to-the-trays.png?id=52671148&width=980"></media:content></item><item><title>Sea Drones in the Russia-Ukraine War Inspire New Tactics</title><link>https://spectrum.ieee.org/sea-drone</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-man-in-a-camouflage-military-uniform-sits-on-a-naval-drone-at-waters-edge.jpg?id=52557078&width=1200&height=800&coordinates=26%2C0%2C100%2C0"/><br/><br/><p>
<strong>Against all odds,</strong> Ukraine is still standing almost two and a half years after Russia’s massive 2022 invasion. Of course, hundreds of billions of dollars in Western support as well as Russian errors have helped immensely, but it would be a mistake to overlook Ukraine’s creative use of new technologies, particularly drones. While uncrewed aerial vehicles have grabbed most of the attention, it is naval drones that could be the key to bringing Russian president Vladimir Putin to the negotiating table.
</p><p>
These naval-drone operations in the Black Sea against Russian warships and other targets have been so successful that they are prompting, in London, Paris, Washington, and elsewhere, fundamental reevaluations of how drones will affect future naval operations. In August, 2023, for example, the Pentagon launched the billion-dollar
<a href="https://www.diu.mil/replicator" rel="noopener noreferrer" target="_blank">Replicator</a> initiative to field air and naval drones (also called sea drones) on a massive scale. It’s widely believed that such drones could be used to help <a href="https://www.navalnews.com/naval-news/2024/06/breaking-down-the-u-s-navys-hellscape-in-detail/" rel="noopener noreferrer" target="_blank">counter</a> a Chinese invasion of Taiwan.
</p><p>
And yet Ukraine’s naval drones initiative grew out of necessity, not grand strategy. Early in the war, Russia’s Black Sea fleet launched cruise missiles into Ukraine and blockaded Odesa, effectively shutting down Ukraine’s exports of grain, metals, and manufactured goods. The missile strikes terrorized Ukrainian citizens and shut down the power grid, but Russia’s blockade was arguably more consequential, devastating Ukraine’s economy and creating food shortages from North Africa to the Middle East.
</p><p>
With its navy seized or sunk during the war’s opening days, Ukraine had few options to regain access to the sea. So Kyiv’s troops got creative.
<a href="https://uk.wikipedia.org/wiki/%D0%9B%D1%83%D0%BA%D0%B0%D1%88%D0%B5%D0%B2%D0%B8%D1%87_%D0%86%D0%B2%D0%B0%D0%BD_%D0%92%D0%BE%D0%BB%D0%BE%D0%B4%D0%B8%D0%BC%D0%B8%D1%80%D0%BE%D0%B2%D0%B8%D1%87" rel="noopener noreferrer" target="_blank">Lukashevich Ivan Volodymyrovych</a>, a brigadier general in the <a href="https://greydynamics.com/ukrainian-sbu-protectors-of-the-homeland/" rel="noopener noreferrer" target="_blank">Security Service of Ukraine</a>, the country’s counterintelligence agency, proposed building a series of fast, uncrewed attack boats. In the summer of 2022, the service, which is known by the acronym SBU, began with a few prototype drones. These quickly led to a pair of naval drones that, when used with commercial satellite imagery, off-the-shelf uncrewed aircraft, and Starlink terminals, gave Ukrainian operators the means to sink or disable a<a href="https://www.newsweek.com/russia-black-sea-fleet-ukraine-crimea-tsiklon-corvette-1902339" rel="noopener noreferrer" target="_blank"> third</a> of Russia’s Black Sea Fleet, including the flagship <a href="https://en.wikipedia.org/wiki/Russian_cruiser_Moskva" rel="noopener noreferrer" target="_blank"><em><em>Moskva</em></em></a> and <a href="https://www.newsweek.com/russia-moving-black-sea-ships-ukraine-strikes-crimea-tsiklon-1903944" rel="noopener noreferrer" target="_blank">most</a> of the fleet’s cruise-missile-equipped warships.
</p><p>
To protect their remaining vessels, Russian commanders relocated the Black Sea Fleet to Novorossiysk, 300 kilometers east of Crimea. This move sheltered the ships from Ukrainian drones and missiles, but it also put them too far away to threaten Ukrainian shipping or defend the Crimean Peninsula. Kyiv has exploited the opening by restoring trade routes and mounting sustained airborne and naval drone strikes against Russian bases on Crimea and the Kerch Strait Bridge connecting the peninsula with Russia.
</p><h2>How Maguras and Sea Babies Hunt and Attack</h2><p>
The first Ukrainian drone boats were cobbled together with parts from jet skis, motorboats, and off-the-shelf electronics. But within months, manufacturers working for the Ukraine defense ministry and SBU fielded several designs that proved their worth in combat, most notably the
<a href="https://www.kyivpost.com/analysis/29068" rel="noopener noreferrer" target="_blank">Magura V5</a> and the <a href="https://www.kyivpost.com/post/25792" rel="noopener noreferrer" target="_blank">Sea Baby</a>.
</p><p>
Carrying a 300-kilogram warhead, on par with that of a heavyweight
<a href="https://www.navy.mil/Resources/Fact-Files/Display-FactFiles/Article/2167907/mk-48-heavyweight-torpedo/" rel="noopener noreferrer" target="_blank">torpedo</a>, the Magura V5 is a hunter-killer antiship drone designed to work in swarms that confuse and overwhelm a ship’s defenses. Equipped with Starlink terminals, which connect to SpaceX’s Starlink satellites, and GPS, a group of about three to five Maguras likely moves autonomously to a location near the potential target. From there, operators can wait until conditions are right and then attack the target from multiple angles using remote control and video feeds from the vehicles.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
<img alt="A man in a black wetsuit and brown bucket hat stands in shallow water next to a gray naval drone. " class="rm-shortcode" data-rm-shortcode-id="c1be3812e64548fcf19c652ab366ca21" data-rm-shortcode-name="rebelmouse-image" id="07bbd" loading="lazy" src="https://spectrum.ieee.org/media-library/a-man-in-a-black-wetsuit-and-brown-bucket-hat-stands-in-shallow-water-next-to-a-gray-naval-drone.jpg?id=52557150&width=980"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">A Ukrainian Magura V5 hunter-killer sea drone was demonstrated at an undisclosed location in Ukraine on 13 April 2024. The domed pod toward the bow, which can rotate from side to side, contains a thermal camera used for guidance and targeting.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Valentyn Origrenko/Reuters/Redux</small></p><p>Larger than a Magura, the Sea Baby is a multipurpose vehicle that can carry about 800 kg of explosives, which is close to twice the payload of a Tomahawk cruise missile. A Sea Baby was used in 2023 to inflict substantial damage to the Kerch Strait Bridge. A more recent version <a href="https://www.newsweek.com/ukraine-sea-baby-naval-drones-grad-multiple-rocket-launchers-russia-1903381" target="_blank">carries</a> a rocket launcher that Ukraine troops plan to use against Russian forces along the Dnipro River, which flows through eastern Ukraine and has often formed the frontline in that part of the country. Like a Magura, a Sea Baby is likely remotely controlled using Starlink and GPS. In addition to attack, it’s also equipped for surveillance and logistics.</p><p>Russia reduced the threat to its ships by moving them out of the region, but fixed targets like the Kerch Strait Bridge remain vulnerable to Ukrainian sea drones. To try to protect these structures from drone onslaughts, Russian commanders are taking a “kitchen sink” approach, <a href="https://www.twz.com/russia-sinks-line-of-its-own-ships-to-protect-kerch-bridge" target="_blank">submerging</a> hulks around bridge supports, fielding more <a href="https://www.bbc.com/news/world-europe-68528761" rel="noopener noreferrer" target="_blank">guns</a> to shoot at incoming uncrewed vessels, and jamming GPS and <a href="https://www.nytimes.com/2024/05/24/technology/ukraine-russia-starlink.html" rel="noopener noreferrer" target="_blank">Starlink</a> around the Kerch Strait.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
<img alt="Two men wearing balaclavas operate suitcase-style terminals for remote control of sea drones. " class="rm-shortcode" data-rm-shortcode-id="142cdcfc704ae039012189256c0d20a4" data-rm-shortcode-name="rebelmouse-image" id="63dd6" loading="lazy" src="https://spectrum.ieee.org/media-library/two-men-wearing-balaclavas-operate-suitcase-style-terminals-for-remote-control-of-sea-drones.jpg?id=52557111&width=980"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">Ukrainian service members demonstrated the portable, ruggedized consoles used to remotely guide the Magura V5 naval drones in April 2024.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Valentyn Origrenko/Reuters/Redux</small>
</p><p>While the war remains largely stalemated in the country’s north, Ukraine’s naval drones could yet force Russia into negotiations. The Crimean Peninsula was Moscow’s biggest prize from its decade-long assault on Ukraine. If the Kerch Bridge is severed and the Black Sea Fleet pushed back into Russian ports, Putin may need to end the fighting to regain control over Crimea.<br/></p><h2>Why the U.S. Navy Embraced the Swarm</h2><p>
Ukraine’s small, low-cost sea drones are offering a compelling view of future tactics and capabilities. But recent experiences elsewhere in the world are highlighting the limitations of drones for some crucial tasks. For example, for protecting shipping from piracy or stopping trafficking and illegal fishing, drones are less useful.
</p><p>
Before the Ukraine war, efforts by the U.S. Department of Defense to field surface sea drones focused mostly on large vehicles. In 2015, the Defense Advanced Research Projects Agency started, and the U.S. Navy later continued, a project that built
<a href="https://news.usni.org/2021/04/08/navy-takes-delivery-of-sea-hawk-unmanned-vessel" rel="noopener noreferrer" target="_blank">two uncrewed surface vessels</a>, called <em><em>Sea Hunter</em></em> and <em><em>Sea Hawk</em></em>. These were 130-tonne sea drones capable of roaming the oceans for up to 70 days while carrying payloads of thousands of pounds each. The point was to demonstrate the ability to detect, follow, and destroy submarines. The Navy and the Pentagon’s secretive Strategic Capabilities Office <a href="https://www.naval-technology.com/projects/ghost-fleet-overlord-unmanned-surface-vessels-usa/" rel="noopener noreferrer" target="_blank">followed</a> with the Ghost Fleet Overlord uncrewed vessel programs, which produced four larger prototypes designed to carry shipping-container-size payloads of missiles, sensors, or electronic countermeasures.
</p><p>
The U.S. Navy’s newly created Uncrewed Surface Vessel Division 1 (
<a href="https://seapowermagazine.org/navy-establishes-unmanned-surface-vessel-division-one/" rel="noopener noreferrer" target="_blank">USVDIV-1</a>) completed a <a href="https://www.cpf.navy.mil/Newsroom/News/Article/3569198/unmanned-surface-vessel-division-arrives-in-sydney/" rel="noopener noreferrer" target="_blank">deployment</a> across the Pacific Ocean last year with four medium and large sea drones: <em><em>Sea Hunter</em></em> and <em><em>Sea Hawk </em></em>and two Overlord vessels, <em><em>Ranger</em></em> and <em><em>Mariner.</em></em> The five-month deployment from Port Hueneme, Calif., took the vessels to Hawaii, Japan, and Australia, where they joined in annual exercises conducted by U.S. and allied navies. The U.S. Navy continues to <a href="https://www.defensenews.com/naval/2022/08/08/us-navy-injects-first-of-kind-unmanned-experiments-into-multinational-exercise/" rel="noopener noreferrer" target="_blank">assess</a> its drone fleet through sea trials lasting from several days to a few months.
</p><p class="shortcode-media shortcode-media-rebelmouse-image">
<img alt="A battleship-gray trimaran ship cruises near a wooded shoreline." class="rm-shortcode" data-rm-shortcode-id="1e49797fdd2d924f71289a1090da26ff" data-rm-shortcode-name="rebelmouse-image" id="c14eb" loading="lazy" src="https://spectrum.ieee.org/media-library/a-battleship-gray-trimaran-ship-cruises-near-a-wooded-shoreline.jpg?id=52557117&width=980"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">The <i>Sea Hawk</i> is a U.S. Navy trimaran drone vessel designed to find, pursue, and attack submarines. The 130-tonne ship, photographed here in October of 2023 in Sydney Harbor, was built to operate autonomously on missions of up to 70 days, but it can also accommodate human observers on board. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Ensign Pierson Hawkins/U.S. Navy</small>
</p><p>
In contrast with Ukraine’s small sea drones, which are usually remotely controlled and operate outside shipping lanes, the U.S. Navy’s much larger uncrewed vessels have to follow the nautical rules of the road. To navigate autonomously, these big ships rely on robust onboard sensors, processing for computer vision and target-motion analysis, and automation based on predictable forms of artificial intelligence, such as expert- or agent-based algorithms rather than deep learning.
</p><p>
But thanks to the success of the Ukrainian drones, the focus and energy in sea drones are rapidly moving to the smaller end of the scale. The U.S. Navy initially envisioned platforms like
<em><em>Sea Hunter</em></em> conducting missions in submarine tracking, electronic deception, or clandestine surveillance far out at sea. And large drones will still be needed for such missions. However, with the right tactics and support, a group of small sea drones can conduct similar missions as well as other vital tasks.
</p><p>
For example, though they are constrained in speed, maneuverability, and power generation, solar- or sail-powered drones can stay out for months with little human intervention. The earliest of these are wave gliders like the Liquid Robotics (a Boeing company)
<a href="https://www.liquid-robotics.com/markets/defense-security/" rel="noopener noreferrer" target="_blank"> SHARC</a>, which has been conducting undersea and surface surveillance for the U.S. Navy for more than a decade. Newer designs like the Saildrone <a href="https://www.saildrone.com/tag/voyager" rel="noopener noreferrer" target="_blank">Voyager</a> and Ocius <a href="https://ocius.com.au/usv/" rel="noopener noreferrer" target="_blank">Blue Bottle</a> incorporate motors and additional solar or diesel power to haul payloads such as radars, jammers, decoys, or active sonars. The Ocean Aero <a href="https://www.oceanaero.com/the-triton" rel="noopener noreferrer" target="_blank">Triton</a> takes this model one step further: It can submerge, to conduct clandestine surveillance or a surprise attack, or to avoid detection.
</p><p class="shortcode-media shortcode-media-rebelmouse-image">
<img alt="A pair of photographs shows an oblong, gray-and-black sea vessel cruising underwater and also sailing on the surface. " class="rm-shortcode" data-rm-shortcode-id="80e9baa671b33a4dee632bfe1fcd146a" data-rm-shortcode-name="rebelmouse-image" id="d8de8" loading="lazy" src="https://spectrum.ieee.org/media-library/a-pair-of-photographs-shows-an-oblong-gray-and-black-sea-vessel-cruising-underwater-and-also-sailing-on-the-surface.jpg?id=52557125&width=980"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">The Triton, from Ocean Aero in Gulfport, Miss., is billed as the world’s only autonomous sea drone capable of both cruising underwater and sailing on the surface. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Ocean Aero</small></p><p>
Ukraine’s success in the Black Sea has also unleashed a flurry of new small antiship attack drones. USVDIV-1 will use the
<a href="https://www.mapcorp.com/technologies-main/#marine" rel="noopener noreferrer" target="_blank">GARC</a> from <a href="https://www.mapcorp.com/" rel="noopener noreferrer" target="_blank">Maritime Applied Physics Corp.</a> to develop tactics. The Pentagon’s Defense Innovation Unit has also begun <a href="https://www.thedefensepost.com/2024/04/10/texas-marine-drones-china/" rel="noopener noreferrer" target="_blank">purchasing</a> drones for the China-focused Replicator initiative. Among the likely craft being evaluated are fast-attack sea drones from Austin, Texas–based <a href="https://www.saronic.com/" rel="noopener noreferrer" target="_blank">Saronic</a>.
</p><p>
Behind the soaring interest in small and inexpensive sea drones is the
<a href="https://www.hudson.org/defense-strategy/unalone-unafraid-plan-integrating-uncrewed-other-emerging-technologies-us-military-bryan-clark-dan-patt" rel="noopener noreferrer" target="_blank">changing value proposition</a> for naval drones. As recently as four years ago, military planners were focused on using them to replace crewed ships in “dull, dirty, and dangerous” jobs. But now, the thinking goes, sea drones can provide scale, adaptability, and resilience across each link in the “kill chain” that extends from detecting a target to hitting it with a weapon.
</p><p>
Today, to attack a ship, most navies generally have one preferred sensor (such as a radar system), one launcher, and one missile. But what these planners are now coming to appreciate is that a fleet of crewed surface ships with a collection of a dozen or two naval drones would offer multiple paths to both find that ship and attack it. These craft would also be less vulnerable, because of their dispersion.
</p><h2>Defending Taiwan by Surrounding It With a “Hellscape”</h2><p>
U.S. efforts to protect Taiwan may soon reflect this new value proposition. Many
<a href="https://www.defenseone.com/policy/2021/07/it-failed-miserably-after-wargaming-loss-joint-chiefs-are-overhauling-how-us-military-will-fight/184050/" rel="noopener noreferrer" target="_blank">classified</a> and <a href="https://www.csis.org/analysis/first-battle-next-war-wargaming-chinese-invasion-taiwan" rel="noopener noreferrer" target="_blank">unclassified</a> war games suggest Taiwan and its allies could successfully defend the island—but at costs high enough to potentially dissuade a U.S. president from intervening on Taiwan’s behalf. With U.S. defense budgets capped by law and procurement constrained by rising personnel and maintenance costs, substantially growing or improving today’s U.S. military for this specific purpose is unrealistic. Instead, commanders are looking for creative solutions to slow or stop a Chinese invasion without losing most U.S. forces in the process.
</p><p>
Naval drones look like a good—and maybe the best—
<a href="https://www.hudson.org/defense-strategy/hedging-bets-rethinking-force-design-post-dominance-era-bryan-clark-dan-patt" rel="noopener noreferrer" target="_blank">solution</a>. The Taiwan Strait is only 160 kilometers (100 miles) wide, and Taiwan’s coastline offers only a few areas where large numbers of troops could come ashore. U.S. naval attack drones positioned on the likely routes could disrupt or possibly even halt a Chinese invasion, much as Ukrainian sea drones have denied Russia access to the western Black Sea and, for that matter, Houthi-controlled drones have sporadically closed off large parts of the Red Sea in the Middle East.
</p><p class="pull-quote">Rather than killer robots seeking out and destroying targets, the drones defending Taiwan would be passively waiting for Chinese forces to illegally enter a protected zone, within which they could be attacked.</p><p>
The new U.S. Indo-Pacific Command leader, Admiral
<a href="https://www.navy.mil/Leadership/Flag-Officer-Biographies/Search/Article/2236378/admiral-samuel-paparo/" rel="noopener noreferrer" target="_blank">Sam Paparo</a>, wants to apply this approach to defending Taiwan in a scenario he calls “<a href="https://www.defenseone.com/technology/2023/08/hellscape-dod-launches-massive-drone-swarm-program-counter-china/389797/" rel="noopener noreferrer" target="_blank">Hellscape</a>.” In it, U.S. surface and undersea drones would likely be based near Taiwan, perhaps in the Philippines or Japan. When the potential for an invasion rises, the drones would move themselves or be carried by larger uncrewed or crewed ships to the western coast of Taiwan to wait.
</p><p>
Sea drones are well-suited to this role, thanks in part to the evolution of naval technologies and tactics over the past half century. Until World War II, submarines were the most lethal threat to ships. But since the Cold War, long-range subsonic, supersonic, and now hypersonic antiship missiles have commanded navy leaders’ attention. They’ve spent decades devising ways to protect their ships against such antiship missiles.
</p><p>
Much less effort has gone into defending against torpedoes, mines—or sea drones. A dozen or more missiles might be needed to ensure that just one reaches a targeted ship, and even then, the
<a href="https://apnews.com/article/yemen-red-sea-ship-attack-c47710540383198ba1acb41b07f14751" rel="noopener noreferrer" target="_blank">damage</a> may not be catastrophic. But a single surface or undersea drone could easily evade detection and explode at a ship’s waterline to sink it, because in this case, water pressure does most of the work.
</p><p>
The level of autonomy available in most sea drones today is more than enough to attack ships in the Taiwan Strait. Details of U.S. military plans are classified, but a recent Hudson Institute
<a href="https://www.hudson.org/defense-strategy/hedging-bets-rethinking-force-design-post-dominance-era-bryan-clark-dan-patt" rel="noopener noreferrer" target="_blank">report</a> that I wrote with Dan Patt, proposes a possible approach. In it, a drone flotilla, consisting of about three dozen hunter-killer surface drones, two dozen uncrewed surface vessels carrying aerial drones, and three dozen autonomous undersea drones, would take up designated positions in a “kill box” adjacent to one of Taiwan’s western beaches if a Chinese invasion fleet had begun massing on the opposite side of the strait. Even if they were based in Japan or the Philippines, the drones could reach Taiwan within a day. Upon receiving a signal from operators remotely using Starlink or locally using a line-of-sight radio, the drones would act as a mobile minefield, attacking troop transports and their escorts inside Taiwan’s territorial waters. Widely available electro-optical and infrared sensors, coupled to recognition <a href="https://ieeexplore.ieee.org/document/9987188" rel="noopener noreferrer" target="_blank">algorithms</a>, would direct the drones to targets.
</p><p>
Although communications with operators onshore would likely be jammed, the drones could coordinate their actions locally using line-of-sight Internet Protocol–based networks like
<a href="https://silvustechnologies.com/" rel="noopener noreferrer" target="_blank">Silvus</a> or <a href="https://www.collinsaerospace.com/what-we-do/industries/military-and-defense/communications/tactical-data-links/tactical-targeting-network-technology" rel="noopener noreferrer" target="_blank">TTNT</a>. For example, surface vessels could launch aerial drones that would attack the pilot houses and radars of ships, while surface and undersea drones strike ships at the waterline. The drones could also coordinate to ensure they do not all strike the same target and to prioritize the largest targets first. These kinds of simple collaborations are routine in today’s drones.
</p><p>
Treating drones like mines reduces the complexity needed in their control systems and helps them comply with Pentagon
<a href="https://www.defense.gov/News/News-Stories/Article/Article/3278065/dod-updates-autonomy-in-weapons-system-directive/#:~:text=DOD%20requires%20extensive%20testing%2C%20reviews,do%20not%20meet%20specific%20exemptions." rel="noopener noreferrer" target="_blank"> rules</a> for autonomous weapons. Rather than killer robots seeking out and destroying targets, the drones defending Taiwan would be passively waiting for Chinese forces to illegally enter a protected zone, within which they could be attacked.
</p><p>
Like Russia’s Black Sea Fleet, the Chinese navy will develop countermeasures to sea drones, such as employing decoy ships, attacking drones from the air, or using minesweepers to move them away from the invasion fleet. To stay ahead, operators will need to continue innovating tactics and behaviors through frequent exercises and experiments, like those
<a href="https://www.navy.mil/Press-Office/News-Stories/Article/3781958/surfor-establishes-unmanned-surface-vessel-squadron-usvron-three/" rel="noopener noreferrer" target="_blank">underway</a> at U.S. Navy Unmanned Surface Vessel Squadron Three. (Like the USVDIV-1, it is a unit under the U.S. Navy’s <a href="https://insidedefense.com/insider/navy-stand-new-unmanned-surface-vessel-squadron" rel="noopener noreferrer" target="_blank">Surface Development Squadron One</a>.) Lessons from such exercises would be incorporated into the defending drones as part of their programming before a mission.
</p><p>
The emergence of sea drones heralds a new era in naval warfare. After decades of focusing on increasingly lethal antiship missiles, navies now have to defend against capable and widely proliferating threats on, above, and below the water. And while sea drone swarms may be mainly a concern for coastal areas, these choke points are critical to the global economy and most nations’ security. For U.S. and allied fleets, especially, naval drones are a classic combination of threat
<em><em>and</em></em> opportunity. As the Hellscape concept suggests, uncrewed vessels may be a solution to some of the most challenging and sweeping of modern naval scenarios for the Pentagon and its allies—and their adversaries. <span class="ieee-end-mark"></span>
</p><p><em>This article was updated on 10 July 2024. An earlier version stated that sea drones from Saronic Technologies are being purchased by the U.S. Department of Defense’s Defense Innovation Unit. This could not be publicly confirmed.</em></p>]]></description><pubDate>Wed, 10 Jul 2024 12:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/sea-drone</guid><category>Naval drones</category><category>Sea drones</category><category>Uncrewed surface vehicles</category><category>Autonomous surface vehicles</category><dc:creator>Bryan Clark</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-man-in-a-camouflage-military-uniform-sits-on-a-naval-drone-at-waters-edge.jpg?id=52557078&width=980"></media:content></item><item><title>Video Friday: Humanoids Building BMWs</title><link>https://spectrum.ieee.org/video-friday-humanoids-building-bmws</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-silvery-humanoid-robot-picks-up-car-parts-from-a-fixture-in-a-factory.png?id=52546736&width=1200&height=800&coordinates=150%2C0%2C150%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><h5><a href="https://icra40.ieee.org/">ICRA@40</a>: 23–26 September 2024, ROTTERDAM, NETHERLANDS</h5><h5><a href="https://iros2024-abudhabi.org/">IROS 2024</a>: 14–18 October 2024, ABU DHABI, UAE</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="k1trbi0baau">Figure is making progress toward a humanoid robot that can do something useful, but keep in mind that the “full use case” here is not one continuous shot.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c8db6c81c8499e776e5d41aaff04a190" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/K1TrbI0BaaU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.figure.ai/">Figure</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ijmykhr1lyw">Can this robot survive a 1-meter drop? Spoiler alert: it cannot.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f8c39b97670d3d14fa6b7cdfe60584fd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/IjMyKHr1lyw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://yugu.faculty.wvu.edu/">WVUIRL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="b_i2k7mzekg">One of those things that’s a lot harder for robots than it probably looks.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="96a700df89fa62b0334122796933c94f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/B_I2k7MZEKg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>This is a demo of hammering a nail. The instantaneous rebound force from the hammer is absorbed through a combination of the elasticity of the rubber material securing the hammer, the deflection in torque sensors and harmonic gears, back-drivability, and impedance control. This allows the nail to be driven with a certain amount of force.</em></blockquote><p>[ <a href="https://robotics.tokyo/">Tokyo Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="qetylccejtw"><em>Although bin packing has been a key benchmark task for robotic manipulation, the community has mainly focused on the placement of rigid rectilinear objects within the container. We address this by presenting a soft robotic hand that combines vision, motor-based proprioception, and soft tactile sensors to identify, sort, and pack a stream of unknown objects.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2beb6be6aaeee0228d93d9edb21e6258" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/qetYLCcejTw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.youtube.com/watch?v=qetYLCcejTw">MIT CSAIL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ujdk3yd2ghy"><em>Status Update: Extending traditional visual servo and compliant control by integrating the latest reinforcement and imitation learning control methodologies, UBTECH gradually trains the embodied intelligence-based “cerebellum” of its humanoid robot Walker S for diverse industrial manipulation tasks.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7009b146ff17c4fc5b6008f65f772b5c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ujdK3yd2gHY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ubtrobot.com/humanoid/products/Walker">UBTECH</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="nuosqungasw">If you’re gonna ask a robot to stack bread, better make it flat.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e3016171edff0edede3527dbdb59bd88" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/nUOsQungAsw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.fanucamerica.com/products/robots/series/dr-3ib-series-delta-robots/dr-3ib-8l">FANUC</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="orie28shevc">Cassie has to be one of the most distinctive sounding legged robots there is.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="46e84ab372a73f5ddf558217ecb31e71" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ORie28sHEvc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2403.02486">Paper</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="0tlsys8a4aa">Twice the robots are by definition twice as capable, right...?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9bc13d7facf89a8aa0357a624fb28a1d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0tLSYs8A4AA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pollen-robotics.com/">Pollen Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="lld3bfs-qms"><em>The Robotic Systems Lab participated in the Advanced Industrial Robotic Applications (AIRA) Challenge at the ACHEMA 2024 process industry trade show, where teams demonstrated their teleoperated robotic solutions for industrial inspection tasks. We competed with the ALMA legged manipulator robot, teleoperated using a second robot arm in a leader-follower configuration, placing us in third place for the competition.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b2dbac224a474e3a7c36513e8b913ae9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/LLD3BFS-qms?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://rsl.ethz.ch/">ETHZ RSL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="zuntf0kmtze">This is apparently “peak demand” in a single market for Wing delivery drones.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1b6a198d2839aaff8090c2de344836e8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Zuntf0KmtzE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://wing.com/">Wing</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="fkdnu50nx-8"><em>Using a new type of surgical intervention and neuroprosthetic interface, MIT researchers, in collaboration with colleagues from Brigham and Women’s Hospital, have shown that a natural walking gait is achievable using a prosthetic leg fully driven by the body’s own nervous system. The surgical amputation procedure reconnects muscles in the residual limb, which allows patients to receive “proprioceptive” feedback about where their prosthetic limb is in space.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="47c34c0243154907cdeb6b30c0b85c48" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/fKdnu50Nx-8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://news.mit.edu/2024/prosthesis-helps-people-with-amputation-walk-naturally-0701">MIT</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="lpodb4c5cim"><em>Coal mining in Forest of Dean (UK) is such a difficult and challenging job. Going into the mine as human is sometimes almost impossible. We did it with our robot while inspecting the mine with our partners (Forestry England) and the local miners!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="54a784d4fb098847bec2b47efa55f30c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/LPoDb4C5cIM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://rpl-as-ucl.github.io/">UCL RPL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="b72gej00gtq">Chill.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cc03677ec95c9ac1462ba5fb7c4877ff" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/b72geJ00gTQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://new.abb.com/products/robotics/robots/collaborative-robots/yumi/dual-arm">ABB</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ui6iklhh-pq"><em>Would you tango with a robot? Inviting us into the fascinating world of dancing machines, robot choreographer Catie Cuan highlights why teaching robots to move with grace, intention and emotion is essential to creating AI-powered machines we will want to welcome into our daily lives.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5d848c3b8c41bd801e3b4d6ca532e376" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UI6IKlHh-pQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ted.com/talks/catie_cuan_next_up_for_ai_dancing_robots?rss=172BB350-0205">TED</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 05 Jul 2024 19:51:56 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-humanoids-building-bmws</guid><category>Video friday</category><category>Humanoid robots</category><category>Figure</category><category>Dancing robots</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-silvery-humanoid-robot-picks-up-car-parts-from-a-fixture-in-a-factory.png?id=52546736&width=980"></media:content></item><item><title>Persona AI Brings Calm Experience to the Hectic Humanoid Industry</title><link>https://spectrum.ieee.org/persona-ai-radford-pratt</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-shadowy-rendering-of-a-humanoid-robot-standing-in-the-darkness.png?id=52520431&width=1200&height=800&coordinates=0%2C101%2C0%2C101"/><br/><br/><p>
It may at times seem like there are as many humanoid robotics companies out there as the industry could possibly sustain, but the potential for useful and reliable and affordable humanoids is so huge that there’s plenty of room for any company that can actually get them to work. Joining the <a href="https://spectrum.ieee.org/humanoid-robots" target="_blank">dozen or so companies</a> already on this quest is
<a href="https://personainc.ai/" target="_blank">Persona AI</a>, founded last month by <a href="https://www.linkedin.com/in/nicolaus-radford/" target="_blank">Nic Radford</a> and <a href="https://www.linkedin.com/in/jerry-pratt/" target="_blank">Jerry Pratt</a>, two people who know better than just about anyone what it takes to make a successful robotics company, although they also know enough to be wary of getting into commercial humanoids.</p><h3></h3><br/><p>
Persona AI may not be the first humanoid robotics startup, but its founders have some serious experience in the space:
</p><p>
<strong>Nic Radford</strong> lead the team that developed NASA’s <a href="https://spectrum.ieee.org/meet-valkyrie-nasas-superhero-robot" target="_blank">Valkyrie humanoid robot</a>, before founding Houston Mechatronics (now Nauticus Robotics), which introduced a <a href="https://spectrum.ieee.org/meet-aquanaut-the-underwater-transformer" target="_blank">transforming underwater robot</a> in 2019. He also founded Jacobi Motors, which is commercializing variable flux electric motors.
</p><p>
<strong>Jerry Pratt</strong> worked on walking robots for 20 years at the Institute for Human and Machine Cognition (<a href="https://robots.ihmc.us/" target="_blank">IHMC</a>) in Pensacola, Florida. He co-founded <a href="https://boardwalkrobotics.com/" target="_blank">Boardwalk Robotics</a> in 2017, and has spent the last two years as CTO of <a href="https://spectrum.ieee.org/figure-robot-video" target="_blank">multi-billion-dollar humanoid startup Figure</a>.
</p><p>
“It took me a long time to warm up to this idea,” Nic Radford tells us. “After I left Nauticus in January, I didn’t want anything to do with humanoids, especially underwater humanoids, and I didn’t even want to hear the word ‘robot.’ But things are changing so quickly, and I got excited and called Jerry and I’m like, this is actually very possible.” Jerry Pratt, who recently left Figure due primarily to the
<a href="https://en.wikipedia.org/wiki/Two-body_problem_(career)" rel="noopener noreferrer" target="_blank"><u>two-body problem</u></a>, seems to be coming from a similar place: “There’s a lot of bashing your head against the wall in robotics, and persistence is so important. Nic and I have both gone through pessimism phases with our robots over the years. We’re a bit more optimistic about the commercial aspects now, but we want to be pragmatic and realistic about things too.”
</p><p>
Behind all of the recent humanoid hype lies the very, very difficult problem of making a highly technical piece of hardware and software compete effectively with humans in the labor market. But that’s also a very, very big opportunity—big enough that Persona doesn’t have to be the first company in this space, or the best funded, or the highest profile. They simply have to succeed, but of course sustainable commercial success with any robot (and bipedal robots in particular) is anything but simple. Step one will be building a founding team across two locations: Houston and Pensacola, Fla. But Radford says that the response so far to just a couple of
<a href="https://www.linkedin.com/company/persona-humanoids-at-work/" target="_blank">LinkedIn posts</a> about Persona has been “tremendous.” And with a substantial seed investment in the works, Persona will have more than just a vision to attract top talent.
</p><p>
For more details about Persona, we spoke with Persona AI co-founders Nic Radford and Jerry Pratt.
</p><p>
<strong>Why start this company, why now, and why you?</strong>
</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-right" data-rm-resized-container="25%" style="float: right;">
<img alt="" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="96390cf5671fa10555398542fbc22991" data-rm-shortcode-name="rebelmouse-image" id="90434" loading="lazy" src="https://spectrum.ieee.org/media-library/nic-radford.png?id=52515077&width=980" style="max-width: 100%"/>
<small class="image-media media-caption" placeholder="Add Photo Caption..." style="max-width: 100%;">Nic Radford</small>
</p><p>
<strong>Nic Radford: </strong>The idea for this started a long time ago. Jerry and I have been working together off and on for quite a while, being in this field and sharing a love for what the humanoid potential is while at the same time being frustrated by where humanoids are at. As far back as probably 2008, we were thinking about starting a humanoids company, but for one reason or another the viability just wasn’t there. We were both recently searching for our next venture and we couldn’t imagine sitting this out completely, so we’re finally going to explore it, although we know better than anyone that robots are really hard. They’re not that hard to build; but they’re hard to make useful and make money with, and the challenge for us is whether we can build a viable business with Persona: can we build a business that uses robots and makes money? That’s our singular focus. We’re pretty sure that this is likely the best time in history to execute on that potential.
</p><p>
<strong>Jerry Pratt: </strong>I’ve been interested in commercializing humanoids for quite a while—thinking about it, and giving it a go here and there, but until recently it has always been the wrong time from both a commercial point of view and a technological readiness point of view. You can think back to the DARPA Robotics Challenge days when we had to wait about 20 seconds to get a good lidar scan and process it, which made it really challenging to do things autonomously. But we’ve gotten much, much better at perception, and now, we can get a whole perception pipeline to run at the framerate of our sensors. That’s probably the main enabling technology that’s happened over the last 10 years.
</p><p>
From the commercial point of view, now that we’re showing that this stuff’s feasible, there’s been a lot more pull from the industry side. It’s like we’re at the next stage of the Industrial Revolution, where the harder problems that weren’t roboticized from the 60s until now can now be. And so, there’s really good opportunities in a lot of different use cases.
</p><p>
<strong>A bunch of companies have started within the last few years, and several were even earlier than that. Are you concerned that you’re too late?</strong>
</p><p>
<strong>Radford:</strong> The concern is that we’re still too early! There might only be one Figure out there that raises a billion dollars, but I don’t think that’s going to be the case. There’s going to be multiple winners here, and if the market is as large as people claim it is, you could see quite a diversification of classes of commercial humanoid robots.
</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-right" data-rm-resized-container="25%" style="float: right;">
<img alt="" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="2068a0ff840814f5b57e8900041a465f" data-rm-shortcode-name="rebelmouse-image" id="922cd" loading="lazy" src="https://spectrum.ieee.org/media-library/jerry-pratt.png?id=52515080&width=980" style="max-width: 100%"/>
<small class="image-media media-caption" placeholder="Add Photo Caption..." style="max-width: 100%;">Jerry Pratt</small>
</p><p>
<strong>Pratt:</strong> We definitely have some catching up to do but we should be able to do that pretty quickly, and I’d say most people really aren’t that far from the starting line at this point. There’s still a lot to do, but all the technology is here now—we know what it takes to put together a really good team and to build robots. We’re also going to do what we can to increase speed, like by starting with a surrogate robot from someone else to get the autonomy team going while building our own robot in parallel.
</p><p>
<strong>Radford:</strong> I also believe that our capital structure is a big deal. We’re taking an anti-stealth approach, and we want to bring everyone along with us as our company grows and give out a significant chunk of the company to early joiners. It was an anxiety of ours that we would be perceived as a me-too and that nobody was going to care, but it’s been the exact opposite with a compelling response from both investors and early potential team members.
</p><p>
<strong>So your approach here is not to look at all of these other humanoid robotics companies and try and do something they’re not, but instead to pursue similar goals in a similar way in a market where there’s room for all?</strong>
</p><p>
<strong>Pratt:</strong> All robotics companies, and AI companies in general, are standing on the shoulders of giants. These are the thousands of robotics and AI researchers that have been collectively bashing their heads against the myriad problems for decades—some of the first humanoids were walking at
<a href="https://www.youtube.com/watch?v=OMWU3KcSizY" target="_blank">Waseda University in the late 1960s</a>. While there are some secret sauces that we might bring to the table, it is really the combined efforts of the research community that now enables commercialization.
</p><p>
So if you’re at a point where you need something new to be invented in order to get to applications, then you’re in trouble, because with invention you never know how long it’s going to take. What is available today and now, the technology that’s been developed by various communities over the last 50+ years—we all have what we need for the first three applications that are widely mentioned: warehousing, manufacturing, and logistics. The big question is, what’s the fourth application? And the fifth and the sixth? And if you can start detecting those and planning for them, you can get a leg up on everybody else.
</p><p>
The difficulty is in the execution and integration. It’s a ten thousand—no, that’s probably too small—it’s a hundred thousand piece puzzle where you gotta get each piece right, and occasionally you lose some pieces on the floor that you just can’t find. So you need a broad team that has expertise in like 30 different disciplines to try to solve the challenge of an end-to-end labor solution with humanoid robots.
</p><p>
<strong>Radford:</strong> The idea is like one percent of starting a company. The rest of it, and why companies fail, is in the execution. Things like, not understanding the market and the product-market fit, or not understanding how to run the company, the dimensions of the actual business. I believe we’re different because with our backgrounds and our experience we bring a very strong view on execution, and that is our focus on day one. There’s enough interest in the VC community that we can fund this company with a singular focus on commercializing humanoids for a couple different verticals.
</p><p>
But listen, we got some novel ideas in actuation and other tricks up our sleeve that might be very compelling for this, but we don’t want to emphasize that aspect. I don’t think Persona’s ultimate success comes just from the tech component. I think it comes mostly from ‘do we understand the customer, the market needs, the business model, and can we avoid the mistakes of the past?’
</p><p>
<strong>How is that going to change things about the way that you run Persona?</strong>
</p><p>
<strong>Radford:</strong> I started a company [Houston Mechatronics] with a bunch of research engineers. They don’t make the best product managers. More broadly, if you’re staffing all your disciplines with roboticists and engineers, you’ll learn that it may not be the most efficient way to bring something to market. Yes, we need those skills. They are essential. But there’s so many other aspects of a business that get overlooked when you’re fundamentally a research lab trying to commercialize a robot. I’ve been there, I’ve done that, and I’m not interested in making that mistake again.
</p><p>
<strong>Pratt:</strong> It’s important to get a really good product team that’s working with a customer from day one to have customer needs drive all the engineering. The other approach is ‘build it and they will come’ but then maybe you don’t build the right thing. Of course, we want to build multi-purpose robots, and we’re steering clear of saying ‘general purpose’ at this point. We don’t want to overfit to any one application, but if we can get to a dozen use cases, two or three per customer site, then we’ve got something.
</p><p>
<strong>There still seems to be a couple of unsolved technical challenges with humanoids, including hands, batteries, and safety. How will Persona tackle those things?</strong>
</p><p>
<strong>Pratt:</strong> Hands are such a hard thing—getting a hand that has the required degrees of freedom and is robust enough that if you accidentally hit it against your table, you’re not just going to break all your fingers. But we’ve seen robotic hand companies popping up now that are showing videos of hitting their hands with a hammer, so I’m hopeful.
</p><p>
Getting one to two hours of battery life is relatively achievable. Pushing up towards five hours is super hard. But batteries can now be charged in 20 minutes or so, as long as you’re going from 20 percent to 80 percent. So we’re going to need a cadence where robots are swapping in and out and charging as they go. And batteries will keep getting better.
</p><p>
<strong>Radford:</strong> We do have a focus on safety. It was paramount at NASA, and when we were working on Robonaut, it led to a lot of morphological considerations with padding. In fact, the first concepts and images we have of our robot illustrate extensive padding, but we have to do that carefully, because at the end of the day it’s mass and it’s inertia.
</p><p>
<strong>What does the near future look like for you?</strong>
</p><p>
<strong>Pratt:</strong> Building the team is really important—getting those first 10 to 20 people over the next few months. Then we’ll want to get some hardware and get going really quickly, maybe buying a couple of robot arms or something to get our behavior and learning pipelines going while in parallel starting our own robot design. From our experience, after getting a good team together and starting from a clean sheet, a new robot takes about a year to design and build. And then during that period we’ll be securing a customer or two or three.
</p><p>
<strong>Radford:</strong> We’re also working hard on some very high profile partnerships that could influence our early thinking dramatically. Like Jerry said earlier, it’s a massive 100,000 piece puzzle, and we’re working on the fundamentals: the people, the cash, and the customers.
</p>]]></description><pubDate>Sun, 30 Jun 2024 13:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/persona-ai-radford-pratt</guid><category>Walking robots</category><category>Humanoid robots</category><category>Robotics startup</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-shadowy-rendering-of-a-humanoid-robot-standing-in-the-darkness.png?id=52520431&width=980"></media:content></item></channel></rss>
If you would like to create a banner that links to this page (i.e. this validation result), do the following:
Download the "valid RSS" banner.
Upload the image to your own server. (This step is important. Please do not link directly to the image on this server.)
Add this HTML to your page (change the image src
attribute if necessary):
If you would like to create a text link instead, here is the URL you can use: