Congratulations!

[Valid RSS] This is a valid RSS feed.

Recommendations

This feed is valid, but interoperability with the widest range of feed readers could be improved by implementing the following recommendations.

Source: http://feeds.feedburner.com/IeeeSpectrumRoboticsChannel

  1. <?xml version="1.0" encoding="utf-8"?>
  2. <rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:media="http://search.yahoo.com/mrss/"><channel><title>IEEE Spectrum</title><link>https://spectrum.ieee.org/</link><description>IEEE Spectrum</description><atom:link href="https://spectrum.ieee.org/feeds/topic/robotics.rss" rel="self"></atom:link><language>en-us</language><lastBuildDate>Tue, 28 Oct 2025 15:17:43 -0000</lastBuildDate><item><title>Free-Floating Robots Find Ocean’s Carbon Storage Is Struggling</title><link>https://spectrum.ieee.org/ocean-robots-mbari-bgc-argo</link><description><![CDATA[
  3. <img src="https://spectrum.ieee.org/media-library/submarine-periscope-partially-submerged-in-calm-water.jpg?id=61944636&width=1245&height=700&coordinates=0%2C321%2C0%2C322"/><br/><br/><p>The surface ocean is a busy place, with ships crossing, storms churning, and satellites monitoring everything from above. But below the top 1,000 meters, a hidden fleet of robotic devices is listening for signs of stress inside the planet’s largest life-support system. </p><p>According to new research <a href="https://www.mbari.org/news/marine-heatwaves-have-hidden-impacts-on-ocean-food-webs-and-carbon-cycling/" rel="noopener noreferrer" target="_blank">published in <em>Nature Communications</em></a>, marine heatwaves are interfering with the ocean’s ability to transport carbon from surface waters into the deep, where it can be stored long-term. That science depends entirely on autonomous “biogeochemical” profiling floats that drift and dive through the ocean, collecting data in near-real time as part of the U.S.-led <a href="https://www.go-bgc.org/" rel="noopener noreferrer" target="_blank">Global Ocean Biogeochemical (GO-BGC) Array</a>, headed by the Monterey Bay Aquarium Research Institute (MBARI) in California. </p><p>These cylindrical, pressure-resistant devices are encased in aluminum and packed with bio-optics, a GPS/Iridium antenna, and lithium or hybrid batteries. They monitor key biological, physical, and chemical properties—hence their biogeochemical name—<a href="https://www.frontiersin.org/journals/marine-science/articles/10.3389/fmars.2024.1358042/full" rel="noopener noreferrer" target="_blank">including</a> oxygen, pH, nitrate, suspended particles, chlorophyll, and temperature, conductivity, and depth. MBARI has deployed <a href="https://www3.mbari.org/gobgc/tables/GOBGC_float_performance.html" rel="noopener noreferrer" target="_blank">more than 330</a> robots worldwide with advanced biogeochemical sensors, joining a larger fleet of 4,000-plus <a href="https://globalocean.noaa.gov/resource/science-on-a-sphere-dataset-argo-floats-by-country/" target="_blank">Argo</a> floats across an international network that started <a href="https://goosocean.org/news/celebrating-25-years-of-argo-a-pillar-of-the-global-ocean-observing-system/" rel="noopener noreferrer" target="_blank">26 years ago</a>.</p><p class="ieee-inbody-related">RELATED: <a href="https://spectrum.ieee.org/ocean-engineering-robots-climate" target="_blank">40,000 Robots Roam the Oceans, Climate in Their Crosshairs</a></p><p>“I describe them as measuring the metabolism of the ocean,” says MBARI Senior Scientist <a href="https://www.mbari.org/person/ken-johnson/" rel="noopener noreferrer" target="_blank">Ken Johnson</a>, who co-authored <a href="https://www.nature.com/articles/s41467-025-63605-w" rel="noopener noreferrer" target="_blank">the new study</a> and serves as lead principal investigator for the GO-BGC program. “If you aren’t feeling well and you go to the hospital, they don’t immediately throw you in for an MRI. They take your vital signs, and that’s what these floats do.” </p><h2>Ocean Carbon Cycle Tracking</h2><p>Understanding <a data-linked-post="2650255631" href="https://spectrum.ieee.org/capturing-climate-change" target="_blank">how far carbon-rich particles sink</a> is central to tracking the ocean’s carbon cycle, its metabolic engine. <a href="https://spectrum.ieee.org/ocean-engineering-robots-climate" target="_blank">BGC-Argo floats</a> can detect oxygen levels deep in the ocean, helping scientists pinpoint where and how bacteria are breaking down sinking organic matter. In the Gulf of Alaska, carbon rarely makes it very deep before returning to the atmosphere. But in the <a href="https://en.wikipedia.org/wiki/Southern_Ocean" target="_blank">Southern Ocean</a>, it spreads far deeper, making that region a more powerful carbon sink.</p><p>It has been historically nearly impossible to monitor the full depth of the ocean’s carbon-transport processes continuously. Satellite sensors are largely <a href="https://www.frontiersin.org/journals/remote-sensing/articles/10.3389/frsen.2024.1495958/full" rel="noopener noreferrer" target="_blank">limited</a> to the surface and upper sunlit layer of the ocean, and cannot directly observe deeper water columns. High-precision ship-based surveys provide detailed data but are restricted by schedules, weather, and cost.</p><p>“The key is having these chemical and biological sensors running in the background, telling you how one year is different from the next,” Johnson says. “You just can’t understand how the ocean responds to multiple heatwaves by going out on a ship for a couple of weeks. But the floats will do it all year round, even on Christmas and Thanksgiving, and in the winter when the weather is terrible and no one wants to be out there. The longest cruise I’ve been on was 58 days. That was long enough. People don’t want to spend years on a ship.”</p><p>Although MBARI’s robots can capture a wider range of data than satellites and ships, they aren’t meant to be replacements. Johnson says there’s synergy between the three: “Satellites only see a few things, but the floats see more, and then the ship sees even more. When you put them all together, each thing makes the other better and gives you more understanding.” </p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Graphic of underwater sensor data cycle and transmission process via satellite." class="rm-shortcode" data-rm-shortcode-id="f1ee931b9c20b6328905073df2f4884b" data-rm-shortcode-name="rebelmouse-image" id="1d507" loading="lazy" src="https://spectrum.ieee.org/media-library/graphic-of-underwater-sensor-data-cycle-and-transmission-process-via-satellite.jpg?id=61944710&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">BGC-Argo robots drift below the ocean surface to collect data and periodically ascend to transmit it via satellite.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Kim Fulton-Bennett/MBARI</small></p><h2>How Do Biogeochemical Robots Work? </h2><p>In a typical cycle, BGC-Argo floats drop to roughly 1,000 meters and drift for 10 days, following a specific water mass. Each float has a central processor that synchronizes readings from the onboard sensors. A buoyancy pump expands and contracts an external oil bladder, letting it dive to 2,000 meters before rising again and collecting continuous measurements on the way up. </p><p>When its antenna reaches the surface, the float transmits its data through the Iridium satellite network, and immediately sinks again. Data is posted publicly within a day as part of international agreements allowing entry into other countries’ economic zones. </p><p>Although the floats are generally autonomous for their pre-programmed data-collection missions, researchers can remotely adjust certain parameters, like <a href="https://argo.ucsd.edu/how-do-floats-work/argo-cycle-timing-variables/" target="_blank">cycle timing</a>, via satellite. This control can be useful for targeted coverage during hurricanes or volcanic eruptions, Johnson says.</p><p><a href="https://today.ucsd.edu/story/nsf-grants-53-million-to-create-a-global-fleet-of-robotic-floats-to-monitor-ocean-health" rel="noopener noreferrer" target="_blank">Funded by a US $53 million National Science Foundation grant awarded in 2020</a>, MBARI developed and calibrated the floats’ key BGC sensors, including the SeaFET Ocean pH technology now used worldwide. The University of Washington built the floats in <a href="https://www.teledynemarine.com/products/product-line/profiling-floats" rel="noopener noreferrer" target="_blank">partnership with Teledyne Webb Research</a>, contributing part kits and fabrication. Before any update goes live, Johnson says the University of Washington team runs simulations on accelerated timescales, stress-testing floats to identify potential failure modes. </p><p>Each float has a lifetime of about 250 vertical dive-drift-rise profiles, lasting up to seven years. “We lose about 5 percent every year for isolated reasons like corrosion or connection problems. Sometimes they get run down by ships when they’re at the surface, or they get stuck at the bottom,” Johnson says. </p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Map showing locations of Argo floats worldwide, color-coded by program." class="rm-shortcode" data-rm-shortcode-id="e217f53bfe7c76e654cb8fdcbc257eaa" data-rm-shortcode-name="rebelmouse-image" id="2276c" loading="lazy" src="https://spectrum.ieee.org/media-library/map-showing-locations-of-argo-floats-worldwide-color-coded-by-program.jpg?id=61944761&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">BGC-Argo floating robots have been deployed all over the global ocean to monitor its health.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Ken Johnson/GO-BGC Project</small></p><h2>What the Robots Reveal</h2><p>MBARI’s new study in <em>Nature Communications </em>used the floats to observe the aftermath of a massive North Pacific marine heatwave in the Gulf of Alaska from 2013 to 2015 (called “<a href="https://en.wikipedia.org/wiki/The_Blob_(Pacific_Ocean)" target="_blank">The Blob</a>”) and its 2019 to 2020 successor. The researchers paired float readings with seasonal data from ship-based surveys tracking plankton pigments and environmental DNA from seawater samples collected by Fisheries and Oceans Canada’s <a href="https://www.dfo-mpo.gc.ca/science/data-donnees/line-p/index-eng.html" target="_blank">Line P program</a>. </p><p>Plankton lifecycles are critical to how Earth stores carbon dioxide (CO<sub>2</sub>). When plankton grow at the surface and die, or are eaten by other plankton or fish, the resulting organic material falls through the water column as tiny particles or fecal pellets. “One of the big questions in carbon-climate science is how deep does the carbon from plankton travel?” Johnson says. “If that carbon only goes 100 meters, it’ll get remineralized by the bacteria, turned back into CO<sub>2</sub>, and it’ll just mix back into the atmosphere. It doesn’t really sequester that CO<sub>2</sub> away. But if the material sinks 2 kilometers, it’s gone out of contact with the atmosphere for hundreds of years.” </p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="1a81a9b7fd8dc787549e793d4e2a9b1d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/jth1Wrohffc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> </p><p>Johnson adds, “To me, the takeaway is that these heatwaves cause changes in ecosystem structure—in the plankton and how they operate—and these shifts in carbon export and how the ocean sequesters carbon are changing the services the ocean provides to us in ways we hadn’t really appreciated. The ocean gives us seafood, it absorbs about 95 percent of the anthropogenic heat in the atmosphere, it stores a bunch of CO<sub>2</sub>. We can now see that its ability to continue providing those services isn’t a given. It can be altered by a heatwave.” </p><p>MBARI’s team is <a href="https://www.mbari.org/news/new-ai-approach-sharpens-picture-of-carbon-export-in-the-southern-ocean/" target="_blank">applying machine learning techniques</a> to extract new biogeochemical insights. In an August study in <em><a href="https://agupubs.onlinelibrary.wiley.com/doi/10.1029/2024GB008371" target="_blank">Global Biogeochemical Cycles</a></em>, they used a neural network on BGC-Argo float data to show that nitrate production has been rising throughout the Southern Ocean for more than two decades. That region is central for carbon uptake and regulating global nutrient distribution.</p><p>The program’s future isn’t guaranteed without additional support. The $53 million NSF grant that built the U.S. BGC-Argo fleet expires this year, and Johnson says no continuation funding has been secured yet. </p>]]></description><pubDate>Tue, 28 Oct 2025 15:12:14 +0000</pubDate><guid>https://spectrum.ieee.org/ocean-robots-mbari-bgc-argo</guid><category>Carbon sequestration</category><category>Climate change</category><category>Climate tech</category><category>Oceanography</category><category>Robots</category><category>Underwater robots</category><dc:creator>Shannon Cuthrell</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/submarine-periscope-partially-submerged-in-calm-water.jpg?id=61944636&amp;width=980"></media:content></item><item><title>Video Friday: Unitree’s Human-Size Humanoid Robot</title><link>https://spectrum.ieee.org/video-friday-human-size-robot</link><description><![CDATA[
  4. <img src="https://spectrum.ieee.org/media-library/stylish-humanoid-robot-walking-in-marble-hallway-dressed-in-brown-hooded-outfit-also-human-walking-next-to-humanoid-dressed-in.png?id=61899164&width=1245&height=700&coordinates=0%2C178%2C0%2C179"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://roscon.ros.org/2025/">ROSCon 2025</a>: 27–29 October 2025, SINGAPORE</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="eudbifkmh-m"><em>Welcome to this world—standing 180 cm tall and weighing 70 kg. The H2 bionic humanoid—born to serve everyone safely and friendly.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cda5bfafb1697b2d631112ed6bfda776" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/eUdBIFkMh-M?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Starting at US$29,900 plus tax and shipping.</p><p>[ <a href="https://www.unitree.com/H2/">Unitree</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="kfoeqnazwys">The title of this one, “Eagle Stole our FPV Drone,” pretty much sums it up.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cf237f6c57dd27b661a2a8a14c8d731a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/kFOeQNazWys?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.team-blacksheep.com/">Team BlackSheep</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="s1fy0wiug_k"><em>Historically, small robots couldn’t have arms because the necessary motors made them too heavy. We addressed this challenge by replacing multiple motors with a single motor and miniature electrostatic clutches. This innovation allowed us to create a high-DOF, lightweight arm for small robots, which can even hitch onto a drone.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="fb7c5660d72255cd4b25d337b6e2eacf" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/S1fy0WiUg_k?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://advanced.onlinelibrary.wiley.com/doi/10.1002/aisy.202500625">Seoul National University</a> ]</p><p>Thanks, Kyu-Jin!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="zlwxrbqic78">Just FYI, any robot that sounds like a tasty baked good is guaranteed favorable coverage on Video Friday.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="dd53e476b509d89926ee763372a2d099" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zlwxrBqic78?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://cleorobotics.com/">Cleo Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="x9zf3fq2ccs"><em>Oli now pulls off a smooth, coordinated whole-body sequence from lying down to getting back up. Standing 165 cm tall and powered by 31 degrees of freedom, Oli continues to demonstrate natural and fluid motion.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1f39ac24203f782c4dec9679c0157310" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/x9ZF3fq2Ccs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.limxdynamics.com/en/oli">LimX Dynamics</a> ]</p><p>Thanks, Jinyan!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="d4rswuynrz8">Friend o’ the blog <a data-linked-post="2650278258" href="https://spectrum.ieee.org/robotic-dreams-robotic-realities" target="_blank">Bram Vanderborght</a> tours the exhibit floor at IROS 2025 in Hanghzou, China.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="adb689328811cd68cfaffec66b38dad9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/D4RSwuYnRz8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.iros25.org/">IROS 2025</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="l-yliih8e8w"><em>In a fireside chat with Professor Sam Madden, Tye Brady, Chief Technologist at <a data-linked-post="2671890507" href="https://spectrum.ieee.org/amazon-robotics-vulcan-warehouse-picking" target="_blank">Amazon Robotics</a>, will discuss the trajectory of robotics and how <a data-linked-post="2667013016" href="https://spectrum.ieee.org/what-is-generative-ai" target="_blank">generative AI</a> plays a role in robotics innovation.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="14c1e6508222c23a8ee248465d0b7f9d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/L-Yliih8e8w?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://genai.mit.edu/mit-generative-ai-symposium/">MIT Generative AI Impact Consortium</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="fnlceacjdue">Prof. Dimitrios Kanoulas gave an invited talk at the Workshop on The Art of Robustness: Surviving Failures in Robotics at IROS 2025.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="93132153ca5e8799ad82ddebf70acf82" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FNlceacjduE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://art-of-robustness-iros2025.github.io/">IROS 2025</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="wz1qzzy4iaw">This University of Pennsylvania GRASP talk is by Suraj Nair from Physical Intelligence, on “Scaling Robot Learning with Vision-Language-Action Models.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="229327e24d4780e4700b411bfc29dd33" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wz1QzZY4Iaw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>The last several years have witnessed tremendous progress in the capabilities of AI systems, driven largely by foundation models that scale expressive architectures with diverse data sources. While the impact of this technology on vision and language understanding is abundantly clear, its use in robotics remains in its infancy. Scaling robot learning still presents numerous open challenges—from selecting the right data to scale, to developing algorithms that can effectively fit this data for closed-loop operation in the physical world. At Physical Intelligence, we aim to tackle these questions. This talk will present our recent work on building vision-language-action models, covering topics such as architecture design, data scaling, and open research directions.</em></blockquote><p>[ <a href="https://www.grasp.upenn.edu/events/fall-2025-grasp-sfi-suraj-nair/">University of Pennsylvania GRASP Laboratory</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 24 Oct 2025 18:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-human-size-robot</guid><category>Video friday</category><category>Robotics</category><category>Unitree</category><category>Humanoid robots</category><category>Drones</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/stylish-humanoid-robot-walking-in-marble-hallway-dressed-in-brown-hooded-outfit-also-human-walking-next-to-humanoid-dressed-in.png?id=61899164&amp;width=980"></media:content></item><item><title>Video Friday: Multimodal Humanoid Walks, Flies, Drives</title><link>https://spectrum.ieee.org/video-friday-multimodal-robot</link><description><![CDATA[
  5. <img src="https://spectrum.ieee.org/media-library/caltech-humanoid-robot-bent-over-at-the-bipedal-attachment-point-with-wheeled-drone-on-its-back-exits-building.png?id=61767773&width=1245&height=700&coordinates=0%2C101%2C0%2C101"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="f8dwbwcvz0c"><em>Caltech’s <a data-linked-post="2650277975" href="https://spectrum.ieee.org/caltech-building-agile-humanoid-robot-by-combining-legs-with-thrusters" target="_blank">Center for Autonomous Systems and Technologies</a> (CAST) and the Technology Innovation Institute in Abu Dhabi, UAE, recently conducted a demonstration of X1, a multirobot system developed as part of a three-year collaboration between the two institutes. During the demo, M4, a multimodal robot developed by CAST, launches in drone mode from a humanoid robot’s back. It lands and converts into driving mode and then back again, as needed. The demonstration underscored the kind of progress that is possible when engineers from multiple institutions at the forefront of autonomous systems and technologies truly collaborate.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="187bd44c287f1358622dadd4160d7f71" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/F8DwBWCVZ0c?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://cast.caltech.edu/news/caltech-and-technology-innovation-institute-demo-multirobot-response-team">Caltech Center for Autonomous Systems and Technologies</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="nm_zhzp8nqa"><em>Spot performs dynamic whole-body manipulation using a combination of reinforcement learning and sampling-based control. Behavior shown in the video is fully autonomous, including the dynamic selection of contacts on the arm, legs, and body, and coordination between the manipulation and locomotion processes.  The tire weighs 15 kilograms (33 pounds), making its mass and inertial energy significant compared to the weight of the robot.  An external motion-capture system was used to simplify perception, and an external computer linked by Wi-Fi performed the intensive computational operations.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0b7539ac3f9c0bdcf86918a96a85ac0a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/nM_ZHzp8nQA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p><a data-linked-post="2652903997" href="https://spectrum.ieee.org/boston-dynamics-spot-robot-arm" target="_blank">Spot’s arm</a> is stronger than I thought. Also, the arm-foot collaboration is pretty wild.</p><p>[ <a href="https://rai-inst.com/resources/blog/combining-sampling-and-learning-for-dynamic-whole-body-manipulation/">Robotics and AI Institute</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="eu5mymavctm"><em>Figure 03 represents an unprecedented advancement in taking humanoid robots from experimental prototypes to deployable, scalable products. By uniting advanced perception and tactile intelligence with home-safe design and mass-manufacturing readiness, Figure has built a platform capable of learning, adapting, and working across both domestic and commercial settings. Designed for Helix, the home, and the world at scale, Figure 03 establishes the foundation for true general-purpose robotics, one capable of transforming how people live and work.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cd2ac38fb023a61b0da0b44ce273a88c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Eu5mYMavctM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>The kid and the dog in those clips make me very, very nervous.</p><p>[ <a href="https://www.figure.ai/news/introducing-figure-03">Figure</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="6jbfpu-0pyw"><em>Researchers have invented a new superagile robot that can cleverly change shape thanks to amorphous characteristics akin to the popular Marvel antihero Venom. Researchers used a special material called electro-morphing gel (e-MG) which allows the robots to show shape-shifting functions, allowing them to bend, stretch, and move in ways that were previously difficult or impossible, through manipulation of electric fields from ultralightweight electrodes.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="babff98421e893b4cf7b53e3ced238b4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/6jbFPu-0Pyw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.bristol.ac.uk/news/2025/october/soft-robotics-breakthrough.html">University of Bristol</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="uvtp0aprdzy">This is very preliminary of course, but I love the idea of quadrupedal robots physically assisting each other to surmount obstacles like this.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6a4f1b4e80d38cf5af6c25f3c012ec39" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UvtP0aPrdzY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://rpl-as-ucl.github.io/">Robot Perception and Learning Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="o5gphcrjx98">Have we reached peak dynamic humanoid yet?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="470de2453e44326494fc797f1a9b206d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/O5GphCrjx98?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/g1">Unitree</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="txyc9b1oflu"><em>Dynamic manipulation, such as robots tossing or throwing objects, has recently gained attention as a novel paradigm to speed up logistic operations. However, the focus has predominantly been on the object’s landing location, irrespective of its final orientation. In this work, we present a method enabling a robot to accurately “throw-flip” objects to a desired landing pose (position and orientation).</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4e21c8165953d5dac1190c6b6afe309f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/txYc9b1oflU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.epfl.ch/labs/lasa/">LASA</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="mqgbegxnctu">I don’t care all that much about “industry-oriented” quadrupeds. I do care very much about “ridable” quadrupeds.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a720f9efe62c51b69ab2fad02b7278f5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/MqGBEGXnctU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.magiclab.top/en">MagicLab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="y4wbmr1hlx0">I am not yet at the point where I would trust any humanoid around priceless ancient relics. Any humanoid, not just the robotic ones.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="69d8480e7a8976602d14a69baa6ee3bb" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/y4wBmR1hLx0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.limxdynamics.com/en/oli">LimX</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="wqfjcuu2ici">This Carnegie Mellon University RI Seminar, “A Manipulation Journey,” is presented by Matt Mason, professor emeritus at CMU. </p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b0e189236024641be80a5f9c9aac2a34" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/WQFJcuU2ICI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>The talk will revisit my career in manipulation research, focusing on projects that might offer some useful lessons for others. We will start with my beginnings at the MIT AI Lab and my MS thesis, which is still my most-cited work, then continue with my arrival at CMU, a discussion with Allen Newell, an exercise to envision a coherent research program, and how that led to a second and third childhood. The talk will conclude with some discussion of lessons learned.</em></blockquote><p>[ <a href="https://www.ri.cmu.edu/event/a-manipulation-journey/">Carnegie Mellon University Robotics Institute</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="dtgsw3hjtzi"><em>Christian Hubicki highlights and explains the past year of humanoid robotics research and news.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a8fbca3ee0499a0d72a9cb30bca32845" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dtGsw3hJtZI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.optimalroboticslab.com/">Florida State University</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="qoet-5gbbcm">More excellent robotics discussions from <a data-linked-post="2669217747" href="https://spectrum.ieee.org/icra40-conference" target="_blank">ICRA@40</a>.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5c7fdcbd59586c274424e6475de8b884" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/qOET-5GbBcM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="146c49fc3d2601786bfa83cba90d51cf" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5e6QAMUQCsg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="6ffa7c4e081863330f1266bcf20026ae" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pnNK-PwTM9k?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://icra40.ieee.org/">ICRA@40</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 17 Oct 2025 16:30:05 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-multimodal-robot</guid><category>Robotics</category><category>Video friday</category><category>Humanoid robots</category><category>Multimodal robots</category><category>Quadruped robots</category><category>Manipulation</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/caltech-humanoid-robot-bent-over-at-the-bipedal-attachment-point-with-wheeled-drone-on-its-back-exits-building.png?id=61767773&amp;width=980"></media:content></item><item><title>How Roomba Got Its Vacuum</title><link>https://spectrum.ieee.org/irobot-roomba-history</link><description><![CDATA[
  6. <img src="https://spectrum.ieee.org/media-library/roomba-vacuum-signed-by-irobot-team-showing-control-buttons-on-wooden-floor.jpg?id=61715049&width=1245&height=700&coordinates=0%2C274%2C0%2C274"/><br/><br/><div class="intro-text"><div class="ieee-sidebar-large"><p>Adapted from <a href="https://dancingwithroomba.com/" rel="noopener noreferrer" target="_blank"><span><em>Dancing With Roomba</em></span></a>, written by Joe Jones, who was iRobot’s first full-time employee and the original designer of the Roomba robot vacuum.</p></div><p><em>After developing a prototype robot that was effective at cleaning both hard floors and carpets using a relatively simple </em><a href="https://en.wikipedia.org/wiki/Carpet_sweeper" target="_blank"><em><em>carpet-sweeping mechanism</em></em></a><em>, iRobot vice president Winston Tao and the iRobot marketing team have organized a focus group so that Roomba’s engineers can witness the reaction of potential first customers.</em></p></div><p class="drop-caps"><strong>One pleasant midsummer day</strong> in 2001, Roomba’s engineers, Winston Tao, and several other iRobot folk rendezvoused at an unremarkable, multistory office building on the Cambridge side of the Charles River, across from Boston. We assembled in a narrow room. A long table occupied the room’s center. Snacks and sodas were set out along the back wall; the lighting was subdued. The dominant feature of this cramped chamber was a big one-way mirror occupying almost the entire front wall. Sitting at the table, one could see through the looking glass into a wonderland of market research on the other side. In that much larger, brightly lit room were comfortable chairs, an easel with a large pad of paper, and our hired facilitator. Although this was a familiar trope I’d seen a hundred times on TV, actually lurking in an observation room like this felt a touch surreal.</p><h2>iRobot vs Focus Group</h2><p>We’d paid maybe US $10,000 for the privilege of setting up some focus groups—probably the most the company had ever spent on a market research event. But we needed to know how potential customers would react to our Roomba prototype when they saw one in the (plastic) flesh cleaning the floor at their feet. At the appointed hour, our facilitator welcomed eight to 10 bona fide ordinary people as they filed into the large room and sat in the chairs. Our mind-child was about to receive its first critical judgment from strangers.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"><a href="https://www.routledge.com/Dancing-with-Roomba-Cracking-the-Robot-Riddle-and-Building-an-Icon/Jones/p/book/9781032890616"></a><a class="shortcode-media-lightbox__toggle shortcode-media-controls__button material-icons" style="background: gray;" title="Select for lightbox">aspect_ratio</a><img alt="Book cover: Roomba with signatures and red dance shoes on wooden floor." class="rm-shortcode" data-rm-shortcode-id="e46182afb80a0087b4101620b788b005" data-rm-shortcode-name="rebelmouse-image" id="aa4ad" loading="lazy" src="https://spectrum.ieee.org/media-library/book-cover-roomba-with-signatures-and-red-dance-shoes-on-wooden-floor.jpg?id=61715094&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">This article was adapted from the author’s new book, <a href="https://www.routledge.com/Dancing-with-Roomba-Cracking-the-Robot-Riddle-and-Building-an-Icon/Jones/p/book/9781032890616" target="_blank">Dancing with Roomba: Cracking the Robot Riddle and Building an Icon</a>  (Routledge 2025).</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Joe Jones</small></p><p><span>The facilitator prepared participants by encouraging them to state their honest views and not to be swayed by the comments of others. “You are the world’s expert in your own opinion,” she told them.</span></p><p>At first the facilitator described Roomba without showing the group any photos or the device itself. She was met with skepticism that such a thing would actually work. Then she demonstrated one of the prototypes we had prepared for the event. As participants watched Roomba go about its business on both carpets and hard floors, their doubts ebbed. Even those who stated that they would never purchase such a device couldn’t help being intrigued. As the group discussion proceeded, soccer moms (representing “early mass-market adopters”) emerged as the most interested. They saw Roomba as a time-saver. This surprised and pleased us, as we’d expected the much smaller market of gadget geeks would be the first to fall in love.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Transparent blue Roomba vacuum on wooden floor near a doorway." class="rm-shortcode" data-rm-shortcode-id="e9aa4054690029ada17e3e695b95d088" data-rm-shortcode-name="rebelmouse-image" id="2503f" loading="lazy" src="https://spectrum.ieee.org/media-library/transparent-blue-roomba-vacuum-on-wooden-floor-near-a-doorway.jpg?id=61715051&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">iRobot built about 20 of its third major Roomba prototype, the T100, all with 3D-printed shells.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Joe Jones</small></p><p>But we could take neither interest nor love to the bank. We needed to know how much customers would pay. Our facilitator eased into that part of the gathering’s proceedings. She did not inquire directly but rather asked, “If you saw this product in a store, what would you expect the price to be?”</p><p><span>The focus group’s responses were all over the map. Some people mentioned a price close to the $200 we intended to charge. A few folks we regarded as saints-in-training expected an even higher number. But most were lower. One woman said she’d expect Roomba to be priced at $25. Later when asked what she thought a replacement battery might cost, she said, “$50.” That hurt. For this lady, attaching our robot to a battery devalued the battery.</span></p><h2>Floor Cleaner or Robot?</h2><p>Throughout the proceedings our facilitator had been careful to leave a couple of things unmentioned. First, she never referred to Roomba as a robot, calling it instead an “automatic floor cleaner.” Three separate groups, comprising an aggregate of around two dozen people, gave their opinions that day. Of these, only two individuals spontaneously applied the term “robot” to Roomba.</p><p>The second unmentioned characteristic was the nature of Roomba’s cleaning mechanism. That is, the facilitator had revealed no details about how it worked. Participants had seen the demo, they observed Roomba cleaning effectively, they had given their opinion about the price. They’d all assumed that a vacuum was at work, several used that term to refer to the robot. But now the facilitator told them, “Roomba is a carpet sweeper, not a vacuum.” Then she asked again what they would expect to pay. On average, focus-group members from all three groups cut their estimates in half. Participants who had previously said $200 now said $100.</p><p>The focus group’s brutal revaluation exploded our world. The enabling innovation that made the energy budget work, that made Roomba technically and economically feasible, was cleaning with a carpet sweeper rather than a vacuum. People had seen that the carpet-sweeper-Roomba really did work. Yet they chose to trust conventional wisdom about vacuums versus carpet sweepers rather than their own apparently lying eyes. If we were forced to cut the robot’s price in half, we would lose money on every unit sold, and there would be no Roomba.</p><p>At the end of the evening, before any member of our stunned team could stagger out the door, Winston said simply, “Roomba has to have a vacuum.” A shotgun wedding was in the offing for bot and vac.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt='Round white robot with cartoon puppy sticker, named "Scamp," on a desk.' class="rm-shortcode" data-rm-shortcode-id="6c9f28c57c67149e1cc654b33a9d18b4" data-rm-shortcode-name="rebelmouse-image" id="4d0a6" loading="lazy" src="https://spectrum.ieee.org/media-library/round-white-robot-with-cartoon-puppy-sticker-named-scamp-on-a-desk.jpg?id=61715054&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">Scamp, the earliest Roomba prototype, was built in 1999.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Joe Jones</small></p><p><span>The next day at work we gathered to discuss the focus group’s revelation. A half-hearted attempt or two to deny reality quickly faded—electrical engineer Chris Casey saw to that—and we accepted what we needed to do. But changing things now would be a huge challenge in multiple ways. We were deep into development, closer to launch than kickoff. All the electrical power our battery could supply was already spoken for. None was available for a new system that would likely be more power hungry than all the robot’s other systems combined. And where could we put a vacuum? All the space in the robot was also fully assigned. Our mandate to clean under furniture and between chair legs wouldn’t let us make the robot any bigger.</span></p><h2>Making Roomba a Vacuum</h2><p>One escape hatch beckoned, but no one was eager to leap through it. Chris articulated what we were all thinking. “We could build a vestigial vacuum,” he said. That is, we could design a tiny, pico-power vacuum—one that consumes almost no power and does almost nothing—strap it on the robot, and call it done. Perversely, that seemed reasonable. The robot already cleaned the floor well; our cleaning tests proved it. Customers, however, didn’t know that. They were all steeped in the dogma of vacuum supremacy. Reeducating the masses wasn’t possible—we didn’t have the funds. But if we could assert on the box that Roomba had a vacuum, then everyone would be satisfied. We could charge the price that makes our economics work. Customers would deem that cost reasonable and wouldn’t have to unlearn their vacuum bias.</p><p>But it felt wrong. If we must add a new system to the robot, we wanted it—like all the other systems—to earn its keep honestly, to do something useful. Through further discussion and calculation, we concluded that we could afford to devote about 10 percent of the robot’s 30-watt power budget to a vacuum. Conventional manual vacuums typically gorged themselves on 1,200 watts of power, but if we could develop a system that provided useful cleaning while consuming only 3 W (0.25 percent of 1,200) then we would feel good about adding it to the robot. It just didn’t seem very likely.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Black and red Roomba vacuum on a gray carpet next to a plaid multicolored blanket." class="rm-shortcode" data-rm-shortcode-id="37a05aa9ece9ef666d0833dc8feaa75b" data-rm-shortcode-name="rebelmouse-image" id="35719" loading="lazy" src="https://spectrum.ieee.org/media-library/black-and-red-roomba-vacuum-on-a-gray-carpet-next-to-a-plaid-multicolored-blanket.jpg?id=61715075&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">iRobot built two identical second-generation Roomba prototypes, named Kipper and Tipper, one of which is shown here.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Joe Jones</small></p><p><span>I sometimes find that solving a problem is largely a matter of staring at the problem’s source. Gaze long and intently enough at something and, Waldo-like, the solution may reveal itself. So I took one of the team’s manual vacuums and stared at it. What exactly made it use as much power as it did? I knew the answer was partly marketing rather than reality. There was no simple, objective way to compare cleaning efficacy between vacuums. Lacking a results-based method, shoppers looked at inputs. For example, a vacuum with a 10-ampere motor sounds as though it should clean better than a vacuum with a 6-amp motor. But the bigger number might only mean that the manufacturer with the 10-amp claim was using a less-efficient motor—the 6-amp (720-W) motor might clean just as well.</span></p><p>But even when you corrected for the amperage arms race, a vacuum was still a power glutton. Staring at the vacuum cleaner, I began to see why. The vacuum fixed in my gaze that day used the standard configuration: a cylindrical beater brush occupied the center of a wide air inlet. A motor, attached by a belt, spun the brush. Another motor, deeper in the machine, drove a centrifugal blower that drew air in through the inlet. To keep dirt particles kicked up by the beater brush entrained in the airstream, the air needed to move fast. The combination of a wide inlet and high velocity meant that every second the vacuum motor had to gulp a huge volume of air.</p><p>Accelerating all that air took considerable power—the physics was inescapable. If we wanted a vacuum that sipped power rather than guzzled it, we had to move a much smaller volume of air per second. We could accomplish that—without reducing air velocity—if, instead of a wide inlet, we used a narrow one. To match the manual vacuum’s air velocity using only a 3-W motor, I computed that we would need a narrow opening indeed: only a millimeter or two.</p><p>That instantly disqualified Roomba from using the standard vacuum configuration—we could not put our bristle brush in the middle of the air inlet. That would require an inlet maybe 20 times too wide. We’d have to find another arrangement.</p><h2>A Micro Vacuum that Doesn’t Suck</h2><p>To test the narrow-inlet idea I turned to my favorite prototyping materials: cardboard and packing tape. Using these, I mocked up my idea. The inlet for my test vacuum was as long as Roomba’s brush but only about 2 millimeters wide. To provide suction I repurposed the blower from a defunct heat gun. Then I applied my jury-rigged contraption to crushed Cheerios and a variety of other dirt stand-ins. My novel vacuum was surprisingly effective at picking up small debris from a hard surface. Using an anemometer to measure the speed of the air rushing through my narrow inlet showed that it was, as desired, as fast as the airstream in a standard vacuum cleaner.</p><p class="ieee-inbody-related">RELATED: <a href="https://spectrum.ieee.org/south-pole-roombas" target="_self">Roombas at the End of the World</a></p><p>The next step was to somehow shoehorn our microvacuum into Roomba. To form the narrow inlet we used two parallel vanes of rubber. Small rubber bumps protruding from one vane spanned the inlet, preventing the vanes from collapsing together when vacuum was applied. We placed the air inlet parallel to and just behind the brush. The only plausible space for the vacuum <a href="https://en.wikipedia.org/wiki/Impeller" target="_blank">impeller</a>, motor, and filter (needed to separate the dirt from the flowing air) was to take over a corner of the dust cup. Drawing on his now well-honed skills of packing big things into tiny spaces where they had no business fitting, mechanical engineer Eliot Mack managed somehow to accomplish this. But we did get help from an outside consultant to design the intricate shape the impeller needed to move air efficiently.</p><p>In general, regular vacuums perform better on carpet than on hard floors. But Roomba inverted that relationship. Our vacuum operated like a squeegee, pulling dirt from tile, linoleum, and wooden floors. But it was less effective on other surfaces. The sweeper mechanism did the heavy lifting when cleaning carpet.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Silver and gray Roomba robotic vacuum on a hardwood floor." class="rm-shortcode" data-rm-shortcode-id="62663496ca8834aa17c67256efeac12e" data-rm-shortcode-name="rebelmouse-image" id="9e12a" loading="lazy" src="https://spectrum.ieee.org/media-library/silver-and-gray-roomba-robotic-vacuum-on-a-hardwood-floor.jpg?id=61715088&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">iRobot released its first production version of the Roomba in September 2002.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Joe Jones</small></p><p><span>Despite the team’s reluctance to add a vacuum and despite the unit’s low power, the vacuum genuinely improved Roomba’s cleaning ability. We could demonstrate this convincingly. First, we disabled Roomba’s new vacuum by disconnecting the power and then cleaned a hard floor relying only on the carpet-sweeper mechanism. If we then walked across the floor barefoot, we would feel a certain amount of grit underfoot. If we repeated the exercise with vacuum power on, the floor was pristine. Bare feet would detect no grit whatsoever.</span></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Seven people pose in front of shelves displaying awards and a gold iRobot Roomba; casual attire." class="rm-shortcode" data-rm-shortcode-id="b1ce9f56ea50d5ec4cb7ee0968f760ab" data-rm-shortcode-name="rebelmouse-image" id="a8f77" loading="lazy" src="https://spectrum.ieee.org/media-library/seven-people-pose-in-front-of-shelves-displaying-awards-and-a-gold-irobot-roomba-casual-attire.jpg?id=61715091&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">The Roomba contributors present on the occasion of the 500,000th Roomba include Steve Hickey, Eliot Mack [front row], Paul Sandin, Chris Casey, Phil Mass, Joe Jones, and Jeff Ostaszewski [back row].</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Joe Jones</small></p><p><span>Years later I learned that the focus group had a back story no one mentioned at the time. While the Roomba team had swallowed the carpet-sweeper concept hook, line, and sinker, Winston had not. He was uneasy with the notion that customers would be cleaning-mechanism agnostic—thinking instead that they simply wouldn’t believe our robot would clean their floors if it didn’t have a vacuum. He found at least indirect support for that position when he scoured marketing data from our earlier collaboration with SC Johnson.</span></p><p>But Winston, well-attuned to the engineering psyche, knew he couldn’t just declare, “Roomba has to have a vacuum.” We’d have pushed back, probably saying something like, “What your business-school-addled brain doesn’t appreciate is that it’s the carpet sweeper that makes the whole concept work!” Winston had to show us. That was a key purpose of the focus group, to demonstrate to the Roomba team that we had made a deal-breaking omission. <span class="ieee-end-mark"></span></p><p><em><em>Dancing With Roomba</em></em> <a href="https://www.routledge.com/Dancing-with-Roomba-Cracking-the-Robot-Riddle-and-Building-an-Icon/Jones/p/book/9781032890616" target="_blank">is now available for preorder</a>.</p>]]></description><pubDate>Wed, 15 Oct 2025 12:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/irobot-roomba-history</guid><category>Roomba</category><category>Consumer robots</category><category>Robotics</category><dc:creator>Joe Jones</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/roomba-vacuum-signed-by-irobot-team-showing-control-buttons-on-wooden-floor.jpg?id=61715049&amp;width=980"></media:content></item><item><title>Video Friday: Non-Humanoid Hands for Humanoid Robots</title><link>https://spectrum.ieee.org/video-friday-robotic-hands-2674168909</link><description><![CDATA[
  7. <img src="https://spectrum.ieee.org/media-library/boston-dynamics-humanoid-robot-giving-two-thumbs-up-in-a-well-lit-industrial-setting.png?id=61730113&width=1245&height=700&coordinates=0%2C0%2C0%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="https://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="gs4roqndtbk">There are two things that I really appreciate about this video on <a data-linked-post="2652903997" href="https://spectrum.ieee.org/boston-dynamics-spot-robot-arm" target="_blank">grippers from Boston Dynamics</a>. First, building a gripper while keeping in mind that the robot will inevitably fall onto it, because I’m seeing lots of very delicate-looking five-fingered hands on humanoids, and I’m very skeptical of their ruggedness. And second, understanding that not only is a five-fingered hand very likely unnecessary for the vast majority of tasks, but also robot hands don’t have to be constrained by a human hand’s range of motion.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="732ee2c774adcd4b3104032aca09daf2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/gS4rOqNDTBk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/blog/ask-a-roboticist-meet-karl/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ilg3x8zri2i">Yes, okay, it’s a fancy-looking robot, but I’m still stuck on what useful, practical things can it reliably and cost-effectively and safely DO?</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="d4148342441134aaf276d8d1a0725d8d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Eu5mYMavctM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-caption" placeholder="Add Photo Caption...">- YouTube</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://youtu.be/Eu5mYMavctM" target="_blank">youtu.be</a></small></p><p>[ <a href="https://www.figure.ai/">Figure</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="52ach_fnquu"><em>Life on Earth has evolved in constant relation to gravity, yet we rarely consider how deeply it shapes living systems until we imagine a place without it. In MycoGravity, pink oyster mushrooms grow inside a custom-built bioreactor mounted on a KUKA robotic arm. Inspired by NASA’s random positioning machines, the robot’s programmed movement simulates altered gravity. Over time, sculptural mushrooms emerge, shaped by their environment without a stable gravitational direction.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8dd27d04f64e68496595859c6e86f6bf" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/52acH_fnQUU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ars.electronica.art/panic/en/view/mycogravity-21938ddb450c816fbb66fbc1c6075a84/">MycoGravity</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="rgokumdirnu"><em>A new technological advancement gives robotic systems a natural sense of touch without extra skins or sensors. With advanced force sensing and deep learning, this robot can feel where you touch, recognize symbols, and even use virtual buttons—paving the way for more natural and flexible human-robot interaction.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6ffd51a4f6badac6752dfd7cad51580e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/rgoKUmdIRnU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.science.org/doi/abs/10.1126/scirobotics.adn4008">Science Robotics</a> ]</p><p>Thanks, Maged!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ubdeyzapvpi">The creator of <a data-linked-post="2655205032" href="https://spectrum.ieee.org/robot-video-2655205032" target="_blank">Mini Pupper</a> introduces <a href="https://www.youtube.com/watch?v=ZdFnK2Lw7oQ" target="_blank">HeySanta</a>, which can be yours for under $60.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="54c520373fa197fde17ce97aeb852d17" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UBdeyZApVpI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p><span>[</span><a href="https://www.kickstarter.com/projects/mdrobotkits/heysanta-generative-ai-powered-santa-claus" target="_blank"> Kickstarter campaign</a><span>]</span></p><div class="horizontal-rule"></div><p class="rm-anchors" id="e0cizgktn4m">I think humanoid robotics companies are starting to realize that they’re going to need to differentiate themselves somehow.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="980cafe42b1ec901060e35338d8d6567" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/e0cIZgkTn4M?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">DEEP Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="evcu8pk5xwc"><em>Drone swarm performances—synchronized, expressive aerial displays set to music—have emerged as a captivating application of modern robotics. Yet designing smooth, safe choreographies remains a complex task requiring expert knowledge. We present SwarmGPT, a language-based choreographer that leverages the reasoning power of large language models (LLMs) to streamline drone performance design.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2826febe5fd3de62edb2ef527d6290ca" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/EvCU8Pk5xwc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://utiasdsl.github.io/swarm_GPT/">SwarmGPT</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="opr3mpybgo4"><em>Dr. Mark Draelos, assistant professor of robotics and ophthalmology, received the National Institutes of Health (NIH) Director’s New Innovator Award for a project that seeks to improve how delicate microsurgeries are conducted by scaling up tissue to a size where surgeons could “walk across the retina” in virtual reality and operate on tissue as if “raking leaves.”</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4124ec201e543bacc74ffd26032a85b8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/OPR3Mpybgo4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotics.umich.edu/news/2025/draelos-nih-award/">University of Michigan</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="iuh8fwxonwa"><em>The intricate mechanisms of the most sophisticated laboratory on Mars are revealed in Episode 4 of the ExoMars Rosalind Franklin series, called “</em>Sample Processing.”</blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="123c2a395ce4c5ce32a8b3fe7c3d181d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/iuh8FWxoNwA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.esa.int/Science_Exploration/Human_and_Robotic_Exploration/Exploration/ExoMars/ExoMars_rover">European Space Agency</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="jcbhspk0yik">There’s currently a marketplace for used industrial robots, and it makes me wonder what’s next. Used humanoids, anyone?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="fbee8316e01aee8552850d9c3ba2c4d0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/jcbHSPk0YIk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://my.kuka.com/s/category/used-robots/0ZG1i000000TOeiGAG?language=en_US">Kuka</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="m8ivrpkaszo"><em>On October 2, 2025, the 10th “Can We Build Baymax?” Workshop Part 10: What Can We Build Today? & BYOB (Bring Your Own Baymax) was held in Seoul, Korea. To celebrate the 10th anniversary, Baymax delivered a special message from his character designer, Jin Kim.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f4d0c14ffe7f9bd7f47abf486a8f303b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/M8IVRPKaszo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://baymax.org/">Baymax</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="btp4ljvx6eg">I am only sharing this to declare that iRobot has gone off the deep end with their product names: Meet the “Roomba® Max 705 Combo Robot + AutoWash™ Dock.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="87989bceb538a48eb2fc62e2d998381d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Btp4ljVx6Eg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.irobot.com/en_US/roomba-max-705-robots.html">iRobot</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="veutck1xyzi"><em>Daniel Piedrahita, Navigation Team Lead, presents on his team’s recent work rebuilding Digit’s navigation stack, including a significant upgrade to footstep path planning.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1abf5358935eb93a5cba9afad6f5f1d7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/VeutCk1xYzI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.agilityrobotics.com/content/digits-next-steps">Agility Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="dehcus0nwsi">A bunch of videos from <a data-linked-post="2669217747" href="https://spectrum.ieee.org/icra40-conference" target="_blank">ICRA@40</a> have just been posted, and here are a few of my favorites.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1fccf31c28ede118912bd57c27220761" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DEHcUs0nwSI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="0f5c5aa8a25fd2520d67e3e7b6f6bf36" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/EZiol10pvvY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="90c6573aff1096acb8059a31b41aed09" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Q5KvEge0km0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://icra40.ieee.org/">ICRA@40</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 10 Oct 2025 16:00:04 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robotic-hands-2674168909</guid><category>Video friday</category><category>Boston dynamics</category><category>Atlas</category><category>Robot grippers</category><category>Humanoid robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/boston-dynamics-humanoid-robot-giving-two-thumbs-up-in-a-well-lit-industrial-setting.png?id=61730113&amp;width=980"></media:content></item><item><title>Video Friday: Drone Easily Lands on Speeding Vehicle</title><link>https://spectrum.ieee.org/video-friday-speedy-drone-landing</link><description><![CDATA[
  8. <img src="https://spectrum.ieee.org/media-library/a-drone-flips-rapidly-and-lands-on-top-of-a-fast-moving-car-captured-in-multiple-positions-against-a-blurred-background.png?id=61690208&width=1245&height=700&coordinates=0%2C0%2C0%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="https://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="ttuvr1ogag0"><em>We demonstrate a new landing system that lets drones safely land on moving vehicles at speeds up to 110 kilometers per hour. By combining lightweight shock absorbers with reverse thrust, our approach drastically expands the landing envelope, making it far more robust to wind, timing, and vehicle motion. This breakthrough opens the door to reliable high-speed drone landings in real-world conditions.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="61a5988b813f6c5c8b48a662b56b7295" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/tTUVr1Ogag0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.createk.co/">Createk Design Lab</a> ]</p><p>Thanks, Alexis!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="w4v2jlnr_gg"><em>This video presents an academic parody inspired by KAIST’s humanoid robot moonwalk. While KAIST demonstrated the iconic move with robot legs, we humorously reproduced it using the Tesollo DG-5F robot hand. A playful experiment to show that not only humanoid robots but also robotic fingers can “dance.” </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6bba12a220306c42036b69e3867f8531" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/w4V2JLnR_Gg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieng.hanyang.ac.kr/en/department-of-robot-engineering1">Hangyang University</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="xxjog_c-fam"><em>Twenty years ago, <a data-linked-post="2650272298" href="https://spectrum.ieee.org/universal-robots" target="_blank">Universal Robots</a> built the first <a data-linked-post="2650279041" href="https://spectrum.ieee.org/universal-robots-introduces-its-strongest-robotic-arm-yet" target="_blank">collaborative robot</a>. You turned it into something bigger. Our cobot was never just technology. In your hands, it became something more: a teammate, a problem-solver, a spark for change. From factories to labs, from classrooms to warehouses. That’s the story of the past 20 years. That’s what we celebrate today.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c6a4b19423003f46af4ebb96ee5314a0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/XxjOg_C-fAM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.universal-robots.com/2025/celebrating-20-years-of-cobots/">Universal Robots</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="j7fah6mhbdi"><em>The assistive robot Maya, newly developed at DLR, is designed to enable people with severe physical disabilities to lead more independent lives. The new robotic arm is built for seamless wheelchair integration, with optimized kinematics for stowing, ground-level access, and compatibility with standing functions. </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6b2fc4e6c72f4e716ac36fce5edda2b8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/j7FAh6mHBdI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dlr.de/en/latest/news/2025/dlr-showcases-robotics-highlights-at-automatica-2025/dlr-assistence-robot-maya">DLR</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="rw9_fgfxwzs"><em>Contoro and HARCO Lab have launched an open-source initiative, ROS-MCP-Server, which connects AI models (for example, Claude, GPT, Gemini) with robots using a robot operating system and the Model Context Protocol. This software enables AI to communicate with multiple ROS nodes in the language of robots. We believe it will allow robots to perform tasks previously impossible due to limited intelligence, help robotics engineers program robots more efficiently, and enable nonexperts to interact with robots without deep robotics knowledge.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0812421fc9d0f507060f859382c89c35" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/RW9_FgfxWzs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://github.com/robotmcp/ros-mcp-server">GitHub</a> ]</p><p>Thanks, Mok!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="-xdsryj1fdw">Here’s a quick look at the <a href="https://www.corl.org/" target="_blank">Conference on Robotic Learning</a> (CoRL) exhibit hall, thanks to PNDbotics.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="14eb94bde7302adfbf4d902a850f11ce" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/-xdsRyJ1Fdw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://pndbotics.com/">PNDbotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="_jb3urj5llk">Old and busted: sim to real. New hotness: real to sim!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2f4cb709caad5b26deb93a0e249a29d0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/_JB3urj5LLk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2505.12428">Paper</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ndct3xwxl8o">Any humanoid video with tennis balls should be obligated to show said humanoid failing to walk over them.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8c29f9da53613b4f733e16f65eddd2aa" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/NDCt3xwXl8o?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.limxdynamics.com/en">LimX</a> ]</p><p>Thanks, Jinyan!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="4i6vrsd5mdg">The correct answer to the question “Can you beat a robot arm at tic-tac-toe?” should be “No. No, you cannot.” And you can’t beat a human, either, if they know what they’re doing.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cf0dcdb26ad375aafed3e8cb7f96aec6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/4I6VRsd5MDg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://global.agilex.ai/products/piper">AgileX</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="zi_kttzgha0"><em>It was an honor to host the team from Microsoft AI as part of their larger educational collaboration with the University of Texas at Austin. During their time here, they shared this wonderful video of our lab facilities.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="49c291c4ccc04671f31d0f76daca90d5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zi_kttZgHa0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Moody lighting is second only to random primary-colored lighting when it comes to making a lab look science-y.</p><p>[ <a href="https://sites.utexas.edu/hcrl/">The University of Texas at Austin HCRL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="9vgxlqw9xxu"><em>Robots aren’t just sci-fi anymore. They’re evolving fast. AI is teaching them how to adapt, learn, and even respond to open-ended questions with advanced intelligence. Aaron Saunders, chief technology officer of Boston Dynamics, explains how this leap is transforming everything, from simple controls to full-motion capabilities. While there are some challenges related to safety and reliability, AI is significantly helping robots become valuable partners at home and on the job.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8dee870b10dc3e35b3ebe0bd27ae0721" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/9vGXLQW9xxU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ibm.com/think/topics/automation">IBM</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 03 Oct 2025 16:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-speedy-drone-landing</guid><category>Robotics</category><category>Video friday</category><category>Drones</category><category>Robot hand</category><category>Humanoid robots</category><category>Robot ai</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-drone-flips-rapidly-and-lands-on-top-of-a-fast-moving-car-captured-in-multiple-positions-against-a-blurred-background.png?id=61690208&amp;width=980"></media:content></item><item><title>Why the World Needs a Flying Robot Baby</title><link>https://spectrum.ieee.org/ironcub-jet-powered-flying-robot</link><description><![CDATA[
  9. <img src="https://spectrum.ieee.org/media-library/advanced-robot-torso-with-mechanical-arms-and-visible-cables.jpg?id=61668516&width=1245&height=700&coordinates=0%2C112%2C0%2C113"/><br/><br/><p>One of the robotics projects that I’ve been most excited about for years now is <a href="https://ami.iit.it/aerial-humanoid-robotics" rel="noopener noreferrer" target="_blank">iRonCub</a>, from Daniele Pucci’s <a href="https://ami.iit.it/" target="_blank">Artificial and Mechanical Intelligence Lab</a> at the Italian Institute of Technology (IIT) in Genoa, Italy. <a href="https://spectrum.ieee.org/jet-powered-icub-could-be-the-first-flying-humanoid-robot" target="_blank">Since 2017</a>, Pucci has been developing a jet-propulsion system that will enable an <a href="https://robotsguide.com/robots/icub" target="_blank">iCub</a> robot (originally designed in 2004 to be the approximate shape and size of a 5-year-old child) to fly like Iron Man.</p><p>Over the summer, after nearly 10 years of development, <a href="https://opentalk.iit.it/en/iit-demonstrates-that-a-humanoid-robot-can-fly/" rel="noopener noreferrer" target="_blank">iRonCub3 achieved lift-off and stable flight for the first time</a>, with its four jet engines lifting it 50 centimeters off the ground for several seconds. The long-term vision is for iRonCub (or a robot like it) to operate as a disaster response platform, Pucci tells us. In an emergency situation like a flood or a fire, iRonCub could quickly get to a location without worrying about obstacles, and then on landing, start walking for energy efficiency while using its hands and arms to move debris and open doors. “We believe in contributing to something unique in the future,” says Pucci. “We have to explore new things, and this is wild territory at the scientific level.”</p><p>Obviously, this concept for iRonCub and the practical experimentation attached to it is really cool. But coolness in and of itself is usually not enough of a reason to build a robot, especially a robot that’s a (presumably rather expensive) multi-year project involving a bunch of robotics students, so let’s get into a little more detail about why a flying robot baby is actually something that the world needs.</p><hr/> <p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="54af67eb6158ed3474900ae20d8312e4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/t1bNHoT4D5Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-caption" placeholder="Add Photo Caption...">In an emergency situation like a flood or a fire, iRonCub could quickly get to a location without worrying about obstacles, and then on landing, start walking for energy efficiency while using its hands and arms to move debris and open doors.</small> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">IIT</small> </p><p><span>Getting a humanoid robot to do this sort of thing is quite a challenge. Together, the jet turbines mounted to iRonCub’s back and arms can generate over 1000 N of thrust, but because it takes time for the engines to spool up or down, control has to come from the robot itself as it moves its arm-engines to maintain stability. </span></p><p><span>“What is not visible from the video,” Pucci tells us, “is that the exhaust gas from the turbines is at 800 °C and almost supersonic speed. We have to understand how to generate trajectories in order to avoid the fact that the cones of emission gases were impacting the robot.” </span></p><p><span>Even if the exhaust doesn’t end up melting the robot, there are still aerodynamic forces involved that have until this point really not been a consideration for humanoid robots at all—in June, Pucci’s group </span><a href="https://www.nature.com/articles/s44172-025-00447-w" target="_blank">published a paper in <em>Nature Engineering Communications</em></a><span>, offering a “comprehensive approach to model and control aerodynamic forces [for humanoid robots] using classical and learning techniques.”</span></p><p class="pull-quote"><span>“The exhaust gas from the turbines is at 800 °C and almost supersonic speed.” <strong>—Daniele Pucci, IIT</strong> </span></p><p>Whether or not you’re on board with Pucci’s future vision for iRonCub as a disaster-response platform, derivatives of current research can be immediately applied beyond flying humanoid robots. The algorithms for thrust estimation can be used with other flying platforms that rely on directed thrust, like eVTOL aircraft. Aerodynamic compensation is relevant for humanoid robots even if they’re not airborne, if we expect them to be able to function when it’s windy outside.</p><p>More surprising, Pucci describes a recent collaboration with an industrial company developing a new pneumatic gripper. “At a certain point, we had to do force estimation for controlling the gripper, and we realized that the dynamics looked really similar to those of the jet turbines, and so we were able to use the same tools for gripper control. That was an ‘ah-ha’ moment for us: first you do something crazy, but then you build the tools and methods, and then you can actually use those tools in an industrial scenario. That’s how to drive innovation.”</p><h2>What’s Next for iRonCub: Attracting Talent and Future Enhancements</h2><p>There’s one more important reason to be doing this, he says: “It’s really cool.” In practice, a really cool flagship project like iRonCub not only attracts talent to Pucci’s lab, but also keeps students and researchers passionate and engaged. I saw this firsthand when I visited IIT last year, where I got a similar vibe to watching the <a href="https://spectrum.ieee.org/darpa-robotics-challenge-amazing-moments-lessons-learned-whats-next" target="_blank">DARPA Robotics Challenge</a> and <a href="https://spectrum.ieee.org/collections/darpa-subterranean-challenge/" target="_blank">DARPA SubT</a>—when people know they’re working on something <em><em>really cool</em></em>, there’s this tangible, pervasive, and immersive buzzing excitement that comes through. It’s projects like iRonCub that can get students to love robotics.</p><p><br/></p><p>In the near future, a new jetpack with an added degree of freedom will make yaw control of iRonCub easier, and Pucci would also like to add wings for more efficient long-distance flight. But the logistics of testing the robot are getting more complicated—there’s only so far that the team can go with their current test stand (which is on the roof of their building), and future progress will likely require coordinating with the Genoa airport. </p><p>It’s not going to be easy, but as Pucci makes clear, “This is not a joke. It’s something that we believe in. And that feeling of doing something exceptional, or possibly historical, something that’s going to be remembered—that’s something that’s kept us motivated. And we’re just getting started.”</p>]]></description><pubDate>Tue, 30 Sep 2025 12:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/ironcub-jet-powered-flying-robot</guid><category>Robotics</category><category>Flying robots</category><category>Iit</category><category>Humanoid robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/advanced-robot-torso-with-mechanical-arms-and-visible-cables.jpg?id=61668516&amp;width=980"></media:content></item><item><title>Video Friday: Gemini Robotics Improves Motor Skills</title><link>https://spectrum.ieee.org/video-friday-google-gemini-robotics</link><description><![CDATA[
  10. <img src="https://spectrum.ieee.org/media-library/robotic-arms-manipulate-objects-and-grapes-humanoid-robot-by-apptronik.webp?id=61659549&width=1245&height=700&coordinates=0%2C0%2C0%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="https://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="uobzwjpb6xm"><em><a href="https://spectrum.ieee.org/gemini-robotics" target="_blank">Gemini Robotics</a> 1.5 is our most capable vision-language-action (VLA) model, which turns visual information and instructions into motor commands for a robot to perform a task. This model thinks before taking action and shows its process, helping robots assess and complete complex tasks more transparently. It also learns across embodiments, <a href="https://spectrum.ieee.org/deepmind-table-tennis-robots" target="_blank">accelerating skill learning</a>.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8fe3adec0fe17749d1cd7cd24a0e6688" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UObzWjPb6XM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://deepmind.google/discover/blog/gemini-robotics-15-brings-ai-agents-into-the-physical-world/">Google DeepMind</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="vb73rrl3hdk"><em>A simple “force pull” gesture brings Carter straight into her hand. This is a fantastic example of how an intuitive interaction can transform complex technology into an extension of our intent.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4938e05d7220672799193c3f1a1cd818" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vb73rRL3hDk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.robust.ai/">Robust.ai</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="owbnt87cmki">I can’t help it, I feel bad for this poor little robot.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e59425ad29f83dd2742e5b9795236a3e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/OwBnT87CMkI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://dreamflex.github.io/">Urban Robotics Laboratory, KAIST</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="5ydlz54unyy">Hey look, no legs!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8a298e7493020798f796034a5265f9c2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5YdLZ54uNyY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.kinisi.com/">Kinisi Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="p4wszmj6knw"><em>Researchers at the University of Michigan and Shanghai Jiao Tong University have developed a soft robot that can crawl along a flat path and climb up vertical surfaces using its unique origami structure. The robot can move with an accuracy typically seen only in rigid robots.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8cd995c652f18b447c06ed8a9e8f6667" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/P4WSZMJ6Knw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotics.umich.edu/news/2025/sparc-climbing-origami-robot/">University of Michigan Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="bpslmx_v38e"><em>Unitree G1 has learned the “antigravity” mode: Stability is greatly improved under any action sequence, and even if it falls, it can quickly get back up.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="58b471bc84634ee6f8ce01ec7fe51136" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/bPSLMX_V38E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/g1/">Unitree</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="iadq_1ulirq"><em>Kepler Robotics has commenced mass production of the K2 Bumblebee, the world’s first commercially available humanoid robot powered by Tesla’s hybrid architecture.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="699d6364d6f80241c5ac0bb0a1b491d2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/IAdq_1ULIRQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.gotokepler.com/">Kepler Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="g20geadn6gm"><em>Reinforcement learning (RL)-based legged locomotion controllers often require meticulous reward tuning to track velocities or goal positions while preserving smooth motion on various terrains. Motion imitation methods via RL using demonstration data reduce reward engineering but fail to generalize to novel environments. We address this by proposing a hierarchical RL framework in which a low-level policy is first pretrained to imitate animal motions on flat ground, thereby establishing motion priors. Real-world experiments with an ANYmal-D quadruped robot confirm our policy’s capability to generalize animal-like locomotion skills to complex terrains, demonstrating smooth and efficient locomotion and local navigation performance amid challenging terrains with obstacles.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="fd15ae48fa56e2fab61de9858b2fcb36" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/G20geAdN6GM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://anymalprior.github.io/">ETHZ RSL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="zmuw_mn2odg">I think we have entered the “differentiation through novelty” phase of robot vacuums.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b6642c6ef4b5942c139a696a032f3f4a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ZMUW_mN2ODg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://us.roborock.com/products/roborock-qrevo-curv-x">Roborock</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="4ev0w_tbuw8"><em>In this work, we present Kinethreads: a new full-body haptic exosuit design built around string-based motor-pulley mechanisms, which keeps our suit lightweight (less than 5 kilograms), soft and flexible and quick-to-wear (in less than 30 seconds), comparatively low-cost (about US $400), and yet capable of rendering expressive, distributed, and forceful (up to 120 newtons) effects.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7d1308c14aa36c8fc07c351c385f58b1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/4ev0w_tbuw8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://programs.sigchi.org/uist/2025/program/content/206935">ACM Symposium on User Interface and Software Technology</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="djtfw4v9a_c"><em>In this episode of the IBM AI in Action podcast, Aaron Saunders, chief technology officer of Boston Dynamics, delves into the transformative potential of AI-powered robotics, highlighting how robots are becoming safer, more cost-effective and widely accessible through robotics as a service (RaaS). </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0397ef7c7c46f05e423de105753bcd11" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DjtFW4v9A_c?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ibm.com/think/podcasts/ai-in-action">IBM</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="rounxcn3irc"><em>This Carnegie Mellon RI Seminar is by Michael T. Tolley from the University of California, San Diego, on biologically inspired soft robotics.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0f1a9da463747c9ab27145a9ffda8d79" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ROuNXCn3irc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Robotics has the potential to address many of today’s pressing problems in fields ranging from health care to manufacturing to disaster relief. However, the traditional approaches used on the factory floor do not perform well in unstructured environments. The key to solving many of these challenges is to explore new, nontraditional designs. Fortunately, nature surrounds us with examples of novel ways to navigate and interact with the real world. Dr. Tolley’s Bioinspired Robotics and Design Lab seeks to borrow the key principles of operation from biological systems and apply them to robotic design.</em></blockquote><p>[ <a href="https://www.ri.cmu.edu/event/biologically-inspired-soft-robotics/">Carnegie Mellon University Robotics Institute</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 26 Sep 2025 15:30:02 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-google-gemini-robotics</guid><category>Video friday</category><category>Robotics</category><category>Google deepmind</category><category>Robot ai</category><category>Humanoid robots</category><category>Quadruped robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/robotic-arms-manipulate-objects-and-grapes-humanoid-robot-by-apptronik.webp?id=61659549&amp;width=980"></media:content></item><item><title>Exploit Allows for Takeover of Fleets of Unitree Robots</title><link>https://spectrum.ieee.org/unitree-robot-exploit</link><description><![CDATA[
  11. <img src="https://spectrum.ieee.org/media-library/the-unitree-humanoid-robot-with-sleek-design-dual-arm-and-sensor-equipped-head.jpg?id=61646485&width=1245&height=700&coordinates=0%2C454%2C0%2C454"/><br/><br/><p><span>A critical vulnerability in the <a href="https://www.bluetooth.com/learn-about-bluetooth/tech-overview/" target="_blank">Bluetooth Low Energy</a> (BLE) Wi-Fi configuration interface used by several different </span><a href="https://www.unitree.com/" target="_blank">Unitree robots</a><span> can result in a root-level takeover by an attacker, security researchers </span><a href="https://github.com/Bin4ry/UniPwn" target="_blank">disclosed on 20 September</a><span>. The exploit impacts Unitree’s Go2 and B2 quadrupeds and G1 and H1 humanoids. Because the vulnerability is wireless, and the resulting access to the affected platform is complete, the vulnerability becomes wormable, </span><a href="https://github.com/Bin4ry/UniPwn?tab=readme-ov-file#the-wormable-threat" target="_blank">say the researchers</a>, meaning “a<span>n infected robot can simply scan for other Unitree robots in BLE range and automatically compromise them, creating a robot botnet that spreads without user intervention.”</span></p><p>Initially discovered by security researchers Andreas Makris and Kevin Finisterre, <a href="https://github.com/Bin4ry/UniPwn" target="_blank">UniPwn</a> takes advantage of several security lapses that are still present in the firmware of <a href="https://spectrum.ieee.org/tag/unitree" target="_blank">Unitree</a> robots as of 20 September 2025. As far as <em>IEEE Spectrum</em> is aware, this is the first major public exploit of a commercial humanoid platform.</p><h2>Unitree Robots’ BLE Security Flaw Exposed</h2><p>Like many robots, Unitree’s robots use an initial BLE connection to make it easier for a user to set up a Wi-Fi network connection. The BLE packets that the robot accepts are encrypted, but those encryption keys are hardcoded and were <a href="https://x.com/Bin4ryDigit/status/1950566849072005304" target="_blank">published on X (formerly Twitter)</a> by Makris in July. Although the robot does validate the contents of the BLE packets to make sure that the user is authenticated, the researchers say that all it takes to become an authenticated user is to encrypt the string “unitree” with the hardcoded keys and the robot will let someone in. From there, an attacker can inject arbitrary code masquerading as the Wi-Fi SSID and password, and when the robot attempts to connect to Wi-Fi, it will execute that code without any validation and with root privileges.</p><p>“A simple attack might be just to reboot the robot, which we published as a proof of concept,” explains Makris. “But an attacker could do much more sophisticated things: It would be possible to have a trojan implanted into your robot’s startup routine to exfiltrate data while disabling the ability to install new firmware without the user knowing. And as the vulnerability uses BLE, the robots can easily infect each other, and from there the attacker might have access to an army of robots.”</p><p>Makris and Finisterre first contacted Unitree in May in an attempt to responsibly disclose this vulnerability. After some back and forth with little progress, Unitree stopped responding to the researchers in July, and the decision was made to make the vulnerability public. “We have had some bad experiences communicating with them,” Makris tells us, citing an <a href="https://takeonme.org/cves/cve-2025-2894/" target="_blank">earlier backdoor vulnerability</a> he discovered with the Unitree Go1. “So we need to ask ourselves—are they introducing vulnerabilities like this on purpose, or is it sloppy development? Both answers are equally bad.”</p><p>Unitree has not responded to a request for comment from <em>IEEE Spectrum</em> as of press time. On 29 September, Unitree posted a <a href="https://www.linkedin.com/posts/unitreerobotics_statement-to-our-respected-unitree-users-activity-7378441101927436288-lLdA/" target="_blank">statement</a> on LinkedIn addressing the security concerns: “We have become aware that some users have discovered security vulnerabilities and network-related issues while using our robots,” the company wrote. “We immediately began addressing these concerns and have now completed the majority of the fixes. These updates will be rolled out to you in the near future.”</p><p>“Unitree, as other manufacturers do, has simply ignored prior security disclosures and repeated outreach attempts,” says Víctor Mayoral-Vilches, the founder of robotics cybersecurity company <a href="https://aliasrobotics.com/" target="_blank">Alias Robotics</a>. “This is not the right way to cooperate with security researchers.” Mayoral-Vilches was not involved in publishing the UniPwn exploit, but he has found <a href="https://www.linkedin.com/posts/vmayoral_cybersecurity-ai-humanoid-robots-as-attack-activity-7374789262157901824-nE0I/" target="_blank">other security issues</a> with Unitree robots, including <a href="https://arxiv.org/pdf/2509.14139" target="_blank">undisclosed streaming of telemetry data to servers in China</a> which could potentially include audio, visual, and spatial data.</p><p>Mayoral-Vilches explains that security researchers are focusing on Unitree primarily because the robots are available and affordable. This makes them not just more accessible for the researchers, but also more relevant, since Unitree’s robots are already being deployed by users around the world who are likely not aware of the security risks. For example, Makris is concerned that the <a href="https://www.nottinghamshire.police.uk/news/nottinghamshire/news/news/2025/august/officers-testing-revolutionary-robot-dog/" target="_blank">Nottinghamshire police in the United Kingdom have begun testing a Unitree Go2</a>, which can be exploited by UniPwn. “We tried contacting them and would have disclosed the vulnerability upfront to them before going public, but they ignored us. What would happen if an attacker implanted themselves into one of these police dogs?”</p><h2>How to Secure Unitree Robots</h2><p>In the short term, Mayoral-Vilches suggests that people using Unitree robots can protect themselves by connecting the robots to only isolated Wi-Fi networks and disabling their Bluetooth connectivity. “You need to hack the robot to secure it for real,” he says. “This is not uncommon and why security research in robotics is so important.”</p><p>Both Mayoral-Vilches and Makris believe that fundamentally it’s up to Unitree to make their robots secure in the long term, and that the company needs to be much more responsive to users and security researchers. But Makris says: “There will never be a 100 percent secure system.”</p><p>Mayoral-Vilches agrees: “Robots are very complex systems, with wide attack surfaces to protect, and a state-of-the-art humanoid exemplifies that complexity.”</p><p>Unitree, of course, is not the only company offering complex state-of-the-art quadrupeds and humanoids, and it seems likely (if not inevitable) that similar exploits will be discovered in other platforms. The <a href="https://spectrum.ieee.org/navigating-the-dual-use-dilemma" target="_blank">potential consequences</a> here can’t be overstated—the idea that robots can be taken over and used for nefarious purposes is already a science-fiction trope, but the impact of a high-profile robot hack on the reputation of the commercial robotics industry is unclear. Robots companies are barely talking about security in public, despite how damaging even the <em><em>perception</em></em> of an unsecured robot might be. A robot that is not under control has the potential to be a real physical danger.</p><p>For the <a href="https://2025humanoids.org/" target="_blank">IEEE Humanoids Conference</a> in Seoul from 30 September to 2 October, Mayoral-Vilches has organized <a href="https://aliasrobotics.com/cs4r_3.php" target="_blank">a workshop on Cybersecurity for Humanoids</a>, where he will present a brief (coauthored with Makris and Finisterre) titled <a href="https://arxiv.org/abs/2509.14139v2" rel="noopener noreferrer" target="_blank">Humanoid Robots as Attack Vectors</a>. Despite the title, their intent is not to overhype the problem but instead to encourage roboticists (and robotics companies) to take security seriously, and not treat it as an afterthought. As Mayoral-Vilches points out, “Robots are only safe if secure.”</p><p><em>Story updated 29 Sep 2025 with statement released by Unitree.</em></p>]]></description><pubDate>Thu, 25 Sep 2025 13:36:52 +0000</pubDate><guid>https://spectrum.ieee.org/unitree-robot-exploit</guid><category>Unitree</category><category>Humanoid robots</category><category>Quadruped robots</category><category>Robotics security</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/the-unitree-humanoid-robot-with-sleek-design-dual-arm-and-sensor-equipped-head.jpg?id=61646485&amp;width=980"></media:content></item><item><title>Robot Navigates Tough Terrain With New 3D Mapping Technique</title><link>https://spectrum.ieee.org/robot-navigation-3d-mapping</link><description><![CDATA[
  12. <img src="https://spectrum.ieee.org/media-library/a-four-legged-robot-successfully-crawling-under-a-bench-and-walking-over-a-sidewalk-curb-each-image-is-paired-with-a-visualizat.jpg?id=61634593&width=1245&height=700&coordinates=0%2C120%2C0%2C120"/><br/><br/><p><em>This article is part of our exclusive <a href="https://spectrum.ieee.org/collections/journal-watch/" target="_self">IEEE Journal Watch series</a> in partnership with IEEE Xplore.</em></p><p>Four-legged robots are making great strides in their ability to safely traverse complex terrains. In a recent advance, researchers in Hong Kong have developed a novel mapping model for <a data-linked-post="2658569185" href="https://spectrum.ieee.org/quadrupedal-robot-shins-turns-biped" target="_blank">quadrupedal robots</a> that allows them to autonomously crawl under and leap over significant obstacles in order to arrive at its desired endpoint.<span><strong><span></span></strong></span></p><p>The researchers describe the robot, which uses multilayered mapping to understand its environment, in a <a href="https://ieeexplore.ieee.org/document/11112615" target="_blank">study</a> published 4 August in <a href="https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=7083369" target="_blank"><em>IEEE Robotics and Automation Letters</em></a>. </p><p>Agile two- and four-legged animals can adapt to diverse terrains. Robots that can traverse similarly complex environments are appealing because they could complete missions that would be dangerous for humans to do, like monitoring and assessing unstable rubble sites after an earthquake. But ensuring that robots can effectively map out complex environments, as well as handle the various types of obstacles (leaping across gaps and or scaling high objects, for example) is challenging. </p><h2>Advanced Terrain Mapping for Robots</h2><p><a href="https://arclab.hku.hk/PengLu.html" target="_blank">Peng Lu</a>,<strong> </strong>an assistant professor at the University of Hong Kong, along with his postdoctoral student Yeke Chen and other team members sought to create a robot capable of overcoming these hurdles. To help their robot perceive its surroundings in detail, they developed a model that creates a multilayer elevation map from the robot’s sensor data. The map can capture the characteristics of a wide range of terrains using lidar data.<strong></strong></p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="f6da7adb5421c3c42fbf739cafac3a3e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/eHDxTpNYHFg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-caption" placeholder="Add Photo Caption...">- YouTube</small> <small class="image-media media-photo-credit" placeholder="Add Photo Credit..."> <a href="https://youtu.be/eHDxTpNYHFg" target="_blank">youtu.be</a> </small> </p><p>The team used simulations to train the robot to recognize different terrains it may encounter in the real world. This includes very challenging terrains to navigate, such as a gap that it must jump across or crawling under obstacles with an overhang jutting out. If the robot has missing sensor data, it can compensate to some extent with estimations, based on its training data. </p><p>“Through learning different skills in simulation and knowledge distillation, the robot is able to switch among different skills to traverse through different obstacles,” says Lu.</p><p>In their study, the researchers tested their mapping technique using a <span><a href="https://shop.unitree.com/en-ca/collections/frontpage/products/unitreeyushutechnologydog-artificial-intelligence-companion-bionic-companion-intelligent-robot-go1-quadruped-robot-dog?utm_term=unitree+go1&utm_campaign=&utm_source=adwords&utm_medium=ppc&hsa_acc=8764137937&hsa_cam=18819230575&hsa_grp=141021336417&hsa_ad=632944124562&hsa_src=g&hsa_tgt=kwd-1370783216629&hsa_kw=unitree+go1&hsa_mt=p&hsa_net=adwords&hsa_ver=3&gad_source=1&gad_campaignid=18819230575&gbraid=0AAAAABa3bGuZ6YLrnsAQIEXU-O3SGhtt-&gclid=CjwKCAjwisnGBhAXEiwA0zEOR3zox9X49nbSSUp-7x5O93-5xWYgvr84B2xue7C1TQeS3rNrh3b5OxoC9Y0QAvD_BwE" target="_blank">Unitree Go1</a></span> robot in a series of indoor and outdoor experiments, where it had to autonomously crawl, climb, or jump to overcome obstacles. </p><p>“The results show that the multilayer elevation map can effectively represent various complex terrains, which allow a robot to easily understand the environment,” says Lu, noting the robot also succeeded in autonomously switching between modes—crawling, jumping, and climbing—as needed. </p><p>He adds that the robot inadvertently has path-planning abilities, even though it was not programmed to have them. For example, when the robot encounters obstacles that are too high to pass, it moves around the obstacle, and therefore finds a path through the environment on its own through trial and error. </p><p>Lu notes that while a key strength of this robot is its ability to navigate diverse and difficult terrain, it can rely only on the data that it has already been trained on, and cannot learn directly from real-world data.</p><p>Lu says his team may commercialize the robot for inspection scenarios, such as construction sites, and plans on using real-world data to further enhance the robot’s ability to cope with any type of terrain.</p>]]></description><pubDate>Wed, 24 Sep 2025 14:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/robot-navigation-3d-mapping</guid><category>Autonomous robots</category><category>Climbing robots</category><category>Journal watch</category><category>Lidar</category><dc:creator>Michelle Hampson</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-four-legged-robot-successfully-crawling-under-a-bench-and-walking-over-a-sidewalk-curb-each-image-is-paired-with-a-visualizat.jpg?id=61634593&amp;width=980"></media:content></item><item><title>Video Friday: A Billion Dollars for Humanoid Robots</title><link>https://spectrum.ieee.org/video-friday-billion-humanoid-robots</link><description><![CDATA[
  13. <img src="https://spectrum.ieee.org/media-library/nine-glossy-humanoid-robots-walking-in-sync-inside-a-spacious-industrial-room.png?id=61621100&width=1245&height=700&coordinates=0%2C0%2C0%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://actuate.foxglove.dev/">ACTUATE 2025</a>: 23–24 September 2025, SAN FRANCISCO</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="http://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="c-nmhem7iya">A billion dollars is a lot of money. And this is actual money, not just a valuation. but <a data-linked-post="2668901591" href="https://spectrum.ieee.org/figure-new-humanoid-robot" target="_blank">Figure</a> already had a lot of money. So what are they going to be able to do now that they weren’t already doing, I wonder?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4d8502ce5186d2705a7882d43fe3ab46" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/c-NmheM7iyA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.figure.ai/news/series-c">Figure</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="knf-uqb9k50"><em>Robots often succeed in simulation but fail in reality. With PACE, we introduce a systematic approach to sim-to-real transfer.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="be798615031e1066a524c5f505e37ee6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/kNf-uQb9k50?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/pdf/2509.06342">Paper</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="gvnkmsfaq3q"><em>Anthropomorphic robotic hands are essential for robots to learn from humans and operate in human environments. While most designs loosely mimic human hand kinematics and structure, achieving the dexterity and emergent behaviors present in human hands, anthropomorphic design must extend to also match passive compliant properties while simultaneously strictly having kinematic matching. We present ADAPT-Teleop, a system combining a robotic hand with human-matched kinematics, skin, and passive dynamics, along with a robotic arm for intuitive teleoperation.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3606bbb871767f6712a61d613be8d14f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/gVNkmsFAQ3Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.nature.com/articles/s44182-025-00034-3">Paper</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="7yynevey4w8"><em>This robot can walk without any electronic components in its body, because the power is transmitted through wires from motors concentrated outside of its body. Also, this robot’s front and rear legs are optimally coupled and can walk with just four wires.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e29524fe1ff4790d1acff1d5ed9f53f4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7yyNevEY4W8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="http://www.jsk.t.u-tokyo.ac.jp/">JSK Lab</a> ]</p><p>Thanks, Takahiro!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="akdagvctgck"><em>Five teams of Los Alamos engineers competed to build the ultimate hole-digging robot dog in a recent engineering sprint. In just days, teams programmed their robot dogs to dig, designing custom “paws” from materials like sheet metal, foam, and 3D-printed polymers. The paws mimicked animal digging behaviors—from paddles and snowshoes to dew claws—and helped the robots avoid sinking into a 30-gallon soil bucket. Teams raced to see whose dog could dig the biggest hole and dig under a fence the fastest.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2029c0d4609508135ca20dee5abd01c7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/AkDaGvctGCk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.lanl.gov/media/publications/national-security-science/peanut-butter-potting-soil-and-prototype-engineering">Los Alamos</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="6pwi4-ycv8m"><em>This work presents UniPilot, a compact hardware-software autonomy payload that can be integrated across diverse robot embodiments to enable resilient autonomous operation in GPS-denied environments. The system integrates a multimodal sensing suite including lidar, radar, vision, and inertial sensing for robust operation in conditions where unimodal approaches may fail. A large number of experiments are conducted across diverse environments and on a variety of robot platforms to validate the mapping, planning, and safe navigation capabilities enabled by the payload.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3c13955bf3f3f0379f8648dbc2af7b0c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/6pwI4-YCV8M?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.autonomousrobotslab.com/">NTNU</a> ]</p><p>Thanks, Kostas!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ytwo7lldn4c"><em>KAIST Humanoid v0.5. Developed at the DRCD Lab, KAIST, with a control policy trained via reinforcement learning.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ae6a815f2600aad733f4334e7bf427b9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ytWO7lldN4c?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://dynamicrobot.kaist.ac.kr/">KAIST</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="f1aaxaskcso">I just like the determined little hops.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5dea6c650decc4cb4722fcb16487c566" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/F1AaxaSkCSo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://global.agilex.ai/">AgileX</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ovigdxft4le">I’m always a little bit suspicious of robotics labs that are exceptionally clean and organized.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="807ab0f11e3ee3bb6eb4ea1f1c3dbbc0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/OvIGDxft4lE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="http://www.pndbotics.com/">PNDbotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="g7rid4yb1tg">Er, has PAL Robotics ever actually seen a kangaroo...?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="40bb97eb68951ab2cf32595229209047" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/G7riD4yb1tg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://pal-robotics.com/robot/kangaroo/">PAL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ug-5bn-5nja">See <a data-linked-post="2662502813" href="https://spectrum.ieee.org/video-friday-spot-tripping" target="_blank">Spots</a> push. Push, Spots, push.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b981f42cf3e82ff65d0ae891016201d2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UG-5BN-5NjA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://hrilab.tufts.edu/">Tufts</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="-kusnexvj2g"><em>Training humanoid robots to hike could accelerate development of embodied AI for tasks like autonomous search and rescue, ecological monitoring in unexplored places, and more, say University of Michigan researchers who developed an AI model that equips humanoids to hit the trails.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="63568b335acd55f879c838722a171bc1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/-KUSNeXvj2g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ai.engin.umich.edu/stories/simulated-humanoid-robots-learn-to-hike-rugged-terrain-autonomously">Michigan</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="b94_w7jir-8">I am dangerously close to no longer being impressed by breakdancing humanoid robots.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d9e4bd320ddfbec127374717853b7082" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/b94_W7JiR-8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.fftai.com/">Fourier</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="7htev2udsae">This, though, would impress me.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2f06b441b4657ae0dac3b58eef872ba9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7htEV2UDsAE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://team.inria.fr/hucebot/">Inria</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="2sya8elzmes">In this interview, Clone’s co-founder and CEO Dhanush Radhakrishnan discusses the company’s path to creating the synthetic humans straight out of science fiction.</p><p>(If YouTube brilliantly attempts to auto-dub this for you, switch the audio track to original [which YouTube thinks is Polish] and the video will still be in English.)</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2af28ac84dc5e5924f314e5ecd1f1940" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2SYA8ELZmEs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://clonerobotics.com/">Clone</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="jyflldzulu4"><em>This documentary takes you behind the scenes of the HMND 01 Alpha release: the breakthroughs, the failures, and the late nights of building the U.K.’s first industrial humanoid robot. </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2c43cb390cb755d0e0f2f6621d47010e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/jyflLDzUlu4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://thehumanoid.ai/">Humanoid</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="pgbzetzx424"><em>What is the role of ethical considerations in the development and deployment of robotic and automation technologies, and what are the responsibilities of researchers to ensure that these technologies advance in ways that are transparent, fair, and aligned with the broader well-being of society?</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ea6a9e5cce37d63d68dc35f58029fe20" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pgBzETzX424?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://icra40.ieee.org/icra-2024/program/debates-and-panels/">ICRA@40</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="srpx2hkap98">This UPenn GRASP SFI lecture is from Tairan He at Nvidia on “Scalable Sim-to-Real Learning for General-Purpose Humanoid Skills.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0d8abdfe6c31b1afb6e3368449dd09b0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/sRpx2hkap98?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Humanoids represent the most versatile robotic platform, capable of walking, manipulating, and collaborating with people in human-centered environments. Yet despite recent advances, building humanoids that can operate reliably in the real world remains a fundamental challenge. Progress has been hindered by difficulties in whole-body control, robust perceptive reasoning, and bridging the sim-to-real gap. In this talk, I will discuss how scalable simulation and learning can systematically overcome these barriers.</em></blockquote><p>[ <a href="https://www.grasp.upenn.edu/events/fall-2025-grasp-sfi-tairan-he/">UPenn</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 19 Sep 2025 15:30:05 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-billion-humanoid-robots</guid><category>Robots</category><category>Video friday</category><category>Humanoid robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/nine-glossy-humanoid-robots-walking-in-sync-inside-a-spacious-industrial-room.png?id=61621100&amp;width=980"></media:content></item><item><title>Video Friday: A Soft Robot Companion</title><link>https://spectrum.ieee.org/video-friday-soft-robot-companion</link><description><![CDATA[
  14. <img src="https://spectrum.ieee.org/media-library/a-person-gently-touches-a-humanoid-robot-s-head-in-a-brightly-lit-room-with-colorful-posters.png?id=61594149&width=1245&height=700&coordinates=0%2C0%2C0%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://actuate.foxglove.dev/">ACTUATE 2025</a>: 23–24 September 2025, SAN FRANCISCO</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="https://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="bkzxjzlekru"><em>Fourier’s first Care-bot GR-3. This full-size “care bot” is designed as an interactive companion. Its soft-touch outer shell and multimodal emotional interaction system bring the concept of “warm tech companionship” to life.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="768d0fc535e35c9f7aac70048df886c2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/bkzXjzlEKrU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>I like that it’s <a href="https://spectrum.ieee.org/blossom-a-creative-handmade-approach-to-social-robotics-from-cornell-and-google" target="_blank">soft to the touch</a>, although I’m not sure that encouraging touch is safe. Reminds me a little bit of <a href="https://spectrum.ieee.org/nasa-jsc-unveils-valkyrie-drc-robot" target="_blank">Valkyrie</a>, where NASA put a lot of thought into the soft aspects of the robot.</p><p>[ <a href="https://www.fftai.com/products-gr3">Fourier</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="mweqy6dfzjm">TAKE MY MONEY</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a3250b7c6ca10e8036e064391036c260" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/mwEqY6DFzjM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>This 112-gram micro air vehicle (MAV) features foldable propeller arms that can lock into a compact rectangular profile comparable to the size of a smartphone. The vehicle can be launched by simply throwing it in the air, at which point the arms will unfold and autonomously stabilize to a hovering state. Multiple flight tests demonstrated the capability of the feedback controller to stabilize the MAV based on different initial conditions, including tumbling rates of up to 2,500 degrees per second.</em></blockquote><p>[ <a href="https://avfl.engr.tamu.edu/">AVFL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="aluhg2wm2ca"><em>The U.S. Naval Research Laboratory (NRL), in collaboration with NASA, is advancing space robotics by deploying reinforcement-learning algorithms onto <a href="https://robotsguide.com/robots/astrobee" target="_blank">Astrobee</a>, a free-flying robotic assistant on board the International Space Station. This video highlights how NRL researchers are leveraging artificial intelligence to enable robots to learn, adapt, and perform tasks autonomously. By integrating reinforcement learning, Astrobee can improve maneuverability and optimize energy use.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="35143f03ef586b1486f8a7fd72880ed0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ALUhG2Wm2CA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.nrl.navy.mil/Media/News/Article/4297593/reinforcement-learning-is-making-a-buzz-in-space/">NRL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="sbusoape43k">Every day I’m scuttlin.’</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d84bff9db01e5eff19f4e10994e34396" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/sbUSOAPe43k?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://groundcontrolrobotics.com/">Ground Control Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="uxhdhj1adr4"><em>Trust is built. Every part of our robot Proxie—from wheels to eyes—is designed with trust in mind. Cobot CEO Brad Porter explains the intent behind its design.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="550db0bcbd986d55bcb2e3d62764e5d4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/uXhDHj1aDr4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.co.bot/our-cobot">Cobot</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="-wp-kg7lcwk">Phase 1: Build lots of small quadruped robots. Phase 2: ? Phase 3: Profit!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8e80c7c0d0b8fde292d19064bffcc03d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/-WP-Kg7LcWk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">DEEP Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="1zttbdsslu4"><em>LAPP USA partnered with Corvus Robotics to solve a long-standing supply-chain challenge: labor-intensive, error-prone inventory counting.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ac92ed142b260f17a19f35eef9823afc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/1ZTtBDsslu4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.corvus-robotics.com/">Corvus</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="tn9zfda4sse">I’m pretty sure that 95 percent of all science consists of moving small amounts of liquid from one container to another.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0471e4f8ecff1ba048b044969d115224" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/tN9zFDa4ssE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.flexiv.com/">Flexiv</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="c2vvjzks6-q"><a href="https://spectrum.ieee.org/tag/raffaello-d-andrea" target="_blank">Raffaello D’Andrea</a>, interviewed at ICRA 2025.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="02705a05e54b1ba3a57f9d0dff0961c4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/c2VVJZkS6-Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.verity.net/">Verity</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="qyc11yl5bxc">Tessa Lau, interviewed at ICRA 2025.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0bb680f058771328a6021c53d2e5fd4b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/qyC11yl5bXc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dustyrobotics.com/">Dusty Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="6zs05ybplty"><em>Ever wanted to look inside the mind behind a cutting-edge humanoid robot? In this special episode, we have Dr. Aaron Zhang, the product manager at LimX Dynamics, for an exclusive deep dive into the LimX Oli.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5f0bb1f8cbb3fd8d7492527f64d577d1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/6zs05YBPlTY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.limxdynamics.com/en">LimX Dynamics</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 12 Sep 2025 17:04:10 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-soft-robot-companion</guid><category>Video friday</category><category>Robotics</category><category>Humanoid robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-person-gently-touches-a-humanoid-robot-s-head-in-a-brightly-lit-room-with-colorful-posters.png?id=61594149&amp;width=980"></media:content></item><item><title>Reality Is Ruining the Humanoid Robot Hype</title><link>https://spectrum.ieee.org/humanoid-robot-scaling</link><description><![CDATA[
  15. <img src="https://spectrum.ieee.org/media-library/robots-with-shovels-stand-in-a-container-ready-for-deployment-on-a-pallet.jpg?id=61564898&width=1245&height=700&coordinates=100%2C0%2C100%2C0"/><br/><br/><p><strong>Over the next several</strong> years, humanoid robots will change the nature of work. Or at least, that’s what humanoid robotics companies have been consistently promising, enabling them to raise <a href="https://www.linkedin.com/posts/andrakeay_agility-robotics-a-humanoid-robot-maker-activity-7312580510977769472-mifD/" rel="noopener noreferrer" target="_blank">hundreds</a> of <a href="https://apptronik.com/news-collection/apptronik-raises-350-million-in-series-a-funding" rel="noopener noreferrer" target="_blank">millions</a> of <a href="https://finance.yahoo.com/news/figure-ai-shakes-silicon-valley-140122922.html" rel="noopener noreferrer" target="_blank">dollars</a> at valuations that run into the billions.</p><p>Delivering on these promises will require a lot of robots. Agility Robotics expects to ship “<a href="https://www.bloomberg.com/news/videos/2025-03-04/2025-the-year-for-humanoid-robots-agility-robotics-ceo-video" rel="noopener noreferrer" target="_blank">hundreds</a>” of its Digit robots in 2025 and has a factory in Oregon capable of building <a href="https://spectrum.ieee.org/agility-humanoid-robotics-factory" target="_self">over 10,000</a> robots per year. Tesla <a href="https://www.wsj.com/business/autos/musk-tells-tesla-workers-dont-sell-your-shares-98691278" rel="noopener noreferrer" target="_blank">is planning </a>to produce 5,000 of its Optimus robots in 2025, and at least 50,000 in 2026. Figure believes “<a href="https://www.forbes.com/sites/johnkoetsier/2025/01/30/figure-plans-to-ship-100000-humanoid-robots-over-next-4-years/" rel="noopener noreferrer" target="_blank">there is a path to 100,000 robots</a>” by 2029. And these are just three of the largest companies in an increasingly crowded space.</p><div class="ieee-sidebar-small"><p>This article is part of <a href="https://spectrum.ieee.org/special-reports/scale/" target="_blank">The Scale Issue</a>.</p></div><p>Amplifying this message are many financial analysts: <a href="https://institute.bankofamerica.com/content/dam/transformation/humanoid-robots.pdf" rel="noopener noreferrer" target="_blank">Bank of America Global Research</a>, for example, predicts that global humanoid robot shipments will reach 18,000 units in 2025. And <a href="https://www.morganstanley.com/insights/articles/humanoid-robot-market-5-trillion-by-2050" rel="noopener noreferrer" target="_blank">Morgan Stanley Research estimates</a> that by 2050 there could be over 1 billion humanoid robots, part of a US $5 trillion market.</p><p>But as of now, the market for humanoid robots is almost entirely hypothetical. Even the most successful companies in this space have deployed only a small handful of robots in carefully controlled pilot projects. And future projections seem to be based on an extraordinarily broad interpretation of jobs that a capable, efficient, and safe humanoid robot—which does not currently exist—might conceivably be able to do. Can the current reality connect with the promised scale?</p><h2>What Will It Take to Scale Humanoid Robots?</h2><p>Physically building tens of thousands, or even hundreds of thousands, of humanoid robots is certainly possible in the near term. In 2023, <a href="https://ifr.org/ifr-press-releases/news/record-of-4-million-robots-working-in-factories-worldwide" rel="noopener noreferrer" target="_blank">on the order of 500,000 industrial robots were installed worldwide</a>. Under the basic assumption that a humanoid robot is approximately equivalent to four industrial arms in terms of components, existing supply chains should be able to support even the most optimistic near-term projections for humanoid manufacturing.</p><p>But simply building the robots is arguably the easiest part of scaling humanoids, says <a href="https://www.linkedin.com/in/meloneewise/" rel="noopener noreferrer" target="_blank">Melonee Wise</a>, who served as chief product officer at Agility Robotics until this month. “The bigger problem is demand—I don’t think anyone has found an application for humanoids that would require several thousand robots per facility.” Large deployments, Wise explains, are the most realistic way for a robotics company to scale its business, since onboarding any new client can take weeks or months. An alternative approach to deploying several thousand robots to do a single job is to deploy several hundred robots that can each do 10 jobs, which seems to be what most of the humanoid industry is betting on in the medium to long term.</p><p>While there’s a belief across much of the humanoid robotics industry that rapid progress in AI must somehow translate into rapid progress toward multipurpose robots, it’s <a href="https://spectrum.ieee.org/solve-robotics" target="_self">not</a> <a href="https://www.youtube.com/watch?v=PfvctjoMPk8" rel="noopener noreferrer" target="_blank">clear</a> how, when, or if that will happen. “I think what a lot of people are hoping for is they’re going to AI their way out of this,” says Wise. “But the reality of the situation is that currently AI is not robust enough to meet the requirements of the market.”</p><h2>Bringing Humanoid Robots to Market</h2><p><span>Market requirements for humanoid robots include a slew of extremely dull, extremely critical things like battery life, reliability, and safety. Of these, battery life is the most straightforward—for a robot to usefully do a job, it can’t spend most of its time charging. The next version of Agility’s Digit robot, which can handle payloads of up to 16 kilograms, includes a bulky “backpack” containing a battery with a charging ratio of 10 to 1: The robot can run for 90 minutes, and fully recharge in 9 minutes. Slimmer humanoid robots from other companies must necessarily be making compromises to maintain their svelte form factors.</span></p><p>In operation, Digit will probably spend a few minutes charging after running for 30 minutes. That’s because 60 minutes of Digit’s runtime is essentially a reserve in case something happens in its workspace that requires it to temporarily pause, a not-infrequent occurrence in the logistics and manufacturing environments that Agility is targeting. Without a 60-minute reserve, the robot would be much more likely to run out of power mid-task and need to be manually recharged. Consider what that might look like with even a modest deployment of several hundred robots weighing over a hundred kilograms each. “No one wants to deal with that,” comments Wise.</p><p>Potential customers for humanoid robots are very concerned with downtime. Over the course of a month, a factory operating at 99 percent reliability will see approximately 5 hours of downtime. Wise says that any downtime that stops something like a production line can cost tens of thousands of dollars per minute, which is why many industrial customers expect a couple more 9s of reliability: 99.99 percent. Wise says that Agility has demonstrated this level of reliability in some specific applications, but not in the context of multipurpose or general-purpose functionality.</p><h2>Humanoid Robot Safety</h2><p><span>A humanoid robot in an industrial environment must meet general </span><a href="https://blog.ansi.org/ansi/ansi-b11-standards-safety-of-machinery/" target="_blank">safety</a><span> </span><a href="https://osha.europa.eu/en/legislation/directives/directive-2006-42-ec-of-the-european-parliament-and-of-the-council" target="_blank">requirements</a><span> for industrial machines. In the past, robotic systems like autonomous vehicles and drones have benefited from immature regulatory environments to scale quickly. But Wise says that approach can’t work for humanoids, because the industry is already heavily regulated—the robot is simply considered another piece of machinery.</span></p><p>There are also more specific<a href="https://webstore.ansi.org/standards/ria/ansiriar15082020" target="_blank"> safety standards</a> currently under development for humanoid robots, explains Matt Powers, associate director of autonomy R&D at Boston Dynamics. He notes that his company is helping develop an <a href="https://www.iso.org/standard/91469.html" target="_blank">International Organization for Standardization (ISO) safety standard for dynamically balancing legged robots</a>. “We’re very happy that the top players in the field, like Agility and Figure, are joining us in developing a way to explain why we believe that the systems that we’re deploying are safe,” Powers says.</p><p>These standards are necessary because the traditional safety approach of cutting power may not be a good option for a dynamically balancing system. Doing so will cause a humanoid robot to fall over, potentially making the situation even worse. There is no simple solution to this problem, and the initial approach that Boston Dynamics expects to take with its Atlas robot is to keep the robot out of situations where simply powering it off might not be the best option. “We’re going to start with relatively low-risk deployments, and then expand as we build confidence in our safety systems,” Powers says. “I think a methodical approach is really going to be the winner here.”</p><p>In practice, low risk means keeping humanoid robots away from people. But humanoids that are restricted by what jobs they can safely do and where they can safely move are going to have more trouble finding tasks that provide value.</p><h2>Are Humanoids the Answer?</h2><p><span>The issues of demand, battery life, reliability, and safety all need to be solved before humanoid robots can scale. But a more fundamental question to ask is whether a bipedal robot is actually worth the trouble.</span></p><p>Dynamic balancing with legs would theoretically enable these robots to navigate complex environments like a human. Yet demo videos show these humanoid robots as either mostly stationary or repetitively moving short distances over flat floors. The promise is that what we’re seeing now is just the first step toward humanlike mobility. But in the short to medium term, there are much more reliable, efficient, and cost-effective platforms that can take over in these situations: robots with arms, but with wheels instead of legs.</p><p>Safe and reliable humanoid robots have the potential to revolutionize the labor market at some point in the future. But potential is just that, and despite the humanoid enthusiasm, we have to be realistic about what it will take to turn potential into reality. <span class="ieee-end-mark"></span></p><p><em>This article appears in the October 2025 print issue as “Why Humanoid Robots Aren’t Scaling.”</em></p>]]></description><pubDate>Thu, 11 Sep 2025 13:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/humanoid-robot-scaling</guid><category>Humanoid robots</category><category>Scale issue</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/robots-with-shovels-stand-in-a-container-ready-for-deployment-on-a-pallet.jpg?id=61564898&amp;width=980"></media:content></item><item><title>How Robotics Is Powering the Future of Innovation</title><link>https://content.knowledgehub.wiley.com/from-concept-to-reality-how-robotics-is-transforming-our-world/</link><description><![CDATA[
  16. <img src="https://spectrum.ieee.org/media-library/heilind-molex-logo-bold-blue-and-red-text-with-a-yellow-line-under-heilind.png?id=61582070&width=980"/><br/><br/><p>The future of robotics is being shaped by powerful technologies like AI, edge computing, and high-speed connectivity, driving smarter, more responsive machines across industries. Robots are no longer confined to static environments—they are evolving to interact dynamically with humans and their surroundings.</p><p>This eBook explores the impact of robotics in diverse fields, from home automation and medical technology to automotive, data centers, and industrial applications. It highlights challenges like power efficiency, miniaturization, and ruggedization, while showcasing Molex’s innovative solutions tailored for each domain.</p><p>Additionally, the eBook covers:</p><ul><li>Ruggedized connectors for harsh industrial settings</li><li>Advanced power management for home robots</li><li>Miniaturized systems for precision medical robotics</li><li>5G/6G-enabled autonomous vehicles</li><li>High-speed data solutions for cloud infrastructure</li></ul><div><span><a href="https://content.knowledgehub.wiley.com/from-concept-to-reality-how-robotics-is-transforming-our-world/" target="_blank">Download this free whitepaper now!</a></span></div>]]></description><pubDate>Thu, 11 Sep 2025 10:00:02 +0000</pubDate><guid>https://content.knowledgehub.wiley.com/from-concept-to-reality-how-robotics-is-transforming-our-world/</guid><category>Type:whitepaper</category><category>Innovation</category><category>Robotics</category><category>Edge computing</category><category>Artificial intelligence</category><dc:creator>Heilind Electronics</dc:creator><media:content medium="image" type="image/png" url="https://assets.rbl.ms/61582070/origin.png"></media:content></item><item><title>Large Behavior Models Are Helping Atlas Get to Work</title><link>https://spectrum.ieee.org/boston-dynamics-atlas-scott-kuindersma</link><description><![CDATA[
  17. <img src="https://spectrum.ieee.org/media-library/boston-dynamics-robot-operating-autonomously-near-stacked-robotic-components-on-a-shelf.png?id=61540463&width=1245&height=700&coordinates=0%2C122%2C0%2C122"/><br/><br/><p><a href="https://spectrum.ieee.org/tag/boston-dynamics" target="_blank">Boston Dynamics</a> can be forgiven, I think, for the relative lack of acrobatic prowess displayed by the <a href="https://spectrum.ieee.org/atlas-humanoid-robot" target="_blank">new version of Atlas</a> in (<a href="https://www.youtube.com/watch?v=I44_zbEwz_w" rel="noopener noreferrer" target="_blank">most of</a>) its latest videos. In fact, if you look at <a href="https://www.youtube.com/watch?v=F_7IPm7f1vI" rel="noopener noreferrer" target="_blank">this Atlas video</a> from late last year and compare it to <a href="https://www.youtube.com/watch?v=HYwekersccY" rel="noopener noreferrer" target="_blank">Atlas’s most recent video</a>, it’s doing what looks to be more or less the same logistics-y stuff—all of which is far less visually exciting than backflips. </p><p>But I would argue that the relatively dull tasks Atlas is working on now, moving car parts and totes and whatnot, are just as impressive. Making a humanoid that can consistently and economically and safely do useful things over the long term could very well be the hardest problem in robotics right now, and Boston Dynamics is taking it seriously. </p><p>Last October, <a href="https://spectrum.ieee.org/boston-dynamics-toyota-research" target="_self">Boston Dynamics announced a partnership with Toyota Research Institute</a> with the goal of general-purpose-izing Atlas. We’re now starting to see the results of that partnership, and Boston Dynamics’ vice president of robotics research, <a href="https://www.linkedin.com/in/scott-kuindersma-06a38152/" target="_blank">Scott Kuindersma</a>, takes us through the progress they’ve made.</p><h2>Building AI Generalist Robots</h2><p>While the context of this work is “building AI generalist robots,” I’m not sure that anyone really knows what a “generalist robot” would actually look like, or how we’ll even know when someone has achieved it. Humans are generalists, sort of—we can potentially do a lot of things, and we’re fairly adaptable and flexible in many situations, but we still require training for most tasks. I bring this up just to try and contextualize expectations, because I think a successful humanoid robot doesn’t have to actually be a generalist, but instead just has to be capable of doing several different kinds of tasks, and to be adaptable and flexible in the context of those tasks. And that’s already difficult enough.</p><p>The approach that the two companies are taking is to leverage large behavior models (LBMs), which combine more general world knowledge with specific task knowledge to help Atlas with that adaptability and flexibility thing. As Boston Dynamics points out in <a href="https://bostondynamics.com/blog/large-behavior-models-atlas-find-new-footing/" rel="noopener noreferrer" target="_blank">a recent blog post</a>, “the field is steadily accumulating evidence that policies trained on a large corpus of diverse task data can generalize and recover better than specialist policies that are trained to solve one or a small number of tasks.” Essentially, the goal is to develop a foundational policy that covers things like movement and manipulation, and then add more specific training (provided by humans) on top of that for specific tasks. This video below shows how that’s going so far.</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="872cd3062fae98c99846ace67ee03822" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/HYwekersccY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://www.youtube.com/watch?v=HYwekersccY" rel="noopener noreferrer" target="_blank">Boston Dynamics/YouTube</a></small></p><p><span>What the video doesn’t show is the training system that Boston Dynamics uses to teach Atlas to do these tasks. Essentially imitation learning, an operator wearing a motion tracking system teleoperates Atlas through motion and manipulation tasks. There’s a one-to-one mapping between the operator and the robot, making it fairly intuitive, although as anyone who has tried to teleoperate a robot with a surfeit of degrees of freedom can attest, it takes some practice to do it well. </span></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Robot and VR user interact in a lab workspace." class="rm-shortcode" data-rm-shortcode-id="b7017320bd3641e671dfcb01d6703a04" data-rm-shortcode-name="rebelmouse-image" id="d9f54" loading="lazy" src="https://spectrum.ieee.org/media-library/robot-and-vr-user-interact-in-a-lab-workspace.png?id=61540461&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">A motion-tracking system provides high-quality task training data for Atlas.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Boston Dynamics</small></p><p><span>This interface provides very high-quality demonstration data for Atlas, but it’s not the easiest to scale—just one of the challenges of deploying a </span><span>multipurpose (different than generalist!) humanoid.</span></p><p>For more about what’s going on behind the scenes in this video and Boston Dynamics’ strategy with Atlas, <em>IEEE Spectrum</em> spoke with Kuindersma.</p><p class="rm-anchors" id="top">Scott Kuindersma on:</p><ul><li><a href="#new">What’s new from Boston Dynamics and Toyota Research Institute</a></li><li><a href="#lbm">The role of large behavior models</a></li><li><a href="#unique">Learning through human imitation</a></li><li><a href="#limit">The potential limitations of imitating humans</a></li><li><a href="#data">The importance of high-quality data</a></li><li><a href="#next">The future for Atlas</a></li></ul><p class="rm-anchors" id="new"><strong>In <a href="https://www.youtube.com/watch?v=F_7IPm7f1vI" target="_blank">a video from last October</a>, just as your partnership with Toyota Research Institute was beginning, Atlas was shown moving parts around and performing whole-body manipulation. What’s the key difference between that demonstration and what we’re seeing in the new video? </strong></p><p><strong>Scott Kuindersma: </strong>The big difference is how we programmed the behavior. The previous system was a more traditional robotics stack involving a combination of model-based controllers, planners, and machine-learning models for perception all architected together to do end-to-end manipulation. Programming a new task on that system generally required roboticists or system integrators to touch code and tell the robot what to do. </p><p>For this new video, we replaced most of that system with a single neural network that was trained on demonstration data. This is much more flexible because there’s no task-specific programming or other open-ended creative engineering required. Basically, if you can teleoperate the robot to do a task, you can train the network to reproduce that behavior. This approach is more flexible and scalable because it allows people without advanced degrees in robotics to “program” the robot.</p><p><a href="#top">Back to top</a></p><p class="rm-anchors" id="lbm"><strong>We’re talking about a large behavior model (LBM) here, right? What would you call the kind of learning that this model does?</strong></p><p><strong>Kuindersma: </strong>It is a kind of imitation learning. We collect many teleoperation demonstrations and train a neural network to reproduce the input-output behaviors in the data. The inputs are things like raw robot camera images, natural language descriptions of the task, and proprioception, and the outputs are the same teleop commands sent by the human interface.</p><p>What makes it a large behavior model is that we collect data from many different tasks and, in some cases, many different robot embodiments, using all of that as training data for the robot to end up with a single policy that knows how to do many things. The idea is that by training the network on a much wider variety of data and tasks and robots, its ability to generalize will be better. As a field, we are still in the early days of gathering evidence that this is actually the case (our [Toyota Research Institute] collaborators are <a href="https://toyotaresearchinstitute.github.io/lbm1/" target="_blank">among those leading the charge</a>), but we expect it is true based on the empirical trends we see in robotics and other AI domains.</p><p><strong>So the idea with the behavior model is that it will be more generalizable, more adaptable, or require less training because it will have a baseline understanding of how things work?</strong></p><p><strong>Kuindersma: </strong>Exactly, that’s the idea. At a certain scale, once the model has seen enough through its training data, it should have some ability to take what it’s learned from one set of tasks and apply those learnings to new tasks. One of the things that makes these models flexible is that they are conditioned on language. We collect teleop demonstrations and then post-annotate that data with language, having humans or language models describing in English what is happening. The network then learns to associate these language prompts with the robot’s behaviors. Then, you can tell the model what to do in English, and it has a chance of actually doing it. At a certain scale, we hope it won’t take hundreds of demonstrations for the robot to do a task—maybe only a couple—and maybe way in the future, you might be able to just tell the robot what to do in English, and it will know how to do it, even if the task requires dexterity beyond simple object pick-and-place.</p><p><a href="#top">Back to top</a></p><p class="rm-anchors" id="unique"><strong>There are a lot of robot videos out there of robots doing stuff that might look similar to what we’re seeing here. Can you tell me how what Boston Dynamics and Toyota Research Institute are doing is unique?</strong></p><p><strong>Kuindersma: </strong>Many groups are using AI tools for robot demos, but there are some differences in our strategic approach. From our perspective, it’s crucial for the robot to perform the full breadth of humanoid manipulation tasks. That means if you use a data-driven approach, you need to somehow funnel those embodied experiences into the dataset you’re using to train the model. We spent a lot of time building a highly expressive teleop interface for Atlas, which allows operators to move the robot around quickly, take steps, balance on one foot, reach the floor and high shelves, throw and catch things, and so on.</p><p>The ability to directly mirror a human body in real time is vital for Atlas to act like a real humanoid laborer. If you’re just standing in front of a table and moving things around, sure, you can do that with a humanoid, but you can do it with much cheaper and simpler robots, too. If you instead want to, say, bend down and pick up something from between your legs, you have to make careful adjustments to the entire body while doing manipulation. The tasks we’ve been focused on with Atlas over the last couple of months have been focused more on collecting this type of data, and we’re committed to making these AI models extremely performant so the motions are smooth, fast, beautiful, and fully cover what humanoids can do.</p><p><strong>Is it a constraint that you’re using imitation learning, given that Atlas is built to move in ways that humans can’t? How do you expand the operating envelope with this kind of training? </strong></p><p><strong>Kuindersma: </strong>That’s a great question. There are a few ways to think about it:</p><ul><li>Atlas can certainly do things like continuous joint rotation that people can’t. While those capabilities might offer efficiency benefits, I would argue that if Atlas <em><em>only</em></em> behaved exactly like a competent human, that would be amazing, and we would be very happy with that.</li><li>We could extend our teleop interface to make available types of motions the robot can do but a person can’t. The downside is this would probably make teleoperation less intuitive, requiring a more highly trained expert, which reduces scalability.</li><li>We may be able to co-train our large behavior models with data sources that are not just teleoperation-based. For example, in simulation, you could use rollouts from reinforcement learning policies or programmatic planners as augmented demonstrations that include these high-range-of-motion capabilities. The LBM can then learn to leverage that in conjunction with teleop demonstrations. This is not just a hypothetical; we’ve actually found that co-training with simulation data has improved performance in the real robot, which is quite promising.</li></ul><p><strong>Can you tell me what Atlas was directed to do in the video? Is it primarily trying to mirror its human-based training, or does it have some capacity to make decisions?</strong> </p><p><strong>Kuindersma:</strong> In this case, Atlas is responding primarily to visual and language cues to perform the task. At our current scale and with the model’s training, there’s a limited ability to completely innovate behaviors. However, you can see a lot of variety and responsiveness in the details of the motion, such as where specific parts are in the bin or where the bin itself is. As long as those experiences are reflected somewhere in the training data, the robot uses its real-time sensor observations to produce the right type of response.</p><p><strong>So, if the bin was too far away for the robot to reach, without specific training, would it move itself to the bin?</strong> </p><p><strong>Kuindersma: </strong>We haven’t done that experiment, but if the bin was too far away, I think it might take a step forward because we varied the initial conditions of the bin when we collected data, which sometimes required the operator to walk the robot to the bin. So there is a good chance that it would step forward, but there is also a small chance that it might try to reach and not succeed. It can be hard to make confident predictions about model behavior without running experiments, which is one of the fun features of working with models like this.</p><p><a href="#top">Back to top</a></p><p class="rm-anchors" id="limit"><strong>It’s interesting how a large behavior model, which provides world knowledge and flexibility, interacts with this instance of imitation learning, where the robot tries to mimic specific human actions. How much flexibility can the system take on when it’s operating based on human imitation?</strong></p><p><strong>Kuindersma:</strong> It’s primarily a question of scale. A large behavior model is essentially imitation learning at scale, similar to a large language model. The hypothesis with large behavior models is that as they scale, generalization capabilities improve, allowing them to handle more real-world corner cases and require less training data for new tasks. Currently, the generalization of these models is limited, but we’re addressing that by gathering more data not only through teleoperating robots but also by exploring other scaling bets like non-teleop human demonstrations and sim/synthetic data. These other sources might have more of an “embodiment gap” to the robot, but the model’s ability to assimilate and translate between data sources could lead to better generalization.</p><p><strong>How much skill or experience does it take to effectively train Atlas through teleoperation?</strong> </p><p><strong>Kuindersma: </strong>We’ve had people on day tours jump in and do some teleop, moving the robot and picking things up. This ease of entry is thanks to our teams building a really nice interface: The user wears a VR headset, where they’re looking at a re-projection of the robot’s stereo RGB cameras, which are aligned to provide a 3D sense of vision, and there are built-in visual augmentations like desired hand locations and what the robot is actually doing to give people situational awareness.</p><p>So novice users can do things fairly easily; they’re probably not generating the highest-quality motions for training policies. To generate high-quality data, and to do that consistently over a period of several hours, it typically takes a couple of weeks of onboarding. We usually start with manipulation tasks and then progress to tasks involving repositioning the entire robot. It’s not trivial, but it’s doable. The people doing it now are not roboticists; we have a team of “robot teachers” who are hired for this, and they’re awesome. It gives us a lot of hope for scaling up the operation as we build more robots.</p><p><a href="#top">Back to top</a></p><p class="rm-anchors" id="data"><strong>How is what you’re doing different from other companies that might lean much harder on scaling through simulation? Are you focusing more on how humans do things?</strong></p><p><strong>Kuindersma: </strong>Many groups are doing similar things, with differences in technical approach, platform, and data strategy. You can characterize the strategies people are taking by thinking about a “data pyramid,” where the top of the pyramid is the highest quality, hardest-to-get data, which is typically teleoperation on the robot you’re working with. The middle of the pyramid might be egocentric data collected on people (e.g., by wearing sensorized gloves), simulation data, or other synthetic world models. And the bottom of the pyramid is data from YouTube or the rest of the Internet. </p><p>Different groups allocate finite resources to different distributions of these data sources. For us, we believe it’s really important to have as large a baseline of actual on-robot data (at the top of the pyramid) as possible. Simulation and synthetic data are almost certainly part of the puzzle, and we’re investing resources there, but we’re taking a somewhat balanced data strategy rather than throwing all of our eggs in one basket.</p><p><strong>Ideally you want the top of the pyramid to be as big as possible, right? </strong></p><p><strong>Kuindersma:</strong> Ideally, yes. But you won’t get to the scale you need by just doing that. You need the whole pyramid, but having as much high-quality data at the top as possible only helps.</p><p><strong>But it’s not like you can just have a super-large bottom to the pyramid and not need the top?</strong></p><p><strong>Kuindersma: </strong>I don’t think so. I believe there needs to be enough high-quality data for these models to effectively translate into the specific embodiment that they are executing on. There needs to be enough of that “top” data for the translation to happen, but no one knows the exact distribution, like whether you need 5 percent real robot data and 95 percent simulation, or some other ratio.</p><p><a href="#top">Back to top</a></p><p class="rm-anchors" id="next"><strong>Is that a box of “<a href="https://punyo.tech/" target="_blank">Puny-os</a>” on the shelf in the video?</strong></p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-right rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: right;"> <img alt="Robot handling a box beside a Boston Dynamics robot dog on a shelf." class="rm-shortcode" data-rm-shortcode-id="7de5bad987fe85e869dd762e07c2b7a9" data-rm-shortcode-name="rebelmouse-image" id="b4601" loading="lazy" src="https://spectrum.ieee.org/media-library/robot-handling-a-box-beside-a-boston-dynamics-robot-dog-on-a-shelf.png?id=61540466&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Part of this self-balancing robot.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Boston Dynamics</small></p><p><strong>Kuindersma: </strong>Yeah! Alex Alspach from [Toyota Research Institute] brought it in to put in the background as an easter egg. </p><p><strong>What’s next for Atlas?</strong></p><p><strong>Kuindersma: </strong>We’re really focused on maximizing the performance manipulation behaviors. I think one of the things that we’re uniquely positioned to do well is reaching the full behavioral envelope of humanoids, including mobile bimanual manipulation, repetitive tasks, and strength, and getting the robot to move smoothly and dynamically using these models. We’re also developing repeatable processes to climb the robustness curve for these policies—we think reinforcement learning may play a key role in achieving this. </p><p>We’re also looking at other types of scaling bets around these systems. Yes, it’s going to be very important that we have a lot of high-quality on-robot on-task data that we’re using as part of training these models. But we also think there are real opportunities and being able to leverage other data sources, whether that’s observing or instrumenting human workers or scaling up synthetic and simulation data, and understanding how those things can mix together to improve the performance of our models.</p><p><a href="#top">Back to top</a></p><p><em>This article appears in the November 2025 print issue as “Atlas Is on Its Best Behavior (Model.)”</em></p>]]></description><pubDate>Sun, 07 Sep 2025 13:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/boston-dynamics-atlas-scott-kuindersma</guid><category>Boston dynamics</category><category>Humanoid robots</category><category>Toyota research institute</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/boston-dynamics-robot-operating-autonomously-near-stacked-robotic-components-on-a-shelf.png?id=61540463&amp;width=980"></media:content></item><item><title>Video Friday: Robot Vacuum Climbs Stairs</title><link>https://spectrum.ieee.org/video-friday-eufy-robot-vacuum</link><description><![CDATA[
  18. <img src="https://spectrum.ieee.org/media-library/robotic-vacuum-from-eufy-on-a-dimly-lit-staircase-in-black-surroundings.png?id=61558584&width=1245&height=700&coordinates=0%2C8%2C0%2C8"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://actuate.foxglove.dev/">ACTUATE 2025</a>: 23–24 September 2025, SAN FRANCISCO</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="http://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="pcunho0cy9q">This is ridiculous, and I love it.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1d8dc2352eddb99aac790509c3e1140f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/PcunhO0cy9Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.eufy.com/">Eufy</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="6epzte9ctzy"><em>At ICRA 2024, We met Paul Nadan to learn about how his LORIS robot climbs up walls by sticking itself to rocks.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="864971ef74d0a648d4a29547ec606c26" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/6EPzTe9cTzY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://paulnadan.com/">CMU</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="8gfuuzdn4q8">If a <a data-linked-post="2666662286" href="https://spectrum.ieee.org/humanoid-robots" target="_blank">humanoid robot</a> is going to load my dishwasher, I expect it to do so optimally, not all haphazardly like a puny human.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a103243a17e5da4c2bd987c599c074dd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8gfuUzDn4Q8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.figure.ai/">Figure</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="tofpkw6d3ge"><em>Humanoid robots have recently achieved impressive progress in locomotion and whole-body control, yet they remain constrained in tasks that demand rapid interaction with dynamic environments through manipulation. Table tennis exemplifies such a challenge: With ball speeds exceeding 5 m/s, players must perceive, predict, and act within sub-second reaction times, requiring both agility and precision. To address this, we present a hierarchical framework for humanoid table tennis that integrates a model-based planner for ball trajectory prediction and racket target planning with a reinforcement learning–based whole-body controller.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="716557784fc438288eb1a94f2867ab6e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/tOfPKW6D3gE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://hybrid-robotics.berkeley.edu/">Hybrid Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="8ebcsdnsgsg"><em>Despite their promise, today’s <a data-linked-post="2650274945" href="https://spectrum.ieee.org/eu-project-developing-symbiotic-robotplant-biohybrids" target="_blank">biohybrid robots</a> typically underperform their fully synthetic counterparts and their potential as predicted from a reductionist assessment of constituents. Many systems represent enticing proofs of concept with limited practical applicability.  Most remain confined to controlled laboratory settings and lack feasibility in complex real-world environments. Developing biohybrid robots is currently a painstaking, bespoke process, and the resulting systems are routinely inadequately characterized. Complex, intertwined relationships between component, interface, and system performance are poorly understood, and methodologies to guide informed design of biohybrid systems are lacking. The HyBRIDS ARC opportunity seeks ideas to address the question: How can synthetic and biological components be integrated to enable biohybrid platforms that outperform traditional robotic systems?</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6c654c545d057ef0b659d7a5c0863c91" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8eBcSDnSgsg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.darpa.mil/research/programs/hybrids">DARPA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="vw7z5_sm7xa"><em>Robotic systems will play a key role in future lunar missions, and a great deal of research is currently being conducted in this area. One such project is SAMLER-KI (Semi-Autonomous Micro Rover for Lunar Exploration Using Artificial Intelligence), a collaboration between the German Research Center for Artificial Intelligence (DFKI) and the University of Applied Sciences Aachen (FH Aachen), Germany. The project focuses on the conceptual design of a semi-autonomous micro rover that is capable of surviving lunar nights while remaining within the size class of a micro rover. During development, conditions on the moon such as dust exposure, radiation, and the vacuum of space are taken into account, along with the 14-Earth-day duration of a lunar night.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c0e7cd40ad1ccd5b5d48a4a47340c445" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/VW7Z5_sm7xA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotik.dfki-bremen.de/en/research/projects/samler-ki">DFKI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="8qw5l_yovz8"><em>ARMstrong Dex is a human-scale dual-arm hydraulic robot developed by the Korea Atomic Energy Research Institute (KAERI) for disaster response applications. It is capable of lifting its own body through vertical pull-ups and manipulating objects over 50 kilograms, demonstrating strength beyond human capabilities. In this test, ARMstrong Dex used a handheld saw to cut through a thick 40×90 millimeter wood beam. Sawing is a physically demanding task involving repetitive force application, fine trajectory control, and real-time coordination.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2e43050095b9b311b771a310bc8478a9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8QW5L_yOVZ8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.kaeri.re.kr/eng/">KAERI</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ox9003uok9s">This robot stole my “OMG I HAVE JUICE” face.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="59e62bac81e6d2ba46238e522854c327" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/oX9003UOK9s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pudurobotics.com/en/products/flashbot-arm">Pudu Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="wjbxhdskx4g">The best way of dodging a punch to the face is to just have a big hole where your face should be.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d8c438cc6c93cc41484711af6dc141cc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/WjBXhdSkx4g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>I do wish they wouldn’t call it a combat robot, though.</p><p>[ <a href="https://www.unitree.com/">Unitree</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="p_tfzznhaqs">It really might be fun to have a DRC-style event for quadrupeds.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2eeb9989f52a0cc7da56df744bfaffd8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/P_tfzznhAqs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">DEEP Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="zvnq0qirvp8"><em>CMU researchers are developing new technology to enable robots to physically interact with people who are not able to care for themselves. These breakthroughs are being deployed in the real world, making it possible for individuals with neurological diseases, stroke, multiple sclerosis, ALS, and dementia to be able to eat, clean, and get dressed fully on their own.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ef18f525ec90f73de852f4045d1fef52" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ZVNQ0qIrvP8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://rchi-lab.github.io/">CMU</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="acvscpemw4y"><em>Caracol’s additive manufacturing platforms use KUKA robotic arms to produce large-scale industrial parts with precision and flexibility. This video outlines how Caracol integrates multi-axis robotics, modular extruders, and proprietary software to support production in sectors like aerospace, marine, automotive, and architecture.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4255eb367f3abee372e7e4daeea9691a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/AcvscpeMw4Y?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.kuka.com/it-it/settori/banca-dati-di-soluzioni/2025/09/caracol-case-study">KUKA</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="chyxgtjvn0i">There were a couple of robots at ICRA 2025, as you might expect.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ccc460cb8b8b8133b6e395a3954eb4fd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ChYXgTjVn0I?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://2026.ieee-icra.org/">ICRA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="mow0mvvlpc0"><em>On June 6, 1990, following the conclusion of Voyager’s planetary explorations, mission representatives held a news conference at NASA’s Jet Propulsion Laboratory in Southern California to summarize key findings and answer questions from the media. In the briefing, Voyager’s longtime project scientist Ed Stone, along with renowned science communicator Carl Sagan, also revealed the mission’s  “Solar System Family Portrait,” a mosaic comprising images of six of the solar system’s eight planets. Sagan was a member of the Voyager imaging team and instrumental in capturing these images and bringing them to the public.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ce23b7d63acaaa33f595b2d7eb8111f5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/MoW0MVVLPc0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Carl Sagan, man. Carl Sagan. Blue Dot unveil was right around 57:00, if you missed it.</p><p>[ <a href="https://www.jpl.nasa.gov/news/vintage-nasa-see-voyagers-1990-solar-system-family-portrait-debut/">JPL</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 05 Sep 2025 16:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-eufy-robot-vacuum</guid><category>Video friday</category><category>Robotics</category><category>Robots</category><category>Vacuum robots</category><category>Humanoid robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/robotic-vacuum-from-eufy-on-a-dimly-lit-staircase-in-black-surroundings.png?id=61558584&amp;width=980"></media:content></item><item><title>Do People Really Want Humanoid Robots in Their Homes?</title><link>https://spectrum.ieee.org/home-humanoid-robots-survey</link><description><![CDATA[
  19. <img src="https://spectrum.ieee.org/media-library/three-by-three-grid-showing-various-humanoid-robots-performing-household-tasks-such-as-watering-plants-vacuuming-and-loading-l.jpg?id=61534540&width=1245&height=700&coordinates=0%2C428%2C0%2C429"/><br/><br/><p>I’ve been teaching robotics at the University of Washington for more than a decade. Every class begins with “robotics news of the week.” For years, humanoid robots appeared only occasionally—usually in the form of viral clips of Boston Dynamics’ Atlas doing parkour or RoboCup humanoid league bloopers that served more as comic relief than serious news.</p><p>But over the past few years, things have shifted. Each week brings another humanoid demo, each flashier than the last, as companies race to showcase new capabilities. And behind those slick videos lies a flood of venture capital. Humanoid robotics has become a billion-dollar frenzy.</p><p>The scale of investment is astonishing. Just a year ago, <a href="https://www.cnbc.com/2024/02/29/robot-startup-figure-valued-at-2point6-billion-by-bezos-amazon-nvidia.html" rel="noopener noreferrer" target="_blank">Figure AI’s $2.6 billion valuation</a> seemed extraordinary—until its latest funding round catapulted it to <a href="https://techfundingnews.com/figure-ai-to-grab-1-5b-funding-at-39-5b-valuation-eyes-to-produce-100000-robots-what-about-competition/" rel="noopener noreferrer" target="_blank">$39.5 billion</a>. Investors large and small are rushing in, and tech giants like Microsoft, Amazon, OpenAI, and Nvidia are scrambling to get a foothold for fear of missing out. Tesla is <a href="https://fortune.com/2025/03/23/tesla-billion-gone-astray-questions-controls/" rel="noopener noreferrer" target="_blank">pouring resources into its Optimus robot</a>, while China has committed more than <a href="https://www.cnn.com/2025/03/25/tech/china-robots-market-competitiveness-intl-hnk/index.html" rel="noopener noreferrer" target="_blank">$10 billion in government funding</a> to drive down costs and seize market dominance. Goldman Sachs now projects the global humanoid market could reach <a href="https://www.goldmansachs.com/insights/articles/the-global-market-for-robots-could-reach-38-billion-by-2035" rel="noopener noreferrer" target="_blank">$38 billion by 2035</a>.</p><p>This surge of interest reflects a long-standing dream in robotics: If machines could match human form and function, they could simply step into human jobs without requiring us to change our environments. If humanoids could do everything people can, then in theory they could replace workers on the factory floor or in warehouse aisles. It’s no surprise, then, that many humanoid companies are targeting what they believe are sectors with labor shortages and undesirable jobs—<a href="https://bostondynamics.com/webinars/why-humanoids-are-the-future-of-manufacturing/" rel="noopener noreferrer" target="_blank">manufacturing</a>, <a href="https://www.agilityrobotics.com/industries/third-party-logistics" rel="noopener noreferrer" target="_blank">logistics</a>, <a href="https://www.agilityrobotics.com/industries/distribution" rel="noopener noreferrer" target="_blank">distribution</a>, <a href="https://apptronik.com/industries/retail" rel="noopener noreferrer" target="_blank">retail</a>—as near-term markets.</p><h2>Factories first, homes next?</h2><p><span></span><span>A subset of humanoid companies see homes as the next frontier. </span><a href="https://www.figure.ai/master-plan" target="_blank">Figure AI claims</a><span> humanoids will revolutionize “assisting individuals in the home” and “caring for the elderly.” Its marketing materials show robots handing an apple to a human, making coffee, putting away groceries and dishes, pouring drinks, and watering plants. </span><a href="https://www.youtube.com/live/6v6dbxPlsXs?t=1256s" target="_blank">Tesla’s Optimus</a>,<span> similarly branded as an “autonomous assistant, humanoid friend,” is shown folding clothes, cracking eggs, unloading groceries, receiving packages, and even playing family games. The </span><a href="https://www.1x.tech/neo" target="_blank">Neo humanoid by 1X Technologies</a><span> appears targeted solely at in-home use, with the company declaring that “1X bets on the home” and is “building a world where we do more of what we love, while our humanoid companions handle the rest.” Neo is depicted vacuuming, serving tea, wiping windows and tables, and carrying laundry and grocery bags.</span></p><p>All these glossy marketing videos struck a personal chord with me. I have always dreamed of robots in homes—and I know <a href="https://spectrum.ieee.org/when-will-we-have-robots-to-help-with-household-chores" target="_self">I am not alone.</a> Like many roboticists of my generation, my earliest memories of robots trace back to Rosie the Robot from <em><a href="https://en.wikipedia.org/wiki/List_of_The_Jetsons_characters" target="_blank">The Jetsons</a>.</em> I dedicated my career to getting assistive robots into homes. In 2014, my students and I placed a <a href="https://robotsguide.com/robots/pr2" target="_blank">PR2 robot</a> in a home in Arizona, where it failed miserably at most tasks—though we learned a great deal in the process. Later, I was part of more successful in-home deployments of a <a href="https://spectrum.ieee.org/hello-robots-stretch-mobile-manipulator" target="_self">Stretch robot</a> and an <a href="https://ieeexplore.ieee.org/abstract/document/10974182" target="_blank">assistive feeding robot</a>. I even found myself enjoying housework because it gave me a chance to <a href="https://ieeexplore.ieee.org/document/6483517" target="_blank">analyze the tasks it entailed</a> with an eye toward someday automating them. For years, I promoted my work under a personal motto: “I want robots to do all the chores by the time I retire,” often joking that I might never retire. </p><p>Yet when billion-dollar companies began chasing the same dream, I found myself reacting with unease. I had always imagined that home robots would be more like Rosie—robotic and cartoonish—and my own research moved further and further away from the human form because non-humanoid robots were more practical and preferred by users. I struggled to picture a humanoid in my own house—or any of the homes where I had deployed robots. And after years of human-centered research in robotics, I could not imagine users welcoming humanoids into their homes without hesitation. Still, I assumed someone must want them. Surely some fraction of those billions had gone into market research and customer insight. And I wanted to know what they knew.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Six multi-use robots: Figure, Optimus, Neo, PR2, Fetch, Stretch, showcasing both humanoid and non-humanoid designs." class="rm-shortcode" data-rm-shortcode-id="263bef2f9c3715174d14e4b079529a91" data-rm-shortcode-name="rebelmouse-image" id="98a50" loading="lazy" src="https://spectrum.ieee.org/media-library/six-multi-use-robots-figure-optimus-neo-pr2-fetch-stretch-showcasing-both-humanoid-and-non-humanoid-designs.jpg?id=61540703&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">[Left] Three real-world humanoids shown to participants in our study (Figure, Optimus, Neo). [Right] Examples of three general-purpose robots with few or no humanlike features (PR2, Fetch, Stretch).</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Maya Cakmak</small></p><h2>What people actually think</h2><p>To find out, my students and I set out to better understand what the public thinks about humanoid robots in the home. We surveyed 76 participants from the U.S. and the U.K., asking whether they considered humanoids in the home acceptable, which designs they preferred, and why. We also presented them with imagined scenarios where either a humanoid or a special-purpose robot assisted an older adult with tasks like eating, dressing, or vacuuming, and asked which they would choose. The results are detailed in our paper “<a href="https://ieeexplore.ieee.org/xpl/conhome/1000636/all-proceedings" target="_blank">Attitudes Towards Humanoid Robots for In-Home Assistance</a>,” presented this week at the <a href="https://www.ro-man2025.org/" target="_blank">IEEE International Conference on Robot and Human Interactive Communication</a> (RO-MAN).</p><p>Our survey showed that people generally prefer special-purpose robots over humanoids. They see special-purpose robots as safer, more private, and ultimately more comfortable to have in their homes and around loved ones. So, while humanoid companies (and their investors) dream of a single humanoid capable of doing it all, our survey participants seem to be more on board with a toolbox of smaller, specialized machines for most tasks: a Roomba for cleaning, a medication dispenser for pills, a stairlift for stairs. </p><p>Nevertheless, most survey participants considered humanoids in the home acceptable. Some even preferred humanoids for certain tasks, especially when the special-purpose alternative was more speculative—like a dressing assistant robot. When shown images of Neo, Figure 02, and Optimus performing household tasks, they agreed the robots looked useful and well-suited for homes. Many said they would feel comfortable having one in their own home—or in the home of a loved one. Of course, we had framed the scenarios optimistically: Participants were told to assume the robots had passed extensive safety testing, were approved by regulators, and would be covered by insurance—assumptions that may be decades away from reality. And we can safely assume that finding humanoids “acceptable” doesn’t mean people actually want them—or that they’d be willing to pay for one. </p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Robots (both humanoid and special purpose) assist elderly with eating, dressing, and doing chores in various household tasks in this AI generated figure." class="rm-shortcode" data-rm-shortcode-id="3f0d38898d0e41d2baef35f2307e7fb7" data-rm-shortcode-name="rebelmouse-image" id="76017" loading="lazy" src="https://spectrum.ieee.org/media-library/robots-both-humanoid-and-special-purpose-assist-elderly-with-eating-dressing-and-doing-chores-in-various-household-tasks-in.png?id=61540477&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">AI-generated images of humanoid and special-purpose robots across eight tasks used in our questionnaires.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Cakmak et al, 2025</small></p><h2>Are home humanoids safe?</h2><p>Unsurprisingly, the task context impacted whether people were open to humanoids in the home. Participants balked at imaginary scenarios involving safety-critical assistance—such as being carried down a staircase—responding with visceral rejections like “absolutely not in a million years.” Whereas for tasks that require little interaction—such as folding laundry—most were willing to imagine a humanoid lending a hand.</p><p>Even with our reassurances about safety, people readily imagined hazards: Humanoids could trip, stumble, or tip over; they might glitch, run out of battery, or malfunction. The idea of a robot handling hot surfaces or sharp objects was also mentioned by multiple participants as a serious concern.</p><p>Privacy was another major concern. Participants worried about camera data being sent to the cloud or robots being remotely controlled by strangers. Several pointed out the security risks—any Internet-connected device, they noted, could be hacked.</p><p>Even participants who saw clear benefits often described a lingering unease. Several described the robots as “creepy” or “unsettling,” and a few explicitly mentioned the <a href="https://spectrum.ieee.org/what-is-the-uncanny-valley" target="_self">uncanny valley effect</a>, pointing in particular to the black face masks common on this new generation of humanoids. One participant described the masks as creating an “eerie sensation, the idea that something might be watching you.” I felt a similar conflict watching a video of Neo (1X) <a href="https://www.1x.tech/neo" target="_blank">sitting on a couch after finishing its chores</a>—a scene that was meant to be comforting but instead left me unsettled.</p><p>A common reason participants preferred special-purpose robots was space. Humanoids were described as “bulky” and “unnecessary,” while specialized robots were seen as “less intrusive” and “more discreet.” These comments reminded me of user research conducted in Japan by the Toyota Research Institute, which led to a <a href="https://spectrum.ieee.org/toyota-research-ceiling-mounted-home-robot" target="_self">ceiling-mounted robot design</a> after finding that limited floor space was a major barrier to adoption. The same thought struck me at home when I showed an in-home humanoid video to my 9-year-old and asked if we should get one. He replied: “But we don’t have an extra bed.” His answer nailed the point: If your home doesn’t have room for another human, it probably doesn’t have room for a humanoid.</p><h2>Very big ifs</h2><p>In the end, the study didn’t fully answer my question about what these companies know that I don’t. Participants said they would accept humanoids—if they were safe, worked reliably, and didn’t cost more than the alternatives. Those are very big ifs.</p><p>And of course, our study asked people to use their imaginations. Looking at a picture is not the same as sharing your living room with a six-foot metal figure that moves—in reality, their reactions might be very different. Likewise, picturing yourself someday needing help with eating, dressing, or walking is very different from already relying on that help every day. Perhaps for those already living with these needs, the immediacy of their situation would make the promise of humanoids more compelling.</p><p>To probe further, I asked the same question at the <a href="https://caregivingrobots.github.io/" target="_blank">HRI 2025 Physical Caregiving Robots Workshop</a> to a panel of six people with motor limitations who are experienced users of assistive robots. Not one of them wanted a humanoid. Their concerns ranged from “it’s creepy” to “it has to be 100 percent safe because I cannot escape it.” One panelist summed it up perfectly: “Trying to make assistive robots with humanoids would be like trying to make autonomous cars by putting humanoids in the driver’s seat and asking them to drive like a human.” After all, it was obvious to investors that the better path to autonomous vehicles was to modify or redesign vehicles for autonomy, rather than replicate human drivers. So why are they convinced that replicating humans is the right solution for the home?</p><h2>What’s the alternative?</h2><p>Special-purpose robots may be preferable to humanoids, but building a dedicated machine for every possible task is unrealistic. Homes involve a long tail of chores, and general-purpose robots could indeed provide enormous value. However, the humanoid form is likely overkill, since much simpler designs—such as wheeled robots with basic pinch grippers—can already accomplish a great deal and are far more attainable. And people will likely accept modest <a href="https://www.wired.com/story/optimize-your-home-for-robots/" target="_blank">changes to their homes</a> to expand what these robots can do, just as <a href="https://link.springer.com/chapter/10.1007/978-3-540-74853-3_9" target="_blank">Roomba owners move furniture</a> to let their vacuums work. After all, our homes have already transformed around new technologies—cars, appliances, televisions—so why not for robots, if they prove just as valuable?</p><p>But beyond the unnecessary complexity, a more important issue about the humanoid form may be that users find it less desirable than simpler alternatives. Research has long shown that highly <a href="https://link.springer.com/chapter/10.1007/978-3-030-49788-0_19" target="_blank">humanlike robots can trigger negative emotional responses</a>, and our study suggests that is true of the latest generation of humanoids. Simpler designs with more <a href="https://pmc.ncbi.nlm.nih.gov/articles/PMC8297987/" target="_blank">cartoonlike features are more likely to be accepted</a> as companions. We may even want home robots with no humanlike features at all, so they can be viewed as tools rather than social agents. I believe those who would benefit most from in-home robots—including the <a href="https://www.who.int/news-room/fact-sheets/detail/ageing-and-health" target="_blank">rapidly growing population of older adults</a>—would prefer robots that empower them to do things for themselves, rather than ones that attempt to replace human caregivers. Yet humanoid companies are openly pursuing the latter.</p><p>Only time will tell whether humanoid companies can deliver on their promises—and whether people, myself included, will welcome them into their homes. I hope our findings encourage these companies to dig deeper and share their insights about in-home humanoid customers. I’d also like to see more capital directed toward alternative robot designs for the home. In the meantime, my students and I can’t wait to get our hands on one of these humanoids—purely in the name of science—bring it to older adults in our communities, and hear their unfiltered reactions. I can already imagine someone saying, “It better not sit in my recliner when I’m not looking,” or, “If it’s going to live here, it better pay rent.”</p>]]></description><pubDate>Wed, 03 Sep 2025 12:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/home-humanoid-robots-survey</guid><category>Humanoid robots</category><category>Guest article</category><category>Home robots</category><category>Robotics</category><dc:creator>Maya Cakmak</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/three-by-three-grid-showing-various-humanoid-robots-performing-household-tasks-such-as-watering-plants-vacuuming-and-loading-l.jpg?id=61534540&amp;width=980"></media:content></item><item><title>Connecting Africa’s Next Generation of Engineers</title><link>https://spectrum.ieee.org/africa-s-next-generation-engineers</link><description><![CDATA[
  20. <img src="https://spectrum.ieee.org/media-library/two-people-posing-with-a-yellow-robot-dog-at-a-tech-event.png?id=61525300&width=1245&height=700&coordinates=0%2C125%2C0%2C125"/><br/><br/><p>I get a lot of email from people asking to contribute to <em><em>IEEE Spectrum</em></em>. Usually, they want to write an article for us. But one bold query I received in January 2024 went much further: An undergraduate engineering student named <a href="https://www.linkedin.com/in/oluwatosin-kolade/?originalSubdomain=ng" rel="noopener noreferrer" target="_blank">Oluwatosin Kolade</a>, from Obafemi Awolowo University, in Ilé-Ifẹ̀, Nigeria, volunteered to be our robotics editor. </p><p>Kolade—Tosin to his friends—had been the newsletter editor for his IEEE student branch, but he’d never published an article professionally. His earnestness and enthusiasm were endearing. I explained that we already have a <a href="https://spectrum.ieee.org/u/evan-ackerman" target="_self">robotics editor</a>, but I’d be glad to work with him on writing, editing, and ultimately publishing an article. </p><p> Back in 2003, I had met plenty of engineering students when I traveled to <a href="https://spectrum.ieee.org/surf-africa" target="_self">Nigeria to report </a>on the SAT-3/WASC cable, the first undersea fiber-optic cable to land in West Africa. I remember seeing students gathering around obsolete PCs at Internet cafés connected to the world via a satellite dish powered by a generator. I challenged Tosin to tell <em><em>Spectrum</em></em> readers what it’s like for engineering students today. The result is “<a href="https://spectrum.ieee.org/stem-education-in-africa" target="_blank">Lessons from a Janky Drone</a>.”</p><p>I decided to complement Tosin’s piece with the perspective of a more established engineer in sub-Saharan Africa. I reached out to <a href="https://spectrum.ieee.org/u/g-pascal-zachary" target="_self">G. Pascal Zachary</a>, who has covered engineering education in Africa for us, and Zachary introduced me to <a href="https://ibaino.net/" rel="noopener noreferrer" target="_blank">Engineer Bainomugisha</a>, a computer science professor at Makerere University, in Kampala, Uganda. In “<a data-linked-post="2673897395" href="https://spectrum.ieee.org/africa-engineering-hardware" target="_blank">Learning More With Less</a>,” Bainomugisha draws out the things that were common to his and Tosin’s experience and suggests ways to make the hardware necessary for engineering education more accessible.</p><p>In fact, the region’s decades-long struggle to develop its engineering talent hinges on access to the three things we focus on in this issue: reliable electricity, ubiquitous broadband, and educational resources for young engineers.</p><p class="pull-quote">“During my weekly video calls with Tosin...the connection was pretty good— except when it wasn’t.”</p><p><span>Zachary’s article in this issue, “</span><a data-linked-post="2673881191" href="https://spectrum.ieee.org/electricity-access-sub-saharan-africa" target="_blank">What It Will Really Take to Electrify All of Africa</a><span>”</span><span> tackles the first topic, with a focus on an ambitious initiative to bring electricity to an additional 300 million people by 2030.</span></p><p> Contributing editor <a href="https://spectrum.ieee.org/u/lucas-laursen" target="_self">Lucas Laursen</a>’s article, “<a data-linked-post="2673856838" href="https://spectrum.ieee.org/broadband-internet-in-nigeria" target="_blank">In Nigeria, Why Isn’t Broadband Everywhere?</a>” investigates the slow rollout of fiber-optic connectivity in the two decades since my first visit. As he learned when he traveled to Nigeria earlier this year, the country now has eight undersea cables delivering 380 terabits of capacity, yet less than half of the population has broadband access. </p><p>I got a sense of Nigeria’s bandwidth issues during my weekly video calls with Tosin to discuss his article. The connection was pretty good, except when it wasn’t. Still, I reminded myself, two decades ago such calls would have been nearly impossible. </p><p>Through those weekly chats, we established a professional connection, which made it that much more meaningful when I got to meet Tosin in person this past May at the <a href="https://2025.ieee-icra.org/" target="_blank">IEEE ICRA robotics conference</a>, in Atlanta. Tosin, a RAS member, was attending thanks to a scholarship from the <a href="https://www.ieee-ras.org/" target="_blank">IEEE Robotics and Automation Society</a> to support his participation in robotics standards activities at the conference. Like a kid in a candy shop, he kibbutzed with fellow scholarship winners, attended talks, checked out robots, and met the engineers who built them. </p><p>As Tosin embarks on the next leg in his career journey, he is supported by the IEEE community, which not only recognizes his promise but gives him access to a network of professionals who can help him and his cohort realize their potential.</p>]]></description><pubDate>Mon, 01 Sep 2025 10:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/africa-s-next-generation-engineers</guid><category>Engineering education</category><category>Makerspaces</category><category>Higher education</category><category>3d printers</category><category>Arduino</category><dc:creator>Harry Goldstein</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/two-people-posing-with-a-yellow-robot-dog-at-a-tech-event.png?id=61525300&amp;width=980"></media:content></item><item><title>Video Friday: Spot’s Got Talent</title><link>https://spectrum.ieee.org/video-friday-synchronized-dancing-robots</link><description><![CDATA[
  21. <img src="https://spectrum.ieee.org/media-library/four-legged-robot-dog-doing-three-backflips-in-industrial-setting-with-equipment-and-yellow-stairs.gif?id=61532173&width=1245&height=700&coordinates=0%2C0%2C0%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://clawar.org/clawar2025/">CLAWAR 2025</a>: 5–7 September 2025, SHENZHEN, CHINA</h5><h5><a href="https://actuate.foxglove.dev/">ACTUATE 2025</a>: 23–24 September 2025, SAN FRANCISCO</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="https://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="o_iuqgxrrae"><em>Boston Dynamics is back and their dancing robot dogs are bigger, better, and bolder than ever! Watch as they bring a “dead” robot to life and unleash a never-before-seen synchronized dance routine to “Good Vibrations.”</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6b3bc23cd0d30d622765a4103c587403" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/o_iUqGxRRAE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>And what’s much more interesting, here’s a discussion of how they made it work:</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="691bac971231bd16e539ba9599b30f45" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/LMPxtcEgtds?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/blog/spot-takes-the-stage/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="4cxc0qjm82k">I don’t especially care whether a <a data-linked-post="2658800413" href="https://spectrum.ieee.org/humanoid-robot-falling" target="_blank">robot falls over</a>. I care whether it gets itself back up again.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a0cac494be632a4b0e255bbbb4a8d0e4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/4cxC0qjm82k?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.limxdynamics.com/en">LimX Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="bxdlxf7bnqq"><em>The robot autonomously connects multiple wires to the environment using small flying anchors—drones equipped with anchoring mechanisms at the wire tips. Guided by an onboard RGB-D camera for control and environmental recognition, the system enables wire attachment in unprepared environments and supports simultaneous multiwire connections, expanding the operational range of wire-driven robots.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7e08d4624013d480add8301acc352ca5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/BXdlXf7BNQQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://shin0805.github.io/flying-anchor/">JSK Robotics Laboratory</a> ] at [ <a href="http://www.jsk.t.u-tokyo.ac.jp/" target="_blank">University of Tokyo</a> ]</p><p>Thanks, Shintaro!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="kgmwidtcyo0">For a robot that barely has a face, this is some pretty good emoting.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7f32bd7db00f18487156251fb2fea84a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/kGMWiDTCyo0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pollen-robotics.com/reachy-mini/">Pollen</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="rs_mtkviazy"><em>Learning skills from human motions offers a promising path toward generalizable policies for whole-body humanoid control, yet two key cornerstones are missing: (1) a scalable, high-quality motion-tracking framework that faithfully transforms kinematic references into robust, extremely dynamic motions on real hardware, and (2) a distillation approach that can effectively learn these motion primitives and compose them to solve downstream tasks. We address these gaps with BeyondMimic, a real-world framework to learn from human motions for versatile and naturalistic humanoid control via guided diffusion.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="29bf0b0ac3b910749943d2eefd1300cd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/RS_MtKVIAzY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://hybrid-robotics.berkeley.edu/">Hybrid Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="_akfhkcne0s"><em>Introducing our open-source metal-made bipedal robot MEVITA. All components can be procured through e-commerce, and the robot is built with a minimal number of parts. All hardware, software, and learning environments are released as open source.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c9c9fa2b40b749672512083b73f91b37" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/_akfHkCne0s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://haraduka.github.io/mevita-hardware/">MEVITA</a> ]</p><p>Thanks, Kento!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="wubhxe-mpaq">I’ve always thought that being able to rent robots (or exoskeletons) to help you move furniture or otherwise carry stuff would be very useful.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cac8863f98e697737af68225ad1ad4ed" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/WuBHxe-MPaQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">DEEP Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="od0qvdwgvyo"><em>A new study explains how tiny water bugs use fanlike propellers to zip across streams at speeds up to 120 body lengths per second. The researchers then created a similar fan structure and used it to propel and maneuver an insect-size robot. The discovery offers new possibilities for designing small machines that could operate during floods or other challenging situations.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3e7605bb8e079ccfc126cef88d95c9c2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/oD0qvdwGvyo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://coe.gatech.edu/news/2025/08/tiny-fans-feet-water-bugs-could-lead-energy-efficient-mini-robots">Georgia Tech</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="gugwb6wxcfo"><em>Dynamic locomotion of legged robots is a critical yet challenging topic in expanding the operational range of mobile robots. To achieve generalized legged locomotion on diverse terrains while preserving the robustness of learning-based controllers, this paper proposes an attention-based map encoding conditioned on robot proprioception, which is trained as part of the end-to-end controller using reinforcement learning. We show that the network learns to focus on steppable areas for future footholds when the robot dynamically navigates diverse and challenging terrains.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3cf4a6ff1c4cda069475872f91248850" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GUgwB6WxcFo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2506.09588">Paper</a> ] from [ <a href="https://rsl.ethz.ch/" target="_blank">ETH Zurich</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="oeuh89bwrl4"><em>In the fifth installment of our Moonshot Podcast Deep Dive video interview series, X’s Captain of Moonshots <a data-linked-post="2650275007" href="https://spectrum.ieee.org/astro-teller-captain-of-moonshots-at-x" target="_blank">Astro Teller</a> sits down with Google DeepMind’s chief scientist, Jeff Dean, for a conversation about the origin of Jeff’s pioneering work scaling neural networks. They discuss the first time AI captured Jeff’s imagination, the earliest Google Brain framework, the team’s stratospheric advancements in image recognition and speech-to-text, how AI is evolving, and more.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3e0b8f6d91c9ba85f7c86b72baf74921" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/OEuh89BWRL4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.youtube.com/@XTheMoonshotFactory/podcasts">Moonshot Podcast</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 29 Aug 2025 16:30:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-synchronized-dancing-robots</guid><category>Video friday</category><category>Dancing robots</category><category>Humanoid robots</category><category>Robotics</category><category>Quadruped robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/four-legged-robot-dog-doing-three-backflips-in-industrial-setting-with-equipment-and-yellow-stairs.gif?id=61532173&amp;width=980"></media:content></item><item><title>Video Friday: Inaugural World Humanoid Robot Games Held</title><link>https://spectrum.ieee.org/world-humanoid-robot-games</link><description><![CDATA[
  22. <img src="https://spectrum.ieee.org/media-library/robots-racing-on-a-blue-track-while-spectators-watch-in-a-stadium.png?id=61501726&width=1245&height=700&coordinates=0%2C43%2C0%2C44"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://www.ro-man2025.org/">RO-MAN 2025</a>: 25–29 August 2025, EINDHOVEN, THE NETHERLANDS</h5><h5><a href="https://clawar.org/clawar2025/">CLAWAR 2025</a>: 5–7 September 2025, SHENZHEN, CHINA</h5><h5><a href="https://actuate.foxglove.dev/">ACTUATE 2025</a>: 23–24 September 2025, SAN FRANCISCO</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="https://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="-xc8cs47lcc"><em>The First World Humanoid Robot Games Conclude Successfully! Unitree Strikes Four Golds (1500m, 400m, 100m Obstacle, 4×100m Relay).</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a032c6342bdd306eb7b990a515bf0972" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/-Xc8cs47LCc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/">Unitree</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="j-cofmqd-ss"><em>Steady! PNDbotics Adam has become the only full-size humanoid robot athlete to successfully finish the 100m Obstacle Race at the World Humanoid Robot Games!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8063a2a4bd12fde3329069353544d401" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/J-COfmQD-Ss?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://pndbotics.com/">PNDbotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="_vdo5ynys_w"><em>Introducing Field Foundation Models (FFMs) from FieldAI—a new class of “physics-first” foundation models built specifically for embodied intelligence. Unlike conventional vision or language models retrofitted for robotics, FFMs are designed from the ground up to grapple with uncertainty, risk, and the physical constraints of the real world. This enables safe and reliable robot behaviors when managing scenarios that they have not been trained on, navigating dynamic, unstructured environments without prior maps, GPS, or predefined paths.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b4991f44490462d8e919fd753762a617" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/_vDo5YnYs_w?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.fieldai.com/">Field AI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="0iqzsmvcqis"><em>Multiply Labs, leveraging Universal Robots’ collaborative robots, has developed a groundbreaking robotic cluster that is fundamentally transforming the manufacturing of life-saving cell and gene therapies. The Multiply Labs solution drives a staggering 74% cost reduction and enables up to 100x more patient doses per square foot of cleanroom.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e31134ffe39ceb67e5bcaadbcd7feb25" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0iqZsmvCqis?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.universal-robots.com/case-stories/multiply-labs/">Universal Robots</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="gmyo-cum1ka"><em>In this video, we put Vulcan V3, the world’s first ambidextrous humanoid robotic hand capable of performing the full American Sign Language (ASL) alphabet, to the ultimate test—side by side with a real human!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c8c9fc776aab009f79d8d9ff5c74df0f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GmYO-Cum1KA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://hackaday.io/project/203847-ambidextrous-23-direct-drive-humanoid-robotic-hand">Hackaday</a> ]</p><p>Thanks, Kelvin!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="_og1egust-i">More robots need to have this form factor.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a4cdca5959880eccc1dc8c193fcc0349" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/_og1egUst-I?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://engineering.tamu.edu/news/2025/08/from-sea-to-space-this-robot-is-on-a-roll.html">Texas A&M University</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="pksef2rtqzy">Robotic vacuums are so pervasive now that it’s easy to forget how much of an icon the <a data-linked-post="2650267122" href="https://spectrum.ieee.org/video-friday-an-arm-for-your-partybot-and-irobot-turns-10" target="_blank">iRobot Roomba</a> has been.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="805e910a12019237e6add751163fc622" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/PKSEF2RtqZY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.irobot.com/">iRobot</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ujqjg1xk8a8">This is quite possibly the largest <a data-linked-post="2650254250" href="https://spectrum.ieee.org/dlr-super-robust-robot-hand" target="_blank">robotic hand</a> I’ve ever seen.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6617f3142cdf5f53e820ab090176fb10" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ujQJG1xk8A8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://moonshot-cafe-project.org/en/">CAFE Project</a> ] via [ <a href="https://built.itmedia.co.jp/bt/articles/2508/18/news082.html">BUILT</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="rz_xbnbituq"><em>Modular robots built by Dartmouth researchers are finding their feet outdoors. Engineered to assemble into structures that best suit the task at hand, the robots are pieced together from cube-shaped robotic blocks that combine rigid rods and soft, stretchy strings whose tension can be adjusted to deform the blocks and control their shape.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5b45024fd761ce33bd7efc99dd7ac3a6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Rz_xBnbituQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://home.dartmouth.edu/news/2025/08/multipurpose-robots-take-shape">Dartmouth</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ptxede_xbro"><em>Our quadruped robot X30 has completed extreme-environment missions in Hoh Xil—supporting patrol teams, carrying vital supplies and protecting fragile ecosystems.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c86d59dfac00af07eda77f6e89557a03" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pTxEdE_Xbro?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">DEEP Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="u_7nt4eq1ns"><em>We propose a base-shaped robot named “koboshi” that moves everyday objects. This koboshi has a spherical surface in contact with the floor, and by moving a weight inside using built-in motors, it can rock up and down, and side to side. By placing everyday items on this koboshi, users can impart new movement to otherwise static objects. The koboshi is equipped with sensors to measure its posture, enabling interaction with users. Additionally, it has communication capabilities, allowing multiple units to communicate with each other.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e2ed91840821eb78809192d271435bbe" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/u_7nt4eQ1ns?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/html/2508.13509v1">Paper</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="bdw9f7aihas"><em>Bi-LAT is the world’s first Vision-Language-Action (VLA) model that integrates bilateral control into imitation learning, enabling robots to adjust force levels based on natural language instructions.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e3ad454ddbbc8d41754470ae30b8def9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/bdw9F7AIHas?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://mertcookimg.github.io/bi-lat/">Bi-LAT</a> ] to be presented at [ <a href="https://www.ro-man2025.org/" target="_blank">IEEE RO-MAN 2025</a> ]</p><p>Thanks, Masato!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="pmcorujdxkq">Look at this jaunty little guy!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="fe73f006b2c8ea1062f763a30e6d2cbe" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pmcOrUjDXKQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Although, they very obviously cut the video right before it smashes face-first into furniture more than once.</p><p>[ <a href="https://gepetto.github.io/BoltLocomotion/">Paper</a> ] to be presented at [ <a href="https://2025humanoids.org/" target="_blank">2025 IEEE-RAS International Conference on Humanoid Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="kqeqdcuyo50"><em>This research has been conducted at the Human Centered Robotics Lab at UT Austin. The video shows our latest experimental bipedal robot, dubbed Mercury, which has passive feet. This means that there are no actuated ankles, unlike humans, forcing Mercury to gain balance by dynamically stepping.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="40c1c4a35c1cac4390cc192af6e5ab86" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/kqEqDCuYO50?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sites.utexas.edu/hcrl/">University of Texas at Austin Human Centered Robotics Lab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="yyldielqgic"><em>We put two RIVR delivery robots to work with an autonomous vehicle—showing how Physical AI can handle the full last mile, from warehouse to consumers’ doorsteps.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0d0e0b38d47b76f0ac8e5ed5ba246f39" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/YyLDIelqgic?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.rivr.ai/">Rivr</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="3fh94e2vpmq"><em>The KR TITAN ultra is a high-performance industrial robot weighing 4.6 tonnes and capable of handling payloads up to 1.5 tonnes.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="be237d0fd193d9bbfcaceafd7a282b4c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/3fh94e2vPMQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.kuka.com/kr-titan-ultra?sc_camp=E45C2ED3B08848A6B2E310E0E28BB294">Kuka</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="gpx3-qnrhl4"><em>CMU MechE’s Ding Zhao and Ph.D. student Yaru Niu describe LocoMan, a robotic assistant they have been developing.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1f370f0fc2719d8f2e31b0909a9cd8df" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/gPx3-QnrHl4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://safeai-lab.github.io/">Carnegie Mellon University</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="dkbgvo80c_u"><em>Twenty-two years ago, Silicon Valley executive Henry Evans had a massive stroke that left him mute and paralyzed from the neck down. But that didn’t prevent him from becoming a leading advocate of adaptive robotic tech to help disabled people—or from writing country songs, one letter at a time. Correspondent John Blackstone talks with Evans about his upbeat attitude and unlikely pursuits.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bcfb5420c867159696934a1fe77b4aa5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DKbGvO80C_U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.cbsnews.com/video/a-robotics-activists-remarkable-crusade/">CBS News</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 22 Aug 2025 15:30:04 +0000</pubDate><guid>https://spectrum.ieee.org/world-humanoid-robot-games</guid><category>Video friday</category><category>Robotics</category><category>Humanoid robots</category><category>Robot games</category><category>Robot ai</category><category>Quadruped robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/robots-racing-on-a-blue-track-while-spectators-watch-in-a-stadium.png?id=61501726&amp;width=980"></media:content></item><item><title>What I Learned From a Janky Drone</title><link>https://spectrum.ieee.org/stem-education-in-africa</link><description><![CDATA[
  23. <img src="https://spectrum.ieee.org/media-library/four-young-black-men-at-desks-in-a-lecture-hall.png?id=61482540&width=1245&height=700&coordinates=0%2C183%2C0%2C184"/><br/><br/><p><strong>The package containing the</strong> ArduCopter 2.8 board finally arrived from China, bearing the weight of our anticipation. I remember picking it up, the cardboard box weathered slightly from its journey. As I tore through the layers of tape, it felt like unwrapping a long-awaited gift. But as I lifted the ArduCopter 2.8 board out of the box, my heart sank. The board, which was to be the cornerstone of our project, looked worn out and old, with visible scuffs and bent pins. This was just one of a cascade of setbacks my team would face.</p><p>It all started when I was assigned a project in machine design at <a href="https://oauife.edu.ng/" rel="noopener noreferrer" target="_blank">Obafemi Awolowo University</a> (OAU), located in the heart of Ilé-Ifẹ̀, an ancient Yoruba city in Osun State, in southwest Nigeria, where I am a mechanical engineering student entering my final year of a five-year program. OAU is one of Nigeria’s oldest and most prestigious universities, known for its beautiful campus and architecture. Some people I know refer to it as the “Stanford of Nigeria” because of the significant number of brilliant startups it has spun off. Despite its reputation, though, OAU—like every other federally owned institution in Nigeria—is underfunded and <a href="https://punchng.com/our-education-not-bargaining-chip-oau-students-lament-lecturers-strike/" rel="noopener noreferrer" target="_blank">plagued by faculty strikes</a>, leading to interruptions in academics. The lack of funding means students must pay for their undergraduate projects themselves, making the success of any project heavily dependent on the students’ financial capabilities.</p><h3>The Student & the Professor</h3><br/><p><strong>Two perspectives on engineering education in Africa</strong></p><p><em>Johnson I. Ejimanya is a one-man pony express. Walking the exhaust-fogged streets of Owerri, Nigeria, Ejimanya, the engineering dean of the Federal University of Technology, Owerri, carries with him a department’s worth of communications, some handwritten, others on disk. He’s delivering them to a man with a PC and an Internet connection who converts the missives into e-mails and downloads the responses. To Ejimanya, broadband means lugging a big bundle of printed e-mails back with him to the university, which despite being one of the country’s largest and most prestigious engineering schools, has no reliable means of connecting to the Internet.</em></p><p>I met Ejimanya when I visited Nigeria in 2003 to report on how the SAT-3/WASC, the first undersea fiber-optic cable to connect West Africa to the world, was being used. (The passage above is from my February 2004 <em>IEEE</em> <em>Spectrum</em> article “<a href="https://spectrum.ieee.org/surf-africa" target="_self">Surf Africa</a>.”) Beyond the lack of computers and Internet access, I saw labs filled with obsolete technology from the 1960s. If students needed a computer or to get online, they went to an Internet cafe, their out-of-pocket costs a burden on them and their families.</p><p>So is the situation any better 20-plus years on? The short answer is yes. But as computer science professor <a href="https://ibaino.net/" target="_blank">Engineer Bainomugisha</a> and IEEE student member <a href="https://www.linkedin.com/in/oluwatosin-kolade/?originalSubdomain=ng" rel="noopener noreferrer" target="_blank">Oluwatosin Kolade</a> attest in the following pages, there’s still a long way to go.</p><p>Both men are engineers but at different stages of their academic journey: Bainomugisha went to college in the early 2000s and is now a computer science professor at <a href="https://mak.ac.ug/" rel="noopener noreferrer" target="_blank">Makerere University</a> in Kampala, Uganda. Kolade is in his final semester as a mechanical engineering student at <a href="https://oauife.edu.ng/" rel="noopener noreferrer" target="_blank">Obafemi Awolowo University</a> in Ilé-Ifẹ̀, Nigeria. They describe the challenges they face and what they see as the path forward for a continent brimming with aspiring engineers but woefully short on the resources necessary for a robust education.</p><p>—Harry Goldstein</p><p><a href="https://www.researchgate.net/lab/Oluwaseun-K-Ajayi-Lab" rel="noopener noreferrer" target="_blank">Dr. Oluwaseun K. Ajayi</a>, an expert in computer-aided design (CAD), machine design, and mechanisms, gave us the freedom to choose our final project. I proposed a research project based on a paper titled “<em>Advance Simulation Method for Wheel-Terrain Interactions of Space Rovers: A Case Study on the UAE Rashid Rover</em>” by <a href="https://arxiv.org/search/cs?searchtype=author&query=Abubakar,+A" rel="noopener noreferrer" target="_blank">Ahmad Abubakar</a> and coauthors<em><em>.</em></em> But due to the computational resources required, it was rejected. Dr. Ajayi instead proposed that my fellow students and I build a surveillance drone, as it aligned with his own research. Dr. Ajayi, a passionate and driven researcher, was motivated by the potential real-world applications of our project. His constant push for progress, while sometimes overwhelming, was rooted in his desire to see us produce meaningful work.</p><p>As my team finished scoping out the preliminary concepts of the drone in CAD designs, we were ready to contribute money toward implementing our idea. We conducted a cost analysis and decided to use a third-party vendor to help us order our components from China. We went this route due to shipping and customs issues we’d previously experienced. Taking the third-party route was supposed to solve the problem. Little did we suspect what was coming.</p><p>By the time we finalized our cost analysis and started to gather funds, the price of the components we needed had skyrocketed due to a sudden economic crisis and depreciation of the Nigerian naira by 35 percent against the U.S. dollar at the end of January 2024. This was the genesis of our problem.</p><p class="ieee-inbody-related">Related: <a href="https://spectrum.ieee.org/africa-engineering-hardware" target="_blank">Learning More With Less</a></p><p>Initially, we were a group of 12, but due to the high cost per person, Dr. Ajayi asked another group, led by <a href="https://www.linkedin.com/in/nanaweitonbrasuoware/" rel="noopener noreferrer" target="_blank">Tonbra Suoware</a>, to merge with mine. Tonbra’s team had been planning <a href="https://spectrum.ieee.org/african-robotics-network" target="_blank">a robotic arm project</a> until Dr. Ajayi merged our teams and instructed us to work on the drone, with the aim of exhibiting it at the <a href="https://central.nasrda.gov.ng/" rel="noopener noreferrer" target="_blank">National Space Research and Development Agency</a>, in Abuja, Nigeria. The merger increased our group to 25 members, which helped with the individual financial burden but also meant that not everyone would actively participate in the project. Many just contributed their share of the money.</p><p>Tonbra and I drove the project forward.</p><h2>Supply Chain Challenges in African Engineering Education</h2><p>With Dr. Ajayi’s consent, my teammates and I scrapped the “surveillance” part of the drone project and raised the money for developing just the drone, totaling approximately 350,000 naira (approximately US $249). We had to cut down costs, which meant straying away from the original specifications of some of the components, like the flight controller, battery, and power-distribution board. Otherwise, the cost would have been way more unbearable.</p><p>We were set to order the components from China on 5 February 2024. Unfortunately, it was a long holiday in China, we were told, so we wouldn’t get the components until March. This led to tense discussions with Dr. Ajayi, despite having briefed him about the situation. Why the pressure? Our school semester ends in March, and having components arrive in March would mean that the project would be long overdue by the time we finished it. At the same time, we students had a compulsory academic-industrial training at the end of the semester.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Young Black man in plaid shirt sitting on a chair in front of a white board and a black board" class="rm-shortcode" data-rm-shortcode-id="662b14ac6096514656856080049e84b5" data-rm-shortcode-name="rebelmouse-image" id="d6b2d" loading="lazy" src="https://spectrum.ieee.org/media-library/young-black-man-in-plaid-shirt-sitting-on-a-chair-in-front-of-a-white-board-and-a-black-board.png?id=61482574&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Oluwatosin Kolade, a mechanical engineering student at Nigeria’s Obafemi Awolowo University, says the drone project taught him the value of failure.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Andrew Esiebo</small></p><p>But what choice did we have? We couldn’t back down from the project—that would have cost us our grade.</p><p>We got most of our components by mid-March, and immediately started working on the drone. We had the frame 3D-printed at a cost of 50 naira (approximately US $0.03) per gram for a 570-gram frame, for a total cost of 28,500 naira (roughly US $18).</p><p><span>Next, we turned to building the power-distribution system for the electrical components. Initially, we’d planned to use a power-distribution board to evenly distribute power from the battery to the speed controllers and the rotors. However, the board we originally ordered was no longer available. Forced to improvise, we used a </span><a href="https://verotl.com/circuitboards/veroboards" target="_blank">Veroboard </a><span>instead. We connected the battery in a configuration parallel to the speed controllers to ensure that each rotor received equal power. This improvisation did mean additional costs, as we had to rent soldering irons, hand drills, hot glue, cables, a digital multimeter, and other tools from an electronics hub in downtown Ilé-Ifẹ̀.</span></p><p><span></span>Everything was going smoothly until it was time to configure the flight controller—the ArduCopter 2.8 board—with the assistance of a software program called <a href="https://ardupilot.org/planner/" target="_blank">Mission Planner</a>. We toiled daily, combing through YouTube videos, online forums, Stack Exchange, and other resources for guidance, all to no avail. We even downgraded the Mission Planner software a couple of times, only to discover that the board we’d waited for so patiently was obsolete. It was truly heartbreaking, but we couldn’t order another one because we didn’t have time to wait for it to arrive. Plus, getting another flight controller would’ve cost an additional sum—240,000 naira (about US $150) for a <a href="https://www.hawks-work.com/products/pixhawk-2-4-8-flight-control-open-source-px4-autopilot" target="_blank">Pixhawk 2.4.8 flight controller</a>—which we didn’t have.</p><p>We knew our drone would be half-baked without the flight controller. Still, given our semester-ending time constraint, we decided to proceed with the configuration of the transmitter and receiver. We made the final connections and tested the components without the flight controller. To ensure that the transmitter could control all four rotors simultaneously, we tested each rotor individually with each transmitter channel. The goal was to assign a single channel on the transmitter that would activate and synchronize all four rotors, allowing them to spin in unison during flight. This was crucial, because without proper synchronization, the drone would not be able to maintain a stable flight.</p><p class="pull-quote">“This experience taught me invaluable lessons about resilience, teamwork, and the harsh realities of engineering projects done by students in Nigeria.”</p><p>After the final configuration and components testing, we set out to test our drone in its final form. But a few minutes into the testing, our battery failed. This failure meant the project had failed, and we were incredibly disappointed.</p><p>When we finally submitted our project to Dr. Ajayi, the deadline had passed. He told us to charge the battery so he could see the drone come alive, even though it couldn’t fly. But circumstances didn’t allow us to order a battery charger, and we were at a loss as to where to get help with the flight controller and battery. There are no tech hubs available for such things in Ilé-Ifẹ̀. We told Dr. Ajayi we couldn’t do as he’d asked and explained the situation to him. He finally allowed us to submit our work, and all team members received course credit.</p><h2>Resourcefulness is not a substitute for funding</h2><p>This experience taught me invaluable lessons about resilience, teamwork, and the harsh realities of engineering projects done by students in Nigeria. It showed me that while technical knowledge is crucial, the ability to adapt and improvise when faced with unforeseen challenges is just as important. I also learned that failure, though disheartening, is not an ending but a stepping stone toward growth and improvement.</p><p>In my school, the demands on mechanical engineering students are exceptionally high. For instance, in a single semester, I was sometimes assigned up to four different major projects, each from a different professor. Alongside the drone project, I worked on two other substantial projects for other courses. The reality is that a student’s ability to score well in these projects is often heavily dependent on financial resources. We are constantly burdened with the costs of running numerous projects. The country’s ongoing economic challenges, including currency devaluation and inflation, only exacerbate this burden.</p><p>In essence, when the world, including graduate-school-admission committees and industry recruiters, evaluates transcripts from Nigerian engineering graduates, it’s crucial to recognize that a grade may not fully reflect a student’s capabilities in a given course. They can also reflect financial constraints, difficulties in sourcing equipment and materials, and the broader economic environment. This understanding must inform how transcripts are interpreted, as they tell a story not just of academic performance but also of perseverance in the face of significant challenges.</p><p>As I advance in my education, I plan to apply these lessons to future projects, knowing that perseverance and resourcefulness will be key to overcoming obstacles. The failed drone project has also given me a realistic glimpse into the working world, where unexpected setbacks and budget constraints are common. It has prepared me to approach my career with both a practical mindset and an understanding that success often comes from how well you manage difficulties, not just how well you execute plans. <span class="ieee-end-mark"></span></p>]]></description><pubDate>Wed, 20 Aug 2025 13:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/stem-education-in-africa</guid><category>Higher education</category><category>Engineering education</category><category>Undergraduate education</category><category>Makerspaces</category><category>3d printers</category><category>Arduino</category><dc:creator>Oluwatosin Kolade</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/four-young-black-men-at-desks-in-a-lecture-hall.png?id=61482540&amp;width=980"></media:content></item><item><title>Smart Glasses Help Train General-Purpose Robots</title><link>https://spectrum.ieee.org/smart-glasses-robot-training</link><description><![CDATA[
  24. <img src="https://spectrum.ieee.org/media-library/conceptual-collage-of-a-robotic-arm-reaching-down-between-two-circular-photos-of-experiments-meant-to-resemble-eyeglasses.jpg?id=61469620&width=1245&height=700&coordinates=0%2C118%2C0%2C119"/><br/><br/><p>General-purpose robots are hard to train. The dream is to have a <a href="https://spectrum.ieee.org/ai-robots" target="_blank">robot like the Jetson’s Rosie</a> that can<span> </span><span>performing a range of</span><span> household </span><span>tasks, like tidying up or folding laundry. But for that to happen, the robot needs to learn from a </span><a href="https://spectrum.ieee.org/global-robotic-brain" target="_blank">large amount of data</a><span> that match real-world conditions—that data can be difficult to collect. Currently, most training data is collected from multiple static cameras that have to be carefully set up to gather useful information. But what if bots could learn from the everyday interactions we already have with the physical world? </span></p><p>That’s a question that the <a href="https://www.lerrelpinto.com/group" target="_blank">General-purpose Robotics and AI Lab</a> at New York University, led by Assistant Professor <a href="https://www.lerrelpinto.com/#publications" target="_blank">Lerrel Pinto</a>, hopes to answer with <a href="https://egozero-robot.github.io/" target="_blank">EgoZero</a>, a smart-glasses system that aids robot learning by collecting data with a souped-up version of <a href="https://spectrum.ieee.org/meta-ar-glasses-expense" target="_blank">Meta’s glasses</a>. <strong></strong></p><p>In a <a href="https://arxiv.org/abs/2505.20290" target="_blank">recent preprint</a>, which serves as a proof of concept for the approach, the researchers trained a robot to complete seven manipulation tasks, such as picking up a piece of bread and placing it on a nearby plate. For each task, they collected 20 minutes of data from humans performing these tasks while recording their actions with glasses from Meta’s <a href="https://www.projectaria.com/" target="_blank">Project Aria</a>. (These sensor-laden glasses are used exclusively for research purposes.) When then deployed to autonomously complete these tasks with a robot, the system achieved a 70 percent success rate. </p><h2>The Advantage of Egocentric Data</h2><p>The “ego” part of EgoZero refers to the “egocentric” nature of the data, meaning that it is collected from the perspective of the person performing a task. “The camera sort of moves with you,” like how our eyes move with us, says <a href="https://raunaqb.com/" target="_blank">Raunaq Bhirangi</a>, a postdoctoral researcher at the NYU lab. </p><p>This has two main advantages: First, the setup is more portable than external cameras. Second, the glasses are more likely to capture the information needed because wearers will make sure they—and thus the camera—can see what’s needed to perform a task. “For instance, say I had something hooked under a table and I want to unhook it. I would bend down, look at that hook and then unhook it, as opposed to a third-person camera, which is not active,” says Bhirangi. “With this egocentric perspective, you get that information baked into your data for free.”</p><p>The second half of EgoZero’s name refers to the fact that the system is trained without any robot data, which can be costly and difficult to collect; human data alone is enough for the robot to learn a new task. This is enabled by a framework developed by Pinto’s lab that tracks points in space, rather than full images. When training robots on image-based data, “the mismatch is too large between what human hands look like and what robot arms look like,” says Bhirangi. This framework instead tracks points on the hand, which are mapped onto points on the robot. </p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="EgoZero localizes object points via triangulation over the camera trajectory, and computes action points via Aria MPS hand pose and a hand estimation model." class="rm-shortcode" data-rm-shortcode-id="31a1551426b63f5c0788d4cbde80aa11" data-rm-shortcode-name="rebelmouse-image" id="d916e" loading="lazy" src="https://spectrum.ieee.org/media-library/egozero-localizes-object-points-via-triangulation-over-the-camera-trajectory-and-computes-action-points-via-aria-mps-hand-pose.jpg?id=61469639&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">The EgoZero system takes data from humans wearing smart glasses and turns it into usable 3D-navigation data for robots to do general manipulation tasks.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://egozero-robot.github.io/" target="_blank">Vincent Liu, Ademi Adeniji, Haotian Zhan, et al.</a></small></p><p>Reducing the image to points in 3D space means the model can track movement the same way, regardless of the specific robotic appendage. “As long as the robot points move relative to the object in the same way that the human points move, we’re good,” says Bhirangi.</p><p>All of this leads to a generalizable model that would otherwise require a lot of diverse robot data to train. If the robot was trained on data picking up one piece of bread—say, a deli roll—it can generalize that information to pick up a piece of ciabatta in a new environment. </p><h2>A Scalable Solution</h2><p>In addition to EgoZero, the research group is working on several projects to help make general-purpose robots a reality, including open-source robot designs, flexible <a href="https://arxiv.org/abs/2409.08276" target="_blank">touch sensors</a>, and additional methods of collecting real-world training data. </p><p>For example, as an alternative to EgoZero, the researchers have also designed a setup with a 3D-printed handheld gripper that more closely resembles most robot “hands.” A smartphone attached to the gripper captures video with the same point-space method that’s used in EgoZero. The team, by having people collect data without bringing a robot into their homes, provide two approaches that could be more scalable for collecting training data.</p><p>That scalability is ultimately the researcher’s goal. Large language models can harness the entire Internet, but there is no Internet equivalent for the physical world. Tapping into everyday interactions with smart glasses could help fill that gap.</p>]]></description><pubDate>Tue, 19 Aug 2025 14:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/smart-glasses-robot-training</guid><category>Training data</category><category>Smart glasses</category><category>Robotics</category><category>Meta</category><dc:creator>Gwendolyn Rak</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/conceptual-collage-of-a-robotic-arm-reaching-down-between-two-circular-photos-of-experiments-meant-to-resemble-eyeglasses.jpg?id=61469620&amp;width=980"></media:content></item><item><title>Video Friday: SCUTTLE</title><link>https://spectrum.ieee.org/video-friday-scuttle-robot</link><description><![CDATA[
  25. <img src="https://spectrum.ieee.org/media-library/black-robotic-snake-navigates-rocky-terrain-in-bright-sunlight.jpg?id=61467306&width=1245&height=700&coordinates=0%2C179%2C0%2C180"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://www.ro-man2025.org/">RO-MAN 2025</a>: 25–29 August 2025, EINDHOVEN, THE NETHERLANDS</h5><h5><a href="https://clawar.org/clawar2025/">CLAWAR 2025</a>: 5–7 September 2025, SHENZHEN, CHINA</h5><h5><a href="https://actuate.foxglove.dev/">ACTUATE 2025</a>: 23–24 September 2025, SAN FRANCISCO</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="http://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="uch3yx-mjta"><em>Check out our latest innovations on SCUTTLE, advancing multilegged mobility anywhere.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a0aae165324d5bb802ad7b69289e1a17" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/uCh3Yx-MjtA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://groundcontrolrobotics.com/">GCR</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="hoornv3la0k">That laundry-folding robot we’ve been working on for 15 years is still not here.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6913d1c039c12817ac9b44f8cf2a843e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/HOoRnv3lA0k?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Honestly I think <a data-linked-post="2668901591" href="https://spectrum.ieee.org/figure-new-humanoid-robot" target="_blank">Figure</a> could learn a few tricks from vintage UC Berkeley PR2, though.</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="5359d4bdf7e1c7fd0b5dace2d9fe6eb3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/gy5g33S0Gzo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-caption" placeholder="Add Photo Caption...">- YouTube</small> </p><p>[ <a href="https://www.figure.ai/">Figure</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="yznay63wcdw">Tensegrity robots are so cool, but so hard—it’s good to see progress.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bb4273eaa4a6ac51826ebce722ef0f8d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/YznAy63wcdw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotics.umich.edu/news/2025/advanced-actuator-tensegrity-robot/">Michigan Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="dy6ysug9f00">We should find out next week how quick this is.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="39f525f7e35533b9604c65c4cfa35c60" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dY6YSUG9F00?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/">Unitree</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="v6w_dtkwvtc"><em>We introduce a methodology for task-specific design optimization of multirotor Micro Aerial Vehicles. By leveraging reinforcement learning, Bayesian optimization, and covariance matrix adaptation evolution strategy, we optimize aerial robot designs guided only by their closed-loop performance in a considered task. Our approach systematically explores the design space of motor pose configurations while ensuring manufacturability constraints and minimal aerodynamic interference. Results demonstrate that optimized designs achieve superior performance compared to conventional multirotor configurations in agile waypoint navigation tasks, including against fully actuated designs from the literature. We build and test one of the optimized designs in the real world to validate the sim2real transferability of our approach.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b08e05f9d393dff055a3825f45335718" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/V6w_DTKWvtc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.autonomousrobotslab.com/">ARL</a> ]</p><p>Thanks, Kostas!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="5gpphya6tkc">I guess legs are required for this inspection application because of the stairs right at the beginning? But sometimes, that’s how the world is.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4614b3061ab10a13ec6c023f7f79f6b8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5GPphya6tkc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">DEEP Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="lu-coco8xzy"><em>The Institute of Robotics and Mechatronics at DLR has a long tradition in developing multifingered hands, creating novel mechatronic concepts as well as autonomous grasping and manipulation capabilities. The range of hands spans from Rotex, a first two-fingered gripper for space applications, to the highly anthropomorphic Awiwi Hand and variable stiffness end effectors. This video summarizes the developments of DLR in this field over the past 30 years, starting with the Rotex experiment in 1993.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9b730c1c696e64c9a91c8e4a809a1554" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/lu-CoCO8xZY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dlr.de/en/rm">DLR RM</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="sgf0nkx8t9a"><em>The quest for agile quadrupedal robots is limited by handcrafted reward design in reinforcement learning. While animal motion capture provides 3D references, its cost prohibits scaling. We address this with a novel video-based framework. The proposed framework significantly advances robotic locomotion capabilities.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7c7d4ff76c677d6b936f4ce3a4bd4067" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/SGf0Nkx8t9A?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arclab.hku.hk/">Arc Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="la1dh0smkkm">Serious question: Why don’t <a data-linked-post="2666662286" href="https://spectrum.ieee.org/humanoid-robots" target="_blank">humanoid robots</a> sit down more often?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ae30d15a26a701f11b24b3ec2ddfe804" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/la1dh0SmkkM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.engineai.com.cn/">EngineAI</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="mcqixha8ykg">And now, this.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9beea64e285ec85616820247546e13e4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/MCqIxHA8YKg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.limxdynamics.com/en">LimX Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="0kb6fz2kjt8"><em>NASA researchers are currently using wind tunnel and flight tests to gather data on an electric vertical takeoff and landing (eVTOL) scaled-down small aircraft that resembles an air taxi that aircraft manufacturers can use for their own designs. By using a smaller version of a full-sized aircraft called the RAVEN Subscale Wind Tunnel and Flight Test (RAVEN SWFT) vehicle, NASA is able to conduct its tests in a fast and cost-effective manner. </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="696c6da4358a4efdd5289afd0142bcc6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0KB6FZ2kjT8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.nasa.gov/aeronautics/air-taxi-flight-controls/">NASA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="8otwr-s6qus"><em>This video details the advances in orbital manipulation made by DLR’s Robotic and Mechatronics Center over the past 30 years, paving the way for the development of robotic technology for space sustainability.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="54f8939f53a513bd6f528681ca5eae2e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8OtwR-S6QUs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dlr.de/en/rm">DLR RM</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="bmfpvcu16sq"><em>This summer, a team of robots explored a simulated Martian landscape in Germany, remotely guided by an astronaut aboard the International Space Station. This marked the fourth and final session of the Surface Avatar experiment, a collaboration between ESA and the German Aerospace Center (‪DLR) to develop how astronauts can control robotic teams to perform complex tasks on the Moon and Mars.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5c6b999b15ede0cf1ccd7e925d754de8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/BMFPVCu16SQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://blogs.esa.int/exploration/human-minds-robotic-hands/">ESA</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 15 Aug 2025 16:00:04 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-scuttle-robot</guid><category>Video friday</category><category>Robotics</category><category>Crawler</category><category>Humanoid robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/black-robotic-snake-navigates-rocky-terrain-in-bright-sunlight.jpg?id=61467306&amp;width=980"></media:content></item><item><title>Designing for Functional Safety: A Developer's Introduction</title><link>https://events.bizzabo.com/749990?utm_source=Wiley&amp;utm_medium=Spectrum</link><description><![CDATA[
  26. <img src="https://spectrum.ieee.org/media-library/image.png?id=61453640&width=980"/><br/><br/><p>Welcome to your essential guide to <span>functional safety, tailored specifically for product developers. In a world where technology is increasingly integrated into every aspect of our lives—from industrial robots to autonomous vehicles—the potential for harm from product malfunctions makes functional safety not just important, but critical. </span></p><p>This webinar cuts through the complexity to provide a clear understanding of what functional safety truly entails and why it’s critical for product success. We’ll start by defining functional safety not by its often-confusing official terms, but as a structured methodology for managing risk through defined engineering processes, essential product design requirements, and probabilistic analysis. The “north star” goals? To ensure your <span>product not only works reliably but, if it does fail, it does so in a safe and predictable manner.</span></p><p>We’ll dive into two fundamental concepts: the <span>Safety Lifecycle, a detailed engineering process focused on design quality to minimize systematic failures, and Probabilistic, Performance-Based Design using reliability metrics to minimize random hardware failures. You’ll learn about IEC 61508, the foundational standard for functional safety, and how numerous industry-specific standards derive from it.</span></p><p>The webinar will walk you through the Engineering Design phases: analyzing hazards and required <span>risk reduction, realizing optimal designs, and ensuring safe operation. We’ll demystify the Performance Concept and the critical Safety Integrity Level (SIL), explaining its definition, criteria (systematic capability, architectural constraints, PFD), and how it relates to industry-specific priorities.</span></p><p>Discover key Design Verification techniques like DFMEA/DDMA and FMEDA, emphasizing how these tools help identify and address problems early in development. We’ll detail the <span>FMEDA technique showing how design decisions directly impact predictions like safe and dangerous failure rates, diagnostic coverage, and useful life. Finally, we’ll cover Functional Safety Certification, explaining its purpose, process, and what adjustments to your development process can set you up for success.</span></p><p><span><span><a href="https://events.bizzabo.com/749990?utm_source=Wiley&utm_medium=Spectrum" target="_blank">Register now for this free webinar!</a></span></span></p>]]></description><pubDate>Fri, 15 Aug 2025 11:52:07 +0000</pubDate><guid>https://events.bizzabo.com/749990?utm_source=Wiley&amp;utm_medium=Spectrum</guid><category>Functional safety</category><category>Risk management</category><category>Technology integration</category><category>Type:webinar</category><dc:creator>exida</dc:creator><media:content medium="image" type="image/png" url="https://assets.rbl.ms/61453640/origin.png"></media:content></item><item><title>Bug-size Bots Get More Nimble With Flexible Actuators</title><link>https://spectrum.ieee.org/soft-robot-actuators-bugs</link><description><![CDATA[
  27. <img src="https://spectrum.ieee.org/media-library/a-cartoon-spider-stands-on-a-sandy-surface-next-to-a-small-bug-sized-flexible-robot.jpg?id=61443734&width=1245&height=700&coordinates=0%2C310%2C0%2C311"/><br/><br/><p>Small, autonomous robots that can access cramped environments could help with future search-and-rescue operations and inspecting infrastructure details that are difficult to access by people or larger bots. <strong></strong>However, the conventional, rigid motors that many robots rely on are difficult to miniaturize to these scales, because they easily break when made smaller or can no longer overcome friction forces.</p><p>Now, researchers have developed a muscle-inspired elasto-electromagnetic system to build insect-size <a data-linked-post="2660256392" href="https://spectrum.ieee.org/soft-robotics" target="_blank">“soft” robots</a> made of flexible materials.<strong> </strong>“It became clear that existing soft robotic systems at this scale still lack actuation mechanisms that are both efficient and autonomous,” says <a href="https://en.westlake.edu.cn/faculty/hanqing-jiang.html" target="_blank">Hanqing Jiang</a>, a professor of mechanical engineering at Westlake University in Hangzhou, China. Instead, they “often require harsh stimuli such as high voltage, strong external fields, or intense light that hinder their real-world deployment.”</p><p>Muscles function similarly to actuators, where body parts move through the contraction and relaxation of muscle fibers. When connected to the rest of the body, the brain and other electrical systems in the body allow animals to make a range of movements, including movement patterns that generate disproportionately large forces relative to their body mass.</p><h2>Muscle-Inspired Actuator Technology</h2><p>The new actuator is made of a flexible silicone polymer called polydimethylsiloxane, a <a data-linked-post="2657538742" href="https://spectrum.ieee.org/the-men-who-made-the-magnet-that-made-the-modern-world" target="_blank">neodymium magnet</a>, and an electrical coil intertwined with soft magnetic iron spheres. The researchers fabricated the actuators using a 2D molding process that can manufacture them at millimeter, centimeter, and decimeter scales. It is also scalable for larger, more powerful soft devices. <span>“We shifted focus from material response to structural design in soft materials and combined it with static magnetic forces to create a novel actuation mechanism,” says Jiang. The researchers published their work in <em><a href="https://www.nature.com/articles/s41467-025-62182-2#Abs1" target="_blank">Nature Communications</a>.</em></span></p><p>The new actuator is able to contract like a muscle using a balance between elastic and magnetic forces. When the actuator contracts,<strong> </strong><span>it generates an electrical current to create a Lorentz force between the electrical coil and the neodymium magnet. The actuator then deforms as the iron spheres respond to the increased force, which can be used to provide movement for the robot itself.</span> The flexible polymer ensures that the system can both deform and recover back to its original state when the current is no longer applied.<strong></strong></p><p>The system tested by the researchers achieved an output force of 210 newtons per kilogram, a low operational voltage below 4 volts, and is powered by onboard batteries. It can also undergo large deformations, up to a 60 percent contraction ratio. The researchers made it more energy efficient by not requiring continuous power to maintain a stable state when the actuator isn’t moving—a technique similar to how mollusks stay in place using their catch muscles, which can maintain high tension over long periods of time by latching together thick and thin muscle filaments to conserve energy.</p><h2>Autonomous Insect-Size Soft Robots</h2><p>The researchers used the actuators to develop a series of insect-size soft robots that could exhibit autonomous adaptive crawling, swimming, and jumping movements in a range of environments.</p><p>One such series of bug-size bots was a group of compact soft inchworm crawlers, just 16-by-10-by-10 millimeters in size and weighing only 1.8 grams. The robots were equipped with a translational joint, a 3.7-volt (30-milliampere-hour) lithium-ion battery, and an integrated control circuit. This setup enabled the robots to crawl using sequential contractions and relaxation—much like a caterpillar. <span>Despite its small size, the crawler exhibited an output force of 0.41 N, which is 8 to 45 times as powerful as existing insect-scale soft crawler robots. </span></p><p><span>This output force enabled the robot to traverse difficult-to-navigate terrain—including soil, rough stone, PVC, glass, wood, and inclines between 5 and 15 degrees—while keeping a consistent speed. </span><span>The bug bots were also found to be very resilient to impacts and falling. They suffered no damage and continued to work even after a 30-meter drop off the side of a building.</span></p><p><span></span><span>The researchers also developed 14-by-20-by-19-mm legged crawlers, weighing 1.9 g with an output force of 0.48 N, that crawled like inchworms. These used rotational elasto-electromagnetic joints to move the legs backward and forward and weighed just 1.9 g. The researchers also built a</span><span> 19-by-19-by-11-mm swimming robot that weighed 2.2 g with an output force of 0.43 N.</span></p><p><span>Alongside testing how the bots move on different surfaces, the researchers built a number of obstacle courses for them to navigate while performing sensing operations. The inchworm bot was put into an obstacle course featuring narrow and complex paths and used a humidity sensor to detect sources of moisture. The swimming bots were tested in both the lab and a river. A course was built in the lab, where the swimmer had to perform chemical sensing operations in a narrow chamber using an integrated miniature ethanol gas detector.</span></p><p><span>Jiang says the researchers are now looking at developing sensor-rich robotic swarms capable of distributed detection, decision-making, and collective behavior. “By coordinating many small robots, we aim to create systems that can cover wide areas, adapt to dynamic environments, and respond more intelligently to complex tasks.”</span></p><p>Jiang says they’re also looking into flying and other swimming movements enabled by the elasto-electromagnetic system,<strong> </strong><span>including a jellyfish-like soft robot for deep-sea exploration and marine research.</span></p>]]></description><pubDate>Tue, 12 Aug 2025 12:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/soft-robot-actuators-bugs</guid><category>Actuators</category><category>Robotics</category><category>Soft robots</category><category>Insect robots</category><dc:creator>Liam Critchley</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-cartoon-spider-stands-on-a-sandy-surface-next-to-a-small-bug-sized-flexible-robot.jpg?id=61443734&amp;width=980"></media:content></item><item><title>Video Friday: Unitree’s A2 Quadruped Goes Exploring</title><link>https://spectrum.ieee.org/video-friday-exploration-robots</link><description><![CDATA[
  28. <img src="https://spectrum.ieee.org/media-library/robotic-dog-running-in-an-illuminated-arched-hallway-at-night-and-smashing-through-a-pane-of-glass.gif?id=61441393&width=1245&height=700&coordinates=0%2C31%2C0%2C32"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://www.whrgoc.com/">World Humanoid Robot Games</a>: 15–17 August 2025, BEIJING</h5><h5><a href="https://www.ro-man2025.org/">RO-MAN 2025</a>: 25–29 August 2025, EINDHOVEN, THE NETHERLANDS</h5><h5><a href="https://clawar.org/clawar2025/">CLAWAR 2025</a>: 5–7 September 2025, SHENZHEN, CHINA</h5><h5><a href="https://actuate.foxglove.dev/">ACTUATE 2025</a>: 23–24 September 2025, SAN FRANCISCO</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="https://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="ve9usu7zplu"><em>The A2 sets a new standard in quadruped robots, balancing endurance, strength, speed, and perception.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0ca7f368fe4e99ec59083a72363fa988" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ve9USu7zpLU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>The A2 weighs 37 kg (81.6 lbs) unloaded. Fully loaded with a 25 kg (55 lb) payload, it can continuously walk for 3 hours or approximately 12.5 km. Unloaded, it can continuously walk for 5 hours or approximately 20 km. Hot-swappable dual batteries enable seamless battery swap and continuous runtime for any mission.</em></blockquote><p>[ <a href="https://www.unitree.com/A2">Unitree</a> ]</p><p>Thanks, William!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="bnyxqwc9qfs"><em>ABB is working with Cosmic Buildings to reshape how communities rebuild and transform construction after disaster. In response to the 2025 Southern California wildfires, Cosmic Buildings are deploying mobile robotic microfactories to build modular homes on-site—cutting construction time by 70% and costs by 30%.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="dfaf34433a46424936827c5bc7becaee" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/BnYXQwC9QFs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://new.abb.com/news/detail/128070/abb-and-cosmic-use-ai-powered-robots-to-rebuild-homes-in-los-angeles-area">ABB</a> ]</p><p>Thanks, Caitlin!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="icxuq4tf4fy">How many slightly awkward engineers can your <a data-linked-post="2666662286" href="https://spectrum.ieee.org/humanoid-robots" target="_blank">humanoid robot</a> pull?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9f18c2671a3b7299ac2359b4fb765c87" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/IcXuQ4tF4FY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.magiclab.top/">MagicLab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="u4d6v3ohsoi">The physical robot hand does some nifty stuff at about 1 minute in.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0e3acbdf8660304575fa0dc3d16fba39" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/u4d6v3ohsOI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://srl.ethz.ch/">ETH Zurich Soft Robotics Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="rfspnvjdacq">Biologists, you can all go home now.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3acdbef0397f4c122fe0d2ae33c0d51e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/rFSpNVJDAcQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://global.agilex.ai/products/piper">AgileX</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="zx0qg1gxrji">The <a data-linked-post="2661714397" href="https://spectrum.ieee.org/robocup-robot-soccer" target="_blank">World Humanoid Robot Games</a> start next week in Beijing, and of course Tech United Eindhoven are there.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5560b9f92962f0654e3b0d996b4cdd26" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ZX0Qg1GXRjI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://techunited.nl/?page_id=2135&lang=en">Tech United</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="bxfheeecjfy"><em>Our USX-1 Defiant is a new kind of <a data-linked-post="2673356350" href="https://spectrum.ieee.org/video-friday-robot-metabolism" target="_blank">autonomous maritime platform</a>, with the potential to transform the way we design and build ships. As the team prepares Defiant for an extended at-sea demonstration, program manager Greg Avicola shares the foundational thinking behind the breakthrough vessel.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="17451c23a36e7c4972e3dc77d8252775" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/BxFhEEeCjFY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.darpa.mil/research/programs/no-manning-required-ship">DARPA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="_dnhdeqmf-4"><em>After loss, how do you translate grief into creation? Meditation Upon Death is Paul Kirby’s most personal and profound painting—a journey through love, loss, and the mystery of the afterlife. Inspired by a conversation with a Native American shaman and years of artistic exploration, Paul fuses technology and traditional art to capture the spirit’s passage beyond. With 5,796 brushstrokes, a custom-built robotic painting system, and a vision shaped by memory and devotion, this is the most important painting he has ever made.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="fb1c23e12d63fa341e08156aaf11836f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/_DNhdeqMf-4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://thefusepathway.com/studio/robotic-art/">Dulcinea</a> ]</p><p>Thanks, Alexandra!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="oz0lizn3umk"><em>In the fourth installment of our Moonshot Podcast Deep Dive video interview series, X’s Captain of Moonshots Astro Teller sits down with Andrew Ng, the founder of Google Brain and DeepLearning.AI, for a conversation about the history of neural network research and how Andrew’s pioneering ideas led to some of the biggest breakthroughs in modern-day AI.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3332c471e624c98c00bbdc5c80fca61e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Oz0LizN3uMk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://x.company/moonshotpodcast/">Moonshot Podcast</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 08 Aug 2025 15:30:04 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-exploration-robots</guid><category>Video friday</category><category>Robotics</category><category>Quadruped robots</category><category>Factory robots</category><category>Humanoid robots</category><category>Dexterity</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/robotic-dog-running-in-an-illuminated-arched-hallway-at-night-and-smashing-through-a-pane-of-glass.gif?id=61441393&amp;width=980"></media:content></item><item><title>Video Friday: Dance With CHILD</title><link>https://spectrum.ieee.org/video-friday-child-humanoid-robot</link><description><![CDATA[
  29. <img src="https://spectrum.ieee.org/media-library/man-controls-humanoid-robot-with-arm-gestures-wearing-harnessed-remote-control-device.png?id=61417799&width=1245&height=700&coordinates=0%2C256%2C0%2C257"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://www.ro-man2025.org/">RO-MAN 2025</a>: 25–29 August 2025, EINDHOVEN, NETHERLANDS</h5><h5><a href="https://clawar.org/clawar2025/">CLAWAR 2025</a>: 5–7 September 2025, SHENZHEN, CHINA</h5><h5><a href="https://actuate.foxglove.dev/">ACTUATE 2025</a>: 23–24 September 2025, SAN FRANCISCO</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="https://www.iros25.org/" target="_blank">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="66_oit-mic0"><em>Many parents naturally teach motions to their child while using a baby carrier. In this setting, the parent’s range of motion fully encompasses the child’s, making it intuitive to scale down motions in a puppeteering manner. This inspired UIUC KIMLAB to build CHILD: Controller for Humanoid Imitation and Live Demonstration.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a2d9d8f8030812b823d7d0f055374f1b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/66_OIT-mic0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>The role of <a data-linked-post="2650233002" href="https://spectrum.ieee.org/video-friday-sarcos-guardian-xt-teleoperated-dexterous-robot" target="_blank">teleoperation</a> has grown increasingly important with the rising interest in collecting physical data in the era of Physical/Embodied AI. We demonstrate the capabilities of CHILD through loco-manipulation and full-body control experiments using the Unitree G1 and other PAPRAS dual-arm systems. To promote accessibility and reproducibility, we open-source the hardware design.</em></blockquote><p>[ <a href="https://uiuckimlab.github.io/CHILD-pages/">KIMLAB</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="v1q4su54iho">This costs less than US $6,000.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0ada267e6748cd47a1a5d4892fd16d80" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/v1Q4Su54iho?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/R1">Unitree</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="wlftez-qf1e">If I wasn’t sold on one of these little Reachy Minis before. I definitely am now.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="af1e51a7a4da99ad62c0ae498b46eb85" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wLftEz-QF1E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pollen-robotics.com/reachy-mini/">Pollen</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="5ttcqrra4um"><em>In this study, we propose a falconry-like interaction system in which a <a data-linked-post="2650280287" href="https://spectrum.ieee.org/high-performance-ornithopter-drone" target="_blank">flapping-wing drone</a> performs autonomous palm-landing motion on a human hand. To achieve a safe approach toward humans, our motion planning method considers both physical and psychological factors.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="eb745b1bb23fb9132d47cad87279ffd9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5TtCqRrA4UM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>I should point out that palm landings are not falconry-like at all, and that if you’re doing falconry right, the bird should be landing on your wrist instead. I have other hobbies besides robots, you know!</p><p>[ <a href="https://arxiv.org/pdf/2507.17144">Paper</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="7r8ad4o_im4">I’m not sure that augmented reality is good for all that much, but I do like this use case of interactive robot help.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="23f7a7ccbf0527830eb46bca6625bcd6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7R8Ad4o_IM4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://mertcookimg.github.io/mrhad/">MRHaD</a> ]</p><p>Thanks, Masato!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ms5mn2zhp4e"><em>LimX Dynamics officially launched its general-purpose full-size humanoid robot LimX Oli. It’s currently available only in Mainland China. A global version is coming soon.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="20eb02f2d67ab771f07870f4ab764125" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Ms5Mn2zHp4E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Standing at 165 cm and equipped with 31 active degrees of freedom (excluding end-effectors), LimX Oli adopts a general-purpose humanoid configuration with modular hardware-software architecture and is supported by a development tool chain. It is built to advance embodied AI development from algorithm research to real-world deployment.</em></blockquote><p>[ <a href="https://www.limxdynamics.com/en">LimX Dynamics</a> ]</p><p>Thanks, Jinyan!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="mzvo2sfq6su"><em>Meet Treadward – the newest robot from HEBI Robotics, purpose-built for rugged terrain, inspection missions, and real-world fieldwork. Treadward combines high mobility with extreme durability, making it ideal for challenging environments like waterlogged infrastructure, disaster zones, and construction sites. With a compact footprint and treaded base, it can climb over debris, traverse uneven ground, and carry substantial payloads.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="87b3b4cb069291f1eb4eebb4a473216d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/MZvO2Sfq6sU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.hebirobotics.com/robotic-mobile-platforms">HEBI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="7rymfdfhnge"><em>PNDbotics made a stunning debut at the 2025 World Artificial Intelligence Conference (WAIC) with the first-ever joint appearance of its full-sized humanoid robot Adam and its intelligent data-collection counterpart Adam-U.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b231bad3f1ac13e40bf1f811ab5eb0fb" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7RymFDfhNgE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://pndbotics.com/">PNDbotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="gi7ftpr-gve"><em>This paper presents the design, development, and validation of a fully autonomous dual-arm aerial robot capable of mapping, localizing, planning, and grasping parcels in an intra-logistics scenario. The aerial robot is intended to operate in a scenario comprising several supply points, delivery points, parcels with tags, and obstacles, generating the mission plan from voice the commands given by the user.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ba057e1be857ac37476bd5badde66959" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/gI7FTPr-gVE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://grvc.us.es/">GRVC</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="pbu2cwvqixs"><em>We left the room. They took over. No humans. No instructions. Just robots...moving, coordinating, showing off. It almost felt like…they were staging something.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="28275873cd7ab5600b2c4b0767f2bde4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pbU2cWVqixs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://global.agilex.ai/">AgileX</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="seiukshccic"><em>TRI’s internship program offers a unique opportunity to work closely with our researchers on technologies to improve the quality of life for individuals and society. Here’s a glimpse into that experience from some of our 2025 interns!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6f3efbec36e37ad97011ed4b1fb39077" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/SEIukShccic?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.tri.global/careers">TRI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="mjhesgakx1w"><em>In the third installment of our Moonshot Podcast Deep Dive video interview series, X’s Captain of Moonshots Astro Teller sits down with Dr. Catie Cuan, robot choreographer and former artist in residence at Everyday Robots, for a conversation about how dance can be used to build beautiful and useful robots that people want to be around.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0d587bff8c111a84b5438ea0b767c296" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/MJhesGakX1w?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.youtube.com/playlist?list=PL7og_3Jqea4U6VgjOfaCGnqp6AiuVfgrU">Moonshot Podcast</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 01 Aug 2025 17:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-child-humanoid-robot</guid><category>Aerial robot</category><category>Humanoid robot</category><category>Robotics events</category><category>Video friday</category><category>Robotics</category><category>Humanoid robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/man-controls-humanoid-robot-with-arm-gestures-wearing-harnessed-remote-control-device.png?id=61417799&amp;width=980"></media:content></item><item><title>Robots That Learn to Fear Like Humans Survive Better</title><link>https://spectrum.ieee.org/robot-risk-assessment-fear</link><description><![CDATA[
  30. <img src="https://spectrum.ieee.org/media-library/silhouette-of-a-human-head-with-circuitry-pattern-extending-from-the-brain.jpg?id=61268049&width=1245&height=700&coordinates=0%2C412%2C0%2C413"/><br/><br/><p><span><br/><em></em></span></p><p>
  31. <em>This article is part of our exclusive <a href="https://spectrum.ieee.org/collections/journal-watch/" target="_blank">IEEE Journal Watch series</a> in partnership with IEEE Xplore.</em>
  32. </p><p><span>Imagine walking downtown when you hear a loud bang coming from the construction site across the street—you may have the impulse to freeze or even duck down. This type of quick, instinctual reaction is one of the most basic but important evolutionary processes we have to protect ourselves and survive in unfamiliar settings.</span></p><p><span>Now, researchers are beginning to explore how a similar, fast-reacting thought process can be translated into robots. The idea is to <a data-linked-post="2650274303" href="https://spectrum.ieee.org/how-to-build-a-moral-robot" target="_blank">program robots to make decisions</a> the same way that humans do, based on our innate emotional responses to unknown stimuli—and in particular our fear response. The </span><a href="https://ieeexplore.ieee.org/document/11054284" target="_blank"><span>results</span></a><span>, published 27 June in </span><a href="https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=7083369" target="_blank"><span><em>IEEE Robotics and Automation Letters</em></span></a><span>, show that the approach can significantly enhance robots’ ability to assess risk and avoid dangerous situations.</span></p> <p><a href="https://www.polito.it/personale?p=039906" target="_blank">Alessandro Rizzo</a>, <span>an associate professor in automation engineering and robotics at the <a href="https://www.polito.it/" target="_blank">Polytechnic University of Turin</a> in Italy, led the study. He notes that robots currently face many challenges in adapting to dynamic environments while enacting self-preserving strategies. This is in large part because their control systems are often designed to accomplish very specific tasks. “As a result, robots may struggle to operate effectively in complex and changing conditions,” Rizzo says.</span></p><h2>How the Human Brain Responds to Risk</h2><p>Humans, on the other hand, are able to respond to many different and unique stimuli that we encounter. It’s theorized that our brains have two different ways to calculate, assess, and respond to risk in these scenarios. </p><p>The first involves a very innate response where we detect external stimuli (for example, a loud bang from a construction site) and our brains make very quick, emotional decisions (such as to freeze or duck). In a way, our brains are swiftly responding to raw data in these scenarios, rather than taking the time to more thoroughly process it. </p><p>According to a theory on how our brains work, called the dual-pathway hypothesis, this reaction is elicited by the “<a href="https://www.numberanalytics.com/blog/amygdala-emotional-regulation-mental-health" target="_blank">low road</a>,” neural circuitry responsible for emotions, driven by the amygdala. But when our brains instead use experience and more articulated reasoning involving our prefrontal cortex, this is the second, “high road” pathway to respond to stimuli.</p><p>Rizzo and a doctoral candidate in his lab, <a href="https://www.polito.it/en/staff?p=andrea.usai" target="_blank">Andrea Usai</a>, were curious to see how these two different approaches for confronting risky situations would play out in robots that have to navigate unfamiliar environments. They began by designing a control system for robots that emulates a fear response via the low road.</p><p>“We focused on fear, as it is one of the most studied emotions in neuroscience and, in our view, the one with the greatest potential for robotics,” says Usai. “Fear is closely related to self-preservation and rapid responses to danger, both of which are critical for adaptive behavior.”</p><h2>Reinforced Learning in Robotics</h2><p>To emulate the fear response in their robot, the researchers designed a controller based on reinforced learning, which helps the robot dynamically adjust its priorities and constraints in real time based on raw data of its surroundings. These results inform the behavior of a second algorithm called a nonlinear model predictive controller, which sets a corresponding motor pattern to the robot’s locomotion. </p><p>Through simulations, Rizzo and Usai tested how their robot navigates unfamiliar environments, comparing it to other robot control systems without the fear element. The simulations involved different scenarios, with various dangerous and nondangerous obstacles, which are either static or moving around the simulated environment. </p><p>The results show that the robot with the low-road programming was able to navigate a smoother and safer path toward its goal compared to conventional robot designs. For example, in one of the scenarios with hazards dynamically moving around, the low-road robot navigated around dangerous objects with a wider berth of about 3.1 meters, whereas the other two conventional robots tested in this study came within a harrowing 0.3 and 0.8 meters of dangerous objects. </p><p>Usai says there are many different scenarios where this low-road approach to robotics could be useful, including cases of object manipulation, surveillance, and rescue operations, where robots must deal with hazardous conditions and may need to adopt more cautious behavior. </p><p>But as Usai notes, the low-road approach is very reactive in nature and is better suited for very quick decisions that are needed in the short term. Therefore, the research team is working on a control design that mimics the high road that, while complementing the low road, could help robots make more rational, long-term decisions. </p><p>The researchers are considering doing this using <a data-linked-post="2666621738" href="https://spectrum.ieee.org/how-ai-can-personalize-education" target="_blank">multimodal large language models</a>, like ChatGPT. As Rizzo explains, “These models could help simulate some of the core functions of the human prefrontal cortex, such as decision-making, strategic planning, and context evaluation, allowing us to emulate more cognitively driven responses in robots.” </p><p>“Looking ahead, it would also be interesting trying to extend the architecture to incorporate multiple emotions,” Rizzo adds, “enabling a richer and more nuanced form of adaptive behavior in robotic systems.”</p><p><em>This article appears in the October 2025 print issue as “Robots Do Better When They Can Fear.”</em></p>]]></description><pubDate>Sat, 26 Jul 2025 13:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/robot-risk-assessment-fear</guid><category>Robotics</category><category>Risk assessment</category><category>Surveillance</category><category>Journal watch</category><dc:creator>Michelle Hampson</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/silhouette-of-a-human-head-with-circuitry-pattern-extending-from-the-brain.jpg?id=61268049&amp;width=980"></media:content></item><item><title>Video Friday: Skyfall Takes on Mars With Swarm Helicopter Concept</title><link>https://spectrum.ieee.org/video-friday-skyfall-mars-helicopter</link><description><![CDATA[
  33. <img src="https://spectrum.ieee.org/media-library/artist-s-concept-of-a-drone-deployment-system-on-mars-6-propellers-connected-by-latticed-scaffolding-and-a-protective-shell-abo.png?id=61322340&width=1245&height=700&coordinates=0%2C0%2C0%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://www.ro-man2025.org/">RO-MAN 2025</a>: 25–29 August 2025, EINDHOVEN, NETHERLANDS</h5><h5><a href="https://clawar.org/clawar2025/">CLAWAR 2025</a>: 5–7 September 2025, SHENZHEN, CHINA</h5><h5><a href="https://actuate.foxglove.dev/">ACTUATE 2025</a>: 23–24 September 2025, SAN FRANCISCO</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="https://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="gqaupq3_xrs"><em>AeroVironment revealed Skyfall—a potential future mission concept for next-generation Mars Helicopters developed with NASA’s Jet Propulsion Laboratory (JPL) to help pave the way for human landing on Mars through autonomous aerial exploration. <br/><br/>The concept is heavily focused on rapidly delivering an affordable, technically mature solution for expanded Mars exploration that would be ready for launch by 2028. Skyfall is designed to deploy six scout helicopters on Mars, where they would explore many of the sites selected by NASA and industry as top candidate landing sites for America’s first Martian astronauts. While exploring the region, each helicopter can operate independently, beaming high-resolution surface imaging and subsurface radar data back to Earth for analysis, helping ensure crewed vehicles make safe landings at areas with maximum amounts of water, ice, and other resources.<br/><br/></em><em>The concept would be the first to use the “Skyfall Maneuver”—an innovative entry, descent, and landing technique whereby the six rotorcraft deploy from their entry capsule during its descent through the Martian atmosphere. By flying the helicopters down to the Mars surface under their own power, Skyfall would eliminate the necessity for a landing platform–traditionally one of the most expensive, complex, and risky elements of any Mars mission.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3eecc27ea351b4023bfcb11dca679a50" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GqAuPq3_XRs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.avinc.com/resources/press-releases/view/av-reveals-skyfall-future-concept-next-gen-mars-helicopters-for-exploration-and-human-landing-preparation">AeroVironment</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="g6ychxkn5nk">By far the best part of videos like these is watching the expressions on the faces of the <a data-linked-post="2650275991" href="https://spectrum.ieee.org/students-race-driverless-cars-in-germany-in-formula-student-competition" target="_blank">students</a> when their robots succeed at something.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="01c5b4fc9378281f6acaf253235b4033" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/g6YchXkN5Nk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://railab.kaist.ac.kr/">RaiLab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="siuwjwkjdgs">This is just a rendering of course, but the real thing should be showing up on 6 August.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8ab996f34e2be8506e2779ee52a20dcd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/siuwJWkjDgs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.fftai.com/">Fourier</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="v_uakh6swyc"><em>Top performer in its class! Less than two weeks after its last release, MagicLab unveils another breakthrough — MagicDog-W, the wheeled quadruped robot. Cyber-flex, dominate all terrains!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="766663aff3271ed07e42bb046c5c1c14" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/V_uakh6swYc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.magiclab.top/dog">MagicLab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="zahmhge_f-m"><em>Inspired by the octopus’s remarkable ability to wrap and grip with precision, this study introduces a vacuum-driven, origami-inspired <a data-linked-post="2650272770" href="https://spectrum.ieee.org/soft-actuators-go-from-squishy-to-stiff-and-back-again" target="_blank">soft actuator</a> that mimics such versatility through self-folding design and high bending angles. Its crease-free, 3D-printable structure enables compact, modular robotics with enhanced grasping force—ideal for handling objects of various shapes and sizes using octopus-like suction synergy.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="79e130248d6b4288ad9a3427e54c957a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ZAHmhGE_f-M?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/document/11079234">Paper</a> ] via [ <a href="https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=8860" target="_blank">IEEE Transactions on Robots</a> ]</p><p>Thanks, Bram!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="k5x8potn0ii">Is it a plane? Is it a helicopter? Yes.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3dba77c2eb51d033a9c65a69d9bdbe57" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/K5X8Potn0iI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ris.bme.cityu.edu.hk/">Robotics and Intelligent Systems Laboratory, City University of Hong Kong</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="vhnc3y_ykks">You don’t need wrist rotation as long as you have the right <a data-linked-post="2667175149" href="https://spectrum.ieee.org/flexiv-gecko-gripper" target="_blank">gripper</a>.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="49ea0eeb83b1023f844b0e9fc17428fa" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vHNc3Y_YKks?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.nature.com/articles/s42256-025-01039-1">Nature Machine Intelligence</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="dmpa5mqlqws">ICRA 2026 will be in Vienna next June!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="345e2b957d0777d1ef0e5ae00b9cdf86" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dMPA5MQlQws?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://2026.ieee-icra.org/">ICRA 2026</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="fnv2bwecs9u">Boing, boing, boing!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c036f1c61e9567a273ad3299b53e0847" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Fnv2BweCs9U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ris.bme.cityu.edu.hk/">Robotics and Intelligent Systems Laboratory, City University of Hong Kong</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="muu3bqo9rki"><em>ROBOTERA Unveils L7: Next-Generation Full-Size Bipedal Humanoid Robot with Powerful Mobility and Dexterous Manipulation!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="50323e65d5091db34b02096f030384de" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/muu3Bqo9RkI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.robotera.com/en/">ROBOTERA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="tnryo2uasws"><em>Meet UBTECH New-Gen of Industrial Humanoid Robot—Walker S2 makes multiple industry-leading breakthroughs! Walker S2 is the world’s first humanoid robot to achieve 3-minute autonomous battery swapping and 24/7 continuous operation.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4b40cbf9fc7b06c4434cb971868eb5be" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/TNryO2uasws?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ubtrobot.com/en/">UBTECH</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="xfjjoe8to5a"><em>ARMstrong Dex is a human-scale dual-arm hydraulic robot developed by the Korea Atomic Energy Research Institute (KAERI) for disaster response. It can perform vertical pull-ups and manipulate loads over 50 kilograms, demonstrating strength beyond human capabilities. However, disaster environments also require agility and fast, precise movement. This test evaluated ARMstrong Dex’s ability to throw a 500-milliliter water bottle (0.5 kg) into a target container. The experiment assessed high-speed coordination, trajectory control, and endpoint accuracy, which are key attributes for operating in dynamic rescue scenarios.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2793c53ff1951bfe0e34092eac131981" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/XfjjOE8TO5A?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://kaeri.ust.ac.kr/">KAERI</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="mrgeroghkhy">This is not a humanoid robot, it’s a data-acquisition platform.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a360087cbb510c147995340c59151532" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/MRgErOGhKHY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://pndbotics.com/">PNDbotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="r_ioywadb9s">Neat feature on this drone to shift the battery back and forth to compensate for movement of the arm.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ab3ae4bae96a4fa9752ef0dabb021e01" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/R_ioYwADb9s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.mdpi.com/2504-446X/9/8/516">Paper</a> ] via [ <a href="https://www.mdpi.com/journal/drones" target="_blank">Drones journal</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="fsyhcolqery"><em>As residential buildings become taller and more advanced, the demand for seamless and secure in-building delivery continues to grow. In high-end apartments and modern senior living facilities where couriers cannot access upper floors, robots like FlashBot Max are becoming essential. In this featured elderly care residence, FlashBot Max completes 80 to 100 deliveries daily, seamlessly navigating elevators, notifying residents upon arrival, and returning to its charging station after each delivery.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a72d2a87a0d94e8788022c075095da92" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/fsyhcoLqeRY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pudurobotics.com/en">Pudu Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="r3u9kidbam4">“How to Shake Trees With Aerial Manipulators.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="529cb4aabb607a6273e945f8678d9595" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/R3U9KidBAM4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://grvc.us.es/">GRVC</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="o72wi-txvlk"><em>We see a future where seeing a cobot in a hospital delivering supplies feels as normal as seeing a tractor in a field. Watch our CEO Brad Porter share what robots moving in the world should feel like.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b1600247076aff89e9020d7fbbc6ac8c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/o72wi-txvlk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.co.bot/">Cobot</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="jdps5thwy8u"><em>Introducing the Engineered Arts UI for robot Roles, it’s now simple to set up a robot to behave exactly the way you want it to. We give a quick overview of customization for languages, personality, knowledge, and abilities. All of this is done with no code. Just simple LLM prompts, drop-down list selections and some switches to enable the features you need.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4cd998525df324cf4923429a901818b7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/jDPs5thwY8U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://engineeredarts.com/">Engineered Arts</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="8s9tjrz01fo"><em>Unlike most quadrupeds, CARA doesn’t use any gears or pulleys. Instead, her joints are driven by rope through capstan drives. Capstan drives offer several advantages: zero backlash, high torque transparency, low inertia, low cost, and quiet operation. These qualities make them an ideal speed reducer for robotics.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6dcc8b380ec3dcc15b9096a812d35d56" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8s9TjRz01fo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.aaedmusa.com/projects/cara">CARA</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 25 Jul 2025 16:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-skyfall-mars-helicopter</guid><category>Video friday</category><category>Robotics</category><category>Mars helicopter</category><category>Drones</category><category>Humanoid robots</category><category>Quadruped robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/artist-s-concept-of-a-drone-deployment-system-on-mars-6-propellers-connected-by-latticed-scaffolding-and-a-protective-shell-abo.png?id=61322340&amp;width=980"></media:content></item><item><title>DeepMind’s Quest for Self-Improving Table Tennis Agents</title><link>https://spectrum.ieee.org/deepmind-table-tennis-robots</link><description><![CDATA[
  34. <img src="https://spectrum.ieee.org/media-library/robots-playing-ping-pong-on-an-automated-table-in-a-tech-lab-setting.gif?id=61214827&width=1245&height=700&coordinates=0%2C0%2C0%2C0"/><br/><br/><p>Hardly a day goes by without impressive new robotic platforms emerging from academic labs and commercial startups worldwide. <a href="https://spectrum.ieee.org/humanoid-robots" target="_blank">Humanoid robots</a> in particular look increasingly capable of assisting us in factories and eventually in homes and hospitals. Yet, for these machines to be truly useful, they need sophisticated “brains” to control their robotic bodies. Traditionally, programming robots involves experts spending countless hours meticulously scripting complex behaviors and exhaustively tuning parameters, such as controller gains or motion-planning weights, to achieve desired performance. While machine learning (ML) techniques have promise, robots that need to learn new complex behaviors still require substantial human oversight and reengineering. At <a href="https://deepmind.google/" target="_blank">Google DeepMind</a>, we asked ourselves: How do we enable robots to learn and adapt more holistically and continuously, reducing the bottleneck of expert intervention for every significant improvement or new skill?</p><p>This question has been a driving force behind our robotics research. We are exploring paradigms where two robotic agents playing against each other can achieve a greater degree of autonomous self-improvement, moving beyond systems that are merely preprogrammed with fixed or narrowly adaptive ML models toward agents that can learn a broad range of skills on the job. Building on our previous work in ML with systems like <a href="https://spectrum.ieee.org/why-alphago-is-not-ai" target="_blank">AlphaGo</a> and <a href="https://spectrum.ieee.org/alphafold-proves-that-ai-can-crack-fundamental-scientific-problems" target="_blank">AlphaFold</a>, we turned our attention to the demanding sport of <a href="https://sites.google.com/view/competitive-robot-table-tennis/home" rel="noopener noreferrer" target="_blank">table tennis as a testbed</a>.</p><p>We chose table tennis precisely because it encapsulates many of the hardest challenges in robotics within a constrained, yet highly dynamic, environment. Table tennis requires a robot to master a confluence of difficult skills: Beyond just perception, it demands exceptionally precise control to intercept the ball at the correct angle and velocity and involves strategic decision-making to outmaneuver an opponent. These elements make it an ideal domain for developing and evaluating robust learning algorithms that can handle real-time interaction, complex physics, high-level reasoning and the need for adaptive strategies<span>—</span>capabilities that are directly transferable to applications like manufacturing and even potentially unstructured home settings.</p><h3>The Self-Improvement Challenge</h3><p>Standard machine learning approaches often fall short when it comes to enabling continuous, autonomous learning. Imitation learning, where a robot learns by mimicking an expert, typically requires us to provide vast numbers of human demonstrations for every skill or variation; this reliance on expert data collection becomes a significant bottleneck if we want the robot to continually learn new tasks or refine its performance over time. Similarly, reinforcement learning, which trains agents through trial-and-error guided by rewards or punishments, often necessitates that human designers meticulously engineer complex mathematical reward functions to precisely capture desired behaviors for multifaceted tasks, and then adapt them as the robot needs to improve or learn new skills, limiting scalability. In essence, both of these well-established methods traditionally involve substantial human involvement, especially if the goal is for the robot to continually self-improve beyond its initial programming. Therefore, we posed a direct challenge to our team: Can robots learn and enhance their skills with minimal or no human intervention during the learning-and-improvement loop?</p><h3>Learning Through Competition: Robot vs. Robot</h3><p><span>One innovative approach we explored mirrors the strategy used for AlphaGo: Have agents learn by competing against themselves. We experimented with having two robot arms play table tennis against each other, an</span><span> idea that is simple yet powerful. As one robot discovers a better strategy, its opponent is forced to adapt and improve, creating a cycle of escalating skill levels.</span></p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="a2d38ab45aa562a36fb1853a741828b6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/b9_OytzkWv8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">                DeepMind        </small> </p><p>To enable the extensive training needed for these paradigms, we engineered a fully autonomous table-tennis environment. This setup allowed for continuous operation, featuring automated ball collection as well as remote monitoring and control, allowing us to run experiments for extended periods without direct involvement. As a first step, we successfully trained a robot agent (replicated on both the robots independently) using reinforcement learning in simulation to play cooperative rallies. We fine-tuned the agent for a few hours in the real-world robot-versus-robot setup, resulting in a policy capable of holding long rallies. We then switched to tackling the competitive robot-versus-robot play.</p><p>Out of the box, the cooperative agent didn’t work well in competitive play. This was expected, because in cooperative play, rallies would settle into a narrow zone, limiting the distribution of balls the agent can hit back. Our hypothesis was that if we continued training with competitive play, this distribution would slowly expand as we rewarded each robot for beating its opponent. While promising, training systems through competitive self-play in the real world presented significant hurdles. The increase in distribution turned out to be rather drastic given the constraints of the limited model size. Essentially, it was hard for the model to learn to deal with the new shots effectively without forgetting old shots, and we quickly hit a local-minima in the training where after a short rally, one robot would hit an easy winner, and the second robot was not able to return it.</p><p>While robot-on-robot competitive play has remained a tough nut to crack, our team also investigated <a href="https://arxiv.org/abs/2408.03906" target="_blank">how the robot could play against humans competitively</a>. In the early stages of training, humans did a better job of keeping the ball in play, thus increasing the distribution of shots that the robot could learn from. We still had to develop a policy architecture consisting of low-level controllers with their detailed skill descriptors and a high-level controller that chooses the low-level skills, along with techniques for enabling a zero-shot sim-to-real approach to allow our system to adapt to unseen opponents in real time. In a user study, while the robot lost all of its matches against the most advanced players, it won all of its matches against beginners and about half of its matches against intermediate players, demonstrating solidly amateur human-level performance. Equipped with these innovations, plus a better starting point than cooperative play, we are in a great position to go back to robot-versus-robot competitive training and continue scaling rapidly.</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="76be4312887661c93d0c360c138a1daa" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/EqQl-JQxToE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">DeepMind</small></p><h3>The AI Coach: VLMs Enter the Game</h3><p>A second intriguing idea we investigated leverages the power of <a href="https://spectrum.ieee.org/gemini-robotics" target="_blank">vision language models (VLMs)</a>, like Gemini. Could a VLM act as a coach, observing a robot player and providing guidance for improvement?</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Profile icon spins, transforming into colorful Gemini logo with sparkling star accent." class="rm-shortcode" data-rm-shortcode-id="b4bb1190d5d7ae71e04e64d200d0d04f" data-rm-shortcode-name="rebelmouse-image" id="975c2" loading="lazy" src="https://spectrum.ieee.org/media-library/profile-icon-spins-transforming-into-colorful-gemini-logo-with-sparkling-star-accent.gif?id=61227959&width=980"/> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">DeepMind</small></p><p>An important insight of this project is that VLMs can be leveraged for <em>explainable</em> robot policy search. Based on this insight, we developed the <a href="https://sites.google.com/asu.edu/sas-llm/" target="_blank">SAS Prompt</a> (summarize, analyze, synthesize), a single prompt that enables iterative learning and adaptation of robot behavior by leveraging the VLM’s ability to retrieve, reason, and optimize to synthesize new behavior. Our approach can be regarded as an early example of a new family of explainable policy-search methods that are entirely implemented within an LLM. Also, there is no reward function—the VLM infers the reward directly from the observations given in the task description. The VLM can thus become a coach that constantly analyzes the performance of the student and provides suggestions for how to get better.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="AI robot practicing ping pong with specific ball placements on a blue table." class="rm-shortcode" data-rm-shortcode-id="9fa625e55d5dcd43f19a1da3e758c81e" data-rm-shortcode-name="rebelmouse-image" id="c26ae" loading="lazy" src="https://spectrum.ieee.org/media-library/ai-robot-practicing-ping-pong-with-specific-ball-placements-on-a-blue-table.gif?id=61227979&width=980"/> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">DeepMind</small></p><h3>Toward Truly Learned Robotics: An Optimistic Outlook</h3><p>Moving beyond the limitations of traditional programming and ML techniques is essential for the future of robotics. Methods enabling autonomous self-improvement, like those we are developing, reduce the reliance on painstaking human effort. Our table-tennis projects explore pathways toward robots that can acquire and refine complex skills more autonomously. While significant challenges persist—stabilizing robot-versus-robot learning and scaling VLM-based coaching are formidable tasks—these approaches offer a unique opportunity. We are optimistic that continued research in this direction will lead to more capable, adaptable machines that can learn the diverse skills needed to operate effectively and safely in our unstructured world. The journey is complex, but the potential payoff of truly intelligent and helpful robotic partners make it worth pursuing.</p><div class="horizontal-rule"></div><p><span><em>The authors express their deepest appreciation to the Google DeepMind Robotics team and in particular David B. D’Ambrosio, Saminda Abeyruwan, Laura Graesser, Atil Iscen, Alex Bewley, and Krista Reymann for their invaluable contributions to the development and refinement of this work.</em></span></p>]]></description><pubDate>Mon, 21 Jul 2025 15:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/deepmind-table-tennis-robots</guid><category>Google deepmind</category><category>Machine learning</category><category>Table tennis</category><category>Robotics</category><dc:creator>Pannag Sanketi</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/robots-playing-ping-pong-on-an-automated-table-in-a-tech-lab-setting.gif?id=61214827&amp;width=980"></media:content></item></channel></rss>

If you would like to create a banner that links to this page (i.e. this validation result), do the following:

  1. Download the "valid RSS" banner.

  2. Upload the image to your own server. (This step is important. Please do not link directly to the image on this server.)

  3. Add this HTML to your page (change the image src attribute if necessary):

If you would like to create a text link instead, here is the URL you can use:

http://www.feedvalidator.org/check.cgi?url=http%3A//feeds.feedburner.com/IeeeSpectrumRoboticsChannel

Copyright © 2002-9 Sam Ruby, Mark Pilgrim, Joseph Walton, and Phil Ringnalda