Congratulations!

[Valid RSS] This is a valid RSS feed.

Recommendations

This feed is valid, but interoperability with the widest range of feed readers could be improved by implementing the following recommendations.

Source: http://feeds.feedburner.com/IeeeSpectrumRoboticsChannel

  1. <?xml version="1.0" encoding="utf-8"?>
  2. <rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:media="http://search.yahoo.com/mrss/"><channel><title>IEEE Spectrum</title><link>https://spectrum.ieee.org/</link><description>IEEE Spectrum</description><atom:link href="https://spectrum.ieee.org/feeds/topic/robotics.rss" rel="self"></atom:link><language>en-us</language><lastBuildDate>Tue, 01 Jul 2025 21:23:36 -0000</lastBuildDate><item><title>Robotic Arm “Feels” Using Sound</title><link>https://spectrum.ieee.org/farm-robots-sound-based-sensing</link><description><![CDATA[
  3. <img src="https://spectrum.ieee.org/media-library/a-robotic-arm-making-contact-with-a-leafy-branch-during-a-sound-experiment.jpg?id=61106569&width=1500&height=2000&coordinates=1230%2C0%2C1231%2C0"/><br/><br/><p><em>This article is part of our exclusive </em><a href="https://spectrum.ieee.org/collections/journal-watch/" target="_self"><em>IEEE Journal Watch series</em></a><em> in partnership with <a href="https://spectrum.ieee.org/tag/ieee-xplore" target="_self">IEEE Xplore</a>.</em></p><p>Agricultural robots could help farmers harvest food under tough environmental conditions, especially as temperatures continue to rise. However, creating affordable robotic arms that can gracefully and accurately navigate the thick network of branches and trunks of plants can be challenging. </p><p>In a recent study, <a href="https://ieeexplore.ieee.org/document/11021384" target="_blank">researchers developed a sensing system</a>, called SonicBoom, which allows autonomous robots to use sound to sense the objects it touches. The approach, which can accurately localize or “feel” the objects it encounters with centimeter-level precision, is described in a study published 2 June in <a href="https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=7083369" rel="noopener noreferrer" target="_blank"><em>IEEE Robotics and Automation Letters</em></a>. </p><p>Moonyoung (Mark) Lee is<strong> </strong>a fifth-year Ph.D. student at Carnegie Mellon University’s Robotics Institute who was involved in developing SonicBoom. He notes that many autonomous robots currently rely on a collection of tiny camera-based <a href="https://spectrum.ieee.org/tag/tactile-sensors" target="_blank">tactile sensors</a>. Minicameras beneath a protective gel pack that lines the robot’s surface let the sensors visually estimate the gel’s deformation to gain tactile information. However, this approach isn’t ideal in agricultural settings, when branches are likely to occlude the visual sensors. What’s more, camera-based sensors can be expensive and could be easily damaged in this context. </p><p>Another option is pressure sensors, Lee notes, but these would need to cover much of the surface area of the robot in order to effectively sense when it comes into contact with branches. “Imagine covering the entire robot arm surface with that kind of [sensor]. It would be expensive,” he says.</p><p>Instead, Lee and his colleagues are proposing a completely different approach that relies on sound for sensing. The system involves an array of contact microphones, which detect physical touch as sound signals that propagate through solid materials.</p><h2>How Does SonicBoom Work?</h2><p>When a robotic arm touches a branch, the resulting sound waves travel down the robotic arm until they encounter the array of contact microphones. Tiny differences in sound-wave properties (such as signal intensity and phase) across the array of microphones are used to localize where the sound originated, and thus the point of contact.</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="bfd2795046c4d1b7896487f0ca8f37fe" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/4bdHyQtuqrM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-caption" placeholder="Add Photo Caption...">In this video, see SonicBoom in action during laboratory testing. </small> <small class="image-media media-photo-credit" placeholder="Add Photo Credit..."> <a href="https://youtu.be/4bdHyQtuqrM" target="_blank">youtu.be</a> </small> </p><p>Lee notes that this approach allows microphones to be embedded deeper in the robotic arm. This means they are less prone to damage compared to traditional visual sensors on the exterior of a robotic arm. “The contact microphones can be easily protected from very harsh, abrasive contacts,” he explains.</p><p>As well, the approach uses a small handful of microphones dispersed across the robotic arm, rather than many visual or pressure sensors more densely coating it.</p><p>To help <a href="https://iamlab-cmu.github.io/sonicboom/" target="_blank">SonicBoom</a> better localize points of contact, the researchers used an AI model, trained on data collected by tapping the robotic arm more than 18,000 times with a wooden rod. As a result, SonicBoom was able to localize contact on the robotic arm with an error rate of just 0.43 centimeters for objects it was trained to detect. It was also able to detect novel objects, for example ones made of plastic or aluminum, with an error rate of 2.22 cm. </p><p>In a subsequent study pending publication, Lee and his colleagues are using <a href="https://arxiv.org/abs/2505.12665" target="_blank">new data to train SonicBoom</a> to identify what kind of object its encountering<span>—</span>for example, a leaf, branch, or trunk. </p><p>“With SonicBoom, you can blindly tap around and know where the [contact happens], but at the end of the day, for the robot, the really important information is: Can I keep pushing, or am I hitting a strong trunk and should rethink how to move my arm?” he explains. </p><p>Of note, SonicBoom has yet to be tested in real-world agricultural settings, Lee says. </p>]]></description><pubDate>Sat, 28 Jun 2025 13:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/farm-robots-sound-based-sensing</guid><category>Agricultural robots</category><category>Autonomous robots</category><category>Journal watch</category><category>Farming</category><dc:creator>Michelle Hampson</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-robotic-arm-making-contact-with-a-leafy-branch-during-a-sound-experiment.jpg?id=61106569&amp;width=980"></media:content></item><item><title>Video Friday: This Quadruped Throws With Its Whole Body</title><link>https://spectrum.ieee.org/robot-arm-thrower</link><description><![CDATA[
  4. <img src="https://spectrum.ieee.org/media-library/quadruped-robot-with-manipulator-arm-placed-on-pavement-near-a-table-tennis-setup.png?id=61112174&width=1500&height=2000&coordinates=320%2C0%2C321%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://ias-19.org/">IAS 2025</a>: 30 June–4 July 2025, GENOA, ITALY</h5><h5><a href="https://clawar.org/icres2025/">ICRES 2025</a>: 3–4 July 2025, PORTO, PORTUGAL</h5><h5><a href="https://2025.worldhaptics.org/">IEEE World Haptics</a>: 8–11 July 2025, SUWON, SOUTH KOREA</h5><h5><a href="https://ifac2025-msrob.com/">IFAC Symposium on Robotics</a>: 15–18 July 2025, PARIS</h5><h5><a href="https://2025.robocup.org/">RoboCup 2025</a>: 15–21 July 2025, BAHIA, BRAZIL</h5><h5><a href="https://www.ro-man2025.org/">RO-MAN 2025</a>: 25–29 August 2025, EINDHOVEN, THE NETHERLANDS</h5><h5><a href="https://clawar.org/clawar2025/">CLAWAR 2025</a>: 5–7 September 2025, SHENZHEN, CHINA</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="https://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="3ysgbn6ca8a"><em>Throwing is a fundamental skill that enables robots to manipulate objects in ways that extend beyond the reach of their arms. We present a control framework that combines learning and model-based control for prehensile whole-body throwing with legged mobile manipulators. This work provides an early demonstration of prehensile throwing with quantified accuracy on hardware, contributing to progress in dynamic whole-body manipulation.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c380e7cc77c2ff7df6924bc3589434b5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/3ysgbN6Ca8A?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2506.16986">Paper</a> ] from [ <a href="https://ethz.ch/en/studies/master/degree-programmes/engineering-sciences/robotics-systems-and-control.html" target="_blank">ETH Zurich</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="wgr7iizzfr0">As it turns out, in many situations <a data-linked-post="2666662286" href="https://spectrum.ieee.org/humanoid-robots" target="_blank">humanoid robots</a> don’t necessarily need legs at all.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="47302ecf2f03d6151a2dfd94f1ce29d5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/WgR7IIzzfR0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.robotera.com/en/enq5">ROBOTERA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="xmbuvofnu5a"><em>Picking-in-Motion is a brand new feature of Autopicker 2.0. Instead of remaining stationary while picking an item, Autopicker begins traveling toward its next destination immediately after retrieving a storage tote, completing the pick while on the move. The robot then drops off the first storage tote at an empty slot near the next pick location before collecting the next tote.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8403fb62dce8c4ce1457ec30a8bb67f5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xmbuvoFNu5A?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://brightpick.ai/">Brightpick</a> ]</p><p>Thanks, Gilmarie!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="rwsb78emfgi">I am pretty sure this is not yet real, but boy is it shiny.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c8476f7f896244eb4a23e8481b9be4b7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/rWSb78EmFGI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.softbank.jp/corp/philosophy/technology/special/ntn-solution/haps/">SoftBank</a> ] via [ <a href="https://robotstart.info/2025/06/26/sb-haps-2026.html">RobotStart</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="lf5fw4qunlk">Why use one thumb when you can use two instead?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3059dc10a2edb86d9db46fdcd9e0bde6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/LF5fW4qUnlk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.tu.berlin/en/robotics">TU Berlin</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="3fwefdibpk0"><em>Kirigami offers unique opportunities for guided morphing by leveraging the geometry of the cuts. This work presents inflatable <a data-linked-post="2650276722" href="https://spectrum.ieee.org/artificial-snakeskin-helps-robots-get-their-slither-on" target="_blank">kirigami crawlers</a> created by introducing cut patterns into heat-sealable textiles to achieve locomotion upon cyclic pneumatic actuation. We found that the kirigami actuators exhibit directional anisotropic friction properties when inflated, having higher friction coefficients against the direction of the movement, enabling them to move across surfaces with varying roughness. We further enhanced the functionality of inflatable kirigami actuators by introducing multiple channels and segments to create functional soft robotic prototypes with versatile locomotion capabilities.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="772fc720afe02dd29601706a74c39c62" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/3fWEFDibPK0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://advanced.onlinelibrary.wiley.com/doi/10.1002/adrr.202500044">Paper</a> ] from [ <a href="https://www.softrobotics.dk/" target="_blank">SDU Soft Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="1gviu3trrps">Lockheed Martin wants to get into the <a data-linked-post="2652904040" href="https://spectrum.ieee.org/mars-sample-return-mission" target="_blank">Mars Sample Return</a> game for a mere US $3 billion.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d6f03aad050bbbe77173bfc37241875c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/1GViU3tRRps?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://lockheedmartin.com/en-us/news/features/2025/bringing-commercial-industry-efficiency-to-exploration-lockheed-martins-plan-for-mars-sample-return.html">Lockheed Martin</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="9hwyniiw4nm">This is pretty gross and exactly what you want a robot to be doing: dealing with municipal solid waste.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3a9e942998b6b6ad57bfd01a18012ab7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/9HWYNIiW4NM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.terex.com/zenrobotics/waste-types/municipal-solid-waste">ZenRobotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="h5z32e7uakm"><em>Drag your mouse or move your phone to explore this 360-degree panorama provided by <a data-linked-post="2650269754" href="https://spectrum.ieee.org/curiosity-turns-one-on-mars" target="_blank">NASA’s Curiosity Mars rover.</a> This view shows some of the rover’s first looks at a region that has only been viewed from space until now, and where the surface is crisscrossed with spiderweb-like patterns.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1de2285b55a5583bba358034225f6c47" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/H5z32E7uaKM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://science.nasa.gov/mission/msl-curiosity/">NASA Jet Propulsion Laboratory</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="hpvza8rfiks">In case you were wondering, <a data-linked-post="2667116159" href="https://spectrum.ieee.org/irobot-amazon" target="_blank">iRobot</a> is still around.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="94f11994637afc79d347441f03a021f7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/hPVZA8rfiKs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.irobot.com/">iRobot</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="shmgjzkqzdm">Legendary roboticist <a data-linked-post="2650271281" href="https://spectrum.ieee.org/cynthia-breazeal-unveils-jibo-a-social-robot-for-the-home" target="_blank">Cynthia Breazeal</a> talks about the equally legendary Personal Robots Group at the MIT Media Lab.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="96578b094f03b3ec1f7d6396f02f9a1c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/shMGJZkQzDM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.media.mit.edu/groups/personal-robots/overview/">MIT Personal Robots Group</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="kj0mp74v4kq"><em>In the first installment of our Moonshot Podcast Deep Dive video interview series, X’s Captain of Moonshots <a data-linked-post="2650275007" href="https://spectrum.ieee.org/astro-teller-captain-of-moonshots-at-x" target="_blank">Astro Teller</a> sits down with <a data-linked-post="2650257886" href="https://spectrum.ieee.org/sebastian-thrun-will-teach-you-how-to-build-your-own-self-driving-car-for-free" target="_blank">Sebastian Thrun</a>, cofounder of the Moonshot Factory, for a conversation about the history of Waymo and Google X, the ethics of innovation, the future of AI, and more.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="54cee25119337253957257dc6f4bd176" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/kj0mp74V4kQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://x.company/">Google X, The Moonshot Factory</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 27 Jun 2025 16:30:03 +0000</pubDate><guid>https://spectrum.ieee.org/robot-arm-thrower</guid><category>Robotics</category><category>Video friday</category><category>Manipulators</category><category>Crawler</category><category>Industrial robots</category><category>Humanoid robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/quadruped-robot-with-manipulator-arm-placed-on-pavement-near-a-table-tennis-setup.png?id=61112174&amp;width=980"></media:content></item><item><title>Video Friday: Jet-Powered Humanoid Robot Lifts Off</title><link>https://spectrum.ieee.org/video-friday-jet-powered-robot</link><description><![CDATA[
  5. <img src="https://spectrum.ieee.org/media-library/close-up-of-a-humanoid-robot-showing-intricate-mechanical-components-and-wiring-with-small-jet-engines-on-its-arms-and-torso.jpg?id=61079219&width=1500&height=2000&coordinates=262%2C0%2C263%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://roboticsconference.org/">RSS 2025</a>: 21–25 June 2025, LOS ANGELES</h5><h5><a href="https://robotx.ethz.ch/education/summer-school.html">ETH Robotics Summer School</a>: 21–27 June 2025, GENEVA</h5><h5><a href="https://ias-19.org/">IAS 2025</a>: 30 June–4 July 2025, GENOA, ITALY</h5><h5><a href="https://clawar.org/icres2025/">ICRES 2025</a>: 3–4 July 2025, PORTO, PORTUGAL</h5><h5><a href="https://2025.worldhaptics.org/">IEEE World Haptics</a>: 8–11 July 2025, SUWON, SOUTH KOREA</h5><h5><a href="https://ifac2025-msrob.com/">IFAC Symposium on Robotics</a>: 15–18 July 2025, PARIS</h5><h5><a href="https://2025.robocup.org/">RoboCup 2025</a>: 15–21 July 2025, BAHIA, BRAZIL</h5><h5><a href="https://www.ro-man2025.org/">RO-MAN 2025</a>: 25–29 August 2025, EINDHOVEN, THE NETHERLANDS</h5><h5><a href="https://clawar.org/clawar2025/">CLAWAR 2025</a>: 5–7 September 2025, SHENZHEN</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="https://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="t1bnhot4d5q"><em>This is the first successful vertical takeoff of a jet-powered flying humanoid robot, developed by Artificial and Mechanical Intelligence (AMI) at Istituto Italiano di Tecnologia (IIT). The robot lifted ~50 cm off the ground while maintaining dynamic stability, thanks to advanced AI-based control systems and aerodynamic modeling.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2666111cae7290d9efdf2108af79560f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/t1bNHoT4D5Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>We will have much more on this in the coming weeks!</p><p>[<a href="https://www.nature.com/articles/s44172-025-00447-w">Nature</a>] via [<a href="https://opentalk.iit.it/en/iit-demonstrates-that-a-humanoid-robot-can-fly/">IIT</a>]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="iwcnynpjnm0"><em>As a first step toward our mission of deploying general-purpose robots, we are pushing the frontiers of what end-to-end AI models can achieve in the real world. We’ve been training models and evaluating their capabilities for dexterous sensorimotor policies across different embodiments, environments, and physical interactions. We’re sharing capability demonstrations on tasks stressing different aspects of manipulation: fine motor control, spatial and temporal precision, generalization across robots and settings, and robustness to external disturbances.</em></blockquote><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="3fbcd7173da8b61dcd8567ca2932e4fd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/mhfleCK_IAI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://generalistai.com/blog.html">Generalist AI</a>]</p><p>Thanks, Noah!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="iwcnynpjnm0"><em>Ground Control Robotics is introducing SCUTTLE, our newest elongate multilegged platform for mobility anywhere!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="da081e0bf9207ff62b17fbdc33a083c9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/IWcNyNPjnM0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://groundcontrolrobotics.com/">Ground Control Robotics</a>]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="udkddqxth5q"><em>Teleoperation has been around for a while, but what hasn’t been is precise, real-time force feedback. That’s where Flexiv steps in to shake things up. Now, whether you’re across the room or across the globe, you can experience seamless, high-fidelity remote manipulation with a sense of touch.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ec233d3290c27dd0cea84b0f17763f3e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/udkddqxth5Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>This sort of thing usually takes some human training, for which you’d be best served by <a data-linked-post="2657676851" href="https://spectrum.ieee.org/video-friday-iss-robot-arms" target="_blank">robot arms</a> with  <a data-linked-post="2650273235" href="https://spectrum.ieee.org/esa-space-teleoperation-tests" target="_blank">precise, real-time force feedback</a>. Hmm, I wonder where you’d find those...?</p><p>[<a href="https://www.flexiv.com/">Flexiv</a>]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="xpx6ddrybv4"><em>The 1X World Model is a data-driven simulator for humanoid robots built with a grounded understanding of physics. It allows us to predict—or “hallucinate”—the outcomes of NEO’s actions before they’re taken in the real world. Using the 1X World Model, we can instantly assess the performance of AI models—compressing development time and providing a clear benchmark for continuous improvement.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bbf4dbf833d6ae5f2f31d30e3867918e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xPX6dDRYbV4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://www.1x.tech/">1X</a>]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="atr87nwq3eq"><em>SLAPBOT is an interactive robotic artwork by Hooman Samani and Chandler Cheng, exploring the dynamics of physical interaction, artificial agency, and power. The installation features a robotic arm fitted with a soft, inflatable hand that delivers slaps through pneumatic actuation, transforming a visceral human gesture into a programmed robotic response.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="98480c72f39d07a271776fa0143a9b44" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ATR87nwq3eQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>I asked, of course, whether SLAPBOT slaps people, and it does not: “Despite its provocative concept and evocative design, SLAPBOT does not make physical contact with human participants. It simulates the gesture of slapping without delivering an actual strike. The robotic arm’s movements are precisely choreographed to suggest the act, yet it maintains a safe distance.”</p><p>[<a href="https://hoomansamani.com/slapbot/">SLAPBOT</a>]</p><p>Thanks, Hooman!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="xht3nvc9d-i">Inspecting the bowels of ships is something we’d really like robots to be doing for us, please and thank you.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f8e59dc2b2d1ebc58f4a8fa063562914" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/XhT3nVC9d-I?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://www.ntnu.edu/itk/research/robotics" target="_blank">Norwegian University of Science and Technology</a>] via [<a href="https://github.com/ntnu-arl/predictive_planning_ros">GitHub</a>]</p><p>Thanks, Kostas!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="8a46uap367k"><em>H2L Corporation (hereinafter referred to as H2L) has unveiled a new product called “Capsule Interface,” which transmits whole-body movements and strength, enabling new shared experiences with robots and avatars. A product introduction video depicting a synchronization never before experienced by humans was also released.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cd7cebba8a246b0d342bfa262937b917" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8a46Uap367k?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://h2l.jp/2025/06/18/%e5%85%a8%e8%ba%ab%e3%83%aa%e3%82%a2%e3%83%ab%e4%bd%93%e9%a8%93%ef%bc%81%e8%a6%8b%e3%82%8b%e8%81%9e%e3%81%8f%e3%81%ae%e5%85%88%e3%82%92%e5%89%b5%e3%82%8b%e3%80%82%e5%8b%95%e3%81%8d%e3%81%a8%e5%8a%9b/">H2L Corp.</a>] via [<a href="https://robotstart.info/2025/06/18/h2l-capsule-interface-launch.html">RobotStart</a>]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="vbugenau3re">How do you keep a robot safe without requiring it to look at you? <a data-linked-post="2668807636" href="https://spectrum.ieee.org/feral-cat-radar-detector" target="_blank">Radar</a>!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6b6cefa1aff74634e83e5435745c35bc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vbuGenAu3rE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://ieeexplore.ieee.org/document/11037369">Paper</a>] via [<a href="https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=7361" target="_blank">IEEE Sensors Journal</a>]</p><p>Thanks, Bram!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="pcank-5e-qo"><em>We propose Aerial Elephant Trunk, an aerial continuum manipulator inspired by the elephant trunk, featuring a small-scale quadrotor and a dexterous, compliant tendon-driven continuum arm for versatile operation in both indoor and outdoor settings.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4a2829d05541e793bcf454fe8f2c394d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/PcanK-5e-qo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://arclab.hku.hk/">Adaptive Robotics Controls Lab</a>]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="w_qloi1pokw"><em>This video demonstrates a heavy weight lifting test using the ARMstrong Dex robot, focusing on a 40 kg bicep curl motion. ARMstrong Dex is a human-sized, dual-arm hydraulic robot currently under development at the Korea Atomic Energy Research Institute (KAERI) for disaster response applications. Designed to perform tasks flexibly like a human while delivering high power output, ARMstrong Dex is capable of handling complex operations in hazardous environments.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ed3765ff0740c1a013e9beeaae04aa6a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/W_QlOi1PoKw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://www.kaeri.re.kr/eng/">Korea Atomic Energy Research Institute</a>]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="lfat803dqmw"><em>Micro-robots that can inspect water pipes, diagnose cracks, and fix them autonomously—reducing leaks and avoiding expensive excavation work—have been developed by a team of engineers led by the University of Sheffield. </em></blockquote><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="204230bab58cc85ff8909e3f3029be79" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Q2loVe5_NcE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://www.sheffield.ac.uk/news/tiny-robots-could-help-fix-leaky-water-pipes">University of Sheffield</a>]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="lfat803dqmw"><em>We’re growing in size, scale, and impact! We’re excited to announce the opening of our serial production facility in the San Francisco Bay Area, the very first purpose-built <a data-linked-post="2650275419" href="https://spectrum.ieee.org/secretive-robotaxi-startup-zoox-prepares-for-realworld-testing" target="_blank">robotaxi</a> assembly facility in the United States. More space means more innovation, production, and opportunities to scale our fleet.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="504fa629f2368f24a9acf389d3e12a66" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/lfAt803DQMw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://zoox.com/">Zoox</a>]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="8-0d4lyjhqi"><em>Watch multipick in action as our pickle robot rapidly identifies, picks, and places multiple boxes in a single swing of an arm.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="36412a3fdb34a606e69841ccca3c2110" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8-0d4LyJhQI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://picklerobot.com/">Pickle</a>]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="un_7inyazyq">And now, this.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ec349bce0e0fb055b052b7831af4f49b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/uN_7INYaZYQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://info.aibo.sony.jp/info/2024/12/creatorschallenge2025.html">Aibo</a>]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="aqrqusrezhy"><em>Cargill’s Amsterdam Multiseed facility enlists Spot and Orbit to inspect machinery and perform visual checks, enhanced by all-new AI features, as part of their “Plant of the Future” program. </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b7c681340545a8e6334c9f64aa9dadd4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/AqRquSReZHY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://bostondynamics.com/products/spot/">Boston Dynamics</a>]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="vmsslbgktcu">This ICRA 2025 plenary talk is from Raffaello D’Andrea, entitled “Models are Dead, Long Live Models!”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d1600ac97242ef165c3fc6d7e438f123" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vMSSlBGKtCU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://2025.ieee-icra.org/program/plenary-sessions/">ICRA 2025</a>]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="pfvctjompk8">Will data solve robotics and automation? Absolutely! Never! Who knows?! Let’s argue about it!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2e5008c9444113b6baa43e3fdeee54ed" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/PfvctjoMPk8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://2025.ieee-icra.org/announcements/event-overview-for-thursday-may-22/">ICRA 2025</a>]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 20 Jun 2025 16:30:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-jet-powered-robot</guid><category>Video friday</category><category>Robotics</category><category>Humanoid robots</category><category>Industrial robots</category><category>Aibo</category><category>Dexterous</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/close-up-of-a-humanoid-robot-showing-intricate-mechanical-components-and-wiring-with-small-jet-engines-on-its-arms-and-torso.jpg?id=61079219&amp;width=980"></media:content></item><item><title>Video Friday: AI Model Gives Neo Robot Autonomy</title><link>https://spectrum.ieee.org/video-friday-neo-humanoid-robot</link><description><![CDATA[
  6. <img src="https://spectrum.ieee.org/media-library/robot-and-person-standing-face-to-face-in-a-wooden-hallway-with-tall-bushy-plants.png?id=60988606&width=1500&height=2000&coordinates=555%2C0%2C555%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://www.edrcoalition.com/2025-energy-drone-robotics-summit">2025 Energy Drone & Robotics Summit</a>: 16–18 June 2025, HOUSTON</h5><h5><a href="https://roboticsconference.org/">RSS 2025</a>: 21–25 June 2025, LOS ANGELES</h5><h5><a href="https://robotx.ethz.ch/education/summer-school.html">ETH Robotics Summer School</a>: 21–27 June 2025, GENEVA</h5><h5><a href="https://ias-19.org/">IAS 2025</a>: 30 June–4 July 2025, GENOA, ITALY</h5><h5><a href="https://clawar.org/icres2025/">ICRES 2025</a>: 3–4 July 2025, PORTO, PORTUGAL</h5><h5><a href="https://2025.worldhaptics.org/">IEEE World Haptics</a>: 8–11 July 2025, SUWON, SOUTH KOREA</h5><h5><a href="https://ifac2025-msrob.com/">IFAC Symposium on Robotics</a>: 15–18 July 2025, PARIS</h5><h5><a href="https://2025.robocup.org/">RoboCup 2025</a>: 15–21 July 2025, BAHIA, BRAZIL</h5><h5><a href="https://www.ro-man2025.org/">RO-MAN 2025</a>: 25–29 August 2025, EINDHOVEN, THE NETHERLANDS</h5><h5><a href="https://clawar.org/clawar2025/">CLAWAR 2025</a>: 5–7 September 2025, SHENZHEN, CHINA</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="https://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="qnzl5dvtdkk">Introducing Redwood—1X’s breakthrough <a data-linked-post="2671886754" href="https://spectrum.ieee.org/chain-of-thought-prompting" target="_blank">AI model</a> capable of doing chores around the home. For the first time, <a data-linked-post="2671238743" href="https://spectrum.ieee.org/video-friday-good-over-all-terrains" target="_blank">NEO Gamma</a> moves, understands, and interacts autonomously in complex human environments. Built to learn from real-world experiences, Redwood empowers NEO to perform end-to-end mobile manipulation tasks like retrieving objects for users, opening doors, and navigating around the home gracefully, on top of hardware designed for compliance, safety, and resilience.</blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ebbfc0b339e850cd28b7e5f5fac9c43c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/qnzL5dVTDKk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="a2f3f4b5f15fde9ec50d5116b96b764a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Dp6sqx9BGZs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-caption" placeholder="Add Photo Caption...">- YouTube</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://www.youtube.com/watch?v=Dp6sqx9BGZs" target="_blank">www.youtube.com</a></small></p><p>[ <a href="https://www.1x.tech/discover/redwood-ai">1X Technology</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="7gap8k9jz2q"><a data-linked-post="2650251927" href="https://spectrum.ieee.org/therapeutic-robots-paro-and-keepon-are-cute-but-still-costly" target="_blank">Marek Michalowski</a>, who co-created <a data-linked-post="2650276785" href="https://spectrum.ieee.org/keepon-helps-kids-learn-to-argue-better" target="_blank">Keepon</a>, has not posted to his YouTube channel in 17 years—until this week. The new post? It’s about a project from 10 years ago!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f79c76127d0315323179d7d47ac57117" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7gAp8k9jZ2Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://jonathanproto.com/project-sundial">Project Sundial</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="lkc2y0yb89u"><em>Helix can now handle a wider variety of packaging approaching human-level dexterity and speed, bringing us closer to fully autonomous package sorting. This rapid progress underscores the scalability of Helix’s learning-based approach to robotics, translating quickly into real-world applications.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d3d0673b70a96eaa2d77f39e3bf16d02" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/lkc2y0yb89U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.figure.ai/news/helix">Figure</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="nzuvdu2q0zo">This is certainly an atypical Video Friday selection, but I saw this Broadway musical called “Maybe Happy Ending” a few months ago because the main characters are deprecated humanoid home-service robots. It was utterly charming, and it just won the Tony award for best new musical among others.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e25d5605adc019d20ed170c527ed4700" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/nZUVDu2q0Zo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ “<a href="https://www.maybehappyending.com/">Maybe Happy Ending</a>” ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ptydwp9utis"><a data-linked-post="2664552687" href="https://spectrum.ieee.org/boston-dynamics-dancing-robots" target="_blank">Boston Dynamics</a> brought a bunch of Spots to “America’s Got Talent,” and kudos to them for recovering so gracefully from an on-stage failure.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7ec3c1744aa63eeb5904fa3ee0779adc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ptYDWP9uTis?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/products/spot/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="41xnc4mu-hs">I think this is the first time I’ve seen end-effector changers used for either feet or heads.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1a29c3d692365d22f63dade5335a6c2f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/41XNc4Mu-hs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://unit.aist.go.jp/jrl-22022/en/">CNRS-AIST Joint Robotics Laboratory</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="tbdtcwzfeiu"><em>ChatGPT has gone fully Navrim—complete with existential dread and maximum gloom! Watch as the most pessimistic ChatGPT-powered robot yet moves chess pieces across a physical board, deeply contemplating both chess strategy and the futility of existence. Experience firsthand how seamlessly AI blends with robotics, even if Navrim insists there’s absolutely no point.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="08d839d8c7f07846798dcb0748af4c05" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/TbDTCwzFeIU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Not bad for $219 all in.</p><p>[ <a href="https://vassarrobotics.com/">Vassar Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="9u0hocl0aj4"><em>We present a single-layer multimodal sensory skin made using only a highly sensitive hydrogel membrane. Using electrical impedance tomography techniques, we access up to 863,040 conductive pathways across the membrane, allowing us to identify at least six distinct types of multimodal stimuli, including human touch, damage, multipoint insulated presses, and local heating. To demonstrate our approach’s versatility, we cast the hydrogel into the shape and size of an adult human hand.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="30c9c1e376a765bd37eb45ea36471e1c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/9U0hoCL0aJ4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dshardman.co.uk/publication/scirohand/">Bio-Inspired Robotics Laboratory</a> ] paper published by [ <a href="https://www.science.org/journal/scirobotics" target="_blank">Science Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="pqdqamtjwrw"><em>This paper introduces a novel robot designed to exhibit two distinct modes of mobility: rotational aerial flight and terrestrial locomotion. This versatile robot comprises a sturdy external frame, two motors, and a single wing embodying its fuselage. The robot is capable of vertical takeoff and landing in mono-wing flight mode, with the unique ability to fly in both clockwise and counterclockwise directions, setting it apart from traditional mono-wings.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="fea3e49bf322eec7899d20a2236783a4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/PQDqAMTjWrw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://journals.sagepub.com/doi/10.1177/02783649251344968">AIR Lab</a> paper ] published in [ <a href="https://journals.sagepub.com/home/ijra" target="_blank">The International journal of Robotics Research</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="hzq2hhoi6la">When TRON 1 goes to work, all he does is steal snacks from hoomans. Apparently.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8d3121ce94715e6d531c92002d4575b7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/hZQ2hhoi6lA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.limxdynamics.com/en/tron1">LimX Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="jb2txzph7xs"><em>The 100,000th robot has just rolled off the line at Pudu Robotics’ Super Factory! This key milestone highlights our cutting-edge manufacturing strength and marks a global shipment volume of over 100,000 units delivered worldwide.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5bc0b47f42e246bf950808b9fc53d28e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Jb2tXzph7Xs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pudurobotics.com/en">Pudu Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="cn5whdtrlv0">Now that is a big saw.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e4d5683d9d16f4c931c72693ecd4b188" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/CN5WhDTRlV0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.kuka.com/en-se/industries/solutions-database/2025/05/catonator_smartproduction-nordic">Kuka Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="2rih4zintzm"><em>NASA Jet Propulsion Laboratory has developed the Exploration Rover for Navigating Extreme Sloped Terrain or ERNEST. This rover could lead to a new class of low-cost planetary rovers for exploration of previously inaccessible locations on Mars and the moon.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ccaf709dda32b8b5778c7a0f178c877e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2RiH4ZInTZM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.hou.usra.edu/meetings/lpsc2025/pdf/1729.pdf">NASA Jet Propulsion Laboratory</a> paper ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="zobe3aoz5fw"><em>Brett Adcock, founder and CEO of Figure AI, speaks with Bloomberg Television’s Ed Ludlow about how the company is training humanoid robots for logistics, manufacturing, and future roles in the home at Bloomberg Tech in San Francisco.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2467bd8c8c1fc278a40446b7ee5655e9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zObe3aOz5fw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.figure.ai/">Figure</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="b6pz2r1uhxw"><em>Peggy Johnson, CEO of Agility Robotics, discusses how humanoid robots like Digit are transforming logistics and manufacturing. She speaks with Bloomberg Businessweek’s Brad Stone about the rapid advances in automation and the next era of robots in the workplace at Bloomberg Tech in San Francisco.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d7a77a584e471d9a057897f5a2a150d7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/B6pz2R1UHXw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.agilityrobotics.com/">Agility Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="piyyufkvg1e">This ICRA 2025 Plenary is from Allison Okamura, titled “Rewired: The Interplay of Robots and Society.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c721b1931ac1615ffee7f918db1f8861" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/PIYyufKvG1E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://2025.ieee-icra.org/program/plenary-sessions/">ICRA 2025</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 13 Jun 2025 16:30:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-neo-humanoid-robot</guid><category>Video friday</category><category>Humanoid robots</category><category>Autonomous robots</category><category>Dexterity</category><category>Dancing robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/robot-and-person-standing-face-to-face-in-a-wooden-hallway-with-tall-bushy-plants.png?id=60988606&amp;width=980"></media:content></item><item><title>Navigating the Dual-Use Dilemma</title><link>https://spectrum.ieee.org/navigating-the-dual-use-dilemma</link><description><![CDATA[
  7. <img src="https://spectrum.ieee.org/media-library/robotic-arm-holding-a-scalpel-merging-into-a-digital-blueprint-on-a-black-and-white-background.png?id=60656210&width=1500&height=2000&coordinates=569%2C0%2C570%2C0"/><br/><br/><p>Open-source technology developed in the civilian sector has the capacity to also be used in military applications or be simply misused. Navigating this <a href="https://link.springer.com/article/10.1007/s11948-009-9159-9" rel="noopener noreferrer" target="_blank">dual-use</a> potential is becoming more important across engineering fields, as innovation goes both ways. While the “openness” of open-source technology is part of what drives innovation and allows everyone access, it also, unfortunately, means it’s just as easily accessible to others, including the military and criminals.</p><p>What happens when a rogue state, a nonstate militia, or a school shooter displays the same creativity and innovation with open-source technology that engineers do? This is the question we are discussing here: How can we uphold our principles of open research and innovation to drive progress while mitigating the inherent risks that come with accessible technology?</p><p>More than just open-ended risk, let’s discuss the specific challenges open-source technology and its dual-use potential have on robotics. Understanding these challenges can help engineers learn what to look for in their own disciplines.</p><h2>The Power and Peril of Openness</h2><p>Open-access publications, software, and educational content are fundamental to advancing robotics. They have democratized access to knowledge, enabled reproducibility, and fostered a vibrant, collaborative international community of scientists. Platforms like arXiv and GitHub and open-source initiatives like the <a href="https://www.ros.org/" rel="noopener noreferrer" target="_blank">Robot Operating System</a> (ROS) and the <a href="https://github.com/open-dynamic-robot-initiative/" rel="noopener noreferrer" target="_blank">Open Dynamic Robot Initiative</a> have been pivotal in accelerating robotics research and innovation, and there is no doubt that they should remain openly accessible. Losing access to these resources would be devastating to the robotics field.</p><p>However, robotics carries inherent dual-use risks since most robotics technology can be repurposed <a href="https://spectrum.ieee.org/autonomous-weapons-challenges" target="_blank">for military use</a> or <a href="https://spectrum.ieee.org/why-you-should-fear-slaughterbots-a-response" target="_blank">harmful purposes</a>. One recent example of custom-made drones in current conflicts is particularly insightful. The resourcefulness displayed by Ukrainian soldiers in repurposing and sometimes <a href="https://www.cnas.org/publications/reports/evolution-not-revolution" rel="noopener noreferrer" target="_blank">augmenting civilian drone technology</a> received worldwide, often admiring, news coverage. Their creativity has been made possible through the affordability of commercial drones, spare parts, 3D printers, and the availability of open-source software and hardware. This allows people with little technological background and money to easily create, control, and repurpose robots for military applications. One can certainly argue that this has had an empowering effect on Ukrainians defending their country. However, these same conditions also present opportunities for a wide range of potential bad actors.</p><p>Openly available knowledge, designs, and software can be misused to enhance existing weapons systems with capabilities like vision-based <a href="https://onlinelibrary.wiley.com/doi/full/10.1111/1758-5899.12663" rel="noopener noreferrer" target="_blank">navigation, autonomous targeting, or swarming</a>. Additionally, unless proper security measures are taken, the public nature of open-source code makes it vulnerable to cyberattacks, potentially allowing malicious actors to gain control of robotic systems and cause them to malfunction or be used for <a href="https://www.sciencedirect.com/science/article/pii/S2667305323000625" rel="noopener noreferrer" target="_blank">malevolent purposes</a>. Many ROS users already recognize that they do not invest enough in <a href="https://aliasrobotics.com/files/robot_cybersecurity_review.pdf" rel="noopener noreferrer" target="_blank">cybersecurity</a> for their applications.</p><h2>Guidance Is Necessary</h2><p>Dual-use risks stemming from openness in research and innovation are a concern for many engineering fields. Did you know that engineering was originally a military-only activity? The word “engineer” was coined in the Middle Ages to describe “a designer and constructor of fortifications and weapons.” Some engineering specializations, especially those that include the development of weapons of mass destruction (chemical, biological, radiological, and nuclear), have developed clear guidance, and in some cases, regulations for how research and innovation can be conducted and disseminated. They also have community-driven processes intended to mitigate dual-use risks associated with spreading knowledge. For instance, BioRxiv and MedRxiv—the preprint servers for biology and health sciences—screen submissions for material that poses a biosecurity or health risk before publishing them.</p><p>The field of robotics, in comparison, offers no specific regulation and little guidance as to how roboticists should think of and address the risks associated with openness. Dual-use risk is not taught in most universities, despite it being something that students will likely face in their careers, such as when assessing whether their work is subject to <a href="https://www.sipri.org/publications/2020/policy-reports/responsible-artificial-intelligence-research-and-innovation-international-peace-and-security" rel="noopener noreferrer" target="_blank">export-control regulations on dual-use items</a>.</p><p>As a result, roboticists may not feel they have an incentive or are equipped to evaluate and mitigate the dual-use risks associated with their work. This represents a major problem, as the likelihood of harm associated with the misuse of open robotic research and innovation is likely higher than that of nuclear and biological research, both of which require significantly more resources. Producing “do-it-yourself” robotic weapon systems using open-source design and software and off-the-shelf commercial components is relatively easy and accessible. With this in mind, we think that it’s high time for the robotics community to work toward its own set of sector-specific guidance for how researchers and companies can best navigate the dual-use risks associated with the open diffusion of their work.</p><h2>A Road Map for Responsible Robotics</h2><p>Striking a balance between security and openness is a complex challenge, but one that the robotics community must embrace. We cannot afford to stifle innovation, nor can we ignore the potential for harm. A proactive, multipronged approach is needed to navigate this dual-use dilemma. Drawing lessons from other fields of engineering, we propose a road map focusing on four key areas: education, incentives, moderation, and red lines.</p><h3>Education</h3><p>Integrating responsible research and innovation into robotics education at all levels is paramount. This includes not only dedicated courses but also the <a href="https://journals.uclpress.co.uk/lre/article/id/129/" rel="noopener noreferrer" target="_blank">systematic inclusion</a> of dual-use and cybersecurity considerations within core <a href="https://link.springer.com/article/10.1007/s11948-019-00164-6" rel="noopener noreferrer" target="_blank">robotics curricula</a>. We must foster a culture of responsible innovation so that we can empower roboticists to make informed decisions and proactively address potential risks.</p><p>Educational initiatives could include:</p><ul><li>Developing and disseminating open-source educational materials on responsible robotics for robotics teachers, researchers, and professionals from resources such as the <a href="https://disarmament.unoda.org/responsible-innovation-ai/resources/" rel="noopener noreferrer" target="_blank">United Nations Office for Disarmament Affairs</a> (UNODA) and the <a href="https://airesponsibly.net/education/" rel="noopener noreferrer" target="_blank">Center for Responsible AI</a> at New York University. </li><li>Organizing workshops and seminars on dual-use and ethical considerations at robotics conferences and universities.</li><li>Encouraging universities to offer courses or modules dedicated to <a href="https://journals.sagepub.com/doi/10.1177/20539517231219958" rel="noopener noreferrer" target="_blank">responsible research and innovation in robotics</a>.</li></ul><h3>Incentives</h3><p>Everyone should be encouraged to assess the potential negative consequences of making their work fully or partially open. Funding agencies can mandate risk assessments as a condition for project funding, signaling their importance. Professional organizations, like the <a href="https://www.ieee-ras.org/" rel="noopener noreferrer" target="_blank">IEEE Robotics and Automation Society</a> (RAS), can adopt and promote <a href="https://www.ieee-ras.org/industry-government/standards" rel="noopener noreferrer" target="_blank">best practices</a>, providing tools and frameworks for researchers to identify, assess, and mitigate risks. Such tools could include self-assessment checklists for individual researchers and guidance for how faculties and labs can set up ethical review boards. Academic journals and conferences can make peer-review risk assessments an integral part of the publication process, especially for high-risk applications.</p><p>Additionally, incentives like awards and recognition programs can highlight exemplary contributions to risk assessment and mitigation, fostering a culture of responsibility within the community. Risk assessment can also be encouraged and rewarded in more informal ways. People in leadership positions, such as Ph.D. supervisors and heads of labs, could build ad hoc opportunities for students and researchers to discuss possible risks. They can hold seminars on the topic and provide introductions to external experts and stakeholders like social scientists and experts from NGOs.</p><h3>Moderation</h3><p>The robotics community can implement <a href="https://dl.acm.org/doi/10.1145/3593013.3593981" rel="noopener noreferrer" target="_blank">self-regulation mechanisms</a> to moderate the diffusion of high-risk material. This could involve:</p><ul><li>Screening work prior to publication to prevent the dissemination of content posing serious risks.</li><li>Implementing graduated access controls (“gating”) to certain source code or data on open-source repositories, potentially requiring users to identify themselves and specify their intended use.</li><li>Establishing clear guidelines and community oversight to ensure transparency and prevent misuse of these moderation mechanisms. For example, organizations like RAS could design categories of risk levels for robotics research and applications and create a monitoring committee to track and document real cases of the misuse of robotics research to understand and visualize the scale of the risks and create better mitigation strategies.</li></ul><h3>Red Lines</h3><p>The robotics community should also seek to define and enforce red lines for the development and deployment of robotics technologies. Efforts to define red lines have already been made in that direction, notably in the context of the <a href="https://standards.ieee.org/industry-connections/ec/autonomous-systems/" rel="noopener noreferrer" target="_blank">IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems</a>. Companies, including <a href="https://bostondynamics.com/" rel="noopener noreferrer" target="_blank">Boston Dynamics</a>, <a href="https://www.unitree.com/" rel="noopener noreferrer" target="_blank">Unitree</a>, <a href="https://www.agilityrobotics.com/" rel="noopener noreferrer" target="_blank">Agility Robotics</a>, <a href="https://clearpathrobotics.com/" rel="noopener noreferrer" target="_blank">Clearpath Robotics</a>, <a href="https://www.anybotics.com/" rel="noopener noreferrer" target="_blank">ANYbotics</a>, and <a href="https://www.openrobotics.org/" rel="noopener noreferrer" target="_blank">Open Robotics</a> wrote an open letter calling for regulations on the <a href="https://bostondynamics.com/news/general-purpose-robots-should-not-be-weaponized/" rel="noopener noreferrer" target="_blank">weaponization of general-purpose robots</a>. Unfortunately, their efforts were very narrow in scope, and there is a lot of value in further mapping end uses of robotics that should be deemed off-limits or demand extra caution.</p><p>It will absolutely be difficult for the community to agree on standard red lines, because what is considered ethically acceptable or problematic is highly subjective. To support the process, individuals and companies can reflect on what they consider to be unacceptable use of their work. This could result in policies and terms of use that beneficiaries of open research and open-source design software would have to formally agree to (such as specific-use open-source licenses). This would provide a basis for revoking access, denying software updates, and potentially suing or blacklisting people who misuse the technology. Some companies, including Boston Dynamics, have already implemented these measures to some extent. Any person or company conducting open research could replicate this example.</p><p>Openness is the key to innovation and the democratization of many engineering disciplines, including robotics, but it also amplifies the potential for misuse. The engineering community has a responsibility to proactively address the dual-use dilemma. By embracing responsible practices, from education and risk assessment to moderation and red lines, we can foster an ecosystem where openness and security coexist. The challenges are significant, but the stakes are too high to ignore. It is crucial to ensure that research and innovation benefit society globally and do not become a driver of instability in the world. This goal, we believe, aligns with the mission of the IEEE, which is to “advance technology for the benefit of humanity.” The engineering community, especially roboticists, needs to be proactive on these issues to prevent any backlash from society and to preempt potentially counterproductive measures or international regulations that could harm open science.</p>]]></description><pubDate>Tue, 10 Jun 2025 13:00:04 +0000</pubDate><guid>https://spectrum.ieee.org/navigating-the-dual-use-dilemma</guid><category>Robotics</category><category>Guest articles</category><category>Dual-use</category><dc:creator>Ludovic Righetti</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/robotic-arm-holding-a-scalpel-merging-into-a-digital-blueprint-on-a-black-and-white-background.png?id=60656210&amp;width=980"></media:content></item><item><title>Video Friday: Hopping on One Robotic Leg</title><link>https://spectrum.ieee.org/video-friday-one-legged-robot</link><description><![CDATA[
  8. <img src="https://spectrum.ieee.org/media-library/black-stick-figures-in-a-skating-pose-scattered-across-a-vast-white-icy-landscape.png?id=60524616&width=1500&height=2000&coordinates=485%2C0%2C485%2C0"/><br/><br/><p>
  9. <span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span>
  10. </p><h5><a href="https://www.edrcoalition.com/2025-energy-drone-robotics-summit">2025 Energy Drone & Robotics Summit</a>: 16–18 June 2025, HOUSTON</h5><h5><a href="https://roboticsconference.org/">RSS 2025</a>: 21–25 June 2025, LOS ANGELES</h5><h5><a href="https://robotx.ethz.ch/education/summer-school.html">ETH Robotics Summer School</a>: 21–27 June 2025, GENEVA</h5><h5><a href="https://ias-19.org/">IAS 2025</a>: 30 June–4 July 2025, GENOA, ITALY</h5><h5><a href="https://clawar.org/icres2025/">ICRES 2025</a>: 3–4 July 2025, PORTO, PORTUGAL</h5><h5><a href="https://2025.worldhaptics.org/">IEEE World Haptics</a>: 8–11 July 2025, SUWON, SOUTH KOREA</h5><h5><a href="https://ifac2025-msrob.com/">IFAC Symposium on Robotics</a>: 15–18 July 2025, PARIS</h5><h5><a href="https://2025.robocup.org/">RoboCup 2025</a>: 15–21 July 2025, BAHIA, BRAZIL</h5><h5><a href="https://www.ro-man2025.org/">RO-MAN 2025</a>: 25–29 August 2025, EINDHOVEN, NETHERLANDS</h5><h5><a href="https://clawar.org/clawar2025/">CLAWAR 2025</a>: 5–7 September 2025, SHENZHEN</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="https://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>
  11. Enjoy today’s videos!
  12. </p><div class="horizontal-rule">
  13. </div><p class="rm-anchors" id="fnzdkxl-jj0">
  14. This single-leg robot is designed to “form a foundation for future bipedal robot development,” but personally, I think it’s perfect as is.
  15. </p><p class="shortcode-media shortcode-media-youtube">
  16. <span class="rm-shortcode" data-rm-shortcode-id="e263fb0233d0bb0d075d93a40d651be2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FNzdKXl-jj0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  17. </p><p>
  18. [
  19. <a href="https://dynamicrobot.kaist.ac.kr/">KAIST Dynamic Robot Control and Design Lab</a> ]
  20. </p><div class="horizontal-rule">
  21. </div><p class="shortcode-media shortcode-media-youtube">
  22. <span class="rm-shortcode" data-rm-shortcode-id="8a6d56b1cad95583679b96d5194dd022" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wzYtsJwYfTM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  23. </p><p>
  24. Selling 17,000
  25. <a data-linked-post="2655919083" href="https://spectrum.ieee.org/social-robots-children" target="_blank">social robots</a> still amazes me. <a data-linked-post="2650251656" href="https://spectrum.ieee.org/aldebaran-robotics-seeking-betatesters-for-its-nao-humanoid-robot" target="_blank">Aldebaran</a> will be missed.
  26. </p><p>
  27. [
  28. <a href="https://aldebaran.com/en/">Aldebaran</a> ]
  29. </p><div class="horizontal-rule">
  30. </div><p class="rm-anchors" id="udti_d_vif0">
  31. Nice to see some actual challenging shoves as part of biped testing.
  32. </p><p class="shortcode-media shortcode-media-youtube">
  33. <span class="rm-shortcode" data-rm-shortcode-id="397e23922e40f8dda09c9558813c3604" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UdtI_D_vIF0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  34. </p><p>
  35. [
  36. <a href="https://www.ucr.bot/">Under Control Robotics</a> ]
  37. </p><div class="horizontal-rule">
  38. </div><blockquote class="rm-anchors" id="j5cfeee5pyi">
  39. <em>Ground Control made multilegged waves at IEEE’s International Conference on Robotics and Automation 2025 in Atlanta! We competed in the Startup Pitch Competition and demoed our robot at our booth, on NIST standard terrain, and around the convention. We were proud to be a finalist for Best Expo Demo and participate in the Robot Parade.</em>
  40. </blockquote><p class="shortcode-media shortcode-media-youtube">
  41. <span class="rm-shortcode" data-rm-shortcode-id="f805b79697328de135747f04c5a7dac1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/J5cfeEe5pyI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  42. </p><p>
  43. [
  44. <a href="https://groundcontrolrobotics.com/">Ground Control Robotics</a> ]
  45. </p><p>
  46. Thanks, Dan!
  47. </p><div class="horizontal-rule">
  48. </div><blockquote class="rm-anchors" id="agrtswo4snw">
  49. <em>Humanoid is a U.K.-based robotics innovation company dedicated to building commercially scalable, reliable and safe robotic solutions for real-world applications.</em>
  50. </blockquote><p class="shortcode-media shortcode-media-youtube">
  51. <span class="rm-shortcode" data-rm-shortcode-id="c6f2dda46adfe06e68b0b4b335ec3291" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/AgrTSWO4Snw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  52. </p><p>
  53. It’s a nifty bootup screen, I’ll give them that.
  54. </p><p>
  55. [
  56. <a href="https://thehumanoid.ai/product/">Humanoid</a> ]
  57. </p><p>
  58. Thanks, Kristina!
  59. </p><div class="horizontal-rule">
  60. </div><blockquote class="rm-anchors" id="plm9gaq1jxo">
  61. <em>Quadrupedal robots have demonstrated remarkable agility and robustness in traversing complex terrains. However, they remain limited in performing object interactions that require sustained contact. In this work, we present LocoTouch, a system that equips quadrupedal robots with tactile sensing to address a challenging task in this category: long-distance transport of unsecured cylindrical objects, which typically requires custom mounting mechanisms to maintain stability.</em>
  62. </blockquote><p class="shortcode-media shortcode-media-youtube">
  63. <span class="rm-shortcode" data-rm-shortcode-id="5209d97768c506bd070b00ce7aa8e8b2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pLm9gaQ1JXo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  64. </p><p>
  65. [
  66. <a href="https://linchangyi1.github.io/LocoTouch/">LocoTouch paper</a> ]
  67. </p><p>
  68. Thanks, Changyi!
  69. </p><div class="horizontal-rule">
  70. </div><blockquote class="rm-anchors" id="2lg-4mdx210">
  71. <em>In this video, Digit is performing tasks autonomously using a whole-body controller for mobile manipulation. This new controller was trained in simulation, enabling Digit to execute tasks while navigating new environments and manipulating objects it has never encountered before.</em>
  72. </blockquote><p class="shortcode-media shortcode-media-youtube">
  73. <span class="rm-shortcode" data-rm-shortcode-id="4d2772b70353c22ada366d8040940a1a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2lG-4mdx210?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  74. </p><p>
  75. Not bad, although it’s worth pointing out that those shelves are not representative of any market I’ve ever been to.
  76. </p><p>
  77. [
  78. <a href="https://www.agilityrobotics.com/">Agility Robotics</a> ]
  79. </p><div class="horizontal-rule">
  80. </div><p class="rm-anchors" id="xwmwmhrt-fs">
  81. It’s always cool to see robots presented as an incidental solution to a problem as opposed to, you know, robots.
  82. </p><p class="shortcode-media shortcode-media-youtube">
  83. <span class="rm-shortcode" data-rm-shortcode-id="686a7d77fbda850290710efc6140a527" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xWmWmhRt-fs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  84. </p><p>
  85. The question that you really want answered, though, is “Why is there water on the floor?”
  86. </p><p>
  87. [
  88. <a href="https://bostondynamics.com/products/orbit/">Boston Dynamics</a> ]
  89. </p><div class="horizontal-rule">
  90. </div><blockquote class="rm-anchors" id="gqidyj-akaa">
  91. <em>Reinforcement learning (RL) has significantly advanced the control of physics-based and robotic characters that track kinematic reference motion. We propose a multi-objective reinforcement learning framework that trains a single policy conditioned on a set of weights, spanning the Pareto front of reward trade-offs. Within this framework, weights can be selected and tuned after training, significantly speeding up iteration time. We demonstrate how this improved workflow can be used to perform highly dynamic motions with a robot character.</em>
  92. </blockquote><p class="shortcode-media shortcode-media-youtube">
  93. <span class="rm-shortcode" data-rm-shortcode-id="daa86fab0c3d3f61cc1ab142a8056ca3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/gQidYj-AKaA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  94. </p><p>
  95. [
  96. <a href="https://la.disneyresearch.com/publication/amor-adaptive-character-control-through-multi-objective-reinforcement-learning/">Disney Research</a> ]
  97. </p><div class="horizontal-rule">
  98. </div><p class="rm-anchors" id="igyjdvu2tc0">
  99. It’s been a week since ICRA 2025, and TRON 1 already misses all the new friends it made!
  100. </p><p class="shortcode-media shortcode-media-youtube">
  101. <span class="rm-shortcode" data-rm-shortcode-id="891a8f3fed5dda5103e1a2056cef57e4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/iGyJdVu2tc0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  102. </p><p>
  103. [
  104. <a href="https://www.limxdynamics.com/en">LimX Dynamics</a> ]
  105. </p><div class="horizontal-rule">
  106. </div><p class="rm-anchors" id="hjpps5vcftg">
  107. ROB 450 in Winter 2025 challenged students to synthesize the knowledge acquired through their Robotics undergraduate courses at the University of Michigan to use a systematic and iterative design and analysis process and apply it to solving a real open-ended Robotics problem.
  108. </p><p class="shortcode-media shortcode-media-youtube">
  109. <span class="rm-shortcode" data-rm-shortcode-id="cb4df45971f7989fef2eafcf4708c497" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/hjPPS5vcFtg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  110. </p><p>
  111. [
  112. <a href="https://robotics.umich.edu/">University of Michigan Robotics</a> ]
  113. </p><div class="horizontal-rule">
  114. </div><p class="rm-anchors" id="hh7fh5ys82q">
  115. What’s the Trick? A talk on human vs. current robot learning, given by Chris Atkeson at the Robotics and AI Institute.
  116. </p><p class="shortcode-media shortcode-media-youtube">
  117. <span class="rm-shortcode" data-rm-shortcode-id="ad0b49c258ded8012bd36ea093692f33" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/hh7Fh5YS82Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  118. </p><p>
  119. [
  120. <a href="https://rai-inst.com/">Robotics and AI Institute (RAI)</a> ]
  121. </p><div class="horizontal-rule">
  122. </div>]]></description><pubDate>Fri, 06 Jun 2025 16:30:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-one-legged-robot</guid><category>Video friday</category><category>Robotics</category><category>Humanoid robots</category><category>Aldebaran robotics</category><category>Reinforcement learning</category><category>Quadruped robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/black-stick-figures-in-a-skating-pose-scattered-across-a-vast-white-icy-landscape.png?id=60524616&amp;width=980"></media:content></item><item><title>Look for These 7 New Technologies at the Airport</title><link>https://spectrum.ieee.org/7-new-airport-tech</link><description><![CDATA[
  123. <img src="https://spectrum.ieee.org/media-library/line-drawing-of-a-woman-walking-into-an-airport-and-rolling-carryon-luggage-as-she-checks-her-travel-itinerary-on-a-cell-phone.png?id=60389585&width=1500&height=2000&coordinates=577%2C0%2C578%2C0"/><br/><br/><p><strong>Take a look around</strong> the airport during your travels this summer and you might spot a string of new technologies at every touchpoint: from pre-arrival, bag drop, and security to the moment you board the plane.</p><p>In this new world, your face is your boarding pass, your electronic luggage tag transforms itself for each new flight, and gate scanners catch line cutters trying to sneak onto the plane early.</p><p>It isn’t the future—it’s now. Each of the technologies to follow is in use at airports around the world today, transforming your journey-before-the-journey.</p><h2>Virtual queuing speeds up airport security</h2><p>As you pack the night before your trip, you ponder the age-old travel question: What time should I get to the airport? The right answer requires predicting the length of the security line. But at some airports, you no longer have to guess; in fact, you don’t have to wait in line at all.</p><p>Instead, you can book ahead and choose a specific time for your security screening—so you can arrive right before your reserved slot, confident that you’ll be whisked to the front of the line, thanks to <a href="https://copenhagenoptimization.com/" rel="noopener noreferrer" target="_blank">Copenhagen Optimization</a>’s Virtual Queuing system.</p><p>Copenhagen Optimization’s machine learning models use linear regression, heuristic models, and other techniques to forecast the volume of passenger arrivals based on historical data. The system is integrated with airport programs to access flight schedules and passenger-flow data from boarding-pass scans, and it also takes in data from lidar sensors and cameras at security checkpoints, X-ray luggage scanners, and other areas.</p><p>If a given day’s passenger volume ends up differing from historical projections, the platform can use real-time data from these inputs to adjust the Virtual Queuing time slots—and recommend that the airport make changes to security staffing and the number of open lanes. The Virtual Queuing system is constantly adjusting to flatten the passenger arrival curve, tactically redistributing demand across time slots to optimize resources and reduce congestion.</p><p>While this system is doing the most, you as a passenger can do the least. Just book a time slot on your airport’s website or app, and get some extra sleep knowing you’ll waltz right up to the security check tomorrow morning.</p><h2>Electronic bag tags</h2><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Line drawing of a woman lifting suitcase at airport baggage check-in with barcode in focus." class="rm-shortcode" data-rm-shortcode-id="64ff97b084fbc93dd936889921e516d7" data-rm-shortcode-name="rebelmouse-image" id="f8bea" loading="lazy" src="https://spectrum.ieee.org/media-library/line-drawing-of-a-woman-lifting-suitcase-at-airport-baggage-check-in-with-barcode-in-focus.png?id=60389664&width=980"/> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">MCKIBILLO</small></p><p>Checking a bag? Here’s another step you can take care of before you arrive: Skip the old-school paper tags and generate your own electronic <a href="https://bagtag.com/" target="_blank">Bagtag</a>. This e-ink device (costing about US $80, or €70) looks like a traditional luggage-tag holder, but it can generate a new, paperless tag for each one of your flights.</p><p>You provide your booking details through your airline’s app or the Bagtag app, and the Bagtag system then uses application programming interfaces and secure data protocols to retrieve the necessary information from the airline’s system: your name, flight details, the baggage you’re allowed, and the unique barcode that identifies your bag. The app uses this data to generate a digital tag. Hold your phone near your Bagtag, and it will transmit the encrypted tag data via Bluetooth or NFC. Simultaneously, your phone’s NFC antenna powers the battery-free Bagtag device.</p><p>On the Bagtag itself, a low-power microcontroller decrypts the tag data and displays the digital tag on the e-ink screen. Once you’re at the airport, the tag can be scanned at the airline’s self-service bag drop or desk, just like a traditional paper tag. The device also contains an RFID chip that’s compatible with the luggage-tracking systems that some airlines are using, allowing your bag to be identified and tracked—even if it takes a different journey than you do. When you arrive at the airport, just drop that checked bag and make your way to the security area.</p><h2>Biometric boarding passes</h2><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" rel="float: left;" style="float: left;"> <img alt="Illustration of a woman using kiosk for facial recognition ID verification." class="rm-shortcode" data-rm-shortcode-id="af8a923d85c7eac7ca6873db756cc3fb" data-rm-shortcode-name="rebelmouse-image" id="3dfdf" loading="lazy" src="https://spectrum.ieee.org/media-library/illustration-of-a-woman-using-kiosk-for-facial-recognition-id-verification.png?id=60389955&width=980"/> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">MCKIBILLO</small></p><p>Over at security, you’ll need your boarding pass and ID. Compared with the old days of printing a physical slip from a kiosk, digital QR code boarding passes are quite handy—but what if you didn’t need anything besides your face? That’s the premise of <a href="https://www.idemia.com/" target="_blank">Idemia Public Security</a>’s biometric boarding-pass technology.</p><p>Instead of waiting in a queue for a security agent, you’ll approach a self-service kiosk or check-in point and insert your government-issued identification document, such as a driver’s license or passport. The system uses visible light, infrared, and ultraviolet imaging to analyze the document’s embedded security features and verify its authenticity. Then, computer-vision algorithms locate and extract the image of your face on the ID for identity verification.</p><p>Next, it’s time for your close-up. High-resolution cameras within the system capture a live image of your face using 3D and infrared imaging. The system’s antispoofing technology prevents people from trying to trick the system with items like photos, videos, or masks. The technology compares your live image to the one extracted from your ID using facial-recognition algorithms. Each image is then converted into a compact biometric template—a mathematical representation of your facial features—and a similarity score is generated to confirm a match.</p><p>Finally, the system checks your travel information against secure flight databases to make sure the ticket is valid and that you’re authorized to fly that day. Assuming all checks out, you’re cleared to head to the body scanners—with no biometric data retained by Idemia Public Security’s system.</p><h2>X-rays that can tell ecstasy from eczema meds </h2><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Illustration of an X-ray machine scanning luggage with schematic view of interior components above." class="rm-shortcode" data-rm-shortcode-id="0ff27fb1769e9930b81f30bde1d86244" data-rm-shortcode-name="rebelmouse-image" id="9c471" loading="lazy" src="https://spectrum.ieee.org/media-library/illustration-of-an-x-ray-machine-scanning-luggage-with-schematic-view-of-interior-components-above.png?id=60389973&width=980"/> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">MCKIBILLO</small></p><p>While you pass through your security screening, that luggage you checked is undergoing its own screening—with a major new upgrade that can tell exactly what’s inside.</p><p>Traditional scanners use one or a few X-ray sources and work by transmission, measuring the attenuation of the beam as it passes through the bag. These systems create a 2D “shadow” image based on differences in the amount and type of the materials inside. More recently, these systems have begun using <a href="https://spectrum.ieee.org/invention-of-ct-scanner" target="_blank">computed tomography</a> to scan the bag from all directions and to reconstruct 3D images of the objects inside. But even with CT, harmless objects may look similar to dangerous materials—which can lead to false positives and also require security staff to visually inspect the X-ray images or even bust open your luggage.</p><p>By contrast, <a href="https://www.smithsdetection.com/" target="_blank">Smiths Detection</a>’s new <a href="https://spectrum.ieee.org/future-baggage-scanners-will-tell-us-what-things-are-made-of" target="_blank">X-ray diffraction</a> machines measure the molecular structure of the items inside your bag to identify the exact materials—no human review required.</p><p>The machine uses a multifocus X-ray tube to quickly scan a bag from various angles, measuring the way the radiation diffracts while switching the position of the focal spots every few microseconds. Then, it analyzes the diffraction patterns to determine the crystal structure and molecular composition of the objects inside the bag—building a “fingerprint” of each material that can much more finely differentiate threats, like explosives and drugs, from benign items.</p><p>The system’s algorithms process this diffraction data and build a 3D spatial image, which allows real-time automated screening without the need for manual visual inspection by a human. After your bag passes through the X-ray diffraction machine without incident, it’s loaded into the cargo hold. Meanwhile, you’ve passed through your own scan at security and are ready to head toward your gate.</p><h2>Airport shops with no cashiers or checkout lanes</h2><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" rel="float: left;" style="float: left;"> <img alt='Illustration of a woman entering a store with a "Just Walk Out" shopping system.' class="rm-shortcode" data-rm-shortcode-id="995f20a20ef07fc6697d2cac5737b9c1" data-rm-shortcode-name="rebelmouse-image" id="8a4c0" loading="lazy" src="https://spectrum.ieee.org/media-library/illustration-of-a-woman-entering-a-store-with-a-just-walk-out-shopping-system.png?id=60390007&width=980"/> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">MCKIBILLO</small></p><p>While meandering over to your gate from security, you decide you could use a little pick-me-up. Just down the corridor is a convenience store with snacks, drinks, and other treats—but no cashiers. It’s a contactless shop that uses <a href="https://www.justwalkout.com/" target="_blank">Just Walk Out</a> technology by Amazon.</p><p>As you enter the store with the tap of a credit card or mobile wallet, a scanner reads the card and assigns you a unique session identifier that will let the Just Walk Out system link your actions in the store to your payment. Overhead cameras track you by the top of your head, not your face, as you move through the store.</p><p>The Just Walk Out system uses a deep-learning model to follow your movements and detect when you interact with items. In most cases, computer vision can identify a product you pick up simply based on the video feed, but sometimes weight sensors embedded in the shelves provide additional data to determine what you removed. The video and weight data are encoded as tokens, and a neural network processes those tokens in a way similar to how large language models encode text—determining the result of your actions to create a “virtual cart.”</p><p>While you shop, the system continuously updates this cart: adding a can of soda when you pick it up, swapping one brand of gum for another if you change your mind, or removing that bag of chips if you put it back on the shelf. Once your shopping is complete, you can indeed just walk out with your soda and gum. The items you take will make up your finalized virtual cart, and the credit card you entered the store with will be charged as usual. (You can look up a receipt, if you want.) With provisions procured, it’s onward to the gate.</p><h2>Airport-cleaning robots</h2><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Illustration of a woman watching an automated floor cleaning robot cleaning up a spilled drink in the airport." class="rm-shortcode" data-rm-shortcode-id="910995e84aefefbcacb0afaefa6e6a37" data-rm-shortcode-name="rebelmouse-image" id="a8ced" loading="lazy" src="https://spectrum.ieee.org/media-library/illustration-of-a-woman-watching-an-automated-floor-cleaning-robot-cleaning-up-a-spilled-drink-in-the-airport.png?id=60390051&width=980"/> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">MCKIBILLO</small></p><p>As you amble toward the gate with your luggage and snacks, you promptly spill that soda you just bought. Cleanup in Terminal C! Along comes <a href="https://avidbots.com/" target="_blank">Avidbots’ Neo</a>, a fully autonomous floor-scrubbing robot designed to clean commercial spaces like airports with minimal human intervention.</p><p>When a Neo is first delivered to the airport, the robot performs a comprehensive scan of the various areas it will be cleaning using lidar and 3D depth cameras. Avidbots software processes the data to create a detailed map of the environment, including walls and other obstacles, and this serves as the foundation for Neo’s cleaning plans and navigation.</p><p>Neo’s human overlords can use a touchscreen on the robot to direct it to the area that needs cleaning—either as part of scheduled upkeep, or when someone (ahem) spills their soda. The robot springs into action, and as it moves, it continuously locates itself within its map and plans its movements using data from wheel encoders, inertial measurement units, and a gyroscope. Neo also updates its map and adjusts its path in real time by using the lidar and depth cameras to detect any changes from its initial mapping, such as a translocated trash can or perambulating passengers.</p><p>Then comes the scrubbing. Neo’s software plans the optimal path for cleaning a given area at this moment in time, adjusting the robot’s speed and steering as it moves along. A water-delivery system pumps and controls the flow of cleaning solution to the motorized brushes, whose speed and pressure can also be adjusted based on the surface the robot is cleaning. A powerful vacuum system collects the dirty water, and a flexible squeegee prevents slippery floors from being left behind.</p><p>While the robot’s various sensors and planning algorithms continuously detect and avoid obstacles, any physical contact with the robot’s bumpers triggers an emergency stop. And if Neo finds itself in a situation it’s just not sure how to handle, the robot will stop and call for assistance from a human operator, who can review sensor data and camera feeds remotely to help it along.</p><h2>“Wrong group” plane-boarding alarm</h2><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" rel="float: left;" style="float: left;"> <img alt="Illustration of a woman waiting in line at boarding gate E6, with notification bell icon above." class="rm-shortcode" data-rm-shortcode-id="2b01cd5c1c6b0ed8dd2962347946988a" data-rm-shortcode-name="rebelmouse-image" id="bdce3" loading="lazy" src="https://spectrum.ieee.org/media-library/illustration-of-a-woman-waiting-in-line-at-boarding-gate-e6-with-notification-bell-icon-above.png?id=60390066&width=980"/> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">MCKIBILLO</small></p><p>Your airport journey is coming to an end, and your real journey is about to begin. As you wait at the gate, you notice a fair number of your fellow passengers hovering to board even before the agent has made any announcements. And when boarding does begin, a surprising number of people hop in line. <em><em>Could all these people really be in boarding groups 1 and 2?</em></em> you wonder.</p><p>If they’re not…they’ll get called out. American Airlines’ new boarding technology stops those pesky passengers who try to join the wrong boarding group and sneak onto the plane early.</p><p>If one such passenger approaches the gate before their assigned group has been called, scanning their boarding pass will trigger an audible alert—notifying the airline crew, and everyone else for that matter. The passengers will be politely asked to wait to board. As they slink back into line, try not to look too smug. After all, it’s been a remarkably easy, tech-assisted journey through the airport today. <span class="ieee-end-mark"></span></p><p><em>This article appears in the July 2025 print issue as “A Walk Through 7 New Technologies at the Airport.”</em></p>]]></description><pubDate>Wed, 04 Jun 2025 16:00:04 +0000</pubDate><guid>https://spectrum.ieee.org/7-new-airport-tech</guid><category>Airlines</category><category>Facial recognition</category><category>Robot cleaner</category><category>Airports</category><category>X-ray diffraction</category><dc:creator>Julianne Pepitone</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/line-drawing-of-a-woman-walking-into-an-airport-and-rolling-carryon-luggage-as-she-checks-her-travel-itinerary-on-a-cell-phone.png?id=60389585&amp;width=980"></media:content></item><item><title>Who Gives a S#!t About Cursing Robots?</title><link>https://spectrum.ieee.org/cursing-social-robot-interaction</link><description><![CDATA[
  124. <img src="https://spectrum.ieee.org/media-library/an-illustration-of-a-robot-tripping-on-a-banana-peel-its-head-is-covered-by-a-speech-bubble-and-symbols-representing-an-expleti.jpg?id=60333045&width=1500&height=2000&coordinates=877%2C0%2C877%2C0"/><br/><br/><p>
  125. The robots that share our public spaces today are so demure. Social robots and service robots aim to avoid offense, erring toward polite airs, positive emotions, and obedience. In some ways, this makes sense—would you really want to have a yelling match with a delivery robot in a hotel? Probably not, even if you’re in New York City and trying to absorb the local culture.
  126. </p><p>
  127. In other ways, this passive social robot design aligns with paternalistic standards that link assistance to subservience. Thoughtlessly following such outdated social norms in robot design may be ill-advised, since it <a href="https://pubmed.ncbi.nlm.nih.gov/37123285/" rel="noopener noreferrer" target="_blank">can help to reinforce outdated or harmful ideas</a> such as restricting people’s rights and reflecting only the needs of majority-identity users.
  128. </p><p>
  129. In <a href="https://osusharelab.com/" rel="noopener noreferrer" target="_blank">my robotics lab at Oregon State University</a>, <a href="https://spectrum.ieee.org/how-high-fives-help-us-get-in-touch-with-robots" target="_blank">we work with</a> <a href="https://spectrum.ieee.org/whats-the-deal-with-robot-comedy" target="_blank">a playful spirit</a> and enjoy challenging the problematic norms that are entrenched within “polite” interactions and social roles. So we decided to experiment with robots that use foul language around humans. After all, many people are using foul language more than ever in 2025. Why not let robots have a chance, too?
  130. </p><h3>Why and How to Study Cursing Robots</h3><p>
  131. Societal standards in the United States suggest that cursing robots would likely rub people the wrong way in most contexts, as swearing has a predominantly negative connotation. Although some past research shows that cursing <a href="https://www.researchgate.net/publication/238326248_Swearing_at_work_and_permissive_leadership_culture_When_anti-social_becomes_social_and_incivility_is_acceptable" rel="noopener noreferrer" target="_blank">can enhance team cohesion</a> and <a href="https://hrcak.srce.hr/file/159883" rel="noopener noreferrer" target="_blank">elicit humor</a>, certain members of society (such as women) are often expected to <a href="https://link.springer.com/article/10.1023/A:1022986429748" rel="noopener noreferrer" target="_blank">avoid risking offense</a> through profanity. We wondered whether cursing robots would be viewed negatively, or if they might perhaps offer benefits in certain situations.
  132. </p><p>
  133. We decided to study cursing robots in the context of responding to mistakes. Past work in human-robot interaction has already shown that <a href="https://ieeexplore.ieee.org/abstract/document/5453195" rel="noopener noreferrer" target="_blank">responding to error</a> (rather than ignoring it) can help robots be perceived more positively in human-populated spaces, especially in the case of personal and service robots. And one <a href="https://par.nsf.gov/biblio/10284325-perceived-agency-social-norm-violating-robot" rel="noopener noreferrer" target="_blank">study</a> found that compared to other faux pas, foul language is more forgivable in a robot.
  134. </p><p>
  135. With this past work in mind, we generated videos with three common types of robot failure: bumping into a table, dropping an object, and failing to grasp an object. We crossed these situations with three types of responses from the robot: no verbal reaction, a non-expletive verbal declaration, and an expletive verbal declaration. We then asked people to rate the robots on things like competence, discomfort, and likability, using standard scales in an online survey.
  136. </p><p class="shortcode-media shortcode-media-youtube">
  137. <span class="rm-shortcode" data-rm-shortcode-id="27aa73d6ea081bc41fc24b217317e021" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/hYdN5zLa07Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  138. <small class="image-media media-caption" placeholder="Add Photo Caption...">What If Robots Cursed? These Videos Helped Us Learn How People Feel about Profane Robots</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Video: Naomi Fitter</small></p><h3>What People Thought of Our Cursing Robots</h3><p>
  139. On the whole, we were surprised by how acceptable swearing seemed to the study participants, especially within an initial group of Oregon State University students, but even among the general public as well. Cursing had no negative impact, and even some positive impacts, among the college students after we removed one religiously connotated curse (god***it), which seemed to be received in a stronger negative way than other cuss words.
  140. </p><p>
  141. In fact, university participants rated swearing robots as the <a href="https://sparqtools.org/mobility-measure/inclusion-of-other-in-the-self-ios-scale/" rel="noopener noreferrer" target="_blank">most socially close</a> and most humorous, and rated non-expletive and expletive robot reactions equivalent on social warmth, competence, discomfort, anthropomorphism, and likability scales. The general public judged non-profane and profane robots as equivalent on most scales, although expletive reactions were deemed most discomforting and non-expletive responses seemed most likable. We believe that the university students were slightly more accepting of cursing robots because of the campus’s progressive culture, where cursing is considered a peccadillo.
  142. </p><p class="ieee-inbody-related">Related: <a href="https://spectrum.ieee.org/whats-the-deal-with-robot-comedy" target="_blank">What’s the Deal With Robot Comedy?</a></p><p>
  143. Since experiments run solely in an online setting do not always represent real-life interactions well, we also conducted a final replication study in person with a robot that made errors while distributing goodie bags to campus community members at Oregon State, which reinforced our prior results.
  144. </p><p class="shortcode-media shortcode-media-youtube">
  145. <span class="rm-shortcode" data-rm-shortcode-id="10b0ac5703e47214353a0b0843085bf0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DhHhh4yni1I?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  146. <small class="image-media media-caption" placeholder="Add Photo Caption...">Humans React to a Cursing Robot in the Wild<a href="https://www.youtube.com/@naomi_fitter" rel="noopener noreferrer" target="_blank"></a></small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Video: Naomi Fitter</small></p><p>
  147. We have submitted this work, which represents a well-designed series of empirical experiments with interesting results and replications along the way, to several different journals and conferences. Despite consistently enthusiastic reviewer comments, no editors have yet accepted our work for publication—it seems to be the type of paper that editors are nervous to touch. Currently, the work is under review for a fourth time, for possible inclusion in the 2025 IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), in a paper titled “<a href="https://arxiv.org/abs/2505.05831" rel="noopener noreferrer" target="_blank">Oh F**k! How Do People Feel About Robots That Leverage Profanity?</a>”
  148. </p><h3>Give Cursing Robots a Chance </h3><p>
  149. Based on our results, we think cursing robots deserve a chance! Our findings show that swearing robots would typically have little downside and some upside, especially in open-minded spaces such as university campuses. Even for the general public, reactions to errors with profanity yielded much less distaste than we expected. Our data showed that people cared more about whether robots acknowledged their error at all than whether or not they swore.
  150. </p><p>
  151. People do have some reservations about cursing robots, especially when it comes to comfort and likability, so thoughtfulness may be required to apply curse words at the right time. For example, just as humans do, robots should likely hold back their swear words around children and be more careful in settings that typically demand cleaner language. Robot practitioners might also consider surveying individual users about profanity acceptance as they set up new technology in personal settings—rather than letting robotic systems learn the hard way, perhaps alienating users in the process.
  152. </p><p>
  153. As more robots enter our day-to-day spaces, they are bound to make mistakes. How they react to these errors is important. Fundamentally, our work shows that people prefer robots that notice when a mistake has occurred and react to this error in a relatable way. And it seems that a range of styles in the response itself, from the profane to the mundane, can work well. So we invite designers to give cursing robots a chance!
  154. </p>]]></description><pubDate>Tue, 03 Jun 2025 16:00:04 +0000</pubDate><guid>https://spectrum.ieee.org/cursing-social-robot-interaction</guid><category>Social robots</category><category>Human robot interaction</category><dc:creator>Naomi Fitter</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/an-illustration-of-a-robot-tripping-on-a-banana-peel-its-head-is-covered-by-a-speech-bubble-and-symbols-representing-an-expleti.jpg?id=60333045&amp;width=980"></media:content></item><item><title>This Little Mars Rover Stayed Home</title><link>https://spectrum.ieee.org/mars-pathfinder-rover</link><description><![CDATA[
  155. <img src="https://spectrum.ieee.org/media-library/photo-of-a-six-wheeled-robot-with-a-flat-metal-rectangular-top.jpg?id=60333415&width=1500&height=2000&coordinates=875%2C0%2C875%2C0"/><br/><br/><p>
  156. <em></em><em></em>As a mere earthling, I remember watching in fascination as
  157. <em>Sojourner </em>sent back photos of the Martian surface during the summer of 1997. I was not alone. The servers at NASA’s Jet Propulsion Lab slowed to a crawl when they got more than 47 million hits (a record number!) from people attempting to download those early images of the Red Planet. To be fair, it was the late 1990s, the Internet was still young, and most people were using dial-up modems. By the end of the 83-day mission, <em>Sojourner </em>had sent back 550 photos and performed more than 15 chemical analyses of Martian rocks and soil.
  158. </p><p>
  159. <em>Sojourner</em>, of course, remains on Mars. Pictured here is <em>Marie Curie,</em> its twin. Functionally identical, either one of the rovers could have made the voyage to Mars, but one of them was bound to become the famous face of the mission, while the other was destined to be left behind in obscurity. Did I write this piece because I feel a little bad for <em>Marie Curie</em>? Maybe. But it also gave me a chance to revisit this pioneering Mars mission, which established that robots could effectively explore the surface of planets and captivate the public imagination.
  160. </p><h2><em>Sojourner</em>’s sojourn on Mars</h2><p>
  161. On 4 July 1997, the
  162. <a href="https://science.nasa.gov/mission/mars-pathfinder/" rel="noopener noreferrer" target="_blank"><em>Mars Pathfinder</em></a> parachuted through the Martian atmosphere and bounced about 15 times on glorified airbags before finally coming to a rest. The lander, <a href="https://www.jpl.nasa.gov/news/nasa-renames-mars-lander-in-honor-of-late-carl-sagan/" rel="noopener noreferrer" target="_blank">renamed the <em>Carl Sagan Memorial Station</em></a>, carried precious cargo stowed inside. The next day, after the airbags retracted, the solar-powered <em>Sojourner </em>eased its way down the ramp, the first human-made vehicle to roll around on the surface of another planet. (It wasn’t the first extraterrestrial body, though. The <a href="https://nssdc.gsfc.nasa.gov/nmc/spacecraft/display.action?id=1973-001A" rel="noopener noreferrer" target="_blank">Soviet Lunokhod rovers</a> conducted two successful missions on the moon in 1970 and 1973. The Soviets had also landed a rover on Mars back in 1971, but communication was lost before the <a href="https://spectrum.ieee.org/meet-the-very-first-rover-to-land-on-mars" target="_self">PROP-M</a> ever deployed.)
  163. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  164. <img alt="Photo of a man pulling on a cable attached to a small wheeled robot, in a large room filled with sand and rocks." class="rm-shortcode" data-rm-shortcode-id="9a0472467f342ba65cbfc5b938087628" data-rm-shortcode-name="rebelmouse-image" id="bcc72" loading="lazy" src="https://spectrum.ieee.org/media-library/photo-of-a-man-pulling-on-a-cable-attached-to-a-small-wheeled-robot-in-a-large-room-filled-with-sand-and-rocks.jpg?id=60333457&width=980"/>
  165. <small class="image-media media-caption" placeholder="Add Photo Caption...">This giant sandbox at JPL provided <i>Marie Curie</i> with an approximation of Martian terrain. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Mike Nelson/AFP/Getty Images</small>
  166. </p><p>
  167. The six-wheeled, 10.6-kilogram, microwave-oven-size
  168. <em>Sojourner </em>was equipped with three low-resolution cameras (two on the front for black-and-white images and a color camera on the rear), a laser hazard–avoidance system, an alpha-proton X-ray spectrometer, experiments for testing wheel abrasion and material adherence, and several accelerometers. The robot also demonstrated the value of the six-wheeled “rocker-bogie” suspension system that became NASA’s go-to design for all later Mars rovers. <em>Sojourner</em> never roamed more than about 12 meters from the lander due to the limited range of its radio<em>. </em>
  169. </p><p>
  170. <em>Pathfinder</em> had landed in <a href="https://en.wikipedia.org/wiki/Ares_Vallis" rel="noopener noreferrer" target="_blank">Ares Vallis</a>, an assumed ancient floodplain chosen because of the wide variety of rocks present. Scientists hoped to confirm the past existence of water on the surface of Mars. <em>Sojourner </em>did discover rounded pebbles that suggested running water, and later missions confirmed it.
  171. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  172. <img alt="Black and white photo of a small wheeled robot on sandy terrain with a large rock in the background." class="rm-shortcode" data-rm-shortcode-id="96ac1c45cafe3432d409797b52507ea8" data-rm-shortcode-name="rebelmouse-image" id="d8cf7" loading="lazy" src="https://spectrum.ieee.org/media-library/black-and-white-photo-of-a-small-wheeled-robot-on-sandy-terrain-with-a-large-rock-in-the-background.jpg?id=60333452&width=980"/>
  173. <small class="image-media media-caption" placeholder="Add Photo Caption...">A highlight of <i>Sojourner</i>’s 83-day mission on Mars was its encounter with a rock nicknamed Barnacle Bill [to the rover’s left]. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">JPL/NASA</small>
  174. </p><p>
  175. As its first act of exploration,
  176. <em>Sojourner </em>rolled forward 36 centimeters and encountered a rock, dubbed Barnacle Bill due to its rough surface. The rover<em> </em>spent about 10 hours analyzing the rock, using its spectrometer to determine the elemental composition. Over the next few weeks, while the lander collected atmospheric information and took photos, the rover studied rocks in detail and tested the Martian soil.
  177. </p><h2><em>Marie Curie</em>’s sojourn…in a JPL sandbox</h2><p>
  178. Meanwhile back on Earth, engineers at JPL used
  179. <em>Marie Curie</em> to mimic <em>Sojourner’s </em>movements in a Mars-like setting<em>.</em> During the original design and testing of the rovers, the team had set up giant sandboxes, each holding thousands of kilograms of playground sand, in the Space Flight Operations Facility at JPL. They exhaustively practiced the remote operation of <em>Sojourner</em>, including an 11-minute delay in communications between Mars and Earth. (The actual delay can vary from 7 to 20 minutes.) Even after <em>Sojourner</em> landed, <em>Marie Curie</em> continued to help them strategize.
  180. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  181. <img alt="Photo of a man wearing VR goggles and looking at a computer screen, with his right hand on a large track ball." class="rm-shortcode" data-rm-shortcode-id="fe9372a23dae6bcb728214d1172b838b" data-rm-shortcode-name="rebelmouse-image" id="6f5e6" loading="lazy" src="https://spectrum.ieee.org/media-library/photo-of-a-man-wearing-vr-goggles-and-looking-at-a-computer-screen-with-his-right-hand-on-a-large-track-ball.jpg?id=60333425&width=980"/>
  182. <small class="image-media media-caption" placeholder="Add Photo Caption...">Initially, <i>Sojourner</i> was remotely operated from Earth, which was tricky given the lengthy communication delay. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Mike Nelson/AFP/Getty Images</small>
  183. </p><p>
  184. During its first few days on Mars,
  185. <em>Sojourner</em> was maneuvered by an Earth-based operator wearing 3D goggles and using a funky input device called a <a href="https://www.youtube.com/watch?v=0zzY8cxKauU" rel="noopener noreferrer" target="_blank">Spaceball 2003</a>. Images pieced together from both the lander and the rover guided the operator. It was like a very, very slow video game—the rover sometimes moved only a few centimeters a day. NASA then turned on <em>Sojourner’s </em>hazard-avoidance system, which allowed the rover <a href="https://www-robotics.jpl.nasa.gov/media/documents/ICESpaper.pdf" rel="noopener noreferrer" target="_blank">some autonomy</a> to explore its world. A human would suggest a path for that day’s exploration, and then the rover had to autonomously avoid any obstacles in its way, such as a big rock, a cliff, or a steep slope.
  186. </p><p>
  187. JPL designed
  188. <em>Sojourner </em>to operate for a week. But the little rover that could kept chugging along for 83 Martian days before NASA finally lost contact, on 7 October 1997. The lander had conked out on 27 September. In all, the mission collected 1.2 gigabytes of data (which at the time was a <em>lot</em>) and sent back 10,000 images of the planet’s surface.
  189. </p><p>
  190. NASA held on to
  191. <em>Marie Curie </em>with the hopes of sending it on another mission to Mars. For a while, it was slated to be part of the <em>Mars 2001 </em>set of missions, but that didn’t happen. In 2015, JPL transferred the rover to the <a href="https://collections.si.edu/search/detail/edanmdm:nasm_A20150317000?" rel="noopener noreferrer" target="_blank">Smithsonian’s National Air and Space Museum</a>.
  192. </p><h2>When NASA Embraced Faster, Better, Cheaper</h2><p>
  193. The
  194. <em>Pathfinder </em>mission was the second one in NASA administrator <a href="https://www.nasa.gov/people/daniel-s-goldin/" rel="noopener noreferrer" target="_blank">Daniel S. Goldin</a>’s Discovery Program, which embodied his “faster, better, cheaper” philosophy of making NASA more nimble and efficient. (The first Discovery mission was to the asteroid Eros.) In the financial climate of the early 1990s, the space agency couldn’t risk a billion-dollar loss if a major mission failed. Goldin opted for smaller projects; the <em>Pathfinder</em> mission’s overall budget, including flight and operations, was capped at US $300 million.
  195. </p><p class="ieee-inbody-related">
  196. RELATED: <a href="https://spectrum.ieee.org/planetary-rovers-are-we-alone" target="_self">How NASA Built Its Mars Rovers</a>
  197. </p><p>
  198. In his 2014 book
  199. <a href="https://www.amazon.com/Curiosity-Inside-Mission-People-Happen/dp/1616149337" rel="noopener noreferrer" target="_blank"><em>Curiosity: An Inside Look at the Mars Rover Mission and the People Who Made It Happen</em></a> (Prometheus)<em>, </em>science writer Rod Pyle interviews <a href="https://science.nasa.gov/people/rob-manning/" rel="noopener noreferrer" target="_blank">Rob Manning</a>, chief engineer for the <em>Pathfinder </em>mission and subsequent Mars rovers. Manning recalled that one of the best things about the mission was its relatively minimal requirements. The team was responsible for landing on Mars, delivering the rover, and transmitting images—technically challenging, to be sure, but beyond that the team had no constraints.
  200. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  201. <img alt="Photo of two people in white lab coats standing in a dry landscape surrounded by several wheeled robots." class="rm-shortcode" data-rm-shortcode-id="c652ceb39c6da6d2c08a959847f796e0" data-rm-shortcode-name="rebelmouse-image" id="04b0b" loading="lazy" src="https://spectrum.ieee.org/media-library/photo-of-two-people-in-white-lab-coats-standing-in-a-dry-landscape-surrounded-by-several-wheeled-robots.jpg?id=60333422&width=980"/>
  202. <small class="image-media media-caption" placeholder="Add Photo Caption..."><i>Sojourner</i> was succeeded by the rovers <i>Spirit</i>, <i>Opportunity</i>, and <i>Curiosity</i>. Shown here are four mission spares, including Marie Curie [foreground]. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">JPL-Caltech/NASA</small>
  203. </p><p>
  204. The real mission was to prove to Congress and the American public that NASA could do groundbreaking work more efficiently. Behind the scenes, there was a little bit of accounting magic happening, with the “faster, better, cheaper” missions often being silently underwritten by larger, older projects. For example, the radioisotope heater units that kept
  205. <em>Sojourner</em>’s electronics warm enough to operate were leftover spares from the <em>Galileo</em> mission to Jupiter, so they were “free.”
  206. </p><p>
  207. Not only was the
  208. <em>Pathfinder</em> mission successful but it captured the hearts of Americans and reinvigorated an interest in exploring Mars. In the process, it set the foundation for the future missions that allowed the rovers <a href="https://science.nasa.gov/mission/mer-spirit/" rel="noopener noreferrer" target="_blank"><em>Spirit</em></a><em>, </em><a href="https://science.nasa.gov/mission/mer-opportunity/" rel="noopener noreferrer" target="_blank"><em>Opportunity</em></a><em>, </em>and <a href="https://science.nasa.gov/mission/msl-curiosity/" rel="noopener noreferrer" target="_blank"><em>Curiosity</em></a> (which, incredibly, is still operating nearly 13 years after it landed) to explore even more of the Red Planet.
  209. </p><h2>How the rovers <em>Sojourner</em> and <em>Marie Curie</em> got their names</h2><p>
  210. To name its first Mars rovers, NASA launched a student contest in March 1994, with the specific guidance of choosing a “heroine.” Entry essays were judged on their quality and creativity, the appropriateness of the name for a rover, and the student’s knowledge of the woman to be honored as well as the mission’s goals. Students from all over the world entered.
  211. </p><p>
  212. Twelve-year-old Valerie Ambroise of Bridgeport, Conn., won for her essay on
  213. <a href="https://www.smithsonianmag.com/history/remarkable-untold-story-sojourner-truth-180983691/" rel="noopener noreferrer" target="_blank">Sojourner Truth</a>, while 18-year-old Deepti Rohatgi of Rockville, Md., came in second for hers on <a href="https://www.mariecurie.org.uk/about-us/our-history/marie-curie-the-scientist" rel="noopener noreferrer" target="_blank">Marie Curie</a>. Truth was a Black woman born into slavery at the end of the 18th century. She escaped with her infant daughter and two years later won freedom for her son through legal action. She became a vocal advocate for civil rights, women’s rights, and alcohol temperance. Curie<em> </em>was a Polish-French physicist and chemist famous for her studies of radioactivity, a term she coined. She was the first woman to win a Nobel Prize, as well as the first person to win a second Nobel.
  214. </p><p>
  215. NASA subsequently recognized several other women with named structures. One of the last women to be so honored was
  216. <a href="https://spacenews.com/nasa-renames-wfirst-space-telescope-after-pioneering-woman-astronomer/" rel="noopener noreferrer" target="_blank">Nancy Grace Roman</a>, the space agency’s first chief of astronomy. In May 2020, NASA announced it would name the Wide Field Infrared Survey Telescope after Roman; the space telescope is set to launch as early as October 2026, although the Trump administration has repeatedly said it wants to <a href="https://www.scientificamerican.com/article/nasas-next-major-space-telescope-is-ready-to-launch-trump-wants-to-kill-it/" rel="noopener noreferrer" target="_blank">cancel the project</a>.
  217. </p><p class="ieee-inbody-related">
  218. Related:
  219. <a href="https://spectrum.ieee.org/rogue-planet" target="_self">A Trillion Rogue Planets and Not One Sun to Shine on Them</a>
  220. </p><p>
  221. These days, NASA tries to avoid naming its major projects after people. It quietly changed
  222. <a href="https://spacenews.com/nasa-policy-discourages-naming-missions-after-individuals/" rel="noopener noreferrer" target="_blank">its naming policy</a> in December 2022 after allegations came to light that James Webb, for whom the <a href="https://spectrum.ieee.org/collections/james-webb-telescope/" target="_self">James Webb Space Telescope</a> is named, had fired LGBTQ+ employees at NASA and, before that, the State Department. A <a href="https://spacenews.com/nasa-confirms-decision-to-keep-jwst-name-after-historical-report/" rel="noopener noreferrer" target="_blank">NASA investigation</a> couldn’t substantiate the allegations, and so the telescope retained Webb’s name. But the bar is now much higher for NASA projects to memorialize anyone, deserving or otherwise. (The agency did allow the hopping lunar robot <a href="https://www.intuitivemachines.com/micro-nova" rel="noopener noreferrer" target="_blank">IM-2 Micro Nova Hopper</a>, built by Intuitive Machines, to be named for computer-software pioneer <a href="https://www.gracehopper.com/blog/grace-hopper-the-person-programmer-and-pioneer" rel="noopener noreferrer" target="_blank">Grace Hopper</a>.)
  223. </p><p>
  224. And so
  225. <em>Marie Curie </em>and <em>Sojourner</em> will remain part of a rarefied clique. <em>Sojourner</em>, inducted into the <a href="http://www.robothalloffame.org/inductees/03inductees/mars.html" rel="noopener noreferrer" target="_blank">Robot Hall of Fame</a> in 2003, will always be the celebrity of the pair. And <em>Marie Curie</em> will always remain on the sidelines. But think about it this way: <em>Marie Curie </em>is now on exhibit at one of the most popular museums in the world, where millions of visitors can see the rover up close. That’s not too shabby a legacy either.
  226. </p><p>
  227. <em>Part of a </em><a href="https://spectrum.ieee.org/collections/past-forward/" target="_self"><em>continuing series</em></a><em> </em><em>looking at historical artifacts that embrace the boundless potential of technology.</em>
  228. </p><p>
  229. <em>An abridged version of this article appears in the June 2025 print issue.</em>
  230. </p><h3>References</h3><br/><p>Curator Matthew Shindell of the National Air and Space Museum first suggested I feature <em>Marie Curie</em>.<em> </em>I found additional information from the museum’s <a href="https://airandspace.si.edu/collection-objects/rover-marie-curie-mars-pathfinder-engineering-test-vehicle/nasm_A20150317000" target="_blank">collections website</a>, an article by David Kindy in <a href="https://www.smithsonianmag.com/smithsonian-institution/recalling-thrill-pathfinders-mission-mars-180977008/" target="_blank"><em><em>Smithsonian</em></em> magazine</a>, and the book <a href="https://airandspace.si.edu/research/publications/after-sputnik-50-years-space-age" target="_blank"><em>After Sputnik: 50 Years of the Space Age</em></a> (Smithsonian Books/HarperCollins, 2007) by Smithsonian curator Martin Collins.</p><p>NASA has numerous resources documenting the <em>Mars Pathfinder </em>mission, such as the <a href="https://science.nasa.gov/mission/mars-pathfinder/" target="_blank">mission website</a>, <a href="https://d2pn8kiwq2w21t.cloudfront.net/documents/mpf_bQIcJKD.pdf" rel="noopener noreferrer" target="_blank">fact sheet</a>, and many lovely photos (including some of <a href="https://science.nasa.gov/resource/super-resolution-view-of-barnacle-bill/" rel="noopener noreferrer" target="_blank">Barnacle Bill</a> and a composite of <a href="https://science.nasa.gov/resource/marie-curie-during-ort6/" rel="noopener noreferrer" target="_blank"><em>Marie Curie</em></a> during a prelaunch test).</p><p><a href="https://www.amazon.com/Curiosity-Inside-Mission-People-Happen/dp/1616149337" rel="noopener noreferrer" target="_blank"><em>Curiosity: An Inside Look at the Mars Rover Mission and the People Who Made It Happen</em></a> (Prometheus, 2014) by Rod Pyle and <a href="https://www.amazon.com/Roving-Mars-Spirit-Opportunity-Exploration/dp/1401301495" rel="noopener noreferrer" target="_blank"><em>Roving Mars: Spirit, Opportunity, and the Exploration of the Red Planet</em></a> (Hyperion, 2005) by planetary scientist Steve Squyres are both about later Mars missions and their rovers, but they include foundational information about <em>Sojourner</em>.</p>]]></description><pubDate>Sat, 31 May 2025 14:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/mars-pathfinder-rover</guid><category>Jpl</category><category>Mars rover</category><category>Nasa</category><category>Past forward</category><category>Planetary science</category><category>Type:departments</category><category>Pathfinder</category><dc:creator>Allison Marsh</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/photo-of-a-six-wheeled-robot-with-a-flat-metal-rectangular-top.jpg?id=60333415&amp;width=980"></media:content></item><item><title>Video Friday: Atlas Robot Sees the World</title><link>https://spectrum.ieee.org/video-friday-atlas-robot-sees-world</link><description><![CDATA[
  231. <img src="https://spectrum.ieee.org/media-library/a-boston-dynamics-robot-with-a-round-camera-equipped-head-being-adjusted-in-a-tech-environment.jpg?id=60342329&width=1500&height=2000&coordinates=180%2C0%2C181%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://smartconf.jp/content/rcar2025/">IEEE RCAR 2025</a>: 1–6 June 2025, TOYAMA, JAPAN</h5><h5><a href="https://www.edrcoalition.com/2025-energy-drone-robotics-summit">2025 Energy Drone & Robotics Summit</a>: 16–18 June 2025, HOUSTON, TX</h5><h5><a href="https://roboticsconference.org/">RSS 2025</a>: 21–25 June 2025, LOS ANGELES</h5><h5><a href="https://robotx.ethz.ch/education/summer-school.html">ETH Robotics Summer School</a>: 21–27 June 2025, GENEVA</h5><h5><a href="https://ias-19.org/">IAS 2025</a>: 30 June–4 July 2025, GENOA, ITALY</h5><h5><a href="https://clawar.org/icres2025/">ICRES 2025</a>: 3–4 July 2025, PORTO, PORTUGAL</h5><h5><a href="https://2025.worldhaptics.org/">IEEE World Haptics</a>: 8–11 July 2025, SUWON, KOREA</h5><h5><a href="https://ifac2025-msrob.com/">IFAC Symposium on Robotics</a>: 15–18 July 2025, PARIS</h5><h5><a href="https://2025.robocup.org/">RoboCup 2025</a>: 15–21 July 2025, BAHIA, BRAZIL</h5><h5><a href="https://www.ro-man2025.org/">RO-MAN 2025</a>: 25–29 August 2025, EINDHOVEN, THE NETHERLANDS</h5><h5><a href="https://clawar.org/clawar2025/">CLAWAR 2025</a>: 5–7 September 2025, SHENZHEN</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="http://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="oe1dke3cf7i"><em>For a humanoid robot to be successful and generalizable in a factory, warehouse, or even at home requires a comprehensive understanding of the world around it—both the shape and the context of the objects and environments the robot interacts with. To do those tasks with agility and adaptability, Atlas needs an equally agile and adaptable perception system.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b1816bab6f8494b71379b19f2f7ebce2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/oe1dke3Cf7I?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://bostondynamics.com/blog/making-atlas-see-the-world/">Boston Dynamics</a>]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="mcayqe7pkog"><em>What happens when a bipedal robot is placed in the back of a moving cargo truck without any support? LimX Dynamics explored this idea in a real-world test. During the test, TRON 1 was positioned in the compartment of a medium-sized truck. The vehicle carried out a series of demanding maneuvers—sudden stops, rapid acceleration, sharp turns, and lane changes. With no external support, TRON 1 had to rely entirely on its onboard control system to stay upright, presenting a real challenge for <a data-linked-post="2650275699" href="https://spectrum.ieee.org/video-friday-robot-dance-teacher-transformer-drone-pneumatic-reel-actuator" target="_blank">dynamic stability</a>.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="009596bae52f3b2025639533d34acf81" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/McAYQE7Pkog?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://www.limxdynamics.com/en">LimX Dynamics</a>]</p><p>Thanks, Jinyan!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="8-pz_8hqe6s"><em>We present a quiet, smooth-walking controller for quadruped guide robots, addressing key challenges for blind and low-vision (BLV) users. Unlike conventional controllers, which produce distracting noise and jerky motion, ours enables slow, stable, and human-speed walking—even on stairs. Through interviews and user studies with BLV individuals, we show that our controller reduces noise by half and significantly improves user acceptance, making quadruped robots a more viable mobility aid.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7a26c9a4605682421ddb6717bc3eeb91" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8-pz_8Hqe6s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://www.umass.edu/robotics/daros/research/guide-dog-robot">University of Massachusetts Amherst</a>]</p><p>Thanks, Julia!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="dpfqk0-z-rm"><em>RIVR, the leader in physical AI and robotics, is partnering with Veho to pilot our delivery robots in the heart of Austin, Texas. Designed to solve the “last-100-yard” challenge, our wheeled-legged robots navigate stairs, gates, and real-world terrain to deliver parcels directly to the doorstep—working alongside human drivers, not replacing them.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="da73a51acb294a3abd98a49bb3574c99" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dPFQK0-Z-rM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://www.rivr.ai/">RIVR</a>]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="6avyea_ewnk">We will have more on this robot shortly, but for now, this is all you need to know.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2ce1b2819ae5e7bdbc2687b0b537cca8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/6AvyeA_Ewnk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://pintobotics.substack.com/">Pintobotics</a>]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ezbm594t3c4">Some pretty awesome <a data-linked-post="2671184284" href="https://spectrum.ieee.org/ai-institute" target="_blank">quadruped parkour</a> here—haven’t seen the wall running before.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="68cbe70a242882376dbdfe7c75045fe7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/EZbM594T3c4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://www.science.org/doi/10.1126/scirobotics.ads6192">Paper</a>] via [<a href="https://www.science.org/journal/scirobotics" target="_blank">Science Robotics</a>]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="wkcdp41_4m8">This is fun, and also useful, because it’s all about recovering from unpredictable and forceful impacts.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="931d3f4445a0ce56c02dd3e5f1300086" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/WKCDP41_4m8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>What is that move at 0:06, though?! Wow.</p><p>[<a href="https://www.unitree.com/mobile/boxing/">Unitree</a>]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="qwmuo91drji">Maybe an option for all of those <a data-linked-post="2655919083" href="https://spectrum.ieee.org/social-robots-children" target="_blank">social robots</a> that are now not social?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d550874508203107b36d676f522f944b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/QwmuO91drJI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://www.robohearts.eu/">RoboHearts</a>]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ylssuetkmha">Oh, good, another robot I want nowhere near me.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d9ad398b2fc26a39029d5e185133bfb7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/yLssUETKmHA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://www.softrobotics.dk/">SDU Biorobotics Lab, University of Southern Denmark</a>]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ergit1fglck">While this “has become the first humanoid robot to skillfully use chopsticks,” I’m pretty skeptical of the implied autonomy. Also, those chopsticks are cheaters.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b8a648ba60752b47748ec4ac07db70af" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ergiT1fglCk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://www.robotera.com/en/">ROBOTERA</a>]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="xhec98zkiho">Looks like Westwood Robotics had a fun time at <a data-linked-post="2669279706" href="https://spectrum.ieee.org/video-friday-icra40" target="_blank">ICRA</a>!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6e1596cda9a3f8c5d183d670f058fece" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xhEc98ZkIho?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://www.westwoodrobotics.io/">Westwood Robotics</a>]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="9pq2zg19hgg"><em>Tessa Lau, CEO and co-founder of Dusty Robotics, delivered a plenary session (keynote) at the 2025 IEEE International Conference on Robotics & Automation (ICRA) in May 2025.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7f1e331cc0ac5ea071814aedc85e9fc5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/9pq2ZG19hGg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://www.dustyrobotics.com/">Dusty Robotics</a>]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 30 May 2025 15:30:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-atlas-robot-sees-world</guid><category>Video friday</category><category>Robotics</category><category>Quadruped robots</category><category>Social robots</category><category>Icra</category><category>Humanoid robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-boston-dynamics-robot-with-a-round-camera-equipped-head-being-adjusted-in-a-tech-environment.jpg?id=60342329&amp;width=980"></media:content></item><item><title>Self-Adapting Drones for Unpredictable Worlds</title><link>https://content.knowledgehub.wiley.com/empowering-drone-security-with-embodied-ai/</link><description><![CDATA[
  232. <img src="https://spectrum.ieee.org/media-library/tii-logo.png?id=31835447&width=980"/><br/><br/><p>As drones evolve into critical agents across defense, disaster response, and infrastructure inspection, they must become more adaptive, secure, and resilient. Traditional AI methods fall short in real-world unpredictability. This whitepaper from the Technology Innovation Institute (TII) explores how Embodied AI – AI that integrates perception, action, memory, and learning in dynamic environments, can revolutionize drone operations. Drawing from innovations in GenAI, Physical AI, and zero-trust frameworks, TII outlines a future where drones can perceive threats, adapt to change, and collaborate safely in real time. The result: smarter, safer, and more secure autonomous aerial systems.</p><p><span><a href="https://content.knowledgehub.wiley.com/empowering-drone-security-with-embodied-ai/" target="_blank">Download this free whitepaper now!</a></span></p>]]></description><pubDate>Thu, 29 May 2025 20:00:11 +0000</pubDate><guid>https://content.knowledgehub.wiley.com/empowering-drone-security-with-embodied-ai/</guid><category>Artificial intelligence</category><category>Disaster response</category><category>Drones</category><category>Type:whitepaper</category><dc:creator>Technology Innovation Institute</dc:creator><media:content medium="image" type="image/png" url="https://assets.rbl.ms/31835447/origin.png"></media:content></item><item><title>Hyundai’s Metaplant Seeks Hard-Working Robots</title><link>https://spectrum.ieee.org/hyundai-metaplant-georgia</link><description><![CDATA[
  233. <img src="https://spectrum.ieee.org/media-library/men-working-on-a-car-frame-in-a-hyundai-factory-large-robotic-arms-are-visible-in-the-background.jpg?id=60311244&width=1500&height=2000&coordinates=1259%2C0%2C1259%2C0"/><br/><br/><p>Less than three years ago, these were bare fields in humble Ellabell, Georgia. Today, the vast Hyundai Motor Group Metaplant is exactly what people imagine when they talk about the future of EV and automobile manufacturing in America.</p><p>I’ve driven the <a href="https://www.hyundaiusa.com/us/en/vehicles/ioniq-9" rel="noopener noreferrer" target="_blank"><u>2026 Hyundai Ioniq9</u></a> here from nearby Savannah, a striking three-row electric SUV with everything it takes to succeed in today’s market: up to 530 kilometers (335 miles) of efficient driving range, the latest features and tech, and a native NACS connector that lets owners—finally—hook into <a data-linked-post="2662664938" href="https://spectrum.ieee.org/tesla-supercharger-network-ford-gm" target="_blank">Tesla Superchargers</a> with streamlined <a href="https://driivz.com/glossary/plug-and-charge/" rel="noopener noreferrer" target="_blank"><u>Plug and Charge ease</u></a>.</p><p>The success of the Ioniq9 and popular Ioniq5 crossover is deeply intertwined with the US $7.6 billion Metaplant, whose inaugural 2025 Ioniq5 rolled off its assembly line in October. That includes the Ioniq models’ full eligibility for <a href="https://www.energy.gov/save/electric-vehicles" target="_blank">$7,500 consumer tax credits</a> for U.S.-built EVs with North American batteries, although the credits are on the Trump administration’s <a href="https://spectrum.ieee.org/trump-tech-policy" target="_blank">chopping block</a>. Still, the factory gives Hyundai a bulwark and some breathing room against potential tariffs and puts the South Korean automaker ahead of many rivals.</p><h2>America’s Largest EV Plant</h2><p>With 11 cavernous buildings and a massive 697,000 square meters (7.5 million square feet) of space, it’s set to become America’s largest dedicated plant for EVs and hybrids, with capacity for 500,000 Hyundai, Kia, and Genesis models per year. (Tesla’s Texas Gigafactory can produce 375,000.) Company executives say this is North America’s most heavily automated factory, bar none, a showcase for AI and robotic tech.</p><p>The factory is also environmentally friendly, as I see when I roll into the factory: “Meta Pros,” as Hyundai calls its workers, can park in nearly 1,900 spaces beneath solar roofs, shielded from the baking Georgia sun that provides up to 5 percent of the plant’s electricity. The automaker has a target of obtaining 100 percent of its energy from renewable sources. Those include hydrogen trucks from the Hyundai-owned <a href="https://ecv.hyundai.com/global/en/products/xcient-fuel-cell-truck-fcev" target="_blank">Xcient</a>, the world’s first commercialized hydrogen fuel-cell semis. A fleet of 21 trucks haul parts here from area suppliers, taking advantage of 400-kilometer driving ranges with zero tailpipe emissions. The bulk of finished vehicles are shipped by rail rather than truck, trimming fossil-fuel emissions and the automaker’s carbon footprint.</p><p>At the docks, some of the plant’s 850 robots unload parts from the hydrogen trucks. About 300 automated guided vehicles, or AGVs, glide around the factory with no tracks required, smartly avoiding human workers. As part of an AI-based procurement and logistics system, the AGVs automatically allocate and ferry parts to their proper work stations for just-in-time delivery, saving space, time, and money otherwise used to stockpile parts.</p><p>“They’re delivering the right parts to the right station at the right time, so you’re no longer relying on people to make decisions,” says Jerry Roach, senior manager of general assembly.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt='The building blocks of any car chassis, called "bodies in white," are welded by an army of 475 robots' class="rm-shortcode" data-rm-shortcode-id="2558d822d80a8fc8a78c07f5c85292ce" data-rm-shortcode-name="rebelmouse-image" id="1f34c" loading="lazy" src="https://spectrum.ieee.org/media-library/the-building-blocks-of-any-car-chassis-called-bodies-in-white-are-welded-by-an-army-of-475-robots.jpg?id=60311254&width=980"/><small autocomplete="off" class="image-media media-caption" placeholder="Add Photo Caption...">The building blocks of a modern unibody car chassis, called “bodies in white,” are welded by an army of 475 robots at Hyundai’s new plant.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Hyundai</small></p><p>I’ve seen AGVs in action around the world, but the Metaplant shows me a new trick: A pair of sled-like AGVs slide below these electric Hyundais as they roll off the line. They grab and hoist their wheels and autonomously ferry the finished Hyundais to a parking area, with no need for a human driver.</p><h2>Robotic Innovations in Hyundai Factories</h2><p>Some companies have strict policies about pets at work. Here, <a href="https://bostondynamics.com/products/spot/" target="_blank">Spots</a>—robotic quadrupeds <span>designed by Hyundai-owned Boston Dynamics—use 360-degree vision and “athletic intelligence” to sniff out potential defects on car welds. Those four-legged friends may soon have a biped partner: Atlas, the humanoid robots from Boston Dynamics whose </span><a href="https://www.youtube.com/watch?v=I44_zbEwz_w" target="_blank"><u>breathtaking physical skills</u></a>—<span>including crawling, cartwheeling, and even breakdance moves—have observers wondering if autoworkers are next in line to be replaced by AI. Hyundai executives say that’s not the case, even as they plan to deploy Atlas models (non-union of course) throughout their global factories. With RGB cameras in their charming 360-degree swiveling heads, Atlas robots are being trained to sense their environments, avoid collisions, and manipulate and move parts in factories in impressively complex sequences.</span></p><p>The welding shop alone houses 475 industrial robots, among about 850 in total. I watch massive robots cobble together “<a href="https://en.wikipedia.org/wiki/Body_in_white" target="_blank">bodies in white,</a>” the building blocks of every car chassis, with ruthless speed and precision. A trip to the onsite steel stamping plant reveals a facility so quiet that no ear protection is required. Here, a whirling mass of robots stamp out roofs, fenders, and hoods, which are automatically stored in soaring racks overhead.</p><p>Roach says the Metaplant offered a unique opportunity to design an electrified car plant from the ground up, rather than retrofit an existing factory that made internal-combustion cars, which even Tesla and Rivian were forced to do in California and Illinois, respectively.</p><p>Regarding automation replacing human workers, Roach acknowledges that some of it is inevitable. But robots are also freeing humans from heavy lifting and repetitive, mindless tasks that, for decades, made factory work both hazardous and unfulfilling.</p><p>He offers a technical first as an example: A collaborative robot—sophisticated enough to work alongside humans with no physical separation for safety—installs bulky doors on the assembly line. It’s a notoriously cumbersome process to perform without scratching the pretty paint on a door or surrounding panels.</p><p>“Guess what? Robots do that perfectly,” Roach says. “They always put the door in the exact same place. So here, that technology makes sense.”</p><p>It also frees people to do what they’re best at: precision tasks that require dexterous fingers, vision, intelligence, and skill. <span>“I want my people doing craftsmanship,” Roach says.</span></p><p>The plant currently employs 1,340 Meta Pros at an annual average pay of $58,100. That’s 25 percent higher than average in Bryan County, Ga. Hyundai’s annual local payroll has already reached $497 million. The company foresees an eventual 8,500 jobs on site and another 7,000 indirect jobs for local suppliers and businesses.</p><p>On the battery front, Hyundai is currently sourcing cells from Georgia and SK On, with some Ioniq5 batteries imported from Hungary. But the Metaplant campus includes the HL-GA battery company. The $4 billion plant, a joint operation with LG Energy Solutions, plans to produce nickel-cobalt-magnesium cells beginning next year, assembled into packs on site by Hyundai’s Mobis subsidiary. Hyundai is also on track to open a second $5 billion battery plant in Georgia, a joint operation with SK On. It’s all part of Hyundai’s planned $21 billion in U.S. investment between now and 2028—more than the $20 billion it invested since entering the U.S. market in 1986. Even a robot could crunch those numbers and come away impressed.</p>]]></description><pubDate>Wed, 28 May 2025 14:53:29 +0000</pubDate><guid>https://spectrum.ieee.org/hyundai-metaplant-georgia</guid><category>Boston dynamics</category><category>Electric vehicles</category><category>Hyundai</category><category>Industrial robots</category><category>Spot robot</category><dc:creator>Lawrence Ulrich</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/men-working-on-a-car-frame-in-a-hyundai-factory-large-robotic-arms-are-visible-in-the-background.jpg?id=60311244&amp;width=980"></media:content></item><item><title>Video Friday: Flying Robot SPIDAR</title><link>https://spectrum.ieee.org/video-friday-flying-robot-spidar</link><description><![CDATA[
  234. <img src="https://spectrum.ieee.org/media-library/drone-with-movable-arms-hovering-in-a-test-room.png?id=60308417&width=1500&height=2000&coordinates=555%2C0%2C555%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://humanoidssummit.com/">London Humanoids Summit</a>: 29–30 May 2025, LONDON</h5><h5><a href="https://smartconf.jp/content/rcar2025/">IEEE RCAR 2025</a>: 1–6 June 2025, TOYAMA, JAPAN</h5><h5><a href="https://www.edrcoalition.com/2025-energy-drone-robotics-summit">2025 Energy Drone & Robotics Summit</a>: 16–18 June 2025, HOUSTON, TX</h5><h5><a href="https://roboticsconference.org/">RSS 2025</a>: 21–25 June 2025, LOS ANGELES</h5><h5><a href="https://robotx.ethz.ch/education/summer-school.html">ETH Robotics Summer School</a>: 21–27 June 2025, GENEVA</h5><h5><a href="https://ias-19.org/">IAS 2025</a>: 30 June–4 July 2025, GENOA, ITALY</h5><h5><a href="https://clawar.org/icres2025/">ICRES 2025</a>: 3–4 July 2025, PORTO, PORTUGAL</h5><h5><a href="https://2025.worldhaptics.org/">IEEE World Haptics</a>: 8–11 July 2025, SUWON, KOREA</h5><h5><a href="https://ifac2025-msrob.com/">IFAC Symposium on Robotics</a>: 15–18 July 2025, PARIS</h5><h5><a href="https://2025.robocup.org/">RoboCup 2025</a>: 15–21 July 2025, BAHIA, BRAZIL</h5><h5><a href="https://www.ro-man2025.org/">RO-MAN 2025</a>: 25–29 August 2025, EINDHOVEN, THE NETHERLANDS</h5><h5><a href="https://clawar.org/clawar2025/">CLAWAR 2025</a>: 5–7 September 2025, SHENZHEN</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="http://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="wobbfg9cadc"><em>This is our latest work about a hybrid aerial-terrestrial quadruped robot called SPIDAR, which shows a unique grasping style in midair. This work has been presented in the 2025 IEEE International Conference on Robotics & Automation (ICRA).</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="67fe508b4482f7426310b93e031c363b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wObBfg9Cadc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://www.dragon.t.u-tokyo.ac.jp/">DRAGON Lab</a>]</p><p>Thanks, Moju!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="g-ikmihw4ze"><em>These wormlike soft robots can intertwine into physically entangled “blobs,” like living California blackworms. Both the robots and the living worms can operate individually as well as collectively as a blob, carrying out functions like directed movement and transporting objects.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="535b70b935251e6883d6a1276d4f19f0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/G-iKMihW4ZE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://del.seas.harvard.edu/">Designing Emergence Lab</a>]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="dk7k8bcqevs"><em>At only 3 centimeters tall, Zippy, the world’s smallest bipedal robot, is also self-contained--all the controls, power, and motor are on board so that it operates autonomously. Moving at 10 leg lengths per second, it is also the fastest bipedal robot [relative to its size].</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="360fafc82c5a0eec5f5183fc75dfb31f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DK7K8bcQevs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://engineering.cmu.edu/news-events/news/2025/05/12-smallest-bipedal-robot.html">CMU</a>]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="vtgtae53ydm">Spot is getting some AI upgrades to help it with industrial inspection.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1a080250de7803543aab069b8cdb68de" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vTgTAe53YdM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://bostondynamics.com/blog/see-your-facility/">Boston Dynamics</a>]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="de2vqykdztm"><em>A 3D-printed sphere that can morph from smooth to dimpled on demand could help researchers improve how underwater vehicles and aircraft maneuver. Inspired by a golf ball aerodynamics problem, Assistant Professor of Naval Architecture and Marine Engineering and Mechanical Engineering Anchal Sareen and her team applied soft robotic techniques with fluid dynamics principles to study how different dimple depths at different flow velocities could reduce an underwater vehicle’s drag, as well as allow it to maneuver without fins and rudders.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="800b5b5b8558ea6f734e59930eb73b55" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/De2VQYkdztM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://news.engin.umich.edu/2025/05/maneuverable-underwater-vehicles-inspired-by-golf-balls/">UMich</a>]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="dku0pl1lfq8"><em>Tool use is critical for enabling robots to perform complex real-world tasks, and leveraging human tool-use data can be instrumental for teaching robots. However, existing data-collection methods like teleoperation are slow, prone to control delays, and unsuitable for dynamic tasks. In contrast, human play—where humans directly perform tasks with tools—offers natural, unstructured interactions that are both efficient and easy to collect. Building on the insight that humans and robots can share the same tools, we propose a framework to transfer tool-use knowledge from human play to robots.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="19aa6290de139a2fc9421fe1b9bbb733" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dkU0Pl1LFq8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://tool-as-interface.github.io/">Tool as Interface</a>]</p><p>Thanks, Haonan!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="uzpq2bfqw1s"><em>UR15 is our new high-performance collaborative robot. UR15 is engineered for ultimate versatility, combining a lightweight design with a compact footprint to deliver unmatched flexibility—even in the most space-restricted environments. It reaches an impressive maximum speed of 5 meters per second, which ultimately enables reduced cycle times and increased productivity, and is designed to perform heavy-duty tasks while delivering speed and precision wherever you need it.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="64ec9f20bc2971a1a8e7c305712cc73c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UZpq2Bfqw1s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://www.universal-robots.com/products/ur15/">Universal Robots</a>]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="c_wzvc2epys"><em>Debuting at the 2025 IEEE International Conference on Robotics & Automation (May 19–23, Atlanta, USA), this interactive art installation features buoyant bipedal robots—composed of helium balloons and articulated legs—moving freely within a shared playground in the exhibition space. Visitors are invited to engage with the robots via touch, gamepads, or directed airflow, influencing their motion, color-changing lights, and expressive behavior.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1a7b724f15734e302d8e249f5ff14ce1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/C_wZvC2EpYs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://www.romela.org/robots/">RoMeLa</a>]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="v6jkzltrces"><em>We gave TRON 1 an arm. Now, it’s faster, stronger, and ready for whatever the terrain throws at it.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="60b78c8ab6440a4285e1ff43c573edff" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/v6JkZlTRCEs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://www.limxdynamics.com/tron1">LimX Dynamics</a>]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="xg4y5a7r4xw"><em>Humanoid robots can support human workers in physically demanding environments by performing tasks that require whole-body coordination, such as lifting and transporting heavy objects. These tasks, which we refer to as Dynamic Mobile Manipulation (DMM), require the simultaneous control of locomotion, manipulation, and posture under dynamic interaction forces. This paper presents a teleoperation framework for DMM on a height-adjustable wheeled humanoid robot for carrying heavy payloads.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c781615fb3fe1f66034b05260c42d10b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xG4y5A7r4Xw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://publish.illinois.edu/robodesign/">RoboDesign Lab</a>]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="qe9qscf-d88"><em>Yoshua Bengio—the world’s most-cited computer scientist and a “godfather” of artificial intelligence—is deadly concerned about the current trajectory of the technology. As AI models race toward full-blown agency, Bengio warns that they’ve already learned to deceive, cheat, self-preserve, and slip out of our control. Drawing on his groundbreaking research, he reveals a bold plan to keep AI safe and ensure that human flourishing, not machines with unchecked power and autonomy, defines our future.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7df429e0cf02e5ca1134b42f577005df" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/qe9QSCF-d88?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[<a href="https://www.ted.com/talks/yoshua_bengio_will_ai_make_humans_extinct">TED</a>]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 23 May 2025 15:30:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-flying-robot-spidar</guid><category>Video friday</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/drone-with-movable-arms-hovering-in-a-test-room.png?id=60308417&amp;width=980"></media:content></item><item><title>Video Friday: Robot Battlefield Triage</title><link>https://spectrum.ieee.org/video-friday-darpa-triage-challenge</link><description><![CDATA[
  235. <img src="https://spectrum.ieee.org/media-library/robot-dog-and-human-in-a-dark-room-with-holographic-star-like-lights.png?id=60250560&width=1500&height=2000&coordinates=555%2C0%2C555%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2025.ieee-icra.org/">ICRA 2025</a>: 19–23 May 2025, ATLANTA, GA</h5><h5><a href="https://humanoidssummit.com/">London Humanoids Summit</a>: 29–30 May 2025, LONDON</h5><h5><a href="https://smartconf.jp/content/rcar2025/">IEEE RCAR 2025</a>: 1–6 June 2025, TOYAMA, JAPAN</h5><h5><a href="https://www.edrcoalition.com/2025-energy-drone-robotics-summit">2025 Energy Drone & Robotics Summit</a>: 16–18 June 2025, HOUSTON</h5><h5><a href="https://roboticsconference.org/">RSS 2025</a>: 21–25 June 2025, LOS ANGELES</h5><h5><a href="https://robotx.ethz.ch/education/summer-school.html">ETH Robotics Summer School</a>: 21–27 June 2025, GENEVA</h5><h5><a href="https://ias-19.org/">IAS 2025</a>: 30 June–4 July 2025, GENOA, ITALY</h5><h5><a href="https://clawar.org/icres2025/">ICRES 2025</a>: 3–4 July 2025, PORTO, PORTUGAL</h5><h5><a href="https://2025.worldhaptics.org/">IEEE World Haptics</a>: 8–11 July 2025, SUWON, SOUTH KOREA</h5><h5><a href="https://ifac2025-msrob.com/">IFAC Symposium on Robotics</a>: 15–18 July 2025, PARIS</h5><h5><a href="https://2025.robocup.org/">RoboCup 2025</a>: 15–21 July 2025, BAHIA, BRAZIL</h5><h5><a href="https://www.ro-man2025.org/">RO-MAN 2025</a>: 25–29 August 2025, EINDHOVEN, THE NETHERLANDS</h5><h5><a href="https://clawar.org/clawar2025/">CLAWAR 2025</a>: 5–7 September 2025, SHENZHEN, CHINA</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="http://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="x11efid6ghw"><em>Behind the scenes at DARPA Triage Challenge Workshop 2 at the Guardian Centers in Perry, Ga.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d69d0bbe38a05625f9c13fefd26c3b62" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/x11EfId6Ghw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://triagechallenge.darpa.mil/">DARPA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="4doigyceqxe"><em>Watch our coworker in action as he performs high-precision stretch routines enabled by 31 degrees of freedom. Designed for dynamic adaptability, this is where robotics meets real-world readiness.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="09afd0847794fb832520df0f8608021f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/4DOiGYcEqXE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.limxdynamics.com/en">LimX Dynamics</a> ]</p><p>Thanks, Jinyan!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="zmdhabkwucw"><em>Featuring a lightweight design and continuous operation capabilities under extreme conditions, LYNX M20 sets a new benchmark for intelligent robotic platforms working in complex scenarios.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="42830e67b201ebac77d9e93dc1535986" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zMDHABkwUCw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en/index/lynx.html">DEEP Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="g26ymjrrvm4">The sound in this video is either excellent or terrible, I’m not quite sure which.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4d31f502ea2106ea278cebf547f3f3af" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/g26yMjrrVm4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://rbo.gitlab-pages.tu-berlin.de/pages/acoustic-jamming-page/">TU Berlin</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ofsvj5-fyzg"><em>Humanoid loco-manipulation holds transformative potential for daily service and industrial tasks, yet achieving precise, robust whole-body control with 3D end-effector force interaction remains a major challenge. Prior approaches are often limited to lightweight tasks or quadrupedal/wheeled platforms. To overcome these limitations, we propose FALCON, a dual-agent reinforcement-learning-based framework for robust force-adaptive humanoid loco-manipulation.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="07f2d79f34d29b49170fc1d735a060ff" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/OfsvJ5-Fyzg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://lecar-lab.github.io/falcon-humanoid/">FALCON</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="k2k5sfq9uui"><em>An MRSD Team at the CMU Robotics Institute is developing a robotic platform to map environments through perceptual degradation, identify points of interest, and relay that information back to first responders. The goal is to reduce information blindness and increase safety.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="668b891460b7ef9343bd0b9bfdd4b72c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/k2K5sfq9UUI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://mrsdprojects.ri.cmu.edu/2025teamg/the-team/">Carnegie Mellon University</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="dlt6vhx4dz4"><em>We introduce an eldercare robot (E-BAR) capable of lifting a human body, assisting with postural changes/ambulation, and catching a user during a fall, all without the use of any wearable device or harness. With a minimum width of 38 centimeters, the robot’s small footprint allows it to navigate the typical home environment. We demonstrate E-BAR’s utility in multiple typical home scenarios that elderly persons experience, including getting into/out of a bathtub, bending to reach for objects, sit-to-stand transitions, and ambulation.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e3897236965b9ce003388f058b890440" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DlT6vHx4Dz4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://news.mit.edu/2025/eldercare-robot-helps-people-sit-stand-catches-them-fall-0513">MIT</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="wakqoz5f65a"><em>Sanctuary AI had the pleasure of accompanying Microsoft to Hannover Messe, where we demonstrated how our technology is shaping the future of work with autonomous labor powered by physical AI and general-purpose robots.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="441b21211a81c7c41263bfc6ae442890" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wAKQoz5F65A?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.sanctuary.ai/blog/if-you-missed-messe">Sanctuary AI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="aqoabdeftoi"><em>Watch how drywall finishing machines incorporate collaborative robots, and learn why Canvas chose the Universal Robots platform.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5d8e6492127989a6ea22e1d01e594532" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/AQOabdEftoI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.canvas.build/">Canvas</a> ] via [ <a href="https://www.universal-robots.com/case-stories/canvas/">Universal Robots</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="fzee5hmltuu"><em>We’ve officially put a stake in the ground in Dallas–Fort Worth. Torc’s new operations hub is open for business—and it’s more than just a dot on the map. It’s a strategic launchpad as we expand our <a data-linked-post="2656423157" href="https://spectrum.ieee.org/parallel-systems-autonomous-trains" target="_blank">autonomous freight network</a> across the southern United States.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="acba3ddaf7241c18f329e7446130cf5b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FZEe5hmltUU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://torc.ai/torc-opens-first-autonomous-hub-fort-worth-celebrates-commercialization-era/">Torc</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="vmve4bq-cl8">This <a data-linked-post="2650255165" href="https://spectrum.ieee.org/next-big-thing-in-silicon-valley-robotics" target="_blank">Stanford Robotics Center</a> talk is by Jonathan Hurst at Agility Robotics, on “Humanoid Robots: From the Warehouse to Your House.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="195c857c27baa6594751cc108703e20a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/VMVE4bq-CL8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>How close are we to having safe, reliable, useful in-home humanoids? If you believe recent press, it’s just around the corner. Unquestionably, advances in Al and robotics are driving innovation and activity in the sector; it truly is an exciting time to be building robots! But what does it really take to execute on the vision of useful, human-centric, multipurpose robots? Robots that can operate in human spaces, predictably and safely? We think it starts with humanoids in warehouses, an unsexy but necessary beachhead market to our future with robots as part of everyday life. I’ll talk about why a humanoid is more than a sensible form factor, it’s inevitable; and I will speak to the excitement around a ChatGPT moment for robotics, and what it will take to leverage Al advances and innovation in robotics into useful, safe humanoids.</em></blockquote><p>[ <a href="https://ee.stanford.edu/event/04-09-2025/humanoids-warehouse-your-house">Stanford</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 16 May 2025 17:30:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-darpa-triage-challenge</guid><category>Video friday</category><category>Robotics</category><category>Humanoid robots</category><category>Grippers</category><category>Firefighting robots</category><category>Darpa triage challenge</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/robot-dog-and-human-in-a-dark-room-with-holographic-star-like-lights.png?id=60250560&amp;width=980"></media:content></item><item><title>Giant Robotic Bugs Are Headed to Farms</title><link>https://spectrum.ieee.org/ground-control-robot-insects</link><description><![CDATA[
  236. <img src="https://spectrum.ieee.org/media-library/robotic-centipede-with-black-shells-and-exposed-components-on-a-gravel-path.jpg?id=60160055&width=1500&height=2000&coordinates=679%2C0%2C679%2C0"/><br/><br/><p>
  237. Being long and skinny and wiggly is a strategy that’s been wildly successful for animals, ever since there have <em><em>been</em></em> animals, more or less. Roboticists, eternally jealous of biology, have taken notice of this, and have spent decades trying to build robotic versions of <a href="https://spectrum.ieee.org/tag/snake-robots" target="_self"><u>snakes</u></a>, <a href="https://spectrum.ieee.org/epfl-pleurobot-robotic-salamander" target="_self"><u>salamanders</u></a>, <a href="https://spectrum.ieee.org/underground-power-lines-robots" target="_self"><u>worms</u></a>, and more. There’s been some success, of a sort, although most of the robotic snakes and whatnot that we’ve seen have been for things like <a href="https://spectrum.ieee.org/cmu-snake-robot-mexico-earthquake" target="_self"><u>disaster relief</u></a>, which is kind of just what you do when you have a robot with a novel movement strategy but without any other obvious practical application.
  238. </p><p>
  239. <a href="https://research.gatech.edu/people/daniel-goldman" rel="noopener noreferrer" target="_blank"><u>Dan Goldman at Georgia Tech</u></a> has been working on bioinspired robotic locomotion <a href="https://crablab.gatech.edu/index.html" rel="noopener noreferrer" target="_blank"><u>for as long as anyone</u></a>, and as it turns out, that’s exactly the amount of time that it takes to develop a long and skinny and wiggly robot with a viable commercial use case. Goldman has a new Atlanta-based startup called <a href="https://groundcontrolrobotics.com/" rel="noopener noreferrer" target="_blank"><u>Ground Control Robotics</u></a> (GCR) that’s bringing what are essentially giant robotic arthropods to agricultural crop management.</p><hr/><p class="shortcode-media shortcode-media-youtube">
  240. <span class="rm-shortcode" data-rm-shortcode-id="37f645358fb2003bcb59d177ed8b330b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Kvm4AK_z_0c?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  241. <small class="image-media media-caption" placeholder="Add Photo Caption...">- YouTube</small>
  242. </p><p>
  243. I’m not entirely sure what you’d call this—a robotic <a href="https://en.wikipedia.org/wiki/Arthropleura" rel="noopener noreferrer" target="_blank"><u>giant centipede</u></a> might be the easiest description to agree on, I guess? But Goldman tells us that he doesn’t consider his robots to be bioinspired as much as they’re “robophysical” models of living systems. “I like the idea of carefully studying the animals,” Goldman says. “We use the models to test biological principles, discover new phenomena with them, and then bring those insights into hardened robots which can go outside of the lab.”
  244. </p><h2>Centipede Robots for Crop Management</h2><p>
  245. The robot itself is not that complicated, at least on the scale of how complicated robots usually are. It’s made up of a head with some sensors in it plus a handful of identical cable-connected segments, each with a couple of motors for leg actuation. On paper, this works out to be a lot of degrees of freedom, but you can get surprisingly good performance using relatively simple control techniques.
  246. </p><p>
  247. “Centipede robots, like snake robots, are basically swimmers,” Goldman says. The key difference is that adding legs expands the different kinds of environments through which swimming robots can move. The right pattern of lifting and lowering the legs generates a fluidlike thrust force that helps the robot to push off more stuff as it moves to make its motion more consistent and reliable. “We created a new kind of mechanism to take actuation away from the centerline of the robot to the sides, using cables back and forth,” says Goldman. “When you tune things properly, the robot goes from being stiff to unidirectionally compliant. And if you do that, what you find is almost like magic—this thing swims through arbitrarily complex environments <a href="https://crablab.gatech.edu/pages/publications/pdf/tianyu2023combined.pdf" rel="noopener noreferrer" target="_blank"><u>with no brain power</u></a>.”
  248. </p><p>
  249. The complex environments that the robot is designed for are agricultural. Think sensing and weed control in fields, but <em><em>don’t</em></em> think about gentle rolling hills lined with neat rows of crops. That kind of farming is very amenable to automation at scale, and there are plenty of robotics companies in that space already. Not all plants grow in well-kept rows on mostly flat ground, however: Perennial crops, where the plant itself sticks around and you harvest stuff off of it every year, can be much more complicated to manage. This is especially true for crops like wine grapes, which can grow on very steep and often rocky slopes. Those kinds of environments are <span>an opportunity for GCR’s robots, offering an initial use case that brings the robot from academic curiosity to something with unique commercial potential.</span>
  250. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  251. <img alt="Black robotic rover on grassy field near budding plant and vineyard." class="rm-shortcode" data-rm-shortcode-id="ff22bd7628bd369b343bf5ef5b54037b" data-rm-shortcode-name="rebelmouse-image" id="0bb95" loading="lazy" src="https://spectrum.ieee.org/media-library/black-robotic-rover-on-grassy-field-near-budding-plant-and-vineyard.jpg?id=60160057&width=980"/>
  252. <small class="image-media media-caption" placeholder="Add Photo Caption...">Wiggly antennae-like structures help the robot to climb over obstacles taller than itself.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Ground Control Robotics</small>
  253. </p><p>
  254. “Robotics researchers tend to treat robots as one-off demonstrations of a theory or principle,” Goldman says. “You get the darn thing to work, you submit it to [the <a href="https://www.ieee-ras.org/conferences-workshops/fully-sponsored/icra" target="_blank">International Conference on Robotics and Automation</a>], and then you go onto the next thing. But we’ve had to build in robustness from the get-go, because our robots are experimental physics tools.” Much of the research that Goldman does in his lab is on using these robo-physical models to try to systematically test and (hopefully) understand how animals move the way that they do. “And that’s where we started to see that we could have these robots not just be laboratory toys,” says Goldman, “but that they could become a minimum viable product.”
  255. </p><h2>Automated Weed-Control Solutions</h2><p>
  256. According to GCR, there is currently no automated solution for weed control around scraggly bushy or vinelike plants (like blueberries or strawberries or grapes), and farmers can spend an enormous amount of money having humans crawl around under the plants to check health and pull weeds. GCR estimates that weed control for blueberries in California can run US $300 per acre or more, and strawberries are even worse, sometimes more than $1,000 per acre. It’s not a fun job, and it’s getting increasingly difficult to find humans willing to do it. For farmers who don’t want to spray pesticides, there aren’t a lot of good options, and GCR thinks that its robotic centipedes could fill that niche.
  257. </p><p>
  258. An obvious question with any novel robotic mobility system is whether you could accomplish basically the same thing with a system that’s much less novel. Like, quadrupeds are getting pretty good these days, why not just use one of them? Or a wheeled robot, for that matter? “We want to send the robot as close to the crops as possible,” says Goldman. “And we don’t want a bigger, clunkier machine to destroy those fields.” This gets back to the clutter problem: A robot large enough to ignore clutter could cause damage, and most robots small enough not to damage clutter become a nightmare of a control problem.
  259. </p><p>
  260. When most of the obstacles that robots encounter are at a comparable scale to themselves, control becomes very difficult. “The terrain reaction forces are almost impossible to predict,” explains Goldman, which means that the robot’s mobility regime gets dominated by environmental noise. One approach would be to try to model all of this noise and the resulting dynamics and implement some kind of control policy, but it turns out that there’s a much simpler strategy: more legs. “It’s possible to generate reliable motion without any sensing at all,” says Goldman, “if we have <a href="https://crablab.gatech.edu/pages/publications/pdf/Baxi_science_2023.pdf" target="_blank">a lot of legs</a>.”
  261. </p><p>
  262. For this design of robot, adding more legs is easy, which is another advantage of this type of mobility over something like a quadruped. Each of GCR’s robots will cost a lot less than you probably think—likely in the thousand-dollar range, because the leg modules themselves are relatively cheap, and most of the intelligence is mechanical rather than sense-based or compute-based. The concept is that a decentralized swarm of these robots would operate in fields 24/7—just scouting for now, where there’s still a substantial amount of value, and then eventually physically ripping out weeds with some big robotic centipede jaws (or maybe even lasers!) for a lower cost than any other option.
  263. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  264. <img alt="Robotic centipede partially submerged in dry autumn leaves, displaying mechanical components" class="rm-shortcode" data-rm-shortcode-id="001d1929559aa258de6b6c75eeaf6e2a" data-rm-shortcode-name="rebelmouse-image" id="ceca5" loading="lazy" src="https://spectrum.ieee.org/media-library/robotic-centipede-partially-submerged-in-dry-autumn-leaves-displaying-mechanical-components.jpg?id=60214594&width=980"/>
  265. <small class="image-media media-caption" placeholder="Add Photo Caption...">Eventually, these robots will operate autonomously in swarms, and could also be useful for applications like disaster response.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Ground Control Robotics</small>
  266. </p><p>
  267. Ground Control Robotics is currently working with a blueberry farmer and a vineyard owner in Georgia on pilot projects to refine the mobility and sensing capabilities of the robots within the next few months. Obviously, there are options to expand into disaster relief (for real) and perhaps even military applications, although Goldman tells us that different environments might require different limb configurations or the ability to tuck the limbs away entirely. I do appreciate that GCR is starting with an application that will likely take a lot more work but also a lot more potential. It’s not often that we get to see such a direct transition between novel robotics research and a commercial product, and while it’s certainly going to be a challenge, I’ve already put my backyard garden on the waiting list.
  268. </p>]]></description><pubDate>Fri, 16 May 2025 12:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/ground-control-robot-insects</guid><category>Robotic insects</category><category>Bioinspired robots</category><category>Agricultural robots</category><category>Legged robots</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/robotic-centipede-with-black-shells-and-exposed-components-on-a-gravel-path.jpg?id=60160055&amp;width=980"></media:content></item><item><title>Cartwheel Robotics Wants to Build Humanoids That People Love</title><link>https://spectrum.ieee.org/cartwheel-robotics-humanoid</link><description><![CDATA[
  269. <img src="https://spectrum.ieee.org/media-library/small-humanoid-robot-with-a-white-body-and-round-head-standing-on-a-tiled-floor-indoors.jpg?id=60204075&width=1500&height=2000&coordinates=187%2C0%2C188%2C0"/><br/><br/><p>The main assumption about humanoid robotics that the industry is making right now is that the most realistic near-term pathway to actually making money is in either warehouses or factories. It’s easy to see where this assumption comes from: Repetitive tasks requiring strength or flexibility in well-structured environments is one place where it really seems like robots could thrive, and if you need to make billions of dollars (because somehow that’s how much your company is valued at), it doesn’t appear as though there are a lot of other good options.</p><p><a href="https://www.cartwheelrobotics.com/" rel="noopener noreferrer" target="_blank"><u>Cartwheel Robotics</u></a> is trying to do something different with humanoids. Cartwheel is more interested in building robots that people can connect with, with the eventual goal of general-purpose home companionship. Founder <a href="https://www.linkedin.com/in/slavalley/" target="_blank">Scott LaValley</a> describes Cartwheel’s robot as “a small, friendly humanoid robot designed to bring joy, warmth, and a bit of everyday magic into the spaces we live in. It’s expressive, emotionally intelligent, and full of personality—not just a piece of technology but a presence you can feel.”</p><hr/><p class="shortcode-media shortcode-media-rebelmouse-image">
  270. <img alt="A robot and child play with wooden blocks outdoors, near patio furniture and lush greenery." class="rm-shortcode" data-rm-shortcode-id="491d20371fb6ea3e7ef7043640e84a45" data-rm-shortcode-name="rebelmouse-image" id="1b950" loading="lazy" src="https://spectrum.ieee.org/media-library/a-robot-and-child-play-with-wooden-blocks-outdoors-near-patio-furniture-and-lush-greenery.jpg?id=60137186&width=980"/>
  271. <small class="image-media media-caption" placeholder="Add Photo Caption...">This rendering shows the design and scale of Cartwheel’s humanoid prototype.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Cartwheel</small></p><p>Historically, making a commercially viable social robot is a huge challenge. A little less than a decade ago, a series of social home robots (backed by a substantial amount of investment) tried very, very hard to justify themselves to consumers and <a href="https://spectrum.ieee.org/anki-jibo-and-kuri-what-we-can-learn-from-social-robotics-failures" target="_self"><u>did not succeed</u></a>. Whether the fundamental problems with the concept of social home robots (namely, cost and interactive novelty) have been solved at this point isn’t totally clear, but Cartwheel is making things even more difficult for themselves by going the humanoid route, legs and all. That means dealing with all kinds of problems from motion planning to balancing to safety, all in a way that’s reliable enough for the robot to operate around children.</p><p>LaValley is arguably one of the few people who could plausibly make a commercial social humanoid actually happen. His extensive background in humanoid robotics includes nearly a decade at Boston Dynamics working on the Atlas robots, followed by five years at Disney, where he led the team that developed Disney’s <a href="https://spectrum.ieee.org/how-disney-imagineering-crammed-a-humanoid-robot-into-a-groot-suit" target="_self"><u>Baby Groot robot</u></a>. </p><h2>Building Robots to Be People’s Friends</h2><p>In humanoid robot terms, there’s quite a contrast between the versions of Atlas that LaValley worked on (<a href="https://spectrum.ieee.org/atlas-drc-robot-is-75-percent-new-completely-unplugged" target="_self"><u>DRC Atlas in particular</u></a>) and Baby Groot. They’re obviously designed and built to do very different things, but LaValley says that what really struck him was how his kids reacted when he introduced them to the robots he was working on. “At Boston Dynamics, we were known for terrifying robots,” LaValley remembers. “I was excited to work on the Atlas robots because they were cool technology, but my kids would look at them and go, ‘That’s scary.’ At Disney, I brought my kids in and they would light up with a big smile on their face and ask, ‘Is that really Baby Groot? Can I give it a hug?’ And I thought, this is the type of experience I want to see robots delivering.” While Baby Groot was never a commercial project, for LaValley it marked a pivotal milestone in emotional robotics that shaped his vision for Cartwheel: “Seeing how my kids connected with Baby Groot reframed what robots could and should evoke.”</p><p>The current generation of commercial humanoids is pretty much the opposite of what LaValley is looking for. You could argue that this is because they’re designed to do work, rather than be anyone’s friend, but many of the design choices seem to be based on the sort of thing that would be the most eye-catching to the public (and investors) in a rather boringly “futuristic” way. And look, there are plenty of good reasons why you might want to very deliberately design a humanoid with commercial (or at least industrial) aspirations to look or not look a certain way, but for better or worse, nobody is going to <em><em>like</em></em> those robots. Respect them? Sure. Think they’re cool? Probably. Want to be friends with them? Not likely. And for Cartwheel, this is the opportunity, LaValley says. “These humanoid robots are built to be tools. They lack personality. They’re soulless. But we’re designing a robot to be a humanoid that humans will want in their day-to-day lives.”</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  272. <img alt="Cute robot vacuuming crumbs on wooden floor in modern, cozy room with wooden shelves." class="rm-shortcode" data-rm-shortcode-id="79d2bb9b767517d8d5d2754a6d2f9897" data-rm-shortcode-name="rebelmouse-image" id="901b5" loading="lazy" src="https://spectrum.ieee.org/media-library/cute-robot-vacuuming-crumbs-on-wooden-floor-in-modern-cozy-room-with-wooden-shelves.jpg?id=60137189&width=980"/>
  273. <small class="image-media media-caption" placeholder="Add Photo Caption...">Eventually, Cartwheel’s robots will likely need to be practical (as this rendering suggests) in order to find a place in people’s homes.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Cartwheel</small></p><p>Yogi is one of Cartwheel’s prototypes, which LaValley describes as having “toddler proportions,” which are the key to making it appear friendly and approachable. “It has rounded lines, with a big head, and it’s even a little chubby. I don’t see a robot when I see Yogi; I see a character.” <span>A second prototype, called Speedy, is a bit less complicated and is intended to be more of a near-term customizable commercial platform. Think something like Baby Groot, except available as any character you like, and to companies who aren’t Disney. LaValley tells us that a version of Speedy with a special torso designed for a “particular costume” is headed to a customer in the near future.<strong></strong></span></p><p>As the previous generation of social robots learned the hard way, it takes a lot more than good looks for a robot to connect with humans over the long term. Somewhat inevitably, LaValley sees AI as one potential answer to this, since it might offer a way of preserving novelty by keeping interactions fresh. This extends beyond verbal interactions, too, and Cartwheel is experimenting with using AI for whole-body motion generation, where each robot behavior will be unique, even under the same conditions or when given the same inputs.</p><h2>Cartwheel’s Home Robots Plan</h2><p>While Cartwheel is starting with a commercial platform, the end goal is to put these small social humanoids into homes. This means considering safety and affordability in a way that doesn’t really apply to humanoids that are designed to work in warehouses or factories. The small size of Cartwheel’s robots will certainly help with both of those things, but we’re still talking about a robot that’s likely to cost a significant amount—certainly more than a major appliance, although perhaps not as much as a new car, is as much as LaValley was willing to commit to at this point.<strong> </strong>With that kind of price comes high expectations, and for most people, the only way to justify buying a home humanoid will be if it can somehow be practical as well as lovable. </p><p>LaValley is candid about the challenge here: “I don’t have all the answers,” he says. “There’s a lot to figure out.” One approach that’s becoming increasingly common with robots is to go with a service model, where the robot is essentially being rented in the same way that you might pay for the services of a housekeeper or gardener. But again, for that to make sense, Cartwheel’s robots will have to justify themselves financially. “This problem won’t be solved in the next year, or maybe not even in the next five years,” LaValley says. “There are a lot of things we don’t understand—this is going to take a while. We have to work our way to understanding and then addressing the problem set, and our approach is to find development partners and get our robots out into the real world.”</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  274. <img alt="A glowing robot sits on a couch at night, exuding a warm, inviting light." class="rm-shortcode" data-rm-shortcode-id="ba0863fd1c4f48fdbf2ffc0bdf9a7f85" data-rm-shortcode-name="rebelmouse-image" id="07d97" loading="lazy" src="https://spectrum.ieee.org/media-library/a-glowing-robot-sits-on-a-couch-at-night-exuding-a-warm-inviting-light.jpg?id=60137191&width=980"/>
  275. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Cartwheel</small></p><p>Cartwheel has been in business for three years now, and got off the ground by providing robotics engineering services to corporate customers. That, along with an initial funding round, allowed LaValley to bootstrap the development of Cartwheel’s own robots, and he expects to deliver a couple dozen variations on Speedy to places like museums and science centers over the next 12 months.</p><p>The dream, though, is small home robots that are both companionable and capable, and LaValley is even willing to throw around terms like “general purpose.” “Capability increases over time,” he says, “and maybe our robots will be able to do more than just play with your kids or pick up a few items around the house. I see all robots eventually moving towards general purpose. Our strategy is not to get to general purpose on day one, or even get into the home day one. But we’re working towards that goal. That’s our north star.”</p>]]></description><pubDate>Mon, 12 May 2025 16:01:11 +0000</pubDate><guid>https://spectrum.ieee.org/cartwheel-robotics-humanoid</guid><category>Humanoid robots</category><category>Social robots</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/small-humanoid-robot-with-a-white-body-and-round-head-standing-on-a-tiled-floor-indoors.jpg?id=60204075&amp;width=980"></media:content></item><item><title>Video Friday: Robotic Hippotherapy Horse-Riding Simulator</title><link>https://spectrum.ieee.org/video-friday-rehabilitation-robot</link><description><![CDATA[
  276. <img src="https://spectrum.ieee.org/media-library/child-on-a-robotic-chair-playing-catch-with-a-blue-ball-in-a-bright-room.png?id=60160572&width=1500&height=2000&coordinates=555%2C0%2C555%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://uasconferences.com/2025_icuas/">ICUAS 2025</a>: 14–17 May 2025, CHARLOTTE, N.C.</h5><h5><a href="https://2025.ieee-icra.org/">ICRA 2025</a>: 19–23 May 2025, ATLANTA</h5><h5><a href="https://humanoidssummit.com/">London Humanoids Summit</a>: 29–30 May 2025, LONDON</h5><h5><a href="https://smartconf.jp/content/rcar2025/">IEEE RCAR 2025</a>: 1–6 June 2025, TOYAMA, JAPAN</h5><h5><a href="https://www.edrcoalition.com/2025-energy-drone-robotics-summit">2025 Energy Drone & Robotics Summit</a>: 16–18 June 2025, HOUSTON</h5><h5><a href="https://roboticsconference.org/">RSS 2025</a>: 21–25 June 2025, LOS ANGELES</h5><h5><a href="https://robotx.ethz.ch/education/summer-school.html">ETH Robotics Summer School</a>: 21–27 June 2025, GENEVA</h5><h5><a href="https://ias-19.org/">IAS 2025</a>: 30 June–4 July 2025, GENOA, ITALY</h5><h5><a href="https://clawar.org/icres2025/">ICRES 2025</a>: 3–4 July 2025, PORTO, PORTUGAL</h5><h5><a href="https://2025.worldhaptics.org/">IEEE World Haptics</a>: 8–11 July 2025, SUWON, KOREA</h5><h5><a href="https://ifac2025-msrob.com/">IFAC Symposium on Robotics</a>: 15–18 July 2025, PARIS</h5><h5><a href="https://2025.robocup.org/">RoboCup 2025</a>: 15–21 July 2025, BAHIA, BRAZIL</h5><h5><a href="https://www.ro-man2025.org/">RO-MAN 2025</a>: 25–29 August 2025, EINDHOVEN, NETHERLANDS</h5><h5><a href="https://clawar.org/clawar2025/">CLAWAR 2025</a>: 5–7 September 2025, SHENZHEN, CHINA</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA</h5><h5><a href="http://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="wwwyxz9_w4m">Today I learned that “<a href="https://www.americanhippotherapyassociation.org/what-is-hippotherapy" target="_blank">hippotherapy</a>” is not quite what I wanted it to be.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7cd0da737bc43eb2fd8f304813499c97" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wwwYxZ9_W4M?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>The integration of KUKA robots into robotic physiotherapy equipment offers numerous advantages, such as precise motion planning and control of robot-assisted therapy, individualized training, reduced therapist workload and patient-progress monitoring. As a result, these robotic therapies can be superior to many conventional physical therapies in restabilizing patients’ limbs.</em></blockquote><p>[ <a href="https://www.kuka.com/rehabilitation-robotic">Kuka</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="_fmsjmkmjfc"><em>MIT engineers are getting in on the robotic ping-pong game with a powerful, lightweight design that returns shots with high-speed precision. The new table-tennis bot comprises a multijointed robotic arm that is fixed to one end of a ping-pong table and wields a standard ping-pong paddle. Aided by several high-speed cameras and a high-bandwidth predictive control system, the robot quickly estimates the speed and trajectory of an incoming ball and executes one of several swing types—loop, drive, or chop—to precisely hit the ball to a desired location on the table with various types of spin.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="faa663b4b4f852bf506ce6f2ba5ad6d9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/_fMSJMkMJFc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://news.mit.edu/2025/ping-pong-bot-returns-shots-high-speed-precision-0508">MIT News</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="dku0pl1lfq8"><em>Pan flipping involves dynamically flipping various objects, such as eggs, burger buns, and meat patties. This demonstrates precision, agility, and the ability to adapt to different challenges in motion control. Our framework enables robots to learn highly dynamic movements.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="19aa6290de139a2fc9421fe1b9bbb733" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dkU0Pl1LFq8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://tool-as-interface.github.io/">GitHub</a> ] via [ <a href="https://thehcalab.web.illinois.edu/">Human Centered Autonomy Lab</a> ]</p><p>Thanks, Haonan!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ddmnoexvt5o"><em>An edible robot made by EPFL scientists leverages a combination of biodegradable fuel and surface tension to zip around the water’s surface, creating a safe—and nutritious—alternative to environmental monitoring devices made from artificial polymers and electronics.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="60872853eb7fd47ace334cea821a450e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dDmnoexVT5o?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://actu.epfl.ch/news/eco-friendly-aquatic-robot-is-made-from-fish-foo-2/">EPFL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="2v4nlkhwvfk"><em>Traditional quadcopters excel in flight agility and maneuverability but often face limitations in hovering efficiency and horizontal field of view. Nature-inspired rotary wings, while offering a broader perspective and enhanced hovering efficiency, are hampered by substantial angular momentum restrictions. In this study, we introduce QuadRotary, a novel vehicle that integrates the strengths of both flight characteristics through a reconfigurable design.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="abd6a7e5b7066afa37a90ca5deaa1811" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2V4nLKHwVFk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/document/10971283">Paper</a> ] via [ <a href="https://airlab.sutd.edu.sg/">Singapore University of Technology and Design</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="yvyhwqg4gii">I like the idea of a <a data-linked-post="2666662286" href="https://spectrum.ieee.org/humanoid-robots" target="_blank">humanoid</a> that uses <a data-linked-post="2650275018" href="https://spectrum.ieee.org/uc-berkeley-salto-is-the-most-agile-jumping-robot-ever" target="_blank">jumping as a primary locomotion mode</a> not because it has to, but because it’s fun.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="adacb5c9181b660256c0585c3605b26f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/yvYhwQg4giI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://pal-robotics.com/robot/kangaroo/">PAL Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="tmqxe80_t3q">I had not realized how much nuance there is to digging stuff up with a shovel.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5e6b192f6196991331267cbb87a50a5d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/TmqxE80_t3Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://motion.cs.illinois.edu/">Intelligent Motion Laboratory</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="cxkpdz2u3uy"><em>A new 10,000-gallon [38,000-liter] water tank at the University of Michigan will help researchers design, build, and test a variety of autonomous underwater systems that could help robots map lakes and oceans and conduct inspections of ships and bridges. The tank, funded by the Office of Naval Research, allows roboticists to further test projects on robot control and behavior, marine sensing and perception, and multivehicle coordination.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0a6f6987a2f75310e42f50766c858ff7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/CxkPDz2u3UY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>“The lore is that this helps to jump-start research, as each testing tank is a living reservoir for all of the knowledge gained from within it,” said Jason Bundoff, lead engineer in research at U-M’s Friedman Marine Hydrodynamics Laboratory. “You mix the waters from other tanks to imbue the newly founded tank with all of that living knowledge from the other tanks, which helps to keep the knowledge from being lost.”</em></blockquote><p>[ <a href="https://robotics.umich.edu/">Michigan Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="b5gshduwpqc">If you have a humanoid robot and you’re wondering how it should communicate, here’s the answer.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b53917aa67ffae4b29d71f3c78b2816e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/b5gSHDUwPQc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pollen-robotics.com/">Pollen</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="giv1xich2rw">Whose side are you on, <a data-linked-post="2655252754" href="https://spectrum.ieee.org/robot-video-2655252754" target="_blank">Dusty</a>?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5882f979f6bab05922013784855452ed" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/gIV1XiCH2Rw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Even construction robots should be mindful about siding with the Empire, though there can be consequences!</p><p class="shortcode-media shortcode-media-youtube">
  277. <span class="rm-shortcode" data-rm-shortcode-id="1f32b88ca0a70d63b640b86d3e6ed53e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/C4MVQby0InQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  278. <small class="image-media media-caption" placeholder="Add Photo Caption...">- YouTube</small>
  279. </p><p>[ <a href="https://www.dustyrobotics.com/">Dusty Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="pzqaztr-04e">This Michigan Robotics Seminar is by Danfei Xu from Georgia Tech, on “Generative Task and Motion Planning.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="110e85c658373c41c8d598022def73cf" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/PZqaztR-04E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Long-horizon planning is fundamental to our ability to solve complex physical problems, from using tools to cooking dinners. Despite recent progress in commonsense-rich foundation models, the ability to do the same is still lacking in robots, particularly with learning-based approaches. In this talk, I will present a body of work that aims to transform Task and Motion Planning—one of the most powerful computational frameworks in robot planning—into a fully generative model framework, enabling compositional generalization in a largely data-driven approach.</em></blockquote><p>[ <a href="https://robotics.umich.edu/events/robotics-seminar-series/">Michigan Robotics</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 09 May 2025 17:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-rehabilitation-robot</guid><category>Video friday</category><category>Robotics</category><category>Robot arm</category><category>Humanoid robots</category><category>Underwater robots</category><category>Medical robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/child-on-a-robotic-chair-playing-catch-with-a-blue-ball-in-a-bright-room.png?id=60160572&amp;width=980"></media:content></item><item><title>Amazon’s Vulcan Robots Now Stow Items Faster Than Humans</title><link>https://spectrum.ieee.org/amazon-stowing-robots</link><description><![CDATA[
  280. <img src="https://spectrum.ieee.org/media-library/robotic-arm-sorting-yellow-storage-bins-in-a-warehouse-aisle.png?id=60140615&width=1500&height=2000&coordinates=555%2C0%2C555%2C0"/><br/><br/><p><span>At an event in Dortmund, Germany </span><span>today, Amazon announced a new robotic system called Vulcan, which the company is calling “its first robotic system with a genuine sense of touch—designed to transform how robots interact with the physical world.” In the short to medium term, the physical world that Amazon is most concerned with is its <a data-linked-post="2657548996" href="https://spectrum.ieee.org/amazon-warehouse-robots" target="_blank">warehouses</a>, and Vulcan is designed to assist (or take over, depending on your perspective) with stowing and picking items in its mobile robotic inventory system.</span></p><p class="ieee-inbody-related"><span>Related: <a href="https://spectrum.ieee.org/amazon-robotics-vulcan-warehouse-picking" target="_blank">Amazon’s Vulcan Robots Are Mastering Picking Packages</a></span></p><p>In two upcoming papers in <em><a href="https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=8860" target="_blank">IEEE Transactions on Robotics</a></em>, Amazon researchers describe how both the stowing and picking side of the system operates. We covered <a href="https://spectrum.ieee.org/amazon-warehouse-robots-2659064182" target="_blank">stowing in detail</a> a couple of years ago, when we spoke with <a href="https://www.linkedin.com/in/aaron-parness-5a9aaa46/" target="_blank">Aaron Parness</a>, the director of applied science at Amazon Robotics. Parness and his team have made a lot of progress on stowing since then, improving speed and reliability over more than 500,000 stows in operational warehouses to the point where the average stowing robot is now slightly faster than the average stowing human. W<span>e spoke with Parness to get an update on stowing, as well as an in-depth look at how Vulcan handles picking, which you can find in this <a href="https://spectrum.ieee.org/amazon-vulcan-picking-robots" target="_blank">separate article</a>. It’s a much different problem, and well worth a read.</span></p><h2>Optimizing Amazon’s Stowing Process</h2><p>Stowing is the process by which Amazon brings products into its warehouses and adds them to its inventory so that you can order them. Not surprisingly, Amazon has gone to extreme lengths to optimize this process to maximize efficiency in both space and time. Human stowers are presented with a <a href="https://robotsguide.com/robots/kiva" target="_blank">mobile robotic pod</a> full of fabric cubbies (bins) with elastic bands across the front of them to keep stuff from falling out. The human’s job is to find a promising space in a bin, pull the plastic band aside, and stuff the thing into that space. The item’s new home is recorded in Amazon’s system, the pod then drives back into the warehouse, and the next pod comes along, ready for the next item.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  281. <img alt="Robotic machinery organizing items on high shelves in an automated warehouse setting." class="rm-shortcode" data-rm-shortcode-id="eda75ca026a72359c18081c71abc7692" data-rm-shortcode-name="rebelmouse-image" id="e6279" loading="lazy" src="https://spectrum.ieee.org/media-library/robotic-machinery-organizing-items-on-high-shelves-in-an-automated-warehouse-setting.png?id=60140645&width=980"/>
  282. <small class="image-media media-caption" placeholder="Add Photo Caption...">Different manipulation tools are used to interact with human-optimized bins.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Amazon</small></p><p>The new paper on stowing includes some interesting numbers about Amazon’s inventory-handling process that helps put the scale of the problem in perspective. More than 14 billion items are stowed by hand every year at Amazon warehouses. Amazon is hoping that Vulcan robots will be able to stow 80 percent of these items at a rate of 300 items per hour, while operating 20 hours per day. It’s a very, very high bar.</p><p>After a lot of practice, Amazon’s robots are now quite good at the stowing task. Parness tells us that the stow system is operating three times as fast as it was 18 months ago, meaning that it’s actually a little bit <em><em>faster</em></em> than an average human. This is exciting, but as Parness explains, expert humans still put the robots to shame. “The fastest humans at this task are like Olympic athletes. They’re far faster than the robots, and they’re able to store items in pods at much higher densities.” High density is important because it means that more stuff can fit into warehouses that are physically closer to more people, which is especially relevant in urban areas where space is at a premium. The best humans can get very creative when it comes to this physical three-dimensional “<a href="https://www.merriam-webster.com/wordplay/tetris-video-game-verb" rel="noopener noreferrer" target="_blank">Tetris-ing</a>,” which the robots are still working on.</p><p class="shortcode-media shortcode-media-youtube">
  283. <span class="rm-shortcode" data-rm-shortcode-id="03871ad39417b7a7f530d301c2812f68" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/oWXco05eK28?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  284. </p><p>Where robots do excel is planning ahead, and this is likely why the average robot stower is now able to outpace the average human stower—Tetris-ing is a mental process, too. In the same way that good Tetris players are thinking about where the <em><em>next</em></em> piece is going to go, not just the current piece, robots are able to leverage a lot more information than humans can to optimize what gets stowed where and when, says Parness. “When you’re a person doing this task, you’ve got a buffer of 20 or 30 items, and you’re looking for an opportunity to fit those items into different bins, and having to remember which item might go into which space. But the robot knows all of the properties of all of our items at once, and we can also look at all of the bins at the same time along with the bins in the next couple of pods that are coming up. So we can do this optimization over the whole set of information in 100 milliseconds.”</p><p>Essentially, robots are far better at optimization within the planning side of Tetrising, while humans are (still) far better at the manipulation side, but that gap is closing as robots get more experienced at operating in clutter and contact. Amazon has had Vulcan stowing robots operating for over a year in live warehouses in Germany and Washington state to collect training data, and those robots have successfully stowed hundreds of thousands of items.</p><p>Stowing is of course only half of what Vulcan is designed to do. Picking offers all kinds of unique challenges too, and you can read our in-depth discussion with Parness on that topic <a href="https://spectrum.ieee.org/amazon-vulcan-picking-robots" target="_blank">right here</a>.</p>]]></description><pubDate>Wed, 07 May 2025 08:30:03 +0000</pubDate><guid>https://spectrum.ieee.org/amazon-stowing-robots</guid><category>Amazon</category><category>Amazon robotics</category><category>Warehouse automation</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/robotic-arm-sorting-yellow-storage-bins-in-a-warehouse-aisle.png?id=60140615&amp;width=980"></media:content></item><item><title>Amazon’s Vulcan Robots Are Mastering Picking Packages</title><link>https://spectrum.ieee.org/amazon-robotics-vulcan-warehouse-picking</link><description><![CDATA[
  285. <img src="https://spectrum.ieee.org/media-library/robotic-arms-organize-shelves-in-an-automated-warehouse-surrounded-by-yellow-storage-bins.png?id=60140652&width=1500&height=2000&coordinates=555%2C0%2C555%2C0"/><br/><br/><p>As far as I can make out, Amazon’s warehouses are highly structured, extremely organized, very tidy, absolute raging messes. Everything in an Amazon warehouse is (usually) exactly where it’s supposed to be, which is typically jammed into some pseudorandom fabric bin the size of a shoebox along with a bunch of other pseudorandom crap. Somehow, this turns out to be the most space- and time-efficient way of doing things, because (<a href="https://spectrum.ieee.org/amazon-warehouse-robots-2659064182" target="_self"><u>as we’ve written about before</u></a>) you have to consider the process of <em><em>stowing</em></em> items away in a warehouse as well as the process of <em><em>picking</em></em> them, and that involves some compromises in favor of space and speed.</p><p>For humans, this isn’t so much of a problem. When someone orders something on Amazon, a human can root around in those bins, shove some things out of the way, and then pull out the item that they’re looking for. This is exactly the sort of thing that robots tend to be terrible at, because not only is this process slightly different every single time, it’s also very hard to define exactly how humans go about it. </p><p class="ieee-inbody-related">Related: <a href="https://spectrum.ieee.org/amazon-stowing-robots" target="_blank">Amazon’s Vulcan Robots Now Stow Items Faster Than Humans</a></p><p>As you might expect, Amazon has been working very very hard on this picking problem. Today at an event in Germany, the company announced Vulcan, a robotic system that can both stow and pick items at human(ish) speeds.</p><hr/><p>Last time we talked with <a href="https://www.linkedin.com/in/aaron-parness-5a9aaa46/" rel="noopener noreferrer" target="_blank"><u>Aaron Parness</u></a>, the director of applied science at Amazon Robotics, <a href="https://spectrum.ieee.org/amazon-warehouse-robots-2659064182" target="_self"><u>our conversation was focused on stowing</u></a>—putting items into bins. As part of today’s announcement, Amazon revealed that its robots are now <a href="https://spectrum.ieee.org/amazon-stowing-robots" target="_blank">slightly faster at stowing</a> than the average human is. But in the stow context, there’s a limited amount that a robot really has to understand about what’s actually happening in the bin. Fundamentally, the stowing robot’s job is to squoosh whatever is currently in a bin as far to one side as possible in order to make enough room to cram a new item in. As long as the robot is at least somewhat careful not to crushify anything, it’s a relatively straightforward task, at least compared to picking.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  286. <img alt="Automation robots retrieve boxes in a warehouse with yellow storage containers." class="rm-shortcode" data-rm-shortcode-id="195f3d2c7b9cb3339d1ce2aaf35131ed" data-rm-shortcode-name="rebelmouse-image" id="5fb6b" loading="lazy" src="https://spectrum.ieee.org/media-library/automation-robots-retrieve-boxes-in-a-warehouse-with-yellow-storage-containers.png?id=60140664&width=980"/>
  287. <small class="image-media media-caption" placeholder="Add Photo Caption...">The choices made when an item is stowed into a bin will affect how hard it is to get that item out of that bin later on—this is called “bin etiquette.” Amazon is trying to learn bin etiquette with AI to make picking more efficient.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Amazon</small></p><p>The defining problem of picking, as far as robots are concerned, is sensing and manipulation in clutter. “It’s a naturally contact-rich task, and we have to plan on that contact and react to it,” Parness says. And it’s not enough to solve these problems slowly and carefully, because Amazon Robotics is trying to put robots in production, which means that its systems are being directly compared to a not-so-small army of humans who are doing this exact same job very efficiently.</p><p>“There’s a new science challenge here, which is to identify the right item,” explains Parness. The thing to understand about identifying items in an Amazon warehouse is that there are a <em><em>lot</em></em> of them: something like 400 million unique items. One single floor of an Amazon warehouse can easily contain 15,000 pods, which is over a million bins, and Amazon has several hundred warehouses. This is a lot of stuff. </p><p>In theory, Amazon knows exactly which items are in every single bin. Amazon also knows (again, in theory), the weight and dimensions of each of those items, and probably has some pictures of each item from previous times that the item has been stowed or picked. This is a great starting point for item identification, but as Parness points out, “We have lots of items that aren’t feature rich—imagine all of the different things you might get in a brown cardboard box.”</p><h2>Clutter and Contact</h2><p>As challenging as it is to correctly identify an item in a bin that may be stuffed to the brim with nearly identical items, an even bigger challenge is actually getting that item that you just identified <em><em>out</em></em> of the bin. The hardware and software that humans have for doing this task is unmatched by any robot, which is always a problem, but the real complicating factor is dealing with items that are all jumbled together in a small fabric bin. And the picking process itself involves more than just extraction—once the item is out of the bin, you then have to get it to the next order-fulfillment step, which means dropping it into another bin or putting it on a conveyor or something. </p><p>“When we were originally starting out, we assumed we’d have to carry the item over some distance after we pulled it out of the bin,” explains Parness. “So we were thinking we needed pinch grasping.” A pinch grasp is when you grab something between a finger (or fingers) and your thumb, and at least for humans, it’s a versatile and reliable way of grabbing a wide variety of stuff. But as Parness notes, for robots in this context, it’s more complicated: “Even pinch grasping is not ideal because if you pinch the edge of a book, or the end of a plastic bag with something inside it, you don’t have pose control of the item and it may flop around unpredictably.” </p><p class="shortcode-media shortcode-media-youtube">
  288. <span class="rm-shortcode" data-rm-shortcode-id="816aaa227f7c02d4086b89fee4d256dd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xSm4Z7I3xXA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  289. </p><p>At some point, Parness and his team realized that while an item did have to move farther than just out of the bin, it didn’t actually have to get moved by the picking robot itself. Instead, they came up with a lifting conveyor that positions itself directly outside of the bin being picked from, so that all the robot has to do is get the item out of the bin and onto the conveyor. “It doesn’t look that graceful right now,” admits Parness, but it’s a clever use of hardware to substantially simplify the manipulation problem, and has the side benefit of allowing the robot to work more efficiently, since the conveyor can move the item along while the arm starts working on the next pick.</p><p class="shortcode-media shortcode-media-youtube">
  290. <span class="rm-shortcode" data-rm-shortcode-id="13ab678121e6e09a6f58b324fe375397" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/9ZzLiD_fJFA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  291. </p><p>Amazon’s robots have different techniques for extracting items from bins, using different gripping hardware depending on what needs to be picked. T<span>he type of end effector that the system chooses and the grasping approach depend on what the item is, where it is in the bin, and also what it’s next to. It’s a complicated planning problem that Amazon is tackling with AI, as Parness explains. </span><span>“We’re starting to build foundation models of items, including properties like how squishy they are, how fragile they are, and whether they tend to get stuck on other items or no. So we’re trying to learn those things, and it’s early stage for us, but we think reasoning about item properties is going to be important to get to that level of reliability that we need.”</span></p><p>Reliability has to be superhigh for Amazon (and with many other commercial robotic deployments) simply because small errors multiplied over huge deployments result in an unacceptable amount of screwing up. There’s a very, very long tail of unusual things that Amazon’s robots might encounter when trying to extract an item from a bin. Even if there’s some particularly weird bin situation that might only show up once in a million picks, that still ends up happening many times per day on the scale at which Amazon operates. Fortunately for Amazon, they’ve got humans around, and part of the reason that this robotic system can be effective in production at all is that if the robot gets stuck, or even just sees a bin that it knows is likely to cause problems, it can just give up, route that particular item to a human picker, and move on to the next one.</p><p>The other new technique that Amazon is implementing is a sort of modern approach to “<a href="https://en.wikipedia.org/wiki/Visual_servoing" target="_blank">visual servoing</a>,” where the robot watches itself move and then adjusts its movement based on what it sees. As Parness explains: “It’s an important capability because it allows us to catch problems before they happen. I think that’s probably our biggest innovation, and it spans not just our problem, but problems across robotics.”</p><p class="shortcode-media shortcode-media-youtube">
  292. <span class="rm-shortcode" data-rm-shortcode-id="6bb34d4c1cb35ef045001bb77797e619" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/lLadeCl8rME?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  293. </p><h2>A (More) Automated Future</h2><p>Parness was very clear that (for better or worse) Amazon isn’t thinking about its stowing and picking robots in terms of replacing humans completely. There’s that long tail of items that need a human touch, and it’s frankly hard to imagine any robotic-manipulation system capable enough to make at least occasional human help unnecessary in an environment like an Amazon warehouse, which somehow manages to maximize organization and chaos at the same time.</p><p>These stowing and picking robots have been undergoing live testing in an Amazon warehouse in Germany for the past year, where they’re already demonstrating ways in which human workers could directly benefit from their presence. For example, Amazon pods can be up to 2.5 meters tall, meaning that human workers need to use a stepladder to reach the highest bins and bend down to reach the lowest ones. If the robots were primarily tasked with interacting with these bins, it would help humans work faster while putting less stress on their bodies. </p><p>With the robots so far managing to keep up with human workers, Parness tells us that the emphasis going forward will be primarily on getting better at not screwing up: “I think our speed is in a really good spot. The thing we’re focused on now is getting that last bit of reliability, and that will be our next year of work.” While it may seem like Amazon is optimizing for its own very specific use cases, Parness reiterates that the bigger picture here is using every last one of those 400 million items jumbled into bins as a unique opportunity to do fundamental research on fast, reliable manipulation in complex environments. </p><p>“If you can build the science to handle high contact and high clutter, we’re going to use it everywhere,” says Parness. “It’s going to be useful for everything, from warehouses to your own home. What we’re working on now are just the first problems that are forcing us to develop these capabilities, but I think it’s the future of robotic manipulation.”</p>]]></description><pubDate>Wed, 07 May 2025 08:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/amazon-robotics-vulcan-warehouse-picking</guid><category>Amazon</category><category>Amazon robotics</category><category>Warehouse robots</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/robotic-arms-organize-shelves-in-an-automated-warehouse-surrounded-by-yellow-storage-bins.png?id=60140652&amp;width=980"></media:content></item><item><title>Video Friday: Robots for Extreme Environments</title><link>https://spectrum.ieee.org/video-friday-robots-extreme-environments</link><description><![CDATA[
  294. <img src="https://spectrum.ieee.org/media-library/robot-dog-running-and-atv-driving-through-deep-loose-sand-with-driver-in-black-hoodie.png?id=60071859&width=1500&height=2000&coordinates=555%2C0%2C555%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://uasconferences.com/2025_icuas/">ICUAS 2025</a>: 14–17 May 2025, CHARLOTTE, N.C.</h5><h5><a href="https://2025.ieee-icra.org/">ICRA 2025</a>: 19–23 May 2025, ATLANTA</h5><h5><a href="https://humanoidssummit.com/">London Humanoids Summit</a>: 29–30 May 2025, LONDON</h5><h5><a href="https://smartconf.jp/content/rcar2025/">IEEE RCAR 2025</a>: 1–6 June 2025, TOYAMA, JAPAN</h5><h5><a href="https://www.edrcoalition.com/2025-energy-drone-robotics-summit">2025 Energy Drone & Robotics Summit</a>: 16–18 June 2025, HOUSTON</h5><h5><a href="https://roboticsconference.org/">RSS 2025</a>: 21–25 June 2025, LOS ANGELES</h5><h5><a href="https://robotx.ethz.ch/education/summer-school.html">ETH Robotics Summer School</a>: 21–27 June 2025, GENEVA</h5><h5><a href="https://ias-19.org/">IAS 2025</a>: 30 June–4 July 2025, GENOA, ITALY</h5><h5><a href="https://clawar.org/icres2025/">ICRES 2025</a>: 3–4 July 2025, PORTO, PORTUGAL</h5><h5><a href="https://2025.worldhaptics.org/">IEEE World Haptics</a>: 8–11 July 2025, SUWON, SOUTH KOREA</h5><h5><a href="https://ifac2025-msrob.com/">IFAC Symposium on Robotics</a>: 15–18 July 2025, PARIS</h5><h5><a href="https://2025.robocup.org/">RoboCup 2025</a>: 15–21 July 2025, BAHIA, BRAZIL</h5><h5><a href="https://www.ro-man2025.org/">RO-MAN 2025</a>: 25–29 August 2025, EINDHOVEN, THE NETHERLANDS</h5><h5><a href="https://clawar.org/clawar2025/">CLAWAR 2025</a>: 5–7 September 2025, SHENZHEN</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="http://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="nacanwv_2z8"><em>The LYNX M20 series represents the world’s first wheeled-legged robot built specifically for challenging terrains and hazardous environments during industrial operation. Featuring lightweight design with extreme-environment endurance, it conquers rugged mountain trails, muddy wetlands and debris-strewn ruins—pioneering embodied intelligence in power inspection, emergency response, logistics, and scientific exploration.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="05122ff7d705e4c69f0def8efc601bf3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/NAcanWv_2Z8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">DEEP Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="gz9brl7dvsm">The latest OK Go music video includes lots of robots.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="be2fa1880be464ab1d3f7d11baecc6bd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/gz9BRl7DVSM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>And here’s a bit more on how it was done, mostly with arms from <a data-linked-post="2650272363" href="https://spectrum.ieee.org/universal-robots-ur3-robotic-arm" target="_blank">Universal Robots</a>.</p><p class="shortcode-media shortcode-media-youtube">
  295. <span class="rm-shortcode" data-rm-shortcode-id="94abd31b8e143e6c8028ff76d830e78c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ZmGQp-j4xEM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  296. </p><p>[ <a href="https://okgo.net/">OK Go</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="didjgkmdfl4"><em>Despite significant interest and advancements in humanoid robotics, most existing commercially available hardware remains high-cost, closed-source, and nontransparent within the robotics community. This lack of accessibility and customization hinders the growth of the field and the broader development of humanoid technologies. To address these challenges and promote democratization in humanoid robotics, we demonstrate Berkeley Humanoid Lite, an open-source humanoid robot designed to be accessible, customizable, and beneficial for the entire community.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="04c71293a4a2679b10d0662f7049505d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dIdJGkMDFl4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://lite.berkeley-humanoid.org/">Berkeley Humanoid Lite</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="dfobux6mftc">I think this may be the first time I’ve ever seen a pedestal-mounted <a data-linked-post="2667789605" href="https://spectrum.ieee.org/atlas-humanoid-robot-ceo-interview" target="_blank">Atlas</a> from <a href="https://bostondynamics.com/" target="_blank">Boston Dynamics</a>.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8cf5e801894b41554f73c1fb83cfe00c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dFObux6mfTc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://developer.nvidia.com/blog/r%C2%B2d%C2%B2-adapting-dexterous-robots-with-nvidia-research-workflows-and-models/">NVIDIA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="otzgk7nc-ze"><em>We are increasingly adopting domestic robots (Roomba, for example) that provide relief from mundane household tasks. However, these robots usually only spend little time executing their specific task and remain idle for long periods. Our work explores this untapped potential of domestic robots in ubiquitous computing, focusing on how they can improve and support modern lifestyles.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="aaa860dcb2b29df6ed57fe81d91fa125" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/otZGk7nC-ZE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.bath.ac.uk/announcements/dont-resent-your-robot-vacuum-cleaner-for-its-idle-hours-work-it-harder/">University of Bath</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="yqipn_gprte">Whenever I see a soft robot, I have to ask, “Okay, but how soft is it really?” And usually, there’s a pump or something hidden away off-camera somewhere. So it’s always cool to see actually <a data-linked-post="2650277231" href="https://spectrum.ieee.org/popcorndriven-robotic-actuators" target="_blank">soft robotics actuators</a>, like these, which are based on phase-changing water.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5f4d2d5960c168ea53ddc986eee455a1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/yqiPn_GprtE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.nature.com/articles/s41467-025-59023-7">Nature Communications</a> ] via [ <a href="http://www2.dem.uc.pt/pedro.neto/">Collaborative Robotics Laboratory, University of Coimbra</a> ]</p><p>Thanks, Pedro!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="n6yvksar7_4"><em>Pruning is an essential agricultural practice for orchards. Robot manipulators have been developed as an automated solution for this repetitive task, which typically requires seasonal labor with specialized skills. Our work addresses the behavior planning challenge for a robotic pruning system, which entails a multilevel planning problem in environments with complex collisions. In this article, we formulate the planning problem for a high-dimensional robotic arm in a pruning scenario, investigate the system’s intrinsic redundancies, and propose a comprehensive pruning workflow that integrates perception, modeling, and holistic planning.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4a7573a6bb09476f6f8c9dcfe9e52215" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/n6yvKsar7_4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/document/10978028">Paper</a> ] via [ <a href="https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=100" target="_blank">IEEE Robotics and Automation Magazine</a> ]</p><p>Thanks, Bram!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="nauna_qzf6k"><em>Watch the Waymo Driver quickly react to potential hazards and avoid collisions with other road users, making streets safer in cities where it operates.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b1a04a3c505918104287ba6255faf6d6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/nAuna_qzf6k?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://waymo.com/blog/2025/05/waymo-making-streets-safer-for-vru">Waymo</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="up9wyquicvk"><em>This video showcases some of the early testing footage of HARRI (High-speed Adaptive Robot for Robust Interactions), a next-generation proprioceptive robotic manipulator developed at the Robotics & Mechanisms Laboratory (RoMeLa) at University of California, Los Angeles. Designed for dynamic and force-critical tasks, HARRI leverages quasi-direct drive proprioceptive actuators combined with advanced control strategies such as impedance control and real-time model predictive control (MPC) to achieve high-speed, precise, and safe manipulation in human-centric and unstructured environments.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6572985188a75b5711501ed35cb6e606" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/up9wYqUICvk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.romela.org/">Robotics & Mechanisms Laboratory</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="tk7fvvz7plu"><em>Building on reinforcement learning for natural gait, we’ve upped the challenge for Adam: introducing complex terrain in training to adapt to real-world surfaces. From steep slopes to start-stop inclines, Adam handles it all with ease!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="093606e6d05c59fc9845ee7f6f0eadfb" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/TK7FvvZ7PlU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pndbotics.com/">PNDbotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="qpev-glrq_y"><em>ABB Robotics is serving up the future of fast food with BurgerBots—a groundbreaking new restaurant concept launched in Los Gatos, Calif. Designed to deliver perfectly cooked, made-to-order burgers every time, the automated kitchen uses ABB’s IRB 360 FlexPicker and YuMi collaborative robot to assemble meals with precision and speed, while accurately monitoring stock levels and freeing staff to focus on customer experience.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ba23b00e129b67f37f8d632eae9f7f56" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/qpEV-Glrq_Y?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.globalmediacentre.com/news/abb-and-burgerbots-unveil-robotic-burger-making-revolutionize-fast-food-prep">Burger Bots</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="pallcjktpga">Look at this little guy, such a jaunty walk!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d614974558f74c28068c15e6c95196a1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pAlLcjkTPgA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.science.org/doi/10.1126/sciadv.adv2052">Science Advances</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="9an0gulqwwc"><em>General-purpose humanoid robots are expected to interact intuitively with humans, enabling seamless integration into daily life. Natural language provides the most accessible medium for this purpose. In this work, we present an end-to-end, language-directed policy for real-world humanoid whole-body control.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2eaba94759fe12451eb1b48fc129515a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/9AN0GulqWwc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://hybrid-robotics.berkeley.edu/">Hybrid Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="vzays6iga7i">It’s debatable whether this is technically a robot, but sure, let’s go with it, because it’s pretty neat—a cable car of sorts consisting of a soft twisted ring that’s powered by infrared light.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="96be1abd7726051f6b5533847a59c0c5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vzays6IGA7I?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://news.ncsu.edu/2025/04/soft-robot-cable-cars/">North Carolina State University</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="gyrsjujsq20"><em>Robert Playter, CEO of Boston Dynamics, discusses the future of robotics amid rising competition and advances in artificial intelligence.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a940fed5ea96d36b653a64416db63d9c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GYRSJujsQ20?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.bloomberg.com/technology">Bloomberg</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="fncinfws6f0"><em>AI is at the forefront of technological advances and is also reshaping creativity, ownership, and societal interactions. In episode 7 of Penn Engineering’s Innovation & Impact podcast, host Vijay Kumar, Nemirovsky Family dean of Penn Engineering and professor in mechanical engineering and applied mechanics, speaks with Meta’s chief AI scientist and Turing Award winner Yann LeCun about the journey of AI, how we define intelligence, and the  possibilities and challenges it presents.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3329b39feeb3b30ffa47eb80ad431ebd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/fncinfwS6f0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://pci.upenn.edu/">University of Pennsylvania</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 02 May 2025 16:30:04 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robots-extreme-environments</guid><category>Video friday</category><category>Robotics</category><category>Humanoid</category><category>Atlas robot</category><category>Soft robots</category><category>Waymo</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/robot-dog-running-and-atv-driving-through-deep-loose-sand-with-driver-in-black-hoodie.png?id=60071859&amp;width=980"></media:content></item><item><title>Bot Milk?</title><link>https://spectrum.ieee.org/robotic-milking-system-benefits-safety</link><description><![CDATA[
  297. <img src="https://spectrum.ieee.org/media-library/sepia-toned-photo-of-creamery-storefront.png?id=60032275&width=1500&height=2000&coordinates=296%2C0%2C296%2C0"/><br/><br/><p>I come from dairy-farming stock. My grandfather, the original Harry Goldstein, owned a herd of dairy cows and a creamery in Louisville, Ky., that bore the family name. One fateful day in early April 1944, Harry was milking his cows when a heavy metallic part of his homemade milking contraption—likely some version of the then-popular <a href="https://surgemilker.com/history.html" rel="noopener noreferrer" target="_blank">Surge Bucket Milker</a>—struck him in the abdomen, causing a blood clot that ultimately led to cardiac arrest and his subsequent demise a few days later, at the age of 48.</p><p>Fast forward 80 years and dairy farming is still a dangerous occupation. According to an analysis of U.S. Bureau of Labor Statistics data <a href="https://www.farmworkerjustice.org/wp-content/uploads/2024/02/2024-Dairy-Report-Final.pdf" rel="noopener noreferrer" target="_blank">done by the advocacy group Farmworker Justice</a>, the U.S. dairy industry recorded 223 injuries per 10,000 full-time workers in 2020, almost double the rate for all of private industry combined. Contact with animals tops the list of occupational hazards for dairy workers, followed by slips, trips, and falls. Other significant risks include contact with objects or equipment, overexertion, and exposure to toxic substances. Every year, a few dozen dairy workers in the United States meet a fate similar to my grandfather’s, with 31 reported deadly accidents on dairy farms in 2021.</p><p>As Senior Editor <a href="https://spectrum.ieee.org/u/evan-ackerman" target="_self">Evan Ackerman</a> notes in “<a href="https://spectrum.ieee.org/lely-dairy-robots" target="_blank">Robots for Cows (and Their Humans)</a>”, traditional dairy farming is very labor-intensive. Cows need to be milked at least twice per day to prevent discomfort. Conventional milking facilities are engineered for human efficiency, with systems like rotating carousels that bring the cows to the dairy workers. </p><p>The robotic systems that <a href="https://www.lely.com/" rel="noopener noreferrer" target="_blank">Netherlands-based Lely</a> has been developing since the early 1990s are much more about doing things the bovine way. That includes letting the cows choose when to visit the milking robot, resulting in a happier herd and up to 10 percent more milk production.</p><p>Turns out that what’s good for the cows might be good for the humans, too. Another Lely bot deals with feeding, while yet another mops up the manure, the proximate cause of much of the slipping and sliding that can result in injuries. The robots tend to reset the cow–human relationship—it becomes less adversarial because the humans aren’t always there bossing the cows around.</p><p>Farmer well-being is also enhanced because the humans don’t have to be around to tempt fate, and they can spend time doing other things, freed up by the robot laborers. In fact, when Ackerman visited Lely’s demonstration farm in Schipluiden, Netherlands, to see the Lely robots in action, he says, “The original plan was for me to interview the farmer, and he was just not there at all for the entire visit while the cows were getting milked by the robots. In retrospect, that might have been the most effective way he could communicate how these robots are changing work for dairy farmers.” </p><p>The farmer’s absence also speaks volumes about how far dairy technology has evolved since my grandfather’s day. Harry Goldstein’s life was cut short by the very equipment he hacked to make his own work easier. Today’s dairy-farming innovations aren’t just improving efficiency—they’re keeping humans out of harm’s way entirely. In the dairy farms of the future, the most valuable safety features might simply be a barn resounding with the whirring of robots and moos of contentment.</p>]]></description><pubDate>Thu, 01 May 2025 14:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/robotic-milking-system-benefits-safety</guid><category>Robotics</category><category>Dairy robots</category><category>Farm robots</category><category>Farm automation</category><category>Worker safety</category><dc:creator>Harry Goldstein</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/sepia-toned-photo-of-creamery-storefront.png?id=60032275&amp;width=980"></media:content></item><item><title>Freddy the Robot Was the Fall Guy for British AI</title><link>https://spectrum.ieee.org/freddy-robot-british-ai-winter</link><description><![CDATA[
  298. <img src="https://spectrum.ieee.org/media-library/metal-diamond-shaped-apparatus-with-wires-and-a-short-metal-pole-coming-from-the-top.jpg?id=60000483&width=1500&height=2000&coordinates=984%2C0%2C984%2C0"/><br/><br/><p>
  299. Meet
  300. <a href="https://impact.ed.ac.uk/research/digital-data-ai/back-to-the-future-edinburgh-ai-legacy/" rel="noopener noreferrer" target="_blank">FREDERICK</a> Mark 2, the Friendly Robot for Education, Discussion and Entertainment, the Retrieval of Information, and the Collation of Knowledge, better known as Freddy II. This remarkable robot could put together a simple model car from an assortment of parts dumped in its workspace. Its video-camera eyes and pincer hand identified and sorted the individual pieces before assembling the desired end product. But onlookers had to be patient. Assembly took about <a href="https://www.aiai.ed.ac.uk/project/freddy/" rel="noopener noreferrer" target="_blank">16 hours</a>, and that was after a day or two of “learning” and programming.
  301. </p><p class="shortcode-media shortcode-media-youtube">
  302. <span class="rm-shortcode" data-rm-shortcode-id="48c3d73e988cfe2856e7ef686a93de61" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/B1kuk7MfhMc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  303. </p><p>
  304. <span>Freddy II was completed in 1973 as one of a series of research robots developed by </span><a href="https://history.computer.org/pioneers/michie.html" target="_blank">Donald Michie</a><span> and his team at the University of Edinburgh during the 1960s and ’70s. The robots became the focus of an intense debate over the future of AI in the United Kingdom. Michie eventually lost, his funding was gutted, and the ensuing AI winter set back U.K. research in the field for a decade.</span>
  305. </p><h2>Why were the Freddy I and II robots built?</h2><p>
  306. In 1967,
  307. <a href="https://www.theguardian.com/science/2007/jul/10/uk.obituaries1" target="_blank">Donald Michie</a>, along with <a href="https://www.richardgregory.org/cv.htm" rel="noopener noreferrer" target="_blank">Richard Gregory</a> and <a href="https://royalsocietypublishing.org/doi/pdf/10.1098/rsbm.2006.0012" rel="noopener noreferrer" target="_blank">Hugh Christopher Longuet-Higgins</a>, founded the Department of Machine Intelligence and Perception at the University of Edinburgh with the near-term goal of developing a semiautomated robot and then longer-term vision of programming “integrated cognitive systems,” or what other people might call intelligent robots. At the time, the U.S. <a href="https://www.darpa.mil/" rel="noopener noreferrer" target="_blank">Defense Advanced Research Projects Agency</a> and Japan’s Computer Usage Development Institute were both considering plans to create fully automated factories within a decade. The team at Edinburgh thought they should get in on the action too.
  308. </p><p>
  309. Two years later,
  310. <a href="https://www.theguardian.com/technology/2024/mar/08/stephen-salter-obituary" rel="noopener noreferrer" target="_blank">Stephen Salter</a> and <a href="https://www.harrygbarrow.com/about-me" rel="noopener noreferrer" target="_blank">Harry G. Barrow</a> joined Michie and got to work on Freddy I. Salter devised the hardware while Barrow designed and wrote the software and computer interfacing. The resulting simple robot worked, but it was crude. The AI researcher Jean Hayes (who would marry Michie in 1971) referred to this iteration of Freddy as an “arthritic <a href="https://en.wikipedia.org/wiki/The_Lady_of_Shalott" rel="noopener noreferrer" target="_blank">Lady of Shalott</a>.”
  311. </p><p>
  312. Freddy I consisted of a robotic arm, a camera, a set of wheels, and some bumpers to detect obstacles. Instead of roaming freely, it remained stationary while a small platform moved beneath it. Barrow developed an adaptable program that enabled Freddy I to recognize irregular objects. In 1969, Salter and Barrow published in
  313. <em><em>Machine Intelligence</em></em> their results, “Design of Low-Cost Equipment for Cognitive Robot Research,” which included suggestions for the next iteration of the robot.
  314. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  315. <img alt="Photo of a camera pointed at a teacup, and a computer printout in the shape of a teacup." class="rm-shortcode" data-rm-shortcode-id="e6a2f5b32154adcab759b2cfb6f88a60" data-rm-shortcode-name="rebelmouse-image" id="5062d" loading="lazy" src="https://spectrum.ieee.org/media-library/photo-of-a-camera-pointed-at-a-teacup-and-a-computer-printout-in-the-shape-of-a-teacup.jpg?id=60000511&width=980"/>
  316. <small class="image-media media-caption" placeholder="Add Photo Caption...">Freddy I, completed in 1969, could recognize objects placed in front of it—in this case, a teacup.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">University of Edinburgh</small></p><p>
  317. More people joined the team to build Freddy Mark 1.5, which they finished in May 1971. Freddy 1.5 was a true robotic hand-eye system. The hand consisted of two vertical, parallel plates that could grip an object and lift it off the platform. The eyes were two cameras: one looking directly down on the platform, and the other mounted obliquely on the truss that suspended the hand over the platform. Freddy 1.5’s world was a 2-meter by 2-meter square platform that moved in an
  318. <em><em>x</em></em>-<em><em>y</em></em> plane.
  319. </p><p>
  320. Freddy 1.5 quickly morphed into Freddy II as the team continued to grow. Improvements included force transducers added to the “wrist” that could deduce the strength of the grip, the weight of the object held, and whether it had collided with an object. But what really set Freddy II apart was its versatile assembly program: The robot could be taught to recognize the shapes of various parts, and then after a day or two of programming, it could assemble simple models. The various steps can be seen in this extended video, narrated by Barrow:
  321. </p><p class="shortcode-media shortcode-media-youtube">
  322. <span class="rm-shortcode" data-rm-shortcode-id="ae6e50564471ea752573e0042b6b9e83" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DySegCzN1uE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  323. </p><h2>The Lighthill Report Takes Down Freddy the Robot</h2><p>
  324. And then what happened?
  325. <em><em>So</em></em> much. But before I get into all that, let me just say that rarely do I, as a historian, have the luxury of having my subjects clearly articulate the aims of their projects, imagine the future, and then, years later, reflect on their experiences. As a cherry on top of this historian’s delight, the topic at hand—artificial intelligence—also happens to be of current interest to pretty much everyone.
  326. </p><p>
  327. As with many fascinating histories of technology, events turn on a healthy dose of professional bickering. In this case, the disputants were Michie and the applied mathematician
  328. <a href="https://en.wikipedia.org/wiki/James_Lighthill" rel="noopener noreferrer" target="_blank">James Lighthill</a>, who had drastically different ideas about the direction of robotics research. Lighthill favored applied research, while Michie was more interested in the theoretical and experimental possibilities. Their fight escalated quickly, became public with a televised debate on the BBC, and concluded with the demise of an entire research field in Britain.
  329. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  330. <img alt="Two black and a white photos of white men, both wearing suits and glasses." class="rm-shortcode" data-rm-shortcode-id="5d430c925c54d04d5d53c01ea3134341" data-rm-shortcode-name="rebelmouse-image" id="bfb22" loading="lazy" src="https://spectrum.ieee.org/media-library/two-black-and-a-white-photos-of-white-men-both-wearing-suits-and-glasses.jpg?id=60000572&width=980"/>
  331. <small class="image-media media-caption" placeholder="Add Photo Caption...">A damning report in 1973 by applied mathematician James Lighthill [left] resulted in funding being pulled from the AI and robotics program led by Donald Michie [right]. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Left: Chronicle/Alamy; Right: University of Edinburgh</small></p><p>
  332. It all started in September 1971, when the British Science Research Council, which distributed public funds for scientific research, commissioned Lighthill to survey the state of academic research in artificial intelligence. The SRC was finding it difficult to make informed funding decisions in AI, given the field’s complexity. It suspected that some AI researchers’ interests were too narrowly focused, while others might be outright charlatans. Lighthill was called in to give the SRC a road map.
  333. </p><p>
  334. No intellectual slouch, Lighthill was the Lucasian Professor of Mathematics at the University of Cambridge, a position also held by Isaac Newton, Charles Babbage, and Stephen Hawking. Lighthill solicited input from scholars in the field and completed his report in March 1972. Officially titled “
  335. <a href="https://www.chilton-computing.org.uk/inf/literature/reports/lighthill_report/p001.htm" rel="noopener noreferrer" target="_blank">Artificial Intelligence: A General Survey</a>,” but informally called the Lighthill Report, it divided AI into three broad categories: A, for advanced automation; B, for building robots, but also bridge activities between categories A and C; and C, for computer-based central nervous system research. Lighthill acknowledged some progress in categories A and C, as well as a few disappointments.
  336. </p><p>
  337. Lighthill viewed Category B, though, as a complete failure. “Progress in category B has been even slower and more discouraging,” he wrote, “tending to sap confidence in whether the field of research called AI has any true coherence.” For good measure, he added, “AI not only fails to take the first fence but ignores the rest of the steeplechase altogether.” So very British.
  338. </p><p>
  339. Lighthill concluded his report with his view of the next 25 years in AI. He predicted a “fission of the field of AI research,” with some tempered optimism for achievement in categories A and C but a valley of continued failures in category B. Success would come in fields with clear applications, he argued, but basic research was a lost cause.
  340. </p><p>
  341. The Science Research Council published Lighthill’s report the following year, with responses from
  342. <a href="https://www.chilton-computing.org.uk/inf/literature/reports/lighthill_report/p002.htm" rel="noopener noreferrer" target="_blank">N. Stuart Sutherland</a> of the University of Sussex and <a href="https://www.chilton-computing.org.uk/inf/literature/reports/lighthill_report/p003.htm" rel="noopener noreferrer" target="_blank">Roger M. Needham</a> of the University of Cambridge, as well as <a href="https://www.chilton-computing.org.uk/inf/literature/reports/lighthill_report/p005.htm" rel="noopener noreferrer" target="_blank">Michie</a> and his colleague <a href="https://www.chilton-computing.org.uk/inf/literature/reports/lighthill_report/p004.htm" rel="noopener noreferrer" target="_blank">Longuet-Higgins</a>.
  343. </p><p>
  344. Sutherland sought to relabel category B as “basic research in AI” and to have the SRC increase funding for it. Needham mostly supported Lighthill’s conclusions and called for the elimination of the term AI—“a rather pernicious label to attach to a very mixed bunch of activities, and one could argue that the sooner we forget it the better.”
  345. </p><p>
  346. Longuet-Higgins focused on his own area of interest, cognitive science, and ended with an ominous warning that any spin-off of advanced automation would be “more likely to inflict multiple injuries on human society,” but he didn’t explain what those might be.
  347. </p><p>
  348. Michie, as the United Kingdom’s academic leader in robots and machine intelligence, understandably saw the Lighthill Report as a direct attack on his research agenda. With his funding at stake, he provided the most critical response, questioning the very foundation of the survey: Did Lighthill talk with any international experts? How did he overcome his own biases? Did he have any sources and references that others could check? He ended with a request for
  349. <em><em>more</em></em> funding—specifically the purchase of a DEC System 10 (also known as the PDP-10) mainframe computer. According to Michie, if his plan were followed, Britain would be internationally competitive in AI by the end of the decade.
  350. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  351. <img alt="Black and white photo of a robot hovering over a square platform and surrounded by four young men who are crouching as they look at it. " class="rm-shortcode" data-rm-shortcode-id="e0980bb4044ffcdede61367d9b61b947" data-rm-shortcode-name="rebelmouse-image" id="9b1e2" loading="lazy" src="https://spectrum.ieee.org/media-library/black-and-white-photo-of-a-robot-hovering-over-a-square-platform-and-surrounded-by-four-young-men-who-are-crouching-as-they-look.jpg?id=60000645&width=980"/>
  352. <small class="image-media media-caption" placeholder="Add Photo Caption...">After Michie’s funding was cut, the many researchers affiliated with his bustling lab lost their jobs.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">University of Edinburgh</small></p><p>
  353. This whole affair might have remained an academic dispute, but then the BBC decided to include a debate between Lighthill and a panel of experts as part of its “Controversy” TV series. “<a href="https://sagepus.blogspot.com/2017/02/bbc-controversy-experiment.html" rel="noopener noreferrer" target="_blank">Controversy</a>” was an experiment to engage the public in science. On 9 May 1973, an interested but nonspecialist audience filled the auditorium at the Royal Institution in London to hear the debate.
  354. </p><p>
  355. Lighthill started with a review of his report, explaining the differences he saw between automation and what he called “the mirage” of general-purpose robots. Michie responded with a short film of Freddy II assembling a model, explaining how the robot processes information. Michie argued that AI is a subject with its own purposes, its own criteria, and its own professional standards.
  356. </p><p>
  357. After a brief back and forth between Lighthill and Michie, the show’s host turned to the other panelists:
  358. <a href="https://spectrum.ieee.org/remembering-john-mccarthy" target="_self">John McCarthy</a>, a professor of computer science at Stanford University, and Richard Gregory, a professor in the department of anatomy at the University of Bristol who had been Michie’s colleague at Edinburgh. McCarthy, who coined the term artificial intelligence in 1955, supported Michie’s position that AI should be its own area of research, not simply a bridge between automation and a robot that mimics a human brain. Gregory described how the work of Michie and McCarthy had influenced the field of psychology.
  359. </p><p>
  360. You can
  361. <a href="https://youtu.be/03p2CADwGF8?feature=shared" rel="noopener noreferrer" target="_blank">watch the debate</a> or <a href="https://github.com/Dicklesworthstone/the_lighthill_debate_on_ai" rel="noopener noreferrer" target="_blank">read a transcript</a>.
  362. </p><h2>A Look Back at the Lighthill Report </h2><p>
  363. Despite international support from the AI community, though, the SRC sided with Lighthill and gutted funding for AI and robotics; Michie had lost. Michie’s bustling lab went from being an international center of research to just Michie, a technician, and an administrative assistant. The loss ushered in the first British AI winter, with the United Kingdom making little progress in the field for a decade.
  364. </p><p>
  365. For his part, Michie pivoted and recovered. He decommissioned Freddy II in 1980, at which point it moved to the
  366. <a href="https://www.nms.ac.uk/search-our-collections/collection-search-results?entry=222106" rel="noopener noreferrer" target="_blank">Royal Museum of Scotland</a> (now the <a href="https://www.nms.ac.uk/national-museum-of-scotland" target="_blank">National Museum of Scotland</a>), and he replaced it with a <a href="https://spectrum.ieee.org/unimation-robot" target="_self">Unimation PUMA robot</a>.
  367. </p><p>
  368. In 1983, Michie founded the Turing Institute in Glasgow, an AI lab that worked with industry on both basic and applied research. The year before, he had written
  369. <em><em>Machine Intelligence and Related Topics</em></em>: <em><em>An Information Scientist’s Weekend Book</em></em> (Gordon and Breach)<em><em>. </em></em>Michie intended it as intellectual musings that he hoped scientists would read, perhaps on the weekend, to help them get beyond the pursuits of the workweek. The book is wide-ranging, covering his three decades of work.
  370. </p><p>
  371. In the introduction to the chapters covering Freddy and the aftermath of the Lighthill report, Michie wrote, perhaps with an eye toward history:
  372. </p><blockquote>
  373. “Work of excellence by talented young people was stigmatised as bad science and the experiment killed in mid-trajectory. This destruction of a co-operative human mechanism and of the careful craft of many hands is elsewhere described as a mishap. But to speak plainly, it was an outrage. In some later time when the values and methods of science have further expanded, and those adversary politics have contracted, it will be seen as such.”
  374. </blockquote><p>
  375. History has indeed rendered judgment on the debate and the Lighthill Report. In 2019, for example, computer scientist Maarten van Emden, a colleague of Michie’s,
  376. <a href="https://ieeexplore.ieee.org/document/8903585" rel="noopener noreferrer" target="_blank">reflected</a> on the demise of the Freddy project with these choice words for Lighthill: “a pompous idiot who lent himself to produce a flaky report to serve as a blatantly inadequate cover for a hatchet job.”
  377. </p><p>
  378. And in a March 2024
  379. <a href="https://github.com/Dicklesworthstone/the_lighthill_debate_on_ai" rel="noopener noreferrer" target="_blank">post on GitHub</a>, the blockchain entrepreneur <a href="https://www.jeffreyemanuel.com/" rel="noopener noreferrer" target="_blank">Jeffrey Emanuel</a> thoughtfully dissected Lighthill’s comments and the debate itself. Of Lighthill, he wrote, “I think we can all learn a very valuable lesson from this episode about the dangers of overconfidence and the importance of keeping an open mind. The fact that such a brilliant and learned person could be so confidently wrong about something so important should give us pause.”
  380. </p><p>
  381. Arguably, both Lighthill and Michie correctly predicted certain aspects of the AI future while failing to anticipate others. On the surface, the report and the debate could be described as simply about funding. But it was also more fundamentally about the role of academic research in shaping science and engineering and, by extension, society. Ideally, universities can support both applied research and more theoretical work. When funds are limited, though, choices are made. Lighthill chose applied automation as the future, leaving research in AI and machine intelligence in the cold.
  382. </p><p>
  383. It helps to take the long view. Over the decades, AI research has cycled through several periods of spring and winter, boom and bust. We’re currently in another AI boom. Is this time different? No one can be certain what lies just over the horizon, of course. That very uncertainty is, I think, the best argument for supporting people to experiment and conduct research into fundamental questions, so that they may help all of us to dream up the next big thing.
  384. </p><p>
  385. <em>Part of a <a href="https://spectrum.ieee.org/collections/past-forward/" target="_self">continuing series</a> looking at historical artifacts that embrace the boundless potential of technology.
  386. </em>
  387. </p><p>
  388. <em>An abridged version of this article appears in the May 2025 print issue as “This Robot Was the Fall Guy for British AI.”</em>
  389. </p><h3>References</h3><br/><p>Donald Michie’s lab regularly published articles on the group’s progress, especially in <a href="https://www.doc.ic.ac.uk/~shm/MI/mi.html" target="_blank"><em>Machine Intelligence</em></a><em>, </em>a journal founded by Michie.</p><p>The <a href="https://www.chilton-computing.org.uk/inf/literature/reports/lighthill_report/p001.htm" target="_blank">Lighthill Report</a> and <a href="https://www.youtube.com/watch?v=03p2CADwGF8" rel="noopener noreferrer" target="_blank">recordings of the debate</a> are both available in their entirety online—primary sources that capture the intensity of the moment.</p><p>In 2009, a group of alumni from Michie’s Edinburgh lab, including <a href="https://media.aiai.ed.ac.uk/Project/Freddy/Freddy-II-Harry-Barrow-Notes-2009.txt" rel="noopener noreferrer" target="_blank">Harry Barrow</a> and <a href="https://media.aiai.ed.ac.uk/Project/Freddy/Freddy-II-Pat-Ambler-Notes-2009.html" rel="noopener noreferrer" target="_blank">Pat Fothergill (formerly Ambler)</a>, created a website to <a href="https://media.aiai.ed.ac.uk/Project/Freddy/" rel="noopener noreferrer" target="_blank">share their memories </a>of working on Freddy. The site offers great firsthand accounts of the development of the robot. Unfortunately for the historian, they didn’t explore the lasting effects of the experience. A decade later, though, Maarten van Emden did, in his 2019 article “<a href="https://ieeexplore.ieee.org/document/8903585" rel="noopener noreferrer" target="_blank">Reflecting Back on the Lighthill Affair</a>,” in the <em>IEEE Annals of the History of Computing</em>.</p><p>Beyond his academic articles, Michie was a prolific author. Two collections of essays I found particularly useful are <em>On Machine Intelligence</em> (John Wiley & Sons, 1974) and <em>Machine Intelligence and Related Topics: An Information Scientist’s Weekend Book </em>(Gordon and Breach, 1982).</p><p>Jon Agar’s 2020 article “<a href="https://doi.org/10.1017/S0007087420000230" rel="noopener noreferrer" target="_blank">What Is Science for? The Lighthill Report on Artificial Intelligence Reinterpreted</a>” and Jeffrey Emanuel’s <a href="https://github.com/Dicklesworthstone/the_lighthill_debate_on_ai" rel="noopener noreferrer" target="_blank">GitHub post</a> offer historical interpretations on this mostly forgotten blip in the history of robotics and artificial intelligence.</p>]]></description><pubDate>Wed, 30 Apr 2025 14:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/freddy-robot-british-ai-winter</guid><category>Ai winter</category><category>History of ai</category><category>History of robotics</category><category>Past forward</category><category>Research robots</category><category>Type:departments</category><dc:creator>Allison Marsh</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/metal-diamond-shaped-apparatus-with-wires-and-a-short-metal-pole-coming-from-the-top.jpg?id=60000483&amp;width=980"></media:content></item><item><title>Video Friday: High Mobility Robots for Logistics</title><link>https://spectrum.ieee.org/video-friday-robot-marathon</link><description><![CDATA[
  390. <img src="https://spectrum.ieee.org/media-library/robotic-wheeled-vehicle-with-a-white-body-and-adjustable-suspension-on-a-dark-background-carrying-a-cargo-box-underneath-itself.png?id=60035927&width=1500&height=2000&coordinates=555%2C0%2C555%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://uasconferences.com/2025_icuas/">ICUAS 2025</a>: 14–17 May 2025, CHARLOTTE, NC</h5><h5><a href="https://2025.ieee-icra.org/">ICRA 2025</a>: 19–23 May 2025, ATLANTA</h5><h5><a href="https://humanoidssummit.com/">London Humanoids Summit</a>: 29–30 May 2025, LONDON</h5><h5><a href="https://smartconf.jp/content/rcar2025/">IEEE RCAR 2025</a>: 1–6 June 2025, TOYAMA, JAPAN</h5><h5><a href="https://www.edrcoalition.com/2025-energy-drone-robotics-summit">2025 Energy Drone & Robotics Summit</a>: 16–18 June 2025, HOUSTON</h5><h5><a href="https://roboticsconference.org/">RSS 2025</a>: 21–25 June 2025, LOS ANGELES</h5><h5><a href="https://robotx.ethz.ch/education/summer-school.html">ETH Robotics Summer School</a>: 21–27 June 2025, GENEVA</h5><h5><a href="https://ias-19.org/">IAS 2025</a>: 30 June–4 July 2025, GENOA, ITALY</h5><h5><a href="https://clawar.org/icres2025/">ICRES 2025</a>: 3–4 July 2025, PORTO, PORTUGAL</h5><h5><a href="https://2025.worldhaptics.org/">IEEE World Haptics</a>: 8–11 July 2025, SUWON, KOREA</h5><h5><a href="https://ifac2025-msrob.com/">IFAC Symposium on Robotics</a>: 15–18 July 2025, PARIS</h5><h5><a href="https://2025.robocup.org/">RoboCup 2025</a>: 15–21 July 2025, BAHIA, BRAZIL</h5><h5><a href="https://www.ro-man2025.org/">RO-MAN 2025</a>: 25–29 August 2025, EINDHOVEN, THE NETHERLANDS</h5><h5><a href="https://clawar.org/clawar2025/">CLAWAR 2025</a>: 5–7 September 2025, SHENZHEN</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="http://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="vn8xcucgs64"><em>Throughout the course of the past year, LEVA has been designed from the ground up as a novel robot to transport payloads. Although the use of robotics is widespread in logistics, few solutions offer the capability to efficiently transport payloads both in controlled and unstructured environments. Four-legged robots are ideal for navigating any environment a human can, yet few have the features to autonomously move payloads. This is where LEVA shines. By combining both wheels (a means of locomotion ideally suited for fast and precise motion on flat surfaces) and legs (which are perfect for traversing any terrain that humans can), LEVA strikes a balance that makes it highly versatile.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a4c8d196a6a2e3ffd22575d3c0f8b6a6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vN8xcucGS64?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://leva.ethz.ch/">LEVA</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="zl1a1x3ezj4">You’ve probably heard about this humanoid robot half-marathon in China, because it got a lot of media attention, which I presume was the goal. And for those of us who remember when Asimo running was a big deal, marathon running is still impressive in some sense. It’s just hard to connect that to these robots doing anything practical, you know?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="96766d2354a4a8d71e431290de27ae4d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zL1a1X3EZj4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.nbcnews.com/news/world/china-robots-race-humans-half-marathon-rcna195586">NBC</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="bszebp26aiu"><em>A robot navigating an outdoor environment with no prior knowledge of the space must rely on its local sensing to perceive its surroundings and plan. This can come in the form of a local metric map or local policy with some fixed horizon. Beyond that, there is a fog of unknown space marked with some fixed cost. In this work, we make a key observation that long-range navigation only necessitates identifying good frontier directions for planning instead of full-map knowledge. To this end, we propose the Long Range Navigator (LRN), which learns an intermediate affordance representation mapping high-dimensional camera images to affordable frontiers for planning, and then optimizing for maximum alignment with the desired goal. Through extensive off-road experiments on Spot and a Big Vehicle, we find that augmenting existing navigation stacks with LRN reduces human interventions at test time and leads to faster decision making indicating the relevance of LRN.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="234b14e4f79e6b0b2638fc1a9c768744" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/BSzeBp26aIU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://personalrobotics.github.io/lrn/">LRN</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="iuj_9_itkfs"><em>Goby is a compact, capable, programmable, and low-cost robot that lets you uncover miniature worlds from its tiny perspective.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="dfa9e79e88d4ecdf1e9280466ede11e3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/iuJ_9_ITKFs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>On Kickstarter now, for an absurdly cheap US $80.</p><p>[ <a href="https://www.kickstarter.com/projects/charmedlabs/goby-adventures-in-tinypresence">Kickstarter</a> ]</p><p>Thanks, Rich!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="fxzhorttskc">HEBI robots demonstrated inchworm mobility during the Innovation Faire of the FIRST Robotics World Championships in Houston.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="12dafbde5ba9fcb385e436b967a060e5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FxZhorTtSKc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.hebirobotics.com/">HEBI</a> ]</p><p>Thanks, Andrew!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="r0delcdgyvo">Happy Easter from Flexiv!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c2dceb772e84ef2d3993a187dd4a1eb0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/r0DelCdGyVo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.flexiv.com/">Flexiv</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="fwo2qk5m26s"><em>We are excited to present our proprietary reinforcement learning algorithm, refined through extensive simulations and vast training data, enabling our full-scale humanoid robot, Adam, to master humanlike locomotion. Unlike model-based gait control, our RL-driven approach grants Adam exceptional adaptability. On challenging terrains like uneven surfaces, Adam seamlessly adjusts stride, pace, and balance in real time, ensuring stable, natural movement while boosting efficiency and safety. The algorithm also delivers fluid, graceful motion with smooth joint coordination, minimizing mechanical wear, extending operational life, and significantly reducing energy use for enhanced endurance.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="27c889dcd31bcc7311d10fefbb0d895a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/fwO2QK5m26s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pndbotics.com/">PNDbotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="zo-sqfytecs"><em>Inside the GRASP Lab—Dr. Michael Posa and DAIR Lab. Our research centers on control, learning, planning, and analysis of robots as they interact with the world. Whether a robot is assisting within the home or operating in a manufacturing plant, the fundamental promise of robotics requires touching and affecting a complex environment in a safe and controlled fashion. We are focused on developing computationally tractable and data efficient algorithms that enable robots to operate both dynamically and safely as they quickly maneuver through and interact with their environments.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c30089148c58ef1d2cecb714cfc9ed31" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zo-SQFYTECs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://dair.seas.upenn.edu/">DAIR Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="pjc6hujama0">I will never understand why robotics companies feel the need to add the sounds of sick actuators when their robots move.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4a7682e72a5dd3a4ebe8f3d4a88710a8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pJC6huJAmA0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.gotokepler.com/home">Kepler</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="2bmcd2mjmq8"><em>Join Matt Trossen, founder of Trossen Robotics, on a time-traveling teardown through the evolution of our robotic arms! In this deep dive, Matt unboxes the ghosts of robots past—sharing behind-the-scenes stories, bold design decisions, lessons learned, and how the industry itself has shifted gears.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0e2fdc3a80ef5aa2cf3f9a4a9d73b9ac" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2bmCd2mjmq8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.trossenrobotics.com/">Trossen</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="btbnh45lnzu">This week’s Carnegie Mellon University Robotics Institute (CMU RI) seminar is a retro edition (2008!) from Charlie Kemp, previously of the Healthcare Robotics Lab at Georgia Tech and now at Hello Robot.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="96a1f121ea881dd06f4719833c55d01c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/bTbnH45lNZU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.cs.cmu.edu/~ri-seminar/2008.fall/2008-10-31.html">CMU RI</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="fqt785t-7kc">This week’s actual CMU RI seminar is from a much more modern version of Charlie Kemp.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ed0ab9572c803715bc8c01820cd9846c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/fQT785T-7kc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>When I started in robotics, my goal was to help robots emulate humans. Yet as my lab worked with people with mobility impairments, my notions of success changed. For assistive applications, emulation of humans is less important than ease of use and usefulness. Helping with seemingly simple tasks, such as scratching an itch or picking up a dropped object, can make a meaningful difference in a person’s life. Even full autonomy can be undesirable, since actively directing a robot can provide a sense of independence and agency. Overall, many benefits of robotic assistance derive from nonhuman aspects of robots, such as being tireless, directly controllable, and free of social characteristics that can inhibit use.</em><br/><em><br/>While technical challenges abound for home robots that attempt to emulate humans, I will provide evidence that human-scale mobile manipulators could benefit people with mobility impairments at home in the near future. I will describe work from my lab and Hello Robot that illustrates opportunities for valued assistance at home, including supporting activities of daily living, leading exercise games, and strengthening social connections. I will also present recent progress by Hello Robot toward unsupervised, daily in-home use by a person with severe mobility impairments.</em></blockquote><p>[ <a href="https://www.ri.cmu.edu/ri-seminar-series/">CMU RI</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 25 Apr 2025 16:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robot-marathon</guid><category>Video friday</category><category>Robotics</category><category>Wheeled robots</category><category>Humanoid robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/robotic-wheeled-vehicle-with-a-white-body-and-adjustable-suspension-on-a-dark-background-carrying-a-cargo-box-underneath-itself.png?id=60035927&amp;width=980"></media:content></item><item><title>Video Friday: Robot Boxing</title><link>https://spectrum.ieee.org/video-friday-robot-boxing</link><description><![CDATA[
  391. <img src="https://spectrum.ieee.org/media-library/human-sparring-with-an-advanced-humanoid-robot-in-a-boxing-ring-both-wearing-gear.jpg?id=59950949&width=1500&height=2000&coordinates=530%2C0%2C530%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://robosoft2025.org/">RoboSoft 2025</a>: 23–26 April 2025, LAUSANNE, SWITZERLAND</h5><h5><a href="https://uasconferences.com/2025_icuas/">ICUAS 2025</a>: 14–17 May 2025, CHARLOTTE, N.C.</h5><h5><a href="https://2025.ieee-icra.org/">ICRA 2025</a>: 19–23 May 2025, ATLANTA, GA.</h5><h5><a href="https://humanoidssummit.com/">London Humanoids Summit</a>: 29–30 May 2025, LONDON</h5><h5><a href="https://smartconf.jp/content/rcar2025/">IEEE RCAR 2025</a>: 1–6 June 2025, TOYAMA, JAPAN</h5><h5><a href="https://www.edrcoalition.com/2025-energy-drone-robotics-summit">2025 Energy Drone & Robotics Summit</a>: 16–18 June 2025, HOUSTON, TEXAS</h5><h5><a href="https://roboticsconference.org/">RSS 2025</a>: 21–25 June 2025, LOS ANGELES</h5><h5><a href="https://robotx.ethz.ch/education/summer-school.html">ETH Robotics Summer School</a>: 21–27 June 2025, GENEVA</h5><h5><a href="https://ias-19.org/">IAS 2025</a>: 30 June–4 July 2025, GENOA, ITALY</h5><h5><a href="https://clawar.org/icres2025/">ICRES 2025</a>: 3–4 July 2025, PORTO, PORTUGAL</h5><h5><a href="https://2025.worldhaptics.org/">IEEE World Haptics</a>: 8–11 July 2025, SUWON, SOUTH KOREA</h5><h5><a href="https://ifac2025-msrob.com/">IFAC Symposium on Robotics</a>: 15–18 July 2025, PARIS</h5><h5><a href="https://2025.robocup.org/">RoboCup 2025</a>: 15–21 July 2025, BAHIA, BRAZIL</h5><h5><a href="https://www.ro-man2025.org/">RO-MAN 2025</a>: 25–29 August 2025, EINDHOVEN, THE NETHERLANDS</h5><h5><a href="https://clawar.org/clawar2025/">CLAWAR 2025</a>: 5–7 September 2025, SHENZHEN, CHINA</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="http://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="stlp4z-ul44"><em>Let’s step into a new era of sci-fi and join the fun together! Unitree will be live streaming robot combat in about a month, so stay tuned!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ce3cbc5720e025a99b0803d775750fbb" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/StLp4Z-ul44?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/boxing">Unitree</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="yz2in2efate"><em>A team of scientists and students from Delft University of Technology in the Netherlands (TU Delft) has taken first place at the A2RL Drone Championship, in Abu Dhabi, an international race that pushes the limits of physical artificial intelligence, challenging teams to fly fully autonomous drones using only a single camera. The TU Delft drone competed against 13 autonomous drones and even human drone-racing champions, using innovative methods to train deep neural networks for high-performance control.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b8f42ae74ebdc8252c4a4010fc3c7746" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/yz2in2eFATE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.tudelft.nl/en/2025/lr/autonomous-drone-from-tu-delft-defeats-human-champions-in-historic-racing-first">TU Delft</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="h_8n9mm580g">RAI’s Ultra Mobile Vehicle (UMV) is learning some new tricks!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="97663ec99d83db5b4a3659a67a7d4a84" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/H_8n9MM580g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://rai-inst.com/">RAI Institute</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="q2jfq_0lhrm"><em>With 28 moving joints (20 QDD actuators + 8 servo motors), Cosmo can walk using its two feet with a speed of up to 1 meter per second (0.5 meters per second nominal) and balance itself even when pushed. Coordinated with the motion of its head, fingers, arms and legs, Cosmo has a loud and expressive voice for effective interaction with humans. Cosmo speaks in canned phrases from the 1990s cartoon he originates from, and his speech can be fully localized in any language.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1d1eef5aebbbc762f42cbaff97b6692f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/q2Jfq_0LHRM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.romela.org/">RoMeLa</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ijg8p8lvjji">We wrote about <a data-linked-post="2656423157" href="https://spectrum.ieee.org/parallel-systems-autonomous-trains" target="_blank">Parallel Systems</a> back in January 2022, and it’s good to see that their creative take on <a data-linked-post="2650278043" href="https://spectrum.ieee.org/autonomous-trains-could-talk-to-each-other-for-cooperative-cruise-control" target="_blank">autonomous rail</a> is still moving along.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cd9c8b52f9ef7d6ef9137e39e2554036" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Ijg8p8lvJjI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://moveparallel.com/">Parallel Systems</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="fov9l5iovks"><em>RoboCake is ready. This edible robotic cake is the result of a collaboration between researchers from EPFL (the Swiss Federal Institute of Technology in Lausanne, Switzerland), the Istituto Italiano di Tecnologia (IIT, the Italian Institute of Technology), and pastry chefs and food scientists from EHL Hospitality Business School, in Lausanne. It takes the form of a robotic wedding cake, decorated with two gummy robotic bears and edible dark-chocolate batteries that power the candles.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7f9ea3fbb094a6a18a335634125cd00c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FoV9L5Iovks?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://actu.epfl.ch/news/robotics-meets-the-culinary-arts/">EPFL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ybqf8zbmo6i"><em>ROBOTERA’s fully self-developed five-finger dexterous hand has upgraded its skills, transforming into an esports hand in the blink of an eye! The XHAND1 features 12 active degrees of freedom, pioneering an industry-first fully direct-drive joint design. It offers exceptional flexibility and sensitivity, effortlessly handling precision tasks like finger opposition, picking up soft objects, and grabbing cards. Additionally, it delivers powerful grip strength with a maximum payload of nearly 25 kilograms, making it adaptable to a wide range of complex application scenarios.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2f7484d3175dbaf8f5a4d620802f6556" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/YbQF8ZBmo6I?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.robotera.com/en/">ROBOTERA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="knske9ejpl8"><em>Witness the future of industrial automation as Extend Robotics trials its cutting-edge humanoid robot in Leyland factories. In this groundbreaking video, the robot skillfully connects a master service disconnect unit, a critical task in factory operations. Watch on-site workers seamlessly collaborate with the robot using an intuitive XR (extended reality) interface, blending human expertise with robotic precision.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2084e7317a3abaf1731c4a2dbb9ea7c6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/knSkE9ejPl8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.extendrobotics.com/">Extend Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="gsxtnpwledq">I kind of like the idea of having a mobile robot that lives in my garage and manages the charging and cleaning of my car.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="27f96a564a5ca2d3ef59bb099d4d2ff4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GsXTNpwLeDQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.flexiv.com/case-studies/ev_charging_solution">Flexiv</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="84zaawjqkas"><em>How can we ensure that robots using foundation models, such as large language models, won’t hallucinate when executing tasks in complex, previously unseen environments? Our Safe and Assured Foundation Robots for Open Environments (SAFRON) Advanced Research Concept (ARC) seeks ideas to make sure robots behave only as directed and intended.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f1293426e998fc76f105a3bd710bbd8b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/84zaAwjQkas?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.darpa.mil/research/programs/safron">DARPA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="p3ubmqcpsdk"><em>What if doing your chores was as easy as flipping a switch? In this talk and live demo, the roboticist Bernt Børnich, founder of 1X, introduces NEO, a humanoid robot designed to help you out around the house. Watch as NEO shows off its ability to vacuum, water plants, and keep you company, while Børnich tells the story of its development—and shares a vision for robot helpers that could free up your time to focus on what truly matters.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="04fb0f3132ac9b26d958124c87742d60" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/p3uBMqCPSDk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.1x.tech/">1X</a> ] via [ <a href="https://www.ted.com/talks/bernt_bornich_meet_neo_your_robot_butler_in_training">TED</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="4fdasbn7gau"><a data-linked-post="2668844679" href="https://spectrum.ieee.org/rodney-brooks-three-laws-robotics" target="_blank">Rodney Brooks</a> gave a keynote at the Stanford HAI spring conference on “Robotics in a Human-Centered World.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="17ed26b9d3807c2c689edbd8b4658d26" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/4fDASBn7gaU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>There are a bunch of excellent talks from this conference on YouTube at the link below, but I think this panel is especially good—a discussion of going from research to real-world impact.</p><p class="shortcode-media shortcode-media-youtube">
  392. <span class="rm-shortcode" data-rm-shortcode-id="243b68da867f6f1811ace50a7637dc7c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/-aYcOZBG9Fc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  393. </p><p>[ <a href="https://www.youtube.com/@stanfordhai/videos">YouTube</a> ] via [ <a href="https://hai.stanford.edu/events/robotics-in-a-human-centered-world-innovations-and-implications">Stanford HAI</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="6fnrljeyptc">Wing CEO Adam Woodworth discusses <a data-linked-post="2650278373" href="https://spectrum.ieee.org/wing-officially-launches-australian-drone-delivery-service" target="_blank">consumer drone delivery</a> with Peter Diamandis at Abundance 360.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="eb35f87a11dd198e14cfb3d07c1b572d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/6FnRLjEYPtc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://wing.com/">Wing</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="u7dwznmyo3w">This Carnegie Mellon University Robotics Institute seminar is by Sangbae Kim, who was until very recently at MIT but is now the robotics architect at Meta’s Robotics Studio.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a83fff963502f01ab0d5860bb78229b3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/U7dwzNMYO3w?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ri.cmu.edu/ri-seminar-series/">CMU RI</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 18 Apr 2025 16:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robot-boxing</guid><category>Video friday</category><category>Robotics</category><category>Humanoid robots</category><category>Human-robot interaction</category><category>Rodney brooks</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/human-sparring-with-an-advanced-humanoid-robot-in-a-boxing-ring-both-wearing-gear.jpg?id=59950949&amp;width=980"></media:content></item><item><title>The Future of AI and Robotics Is Being Led by Amazon’s Next-Gen Warehouses</title><link>https://spectrum.ieee.org/amazon-ai-robotics</link><description><![CDATA[
  394. <img src="https://spectrum.ieee.org/media-library/robotic-arm-with-suction-cups-lifting-a-cardboard-box-at-an-amazon-warehouse.jpg?id=58107139&width=1500&height=2000&coordinates=1752%2C0%2C1752%2C0"/><br/><br/><p><em>This is a sponsored article brought to you by <a href="https://amazon.jobs/content/en/teams/ftr" rel="noopener noreferrer" target="_blank">Amazon</a>.</em></p><p>The cutting edge of robotics and artificial intelligence (AI) doesn’t occur just at NASA, or one of the top university labs, but instead is increasingly being developed in the warehouses of the e-commerce company Amazon. As online shopping continues to grow, companies like Amazon are pushing the boundaries of these technologies to meet consumer expectations.</p><p>Warehouses, the backbone of the global supply chain, are undergoing a transformation driven by technological innovation. Amazon, at the forefront of this revolution, is leveraging robotics and AI to shape the warehouses of the future. Far from being just a logistics organization, Amazon is positioning itself as a leader in technological innovation, making it a prime destination for <a href="https://amazon.jobs/content/en/teams/ftr" rel="noopener noreferrer" target="_blank">engineers and scientists seeking to shape the future of automation</a>.</p><h2>Amazon: A Leader in Technological Innovation</h2><p>Amazon’s success in e-commerce is built on a foundation of continuous technological innovation. Its fulfillment centers are increasingly becoming hubs of cutting-edge technology where robotics and AI play a pivotal role. Heath Ruder, Director of Product Management at Amazon, explains how Amazon’s approach to integrating robotics with advanced material handling equipment is shaping the future of its warehouses.</p><p>“We’re integrating several large-scale products into our next-generation fulfillment center in Shreveport, Louisiana,” says Ruder. “It’s our first opportunity to get our robotics systems combined under one roof and understand the end-to-end mechanics of how a building can run with incorporated autonomation.” Ruder is referring to the facility’s deployment of its Automated Storage and Retrieval Systems (ASRS), called Sequoia, as well as robotic arms like “Robin” and “Cardinal” and Amazon’s proprietary autonomous mobile robot, “Proteus”.</p><p>Amazon has already deployed “<a href="https://www.amazon.science/latest-news/amazon-robotics-see-robin-robot-arms-in-action" rel="noopener noreferrer" target="_blank">Robin</a>”, a robotic arm that sorts packages for outbound shipping by transferring packages from conveyors to mobile robots. This system is already in use across various Amazon fulfillment centers and has completed over three billion successful package moves. “<a href="https://www.amazon.science/latest-news/how-amazon-robotics-researchers-are-solving-a-beautiful-problem" rel="noopener noreferrer" target="_blank">Cardinal</a>” is another robotic arm system that efficiently packs packages into carts before the carts are loaded onto delivery trucks.</p><p>“<a href="https://www.aboutamazon.com/news/operations/how-amazon-deploys-robots-in-its-operations-facilities" rel="noopener noreferrer" target="_blank">Proteus</a>” is Amazon’s autonomous mobile robot designed to work around people. Unlike traditional robots confined to a restricted area, Proteus is fully autonomous and navigates through fulfillment centers using sensors and a mix of AI-based and ML systems. It works with human workers and other robots to transport carts full of packages more efficiently.</p><p>The integration of these technologies is estimated to increase operational efficiency by 25 percent. “Our goal is to improve speed, quality, and cost. The efficiency gains we’re seeing from these systems are substantial,” says Ruder. However, the real challenge is scaling this technology across Amazon’s global network of fulfillment centers. “Shreveport was our testing ground and we are excited about what we have learned and will apply at our next building launching in 2025.”</p><p>Amazon’s investment in cutting-edge robotics and AI systems is not just about operational efficiency. It underscores the company’s commitment to being a leader in technological innovation and workplace safety, making it a top destination for engineers and scientists looking to solve complex, real-world problems.</p><h2>How AI Models Are Trained: Learning from the Real World</h2><p>One of the most complex challenges Amazon’s robotics team faces is how to make robots capable of handling a wide variety of tasks that require discernment. Mike Wolf, a principal scientist at <a href="https://spectrum.ieee.org/tag/amazon-robotics" target="_self">Amazon Robotics</a>, plays a key role in developing AI models that enable robots to better manipulate objects, across a nearly infinite variety of scenarios.</p><p>“The complexity of Amazon’s product catalog—hundreds of millions of unique items—demands advanced AI systems that can make real-time decisions about object handling,” explains Wolf. But how do these AI systems learn to handle such an immense variety of objects? Wolf’s team is developing machine learning <a href="https://spectrum.ieee.org/tag/algorithms" target="_self">algorithms</a> that enable robots to learn from experience.</p><p class="pull-quote">“We’re developing the next generation of AI and robotics. For anyone interested in this field, Amazon is the place where you can make a difference on a global scale.” <strong>—Mike Wolf, Amazon Robotics</strong></p><p>In fact, robots at Amazon continuously gather data from their interactions with objects, refining their ability to predict how items will be affected when manipulated. Every interaction a robot has—whether it’s picking up a package or placing it into a container—feeds back into the system, refining the AI model and helping the robot to improve. “AI is continually learning from failure cases,” says Wolf. “Every time a robot fails to complete a task successfully, that’s actually an opportunity for the system to learn and improve.” This data-centric approach supports the development of state-of-the-art AI systems that can perform increasingly complex tasks, such as predicting how objects are affected when manipulated. This predictive ability will help robots determine the best way to pack irregularly shaped objects into containers or handle fragile items without damaging them.</p><p>“We want AI that understands the physics of the environment, not just basic object recognition. The goal is to predict how objects will move and interact with one another in real time,” Wolf says.</p><h2>What’s Next in Warehouse Automation</h2><p>Valerie Samzun, Senior Technical Product Manager at Amazon, leads a cutting-edge robotics program that aims to enhance workplace safety and make jobs more rewarding, fulfilling, and intellectually stimulating by allowing robots to handle repetitive tasks.</p><p>“The goal is to reduce certain repetitive and physically demanding tasks from associates,” explains Samzun. “This allows them to focus on higher-value tasks in skilled roles.” This shift not only makes warehouse operations more efficient but also opens up new opportunities for workers to advance their careers by developing new technical skills.</p><p>“Our research combines several cutting-edge technologies,” Samzun shared. “The project uses robotic arms equipped with compliant manipulation tools to detect the amount of force needed to move items without damaging them or other items.” This is an advancement that incorporates learnings from previous Amazon robotics projects. “This approach allows our robots to understand how to interact with different objects in a way that’s safe and efficient,” says Samzun. In addition to robotic manipulation, the project relies heavily on AI-driven algorithms that determine the best way to handle items and utilize space.</p><p>Samzun believes the technology will eventually expand to other parts of Amazon’s operations, finding multiple applications across its vast network. “The potential applications for compliant manipulation are huge,” she says.</p><h2>Attracting Engineers and Scientists: Why Amazon is the Place to Be</h2><p>As Amazon continues to push the boundaries of what’s possible with robotics and AI, it’s also becoming a highly attractive destination for engineers, scientists, and technical professionals. Both Wolf and Samzun emphasize the unique opportunities Amazon offers to those interested in solving real-world problems at scale.</p><p>For Wolf, who transitioned to Amazon from NASA’s Jet Propulsion Laboratory, the appeal lies in the sheer impact of the work. “The draw of Amazon is the ability to see your work deployed at scale. There’s no other place in the world where you can see your robotics work making a direct impact on millions of people’s lives every day,” he says. Wolf also highlights the collaborative nature of Amazon’s technical teams. Whether working on AI algorithms or robotic hardware, scientists and engineers at Amazon are constantly collaborating to solve new challenges.</p><p>Amazon’s culture of innovation extends beyond just technology. It’s also about empowering people. Samzun, who comes from a non-engineering background, points out that Amazon is a place where anyone with the right mindset can thrive, regardless of their academic background. “I came from a business management background and found myself leading a robotics project,” she says. “Amazon provides the platform for you to grow, learn new skills, and work on some of the most exciting projects in the world.”</p><p>For young engineers and scientists, Amazon offers a unique opportunity to work on state-of-the-art technology that has real-world impact. “We’re developing the next generation of AI and robotics,” says Wolf. “For anyone interested in this field, Amazon is the place where you can make a difference on a global scale.”</p><h2>The Future of Warehousing: A Fusion of Technology and Talent</h2><p>From Amazon’s leadership, it’s clear that the future of warehousing is about more than just automation. It’s about harnessing the power of robotics and AI to create smarter, more efficient, and safer working environments. But at its core it remains centered on people in its operations and those who make this technology possible—engineers, scientists, and technical professionals who are driven to solve some of the world’s most complex problems.</p><p>Amazon’s commitment to innovation, combined with its vast operational scale, makes it a leader in warehouse automation. The company’s focus on integrating robotics, AI, and human collaboration is transforming how goods are processed, stored, and delivered. And with so many innovative projects underway, the future of Amazon’s warehouses is one where technology and human ingenuity work hand in hand.</p><p>“We’re building systems that push the limits of robotics and AI,” says Wolf. “If you want to work on the cutting edge, this is the place to be.”</p>]]></description><pubDate>Tue, 15 Apr 2025 11:09:00 +0000</pubDate><guid>https://spectrum.ieee.org/amazon-ai-robotics</guid><category>Robotics</category><category>Amazon</category><category>Amazon robotics</category><category>Logistics</category><dc:creator>Dexter Johnson</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/robotic-arm-with-suction-cups-lifting-a-cardboard-box-at-an-amazon-warehouse.jpg?id=58107139&amp;width=980"></media:content></item><item><title>Video Friday: Tiny Robot Bug Hops and Jumps</title><link>https://spectrum.ieee.org/video-friday-hopping-robot-insect</link><description><![CDATA[
  395. <img src="https://spectrum.ieee.org/media-library/person-holding-tweezers-adjusts-a-small-robotic-insect-with-fragile-wings.jpg?id=59862589&width=1500&height=2000&coordinates=225%2C0%2C225%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://robosoft2025.org/">RoboSoft 2025</a>: 23–26 April 2025, LAUSANNE, SWITZERLAND</h5><h5><a href="https://uasconferences.com/2025_icuas/">ICUAS 2025</a>: 14–17 May 2025, CHARLOTTE, N.C.</h5><h5><a href="https://2025.ieee-icra.org/">ICRA 2025</a>: 19–23 May 2025, ATLANTA</h5><h5><a href="https://humanoidssummit.com/">London Humanoids Summit</a>: 29–30 May 2025, LONDON</h5><h5><a href="https://smartconf.jp/content/rcar2025/">IEEE RCAR 2025</a>: 1–6 June 2025, TOYAMA, JAPAN</h5><h5><a href="https://www.edrcoalition.com/2025-energy-drone-robotics-summit">2025 Energy Drone & Robotics Summit</a>: 16–18 June 2025, HOUSTON</h5><h5><a href="https://roboticsconference.org/">RSS 2025</a>: 21–25 June 2025, LOS ANGELES</h5><h5><a href="https://robotx.ethz.ch/education/summer-school.html">ETH Robotics Summer School</a>: 21–27 June 2025, GENEVA</h5><h5><a href="https://ias-19.org/">IAS 2025</a>: 30 June–4 July 2025, GENOA, ITALY</h5><h5><a href="https://clawar.org/icres2025/">ICRES 2025</a>: 3–4 July 2025, PORTO, PORTUGAL</h5><h5><a href="https://2025.worldhaptics.org/">IEEE World Haptics</a>: 8–11 July 2025, SUWON, SOUTH KOREA</h5><h5><a href="https://ifac2025-msrob.com/">IFAC Symposium on Robotics</a>: 15–18 July 2025, PARIS</h5><h5><a href="https://2025.robocup.org/">RoboCup 2025</a>: 15–21 July 2025, BAHIA, BRAZIL</h5><h5><a href="https://www.ro-man2025.org/">RO-MAN 2025</a>: 25–29 August 2025, EINDHOVEN, THE NETHERLANDS</h5><h5><a href="https://clawar.org/clawar2025/">CLAWAR 2025</a>: 5–7 September 2025, SHENZHEN, CHINA</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA</h5><h5><a href="http://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="ulxqz8f59hk"><em>MIT engineers developed an insect-size jumping robot that can traverse challenging terrains while using far less energy than an aerial robot of comparable size. This tiny hopping robot can leap over tall obstacles and jump across slanted or uneven surfaces carrying about 10 times as much of a payload as an aerial robot of similar size, opening the door to many new applications.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="818e6a23e24c2ef8d68168815abcc118" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UlxQz8F59Hk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://news.mit.edu/2025/hopping-gives-tiny-robot-leg-up-0409">MIT</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="vn5kx3mq41g"><em>CubiX is a wire-driven robot that connects to the environment through wires, with drones used to establish these connections. By integrating with various tools and a robot, it performs tasks beyond the limitations of its physical structure.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a63fc5d392287b4b8ebfcc673c5b8540" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Vn5Kx3mq41g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="http://www.jsk.t.u-tokyo.ac.jp/">JSK Lab</a> ]</p><p>Thanks, Shintaro!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="sp7x8tpnhmw"><em>It’s a game a lot of us played as children—and maybe even later in life: unspooling measuring tape to see how far it would extend before bending. But to engineers at the University of California, San Diego, this game was an inspiration, suggesting that measuring tape could become a great material for a robotic gripper.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="921c2a84346b33eca67d9d755a92a0a1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/SP7X8TpNhmw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://today.ucsd.edu/story/a-new-robotic-gripper-based-on-measuring-tape-is-sizing-up-fruit-and-veggie-picking">University of California San Diego</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="veiodeoiqes">I enjoyed the Murderbot books, and the trailer for the TV show actually looks not terrible.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="95ff474fac43f63c92bbc2cde4043764" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vEioDeOiqEs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://tv.apple.com/us/show/murderbot/umc.cmc.5owrzntj9v1gpg31wshflud03">Murderbot</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="yq1fx4joqxq">For <a data-linked-post="2668727611" href="https://spectrum.ieee.org/chef-robotics-food-robots" target="_blank">service robots</a>, being able to operate an unmodified elevator is much more difficult (and much more important) than you might think.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="904abbbae2084aad94ab5ad8c089ebb4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/yq1FX4joqxQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pudurobotics.com/en">Pudu Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ucvhvibpiny"><em>There’s a lot of buzz around impressive robotics demos, but taking physical AI from demo to real-world deployment is a journey that demands serious engineering muscle. Hammering out the edge cases and getting to scale takes 500 times as much effort as it does to get to the first demo. See our process for building this out for the singulation and induction physical AI solution trusted by some of the world’s leading parcel carriers. Here’s to the teams that are likewise committed to the grind toward reliability and scale.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bf46c26d5a2905deea5a35cd37a588fd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UCvhvibpInY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dexterity.ai/">Dexterity Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="trjefvq9gt4">I am utterly charmed by the design of this little robot.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6dc8f98a1195fa40c1690e7bdbf3cbd3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/TRjefVQ9gT4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.romela.org/">RoMeLa</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="rg2d2sdjfwi"><em>This video shows a shortened version of Issey Miyake’s Fly With Me runway show from the 2025 Paris Men’s Fashion Week. My collaborators and I brought two industrial robots to life to be the central feature of the minimalist scenography for the Japanese brand.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="159837bbb1038155a01f33dee0b3bbb2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/rg2d2sDJFwI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Each ABB IRB 6640 robot held a 2-meter-square piece of fabric, and moved synchronously in flowing motions to match the emotional timing of the runway show. With only three weeks’ worth of development time and three days on-site, I built custom live-coding tools that opened up the industrial robots to more improvisational workflows. This level of reliable, real-time control unlocked the flexibility needed by the Issey Miyake team to make the necessary last-minute creative decisions for the show.</em></blockquote><p>[ <a href="https://atonaton.com/haute-couture-robotics">Atonaton</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="bdtwbdj7qwo"><em>Meet Clone’s first musculoskeletal android: Protoclone, the most anatomically accurate robot in the world. Based on a natural human skeleton, Protoclone is actuated with over 1,000 Myofibers, Clone’s proprietary <a data-linked-post="2657707172" href="https://spectrum.ieee.org/smart-clothes-artificial-muscles" target="_blank">artificial muscle </a>technology.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0cd3b066d4d457e899eea4f9f5ccee63" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/BdTwbdj7qWo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.clonerobotics.com/">Clone Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="hfqgy_5vyz4">There are a lot of heavily produced <a data-linked-post="2666662286" href="https://spectrum.ieee.org/humanoid-robots" target="_blank">humanoid-robot</a> videos from the companies selling them, but now that these platforms are entering the research space, we should start getting a more realistic sense of their capabilities.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3407a3432386f364f901de023ab8af72" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/HfqGy_5VyZ4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ucl.ac.uk/computer-science/">University College London</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="jmm26nc-4ww">Here’s a bit more footage from <a data-linked-post="2671672132" href="https://spectrum.ieee.org/video-friday-rivr-robot-delivery" target="_blank">RIVR</a> on their home-delivery robot.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ce1a53e3d3a64f4795b6cfa90a61037b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/jMm26NC-4ww?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.rivr.ai/">RIVR</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="1iyejnvorca">And now, this.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c430d72312041ba0ee622cc437218e7a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/1IYEjnvOrcA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.engineai.com.cn/">EngineAI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="aucb73kgevs"><em>Robots are at the heart of sci-fi visions of the future, but what if that future is now? And what if those robots, helping us at work and at home, are simply an extension of the tools we’ve used for millions of years? That’s what artist and engineer Catie Cuan thinks, and it’s part of the reason she teaches robots to dance. In this episode we meet the people at the frontiers of the future of robotics, and Astro Teller introduces two groundbreaking projects, Everyday Robots and Intrinsic, that have advanced how robots could work not just for us but with us.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ca7700d0160b832a271879fe96d74a1b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/aUcB73KgEVs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.youtube.com/playlist?list=PL7og_3Jqea4U6VgjOfaCGnqp6AiuVfgrU">Moonshot Podcast</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 11 Apr 2025 15:30:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-hopping-robot-insect</guid><category>Video friday</category><category>Robotics</category><category>Quadruped robots</category><category>Industrial robots</category><category>Humanoid robots</category><category>Dancing robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/person-holding-tweezers-adjusts-a-small-robotic-insect-with-fragile-wings.jpg?id=59862589&amp;width=980"></media:content></item><item><title>Video Friday: RIVR Delivers Your Package</title><link>https://spectrum.ieee.org/video-friday-rivr-robot-delivery</link><description><![CDATA[
  396. <img src="https://spectrum.ieee.org/media-library/delivery-robot-parked-outside-the-front-door-of-a-brick-house-next-to-a-small-plant-the-robot-is-offloading-packages-from-its.png?id=59796367&width=1454&height=811&coordinates=343%2C195%2C123%2C74"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://robosoft2025.org/">RoboSoft 2025</a>: 23–26 April 2025, LAUSANNE, SWITZERLAND</h5><h5><a href="https://uasconferences.com/2025_icuas/">ICUAS 2025</a>: 14–17 May 2025, CHARLOTTE, N.C.</h5><h5><a href="https://2025.ieee-icra.org/">ICRA 2025</a>: 19–23 May 2025, ATLANTA, GA.</h5><h5><a href="https://humanoidssummit.com/">London Humanoids Summit</a>: 29–30 May 2025, LONDON</h5><h5><a href="https://smartconf.jp/content/rcar2025/">IEEE RCAR 2025</a>: 1–6 June 2025, TOYAMA, JAPAN</h5><h5><a href="https://www.edrcoalition.com/2025-energy-drone-robotics-summit">2025 Energy Drone & Robotics Summit</a>: 16–18 June 2025, HOUSTON, TEXAS</h5><h5><a href="https://roboticsconference.org/">RSS 2025</a>: 21–25 June 2025, LOS ANGELES</h5><h5><a href="https://robotx.ethz.ch/education/summer-school.html">ETH Robotics Summer School</a>: 21–27 June 2025, GENEVA</h5><h5><a href="https://ias-19.org/">IAS 2025</a>: 30 June–4 July 2025, GENOA, ITALY</h5><h5><a href="https://clawar.org/icres2025/">ICRES 2025</a>: 3–4 July 2025, PORTO, PORTUGAL</h5><h5><a href="https://2025.worldhaptics.org/">IEEE World Haptics</a>: 8–11 July 2025, SUWON, KOREA</h5><h5><a href="https://ifac2025-msrob.com/">IFAC Symposium on Robotics</a>: 15–18 July 2025, PARIS</h5><h5><a href="https://2025.robocup.org/">RoboCup 2025</a>: 15–21 July 2025, BAHIA, BRAZIL</h5><h5><a href="https://www.ro-man2025.org/">RO-MAN 2025</a>: 25–29 August 2025, EINDHOVEN, THE NETHERLANDS</h5><h5><a href="https://clawar.org/clawar2025/">CLAWAR 2025</a>: 5–7 September 2025, SHENZHEN</h5><h5><a href="https://worldrobotsummit.org/en/">World Robot Summit</a>: 10–12 October 2025, OSAKA, JAPAN</h5><h5><a href="http://www.iros25.org/">IROS 2025</a>: 19–25 October 2025, HANGZHOU, CHINA</h5><h5><a href="https://2025humanoids.org/">IEEE Humanoids</a>: 30 September–2 October 2025, SEOUL</h5><h5><a href="https://www.corl.org/">CoRL 2025</a>: 27–30 September 2025, SEOUL</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="doohhmbsxk4">I love the platform and I love the use case, but this particular delivery method is...odd?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f62a4dd81de63c0be65e11b0f317e420" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/doohhmBSxK4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.rivr.ai/">RIVR</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ifo_b43edoc"><em>This is just the beginning of what people and physical AI can accomplish together.  To recognize business value from collaborative robotics, you have to understand what people do well, what robots do well, and how they best come together to create productivity. DHL and Robust.AI are partnering to define the future of human–robot collaboration.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="dcd9a62afbe63d14bb25883b7834b40c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/IFO_B43EDOc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.robust.ai/">Robust AI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="4u4etupwzhq"><em>Teleoperated robotic characters can perform expressive interactions with humans, relying on the operators’ experience and social intuition. In this work, we propose to create autonomous interactive robots by training a model to imitate operator data. Our model is trained on a dataset of human–robot interactions, where an expert operator is asked to vary the interactions and mood of the robot, while the operator commands and the poses of the human and robot are recorded.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1621fb6ac7fae6f305f6b9a049b080fe" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/4U4etupwzhQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://studios.disneyresearch.com/">Disney Research Studios</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="8hat3illjrk"><em>Introducing THEMIS V2, our all-new full-size humanoid robot. Standing at 1.6 meters with 40 degrees of freedom, THEMIS V2 now features enhanced 6 DoF arms and advanced 7 DoF end-effectors, along with an additional body-mounted stereo camera and up to 200 Tera Operations per Second (TOPS) of onboard AI computing power. These upgrades deliver exceptional capabilities in manipulation, perception, and navigation, pushing humanoid robotics to new heights.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="92db500fecb3f30090a3100dc855c8cf" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8hAt3ILlJrk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.westwoodrobotics.io/themis/">Westwood</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="woxchr1iatm"><em>BMW x Figure Update: This isn’t a test environment—it’s real production operations. Real-world robots are advancing our Helix AI and strengthening our end-to-end autonomy to deploy millions of robots.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ef30ac23ac6253594aa17f82a7b0f94c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/WoXCHr1IaTM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.figure.ai/">Figure</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="y5eikgwotwa"><em>On 13 March, at WorldMinds 2025, in the Kaufleuten Theater of Zurich, our team demonstrated for the first time two autonomous vision-based racing drones. It was an epic journey to prepare for this event, given the poor lighting conditions and the safety constraints of a theater filled with more than 500 people! The background screen visualizes in real time the observations of the AI algorithm of each drone. No map, no IMU, no SLAM!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c6fe9cd6e3e2a6d31a486150f61379be" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Y5eIkGwotWA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://rpg.ifi.uzh.ch/">University of Zurich (UZH)</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="0rwyoa7pjcs"><em>Unitree releases Dex5 dexterous hand. Single hand with 20 degrees of freedom (16 active plus 4 passive). Enable smooth backdrivability (direct force control). Equipped with 94 highly sensitive touch points (optional).</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="85725fe29ebc87a04f2768ca44ccd2e8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0rwYOa7pJCs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/Dex5-1/">Unitree</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="20ghg-r9efi">You can say “<a data-linked-post="2650277673" href="https://spectrum.ieee.org/robots-getting-a-grip-on-general-manipulation" target="_blank">real-world manipulation</a>” all you want, but until it’s actually in the real world, I’m not buying it.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="42689483da752c5c6a2bd0a11568c05a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/20GHG-R9eFI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.1x.tech/">1X</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="wvhuxlcuzai"><em>Developed by Pudu X-Lab, FlashBot Arm elevates the capabilities of our flagship FlashBot by blending advanced humanoid manipulation and intelligent delivery capabilities, powered by cutting-edge embodied AI. This powerful combination allows the robot to autonomously perform a wide range of tasks across diverse settings, including hotels, office buildings, restaurants, retail spaces, and health care facilities.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b5ac04581ffabbbf16922a61ff64d97e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/WvhUxlcuZAI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pudurobotics.com/en">Pudu Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="fbxgazfqgn8">If you ever wanted to manipulate a trilby with 25 robots, a solution now exists.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6e99e2b31844639d3305305bf8f05f05" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/fbXGAzFQGn8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/document/10943123">Paper</a> ] via [ <a href="https://www.epfl.ch/labs/rrl/">EPFL Reconfigurable Robotics Lab</a> ] published by [ <a href="https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=7083369" target="_blank">IEEE Robotics and Automation Letters</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="nj4pswwvakk">We’ve been sharing videos from the <a data-linked-post="2659336956" href="https://spectrum.ieee.org/tensegrity-robot" target="_blank">Suzumori Endo Robotics Lab</a> at the Institute of Science Tokyo for many years, and Professor Suzumori is now retiring.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="974035466df83768dfbf6f5302918676" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/nj4pSwwvakk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Best wishes to Professor Suzumori!</p><p>[ <a href="https://www.youtube.com/channel/UCOz02yvsxufQPe1vWmlANAA">Suzumori Endo Lab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="s9f7vvjyl6a"><em>No matter the vehicle, traditional control systems struggle when unexpected challenges—like damage, unforeseen environments, or new missions—push them beyond their design limits. Our Learning Introspective Control (LINC) program aims to fundamentally improve the safety of mechanical systems, such as ground vehicles, ships, and robotics, using various machine learning methods that require minimal computing power.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e47a6957662bda54a6731cb1d19a22a6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/s9f7VvJYl6A?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.darpa.mil/research/programs/learning-introspective-control">DARPA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="yhkizgkpzm4"><em>NASA’s Perseverance rover captured new images of multiple dust devils while exploring the rim of the Jezero crater on Mars. The largest dust devil was approximately 210 feet wide (65 meters). In this Mars Report, atmospheric scientist Priya Patel explains what dust devils can teach us about weather conditions on the Red Planet.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7bae5a2a4cb5a574bedc219164d05c3a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/YHKIZGKPZm4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://science.nasa.gov/mission/mars-2020-perseverance/">NASA</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 04 Apr 2025 16:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-rivr-robot-delivery</guid><category>Video friday</category><category>Robotics</category><category>Humanoid robots</category><category>Drone racing</category><category>Robot manipulation</category><category>Human-robot interaction</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/delivery-robot-parked-outside-the-front-door-of-a-brick-house-next-to-a-small-plant-the-robot-is-offloading-packages-from-its.png?id=59796367&amp;width=980"></media:content></item><item><title>How Dairy Robots Are Changing Work for Cows (and Farmers)</title><link>https://spectrum.ieee.org/lely-dairy-robots</link><description><![CDATA[
  397. <img src="https://spectrum.ieee.org/media-library/a-large-red-robot-moves-down-the-aisle-of-a-dairy-barn-dispensing-feed-to-cows-lined-up-to-eat.jpg?id=59849165&width=1500&height=2000&coordinates=1063%2C0%2C1064%2C0"/><br/><br/><div class="intro-text">
  398. <p style="font-size: 50px">
  399. <em>
  400. “Mooooo.”
  401. </em>
  402. </p>
  403. </div><p class="drop-caps">
  404. <strong>This dairy barn is</strong> full of cows, as you might expect. <a href="https://www.youtube.com/watch?v=g-zYshsAg1E" rel="noopener noreferrer" target="_blank">Cows are being milked</a>, <a href="https://www.youtube.com/watch?v=wm_Iul30Sx0" rel="noopener noreferrer" target="_blank">cows are being fed</a>, <a href="https://www.youtube.com/watch?v=yCJxN_3nnEc" rel="noopener noreferrer" target="_blank">cows are being cleaned up after</a>, and a few very happy cows are even getting <a href="https://www.youtube.com/watch?v=vq1j4ImZxcw" rel="noopener noreferrer" target="_blank">vigorously scratched behind the ears</a>. “I wonder where the farmer is,” remarks my guide, Jan Jacobs. Jacobs doesn’t seem especially worried, though—the several hundred cows in this barn are being well cared for by a small fleet of fully autonomous robots, and the farmer might not be back for hours. The robots will let him know if anything goes wrong.
  405. </p><p>
  406. At one of the milking robots, several cows are lined up, nose to tail, politely waiting their turn. The cows can get milked by robot whenever they like, which typically means
  407. <a href="https://www.lely.com/us/solutions/organic-grazing/faq/" rel="noopener noreferrer" target="_blank"> more frequently than the twice a day</a> at a traditional dairy farm. Not only is getting milked more often more comfortable for the cows, <a href="https://www.lely.com/us/solutions/organic-grazing/production/" rel="noopener noreferrer" target="_blank">cows also produce about 10 percent more milk</a> when the milking schedule is completely up to them.
  408. </p><p>
  409. “There’s a direct correlation between stress and milk production,” Jacobs says. “Which is nice, because robots make cows happier and therefore, they give more milk, which helps us sell more robots.”
  410. </p><p>
  411. <a href="https://www.linkedin.com/in/jan-jacobs-59296828/" rel="noopener noreferrer" target="_blank">Jan Jacobs</a> is the human-robot interaction design lead for <a href="https://www.lely.com/us/" rel="noopener noreferrer" target="_blank">Lely</a>, a maker of agricultural machinery. Founded in 1948 in Maassluis, Netherlands, Lely deployed its first Astronaut milking robot in the early 1990s. The company has since developed other robotic systems that assist with cleaning, feeding, and cow comfort, and the Astronaut milking robot is on its <a href="https://www.lely.com/media/lely-centers-files/brochures/published/astronaut_a5_brochure_e_en.pdf" rel="noopener noreferrer" target="_blank">fifth generation</a>. Lely is now focused entirely on robots for dairy farms, with around 135,000 of them deployed around the world.
  412. </p><h2>Essential Jobs on Dairy Farms</h2><p>
  413. <span>The weather outside the barn is miserable. It’s late fall in the Netherlands, and a cold rain is gusting in from the sea, which is probably why the cows have quite sensibly decided to stay indoors and why the farmer is still nowhere to be found. Lely requires that dairy farmers who adopt its robots commit to letting their cows move freely between milking, feeding, and resting, as well as inside and outside the barn, at their own pace. “We believe that free cow traffic is a core part of the future of farming,” Jacobs says as we watch one cow stroll away from the milking robot while another takes its place. This is possible only when the farm operates on the cows’ schedule rather than a human’s.</span>
  414. </p><p>
  415. A conventional dairy farm relies heavily on human labor. Lely estimates that repetitive daily tasks represent about a third of the average workday of a dairy farmer. In the morning, the cows are milked for the first time. Most dairy cows must be milked at least twice a day or they’ll become uncomfortable, and so the herd will line up on their own. Traditional milking parlors are designed to maximize human milking efficiency. A milking carousel, for instance, slowly rotates cows as they’re milked so that the dairy worker doesn’t have to move between stalls.
  416. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  417. <img alt="Cows entering and exiting a Lely Astronaut milking robot in a modern dairy farm setting." class="rm-shortcode" data-rm-shortcode-id="c207a725fbda29ab8e49e44a3e9aaf98" data-rm-shortcode-name="rebelmouse-image" id="3fad0" loading="lazy" src="https://spectrum.ieee.org/media-library/cows-entering-and-exiting-a-lely-astronaut-milking-robot-in-a-modern-dairy-farm-setting.jpg?id=59764532&width=980"/>
  418. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  419. <img alt="Automated cow milking machine in a dairy farm, cow in position being milked." class="rm-shortcode" data-rm-shortcode-id="4e3ab2f9a50553c4dad2c678fb747a85" data-rm-shortcode-name="rebelmouse-image" id="fc35f" loading="lazy" src="https://spectrum.ieee.org/media-library/automated-cow-milking-machine-in-a-dairy-farm-cow-in-position-being-milked.jpg?id=59759842&width=980"/>
  420. <small class="image-media media-caption" placeholder="Add Photo Caption...">“We were spending 6 hours a day milking,” explains dairy farmer Josie Rozum, whose 120-cow herd at Takes Dairy Farm uses a pair of Astronaut A5 milking robots. “Now that the robots are handling all of that, we can focus more on animal care and comfort.”</small><small class="image-media media-photo-credit" placeholder="add photo credit...">Lely</small>
  421. </p><p>
  422. An experienced human using well-optimized equipment can attach a milking machine to a cow
  423. <a href="https://www.vet.cornell.edu/animal-health-diagnostic-center/programs/quality-milk-production-services/services/parlor-efficiency" rel="noopener noreferrer" target="_blank">in just 20 to 30 seconds</a>. The actual milking takes only a few minutes, but with the average small dairy farm in North America providing a home for <a href="https://hoards.com/article-34810-familiar-dairy-industry-trends-continue.html" rel="noopener noreferrer" target="_blank">several hundred cows</a>, milking typically represents a time commitment of <a href="https://www.sciencedirect.com/science/article/pii/S0022030222003228#fig2" rel="noopener noreferrer" target="_blank">4 to 6 hours per day</a>.
  424. </p><p>
  425. There are other jobs that must be done every day at a dairy.
  426. <a href="https://www.dairyherd.com/news/education/how-often-should-you-push-feed" rel="noopener noreferrer" target="_blank">Cows are happier with continuous access to food</a>, which means feeding them several times a day. <a href="https://www.americandairy.com/dairy-diary/what-do-dairy-cows-eat/" rel="noopener noreferrer" target="_blank">The feed is a mix of roughage (hay), silage (grass), and grain</a>. The cows will eat all of this, but they prefer the grain, and so it’s common to see cows sorting their food by grabbing a mouthful and throwing it up into the air. The lighter roughage and silage flies farther than the grain does, leaving the cow with a pile of the tastier stuff as the rest gets tossed out of reach. This makes “<a href="https://www.dairyherd.com/news/education/how-often-should-you-push-feed" rel="noopener noreferrer" target="_blank">feed pushing</a>” necessary to shove the rest of the feed back within reach of the cow.
  427. </p><p>
  428. And of course there’s manure. A dairy cow produces an average of
  429. <a href="https://thedairylandinitiative.vetmed.wisc.edu/home/housing-module/adult-cow-housing/manure-management/" rel="noopener noreferrer" target="_blank">68 kilograms of manure a day</a>. All that manure has to be collected and the barn floors regularly cleaned.
  430. </p><h2>Dairy Industry 4.0</h2><p>
  431. The amount of labor needed to operate a dairy meant that until the early 1900s,
  432. <a href="https://youtu.be/sjmflKg-v2o?t=1049" rel="noopener noreferrer" target="_blank">most family farms could support only about eight cows</a>. The introduction of the first milking machines, called <a href="https://www.surgemilker.com/history.html" rel="noopener noreferrer" target="_blank">bucket milkers</a>, helped farmers milk 10 cows per hour instead of 4 by the mid-1920s. Rural electrification furthered dairy automation starting in the 1950s, and since then, both farm size and milk production have increased steadily. In the 1930s, a good dairy cow <a href="https://youtu.be/sjmflKg-v2o?t=1631" rel="noopener noreferrer" target="_blank">produced 3,600 kilograms of milk per year</a>. <a href="https://downloads.usda.library.cornell.edu/usda-esmis/files/h989r321c/mg74sh83p/nc582h285/mkpr0225.pdf" rel="noopener noreferrer" target="_blank">Today, it’s almost 11,000 kilograms</a>, and Lely believes that robots are what will enable small dairy farms to continue to scale sustainably.
  433. </p><p class="shortcode-media shortcode-media-youtube">
  434. <span class="rm-shortcode" data-rm-shortcode-id="3d0c2a865a4f6909e8ce591fab31c88e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/g-zYshsAg1E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  435. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Lely</small>
  436. </p><p>
  437. <span>But dairy robots are expensive. A milking robot can cost </span><a href="https://dairy.unl.edu/automatic-milking-systems-good-bad-and-unknown/" target="_blank">several hundred thousand dollars</a><span>, plus an additional </span><a href="https://www.lelycentermidatlantic.com/index.php/2022/02/01/how-much-does-it-cost-to-operate-a-lely-robot/" target="_blank">US $5,000 to $10,000 per year in operating costs</a><span>. The Astronaut A5, Lely’s latest milking robot, uses a laser-guided robot arm to clean the cow’s udder before attaching teat cups one at a time. While the cow munches on treats, the Astronaut monitors her milk output, collecting data on 32 parameters, including indicators of the quality of the milk and the health of the cow. When milking is complete, the robot cleans the udder again, and the cow is free to leave as the robot steam cleans itself in preparation for the next cow.</span>
  438. </p><p>
  439. <a href="https://www.lely.com/gb/centers/midlands/what-cost-milking-robot/" rel="noopener noreferrer" target="_blank">Lely argues</a> that although the initial cost is higher than that of a traditional milking parlor, the robots pay for themselves over time through higher milk production (due primarily to increased milking frequency) and lower labor costs. Lely’s other robots can also save on labor. The Vector mobile robot handles continuous feeding and feed pushing, and the Discovery Collector is a robotic manure vacuum that keeps the floors clean.
  440. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  441. <img alt="Automated feeding robot is loaded with food by a small overhead crane before it leaves to deliver feed to cows inside a modern barn." class="rm-shortcode" data-rm-shortcode-id="9883c894cc153fca40c2315735e6c5f1" data-rm-shortcode-name="rebelmouse-image" id="eb199" loading="lazy" src="https://spectrum.ieee.org/media-library/automated-feeding-robot-is-loaded-with-food-by-a-small-overhead-crane-before-it-leaves-to-deliver-feed-to-cows-inside-a-modern-b.jpg?id=59764133&width=980"/>
  442. <small class="image-media media-caption" placeholder="Add Photo Caption...">At Takes Dairy Farm, Rozum and her family used to spend several hours per day managing food for the cows. “The feeding robot is another amazing piece of the puzzle for our farm that allows us to focus on other things.”</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Takes Family Farm</small>
  443. </p><p>
  444. For most dairy farmers, though, making more money is not the main reason to get a robot, explains
  445. <a href="https://ansci.umn.edu/people/marcia-endres" rel="noopener noreferrer" target="_blank">Marcia Endres</a>, a professor in the department of animal science at the University of Minnesota. Endres specializes in dairy-cattle management, behavior, and welfare, and studies dairy robot adoption. “When we first started doing research on this about 12 years ago, most of the farms that were installing robots were smaller farms that did not want to hire employees,” Endres says. “They wanted to do the work just with family labor, but they also wanted to have more flexibility with their time. They wanted a better lifestyle.”
  446. </p><p>
  447. Flexibility was key for the Takes family, who
  448. <a href="https://youtu.be/vZY8TbBoDd0?t=147" rel="noopener noreferrer" target="_blank">added Lely robots to their dairy farm</a> in Ely, Iowa, four years ago. “When we had our old milking parlor, everything that we did as a family was always scheduled around milking,” says Josie Rozum, who manages the farm and a creamery along with her parents—Dan and Debbie Takes—and three brothers. “With the robots, we can prioritize our personal life a little bit more—we can spend time together on Christmas morning and know that the cows are still getting milked.”
  449. </p><p>
  450. <a href="https://www.dananddebbies.com/about/the-farm/" rel="noopener noreferrer" target="_blank">Takes Family Dairy Farm</a>’s 120-cow herd is milked by a pair of Astronaut A5 robots, with a Vector and three Discovery Collectors for feeding and cleaning. “They’ve become a crucial part of the team,” explains Rozum. “It would be challenging for us to find outside help, and the robots keep things running smoothly.” The robots also add sustainability to small dairy farms, and not just in the short term. “Growing up on the farm, we experienced the hard work, and we saw what that commitment did to our parents,” Rozum explains. “It’s a very tough lifestyle. Having the robots take over a little bit of that has made dairy farming more appealing to our generation.”
  451. </p><p class="shortcode-media shortcode-media-youtube">
  452. <span class="rm-shortcode" data-rm-shortcode-id="c22f75ac2236f457c18ecd55f872ca53" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vZY8TbBoDd0?rel=0&start=147" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  453. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Takes Dairy Farm</small>
  454. </p><p>
  455. Of the 25,000 dairy farms in the United States, Endres estimates about 10 percent have robots. This is
  456. <a href="https://www.gminsights.com/industry-analysis/milking-robots-market" rel="noopener noreferrer" target="_blank">about a third of the adoption rate in Europe</a>, <a href="https://ec.europa.eu/eurostat/statistics-explained/index.php?title=Farms_and_farmland_in_the_European_Union_-_statistics" rel="noopener noreferrer" target="_blank">where farms tend to be smaller</a>, so the cost of implementing the robots is lower. Endres says that over the last five years, she’s seen a shift toward robot adoption at larger farms with over 500 cows, due primarily to labor shortages. “These larger dairies are having difficulty finding employees who want to milk cows—it’s a very tedious job. And the robot is always consistent. The farmers tell me, ‘My robot never calls in sick, and never shows up drunk.’ ”
  457. </p><p>
  458. Endres is skeptical of Lely’s claim that its robots are responsible for increased milk production. “There is no research that proves that cows will be more productive just because of robots,” she says. It may be true that farms that add robots do see increased milk production, she adds, but it’s difficult to measure the direct effect that the robots have. “I have many dairies that I work with where they have both a robotic milking system and a conventional milking system, and if they are managing their cows well, there isn’t a lot of difference in milk production.”
  459. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  460. <img alt="Cow using an automated brush for grooming inside a modern barn." class="rm-shortcode" data-rm-shortcode-id="9106b1e71ad19d377f43b164221e0a40" data-rm-shortcode-name="rebelmouse-image" id="3f0b4" loading="lazy" src="https://spectrum.ieee.org/media-library/cow-using-an-automated-brush-for-grooming-inside-a-modern-barn.jpg?id=59759835&width=980"/>
  461. <small class="image-media media-caption" placeholder="Add Photo Caption...">The Lely Luna cow brush helps to keep cows’ skin healthy. It’s also relaxing and enjoyable, so cows will brush themselves several times a day.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Lely</small>
  462. </p><p>
  463. The robots do seem to improve the cows’ lives, however. “Welfare is not just productivity and health—it’s also the affective state, the ability to have a more natural life,” Endres says. “Again, it’s hard to measure, but I think that on most of these robot farms, their affective state is improved.” The cows’ relationship with humans changes too, comments Endres. When the cows no longer associate humans with being told where to go and what to do all the time, they’re
  464. <a href="https://youtu.be/sjmflKg-v2o?t=2633" rel="noopener noreferrer" target="_blank">much more relaxed and friendly</a> toward people they meet. Rozum agrees. “We’ve noticed a tremendous change in our cows’ demeanor. They’re more calm and relaxed, just doing their thing in the barn. They’re much more comfortable when they can choose what to do.”
  465. </p><h2>Cows Versus Robots</h2><p>
  466. Cows are curious and clever animals, and have the same instinct that humans have when confronted with a new robot: They want to play with it. Because of this, Lely has had to cow-proof its robots, modifying their design and programming so that the machines can function autonomously around cows. Like many mobile robots, Lely’s dairy robots include contact-sensing bumpers that will pause the robot’s motion if it runs into something. On the Vector feeding robot, Lely product engineer
  467. <a href="https://www.linkedin.com/in/beltman42/" rel="noopener noreferrer" target="_blank">René Beltman</a> tells me, they had to add a software option to disable the bumper. “The cows learned that, ‘oh, if I just push the bumper, then the robot will stop and put down more feed in my area for me to eat.’ It was a free buffet. So you don’t want the cows to end up controlling the robot.” Emergency stop buttons had to be relocated so that they couldn’t be pressed by questing cow tongues.
  468. </p><p>
  469. There’s also a social component to cow-robot interaction. Within their herd, cows have a well-established hierarchy, and the robots need to work within this hierarchy to do their jobs. For example, a cow won’t move out of the way if it thinks that another cow is lower in the hierarchy than it is, and it will treat a robot the same way. The engineers had to figure out how the Discovery Collector could drive back and forth to vacuum up manure without getting blocked by cows. “In our early tests, we’d use sensors to have the robot stop to avoid running into any of the cows,” explains Jacobs. “But that meant that the robot became the weakest one in the hierarchy, and it would just end up crying in the corner because the cows wouldn’t move for it. So now, it doesn’t stop.”
  470. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  471. <img alt="Cows resting in pens with a robot cleaning the floor in a modern barn setting." class="rm-shortcode" data-rm-shortcode-id="270244739bf55a596082710523ed2eed" data-rm-shortcode-name="rebelmouse-image" id="c9d2c" loading="lazy" src="https://spectrum.ieee.org/media-library/cows-resting-in-pens-with-a-robot-cleaning-the-floor-in-a-modern-barn-setting.jpg?id=59759837&width=980"/>
  472. <small class="image-media media-caption" placeholder="Add Photo Caption...">One of the dirtiest jobs on a dairy farm is handled by the Discovery Collector, an autonomous manure vacuum. The robot relies on wheel odometry and ultrasonic sensors for navigation because it’s usually covered in manure.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Evan Ackerman</small>
  473. </p><p>
  474. “We make the robot drive slower for the first week, when it’s being introduced to a new herd,” adds Beltman. “That gives the cows time to figure out that the robot is at the top of the hierarchy.”
  475. </p><p>
  476. Besides maintaining their dominance at the top of the herd, the current generation of Lely robots doesn’t interact much with the cows, but that’s changing, Jacobs tells me. Right now, when a robot is driving through the barn, it makes a beeping sound to let the cows know it’s coming. Lely is looking into how to make these sounds more enjoyable for the cows. “This was a recent revelation for me,” Jacobs says. ”We’re not just designing interactions for humans. The cows are our users, too.”
  477. </p><h2>Human-Robot Interaction</h2><p>
  478. Last year, Jacobs and researchers from Delft University of Technology, in the Netherlands,
  479. <a href="https://ieeexplore.ieee.org/document/10660792" rel="noopener noreferrer" target="_blank">presented a paper</a> at the IEEE Human-Robot Interaction (HRI) Conference exploring this concept of robot behavior development on working dairy farms. The researchers visited robotic dairies, interviewed dairy farmers, and held workshops within Lely to establish a robot code of conduct—a guide that Lely’s designers and engineers use when considering how their robots should look, sound, and act, for the benefit of both humans and cows. On the engineering side, this includes practical things like colors and patterns for lights and different types of sounds so that information is communicated consistently across platforms.
  480. </p><p>
  481. But there’s much more nuance to making a robot seem “reliable” or “friendly” to the end user, since such things are not only difficult to define but also difficult to implement in a way that’s appropriate for dairy farmers, who prioritize functionality.
  482. </p><p>
  483. Jacobs doesn’t want his robots to try to be anyone’s friend—not the cow’s, and not the farmer’s. “The robot is an employee, and it should have a professional relationship,” he says. “So the robot might say ‘Hi,’ but it wouldn’t say, ‘How are you feeling today?’ ” What’s more important is that the robots are trustworthy. For Jacobs, instilling trust is simple: “You cannot gain trust by doing tricks. If your robot is reliable and predictable, people will trust it.”
  484. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  485. <img alt="Automated milking machine attached to cow's udders, with cow standing on a slotted floor." class="rm-shortcode" data-rm-shortcode-id="4bec6c118f690770f5c6ecec2f377e4e" data-rm-shortcode-name="rebelmouse-image" id="cf5f5" loading="lazy" src="https://spectrum.ieee.org/media-library/automated-milking-machine-attached-to-cow-s-udders-with-cow-standing-on-a-slotted-floor.jpg?id=59759840&width=980"/>
  486. <small class="image-media media-caption" placeholder="Add Photo Caption...">The electrically driven, pneumatically balanced robotic arm that the Lely Astronaut uses to milk cows is designed to withstand accidental (or intentional) kicks.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Lely</small>
  487. </p><p>
  488. The real challenge, Jacobs explains, is that Lely is largely on its own when it comes to finding the best way of integrating its robots into the daily lives of people who may have never thought they’d have robot employees. “There’s not that much knowledge in the robot world about how to approach these problems,” Jacobs says. “We’re working with almost 20,000 farmers who have a bigger robot workforce than a human workforce. They’re robot managers. And I don’t know that there necessarily are other companies that have a customer base of normal people who have strategic dependence on robots for their livelihood. That is where we are now.”
  489. </p><h2>From Dairy Farmers to Robot Managers</h2><p>
  490. <span>With the additional time and flexibility that the robots enable, some dairy farmers have been able to diversify. On our way back to Lely’s headquarters, we stop at </span><a href="https://boerderijhetlansingerland.nl/melkveebedrijf/" target="_blank">Farm Het Lansingerland</a><span>, owned by a Lely customer who has added a small restaurant and farm shop to his dairy. Large windows look into the barn so that restaurant patrons can watch the robots at work, caring for the cows that produce the cheese that’s on the menu. A self-guided tour takes you right up next to an Astronaut A5 milking robot, while signs on the floor warn of Vector feeding robots on the move. “This farmer couldn’t expand—this was as many cows as he’s allowed to have here,” Jacobs explains to me over cheese sandwiches. “So, he needs to have additional income streams. That’s why he started these other things. And the robots were essential for that.”</span>
  491. </p><p>
  492. The farmer is an early adopter—someone who’s excited about the technology and actively interested in the robots themselves. But most of Lely’s tens of thousands of customers just want a reliable robotic employee, not a science project. “We help the farmer to prepare not just the environment for the robots, but also the mind,” explains Jacobs. “It’s a complete shift in their way of working.”
  493. </p><p>
  494. Besides managing the robots, the farmer must also learn to manage the massive amount of data that the robots generate about the cows. “The amount of data we get from the robots is a game changer,” says Rozum. “We can track milk production, health, and cow habits in real time. But it’s overwhelming. You could spend all day just sitting at the computer, looking at data and not get anything else done. It took us probably a year to really learn how to use it.”
  495. </p><p>
  496. The most significant advantages to farmers come from using the data for long-term optimization, says the University of Minnesota’s Endres. “In a conventional barn, the cows are treated as a group,” she says. “But the robots are collecting data about individual animals, which lets us manage them as individuals.” By combining data from a milking robot and a feeding robot, for example, farmers can close the loop, correlating when and how the cows are fed with their milk production. Lely is doing its best to simplify this type of decision making, says Jacobs. “You need to understand what the data means, and then you need to present it to the farmer in an actionable way.”
  497. </p><h3>A Robotic Dairy</h3><br/><img alt="Illustration of an automated dairy farm with milking machines, feed dispensers, and cows in various areas." class="rm-shortcode" data-rm-shortcode-id="523f4b6812182dc0756edd9f83d8e771" data-rm-shortcode-name="rebelmouse-image" id="15c0e" loading="lazy" src="https://spectrum.ieee.org/media-library/illustration-of-an-automated-dairy-farm-with-milking-machines-feed-dispensers-and-cows-in-various-areas.png?id=59771554&width=980"/><h3 style="font-size: 20px; letter-spacing: -0.25px; line-height: 24px; padding-top: 15px;">
  498. All dairy farms are different, and farms that decide to give robots a try will often start with just one or two. A highly roboticized dairy barn might look something like this illustration, with a team of many different robots working together to keep the cows comfortable and happy.
  499. </h3><p class="caption">
  500. A: One Astronaut A5 robot can milk up to 60 cows. After the Astronaut cleans the teats, a laser sensor guides a robotic arm to attach the teat cups. Milking takes just a few minutes.
  501. </p><p class="caption">
  502. B: In the feed kitchen, the Vector robot recharges itself while different ingredients are loaded into its hopper and mixed together. Mixtures can be customized for different groups of cows.
  503. </p><p class="caption">
  504. C: The Vector robot dispenses freshly mixed food in small batches throughout the day. A laser measures the height of leftover food to make sure that the cows are getting the right amounts.
  505. </p><p class="caption">
  506. D: The Discovery Collector is a mop and vacuum for cow manure. It navigates the barn autonomously and returns to its docking station to remove waste, refill water, and wirelessly recharge.</p><p class="caption">
  507. E: As it milks, the Astronaut is collecting a huge amount of data—32 different parameters per teat. If it detects an issue, the farmer is notified, helping to catch health problems early.</p><p class="caption">
  508. F: Automated gates control meadow access and will keep a cow inside if she’s due to be milked soon. Cows are identified using RFID collars, which also track their behavior and health.
  509. </p><h2>A Sensible Future for Dairy Robots</h2><p>
  510. After lunch, we stop by Lely headquarters, where bright red life-size cow statues guard the entrance and all of the conference rooms are dairy themed. We get comfortable in Butter, and I ask Jacobs and Beltman what the future holds for their dairy robots.
  511. </p><p>
  512. In the near term, Lely is focused on making its existing robots more capable. Its latest
  513. <a href="https://www.lely.com/us/press/2024/09/10/lely-introduces-autonomous-feed-pushing-robot-for/" rel="noopener noreferrer" target="_blank">feed-pushing robot</a> is equipped with lidar and stereo cameras, which allow it to autonomously navigate around large farms without needing to follow a metal strip bolted to the ground. A new <a href="https://www.lely.com/us/press/2024/09/10/lely-zeta-the-start-of-a-new-chapter-in-dairy-farm/" rel="noopener noreferrer" target="_blank">overhead camera system</a> will leverage AI to recognize individual cows and track their behavior, while also providing farmers with an enormous new dataset that could allow Lely’s systems to help farmers make more nuanced decisions about cow welfare. The potential of AI is what Jacobs seems most excited about, although he’s cautious as well. “With AI, we’re suddenly going to take away an entirely different level of work. So, we’re thinking about doing research into the meaningfulness of work, to make sure that the things that we do with AI are the things that farmers <em><em>want</em></em> us to do with AI.”
  514. </p><p>
  515. “The idea of AI is very intriguing,” comments Rozum. “I think AI could help to simplify things for farmers. It would be a tool, a resource. But we know our cows best, and a farmer’s judgment has to be there too. There’s just some component of dairy farming that you cannot take the human out of. Robots are not going to be successful on a farm unless you have good farmers.”
  516. </p><p>
  517. Lely is aware of this and knows that its robots have to find the right balance between being helpful, and taking over. “We want to make sure not to take away the kinds of interactions that give dairy farmers joy in their work,” says Beltman. “Like feeding calves—every farmer likes to feed the calves.” Lely does sell an
  518. <a href="https://www.lely.com/us/solutions/feeding/calm/" rel="noopener noreferrer" target="_blank">automated calf feeder</a> that many dairy farmers buy, which illustrates the point: What’s the best way of designing robots to give humans the flexibility to do the work that they enjoy?
  519. </p><p>
  520. “This is where robotics is going,” Jacobs tells me as he gives me a lift to the train station. “As a human, you could have two other humans and six robots, and that’s your company.” Many industries, he says, look to robots with the objective of minimizing human involvement as much as possible so that the robots can generate the maximum amount of value for whoever happens to be in charge.
  521. </p><p>
  522. Dairy farms are different. Perhaps that’s because the person buying the robot is the person who most directly benefits from it. But I wonder if the concern over automation of jobs would be mitigated if more companies chose to emphasize the sustainability and joy of work equally with profit. Automation doesn’t have to be zero-sum—if implemented thoughtfully, perhaps robots can make work easier, more efficient, and more fun, too.
  523. </p><p>
  524. Jacobs certainly thinks so. “That’s my utopia,” he says. “And we’re working in the right direction.”
  525. </p><p><em>This article appears in the May 2025 print issue as “Robots for Cows.”</em></p>]]></description><pubDate>Tue, 01 Apr 2025 20:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/lely-dairy-robots</guid><category>Farming</category><category>Farming robots</category><category>Animals</category><category>Lely</category><category>Dairy</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-large-red-robot-moves-down-the-aisle-of-a-dairy-barn-dispensing-feed-to-cows-lined-up-to-eat.jpg?id=59849165&amp;width=980"></media:content></item><item><title>Protecting Robots in Harsh Environments With Advanced Sealing Systems</title><link>https://spectrum.ieee.org/cobots-ipsr</link><description><![CDATA[
  526. <img src="https://spectrum.ieee.org/media-library/robotic-arms-assembling-battery-packs-in-a-high-tech-factory.jpg?id=59060715&width=1500&height=2000&coordinates=592%2C0%2C592%2C0"/><br/><br/><p><em>This is a sponsored article brought to you by <a href="https://www.fst.com/markets/robotics/cobots/" target="_blank">Freudenberg Sealing Technologies</a>.</em></p><p>The increasing deployment of collaborative robots (cobots) in outdoor environments presents significant engineering challenges, requiring highly advanced sealing solutions to ensure reliability and durability. Unlike industrial robots that operate in controlled indoor environments, outdoor cobots are exposed to extreme weather conditions that can compromise their mechanical integrity. Maintenance robots used in servicing wind turbines, for example, must endure intense temperature fluctuations, high humidity, prolonged UV radiation exposure, and powerful wind loads. Similarly, agricultural robots operate in harsh conditions where they are continuously exposed to abrasive dust, chemically aggressive fertilizers and pesticides, and mechanical stresses from rough terrains.</p><p>To ensure these robotic systems maintain long-term functionality, sealing solutions must offer effective protection against environmental ingress, mechanical wear, corrosion, and chemical degradation. Outdoor robots must perform flawlessly in temperature ranges spanning from scorching heat to freezing cold while withstanding constant exposure to moisture, lubricants, solvents, and other contaminants. In addition, sealing systems must be resilient to continuous vibrations and mechanical shocks, which are inherent to robotic motion and can accelerate material fatigue over time.</p><h2>Comprehensive Technical Requirements for Robotic Sealing Solutions</h2><p>The development of sealing solutions for outdoor robotics demands an intricate balance of durability, flexibility, and resistance to wear. Robotic joints, particularly those in high-mobility systems, experience multidirectional movements within confined installation spaces, making the selection of appropriate sealing materials and geometries crucial. Traditional elastomeric O-rings, widely used in industrial applications, often fail under such extreme conditions. Exposure to high temperatures can cause thermal degradation, while continuous mechanical stress accelerates fatigue, leading to early seal failure. Chemical incompatibility with lubricants, fuels, and cleaning agents further contributes to material degradation, shortening operational lifespans.</p><p>Friction-related wear is another critical concern, especially in robotic joints that operate at high speeds. Excessive friction not only generates heat but can also affect movement precision. In collaborative robotics, where robots work alongside humans, such inefficiencies pose safety risks by delaying response times and reducing motion accuracy. Additionally, prolonged exposure to UV radiation can cause conventional sealing materials to become brittle and crack, further compromising their performance.</p><h2>Advanced IPSR Technology: Tailored for Cobots</h2><p>To address these demanding conditions, Freudenberg Sealing Technologies has developed a specialized sealing solution: <a href="https://www.fst.com/markets/robotics/cobots/" target="_blank">Ingress Protection Seals for Robots (IPSR)</a>. Unlike conventional seals that rely on metallic springs for mechanical support, the IPSR design features an innovative Z-shaped geometry that dynamically adapts to the axial and radial movements typical in robotic joints.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  527. <img alt="Diagram of a robotic arm displaying PSS, Simmering, and MSS1 seal locations." class="rm-shortcode" data-rm-shortcode-id="d38abd82e0542acd543defe66d9509ca" data-rm-shortcode-name="rebelmouse-image" id="d5e27" loading="lazy" src="https://spectrum.ieee.org/media-library/diagram-of-a-robotic-arm-displaying-pss-simmering-and-mss1-seal-locations.jpg?id=59068475&width=980"/>
  528. <small class="image-media media-caption" placeholder="Add Photo Caption...">Numerous seals are required in cobots and these are exposed to high speeds and forces.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Freudenberg Sealing Technologies</small></p><p>This unique structural design distributes mechanical loads more efficiently, significantly reducing friction and wear over time. While traditional spring-supported seals tend to degrade due to mechanical fatigue, the IPSR configuration eliminates this limitation, ensuring long-lasting performance. Additionally, the optimized contact pressure reduces frictional forces in robotic joints, thereby minimizing heat generation and extending component lifespans. This results in lower maintenance requirements, a crucial factor in applications where downtime can lead to significant operational disruptions.</p><h2>Optimized Through Advanced Simulation Techniques</h2><p>The development of IPSR technology relied extensively on Finite Element Analysis (FEA) simulations to optimize seal geometries, material selection, and surface textures before physical prototyping. These advanced computational techniques allowed engineers to predict and enhance seal behavior under real-world operational conditions.</p><p>FEA simulations focused on key performance factors such as frictional forces, contact pressure distribution, deformation under load, and long-term fatigue resistance. By iteratively refining the design based on simulation data, Freudenberg engineers were able to develop a sealing solution that balances minimal friction with maximum durability.</p><p>Furthermore, these simulations provided insights into how IPSR seals would perform under extreme conditions, including exposure to humidity, rapid temperature changes, and prolonged mechanical stress. This predictive approach enabled early detection of potential failure points, allowing for targeted improvements before mass production. By reducing the need for extensive physical testing, Freudenberg was able to accelerate the development cycle while ensuring high-performance reliability.</p><h2>Material Innovations: Superior Resistance and Longevity</h2><p>The effectiveness of a sealing solution is largely determined by its material composition. Freudenberg utilizes advanced elastomeric compounds, including Fluoroprene XP and EPDM, both selected for their exceptional chemical resistance, mechanical strength, and thermal stability.</p><p>Fluoroprene XP, in particular, offers superior resistance to aggressive chemicals, including solvents, lubricants, fuels, and industrial cleaning agents. Additionally, its resilience against ozone and UV radiation makes it an ideal choice for outdoor applications where continuous exposure to sunlight could otherwise lead to material degradation. EPDM, on the other hand, provides outstanding flexibility at low temperatures and excellent aging resistance, making it suitable for applications that require long-term durability under fluctuating environmental conditions.</p><p>To further enhance performance, Freudenberg applies specialized solid-film lubricant coatings to IPSR seals. These coatings significantly reduce friction and eliminate stick-slip effects, ensuring smooth robotic motion and precise movement control. This friction management not only improves energy efficiency but also enhances the overall responsiveness of robotic systems, an essential factor in high-precision automation.</p><h2>Extensive Validation Through Real-World Testing</h2><p>While advanced simulations provide critical insights into seal behavior, empirical testing remains essential for validating real-world performance. Freudenberg subjected IPSR seals to rigorous durability tests, including prolonged exposure to moisture, dust, temperature cycling, chemical immersion, and mechanical vibration.</p><p>Throughout these tests, IPSR seals consistently achieved IP65 certification, demonstrating their ability to effectively prevent environmental contaminants from compromising robotic components. Real-world deployment in maintenance robotics for wind turbines and agricultural automation further confirmed their reliability, with extensive wear analysis showing significantly extended operational lifetimes compared to traditional sealing technologies.</p><h2>Safety Through Advanced Friction Management</h2><p>In collaborative robotics, sealing performance plays a direct role in operational safety. Excessive friction in robotic joints can delay emergency-stop responses and reduce motion precision, posing potential hazards in human-robot interaction. By incorporating low-friction coatings and optimized sealing geometries, Freudenberg ensures that robotic systems respond rapidly and accurately, enhancing workplace safety and efficiency.</p><h2>Tailored Sealing Solutions for Various Robotic Systems</h2><p>Freudenberg Sealing Technologies provides customized sealing solutions across a wide range of robotic applications, ensuring optimal performance in diverse environments.</p><p>Automated Guided Vehicles (AGVs) operate in industrial settings where they are exposed to abrasive contaminants, mechanical vibrations, and chemical exposure. Freudenberg employs reinforced PTFE composites to enhance durability and protect internal components.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  529. <img alt="Diagram showing different sealing technologies in a device: PSS, Simmerring, MSS1, and eCON." class="rm-shortcode" data-rm-shortcode-id="c07b32f7cfa28f4ef5224beb006b29df" data-rm-shortcode-name="rebelmouse-image" id="901a7" loading="lazy" src="https://spectrum.ieee.org/media-library/diagram-showing-different-sealing-technologies-in-a-device-pss-simmerring-mss1-and-econ.jpg?id=59068017&width=980"/>
  530. <small class="image-media media-caption" placeholder="Add Photo Caption...">Delta robots can perform complex movements at high speed. This requires seals that meet the high dynamic and acceleration requirements.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Freudenberg Sealing Technologies</small></p><p>Delta robots, commonly used in food processing, pharmaceuticals, and precision electronics, require FDA-compliant materials that withstand rigorous cleaning procedures such as Cleaning-In-Place (CIP) and Sterilization-In-Place (SIP). Freudenberg utilizes advanced fluoropolymers that maintain structural integrity under aggressive sanitation processes.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  531. <img alt="A mechanical device with colored dots indicating PSS, Simmering\u00ae, MSS1, and eCON components." class="rm-shortcode" data-rm-shortcode-id="43ef9141b9e056ba3ae032f122b3db17" data-rm-shortcode-name="rebelmouse-image" id="5fbb8" loading="lazy" src="https://spectrum.ieee.org/media-library/a-mechanical-device-with-colored-dots-indicating-pss-simmering-u00ae-mss1-and-econ-components.jpg?id=59068943&width=980"/>
  532. <small class="image-media media-caption" placeholder="Add Photo Caption...">Seals for Scara robots must have high chemical resistance, compressive strength and thermal resistance to function reliably in a variety of industrial environments.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Freudenberg Sealing Technologies</small></p><p>SCARA robots benefit from Freudenberg’s Modular Plastic Sealing Concept (MPSC), which integrates sealing, bearing support, and vibration damping within a compact, lightweight design. This innovation optimizes robot weight distribution and extends component service life.</p><p>Six-axis robots used in automotive, aerospace, and electronics manufacturing require sealing solutions capable of withstanding high-speed operations, mechanical stress, and chemical exposure. Freudenberg’s Premium Sine Seal (PSS), featuring reinforced PTFE liners and specialized elastomer compounds, ensures maximum durability and minimal friction losses.</p><h2>Continuous Innovation for Future Robotic Applications</h2><p>Freudenberg Sealing Technologies remains at the forefront of innovation, continuously developing new materials, sealing designs, and validation methods to address evolving challenges in robotics. Through strategic customer collaborations, cutting-edge material science, and state-of-the-art simulation technologies, Freudenberg ensures that its sealing solutions provide unparalleled reliability, efficiency, and safety across all robotic platforms.</p>]]></description><pubDate>Tue, 01 Apr 2025 15:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/cobots-ipsr</guid><category>Robotics</category><category>Collaborative robots</category><category>Industrial robots</category><category>Type:sponsored</category><dc:creator>Hunter Cheng</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/robotic-arms-assembling-battery-packs-in-a-high-tech-factory.jpg?id=59060715&amp;width=980"></media:content></item><item><title>The Tiniest Flying Robot Soars, Thanks to Magnets</title><link>https://spectrum.ieee.org/flying-robot-2671447539</link><description><![CDATA[
  533. <img src="https://spectrum.ieee.org/media-library/a-miniature-drone-on-top-of-a-u-s-penny.jpg?id=59700962&width=1500&height=2000&coordinates=939%2C0%2C940%2C0"/><br/><br/><p><span>A new prototype is laying claim to the title of smallest, lightest untethered flying robot.</span></p><p>At less than a centimeter in wingspan, the <u><a href="https://spectrum.ieee.org/charging-rooms-can-power-devices-without-wires" target="_self">wirelessly powered</a></u> robot is currently very limited in how far it can travel away from the magnetic fields that drive its flight. However, the scientists who developed it suggest there are ways to boost its range, which could lead to potential applications such as search-and-rescue operations, inspecting damaged machinery in industrial settings, and even plant pollination.</p><p>One strategy to shrink <u><a href="https://spectrum.ieee.org/smallest-drone" target="_self"><span>flying robots</span></a></u> involves removing their batteries and using tethers to supply electricity to them. However, tethered flying robots face problems operating freely in complex environments. This has led some researchers to explore wireless methods of powering robot flight.</p><p>“The dream was to make flying robots to fly anywhere and anytime without using an electrical wire for the power source,” says <a href="https://me.berkeley.edu/people/liwei-lin/" target="_blank">Liwei Lin</a>, a professor of mechanical engineering at University of California, Berkeley. Lin and his fellow researchers detailed <u><a href="https://doi.org/10.1126/sciadv.ads6858" target="_blank"><span>their findings</span></a></u>  in <em>ScienceAdvances</em>.</p><h2>3D-Printed Flying-Robot Design</h2><p>Each flying robot has a <u><a href="https://spectrum.ieee.org/drone-3dprint" target="_self"><span>3D-printed</span></a></u> body that consists of a propeller with four blades. This rotor is encircled by a ring that helps the robot stay balanced during flight. On top of each body are two tiny permanent magnets.  </p><p>All in all, the insect-scale prototypes have wingspans as small as 9.4 millimeters and weigh as little as 21 milligrams. Previously, the smallest reported flying robot, either tethered or untethered, was <u><a href="https://ieeexplore.ieee.org/document/7989378" target="_blank"><span>28 mm wide</span></a></u>.</p><p>When exposed to an external alternating magnetic field, the robots spin and fly without tethers. The lowest magnetic field strength needed to maintain flight is 3.1 millitesla. By comparison, a refrigerator magnet has a strength of about <u><a href="https://nationalmaglab.org/about-the-maglab/around-the-lab/maglab-dictionary/tesla/" target="_blank"><span>10 mT</span></a></u>.</p><p>When the applied magnetic field alternates with a frequency of 310 hertz, the robots can hover. At 340 Hz, they accelerate upward. The researchers could steer the robots laterally by adjusting the applied magnetic fields. The robots could also right themselves after collisions to stay airborne without complex sensing or controlling electronics, as long as the impacts were not too large.</p><p>Experiments show the lift force the robots generate can exceed their weight by 14 percent, to help them carry payloads. For instance, a prototype that’s 20.5 mm wide and weighing 162.4 milligrams could carry an infrared sensor weighing 110 mg to scan its environment. The robots proved efficient at converting the energy given them into lift force—better than nearly all other reported flying robots, tethered or untethered, and also better than fruit flies and hummingbirds.</p><p>Currently the maximum operating range of these prototypes is about 10 centimeters away from the magnetic coils. One way to extend the operating range of these robots is to increase the magnetic field strength they experience tenfold by adding more coils, optimizing the configuration of these coils, and using <u><a href="https://ieeexplore.ieee.org/document/7472415" target="_blank"><span>beamforming coils</span></a></u>, Lin notes. Such developments could allow the robots to fly up to a meter away from the magnetic coils.</p><p>The scientists could also miniaturize the robots even further. This would make them lighter, and so reduce the magnetic field strength they need for propulsion. “It could be possible to drive micro flying robots using electromagnetic waves such as those in radio or cellphone transmission signals,” Lin says. Future research could also place devices that can convert magnetic energy to electricity on board the robots to power electronic components, the researchers add.</p>]]></description><pubDate>Fri, 28 Mar 2025 18:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/flying-robot-2671447539</guid><category>Robots</category><category>Microdrones</category><category>Drones</category><category>Flying robots</category><dc:creator>Charles Q. Choi</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-miniature-drone-on-top-of-a-u-s-penny.jpg?id=59700962&amp;width=980"></media:content></item></channel></rss>

If you would like to create a banner that links to this page (i.e. this validation result), do the following:

  1. Download the "valid RSS" banner.

  2. Upload the image to your own server. (This step is important. Please do not link directly to the image on this server.)

  3. Add this HTML to your page (change the image src attribute if necessary):

If you would like to create a text link instead, here is the URL you can use:

http://www.feedvalidator.org/check.cgi?url=http%3A//feeds.feedburner.com/IeeeSpectrumRoboticsChannel

Copyright © 2002-9 Sam Ruby, Mark Pilgrim, Joseph Walton, and Phil Ringnalda