Congratulations!

[Valid RSS] This is a valid RSS feed.

Recommendations

This feed is valid, but interoperability with the widest range of feed readers could be improved by implementing the following recommendations.

Source: https://feeds.feedburner.com/IeeeSpectrumRobotics?format=xml

  1. <?xml version="1.0" encoding="utf-8"?>
  2. <rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:media="http://search.yahoo.com/mrss/"><channel><title>IEEE Spectrum</title><link>https://spectrum.ieee.org/</link><description>IEEE Spectrum</description><atom:link href="https://spectrum.ieee.org/feeds/topic/robotics.rss" rel="self"></atom:link><language>en-us</language><lastBuildDate>Fri, 26 Jul 2024 20:24:43 -0000</lastBuildDate><item><title>Video Friday: Robot Baby With a Jet Pack</title><link>https://spectrum.ieee.org/video-friday-robot-baby-jetpack</link><description><![CDATA[
  3. <img src="https://spectrum.ieee.org/media-library/a-robot-with-jet-engines-with-blue-flames-coming-out-attached-to-its-arms-and-back-stands-in-a-safety-frame-on-a-rooftop-at-nigh.gif?id=52974821&width=1200&height=800&coordinates=62%2C0%2C63%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://icra40.ieee.org/">ICRA@40</a>: 23–26 September 2024, ROTTERDAM, NETHERLANDS</h5><h5><a href="https://iros2024-abudhabi.org/">IROS 2024</a>: 14–18 October 2024, ABU DHABI, UAE</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="wkhqym57usw">If the Italian Institute of Technology’s iRonCub3 looks this cool while <em>learning</em> to fly, just imagine how cool it will look when it actually takes off!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d6dae7256db9afa12f8a7317404c4962" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wKhqym57USw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Hovering is in the works, but this is a really hard problem, which you can read more about in Daniele Pucci’s post on LinkedIn.</p><p>[ <a href="https://www.linkedin.com/posts/daniele-pucci-78428420_flying-jetpowered-humanoidrobotics-activity-7221876503725166595-nWSL?utm_source=share&utm_medium=member_desktop">LinkedIn</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="8j_wit-rd74"><em>Stanford Engineering and the Toyota Research Institute achieve the world’s first autonomous tandem drift. Leveraging the latest AI technology, Stanford Engineering and TRI are working to make driving safer for all. By automating a driving style used in motorsports called drifting—in which a driver deliberately spins the rear wheels to break traction—the teams have unlocked new possibilities for future safety systems.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4a3a13a9bc7a4bfee74f2c445698288d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8J_WiT-RD74?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://toyotaresearch.medium.com/stanford-engineering-and-toyota-research-institute-achieve-worlds-first-autonomous-tandem-drift-131fcb9a76a9">TRI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="dx7y0bchwmw"><em>Researchers at the Istituto Italiano di Tecnologia (Italian Institute of Technology) have demonstrated that under specific conditions, humans can treat robots as coauthors of the results of their actions. The condition that enables this phenomenon is a robot that behaves in a social, humanlike manner. Engaging in eye contact and participating in a common emotional experience, such as watching a movie, are key.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="60307c73020e7095cf8efbc264655a7b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DX7y0bChWmw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.science.org/doi/10.1126/scirobotics.adj3665">Science Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="hjueskudnds">If Aibo is not quite catlike enough for you, here you go.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c923a7cf3666cd8c8b9815fe1804893d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/hJUesKudNDs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.makuake.com/project/maicat/">Maicat</a> ] via [ <a href="https://robotstart.info/2024/07/24/cat-robot-maicat-on-sale-japan.html">RobotStart</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="69svc-43oqg">I’ve never been more excited for a sim-to-real gap to be bridged.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bcb4c13c6097ae2ac5f1fb6809ba752c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/69SVc-43Oqg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sites.usc.edu/quann/">USC Viterbi</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="m-utklmylhk">I’m sorry, but this looks exactly like a quadrotor sitting on a test stand.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="51c5a294715c623d6e6295b2c3111ce7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/M-utKlMYlHk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>The 12-pound Quad-Biplane combines four rotors and two wings without any control surfaces. The aircraft takes off like a conventional quadcopter and transitions to a more-efficient horizontal cruise flight, similar to that of a biplane. This combines the simplicity of a quadrotor design, providing vertical flight capability, with the cruise efficiency of a fixed-wing aircraft. The rotors are responsible for aircraft control both in vertical and forward cruise flight regimes.</em></blockquote><p>[ <a href="https://avfl.engr.tamu.edu/">AVFL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="okj-newfeci">Tensegrity robots are so weird, and I so want them to be useful.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a09f13b1b2e8b94f78ceb2164b0a153c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/OkJ-nEWFEcI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="http://www-robot.mes.titech.ac.jp/index_e.html">Suzumori Endo Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="xqqxnbgr3ii">Top-performing robots need all the help they can get.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d1993d56b4ee8850163cce456795cd04" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xqqxnbGR3II?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://b-human.de/index.html">Team B-Human</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="-ca3xqe6z_4">And now: a beetle nearly hit by an autonomous robot.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="61aec8c749093473535f7c49aea3aa54" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/-Ca3xqe6z_4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://yugu.faculty.wvu.edu/">WVUIRL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="tmgk0ata5hk"><em>Humans possess a remarkable ability to react to unpredictable perturbations through immediate mechanical responses, which harness the visco-elastic properties of muscles to maintain balance. Inspired by this behavior, we propose a novel design of a robotic leg utilizing fiber-jammed structures as passive compliant mechanisms to achieve variable joint stiffness and damping.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a6f722be407faca7426d51b4544886d4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/TMGk0ATA5Hk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2308.01758">Paper</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="wjb4l4gly8c">I don’t know what this piece of furniture is, but your cats will love it.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9bded74939eec40eb92d0903446ae5bf" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/WJB4L4gLY8c?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://new.abb.com/news/detail/117938/cstmr-robot-wound-veneer-a-sustainable-building-material-for-the-future">ABB</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="sd0p5mmaykm"><em>This video shows a dexterous avatar humanoid robot with VR teleoperation, hand tracking, and speech recognition to achieve highly dexterous mobile manipulation. Extend Robotics is developing a dexterous remote-operation interface to enable data collection for embodied AI and humanoid robots.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="eace905cc56c5c3645894e2bd582f71e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/sD0p5mmAYKM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.extendrobotics.com/">Extend Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="diautbgrxki"><em>I never really thought about this, but wind turbine blades are hollow inside and need to be inspected sometimes, which is really one of those jobs where you’d much rather have a robot do it.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a7b14059d0c2df39a96be46cde0e920d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dIAUTbgRXkI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.flyability.com/">Flyability</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="zqypazevbia"><em>Here’s a full, uncut drone-delivery mission, including a package pickup from our AutoLoader—a simple, nonpowered mechanical device that allows retail partners to utilize drone delivery with existing curbside-pickup workflows.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c7ba3c7bd5f25392d1b00056834e5b90" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zQYPaZeVbIA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://blog.wing.com/2024/01/customer-demand-and-wings-aircraft.html">Wing</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="687hxi0_5li">Daniel Simu and his acrobatic robot competed in “America’s Got Talent,” and even though his robot did a very robot thing by breaking itself immediately beforehand, the performance went really well.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c46e071cf5be504cace4699b280a190a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/687HXI0_5lI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://acrobot.nl/">Acrobot</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="k5_r0r8tiby">A tour of the Creative Robotics Mini Exhibition at the Creative Computing Institute, University of the Arts London.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="41739cdf34dec916c9f5872c4d828466" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/k5_R0R8TiBY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.arts.ac.uk/subjects/creative-computing/postgraduate/msc-creative-robotics">UAL</a> ]</p><p>Thanks, Hooman!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="dycujjms3uo"><em>Zoox CEO Aicha Evans and cofounder and chief technology officer Jesse Levinson hosted a LinkedIn Live last week to reflect on the past decade of building Zoox and their predictions for the next 10 years of the autonomous-vehicle industry.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="62682d4df022a6c98247bd82b5ca608e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DYcujjMs3Uo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://zoox.com/">Zoox</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 26 Jul 2024 16:59:31 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robot-baby-jetpack</guid><category>Robotics</category><category>Video friday</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/a-robot-with-jet-engines-with-blue-flames-coming-out-attached-to-its-arms-and-back-stands-in-a-safety-frame-on-a-rooftop-at-nigh.gif?id=52974821&amp;width=980"></media:content></item><item><title>Elephant Robotics’ Mercury Humanoid Robot Empowers Embodied AI Research</title><link>https://spectrum.ieee.org/elephant-robotics-mercury</link><description><![CDATA[
  4. <img src="https://spectrum.ieee.org/media-library/two-grouped-photos-showing-a-winking-robot-standing-in-a-room-and-another-smiling-robot-in-a-kitchen-area.jpg?id=52857053&width=1200&height=800&coordinates=150%2C0%2C150%2C0"/><br/><br/><p><em>
  5. This is a sponsored article brought to you by <a href="https://www.elephantrobotics.com/en/" target="_blank">Elephant Robotics</a>.</em></p><p><a href="https://www.elephantrobotics.com/en/" rel="noopener noreferrer" target="_blank">Elephant Robotics</a> has gone through years of research and development to accelerate its mission of bringing robots to millions of homes and a vision of “Enjoy Robots World”. From the collaborative industrial robots P-series and C-series, which have been on the drawing board since its establishment in 2016, to the lightweight desktop 6 DOF collaborative robot <a href="https://shop.elephantrobotics.com/collections/mycobot-280" rel="noopener noreferrer" target="_blank">myCobot 280</a> in 2020, to the dual-armed, semi-humanoid robot <a href="https://shop.elephantrobotics.com/collections/mybuddy/products/mybuddy-280" rel="noopener noreferrer" target="_blank">myBuddy</a>, which was launched in 2022, Elephant Robotics is launching 3-5 robots per year, and this year’s full-body humanoid robot, the <a href="https://www.elephantrobotics.com/en/mercury-humanoid-robot/" rel="noopener noreferrer" target="_blank">Mercury series</a>, promises to reshape the landscape of non-human workers, introducing intelligent robots like Mercury into research and education and even everyday home environments.</p><h2>A Commitment to Practical Robotics</h2><p>
  6. <a href="https://shop.elephantrobotics.com/products/mercury-humanoid-robot-series?_pos=1&_psq=mercury&_ss=e&_v=1.0&variant=47556966875448" rel="noopener noreferrer" target="_blank">Elephant Robotics</a> proudly introduces the Mercury Series, a suite of humanoid robots that not only push the boundaries of innovation but also embody a deep commitment to practical applications. Designed with the future of robotics in mind, the Mercury Series is poised to become the go-to choice for researchers and industry professionals seeking reliable, scalable, and robust solutions.
  7. </p><p class="shortcode-media shortcode-media-youtube">
  8. <span class="rm-shortcode" data-rm-shortcode-id="20a19f4fa2a474864196ce49dc15f94c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Ru24sDmK8yI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  9. <small class="image-media media-caption" placeholder="Add Photo Caption..."><u><br/></u></small>
  10. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Elephant Robotics</small></p><h2>The Genesis of Mercury Series: Bridging Vision With Practicality</h2><p>
  11. From the outset, the Mercury Series has been envisioned as more than just a collection of advanced prototypes. It is a testament to Elephant Robotics’ dedication to creating humanoid robots that are not only groundbreaking in their capabilities but also practical for mass production and consistent, reliable use in real-world applications.
  12. </p><h2>Mercury X1: Wheeled Humanoid Robot</h2><p>
  13. <a href="https://www.elephantrobotics.com/en/mercury-x1-en/" rel="noopener noreferrer" target="_blank">The Mercury X1</a> is a versatile wheeled humanoid robot that combines advanced functionalities with mobility. Equipped with dual NVIDIA Jetson controllers, lidar, ultrasonic sensors, and an 8-hour battery life, the X1 is perfect for a wide range of applications, from exploratory studies to commercial tasks requiring mobility and adaptability.
  14. </p><h2>Mercury B1: Dual-Arm Semi-Humanoid Robot</h2><p>
  15. <a href="https://www.elephantrobotics.com/en/mercury-b1-en/" rel="noopener noreferrer" target="_blank">The Mercury B1</a> is a semi-humanoid robot tailored for sophisticated research. It features 17 degrees of freedom, dual robotic arms, a 9-inch touchscreen, a NVIDIA Xavier control chip, and an integrated 3D camera. The B1 excels in machine vision and VR-assisted teleoperation, and its AI voice interaction and LLM integration mark significant advancements in human-robot communication.
  16. </p><p>
  17. These two advanced models exemplify Elephant Robotics’ commitment to practical robotics. The wheeled humanoid robot Mercury X1 integrates advanced technology with a state-of-the-art mobile platform, ensuring not only versatility but also the feasibility of large-scale production and deployment.
  18. </p><h2>Embracing the Power of Reliable Embodied AI</h2><p>
  19. The Mercury Series is engineered as the ideal hardware platform for embodied AI research, providing robust support for sophisticated AI algorithms and real-world applications. Elephant Robotics demonstrates its commitment to innovation through the Mercury series’ compatibility with NVIDIA’s ISSACSIM, a state-of-the-art simulation platform that facilitates sim2real learning, bridging the gap between virtual environments and physical robot interaction.
  20. </p><p>
  21. The Mercury Series is perfectly suited for the study and experimentation of mainstream large language models in embodied AI. Its advanced capabilities allow seamless integration with the latest AI research. This provides a reliable and scalable platform for exploring the frontiers of machine learning and robotics.
  22. </p><p>
  23. Furthermore, the Mercury Series is complemented by the <a href="https://shop.elephantrobotics.com/collections/myarm-mc/products/myarm-c650" target="_blank">myArm C650</a>, a teleoperation robotic arm that enables rapid acquisition of physical data. This feature supports secondary learning and adaptation, allowing for immediate feedback and iterative improvements in real-time. These features, combined with the Mercury Series’ reliability and practicality, make it the preferred hardware platform for researchers and institutions looking to advance the field of embodied AI.
  24. </p><p>
  25. The Mercury Series is supported by a rich software ecosystem, compatible with major programming languages, and integrates seamlessly with industry-standard simulation software. This comprehensive development environment is enhanced by a range of auxiliary hardware, all designed with mass production practicality in mind.
  26. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  27. <img alt="A set of images showing a robot in a variety of situations." class="rm-shortcode" data-rm-shortcode-id="0f090fc8addd438aa868f01e910606e3" data-rm-shortcode-name="rebelmouse-image" id="654a8" loading="lazy" src="https://spectrum.ieee.org/media-library/a-set-of-images-showing-a-robot-in-a-variety-of-situations.jpg?id=52857217&width=980"/>
  28. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Elephant Robotics</small></p><h2>Drive to Innovate: Mass Production and Global Benchmarks</h2><p>
  29. The “Power Spring” harmonic drive modules, a hallmark of the Elephant Robotics’ commitment to innovation for mass production, have been meticulously engineered to offer an unparalleled torque-to-weight ratio. These components are a testament to the company’s foresight in addressing the practicalities of large-scale manufacturing. The incorporation of carbon fiber in the design of these modules not only optimizes agility and power but also ensures that the robots are well-prepared for the rigors of the production line and real-world applications. The Mercury Series, with its spirit of innovation, is making a significant global impact, setting a new benchmark for what practical robotics can achieve.
  30. </p><p>
  31. Elephant Robotics is consistently delivering mass-produced robots to a range of renowned institutions and industry leaders, thereby redefining the industry standards for reliability and scalability. The company’s dedication to providing more than mere prototypes is evident in the active role its robots play in various sectors, transforming industries that are in search of dependable and efficient robotic solutions.
  32. </p><h2>Conclusion: The Mercury Series—A Beacon for the Future of Practical Robotics</h2><p>
  33. The Mercury Series represents more than a product; it is a beacon for the future of practical robotics. <a href="https://shop.elephantrobotics.com/" rel="noopener noreferrer" target="_blank">Elephant Robotics’</a> dedication to affordability, accessibility, and technological advancement ensures that the Mercury Series is not just a research tool but a platform for real-world impact.
  34. </p><p class="shortcode-media shortcode-media-youtube">
  35. <span class="rm-shortcode" data-rm-shortcode-id="175347c43067d1bd4f7f1530d6dbcb91" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/gKJXL0IXeUs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  36. <small class="image-media media-caption" placeholder="Add Photo Caption...">Mercury Usecases | Explore the Capabilities of the Wheeled Humanoid Robot and Discover Its Precision</small>
  37. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">
  38. <a href="https://youtu.be/gKJXL0IXeUs" target="_blank">youtu.be</a>
  39. </small>
  40. </p><p>
  41. <strong>Elephant Robotics:</strong> <a href="https://www.elephantrobotics.com/en/" rel="noopener noreferrer" target="_blank">https://www.elephantrobotics.com/en/</a>
  42. </p><p>
  43. <strong>Mercury Robot Series: </strong><u><a href="https://www.elephantrobotics.com/en/mercury-humanoid-robot/" rel="noopener noreferrer" target="_blank">https://www.elephantrobotics.com/en/mercury-humanoid-robot/</a></u>
  44. </p>]]></description><pubDate>Tue, 23 Jul 2024 22:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/elephant-robotics-mercury</guid><category>Elephant robotics</category><category>Ai</category><category>Humanoid robots</category><category>Mercury series</category><dc:creator>Elephant Robotics</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/two-grouped-photos-showing-a-winking-robot-standing-in-a-room-and-another-smiling-robot-in-a-kitchen-area.jpg?id=52857053&amp;width=980"></media:content></item><item><title>iRobot’s Autowash Dock Is (Almost) Automated Floor Care</title><link>https://spectrum.ieee.org/irobot-roomba-combo-10-max</link><description><![CDATA[
  45. <img src="https://spectrum.ieee.org/media-library/a-round-black-vacuuming-robot-sits-in-front-of-a-large-black-docking-station-that-is-partially-transparent-to-show-clean-and-dir.jpg?id=52955844&width=2048&height=1609&coordinates=0%2C225%2C0%2C214"/><br/><br/><p>The dream of robotic floor care has always been for it to be hands-off and mind-off. That is, for a robot to live in your house that will keep your floors clean without you having to really do anything or even think about it. When it comes to robot vacuuming, that’s been more or less solved thanks to self-emptying robots that transfer debris into docking stations, which iRobot pioneered <a href="https://spectrum.ieee.org/irobot-develops-self-emptying-roomba" target="_self"><u>with the Roomba i7+ in 2018</u></a>. By 2022, iRobot’s <a href="https://spectrum.ieee.org/irobot-roomba-combo-j7-vacuum" target="_self"><u>Combo j7+</u></a> added an intelligent mopping pad to the mix, which definitely made for cleaner floors but was also a step backwards in the sense that you had to remember to toss the pad into your washing machine and fill the robot’s clean water reservoir every time. The <a href="https://www.irobot.com/en_US/roomba-combo-j9plus-auto-fill-robot-vacuum-and-mop/C975020.html" rel="noopener noreferrer" target="_blank"><u>Combo j9+</u></a> stuffed a clean water reservoir into the dock itself, which could top off the robot with water by itself for a month.</p><p>With the new Roomba Combo 10 Max, announced today, iRobot has cut out (some of) that annoying process thanks to a massive new docking station that self-empties vacuum debris, empties dirty mop water, refills clean mop water, and then washes and dries the mopping pad, completely autonomously.</p><hr/><p class="shortcode-media shortcode-media-youtube">
  46. <span class="rm-shortcode" data-rm-shortcode-id="79293f109fc36e515ef368eac12d7d0b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Yfx4331nQjg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  47. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">iRobot</small>
  48. </p><p>The Roomba part of this is a mildly upgraded j7+, and most of what’s new on the hardware side here is in the “multifunction AutoWash Dock.” This new dock is a beast: It empties the robot of all of the dirt and debris picked up by the vacuum, refills the Roomba’s clean water tank from a reservoir, and then starts up a wet scrubby system down under the bottom of the dock. The Roomba deploys its dirty mopping pad onto that system, and then drives back and forth while the scrubby system cleans the pad. All the dirty water from this process gets sucked back up into a dedicated reservoir inside the dock, and the pad gets blow-dried while the scrubby system runs a self-cleaning cycle.</p><p class="shortcode-media shortcode-media-rebelmouse-image image-crop-custom">
  49. <img alt="A round black vacuuming robot sits inside of a large black docking station that is partially transparent to show clean and dirty water tanks inside." class="rm-shortcode" data-rm-shortcode-id="612af9c9ffcfd1562fa3fac3814a9d70" data-rm-shortcode-name="rebelmouse-image" id="debee" loading="lazy" src="https://spectrum.ieee.org/media-library/a-round-black-vacuuming-robot-sits-inside-of-a-large-black-docking-station-that-is-partially-transparent-to-show-clean-and-dirty.jpg?id=52955848&width=5120&height=3452&quality=85&coordinates=0%2C922%2C0%2C746"/>
  50. <small class="image-media media-caption" placeholder="Add Photo Caption...">The dock removes debris from the vacuum, refills it with clean water, and then uses water to wash the mopping pad.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">iRobot</small></p><p>This means that as a user, you’ve only got to worry about three things: dumping out the dirty water tank every week (if you use the robot for mopping most days), filling the clean water tank every week, and then changing out the debris every two months. That is not a lot of hands-on time for having consistently clean floors.</p><p>The other thing to keep in mind about all of these robots is that they do need relatively frequent human care if you want them to be happy and successful. That means flipping them over and getting into their guts to clean out the bearings and all that stuff. iRobot makes this very easy to do, and it’s a necessary part of robot ownership, so the dream of having a robot that you can actually forget <em><em>completely</em></em> is probably not achievable.</p><p>The consequence for this convenience is a real chonker of a dock. The dock is basically furniture, and to the company’s credit, iRobot designed it so that the top surface is useable as a shelf—Access to the guts of the dock are from the front, not the top. This is fine, but it’s also kind of crazy just how much these docks have expanded, especially once you factor in the front ramp that the robot drives up, which sticks out even farther. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  51. <img alt="A round black robot on a wooden floor approaches a dirty carpet and uses a metal arm to lift a wet mopping pad onto its back." class="rm-shortcode" data-rm-shortcode-id="9bb751380913b961c87b04b3529c21dc" data-rm-shortcode-name="rebelmouse-image" id="b228f" loading="lazy" src="https://spectrum.ieee.org/media-library/a-round-black-robot-on-a-wooden-floor-approaches-a-dirty-carpet-and-uses-a-metal-arm-to-lift-a-wet-mopping-pad-onto-its-back.jpg?id=52955845&width=980"/>
  52. <small class="image-media media-caption" placeholder="Add Photo Caption...">The Roomba will detect carpet and lift its mopping pad up to prevent drips.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">iRobot</small></p><p>We asked iRobot director of project management <a href="https://www.linkedin.com/in/warren-fernandez/" rel="noopener noreferrer" target="_blank"><u>Warren Fernandez</u></a> about whether docks are just going to keep on getting bigger forever until we’re all just living in giant robot docks, to which he said: “Are you going to continue to see some large capable multifunction docks out there in the market? Yeah, I absolutely think you will—but when does big become too big?” Fernandez says that there are likely opportunities to reduce dock size going forward through packaging efficiencies or dual-purpose components, but that there’s another option, too: Distributed docks. “If a robot has dry capabilities and wet capabilities, do those have to coexist inside the same chassis? What if they were separate?” says Fernandez.</p><p>We should mention that iRobot is not the first in the robotic floor care robot space to have a self-cleaning mop, and it’s also not the first to think about distributed docks, although as Fernandez explains, this is a more common approach in Asia where you can also take advantage of home plumbing integration. “It’s a major trend in China, and starting to pop up a little bit in Europe, but not really in North America yet. How amazing could it be if you had a dock that, in a very easy manner, was able to tap right into plumbing lines for water supply and sewage disposal?”</p><p>According to Fernandez, this tends to be much easier to do in China, both because the labor cost for plumbing work is far lower than in the United States and Europe, and also because it’s fairly common for apartments in China to have accessible floor drains. “We don’t really yet see it in a major way at a global level,” Fernandez tells us. “But that doesn’t mean it’s not coming.”</p><p class="shortcode-media shortcode-media-rebelmouse-image image-crop-custom">
  53. <img alt="A round black robot on a wooden floor approaches a dirty carpet and uses a metal arm to lift a wet mopping pad onto its back." class="rm-shortcode" data-rm-shortcode-id="627d844ae29c2fc4529e91b185994e61" data-rm-shortcode-name="rebelmouse-image" id="003ba" loading="lazy" src="https://spectrum.ieee.org/media-library/a-round-black-robot-on-a-wooden-floor-approaches-a-dirty-carpet-and-uses-a-metal-arm-to-lift-a-wet-mopping-pad-onto-its-back.jpg?id=52955851&width=1367&height=1091&quality=85&coordinates=0%2C629%2C0%2C328"/>
  54. <small class="image-media media-caption" placeholder="Add Photo Caption...">The robot autonomously switches mopping mode on and off for different floor surfaces.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">iRobot</small></p><p>We should also mention the Roomba Combo 10 Max, which includes some software updates:</p><ul><li>The front-facing camera and specialized bin sensors can identify dirtier areas eight times as effectively as before.</li><li>The Roomba can identify specific rooms and prioritize the order they’re cleaned in, depending on how dirty they get.</li><li>A new cleaning behavior called “Smart Scrub” adds a back-and-forth scrubbing motion for floors that need extra oomph.</li></ul><p>And here’s what I feel like the new software <em><em>should</em></em> do, but doesn’t:</p><ul><li>Use the front-facing camera and bin sensors to identify dirtier areas and then autonomously develop a schedule to more frequently clean those areas.</li><li>Activate Smart Scrub when the camera and bin sensors recognize an especially dirty floor.</li></ul><p>I say “should do” because the robot appears to be collecting the data that it needs to do these things but it doesn’t do them yet. New features (especially new features that involve autonomy) take time to develop and deploy, but imagine a robot that makes much more nuanced decisions about where and when to clean based on very detailed real-time data and environmental understanding that iRobot has already implemented. </p><p>I also appreciate that even as iRobot is emphasizing autonomy and leveraging data to start making more decisions for the user, the company is also making sure that the user has as much control as possible through the app. For example, you can set the robot to mop your floor without vacuuming first, even though if you do that, all you’re going to end up with a much dirtier mop. Doesn’t make a heck of a lot of sense, but if that’s what you want, iRobot has empowered you to do it.</p><p class="shortcode-media shortcode-media-rebelmouse-image image-crop-custom">
  55. <img alt="A round black vacuuming robot sits inside of a large black docking station that is opened to show clean and dirty water tanks inside." class="rm-shortcode" data-rm-shortcode-id="47b27289f88402a94a85551f575502dc" data-rm-shortcode-name="rebelmouse-image" id="f456a" loading="lazy" src="https://spectrum.ieee.org/media-library/a-round-black-vacuuming-robot-sits-inside-of-a-large-black-docking-station-that-is-opened-to-show-clean-and-dirty-water-tanks-in.jpg?id=52955852&width=2048&height=1407&quality=85&coordinates=0%2C401%2C0%2C240"/>
  56. <small class="image-media media-caption" placeholder="Add Photo Caption...">The dock opens from the front for access to the clean- and dirty-water storage and the dirt bag.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">iRobot</small></p><p>The Roomba Combo 10 Max will be launching in August for US $1,400. That’s expensive, but it’s also how iRobot does things: A new Roomba with new tech always gets flagship status and premium cost. Sooner or later it’ll be affordable enough that the rest of us will be able to afford it, too.</p>]]></description><pubDate>Tue, 23 Jul 2024 11:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/irobot-roomba-combo-10-max</guid><category>Irobot</category><category>Roomba</category><category>Home robots</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-round-black-vacuuming-robot-sits-in-front-of-a-large-black-docking-station-that-is-partially-transparent-to-show-clean-and-dir.jpg?id=52955844&amp;width=980"></media:content></item><item><title>Video Friday: Robot Crash-Perches, Hugs Tree</title><link>https://spectrum.ieee.org/video-friday-bioinspired-robot</link><description><![CDATA[
  57. <img src="https://spectrum.ieee.org/media-library/three-images-showing-a-bat-hugging-a-tree-trunk-an-owl-hugging-a-tree-trunk-and-a-black-robotic-airplane-hugging-a-tree-trunk.png?id=52930125&width=1200&height=800&coordinates=150%2C0%2C150%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://icra40.ieee.org/">ICRA@40</a>: 23–26 September 2024, ROTTERDAM, NETHERLANDS</h5><h5><a href="https://iros2024-abudhabi.org/">IROS 2024</a>: 14–18 October 2024, ABU DHABI, UAE</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="sgyivja3dzg"><em>Perching with winged Unmanned Aerial Vehicles has often been solved by means of complex control or intricate appendages. Here, we present a method that relies on passive wing morphing for crash-landing on trees and other types of vertical poles. Inspired by the adaptability of animals’ and bats’ limbs in gripping and holding onto trees, we design dual-purpose wings that enable both aerial gliding and perching on poles.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="af2fc1eabd69019ce92efbd3a25a9fb5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/SGyivJa3DZg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.nature.com/articles/s44172-024-00241-0"><em>Nature Communications Engineering</em></a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="r1jfvn76kai">Pretty impressive to have low enough latency in controlling your robot’s hardware that it can play ping pong, although it makes it impossible to tell whether the robot or the human is the one that’s actually bad at the game.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4fedcb825f47a67a4fd51ac1d2c2f0aa" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/R1JfVN76kAI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ihmc.us/nadia-humanoid/">IHMC</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="e6vukpg3jxu">How to be a good robot when boarding an elevator.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6e009489f01df1fa602e49b690b7b11f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/E6VUkPG3jXU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://1784.navercorp.com/en/">NAVER</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="j0qh9gu9tko"><em>Have you ever wondered how insects are able to go so far beyond their home and still find their way? The answer to this question is not only relevant to biology but also to making the AI for tiny, autonomous robots.  We felt inspired by biological findings on how ants visually recognize their environment and combine it with counting their steps in order to get safely back home.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a4c7b78ab2a256b65196dcccbc2813c2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/J0qh9gu9Tko?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.science.org/doi/10.1126/scirobotics.adk0310"><em>Science Robotics</em></a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="q1_cwyqzt88"><em>Team RoMeLa Practice with ARTEMIS humanoid robots, featuring Tsinghua Hephaestus (Booster Alpha). Fully autonomous humanoid robot soccer match with the official goal of beating the human WorldCup Champions by the year 2050.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cb7acd663808dfe6b77a14fe92a652be" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Q1_cwYQZT88?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.romela.org/">RoMeLa</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="cjnuoxq2axm"><em>Triangle is the most stable shape, right?</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7648ac350c1c14b9f4d0665c6c0d322d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/CJNuoxQ2AxM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://yugu.faculty.wvu.edu/">WVU IRL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="w-l90bhfdfo"><em>We propose RialTo, a new system for robustifying real-world imitation learning policies via reinforcement learning in “digital twin” simulation environments constructed on the fly from small amounts of real-world data.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="da6cbba32610c3bbe5c8e7488817c9c4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/w-L90BhfDFo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://real-to-sim-to-real.github.io/RialTo/">MIT CSAIL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="trtdkuxmlxo">There is absolutely no reason to watch this entire video, but Moley Robotics is still working on that robotic kitchen of theirs.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="17437d7cb98db3ce85d359f39d8e8db3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/TrTDkuXmlxo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>I will once again point out that the hardest part of cooking (for me, anyway) is the prep and the cleanup, and this robot still needs you to do all that.</p><p>[ <a href="https://www.moley.com/">Moley</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="v2vc8pk31n4"><em>B-Human has so far won 10 titles at the RoboCup SPL tournament. Can we make it 11 this year? Our RoboCup starts off with a banger game against HTWK Robots form Leipzig!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f5f8b8a3e52f1f470c4f765ab5792d91" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/v2VC8Pk31n4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://b-human.de/index.html">Team B-Human</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="4p7axxplk-w"><em>AMBIDEX is a dual-armed robot with an innovative mechanism developed for safe coexistence with humans. Based on an innovative cable structure, it is designed to be both strong and stable.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ceea5604c04693daadf67b96f01f7c84" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/4P7AXxPlk-w?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.naverlabs.com/en/ambidex">NAVER</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="tzm846ojxsa"><em>As NASA’s Perseverance rover prepares to ascend to the rim of Jezero Crater, its team is investigating a rock unlike any that they’ve seen so far on Mars. Deputy project scientist Katie Stack Morgan explains why this rock, found in an ancient channel that funneled water into the crater, could be among the oldest that Perseverance has investigated—or the youngest.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9f1794f9f718b7076451f35b11a38777" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/TZm846OJxSA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://science.nasa.gov/mission/mars-2020-perseverance/">NASA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="u9_q6c5yigo"><em>We present a novel approach for enhancing human-robot collaboration using physical interactions for real-time error correction of large language model (LLM) parameterized commands.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9f4a55c2d8a8fdc7ff5667c8767106e4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/U9_q6C5YIgo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.grasp.upenn.edu/research-groups/figueroa-robotics-lab/">Figueroa Robotics Lab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="xim0mxvqygu"><em>Husky Observer was recently used to autonomously inspect solar panels at a large solar panel farm.  As part of its mission, the robot navigated rows of solar panels, stopping to inspect areas with its integrated thermal camera.  Images were taken by the robot and enhanced to detect potential “hot spots” in the panels.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="250d84f1fffa73c17e614a110d19cfd0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xIM0MXvQyGU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://clearpathrobotics.com/husky-observer/">Clearpath Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="npetybdwwz8">Most of the time, robotic workcells contain just one robot, so it’s cool to see a pair of them collaborating on tasks.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="21bd81d68c59c8991b913f31fbed0d96" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/npetYBdwwz8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://leverage-robotics.com/en/">Leverage Robotics</a> ]</p><p>Thanks, Roman!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="if9immyioh0"><em>Meet Hydrus, the autonomous underwater drone revolutionising underwater data collection by eliminating the barriers to its entry. Hydrus ensures that even users with limited resources can execute precise and regular subsea missions to meet their data requirements.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="264e00b842e459e30010648999f0d897" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/If9immYioh0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.advancednavigation.com/robotics/micro-auv/hydrus/">Advanced Navigation</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="7_lw7u-nk6q">Those adorable Disney robots have finally made their way into a paper.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5d81357a39aba2e6cb10b8c4a3e1163d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7_LW7u-nk6Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://roboticsconference.org/program/papers/103/">RSS 2024</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 19 Jul 2024 19:45:02 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-bioinspired-robot</guid><category>Autonomous robots</category><category>Collaborative robots</category><category>Disney robots</category><category>Perseverance rover</category><category>Robotics</category><category>Video friday</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/three-images-showing-a-bat-hugging-a-tree-trunk-an-owl-hugging-a-tree-trunk-and-a-black-robotic-airplane-hugging-a-tree-trunk.png?id=52930125&amp;width=980"></media:content></item><item><title>Robot Dog Cleans Up Beaches With Foot-Mounted Vacuums</title><link>https://spectrum.ieee.org/robot-dog-vacuum</link><description><![CDATA[
  58. <img src="https://spectrum.ieee.org/media-library/a-black-robot-dog-with-a-white-backpack-with-tubes-coming-out-of-it-running-down-its-legs-to-its-feet-stands-on-a-pebbly-beach-w.jpg?id=52824948&width=1200&height=800&coordinates=150%2C0%2C150%2C0"/><br/><br/><p>Cigarette butts are the second most common undisposed-of litter on Earth—of the six trillion-ish cigarettes inhaled every year, <a href="https://pubmed.ncbi.nlm.nih.gov/30782533/" target="_blank">it’s estimated</a> that over 4 trillion of the butts are just tossed onto the ground, each one leeching over 700 different toxic chemicals into the environment. Let’s not focus on the fact that all those toxic chemicals are <em><em>also</em></em> going into people’s lungs, and instead talk about the ecosystem damage that they can do and also just the general grossness of having bits of sucked-on trash everywhere. Ew.</p><p>Preventing those cigarette butts from winding up on the ground in the first place would be the best option, but it would require a pretty big shift in human behavior. Operating under the assumption that humans changing their behavior is a nonstarter, roboticists from the <a href="https://dls.iit.it/" target="_blank">Dynamic Legged Systems</a> unit at the Italian Institute of Technology (IIT), in Genoa, have instead designed a novel platform for cigarette-butt cleanup in the form of a quadrupedal robot with vacuums attached to its feet.</p><p class="shortcode-media shortcode-media-youtube">
  59. <span class="rm-shortcode" data-rm-shortcode-id="1221ce054f5104cd582d1a9d3d1789c9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/O8BqvAe-moI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  60. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">IIT</small>
  61. </p><p>There are, of course, far more efficient ways of at least partially automating the cleanup of litter with machines. The challenge is that most of that automation relies on mobility systems with wheels, which won’t work on the many beautiful beaches (and many beautiful flights of stairs) of Genoa. In places like these, it still falls to humans to do the hard work, which is less than ideal.<strong></strong></p><p>This robot, developed in <a href="https://spectrum.ieee.org/tag/claudio-semini" target="_blank">Claudio Semini’s lab at IIT</a>, is called VERO (Vacuum-cleaner Equipped RObot). It’s based around an AlienGo from Unitree, with a commercial vacuum mounted on its back. Hoses go from the vacuum down the leg to each foot, with a custom 3D-printed nozzle that puts as much suction near the ground as possible without tripping the robot up. While the vacuum is novel, the real contribution here is how the robot autonomously locates things on the ground and then plans how to interact with those things using its feet.</p><p>First, an operator designates an area for VERO to clean, after which the robot operates by itself. After calculating an exploration path to explore the entire area, the robot uses its onboard cameras and a neural network to detect cigarette butts. This is trickier than it sounds, because there may be a lot of cigarette butts on the ground, and they all probably look pretty much the same, so the system has to filter out all of the potential duplicates. The next step is to plan its next steps: VERO has to put the vacuum side of one of its feet right next to each cigarette butt while calculating a safe, stable pose for the rest of its body. Since this whole process can take place on sand or stairs or other uneven surfaces, VERO has to prioritize not falling over before it decides how to do the collection. The final collecting maneuver is fine-tuned using an extra Intel RealSense depth camera mounted on the robot’s chin. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  62. <img alt="A collage of six photos of a quadruped robot navigating different environments." class="rm-shortcode" data-rm-shortcode-id="a558e68c65459dc391adc6aa6e78231d" data-rm-shortcode-name="rebelmouse-image" id="eb956" loading="lazy" src="https://spectrum.ieee.org/media-library/a-collage-of-six-photos-of-a-quadruped-robot-navigating-different-environments.png?id=52820248&width=980"/>
  63. <small class="image-media media-caption" placeholder="Add Photo Caption...">VERO has been tested successfully in six different scenarios that challenge both its locomotion and detection capabilities.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">IIT</small></p><p>Initial testing with the robot in a variety of different environments showed that it could successfully collect just under 90 percent of cigarette butts, which I bet is better than I could do, and I’m also much more likely to get fed up with the whole process. The robot is not very quick at the task, but unlike me it will never get fed up as long as it’s got energy in its battery, so speed is somewhat less important.</p><p>As far as the authors of this paper are aware (and I assume they’ve done their research), this is “the first time that the legs of a legged robot are <em>concurrently</em> utilized for locomotion and for a different task.” This is distinct from other robots that can (for example) open doors with their feet, because those robots stop using the feet as feet for a while and instead use them as manipulators.</p><p>So, this is about a lot more than cigarette butts, and the researchers suggest a variety of other potential use cases, including spraying weeds in crop fields, inspecting cracks in infrastructure, and placing nails and rivets during construction.</p><p>Some use cases include potentially doing multiple things at the same time, like planting different kinds of seeds, using different surface sensors, or driving both nails and rivets. And since quadrupeds have four feet, they could potentially host four completely different tools, and the software that the researchers developed for VERO can be slightly modified to put whatever foot you want on whatever spot you need.</p><em><em><a href="https://onlinelibrary.wiley.com/doi/abs/10.1002/rob.22350" target="_blank">VERO: A Vacuum‐Cleaner‐Equipped Quadruped Robot for Efficient Litter Removal</a></em></em>, by Lorenzo Amatucci, Giulio Turrisi, Angelo Bratta, Victor Barasuol, and Claudio Semini from IIT, was published in the <em>Journal of Field Robotics</em>.]]></description><pubDate>Thu, 18 Jul 2024 14:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/robot-dog-vacuum</guid><category>Quadruped robots</category><category>Legged robots</category><category>Robotics</category><category>Italy</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-black-robot-dog-with-a-white-backpack-with-tubes-coming-out-of-it-running-down-its-legs-to-its-feet-stands-on-a-pebbly-beach-w.jpg?id=52824948&amp;width=980"></media:content></item><item><title>The Smallest, Lightest Solar-Powered Drone Takes Flight</title><link>https://spectrum.ieee.org/smallest-drone</link><description><![CDATA[
  64. <img src="https://spectrum.ieee.org/media-library/a-silvery-round-drone-with-wings-and-a-small-power-system-sits-in-the-palm-of-a-hand.jpg?id=52820114&width=1200&height=800&coordinates=0%2C52%2C0%2C52"/><br/><br/><p>Scientists in China have built what they claim to be the smallest and lightest solar-powered aerial vehicle. It’s small enough to sit in the palm of a person’s hand, weighs less than a U.S. nickel, and can fly indefinitely while the sun shines on it.<br/></p><p>Micro aerial vehicles (MAVs) are <a href="https://spectrum.ieee.org/robobee-robot-precision-control" target="_blank">insect- and bird-size aircraft</a> that <a href="https://spectrum.ieee.org/nothing-can-keep-this-drone-down" target="_blank">might prove useful for reconnaissance</a> and other possible applications. However, a major problem that MAVs currently face is their limited flight times, usually about 30 minutes. Ultralight MAVs—those weighing less than 10 grams—can often only stay aloft for less than 10 minutes.</p><p>One potential way to keep MAVs flying longer is to power them with a consistent source of energy such as sunlight. Now, in a new study, researchers have developed what they say is the first solar-powered MAV capable of sustained flight.</p><p>The new ultralight MAV, CoulombFly, is just 4.21g with a wingspan of 20 centimeters. That’s about 10 times as small as and roughly 600 times as light as the previous smallest sunlight-powered aircraft, a <u><a href="https://onlinelibrary.wiley.com/doi/abs/10.1002/pip.3169" target="_blank">quadcopter</a></u> that’s 2 meters wide and weighs 2.6 kilograms.</p><p class="shortcode-media shortcode-media-youtube">
  65. <span class="rm-shortcode" data-rm-shortcode-id="af7200142a01f529dc54d28c3065cfe7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/LBoee1l4OXo?rel=0&start=1" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  66. <small class="image-media media-caption" placeholder="Add Photo Caption...">Sunlight powered flight test</small>
  67. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">
  68. <a href="https://www.youtube.com/watch?v=LBoee1l4OXo&t=1s" target="_blank">Nature</a>
  69. </small>
  70. </p><p>“My ultimate goal is to make a super tiny flying vehicle, about the size and weight of a mosquito, with a wingspan under 1 centimeter,” says Mingjing Qi, a professor of energy and power engineering at Beihang University in Beijing. Qi and the scientists who built CoulombFly developed a prototype of such an aircraft, measuring 8 millimeters wide and 9 milligrams in mass, “but it can’t fly on its own power yet. I believe that with the ongoing development of microcircuit technology, we can make this happen.”</p><p>Previous sunlight-powered aerial vehicles typically rely on <u><a href="https://spectrum.ieee.org/200-years-ago-faraday-invented-the-electric-motor" target="_self">electromagnetic motors</a></u>, which use electromagnets to generate motion. However, the smaller a solar-powered aircraft gets, the less surface area it has with which to collect sunlight, reducing the amount of energy it can generate. In addition, the efficiency of electromagnetic motors decrease sharply as vehicles shrink in size. Smaller electromagnetic motors experience comparably greater friction than larger ones, as well as greater energy losses due to electrical resistance from their components. This results in low lift-to-power efficiencies, Qi and his colleagues explain.</p><p>CoulombFly instead employs an electrostatic motor, which produce motion using electrostatic fields. Electrostatic motors are generally used as sensors in microelectromechanical systems (<u><a href="https://spectrum.ieee.org/nano-machine-shape-shifting" target="_self">MEMS</a></u>), not for aerial propulsion. Nevertheless, with a mass of only 1.52 grams, the electrostatic motor the scientists used has a lift-to-power efficiency two to three times that of other MAV motors.</p><p>The electrostatic motor has two nested rings. The inner ring is a spinning rotor that possesses 64 slats, each made of a carbon fiber sheet covered with aluminum foil. It resembles a wooden fence curved into a circle, with gaps between the fence’s posts. The outer ring is equipped eight alternating pairs of positive and negative electrode plates, which are each also made of a carbon fiber sheet bonded to aluminum foil. Each plate’s edge also possesses a brush made of aluminum that touches the inner ring’s slats.</p><p>Above CoulombFly’s electrostatic motor is a propeller 20 cm wide and connected to the rotor. Below the motor are two high-power-density thin-film gallium arsenide solar cells, each 4 by 6 cm in size, with a mass of 0.48 g and an energy conversion efficiency of more than 30 percent.</p><p>Sunlight electrically charges CoulombFly’s outer ring, and its 16 plates generate electric fields. The brushes on the outer ring’s plates touch the inner ring, electrically charging the rotor slats. The electric fields of the outer ring’s plates exert force on the charged rotor slats, making the inner ring and the propeller spin.</p><p>In tests under natural sunlight conditions—about 920 watts of light per square meter—CoulombFly successfully took off within one second and sustained flight for an hour without any deterioration in performance. Potential applications for sunlight-powered MAVs may include long-distance and long-duration aerial reconnaissance, the researchers say.</p><p class="shortcode-media shortcode-media-youtube">
  71. <span class="rm-shortcode" data-rm-shortcode-id="7014ac2e38148fa99d02dd4eccccad3e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/-sQR0lG4OLA?rel=0&start=9" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  72. <small class="image-media media-caption" placeholder="Add Photo Caption...">Long term test for hovering operation</small>
  73. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">
  74. <a href="https://www.youtube.com/watch?v=-sQR0lG4OLA&t=9s" target="_blank">Nature</a>
  75. </small>
  76. </p><p>CoulombFly’s propulsion system can generate up to 5.8 g of lift. This means it could support an extra payload of roughly 1.59 g, which is “sufficient to accommodate the smallest available sensors, controllers, cameras and so on” to support future autonomous operations, Qi says. ”Right now, there’s still a lot of room to improve things like motors, propellers, and circuits, so we think we can get the extra payload up to 4 grams in the future. If we need even more payload, we could switch to quadcopters or fixed-wing designs, which can carry up to 30 grams.”</p><p>Qi adds “it should be possible for the vehicle to carry a tiny lithium-ion battery.” That means it could store energy from its solar panels and fly even when the sun is not out, potentially enabling 24-hour operations.</p><p>In the future, “we plan to use this propulsion system in different types of flying vehicles, like fixed-wing and rotorcraft,” Qi says.</p><p>The scientists detailed <u><a href="https://www.nature.com/articles/s41586-024-07609-4" rel="noopener noreferrer" target="_blank">their findings</a></u> online 17 July in the journal <em>Nature</em>.</p>]]></description><pubDate>Wed, 17 Jul 2024 15:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/smallest-drone</guid><category>Micro aerial vehicles</category><category>Solar power</category><category>Drones</category><category>Uav</category><category>Micro air vehicles</category><dc:creator>Charles Q. Choi</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-silvery-round-drone-with-wings-and-a-small-power-system-sits-in-the-palm-of-a-hand.jpg?id=52820114&amp;width=980"></media:content></item><item><title>Soft Robot Can Amputate and Reattach Its Own Legs</title><link>https://spectrum.ieee.org/soft-modular-robot</link><description><![CDATA[
  77. <img src="https://spectrum.ieee.org/media-library/a-photo-of-a-hand-sized-three-legged-soft-robot-with-tubes-and-wires-coming-out-of-it-crawling-away-from-a-rock-leaving-its-fou.png?id=52559148&width=1470&height=1080&coordinates=450%2C0%2C0%2C0"/><br/><br/><p>Among the many things that humans cannot do (without some fairly substantial modification) is shifting our body morphology around on demand. It sounds a little extreme to be talking about things like self-amputation, and it
  78. <em><em>is</em></em> a little extreme, but it’s also not at all uncommon for other animals to do—lizards can disconnect their tails to escape a predator, for example. And it works in the other direction, too, with animals like ants adding to their morphology by connecting to each other to traverse gaps that a single ant couldn’t cross alone.<br/></p><p>
  79. In a new paper, roboticists from
  80. <a href="https://www.eng.yale.edu/faboratory/" rel="noopener noreferrer" target="_blank"><u>The Faboratory at Yale University</u></a> have given a soft robot the ability to detach and reattach pieces of itself, editing its body morphology when necessary. It’s a little freaky to watch, but it kind of makes me wish I could do the same thing.
  81. </p><hr/><p class="shortcode-media shortcode-media-youtube">
  82. <span class="rm-shortcode" data-rm-shortcode-id="1810cbb2a95bdc058a86ae7f9c900e96" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/qPd9x9-bALo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  83. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Faboratory at Yale</small>
  84. </p><p>
  85. These are fairly standard soft-bodied silicon robots that use asymmetrically stiff air chambers that inflate and deflate (using a tethered pump and valves) to generate a walking or crawling motion. What’s new here are the joints, which rely on a new material called a bicontinuous thermoplastic foam (BTF) to form a supportive structure for a sticky polymer that’s solid at room temperature but can be easily melted.
  86. </p><p>
  87. The BTF acts like a sponge to prevent the polymer from running out all over the place when it melts, and means that you can pull two BTF surfaces apart by melting the joint, and stick them together again by reversing the procedure. The process takes about 10 minutes and the resulting joint is quite strong. It’s also good for a couple of hundred detach/re-attach cycles before degrading. It even stands up to dirt and water reasonably well.
  88. </p><p class="shortcode-media shortcode-media-youtube">
  89. <span class="rm-shortcode" data-rm-shortcode-id="847ee4e9e09f65f772b74715ea9a5ca7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/kFgKRabL7Rc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  90. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Faboratory at Yale</small>
  91. </p><p>
  92. This kind of thing has been done before with mechanical connections and magnets and other things like that—getting robots to attach to and detach from other robots is a foundational technique for modular robotics, after all. But these systems are inherently rigid, which is bad for soft robots, whose whole thing is about
  93. <em><em>not</em></em> being rigid. It’s all very preliminary, of course, because there are plenty of rigid things attached to these robots with tubes and wires and stuff. And there’s no autonomy or payloads here either. That’s not the point, though—the point is the joint, which (as the researchers point out) is “the first instantiation of a fully soft reversible joint” resulting in the “potential for soft artificial systems [that can] shape change via mass addition and subtraction.”<strong></strong>
  94. </p><p>“<a href="https://onlinelibrary.wiley.com/doi/abs/10.1002/adma.202400241" rel="noopener noreferrer" target="_blank"><u>Self-Amputating and Interfusing Machines</u></a>,” by Bilige Yang, Amir Mohammadi Nasab, Stephanie J. Woodman, Eugene Thomas, Liana G. Tilton, Michael Levin, and Rebecca Kramer-Bottiglio from Yale, was published in May in <em><em>Advanced Materials.</em></em>
  95. </p><p>
  96. .
  97. </p>]]></description><pubDate>Sat, 13 Jul 2024 12:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/soft-modular-robot</guid><category>Soft robotics</category><category>Robotics</category><category>Yale</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-photo-of-a-hand-sized-three-legged-soft-robot-with-tubes-and-wires-coming-out-of-it-crawling-away-from-a-rock-leaving-its-fou.png?id=52559148&amp;width=980"></media:content></item><item><title>Video Friday: Unitree Talks Robots</title><link>https://spectrum.ieee.org/video-friday-unitree-talks-robots</link><description><![CDATA[
  98. <img src="https://spectrum.ieee.org/media-library/two-chinese-men-sit-next-to-a-small-silver-humanoid-robot-with-a-larger-black-humanoid-robot-in-the-background-in-a-booth-at-a-c.png?id=52675892&width=1200&height=800&coordinates=0%2C0%2C300%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><h5><a href="https://icra40.ieee.org/">ICRA@40</a>: 23–26 September 2024, ROTTERDAM, NETHERLANDS</h5><h5><a href="https://iros2024-abudhabi.org/">IROS 2024</a>: 14–18 October 2024, ABU DHABI, UAE</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="b2pmdshvgoy"><em>At ICRA 2024, Spectrum editor Evan Ackerman sat down with Unitree Founder and CEO Xingxing Wang and Tony Yang, VP of Business Development, to talk about the company’s newest humanoid, the G1 model.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="40ba2851eb8b488a9553e27e8f6c4d40" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/B2pmDShvGOY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/g1/">Unitree</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="byloosji-t8">SACRIFICE YOUR BODY FOR THE ROBOT</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ea89766662388b5fa0ed403c9a1f9de5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/bylOoSJI-t8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://yugu.faculty.wvu.edu/">WVUIRL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="kp7291n9jg4"><em>From navigating uneven terrain outside the lab to pure vision perception, GR-1 continues to push the boundaries of what’s possible.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ce80aa3762c4dfeef004c836f21a913d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/KP7291N9jg4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://fourierintelligence.com/gr1/">Fourier</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="g5egnbtqcwa"><em>Aerial manipulation has gained interest for completing high-altitude tasks that are challenging for human workers, such as contact inspection and defect detection. This letter addresses a more general and dynamic task: simultaneously tracking time-varying contact force and motion trajectories on tangential surfaces. We demonstrate the approach on an aerial calligraphy task using a novel sponge pen design as the end-effector.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f38be05694a9a00589d180ebdd928252" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/g5egNbtQCwA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://xiaofeng-guo.github.io/flying-calligrapher/">CMU</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="hyfcgppjjnk"><em>LimX Dynamics Biped Robot P1 was kicked and hit: Faced with random impacts in a crowd, P1 with its new design once again showcased exceptional stability as a mobility platform.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e0234aa5dc2a9fedb2214bde3ced1654" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/HYFCGPPjJnk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://medium.com/@limxdynamics/robotic-rigorous-testing-the-rationale-behind-our-kicks-18b1d96b6e01">LimX Dynamics</a> ]</p><p>Thanks, Ou Yan!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="xh7v-6uwfqc">This is from ICRA 2018, but it holds up pretty well in the novelty department.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d8ccd55dbdc95bb7bca660d8fb4c043d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Xh7v-6uWfQc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.inrol.snu.ac.kr/">SNU INRoL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="onv6e7kssy4">I think someone needs to crank the humor setting up on this one.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3638216d991463ada61835bc77d50d65" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Onv6E7kssY4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en/index/product3.html">Deep Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="fcyv4b-dh68"><em>The paper summarizes the work at the Micro Air Vehicle Laboratory on end-to-end neural control of quadcopters. A major challenge in bringing these controllers to life is the “reality gap” between the real platform and the training environment. To address this, we combine online identification of the reality gap with pre-trained corrections through a deep neural controller, which is orders of magnitude more efficient than traditional computation of the optimal solution.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="dbe9bcfe3d5c9ebf6bf3c3fcb9bb5cba" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FCYV4B-DH68?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://mavlab.tudelft.nl/">MAVLab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="9liei__uchm">This is a dedicated Track Actuator from HEBI Robotics. Why they didn’t just call it a “tracktuator” is beyond me.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="08e8a475e2e249a3638ad634d5dc6d07" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/9Liei__UChM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.hebirobotics.com/actuators#Track%20Actuator">HEBI Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="cwzlhmswl5e"><em>Menteebot can navigate complex environments by combining a 3D model of the world with a dynamic obstacle map. On the first day in a new location, Menteebot generates the 3D model by following a person who shows the robot around.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2a6dbcd662db6c617117ba0da34326b5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/cWZLHmswL5E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.menteebot.com/">Mentee Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="8hwoaikmqwy">Here’s that drone with a 68kg payload and 70km range you’ve always wanted.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="18cd744218ae9fc0d357bc86928c1695" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8HwoaiKMQWY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.malloyaeronautics.com/t150.html">Malloy</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="zlha-rwbbyu"><em>AMBIDEX is a dual-armed robot with an innovative mechanism developed for safe coexistence with humans. Based on an innovative cable structure, it is designed to be both strong and stable.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d5cc4c4d8d2b872ad217b1d06c8e3357" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zLhA-RWBBYU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.naverlabs.com/en/ambidex">NAVER Labs</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="i7qon82reic"><em>As quadrotors take on an increasingly diverse range of roles, researchers often need to develop new hardware platforms tailored for specific tasks, introducing significant engineering overhead. In this article, we introduce the UniQuad series, a unified and versatile quadrotor hardware platform series that offers high flexibility to adapt to a wide range of common tasks, excellent customizability for advanced demands, and easy maintenance in case of crashes.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="89f245b4214e4dd037a4d1f719ec8506" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/I7qoN82rEIc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://hkust-aerial-robotics.github.io/UniQuad/">HKUST</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="spkzgmo5lr4"><em>The video demonstrates the field testing of a 43 kg (95 lb) amphibious cycloidal propeller unmanned underwater vehicle (Cyclo-UUV) developed at the Advanced Vertical Flight Laboratory, Texas A&M University. The vehicle utilizes a combination of cycloidal propellers (or cyclo-propellers), screw propellers, and tank treads for operations on land and underwater.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="21136ac8dd2f8da163c218d2c74b02cd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/sPKZGMO5lR4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://avfl.engr.tamu.edu/projects/amphibious-uuv/">TAMU</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="zhch2w1_8t0"><em>The “pill” (the package hook) on Wing’s delivery drones is a crucial component to our aircraft! Did you know our package hook is designed to be aerodynamic and has stable flight characteristics, even at 65 mph?</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cbb02d6d0d38a05e94a12ed5a5551c1c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ZHCh2w1_8T0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://wing.com/">Wing</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="rtj0gp7twly">Happy 50th to robotics at ABB!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9753a65d2c460c4bb63357ed3dec63fc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/RTJ0gP7TwLY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://global.abb/group/en/about/history">ABB</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="lvmdzrxve_q">This JHU Center for Functional Anatomy & Evolution Seminar is by Chen Li, on Terradynamics of Animals & Robots in Complex Terrain.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="faa7a645891e8df891f0f72163e73270" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/LVmdZRxvE_Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://li.me.jhu.edu/">JHU</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 12 Jul 2024 16:21:11 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-unitree-talks-robots</guid><category>Video friday</category><category>Unitree</category><category>Fourier intelligence</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/two-chinese-men-sit-next-to-a-small-silver-humanoid-robot-with-a-larger-black-humanoid-robot-in-the-background-in-a-booth-at-a-c.png?id=52675892&amp;width=980"></media:content></item><item><title>Food Service Robots Just Need the Right Ingredients</title><link>https://spectrum.ieee.org/chef-robotics-food-robots</link><description><![CDATA[
  99. <img src="https://spectrum.ieee.org/media-library/as-trays-of-food-move-along-a-conveyor-belt-overhead-robot-arms-scoop-up-different-food-items-to-add-them-to-the-trays.png?id=52671148&width=1200&height=800&coordinates=267%2C0%2C267%2C0"/><br/><br/><p>Food prep is one of those problems that seems like it <em><em>should</em></em> be solvable by robots. It’s a predictable, repetitive, basic manipulation task in a semi-structured environment—seems ideal, right? And obviously there’s a huge need, because human labor is expensive and getting harder and harder to find in these contexts. There are currently <a href="https://www.chefrobotics.ai/" target="_blank">over a million unfilled jobs in the food industry</a> in the United States, and even with jobs that are filled, the annual turnover rate is 150 percent (meaning a lot of workers don’t even last a year).<strong></strong></p><p>Food prep seems like a great opportunity for robots, which is why <a href="https://www.chefrobotics.ai/" target="_blank"><u>Chef Robotics</u></a> and a handful of other robotics companies tackled it a couple years ago by bringing robots to fast casual restaurants like Chipotle or Sweetgreen, where you get served a custom-ish meal from a selection of ingredients at a counter.<strong></strong></p><p>But this didn’t really work out, for a couple of reasons. First, doing things that are mostly effortless for humans are inevitably extremely difficult for robots. And second, humans actually do a lot of useful things in a restaurant context besides just putting food onto plates, and the robots weren’t up for all of those things.</p><p>Still, Chef Robotics founder and CEO <a href="https://www.linkedin.com/in/rajatbhageria/" target="_blank">Rajat Bhageria</a> wasn’t ready to let this opportunity go. “The food market is arguably the biggest market that’s tractable for AI today,” he told <em><em>IEEE Spectrum</em></em>. And with a bit of a pivot away from the complicated mess of fast casual restaurants, Chef Robotics has still managed to prepare over 20 million meals thanks to autonomous robot arms deployed all over North America. Without knowing it, you may even have eaten such a meal.<br/></p><p class="pull-quote">“The hard thing is, can you pick fast? Can you pick consistently? Can you pick the right portion size without spilling? And can you pick without making it look like the food was picked by a machine?” <strong>—Rajat Bhageria, Chef Robotics</strong></p><p><span></span>When we spoke with Bhageria, he explained that there are three basic tasks involved in prepared food production: prep (tasks like chopping ingredients), the actual cooking process, and then assembly (or plating). Of these tasks, prep scales pretty well with industrial automation in that you can usually order pre-chopped or mixed ingredients, and cooking also scales well since you can cook more with only a minimal increase in effort just by using a bigger pot or pan or oven. What <em>doesn’t</em> scale well is the assembly, especially when any kind of flexibility or variety is required. You can clearly see this in action at any fast casual restaurant, where a couple of people are in the kitchen cooking up massive amounts of food while each customer gets served one at a time.</p><p>So with that bottleneck identified, let’s throw some robots at the problem, right? And that’s exactly what Chef Robotics did, explains Bhageria: “we went to our customers, who said that their biggest pain point was labor, and the most labor is in assembly, so we said, we can help you solve this.”</p><p>Chef Robotics started with fast casual restaurants. They weren’t the first to try this—many other robotics companies had attempted this before, with decidedly mixed results.<strong> </strong>“We actually had some good success in the early days selling to fast casual <strong></strong> chains,” Bhageria says, “but then we had some technical obstacles. Essentially, if we want to have a human-equivalent system so that we can charge a human-equivalent service fee for our robot, we need to be able to do every ingredient. You’re either a full human equivalent, or our customers told us it wouldn’t be useful.”</p><p>Part of the challenge is that training robots do perform all of the different manipulations required for different assembly tasks requires different kinds of real world data. That data simply doesn’t exist—or, if it does, any company that has it knows what it’s worth and isn’t sharing. <strong></strong>You can’t easily simulate this kind of data, because food can be gross and difficult to handle, whether it’s gloopy or gloppy or squishy or slimy or unpredictably deformable in some other way, and you really need physical experience to train a useful manipulation model.<strong></strong></p><p>Setting fast casual restaurants aside for a moment, what about food prep situations where things are as predictable as possible, like mass-produced meals? We’re talking about food like frozen dinners, that have a handful of discrete ingredients packed into trays at factory scale. Frozen meal production relies on automation rather than robotics because the scale is such that the cost of dedicated equipment can be justified.</p><p>There’s a middle ground, though, where robots have found (some) opportunity: When you need to produce a high volume of the same meal, but that meal changes regularly. For example, think of any kind of pre-packaged meal that’s made in bulk, just not at frozen-food scale. It’s an opportunity for automation in a structured environment—but with enough variety that actual automation isn’t cost effective. Suddenly, robots and their tiny bit of flexible automation have a chance to be a practical solution.</p><p class="shortcode-media shortcode-media-youtube">
  100. <span class="rm-shortcode" data-rm-shortcode-id="fa3f9208eb17dcc5971b40593bdab8fd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/VzCIpfO6peQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  101. </p><p>“We saw these long assembly lines, where humans were scooping food out of big tubs and onto individual trays,” Bhageria says. “They do a lot of different meals on these lines; it’s going to change over and they’re going to do different meals throughout the week. But at any given moment, each person is doing one ingredient, and maybe on a weekly basis, that person would do six ingredients. This was really compelling for us because six ingredients is something we can bootstrap in a lab. We can get something good enough and if we can get something good enough, then we can ship a robot, and if we can ship a robot to production, then we will get real world training data.”<br/></p><p>Chef Robotics has been deploying robot modules that they can slot into existing food assembly lines in place of humans without any retrofitting necessary. The modules consist of six degree of freedom arms wearing swanky IP67 washable suits. To handle different kinds of food, the robots can be equipped with a variety of different utensils (and their accompanying manipulation software strategies). Sensing includes a few depth cameras, as well as a weight-sensing platform <strong></strong>for the food tray to ensure consistent amounts of food are picked. And while arms with six degrees of freedom may be overkill for now, eventually the hope is that they’ll be able to handle more complex food like asparagus, where you need to do a little bit more than just scoop.<strong></strong></p><p class="">While Chef Robotics seems to have a viable business here, Bhageria tells us that he keeps coming back to that vision of robots being useful in fast casual restaurants, and eventually, robots making us food in our homes. Making that happen will require time, experience, technical expertise, and an astonishing amount of real-world training data, which is the real value behind those 20 million robot-prepared meals (and counting). The more robots the company deploys, the more data they collect, which will allow them to train their food manipulation models to handle a wider variety of ingredients to open up even more deployments. Their robots, <a href="https://www.chefrobotics.ai/post/lifting-the-veil-on-chef-and-the-future-of-embodied-ai-in-the-food-industry" target="_blank">Chef’s website says</a>, “essentially act as data ingestion engines to improve our AI models.”<strong></strong></p><p>The next step is likely <a href="https://cloudkitchens.com/blog/ultimate-guide-to-ghost-kitchens/" target="_blank">ghost kitchens</a> where the environment is still somewhat controlled and human interaction isn’t necessary, followed by deployments in commercial kitchens more broadly. But even that won’t be enough for Bhageria, who wants robots that can take over from all of the drudgery in food service: “I’m really excited about this vision,” he says. “How do we deploy hundreds of millions of robots all over the world that allow humans to do what humans do best?”</p>]]></description><pubDate>Thu, 11 Jul 2024 18:51:23 +0000</pubDate><guid>https://spectrum.ieee.org/chef-robotics-food-robots</guid><category>Chef robotics</category><category>Food robots</category><category>Robotic manipulation</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/as-trays-of-food-move-along-a-conveyor-belt-overhead-robot-arms-scoop-up-different-food-items-to-add-them-to-the-trays.png?id=52671148&amp;width=980"></media:content></item><item><title>Sea Drones in the Russia-Ukraine War Inspire New Tactics</title><link>https://spectrum.ieee.org/sea-drone</link><description><![CDATA[
  102. <img src="https://spectrum.ieee.org/media-library/a-man-in-a-camouflage-military-uniform-sits-on-a-naval-drone-at-waters-edge.jpg?id=52557078&width=1200&height=800&coordinates=26%2C0%2C100%2C0"/><br/><br/><p>
  103. <strong>Against all odds,</strong> Ukraine is still standing almost two and a half years after Russia’s massive 2022 invasion. Of course, hundreds of billions of dollars in Western support as well as Russian errors have helped immensely, but it would be a mistake to overlook Ukraine’s creative use of new technologies, particularly drones. While uncrewed aerial vehicles have grabbed most of the attention, it is naval drones that could be the key to bringing Russian president Vladimir Putin to the negotiating table.
  104. </p><p>
  105. These naval-drone operations in the Black Sea against Russian warships and other targets have been so successful that they are prompting, in London, Paris, Washington, and elsewhere, fundamental reevaluations of how drones will affect future naval operations. In August, 2023, for example, the Pentagon launched the billion-dollar
  106. <a href="https://www.diu.mil/replicator" rel="noopener noreferrer" target="_blank">Replicator</a> initiative to field air and naval drones (also called sea drones) on a massive scale. It’s widely believed that such drones could be used to help <a href="https://www.navalnews.com/naval-news/2024/06/breaking-down-the-u-s-navys-hellscape-in-detail/" rel="noopener noreferrer" target="_blank">counter</a> a Chinese invasion of Taiwan.
  107. </p><p>
  108. And yet Ukraine’s naval drones initiative grew out of necessity, not grand strategy. Early in the war, Russia’s Black Sea fleet launched cruise missiles into Ukraine and blockaded Odesa, effectively shutting down Ukraine’s exports of grain, metals, and manufactured goods. The missile strikes terrorized Ukrainian citizens and shut down the power grid, but Russia’s blockade was arguably more consequential, devastating Ukraine’s economy and creating food shortages from North Africa to the Middle East.
  109. </p><p>
  110. With its navy seized or sunk during the war’s opening days, Ukraine had few options to regain access to the sea. So Kyiv’s troops got creative.
  111. <a href="https://uk.wikipedia.org/wiki/%D0%9B%D1%83%D0%BA%D0%B0%D1%88%D0%B5%D0%B2%D0%B8%D1%87_%D0%86%D0%B2%D0%B0%D0%BD_%D0%92%D0%BE%D0%BB%D0%BE%D0%B4%D0%B8%D0%BC%D0%B8%D1%80%D0%BE%D0%B2%D0%B8%D1%87" rel="noopener noreferrer" target="_blank">Lukashevich Ivan Volodymyrovych</a>, a brigadier general in the <a href="https://greydynamics.com/ukrainian-sbu-protectors-of-the-homeland/" rel="noopener noreferrer" target="_blank">Security Service of Ukraine</a>, the country’s counterintelligence agency, proposed building a series of fast, uncrewed attack boats. In the summer of 2022, the service, which is known by the acronym SBU, began with a few prototype drones. These quickly led to a pair of naval drones that, when used with commercial satellite imagery, off-the-shelf uncrewed aircraft, and Starlink terminals, gave Ukrainian operators the means to sink or disable a<a href="https://www.newsweek.com/russia-black-sea-fleet-ukraine-crimea-tsiklon-corvette-1902339" rel="noopener noreferrer" target="_blank"> third</a> of Russia’s Black Sea Fleet, including the flagship <a href="https://en.wikipedia.org/wiki/Russian_cruiser_Moskva" rel="noopener noreferrer" target="_blank"><em><em>Moskva</em></em></a> and <a href="https://www.newsweek.com/russia-moving-black-sea-ships-ukraine-strikes-crimea-tsiklon-1903944" rel="noopener noreferrer" target="_blank">most</a> of the fleet’s cruise-missile-equipped warships.
  112. </p><p>
  113. To protect their remaining vessels, Russian commanders relocated the Black Sea Fleet to Novorossiysk, 300 kilometers east of Crimea. This move sheltered the ships from Ukrainian drones and missiles, but it also put them too far away to threaten Ukrainian shipping or defend the Crimean Peninsula. Kyiv has exploited the opening by restoring trade routes and mounting sustained airborne and naval drone strikes against Russian bases on Crimea and the Kerch Strait Bridge connecting the peninsula with Russia.
  114. </p><h2>How Maguras and Sea Babies Hunt and Attack</h2><p>
  115. The first Ukrainian drone boats were cobbled together with parts from jet skis, motorboats, and off-the-shelf electronics. But within months, manufacturers working for the Ukraine defense ministry and SBU fielded several designs that proved their worth in combat, most notably the
  116. <a href="https://www.kyivpost.com/analysis/29068" rel="noopener noreferrer" target="_blank">Magura V5</a> and the <a href="https://www.kyivpost.com/post/25792" rel="noopener noreferrer" target="_blank">Sea Baby</a>.
  117. </p><p>
  118. Carrying a 300-kilogram warhead, on par with that of a heavyweight
  119. <a href="https://www.navy.mil/Resources/Fact-Files/Display-FactFiles/Article/2167907/mk-48-heavyweight-torpedo/" rel="noopener noreferrer" target="_blank">torpedo</a>, the Magura V5 is a hunter-killer antiship drone designed to work in swarms that confuse and overwhelm a ship’s defenses. Equipped with Starlink terminals, which connect to SpaceX’s Starlink satellites, and GPS, a group of about three to five Maguras likely moves autonomously to a location near the potential target. From there, operators can wait until conditions are right and then attack the target from multiple angles using remote control and video feeds from the vehicles.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  120. <img alt="A man in a black wetsuit and brown bucket hat stands in shallow water next to a gray naval drone. " class="rm-shortcode" data-rm-shortcode-id="c1be3812e64548fcf19c652ab366ca21" data-rm-shortcode-name="rebelmouse-image" id="07bbd" loading="lazy" src="https://spectrum.ieee.org/media-library/a-man-in-a-black-wetsuit-and-brown-bucket-hat-stands-in-shallow-water-next-to-a-gray-naval-drone.jpg?id=52557150&width=980"/>
  121. <small class="image-media media-caption" placeholder="Add Photo Caption...">A Ukrainian Magura V5 hunter-killer sea drone was demonstrated at an undisclosed location in Ukraine on 13 April 2024. The domed pod toward the bow, which can rotate from side to side, contains a thermal camera used for guidance and targeting.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Valentyn Origrenko/Reuters/Redux</small></p><p>Larger than a Magura, the Sea Baby is a multipurpose vehicle that can carry about 800 kg of explosives, which is close to twice the payload of a Tomahawk cruise missile. A Sea Baby was used in 2023 to inflict substantial damage to the Kerch Strait Bridge. A more recent version <a href="https://www.newsweek.com/ukraine-sea-baby-naval-drones-grad-multiple-rocket-launchers-russia-1903381" target="_blank">carries</a> a rocket launcher that Ukraine troops plan to use against Russian forces along the Dnipro River, which flows through eastern Ukraine and has often formed the frontline in that part of the country. Like a Magura, a Sea Baby is likely remotely controlled using Starlink and GPS. In addition to attack, it’s also equipped for surveillance and logistics.</p><p>Russia reduced the threat to its ships by moving them out of the region, but fixed targets like the Kerch Strait Bridge remain vulnerable to Ukrainian sea drones. To try to protect these structures from drone onslaughts, Russian commanders are taking a “kitchen sink” approach, <a href="https://www.twz.com/russia-sinks-line-of-its-own-ships-to-protect-kerch-bridge" target="_blank">submerging</a> hulks around bridge supports, fielding more <a href="https://www.bbc.com/news/world-europe-68528761" rel="noopener noreferrer" target="_blank">guns</a> to shoot at incoming uncrewed vessels, and jamming GPS and <a href="https://www.nytimes.com/2024/05/24/technology/ukraine-russia-starlink.html" rel="noopener noreferrer" target="_blank">Starlink</a> around the Kerch Strait.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  122. <img alt="Two men wearing balaclavas operate suitcase-style terminals for remote control of sea drones. " class="rm-shortcode" data-rm-shortcode-id="142cdcfc704ae039012189256c0d20a4" data-rm-shortcode-name="rebelmouse-image" id="63dd6" loading="lazy" src="https://spectrum.ieee.org/media-library/two-men-wearing-balaclavas-operate-suitcase-style-terminals-for-remote-control-of-sea-drones.jpg?id=52557111&width=980"/>
  123. <small class="image-media media-caption" placeholder="Add Photo Caption...">Ukrainian service members demonstrated the portable, ruggedized consoles used to remotely guide the Magura V5 naval drones in April 2024.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Valentyn Origrenko/Reuters/Redux</small>
  124. </p><p>While the war remains largely stalemated in the country’s north, Ukraine’s naval drones could yet force Russia into negotiations. The Crimean Peninsula was Moscow’s biggest prize from its decade-long assault on Ukraine. If the Kerch Bridge is severed and the Black Sea Fleet pushed back into Russian ports, Putin may need to end the fighting to regain control over Crimea.<br/></p><h2>Why the U.S. Navy Embraced the Swarm</h2><p>
  125. Ukraine’s small, low-cost sea drones are offering a compelling view of future tactics and capabilities. But recent experiences elsewhere in the world are highlighting the limitations of drones for some crucial tasks. For example, for protecting shipping from piracy or stopping trafficking and illegal fishing, drones are less useful.
  126. </p><p>
  127. Before the Ukraine war, efforts by the U.S. Department of Defense to field surface sea drones focused mostly on large vehicles. In 2015, the Defense Advanced Research Projects Agency started, and the U.S. Navy later continued, a project that built
  128. <a href="https://news.usni.org/2021/04/08/navy-takes-delivery-of-sea-hawk-unmanned-vessel" rel="noopener noreferrer" target="_blank">two uncrewed surface vessels</a>, called <em><em>Sea Hunter</em></em> and <em><em>Sea Hawk</em></em>. These were 130-tonne sea drones capable of roaming the oceans for up to 70 days while carrying payloads of thousands of pounds each. The point was to demonstrate the ability to detect, follow, and destroy submarines. The Navy and the Pentagon’s secretive Strategic Capabilities Office <a href="https://www.naval-technology.com/projects/ghost-fleet-overlord-unmanned-surface-vessels-usa/" rel="noopener noreferrer" target="_blank">followed</a> with the Ghost Fleet Overlord uncrewed vessel programs, which produced four larger prototypes designed to carry shipping-container-size payloads of missiles, sensors, or electronic countermeasures.
  129. </p><p>
  130. The U.S. Navy’s newly created Uncrewed Surface Vessel Division 1 (
  131. <a href="https://seapowermagazine.org/navy-establishes-unmanned-surface-vessel-division-one/" rel="noopener noreferrer" target="_blank">USVDIV-1</a>) completed a <a href="https://www.cpf.navy.mil/Newsroom/News/Article/3569198/unmanned-surface-vessel-division-arrives-in-sydney/" rel="noopener noreferrer" target="_blank">deployment</a> across the Pacific Ocean last year with four medium and large sea drones: <em><em>Sea Hunter</em></em> and <em><em>Sea Hawk </em></em>and two Overlord vessels, <em><em>Ranger</em></em> and <em><em>Mariner.</em></em> The five-month deployment from Port Hueneme, Calif., took the vessels to Hawaii, Japan, and Australia, where they joined in annual exercises conducted by U.S. and allied navies. The U.S. Navy continues to <a href="https://www.defensenews.com/naval/2022/08/08/us-navy-injects-first-of-kind-unmanned-experiments-into-multinational-exercise/" rel="noopener noreferrer" target="_blank">assess</a> its drone fleet through sea trials lasting from several days to a few months.
  132. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  133. <img alt="A battleship-gray trimaran ship cruises near a wooded shoreline." class="rm-shortcode" data-rm-shortcode-id="1e49797fdd2d924f71289a1090da26ff" data-rm-shortcode-name="rebelmouse-image" id="c14eb" loading="lazy" src="https://spectrum.ieee.org/media-library/a-battleship-gray-trimaran-ship-cruises-near-a-wooded-shoreline.jpg?id=52557117&width=980"/>
  134. <small class="image-media media-caption" placeholder="Add Photo Caption...">The <i>Sea Hawk</i> is a U.S. Navy trimaran drone vessel designed to find, pursue, and attack submarines. The 130-tonne ship, photographed here in October of 2023 in Sydney Harbor, was built to operate autonomously on missions of up to 70 days, but it can also accommodate human observers on board. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Ensign Pierson Hawkins/U.S. Navy</small>
  135. </p><p>
  136. In contrast with Ukraine’s small sea drones, which are usually remotely controlled and operate outside shipping lanes, the U.S. Navy’s much larger uncrewed vessels have to follow the nautical rules of the road. To navigate autonomously, these big ships rely on robust onboard sensors, processing for computer vision and target-motion analysis, and automation based on predictable forms of artificial intelligence, such as expert- or agent-based algorithms rather than deep learning.
  137. </p><p>
  138. But thanks to the success of the Ukrainian drones, the focus and energy in sea drones are rapidly moving to the smaller end of the scale. The U.S. Navy initially envisioned platforms like
  139. <em><em>Sea Hunter</em></em> conducting missions in submarine tracking, electronic deception, or clandestine surveillance far out at sea. And large drones will still be needed for such missions. However, with the right tactics and support, a group of small sea drones can conduct similar missions as well as other vital tasks.
  140. </p><p>
  141. For example, though they are constrained in speed, maneuverability, and power generation, solar- or sail-powered drones can stay out for months with little human intervention. The earliest of these are wave gliders like the Liquid Robotics (a Boeing company)
  142. <a href="https://www.liquid-robotics.com/markets/defense-security/" rel="noopener noreferrer" target="_blank"> SHARC</a>, which has been conducting undersea and surface surveillance for the U.S. Navy for more than a decade. Newer designs like the Saildrone <a href="https://www.saildrone.com/tag/voyager" rel="noopener noreferrer" target="_blank">Voyager</a> and Ocius <a href="https://ocius.com.au/usv/" rel="noopener noreferrer" target="_blank">Blue Bottle</a> incorporate motors and additional solar or diesel power to haul payloads such as radars, jammers, decoys, or active sonars. The Ocean Aero <a href="https://www.oceanaero.com/the-triton" rel="noopener noreferrer" target="_blank">Triton</a> takes this model one step further: It can submerge, to conduct clandestine surveillance or a surprise attack, or to avoid detection.
  143. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  144. <img alt="A pair of photographs shows an oblong, gray-and-black sea vessel cruising underwater and also sailing on the surface. " class="rm-shortcode" data-rm-shortcode-id="80e9baa671b33a4dee632bfe1fcd146a" data-rm-shortcode-name="rebelmouse-image" id="d8de8" loading="lazy" src="https://spectrum.ieee.org/media-library/a-pair-of-photographs-shows-an-oblong-gray-and-black-sea-vessel-cruising-underwater-and-also-sailing-on-the-surface.jpg?id=52557125&width=980"/>
  145. <small class="image-media media-caption" placeholder="Add Photo Caption...">The Triton, from Ocean Aero in Gulfport, Miss., is billed as the world’s only autonomous sea drone capable of both cruising underwater and sailing on the surface. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Ocean Aero</small></p><p>
  146. Ukraine’s success in the Black Sea has also unleashed a flurry of new small antiship attack drones. USVDIV-1 will use the
  147. <a href="https://www.mapcorp.com/technologies-main/#marine" rel="noopener noreferrer" target="_blank">GARC</a> from <a href="https://www.mapcorp.com/" rel="noopener noreferrer" target="_blank">Maritime Applied Physics Corp.</a> to develop tactics. The Pentagon’s Defense Innovation Unit has also begun <a href="https://www.thedefensepost.com/2024/04/10/texas-marine-drones-china/" rel="noopener noreferrer" target="_blank">purchasing</a> drones for the China-focused Replicator initiative. Among the likely craft being evaluated are fast-attack sea drones from Austin, Texas–based <a href="https://www.saronic.com/" rel="noopener noreferrer" target="_blank">Saronic</a>.
  148. </p><p>
  149. Behind the soaring interest in small and inexpensive sea drones is the
  150. <a href="https://www.hudson.org/defense-strategy/unalone-unafraid-plan-integrating-uncrewed-other-emerging-technologies-us-military-bryan-clark-dan-patt" rel="noopener noreferrer" target="_blank">changing value proposition</a> for naval drones. As recently as four years ago, military planners were focused on using them to replace crewed ships in “dull, dirty, and dangerous” jobs. But now, the thinking goes, sea drones can provide scale, adaptability, and resilience across each link in the “kill chain” that extends from detecting a target to hitting it with a weapon.
  151. </p><p>
  152. Today, to attack a ship, most navies generally have one preferred sensor (such as a radar system), one launcher, and one missile. But what these planners are now coming to appreciate is that a fleet of crewed surface ships with a collection of a dozen or two naval drones would offer multiple paths to both find that ship and attack it. These craft would also be less vulnerable, because of their dispersion.
  153. </p><h2>Defending Taiwan by Surrounding It With a “Hellscape”</h2><p>
  154. U.S. efforts to protect Taiwan may soon reflect this new value proposition. Many
  155. <a href="https://www.defenseone.com/policy/2021/07/it-failed-miserably-after-wargaming-loss-joint-chiefs-are-overhauling-how-us-military-will-fight/184050/" rel="noopener noreferrer" target="_blank">classified</a> and <a href="https://www.csis.org/analysis/first-battle-next-war-wargaming-chinese-invasion-taiwan" rel="noopener noreferrer" target="_blank">unclassified</a> war games suggest Taiwan and its allies could successfully defend the island—but at costs high enough to potentially dissuade a U.S. president from intervening on Taiwan’s behalf. With U.S. defense budgets capped by law and procurement constrained by rising personnel and maintenance costs, substantially growing or improving today’s U.S. military for this specific purpose is unrealistic. Instead, commanders are looking for creative solutions to slow or stop a Chinese invasion without losing most U.S. forces in the process.
  156. </p><p>
  157. Naval drones look like a good—and maybe the best—
  158. <a href="https://www.hudson.org/defense-strategy/hedging-bets-rethinking-force-design-post-dominance-era-bryan-clark-dan-patt" rel="noopener noreferrer" target="_blank">solution</a>. The Taiwan Strait is only 160 kilometers (100 miles) wide, and Taiwan’s coastline offers only a few areas where large numbers of troops could come ashore. U.S. naval attack drones positioned on the likely routes could disrupt or possibly even halt a Chinese invasion, much as Ukrainian sea drones have denied Russia access to the western Black Sea and, for that matter, Houthi-controlled drones have sporadically closed off large parts of the Red Sea in the Middle East.
  159. </p><p class="pull-quote">Rather than killer robots seeking out and destroying targets, the drones defending Taiwan would be passively waiting for Chinese forces to illegally enter a protected zone, within which they could be attacked.</p><p>
  160. The new U.S. Indo-Pacific Command leader, Admiral
  161. <a href="https://www.navy.mil/Leadership/Flag-Officer-Biographies/Search/Article/2236378/admiral-samuel-paparo/" rel="noopener noreferrer" target="_blank">Sam Paparo</a>, wants to apply this approach to defending Taiwan in a scenario he calls “<a href="https://www.defenseone.com/technology/2023/08/hellscape-dod-launches-massive-drone-swarm-program-counter-china/389797/" rel="noopener noreferrer" target="_blank">Hellscape</a>.” In it, U.S. surface and undersea drones would likely be based near Taiwan, perhaps in the Philippines or Japan. When the potential for an invasion rises, the drones would move themselves or be carried by larger uncrewed or crewed ships to the western coast of Taiwan to wait.
  162. </p><p>
  163. Sea drones are well-suited to this role, thanks in part to the evolution of naval technologies and tactics over the past half century. Until World War II, submarines were the most lethal threat to ships. But since the Cold War, long-range subsonic, supersonic, and now hypersonic antiship missiles have commanded navy leaders’ attention. They’ve spent decades devising ways to protect their ships against such antiship missiles.
  164. </p><p>
  165. Much less effort has gone into defending against torpedoes, mines—or sea drones. A dozen or more missiles might be needed to ensure that just one reaches a targeted ship, and even then, the
  166. <a href="https://apnews.com/article/yemen-red-sea-ship-attack-c47710540383198ba1acb41b07f14751" rel="noopener noreferrer" target="_blank">damage</a> may not be catastrophic. But a single surface or undersea drone could easily evade detection and explode at a ship’s waterline to sink it, because in this case, water pressure does most of the work.
  167. </p><p>
  168. The level of autonomy available in most sea drones today is more than enough to attack ships in the Taiwan Strait. Details of U.S. military plans are classified, but a recent Hudson Institute
  169. <a href="https://www.hudson.org/defense-strategy/hedging-bets-rethinking-force-design-post-dominance-era-bryan-clark-dan-patt" rel="noopener noreferrer" target="_blank">report</a> that I wrote with Dan Patt, proposes a possible approach. In it, a drone flotilla, consisting of about three dozen hunter-killer surface drones, two dozen uncrewed surface vessels carrying aerial drones, and three dozen autonomous undersea drones, would take up designated positions in a “kill box” adjacent to one of Taiwan’s western beaches if a Chinese invasion fleet had begun massing on the opposite side of the strait. Even if they were based in Japan or the Philippines, the drones could reach Taiwan within a day. Upon receiving a signal from operators remotely using Starlink or locally using a line-of-sight radio, the drones would act as a mobile minefield, attacking troop transports and their escorts inside Taiwan’s territorial waters. Widely available electro-optical and infrared sensors, coupled to recognition <a href="https://ieeexplore.ieee.org/document/9987188" rel="noopener noreferrer" target="_blank">algorithms</a>, would direct the drones to targets.
  170. </p><p>
  171. Although communications with operators onshore would likely be jammed, the drones could coordinate their actions locally using line-of-sight Internet Protocol–based networks like
  172. <a href="https://silvustechnologies.com/" rel="noopener noreferrer" target="_blank">Silvus</a> or <a href="https://www.collinsaerospace.com/what-we-do/industries/military-and-defense/communications/tactical-data-links/tactical-targeting-network-technology" rel="noopener noreferrer" target="_blank">TTNT</a>. For example, surface vessels could launch aerial drones that would attack the pilot houses and radars of ships, while surface and undersea drones strike ships at the waterline. The drones could also coordinate to ensure they do not all strike the same target and to prioritize the largest targets first. These kinds of simple collaborations are routine in today’s drones.
  173. </p><p>
  174. Treating drones like mines reduces the complexity needed in their control systems and helps them comply with Pentagon
  175. <a href="https://www.defense.gov/News/News-Stories/Article/Article/3278065/dod-updates-autonomy-in-weapons-system-directive/#:~:text=DOD%20requires%20extensive%20testing%2C%20reviews,do%20not%20meet%20specific%20exemptions." rel="noopener noreferrer" target="_blank"> rules</a> for autonomous weapons. Rather than killer robots seeking out and destroying targets, the drones defending Taiwan would be passively waiting for Chinese forces to illegally enter a protected zone, within which they could be attacked.
  176. </p><p>
  177. Like Russia’s Black Sea Fleet, the Chinese navy will develop countermeasures to sea drones, such as employing decoy ships, attacking drones from the air, or using minesweepers to move them away from the invasion fleet. To stay ahead, operators will need to continue innovating tactics and behaviors through frequent exercises and experiments, like those
  178. <a href="https://www.navy.mil/Press-Office/News-Stories/Article/3781958/surfor-establishes-unmanned-surface-vessel-squadron-usvron-three/" rel="noopener noreferrer" target="_blank">underway</a> at U.S. Navy Unmanned Surface Vessel Squadron Three. (Like the USVDIV-1, it is a unit under the U.S. Navy’s <a href="https://insidedefense.com/insider/navy-stand-new-unmanned-surface-vessel-squadron" rel="noopener noreferrer" target="_blank">Surface Development Squadron One</a>.) Lessons from such exercises would be incorporated into the defending drones as part of their programming before a mission.
  179. </p><p>
  180. The emergence of sea drones heralds a new era in naval warfare. After decades of focusing on increasingly lethal antiship missiles, navies now have to defend against capable and widely proliferating threats on, above, and below the water. And while sea drone swarms may be mainly a concern for coastal areas, these choke points are critical to the global economy and most nations’ security. For U.S. and allied fleets, especially, naval drones are a classic combination of threat
  181. <em><em>and</em></em> opportunity. As the Hellscape concept suggests, uncrewed vessels may be a solution to some of the most challenging and sweeping of modern naval scenarios for the Pentagon and its allies—and their adversaries. <span class="ieee-end-mark"></span>
  182. </p><p><em>This article was updated on 10 July 2024. An earlier version stated that sea drones from Saronic Technologies are being purchased by the U.S. Department of Defense’s Defense Innovation Unit. This could not be publicly confirmed.</em></p>]]></description><pubDate>Wed, 10 Jul 2024 12:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/sea-drone</guid><category>Naval drones</category><category>Sea drones</category><category>Uncrewed surface vehicles</category><category>Autonomous surface vehicles</category><dc:creator>Bryan Clark</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-man-in-a-camouflage-military-uniform-sits-on-a-naval-drone-at-waters-edge.jpg?id=52557078&amp;width=980"></media:content></item><item><title>Video Friday: Humanoids Building BMWs</title><link>https://spectrum.ieee.org/video-friday-humanoids-building-bmws</link><description><![CDATA[
  183. <img src="https://spectrum.ieee.org/media-library/a-silvery-humanoid-robot-picks-up-car-parts-from-a-fixture-in-a-factory.png?id=52546736&width=1200&height=800&coordinates=150%2C0%2C150%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><h5><a href="https://icra40.ieee.org/">ICRA@40</a>: 23–26 September 2024, ROTTERDAM, NETHERLANDS</h5><h5><a href="https://iros2024-abudhabi.org/">IROS 2024</a>: 14–18 October 2024, ABU DHABI, UAE</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="k1trbi0baau">Figure is making progress toward a humanoid robot that can do something useful, but keep in mind that the “full use case” here is not one continuous shot.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c8db6c81c8499e776e5d41aaff04a190" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/K1TrbI0BaaU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.figure.ai/">Figure</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ijmykhr1lyw">Can this robot survive a 1-meter drop? Spoiler alert: it cannot.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f8c39b97670d3d14fa6b7cdfe60584fd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/IjMyKHr1lyw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://yugu.faculty.wvu.edu/">WVUIRL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="b_i2k7mzekg">One of those things that’s a lot harder for robots than it probably looks.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="96a700df89fa62b0334122796933c94f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/B_I2k7MZEKg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>This is a demo of hammering a nail. The instantaneous rebound force from the hammer is absorbed through a combination of the elasticity of the rubber material securing the hammer, the deflection in torque sensors and harmonic gears, back-drivability, and impedance control. This allows the nail to be driven with a certain amount of force.</em></blockquote><p>[ <a href="https://robotics.tokyo/">Tokyo Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="qetylccejtw"><em>Although bin packing has been a key benchmark task for robotic manipulation, the community has mainly focused on the placement of rigid rectilinear objects within the container. We address this by presenting a soft robotic hand that combines vision, motor-based proprioception, and soft tactile sensors to identify, sort, and pack a stream of unknown objects.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2beb6be6aaeee0228d93d9edb21e6258" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/qetYLCcejTw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.youtube.com/watch?v=qetYLCcejTw">MIT CSAIL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ujdk3yd2ghy"><em>Status Update: Extending traditional visual servo and compliant control by integrating the latest reinforcement and imitation learning control methodologies, UBTECH gradually trains the embodied intelligence-based “cerebellum” of its humanoid robot Walker S for diverse industrial manipulation tasks.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7009b146ff17c4fc5b6008f65f772b5c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ujdK3yd2gHY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ubtrobot.com/humanoid/products/Walker">UBTECH</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="nuosqungasw">If you’re gonna ask a robot to stack bread, better make it flat.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e3016171edff0edede3527dbdb59bd88" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/nUOsQungAsw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.fanucamerica.com/products/robots/series/dr-3ib-series-delta-robots/dr-3ib-8l">FANUC</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="orie28shevc">Cassie has to be one of the most distinctive sounding legged robots there is.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="46e84ab372a73f5ddf558217ecb31e71" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ORie28sHEvc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2403.02486">Paper</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="0tlsys8a4aa">Twice the robots are by definition twice as capable, right...?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9bc13d7facf89a8aa0357a624fb28a1d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0tLSYs8A4AA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pollen-robotics.com/">Pollen Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="lld3bfs-qms"><em>The Robotic Systems Lab participated in the Advanced Industrial Robotic Applications (AIRA) Challenge at the ACHEMA 2024 process industry trade show, where teams demonstrated their teleoperated robotic solutions for industrial inspection tasks. We competed with the ALMA legged manipulator robot, teleoperated using a second robot arm in a leader-follower configuration, placing us in third place for the competition.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b2dbac224a474e3a7c36513e8b913ae9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/LLD3BFS-qms?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://rsl.ethz.ch/">ETHZ RSL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="zuntf0kmtze">This is apparently “peak demand” in a single market for Wing delivery drones.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1b6a198d2839aaff8090c2de344836e8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Zuntf0KmtzE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://wing.com/">Wing</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="fkdnu50nx-8"><em>Using a new type of surgical intervention and neuroprosthetic interface, MIT researchers, in collaboration with colleagues from Brigham and Women’s Hospital, have shown that a natural walking gait is achievable using a prosthetic leg fully driven by the body’s own nervous system. The surgical amputation procedure reconnects muscles in the residual limb, which allows patients to receive “proprioceptive” feedback about where their prosthetic limb is in space.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="47c34c0243154907cdeb6b30c0b85c48" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/fKdnu50Nx-8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://news.mit.edu/2024/prosthesis-helps-people-with-amputation-walk-naturally-0701">MIT</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="lpodb4c5cim"><em>Coal mining in Forest of Dean (UK) is such a difficult and challenging job. Going into the mine as human is sometimes almost impossible. We did it with our robot while inspecting the mine with our partners (Forestry England) and the local miners!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="54a784d4fb098847bec2b47efa55f30c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/LPoDb4C5cIM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://rpl-as-ucl.github.io/">UCL RPL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="b72gej00gtq">Chill.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cc03677ec95c9ac1462ba5fb7c4877ff" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/b72geJ00gTQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://new.abb.com/products/robotics/robots/collaborative-robots/yumi/dual-arm">ABB</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ui6iklhh-pq"><em>Would you tango with a robot? Inviting us into the fascinating world of dancing machines, robot choreographer Catie Cuan highlights why teaching robots to move with grace, intention and emotion is essential to creating AI-powered machines we will want to welcome into our daily lives.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5d848c3b8c41bd801e3b4d6ca532e376" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UI6IKlHh-pQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ted.com/talks/catie_cuan_next_up_for_ai_dancing_robots?rss=172BB350-0205">TED</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 05 Jul 2024 19:51:56 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-humanoids-building-bmws</guid><category>Video friday</category><category>Humanoid robots</category><category>Figure</category><category>Dancing robots</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-silvery-humanoid-robot-picks-up-car-parts-from-a-fixture-in-a-factory.png?id=52546736&amp;width=980"></media:content></item><item><title>Persona AI Brings Calm Experience to the Hectic Humanoid Industry</title><link>https://spectrum.ieee.org/persona-ai-radford-pratt</link><description><![CDATA[
  184. <img src="https://spectrum.ieee.org/media-library/a-shadowy-rendering-of-a-humanoid-robot-standing-in-the-darkness.png?id=52520431&width=1200&height=800&coordinates=0%2C101%2C0%2C101"/><br/><br/><p>
  185. It may at times seem like there are as many humanoid robotics companies out there as the industry could possibly sustain, but the potential for useful and reliable and affordable humanoids is so huge that there’s plenty of room for any company that can actually get them to work. Joining the <a href="https://spectrum.ieee.org/humanoid-robots" target="_blank">dozen or so companies</a> already on this quest is
  186. <a href="https://personainc.ai/" target="_blank">Persona AI</a>, founded last month by <a href="https://www.linkedin.com/in/nicolaus-radford/" target="_blank">Nic Radford</a> and <a href="https://www.linkedin.com/in/jerry-pratt/" target="_blank">Jerry Pratt</a>, two people who know better than just about anyone what it takes to make a successful robotics company, although they also know enough to be wary of getting into commercial humanoids.</p><h3></h3><br/><p>
  187. Persona AI may not be the first humanoid robotics startup, but its founders have some serious experience in the space:
  188. </p><p>
  189. <strong>Nic Radford</strong> lead the team that developed NASA’s <a href="https://spectrum.ieee.org/meet-valkyrie-nasas-superhero-robot" target="_blank">Valkyrie humanoid robot</a>, before founding Houston Mechatronics (now Nauticus Robotics), which introduced a <a href="https://spectrum.ieee.org/meet-aquanaut-the-underwater-transformer" target="_blank">transforming underwater robot</a> in 2019. He also founded Jacobi Motors, which is commercializing variable flux electric motors.
  190. </p><p>
  191. <strong>Jerry Pratt</strong> worked on walking robots for 20 years at the Institute for Human and Machine Cognition (<a href="https://robots.ihmc.us/" target="_blank">IHMC</a>) in Pensacola, Florida. He co-founded <a href="https://boardwalkrobotics.com/" target="_blank">Boardwalk Robotics</a> in 2017, and has spent the last two years as CTO of <a href="https://spectrum.ieee.org/figure-robot-video" target="_blank">multi-billion-dollar humanoid startup Figure</a>.
  192. </p><p>
  193. “It took me a long time to warm up to this idea,” Nic Radford tells us. “After I left Nauticus in January, I didn’t want anything to do with humanoids, especially underwater humanoids, and I didn’t even want to hear the word ‘robot.’ But things are changing so quickly, and I got excited and called Jerry and I’m like, this is actually very possible.” Jerry Pratt, who recently left Figure due primarily to the
  194. <a href="https://en.wikipedia.org/wiki/Two-body_problem_(career)" rel="noopener noreferrer" target="_blank"><u>two-body problem</u></a>, seems to be coming from a similar place: “There’s a lot of bashing your head against the wall in robotics, and persistence is so important. Nic and I have both gone through pessimism phases with our robots over the years. We’re a bit more optimistic about the commercial aspects now, but we want to be pragmatic and realistic about things too.”
  195. </p><p>
  196. Behind all of the recent humanoid hype lies the very, very difficult problem of making a highly technical piece of hardware and software compete effectively with humans in the labor market. But that’s also a very, very big opportunity—big enough that Persona doesn’t have to be the first company in this space, or the best funded, or the highest profile. They simply have to succeed, but of course sustainable commercial success with any robot (and bipedal robots in particular) is anything but simple. Step one will be building a founding team across two locations: Houston and Pensacola, Fla. But Radford says that the response so far to just a couple of
  197. <a href="https://www.linkedin.com/company/persona-humanoids-at-work/" target="_blank">LinkedIn posts</a> about Persona has been “tremendous.” And with a substantial seed investment in the works, Persona will have more than just a vision to attract top talent.
  198. </p><p>
  199. For more details about Persona, we spoke with Persona AI co-founders Nic Radford and Jerry Pratt.
  200. </p><p>
  201. <strong>Why start this company, why now, and why you?</strong>
  202. </p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-right" data-rm-resized-container="25%" style="float: right;">
  203. <img alt="" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="96390cf5671fa10555398542fbc22991" data-rm-shortcode-name="rebelmouse-image" id="90434" loading="lazy" src="https://spectrum.ieee.org/media-library/nic-radford.png?id=52515077&width=980" style="max-width: 100%"/>
  204. <small class="image-media media-caption" placeholder="Add Photo Caption..." style="max-width: 100%;">Nic Radford</small>
  205. </p><p>
  206. <strong>Nic Radford: </strong>The idea for this started a long time ago. Jerry and I have been working together off and on for quite a while, being in this field and sharing a love for what the humanoid potential is while at the same time being frustrated by where humanoids are at. As far back as probably 2008, we were thinking about starting a humanoids company, but for one reason or another the viability just wasn’t there. We were both recently searching for our next venture and we couldn’t imagine sitting this out completely, so we’re finally going to explore it, although we know better than anyone that robots are really hard. They’re not that hard to build; but they’re hard to make useful and make money with, and the challenge for us is whether we can build a viable business with Persona: can we build a business that uses robots and makes money? That’s our singular focus. We’re pretty sure that this is likely the best time in history to execute on that potential.
  207. </p><p>
  208. <strong>Jerry Pratt: </strong>I’ve been interested in commercializing humanoids for quite a while—thinking about it, and giving it a go here and there, but until recently it has always been the wrong time from both a commercial point of view and a technological readiness point of view. You can think back to the DARPA Robotics Challenge days when we had to wait about 20 seconds to get a good lidar scan and process it, which made it really challenging to do things autonomously. But we’ve gotten much, much better at perception, and now, we can get a whole perception pipeline to run at the framerate of our sensors. That’s probably the main enabling technology that’s happened over the last 10 years.
  209. </p><p>
  210. From the commercial point of view, now that we’re showing that this stuff’s feasible, there’s been a lot more pull from the industry side. It’s like we’re at the next stage of the Industrial Revolution, where the harder problems that weren’t roboticized from the 60s until now can now be. And so, there’s really good opportunities in a lot of different use cases.
  211. </p><p>
  212. <strong>A bunch of companies have started within the last few years, and several were even earlier than that. Are you concerned that you’re too late?</strong>
  213. </p><p>
  214. <strong>Radford:</strong> The concern is that we’re still too early! There might only be one Figure out there that raises a billion dollars, but I don’t think that’s going to be the case. There’s going to be multiple winners here, and if the market is as large as people claim it is, you could see quite a diversification of classes of commercial humanoid robots.
  215. </p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-right" data-rm-resized-container="25%" style="float: right;">
  216. <img alt="" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="2068a0ff840814f5b57e8900041a465f" data-rm-shortcode-name="rebelmouse-image" id="922cd" loading="lazy" src="https://spectrum.ieee.org/media-library/jerry-pratt.png?id=52515080&width=980" style="max-width: 100%"/>
  217. <small class="image-media media-caption" placeholder="Add Photo Caption..." style="max-width: 100%;">Jerry Pratt</small>
  218. </p><p>
  219. <strong>Pratt:</strong> We definitely have some catching up to do but we should be able to do that pretty quickly, and I’d say most people really aren’t that far from the starting line at this point. There’s still a lot to do, but all the technology is here now—we know what it takes to put together a really good team and to build robots. We’re also going to do what we can to increase speed, like by starting with a surrogate robot from someone else to get the autonomy team going while building our own robot in parallel.
  220. </p><p>
  221. <strong>Radford:</strong> I also believe that our capital structure is a big deal. We’re taking an anti-stealth approach, and we want to bring everyone along with us as our company grows and give out a significant chunk of the company to early joiners. It was an anxiety of ours that we would be perceived as a me-too and that nobody was going to care, but it’s been the exact opposite with a compelling response from both investors and early potential team members.
  222. </p><p>
  223. <strong>So your approach here is not to look at all of these other humanoid robotics companies and try and do something they’re not, but instead to pursue similar goals in a similar way in a market where there’s room for all?</strong>
  224. </p><p>
  225. <strong>Pratt:</strong> All robotics companies, and AI companies in general, are standing on the shoulders of giants. These are the thousands of robotics and AI researchers that have been collectively bashing their heads against the myriad problems for decades—some of the first humanoids were walking at
  226. <a href="https://www.youtube.com/watch?v=OMWU3KcSizY" target="_blank">Waseda University in the late 1960s</a>. While there are some secret sauces that we might bring to the table, it is really the combined efforts of the research community that now enables commercialization.
  227. </p><p>
  228. So if you’re at a point where you need something new to be invented in order to get to applications, then you’re in trouble, because with invention you never know how long it’s going to take. What is available today and now, the technology that’s been developed by various communities over the last 50+ years—we all have what we need for the first three applications that are widely mentioned: warehousing, manufacturing, and logistics. The big question is, what’s the fourth application? And the fifth and the sixth? And if you can start detecting those and planning for them, you can get a leg up on everybody else.
  229. </p><p>
  230. The difficulty is in the execution and integration. It’s a ten thousand—no, that’s probably too small—it’s a hundred thousand piece puzzle where you gotta get each piece right, and occasionally you lose some pieces on the floor that you just can’t find. So you need a broad team that has expertise in like 30 different disciplines to try to solve the challenge of an end-to-end labor solution with humanoid robots.
  231. </p><p>
  232. <strong>Radford:</strong> The idea is like one percent of starting a company. The rest of it, and why companies fail, is in the execution. Things like, not understanding the market and the product-market fit, or not understanding how to run the company, the dimensions of the actual business. I believe we’re different because with our backgrounds and our experience we bring a very strong view on execution, and that is our focus on day one. There’s enough interest in the VC community that we can fund this company with a singular focus on commercializing humanoids for a couple different verticals.
  233. </p><p>
  234. But listen, we got some novel ideas in actuation and other tricks up our sleeve that might be very compelling for this, but we don’t want to emphasize that aspect. I don’t think Persona’s ultimate success comes just from the tech component. I think it comes mostly from ‘do we understand the customer, the market needs, the business model, and can we avoid the mistakes of the past?’
  235. </p><p>
  236. <strong>How is that going to change things about the way that you run Persona?</strong>
  237. </p><p>
  238. <strong>Radford:</strong> I started a company [Houston Mechatronics] with a bunch of research engineers. They don’t make the best product managers. More broadly, if you’re staffing all your disciplines with roboticists and engineers, you’ll learn that it may not be the most efficient way to bring something to market. Yes, we need those skills. They are essential. But there’s so many other aspects of a business that get overlooked when you’re fundamentally a research lab trying to commercialize a robot. I’ve been there, I’ve done that, and I’m not interested in making that mistake again.
  239. </p><p>
  240. <strong>Pratt:</strong> It’s important to get a really good product team that’s working with a customer from day one to have customer needs drive all the engineering. The other approach is ‘build it and they will come’ but then maybe you don’t build the right thing. Of course, we want to build multi-purpose robots, and we’re steering clear of saying ‘general purpose’ at this point. We don’t want to overfit to any one application, but if we can get to a dozen use cases, two or three per customer site, then we’ve got something.
  241. </p><p>
  242. <strong>There still seems to be a couple of unsolved technical challenges with humanoids, including hands, batteries, and safety. How will Persona tackle those things?</strong>
  243. </p><p>
  244. <strong>Pratt:</strong> Hands are such a hard thing—getting a hand that has the required degrees of freedom and is robust enough that if you accidentally hit it against your table, you’re not just going to break all your fingers. But we’ve seen robotic hand companies popping up now that are showing videos of hitting their hands with a hammer, so I’m hopeful.
  245. </p><p>
  246. Getting one to two hours of battery life is relatively achievable. Pushing up towards five hours is super hard. But batteries can now be charged in 20 minutes or so, as long as you’re going from 20 percent to 80 percent. So we’re going to need a cadence where robots are swapping in and out and charging as they go. And batteries will keep getting better.
  247. </p><p>
  248. <strong>Radford:</strong> We do have a focus on safety. It was paramount at NASA, and when we were working on Robonaut, it led to a lot of morphological considerations with padding. In fact, the first concepts and images we have of our robot illustrate extensive padding, but we have to do that carefully, because at the end of the day it’s mass and it’s inertia.
  249. </p><p>
  250. <strong>What does the near future look like for you?</strong>
  251. </p><p>
  252. <strong>Pratt:</strong> Building the team is really important—getting those first 10 to 20 people over the next few months. Then we’ll want to get some hardware and get going really quickly, maybe buying a couple of robot arms or something to get our behavior and learning pipelines going while in parallel starting our own robot design. From our experience, after getting a good team together and starting from a clean sheet, a new robot takes about a year to design and build. And then during that period we’ll be securing a customer or two or three.
  253. </p><p>
  254. <strong>Radford:</strong> We’re also working hard on some very high profile partnerships that could influence our early thinking dramatically. Like Jerry said earlier, it’s a massive 100,000 piece puzzle, and we’re working on the fundamentals: the people, the cash, and the customers.
  255. </p>]]></description><pubDate>Sun, 30 Jun 2024 13:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/persona-ai-radford-pratt</guid><category>Walking robots</category><category>Humanoid robots</category><category>Robotics startup</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-shadowy-rendering-of-a-humanoid-robot-standing-in-the-darkness.png?id=52520431&amp;width=980"></media:content></item><item><title>Why Not Give Robots Foot-Eyes?</title><link>https://spectrum.ieee.org/robot-camera-feet</link><description><![CDATA[
  256. <img src="https://spectrum.ieee.org/media-library/image.jpg?id=52515157&width=1200&height=800&coordinates=0%2C171%2C0%2C171"/><br/><br/><p><p>
  257. <em>This article is part of our exclusive <a href="https://spectrum.ieee.org/collections/journal-watch/" rel="noopener noreferrer" target="_self">IEEE Journal Watch series</a> in partnership with IEEE Xplore.</em>
  258. </p></p><p>One of the (many) great things about robots is that they don’t have to be constrained by how their biological counterparts do things. If you have a particular problem your robot needs to solve, you can get creative with extra sensors: many quadrupeds have side cameras and butt cameras for obstacle avoidance, and humanoids sometimes have chest cameras and knee cameras to help with navigation along with wrist cameras for manipulation. But how far can you take this? I have no idea, but it seems like we haven’t gotten to the end of things yet because now there’s <a href="https://ieeexplore.ieee.org/document/10543120" target="_blank">a quadruped with cameras on the bottom of its feet</a>.</p><hr/><p>Sensorized feet is not a new idea; it’s pretty common for quadrupedal robots to have some kind of foot-mounted force sensor to detect ground contact. Putting an actual camera down there is fairly novel, though, because it’s not at all obvious how you’d go about doing it. And the way that roboticists from the <a href="https://en.wikipedia.org/wiki/Southern_University_of_Science_and_Technology" target="_blank">Southern University of Science and Technology</a> in Shenzhen went about doing it is, indeed, not at all obvious.</p><p class="shortcode-media shortcode-media-youtube">
  259. <span class="rm-shortcode" data-rm-shortcode-id="240f938fb26acf4ee10129979ebb1cc4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GInCNivRO6s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  260. </p><p>Go1’s snazzy feetsies have soles made of transparent acrylic, with slightly flexible plastic structure supporting a 60 millimeter gap up to each camera (640x480 at 120 frames per second) with a quartet of LEDs to provide illumination. While it’s complicated looking, at 120 grams, it doesn’t weigh all that much, and costs only about $50 per foot ($42 of which is the camera). The whole thing is sealed to keep out dirt and water.</p><p>So why bother with all of this (presumably somewhat fragile) complexity? As we ask quadruped robots to do more useful things in more challenging environments, having more information about what exactly they’re stepping on and how their feet are interacting with the ground is going to be super helpful. Robots that rely only on <a href="https://eng.libretexts.org/Bookshelves/Mechanical_Engineering/Introduction_to_Autonomous_Robots_(Correll)/05%3A_Sensors/5.02%3A_Proprioception_of_Robot_Kinematics_and_Internal_forces" target="_blank">proprioceptive sensing</a> (sensing self-movement) are great and all, but when you start trying to move over complex surfaces like sand, it can be really helpful to have vision that explicitly shows how your robot is interacting with the surface that it’s stepping on. Preliminary results showed that Foot Vision enabled the <a data-linked-post="2653906652" href="https://spectrum.ieee.org/unitrees-go1-robot-dog-looks-pretty-great-costs-just-usd-2700" target="_blank">Go1</a> using it to perceive the flow of sand or soil around its foot as it takes a step, which can be used to estimate slippage, the bane of ground-contacting robots. </p><p>The researchers acknowledge that their hardware could use a bit of robustifying, and they also want to try adding some tread patterns around the circumference of the foot, since that plexiglass window is pretty slippery. The overall idea is to make Foot Vision as useful as the much more common gripper-integrated vision systems for robotic manipulation, helping legged robots make better decisions about how to get where they need to go.</p><em><em><a href="https://ieeexplore.ieee.org/document/10543120" target="_blank">Foot Vision: A Vision-Based Multi-Functional Sensorized Foot for Quadruped Robots</a></em></em>, by Guowei Shi, Chen Yao, Xin Liu, Yuntian Zhao, Zheng Zhu, and Zhenzhong Jia from Southern University of Science and Technology in Shenzhen, is accepted to the July 2024 issue of <a href="https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=7083369" rel="noopener noreferrer" target="_blank"><u><em><em>IEEE Robotics and Automation Letters</em></em></u></a><p>.</p>]]></description><pubDate>Fri, 28 Jun 2024 16:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/robot-camera-feet</guid><category>Cameras</category><category>Journal watch</category><category>Legged robots</category><category>Navigation</category><category>Obstacle avoidance</category><category>Robotics</category><category>Sensors</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/image.jpg?id=52515157&amp;width=980"></media:content></item><item><title>Video Friday: Humanoids Get a Job</title><link>https://spectrum.ieee.org/video-friday-humanoids-get-a-job</link><description><![CDATA[
  261. <img src="https://spectrum.ieee.org/media-library/image.jpg?id=52514897&width=1200&height=800&coordinates=0%2C3%2C0%2C3"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><h5><a href="https://icra40.ieee.org/">ICRA@40</a>: 23–26 September 2024, ROTTERDAM, NETHERLANDS</h5><h5><a href="https://iros2024-abudhabi.org/">IROS 2024</a>: 14–18 October 2024, ABU DHABI, UAE</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="ow9l0cuau_0">Agility has been working with GXO for a bit now, but the big news here (and it IS big news) is that Agility’s Digit robots at GXO now represent the first formal commercial deployment of humanoid robots.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e1c3486e18a71d6c6908a7264a5e8465" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/oW9L0CuAu_0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://agilityrobotics.com/content/gxo-signs-industry-first-multi-year-agreement-with-agility-robotics">GXO</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="2qacz8iuh2a">GXO can’t seem to get enough humanoids, because they’re also starting some R&D with <a href="https://apptronik.com/" target="_blank">Apptronik</a>.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4ea482a3ca6577fa50c91f0525c975c4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2QAcz8IUh2A?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://gxo.com/news_article/gxo-announces-multi-phase-r-and-d-initiative-with-apptronik/">GXO</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="6r4zxpjjdx8"><em>In this paper, we introduce a full-stack system for humanoids to learn motion and autonomous skills from human data. Through shadowing, human operators can teleoperate humanoids to collect whole-body data for learning different tasks in the real world. Using the data collected, we then perform supervised behavior cloning to train skill policies using egocentric vision, allowing humanoids to complete different tasks autonomously by imitating human skills.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7518e2a1add091699d5aa4ec86a76eb9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/6r4ZxpJjdx8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>THAT FACE.</p><p>[ <a href="https://humanoid-ai.github.io/">HumanPlus</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="z33q7mszdnu">Yeah these robots are impressive but it’s the sound effects that make it.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0f5aa634cf069a2d38a3542e7b51a8be" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/z33q7MSzdnU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">Deep Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="bgka32tlvxm"><em>Meet CARMEN, short for Cognitively Assistive Robot for Motivation and Neurorehabilitation–a small, tabletop robot designed to help people with mild cognitive impairment (MCI) learn skills to improve memory, attention, and executive functioning at home.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2d5c4981fa1497c654825e30c4e53a0d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/bGKA32TlVXM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://github.com/UCSD-RHC-Lab/CARMEN">CARMEN</a> ] via [ <a href="https://today.ucsd.edu/story/meet-carmen-a-robot-that-helps-people-with-mild-cognitive-impairment">UCSD</a> ]</p><p>Thanks, Ioana!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="70oqf2ninj0">The caption of this video is, “it did not work...”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bd041adbff4d6cb6a58098c4d1ac8121" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/70OqF2NiNJ0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>You had one job, e-stop person! ONE JOB!</p><p>[ <a href="https://yugu.faculty.wvu.edu/">WVUIRL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="m6aesxyw7do"><em>This is a demo of cutting wood with a saw. When using position control for this task, precise measurement of the cutting amount is necessary. However, by using impedance control, this requirement is eliminated, allowing for successful cutting with only rough commands.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9e2a7da7d13ed349de6af0876055d568" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/M6AEsXYw7do?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotics.tokyo/">Tokyo Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="2sjqcbayksw">This is mesmerizing.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b21b32270667ca6a57e4382af2ffcdb8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2sJQCBaYKsw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://decmbc.github.io/">Oregon State</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="jyd1rlrqrwm">Quadrupeds are really starting to look like the new hotness in bipedal locomotion.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="be40b2d24a69d7143afdccd2bbe723ac" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/JYD1RlrQRWM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://eps.leeds.ac.uk/mechanical-engineering/staff/1720/dr-chengxu-zhou">University of Leeds</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="gitxhbte1oe">I still think this is a great way of charging a robot. Make sure and watch until the end to see the detach trick.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b0936c18b97fd790200b2ce2465ad2dc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GITxhbte1oE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.youtube.com/@ObiJerome">YouTube</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="pdu2l__gwu0">The Oasa R1, now on Kickstarter for $1,200, is the world’s first robotic lawn mower that uses one of them old timey reely things for cutting.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c71bc3a84574cb617494702b05e1e1fb" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/PdU2L__Gwu0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.kickstarter.com/projects/1428942922/oasa-r1-the-premier-robotic-reel-mower-with-auto-mapping">Kickstarter</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="f9wbdwqk2ci">ICRA next year is in Atlanta!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="89b34358469187b4f8782c7df2007973" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/f9WbdWqK2cI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://2025.ieee-icra.org/">ICRA 2025</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="c0lustcsmg0">Our Skunk Works team developed a modified version of the SR-71 Blackbird, titled the M-21, which carried an uncrewed reconnaissance drone called the D-21. The D-21 was designed to capture intelligence, release its camera, then self-destruct!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c3d2d3f29f682f7f9b708e040a4e5a89" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/c0LUstcSMg0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.lockheedmartin.com/en-us/news/features/2023/80-years-of-skunk-works-innovation.html#1960">Lockheed Martin</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="6riextoxxku"><em>The RPD 35 is a robotic powerhouse that surveys, distributes, and drives wide-flange solar piles up to 19 feet in length.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="07181497dac3caf8306ee3c60685a3cd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/6RIExTOxXkU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.builtrobotics.com/">Built Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="r_tnfc0vpqo"><em>Field AI’s brain technology is enabling robots to autonomously explore oil and gas facilities, navigating throughout the site and inspecting equipment for anomalies and hazardous conditions.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f723b35ef92eb852d61a25d8cd300c62" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/R_tNFC0VPqo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://fieldai.com/">Field AI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="difucrvrdd4"><em>Husky Observer was recently deployed at a busy automotive rail yard to carry out various autonomous inspection tasks including measuring train car positions and RFID data collection from the offloaded train inventory.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cd488b6aca57d8b799f3558423a3294e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DifUCrVrdD4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://clearpathrobotics.com/husky-observer/">Clearpath</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="l8mfwhshx0q">If you’re going to try to land a robot on the Moon, it’s useful to have a little bit of the Moon somewhere to practice on.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e882d243314931e52fe99298ce31d698" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/l8mfWhshX0Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.astrobotic.com/astrobotic-unveils-terrestrial-moonscape-for-payload-testing/">Astrobotic</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ikbprj-akrs"><em>Would you swallow a micro-robot? In a gutsy demo, physician Vivek Kumbhari navigates Pillbot, a wireless, disposable robot swallowed onstage by engineer Alex Luebke, modeling how this technology can swiftly provide direct visualization of internal organs. Learn more about how micro-robots could move us past the age of invasive endoscopies and open up doors to more comfortable, affordable medical imaging.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a2f65b2cf8bcebffa8927048fbd71715" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/iKBPrJ-AKRs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ted.com/talks/alex_luebke_vivek_kumbhari_how_you_could_see_inside_your_body_with_a_micro_robot?rss=172BB350-0205">TED</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="uezthu4nhrs"><em>How will AI improve our lives in the years to come? From its inception six decades ago to its recent exponential growth, futurist Ray Kurzweil highlights AI’s transformative impact on various fields and explains his prediction for the singularity: the point at which human intelligence merges with machine intelligence.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5d7d7e3bb220b1da56080bcec962a41e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/uEztHu4NHrs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ted.com/talks/ray_kurzweil_the_last_6_decades_of_ai_and_what_comes_next">TED</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 28 Jun 2024 15:03:16 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-humanoids-get-a-job</guid><category>Agility robotics</category><category>Apptronik</category><category>Humanoid robots</category><category>Quadruped robots</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/image.jpg?id=52514897&amp;width=980"></media:content></item><item><title>Video Friday: Morphy Drone</title><link>https://spectrum.ieee.org/video-friday-morphy-drone</link><description><![CDATA[
  262. <img src="https://spectrum.ieee.org/media-library/image.gif?id=52485146&width=1200&height=800&coordinates=62%2C0%2C63%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><h5><a href="https://icra40.ieee.org/">ICRA@40</a>: 23–26 September 2024, ROTTERDAM, NETHERLANDS</h5><h5><a href="https://iros2024-abudhabi.org/">IROS 2024</a>: 14–18 October 2024, ABU DHABI, UAE</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="c6l7vklbc9k"><em>We present Morphy, a novel compliant and morphologically aware flying robot that integrates sensorized flexible joints in its arms, thus enabling resilient collisions at high speeds and the ability to squeeze through openings narrower than its nominal dimensions.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="932636379e5391370626ccfcb0486ca6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/C6l7Vklbc9k?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Morphy represents a new class of soft flying robots that can facilitate unprecedented resilience through innovations both in the “body” and “brain.” The novel soft body can, in turn, enable new avenues for autonomy. Collisions that previously had to be avoided have now become acceptable risks, while areas that are untraversable for a certain robot size can now be negotiated through self-squeezing. These novel bodily interactions with the environment can give rise to new types of embodied intelligence.</em></blockquote><p>[ <a href="https://www.autonomousrobotslab.com/">ARL</a> ]</p><p>Thanks, Kostas!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="8clybtfhkaw"><em>Segments of daily training for robots driven by reinforcement learning. Multiple tests done in advance for friendly service humans. The training includes some extreme tests. Please do not imitate!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2e4fd1ce9387c9c8915c1b54fd43c90d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8ClYBtfhkaw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/">Unitree</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="1s3gvkeo5gk">Sphero is not only still around, it’s making new STEM robots!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5faa26ae2c22efa5b9c5ca04b36152a7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/1s3gVKEo5Gk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sphero.com/products/sphero-bolt-plus">Sphero</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="qgkzagpmsba">Googly eyes mitigate all robot failures.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c71c84453fc7309b5fe41f1d935257d6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/qgKzAGPmsBA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://yugu.faculty.wvu.edu/home">WVUIRL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="wuzd1gyuikm">Here I am, without the ability or equipment (or desire) required to iron anything that I own, and Flexiv’s got robots out there ironing fancy leather car seats. </p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f83553f2ad42cbcc00e32059d6dd0e70" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wuzD1gYuIKM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://flexiv.prowly.com/271395-flexiv-smooths-the-way-rizon-robot-revolutionizes-car-seat-production">Flexiv</a> ]</p><p>Thanks, Noah!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="qj7ypt35qls"><em>We unveiled a significant leap forward in perception technology for our humanoid robot GR-1. The newly adapted pure-vision solution integrates bird’s-eye view, transformer models, and an occupancy network for precise and efficient environmental perception.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="aee7a78afa3893a1aab2c08583b7ffb3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/qJ7ypt35Qls?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://fourierintelligence.com/2024/06/19/fourier-unveils-gr-1s-breakthrough-in-vision-technology-for-enhanced-environmental-perception/">Fourier</a> ]</p><p>Thanks, Serin!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="hbyia3wbd2k"><em>LimX Dynamics’ humanoid robot CL-1 was launched in December 2023. It climbed stairs based on real-time terrain perception, two steps per stair. Four months later, in April 2024, the second demo video showcased CL-1 in the same scenario. It had advanced to climb the same stair, one step per stair.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="930dde1150322264a1daaabde7bdc319" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/hbYia3Wbd2k?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.limxdynamics.com/en">LimX Dynamics</a> ]</p><p>Thanks, Ou Yan!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="zslboxqxtsi"><em>New research from the University of Massachusetts Amherst shows that programming robots to create their own teams and voluntarily wait for their teammates results in faster task completion, with the potential to improve manufacturing, agriculture, and warehouse automation.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="466dabd166ed52e3de406b7d5b7d163f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zslbOXQXtSI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://hcrlab.gitlab.io/project/lvws/">HCRL</a> ] via [ <a href="https://www.umass.edu/news/article/umass-amherst-researchers-create-new-method-orchestrating-successful-collaboration">UMass Amherst</a> ]</p><p>Thanks, Julia!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="vetl-fqbyik"><em>LASDRA (Large-size Aerial Skeleton with Distributed Rotor Actuation system (ICRA18) is a scalable and modular aerial robot. It can assume a very slender, long, and dexterous form factor and is very lightweight.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="567ebd43adc4b424b3dfa54183e71325" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/VETL-fqbYik?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.inrol.snu.ac.kr/">SNU INRoL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="8nzopzuadjy"><em>We propose augmenting initially passive structures built from simple repeated cells, with novel active units to enable dynamic, shape-changing, and robotic applications. Inspired by metamaterials that can employ mechanisms, we build a framework that allows users to configure cells of this passive structure to allow it to perform complex tasks.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e59b6b1cdcd15f0da4723c87d150e6f7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8nzoPZUADJY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://interactive-structures.org/publications.html">CMU</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="welghdurdm4"><em>Testing autonomous exploration at the Exyn Office using Spot from Boston Dynamics. In this demo, Spot autonomously explores our flight space while on the hunt for one of our engineers.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9bea27f58cd566dfb6114b2c99a20c0b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/weLgHdurDM4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.exyn.com/">Exyn</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ut6v9lsup0y"><em>Meet Heavy Picker, the strongest robot in bulky-waste sorting and an absolute pro at lifting and sorting waste. With skills that would make a concert pianist jealous and a work ethic that never needs coffee breaks, Heavy Picker was on the lookout for new challenges.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6b96811fe312011895d24bc1d739bfe9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UT6v9lSUp0Y?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.terex.com/zenrobotics">Zen Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="_bn9ek6_f8u"><em>AI is the biggest and most consequential business, financial, legal, technological, and cultural story of our time. In this panel, you will hear from the underrepresented community of women scientists who have been leading the AI revolution—from the beginning to now.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0e288781d49a28e9b4311aecc93c1e9c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/_bn9Ek6_F8U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://hai.stanford.edu/">Stanford HAI</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 21 Jun 2024 18:33:48 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-morphy-drone</guid><category>Video friday</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/image.gif?id=52485146&amp;width=980"></media:content></item><item><title>Here’s the Most Buglike Robot Bug Yet</title><link>https://spectrum.ieee.org/flying-robot-bug</link><description><![CDATA[
  263. <img src="https://spectrum.ieee.org/media-library/a-video-shows-a-small-winged-robot-fly-down-and-land-on-a-conference-table-crawl-around-and-take-off-again.gif?id=52448644&width=1200&height=800&coordinates=52%2C0%2C53%2C0"/><br/><br/><p>Insects have long been an inspiration for robots. The insect world is full of things that are tiny, fully autonomous, highly mobile, energy efficient, multimodal, self-repairing, and I could go on and on but you get the idea—<a href="https://spectrum.ieee.org/tag/insect-robots" target="_blank">insects are both an inspiration</a> and a source of frustration to roboticists because it’s so hard to get robots to have anywhere close to insect capability. </p><p>We’re definitely making progress, though. In a paper published last month in <em><a href="https://www.ieee-ras.org/publications/ra-l" target="_blank">IEEE Robotics and Automation Letters</a></em>, roboticists from Shanghai Jong Tong University demonstrated the most buglike robotic bug I think I’ve ever seen.</p><hr/><p class="shortcode-media shortcode-media-youtube">
  264. <span class="rm-shortcode" data-rm-shortcode-id="0cf5ff180de78f84d81e36eddfeb420e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Dech-vsAmUY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  265. <small class="image-media media-caption" placeholder="Add Photo Caption...">A Multi-Modal Tailless Flapping-Wing Robot</small>
  266. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">
  267. <a href="https://www.youtube.com/watch?v=Dech-vsAmUY" target="_blank">www.youtube.com</a>
  268. </small>
  269. </p><p>Okay so it may not <em><em>look</em></em> the most buglike, but it can do many very buggy bug things, including crawling, taking off horizontally, flying around (with six degrees of freedom control), hovering, landing, and self-righting if necessary. JT-fly weighs about 35 grams and has a wingspan of 33 centimeters, using four wings at once to fly at up to 5 meters per second and six legs to scurry at 0.3 m/s. Its 380 milliampere-hour battery powers it for an actually somewhat useful 8-ish minutes of flying and about 60 minutes of crawling. </p><p>While that amount of endurance may not sound like a lot, robots like these aren’t necessarily intended to be moving continuously. Rather, they move a little bit, find a nice safe perch, and then do some sensing or whatever until you ask them to move to a new spot. Ideally, most of that movement would be crawling, but having the option to fly makes JT-fly exponentially more useful.</p><p>Or, potentially more useful, because obviously this is still very much a research project. It does seem like there’s a bunch more optimization that could be done here. For example, JT-fly uses completely separate systems for flying and crawling, with two motors powering the legs and two additional motors powering the wings—plus two wing servos for control. There’s currently a limited amount of onboard autonomy, with an inertial measurement unit, barometer, and wireless communication, but otherwise not much in the way of useful payload. </p><p class="pull-quote">Insects are both an inspiration and a source of frustration to roboticists because it’s so hard to get robots to have anywhere close to insect capability.</p><p>It won’t surprise you to learn that the researchers have disaster-relief applications in mind for this robot, suggesting that “after natural disasters such as earthquakes and mudslides, roads and buildings will be severely damaged, and in these scenarios, JT-fly can rely on its flight ability to quickly deploy into the mission area.” One day, robots like these will actually be deployed for disaster relief, and although that day is not today, we’re just a little bit closer than we were before.</p><u><a href="https://ieeexplore.ieee.org/document/10490117" rel="noopener noreferrer" target="_blank">“A Multi-Modal Tailless Flapping-Wing Robot Capable of Flying, Crawling, Self-Righting and Horizontal Takeoff,”</a></u> by Chaofeng Wu, Yiming Xiao, Jiaxin Zhao, Jiawang Mou, Feng Cui, and Wu Liu from Shanghai Jong Tong University, is published in the May issue of <em>IEEE Robotics and Automation Letters</em>.]]></description><pubDate>Wed, 19 Jun 2024 11:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/flying-robot-bug</guid><category>Drones</category><category>Insect robots</category><category>Natural disasters</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/a-video-shows-a-small-winged-robot-fly-down-and-land-on-a-conference-table-crawl-around-and-take-off-again.gif?id=52448644&amp;width=980"></media:content></item><item><title>Video Friday: Drone vs. Flying Canoe</title><link>https://spectrum.ieee.org/video-friday-drone-vs-flying-canoe</link><description><![CDATA[
  270. <img src="https://spectrum.ieee.org/media-library/image.png?id=52453952&width=1311&height=808&coordinates=609%2C272%2C0%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><h5><a href="https://icra40.ieee.org/">ICRA@40</a>: 23–26 September 2024, ROTTERDAM, NETHERLANDS</h5><h5><a href="https://iros2024-abudhabi.org/">IROS 2024</a>: 14–18 October 2024, ABU DHABI, UAE</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="e-ns-mxmjvu">There’s a Canadian legend about a flying canoe, because of course there is. The legend involves drunkenness, a party with some ladies, swearing, and a pact with the devil, because of course it does. Fortunately for the drone in this video, it needs none of that to successfully land on this (nearly) flying canoe, just some high-friction shock absorbing legs and judicious application of reverse thrust.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f75a6f6fd0cec6c024135e86467dcd4a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/E-ns-MxMJvU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.createk.co/">Createk</a> ]</p><p>Thanks, Alexis!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="qqqv2pfmhmo"><em>This paper summarizes an autonomous driving project by musculoskeletal humanoids. The musculoskeletal humanoid, which mimics the human body in detail, has redundant sensors and a flexible body structure. We reconsider the developed hardware and software of the musculoskeletal humanoid Musashi in the context of autonomous driving. The respective components of autonomous driving are conducted using the benefits of the hardware and software. Finally, Musashi succeeded in the pedal and steering wheel operations with recognition.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="283cd086e71070823cd352d64288f66f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/qQqv2pFMhmo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2406.05573">Paper</a> ] via [ <a href="http://www.jsk.t.u-tokyo.ac.jp/">JSK Lab</a> ]</p><p>Thanks, Kento!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="i78suprdkku">Robust AI has been kinda quiet for the last little while, but their Carter robot continues to improve.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="03facff91eb592cf436b971d8c8675f7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/I78suprDKkU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.robust.ai/" target="_blank">Robust AI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="fpiyv7civ6i"><em>One of the key arguments for building robots that have similar form factors to human beings is that we can leverage the massive human data for training. In this paper, we introduce a full-stack system for humanoids to learn motion and autonomous skills from human data. We demonstrate the system on our customized 33-degrees-of-freedom 180 centimeter humanoid, autonomously completing tasks such as wearing a shoe to stand up and walk, unloading objects from warehouse racks, folding a sweatshirt, rearranging objects, typing, and greeting another robot with 60-100 percent success rates using up to 40 demonstrations.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a49e91c1b93f779a2f7c58458fb598b3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FPiyv7CIV6I?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://humanoid-ai.github.io/">HumanPlus</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ofgxzhv0gmk"><em>We present OmniH2O (Omni Human-to-Humanoid), a learning-based system for whole-body humanoid teleoperation and autonomy. Using kinematic pose as a universal control interface, OmniH2O enables various ways for a human to control a full-sized humanoid with dexterous hands, including using real-time teleoperation through VR headset, verbal instruction, and RGB camera. OmniH2O also enables full autonomy by learning from teleoperated demonstrations or integrating with frontier models such as GPT-4.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="14a6f18851cbce9c7bb001e054cd786d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ofgxZHv0GMk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://omni.human2humanoid.com/">OmniH2O</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="xgga6bo8qec">A collaboration between Boxbot, Agility Robotics, and Robust.AI at Playground Global. Make sure and watch until the end to hear the roboticists in the background react when the demo works in a very roboticist way.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ea50dd828881e95dabc5d2e83ae6880c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Xgga6bO8qEc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>::clap clap clap:: yaaaaayyyyy....</p><p>[ <a href="https://www.robust.ai/">Robust AI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="a6zg8de4lys"><em>The use of drones and robotic devices threatens civilian and military actors in conflict areas. We started trials with robots to see how we can adapt our HEAT (Hostile Environment Awareness Training) courses to this new reality.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1ccc9f86a2c8b6ea75da52261e17256b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/A6Zg8DE4Lys?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.centreforsafety.org/">CSD</a> ]</p><p>Thanks, Ebe!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="l18x-qbxqpi"><em>How to make humanoids do versatile parkour jumping, clapping dance, cliff traversal, and box pick-and-move with a unified RL framework? We introduce WoCoCo: Whole-body humanoid Control with sequential Contacts</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7e8d51309ad7671e9bdfcd5f03f813ca" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/L18X-QbXqPI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://lecar-lab.github.io/wococo/">WoCoCo</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="fgoxfknvm2q">A selection of excellent demos from the Learning Systems and Robotics Lab at TUM and the University of Toronto.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="fd28a48551d9630a7e2a7509f27597fc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FgoXFknvM2Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dynsyslab.org/research/">Learning Systems and Robotics Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="mxwxdw7ehlo">Harvest Automation, one of the OG autonomous mobile robot companies, hasn’t updated their website since like 2016, but some videos just showed up on YouTube this week.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b397e678da3e333f5e8184f3c7691e23" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/MXwxdw7EHLo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.public.harvestai.com/">Harvest Automation</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="o6cdmxfkadm"><em>Northrop Grumman has been pioneering capabilities in the undersea domain for more than 50 years. Now, we are creating a new class of uncrewed underwater vehicles (UUV) with Manta Ray. Taking its name from the massive “winged” fish, Manta Ray will operate long-duration, long-range missions in ocean environments where humans can’t go.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="597703571b8d73ecb736817c54d537dc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/o6cDmXFkAdM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.northropgrumman.com/what-we-do/sea/manta-ray">Northrop Grumman</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="foyzilg4qh0">Akara Robotics’ autonomous robotic UV disinfection demo.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9e4620cf5da3c2ed7208436063822bd3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/fOYZiLg4Qh0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.akara.ai/">Akara Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="rlo4sfk37w4"><em>Scientists have computationally predicted hundreds of thousands of novel materials that could be promising for new technologies—but testing to see whether any of those materials can be made in reality is a slow process. Enter A-Lab, which uses robots guided by artificial intelligence to speed up the process.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0c26e283777610120a3786676eb24a53" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/RLO4sfK37w4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://newscenter.lbl.gov/2023/04/17/meet-the-autonomous-lab-of-the-future/">A-Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="oymf-yab2d0">We wrote about this research from CMU <a href="https://spectrum.ieee.org/video-friday-agile-but-safe" target="_blank">a while back</a>, but here’s a quite nice video.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c3f0e9c9e905ec41dc6b19861937d3f5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/oyMf-yaB2d0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ri.cmu.edu/collision-free-high-speed-robots/">CMU RI</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ku8wambwdc8">Aw yiss pick and place robots.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d8c206ab2c9488a8145c95ca1e619e75" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/KU8WaMbWDc8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.fanucamerica.com/">Fanuc</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="16ey4o0bpxg"><em>Axel Moore describes his lab’s work in orthopedic biomechanics to relieve joint pain with robotic assistance.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="35b46889dc32fa559a5e0a67a4f82879" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/16Ey4O0bPxg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.axelcmoore.com/">CMU</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="tpz2ppugtty"><em>The field of humanoid robots has grown in recent years with several companies and research laboratories developing new humanoid systems. However, the number of running robots did not noticeably rise. Despite the need for fast locomotion to quickly serve given tasks, which require traversing complex terrain by running and jumping over obstacles. To provide an overview of the design of humanoid robots with bioinspired mechanisms, this paper introduces the fundamental functions of the human running gait.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="68f4e82278dcd54077353215d768cbee" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/tpZ2PpUGTTY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/document/10375238">Paper</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 14 Jun 2024 17:12:13 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-drone-vs-flying-canoe</guid><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/image.png?id=52453952&amp;width=980"></media:content></item><item><title>The Mythical Non-Roboticist</title><link>https://spectrum.ieee.org/the-mythical-non-roboticist</link><description><![CDATA[
  271. <img src="https://spectrum.ieee.org/media-library/illustration-of-three-people-with-unicorn-heads-working-on-a-giant-robot.jpg?id=52208531&width=1200&height=800&coordinates=62%2C0%2C63%2C0"/><br/><br/><p>
  272. <em>The original version of this post by Benjie Holson was published on Substack <a href="https://generalrobots.substack.com/p/the-mythical-non-roboticist" target="_blank">here</a>, and includes </em><em>Benjie’s original comics as <a href="https://generalrobots.substack.com/" target="_blank">part of his series on robots and startups</a>.</em>
  273. </p><p>
  274. I worked on this idea for months before I decided it was a mistake. The second time I heard someone mention it, I thought, “That’s strange, these two groups had the same idea. Maybe I should tell them it didn’t work for us.” The third and fourth time I rolled my eyes and ignored it. The fifth time I heard about a group struggling with this mistake, I decided it was worth a blog post all on its own. I call this idea “The Mythical Non-Roboticist.”<br/>
  275. </p><hr/><h2>The Mistake</h2><p>
  276. The idea goes something like this: Programming robots is hard. And there are some people with really arcane skills and PhDs who are really expensive and seem to be required for some reason. Wouldn’t it be nice if we could do robotics without them?
  277. <a href="#1"><sup>1</sup></a> What if everyone could do robotics? That would be great, right? We should make a software framework so that non-roboticists can program robots.
  278. </p><p>
  279. This idea is so close to a correct idea that it’s hard to tell why it doesn’t work out. On the surface, it’s not
  280. <em>wrong</em>: All else being equal, it would be good if programming robots was more accessible. The problem is that we don’t have a good recipe for making working robots. So we don’t know how to make that recipe easier to follow. In order to make things simple, people end up removing things that folks might need, because no one knows for sure what’s absolutely required. It’s like saying you want to invent an invisibility cloak and want to be able to make it from materials you can buy from Home Depot. Sure, that would be nice, but if you invented an invisibility cloak that required some mercury and neodymium to manufacture would you toss the recipe?
  281. </p><p>
  282. In robotics, this mistake is based on a very true and very real observation: Programming robots
  283. <em>is</em> super hard. Famously hard. It would be super great if programming robots was easier. The issue is this: Programming robots has two different kinds of hard parts.
  284. </p><h2>Robots are hard because the world is complicated</h2><p class="shortcode-media shortcode-media-rebelmouse-image">
  285. <img alt="Illustration of a robot photo stepping down towards a banana peel." class="rm-shortcode" data-rm-shortcode-id="2bbe2d6fffcd94831aabc656e3e81b84" data-rm-shortcode-name="rebelmouse-image" id="831cc" loading="lazy" src="https://spectrum.ieee.org/media-library/illustration-of-a-robot-photo-stepping-down-towards-a-banana-peel.jpg?id=52208546&width=980"/>
  286. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Moor Studio/Getty Images</small>
  287. </p><p>
  288. The first kind of hard part is that robots deal with the real world, imperfectly sensed and imperfectly actuated. Global mutable state is bad programming style because it’s really hard to deal with, but to robot software the entire physical world is global mutable state, and you only get to unreliably observe it and hope your actions approximate what you wanted to achieve. Getting robotics to work at all is often at the very limit of what a person can reason about, and requires the flexibility to employ whatever heuristic might work for your special problem. This is the
  289. <em> intrinsic </em>complexity of the problem: Robots live in complex worlds, and for every working solution there are millions of solutions that don’t work, and finding the right one is hard, and often very dependent on the task, robot, sensors, and environment.
  290. </p><p>
  291. Folks look at that challenge, see that it is super hard, and decide that, sure, maybe some fancy roboticist could solve it in one particular scenario, but what about “normal” people? “We should make this possible for non-roboticists” they say. I call these users “Mythical Non-Roboticists” because once they are programming a robot, I feel they
  292. <em>become</em> roboticists. Isn’t anyone programming a robot for a purpose a roboticist? Stop gatekeeping, people.
  293. </p><h2>Don’t design for amorphous groups</h2><p>
  294. I call also them “mythical” because usually the “non-roboticist” implied is a vague, amorphous group. Don’t design for amorphous groups. If you can’t name three real people (that you have talked to) that your API is for, then you are designing for an amorphous group and only amorphous people will like your API.
  295. </p><p>
  296. And with this hazy group of users in mind (and seeing how difficult everything is), folks think, “Surely we could make this easier for everyone else by papering over these things with simple APIs?”
  297. </p><p>
  298. No. No you can’t. Stop it.
  299. </p><p>
  300. You can’t paper over intrinsic complexity with simple APIs because
  301. <strong>if your APIs are simple they can’t cover the complexity of the problem</strong>. You will inevitably end up with a beautiful looking API, with calls like “grasp_object” and “approach_person” which demo nicely in a hackathon kickoff but last about 15 minutes of someone actually trying to get some work done. It will turn out that, for their particular application, “grasp_object()” makes 3 or 4 wrong assumptions about “grasp” <em>and</em> “object” and doesn’t work for them at all.
  302. </p><h2>Your users are just as smart as you</h2><p>
  303. This is made worse by the pervasive assumption that these people are less savvy (read: less intelligent) than the creators of this magical framework.
  304. <a href="#2"><sup>2</sup></a> That feeling of superiority will cause the designers to cling desperately to their beautiful, simple “grasp_object()”s and resist adding the knobs and arguments needed to cover more use cases and allow the users to customize what they get.
  305. </p><p>
  306. Ironically this foists a bunch of complexity on to the poor users of the API who have to come up with clever workarounds to get it to work at all.
  307. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  308. <img alt="Illustration of a human and robot hand fitting puzzle pieces together in front of a brain" class="rm-shortcode" data-rm-shortcode-id="6bb28ed080b7a6ac68893ee2d530c6da" data-rm-shortcode-name="rebelmouse-image" id="7d7e9" loading="lazy" src="https://spectrum.ieee.org/media-library/illustration-of-a-human-and-robot-hand-fitting-puzzle-pieces-together-in-front-of-a-brain.jpg?id=52208580&width=980"/>
  309. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Moor Studio/Getty Images</small>
  310. </p><p>
  311. The sad, salty, bitter icing on this cake-of-frustration is that, even if done really well, the goal of this kind of framework would be to expand the group of people who can do the work. And to achieve that, it would sacrifice some performance you can only get by super-specializing your solution to your problem. If we lived in a world where expert roboticists could program robots that worked really well, but there was so much demand for robots that there just wasn’t enough time for those folks to do all the programming, this would be a great solution.
  312. <a href="#3"><sup>3</sup></a>
  313. </p><p>
  314. The obvious truth is that (outside of really constrained environments like manufacturing cells) even the very best collection of real bone-fide, card-carrying roboticists working at the best of their ability struggle to get close to a level of performance that makes the robots commercially viable, even with long timelines and mountains of funding.
  315. <a href="#4"><sup>4</sup></a> We don’t have <em>any</em> headroom to sacrifice power and effectiveness for ease.
  316. </p><h2>What problem are we solving?</h2><p>
  317. So should we give up making it easier? Is robotic development available only to a small group of elites with fancy PhDs?
  318. <a href="#5"><sup>5</sup></a> No to both! I have worked with tons of undergrad interns who have been completely able to do robotics.<a href="#6"><sup>6</sup></a> I myself am mostly self-taught in robot programming.<a href="#7"><sup>7</sup></a> While there is a lot of intrinsic complexity in making robots work, I don’t think there is any more than, say, video game development.
  319. </p><p>
  320. In robotics, like in all things, experience helps, some things are teachable, and as you master many areas you can see things start to connect together. These skills are not magical or unique to robotics. We are not as special as we like to think we are.
  321. </p><p>
  322. But what about making programming robots easier? Remember way back at the beginning of the post when I said that there were two different kinds of hard parts? One is the intrinsic complexity of the problem, and that one will be hard no matter what.
  323. <a href="#8"><sup>8</sup></a> But the second is the incidental complexity, or as I like to call it, the stupid <a href="https://en.wikipedia.org/wiki/BS#:~:text=Bullshit%2C%20a%20phrase%20denoting%20something%20worthless" target="_blank">BS</a> complexity.
  324. </p><h2>Stupid BS Complexity</h2><p>
  325. Robots are asynchronous, distributed, real-time systems with weird hardware. All of that will be hard to configure for stupid BS reasons. Those drivers need to work in the weird flavor of Linux you want for hard real-time for your controls and getting that all set up will be hard for stupid BS reasons. You are abusing Wi-Fi so you can roam seamlessly without interruption but Linux’s Wi-Fi will not want to do that. Your log files are huge and you have to upload them somewhere so they don’t fill up your robot. You’ll need to integrate with some cloud something or other and deal with its stupid BS.
  326. <a href="#9"><sup>9</sup></a>
  327. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  328. <img alt="An illustration of a robot whose head has exploded off " class="rm-shortcode" data-rm-shortcode-id="5d8145dc2ed005a011a7b8cd4d21efe1" data-rm-shortcode-name="rebelmouse-image" id="d818f" loading="lazy" src="https://spectrum.ieee.org/media-library/an-illustration-of-a-robot-whose-head-has-exploded-off.jpg?id=52208747&width=980"/>
  329. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Moor Studio/Getty Images</small>
  330. </p><p>
  331. There is a ton of crap to deal with before you even get to complexity of dealing with 3D rotation, moving reference frames, time synchronization, messaging protocols. Those things have intrinsic complexity (you have to think about when something was observed and how to reason about it as other things have moved) and stupid BS complexity (There’s a weird bug because someone multiplied two transform matrices in the wrong order and now you’re getting an error message that deep in some protocol a quaternion is not normalized. WTF does that mean?)
  332. <a href="#10"><sup>10</sup></a>
  333. </p><p>
  334. One of the biggest challenges of robot programming is wading through the sea of stupid BS you need to wrangle in order to
  335. <em>start</em> working on your interesting and challenging robotics problem.
  336. </p><p>
  337. So a simple heuristic to make good APIs is:
  338. </p><div class="horizontal-rule">
  339. </div><p>
  340. <em>Design your APIs for someone as smart as you, but less tolerant of stupid BS.</em>
  341. </p><div class="horizontal-rule">
  342. </div><p>
  343. That feels universal enough that I’m tempted to call it
  344. <strong>Holson’s Law of Tolerable API Design</strong>.
  345. </p><p>
  346. When you are using tools you’ve made, you know them well enough to know the rough edges and how to avoid them.
  347. </p><p>
  348. But rough edges are things that have to be held in a programmer’s memory while they are using your system. If you insist on making a robotics framework
  349. <a href="#11"><sup>11</sup></a>, you should strive to make it as powerful as you can with the least amount of stupid BS. Eradicate incidental complexity everywhere you can. You want to make APIs that have maximum flexibility but good defaults. I like python’s default-argument syntax for this because it means you can write APIs that can be used like:
  350. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  351. <img alt="A screenshot of code" class="rm-shortcode" data-rm-shortcode-id="49db54aa7c3e3983bd6e628e2ca3cf3a" data-rm-shortcode-name="rebelmouse-image" id="e18db" loading="lazy" src="https://spectrum.ieee.org/media-library/a-screenshot-of-code.jpg?id=52208798&width=980"/>
  352. </p><p>
  353. It is possible to have easy things be simple
  354. <em>and</em> allow complex things. And please, please, please don’t make condescending APIs. Thanks!
  355. </p><p class="rm-anchors" id="1">
  356. 1. Ironically it is very often the expensive arcane-knowledge-having PhDs who are proposing this.
  357. </p><p class="rm-anchors" id="2">
  358. 2. Why is it always a
  359. <a href="https://generalrobots.substack.com/p/tech-debt#footnote-anchor-3-118842509" target="_blank"> framework</a>?
  360. </p><p class="rm-anchors" id="3">
  361. 3. The exception that might prove the rule is things like traditional manufacturing-cell automation. That is a place where the solutions exist, but the limit to expanding is set up cost. I’m not an expert in this domain, but I’d worry that physical installation and safety compliance might still dwarf the software programming cost, though.
  362. </p><p class="rm-anchors" id="4">
  363. 4. As I well know from personal experience.
  364. </p><p class="rm-anchors" id="5">
  365. 5. Or non-fancy PhDs for that matter?
  366. </p><p class="rm-anchors" id="6">
  367. 6. I suspect that many bright highschoolers would also be able to do the work. Though, as Google tends not to hire them, I don’t have good examples.
  368. </p><p class="rm-anchors" id="7">
  369. 7. My schooling was in Mechanical Engineering and I never got a PhD, though my ME classwork did include some programming fundamentals.
  370. </p><p class="rm-anchors" id="8">
  371. 8. Unless we create effective general purpose AI. It feels weird that I have to add that caveat, but the possibility that it’s actually coming for robotics in my lifetime feels much more possible than it did two years ago.
  372. </p><p class="rm-anchors" id="9">
  373. 9. And if you are unlucky, its API was designed by someone who thought they were smarter than their customers.
  374. </p><p class="rm-anchors" id="10">
  375. 10. This particular flavor of BS complexity is why I wrote
  376. <a href="https://github.com/robobenjie/posetree" target="_blank">posetree.py</a>. If you do robotics, you should check it out.
  377. </p><p class="rm-anchors" id="11">
  378. 11. Which, judging by the trail of dead robot-framework-companies, is a fraught thing to do.
  379. </p>]]></description><pubDate>Sun, 09 Jun 2024 13:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/the-mythical-non-roboticist</guid><category>Robotics</category><category>Api</category><category>Robot software</category><dc:creator>Benjie Holson</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/illustration-of-three-people-with-unicorn-heads-working-on-a-giant-robot.jpg?id=52208531&amp;width=980"></media:content></item><item><title>Video Friday: 1X Robots Tidy Up</title><link>https://spectrum.ieee.org/video-friday-1x-robots-tidy-up</link><description><![CDATA[
  380. <img src="https://spectrum.ieee.org/media-library/image.png?id=52427464&width=1200&height=800&coordinates=344%2C0%2C345%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><h5><a href="https://icra40.ieee.org/">ICRA@40</a>: 23–26 September 2024, ROTTERDAM, NETHERLANDS</h5><h5><a href="https://iros2024-abudhabi.org/">IROS 2024</a>: 14–18 October 2024, ABU DHABI, UNITED ARAB EMIRATES</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="bzn9o37frmq"><em>In this video, you see the start of 1X’s development of an advanced AI system that chains simple tasks into complex actions using voice commands, allowing seamless multi-robot control and remote operation. By starting with single-task models, we ensure smooth transitions to more powerful unified models, ultimately aiming to automate high-level actions using AI.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cef378335f3b21a7d912c9c4b8b6dea6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/bzn9O37fRMQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>This video does not contain teleoperation, computer graphics, cuts, video speedups, or scripted trajectory playback. It’s all controlled via neural networks.</em></blockquote><p>[ <a href="https://www.1x.tech/discover/ai-update-voice-commands-chaining-tasks">1X</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="nx2yo3twzys"><em>As the old adage goes, one cannot claim to be a true man without a visit to the Great Wall of China. XBot-L, a full-sized humanoid robot developed by Robot Era, recently acquitted itself well in a walk along sections of the Great Wall.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4f2f6aed6e8bcee6775c08c3fef54a51" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/nx2YO3twZYs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.robotera.com/">Robot Era</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ohlbp19uig0"><em>The paper presents a novel rotary wing platform, that is capable of folding and expanding its wings during flight. Our source of inspiration came from birds’ ability to fold their wings to navigate through small spaces and dive. The design of the rotorcraft is based on the monocopter platform, which is inspired by the flight of Samara seeds.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ac6d700a42bc95da06751f258ad64c53" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/oHlbp19UiG0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://airlab.sutd.edu.sg/">AirLab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="bdzxe0gnc7u"><em>We present a variable stiffness robotic skin (VSRS), a concept that integrates stiffness-changing capabilities, sensing, and actuation into a single, thin modular robot design. Reconfiguring, reconnecting, and reshaping VSRSs allows them to achieve new functions both on and in the absence of a host body.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d1121f533c87d4cac2477ffb67638ac0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/bDzxe0GNC7U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.eng.yale.edu/faboratory/">Yale Faboratory</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="x3cutqhxk-c"><em>Heimdall is a new rover design for the 2024 University Rover Challenge (URC). This video shows highlights of Heimdall’s trip during the four missions at URC 2024. </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="61383241a5825fcc1e68c6a0607bc930" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/x3cuTqHxk-c?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Heimdall features a split body design with whegs (wheel legs), and a drill for sub-surface sample collection. It also has the ability to manipulate a variety of objects, collect surface samples, and perform onboard spectrometry and chemical tests.</em></blockquote><p>[ <a href="https://urc.orgs.wvu.edu/">WVU</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="dinbi7g6wbu">I think this may be the first time I’ve seen an autonomous robot using a train? This one is delivering lunch boxes!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="188950c53904b813a94bbd7b47c57ab9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DiNbI7G6wbU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.jsme.or.jp/career-mech/juniornews-m24utsunomiya/">JSME</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="jjrakwexh-a"><em>The AI system used identifies and separates red apples from green apples, after which a robotic arm picks up the red apples identified with a qb SoftHand Industry and gently places them in a basket.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d219c836c7f76e0623e2614b78ae95a7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/jjrAkWexh-A?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>My favorite part is the magnetic apple stem system.</p><p>[ <a href="https://qbrobotics.com/product/qb-softhand-research/">QB Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="oh_bt9byn6i"><em>DexNex (v0, June 2024) is an anthropomorphic teleoperation testbed for dexterous manipulation at the Center for Robotics and Biosystems at Northwestern University. DexNex recreates human upper-limb functionality through a near 1-to-1 mapping between Operator movements and Avatar actions.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b2e86f40fe5dc0fbfd6a35394895d61d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/OH_BT9byn6I?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Motion of the Operator’s arms, hands, fingers, and head are fed forward to the Avatar, while fingertip pressures, finger forces, and camera images are fed back to the Operator. DexNex aims to minimize the latency of each subsystem to provide a seamless, immersive, and responsive user experience. Future research includes gaining a better understanding of the criticality of haptic and vision feedback for different manipulation tasks; providing arm-level grounded force feedback; and using machine learning to transfer dexterous skills from the human to the robot.</em></blockquote><p>[ <a href="https://sites.northwestern.edu/hapticsgroup/robot-dexterity-3/">Northwestern</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="euvoh-wa-la">Sometimes the best path isn’t the smoothest or straightest surface, it’s the path that’s actually meant to be a path.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="378b098db11eba9fdca15b64d3a1c906" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/EUVoH-wA-lA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://railab.kaist.ac.kr/">RaiLab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="yze4zohgn0c"><em>Fulfilling a school requirement by working in a Romanian locomotive factory one week each month, Daniela Rus learned to operate “machines that help us make things.” Appreciation for the practical side of math and science stuck with Daniela, who is now Director of the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL).</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3396df31f056ff281703e2a8f596a210" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Yze4zOhGN0c?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://danielarus.csail.mit.edu/">MIT</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="pxe0twct-ka"><em>For AI to achieve its full potential, non-experts need to be let into the development process, says Rumman Chowdhury, CEO and cofounder of Humane Intelligence. She tells the story of farmers fighting for the right to repair their own AI-powered tractors (which some manufacturers actually made illegal), proposing everyone should have the ability to report issues, patch updates or even retrain AI technologies for their specific uses.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c31b55d878b2958b8c87c76077a562b0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/PxE0TWcT-kA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ted.com/talks/rumman_chowdhury_your_right_to_repair_ai_systems">TED</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 07 Jun 2024 16:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-1x-robots-tidy-up</guid><category>Artificial intelligence</category><category>Haptic feedback</category><category>Humanoid robot</category><category>Robotics</category><category>Video friday</category><category>Path planning</category><category>Mars rovers</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/image.png?id=52427464&amp;width=980"></media:content></item><item><title>Video Friday: Multitasking</title><link>https://spectrum.ieee.org/video-friday-multitasking</link><description><![CDATA[
  381. <img src="https://spectrum.ieee.org/media-library/image.png?id=52361248&width=1200&height=800&coordinates=150%2C0%2C150%2C0"/><br/><br/><p>
  382. Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/>
  383. </p><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>
  384. Enjoy today’s videos!
  385. </p><div class="horizontal-rule">
  386. </div><div style="page-break-after: always">
  387. <span style="display:none"> </span>
  388. </div><p class="rm-anchors" id="j4tj1fz-qoa">
  389. Do you have trouble multitasking? Cyborgize yourself through muscle stimulation to automate repetitive physical tasks while you focus on something else.
  390. </p><p class="shortcode-media shortcode-media-youtube">
  391. <span class="rm-shortcode" data-rm-shortcode-id="3f730b293cc15e0f678da712f58eb25d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/J4tJ1FZ-QoA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  392. </p><p>
  393. [ <a href="https://lab.plopes.org/#SplitBody">SplitBody</a> ]
  394. </p><div class="horizontal-rule">
  395. </div><p>
  396. By combining a 5,000 frame-per-second (FPS) event camera with a 20-FPS RGB camera, roboticists from the University of Zurich have developed a much more effective vision system that keeps autonomous cars from crashing into stuff, as described in the current issue of <i>Nature</i>.
  397. </p><p class="shortcode-media shortcode-media-youtube">
  398. <span class="rm-shortcode" data-rm-shortcode-id="08ed05a05e4a7dccbfe00750d11c5036" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dwzGhMQCc4Y?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  399. </p><p>
  400. [ <a href="https://www.nature.com/articles/s41586-024-07409-w">Nature</a> ]
  401. </p><div class="horizontal-rule">
  402. </div><blockquote class="rm-anchors" id="59qgzzsd1tk">
  403. <em>Mitsubishi Electric has been awarded the GUINNESS WORLD RECORDS title for the fastest robot to solve a puzzle cube. The robot’s time of 0.305 second beat the previous record of 0.38 second, for which it received a GUINNESS WORLD RECORDS certificate on 21 May 2024.</em>
  404. </blockquote><p class="shortcode-media shortcode-media-youtube">
  405. <span class="rm-shortcode" data-rm-shortcode-id="03eabb8ef0077a2ea5fa381c745a319e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/59qgzzSD1tk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  406. </p><p>
  407. [ <a href="https://www.mitsubishielectric.com/news/2024/0523.html">Mitsubishi</a> ]
  408. </p><div class="horizontal-rule">
  409. </div><p class="rm-anchors" id="p_nsv3a8uwm">
  410. Sony’s AIBO is celebrating its 25th anniversary, which seems like a long time, and it is. But back then, the original AIBO could check your email for you. Email! In 1999!
  411. </p><p class="shortcode-media shortcode-media-youtube">
  412. <span class="rm-shortcode" data-rm-shortcode-id="c183018a668dc7b1708ef866e9756324" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/p_nSv3A8UWM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  413. </p><p>
  414. I miss Hotmail.
  415. </p><p>
  416. [ <a href="https://us.aibo.com/">AIBO</a> ]
  417. </p><div class="horizontal-rule">
  418. </div><p class="rm-anchors" id="ol-2g5xa5zu">
  419. SchniPoSa: schnitzel with french fries and a salad.
  420. </p><p class="shortcode-media shortcode-media-youtube">
  421. <span class="rm-shortcode" data-rm-shortcode-id="95bf77a1d653c5a0a20e1bb5ea01dd9f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Ol-2G5xa5ZU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  422. </p><p>
  423. [ <a href="https://www.dino-robotics.com/">Dino Robotics</a> ]
  424. </p><div class="horizontal-rule">
  425. </div><p class="rm-anchors" id="716p6zg7nlc">
  426. Cloth-folding is still a really hard problem for robots, but progress was made at ICRA!
  427. </p><p class="shortcode-media shortcode-media-youtube">
  428. <span class="rm-shortcode" data-rm-shortcode-id="113e101880bb6bb7e9b79e4e0e496208" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/716P6zG7NLc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  429. </p><p>
  430. [ <a href="https://airo.ugent.be/cloth_competition/">ICRA Cloth Competition</a> ]
  431. </p><p>
  432. Thanks, Francis!
  433. </p><div class="horizontal-rule">
  434. </div><blockquote class="rm-anchors" id="5s9dzqv5qac">
  435. <em>MIT CSAIL researchers enhance robotic precision with sophisticated tactile sensors in the palm and agile fingers, setting the stage for improvements in human-robot interaction and prosthetic technology.</em>
  436. </blockquote><p class="shortcode-media shortcode-media-youtube">
  437. <span class="rm-shortcode" data-rm-shortcode-id="4b6779b6184c074c1c243be1ae59c977" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5S9dZQv5qAc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  438. </p><p>
  439. [ <a href="https://news.mit.edu/2024/robotic-palm-mimics-human-touch-0520">MIT</a> ]
  440. </p><div class="horizontal-rule">
  441. </div><blockquote class="rm-anchors" id="wmgurhz2tps">
  442. <em>We present a novel adversarial attack method designed to identify failure cases in any type of locomotion controller, including state-of-the-art reinforcement-learning-based controllers. Our approach reveals the vulnerabilities of black-box neural network controllers, providing valuable insights that can be leveraged to enhance robustness through retraining.</em>
  443. </blockquote><p class="shortcode-media shortcode-media-youtube">
  444. <span class="rm-shortcode" data-rm-shortcode-id="c81227a11d434378b9ede552d046fc6c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/WMgUrhZ2Tps?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  445. </p><p>
  446. [ <a href="https://fanshi14.github.io/me/rss24.html">Fan Shi</a> ]
  447. </p><div class="horizontal-rule">
  448. </div><blockquote class="rm-anchors" id="xhg6gp68o_w">
  449. <em>In this work, we investigate a novel integrated flexible OLED display technology used as a robotic skin-interface to improve robot-to-human communication in a real industrial setting at Volkswagen or a collaborative human-robot interaction task in motor assembly. The interface was implemented in a workcell and validated qualitatively with a small group of operators (n=9) and quantitatively with a large group (n=42). The validation results showed that using flexible OLED technology could improve the operators’ attitude toward the robot; increase their intention to use the robot; enhance their perceived enjoyment, social influence, and trust; and reduce their anxiety.</em>
  450. </blockquote><p class="shortcode-media shortcode-media-youtube">
  451. <span class="rm-shortcode" data-rm-shortcode-id="a7fccc2d1f2bb8a9e1f9eaa84fe5250d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xhG6Gp68O_w?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  452. </p><p>
  453. [ <a href="https://link.springer.com/article/10.1007/s41315-024-00343-0">Paper</a> ]
  454. </p><p>
  455. Thanks, Bram!
  456. </p><div class="horizontal-rule">
  457. </div><blockquote class="rm-anchors" id="pmwdonno_mc">
  458. <em>We introduce InflatableBots, shape-changing inflatable robots for large-scale encountered-type haptics in VR. Unlike traditional inflatable shape displays, which are immobile and limited in interaction areas, our approach combines mobile robots with fan-based inflatable structures. This enables safe, scalable, and deployable haptic interactions on a large scale.</em>
  459. </blockquote><p class="shortcode-media shortcode-media-youtube">
  460. <span class="rm-shortcode" data-rm-shortcode-id="570c7e08b221f00543a8198e52aaadde" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/PmwdonNO_Mc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  461. </p><p>
  462. [ <a href="https://ryosuzuki.org/inflatablebots/">InflatableBots</a> ]
  463. </p><div class="horizontal-rule">
  464. </div><blockquote class="rm-anchors" id="ssxpx_lwyvc">
  465. <em>We present a bioinspired passive dynamic foot in which the claws are actuated solely by the impact energy. Our gripper simultaneously resolves the issue of smooth absorption of the impact energy and fast closure of the claws by linking the motion of an ankle linkage and the claws through soft tendons.</em>
  466. </blockquote><p class="shortcode-media shortcode-media-youtube">
  467. <span class="rm-shortcode" data-rm-shortcode-id="aceb07c3550036b253dfa85614078d41" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/SSxPX_lwyVc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  468. </p><p>
  469. [ <a href="https://ieeexplore.ieee.org/abstract/document/10328663">Paper</a> ]
  470. </p><div class="horizontal-rule">
  471. </div><blockquote class="rm-anchors" id="o1waeime24a">
  472. <em>In this video, a 3-UPU exoskeleton robot for a wrist joint is designed and controlled to perform wrist extension, flexion, radial-deviation, and ulnar-deviation motions in stroke-affected patients. This is the first time a 3-UPU robot has been used effectively for any kind of task.</em>
  473. </blockquote><p class="shortcode-media shortcode-media-youtube">
  474. <span class="rm-shortcode" data-rm-shortcode-id="833e34c4e76d2df8441400a64d2c85ac" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/o1WaEime24A?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  475. </p><p>“UPU” stands for “universal-prismatic-universal” and refers to the actuators—the prismatic joints between two universal joints.
  476. </p><p>
  477. [ <a href="http://www.ir.bas.bg/index_en.html">BAS</a> ]
  478. </p><p>
  479. Thanks, Tony!
  480. </p><div class="horizontal-rule">
  481. </div><blockquote class="rm-anchors" id="ufl45we2pfy">
  482. <em>BRUCE Got Spot-ted at ICRA2024.</em>
  483. </blockquote><p class="shortcode-media shortcode-media-youtube">
  484. <span class="rm-shortcode" data-rm-shortcode-id="3f19a8800b806f2c30cb66fea8e54160" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UFL45wE2pFY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  485. </p><p>
  486. [ <a href="https://www.westwoodrobotics.io/bruce/">Westwood Robotics</a> ]
  487. </p><div class="horizontal-rule">
  488. </div><p class="rm-anchors" id="d6l2n2agpqy">
  489. Parachutes: maybe not as good of an idea for drones as you might think.
  490. </p><p class="shortcode-media shortcode-media-youtube">
  491. <span class="rm-shortcode" data-rm-shortcode-id="cd8bb7e247a8f996c5d05acb68fdf028" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/d6L2N2agpqY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  492. </p><p>
  493. [ <a href="https://wing.com/">Wing</a> ]
  494. </p><div class="horizontal-rule">
  495. </div><blockquote class="rm-anchors" id="fdgw3983cvc">
  496. <em>In this paper, we propose a system for the artist-directed authoring of stylized bipedal walking gaits, tailored for execution on robotic characters.  To demonstrate the utility of our approach, we animate gaits for a custom, free-walking robotic character, and show, with two additional in-simulation examples, how our procedural animation technique generalizes to bipeds with different degrees of freedom, proportions, and mass distributions.</em>
  497. </blockquote><p class="shortcode-media shortcode-media-youtube">
  498. <span class="rm-shortcode" data-rm-shortcode-id="b29d8f9a7b403f99d5cbc646ecda464d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FDgW3983Cvc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  499. </p><p>
  500. [ <a href="https://studios.disneyresearch.com/research/">Disney Research</a> ]
  501. </p><div class="horizontal-rule">
  502. </div><blockquote class="rm-anchors" id="78dn6b0dnka">
  503. <em>The European drone project Labyrinth aims to keep new and conventional air traffic separate, especially in busy airspaces such as those expected in urban areas. The project provides a new drone-traffic service and illustrates its potential to improve the safety and efficiency of civil land, air, and sea transport, as well as emergency and rescue operations.</em>
  504. </blockquote><p class="shortcode-media shortcode-media-youtube">
  505. <span class="rm-shortcode" data-rm-shortcode-id="5fa1f2ea29faaabc37cb6a715f04a5a3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/78dN6B0DnKA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  506. </p><p>
  507. [ <a href="https://www.dlr.de/en/fl/research-transfer/projects/labyrinth">DLR</a> ]
  508. </p><div class="horizontal-rule">
  509. </div><p class="rm-anchors" id="kguth-plq00">
  510. This Carnegie Mellon University Robotics Institute seminar, by Kim Baraka at Vrije Universiteit Amsterdam, is on the topic “Why We Should Build Robot Apprentices and Why We Shouldn’t Do It Alone.”
  511. </p><p class="shortcode-media shortcode-media-youtube">
  512. <span class="rm-shortcode" data-rm-shortcode-id="b1e4c11f24d182ae79fa870c799a59c2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/kguTh-PlQ00?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  513. </p><blockquote>
  514. <em>For robots to be able to truly integrate human-populated, dynamic, and unpredictable environments, they will have to have strong adaptive capabilities. In this talk, I argue that these adaptive capabilities should leverage interaction with end users, who know how (they want) a robot to act in that environment. I will present an overview of my past and ongoing work on the topic of human-interactive robot learning, a growing interdisciplinary subfield that embraces rich, bidirectional interaction to shape robot learning. I will discuss contributions on the algorithmic, interface, and interaction design fronts, showcasing several collaborations with animal behaviorists/trainers, dancers, puppeteers, and medical practitioners.</em>
  515. </blockquote><p>
  516. [ <a href="https://www.ri.cmu.edu/event/why-we-should-build-robot-apprentices-and-why-we-shouldnt-do-it-alone/">CMU RI</a> ]
  517. </p><div class="horizontal-rule">
  518. </div>]]></description><pubDate>Fri, 31 May 2024 16:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-multitasking</guid><category>Event cameras</category><category>Robotics</category><category>Video friday</category><category>Tactile sensors</category><category>Quadruped robots</category><category>Aibo</category><category>Robot arms</category><category>Drones</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/image.png?id=52361248&amp;width=980"></media:content></item><item><title>Will Scaling Solve Robotics?</title><link>https://spectrum.ieee.org/solve-robotics</link><description><![CDATA[
  519. <img src="https://spectrum.ieee.org/media-library/a-set-of-24-images-show-different-types-of-robot-arms-performing-manipulation-tasks-with-objects-like-toys-cups-and-towels.jpg?id=52322881&width=1200&height=800&coordinates=126%2C0%2C127%2C0"/><br/><br/><p>
  520. <em>This post was originally published on the author’s <a href="https://nishanthjkumar.com/Will-Scaling-Solve-Robotics-Perspectives-from-CoRL-2023/" target="_blank">personal blog</a>.</em>
  521. </p><p>
  522. Last year’s
  523. <a href="https://www.corl2023.org/" rel="noopener noreferrer" target="_blank">Conference on Robot Learning (CoRL)</a> was the biggest CoRL yet, with over 900 attendees, 11 workshops, and almost 200 accepted papers. While there were <em>a lot</em> of cool new ideas (see <a href="https://seungchan-kim.github.io/notes/CoRL_2023_Note.pdf" rel="noopener noreferrer" target="_blank">this great set of notes</a> for an overview of technical content), one particular debate seemed to be front and center: Is training a large neural network on a very large dataset a feasible way to solve robotics?<a href="#ref1" rel="noopener noreferrer"><sup>1</sup></a>
  524. </p><p>
  525. Of course, some version of this question has been on researchers’ minds for a few years now. However, in the aftermath of the unprecedented success of
  526. <a href="https://chat.openai.com/" rel="noopener noreferrer" target="_blank">ChatGPT</a> and other large-scale “<a href="https://arxiv.org/abs/2108.07258" rel="noopener noreferrer" target="_blank">foundation models</a>” on tasks that were thought to be unsolvable just a few years ago, the question was especially topical at this year’s CoRL. Developing a general-purpose robot, one that can competently and robustly execute a wide variety of tasks of interest in <em>any</em> home or office environment that humans can, has been perhaps the holy grail of robotics since the inception of the field. And given the recent progress of foundation models, it seems possible that scaling existing network architectures by training them on very large datasets might actually be the key to that grail.
  527. </p><p>
  528. Given how timely and significant this debate seems to be, I thought it might be useful to write a post centered around it. My main goal here is to try to present the different sides of the argument as I heard them, without bias towards any side. Almost all the content is taken directly from talks I attended or conversations I had with fellow attendees. My hope is that this serves to deepen people’s understanding around the debate, and maybe even inspire future research ideas and directions.
  529. </p><hr/><p>
  530. I want to start by presenting the main arguments I heard in favor of scaling as a solution to robotics.
  531. </p><h2>Why Scaling Might Work<a href="https://nishanthjkumar.com/Will-Scaling-Solve-Robotics-Perspectives-from-CoRL-2023/#why-scaling-might-work" rel="noopener noreferrer" target="_blank" title="Permalink"><span></span></a></h2><ul>
  532. <li><strong>It worked for Computer Vision (CV) and Natural Language Processing (NLP), so why not robotics?</strong> This was perhaps the most common argument I heard, and the one that seemed to excite most people given recent models like <a href="https://openai.com/research/gpt-4v-system-card" target="_blank">GPT4-V</a> and <a href="https://segment-anything.com/" target="_blank">SAM</a>. The point here is that training a large model on an extremely large corpus of data has recently led to astounding progress on problems thought to be intractable just 3 to 4 years ago. Moreover, doing this has led to a number of <em>emergent</em> capabilities, where trained models are able to perform well at a number of tasks they weren’t explicitly trained for. Importantly, the fundamental method here of training a large model on a very large amount of data is <em>general</em> and not somehow unique to CV or NLP. Thus, there seems to be no reason why we shouldn’t observe the same incredible performance on robotics tasks.
  533. <ul>
  534. <li><strong>We’re already starting to see some evidence that this might work well</strong>: <a href="https://ai.stanford.edu/~cbfinn/" rel="noopener noreferrer" target="_blank">Chelsea Finn</a>, <a href="https://vincent.vanhoucke.com/" rel="noopener noreferrer" target="_blank">Vincent Vanhoucke</a>, and several others pointed to the recent <a href="https://deepmind.google/discover/blog/scaling-up-learning-across-many-different-robot-types/" rel="noopener noreferrer" target="_blank">RT-X</a> and <a href="https://deepmind.google/discover/blog/rt-2-new-model-translates-vision-and-language-into-action/" rel="noopener noreferrer" target="_blank">RT-2</a> papers from Google DeepMind as evidence that training a single model on large amounts of robotics data yields promising generalization capabilities. <a href="https://groups.csail.mit.edu/locomotion/russt.html" rel="noopener noreferrer" target="_blank">Russ Tedrake</a> of Toyota Research Institute (TRI) and MIT pointed to the recent <a href="https://arxiv.org/abs/2303.04137" rel="noopener noreferrer" target="_blank">Diffusion Policies</a> paper as showing a similar surprising capability. <a href="https://people.eecs.berkeley.edu/~svlevine/" rel="noopener noreferrer" target="_blank">Sergey Levine</a> of UC Berkeley highlighted <a href="https://general-navigation-models.github.io/" rel="noopener noreferrer" target="_blank">recent efforts and successes from his group</a> in building and deploying a robot-agnostic foundation model for navigation. All of these works are somewhat preliminary in that they train a relatively small model with a paltry amount of data compared to something like GPT4-V, but they certainly do seem to point to the fact that scaling up these models and datasets could yield impressive results in robotics.</li>
  535. </ul>
  536. </li>
  537. <li><strong>Progress in data, compute, and foundation models are waves that we should ride</strong>: This argument is closely related to the above one, but distinct enough that I think it deserves to be discussed separately. The main idea here comes from <a href="http://www.incompleteideas.net/IncIdeas/BitterLesson.html" rel="noopener noreferrer" target="_blank">Rich Sutton’s influential essay</a>: The history of AI research has shown that relatively simple algorithms that scale well with data always outperform more complex/clever algorithms that do not. A nice analogy from Karol Hausman’s early career keynote is that improvements to data and compute are like a wave that is bound to happen given the progress and adoption of technology. Whether we like it or not, there will be more data and better compute. As AI researchers, we can either choose to ride this wave, or we can ignore it. Riding this wave means recognizing all the progress that’s happened because of large data and large models, and then developing algorithms, tools, datasets, etc. to take advantage of this progress. It also means leveraging large pre-trained models from vision and language that currently exist or will exist for robotics tasks.</li>
  538. <li><strong>Robotics tasks of interest lie on a relatively simple manifold, and training a large model will help us find it</strong>: This was something rather interesting that Russ Tedrake pointed out during a debate in the <a href="https://corl2023deployable.github.io/" rel="noopener noreferrer" target="_blank">workshop on robustly deploying learning-based solutions</a>. The <a href="https://en.wikipedia.org/wiki/Manifold_hypothesis" rel="noopener noreferrer" target="_blank">manifold hypothesis</a> as applied to robotics roughly states that, while the space of possible tasks we could conceive of having a robot do is impossibly large and complex, the tasks that <em>actually</em> occur practically in our world lie on some much lower-dimensional and simpler manifold of this space. By training a single model on large amounts of data, we might be able to discover this manifold. If we believe that such a manifold exists for robotics—which certainly seems intuitive—then this line of thinking would suggest that robotics is not somehow <em>different</em> from CV or NLP in any fundamental way. The same recipe that worked for CV and NLP should be able to discover the manifold for robotics and yield a shockingly competent generalist robot. Even if this doesn’t exactly happen, Tedrake points out that attempting to train a large model for general robotics tasks could teach us important things about the manifold of robotics tasks, and perhaps we can leverage this understanding to solve robotics.</li>
  539. <li><strong>Large models are the best approach we have to get at “commonsense” capabilities, which pervade all of robotics</strong>: Another thing Russ Tedrake pointed out is that “common sense” pervades almost every robotics task of interest. Consider the task of having a mobile manipulation robot place a mug onto a table. Even if we ignore the challenging problems of finding and localizing the mug, there are a surprising number of subtleties to this problem. What if the table is cluttered and the robot has to move other objects out of the way? What if the mug accidentally falls on the floor and the robot has to pick it up again, re-orient it, and place it on the table? And what if the mug has something in it, so it’s important it’s never overturned? These “edge cases” are actually much more common that it might seem, and often are the difference between success and failure for a task. Moreover, these seem to require some sort of ‘common sense’ reasoning to deal with. Several people argued that large models trained on a large amount of data are the best way we know of to yield some aspects of this ‘common sense’ capability. Thus, they might be the best way we know of to solve general robotics tasks.</li>
  540. </ul><p>
  541. As you might imagine, there were a number of arguments against scaling as a practical solution to robotics. Interestingly, almost no one directly disputes that this approach
  542. <em>could</em> work in theory. Instead, most arguments fall into one of two buckets: (1) arguing that this approach is simply <em>impractical</em>, and (2) arguing that even if it does kind of work, it won’t really “solve” robotics.
  543. </p><div class="horizontal-rule">
  544. </div><h2>Why Scaling Might Not Work<a href="https://nishanthjkumar.com/Will-Scaling-Solve-Robotics-Perspectives-from-CoRL-2023/#why-scaling-might-not-work" rel="noopener noreferrer" target="_blank" title="Permalink"><span></span></a></h2><h3>It’s impractical</h3><ul>
  545. <li><strong>We currently just <em>don’t have</em> much robotics data, and there’s no clear way we’ll get it</strong>: This is the elephant in pretty much every large-scale robot learning room. The Internet is chock-full of data for CV and NLP, but not at all for robotics. <a href="https://robotics-transformer-x.github.io/#:~:text=We%20introduce%20the%20Open%20X,bi%2Dmanual%20robots%20and%20quadrupeds." target="_blank">Recent efforts to collect very large datasets</a> have required tremendous amounts of time, money, and cooperation, yet have yielded a very small fraction of the amount of vision and text data on the Internet. CV and NLP got so much data because they had an incredible “data flywheel”: tens of millions of people connecting to and using the Internet. Unfortunately for robotics, there seems to be no reason why people would upload a bunch of sensory input and corresponding action pairs. Collecting a very large robotics dataset seems quite hard, and given that we know that a lot of important “emergent” properties only showed up in vision and language models at scale, the inability to get a large dataset could render this scaling approach hopeless.</li>
  546. <li><strong>Robots have different embodiments</strong>: Another challenge with collecting a very large robotics dataset is that robots come in a large variety of different shapes, sizes, and form factors. The output control actions that are sent to a <a href="https://bostondynamics.com/products/spot/" target="_blank">Boston Dynamics Spot robot</a> are very different to those sent to a <a href="https://www.kuka.com/en-us/products/robotics-systems/industrial-robots/lbr-iiwa" rel="noopener noreferrer" target="_blank">KUKA iiwa arm</a>. Even if we ignore the problem of finding some kind of common output space for a large trained model, the variety in robot embodiments means we’ll probably have to collect data from each robot type, and that makes the above data-collection problem even harder.</li>
  547. <li><strong>There is extremely large variance in the environments we want robots to operate in</strong>: For a robot to really be “general purpose,” it must be able to operate in any practical environment a human might want to put it in. This means operating in <em>any</em> possible home, factory, or office building it might find itself in. Collecting a dataset that has even just one example of every possible building seems impractical. Of course, the hope is that we would only need to collect data in a small fraction of these, and the rest will be handled by generalization. However, we don’t <em>know</em> how much data will be required for this generalization capability to kick in, and it very well could also be impractically large.</li>
  548. <li><strong>Training a model on such a large robotics dataset might be too expensive/energy-intensive</strong>: It’s no secret that training large foundation models is expensive, both in terms of money and in energy consumption. GPT-4V—OpenAI’s biggest foundation model at the time of this writing—reportedly cost <a href="https://www.wired.com/story/openai-ceo-sam-altman-the-age-of-giant-ai-models-is-already-over/" rel="noopener noreferrer" target="_blank">over US $100 million</a> and <a href="https://www.google.com/search?q=gpt4+training+and+energy+cost&oq=gpt4+training+and+energy+cost&gs_lcrp=EgZjaHJvbWUyBggAEEUYOTIJCAEQIRgKGKABMgkIAhAhGAoYoAEyCQgDECEYChigAdIBCDQ5NThqMGo3qAIAsAIA&sourceid=chrome&ie=UTF-8" rel="noopener noreferrer" target="_blank">50 million KWh</a> of electricity to train. This is well beyond the budget and resources that any academic lab can currently spare, so a larger robotics foundation model would need to be trained by a company or a government of some kind. Additionally, depending on how large both the dataset and model itself for such an endeavor are, the costs may balloon by another order-of-magnitude or more, which might make it completely infeasible.</li>
  549. </ul><h3>Even if it works as well as in CV/NLP, it won’t solve robotics<a href="https://nishanthjkumar.com/Will-Scaling-Solve-Robotics-Perspectives-from-CoRL-2023/#even-if-it-works-as-well-as-in-cvnlp-it-wont-solve-robotics" rel="noopener noreferrer" target="_blank" title="Permalink"><span></span></a></h3><ul>
  550. <li><strong>The 99.X problem and long tails</strong>: Vincent Vanhoucke of Google Robotics started a talk with a provocative assertion: Most—if not all—robot learning approaches cannot be deployed for any practical task. The reason? Real-world industrial and home applications typically require 99.X percent or higher accuracy and reliability. What exactly that means varies by application, but it’s safe to say that robot learning algorithms aren’t there yet. Most results presented in academic papers top out at 80 percent success rate. While that might seem quite close to the 99.X percent threshold, people trying to actually deploy these algorithms have found that it isn’t so: getting higher success rates requires asymptotically more effort as we get closer to 100 percent. That means going from 85 to 90 percent might require just as much—if not more—effort than going from 40 to 80 percent. Vincent asserted in his talk that getting up to 99.X percent is a fundamentally different beast than getting even up to 80 percent, one that might require a whole host of new techniques beyond just scaling.
  551. <ul>
  552. <li><strong>Existing big models don’t get to 99.X percent even in CV and NLP</strong>: As impressive and capable as current large models like GPT-4V and DETIC are, even they don’t achieve 99.X percent or higher success rate on previously-unseen tasks. Current robotics models are very far from this level of performance, and I think it’s safe to say that the entire robot learning community would be thrilled to have a general model that does as well on robotics tasks as GPT-4V does on NLP tasks. However, even if we had something like this, it wouldn’t be at 99.X percent, and it’s not clear that it’s possible to get there by scaling either.</li>
  553. </ul>
  554. </li>
  555. <li><strong>Self-driving car companies have tried this approach, and it doesn’t fully work (yet)</strong>: This is closely related to the above point, but important and subtle enough that I think it deserves to stand on its own. A number of self-driving car companies—most notably <a href="https://www.tesla.com/" target="_blank">Tesla</a> and <a href="https://wayve.ai/" target="_blank">Wayve</a>—have tried training such an end-to-end big model on large amounts of data to achieve <a href="https://www.sae.org/blog/sae-j3016-update" rel="noopener noreferrer" target="_blank">Level 5 autonomy</a>. Not only do these companies have the engineering resources and money to train such models, but they also have the data. Tesla in particular has a fleet of over 100,000 cars deployed in the real world that it is constantly collecting and then annotating data from. These cars are being teleoperated by experts, making the data ideal for large-scale supervised learning. And despite all this, <a href="https://www.tesla.com/support/autopilot" rel="noopener noreferrer" target="_blank">Tesla has so far been unable to produce a Level 5 autonomous driving system</a>. That’s not to say their approach doesn’t work at all. It competently handles a large number of situations—especially highway driving—and serves as a useful Level 2 (i.e., driver assist) system. However, it’s far from 99.X percent performance. Moreover, <a href="https://electrek.co/2022/12/14/tesla-full-self-driving-data-awful-challenge-elon-musk-prove-otherwise/" rel="noopener noreferrer" target="_blank">data seems to suggest that Tesla’s approach is faring far worse than Waymo or Cruise</a>, which both use much more modular systems. While it isn’t inconceivable that Tesla’s approach could end up catching up and surpassing its competitors performance in a year or so, the fact that it hasn’t worked yet should serve as evidence perhaps that the 99.X percent problem is hard to overcome for a large-scale ML approach. Moreover, given that self-driving is a special case of general robotics, Tesla’s case should give us reason to doubt the large-scale model approach as a full solution to robotics, especially in the medium term.</li>
  556. <li><strong>Many robotics tasks of interest are quite long-horizon</strong>: Accomplishing any task requires taking a number of correct actions in sequence. Consider the relatively simple problem of making a cup of tea given an electric kettle, water, a box of tea bags, and a mug. Success requires pouring the water into the kettle, turning it on, then pouring the hot water into the mug, and placing a tea-bag inside it. If we want to solve this with a model trained to output motor torque commands given pixels as input, we’ll need to send torque commands to all 7 motors at around 40 Hz. Let’s suppose that this tea-making task requires 5 minutes. That requires 7 * 40 * 60 * 5 = 84,000 correct torque commands. This is all just for a stationary robot arm; things get much more complicated if the robot is mobile, or has more than one arm. It is well-known that error tends to compound with longer-horizons for most tasks. This is one reason why—despite their ability to produce long sequences of text—even LLMs cannot yet produce completely coherent novels or long stories: small deviations from a true prediction over time tend to add up and yield extremely large deviations over long-horizons. Given that most, if not all robotics tasks of interest require sending at least thousands, if not hundreds of thousands, of torques in just the right order, even a fairly well-performing model might really struggle to fully solve these robotics tasks.</li>
  557. </ul><p>
  558. Okay, now that we’ve sketched out all the main points on both sides of the debate, I want to spend some time diving into a few related points. Many of these are responses to the above points on the ‘against’ side, and some of them are proposals for directions to explore to help overcome the issues raised.
  559. </p><div class="horizontal-rule">
  560. </div><h2>Miscellaneous Related Arguments</h2><h3>We can probably deploy learning-based approaches robustly</h3><p>
  561. One point that gets brought up a lot against learning-based approaches is the lack of theoretical guarantees. At the time of this writing, we know very little about neural network theory: we don’t really know why they learn well, and more importantly, we don’t have any guarantees on what values they will output in different situations. On the other hand, most classical control and planning approaches that are widely used in robotics have various theoretical guarantees built-in. These are generally quite useful when certifying that systems are safe.
  562. </p><p>
  563. However, there seemed to be general consensus amongst a number of CoRL speakers that this point is perhaps given more significance than it should. Sergey Levine pointed out that most of the guarantees from controls aren’t really that useful for a number of real-world tasks we’re interested in. As he put it: “self-driving car companies aren’t worried about controlling the car to drive in a straight line, but rather about a situation in which someone paints a sky onto the back of a truck and drives in front of the car,” thereby confusing the perception system. Moreover,
  564. <a href="https://www.linkedin.com/in/scott-kuindersma-06a38152/" target="_blank">Scott Kuindersma</a> of Boston Dynamics talked about how they’re <a href="https://www.youtube.com/watch?v=Qlv77vBH4i0&list=PLtF7v_W_CG5oG_lhI9tA1g4dPJKBOWDsA&index=4" rel="noopener noreferrer" target="_blank">deploying RL-based controllers</a> on their robots in production, and are able to get the confidence and guarantees they need via rigorous simulation and real-world testing. Overall, I got the sense that while people feel that guarantees are important, and encouraged researchers to keep trying to study them, they don’t think that the lack of guarantees for learning-based systems means that they <em>cannot</em> be deployed robustly.
  565. </p><h3>What if we strive to deploy Human-in-the-Loop systems?</h3><p>
  566. In one of the organized debates,
  567. <a href="https://homes.cs.washington.edu/~todorov/" rel="noopener noreferrer" target="_blank">Emo Todorov</a> pointed out that existing successful ML systems, like <a href="https://openai.com/blog/openai-codex" rel="noopener noreferrer" target="_blank">Codex</a> and ChatGPT, work well only because a human interacts with and sanitizes their output. Consider the case of coding with Codex: it isn’t intended to directly produce runnable, bug-free code, but rather to act as an intelligent autocomplete for programmers, thereby making the overall human-machine team more productive than either alone. In this way, these models don’t have to achieve the 99.X percent performance threshold, because a human can help correct any issues during deployment. As Emo put it: “humans are forgiving, physics is not.”</p><p>
  568. Chelsea Finn responded to this by largely agreeing with Emo. She strongly agreed that all successfully-deployed and useful ML systems have humans in the loop, and so this is likely the setting that deployed robot learning systems will need to operate in as well. Of course, having a human operate in the loop with a robot isn’t as straightforward as in other domains, since having a human and robot inhabit the same space introduces potential safety hazards. However, it’s a useful setting to think about, especially if it can help address issues brought on by the 99.X percent problem.
  569. </p><h3>Maybe we don’t need to collect that much real-world data for scaling</h3><p>
  570. A number of people at the conference were thinking about creative ways to overcome the real-world data bottleneck without actually collecting more real world data. Quite a few of these people argued that fast, realistic simulators could be vital here, and there were a number of works that explored creative ways to train robot policies in simulation and then transfer them to the real world. Another set of people argued that we can leverage existing vision, language, and video data and then just ‘sprinkle in’ some robotics data. Google’s recent
  571. <a href="https://deepmind.google/discover/blog/rt-2-new-model-translates-vision-and-language-into-action/" rel="noopener noreferrer" target="_blank">RT-2 model</a> showed how taking a large model trained on internet scale vision and language data, and then just fine-tuning it on a much smaller set robotics data can produce impressive performance on robotics tasks. Perhaps through a combination of simulation and pretraining on general vision and language data, we won’t actually have to collect too much real-world robotics data to get scaling to work well for robotics tasks.
  572. </p><h3>Maybe combining classical and learning-based approaches can give us the best of both worlds</h3><p>
  573. As with any debate, there were quite a few people advocating the middle path. Scott Kuindersma of Boston Dynamics titled one of his talks “Let’s all just be friends: model-based control helps learning (and vice versa)”. Throughout his talk, and the subsequent debates, his strong belief that in the short to medium term, the best path towards reliable real-world systems involves combining learning with classical approaches. In her keynote speech for the conference,
  574. <a href="https://www.diligentrobots.com/andrea-thomaz" target="_blank">Andrea Thomaz</a> talked about how such a hybrid system—using learning for perception and a few skills, and classical SLAM and path-planning for the rest—is what powers a real-world robot that’s deployed in tens of hospital systems in Texas (and growing!). <a href="https://openreview.net/forum?id=QNPuJZyhFE" target="_blank">Several</a> <a href="https://openreview.net/forum?id=HtJE9ly5dT" target="_blank">papers</a> <a href="https://openreview.net/forum?id=9_8LF30mOC" target="_blank">explored</a> how classical controls and planning, together with learning-based approaches can enable much more capability than any system on its own. Overall, most people seemed to argue that this ‘middle path’ is extremely promising, especially in the short to medium term, but perhaps in the long-term either pure learning or an entirely different set of approaches might be best.
  575. </p><div class="horizontal-rule">
  576. </div><h2>What Can/Should We Take Away From All This?</h2><p>
  577. If you’ve read this far, chances are that you’re interested in some set of takeaways/conclusions. Perhaps you’re thinking “this is all very interesting, but what does all this mean for what we as a community should do? What research problems should I try to tackle?” Fortunately for you, there seemed to be a number of interesting suggestions that had some consensus on this.
  578. </p><h3>We should pursue the direction of trying to just scale up learning with very large datasets</h3><p>
  579. Despite the various arguments against scaling solving robotics outright, most people seem to agree that scaling in robot learning is a promising direction to be investigated. Even if it doesn’t fully solve robotics, it could lead to a significant amount of progress on a number of hard problems we’ve been stuck on for a while. Additionally, as Russ Tedrake pointed out, pursuing this direction carefully could yield useful insights about the general robotics problem, as well as current learning algorithms and why they work so well.
  580. </p><h3>We should <em>also</em> pursue other existing directions<a href="https://nishanthjkumar.com/Will-Scaling-Solve-Robotics-Perspectives-from-CoRL-2023/#we-should-also-pursue-other-existing-directions" target="_blank" title="Permalink"><span></span></a></h3><p>
  581. Even the most vocal proponents of the scaling approach were clear that they don’t think
  582. <em>everyone</em> should be working on this. It’s likely a bad idea for the entire robot learning community to put its eggs in the same basket, especially given all the reasons to believe scaling won’t fully solve robotics. Classical robotics techniques have gotten us quite far, and led to many successful and reliable deployments: pushing forward on them or integrating them with learning techniques might be the right way forward, especially in the short to medium terms.
  583. </p><h3>We should focus more on real-world mobile manipulation and easy-to-use systems<a href="https://nishanthjkumar.com/Will-Scaling-Solve-Robotics-Perspectives-from-CoRL-2023/#we-should-focus-more-on-real-world-mobile-manipulation-and-easy-to-use-systems" target="_blank" title="Permalink"><span></span></a></h3><p>
  584. Vincent Vanhoucke made an observation that most papers at CoRL this year were limited to tabletop manipulation settings. While there are plenty of hard tabletop problems, things generally get a lot more complicated when the robot—and consequently its camera view—moves. Vincent speculated that it’s easy for the community to fall into a local minimum where we make a lot of progress that’s
  585. <em>specific</em> to the tabletop setting and therefore not generalizable. A similar thing could happen if we work predominantly in simulation. Avoiding these local minima by working on real-world mobile manipulation seems like a good idea.
  586. </p><p>
  587. Separately, Sergey Levine observed that a big reason why LLM’s have seen so much excitement and adoption is because they’re extremely easy to use: especially by non-experts. One doesn’t have to know about the details of training an LLM, or perform any tough setup, to prompt and use these models for their own tasks. Most robot learning approaches are currently far from this. They often require significant knowledge of their inner workings to use, and involve very significant amounts of setup. Perhaps thinking more about how to make robot learning systems easier to use and widely applicable could help improve adoption and potentially scalability of these approaches.
  588. </p><h3>We should be more forthright about things that don’t work<a href="https://nishanthjkumar.com/Will-Scaling-Solve-Robotics-Perspectives-from-CoRL-2023/#we-should-be-more-forthright-about-things-that-dont-work" target="_blank" title="Permalink"><span></span></a></h3><p>
  589. There seemed to be a broadly-held complaint that many robot learning approaches don’t adequately report negative results, and this leads to a lot of unnecessary repeated effort. Additionally, perhaps patterns might emerge from consistent failures of things that we expect to work but don’t actually work well, and this could yield novel insight into learning algorithms. There is currently no good incentive for researchers to report such negative results in papers, but most people seemed to be in favor of designing one.
  590. </p><h3>We should try to do something totally new<a href="https://nishanthjkumar.com/Will-Scaling-Solve-Robotics-Perspectives-from-CoRL-2023/#we-should-try-to-do-something-totally-new" target="_blank" title="Permalink"><span></span></a></h3><p>
  591. There were a few people who pointed out that all current approaches—be they learning-based or classical—are unsatisfying in a number of ways. There seem to be a number of drawbacks with each of them, and it’s very conceivable that there is a completely different set of approaches that ultimately solves robotics. Given this, it seems useful to try think outside the box. After all, every one of the current approaches that’s part of the debate was only made possible because the few researchers that introduced them dared to think against the popular grain of their times.
  592. </p><p>
  593. <strong>Acknowledgements</strong>: Huge thanks to <a href="https://web.mit.edu/tslvr/www/" target="_blank">Tom Silver</a> and <a href="https://people.csail.mit.edu/lpk/" target="_blank">Leslie Kaelbling</a> for providing helpful comments, suggestions, and encouragement on a previous draft of this post.
  594. </p><p class="rm-anchors" id="ref1">
  595. <br/>
  596. <sup>1</sup> In fact, this was the topic of <a href="https://www.youtube.com/watch?v=pGjzxdD2Sa4&list=PLtF7v_W_CG5oG_lhI9tA1g4dPJKBOWDsA&index=14" target="_blank">a popular debate</a> hosted at a workshop on the first day; many of the points in this post were inspired by the conversation during that debate.
  597. </p>]]></description><pubDate>Tue, 28 May 2024 10:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/solve-robotics</guid><category>Machine learning</category><category>Mobile manipulation</category><category>Natural language processing</category><category>Neural networks</category><category>Robotics</category><dc:creator>Nishanth J. Kumar</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-set-of-24-images-show-different-types-of-robot-arms-performing-manipulation-tasks-with-objects-like-toys-cups-and-towels.jpg?id=52322881&amp;width=980"></media:content></item><item><title>Video Friday: A Starbucks With 100 Robots</title><link>https://spectrum.ieee.org/video-friday-starbucks-robots</link><description><![CDATA[
  598. <img src="https://spectrum.ieee.org/media-library/a-starbucks-worker-places-coffee-cups-on-a-mobile-robot-part-of-a-fleet-of-delivery-robots-operating-in-a-modern-office-buildin.gif?id=52317753&width=1200&height=800&coordinates=77%2C0%2C78%2C0"/><br/><br/><p>
  599. Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please
  600. <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/>
  601. </p><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>
  602. Enjoy today’s videos!
  603. </p><div class="horizontal-rule">
  604. </div><div style="page-break-after: always">
  605. <span style="display:none"> </span>
  606. </div><blockquote class="rm-anchors" id="yyqviaheyno">
  607. <em>NAVER 1784 is the world’s largest robotics testbed. The Starbucks on the second floor of 1784 is the world’s most unique Starbucks, with more than 100 service robots called “Rookie” delivering Starbucks drinks to meeting rooms and private seats, and various experiments with a dual-arm robot.</em>
  608. </blockquote><p class="shortcode-media shortcode-media-youtube">
  609. <span class="rm-shortcode" data-rm-shortcode-id="cf2da99309ba50e470f23cbfb09e4c57" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/yYqviahEyno?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  610. </p><p>
  611. [
  612. <a href="https://1784.navercorp.com/">Naver</a> ]
  613. </p><div class="horizontal-rule">
  614. </div><p class="rm-anchors" id="xgimhttw1mu">
  615. If you’re gonna take a robot dog with you on a hike, the least it could do is carry your backpack for you.
  616. </p><p class="shortcode-media shortcode-media-youtube">
  617. <span class="rm-shortcode" data-rm-shortcode-id="f880c214374d9ed180ce973ae549c37c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xgIMHTTW1MU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  618. </p><p>
  619. [
  620. <a href="https://www.deeprobotics.cn/en">Deep Robotics</a> ]
  621. </p><div class="horizontal-rule">
  622. </div><p class="rm-anchors" id="aepechiik9s">
  623. Obligatory reminder that phrases like “no teleoperation” without any additional context can mean many different things.
  624. </p><p class="shortcode-media shortcode-media-youtube">
  625. <span class="rm-shortcode" data-rm-shortcode-id="243b36b4f08f145149e0bab918f1b90c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/AePEcHIIk9s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  626. </p><p>
  627. [
  628. <a href="https://astribot.com/index-en.html">Astribot</a> ]
  629. </p><div class="horizontal-rule">
  630. </div><blockquote class="rm-anchors" id="czbmxdm1_tk">
  631. <em>This video is presented at the ICRA 2024 conference and summarizes recent results of our Learning AI for Dextrous Manipulation Lab. It demonstrates  how our learning AI methods allowed for breakthroughs in dextrous manipulation with the  mobile humanoid robot DLR Agile Justin. Although the core of the mechatronic hardware is almost 20 years old, only the advent of learning AI methods enabled a level of dexterity, flexibility and autonomy coming close to human capabilities.</em>
  632. </blockquote><p class="shortcode-media shortcode-media-youtube">
  633. <span class="rm-shortcode" data-rm-shortcode-id="15cb5037943c7d861f96486c7ef75ca7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/CZBMXDM1_Tk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  634. </p><p>
  635. [
  636. <a href="https://aidx-lab.github.io/">TUM</a> ]
  637. </p><p>
  638. Thanks Berthold!
  639. </p><div class="horizontal-rule">
  640. </div><p class="rm-anchors" id="ry1z_edby10">
  641. Hands of blue? Not a good look.
  642. </p><p class="shortcode-media shortcode-media-youtube">
  643. <span class="rm-shortcode" data-rm-shortcode-id="585ee3ef015ffdd38675c4f84b2b6369" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/rY1Z_eDBy10?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  644. </p><p>
  645. [
  646. <a href="https://synapticrobotics.com/">Synaptic</a> ]
  647. </p><div class="horizontal-rule">
  648. </div><p class="rm-anchors" id="rgkzs57_6nk">
  649. With all the humanoid stuff going on, there really should be more emphasis on intentional contact—humans lean and balance on things all the time, and robots should too!
  650. </p><p class="shortcode-media shortcode-media-youtube">
  651. <span class="rm-shortcode" data-rm-shortcode-id="c6df3ce89b9956f9ada064631458f14b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/RGkZS57_6Nk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  652. </p><p>
  653. [
  654. <a href="https://hucebot.github.io/seiko_controller_website/">Inria</a> ]
  655. </p><div class="horizontal-rule">
  656. </div><blockquote class="rm-anchors" id="p1g5kk2qd4e">
  657. <em>LimX Dynamics W1 is now more than a wheeled quadruped. By evolving into a biped robot, W1 maneuvers slickly on two legs in different ways: non-stop 360° rotation, upright free gliding, slick maneuvering, random collision and self-recovery, and step walking.</em>
  658. </blockquote><p class="shortcode-media shortcode-media-youtube">
  659. <span class="rm-shortcode" data-rm-shortcode-id="799d37e617002addadcda679fc773fa3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/p1g5kk2qD4E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  660. </p><p>
  661. [
  662. <a href="https://www.limxdynamics.com/en">LimX Dynamics</a> ]
  663. </p><div class="horizontal-rule">
  664. </div><blockquote class="rm-anchors" id="nmjrwyhvk6a">
  665. <em>Animal brains use less data and energy compared to current deep neural networks running on Graphics Processing Units (GPUs). This makes it hard to develop tiny autonomous drones, which are too small and light for heavy hardware and big batteries. Recently, the emergence of neuromorphic processors that mimic how brains function has made it possible for researchers from Delft University of Technology to develop a drone that uses neuromorphic vision and control for autonomous flight.</em>
  666. </blockquote><p class="shortcode-media shortcode-media-youtube">
  667. <span class="rm-shortcode" data-rm-shortcode-id="d1e4b8e2228fe08dd50542353ca02854" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/NmjRwyHVk6A?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  668. </p><p>
  669. [
  670. <a href="https://www.science.org/doi/10.1126/scirobotics.adi0591">Science</a> ]
  671. </p><div class="horizontal-rule">
  672. </div><blockquote class="rm-anchors" id="y8ntmz7vgmu">
  673. <em>In the beginning of the universe, all was darkness — until the first organisms developed sight, which ushered in an explosion of life, learning and progress. AI pioneer Fei-Fei Li says a similar moment is about to happen for computers and robots. She shows how machines are gaining “spatial intelligence” — the ability to process visual data, make predictions and act upon those predictions — and shares how this could enable AI to interact with humans in the real world.</em>
  674. </blockquote><p class="shortcode-media shortcode-media-youtube">
  675. <span class="rm-shortcode" data-rm-shortcode-id="ffb38ac8db9bc6ed2bd4c7188d4e07ea" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/y8NtMZ7VGmU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  676. </p><p>
  677. [
  678. <a href="https://www.ted.com/talks/fei_fei_li_with_spatial_intelligence_ai_will_understand_the_real_world">TED</a> ]
  679. </p><div class="horizontal-rule">
  680. </div>]]></description><pubDate>Fri, 24 May 2024 17:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-starbucks-robots</guid><category>Service robots</category><category>Ted talk</category><category>Humanoid robots</category><category>Quadruped robots</category><category>Delivery robots</category><category>Drones</category><category>Video friday</category><dc:creator>Erico Guizzo</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/a-starbucks-worker-places-coffee-cups-on-a-mobile-robot-part-of-a-fleet-of-delivery-robots-operating-in-a-modern-office-buildin.gif?id=52317753&amp;width=980"></media:content></item><item><title>Video Friday: Robots With Knives</title><link>https://spectrum.ieee.org/video-friday-robots-with-knives</link><description><![CDATA[
  681. <img src="https://spectrum.ieee.org/media-library/a-robotic-hand-uses-a-kitchen-knife-to-slice-a-cucumber.jpg?id=52263792&width=1200&height=800&coordinates=70%2C0%2C71%2C0"/><br/><br/><p>
  682. Greetings from the
  683. <a href="https://2024.ieee-icra.org/" rel="noopener noreferrer" target="_blank">IEEE International Conference on Robotics and Automation (ICRA)</a> in Yokohama, Japan! We hope you’ve been enjoying our short videos on <a href="https://www.tiktok.com/@ieeespectrum/video/7369224609025068331" rel="noopener noreferrer" target="_blank">TikTok</a>, <a href="https://www.youtube.com/shorts/U_I8qLmIcuk" rel="noopener noreferrer" target="_blank">YouTube</a>, and <a href="https://www.instagram.com/ieeespectrum/" rel="noopener noreferrer" target="_blank">Instagram</a>. They are just a preview of our in-depth ICRA coverage, and over the next several weeks we’ll have lots of articles and videos for you. In today’s edition of Video Friday, we bring you a dozen of the most interesting projects presented at the conference.
  684. </p>
  685. <p>
  686. Enjoy today’s videos, and stay tuned for more ICRA posts!
  687. </p>
  688. <hr/>
  689. <p>
  690. Upcoming robotics events for the next few months:
  691. <br/>
  692. </p>
  693. <h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5>
  694. <h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5>
  695. <h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH, SWITZERLAND</h5>
  696. <p>
  697. Please
  698. <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday" target="_blank">send us your events</a> for inclusion.
  699. </p>
  700. <div class="horizontal-rule">
  701. </div>
  702. <div style="page-break-after: always">
  703. </div>
  704. <p class="rm-anchors" id="lyzpaa4fjeo">
  705. The following two videos are part of the “
  706. <a href="https://sites.google.com/view/icra2024cookingrobotics/" target="_blank">Cooking Robotics: Perception and Motion Planning</a>” workshop, which explored “the new frontiers of ‘robots in cooking,’ addressing various scientific research questions, including hardware considerations, key challenges in multimodal perception, motion planning and control, experimental methodologies, and benchmarking approaches.” The workshop featured robots handling food items like cookies, burgers, and cereal, and the two robots seen in the videos below used knives to slice cucumbers and cakes. You can watch all workshop videos <a href="https://www.youtube.com/@ICRA2024CookingRoboticsWS/videos" target="_blank">here</a>.
  707. </p>
  708. <p>
  709. “SliceIt!: Simulation-Based Reinforcement Learning for Compliant Robotic Food Slicing,” by Cristian C. Beltran-Hernandez, Nicolas Erbetti, and Masashi Hamaya from OMRON SINIC X Corporation, Tokyo, Japan.
  710. </p>
  711. <p class="shortcode-media shortcode-media-youtube">
  712. <span class="rm-shortcode" data-rm-shortcode-id="662a7ab0d312f6b9484cb0a033976d55" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/LyzPAA4Fjeo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  713. </p>
  714. <blockquote>
  715. <em>Cooking robots can enhance the home experience by reducing the burden of daily chores. However, these robots must perform their tasks dexterously and safely in shared human environments, especially when handling dangerous tools such as kitchen knives. This study focuses on enabling a robot to autonomously and safely learn food-cutting tasks. More specifically, our goal is to enable a collaborative robot or industrial robot arm to perform food-slicing tasks by adapting to varying material properties using compliance control. Our approach involves using Reinforcement Learning (RL) to train a robot to compliantly manipulate a knife, by reducing the contact forces exerted by the food items and by the cutting board. However, training the robot in the real world can be inefficient, and dangerous, and result in a lot of food waste. Therefore, we proposed SliceIt!, a framework for safely and efficiently learning robot food-slicing tasks in simulation. Following a real2sim2real approach, our framework consists of collecting a few real food slicing data, calibrating our dual simulation environment (a high-fidelity cutting simulator and a robotic simulator), learning compliant control policies on the calibrated simulation environment, and finally, deploying the policies on the real robot.</em>
  716. </blockquote>
  717. <div class="horizontal-rule">
  718. </div>
  719. <p class="rm-anchors" id="vi_ttalpiyk">
  720. “Cafe Robot: Integrated AI Skillset Based on Large Language Models,” by Jad Tarifi, Nima Asgharbeygi, Shuhei Takamatsu, and Masataka Goto from Integral AI in Tokyo, Japan, and Mountain View, Calif., USA.
  721. </p>
  722. <p class="shortcode-media shortcode-media-youtube">
  723. <span class="rm-shortcode" data-rm-shortcode-id="ed9df16089faacb22352b4dc5037e0a2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vi_ttALpIyk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  724. </p>
  725. <blockquote>
  726. <em>The cafe robot engages in natural language inter-action to receive orders and subsequently prepares coffee and cakes. Each action involved in making these items is executed using AI skills developed by Integral, including Integral Liquid Pouring, Integral Powder Scooping, and Integral Cutting. The dialogue for making coffee, as well as the coordination of each action based on the dialogue, is facilitated by the Integral Task Planner.</em>
  727. </blockquote>
  728. <div class="horizontal-rule">
  729. </div>
  730. <p class="rm-anchors" id="c-uekd6vtiq">
  731. “Autonomous Overhead Powerline Recharging for Uninterrupted Drone Operations,” by Viet Duong Hoang, Frederik Falk Nyboe, Nicolaj Haarhøj Malle, and Emad Ebeid from University of Southern Denmark, Odense, Denmark.
  732. </p>
  733. <p class="shortcode-media shortcode-media-youtube">
  734. <span class="rm-shortcode" data-rm-shortcode-id="f7f4241f948f7cea983204b6687b06e7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/C-uekD6VTIQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  735. </p>
  736. <blockquote>
  737. <em>We present a fully autonomous self-recharging drone system capable of long-duration sustained operations near powerlines. The drone is equipped with a robust onboard perception and navigation system that enables it to locate powerlines and approach them for landing. A passively actuated gripping mechanism grasps the powerline cable during landing after which a control circuit regulates the magnetic field inside a split-core current transformer to provide sufficient holding force as well as battery recharging. The system is evaluated in an active outdoor three-phase powerline environment. We demonstrate multiple contiguous hours of fully autonomous uninterrupted drone operations composed of several cycles of flying, landing, recharging, and takeoff, validating the capability of extended, essentially unlimited, operational endurance.</em>
  738. </blockquote>
  739. <div class="horizontal-rule">
  740. </div>
  741. <p class="rm-anchors" id="6huiz7iwngc">
  742. “Learning Quadrupedal Locomotion With Impaired Joints Using Random Joint Masking,” by Mincheol Kim, Ukcheol Shin, and Jung-Yup Kim from Seoul National University of Science and Technology, Seoul, South Korea, and Robotics Institute, Carnegie Mellon University, Pittsburgh, Pa., USA.
  743. </p>
  744. <p class="shortcode-media shortcode-media-youtube">
  745. <span class="rm-shortcode" data-rm-shortcode-id="446f529b517643452701152d97c43310" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/6huiZ7IwNGc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  746. </p>
  747. <blockquote>
  748. <em>Quadrupedal robots have played a crucial role in various environments, from structured environments to complex harsh terrains, thanks to their agile locomotion ability. However, these robots can easily lose their locomotion functionality if damaged by external accidents or internal malfunctions. In this paper, we propose a novel deep reinforcement learning framework to enable a quadrupedal robot to walk with impaired joints. The proposed framework consists of three components: 1) a random joint masking strategy for simulating impaired joint scenarios, 2) a joint state estimator to predict an implicit status of current joint condition based on past observation history, and 3) progressive curriculum learning to allow a single network to conduct both normal gait and various joint-impaired gaits. We verify that our framework enables the Unitree’s Go1 robot to walk under various impaired joint conditions in real world indoor and outdoor environments.</em>
  749. </blockquote>
  750. <div class="horizontal-rule">
  751. </div>
  752. <p class="rm-anchors" id="g4dlszegf_k">
  753. “Synthesizing Robust Walking Gaits via Discrete-Time Barrier Functions With Application to Multi-Contact Exoskeleton Locomotion,” by Maegan Tucker, Kejun Li, and Aaron D. Ames from Georgia Institute of Technology, Atlanta, Ga., and California Institute of Technology, Pasadena, Calif., USA.
  754. </p>
  755. <p class="shortcode-media shortcode-media-youtube">
  756. <span class="rm-shortcode" data-rm-shortcode-id="8a00c66fa57d7def45c9addc794b1b80" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/6aXsBKMxDH0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  757. </p>
  758. <blockquote>
  759. <em>Successfully achieving bipedal locomotion remains challenging due to real-world factors such as model uncertainty, random disturbances, and imperfect state estimation. In this work, we propose a novel metric for locomotive robustness – the estimated size of the hybrid forward invariant set associated with the step-to-step dynamics. Here, the forward invariant set can be loosely interpreted as the region of attraction for the discrete-time dynamics. We illustrate the use of this metric towards synthesizing nominal walking gaits using a simulation in-the-loop learning approach. Further, we leverage discrete time barrier functions and a sampling-based approach to approximate sets that are maximally forward invariant. Lastly, we experimentally demonstrate that this approach results in successful locomotion for both flat-foot walking and multicontact walking on the Atalante lower-body exoskeleton.</em>
  760. </blockquote>
  761. <div class="horizontal-rule">
  762. </div>
  763. <p class="rm-anchors" id="wed3lbvopq4">
  764. “Supernumerary Robotic Limbs to Support Post-Fall Recoveries for Astronauts,” by Erik Ballesteros, Sang-Yoep Lee, Kalind C. Carpenter, and H. Harry Asada from MIT, Cambridge, Mass., USA, and Jet Propulsion Laboratory, California Institute of Technology, Pasadena, Calif., USA.
  765. </p>
  766. <p class="shortcode-media shortcode-media-youtube">
  767. <span class="rm-shortcode" data-rm-shortcode-id="8a121890169aa60a0959bffb1af64e81" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wED3lBVopq4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  768. </p>
  769. <blockquote>
  770. <em>This paper proposes the utilization of Supernumerary Robotic Limbs (SuperLimbs) for augmenting astronauts during an Extra-Vehicular Activity (EVA) in a partial-gravity environment. We investigate the effectiveness of SuperLimbs in assisting astronauts to their feet following a fall. Based on preliminary observations from a pilot human study, we categorized post-fall recoveries into a sequence of statically stable poses called “waypoints”. The paths between the waypoints can be modeled with a simplified kinetic motion applied about a specific point on the body. Following the characterization of post-fall recoveries, we designed a task-space impedance control with high damping and low stiffness, where the SuperLimbs provide an astronaut with assistance in post-fall recovery while keeping the human in-the-loop scheme. In order to validate this control scheme, a full-scale wearable analog space suit was constructed and tested with a SuperLimbs prototype. Results from the experimentation found that without assistance, astronauts would impulsively exert themselves to perform a post-fall recovery, which resulted in high energy consumption and instabilities maintaining an upright posture, concurring with prior NASA studies. When the SuperLimbs provided assistance, the astronaut’s energy consumption and deviation in their tracking as they performed a post-fall recovery was reduced considerably.</em>
  771. </blockquote>
  772. <div class="horizontal-rule">
  773. </div>
  774. <p class="rm-anchors" id="vhhcsg38nvw">
  775. “ArrayBot: Reinforcement Learning for Generalizable Distributed Manipulation through Touch,” by Zhengrong Xue, Han Zhang, Jingwen Cheng, Zhengmao He, Yuanchen Ju, Changyi Lin, Gu Zhang, and Huazhe Xu from Tsinghua Embodied AI Lab, IIIS, Tsinghua University; Shanghai Qi Zhi Institute; Shanghai AI Lab; and Shanghai Jiao Tong University, Shanghai, China.
  776. </p>
  777. <p class="shortcode-media shortcode-media-youtube">
  778. <span class="rm-shortcode" data-rm-shortcode-id="257b8c4c82596402620de77e94fa8450" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vhhCsG38Nvw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  779. </p>
  780. <blockquote>
  781. <em>We present ArrayBot, a distributed manipulation system consisting of a 16 × 16 array of vertically sliding pillars integrated with tactile sensors. Functionally, ArrayBot is designed to simultaneously support, perceive, and manipulate the tabletop objects. Towards generalizable distributed manipulation, we leverage reinforcement learning (RL) algorithms for the automatic discovery of control policies. In the face of the massively redundant actions, we propose to reshape the action space by considering the spatially local action patch and the low-frequency actions in the frequency domain. With this reshaped action space, we train RL agents that can relocate diverse objects through tactile observations only. Intriguingly, we find that the discovered policy can not only generalize to unseen object shapes in the simulator but also have the ability to transfer to the physical robot without any sim-to-real fine tuning. Leveraging the deployed policy, we derive more real world manipulation skills on ArrayBot to further illustrate the distinctive merits of our proposed system.</em>
  782. </blockquote>
  783. <div class="horizontal-rule">
  784. </div>
  785. <p class="rm-anchors" id="jgxqpivkt04">
  786. “SKT-Hang: Hanging Everyday Objects via Object-Agnostic Semantic Keypoint Trajectory Generation,” by Chia-Liang Kuo, Yu-Wei Chao, and Yi-Ting Chen from National Yang Ming Chiao Tung University, in Taipei and Hsinchu, Taiwan, and NVIDIA.
  787. </p>
  788. <p class="shortcode-media shortcode-media-youtube">
  789. <span class="rm-shortcode" data-rm-shortcode-id="9d3a27598172d7941435b6cfe7d7818b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/JgxQpIvKT04?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  790. </p>
  791. <blockquote>
  792. <em>We study the problem of hanging a wide range of grasped objects on diverse supporting items. Hanging objects is a ubiquitous task that is encountered in numerous aspects of our everyday lives. However, both the objects and supporting items can exhibit substantial variations in their shapes and structures, bringing two challenging issues: (1) determining the task-relevant geometric structures across different objects and supporting items, and (2) identifying a robust action sequence to accommodate the shape variations of supporting items. To this end, we propose Semantic Keypoint Trajectory (SKT), an object agnostic representation that is highly versatile and applicable to various everyday objects. We also propose Shape-conditioned Trajectory Deformation Network (SCTDN), a model that learns to generate SKT by deforming a template trajectory based on the task-relevant geometric structure features of the supporting items. We conduct extensive experiments and demonstrate substantial improvements in our framework over existing robot hanging methods in the success rate and inference time. Finally, our simulation-trained framework shows promising hanging results in the real world.</em>
  793. </blockquote>
  794. <div class="horizontal-rule">
  795. </div>
  796. <p class="rm-anchors" id="ur7ud077ass">
  797. “TEXterity: Tactile Extrinsic deXterity,” by Antonia Bronars, Sangwoon Kim, Parag Patre, and Alberto Rodriguez from MIT and Magna International Inc.
  798. </p>
  799. <p class="shortcode-media shortcode-media-youtube">
  800. <span class="rm-shortcode" data-rm-shortcode-id="760fe2edeb58d41b5f82b1dd58480444" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ur7UD077ass?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  801. </p>
  802. <blockquote>
  803. <em>We introduce a novel approach that combines tactile estimation and control for in-hand object manipulation. By integrating measurements from robot kinematics and an image based tactile sensor, our framework estimates and tracks object pose while simultaneously generating motion plans in a receding horizon fashion to control the pose of a grasped object. This approach consists of a discrete pose estimator that tracks the most likely sequence of object poses in a coarsely discretized grid, and a continuous pose estimator-controller to refine the pose estimate and accurately manipulate the pose of the grasped object. Our method is tested on diverse objects and configurations, achieving desired manipulation objectives and outperforming single-shot methods in estimation accuracy. The proposed approach holds potential for tasks requiring precise manipulation and limited intrinsic in-hand dexterity under visual occlusion, laying the foundation for closed loop behavior in applications such as regrasping, insertion, and tool use.</em>
  804. </blockquote>
  805. <div class="horizontal-rule">
  806. </div>
  807. <p class="rm-anchors" id="2u7jxs1ax5o">
  808. “Out of Sight, Still in Mind: Reasoning and Planning about Unobserved Objects With Video Tracking Enabled Memory Models,” by Yixuan Huang, Jialin Yuan, Chanho Kim, Pupul Pradhan, Bryan Chen, Li Fuxin, and Tucker Hermans from University of Utah, Salt Lake City, Utah, Oregon State University, Corvallis, Ore., and NVIDIA, Seattle, Wash., USA.
  809. </p>
  810. <p class="shortcode-media shortcode-media-youtube">
  811. <span class="rm-shortcode" data-rm-shortcode-id="94741790d3f75be748908749c30af979" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2u7JxS1Ax5o?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  812. </p>
  813. <blockquote>
  814. <em>Robots need to have a memory of previously observed, but currently occluded objects to work reliably in realistic environments. We investigate the problem of encoding object-oriented memory into a multi-object manipulation reasoning and planning framework. We propose DOOM and LOOM, which leverage transformer relational dynamics to encode the history of trajectories given partial-view point clouds and an object discovery and tracking engine. Our approaches can perform multiple challenging tasks including reasoning with occluded objects, novel objects appearance, and object reappearance. Throughout our extensive simulation and real world experiments, we find that our approaches perform well in terms of different numbers of objects and different numbers</em>
  815. </blockquote>
  816. <div class="horizontal-rule">
  817. </div>
  818. <p class="rm-anchors" id="4ha7e2-nbuu">
  819. “Open Sourse Underwater Robot: Easys,” by Michikuni Eguchi, Koki Kato, Tatsuya Oshima, and Shunya Hara from University of Tsukuba and Osaka University, Japan.
  820. </p>
  821. <p class="shortcode-media shortcode-media-youtube">
  822. <span class="rm-shortcode" data-rm-shortcode-id="d08aad5767461210bb84747472ddee40" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/4HA7E2-nBuU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  823. </p>
  824. <div class="horizontal-rule">
  825. </div>
  826. <p class="rm-anchors" id="nbts0mg2ixc">
  827. “Sensorized Soft Skin for Dexterous Robotic Hands,” by Jana Egli, Benedek Forrai, Thomas Buchner, Jiangtao Su, Xiaodong Chen, and Robert K. Katzschmann from ETH Zurich, Switzerland, and Nanyang Technological University, Singapore.
  828. </p>
  829. <p class="shortcode-media shortcode-media-youtube">
  830. <span class="rm-shortcode" data-rm-shortcode-id="b09abc29e49f92c14e22daef746ad40e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wDtXIPDn0Rk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  831. </p>
  832. <blockquote>
  833. <em>Conventional industrial robots often use two fingered grippers or suction cups to manipulate objects or interact with the world. Because of their simplified design, they are unable to reproduce the dexterity of human hands when manipulating a wide range of objects. While the control of humanoid hands evolved greatly, hardware platforms still lack capabilities, particularly in tactile sensing and providing soft contact surfaces. In this work, we present a method that equips the skeleton of a tendon-driven humanoid hand with a soft and sensorized tactile skin. Multi-material 3D printing allows us to iteratively approach a cast skin design which preserves the robot’s dexterity in terms of range of motion and speed. We demonstrate that a soft skin enables frmer grasps and piezoresistive sensor integration enhances the hand’s tactile sensing capabilities.</em>
  834. </blockquote>
  835. <div class="horizontal-rule">
  836. </div>]]></description><pubDate>Fri, 17 May 2024 10:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robots-with-knives</guid><category>Robotics</category><category>Video friday</category><category>Robot videos</category><category>Robotics research</category><category>Cooking robots</category><category>Icra</category><dc:creator>Erico Guizzo</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-robotic-hand-uses-a-kitchen-knife-to-slice-a-cucumber.jpg?id=52263792&amp;width=980"></media:content></item><item><title>Disney's Robots Use Rockets to Stick the Landing</title><link>https://spectrum.ieee.org/disney-robot-2668135204</link><description><![CDATA[
  837. <img src="https://spectrum.ieee.org/media-library/three-people-in-hard-hats-stand-back-from-a-white-plume-coming-downwards-from-a-raised-object.jpg?id=52179304&width=1200&height=800&coordinates=0%2C0%2C211%2C0"/><br/><br/><p>It’s hard to think of a more dramatic way to make an entrance than falling from the sky. While it certainly happens often enough on the silver screen, whether or not it can be done in real life is a tantalizing challenge for our entertainment robotics team at Disney Research.<br/></p><hr/><p>Falling is tricky for two reasons. The first and most obvious is what Douglas Adams referred to as “the sudden stop at the end.” Every second of free fall means another 9.8 m/s of velocity, and that can quickly add up to an extremely difficult energy dissipation problem. The other tricky thing about falling, especially for terrestrial animals like us, is that our normal methods for controlling our orientation disappear. We are used to relying on contact forces between our body and the environment to control which way we’re pointing. In the air, there’s nothing to push on except the air itself!</p><p>Finding a solution to these problems is a big, open-ended challenge. In the clip below, you can see one approach we’ve taken to start chipping away at it.</p><p class="shortcode-media shortcode-media-youtube">
  838. <span class="rm-shortcode" data-rm-shortcode-id="ca223296f8fb78891e2c86d8dd8191b0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/q_4Pi78gvEo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  839. </p><p>The video shows a small, stick-like robot with an array of four ducted fans attached to its top. The robot has a piston-like foot that absorbs the impact of a small fall, and then the ducted fans keep the robot standing by counteracting any tilting motion using aerodynamic thrust.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-left" data-rm-resized-container="25%" style="float: left;">
  840. <img alt="Two people outdoors holding a tall silver object." class="rm-shortcode rm-resized-image" data-rm-shortcode-id="dfd7719be747ff81792f70c9ce37955a" data-rm-shortcode-name="rebelmouse-image" id="d8995" loading="lazy" src="https://spectrum.ieee.org/media-library/two-people-outdoors-holding-a-tall-silver-object.jpg?id=52179317&width=980" style="max-width: 100%"/>
  841. <small class="image-media media-caption" placeholder="Add Photo Caption..." style="max-width: 100%;">Raphael Pilon [left] and Marcela de los Rios evaluate the performance of the monopod balancing robot.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..." style="max-width: 100%;">Disney Research</small></p><p>The standing portion demonstrates that pushing on the air isn’t only useful during freefall. Conventional walking and hopping robots depend on ground contact forces to maintain the required orientation. These forces can ramp up quickly because of the stiffness of the system, necessitating high bandwidth control strategies. Aerodynamic forces are relatively soft, but even so, they were sufficient to keep our robots standing. And since these forces can also be applied during the flight phase of running or hopping, this approach might lead to robots that run before they walk. The thing that defines a running gait is the existence of a “flight phase” - a time when none of the feet are in contact with the ground. A running robot with aerodynamic control authority could potentially use a gait with a long flight phase. This would shift the burden of the control effort to mid-flight, simplifying the leg design and possibly making rapid bipedal motion more tractable than a moderate pace.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-left" data-rm-resized-container="25%" style="float: left;">
  842. <img alt="A man with silvery beard and mustache wearing safety googles and headphones sits in front of a mechanism on the floor." class="rm-shortcode rm-resized-image" data-rm-shortcode-id="614c86be77bcccf4996da78d513a9d10" data-rm-shortcode-name="rebelmouse-image" id="e974e" loading="lazy" src="https://spectrum.ieee.org/media-library/a-man-with-silvery-beard-and-mustache-wearing-safety-googles-and-headphones-sits-in-front-of-a-mechanism-on-the-floor.jpg?id=52181702&width=980" style="max-width: 100%"/>
  843. <small class="image-media media-caption" placeholder="Add Photo Caption..." style="max-width: 100%;">Richard Landon uses a test rig to evaluate the thrust profile of a ducted fan.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..." style="max-width: 100%;">Disney Research</small></p><p>In the next video, a slightly larger robot tackles a much more dramatic fall, from 65’ in the air. This simple machine has two piston-like feet and a similar array of ducted fans on top. The fans not only stabilize the robot upon landing, they also help keep it oriented properly as it falls. Inside each foot is a plug of single-use compressible foam. Crushing the foam on impact provides a nice, constant force profile, which maximizes the amount of energy dissipated per inch of contraction.</p><p class="shortcode-media shortcode-media-youtube">
  844. <span class="rm-shortcode" data-rm-shortcode-id="41df3913eaf7f0bfbfd26bd85bad7389" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/O4-OE5lMFQo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  845. </p><p>In the case of this little robot, the mechanical energy dissipation in the pistons is less than the total energy needed to be dissipated from the fall, so the rest of the mechanism takes a pretty hard hit. The size of the robot is an advantage in this case, because scaling laws mean that the strength-to-weight ratio is in its favor. </p><p>The strength of a component is a function of its cross-sectional area, while the weight of a component is a function of its volume. Area is proportional to length squared, while volume is proportional to length cubed. This means that as an object gets smaller, its weight becomes relatively small. This is why a toddler can be half the height of an adult but only a fraction of that adult’s weight, and why ants and spiders can run around on long, spindly legs. Our tiny robots take advantage of this, but we can’t stop there if we want to represent some of our bigger characters.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  846. <img alt="Two people kneel on the floor working on a silver contraption." class="rm-shortcode" data-rm-shortcode-id="5ade6e78b56e5246e2732fb428026a76" data-rm-shortcode-name="rebelmouse-image" id="685cc" loading="lazy" src="https://spectrum.ieee.org/media-library/two-people-kneel-on-the-floor-working-on-a-silver-contraption.jpg?id=52179349&width=980"/>
  847. <small class="image-media media-caption" placeholder="Add Photo Caption...">Louis Lambie and Michael Lynch assemble an early ducted fan test platform. The platform was mounted on guidewires and was used for lifting capacity tests.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Disney Research</small></p><p>In most aerial robotics applications, control is provided by a system that is capable of supporting the entire weight of the robot. In our case, being able to hover isn’t a necessity. The clip below shows an investigation into how much thrust is needed to control the orientation of a fairly large, heavy robot. The robot is supported on a gimbal, allowing it to spin freely. At the extremities are mounted arrays of ducted fans. The fans don’t have enough force to keep the frame in the air, but they do have a lot of control authority over the orientation.</p><p class="shortcode-media shortcode-media-youtube">
  848. <span class="rm-shortcode" data-rm-shortcode-id="c0db29b873f9d7ca592687c0ccd6f398" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/yMTg7jLJIYs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  849. </p><p>Complicated robots are less likely to survive unscathed when subjected to the extremely high accelerations of a direct ground impact, as you can see in this early test that didn’t quite go according to plan.</p><p class="shortcode-media shortcode-media-youtube">
  850. <span class="rm-shortcode" data-rm-shortcode-id="1aa3d16b2394ef3c9660e3ac8e3fbec2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vujnpCQGKCY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  851. </p><p>In this last video, we use a combination of the previous techniques and add one more capability – a dramatic mid-air stop. Ducted fans are part of this solution, but the high-speed deceleration is principally accomplished by a large water rocket. Then the mechanical legs only have to handle the last ten feet of dropping acceleration.</p><p class="shortcode-media shortcode-media-youtube">
  852. <span class="rm-shortcode" data-rm-shortcode-id="1aa54c4c91110f3aa7f2422740c95f97" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/mw_XpMe6-1w?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  853. </p><p>Whether it’s using water or rocket fuel, the principle underlying a rocket is the same – mass is ejected from the rocket at high speed, producing a reaction force in the opposite direction via Newton’s third law. The higher the flow rate and the denser the fluid, the more force is produced. To get a high flow rate and a quick response time, we needed a wide nozzle that went from closed to open cleanly in a matter of milliseconds. We designed a system using a piece of copper foil and a custom punch mechanism that accomplished just that.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  854. <img alt="Two photos show someone on a ladder manipulating a small tank on the left, and on the right a black cylindar with a clear tube out of the bottom and splashing water coming up from the ground." class="rm-shortcode" data-rm-shortcode-id="bbb174639d629ba2462a371ff24d8220" data-rm-shortcode-name="rebelmouse-image" id="2846a" loading="lazy" src="https://spectrum.ieee.org/media-library/two-photos-show-someone-on-a-ladder-manipulating-a-small-tank-on-the-left-and-on-the-right-a-black-cylindar-with-a-clear-tube-o.jpg?id=52179367&width=980"/>
  855. <small class="image-media media-caption" placeholder="Add Photo Caption...">Grant Imahara pressurizes a test tank to evaluate an early valve prototype [left]. The water rocket in action - note the laminar, two-inch-wide flow as it passes through the specially designed nozzle</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Disney Research</small></p><p>Once the water rocket has brought the robot to a mid-air stop, the ducted fans are able to hold it in a stable hover about ten feet above the deck. When they cut out, the robot falls again and the legs absorb the impact. In the video, the robot has a couple of loose tethers attached as a testing precaution, but they don’t provide any support, power, or guidance.</p><p>“It might not be so obvious as to what this can be directly used for today, but these rough proof-of-concept experiments show that we might be able to work within real-world physics to do the high falls our characters do on the big screen, and someday actually stick the landing,” explains Tony Dohi, the project lead. <br/></p><p>There are still a large number of problems for future projects to address. Most characters have legs that bend on hinges rather than compress like pistons, and don’t wear a belt made of ducted fans. Beyond issues of packaging and form, making sure that the robot lands exactly where it intends to land has interesting implications for perception and control. Regardless, we think we can confirm that this kind of entrance has–if you’ll excuse the pun–quite the impact.</p>]]></description><pubDate>Sun, 12 May 2024 13:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/disney-robot-2668135204</guid><category>Disney research</category><category>Rockets</category><category>Robotics</category><dc:creator>Morgan Pope</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/three-people-in-hard-hats-stand-back-from-a-white-plume-coming-downwards-from-a-raised-object.jpg?id=52179304&amp;width=980"></media:content></item><item><title>Video Friday: Robot Bees</title><link>https://spectrum.ieee.org/video-friday-robot-bees</link><description><![CDATA[
  856. <img src="https://spectrum.ieee.org/media-library/image.jpg?id=52214929&width=1200&height=800&coordinates=300%2C0%2C300%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><h5><a href="https://icsr2024.dk/">ICSR 2024</a>: 23–26 October 2024, ODENSE, DENMARK</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="uzrf9y9ewqm">Festo has robot bees!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cd84fa42044d3c82b5f5232d59d3f4fa" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UzRf9y9EWqM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>It’s a very clever design, but the size makes me terrified of whatever the bees are that Festo seems to be familiar with.</p><p>[ <a href="https://www.festo.com/bionicbee">Festo</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="eqfgdbfykvs">Boing, boing, boing!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="eb4f4d7beaa1f1f8fddd5e60129874af" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/EQFgdbFYKvs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sites.usc.edu/quann/">USC</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="03egqwiy0de">Why the heck would you take the trouble to program a robot to make sweet potato chips and then not scarf them down yourself?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e848aa4cdd40fcc38b725ffc70ec8aaf" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/03egqwIy0DE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dino-robotics.com/">Dino Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="0chapn2kowa"><em>Mobile robots can transport payloads far greater than their mass through vehicle traction. However, off-road terrain features substantial variation in height, grade, and friction, which can cause traction to degrade or fail catastrophically. This paper presents a system that utilizes a vehicle-mounted, multipurpose manipulator to physically adapt the robot with unique anchors suitable for a particular terrain for autonomous payload transport.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f3b07a9bbc38fac66002aff02b8b0656" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0chaPN2KOwA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sites.gatech.edu/dart-lab/">DART Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="dhvjlvkqs2c">Turns out that working on a collaborative task with a robot can make humans less efficient, because we tend to overestimate the robot’s capabilities.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5fd905ac763729eaa1d423633eadc8ad" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DHVjlVKqs2c?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://programs.sigchi.org/chi/2024/program/content/151006">CHI 2024</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="pxknatjtfs8">Wing posts a video with the title “What Do Wing’s Drones Sound Like” but only includes a brief snippet—though nothing without background room noise—revealing to curious viewers and listeners exactly what Wing’s drones sound like. </p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c605ddeca11bb076d2d32500e31abef2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/PXKNaTjTfS8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Because, look, a couple seconds of muted audio underneath a voiceover is in fact not really answering the question. </p><p>[ <a href="https://wing.com/">Wing</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="nbts0mg2ixc"><em>This first instance of ROB 450 in Winter 2024 challenged students to synthesize the knowledge acquired through their Robotics undergraduate courses at the University of Michigan to use a systematic and iterative design and analysis process and apply it to solving a real, open-ended Robotics problem.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0543a064808a222c547b074311fec949" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/NbtS0mG2iXc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotics.umich.edu/">Michigan Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="g4dlszegf_k">This Microsoft Future Leaders in Robotics and AI Seminar is from <a href="https://catiecuan.com/" target="_blank">Catie Cuan</a> at Stanford, on “Choreorobotics: Teaching Robots How to Dance With Humans.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3da6e5e37aea62afed00f6daccefd8d7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/g4dLsZegF_k?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>As robots transition from industrial and research settings into everyday environments, robots must be able to (1) learn from humans while benefiting from the full range of the humans’ knowledge and (2) learn to interact with humans in safe, intuitive, and social ways. I will present a series of compelling robot behaviors, where human perception and interaction are foregrounded in a variety of tasks.</em></blockquote><p>[ <a href="https://robotics.umd.edu/">UMD</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 10 May 2024 16:26:56 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robot-bees</guid><category>Biomemetic robots</category><category>Mobile robots</category><category>Robotic arm</category><category>Robotics</category><category>Video friday</category><category>Stanford</category><category>Drone delivery</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/image.jpg?id=52214929&amp;width=980"></media:content></item><item><title>The New Shadow Hand Can Take a Beating</title><link>https://spectrum.ieee.org/robot-hand-shadow-robot-company</link><description><![CDATA[
  857. <img src="https://spectrum.ieee.org/media-library/image.png?id=52214870&width=4599&height=3474&coordinates=0%2C0%2C1352%2C493"/><br/><br/><p>For years, <a href="https://www.shadowrobot.com/" target="_blank">Shadow Robot Company</a>’s Shadow Hand has arguably been the gold standard for robotic manipulation. Beautiful and expensive, it is able to mimic the form factor and functionality of human hands, which has made it ideal for complex tasks. I’ve personally experienced how amazing it is to use Shadow Hands in a teleoperation context, and it’s hard to imagine anything better.</p><p>The problem with <a href="https://robotsguide.com/robots/shadow" target="_blank">the original Shadow hand</a> was (and still is) fragility. In a research environment, this has been fine, except that research is changing: Roboticists no longer carefully program manipulation tasks by, uh, hand. Now it’s all about machine learning, in which you need robotic hands to massively fail over and over again until they build up enough data to understand how to succeed. </p><p class="pull-quote">“We’ve aimed for robustness and performance over anthropomorphism and human size and shape.” <strong>—Rich Walker, Shadow Robot Company</strong></p><p>Doing this with a Shadow Hand was just not realistic, which Google DeepMind understood five years ago when it asked Shadow Robot to build it a new hand with hardware that could handle the kind of training environments that now typify manipulation research. So Shadow Robot spent the last half-decade-ish working on <a href="https://www.shadowrobot.com/new-shadow-hand/" target="_blank">a new, three-fingered Shadow Hand</a>, which the company unveiled today. The company is calling it, appropriately enough, “the new Shadow Hand.”</p><hr/><p class="shortcode-media shortcode-media-youtube">
  858. <span class="rm-shortcode" data-rm-shortcode-id="4ea7877f1366bb130b2b564d29df9645" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/upi6c9dmJpM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  859. </p><p>As you can see, this thing is an absolute beast. Shadow Robot says that the new hand is “robust against a significant amount of misuse, including aggressive force demands, abrasion and impacts.” Part of the point, though, is that what robot-hand <em><em>designers</em></em> might call “misuse,” robot-manipulation <em><em>researchers</em></em> might very well call “progress,” and the hand is designed to stand up to manipulation research that pushes the envelope of what robotic hardware and software are physically capable of.</p><p>Shadow Robot understands that despite its best engineering efforts, this new hand will still occasionally break (because it’s a robot and that’s what robots do), so the company designed it to be modular and easy to repair. Each finger is its own self-contained unit that can be easily swapped out, with five Maxon motors in the base of the finger driving the four finger joints through cables in a design that eliminates <a href="https://en.wikipedia.org/wiki/Backlash_(engineering)" target="_blank">backlash</a>. The cables themselves will need replacement from time to time, but it’s much easier to do this on the new Shadow Hand than it was on the original. Shadow Robot says that you can swap out an entire New Hand’s worth of cables in the same time it would take you to replace a <em><em>single</em></em> cable on the old hand.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  860. <img alt="" class="rm-shortcode" data-rm-shortcode-id="e3e90873a58fbf7f5ed8f4f37a03c703" data-rm-shortcode-name="rebelmouse-image" id="d5945" loading="lazy" src="https://spectrum.ieee.org/media-library/image.jpg?id=52214869&width=980"/>
  861. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Shadow Robot</small></p><p>The new Shadow Hand itself is somewhat larger than a typical human hand, and heavier too: Each modular finger unit weighs 1.2 kilograms, and the entire three-fingered hand is just over 4 kg. The fingers have humanlike <a href="https://en.wikipedia.org/wiki/Kinematics" target="_blank">kinematics</a>, and each joint can move up to 180 degrees per second with the capability of exerting at least 8 newtons of force at each fingertip. Both force control and position control are available, and the entire hand runs <a href="https://www.ros.org/" target="_blank">Robot Operating System</a>, the <a href="https://www.openrobotics.org/" target="_blank">Open Source Robotics Foundation</a>’s collection of open-source software libraries and tools.</p><p class="shortcode-media shortcode-media-youtube">
  862. <span class="rm-shortcode" data-rm-shortcode-id="fb7d9a3e22cab9be1133c65c8cf482cf" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Wcf9mbgoqKc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  863. </p><p>One of the coolest new features of this hand is the tactile sensing. Shadow Robot has decided to take the optical route with fingertip sensors, <a href="https://www.gelsight.com/" target="_blank">GelSight</a>-style. Each fingertip is covered in soft, squishy gel with thousands of embedded particles. Cameras in the fingers behind the gel track each of those particles, and when the fingertip touches something, the particles move. Based on that movement, the fingertips can very accurately detect the magnitude and direction of even very small forces. And there are even more sensors on the insides of the fingers too, with embedded <a href="https://en.wikipedia.org/wiki/Hall_effect_sensor" target="_blank">Hall effect sensors</a> to help provide feedback during grasping and manipulation tasks.<br/></p><p class="shortcode-media shortcode-media-rebelmouse-image">
  864. <img alt="" class="rm-shortcode" data-rm-shortcode-id="efe03ac13996ca1ae0b84466c009efd0" data-rm-shortcode-name="rebelmouse-image" id="cc798" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=52214895&width=980"/>
  865. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Shadow Robot</small></p><p>The most striking difference here is how completely different of a robotic-manipulation philosophy this new hand represents for Shadow Robot. “We’ve aimed for robustness and performance over anthropomorphism and human size and shape,” says Rich Walker, director of Shadow Robot Company. “There’s a very definite design choice there to get something that really behaves much more like an optimized manipulator rather than a humanlike hand.” </p><p>Walker explains that Shadow Robot sees two different approaches to manipulation within the robotics community right now: There’s imitation learning, where a human does a task and then a robot tries to do the task the same way, and then there’s reinforcement learning, where a robot tries to figure out how do the task by itself. “Obviously, this hand was built from the ground up to make reinforcement learning easy.”</p><p>The hand was also built from the ground up to be rugged and repairable, which had a significant effect on the form factor. To make the fingers modular, they have to be chunky, and trying to cram five of them onto one hand was just not practical. But because of this modularity, Shadow Robot could make you a five-fingered hand if you really wanted one. Or a two-fingered hand. Or (and this is the company’s suggestion, not mine) “a giant spider.” Really, though, it’s probably not useful to get stuck on the form factor. Instead, focus more on what the hand can do. In fact, Shadow Robot tells me that the best way to think about the hand in the context of agility is as having three <em><em>thumbs</em></em>, not three fingers, but Walker says that “if we describe it as that, people get confused.”</p><p>There’s still definitely a place for the original anthropomorphic Shadow Hand, and Shadow Robot has no plans to discontinue it. “It’s clear that for some people anthropomorphism is a deal breaker, they have to have it,” Walker says. “But for a lot of people, the idea that they could have something which is really robust and dexterous and can gather lots of data, that’s exciting enough to be worth saying okay, what can we do with this? We’re very interested to find out what happens.”</p><p>The <a href="https://www.shadowrobot.com/new-shadow-hand/" target="_blank">Shadow New Hand</a> is available now, starting at about US $74,000 depending on configuration.</p>]]></description><pubDate>Fri, 10 May 2024 14:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/robot-hand-shadow-robot-company</guid><category>Anthropomorphism</category><category>Google deepmind</category><category>Modular design</category><category>Robotic manipulation</category><category>Robotics</category><category>Shadow hand</category><category>Shadow robot</category><category>Tactile sensing</category><category>Robot hand</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/image.png?id=52214870&amp;width=980"></media:content></item><item><title>Video Friday: Loco-Manipulation</title><link>https://spectrum.ieee.org/video-friday-locoman</link><description><![CDATA[
  866. <img src="https://spectrum.ieee.org/media-library/image.png?id=52165069&width=1200&height=800&coordinates=316%2C0%2C316%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="d-glirpihie"><em>In this work, we present LocoMan, a dexterous quadrupedal robot with a novel morphology to perform versatile manipulation in diverse constrained environments. By equipping a Unitree Go1 robot with two low-cost and lightweight modular 3-DoF loco-manipulators on its front calves, LocoMan leverages the combined mobility and functionality of the legs and grippers for complex manipulation tasks that require precise 6D positioning of the end effector in a wide workspace.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d54a294a40d24a800c5a6f4198f2b44c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/d-GLiRPiHIE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://linchangyi1.github.io/LocoMan/">CMU</a> ]</p><p>Thanks, Changyi!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="8cmpr3j5gby"><em>Object manipulation has been extensively studied in the context of fixed-base and mobile manipulators. However, the overactuated locomotion modality employed by snake robots allows for a unique blend of object manipulation through locomotion, referred to as loco-manipulation. In this paper, we present an optimization approach to solving the loco-manipulation problem based on nonimpulsive implicit-contact path planning for our snake robot COBRA.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b348de7e428a1722b24d6d60598afb7e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8cmpR3J5gbY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://siliconsynapse.sites.northeastern.edu/">Silicon Synapse Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="mg4ppkcyjig">Okay, but where that costume has eyes is not where Spot has eyes, so the Spot in the costume can’t see, right? And now I’m skeptical of the authenticity of the mutual snoot-boop.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c14bd4d72d7aa47ebd1375add7b938f1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/MG4PPkCyJig?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/products/spot/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="jtxhcvwoohm">Here’s some video of Field AI’s robots operating in relatively complex and unstructured environments without prior maps. Make sure to read our <a href="https://spectrum.ieee.org/autonomy-unstructured-field-ai" target="_blank">article from this week</a> for details!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="16a5cc1d82d3d840b12ef9308e769ab8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/JTxhcVwooHM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p class="shortcode-media shortcode-media-youtube">
  867. <span class="rm-shortcode" data-rm-shortcode-id="8542d6dca1581be0c3cc8d56254ddfc2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/j7bwTUSq9dg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  868. </p><p>[ <a href="https://fieldai.com/">Field AI</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="zradse5peeg">Is it just me, or is it kind of wild that researchers are now publishing papers comparing their humanoid controller to the “manufacturer’s” humanoid controller? It’s like humanoids are a commodity now or something.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1dc37a110424295ff1089a75cab35616" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ZraDSE5Peeg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://b-vm.github.io/Robust-SaW/">OSU</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="zlhmgfsgoqq">I, too, am packing armor for ICRA.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a548d14798b3e46667794e9cce68626c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ZlHmgFsGOqQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pollen-robotics.com/">Pollen Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="heddvc-1dny"><em>Honey Badger 4.0 is our latest robotic platform, created specifically for traversing hostile environments and difficult terrains. Equipped with multiple cameras and sensors, it will make sure no defect is omitted during inspection.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4e977961b4c06b9d0e137b517b064751" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/HEddVC-1DnY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.mabrobotics.pl/robot">MAB Robotics</a> ]</p><p>Thanks, Jakub!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="eer4pdkw2s4"><em>Have an automation task that calls for the precision and torque of an industrial robot arm…but you need something that is more rugged or a nonconventional form factor?  Meet the HEBI Robotics H-Series Actuator! With 9x the torque of our X-Series and seamless compatibility with the HEBI ecosystem for robot development, the H-Series opens a new world of possibilities for robots.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="88b4da69ebea06b0287338d700cbafb0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/EER4pdKw2s4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.hebirobotics.com/actuators#h-series">HEBI</a> ]</p><p>Thanks, Dave!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="dcn0n4_1ntm">This is how all spills happen at my house too: super passive-aggressively.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="85cb4926f41d5ad5db7d4f2ce0c0fc6a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DcN0N4_1ntM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.1x.tech/">1X</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="kx7mnvrr_ns"><em>EPFL’s team, led by Ph.D. student Milad Shafiee along with coauthors Guillaume Bellegarda and BioRobotics Lab head Auke Ijspeert, have trained a four-legged robot using deep-reinforcement learning to navigate challenging terrain, achieving a milestone in both robotics and biology.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="fcbb6c3bd1a527294978c5506eaae120" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/kX7MNvrR_ns?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://actu.epfl.ch/news/trotting-robots-reveal-emergence-of-animal-gait-tr/">EPFL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="j40zg1j1mca"><em>At Agility, we make robots that are made for work. Our robot Digit works alongside us in spaces designed for people. Digit handles the tedious and repetitive tasks meant for a machine, allowing companies and their people to focus on the work that requires the human element.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b1e72aa104b3c072053577b133919cfb" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/j40ZG1j1mCA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://agilityrobotics.com/">Agility</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="qclgqyrbtme"><em>With a wealth of incredible figures and outstanding facts, here’s Jan Jonsson, ABB Robotics veteran,  sharing his knowledge and passion for some of our robots and controllers from the past.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3544fd410870cd33b5479a342d36c2dd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/QCLGqyrBTME?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://global.abb/group/en/about/history">ABB</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="sbsnrz65udk">I have it on good authority that getting robots to mow a lawn (like, any lawn) is much harder than it looks, but Electric Sheep has built a business around it.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="46c84bab858c165cdfcc54ffb04739a8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/SBSnRz65UDk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sheeprobotics.ai/">Electric Sheep</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="jlexuorei9w"><em>The AI Index, currently in its seventh year, tracks, collates, distills, and visualizes data relating to artificial intelligence. The Index provides unbiased, rigorously vetted, and globally sourced data for policymakers, researchers, journalists, executives, and the general public to develop a deeper understanding of the complex field of AI. Led by a steering committee of influential AI thought leaders, the Index is the world’s most comprehensive report on trends in AI. In this seminar, HAI Research Manager Nestor Maslej offers highlights from the 2024 report, explaining trends related to research and development, technical performance, technical AI ethics, the economy, education, policy and governance, diversity, and public opinion.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="41c9ff6b83a4f2de7a2157b9ec40de12" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/JLExuoRei9w?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://hai.stanford.edu/research/ai-index-report">Stanford HAI</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="oazrbyclnaa">This week’s CMU Robotics Institute seminar, from Dieter Fox at Nvidia and the University of Washington, is “Where’s RobotGPT?”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cd3a0a2a340d195c30b49ff35c351c5f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/OAZrBYCLnaA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>In this talk, I will discuss approaches to generating large datasets for training robot-manipulation capabilities, with a focus on the role simulation can play in this context. I will show some of our prior work, where we demonstrated robust sim-to-real transfer of manipulation skills trained in simulation, and then present a path toward generating large-scale demonstration sets that could help train robust, open-world robot-manipulation models.</em></blockquote><p>[ <a href="https://www.ri.cmu.edu/event/ri-seminar-with-dieter-fox/">CMU</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 03 May 2024 16:22:31 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-locoman</guid><category>Deep reinforcement learning</category><category>Digit robot</category><category>Humanoid robots</category><category>Lawnmowers</category><category>Quadruped robots</category><category>Robotics</category><category>Video friday</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/image.png?id=52165069&amp;width=980"></media:content></item><item><title>How Field AI Is Conquering Unstructured Autonomy</title><link>https://spectrum.ieee.org/autonomy-unstructured-field-ai</link><description><![CDATA[
  869. <img src="https://spectrum.ieee.org/media-library/human-like-robot-driver-with-a-logo-on-its-chest-saying-field-ai-sits-behind-the-wheel-of-an-open-dune-buggy-like-vehicle.jpg?id=52093492&width=1200&height=800&coordinates=0%2C0%2C0%2C1"/><br/><br/><p>One of the biggest challenges for robotics right now is practical autonomous operation in unstructured environments. That is, doing useful stuff in places your robot hasn’t been before and where things may not be as familiar as your robot might like. Robots thrive on predictability, which has put some irksome restrictions on where and how they can be successfully deployed.</p><p>But over the past few years, this has started to change, thanks in large part to a couple of pivotal robotics challenges put on by DARPA. <a href="https://spectrum.ieee.org/collections/darpa-subterranean-challenge/" target="_blank">The DARPA Subterranean Challenge</a> ran from 2018 to 2021, putting mobile robots through a series of unstructured underground environments. And the currently ongoing <a href="https://spectrum.ieee.org/darpa-robot-racer" target="_blank">DARPA RACER</a> program tasks autonomous vehicles with navigating long distances off-road. Some extremely impressive technology has been developed through these programs, but there’s always a gap between this cutting-edge research and any real-world applications.</p><p>Now, a bunch of the folks involved in these challenges, including experienced roboticists from NASA, DARPA, Google DeepMind, Amazon, and Cruise (to name just a few places) are applying everything that they’ve learned to enable real-world practical autonomy for mobile robots at a startup called <a href="https://fieldai.com/" rel="noopener noreferrer" target="_blank"><u>Field AI</u></a>.</p><hr/><p>Field AI was cofounded by Ali Agha, who previously was a group leader for NASA JPL’s Aerial Mobility Group as well as JPL’s Perception Systems Group. While at JPL, Agha led Team CoSTAR, which<a href="https://spectrum.ieee.org/nasa-jpl-team-costar-darpa-subt-urban-circuit-systems-track" target="_self"><u> won the DARPA Subterranean Challenge Urban Circuit</u></a>. Agha has also been the principal investigator for <a href="https://spectrum.ieee.org/darpa-robot-racer" target="_self"><u>DARPA RACER</u></a>, first with JPL, and now continuing with Field AI. “Field AI is not just a startup,” Agha tells us. “It’s a culmination of decades of experience in AI and its deployment in the field.”</p><p class="pull-quote">Unstructured environments are where things are constantly changing, which can play havoc with robots that rely on static maps.</p><p>The “field” part in Field AI is what makes Agha’s startup unique. Robots running Field AI’s software are able to handle unstructured, unmapped environments without reliance on prior models, GPS, or human intervention. Obviously, this kind of capability was (and is) of interest to NASA and JPL, which send robots to places where there are no maps, GPS doesn’t exist, and direct human intervention is impossible. </p><p>But DARPA SubT demonstrated that similar environments can be found on Earth, too. For instance, mines, natural caves, and the urban underground are all extremely challenging for robots (and even for humans) to navigate. And those are just the most extreme examples: robots that need to operate inside buildings or out in the wilderness have similar challenges understanding where they are, where they’re going, and how to navigate the environment around them.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  870. <img alt="driverless dune buggy-type vehicle with waving American flag drives through a blurred landscape of sand and scrub brush" class="rm-shortcode" data-rm-shortcode-id="eeec6a4f68cf560fd766510e5231475c" data-rm-shortcode-name="rebelmouse-image" id="f1e2f" loading="lazy" src="https://spectrum.ieee.org/media-library/driverless-dune-buggy-type-vehicle-with-waving-american-flag-drives-through-a-blurred-landscape-of-sand-and-scrub-brush.png?id=52127240&width=980"/>
  871. <small class="image-media media-caption" data-gramm="false" data-lt-tmp-id="lt-741183" placeholder="Add Photo Caption..." spellcheck="false">An autonomous vehicle drives across kilometers of desert with no prior map, no GPS, and no road.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Field AI</small></p><p>Despite the difficulty that robots have operating in the field, this is an enormous opportunity that Field AI hopes to address. Robots have already proven their worth in inspection contexts, typically where you either need to make sure that nothing is going wrong across a large industrial site, or for tracking construction progress inside a partially completed building. There’s a lot of value here because the consequences of something getting messed up are expensive or dangerous or both, but the tasks are repetitive and sometimes risky and generally don’t require all that much human insight or creativity. </p><h3>Uncharted Territory as Home Base</h3><p>Where Field AI differs from other robotics companies offering these services, as Agha explains, is that his company wants to do these tasks without first having a map that tells the robot where to go. In other words, there’s no lengthy setup process, and no human supervision, and the robot can adapt to changing and new environments. Really, this is what full autonomy is all about: going anywhere, anytime, without human interaction. “Our customers don’t need to train anything,” Agha says, laying out the company’s vision. “They don’t need to have precise maps. They press a single button, and the robot just discovers every corner of the environment.” This capability is where the DARPA SubT heritage comes in. During the competition, DARPA basically said, “here’s the door into the course. We’re not going to tell you anything about what’s back there or even how big it is. Just go explore the whole thing and bring us back the info we’ve asked for.” Agha’s Team CoSTAR did exactly that during the competition, and Field AI is commercializing this capability.</p><p class="pull-quote">“With our robots, our aim is for you to just deploy it, with no training time needed. And then we can just leave the robots.” <strong>—Ali Agha, Field AI</strong></p><p>The other tricky thing about these unstructured environments, especially construction environments, is that things are constantly changing, which can play havoc with robots that rely on static maps. “We’re one of the few, if not the only company that can leave robots for days on continuously changing construction sites with minimal supervision,” Agha tells us. “These sites are very complex—every day there are new items, new challenges, and unexpected events. Construction materials on the ground, scaffolds, forklifts, and heavy machinery moving all over the place, nothing you can predict.” </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  872. <img alt="" class="rm-shortcode" data-rm-shortcode-id="814820b370823b4c8d293994362f74cf" data-rm-shortcode-name="rebelmouse-image" id="8c50c" loading="lazy" src="https://spectrum.ieee.org/media-library/image.jpg?id=52093494&width=980"/>
  873. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Field AI</small></p><p>Field AI’s approach to this problem is to emphasize environmental understanding over mapping. Agha says that essentially, Field AI is working towards creating “field foundation models” (FFMs) of the physical world, using sensor data as an input. You can think of FFMs as being similar to the foundation models of language, music, and art that other AI companies have created over the past several years, where ingesting a large amount of data from the Internet enables some level of functionality in a domain without requiring specific training for each new situation. Consequently, Field AI’s robots can understand <em><em>how </em></em>to move in the world, rather than just <em><em>where</em></em> to move. “We look at AI quite differently from what’s mainstream,” Agha explains. “We do very heavy probabilistic modeling.” Much more technical detail would get into Field AI’s IP, says Agha, but the point is that real-time world modeling becomes a by-product of Field AI’s robots operating in the world rather than a prerequisite for that operation. This makes the robots fast, efficient, and resilient.</p><p>Developing field-foundation models that robots can use to reliably go almost anywhere requires a lot of real-world data, which Field AI has been collecting at industrial and construction sites around the world for the past year. To be clear, they’re collecting the data as part of their commercial operations—these are paying customers that Field AI has already. “In these job sites, it can traditionally take weeks to go around a site and map where every single target of interest that you need to inspect is,” explains Agha. “But with our robots, our aim is for you to just deploy it, with no training time needed. And then we can just leave the robots. This level of autonomy really unlocks a lot of use cases that our customers weren’t even considering, because they thought it was years away.” And the use cases aren’t just about construction or inspection or other areas where we’re already seeing autonomous robotic systems, Agha says. “These technologies hold immense potential.” </p><p>There’s obviously demand for this level of autonomy, but Agha says that the other piece of the puzzle that will enable Field AI to leverage a trillion dollar market is the fact that they can do what they do with virtually any platform. Fundamentally, Field AI is a software company—they make sensor payloads that integrate with their autonomy software, but even those payloads are adjustable, ranging from something appropriate for an autonomous vehicle to something that a drone can handle. </p><p>Heck, if you decide that you need an autonomous humanoid for some weird reason, Field AI can do that too. While the versatility here is important, according to Agha, what’s even more important is that it means you can focus on platforms that are more affordable, and still expect the same level of autonomous performance, within the constraints of each robot’s design, of course. With control over the full software stack, integrating mobility with high-level planning, decision making, and mission execution, Agha says that the potential to take advantage of relatively inexpensive robots is what’s going to make the biggest difference toward Field AI’s commercial success.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  874. <img alt="Group shot in a company parking lot of ten men and 12 robots" class="rm-shortcode" data-rm-shortcode-id="7257ff8da545e998427ea45757c3fe47" data-rm-shortcode-name="rebelmouse-image" id="786c2" loading="lazy" src="https://spectrum.ieee.org/media-library/group-shot-in-a-company-parking-lot-of-ten-men-and-12-robots.jpg?id=52093499&width=980"/>
  875. <small class="image-media media-caption" placeholder="Add Photo Caption...">Same brain, lots of different robots: the Field AI team’s foundation models can be used on robots big, small, expensive, and somewhat less expensive.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Field AI</small></p><p>Field AI is already expanding its capabilities, building on some of its recent experience with DARPA RACER by working on deploying robots to inspect pipelines for tens of kilometers and to transport materials across solar farms. With revenue coming in and a substantial chunk of funding, Field AI has even attracted interest from <a href="https://www.gatesnotes.com/Robotics" target="_blank"><u>Bill Gates</u></a>. Field AI’s participation in RACER is ongoing, under a sort of subsidiary company for federal projects called Offroad Autonomy, and in the meantime its commercial side is targeting expansion to “hundreds” of sites on every platform it can think of, including humanoids.</p>]]></description><pubDate>Tue, 30 Apr 2024 14:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/autonomy-unstructured-field-ai</guid><category>Autonomous vehicles</category><category>Bill gates</category><category>Darpa subterranean challenge</category><category>Field ai</category><category>Foundation models</category><category>Humanoids</category><category>Mobile robots</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/human-like-robot-driver-with-a-logo-on-its-chest-saying-field-ai-sits-behind-the-wheel-of-an-open-dune-buggy-like-vehicle.jpg?id=52093492&amp;width=980"></media:content></item><item><title>Will Human Soldiers Ever Trust Their Robot Comrades?</title><link>https://spectrum.ieee.org/military-robots</link><description><![CDATA[
  876. <img src="https://spectrum.ieee.org/media-library/four-men-wearing-army-fatigues-and-carrying-weapons-and-tools-look-on-as-a-small-robot-with-treads-rolls-along-the-dusty-ground.jpg?id=52013412&width=1200&height=800&coordinates=0%2C0%2C126%2C0"/><br/><br/><div class="intro-text">
  877. <em><strong>Editor’s note: </strong>This article is adapted from the author’s book <a href="https://www.ucpress.edu/book/9780520402171/war-virtually" target="_blank">War Virtually: The Quest to Automate Conflict, Militarize Data, and Predict the Future</a> (University of California Press, published in paperback April 2024).</em>
  878. </div><p class="drop-caps">
  879. The blistering late-afternoon wind ripped across
  880. <a href="https://militarybases.com/overseas/iraq/camp-taji/" target="_blank"><u>Camp Taji</u></a>, a sprawling U.S. military base just north of Baghdad. In a desolate corner of the outpost, where the feared Iraqi Republican Guard had once manufactured mustard gas, nerve agents, and other chemical weapons, a group of American soldiers and Marines were solemnly gathered around an open grave, dripping sweat in the 114-degree heat. They were paying their final respects to Boomer, a fallen comrade who had been an indispensable part of their team for years. Just days earlier, he had been blown apart by a roadside bomb.
  881. </p><p>
  882. As a bugle mournfully sounded the last few notes of “Taps,” a soldier raised his rifle and fired a long series of volleys—a 21-gun salute. The troops, which included members of an elite army unit specializing in
  883. <a href="https://en.wikipedia.org/wiki/Bomb_disposal" target="_blank"><u>explosive ordnance disposal</u></a> (EOD), had decorated Boomer posthumously with a Bronze Star and a Purple Heart. With the help of human operators, the diminutive remote-controlled robot had protected American military personnel from harm by finding and disarming hidden explosives.
  884. </p><p>
  885. Boomer was a Multi-function Agile Remote-Controlled robot, or
  886. <a href="https://www.defense.gov/Multimedia/Photos/igphoto/2001082886/" target="_blank"><u>MARCbot</u></a>, manufactured by a Silicon Valley company called <a href="https://www.exponent.com/" target="_blank"><u>Exponent</u></a>. Weighing in at just over 30 pounds, MARCbots look like a cross between a Hollywood camera dolly and an oversized Tonka truck. Despite their toylike appearance, the devices often leave a lasting impression on those who work with them. In an <a href="https://www.reddit.com/r/Military/comments/1mn6y1/soldiers_are_developing_relationships_with_their/" target="_blank"><u>online discussion</u></a> about EOD support robots, one soldier wrote, “Those little bastards can develop a personality, and they save so many lives.” An infantryman responded by admitting, “We liked those EOD robots. I can’t blame you for giving your guy a proper burial, he helped keep a lot of people safe and did a job that most people wouldn’t want to do.”
  887. </p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-left" data-rm-resized-container="25%" style="float: left;">
  888. <img alt="Two men work with a rugged box containing the controller for the small four-wheeled vehicle in front of them. The vehicle has a video camera mounted on a jointed arm." class="rm-shortcode rm-resized-image" data-rm-shortcode-id="90f54b8dd83a7f8b0f9142ca2f64079f" data-rm-shortcode-name="rebelmouse-image" id="d081d" loading="lazy" src="https://spectrum.ieee.org/media-library/two-men-work-with-a-rugged-box-containing-the-controller-for-the-small-four-wheeled-vehicle-in-front-of-them-the-vehicle-has-a.jpg?id=52013436&width=980" style="max-width: 100%"/>
  889. <small class="image-media media-caption" placeholder="Add Photo Caption..." style="max-width: 100%;">A Navy unit used a remote-controlled vehicle with a mounted video camera in 2009 to investigate suspicious areas in southern Afghanistan.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..." style="max-width: 100%;">Mass Communication Specialist 2nd Class Patrick W. Mullen III/U.S. Navy </small>
  890. </p><p>
  891. But while some EOD teams established warm emotional bonds with their robots, others loathed the machines, especially when they malfunctioned. Take, for example, this case described by a Marine who served in Iraq:
  892. </p><blockquote>
  893. My team once had a robot that was obnoxious. It would frequently accelerate for no reason, steer whichever way it wanted, stop, etc. This often resulted in this stupid thing driving itself into a ditch right next to a suspected IED. So of course then we had to call EOD [personnel] out and waste their time and ours all because of this stupid little robot. Every time it beached itself next to a bomb, which was at least two or three times a week, we had to do this. Then one day we saw yet another IED. We drove him straight over the pressure plate, and blew the stupid little sh*thead of a robot to pieces. All in all a good day.
  894. </blockquote><p>
  895. Some battle-hardened warriors treat remote-controlled devices like brave, loyal, intelligent pets, while others describe them as clumsy, stubborn clods. Either way, observers have interpreted these accounts as unsettling glimpses of a future in which men and women ascribe personalities to artificially intelligent war machines.
  896. </p><p class="pull-quote">
  897. Some battle-hardened warriors treat remote-controlled devices like brave, loyal, intelligent pets, while others describe them as clumsy, stubborn clods.
  898. </p><p>
  899. From this perspective, what makes robot funerals unnerving is the idea of an emotional slippery slope. If soldiers are bonding with clunky pieces of remote-controlled hardware, what are the prospects of humans forming emotional attachments with machines once they’re more autonomous in nature, nuanced in behavior, and anthropoid in form? And a more troubling question arises: On the battlefield, will
  900. <em><em>Homo sapiens </em></em>be capable of dehumanizing members of its own species (as it has for centuries), even as it simultaneously humanizes the robots sent to kill them?
  901. </p><p>
  902. As I’ll explain, the Pentagon has a vision of a warfighting force in which humans and robots work together in tight collaborative units. But to achieve that vision, it has called in reinforcements: “trust engineers” who are diligently helping the Department of Defense (DOD) find ways of rewiring human attitudes toward machines. You could say that they want more soldiers to play “Taps” for their robot helpers and fewer to delight in blowing them up.
  903. </p><h2>The Pentagon’s Push for Robotics</h2><p>
  904. For the better part of a decade, several influential Pentagon officials have relentlessly promoted robotic technologies,
  905. <a href="https://apps.dtic.mil/sti/pdfs/AD1059546.pdf" target="_blank"><u>promising a future</u></a> in which “humans will form integrated teams with nearly fully autonomous unmanned systems, capable of carrying out operations in contested environments.”
  906. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  907. <img alt="Several soldiers wearing helmets and ear protectors pull upright a tall grey drone. " class="rm-shortcode" data-rm-shortcode-id="040b0a8056f7364697f466be166f2c76" data-rm-shortcode-name="rebelmouse-image" id="792af" loading="lazy" src="https://spectrum.ieee.org/media-library/several-soldiers-wearing-helmets-and-ear-protectors-pull-upright-a-tall-grey-drone.jpg?id=52013466&width=980"/>
  908. <small class="image-media media-caption" placeholder="Add Photo Caption...">Soldiers test a vertical take-off-and-landing drone at Fort Campbell, Ky., in 2020.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">U.S. Army </small>
  909. </p><p>
  910. As
  911. <em><em>The</em></em> <em><em>New York Times </em></em><a href="https://www.nytimes.com/2016/10/26/us/pentagon-artificial-intelligence-terminator.html" target="_blank"><u>reported in 2016</u></a>: “Almost unnoticed outside defense circles, the Pentagon has put artificial intelligence at the center of its strategy to maintain the United States’ position as the world’s dominant military power.” The U.S. government is spending staggering sums to advance these technologies: For fiscal year 2019, the U.S. Congress was projected to provide the DOD with US $9.6 billion to <a href="https://www.auvsi.org/sites/default/files/DoD%20FY19%20Budget%20Report_FINAL%20DRAFT_WITH%20SENATE%20NDAA.pdf" target="_blank"><u>fund uncrewed and robotic systems</u></a>—significantly more than the annual budget of the entire National Science Foundation.
  912. </p><p>
  913. Arguments supporting the expansion of autonomous systems are consistent and predictable: The machines will keep our troops safe because they can perform dull, dirty, dangerous tasks; they will result in fewer civilian casualties, since robots will be able to identify enemies with greater precision than humans can; they will be cost-effective and efficient, allowing more to get done with less; and the devices will allow us to stay ahead of China, which, according to some experts, will soon surpass America’s technological capabilities.
  914. </p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-left" data-rm-resized-container="25%" style="float: left;">
  915. <img alt="A headshot shows a smiling man in a dark suit with his arms crossed.\u00a0" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="f6e621db6d16c8aa7ff1d739be02ed2f" data-rm-shortcode-name="rebelmouse-image" id="391f4" loading="lazy" src="https://spectrum.ieee.org/media-library/a-headshot-shows-a-smiling-man-in-a-dark-suit-with-his-arms-crossed-u00a0.jpg?id=52013561&width=980" style="max-width: 100%"/>
  916. <small class="image-media media-caption" placeholder="Add Photo Caption..." style="max-width: 100%;">Former U.S. deputy defense secretary Robert O. Work has argued for more automation within the military. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..." style="max-width: 100%;">Center for a New American Security </small>
  917. </p><p>
  918. Among the most outspoken advocate of a roboticized military is
  919. <a href="https://www.defense.gov/About/Biographies/Biography/Article/602787/robert-o-work/" target="_blank"><u>Robert O. Work</u></a>, who was nominated by President Barack Obama in 2014 to serve as deputy defense secretary. <a href="https://www.defense.gov/News/News-Stories/Article/Article/628154/work-human-machine-teaming-represents-defense-technology-future/" target="_blank"><u>Speaking at a 2015 defense forum</u></a>, Work—a barrel-chested retired Marine Corps colonel with the slight hint of a drawl—described a future in which “human-machine collaboration” would win wars using big-data analytics. He used the example of Lockheed Martin’s newest stealth fighter to illustrate his point: “The F-35 is not a fighter plane, it is a flying sensor computer that sucks in an enormous amount of data, correlates it, analyzes it, and displays it to the pilot on his helmet.”
  920. </p><p>
  921. The beginning of Work’s speech was measured and technical, but by the end it was full of swagger. To drive home his point, he described a ground combat scenario. “I’m telling you right now,” Work told the rapt audience, “10 years from now if the first person through a breach isn’t a friggin’ robot, shame on us.”
  922. </p><p>
  923. “The debate within the military is no longer about whether to build autonomous weapons but how much independence to give them,” said a
  924. <a href="https://www.nytimes.com/2016/10/26/us/pentagon-artificial-intelligence-terminator.html" target="_blank"><u>2016 </u><u><em><em>New York Times </em></em></u><u>article</u></a>. The rhetoric surrounding robotic and autonomous weapon systems is remarkably similar to that of Silicon Valley, where charismatic CEOs, technology gurus, and sycophantic pundits have relentlessly hyped artificial intelligence.
  925. </p><p>
  926. For example, in 2016, the
  927. <a href="https://dsb.cto.mil/" target="_blank"><u>Defense Science Board</u></a>—a group of appointed civilian scientists tasked with giving advice to the DOD on technical matters—released a report titled “<a href="https://apps.dtic.mil/sti/pdfs/AD1017790.pdf" target="_blank"><u>Summer Study on Autonomy</u></a>.” Significantly, the report wasn’t written to weigh the pros and cons of autonomous battlefield technologies; instead, the group assumed that such systems will inevitably be deployed. Among other things, the report included “focused recommendations to improve the future adoption and use of autonomous systems [and] example projects intended to demonstrate the range of benefits of autonomy<em> </em>for the warfighter.”
  928. </p><h2>What Exactly Is a Robot Soldier?</h2><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-left" data-rm-resized-container="25%" style="float: left;">
  929. <img alt="A red book cover shows the crosshairs of a target surrounded by images of robots and drones." class="rm-shortcode rm-resized-image" data-rm-shortcode-id="adc9550f238ab46441e1d5d4e80e00b3" data-rm-shortcode-name="rebelmouse-image" id="b047a" loading="lazy" src="https://spectrum.ieee.org/media-library/a-red-book-cover-shows-the-crosshairs-of-a-target-surrounded-by-images-of-robots-and-drones.jpg?id=52013571&width=980" style="max-width: 100%"/>
  930. <small class="image-media media-caption" placeholder="Add Photo Caption..." style="max-width: 100%;">The author’s book, <a href="https://www.ucpress.edu/book/9780520402171/war-virtually" rel="noopener noreferrer" target="_blank"><i>War Virtually</i></a>, is a critical look at how the U.S. military is weaponizing technology and data.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..." style="max-width: 100%;">University of California Press </small>
  931. </p><p>
  932. Early in the 20th century, military and intelligence agencies began developing robotic systems, which were mostly devices remotely operated by human controllers. But microchips, portable computers, the Internet, smartphones, and other developments have supercharged the pace of innovation. So, too, has the ready availability of colossal amounts of data from electronic sources and sensors of all kinds. The
  933. <a href="https://www.ft.com/content/442de9aa-e7a0-11e8-8a85-04b8afea6ea3" target="_blank"><u><em><em>Financial Times </em></em></u><u>reports</u></a>: “The advance of artificial intelligence brings with it the prospect of robot-soldiers battling alongside humans—and one day eclipsing them altogether.” These transformations aren’t inevitable, but they may become a self-fulfilling prophecy.
  934. </p><p>
  935. All of this raises the question: What exactly is a “robot-soldier”? Is it a remote-controlled, armor-clad box on wheels, entirely reliant on explicit, continuous human commands for direction? Is it a device that can be activated and left to operate semiautonomously, with a limited degree of human oversight or intervention? Is it a droid capable of selecting targets (using facial-recognition software or other forms of artificial intelligence) and initiating attacks without human involvement? There are hundreds, if not thousands, of possible technological configurations lying between remote control and full autonomy—and these differences affect ideas about who bears responsibility for a robot’s actions.
  936. </p><p>
  937. The U.S. military’s experimental and actual robotic and autonomous systems include a vast array of artifacts that rely on either remote control or artificial intelligence: aerial drones; ground vehicles of all kinds; sleek warships and submarines; automated missiles; and robots of various shapes and sizes—bipedal androids, quadrupedal gadgets that trot like dogs or mules, insectile swarming machines, and streamlined aquatic devices resembling fish, mollusks, or crustaceans, to name a few.
  938. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  939. <img alt="A four-legged black and grey robot moves in the foreground, while in the background a number of uniformed people watch its actions, " class="rm-shortcode" data-rm-shortcode-id="34aab039b2ab342391434de4a56b2c9c" data-rm-shortcode-name="rebelmouse-image" id="6b5a7" loading="lazy" src="https://spectrum.ieee.org/media-library/a-four-legged-black-and-grey-robot-moves-in-the-foreground-while-in-the-background-a-number-of-uniformed-people-watch-its-actio.jpg?id=52013577&width=980"/>
  940. <small class="image-media media-caption" placeholder="Add Photo Caption...">Members of a U.S. Air Force squadron test out an agile and rugged quadruped robot from Ghost Robotics in 2023.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Airman First Class Isaiah Pedrazzini/U.S. Air Force </small>
  941. </p><p>
  942. The transitions projected by military planners suggest that servicemen and servicewomen are in the midst of a three-phase evolutionary process, which begins with remote-controlled robots, in which humans are “in the loop,” then proceeds to semiautonomous and supervised autonomous systems, in which humans are “on the loop,” and then concludes with the adoption of fully autonomous systems, in which humans are “out of the loop.” At the moment, much of the debate in military circles has to do with the degree to which automated systems should allow—or require—human intervention.
  943. </p><p class="pull-quote">
  944. “Ten years from now if the first person through a breach isn’t a friggin’ robot, shame on us.” <strong>—Robert O. Work</strong>
  945. </p><p>
  946. In recent years, much of the hype has centered around that second stage: semiautonomous and supervised autonomous systems that DOD officials refer to as “human-machine teaming.” This idea suddenly appeared in Pentagon publications and official statements after the summer of 2015. The timing probably wasn’t accidental; it came at a time when global news outlets were focusing attention on a public backlash against lethal autonomous weapon systems. The
  947. <a href="https://www.stopkillerrobots.org/" target="_blank"><u>Campaign to Stop Killer Robots</u></a> was launched in April 2013 as a coalition of nonprofit and civil society organizations, including the <a href="https://www.icrac.net/" target="_blank"><u>International Committee for Robot Arms Control</u></a>, <a href="https://www.amnestyusa.org/" target="_blank"><u>Amnesty International</u></a>, and <a href="https://www.hrw.org/" target="_blank"><u>Human Rights Watch</u></a>. In July 2015, the campaign released an <a href="https://futureoflife.org/open-letter/open-letter-autonomous-weapons-ai-robotics/" target="_blank"><u>open letter</u></a> warning of a robotic arms race and calling for a ban on the technologies. Cosigners included world-renowned physicist Stephen Hawking, Tesla founder Elon Musk, Apple cofounder Steve Wozniak, and thousands more.
  948. </p><p>
  949. In November 2015, Work gave a high-profile speech on the importance of human-machine teaming, perhaps hoping to defuse the growing criticism of “killer robots.”
  950. <a href="https://breakingdefense.com/2015/11/centaur-army-bob-work-robotics-the-third-offset-strategy/" target="_blank"><u>According to one accoun</u></a>t, Work’s vision was one in which “computers will fly the missiles, aim the lasers, jam the signals, read the sensors, and pull all the data together over a network, putting it into an intuitive interface humans can read, understand, and use to command the mission”—but humans would still be in the mix, “using the machine to make the human make better decisions.” From this point forward, the military branches accelerated their drive toward human-machine teaming.
  951. </p><h2>The Doubt in the Machine</h2><p>
  952. But there was a problem. Military experts loved the idea, touting it as a win-win:
  953. <a href="https://www.cnas.org/people/paul-scharre" target="_blank"><u>Paul Scharre</u></a>, in his book <a href="https://www.amazon.com/Army-None-Autonomous-Weapons-Future/dp/0393608980" target="_blank"><u><em><em>Army of None</em></em></u></a><u><em><em>: Autonomous Weapons and the Future of War</em></em></u><em><em>, </em></em>claimed that “we don’t need to give up the benefits of human judgment to get the advantages of automation, we can have our cake and eat it too.” However, personnel on the ground expressed—and continue to express—deep misgivings about the side effects of the Pentagon’s newest war machines.
  954. </p><p>
  955. The difficulty, it seems, is humans’ lack of trust. The engineering challenges of creating robotic weapon systems are relatively straightforward, but the social and psychological challenges of convincing humans to place their faith in the machines are bewilderingly complex. In high-stakes, high-pressure situations like military combat, human confidence in autonomous systems can quickly vanish. The Pentagon’s
  956. <a href="https://dsiac.org/journals/" target="_blank"><u><em><em>Defense Systems Information Analysis Center Journal</em></em></u></a><em> </em>noted that although the prospects for combined human-machine teams are promising, <a href="https://dsiac.org/articles/warfighter-trust-in-autonomy/" target="_blank"><u>humans will need assurances</u></a>:
  957. </p><blockquote>
  958. [T]he battlefield is fluid, dynamic, and dangerous. As a result, warfighter demands become exceedingly complex, especially since the potential costs of failure are unacceptable. The prospect of lethal autonomy adds even greater complexity to the problem [in that] warfighters will have no prior experience with similar systems. Developers will be forced to build trust almost from scratch.
  959. </blockquote><p>
  960. In a
  961. <a href="https://cimsec.org/trusting-autonomous-systems-its-more-than-technology/" target="_blank"><u>2015 article</u></a>, <a href="https://www.linkedin.com/in/greg-smith-594b1011/" target="_blank"><u>U.S. Navy Commander Greg Smith</u></a> provided a candid assessment of aviators’ distrust in aerial drones. After describing how drones are often intentionally separated from crewed aircraft, Smith noted that operators sometimes lose communication with their drones and may inadvertently bring them perilously close to crewed airplanes, which “raises the hair on the back of an aviator’s neck.” He concluded:
  962. </p><blockquote>
  963. [I]n 2010, one task force commander grounded his manned aircraft at a remote operating location until he was assured that the local control tower and UAV [unmanned aerial vehicle] operators located halfway around the world would improve procedural compliance. Anecdotes like these abound…. After nearly a decade of sharing the skies with UAVs, most naval aviators no longer believe that UAVs are trying to kill them, but one should not confuse this sentiment with trusting the platform, technology, or [drone] operators.
  964. </blockquote><p class="shortcode-media shortcode-media-rebelmouse-image">
  965. <img alt="Two men look at a variety of screens in a dark room. Bottom: A large gray unmanned aircraft sits in a hangar. " class="rm-shortcode" data-rm-shortcode-id="3209a4ccb89c4ef6000731df15cf1756" data-rm-shortcode-name="rebelmouse-image" id="68773" loading="lazy" src="https://spectrum.ieee.org/media-library/two-men-look-at-a-variety-of-screens-in-a-dark-room-bottom-a-large-gray-unmanned-aircraft-sits-in-a-hangar.jpg?id=52013837&width=980"/>
  966. <small class="image-media media-caption" placeholder="Add Photo Caption...">U.S. Marines [top] prepare to launch and operate a MQ-9A Reaper drone in 2021. The Reaper [bottom] is designed for both high-altitude surveillance and destroying targets.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Top: Lance Cpl. Gabrielle Sanders/U.S. Marine Corps; Bottom: 1st Lt. John Coppola/U.S. Marine Corps </small>
  967. </p><p>
  968. Yet Pentagon leaders place an almost superstitious trust
  969. <em> </em>in those systems, and seem firmly convinced that a lack of human confidence in autonomous systems can be overcome with engineered solutions. In a <a href="https://www.c4isrnet.com/opinion/2018/08/06/the-social-science-of-soldier-machine-teaming/" target="_blank"><u>commentary</u></a>, Courtney Soboleski, a data scientist employed by the military contractor <a href="https://www.boozallen.com/" target="_blank"><u>Booz Allen Hamilton</u></a>, makes the case for mobilizing social science as a tool for overcoming soldiers’ lack of trust in robotic systems.
  970. </p><blockquote>
  971. The problem with adding a machine into military teaming arrangements is not doctrinal or numeric…it is psychological. It is rethinking the instinctual threshold required for trust to exist between the soldier and machine.… The real hurdle lies in surpassing the individual psychological and sociological barriers to assumption of risk presented by algorithmic warfare. To do so requires a rewiring of military culture across several mental and emotional domains.… AI [artificial intelligence] trainers should partner with traditional military subject matter experts to develop the psychological feelings of safety not inherently tangible in new technology. Through this exchange, soldiers will develop the same instinctual trust natural to the human-human war-fighting paradigm with machines.
  972. </blockquote><h2>The Military’s Trust Engineers Go to Work</h2><p>
  973. Soon, the wary warfighter will likely be subjected to new forms of training that focus on building trust between robots and humans. Already, robots are being programmed to communicate in more human ways with their users for the explicit purpose of increasing trust. And projects are currently underway to help military robots report their deficiencies to humans in given situations, and to alter their functionality according to the machine’s perceived emotional state of the user.
  974. </p><p>
  975. At the DEVCOM
  976. <a href="https://www.army.mil/arl" target="_blank"><u>Army Research Laboratory</u></a>, military psychologists have spent more than a decade on human experiments related to trust in machines. Among the most prolific is <a href="https://www.linkedin.com/in/jessie-chen-83434212/" target="_blank"><u>Jessie Chen</u></a>, who joined the lab in 2003. Chen lives and breathes robotics—specifically “agent teaming” research, a field that examines how robots can be integrated into groups with humans. Her experiments test how humans’ lack of trust in robotic and autonomous systems can be overcome—or at least minimized.
  977. </p><p>
  978. For example, in
  979. <a href="https://www.eurekalert.org/news-releases/540762" target="_blank"><u>one set of tests</u></a>, Chen and her colleagues deployed a small ground robot called an Autonomous Squad Member that interacted and communicated with infantrymen. The researchers varied “situation-awareness-based agent transparency”—that is, the robot’s self-reported information about its plans, motivations, and predicted outcomes—and found that human trust in the robot increased when the autonomous “agent” was more transparent or honest about its intentions.
  980. </p><p>
  981. The Army isn’t the only branch of the armed services researching human trust in robots. The
  982. <a href="https://www.afrl.af.mil/" target="_blank"><u>U.S. Air Force Research Laboratory</u></a> recently had an entire group dedicated to the subject: the <a href="https://www.scientificamerican.com/article/the-air-force-wants-you-to-trust-robots-should-you/" target="_blank"><u>Human Trust and Interaction Branch</u></a>, part of the lab’s <a href="https://www.afrl.af.mil/711HPW/" target="_blank"><u>711th Human Performance Wing</u></a><u>,</u> located at Wright-Patterson Air Force Base, in Ohio.
  983. </p><p>
  984. In 2015, the Air Force began
  985. <a href="https://govtribe.com/opportunity/federal-contract-opportunity/trust-in-autonomy-for-human-machine-teaming-baaafrlrqkh20150008" target="_blank"><u>soliciting proposals</u></a> for “research on how to harness the socio-emotional elements of interpersonal team/trust dynamics and inject them into human-robot teams.” <a href="https://www.linkedin.com/in/mark-draper-b983838/" target="_blank"><u>Mark Draper</u></a>, a principal engineering research psychologist at the Air Force lab, is <a href="https://www.dvidshub.net/news/250385/creating-synthetic-teammates" target="_blank"><u>optimistic about the prospects</u></a> of human-machine teaming: “As autonomy becomes more trusted, as it becomes more capable, then the Airmen can start off-loading more decision-making capability on the autonomy, and autonomy can exercise increasingly important levels of decision-making.”
  986. </p><p>
  987. Air Force researchers are attempting to dissect the determinants of human trust. In one project, they
  988. <a href="https://www.sciencedirect.com/science/article/abs/pii/S0092656618300618" target="_blank"><u>examined the relationship</u></a> between a person’s personality profile (measured using the so-called <a href="https://www.simplypsychology.org/big-five-personality.html" target="_blank"><u>Big Five personality traits</u></a>: openness, conscientiousness, extraversion, agreeableness, neuroticism) and his or her tendency to trust. In another experiment, entitled “<a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6423898/" target="_blank"><u>Trusting Robocop</u></a><u>: Gender-Based Effects on Trust of an Autonomous Robot</u>,” Air Force scientists compared male and female research subjects’ levels of trust by showing them a video depicting a guard robot. The robot was armed with a Taser, interacted with people, and eventually used the Taser on one. Researchers designed the scenario to create uncertainty about whether the robot or the humans were to blame. By surveying research subjects, the scientists found that women reported higher levels of trust in “Robocop” than men.
  989. </p><p>
  990. The issue of trust in autonomous systems has even led the Air Force’s chief scientist to
  991. <a href="https://www.amazon.com/Autonomous-Horizons-Greg-L-Zacharias/dp/1077547854" target="_blank"><u>suggest ideas</u></a> for increasing human confidence in the machines, ranging from better android manners to robots that look more like people, under the principle that
  992. </p><blockquote>
  993. good HFE [human factors engineering] design should help support ease of interaction between humans and AS [autonomous systems]. For example, better “etiquette” often equates to better performance, causing a more seamless interaction. This occurs, for example, when an AS avoids interrupting its human teammate during a high workload situation or cues the human that it is about to interrupt—activities that, surprisingly, can improve performance independent of the actual reliability of the system. To an extent, anthropomorphism can also improve human-AS interaction, since people often trust agents endowed with more humanlike features…[but] anthropomorphism can also induce overtrust.
  994. </blockquote><div class="intro-text">
  995. <em><strong></strong></em>
  996. </div><p>
  997. It’s impossible to know the degree to which the trust engineers will succeed in achieving their objectives. For decades, military trainers have trained and prepared newly enlisted men and women to kill other people. If specialists have developed simple psychological techniques to overcome the soldier’s deeply ingrained aversion to destroying human life, is it possible that someday, the warfighter might also be persuaded to unquestioningly place his or her trust in robots?
  998. <span class="ieee-end-mark"></span>
  999. </p>]]></description><pubDate>Sat, 27 Apr 2024 15:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/military-robots</guid><category>Military robots</category><category>Autonomous weapons</category><category>Trust</category><dc:creator>Roberto J. González</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/four-men-wearing-army-fatigues-and-carrying-weapons-and-tools-look-on-as-a-small-robot-with-treads-rolls-along-the-dusty-ground.jpg?id=52013412&amp;width=980"></media:content></item><item><title>Video Friday: RACER Heavy</title><link>https://spectrum.ieee.org/video-friday-racer-heavy</link><description><![CDATA[
  1000. <img src="https://spectrum.ieee.org/media-library/image.jpg?id=52109387&width=3845&height=3013&coordinates=1132%2C432%2C1071%2C579"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><h5><a href="https://cybathlon.ethz.ch/en/events/edition/cybathlon-2024">Cybathlon 2024</a>: 25–27 October 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="5t3-exhc5zs"><em>DARPA’s Robotic Autonomy in Complex Environments with Resiliency (RACER) program recently  conducted its fourth experiment (E4) to assess the performance of off-road unmanned vehicles. These tests, conducted in Texas in late 2023, were the first time the program tested its new vehicle, the RACER Heavy Platform (RHP). The video shows autonomous route following for mobility testing and demonstration, including sensor point cloud visualizations.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f4969a65948340180e7580ed53d451e1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5t3-eXHC5Zs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>The 12-ton RHP is significantly larger than the 2-ton RACER Fleet Vehicles (RFVs) already in use in the program. Using the algorithms on a very different platform helps RACER toward its goal of platform agnostic autonomy of combat-scale vehicles in complex, mission-relevant off-road environments that are significantly more unpredictable than on-road conditions.</em></blockquote><p>[ <a href="https://www.darpa.mil/news-events/2024-04-23">DARPA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="vjxqg2_85v0"><em>In our new </em><u>Science Robotics</u><em> <a href="https://www.science.org/doi/10.1126/scirobotics.adi9641" target="_blank">paper</a>, we introduce an autonomous navigation system developed for our wheeled-legged quadrupeds,  designed for fast and efficient navigation within large urban environments. Driven by neural network policies, our simple, unified control system enables smooth gait transitions, smart navigation planning, and highly responsive obstacle avoidance in populated urban environments.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b949d200aef151486b1fec093e6f6449" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vJXQG2_85V0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://junja94.github.io/learning_robust_autonomous_navigation_and_locomotion_for_wheeled_legged_robots/">Github</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="-hizp4uqvug"><em>Generation 7 of “Phoenix” robots include improved human-like range of motion. Improvements in uptime, visual perception, and tactile sensing increase the capability of the system to perform complex tasks over longer periods. Design iteration significantly decreases build time. The speed at which new tasks can be automated has increased 50x, marking a major inflection point in task automation speed.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="06bf90d6b4b20fa594bafdd2ad6511a9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/-HizP4UQvug?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sanctuary.ai/resources/news/sanctuary-ai-unveils-the-next-generation-of-ai-robotics/">Sanctuary AI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="pcpca5wbw5y"><em>We’re proud to celebrate our one millionth commercial delivery—that’s a million deliveries of lifesaving blood, critical vaccines, last-minute groceries, and so much more. But the best part? This is just the beginning.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="58af119afcc289be6c689f9f3b9893e4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pCPCa5WBW5Y?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.flyzipline.com/newsroom/stories/articles/why-a-million-deliveries-is-only-the-beginning">Zipline</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="uufjfxpeixg">Work those hips!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f0cfaa75da54e34ea58c76884c09077b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/uuFJfXpEIXg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.romela.org/">RoMeLa</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="9yfiedyk4em">This thing is kind of terrifying, and I’m fascinated by it.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="789156287e2107953ee9cc1aa7f43e85" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/9yfiEdyk4EM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://avfl.engr.tamu.edu/">AVFL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="6y4fpxx6axq"><em>We propose a novel humanoid TWIMP, which combines a human mimetic musculoskeletal upper limb with a two-wheel inverted pendulum. By combining the benefit of a musculoskeletal humanoid, which can achieve soft contact with the external environment, and the benefit of a two-wheel inverted pendulum with a small footprint and high mobility, we can easily investigate learning control systems in environments with contact and sudden impact.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="838445b9e207dca6d26e204bc7c98eef" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/6Y4FpXx6axQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>From Humanoids 2018.</p><p>[ <a href="https://arxiv.org/abs/2404.14080">Paper</a> ] via [ <a href="http://www.jsk.t.u-tokyo.ac.jp/">JSK Lab</a> ]</p><p>Thanks, Kento!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="impkqh37pdm">Ballbots are uniquely capable of pushing wheelchairs—arguably better than legged platforms, because they can move in any direction without having to reposition themselves.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="896e395ed2ec8f8f86c08d0d286e63e6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/IMpKqh37pDM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/html/2404.13206v1">Paper</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="zz2fp1y5z2e"><em>Charge Robotics is building robots that automate the most labor-intensive parts of solar construction. Solar has rapidly become the cheapest form of power generation in many regions. Demand has skyrocketed, and now the primary barrier to getting it installed is labor logistics and bandwidth. Our robots remove the labor bottleneck, allowing construction companies to meet the rising demand for solar, and enabling the world to switch to renewables faster.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e788d5961284f0aecfc9f74ead68ed9d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ZZ2fP1Y5Z2E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://chargerobotics.com/">Charge Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="3l6z3x5vscq">Robots doing precision assembly is cool and all, but those vibratory bowl sorters seem like magic.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5dca61606953cfab46d513ef25e8005a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/3l6Z3x5VScQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.linkedin.com/posts/fanuc-america-corporation_get-it-done-with-multi-axis-rotary-assembly-activity-7188150554034950144-swY8/">FANUC</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="r-boz0qtehm"><em>The QUT CGRAS project’s robot prototype captures images of baby corals, destined for the Great Barrier Reef, monitoring and counting them in grow tanks. The team uses state-of-the-art AI algorithms to automatically detect and count these coral babies and track their growth over time – saving human counting time and money.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b8e974d4299511be59932318a445dd42" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/r-bOZ0qteHM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.qut.edu.au/news?id=194074">QUT</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="l-4qzhzzwna"><em>We are conducting research to develop Unmanned Aerial Systems to aid in wildfire monitoring. The hazardous, dynamic, and visually degraded environment of wildfire gives rise to many unsolved fundamental research challenges.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1e916f9a3246d7644af12452ffffd572" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/l-4QzhzZWNA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.cs.cmu.edu/news/2024/wildfire-drones">CMU</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="izxogwos03e">Here’s a little more video of that robot elevator, but I’m wondering why it’s so slow—clamp those bots in there and rocket that elevator up and down!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b2a4a47a095369de2a4e759b78dce09e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/IzxOgwOs03E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://1784.navercorp.com/en/">NAVER</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="qrwy5pap2ja"><em>In March 2024, Northwestern University’s Center for Robotics and Biosystems demonstrated the Omnid mobile collaborative robots (mocobots) at MARS, a conference in Ojai, California on Machine learning, Automation, Robotics, and Space, hosted by Jeff Bezos. The “swarm” of mocobots is designed to collaborate with humans, allowing a human to easily manipulate large, heavy, or awkward payloads. In this case, the mocobots cancel the effect of gravity, so the human can easily manipulate the mock airplane wing in six degrees of freedom. In general, human-cobot systems combine the best of human capabilities with the best of robot capabilities.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="50da450a79a735d0e2ca896560617da8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/qRwy5Pap2jA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.mccormick.northwestern.edu/news/articles/2024/04/northwestern-robotics-team-demonstrates-mocobots-to-tech-leaders/">Northwestern</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="lsajhsihyp4">There’s something so soothing about watching a lithium battery get wrecked and burn for 8 minutes.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d5dc358a1f90a160cb46fed36b13ab53" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/lSAjHsIHyP4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://hardcorerobotics.com/robots.html">Hardcore Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="rxzsjx3jnr8"><em>EELS, or Exobiology Extant Life Surveyor, is a versatile, snake-like robot designed for exploration of previously inaccessible terrain. This talk on EELS was presented at the 2024 Amazon MARS conference.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="95830c869c137407db0f9a8804ea1a25" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/rxzsJX3jNR8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.jpl.nasa.gov/robotics-at-jpl/eels">JPL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="qoczyrxl0aq"><em>The convergence of AI and robotics will unlock a wonderful new world of possibilities in everyday life, says robotics and AI pioneer Daniela Rus. Diving into the way machines think, she reveals how “liquid networks”—a revolutionary class of AI that mimics the neural processes of simple organisms—could help intelligent machines process information more efficiently and give rise to “physical intelligence” that will enable AI to operate beyond digital confines and engage dynamically in the real world.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="26ecdc2404eab4e7b34dda11feb3675e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/QOCZYRXL0AQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ted.com/talks/daniela_rus_how_ai_will_step_off_the_screen_and_into_the_real_world">TED</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 26 Apr 2024 15:23:34 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-racer-heavy</guid><category>Video friday</category><category>Robotics</category><category>Quadruped robots</category><category>Humanoid robots</category><category>Drone delivery</category><category>Uav</category><category>Robotic arm</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/image.jpg?id=52109387&amp;width=980"></media:content></item></channel></rss>

If you would like to create a banner that links to this page (i.e. this validation result), do the following:

  1. Download the "valid RSS" banner.

  2. Upload the image to your own server. (This step is important. Please do not link directly to the image on this server.)

  3. Add this HTML to your page (change the image src attribute if necessary):

If you would like to create a text link instead, here is the URL you can use:

http://www.feedvalidator.org/check.cgi?url=https%3A//feeds.feedburner.com/IeeeSpectrumRobotics%3Fformat%3Dxml

Copyright © 2002-9 Sam Ruby, Mark Pilgrim, Joseph Walton, and Phil Ringnalda