Congratulations!

[Valid RSS] This is a valid RSS feed.

Recommendations

This feed is valid, but interoperability with the widest range of feed readers could be improved by implementing the following recommendations.

Source: https://feeds.feedburner.com/IeeeSpectrumRobotics?format=xml

  1. <?xml version="1.0" encoding="utf-8"?>
  2. <rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:media="http://search.yahoo.com/mrss/"><channel><title>IEEE Spectrum</title><link>https://spectrum.ieee.org/</link><description>IEEE Spectrum</description><atom:link href="https://spectrum.ieee.org/feeds/topic/robotics.rss" rel="self"></atom:link><language>en-us</language><lastBuildDate>Sat, 17 Feb 2024 17:05:05 -0000</lastBuildDate><item><title>Video Friday: Acrobot Error</title><link>https://spectrum.ieee.org/video-friday-acrobot-error</link><description><![CDATA[
  3. <img src="https://spectrum.ieee.org/media-library/image.gif?id=51482626&width=1245&height=700&coordinates=0%2C333%2C0%2C333"/><br/><br/><p>
  4. Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/>
  5. </p><h5><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 2 February 2024, ZURICH</h5><h5><a href="https://humanrobotinteraction.org/2024/">HRI 2024</a>: 11–15 March 2024, BOULDER, COLO.</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><p>
  6. Enjoy today’s videos!
  7. </p><div class="horizontal-rule">
  8. </div><div style="page-break-after: always">
  9. <span style="display:none"> </span>
  10. </div><p class="rm-anchors" id="gt6olyyd3n4">
  11. Just like a real human, Acrobot will sometimes kick you in the face.
  12. </p><p class="shortcode-media shortcode-media-youtube">
  13. <span class="rm-shortcode" data-rm-shortcode-id="3334739d564ceaedf148f129005e8a0b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/gT6oLYYd3n4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  14. </p><p>
  15. [ <a href="https://danielsimu.com/acrobotics/">Acrobotics</a> ]
  16. </p><p>
  17. Thanks, Elizabeth!
  18. </p><div class="horizontal-rule">
  19. </div><p class="rm-anchors" id="21f7iof9bms">
  20. You had me at “wormlike, limbless robots.”
  21. </p><p class="shortcode-media shortcode-media-youtube">
  22. <span class="rm-shortcode" data-rm-shortcode-id="04804c773eebe3ee6e0c340dd60f03e7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/21F7IOF9BMs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  23. </p><p>
  24. [ <a href="https://ty-wang.github.io/MechIntelligence/">GitHub</a> ] via [ <a href="https://theconversation.com/we-designed-wormlike-limbless-robots-that-navigate-obstacle-courses-they-could-be-used-for-search-and-rescue-one-day-220828">Georgia Tech</a> ]
  25. </p><div class="horizontal-rule">
  26. </div><blockquote class="rm-anchors" id="raprq2lyeze">
  27. <em>Filmed in July 2017, this video shows us using Atlas to put out a “fire” on our loading dock. This uses a combination of teleoperation and autonomous behaviors through a single, remote computer. Robot built by Boston Dynamics for the DARPA Robotics Challenge in 2013. Software by IHMC Robotics.</em>
  28. </blockquote><p class="shortcode-media shortcode-media-youtube">
  29. <span class="rm-shortcode" data-rm-shortcode-id="f8c7f495f9b7554dce027b26086c7489" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Raprq2LyEZE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  30. </p><p>
  31. I would say that in the middle of a rainstorm is probably the best time to start a fire that you expect to be extinguished by a robot.
  32. </p><p>
  33. [ <a href="https://robots.ihmc.us/">IHMC</a> ]
  34. </p><div class="horizontal-rule">
  35. </div><blockquote class="rm-anchors" id="mwn5kjwenas">
  36. <em>We’re hard at work, but Atlas still has time for a dance break.</em>
  37. </blockquote><p class="shortcode-media shortcode-media-youtube">
  38. <span class="rm-shortcode" data-rm-shortcode-id="5749d903e772bcfae0b530b6af31be9d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/mWn5KjWeNas?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  39. </p><p>
  40. [ <a href="https://bostondynamics.com/atlas/">Boston Dynamics</a> ]
  41. </p><div class="horizontal-rule">
  42. </div><p class="rm-anchors" id="0rrvoakwdgo">
  43. This is pretty cool: BruBotics is testing its self-healing robotics gripper technology on commercial grippers from Festo.
  44. </p><p class="shortcode-media shortcode-media-youtube">
  45. <span class="rm-shortcode" data-rm-shortcode-id="d8a949e5d9970c60ca13540515389cf8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0RrVoakWdgo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  46. </p><p>
  47. [ <a href="https://ieeexplore.ieee.org/document/10423895">Paper</a> ] via [ <a href="https://www.brubotics.eu/">BruBotics</a> ]
  48. </p><p>
  49. Thanks, Bram!
  50. </p><div class="horizontal-rule">
  51. </div><p class="rm-anchors" id="srzxqe-hlc0">
  52. You should read <a href="https://spectrum.ieee.org/hello-robot-stretch-3" target="_blank">our in-depth article on Stretch 3</a>, so if you haven’t yet, consider this as just a teaser.
  53. </p><p class="shortcode-media shortcode-media-youtube">
  54. <span class="rm-shortcode" data-rm-shortcode-id="d85e7e2468a6a24490a25b99b6217cc7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/srzXqE-hlc0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  55. </p><p>
  56. [ <a href="https://hello-robot.com/">Hello Robot</a> ]
  57. </p><div class="horizontal-rule">
  58. </div><blockquote class="rm-anchors" id="0_cjqzmzis4">
  59. <em>Inspired by caregiving experts, we proposed a bimanual interactive robotic dressing assistance scheme, which is unprecedented in previous research. In the scheme, an interactive robot joins hands with the human thus supporting/guiding the human in the dressing process, while the dressing robot performs the dressing task. This work represents a paradigm shift of thinking of the dressing assistance task from one-robot-to-one-arm to two-robot-to-one-arm.</em>
  60. </blockquote><p class="shortcode-media shortcode-media-youtube">
  61. <span class="rm-shortcode" data-rm-shortcode-id="07384f315b1bd00d9ef9652fdb1626cc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0_cJqZmZIS4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  62. </p><p>
  63. [ <a href="https://sites.google.com/view/bimanualassitdressing/home">Project</a> ]
  64. </p><p>
  65. Thanks, Jihong!
  66. </p><div class="horizontal-rule">
  67. </div><p class="rm-anchors" id="tne9nvodu0s">
  68. Tony Punnoose Valayil from the Bulgarian Academy of Sciences Institute of Robotics wrote in to share some very low-cost hand-rehabilitation robots for home use.
  69. </p><p class="shortcode-media shortcode-media-youtube">
  70. <span class="rm-shortcode" data-rm-shortcode-id="1b7158e26042e082bc1c7a30d2c76489" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Tne9NvoDU0s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  71. </p><blockquote>
  72. In this video, we present a robot-assisted rehabilitation of the wrist joint which can aid in restoring the strength that has been lost across the upper limb due to stroke. This robot is very cost-effective and can be used for home rehabilitation.
  73. </blockquote><p class="shortcode-media shortcode-media-youtube">
  74. <span class="rm-shortcode" data-rm-shortcode-id="449f853db93fd4f1b48345d2f029d44e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Flyx2x1G-cM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  75. </p><blockquote>
  76. In this video, we present an exoskeleton robot which can be used at home for rehabilitating the index and middle fingers of stroke-affected patients. This robot is built at a cost of 50 euros for patients who are not financially independent to get better treatment.</blockquote><p>
  77. [ <a href="http://www.ir.bas.bg/index_en.html">BAS</a> ]
  78. </p><div class="horizontal-rule">
  79. </div><p class="rm-anchors" id="0djr8fetpq0">
  80. Some very impressive work here from the Norwegian University of Science and Technology (<a href="https://en.wikipedia.org/wiki/Norwegian_University_of_Science_and_Technology" target="_blank">NTNU</a>), showing a drone tracking its position using radar and lidar-based odometry in some nightmare (for robots) environments, including a long tunnel that looks the same everywhere and a hallway full of smoke.
  81. </p><p class="shortcode-media shortcode-media-youtube">
  82. <span class="rm-shortcode" data-rm-shortcode-id="bc5a0356ca298331d7172e68486a35e3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0dJr8fETpQ0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  83. </p><p>
  84. [ <a href="https://arxiv.org/abs/2310.16658">Paper</a> ] via [ <a href="https://github.com/ntnu-arl/lidar_degeneracy_datasets">GitHub</a> ]
  85. </p><div class="horizontal-rule">
  86. </div><p class="rm-anchors" id="5u-bszrx-t4">
  87. I’m sorry, but people should really know better than to make videos like this for social robot crowdfunding by now.
  88. </p><p class="shortcode-media shortcode-media-youtube">
  89. <span class="rm-shortcode" data-rm-shortcode-id="9decd93a25ef88830252bedd935e4278" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5U-bSZRx-t4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  90. </p><p>
  91. It’s on Kickstarter for about $300, and the fact that it’s been funded so quickly tells me that people have already forgotten about the social robotpocalypse.
  92. </p><p>
  93. [ <a href="https://www.kickstarter.com/projects/doly/doly-more-than-a-robot">Kickstarter</a> ]
  94. </p><div class="horizontal-rule">
  95. </div><blockquote class="rm-anchors" id="vvpgsd9jsw0">
  96. <em>Introducing Orbit, your portal for managing asset-intensive facilities through real-time and predictive intelligence. Orbit brings a whole new suite of fleet management capabilities and will unify your ecosystem of Boston Dynamics robots, starting with Spot.</em>
  97. </blockquote><p class="shortcode-media shortcode-media-youtube">
  98. <span class="rm-shortcode" data-rm-shortcode-id="78ca765204cebb9f5913cacd8dd8f0e7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/VVpgsd9Jsw0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  99. </p><p>
  100. [ <a href="https://bostondynamics.com/blog/robot-fleet-management-lifts-off-with-spot/">Boston Dynamics</a> ]
  101. </p><div class="horizontal-rule">
  102. </div>]]></description><pubDate>Fri, 16 Feb 2024 15:31:44 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-acrobot-error</guid><category>Atlas</category><category>Boston dynamics</category><category>Darpa robotics challenge</category><category>Georgia tech</category><category>Hello robot</category><category>Kickstarter</category><category>Robotics</category><category>Video friday</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/image.gif?id=51482626&amp;width=980"></media:content></item><item><title>Stretch 3 Brings Us Closer to Realistic Home Robots</title><link>https://spectrum.ieee.org/hello-robot-stretch-3</link><description><![CDATA[
  103. <img src="https://spectrum.ieee.org/media-library/a-picture-of-a-tall-skinny-robot-with-a-two-wheeled-base-and-a-single-arm-with-a-flexible-wrist-on-the-end.png?id=51473667&width=3001&height=2307&coordinates=417%2C0%2C422%2C93"/><br/><br/><p>A lot has happened in robotics over the last year. Everyone is wondering how AI will transform robotics, and everyone else is wondering whether humanoids are going to blow it or not, and the rest of us are busy trying not to get completely run over as things shake out however they’re going to shake out. </p><p>Meanwhile, over at <a href="https://spectrum.ieee.org/hello-robots-stretch-mobile-manipulator" target="_blank">Hello Robot</a>, they’ve been focused on making their Stretch robot do useful things while also being affordable and reliable and affordable and expandable and affordable and community-friendly and affordable. Which are some really hard and important problems that can sometimes get overwhelmed by flashier things.</p><p>Today, <a href="https://hello-robot.com/stretch-3-product" rel="noopener noreferrer" target="_blank">Hello Robot is announcing Stretch 3</a>, which provides a suite of upgrades to what they (quite accurately) call “the world’s only lightweight, capable, developer-friendly mobile manipulator.” And impressively, they’ve managed to do it without forgetting about that whole “affordable” part.</p><hr/><p class="shortcode-media shortcode-media-youtube">
  104. <span class="rm-shortcode" data-rm-shortcode-id="314bde5772f929cd623891843c2b8c9c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/haauI2x2U1E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  105. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Hello Robot</small></p><p>Stretch 3 looks about the same as the previous versions, but there are important upgrades that are worth highlighting. The most impactful: Stretch 3 now comes with the dexterous wrist kit that used to be an add-on, and it now also includes an <a href="https://www.intelrealsense.com/depth-camera-d405/" target="_blank">Intel Realsense D405 camera</a> mounted right behind the gripper, which is a huge help for both autonomy and remote teleoperation—a useful new feature shipping with Stretch 3 that’s based on research out of <a href="https://hcrlab.cs.washington.edu/" target="_blank">Maya Cakmak’s lab</a> at the University of Washington, in Seattle. This is an example of turning innovation from the community of Stretch users into product features, a product-development approach that seems to be working well for Hello Robot.<strong></strong></p><p>“We’ve really been learning from our community,” says Hello Robot cofounder and CEO Aaron Edsinger. “In the past year, we’ve seen a real uptick in publications, and it feels like we’re getting to this critical-mass moment with Stretch. So with Stretch 3, it’s about implementing features that our community has been asking us for.”<strong></strong></p><p>“When we launched, we didn’t have a dexterous wrist at the end as standard, because we were trying to start with truly the minimum viable product,” says Hello Robot cofounder and CTO Charlie Kemp. “And what we found is that almost every order was adding the dexterous wrist, and by actually having it come in standard, we’ve been able to devote more attention to it and make it a much more robust and capable system.”</p><p>Kemp says that having Stretch do everything right out of the box (with Hello Robot support) makes a big difference for their research customers. “Making it easier for people to try things—we’ve learned to really value that, because the more steps that people have to go through to experience it, the less likely they are to build on it.” In a research context, this is important because what you’re really talking about is time: The more time people spend just trying to make the robot function, the less time they’ll spend getting the robot to do useful things.</p><p class="shortcode-media shortcode-media-youtube">
  106. <span class="rm-shortcode" data-rm-shortcode-id="048a340e86bbefb9a6ea9581c8db4f9d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Rxru8t1x1hg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  107. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Hello Robot</small></p><p>At this point, you may be thinking of Stretch as a research platform. Or you may be thinking of Stretch as a robot for people with disabilities, if you read our <a href="https://spectrum.ieee.org/stretch-assistive-robot" target="_self">November 2023 cover story about Stretch and Henry and Jane Evans</a>. And the robot is definitely both of those things. But Hello Robot stresses that these specific markets are not their end goal—they see Stretch as a generalist mobile manipulator with a future in the home, as suggested by this Stretch 3 promo video:</p><p class="shortcode-media shortcode-media-youtube">
  108. <span class="rm-shortcode" data-rm-shortcode-id="0a6194e086c360103ffb0e1e58aad9ff" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Ni4p8axgqHM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  109. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Hello Robot</small></p><p>Dishes, laundry, bubble cannons: All of these are critical to the functionality of any normal household. “Stretch is an inclusive robot,” says Kemp. “It’s not just for older adults or people with disabilities. We want a robot that can be beneficial for everyone. Our vision, and what we believe will really happen, whether it’s us or someone else, is that there is going to be a versatile, general-purpose home robot. Right now, clearly, our market is not yet consumers in the home. But that’s where we want to go.”</p><p>Robots in the home have been promised for decades, and with the notable exception of the <a data-linked-post="2665973683" href="https://spectrum.ieee.org/south-pole-roombas" target="_blank">Roomba</a>, there has not been a lot of success. The idea of a robot that could handle dishes or laundry is tempting, but is it near-term or medium-term realistic? Edsinger, who has been at this whole robots thing for <a href="https://www.linkedin.com/in/aaron-edsinger/" rel="noopener noreferrer" target="_blank">a very long time</a>, is an optimist about this, and about the role that Stretch will play. “There are so many places where you can see the progress happening—in sensing, in manipulation,” Edsinger says. “I can imagine those things coming together now in a way that I could not have 5 to 10 years ago, when it seemed so incredibly hard.”</p><p class="pull-quote">“We’re very pragmatic about what is possible. And I think that we do believe that things are changing faster than we anticipated—10 years ago, I had a pretty clear linear path in mind for robotics, but it’s hard to really imagine where we’ll be in terms of robot capabilities 10 years from now.” <strong>—Aaron Edsinger, Hello Robot</strong></p><p>I’d say that it’s <em>still</em> incredibly hard, but Edsinger is right that a lot of the pieces do seem to be coming together. Arguably, the hardware is the biggest challenge here, because working in a home puts heavy constraints on what kind of hardware you’re able to use. You’re not likely to see a humanoid in a home anytime soon, because they’d actually be dangerous, and even a quadruped is likely to be more trouble than it’s worth in a home environment. Hello Robot is conscious of this, and that’s been one of the main drivers of the design of Stretch.</p><p>“I think the portability of Stretch is really worth highlighting because there’s just so much value in that which is maybe not obvious,” Edsinger tells us. Being able to just pick up and move a mobile manipulator is not normal. Stretch’s weight (24.5 kilograms) is almost trivial to work with, in sharp contrast with virtually every other mobile robot with an arm: Stretch fits into places that humans fit into, and manages to have a similar workspace as well, and its bottom-heavy design makes it safe for humans to be around. It can’t climb stairs, but it can be carried upstairs, which is a bigger deal than it may seem. It’ll fit in the back of a car, too. Stretch is built to explore the world—not just some facsimile of the world in a research lab.</p><p>“<a href="https://dobb-e.com/" rel="noopener noreferrer" target="_blank">NYU students</a> have been taking Stretch into tens of homes around New York,” says Edsinger. “They carried one up a four-story walk-up. This enables real in-home data collection. And this is where home robots will start to happen—when you can have hundreds of these out there in homes collecting data for machine learning.”</p><p>“That’s where the opportunity is,” adds Kemp. “It’s that engagement with the world about where to apply the technology beneficially. And if you’re in a lab, you’re not going to find it.”</p><p>We’ve seen some compelling examples of this recently, with <a href="https://mobile-aloha.github.io/" rel="noopener noreferrer" target="_blank">Mobile ALOHA</a>. These are robots learning to be autonomous by having humans teleoperate them through common household skills. But the system isn’t particularly portable, and <a href="https://mobile-aloha.github.io/" rel="noopener noreferrer" target="_blank">it costs nearly US $32,000 in parts alone</a>. Don’t get me wrong: I love the research. It’s just going to be difficult to scale, and <a href="https://spectrum.ieee.org/global-robotic-brain" target="_self">in order to collect enough data to effectively tackle the world, scale is critical</a>. Stretch is much easier to scale, because you can just straight up buy one.</p><p>Or two! You may have noticed that some of the Stretch 3 videos have two robots in them, collaborating with each other. This is not yet autonomous, but with two robots, a single human (or a pair of humans) can teleoperate them as if they were effectively a single two-armed robot:</p><p class="shortcode-media shortcode-media-youtube">
  110. <span class="rm-shortcode" data-rm-shortcode-id="f2c94e0759331e3ec652133f40b5f5c2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/QtG8nJ78x2M?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  111. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Hello Robot</small></p><p>Essentially, what you’ve got here is a two-armed robot that (very intentionally) has nothing to do with humanoids. As Kemp explains: “We’re trying to help our community and the world see that there is a different path from the human model. We humans tend to think of the preexisting solution: People have two arms, so we think, well, I’m going to need to have two arms on my robot or it’s going to have all these issues.” Kemp points out that robots like Stretch have shown that really quite a lot of things can be done with only one arm, but two arms can still be helpful for a substantial subset of common tasks. “The challenge for us, which I had just never been able to find a solution for, was how you get two arms into a portable, compact, affordable lightweight mobile manipulator. You can’t!”</p><p>But with two Stretches, you have not only two arms but also two shoulders that you can put wherever you want. Washing a dish? You’ll probably want two arms close together for collaborative manipulation. Making a bed? Put the two arms far apart to handle both sides of a sheet at once. It’s a sort of distributed on-demand bimanual manipulation, which certainly adds a little bit of complexity but also solves a bunch of problems when it comes to practical in-home manipulation. Oh—and if those teleop tools look like modified kitchen tongs, that’s because <a href="https://github.com/hello-robot/stretch_dex_teleop" target="_blank">they’re modified kitchen tongs</a>.</p><p>Of course, buying two Stretch robots is twice as expensive as buying a single Stretch robot, and even though Stretch 3’s cost of just under $25,000 is very inexpensive for a mobile manipulator and very affordable in a research or education context, we’re still pretty far from something that most people would be able to afford for themselves. Hello Robot says that producing robots at scale is the answer here, which I’m sure is true, but it can be a difficult thing for a small company to achieve. </p><p>Moving slowly toward scale is at least partly intentional, Kemp tells us. “We’re still in the process of discovering Stretch’s true form—what the robot really should be. If we tried to scale to make lots and lots of robots at a much lower cost before we fundamentally understood what the needs and challenges were going to be, I think it would be a mistake. And there are many gravestones out there for various home-robotics companies, some of which I truly loved. We don’t want to become one of those.”</p><p>This is not to say that Hello Robot isn’t actively trying to make Stretch more affordable, and Edsinger suggests that the next iteration of the robot will be more focused on that. But—and this is super important—Kemp tells us that Stretch has been, is, and will continue to be sustainable for Hello Robot: “We actually charge what we should be charging to be able to have a sustainable business.” In other words, Hello Robot is not relying on some nebulous scale-defined future to transition into a business model that can develop, sell, and support robots. They can do that right now while keeping the lights on. “Our sales have enough margin to make our business work,” says Kemp. “That’s part of our discipline.”</p><a href="https://hello-robot.com/stretch-3-product" rel="noopener noreferrer" target="_blank">Stretch 3 is available now for $24,950</a>, which is just about the same as the cost of Stretch 2 with the optional add-ons included. There are lots and lots of other new features that we couldn’t squeeze into this article, including FCC certification, a more durable arm, and off-board GPU support. You’ll find a handy list of all the upgrades <a href="https://hello-robot.com/stretch-3-whats-new" rel="noopener noreferrer" target="_blank">here</a>.]]></description><pubDate>Thu, 15 Feb 2024 17:28:45 +0000</pubDate><guid>https://spectrum.ieee.org/hello-robot-stretch-3</guid><category>Hello robot</category><category>Home robots</category><category>Mobile manipulator</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-picture-of-a-tall-skinny-robot-with-a-two-wheeled-base-and-a-single-arm-with-a-flexible-wrist-on-the-end.png?id=51473667&amp;width=980"></media:content></item><item><title>What It’s Like to Eat a Robot</title><link>https://spectrum.ieee.org/edible-robots-2667244222</link><description><![CDATA[
  112. <img src="https://spectrum.ieee.org/media-library/a-woman-puts-a-cup-holding-a-wiggly-beige-cylinder-into-her-mouth-and-bites-off-the-top-of-the-cylinder.gif?id=51445183&width=1245&height=700&coordinates=0%2C9%2C0%2C9"/><br/><br/><p><em>Odorigui</em> is a type of Japanese cuisine in which people consume live seafood while it’s still moving, making movement part of the experience. You may have some feelings about this (I definitely do), but from a research perspective, getting into what those feelings are and what they mean isn’t really practical. To do so in a controlled way would be both morally and technically complicated, which is why <a href="https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0296697" rel="noopener noreferrer" target="_blank">Japanese researchers have started developing robots that can be eaten as they move</a>, wriggling around in your mouth as you chomp down on them. Welcome to HERI: Human-Edible Robot Interaction.</p><hr/><p class="shortcode-media shortcode-media-youtube">
  113. <span class="rm-shortcode" data-rm-shortcode-id="0bd98a907f7c1e664360a4b6c98c1f72" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/OoAszrv5vy4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  114. </p><p>The happy little robot that got its head ripped off by a hungry human (who, we have to say, was exceptionally polite about it) is made primarily of gelatin, along with sugar and apple juice for taste. After all the ingredients were mixed, it was poured into a mold and refrigerated for 12 hours to set, with the resulting texture ending up like a chewy gummy candy. The mold incorporated a couple of air chambers into the structure of the robot, which were hooked up to pneumatics that got the robot to wiggle back and forth.</p><p>Sixteen students at Osaka University got the chance to eat one of these wiggly little robots. The process was to put your mouth around the robot, let the robot move around in there for 10 seconds for the full experience, and then bite it off, chew, and swallow. Japanese people were chosen partly because this research was done in Japan, but also because, according to the paper, “of the cultural influences on the use of onomatopoeic terms.” In Japanese, there are terms that are useful in communicating specific kinds of textures that can’t easily be quantified. </p><p>The participants were  asked a series of questions about their experience, including some heavy ones:</p><ul><li>Did you think what you just ate had animateness?</li><li>Did you feel an emotion in what you just ate?</li><li>Did you think what you just ate had intelligence?</li><li>Did you feel guilty about what you just ate?</li></ul><p>Oof.</p><p>Compared with a control group of students who ate the robot when it was <em>not</em> moving, the students who ate the <em>moving</em> robot were more likely to interpret it as having a “munya-munya” or “mumbly” texture, showing that movement can influence the eating experience. Analysis of question responses showed that the moving robot also caused people to perceive it as emotive and intelligent, and caused more feelings of guilt when it was consumed. The paper summarizes it pretty well: “In the stationary condition, participants perceived the robot as ‘food,’ whereas in the movement condition, they perceived it as a ‘creature.’”</p><p>The good news here is that since these robots are more like living things than nonrobots, they could potentially stand in for live critters eaten in a research context, say the researchers: “The utilization of edible robots in this study enabled us to examine the effects of subtle movement variations in human eating behavior under controlled conditions, a task that would be challenging to accomplish with real organisms.” There’s still more work to do to make the robots more like specific living things, but that’s the plan going forward:</p><blockquote>Our proposed edible robot design does not specifically mimic any particular biological form. To address these limitations, we will focus on the field by designing edible robots that imitate forms relevant to ongoing discussions on food shortages and cultural delicacies. Specifically, in future studies, we will emulate creatures consumed in contexts such as insect-based diets, which are being considered as a solution to food scarcity issues, and traditional Japanese dishes like “Odorigui” or “Ikizukuri (live fish sashimi).” These imitations are expected to provide deep insights into the psychological and cognitive responses elicited when consuming moving robots, merging technology with necessities and culinary traditions.</blockquote><p>“Exploring the Eating Experience of a Pneumatically Driven Edible Robot: Perception, Taste, and Texture,” by Yoshihiro NakataI, Midori Ban, Ren Yamaki, Kazuya Horibe, Hideyuki Takahashi, and Hiroshi Ishiguro from the University of Electro-Communications and Osaka University, was published in <a href="https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0296697" rel="noopener noreferrer" target="_blank"><em>PLOS One</em></a>.</p>]]></description><pubDate>Tue, 13 Feb 2024 14:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/edible-robots-2667244222</guid><category>Edible robots</category><category>Hiroshi ishiguro</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/a-woman-puts-a-cup-holding-a-wiggly-beige-cylinder-into-her-mouth-and-bites-off-the-top-of-the-cylinder.gif?id=51445183&amp;width=980"></media:content></item><item><title>Everything You Wanted to Know About 1X’s Latest Video</title><link>https://spectrum.ieee.org/1x-robotics-video</link><description><![CDATA[
  115. <img src="https://spectrum.ieee.org/media-library/two-humanoid-robots-stand-at-a-workbenches-with-two-humanoid-robots-in-the-background.png?id=51444298&width=1245&height=700&coordinates=0%2C0%2C0%2C0"/><br/><br/><p>Just last month, Oslo-based 1X (formerly Halodi Robotics) announced a massive US $100 million Series B, and clearly it has been putting the work in. <a href="https://www.youtube.com/watch?v=iHXuU3nTXfQ" target="_blank">A new video posted last week</a> shows a [insert <a href="https://en.wikipedia.org/wiki/Collective_noun" target="_blank">collective noun</a> for humanoid robots here] of EVE android-ish mobile manipulators doing <a href="https://www.1x.tech/discover/all-neural-networks-all-autonomous-all-1x-speed" target="_blank">a wide variety of tasks leveraging end-to-end neural networks</a> (pixels to actions). And best of all, the video seems to be more or less an honest one: a single take, at (appropriately) 1X speed, and full autonomy. But we still had questions! And 1X has answers.</p><hr/><p class="shortcode-media shortcode-media-youtube">
  116. <span class="rm-shortcode" data-rm-shortcode-id="54b7d5496c08caeceabdced673f123ed" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/iHXuU3nTXfQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  117. </p><p>If, like me, you had some very important questions after watching this video, including whether that plant is actually dead and the fate of the weighted companion cube, you’ll want to read this Q&A with <a href="https://twitter.com/ericjang11" target="_blank">Eric Jang</a>, vice president of artificial intelligence at 1X.</p><p><strong>How many takes did it take to get this take?</strong></p><p><strong>Eric Jang: </strong>About 10 takes that lasted more than a minute; this was our first time doing a video like this, so it was more about learning how to coordinate the film crew and set up the shoot to look impressive.</p><p><strong>Did you train your robots specifically on floppy things and transparent things?</strong></p><p><strong>Jang: </strong>Nope! We train our neural network to pick up all kinds of objects—both rigid and deformable and transparent things. Because we train manipulation end-to-end from pixels, picking up deformables and transparent objects is much easier than a classical grasping pipeline, where you have to figure out the exact geometry of what you are trying to grasp.</p><p><strong>What keeps your robots from doing these tasks faster?</strong></p><p><strong>Jang: </strong>Our robots learn from demonstrations, so they go at exactly the same speed the human teleoperators demonstrate the task at. If we gathered demonstrations where we move faster, so would the robots.</p><p><strong>How many <a href="https://theportalwiki.com/wiki/Weighted_Companion_Cube" rel="noopener noreferrer" target="_blank">weighted companion cubes</a> were harmed in the making of this video?</strong></p><p><strong>Jang: </strong>At 1X, weighted companion cubes do not have rights.</p><p><strong>That’s a very cool method for charging, but it seems a lot more complicated than some kind of drive-on interface directly with the base. Why use manipulation instead?</strong></p><p><strong>Jang: </strong>You’re right that this isn’t the simplest way to charge the robot, but if we are going to succeed at our mission to build generally capable and reliable robots that can manipulate all kinds of objects, our neural nets have to be able to do this task at the very least. Plus, it reduces costs quite a bit and simplifies the system!</p><p><strong>What animal is that blue plush supposed to be?</strong></p><p><strong>Jang: </strong>It’s an obese shark, I think.</p><p><strong>How many different robots are in this video?</strong></p><p><strong>Jang: </strong>17? And more that are stationary.</p><p><strong>How do you tell the robots apart?</strong></p><p><strong>Jang: </strong>They have little numbers printed on the base.</p><p><strong>Is that plant dead?</strong></p><p><strong>Jang: </strong>Yes, we put it there because no CGI/3D-rendered video would ever go through the trouble of adding a dead plant.</p><p><strong>What sort of existential crisis is the robot at the window having?</strong></p><p> <strong>Jang: </strong>It was <a href="https://twitter.com/ericjang11/status/1756138725363662948" target="_blank">supposed to be</a> opening and closing the window repeatedly (good for testing statistical significance).</p><p><strong>If one of the robots was actually a human in a helmet and a suit holding grippers and standing on a mobile base, would I be able to tell?</strong></p><p><strong>Jang: </strong>I was super flattered by this comment on the Youtube video:</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  118. <img alt="" class="rm-shortcode" data-rm-shortcode-id="4c4d3251d9be31a68149078248a7e2b9" data-rm-shortcode-name="rebelmouse-image" id="0d42f" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=51444288&width=980"/>
  119. </p><p>But if you look at the area where the upper arm tapers at the shoulder, it’s too thin for a human to fit inside while still having such broad shoulders:</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  120. <img alt="" class="rm-shortcode" data-rm-shortcode-id="e86ba19a40913983d7eb196d36c75b9a" data-rm-shortcode-name="rebelmouse-image" id="35356" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=51444290&width=980"/>
  121. </p><p><strong>Why are your robots so happy all the time? Are you planning to do more complex HRI (human-robot interaction) stuff with their faces?</strong></p><p><strong>Jang: </strong>Yes, more complex HRI stuff is in the pipeline! </p><p><strong>Are your robots able to autonomously collaborate with each other?</strong></p><p><strong>Jang: </strong>Stay tuned! </p><p><strong>Is the <a href="https://mathworld.wolfram.com/SkewPolyomino.html" target="_blank">skew tetromino</a><a href="https://en.wikipedia.org/wiki/Tetromino" target="_blank"></a> the most difficult <a href="https://en.wikipedia.org/wiki/Tetromino" target="_blank">tetromino</a> for robotic manipulation?</strong></p><p><strong>Jang: </strong>Good catch! Yes, the green one is the worst of them all because there are many valid ways to pinch it with the gripper and lift it up. In robotic learning, if there are multiple ways to pick something up, it can actually confuse the machine learning model. Kind of like asking a car to turn left and right at the same time to avoid a tree. </p><p><strong>Everyone else’s robots are making coffee. Can your robots make coffee?</strong></p><p><strong>Jang: </strong><a href="https://x.com/ericjang11/status/1755666730326823168?s=20" target="_blank">Yep!</a> We were planning to throw in some coffee making on this video as an Easter egg, but the coffee machine broke right before the film shoot and it turns out it’s impossible to get a Keurig K-Slim in Norway via next-day shipping.</p><div class="horizontal-rule"></div><p>1X is currently hiring both AI researchers (specialties include imitation learning, reinforcement learning, and large-scale training) and android operators (!) which actually sounds like a super fun and interesting job. <a href="https://www.1x.tech/discover/all-neural-networks-all-autonomous-all-1x-speed" target="_blank">More here</a>.</p>]]></description><pubDate>Mon, 12 Feb 2024 21:07:28 +0000</pubDate><guid>https://spectrum.ieee.org/1x-robotics-video</guid><category>1x</category><category>Humanoid robots</category><category>Neural networks</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/two-humanoid-robots-stand-at-a-workbenches-with-two-humanoid-robots-in-the-background.png?id=51444298&amp;width=980"></media:content></item><item><title>Disney’s Newest Robot Demonstrates Collaborative Cuteness</title><link>https://spectrum.ieee.org/disney-robot-2666681104</link><description><![CDATA[
  122. <img src="https://spectrum.ieee.org/media-library/the-duke-weaselton-animatronic-character-under-development-at-disney-imagineering.gif?id=51434452&width=1245&height=700&coordinates=0%2C0%2C0%2C1"/><br/><br/><p>
  123. <em>This is a guest post. The views expressed here are solely those of the author and do not represent positions of </em><a href="https://spectrum.ieee.org/" target="_self">IEEE Spectrum</a><em> or the IEEE.</em>
  124. </p><p>
  125. If Disney’s history of storytelling has taught us anything, it’s to never underestimate the power of a great sidekick. Even though sidekicks aren’t the stars of the show, they provide life and energy and move the story along in important ways. It’s hard to imagine Aladdin without the Genie, or Peter Pan without Tinker Bell.
  126. </p><p>
  127. In robotics, however, solo acts proliferate. Even when multiple robots are used, they usually act in parallel. One key reason for this is that most robots are designed in ways that make direct collaboration with other robots difficult. Stiff, strong robots are more repeatable and easier to control, but those designs have very little forgiveness for the imperfections and mismatches that are inherent in coming into contact with another robot.
  128. </p><p>
  129. Having robots work together–especially if they have complementary skill sets–can open up some exciting opportunities, especially in the entertainment robotics space. At Walt Disney Imagineering, our research and development teams have been working on this idea of collaboration between robots, and we were able to show off the result of one such collaboration in Shanghai this week, when a little furry character interrupted the opening moments for the first-ever <a href="https://disneyparks.disney.go.com/blog/2023/10/zootopia-opening-dec-20-2023-at-shanghai-disney-resort/" target="_blank"><em>Zootopia</em> land</a>.
  130. </p><hr/><p>
  131. Our newest robotic character, <a href="https://zootopia.fandom.com/wiki/Duke_Weaselton" target="_blank">Duke Weaselton</a>, rolled onstage at the Shanghai Disney Resort for the first time last December, pushing a purple kiosk and blasting pop music. As seen in the video below, the audience got a kick out of watching him hop up on top of the kiosk and try to negotiate with the Chairman of Disney Experiences, Josh D’Amaro, for a new job. And of course, some new perks. After a few moments of wheeling and dealing, Duke gets gently escorted offstage by team members Richard Landon and Louis Lambie.</p><p class="shortcode-media shortcode-media-youtube">
  132. <span class="rm-shortcode" data-rm-shortcode-id="b603991209cf91df2cb34814e3fd80cc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ZRJFYlldkBw?rel=0&start=242" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  133. </p><p>
  134. What might not be obvious at first is that the moment you just saw was enabled not by one robot, but by two. Duke Weaselton is the star of the show, but his dynamic motion wouldn’t be possible without the kiosk, which is its own independent, actuated robot. While these two robots are very different, by working together as one system, they’re able to do things that neither could do alone.
  135. </p><p>
  136. The character and the kiosk bring two very different kinds of motion together, and create something more than the sum of their parts in the process. The character is an expressive, bipedal robot with an exaggerated, animated motion style. It looks fantastic, but it’s not optimized for robust, reliable locomotion. The kiosk, meanwhile, is a simple wheeled system that behaves in a highly predictable way. While that’s great for reliability, it means that by itself it’s not likely to surprise you. But when we combine these two robots, we get the best of both worlds. The character robot can bring a zany, unrestrained energy and excitement as it bounces up, over, and alongside the kiosk, while the kiosk itself ensures that both robots reliably get to wherever they are going.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  137. <img alt="" class="rm-shortcode" data-rm-shortcode-id="9cd8a3308caea5b0672353e0b170241f" data-rm-shortcode-name="rebelmouse-image" id="a0478" loading="lazy" src="https://spectrum.ieee.org/media-library/harout-jarchafjian-sophie-bowe-tony-dohi-bill-west-marcela-de-los-rios-bob-michel-and-u00a0morgan-pope.jpg?id=51435623&width=980"/>
  138. <small class="image-media media-caption" placeholder="Add Photo Caption...">Harout Jarchafjian, Sophie Bowe, Tony Dohi, Bill West, Marcela de los Rios, Bob Michel, and Morgan Pope.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Morgan Pope</small></p><p>
  139. The collaboration between the two robots is enabled by designing them to be robust and flexible, and with motions that can tolerate a large amount of uncertainty while still delivering a compelling show. This is a direct result from lessons learned from an earlier robot, <a href="https://spectrum.ieee.org/disney-robot-indestructibles" target="_self">one that tumbled across the stage at SXSW earlier this year</a>. Our basic insight is that a small, lightweight robot can be surprisingly tough, and that this toughness enables new levels of creative freedom in the design and execution of a show.
  140. </p><p>
  141. This level of robustness also makes collaboration between robots easier. Because the character robot is tough and because there is some flexibility built into its motors and joints, small errors in placement and pose don’t create big problems like they might for a more conventional robot. The character can lean on the motorized kiosk to create the illusion that it is pushing it across the stage. The kiosk then uses a winch to hoist the character onto a platform, where electromagnets help stabilize its feet. Essentially, the kiosk is compensating for the fact that Duke himself can’t climb, and might be a little wobbly without having his feet secured. The overall result is a free-ranging bipedal robot that moves in a way that feels natural and engaging, but that doesn’t require especially complicated controls or highly precise mechanical design. Here’s a behind-the-scenes look at our development of these systems:
  142. </p><p class="shortcode-media shortcode-media-youtube">
  143. <span class="rm-shortcode" data-rm-shortcode-id="71aa08adf71577d7c9f027d0c887cbaa" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/VFNruidzlZw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  144. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Disney Imagineering</small>
  145. </p><p>To program Duke’s motions, our team uses an animation pipeline that was originally developed for <a href="https://spectrum.ieee.org/disney-robot-indestructibles" target="_blank">the SXSW demo</a>, where a designer can pose the robot by hand to create new motions. We have since developed an interface which can also take motions from conventional animation software tools. Motions can then be adjusted to adapt to the real physical constraints of the robots, and that information can be sent back to the animation tool. As animations are developed, it’s critical to retain a tight synchronization between the kiosk and the character. The system is designed so that the motion of both robots is always coordinated, while simultaneously supporting the ability to flexibly animate individual robots–or individual parts of the robot, like the mouth and eyes.<br/></p><p>
  146. Over the past nine months, we explored a few different kinds of collaborative locomotion approaches. The GIFs below show some early attempts at riding a tricycle, skateboarding, and pushing a crate. In each case, the idea is for a robotic character to eventually collaborate with another robotic system that helps bring that character’s motions to life in a stable and repeatable way.
  147. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  148. <img alt="" class="rm-shortcode" data-rm-shortcode-id="2979554aba931ef962f0085e75497d5f" data-rm-shortcode-name="rebelmouse-image" id="f964d" loading="lazy" src="https://spectrum.ieee.org/media-library/disney-hopes-that-their-judy-hopps-robot-will-soon-u00a0be-able-to-use-the-help-of-u00a0a-robotic-tricycle-crate-or-skateboard.gif?id=50873889&width=980"/>
  149. <small class="image-media media-caption" placeholder="Add Photo Caption...">Disney hopes that their Judy Hopps robot will soon be able to use the help of a robotic tricycle, crate, or skateboard to enable new forms of locomotion.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Morgan Pope</small>
  150. </p><p>
  151. This demo with Duke Weaselton and his kiosk is just the beginning, says Principal R&D Imagineer Tony Dohi, who leads the project for us. “Ultimately, what we showed today is an important step towards a bigger vision. This project is laying the groundwork for robots that can interact with each other in surprising and emotionally satisfying ways. Today it’s a character and a kiosk, but moving forward we want to have multiple characters that can engage with each other and with our guests.”
  152. </p><p>
  153. Walt Disney Imagineering R&D is exploring a multi-pronged development strategy for our robotic characters. Engaging character demonstrations like Duke Weasleton focus on quickly prototyping complete experiences using immediately accessible techniques. In parallel, our research group is developing new technologies and capabilities that become the building blocks for both elevating existing experiences, and designing and delivering completely new shows. The robotics team led by Moritz Bächer shared one such building block–embodied in a highly expressive and stylized robotic walking character–<a href="https://spectrum.ieee.org/disney-robot" target="_self">at IROS in October</a>. The capabilities demonstrated there can eventually be used to help robots like Duke Weaselton perform more flexibly, more reliably, and more spectacularly.
  154. </p><p>
  155. “Authentic character demonstrations are useful because they help inform what tools are the most valuable for us to develop,” explains Bächer. “In the end our goal is to create tools that enable our teams to produce and deliver these shows rapidly and efficiently.” This ties back to the fundamental technical idea behind the Duke Weaselton show moment–collaboration is key!
  156. </p>]]></description><pubDate>Sun, 11 Feb 2024 14:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/disney-robot-2666681104</guid><category>Disney robots</category><category>Animatronics</category><category>Robotics</category><dc:creator>Morgan Pope</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/the-duke-weaselton-animatronic-character-under-development-at-disney-imagineering.gif?id=51434452&amp;width=980"></media:content></item><item><title>Video Friday: Monocycle Robot With Legs</title><link>https://spectrum.ieee.org/video-friday-monocycle-robot-with-legs</link><description><![CDATA[
  157. <img src="https://spectrum.ieee.org/media-library/image.png?id=51425645&width=1473&height=913&coordinates=200%2C0%2C247%2C167"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 02 February 2024, ZURICH</h5><h5><a href="https://humanrobotinteraction.org/2024/">HRI 2024</a>: 11–15 March 2024, BOULDER, COLO.</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="dx3dmeredyk"><em>In this video, we present Ringbot, a novel leg-wheel transformer robot incorporating a monocycle mechanism with legs. Ringbot aims to provide versatile mobility by replacing the driver and driving components of a conventional monocycle vehicle with legs mounted on compact driving modules inside the wheel.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b89bcefe6d85a52f0ab912732b8923cf" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DX3DMEreDyk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/document/10423226">Paper</a> ] via [ <a href="https://publish.illinois.edu/kimlab2020/">KIMLAB</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="phxqfe-rteo">Making money with robots has always been a struggle, but I think ALOHA 2 has figured it out.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1d1e85e1e77d03fe5bd075f8aa7d69ac" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/PHXQFE-Rteo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Seriously, though, that is some impressive manipulation capability. I don’t know what that freakish panda thing is, but getting a contact lens from the package onto its bizarre eyeball was some wild dexterity.</p><p>[ <a href="https://aloha-2.github.io/">ALOHA 2</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="utmufoc7r_s"><em>Highlights from testing our new arms built by Boardwalk Robotics. Installed in October of 2023, these new arms are not just for boxing and provide much greater speed and power. This matches the mobility and manipulation goals we have for Nadia!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="598a543427e332452c1d38ca04e22e49" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/uTmUfOc7r_s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>The least dramatic but possibly most important bit of that video is when Nadia uses her arms to help her balance against a wall, which is one of those things that humans do all the time without thinking about it. And we always appreciate being shown things that don’t go perfectly alongside things that do. The bit at the end there was Nadia not quite managing to do lateral arm raises. I can relate; that’s my reaction when I lift weights, too.</p><p>[ <a href="https://robots.ihmc.us/nadia">IHMC</a> ]</p><p>Thanks, Robert!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="yr835-6bato">The recent progress in commercial humanoids is just exhausting.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8105e56cec4a108b91373cb9ad849f75" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/yR835-6BaTo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/b2/">Unitree</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="z0aotzdx25y"><em>We present an avatar system designed to facilitate the embodiment of humanoid robots by human operators, validated through iCub3, a humanoid developed at the Istituto Italiano di Tecnologia.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="00f7703902c405a5d0fe80812042fbe1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/z0aoTzdx25Y?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.science.org/doi/10.1126/scirobotics.adh3834">Science Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="nwqozp40ows"><em>Have you ever seen a robot skiing?! Ascento robot enjoying a day in the ski slopes of Davos.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="745e9617a90167b61aa1bb2cdf2212ef" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/nWqOzP40Ows?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ascento.ai/">Ascento</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="sfkm-rxiqzg"><em>Can’t trip Atlas up! Our humanoid robot gets ready for real work combining strength, perception, and mobility.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5306d9d741d9d97cfc83ba1fdf955db5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/SFKM-Rxiqzg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Notable that Boston Dynamics is now saying that Atlas “gets ready for real work.” Wonder how much to read into that?</p><p>[ <a href="https://bostondynamics.com/atlas/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="jmqftc2yq1s">You deserve to be free from endless chores! YOU! DESERVE! CHORE! FREEDOM!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="62c1560a8103f1eac805aaccdc6efff6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/JmQftC2yQ1s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Pretty sure this is teleoperated, so someone is still doing the chores, sadly.</p><p>[ <a href="https://www.youtube.com/@MagicLab_Offical">MagicLab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="bmobzroztck"><em>Multimodal UAVs (unmanned aerial vehicles) are rarely capable of more than two modalities—that is, flying and walking or flying and perching. However, being able to fly, perch, and walk could further improve their usefulness by expanding their operating envelope. For instance, an aerial robot could fly a long distance, perch in a high place to survey the surroundings, then walk to avoid obstacles that could potentially inhibit flight. Birds are capable of these three tasks, and so offer a practical example of how a robot might be developed to do the same.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="aec685517da4cbfed1d02146c4201db0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/BmobZRoZtCk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/document/10324391">Paper</a> ] via [ <a href="https://www.epfl.ch/labs/lis/">EPFL LIS</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="blsleii2uem"><em>Nissan announces the concept model of “Iruyo,” a robot that supports babysitting while driving. Ilyo relieves the anxiety of the mother, father, and baby in the driver’s seat. We support safe and secure driving for parents and children. Nissan and Akachan Honpo are working on a project to make life better with cars and babies. Iruyo was born out of the voices of mothers and fathers who said, “I can’t hold my baby while driving alone.”</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="13ab4982649a5411ed5871c3b59c271d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/bLslEII2ueM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.nissan.co.jp/SP/INTELLIGENTPUPPET/">Nissan</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="cbcpjzicj2w"><em>Building 937 houses the coolest robots at CERN. This is where the action happens to build and program robots that can tackle the unconventional challenges presented by the laboratory’s unique facilities. Recently, a new type of robot called CERNquadbot has entered CERN’s robot pool and successfully completed its first radiation protection test in the North Area. </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="116b255d6f0fee3f2cc9260ce93ae864" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/cbcpJZicJ2w?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://home.web.cern.ch/news/news/engineering/introducing-cerns-robodog">CERN</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="nplbxelxnga">Congrats to Starship, the OG robotic delivery service, on their US $90 million raise.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2e3276d6738dcac9e1ce7495f8af2718" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/NplBXelxNGA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.starship.xyz/">Starship</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="yhszyywftoo"><em>By blending 2D images with foundation models to build 3D feature fields, a new MIT method helps robots understand and manipulate nearby objects with open-ended language prompts.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="860964abb3ee4edf944057822a1a61ac" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/YhszyYWfTOo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://f3rm.github.io/">GitHub</a> ] via [ <a href="https://news.mit.edu/2023/using-language-give-robots-better-grasp-open-ended-world-1102">MIT</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="mkaitiazqek">This is one of those things that’s far more difficult than it might look.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="71bb9acad01ed1cce661b21350ec80d7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/MkaITIAZQEk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://roam.me.columbia.edu/">ROAM Lab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="2l9pzycde-c"><em>Our current care system does not scale, and our populations are aging fast. Robodies are multipliers for care staff, allowing them to work together with local helpers to provide protection and assistance around the clock while maintaining personal contact with people in the community.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="658b41c4b3a3712c9a9153d29c50bdc8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2l9pZYcDe-c?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="">DEVANTHRO</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="hjeol4jskci">It’s the world’s smallest humanoid robot, until someone comes out with slightly smaller servos!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f53b7f350401302049bbd9a1ae4abb78" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/hJeOL4jSkCI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.guinnessworldrecords.com/news/2024/2/worlds-smallest-humanoid-robot-built-by-hong-kong-students-764314">Guinness </a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="m_ehmqxdqd8"><em>Deep Robotics wishes you a happy year of the dragon!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7b978e4ac30c5f99c772827a156d5ed2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/m_ehMQxDqd8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">Deep Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="znksmrz0o6q"><em>SEAS  researchers are helping develop resilient and autonomous deep-space and extraterrestrial habitations by developing technologies to let autonomous robots repair or replace damaged components in a habitat. The research is part of the Resilient ExtraTerrestrial Habitats institute (RETHi), led by Purdue University in partnership with SEAS, the University of Connecticut, and the University of Texas at San Antonio. Its goal is to “design and operate resilient deep-space habitats that can adapt, absorb, and rapidly recover from expected and unexpected disruptions.”</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e46ad24e401e10b9a8119af734b3291b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zNKsMrz0o6Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://seas.harvard.edu/news/2024/01/robots-help-human-habitation-space">Harvard</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="cxj72nxlrzq"><em>Find out how a bold vision became a success story! The DLR Institute of Robotics and Mechatronics has been researching robotic arms since the 1990s, originally for use in space. It was a long and ambitious journey before these lightweight robotic arms could be used on Earth and finally in operating theaters, a journey that required concentrated robotics expertise, interdisciplinary cooperation, and ultimately a successful technology transfer.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="dc5edfbd8a6416547c6ffe32a7e67d15" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/cXj72nxlrzQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.miroinnovationlab.de/">DLR MIRO</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="nngf37vdmxo"><em>Robotics is changing the world, driven by focused teams of diverse experts. Willow Garage operated with the mantra “Impact first, return on capital second” and through ROS and the PR2 had enormous impact. Autonomous mobile robots are finally being accepted in the service industry, and Savioke (now Relay Robotics) was created to drive that impact. This talk will trace the evolution of Relay robots and their deployment in hotels, hospitals, and other service industries, starting with roots at Willow Garage. As robotics technology is poised for the next round of advances, how do we create and maintain the organizations that continue to drive progress?</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="69d129b696e752e699e6a16b649b7e03" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/nngF37VdMxo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotics.northwestern.edu/">Northwestern</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 09 Feb 2024 17:41:33 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-monocycle-robot-with-legs</guid><category>Boston dynamics</category><category>Cern</category><category>Harvard</category><category>Humanoid robots</category><category>Mit</category><category>Robotics</category><category>Science robotics</category><category>Unitree</category><category>Video friday</category><category>Willow garage</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/image.png?id=51425645&amp;width=980"></media:content></item><item><title>Tiny Quadrotor Learns to Fly in 18 Seconds</title><link>https://spectrum.ieee.org/drone-quadrotor</link><description><![CDATA[
  158. <img src="https://spectrum.ieee.org/media-library/a-short-gif-of-simulated-x-shaped-devices-scrambling-around-until-they-gradually-settle-down-and-stay-in-place.gif?id=51413016&width=1245&height=700&coordinates=16%2C0%2C16%2C0"/><br/><br/><p>
  159. It’s kind of astonishing how quadrotors have scaled over the past decade. Like, we’re now at the point where they’re verging on disposable, at least from a commercial or research perspective—for a bit over US $200, you can buy a little <a href="https://www.bitcraze.io/products/crazyflie-2-1/" rel="noopener noreferrer" target="_blank">27-gram, completely open-source drone</a>, and all you have to do is teach it to fly. That’s where things do get a bit more challenging, though, because teaching drones to fly is not a straightforward process. Thanks to good simulation and techniques like reinforcement learning, it’s much easier to imbue drones with autonomy than it used to be. But it’s not typically a fast process, and it can be finicky to make a smooth transition from simulation to reality.
  160. </p><p>
  161. New York University’s <a href="https://wp.nyu.edu/arpl/" target="_blank">Agile Robotics and Perception Lab</a> in collaboration with the <a href="https://www.tii.ae/" target="_blank">Technology Innovation Institute</a> (TII) have managed to streamline the process of getting basic autonomy to work on drones, and streamline it by a lot: The lab’s system is able to train a drone in simulation from nothing up to stable and controllable flying in 18 seconds flat on a MacBook Pro. And it actually takes longer to compile and flash the firmware onto the drone itself than it does for the entire training process.
  162. </p><hr/><p class="shortcode-media shortcode-media-youtube">
  163. <span class="rm-shortcode" data-rm-shortcode-id="1fdc3045b585634aaa452d87bd2e574e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/NRD43ZA1D-4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  164. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">ARPL NYU</small></p><p>
  165. So not only is the drone able to keep a stable hover while rejecting pokes and nudges and wind, but it’s also able to fly specific trajectories. Not bad for 18 seconds, right?
  166. </p><p>
  167. One of the things that typically slows down training times is the need to keep refining exactly what you’re training for, without refining it so much that you’re only training your system to fly in your specific simulation rather than the real world. The strategy used here is what the researchers call a curriculum (you can also think of it as a sort of lesson plan) to adjust the reward function used to train the system through reinforcement learning. The curriculum starts things off being more forgiving and gradually increasing the penalties to emphasize robustness and reliability. This is all about efficiency: Doing that training that you need to do in the way that it needs to be done to get the results you want, and no more.
  168. </p><p>
  169. There are other, more straightforward, tricks that optimize this technique for speed as well. The deep-reinforcement learning algorithms are particularly efficient, and leverage the hardware acceleration that comes along with Apple’s M-series processors. The simulator efficiency multiplies the benefits of the curriculum-driven sample efficiency of the reinforcement-learning pipeline, leading to that wicked-fast training time.
  170. </p><p>
  171. This approach isn’t limited to simple tiny drones—it’ll work on pretty much any drone, including bigger and more expensive ones, or even a drone that you yourself build from scratch.
  172. </p><p class="shortcode-media shortcode-media-youtube">
  173. <span class="rm-shortcode" data-rm-shortcode-id="b0bf3841f3b3075aad8bafd08d5c9bb1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/EYtPG_HmBd8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  174. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Jonas Eschmann</small></p><p>
  175. We’re told that it took minutes rather than seconds to train a policy for the drone in the video above, although the researchers expect that 18 seconds is achievable even for a more complex drone like this in the near future. And it’s all <a href="https://github.com/arplaboratory/learning-to-fly" rel="noopener noreferrer" target="_blank">open source</a>, so you can, in fact, build a drone and teach it to fly with this system. But if you wait a little bit, it’s only going to get better: The researchers tell us that they’re working on integrating with the PX4 open source drone autopilot. Longer term, the idea is to have a single policy that can adapt to different environmental conditions, as well as different vehicle configurations, meaning that this could work on all kinds of flying robots rather than just quadrotors.
  176. </p><p>
  177. Everything you need to run this yourself <a href="https://github.com/arplaboratory/learning-to-fly" target="_blank">is available on GitHub</a>, and the paper is on ArXiv <a href="https://arxiv.org/abs/2311.13081" target="_blank">here</a>.</p>]]></description><pubDate>Thu, 08 Feb 2024 17:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/drone-quadrotor</guid><category>Autonomy</category><category>Drones</category><category>Flying robots</category><category>Quadrotors</category><category>Reinforcement learning</category><category>Robotics</category><category>Simulation</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/a-short-gif-of-simulated-x-shaped-devices-scrambling-around-until-they-gradually-settle-down-and-stay-in-place.gif?id=51413016&amp;width=980"></media:content></item><item><title>Robotic Tongue Licks Gecko Gripper Clean</title><link>https://spectrum.ieee.org/flexiv-gecko-gripper</link><description><![CDATA[
  178. <img src="https://spectrum.ieee.org/media-library/an-image-of-a-reel-to-reel-of-yellow-tape-with-a-blue-cylinder-on-a-robotic-arm-approaching-it-there-is-an-inset-of-a-gecko-in.jpg?id=51391567&width=1245&height=700&coordinates=0%2C119%2C0%2C119"/><br/><br/><p>About a decade ago, there was a lot of excitement in the robotics world around <a href="https://people.eecs.berkeley.edu/%7Eronf/Gecko/gecko-facts.html" target="_blank">gecko-inspired directional adhesives</a>, which are materials that stick without being sticky using the same van der Waals forces that allow geckos to scamper around on vertical panes of glass. They were used extensively in different sorts of climbing robots, some of them <a href="https://www.youtube.com/watch?v=e4ntbQ6isIk" rel="noopener noreferrer" target="_blank">quite lovely</a>. Gecko adhesives are uniquely able to stick to very smooth things where your only other option might be suction, which requires all kinds of extra infrastructure to work.</p><p>We haven’t seen gecko adhesives around as much of late, for a couple of reasons. First, the ability to only stick to smooth surfaces (which is what gecko adhesives are best at) is a bit of a limitation for mobile robots. And second, <a href="https://spectrum.ieee.org/gecko-adhesives-moving-from-robot-feet-to-your-walls" target="_blank">the gap between research and useful application</a> is wide and deep and full of crocodiles. I’m talking about the mean kind of crocodiles, not the cuddly kind. But <a href="https://www.flexiv.com/en/" target="_blank">Flexiv Robotics</a> has made gecko adhesives practical for robotic grasping in a commercial environment, thanks in part to a sort of robotic tongue that licks the gecko tape clean.</p><hr/><p>If you zoom way, <em>way</em> in on a gecko’s foot, you’ll see that each toe is covered in millions of hairlike nanostructures called setae. Each seta branches out at the end into hundreds of more hairs with flat bits at the end called spatulas. The result of this complex arrangement of setae and spatulas is that gecko toes have a ridiculous amount of surface area, meaning that they can leverage the extremely weak van der Waals forces between molecules to stick themselves to perfectly flat and smooth surfaces. This technique works exceptionally well: Geckos can hang from glass by a single toe, and a fully adhered gecko can hold something like 140 kilograms (which, unfortunately, seems to be an extrapolation rather than an experimental result). And luckily for the gecko, the structure of the spatulas makes the adhesion directional, so that when its toes are no longer being loaded, they can be easily peeled off of whatever they’re attached to.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  179. <img alt="A set of black and white images of gecko toes at different magnifcations." class="rm-shortcode" data-rm-shortcode-id="60f9a053f4ab5676de967c35a7e41243" data-rm-shortcode-name="rebelmouse-image" id="d0737" loading="lazy" src="https://spectrum.ieee.org/media-library/a-set-of-black-and-white-images-of-gecko-toes-at-different-magnifcations.jpg?id=51386070&width=980"/>
  180. <small class="image-media media-caption" placeholder="Add Photo Caption...">Natural gecko adhesive structure, along with a synthetic adhesive (f).</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://royalsocietypublishing.org/doi/10.1098/rsta.2007.2173" target="_blank">Gecko adhesion: evolutionary nanotechnology, by Kellar Autumn and Nick Gravish</a></small></p><p>Since geckos don’t “stick” to things in the sense that we typically use the word “sticky,” a better way of characterizing what geckos can do is as “dry adhesion,” as opposed to something that involves some sort of glue. You can also think about gecko toes as just being very, very high friction, and it’s this perspective that is particularly interesting in the context of robotic grippers.</p><p class="shortcode-media shortcode-media-youtube">
  181. <span class="rm-shortcode" data-rm-shortcode-id="b71cb9d9b823bfea19ccf1a4bdc3692b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Jp0lOm8-oLM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  182. </p><p>This is <a href="https://www.flexiv.com/product/grav" target="_blank">Flexiv’s “Grav Enhanced” gripper</a>, which uses a combination of pinch grasping and high friction gecko adhesive to lift heavy and delicate objects without having to squeeze them. When you think about a traditional robotic grasping system trying to lift something like a water balloon, you have to squeeze that balloon until the friction between the side of the gripper and the side of the balloon overcomes the weight of the balloon itself. The higher the friction, the lower the squeeze required, and although a water balloon might be an extreme example, maximizing gripper friction can make a huge difference when it comes to fragile or deformable objects.</p><p>There are a couple of problems with dry adhesive, however. The tiny structures that make the adhesive adhere can be prone to damage, and the fact that dry adhesive will stick to just about anything it can make good contact with means that it’ll rapidly accumulate dirt outside of a carefully controlled environment. In research contexts, these problems aren’t all that significant, but for a commercial system, you can’t have something that requires constant attention.</p><p>Flexiv says that the microstructure material that makes up its gecko adhesive was able to sustain 2 million gripping cycles without any visible degradation in performance, suggesting that as long as you use the stuff within the tolerances that it’s designed for, it should keep on adhering to things indefinitely—although trying to lift too much weight will tear the microstructures, ruining the adhesive properties after just a few cycles. And to keep the adhesive from getting clogged up with debris, Flexiv came up with this clever little cleaning station that acts like a little robotic tongue of sorts:</p><p class="shortcode-media shortcode-media-youtube">
  183. <span class="rm-shortcode" data-rm-shortcode-id="aa9c23f5182428255303e4b05f777f03" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/PIz353Bv_OE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  184. </p><p>Interestingly, geckos themselves don’t seem to use their own tongues to clean their toes. They lick their eyeballs on the regular, like all normal humans do, but gecko toes appear to be self-cleaning, which is a pretty neat trick. It’s certainly possible to make <a href="https://people.eecs.berkeley.edu/~ronf/Gecko/selfcleaning.html" rel="noopener noreferrer" target="_blank">self-cleaning synthetic gecko adhesive</a>, but Flexiv tells us that “due to technical and practical limitations, replicating this process in our own gecko adhesive material is not possible. Essentially, we replicate the microstructure of a gecko’s footpad, but not its self-cleaning process.” This likely goes back to that whole thing about what works in a research context versus what works in a commercial context, and Flexiv needs its gecko adhesive to handle all those millions of cycles.</p>Flexiv says that it was made aware of the need for a system like this when one of its clients started using the gripper for the extra-dirty task of <a href="https://www.youtube.com/watch?v=58b0i1MFZkE" rel="noopener noreferrer" target="_blank">sorting trash from recycling</a>, and that the solution was inspired by a lint roller. And I have to say, I appreciate the simplicity of the system that Flexiv came up with to solve the problem directly and efficiently. Maybe one day, it will be able to replicate a real gecko’s natural self-cleaning toes with a durable and affordable artificial dry adhesive, but until that happens, an artificial tongue does the trick.]]></description><pubDate>Tue, 06 Feb 2024 18:01:27 +0000</pubDate><guid>https://spectrum.ieee.org/flexiv-gecko-gripper</guid><category>Flexiv</category><category>Gecko feet</category><category>Robotic grippers</category><category>Robotics</category><category>Adhesives</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/an-image-of-a-reel-to-reel-of-yellow-tape-with-a-blue-cylinder-on-a-robotic-arm-approaching-it-there-is-an-inset-of-a-gecko-in.jpg?id=51391567&amp;width=980"></media:content></item><item><title>Video Friday: Agile but Safe</title><link>https://spectrum.ieee.org/video-friday-agile-but-safe</link><description><![CDATA[
  185. <img src="https://spectrum.ieee.org/media-library/image.gif?id=51282224&width=1245&height=700&coordinates=0%2C0%2C0%2C0"/><br/><br/><p>
  186. Video Friday is your weekly selection of awesome robotics videos, collected by your friends at
  187. <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/>
  188. </p><h5><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 2 February 2024, ZURICH</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><p>
  189. Enjoy today’s videos!
  190. </p><div class="horizontal-rule">
  191. </div><div style="page-break-after: always">
  192. <span style="display:none"> </span>
  193. </div><p class="rm-anchors" id="elwwpn5ihja">
  194. Is “scamperiest” a word? If not, it should be, because this is the scamperiest robot I’ve ever seen.
  195. <span></span>
  196. </p><p class="shortcode-media shortcode-media-youtube">
  197. <span class="rm-shortcode" data-rm-shortcode-id="e739c2a1bbb8e79be5424f398554c6a4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/elWwPn5IhjA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  198. </p><p>
  199. [
  200. <a href="https://agile-but-safe.github.io/">ABS</a> ]
  201. </p><div class="horizontal-rule">
  202. </div><blockquote class="rm-anchors" id="ajr7tcjzbmo">
  203. <em>GITAI is pleased to announce that its 1.5-meter-long autonomous dual robotic arm system (S2) has successfully arrived at the International Space Station (ISS) aboard the SpaceX Falcon 9 rocket (NG-20) to conduct an external demonstration of in-space servicing, assembly, and manufacturing (ISAM) while onboard the ISS. The success of the S2 tech demo will be a major milestone for GITAI, confirming the feasibility of this technology as a fully operational system in space.</em>
  204. </blockquote><p class="shortcode-media shortcode-media-youtube">
  205. <span class="rm-shortcode" data-rm-shortcode-id="76fe5cb900af96436aeac56ce3f0e37e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ajr7tcjZbMo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  206. </p><p>
  207. [
  208. <a href="https://gitai.tech/">GITAI</a> ]
  209. </p><div class="horizontal-rule">
  210. </div><blockquote class="rm-anchors" id="sqendbet75g">
  211. <em>This work presents a comprehensive study on using deep reinforcement learning (RL) to create dynamic locomotion controllers for bipedal robots. Going beyond focusing on a single locomotion skill, we develop a general control solution that can be used for a range of dynamic bipedal skills, from periodic walking and running to aperiodic jumping and standing.</em>
  212. </blockquote><p class="shortcode-media shortcode-media-youtube">
  213. <span class="rm-shortcode" data-rm-shortcode-id="8ac450dae0547f69cbffcbb186be77fd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/sQEnDbET75g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  214. </p><p>
  215. And if you want to get exhausted on behalf of a robot, the full 400-meter dash is below.
  216. </p><p class="shortcode-media shortcode-media-youtube">
  217. <span class="rm-shortcode" data-rm-shortcode-id="dcffab2841362d65e8f2169b7d2537ce" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wzQtRaXjvAk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  218. </p><p>
  219. [
  220. <a href="https://hybrid-robotics.berkeley.edu/">Hybrid Robotics</a> ]
  221. </p><div class="horizontal-rule">
  222. </div><blockquote class="rm-anchors" id="w_y3rth_wug">
  223. <em>NASA’s Ingenuity Mars Helicopter pushed aerodynamic limits during the final months of its mission, setting new records for speed, distance, and altitude. Hear from Ingenuity chief engineer Travis Brown on how the data the team collected could eventually be used in future rotorcraft designs.</em>
  224. </blockquote><p class="shortcode-media shortcode-media-youtube">
  225. <span class="rm-shortcode" data-rm-shortcode-id="d9638332cbe95c7732ad4355bb3d66b2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/w_Y3rtH_WUg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  226. </p><p>
  227. [
  228. <a href="https://mars.nasa.gov/technology/helicopter/">NASA</a> ]
  229. </p><div class="horizontal-rule">
  230. </div><p class="rm-anchors" id="jtzeophj5yq">
  231. BigDog: 15 years of solving mobility problems its own way.
  232. </p><p class="shortcode-media shortcode-media-youtube">
  233. <span class="rm-shortcode" data-rm-shortcode-id="aad07b77acdb00dfa7274d72fa6a7f55" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/JtzeOPHJ5yQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  234. </p><p>
  235. [
  236. <a href="https://bostondynamics.com/legacy/">Boston Dynamics</a> ]
  237. </p><div class="horizontal-rule">
  238. </div><blockquote class="rm-anchors" id="znksmrz0o6q">
  239. <em>[Harvard School of Engineering and Applied Sciences] researchers are helping develop resilient and autonomous deep space and extraterrestrial habitations by developing technologies to let autonomous robots repair or replace damaged components in a habitat. The research is part of the Resilient ExtraTerrestrial Habitats institute (RETHi) led by Purdue University, in partnership with [Harvard] SEAS, the University of Connecticut and the University of Texas at San Antonio. Its goal is to “design and operate resilient deep space habitats that can adapt, absorb and rapidly recover from expected and unexpected disruptions.” </em>
  240. </blockquote><p class="shortcode-media shortcode-media-youtube">
  241. <span class="rm-shortcode" data-rm-shortcode-id="e46ad24e401e10b9a8119af734b3291b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zNKsMrz0o6Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  242. </p><p>
  243. [
  244. <a href="https://seas.harvard.edu/news/2024/01/robots-help-human-habitation-space">Harvard SEAS</a> ]
  245. </p><div class="horizontal-rule">
  246. </div><blockquote class="rm-anchors" id="_sf32a_zcui">
  247. <em>Researchers from Huazhong University of Science and Technology (HUST) in a recent T-RO paper describe and construct a novel variable stiffness spherical joint motor that enables dexterous motion and joint compliance in omni-directions.</em>
  248. </blockquote><p class="shortcode-media shortcode-media-youtube">
  249. <span class="rm-shortcode" data-rm-shortcode-id="a1e73f41659bf8e163f88ef6a6e509df" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/_Sf32A_ZcuI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  250. </p><p>
  251. [
  252. <a href="https://ieeexplore.ieee.org/document/10328708">Paper</a> ]
  253. </p><p>
  254. Thanks, Ram!
  255. </p><div class="horizontal-rule">
  256. </div><p class="rm-anchors" id="kl_431yd2o4">
  257. We are told that this new robot from HEBI is called “Mark Suckerberg” and that they’ve got a pretty cool application in mind for it, to be revealed later this year.
  258. </p><p class="shortcode-media shortcode-media-youtube">
  259. <span class="rm-shortcode" data-rm-shortcode-id="c8619bb975174096c84441fd826ab60e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/KL_431YD2O4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  260. </p><p>
  261. [
  262. <a href="https://www.hebirobotics.com/">HEBI Robotics</a> ]
  263. </p><p>
  264. Thanks, Dave!
  265. </p><div class="horizontal-rule">
  266. </div><blockquote class="rm-anchors" id="ccer3cuu1jq">
  267. <em>Dive into the first edition of our new Real-World-Robotics class at ETH Zürich! Our students embarked on an incredible journey, creating their human-like robotic hands from scratch. In just three months, the teams designed, built, and programmed their tendon-driven robotic hands, mastering dexterous manipulation with reinforcement learning! The result? A spectacular display of innovation and skill during our grand final.</em>
  268. </blockquote><p class="shortcode-media shortcode-media-youtube">
  269. <span class="rm-shortcode" data-rm-shortcode-id="e825199ea300d65a1067a70ac7db2194" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/CCer3cUU1JQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  270. </p><p>
  271. [
  272. <a href="https://srl.ethz.ch/">SRL ETHZ</a> ]</p><div class="horizontal-rule">
  273. </div><p>
  274. Carnegie Mellon researchers have built a system with a robotic arm atop a RangerMini 2.0 robotic cart from AgileX robotics to make what they’re calling a platform for “intelligent movement and processing.”
  275. </p><p class="shortcode-media shortcode-media-youtube">
  276. <span class="rm-shortcode" data-rm-shortcode-id="ed59d7a437aa0f7ce478e156c807c9bf" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/49WKbQO90bA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  277. </p><p>
  278. [ <a href="https://www.cs.cmu.edu/~dpathak/" target="_blank">CMU</a> ] via [
  279. <a href="https://www.agilex.ai/">AgileX</a> ]
  280. </p><div class="horizontal-rule">
  281. </div><blockquote class="rm-anchors" id="brsfnsl6dr8">
  282. <em>Picassnake is our custom-made robot that paints pictures from music. Picassnake consists of an arm and a head, embedded in a plush snake doll. The robot is connected to a laptop for control and music processing, which can be fed through a microphone or an MP3 file. To open the media source, an operator can use the graphical user interface or place a text QR code in front of a webcam. Once the media source is opened, Picassnake generates unique strokes based on the music and translates the strokes to physical movement to paint them on canvas.</em>
  283. </blockquote><p class="shortcode-media shortcode-media-youtube">
  284. <span class="rm-shortcode" data-rm-shortcode-id="d82044ecad4106441e2bf88d6eefa593" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/BRsfNSL6DR8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  285. </p><p>
  286. [
  287. <a href="https://hci.cs.umanitoba.ca/projects-and-research/details/picassnake-the-painting-robot">Picassnake</a> ]
  288. </p><div class="horizontal-rule">
  289. </div><blockquote class="rm-anchors" id="lkz6jkqpmec">
  290. <em>In April 2021, NASA’s Ingenuity Mars Helicopter became the first spacecraft to achieve powered, controlled flight on another world. With 72 successful flights, Ingenuity has far surpassed its originally planned technology demonstration of up to five flights. On Jan. 18, Ingenuity flew for the final time on the Red Planet. Join Tiffany Morgan, NASA’s Mars Exploration Program Deputy Director, and Teddy Tzanetos, Ingenuity Project Manager, as they discuss these historic flights and what they could mean for future extraterrestrial aerial exploration.</em>
  291. </blockquote><p class="shortcode-media shortcode-media-youtube">
  292. <span class="rm-shortcode" data-rm-shortcode-id="b11949450a044ff6c571fc87ab21157e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/lkZ6jkqPMEc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  293. </p><p>
  294. [
  295. <a href="https://mars.nasa.gov/technology/helicopter/">NASA</a> ]
  296. </p><div class="horizontal-rule">
  297. </div>]]></description><pubDate>Fri, 02 Feb 2024 16:58:51 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-agile-but-safe</guid><category>Video friday</category><category>Robotics</category><category>Quadruped robots</category><category>Bipedal robots</category><category>Boston dynamics</category><category>Nasa</category><category>Jpl</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/image.gif?id=51282224&amp;width=980"></media:content></item><item><title>Infrared Sensors Can Now Peer Around Corners</title><link>https://spectrum.ieee.org/non-line-of-sight-infrared</link><description><![CDATA[
  298. <img src="https://spectrum.ieee.org/media-library/a-bubble-with-text-and-a-wall-forming-a-t-with-a-small-camera-at-the-bottom.jpg?id=51211632&width=1245&height=700&coordinates=0%2C112%2C0%2C113"/><br/><br/><p>Just because an object is around a corner doesn’t mean it has to be hidden. <a href="https://spectrum.ieee.org/seeing-around-corner-lasers-speckle" target="_blank">Non-line-of-sight imaging</a> can peek around corners and spot those objects, but it has so far been limited to a narrow band of frequencies. Now, a new sensor can help extend this technique from working with visible light to infrared. This advance could help make <a data-linked-post="2662384970" href="https://spectrum.ieee.org/autonomous-vehicle-data-collection" target="_blank">autonomous vehicles</a> safer, among other potential applications.<br/></p><p>Non-line-of-sight imaging relies on the faint signals of light beams that have reflected off surfaces in order to reconstruct images. The ability to see around corners may prove useful for <a href="https://spectrum.ieee.org/image-neural" target="_self">machine vision</a>—for instance, helping autonomous vehicles foresee hidden dangers to better predict how to respond to them, says <a href="https://nanophotonics.tju.edu.cn/index/PEOPLE.htm" target="_blank">Xiaolong Hu</a>, the senior author of the study and a professor at Tianjin University in Tianjin, China. It may also improve <a href="https://spectrum.ieee.org/robotic-tank-is-designed-to-crawl-through-your-intestine" target="_blank">endoscopes</a> that help doctors peer inside the body. <strong></strong></p><p>The light that non-line-of-sight imaging depends on is typically very dim, and until now, the detectors that were efficient and sensitive enough for non-line-of-sight imaging could only detect either visible or near-infrared light. Moving to longer wavelengths might have several advantages, such as dealing with less interference from sunshine, and the possibility of using lasers that are safe around eyes, Hu says.<strong></strong></p><p>Now Hu and his colleagues have for the first time performed non-line-of-sight imaging using 1,560- and 1,997-nanometer infrared wavelengths. “This extension in spectrum paves the way for more practical applications,” Hu says.</p><p class="shortcode-media shortcode-media-rebelmouse-image">
  299. <img alt="black and white images and red and black images of a sculpture made of the letters T, J, and U, and of a wooden pose reference mannequin" class="rm-shortcode" data-rm-shortcode-id="7b89e3d5ee03688b1e2dbea8f4795a1d" data-rm-shortcode-name="rebelmouse-image" id="8258a" loading="lazy" src="https://spectrum.ieee.org/media-library/black-and-white-images-and-red-and-black-images-of-a-sculpture-made-of-the-letters-t-j-and-u-and-of-a-wooden-pose-reference-m.jpg?id=51248373&width=980"/>
  300. <small class="image-media media-caption" data-gramm="false" data-lt-tmp-id="lt-127139" placeholder="Add Photo Caption..." spellcheck="false">Researchers at Tanjin University in China imaged several objects with a non-line-of-sight infrared camera, before [middle column] and after [right column] using their reconstructive algorithm.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Tianjin University</small></p><p>In the new study, the researchers experimented with <a href="https://pubs.acs.org/doi/10.1021/acsphotonics.1c00730" target="_blank">superconducting nanowire single-photon detectors</a>. In each device, a 40-nanometer-wide niobium titanium nitride wire was cooled to about 2 kelvins (about –271 °C), rendering the wire superconductive. A single photon could disrupt this fragile state, generating electrical pulses that enabled the efficient detection of individual photons.</p><p>The scientists contorted the nanowire in each device into a fractal pattern that took on similar shapes at various magnifications. This let the sensor detect photons of all <a href="https://spectrum.ieee.org/metalens" target="_self">polarizations</a>, boosting its efficiency.</p><p>The new detector was up to nearly three times as efficient as other single-photon detectors at sensing near- and mid-infrared light. This let the researchers perform non-line-of-sight imaging, achieving a spatial resolution of roughly 1.3 to 1.5 centimeters.</p><p>In addition to an algorithm that reconstructed non-line-of-sight images based off <a href="https://www.nature.com/articles/nature25489" rel="noopener noreferrer" target="_blank">multiple scattered light rays</a>, the scientists developed a new algorithm that helped remove noise from their data. When each pixel during the scanning process was given 5 milliseconds to collect photons, the new de-noising algorithm reduced the root mean square error—a measure of its deviation from a perfect image—of reconstructed images by about eightfold.</p><p>The researchers now plan to arrange multiple sensors into larger arrays to boost efficiency, reduce scanning time, and extend the distance over which imaging can take place, Hu says. They would also like to test their device in daylight conditions, he adds.</p><p>The scientists detailed <u><a href="http://dx.doi.org/10.1364/OE.497802" target="_blank">their findings</a></u> 30 November in the journal <em>Optics Express</em>.</p>]]></description><pubDate>Thu, 01 Feb 2024 17:22:49 +0000</pubDate><guid>https://spectrum.ieee.org/non-line-of-sight-infrared</guid><category>Nanowires</category><category>Machine vision</category><category>Imaging</category><category>Cameras</category><category>Infrared</category><dc:creator>Charles Q. Choi</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-bubble-with-text-and-a-wall-forming-a-t-with-a-small-camera-at-the-bottom.jpg?id=51211632&amp;width=980"></media:content></item><item><title>Amazon’s Acquisition of iRobot Falls Through</title><link>https://spectrum.ieee.org/irobot-amazon</link><description><![CDATA[
  301. <img src="https://spectrum.ieee.org/media-library/circle-robot-vacuum-against-a-blue-background-with-irobot-logo-underneath.jpg?id=51214323&width=1245&height=700&coordinates=0%2C281%2C0%2C282"/><br/><br/><p>Citing “no path to regulatory approval in the European Union,” Amazon and iRobot have <a href="https://www.prnewswire.com/news-releases/amazon-and-irobot-agree-to-terminate-pending-acquisition-302046311.html" rel="noopener noreferrer" target="_blank">announced the termination</a> of <a href="https://spectrum.ieee.org/amazon-irobot-acquisition" target="_blank">an acquisition deal</a> first announced in August of 2022 that would have made iRobot a part of Amazon and valued the robotics company at US $1.4 billion.</p><hr/><p>The European Commission <a href="https://ec.europa.eu/commission/presscorner/detail/en/STATEMENT_24_521" rel="noopener noreferrer" target="_blank">released a statement today</a> that explained some of its concerns, which to be fair, seem like reasonable things to be concerned about:</p><blockquote>Our in-depth investigation preliminarily showed that the acquisition of iRobot would have enabled Amazon to foreclose iRobot’s rivals by restricting or degrading access to the Amazon Stores.… We also preliminarily found that Amazon would have had the incentive to foreclose iRobot’s rivals because it would have been economically profitable to do so. All such foreclosure strategies could have restricted competition in the market for robot vacuum cleaners, leading to higher prices, lower quality, and less innovation for consumers.</blockquote><p>Amazon, for its part, characterizes this as “undue and disproportionate regulatory hurdles.” Whoever you believe is correct, the protracted strangulation of this acquisition deal has not been great for iRobot, and its termination is potentially disastrous—Amazon will have to pay iRobot a $94 million termination fee, which is basically nothing for it, and meanwhile iRobot is already laying off 350 people, or 31 percent of its head count. </p><p>From one of iRobot’s <a href="https://www.prnewswire.com/news-releases/irobot-announces-operational-restructuring-plan-to-position-company-for-the-future-302046345.html" rel="noopener noreferrer" target="_blank">press releases</a>:</p><blockquote>“iRobot is an innovation pioneer with a clear vision to make consumer robots a reality,” said Colin Angle, Founder of iRobot. “The termination of the agreement with Amazon is disappointing, but iRobot now turns toward the future with a focus and commitment to continue building thoughtful robots and intelligent home innovations that make life better, and that our customers around the world love.” </blockquote><p>The reason that I don’t feel much better after reading that statement is that Colin Angle has already stepped down as chairman and CEO of iRobot. Angle was one of the founders of iRobot (along with <a data-linked-post="2650279736" href="https://spectrum.ieee.org/what-is-a-robot-rodney-brooks-sonnet" target="_blank">Rodney Brooks</a> and Helen Greiner) and has stuck with the company for its entire 30+ year existence, until just now. So, that’s not great. Also, I’m honestly not sure <em>how</em> iRobot is going to create much in the way of home innovations since the press release states that the company is “pausing all work related to non-floor care innovations, including air purification, robotic lawn mowing and education,” while also “reducing R&D expense by approximately $20 million year-over-year.”</p><p><a href="https://spectrum.ieee.org/irobot-terra-robotic-lawnmower" target="_self">iRobot’s lawn mower</a> has been paused for a while now, so it’s not a huge surprise that nothing will move forward there, but a pause on the education robots like <a href="https://spectrum.ieee.org/irobot-create-3" target="_self">Create</a> and <a href="https://spectrum.ieee.org/irobot-new-education-robot-root" target="_self">Root</a> is a real blow to the robotics community. And even if iRobot is focusing on floor-care innovations, I’m not sure how much innovation will be possible with a slashed R&D budget amidst huge layoffs.</p><p>Sigh.</p><p>On LinkedIn, Colin Angle <a href="https://www.linkedin.com/feed/update/urn:li:activity:7157729527916318720/" target="_blank">wrote a little bit</a> about what he called “the magic of iRobot”:</p><blockquote>iRobot built the first micro rovers and changed space exploration forever. iRobot built the first practical robots that left the research lab and went on combat missions to defuse bombs, saving 1000’s of lives. iRobot’s robots crucially enabled the cold shutdown of the reactors at Fukushima, found the underwater pools of oil in the aftermath of the deep horizon oil rig disaster in the Gulf of Mexico. And pioneered an industry with Roomba, fulfilling the unfulfilled promise of over 50 years for practical robots in the home.<br/> <br/>
  302. Why?<br/><br/>
  303. As I think about all the events surrounding those actions, there is a common thread. We believed we could. And we decided to try with a spirit of pragmatic optimism. Building robots means knowing failure. It does not treat blind hope kindly. Robots are too complex. Robots are too expensive. Robots are too challenging for hope alone to have the slightest chance of success. But combining the belief that a problem can be solved with a commitment to the work to solve it enabled us to change the world.</blockquote><p>And that’s what I personally find so worrying about all of this. iRobot has a treasured history of innovation which is full of successes and failures and really weird stuff, and it’s hard to see how that will be able to effectively continue. Here are a couple of my favorite weird iRobot things, including a PackBot that flies (for a little bit) and a morphing blobular robot:</p><p class="shortcode-media shortcode-media-youtube">
  304. <span class="rm-shortcode" data-rm-shortcode-id="9b24f973101914d87141d5f36ff47f8e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/hwFZpya_1m0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  305. </p><p class="shortcode-media shortcode-media-youtube">
  306. <span class="rm-shortcode" data-rm-shortcode-id="59c964942b99c84ba0db641b84cbaf1b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/SbqHERKdlK8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  307. </p><p>I suppose it’s worth pointing out that the weirdest stuff (like in the videos above) is all over a decade old, and you can reasonably ask whether iRobot was that kind of company anymore even before this whole Amazon thing happened. The answer is probably not, since the company has chosen to focus almost exclusively on floor-care robots. But even there we’ve seen consistent innovation in hardware and software that pretty much every floor-care robot company seems to then pick up on about a year later. This is not to say that other floor-care robots can’t innovate, but it’s undeniable that iRobot has been a driving force behind that industry. Will that continue? I really hope so. </p>]]></description><pubDate>Mon, 29 Jan 2024 21:54:39 +0000</pubDate><guid>https://spectrum.ieee.org/irobot-amazon</guid><category>Amazon</category><category>Consumer robots</category><category>European union</category><category>Irobot</category><category>Robotics</category><category>Roomba</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/circle-robot-vacuum-against-a-blue-background-with-irobot-logo-underneath.jpg?id=51214323&amp;width=980"></media:content></item><item><title>Video Friday: Medusai</title><link>https://spectrum.ieee.org/video-friday-medusai</link><description><![CDATA[
  308. <img src="https://spectrum.ieee.org/media-library/image.png?id=51196963&width=1245&height=700&coordinates=0%2C0%2C0%2C0"/><br/><br/><p><span style="background-color: initial;">Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span style="background-color: initial;"> for inclusion.</span><br/></p><h5><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 2 February 2024, ZURICH</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote><em>Made from beautifully fabricated steel and eight mobile arms, medusai can play percussion and strings with human musicians, dance with human dancers, and move in time to multiple human observers. It uses AI-driven computer vision to know what human observers are doing and responds accordingly through snake gestures, music, and light.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2dea5682a8c05c2e5bd75e51560f59a1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ZbpLjxuHeOY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>If this seems a little bit unsettling, that’s intentional! The project was designed to explore the concepts of trust and risk in the context of robots, and of using technology to influence emotion.</p><p class="shortcode-media shortcode-media-youtube">
  309. <span class="rm-shortcode" data-rm-shortcode-id="f5a30baa34ff48be7b8383a0e0be6dc0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/bmBexuV2pw4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  310. </p><p>[ <a href="https://www.medus.ai/">medusai</a> ] via [ <a href="https://gtcmt.gatech.edu/feature/medusai">Georgia Tech</a> ]</p><p>Thanks, Gil!</p><div class="horizontal-rule"></div><blockquote><em>On 19 April 2021, NASA’s Ingenuity Mars Helicopter made history when it completed the first powered, controlled flight on the Red Planet. It flew for the last time on 18 January 2024.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5360a1cf3df67c2ace0509538cc63be1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/qMbHE_VXI-8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.jpl.nasa.gov/news/after-three-years-on-mars-nasas-ingenuity-helicopter-mission-ends">NASA JPL</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Teleoperation plays a crucial role in enabling robot operations in challenging environments, yet existing limitations in effectiveness and accuracy necessitate the development of innovative strategies for improving teleoperated tasks. The work illustrated in this video introduces a novel approach that utilizes mixed reality and assistive autonomy to enhance the efficiency and precision of humanoid robot teleoperation.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b4b34894fe26f27afc0d1e544234ca5c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/oN-FD6YnF2c?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Sometimes all it takes is one good punch, and then you can just collapse.</p><p>[ <a href="https://ieeexplore.ieee.org/abstract/document/10380694">Paper</a> ] via [ <a href="https://robots.ihmc.us/">IHMC</a> ]</p><p>Thanks, Robert!</p><div class="horizontal-rule"></div><blockquote><em>The new Dusty Robotics FieldPrinter 2 enhances on-site performance and productivity through its compact design and extended capabilities. Building upon the success of the first-generation FieldPrinter, which has printed over 91 million square feet of layout, the FieldPrint Platform incorporates lessons learned from years of experience in the field to deliver an optimized experience for all trades on site.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="023c88f41a6724143d01efb20cd52c82" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xIb_yiOHjWY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dustyrobotics.com/">Dusty Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Quadrupedal robots have emerged as a cutting-edge platform for assisting humans, finding applications in tasks related to inspection and exploration in remote areas. Nevertheless, their floating base structure renders them susceptible to failure in cluttered environments, where manual recovery by a human operator may not always be feasible. In this study, we propose a robust all-terrain recovery policy to facilitate rapid and secure recovery in cluttered environments.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ec994964e305e425572cbdd40e12b3a7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/NiadpC05M6s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sites.google.com/view/dreamriser">DreamRiser </a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="k2u7wwemldu">The work that Henry Evans is doing with Stretch (along with Hello Robot and University of Washington PhD student Vinitha Ranganeni) will be presented at HRI 2024 this spring. </p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0a22bd7ed1289a7cb7d2069e95a7f70e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/K2U7wwEMLDU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://hcrlab.cs.washington.edu/">UW HCRL</a> ]</p><p>Thanks, Stefan!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ok4dhssene4">I like to imagine that these are just excerpts from one very long walk that Digit took around San Francisco.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="52177cfdb5f147afa84b4fcb07ceb0f4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ok4DHssENE4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://hybrid-robotics.berkeley.edu/">Hybrid Robotics Lab</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Boxing, drumming, stacking boxes, and various other practices...those are the daily teleoperation testing of our humanoid robot. Collaborating with engineers, our humanoid robots collect real-world data from teleoperation for learning to iterate control algorithms. </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2abc41109cf691eb0f7b213899dd6209" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2dmjzMv-y-M?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.limxdynamics.com/en">LimX Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote><em>The OpenDR project aims to develop a versatile and open tool kit for fundamental robot functions, using deep learning to enhance their understanding and decision-making abilities. The primary objective is to make robots more intelligent, particularly in critical areas like health care, agriculture, and production. In the health care setting, the TIAGo robot is deployed to offer assistance and support within a health care facility.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5c6c8714578d0aa6c7df993fc980ab3f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Z9y8FXwqxbM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://opendr.eu/">OpenDR</a> ] via [ <a href="https://pal-robotics.com/">PAL Robotics</a> ]</p><div class="horizontal-rule"></div><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7b69c618db437c6056ff2addfef55aaf" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/fQ64i9dzj5s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.arches-projekt.de/projekt-arches/">ARCHES</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="uletbu3dufu">Christoph Bartneck gives a talk entitled “Social robots: The end of the beginning or the beginning of the end?”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ab20b2e3fca0379c6ca94894b7f4a3c0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UleTbu3DufU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.bartneck.de/">Christoph Bartneck</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Professor Michael Jordan offers his provocative thoughts on the blending of AI and economics and takes us on a tour of Trieste, a beautiful and grand city in northern Italy.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4d11e959d5117a9bd64859d940f823e9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/3lXphIYfoBM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://people.eecs.berkeley.edu/~jordan/">Berkeley</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 26 Jan 2024 21:11:45 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-medusai</guid><category>Digit</category><category>Quadrupedal robots</category><category>Robotics</category><category>Video friday</category><category>Mixed reality</category><category>Nasa</category><category>Mars helicopter</category><category>Dancing robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/image.png?id=51196963&amp;width=980"></media:content></item><item><title>Blade Strike on Landing Ends Mars Helicopter’s Epic Journey</title><link>https://spectrum.ieee.org/mars-helicopter-ingenuity-end-mission</link><description><![CDATA[
  311. <img src="https://spectrum.ieee.org/media-library/a-photo-from-ingenuitys-navigation-camera-shows-the-shadow-of-one-of-its-rotors-with-significant-damage-to-the-tip.jpg?id=51190273&width=1245&height=700&coordinates=0%2C376%2C0%2C377"/><br/><br/><p>
  312. The Ingenuity Mars Helicopter made its 72nd and final flight on 18 January. “While the helicopter remains upright and in communication with ground controllers,” NASA’s Jet Propulsion Lab said
  313. <a href="https://www.jpl.nasa.gov/news/after-three-years-on-mars-nasas-ingenuity-helicopter-mission-ends" rel="noopener noreferrer" target="_blank">in a press release this afternoon</a>, “imagery of its Jan. 18 flight sent to Earth this week indicates one or more of its rotor blades sustained damage during landing, and it is no longer capable of flight.” That’s what you’re seeing in <a href="https://photojournal.jpl.nasa.gov/catalog/PIA26243" rel="noopener noreferrer" target="_blank">the picture above</a>: the shadow of a broken tip of one of the helicopter’s four 2-foot-long carbon-fiber rotor blades. NASA is assuming that at least one blade struck the Martian surface during a “rough landing,” and this is not the kind of damage that will allow the helicopter to get back into the air. Ingenuity’s mission is over.</p><hr/><p><br/></p><p class="shortcode-media shortcode-media-rebelmouse-image">
  314. <img alt="" class="rm-shortcode" data-rm-shortcode-id="a2535f824bfbf7fb54c089d5b9166ac3" data-rm-shortcode-name="rebelmouse-image" id="2d567" loading="lazy" src="https://spectrum.ieee.org/media-library/the-perseverance-rover-took-this-picture-of-ingenuity-on-on-2-u00a0august-u00a02023-just-before-flight-54.jpg?id=51190282&width=980"/>
  315. <small class="image-media media-caption" placeholder="Add Photo Caption...">The Perseverance rover took this picture of Ingenuity on on 2 August 2023, just before flight 54.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">NASA/JPL-Caltech/ASU/MSSS</small></p><p>
  316. NASA held a press conference earlier this evening to give as much information as they could about exactly what happened to Ingenuity, and what comes next. First, here’s a summary
  317. <a href="https://www.nasa.gov/news-release/after-three-years-on-mars-nasas-ingenuity-helicopter-mission-ends/" rel="noopener noreferrer" target="_blank">from the press release</a>:
  318. </p><blockquote>
  319. Ingenuity’s team planned for the helicopter to make a short vertical flight on Jan. 18 to determine its location after executing an emergency landing on its previous flight. Data shows that, as planned, the helicopter achieved a maximum altitude of 40 feet (12 meters) and hovered for 4.5 seconds before starting its descent at a velocity of 3.3 feet per second (1 meter per second).
  320. <br/>
  321. <br/>
  322. However, about 3 feet (1 meter) above the surface, Ingenuity lost contact with the rover, which serves as a communications relay for the rotorcraft. The following day, communications were reestablished and more information about the flight was relayed to ground controllers at NASA JPL. Imagery revealing damage to the rotor blade arrived several days later. The cause of the communications dropout and the helicopter’s orientation at time of touchdown are still being investigated.
  323. </blockquote><p>
  324. While NASA doesn’t know for sure what happened, they do have some ideas based on the cause of the emergency landing during the previous flight, Flight 71. “[This location] is some of the hardest terrain we’ve ever had to navigate over,” said
  325. <a href="https://spectrum.ieee.org/mars-helicopter-ingenuity-50" target="_self">Teddy Tzanetos, Ingenuity project manager at NASA JPL</a>, during the NASA press conference. “It’s very featureless—bland, sandy terrain. And that’s why we believe that during Flight 71, we had an emergency landing. She was flying over the surface and was realizing that there weren’t too many rocks to look at or features to navigate from, and that’s why Ingenuity called an emergency landing on her own.”<br/>
  326. </p><p>
  327. <a href="https://spectrum.ieee.org/nasa-designed-perseverance-helicopter-rover-fly-autonomously-mars" target="_self">Ingenuity uses a downward-pointing VGA camera running at 30 hertz for monocular feature tracking</a>, and compares the apparent motion of distinct features between frames to determine its motion over the ground. This optical flow technique is used for drones (and other robots) on Earth too, and it’s very reliable, as long as you have enough features to track. Where it starts to go wrong is when your camera is looking at things that are featureless, which is why consumer drones will sometimes warn you about unexpected behavior when flying over water, and why robotics labs often have bizarre carpets and wallpaper—the more features, the better. On Mars, Ingenuity has been reliably navigating by looking for distinctive features like rocks, but flying over a featureless expanse of sand caused serious problems, as <a href="https://spectrum.ieee.org/ingenuity-how-to-fly-a-helicopter-on-mars" target="_self">Ingenuity’s Chief Pilot Emeritus Håvard Grip</a> explained to us during today’s press conference:
  328. </p><blockquote>
  329. The way a system like this works is by looking at the consensus of [the features] it sees, and then throwing out the things that don’t really agree with the consensus. The danger is when you run out of features, when you don’t have very many features to navigate on, and you’re not really able to establish what that consensus is, and you end up tracking the wrong kinds of features, and that’s when things can get off track.
  330. </blockquote><p class="shortcode-media shortcode-media-rebelmouse-image">
  331. <img alt="" class="rm-shortcode" data-rm-shortcode-id="dd3d90a1f7d9670fc145ed2b0cf7df0b" data-rm-shortcode-name="rebelmouse-image" id="5b2fb" loading="lazy" src="https://spectrum.ieee.org/media-library/this-view-from-ingenuity-u2019s-navigation-camera-during-flight-70-on-22-u00a0december-u00a0shows-areas-of-nearly-featureless.jpg?id=51190280&width=980"/>
  332. <small class="image-media media-caption" placeholder="Add Photo Caption...">This view from Ingenuity’s navigation camera during flight 70 (on 22 December) shows areas of nearly featureless terrain that would cause problems during flights 71 and 72.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">NASA/JPL-Caltech</small></p><p>
  333. After the Flight 71 emergency landing, the team decided to try a “pop-up” flight next: It was supposed to be about 30 seconds in the air, just straight up to 12 meters and then straight down as a check-out of the helicopter’s systems. As Ingenuity was descending, just before landing, there was a loss of communications with the helicopter. “We have reason to believe that it was facing the same featureless sandy terrain challenges [as in the previous flight],” said Tzanetos. “And because of the navigation challenges, we had a rotor strike with the surface that would have resulted in a power brownout, which caused the communications loss.” Grip describes what he thinks happened in more detail:
  334. </p><blockquote>
  335. Some of this is speculation because of the sparse telemetry that we have, but what we see in the telemetry is that coming down towards the last part of the flight, on the sand, when we’re closing in on the ground, the helicopter relatively quickly starts to think that it’s moving horizontally away from the landing target. It’s likely that it made an aggressive maneuver to try to correct that right upon landing. And that would have accounted for a sideways motion and tilt of the helicopter that could have led to either striking the blade to the ground and then losing power, or making a maneuver that was aggressive enough to lose power before touching down and striking the blade. We don’t know those details yet. We may never know. But we’re trying as hard as we can with the data that we have to figure out those details.
  336. </blockquote><p>
  337. When the Ingenuity team tried reestablishing contact with the helicopter the next
  338. <a href="https://mars.nasa.gov/mars2020/mission/status/365/a-sol-in-the-life-of-a-rover/" rel="noopener noreferrer" target="_blank">sol</a>, “she was right there where we expected her to be,” Tzanetos said. “Solar panel currents were looking good, which indicated that she was upright.” In fact, everything was “green across the board.” That is, until the team started looking through the images from Ingenuity’s navigation camera, and spotted the shadow of the damaged lower blade. Even if that’s the only damage to Ingenuity, the whole rotor system is now both unbalanced and producing substantially less lift, and further flights will be impossible.
  339. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  340. <img alt="" class="rm-shortcode" data-rm-shortcode-id="a3ec91d69ff634dc14045df0d5dfcf09" data-rm-shortcode-name="rebelmouse-image" id="6e577" loading="lazy" src="https://spectrum.ieee.org/media-library/a-closeup-of-the-shadow-of-the-damaged-blade-tip.png?id=51190277&width=980"/>
  341. <small class="image-media media-caption" placeholder="Add Photo Caption...">A closeup of the shadow of the damaged blade tip.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">NASA/JPL-Caltech </small>
  342. </p><blockquote>
  343. There’s always that piece in the back of your head that’s getting ready every downlink—today could be the last day, today could be the last day. So there was an initial moment, obviously, of sadness, seeing that photo come down and pop on-screen, which gives us certainty of what occurred. But that’s very quickly replaced with happiness and pride and a feeling of celebration for what we’ve pulled off. Um, it’s really remarkable the journey that she’s been on and worth celebrating every single one of those sols. Around 9 pm tonight Pacific time will mark 1,000
  344. <a href="https://mars.nasa.gov/mars2020/mission/status/365/a-sol-in-the-life-of-a-rover/" rel="noopener noreferrer" target="_blank">sols</a> that Ingenuity has been on the surface since her deployment from the Perseverance rover. So she picked a very fitting time to come to the end of her mission. —Teddy Tzanetos
  345. </blockquote><p>
  346. The Ingenuity team is guessing that there’s damage to more than one of the helicopter’s blades; the blades spin fast enough that if one hit the surface, others likely did too. The plan is to attempt to slowly spin the blades to bring others into view to try to collect more information. It sounds unlikely that NASA will divert the Perseverance rover to give Ingenuity a closer look. While continuing on its science mission, the rover will come between 200 and 300 meters of Ingenuity and will try to take some pictures, but that’s likely too far away for a good quality image.
  347. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  348. <img alt="" class="rm-shortcode" data-rm-shortcode-id="f244e73578f7186821a3188a74ad6b86" data-rm-shortcode-name="rebelmouse-image" id="85a9d" loading="lazy" src="https://spectrum.ieee.org/media-library/perseverance-watches-ingenuity-take-off-on-flight-47-on-14-u00a0march-2023.gif?id=51190293&width=980"/>
  349. <small class="image-media media-caption" placeholder="Add Photo Caption...">Perseverance watches Ingenuity take off on flight 47 on 14 March 2023.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">NASA/JPL-Caltech/ASU/MSSS</small></p><p>
  350. As a tech demo, Ingenuity’s entire reason for existence was to push the boundaries of what’s possible. And as Grip explains, even in its last flight, the little helicopter was doing exactly that, going above and beyond and trying newer and riskier things until it got as far as it possibly could:
  351. </p><blockquote>
  352. Overall, the way that Ingenuity has navigated using features of terrain has been incredibly successful. We didn’t design this system to handle this kind of terrain, but nonetheless it’s sort of been invincible until this moment where we flew in this completely bland terrain where you just have nothing to really hold on to. So there are some lessons in that for us: We now know that that particular kind of terrain can be a trap for a system like this. Backing up when encountering this featureless terrain is a functionality that a future helicopter could be equipped with. And then there are solutions like having a higher-resolution camera, which would have likely helped mitigate this situation. But it’s all part of this tech demo, where we equipped this helicopter to do at most five flights in a pre-scouted area and it’s gone on to do so much more than that. And we just worked it all the way up to the line, and then just tipped it right over the line to where it couldn’t handle it anymore.
  353. </blockquote><p>
  354. Arguably, Ingenuity’s most important contribution has been showing that it’s not just possible <a href="https://spectrum.ieee.org/mars-perseverance" target="_self">but practical and valuable</a> to have rotorcraft on Mars. “I don’t think we’d be talking about<a href="https://spectrum.ieee.org/nasa-mars-sample-return" target="_self"> sample recovery helicopters</a> if Ingenuity didn’t fly, period, and if it hadn’t survived for as long as it has,” <a href="https://spectrum.ieee.org/mars-helicopter-ingenuity-50" target="_blank">Teddy Tzanetos told us after Ingenuity’s 50th flight</a>. And it’s not just the sample return mission: JPL is also developing <a href="https://spectrum.ieee.org/the-next-mars-helicopter" target="_self">a much larger Mars Science Helicopter</a>, which will owe its existence to Ingenuity’s success.
  355. </p><p>
  356. Nearly three years on Mars; 128 minutes and 11 miles of flight in the Martian skies. “I look forward to the day that one of our astronauts brings home Ingenuity and we can all visit it in the Smithsonian,” said
  357. <a href="https://www.jpl.nasa.gov/who-we-are/executive-council/laurie-leshin-director-of-jpl" rel="noopener noreferrer" target="_blank">Director of JPL Laurie Leshin</a> at the end of today’s press conference.
  358. </p><p>
  359. I’ll be first in line.
  360. </p><p class="shortcode-media shortcode-media-youtube">
  361. <span class="rm-shortcode" data-rm-shortcode-id="d82923507d92dbd1701fd8493287ad94" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/raOA2MX-XLQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  362. </p><p>
  363. We’ve written extensively about Ingenuity, including in-depth interviews with both helicopter and rover team members, and they’re well worth rereading today. Thanks, Ingenuity. You did well.
  364. </p><p>
  365. <br/>
  366. </p><h5>
  367. <a href="https://spectrum.ieee.org/mars-helicopter-ingenuity-50" target="_self">What Flight 50 Means for the Ingenuity Mars Helicopter</a> </h5><p><em>
  368. Team lead Teddy Tzanetos on the helicopter’s milestone aerial mission</em></p><p>
  369. <br/>
  370. </p><h5>
  371. <a href="https://spectrum.ieee.org/mars-perseverance" target="_self">Mars Helicopter Is Much More Than a Tech Demo</a>
  372. </h5><p><em>
  373. </em><em>A Mars rover driver explains just how much of a difference the little helicopter scout is making to Mars exploration</em>
  374. </p><p>
  375. <br/>
  376. </p><h5>
  377. <a href="https://spectrum.ieee.org/ingenuity-how-to-fly-a-helicopter-on-mars" target="_self">Ingenuity’s Chief Pilot Explains How to Fly a Helicopter on Mars </a>
  378. </h5><p><em>
  379. </em><em>Simulation is the secret to flying a helicopter on Mars</em>
  380. </p><p>
  381. <br/>
  382. </p><h5>
  383. <a href="https://spectrum.ieee.org/nasa-designed-perseverance-helicopter-rover-fly-autonomously-mars" target="_self">How NASA Designed a Helicopter That Could Fly Autonomously on Mars</a>
  384. </h5><p><em>
  385. </em><em>The Perseverance rover’s Mars Helicopter (Ingenuity) will take off, navigate, and land on Mars without human intervention</em>
  386. </p>]]></description><pubDate>Fri, 26 Jan 2024 02:08:41 +0000</pubDate><guid>https://spectrum.ieee.org/mars-helicopter-ingenuity-end-mission</guid><category>Ingenuity</category><category>Mars helicopter</category><category>Mars</category><category>Space robots</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-photo-from-ingenuitys-navigation-camera-shows-the-shadow-of-one-of-its-rotors-with-significant-damage-to-the-tip.jpg?id=51190273&amp;width=980"></media:content></item><item><title>That Awesome Robot Demo Could Have a Human in the Loop</title><link>https://spectrum.ieee.org/robot-teleoperation-autonomy</link><description><![CDATA[
  387. <img src="https://spectrum.ieee.org/media-library/a-man-sits-in-a-chair-wearing-vr-googles-and-attached-to-two-robotic-arms-near-him-a-robot-with-two-arms-show-s-the-man-s-face.png?id=51152136&width=1245&height=700&coordinates=0%2C221%2C0%2C222"/><br/><br/><p>
  388. Over the past few weeks, we’ve seen a couple of high-profile videos of robotic systems doing really impressive things. And I mean, that’s what we’re all here for, right? Being impressed by the awesomeness of robots! But sometimes the awesomeness of robots is more complicated than what you see in a video making the rounds on social media—any robot has a lot of things going on behind the scenes to make it successful, but if you can’t tell what those things are, what you see at first glance might be deceiving you.
  389. </p><hr/><p>
  390. Earlier this month, a group of researchers from Stanford’s IRIS Lab introduced Mobile ALOHA, which (if you read the YouTube video description) is described as “a low-cost and whole-body teleoperation system for data collection”:
  391. </p><p class="shortcode-media shortcode-media-youtube">
  392. <span class="rm-shortcode" data-rm-shortcode-id="19f666b1f65caa81761575dc71136416" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/HaaZ8ss-HP4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  393. </p><p>
  394. And just last week, Elon Musk posted a video of Tesla’s Optimus robot folding a shirt:
  395. </p><blockquote class="rm-embed twitter-tweet" data-conversation="none" data-partner="rebelmouse" data-twitter-tweet-id="1746964887949934958">
  396. <div style="margin:1em 0"></div> —  (@)
  397.        <a href="https://twitter.com/elonmusk/status/1746964887949934958"></a>
  398. </blockquote>
  399. <script async="" charset="utf-8" src="https://platform.twitter.com/widgets.js"></script><p>
  400. Most people who watch these videos without poking around in the descriptions or comments will likely <em>not</em> assume that these robots were being entirely controlled by experienced humans, because why would they? Even for roboticists, it can be tricky to know for sure whether the robot they’re watching has a human in the loop somewhere. This is a problem that’s not unique to the folks behind either of the videos above; it’s a communication issue that the entire robotics community struggles with. But as robots (and robot videos) become more mainstream, it’s important that we get better at it.<br/>
  401. </p><h2>Why use teleoperation?</h2><p>
  402. Humans are way, way, way, way, <em>way</em> better than robots at almost everything. We’re fragile and expensive, which is why so many people are trying to get robots to do stuff instead, but with a very few exceptions involving speed and precision, humans are the gold standard and are likely to remain so for the foreseeable future. So, if you need a robot to do something complicated or something finicky or something that might require some innovation or creativity, the best solution is to put a human in control.
  403. </p><h2>What about autonomy, though?</h2><p>
  404. Having one-to-one human teleoperation of a robot is a great way of getting things done, but it’s not scalable, and aside from some very specific circumstances, the whole point of robots is to do stuff autonomously at scale so that humans don’t have to. One approach to autonomy is to learn as much as you can from human teleoperation: Many robotics companies are betting that they’ll be able to use humans to gradually train their robotic systems, transitioning from full teleoperation to partial teleoperation to supervisory control to full autonomy. <a href="https://sanctuary.ai/" target="_blank">Sanctuary AI</a> is a great example of this: They’ve been teleoperating their humanoid robots <a href="https://www.youtube.com/watch?v=SvlHlf1weYw" rel="noopener noreferrer" target="_blank">through all kinds of tasks</a>, collecting training data as a foundation for later autonomy.
  405. </p><h2>What’s wrong with teleoperation, then?
  406. </h2><p>
  407. Nothing! Teleoperation is great. But when people see a robot doing something and it <em>looks </em>autonomous but it’s <em>actually</em> teleoperated, that’s a problem, because it’s a misrepresentation of the state of the technology. Not only do people end up with the wrong idea of how your robot functions and what it’s really capable of, it also means that whenever those people see <em>other</em> robots doing similar tasks autonomously, their frame of reference will be completely wrong, minimizing what otherwise may be a significant contribution to the field by other robotics folks. To be clear, I don’t (usually) think that the roboticists making these videos have any intention of misleading people, but that is unfortunately what often ends up happening.
  408. </p><h2>What can we do about this problem?</h2><p>
  409. Last year, I wrote <a href="https://www.ieee-ras.org/publications/how-to-make-a-good-robot-video" rel="noopener noreferrer" target="_blank">an article for the IEEE Robotics & Automation Society (RAS)</a> with some tips for making a good robot video, which includes arguably the most important thing: <strong>context</strong>. This covers teleoperation, along with other common things that can cause robot videos to mislead an unfamiliar audience. Here’s an excerpt from the RAS article:
  410. </p><p>
  411. <em>It’s critical to provide accurate context for videos of robots. It’s not always clear (especially to nonroboticists) what a robot may be doing or not doing on its own, and your video should be as explicit as possible about any assistance that your system is getting. For example, your video should identify:</em>
  412. </p><ul>
  413. <li><em>If the video has been sped up or slowed down</em></li>
  414. <li><em>If the video makes multiple experiments look like one continuous experiment</em></li>
  415. <li><em>If external power, compute, or localization is being used</em></li>
  416. <li><em>How the robot is being controlled (e.g., human in the loop, human supervised, scripted actions, partial autonomy, full autonomy)</em></li>
  417. </ul><p>
  418. <em>These things should be made explicit on the video itself, not in the video description or in captions. Clearly communicating the limitations of your work is the responsible thing to do, and not doing this is detrimental to the robotics community.</em>
  419. </p><p>
  420. I want to emphasize that <strong>context should be made explicit on the video itself</strong>. That is, when you edit the video together, add captions or callouts or something that describes the context <strong>on top of the actual footage</strong>. Don’t put it in the description or in the subtitles or in a link, because when videos get popular online, they may be viewed and shared and remixed without any of that stuff being readily available.
  421. </p><h2>So how can I tell if a robot is being teleoperated?</h2><p>
  422. If you run across a video of a robot doing some kind of amazing manipulation task and aren’t sure whether it’s autonomous or not, here are some questions to ask that might help you figure it out.
  423. </p><ul>
  424. <li><strong>Can you identify an operator? </strong>In both of the videos we mentioned above, if you look very closely, you can tell that there’s a human operator, whether it’s a pair of legs or a wayward hand in a force-sensing glove. This may be the first thing to look for, because sometimes an operator is very obvious, but at the same time, not seeing an operator isn’t particularly meaningful because it’s easy for them to be out of frame. </li>
  425. </ul><ul>
  426. <li><strong>Is there any more information? </strong>The second thing to check is whether the video says anywhere what’s actually going on. Does the video have a description? Is there a link to a project page or paper? Are there credits at the end of the video? What account is publishing the video? Even if you can narrow down the institution or company or lab, you might be able to get a sense of whether they’re working on autonomy or teleoperation.</li>
  427. </ul><ul>
  428. <li><strong>What kind of task is it? </strong>You’re most likely to see teleoperation in tasks that would be especially difficult for a robot to do autonomously. At the moment, that’s predominantly manipulation tasks that aren’t well structured—for example, getting multiple objects to interact with each other, handling things that are difficult to model (like fabrics), or extended multistep tasks. If you see a robot doing this stuff quickly and well, it’s worth questioning whether it’s autonomous.</li>
  429. </ul><ul>
  430. <li><strong>Is the robot just too good? </strong>I always start asking more questions when a robot demo strikes me as just too impressive. But when does impressive become <em>too </em>impressive? Personally, I think a robot demonstrating human-level performance at just about any complex task is too impressive. Some autonomous robots definitely have reached that benchmark, but not many, and the circumstances of them doing so are usually atypical. Furthermore, it takes a lot of work to reach humanlike performance with an autonomous system, so there’s usually some warning in the form of previous work. If you see an impressive demo that comes out of nowhere, showcasing an autonomous capability without any recent precedents, that’s probably too impressive. Remember that it can be tricky with a video because you have no idea whether you’re watching the first take or the 500th, and that itself is a good thing to be aware of—even if it turns out that a demo <em>is</em> fully autonomous, there are many other ways of obfuscating how successful the system actually is.</li>
  431. </ul><ul>
  432. <li><strong>Is it too fast? </strong>Autonomous robots are well known for being very fast and precise, but only in the context of structured tasks. For complex manipulation tasks, robots need to sense their environment, decide what to do next, and then plan how to move. This takes time. If you see an extended task that consists of multiple parts but the system never stops moving, that suggests it’s not fully autonomous. </li>
  433. </ul><ul>
  434. <li><strong>Does it move like a human?</strong> Robots like to move optimally. Humans might also like to move optimally, but we’re bad at it. Autonomous robots tend to move smoothly and fluidly, while teleoperated robots often display small movements that don’t make sense in the context of the task, but are very humanlike in nature. For example, finger motions that are unrelated to gripping, or returning an arm to a natural rest position for no particular reason, or being just a little bit sloppy in general. If the motions seem humanlike, that’s usually a sign of a human in the loop rather than a robot that’s just so good at doing a task that it <em>looks</em> human.</li>
  435. </ul><div class="horizontal-rule">
  436. </div><p>
  437. None of these points make it impossible for an autonomous robot demo to come out of nowhere and blow everyone away. Improbable, perhaps, but not impossible. And the rare moments when that actually happens is part of what makes robotics so exciting. That’s why it’s so important to understand what’s going on when you see a robot doing something amazing, though—knowing how it’s done, and all of the work that went into it, can only make it more impressive.
  438. </p><p>
  439. <em>This article was inspired by <a href="https://www.linkedin.com/in/petercorke/" target="_blank">Peter Corke</a>’s LinkedIn post, <a href="https://www.linkedin.com/pulse/whats-all-deceptive-teleoperation-demos-peter-corke-asmyc/" target="_blank">What’s with all these deceptive teleoperation demos?</a> And extra thanks to Peter for his feedback on an early draft of this article.</em>
  440. </p>]]></description><pubDate>Tue, 23 Jan 2024 21:00:08 +0000</pubDate><guid>https://spectrum.ieee.org/robot-teleoperation-autonomy</guid><category>Autonomous robots</category><category>Teleoperated robots</category><category>Tesla</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-man-sits-in-a-chair-wearing-vr-googles-and-attached-to-two-robotic-arms-near-him-a-robot-with-two-arms-show-s-the-man-s-face.png?id=51152136&amp;width=980"></media:content></item><item><title>Liquid RAM Flexes for Wearables, Robots, Implants</title><link>https://spectrum.ieee.org/flexible-electronics-flexram</link><description><![CDATA[
  441. <img src="https://spectrum.ieee.org/media-library/a-gold-reflective-surface-against-a-green-background.jpg?id=51129513&width=1245&height=700&coordinates=0%2C0%2C0%2C0"/><br/><br/><p>While organic thin-film transistors built on flexible plastic have been around long enough for people to start discussing <a href="https://spectrum.ieee.org/plotting-a-moores-law-for-flexible-electronics" target="_self">a Moore’s Law for bendable ICs</a>, memory devices for these flexible electronics have been <a href="https://spectrum.ieee.org/search/?q=flexible+memory" target="_self">a bit more elusive</a>. Now researchers from <a href="https://www.tsinghua.edu.cn/en/" target="_blank">Tsinghua University</a>, in Beijing, have developed a fully flexible resistive random-access-memory device, dubbed FlexRAM, that offers another approach: a liquid one.</p><p>In research described in the journal <a href="https://onlinelibrary.wiley.com/doi/10.1002/adma.202309182" rel="noopener noreferrer" target="_blank"><em>Advanced Materials</em></a>, the researchers have used a gallium-based liquid metal to achieve FlexRAM’s data writing-and-reading process. In an example of biomimicry, the gallium-based liquid metal (GLM) droplets undergo oxidation and reduction mechanisms while in a solution environment that mimics the <a href="https://www.britannica.com/science/nervous-system/The-neuronal-membrane" target="_blank">hyperpolarization and depolarization of neurons</a>.</p><p class="pull-quote">“This breakthrough fundamentally changes traditional notions of flexible memory, offering a theoretical foundation and technical path for future soft intelligent robots, brain-machine interface systems, and wearable/implantable electronic devices.”<br/><strong>—Jing Liu, Tsinghua University</strong></p><p>These positive and negative bias voltages define the writing of information “1” and “0,” respectively. When a low voltage is applied, the liquid metal is oxidized, corresponding to the high-resistance state of “1.” By reversing the voltage polarity, it returns the metal to its initial low-resistance state of “0.” This reversible switching process allows for the storage and erasure of data.</p><p>To showcase the reading and writing capabilities of FlexRAM, the researchers integrated it into a software-and-hardware setup. Through computer commands, they encoded a string of letters and numbers, represented in the form of 0s and 1s, onto an array of eight FlexRAM storage units, equivalent to 1 byte of data information. The digital signal from the computer underwent conversion into an analog signal using pulse-width modulation to precisely control the oxidation and reduction of the liquid metal.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-left" data-rm-resized-container="25%" style="float: left;">
  442. <img alt="" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="8cee0a999752f204df5f287e66c6b0b5" data-rm-shortcode-name="rebelmouse-image" id="c02fa" loading="lazy" src="https://spectrum.ieee.org/media-library/these-photographs-show-the-oxidation-and-reduction-state-of-the-gallium-based-liquid-metal-at-the-heart-of-flexram.jpg?id=51129529&width=980" style="max-width: 100%"/>
  443. <small class="image-media media-caption" data-gramm="false" data-lt-tmp-id="lt-172587" placeholder="Add Photo Caption..." spellcheck="false">These photographs show the oxidation and reduction state of the gallium-based liquid metal at the heart of FlexRAM.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Jing Liu/Tsinghua University</small></p><p>The present prototype is a volatile memory, according to <a href="https://www.med.tsinghua.edu.cn/en/info/1358/1963.htm" rel="noopener noreferrer" target="_blank">Jing Liu</a>, a professor at the department of biomedical engineering at Tsinghua University. But Liu contends that the memory principle allows for the development of the device into different forms of memory.</p><p>This contention is supported by the unusual phenomenon that the data stored in FlexRAM persists even when the power is switched off. In a low- or no-oxygen environment, FlexRAM can retain its data for up to 43,200 seconds (12 hours). It also exhibits repeatable use, maintaining stable performance for over 3,500 cycles of operation.</p><p>“This breakthrough fundamentally changes traditional notions of flexible memory, offering a theoretical foundation and technical path for future soft intelligent robots, brain-machine interface systems, and wearable/implantable electronic devices,” said Liu.</p><p>The GLM droplets are encapsulated in <a href="https://plastics-rubber.basf.com/global/en/performance_polymers/products/ecoflex.html" target="_blank">Ecoflex</a>, a stretchable biopolymer. Using a 3D printer, the researchers printed Ecoflex molds and injected gallium-based liquid-metal droplets and a solution of polyvinyl acetate hydrogel separately into the cavities in the mold. The hydrogel not only prevents solution leakage but also enhances the mechanical properties of the device, increasing its resistance ratio.</p><p class="pull-quote">“FlexRAM could be incorporated into entire liquid-based computing systems, functioning as a logic device.”<br/><strong>—Jing Liu, Tsinghua University</strong></p><p>In the present prototype, an array of eight FlexRAM units can store 1 byte of information.<br/></p><p>At this conceptual demonstration stage, millimeter-scale resolution molding is sufficient for the demonstration of its working principle, Liu notes.</p><p>“The conceivable size scale for these FlexRAM devices can range widely,” said Liu. “For example, the size for each of the droplet memory elements can be from millimeter to nanoscale droplets. Interestingly, as revealed by the present study, the smaller the droplet size, the more sensitive the memory response.”</p><p>This groundbreaking work paves the way for the realization of brainlike circuits, aligning with concepts proposed by researchers such as <a href="https://spectrum.ieee.org/soggy-computing-liquid-devices-might-match-the-brains-efficiency" target="_self">Stuart Parkin at IBM</a> over a decade ago. “FlexRAM could be incorporated into entire liquid-based computing systems, functioning as a logic device,” Liu envisions.<br/></p><p>As researchers and engineers continue to address challenges and refine the technology, the potential applications of FlexRAM in soft robotics, brain-machine interface systems, and wearable/implantable electronic could be significant.</p>]]></description><pubDate>Sun, 21 Jan 2024 18:56:49 +0000</pubDate><guid>https://spectrum.ieee.org/flexible-electronics-flexram</guid><category>Flexible electronics</category><category>Liquid metal</category><category>Memory devices</category><category>Random access memory</category><category>Soft robotics</category><category>Volatile memory</category><dc:creator>Dexter Johnson</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-gold-reflective-surface-against-a-green-background.jpg?id=51129513&amp;width=980"></media:content></item><item><title>Video Friday: Swiss-Mile</title><link>https://spectrum.ieee.org/video-friday-swiss-mile</link><description><![CDATA[
  444. <img src="https://spectrum.ieee.org/media-library/a-silver-four-legged-robot-with-wheels-for-feet-drives-down-a-flight-of-concrete-stairs-outside.png?id=51121645&width=1325&height=830&coordinates=277%2C151%2C318%2C99"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 2 February 2024, ZURICH</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p>You may not be familiar with Swiss-Mile, but you’d almost certainly recognize its robot: it’s the <a data-linked-post="2666409159" href="https://spectrum.ieee.org/quadruped-robot-wheels" target="_blank">ANYmal</a> with wheels on its feet that can do all kinds of amazing things. Swiss-Mile has just announced a seed round to commercialize these capabilities across quadrupedal platforms, including Unitree’s, which means it’s even affordable-ish!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="54259339f6cb034ddd06a1564bf5eaf5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/KmXulwvCsZA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>It’s always so cool to see impressive robotics research move toward commercialization, and I’ve already started saving up for one of these of my own.</p><p>[ <a href="https://www.swiss-mile.com/">Swiss-Mile</a> ]</p><p>Thanks Marko!</p><div class="horizontal-rule"></div><blockquote><em>This video presents the capabilities of PAL Robotics’ TALOS robot as it demonstrates agile and robust walking using Model Predictive Control (MPC) references sent to a Whole-Body Inverse Dynamics (WBID) controller developed in collaboration with Dynamograde. The footage shows TALOS navigating various challenging terrains, including stairs and slopes, while handling unexpected disturbances and additional weight.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ea83d90dbe2f435aff45dc3b6c729cad" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/qVzHe8FG984?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://pal-robotics.com/robots/talos/">PAL Robotics</a> ]</p><p>Thanks Lorna!</p><div class="horizontal-rule"></div><p>Do you want to create a spectacular bimanual manipulation demo? All it takes is this teleoperation system and a carefully cropped camera shot! This is based on the Mobile ALOHA system from Stanford that we featured in Video Friday last week.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4b2a34f5649ea486c86c9fcb260de102" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/mNgqXZ44W9Y?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://global.agilex.ai/">AgileX</a> ]</p><div class="horizontal-rule"></div><p>Wing is still trying to make the drone-delivery thing work, and it’s got a new, bigger drone to deliver even more stuff at once.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="693d22df87db69fe1680c08b4a5bc205" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GU1bNw4Z6to?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://blog.wing.com/2024/01/customer-demand-and-wings-aircraft.html">Wing</a> ]</p><div class="horizontal-rule"></div><p>A lot of robotics research claims to be about search and rescue and disaster relief, but it really looks like RSL’s ANYmal can actually pull it off.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8f392eaadb60a4c76d4fc6fb272f2924" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/_R29DrAx0Xs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>And here’s even more impressive video, along with some detail about how the system works.</p><p class="shortcode-media shortcode-media-youtube">
  445. <span class="rm-shortcode" data-rm-shortcode-id="f5ab439384e6902dbc9a3085b4f16142" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/nipH-yl8lR0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  446. </p><p>[ <a href="https://arxiv.org/abs/2309.15462">Paper</a> ]</p><div class="horizontal-rule"></div><p>This might be the most appropriate soundtrack for a robot video that I’ve ever heard.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="29a2dda83d337763ffd031cf682141b2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zuhZ-cGFYPs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Snakes have long captivated robotics researchers due to their effective locomotion, flexible body structure, and ability to adapt their skin friction to different terrains. While extensive research has delved into serpentine locomotion, there remains a gap in exploring rectilinear locomotion as a robotic solution for navigating through narrow spaces. In this study, we describe the fundamental principles of rectilinear locomotion and apply them to design a soft crawling robot using origami modules constructed from laminated fabrics.</em></blockquote><p>[ <a href="https://www.softrobotics.dk/">SDU</a> ]</p><div class="horizontal-rule"></div><p>We wrote about Fotokite’s innovative tethered drone seven or eight years ago, and it’s good to see the company is still doing solid work.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b9d42c15c98c03c25585f73a13c56b90" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/P3zDnxsVkZw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>I do miss the consumer version, though.</p><p>[ <a href="https://fotokite.com/">Fotokite</a> ]</p><div class="horizontal-rule"></div><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7fffc2792213510db7daca9e22aadbb3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/e1uqA-1st_U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="http://jdp.co.uk/">JDP</a> ] via [ <a href="https://petapixel.com/2024/01/16/mantis-shrimp-pulls-no-punches-in-robot-spy-crab-face-off/">Petapixel</a> ]</p><div class="horizontal-rule"></div><blockquote><em>This is SHIVAA the strawberry picking robot of DFKI Robotics Innovation Center. The system is being developed in the RoLand (Robotic Systems in Agriculture) project, coordinated by the #RoboticsInnovationCenter (RIC) of the DFKI Bremen. Within the project we design and develop a semi-autonomous, mobile system that is capable of harvesting strawberries independent of human interaction. </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="51f203c9af3abfebf0c5867badf85fa4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/E_yl3-wgckM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotik.dfki-bremen.de/en/research/projects/roland">DFKI</a> ]</p><div class="horizontal-rule"></div><blockquote><em>On December 6, 2023, Demarcus Edwards talked to Robotics students as a speaker in the Undergraduate Robotics Pathways & Careers Speaker Series, which aims to answer the question: “What can I do with a robotics degree?”</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="43c95b0ba646afad4c8df366fe343009" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/RAq4tBBXxO8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotics.umich.edu/academics/undergraduate/robotics-pathways-speaker-series/">Michigan Robotics</a> ]</p><div class="horizontal-rule"></div><p>This movie, <em>Loss of Sensation</em>, was released in Russia in 1935. It seems to be the movie that really, really irritated Karel Čapek, because they made his “robots” into mechanical beings instead of biological ones.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7b068cb43c69d1788bf37ee1ac5b79c8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/6GiBhKbYBcU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.imdb.com/title/tt0240539/">IMDB</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 19 Jan 2024 19:44:22 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-swiss-mile</guid><category>Anymal</category><category>Drone delivery</category><category>Pal robotics</category><category>Quadruped robots</category><category>Robotics</category><category>Swiss-mile</category><category>Unitree</category><category>Video friday</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-silver-four-legged-robot-with-wheels-for-feet-drives-down-a-flight-of-concrete-stairs-outside.png?id=51121645&amp;width=980"></media:content></item><item><title>The Man Who Coined the Word “Robot” Defends Himself</title><link>https://spectrum.ieee.org/karel-capek-robots</link><description><![CDATA[
  447. <img src="https://spectrum.ieee.org/media-library/a-black-and-white-photograph-of-a-man-overlaying-an-orange-tinted-robot-in-the-background-with-rur-printed-on-its-chest.jpg?id=51097752&width=1245&height=700&coordinates=0%2C84%2C0%2C333"/><br/><br/><p>You’re familiar with Karel Čapek, right? If not, you should be—he’s the guy who (along with his brother Josef) invented the word “robot.” Čapek introduced robots to the world in 1921, when his play “<em>R.U.R.</em>” (subtitled “Rossum’s Universal Robots”) was first performed in Prague. It was performed in New York City the next year, and by the year after that, it had been translated into 30 languages. Translated, that is, except for the word “robot” itself, which originally described artificial humans but within a decade of its introduction came to mean things that were mechanical and electronic in nature.</p><p>Čapek, it turns out, was a little miffed that his “robots” had been so hijacked, and in 1935, he wrote a column in the <a href="https://www.lidovky.cz/" rel="noopener noreferrer" target="_blank"><em>Lidové noviny</em></a> “defending” his vision of what robots should be, while also resigning himself to what they had become. A new translation of this column is included as an afterword in a new English translation of <em>R.U.R. </em> that is accompanied by 20 essays exploring robotics, philosophy, politics, and AI in the context of the play, and it makes for fascinating reading. </p><hr/><p><em><a href="https://mitpress.mit.edu/9780262544504/ir-u-r-iand-the-vision-of-artificial-life/" target="_blank">R.U.R. and the Vision of Artificial Life</a></em> is edited by <a href="https://droplets.vscht.cz/people/cejkova" target="_blank">Jitka Čejková</a>, a professor at the <a href="https://chobotix.cz/" target="_blank">Chemical Robotics Laboratory</a> at the University of Chemistry and Technology Prague, whose research interests arguably make her one of the most qualified people to write about Čapek’s perspective on robots. “The chemical robots in the form of microparticles that we designed and investigated, and that had properties similar to living cells, were much closer to Čapek’s original ideas than any other robots today,” Čejková explains in the book’s introduction. These microparticles can exhibit surprisingly complex autonomous behaviors under specific situations, like solving simple mazes:</p><p class="shortcode-media shortcode-media-youtube">
  448. <span class="rm-shortcode" data-rm-shortcode-id="4fc6340ae9013d41849de7917fae8fb6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/P5uKRqJIeSs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  449. </p><p>“I started to call these droplets liquid robots,” says Čejková. “Just as Rossum’s robots were artificial human beings that only looked like humans and could imitate only certain characteristics and behaviors of humans, so liquid robots, as artificial cells, only partially imitate the behavior of their living counterparts.”</p><p>What is or is not called a robot is an ongoing debate that most roboticists seem to try to avoid, but personally, I appreciate the idea that very broadly, a robot is something that seems alive but isn’t—something with independent embodied intelligence. Perhaps the requirement that a robot is mechanical and electronic <em>is</em> too strict, although as Čapek himself realized 100 years ago, what defines a robot has escaped from the control of anyone, even its creator. Here then is his column from 1935, excerpted from <em>R.U.R. and the Vision of Artificial Life</em>, released just today:  </p><h3>“THE AUTHOR OF THE ROBOTS DEFENDS HIMSELF”</h3><h3>
  450. By Karel Čapek</h3><h5>Published in <a href="https://www.lidovky.cz/" rel="noopener noreferrer" target="_blank">Lidové noviny</a>, June 9, 1935</h5><p>I know it is a sign of ingratitude on the part of the author, if he raises both hands against a certain popularity that has befallen something which is called his spiritual brainchild; for that matter, he is aware that by doing so he can no longer change a thing. The author was silent a goodly time and kept his own counsel, while the notion that robots have limbs of metal and innards of wire and cogwheels (or the like) has become current; he has learned, without any great pleasure, that genuine steel robots have started to appear, robots that move in various directions, tell the time, and even fly airplanes; but when he recently read that, in Moscow, they have shot a major film, in which the world is trampled underfoot by mechanical robots, driven by electromagnetic waves, he developed a strong urge to protest, at least in the name of his own robots. For his robots were not mechanisms. They were not made of sheet metal and cogwheels. They were not a celebration of mechanical engineering. If the author was thinking of any of the marvels of the human spirit during their creation, it was not of technology, but of science. With outright horror, he refuses any responsibility for the thought that machines could take the place of people, or that anything like life, love, or rebellion could ever awaken in their cogwheels. He would regard this somber vision as an unforgivable overvaluation of mechanics or as a severe insult to life.</p><p>The author of the robots appeals to the fact that he must know the most about it: and therefore he pronounces that his robots were created quite differently—that is, by a chemical path. The author was thinking about modern chemistry, which in various emulsions (or whatever they are called) has located substances and forms that in some ways behave like living matter. He was thinking about biological chemistry, which is constantly discovering new chemical agents that have a direct regulatory influence on living matter; about chemistry, which is finding—and to some extent already building—those various enzymes, hormones, and vitamins that give living matter its ability to grow and multiply and arrange all the other necessities of life. Perhaps, as a scientific layman, he might develop an urge to attribute this patient ingenious scholarly tinkering with the ability to one day produce, by artificial means, a living cell in the test tube; but for many reasons, amongst which also belonged a respect for life, he could not resolve to deal so frivolously with this mystery. That is why he created a new kind of matter by chemical synthesis, one which simply behaves a lot like the living; it is an organic substance, different from that from which living cells are made; it is something like another alternative to life, a material substrate in which life could have evolved if it had not, from the beginning, taken a different path. We do not have to suppose that all the different possibilities of creation have been exhausted on our planet. The author of the robots would regard it as an act of scientific bad taste if he had brought something to life with brass cogwheels or created life in the test tube; the way he imagined it, he created only a new foundation for life, which began to behave like living matter, and which could therefore have become a vehicle of life—but a life which remains an unimaginable and incomprehensible mystery. This life will reach its fulfillment only when (with the aid of considerable inaccuracy and mysticism) the robots acquire souls. From which it is evident that the author did not invent his robots with the technological hubris of a mechanical engineer, but with the metaphysical humility of a spiritualist.</p><p>Well then, the author cannot be blamed for what might be called the worldwide humbug over the robots. The author did not intend to furnish the world with plate metal dummies stuffed with cogwheels, photocells, and other mechanical gizmos. It appears, however, that the modern world is not interested in his scientific robots and has replaced them with technological ones; and these are, as is apparent, the true flesh-of-our-flesh of our age. The world needed mechanical robots, for it believes in machines more than it believes in life; it is fascinated more by the marvels of technology than by the miracle of life. For which reason, the author who wanted—through his insurgent robots, striving for a soul—to protest against the mechanical superstition of our times, must in the end claim something which nobody can deny him: the honor that he was defeated.</p><p><em>Excerpted from </em>R.U.R. and the Vision of Artificial Life<em>, by Karel Čapek, edited by Jitka Čejková. Published by The MIT Press. Copyright © 2024 MIT. All rights reserved.</em></p>]]></description><pubDate>Tue, 16 Jan 2024 21:42:40 +0000</pubDate><guid>https://spectrum.ieee.org/karel-capek-robots</guid><category>Books</category><category>Chemical synthesis</category><category>Karel capek</category><category>Liquid robots</category><category>Robotics</category><category>Mechanical engineering</category><category>Chemical engineering</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-black-and-white-photograph-of-a-man-overlaying-an-orange-tinted-robot-in-the-background-with-rur-printed-on-its-chest.jpg?id=51097752&amp;width=980"></media:content></item><item><title>Video Friday: Robot, Make Me Coffee</title><link>https://spectrum.ieee.org/video-friday-robot-make-me-coffee</link><description><![CDATA[
  451. <img src="https://spectrum.ieee.org/media-library/a-photograph-of-a-silvery-humanoid-robot-placing-a-pod-into-a-keurig-machine.png?id=51064069&width=1245&height=700&coordinates=0%2C0%2C0%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 2 February 2024, ZURICH</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="q5mko7idsok">Figure’s robot is watching videos of humans making coffee, and then making coffee on its own.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4085d61aa4d7b7dc95e49406607b3211" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Q5MKo7Idsok?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>While this is certainly impressive, just be aware that it’s not at all clear from the video exactly how impressive it is. </p><p>[ <a href="https://www.figure.ai/">Figure</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="">It’s really the shoes that get me with Westwood’s THEMIS robot.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="54e30c5ee59756a2d61ce3e18febc32d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/qNbCu5Nc2Ik?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>THEMIS can also deliver a package just as well as a human can, if not better!</p><p class="shortcode-media shortcode-media-youtube">
  452. <span class="rm-shortcode" data-rm-shortcode-id="dfb001f72b17ae943c348c0d77fed1b9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/p-Rhy8fl2lk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  453. </p><p>And I appreciate the inclusion of all of these outtakes, too:</p><p class="shortcode-media shortcode-media-youtube">
  454. <span class="rm-shortcode" data-rm-shortcode-id="ebcf97fc771295a3b2de96be81dcd2eb" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zYhRJPO6GDk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  455. </p><p>[ <a href="https://www.westwoodrobotics.io/">Westwood Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Kepler Exploration Robot recently unveiled its latest innovation, the Kepler Forerunner series of general-purpose humanoid robots. This advanced humanoid stands at a height of 178cm (5’10”), weighs 85kg (187 lbs.), and boasts an intelligent and dexterous hand with 12 degrees of freedom. The entire body has up to 40 degrees of freedom, enabling functionalities such as navigating complex terrains, intelligent obstacle avoidance, flexible manipulation of hands, powerful lifting and carrying of heavy loads, hand-eye coordination, and intelligent interactive communication.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2bf8296ad38c207efc65b2813f208e6e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/A5vshTgDbKE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.gotokepler.com/home">Kepler Exploration</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Introducing the new Ballie, your true AI companion. With more advanced intelligence, Ballie can come right to you and project visuals on your walls. It can also help you interact with other connected devices or take care of hassles.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="905e3f17f45ca884101744195af29c62" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/YBfSX3QiqDM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://news.samsung.com/global/video-ces-2024-a-day-in-the-life-with-ballie-an-ai-companion-robot-for-the-home">Samsung</a> ]</p><div class="horizontal-rule"></div><p>There is a thing called Drone Soccer that got some exposure at CES this week, but apparently it’s been around for several years, and originated in South Korea. Inspired by <a href="https://en.wikipedia.org/wiki/Quidditch" target="_blank">Quiddich</a>, targeted at STEM students.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cb106d261cf352cdd6993a9f75959b01" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5Pzx6kv_1MM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dronesoccer.us/">Drone Soccer</a> ]</p><div class="horizontal-rule"></div><p>Every so often, JPL dumps a bunch of raw footage onto YouTube. This time, there’s <a data-linked-post="2652903805" href="https://spectrum.ieee.org/nasa-mars-rover-perseverance-landing" target="_blank">Perseverance</a>’s view of Ingenuity taking off, a test of the EELS robot, and an unusual sample tube drop test.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b7bc8693444b0e211a1294f202f65f18" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Z3pzmytXZvs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p class="shortcode-media shortcode-media-youtube">
  456. <span class="rm-shortcode" data-rm-shortcode-id="dff4929a981d382c8f06b16cec5c98cb" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/V5weg-xzedI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  457. </p><p class="shortcode-media shortcode-media-youtube">
  458. <span class="rm-shortcode" data-rm-shortcode-id="5c0d088c544948715fc41c76d04270d7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/MwoqL321FaU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  459. </p><p>[ <a href="https://www.jpl.nasa.gov/">JPL</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Our first months delivering to Walmart customers have made one thing clear: Demand for drone delivery is real. On the heels of our Dallas-wide FAA approvals, today we announced that millions of new DFW-area customers will have access to drone delivery in 2024!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="79d0bc4f9ffce7245866ea6e90523e73" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/d51c9CpSNqw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://blog.wing.com/2024/01/wing-and-walmart-expand-service.html">Wing</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Dave Burke works with Biomechatronics researcher Michael Fernandez to test a prosthesis with neural control, by cutting a sheet of paper with scissors. This is the first time in 30 years that Dave has performed this task with his missing hand.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="822819b94747e030aacd21a18233098e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/EX3zAY2sEzE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.media.mit.edu/posts/connected-mind-body-landing-1/">MIT</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Meet DJI’s first delivery drone—FlyCart 30. Overcome traditional transport challenges and start a new era of dynamic aerial delivery with large payload capacity, long operation range, high reliability, and intelligent features.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f3c2ce152228ea097b6d946a865a7ab4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Hhp11I-vGHA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dji.com/delivery?utm_source=youtube-own&utm_medium=social&utm_campaign=socialdaily&utm_term=Global&utm_content=ProductPage">DJI</a> ]</p><div class="horizontal-rule"></div><blockquote><em>The Waymo Driver autonomously operating both a passenger vehicle and class 8 truck safely in various freeway scenarios, including on-ramps and off-ramps, lane merges, and sharing the road with others.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1ff5d5a933d0cac54ae64c9830d22278" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/tgX7yzyfQ6E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://waymo.com/blog/2024/01/from-surface-streets-to-freeways-safely.html">Waymo</a> ]</p><div class="horizontal-rule"></div><blockquote><em>In this paper, we present DiffuseBot, a physics-augmented diffusion model that generates soft robot morphologies capable of excelling in a wide spectrum of tasks. DiffuseBot bridges the gap between virtually generated content and physical utility by (i) augmenting the diffusion process with a physical dynamical simulation which provides a certificate of performance, and ii) introducing a co-design procedure that jointly optimizes physical design and control by leveraging information about physical sensitivities from differentiable simulation.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="495ffb394bca75a3d2829fff395aa147" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/LSzasdvD3Ss?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://openreview.net/pdf?id=1zo4iioUEs">Paper</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 12 Jan 2024 18:15:53 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robot-make-me-coffee</guid><category>Video friday</category><category>Jpl</category><category>Figure</category><category>Samsung</category><category>Robotics</category><category>Humanoid robots</category><category>Drone delivery</category><category>Dji</category><category>Nasa</category><category>Mars rovers</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-photograph-of-a-silvery-humanoid-robot-placing-a-pod-into-a-keurig-machine.png?id=51064069&amp;width=980"></media:content></item><item><title>The Global Project to Make a General Robotic Brain</title><link>https://spectrum.ieee.org/global-robotic-brain</link><description><![CDATA[
  460. <img src="https://spectrum.ieee.org/media-library/a-silver-robot-with-one-arm-lifts-a-dinosaur-from-a-table-full-of-a-clutter-of-random-objects.jpg?id=50891347&width=1245&height=700&coordinates=0%2C0%2C0%2C0"/><br/><br/><p>
  461. <strong>The generative AI revolution</strong> embodied in tools like <a href="https://chat.openai.com/" rel="noopener noreferrer" target="_blank">ChatGPT</a>, <a href="https://www.midjourney.com/" rel="noopener noreferrer" target="_blank">Midjourney</a>, and many others is at its core based on a simple formula: Take a very large neural network, train it on a huge dataset scraped from the Web, and then use it to fulfill a broad range of user requests. Large language models (<a href="https://spectrum.ieee.org/tag/llms" target="_self">LLM</a>s) can answer questions, write code, and spout poetry, while image-generating systems can create convincing cave paintings or contemporary art.
  462. </p><p>
  463. So why haven’t these amazing AI capabilities translated into the kinds of helpful and broadly useful robots we’ve seen in science fiction? Where are the robots that can clean off the table, fold your laundry, and make you breakfast?
  464. </p><p>
  465. Unfortunately, the highly successful generative AI formula—big models trained on lots of Internet-sourced data—doesn’t easily carry over into robotics, because the Internet is not full of robotic-interaction data in the same way that it’s full of text and images. Robots need robot data to learn from, and this data is typically created slowly and tediously by researchers in laboratory environments for very specific tasks. Despite tremendous progress on robot-learning algorithms, without abundant data we still can’t enable robots to perform real-world tasks (like making breakfast) outside the lab. The most impressive results typically only work in a single laboratory, on a single robot, and often involve only a handful of behaviors.
  466. </p><p>
  467. If the abilities of each robot are limited by the time and effort it takes to manually teach it to perform a new task, what if we were to pool together the experiences of many robots, so a new robot could learn from all of them at once? We decided to give it a try. In 2023, our labs at Google and the University of California, Berkeley came together with 32 other robotics laboratories in North America, Europe, and Asia to undertake the
  468. <a href="https://robotics-transformer-x.github.io/" rel="noopener noreferrer" target="_blank">RT-X project</a>, with the goal of assembling data, resources, and code to make general-purpose robots a reality.
  469. </p><p>
  470. Here is what we learned from the first phase of this effort.
  471. </p><h2>How to create a generalist robot</h2><p>
  472. Humans are far better at this kind of learning. Our brains can, with a little practice, handle what are essentially changes to our body plan, which happens when we pick up a tool, ride a bicycle, or get in a car. That is, our “embodiment” changes, but our brains adapt. RT-X is aiming for something similar in robots: to enable a single deep neural network to control many different types of robots, a capability called cross-embodiment. The question is whether a deep neural network trained on data from a sufficiently large number of different robots can learn to “drive” all of them—even robots with very different appearances, physical properties, and capabilities. If so, this approach could potentially unlock the power of large datasets for robotic learning.
  473. </p><p>
  474. The scale of this project is very large because it has to be. The RT-X dataset currently contains nearly a million robotic trials for 22 types of robots, including many of the most commonly used robotic arms on the market. The robots in this dataset perform a huge range of behaviors, including picking and placing objects, assembly, and specialized tasks like cable routing. In total, there are about 500 different skills and interactions with thousands of different objects. It’s the largest open-source dataset of real robotic actions in existence.
  475. </p><p>
  476. Surprisingly, we found that our multirobot data could be used with relatively simple machine-learning methods, provided that we follow the recipe of using large neural-network models with large datasets. Leveraging the same kinds of models used in current LLMs like ChatGPT, we were able to train robot-control algorithms that do not require any special features for cross-embodiment. Much like a person can drive a car or ride a bicycle using the same brain, a model trained on the RT-X dataset can simply recognize what kind of robot it’s controlling from what it sees in the robot’s own camera observations. If the robot’s camera sees a
  477. <a href="https://www.universal-robots.com/products/ur10-robot/" rel="noopener noreferrer" target="_blank">UR10 industrial arm</a>, the model sends commands appropriate to a UR10. If the model instead sees a low-cost <a href="https://www.trossenrobotics.com/widowxrobotarm" rel="noopener noreferrer" target="_blank">WidowX hobbyist arm</a>, the model moves it accordingly.
  478. </p><p>
  479. To test the capabilities of our model, five of the laboratories involved in the RT-X collaboration each tested it in a head-to-head comparison against the best control system they had developed independently for their own robot. Each lab’s test involved the tasks it was using for its own research, which included things like picking up and moving objects, opening doors, and routing cables through clips. Remarkably, the single unified model provided improved performance over each laboratory’s own best method, succeeding at the tasks about 50 percent more often on average.
  480. </p><p>
  481. While this result might seem surprising, we found that the RT-X controller could leverage the diverse experiences of other robots to improve robustness in different settings. Even within the same laboratory, every time a robot attempts a task, it finds itself in a slightly different situation, and so drawing on the experiences of other robots in other situations helped the RT-X controller with natural variability and edge cases. Here are a few examples of the range of these tasks:
  482. </p><h3></h3><br/><img alt="" class="rm-shortcode" data-rm-shortcode-id="00e876b7c84501e7c4c62bbcb263e762" data-rm-shortcode-name="rebelmouse-image" id="208c0" loading="lazy" src="https://spectrum.ieee.org/media-library/image.gif?id=51012338&width=980"/><h3></h3><br/><img alt="" class="rm-shortcode" data-rm-shortcode-id="65e3d7d7fdc36cf6568f50cd8cb740d5" data-rm-shortcode-name="rebelmouse-image" id="ae8a5" loading="lazy" src="https://spectrum.ieee.org/media-library/image.gif?id=51012379&width=980"/><h3></h3><br/><img alt="" class="rm-shortcode" data-rm-shortcode-id="eaa74473d68b631be56f65e342ad6f16" data-rm-shortcode-name="rebelmouse-image" id="d3985" loading="lazy" src="https://spectrum.ieee.org/media-library/image.gif?id=51012395&width=980"/><h2>Building robots that can reason</h2><p>
  483. Encouraged by our success with combining data from many robot types, we next sought to investigate how such data can be incorporated into a system with more in-depth reasoning capabilities. Complex semantic reasoning is hard to learn from robot data alone. While the robot data can provide a range of
  484. <em>physical</em> capabilities, more complex tasks like “Move apple between can and orange” also require understanding the semantic relationships between objects in an image, basic common sense, and other symbolic knowledge that is not directly related to the robot’s physical capabilities.
  485. </p><p>
  486. So we decided to add another massive source of data to the mix: Internet-scale image and text data. We used an existing large vision-language model that is already proficient at many tasks that require some understanding of the connection between natural language and images. The model is similar to the ones available to the public such as ChatGPT or
  487. <a href="https://bard.google.com/chat" rel="noopener noreferrer" target="_blank">Bard</a>. These models are trained to output text in response to prompts containing images, allowing them to solve problems such as visual question-answering, captioning, and other open-ended visual understanding tasks. We discovered that such models can be adapted to robotic control simply by training them to also output robot actions in response to prompts framed as robotic commands (such as “Put the banana on the plate”). We applied this approach to the robotics data from the RT-X collaboration.
  488. </p><p class="shortcode-media shortcode-media-rebelmouse-image">
  489. <img alt="An illustration of a map and robot tasks shown on the right.  " class="rm-shortcode" data-rm-shortcode-id="cef4567b55fe04bd320640ddcdb2779f" data-rm-shortcode-name="rebelmouse-image" id="d785a" loading="lazy" src="https://spectrum.ieee.org/media-library/an-illustration-of-a-map-and-robot-tasks-shown-on-the-right.png?id=51027917&width=980"/>
  490. <small class="image-media media-caption" placeholder="Add Photo Caption...">The RT-X model uses images or text descriptions of specific robot arms doing different tasks to output a series of discrete actions that will allow any robot arm to do those tasks. By collecting data from many robots doing many tasks from robotics labs around the world, we are building an open-source dataset that can be used to teach robots to be generally useful.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Chris Philpot</small></p><p>
  491. To evaluate the combination of Internet-acquired smarts and multirobot data, we tested our RT-X model with Google’s mobile manipulator robot. We gave it our hardest generalization benchmark tests. The robot had to recognize objects and successfully manipulate them, and it also had to respond to complex text commands by making logical inferences that required integrating information from both text and images. The latter is one of the things that make humans such good generalists. Could we give our robots at least a hint of such capabilities?</p><p>We conducted two sets of evaluations. As a baseline, we used a model that excluded all of the generalized multirobot RT-X data that didn’t involve Google’s robot. Google’s robot-specific dataset is in fact the largest part of the RT-X dataset, with over 100,000 demonstrations, so the question of whether all the other multirobot data would actually help in this case was very much open. Then we tried again with all that multirobot data included.</p><p>In one of the most difficult evaluation scenarios, the Google robot needed to accomplish a task that involved reasoning about spatial relations (“Move apple between can and orange”); in another task it had to solve rudimentary math problems (“Place an object on top of a paper with the solution to ‘2+3’”). These challenges were meant to test the crucial capabilities of reasoning and drawing conclusions.</p><p>In this case, the reasoning capabilities (such as the meaning of “between” and “on top of”) came from the Web-scale data included in the training of the vision-language model, while the ability to ground the reasoning outputs in robotic behaviors—commands that actually moved the robot arm in the right direction—came from training on cross-embodiment robot data from RT-X. An example of an evaluation where we asked the robot to perform a task not included in its training data is shown in the video below. </p><p class="shortcode-media shortcode-media-youtube">
  492. <span class="rm-shortcode" data-rm-shortcode-id="d363c625358f758679672ff941842fc7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/qSARoad-F-k?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  493. <small class="image-media media-caption" placeholder="Add Photo Caption...">Even without specific training, this Google research robot is able to follow the instruction “move apple between can and orange.” This capability is enabled by RT-X, a large robotic manipulation dataset and the first step towards a general robotic brain.</small>
  494. </p><p>While these tasks are rudimentary for humans, they present a major challenge for general-purpose robots. Without robotic demonstration data that clearly illustrates concepts like “between,” “near,” and “on top of,” even a system trained on data from many different robots would not be able to figure out what these commands mean. By integrating Web-scale knowledge from the vision-language model, our complete system was able to solve such tasks, deriving the semantic concepts (in this case, spatial relations) from Internet-scale training, and the physical behaviors (picking up and moving objects) from multirobot RT-X data. To our surprise, we found that the inclusion of the multirobot data improved the Google robot’s ability to generalize to such tasks by a factor of three. This result suggests that not only was the multirobot RT-X data useful for acquiring a variety of physical skills, it could also help to better connect such skills to the semantic and symbolic knowledge in vision-language models. These connections give the robot a degree of common sense, which could one day enable robots to understand the meaning of complex and nuanced user commands like “Bring me my breakfast” while carrying out the actions to make it happen.<br/></p><h2>The next steps for RT-X</h2><p>
  495. The RT-X project shows what is possible when the robot-learning community acts together. Because of this cross-institutional effort, we were able to put together a diverse robotic dataset and carry out comprehensive multirobot evaluations that wouldn’t be possible at any single institution. Since the robotics community can’t rely on scraping the Internet for training data, we need to create that data ourselves. We hope that more researchers will contribute their data to the
  496. <a href="https://robotics-transformer-x.github.io/" target="_blank">RT-X database</a> and join this collaborative effort. We also hope to provide tools, models, and infrastructure to support cross-embodiment research. We plan to go beyond sharing data across labs, and we hope that RT-X will grow into a collaborative effort to develop data standards, reusable models, and new techniques and algorithms.
  497. </p><p>
  498. Our early results hint at how large cross-embodiment robotics models could transform the field. Much as large language models have mastered a wide range of language-based tasks, in the future we might use the same foundation model as the basis for many real-world robotic tasks. Perhaps new robotic skills could be enabled by fine-tuning or even prompting a pretrained foundation model. In a similar way to how you can prompt ChatGPT to tell a story without first training it on that particular story, you could ask a robot to write “Happy Birthday” on a cake without having to tell it how to use a piping bag or what handwritten text looks like. Of course, much more research is needed for these models to take on that kind of general capability, as our experiments have focused on single arms with two-finger grippers doing simple manipulation tasks.
  499. </p><p>
  500. As more labs engage in cross-embodiment research, we hope to further push the frontier on what is possible with a single neural network that can control many robots. These advances might include adding diverse simulated data from generated environments, handling robots with different numbers of arms or fingers, using different sensor suites (such as depth cameras and tactile sensing), and even combining manipulation and locomotion behaviors. RT-X has opened the door for such work, but the most exciting technical developments are still ahead.
  501. </p><p>
  502. This is just the beginning. We hope that with this first step, we can together create the future of robotics: where general robotic brains can power any robot, benefiting from data shared by all robots around the world.
  503. <span class="ieee-end-mark"></span>
  504. </p><p><em>This article appears in the February 2024 print issue as “The Global Project to Make a General Robotic Brain.”</em></p>]]></description><pubDate>Tue, 09 Jan 2024 16:05:50 +0000</pubDate><guid>https://spectrum.ieee.org/global-robotic-brain</guid><category>Deep learning</category><category>Robotic learning</category><category>General purpose robots</category><category>Robotics</category><dc:creator>Karol Hausman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-silver-robot-with-one-arm-lifts-a-dinosaur-from-a-table-full-of-a-clutter-of-random-objects.jpg?id=50891347&amp;width=980"></media:content></item><item><title>Video Friday: 3-Course Cantonese Meal</title><link>https://spectrum.ieee.org/video-friday-3-course-cantonese-meal</link><description><![CDATA[
  505. <img src="https://spectrum.ieee.org/media-library/image.png?id=51011521&width=2081&height=1371&coordinates=0%2C0%2C479%2C69"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 02 February 2024, ZURICH</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><h5><a href="https://2024.robocup.org/">RoboCup 2024</a>: 17–22 July 2024, EINDHOVEN, NETHERLANDS</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p>One approach to robot autonomy is to learn from human demonstration, which can be very effective as long as you have enough high-quality data to work with. Mobile ALOHA is a low-cost and whole-body teleoperation system for data collection from Stanford’s IRIS Lab, and under the control of an experienced human, it can do pretty much everything we’ve ever fantasized about home robots doing for us.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="681bb0b51699d33a4f46a8d4266640cd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/mnLVbwxSdNM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p class="shortcode-media shortcode-media-youtube">
  506. <span class="rm-shortcode" data-rm-shortcode-id="b550601d51737b51bae72da69db43f9e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/HaaZ8ss-HP4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  507. </p><p>[ <a href="https://mobile-aloha.github.io/">Stanford</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Researchers at the John A. Paulson School of Engineering and Applied Sciences (SEAS)and Boston University’s Sargent College of Health & Rehabilitation Sciences used a soft, wearable robot to help a person living with Parkinson’s walk without freezing. The robotic garment, worn around the hips and thighs, gives a gentle push to the hips as the leg swings, helping the patient achieve a longer stride. The research demonstrates the potential of soft robotics to treat a potentially dangerous symptom of Parkinson’s disease and could allow people living with the disease to regain their mobility and independence.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5889e2f4ea6fe52ca658090e4862fc4a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pAWrypkeDXM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://seas.harvard.edu/news/2024/01/soft-robotic-wearable-device-improves-walking-individual-parkinsons-disease">Harvard SEAS</a> ]</p><div class="horizontal-rule"></div><p>Happy 2024 from SkyMul!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1cc72f5318343db8059bcfa3b1f4c353" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/EWo_J9vRHh4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://skymul.com/">SkyMul</a> ]</p><p>Thanks, Eohan!</p><div class="horizontal-rule"></div><blockquote><em>As the holiday season approaches, we at Kawasaki Robotics (USA), Inc. wanted to take a moment to express our warmest wishes to you. May your holidays be filled with joy, love, and peace, and may the New Year bring you prosperity, success, and happiness. From our team to yours, we wish you a very happy holiday season and a wonderful New Year ahead.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b32dca9af8f38fcadb9360c1493a9fd4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2RFhWaNoYRY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://kawasakirobotics.com/">Kawasaki Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Aurora Flight Sciences is working on a new X-plane for the Defense Advanced Research Projects Agency (DARPA)’s Control of Revolutionary Aircraft with Novel Effectors (CRANE) program. X-65 is purpose-designed for testing and demonstrating the benefits of active flow control (AFC) at tactically relevant scale and flight conditions.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="72926133f570c447703634deae50c1f8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/G_jhcqGjUww?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.aurora.aero/2024/01/03/aurora-begins-building-full-scale-active-flow-control-x-plane/">Aurora</a> ]</p><div class="horizontal-rule"></div><p>Well, this is the craziest piece of immersive robotic teleop hardware I’ve ever seen.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a2e4f5c6b7414b1524724e636b32f930" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/itPlHw_GnLg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.jinki.jp/news/2023122801">Jinkisha</a> ]</p><div class="horizontal-rule"></div><p>Looks like Moley Robotics is still working on the least practical robotic kitchen ever.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="023960bbbbfed85af1ea90f0831b276f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/WzeGT3m5oMU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.moley.com/">Moley</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 05 Jan 2024 19:49:23 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-3-course-cantonese-meal</guid><category>Moleyrobotics</category><category>Skymul</category><category>Stanford</category><category>Video friday</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/image.png?id=51011521&amp;width=980"></media:content></item><item><title>The Future We Saw Coming Is Now</title><link>https://spectrum.ieee.org/technology-forecast-2024</link><description><![CDATA[
  508. <img src="https://spectrum.ieee.org/media-library/a-photo-of-a-smiling-man-in-glasses-and-a-collared-shirt.png?id=50902322&width=1245&height=700&coordinates=0%2C125%2C0%2C873"/><br/><br/><p>
  509. As <em>IEEE Spectrum</em> editors, we pride ourselves on spotting promising technologies and following them from the research phase through development and ultimately deployment. In every January issue, we focus on the technologies that are now poised to achieve significant milestones in the new year.
  510. </p><p>
  511. This issue was curated by Senior Editor <a href="https://spectrum.ieee.org/u/samuel-k-moore" target="_self">Samuel K. Moore</a>, our in-house expert on semiconductors. So it’s no surprise that he included a story on Intel’s plan to roll out two momentous chip technologies in the next few months.
  512. </p><p>
  513. For “<a href="https://spectrum.ieee.org/intel-20a" target="_blank">Intel Hopes to Leapfrog Its Competitors</a>,” Moore directed our editorial intern, <a href="https://spectrum.ieee.org/u/gwendolynrak" target="_self">Gwendolyn Rak</a>, to report on the risk the chip giant is taking by introducing two technologies at once. We began tracking the first technology, <a href="https://spectrum.ieee.org/nanosheets-ibms-path-to-5nanometer-transistors" target="_self">nanosheet transistors</a>, in 2017. By the time we gave all the details in a <a href="https://spectrum.ieee.org/the-nanosheet-transistor-is-the-next-and-maybe-last-step-in-moores-law" target="_self">2019 feature article</a>, it was clear that this device was destined to be the successor to the <a href="https://spectrum.ieee.org/the-amazing-vanishing-transistor-act" target="_self">FinFET</a>. Moore <a href="https://spectrum.ieee.org/arm-shows-backside-power-delivery-as-path-to-further-moores-law" target="_self">first spotted </a>the second technology, <a href="https://spectrum.ieee.org/next-gen-chips-will-be-powered-from-below" target="_self">back-side power delivery</a>, at the IEEE International Electron Devices Meeting in 2019. Less than two years later, Intel publicly <a href="https://spectrum.ieee.org/intel-says-its-manufacturing-tech-will-lead-the-world-by-2025" target="_self">committed</a> to incorporating the tech in 2024.
  514. </p><p>
  515. Speaking of commitment, the U.S. military’s Defense Advanced Research Projects Agency has played an enormous part in bankrolling some of the fundamental advances that appear in these pages. Many of our readers will be familiar with the robots that Senior Editor Evan Ackerman <a href="https://spectrum.ieee.org/search/?q=darpa+humanoid+challenge" target="_self">covered during DARPA’s humanoid challenge</a> almost 10 years ago. Those robots were essentially research projects, but as Ackerman reports in “<a href="https://spectrum.ieee.org/humanoid-robots" target="_blank">Year of the Humanoid</a>,” a few companies will start up pilot projects in 2024 to see if this generation of humanoids is ready to roll up its metaphorical sleeves and get down to business.
  516. </p><p>
  517. More recently, fully homomorphic encryption (FHE) has burst onto the scene. Moore, who’s been covering the Cambrian explosion in chip architectures for AI and other alternative computing modalities since the mid-teens, notes that, like the robotics challenge, DARPA was the initial driver.
  518. </p><p>
  519. “You’d expect the three companies DARPA funded to come up with a chip, though there was no guarantee they’d commercialize it,” says Moore, who wrote “<a href="https://spectrum.ieee.org/homomorphic-encryption" target="_blank">Chips to Compute With Encrypted Data Are Coming</a>.” “But what you wouldn’t expect is three <em>more</em> startups, independently of DARPA, to come out with their own FHE chips at the same time.”
  520. </p><p>
  521. Senior Editor Tekla S. Perry’s story about phosphorescent OLEDs, “<a href="https://spectrum.ieee.org/blue-pholed" target="_blank">A Behind-the-Screens Change for OLED</a>,” is actually a deep cut for us. One of the first feature articles Moore edited at <em>Spectrum</em> way back in 2000 was <a href="https://spectrum.ieee.org/the-dawn-of-organic-electronics" target="_self">Stephen Forrest’s article on organic electronics</a>. His lab developed the first phosphorescent OLED materials, which are hugely more efficient than the fluorescent ones. Forrest’s research led to the founding of Universal Display Corp., which, after more than two decades, is finally about to commercialize the last of its trio of phosphorescent colors—blue.
  522. </p><p>
  523. Then there’s our cover story about deepfakes and their potential impact on dozens of national elections later this year. We’ve been tracking the rise of deepfakes since mid-2018, when we ran a story about AI researchers betting on whether or not a <a href="https://spectrum.ieee.org/experts-bet-on-first-deepfakes-political-scandal" target="_self">deepfake</a> video about a political candidate would receive more than 2 million views during the U.S. midterm elections that year. As Senior Editor Eliza Strickland reports in “<a href="https://spectrum.ieee.org/deepfakes-election" target="_blank">This Election Year, Look for Content Credentials</a>,” several companies and industry groups are working hard to ensure that deepfakes don’t take down democracy.
  524. </p><p>
  525. Best wishes for a healthy and prosperous new year, and enjoy this year’s technology forecast. It’s been years in the making.
  526. </p><p>This article appears in the January 2024 print issue.</p><p><em>This post was corrected on 2 January. Stephen Forrest was involved in the creation of Universal Display, but he was not a cofounder</em>.</p>]]></description><pubDate>Tue, 02 Jan 2024 16:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/technology-forecast-2024</guid><category>Technology forecast</category><category>Pilot projects</category><category>Robots</category><category>Fhe</category><category>Deepfakes</category><category>Pholeds</category><category>Intel</category><category>Semiconductors</category><category>Ai</category><dc:creator>Harry Goldstein</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-photo-of-a-smiling-man-in-glasses-and-a-collared-shirt.png?id=50902322&amp;width=980"></media:content></item><item><title>11 Intriguing Engineering Milestones to Look for in 2024</title><link>https://spectrum.ieee.org/technology-in-2024</link><description><![CDATA[
  527. <img src="https://spectrum.ieee.org/media-library/an-illustration-of-multiple-icons-including-a-robot-and-a-satellite.png?id=50708266&width=1245&height=700&coordinates=0%2C80%2C0%2C80"/><br/><br/><p class="ieee-editors-note">
  528. <em>This story is part of our <a href="https://spectrum.ieee.org/special-reports/top-tech-2024/" rel="noopener noreferrer" target="_blank">Top Tech 2024</a> special report.</em>
  529. </p><h3>Journey to the Center of the Earth</h3><br/><img alt="" class="rm-shortcode" data-rm-shortcode-id="090f60aacae23989d2b7a9bceccd09a1" data-rm-shortcode-name="rebelmouse-image" id="91196" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=50706132&width=980"/><h3></h3><br/><p>To unlock the terawatt potential of geothermal energy, MIT startup <a href="https://www.quaise.energy/" rel="noopener noreferrer" target="_blank">Quaise Energy</a> is testing a deep-drilling rig in 2024 that will use high-power <a href="https://spectrum.ieee.org/altarock-energy-melts-rock-with-millimeter-waves-for-geothermal-wells" target="_self">millimeter waves</a> to melt a column of rock down as far as 10 to 20 kilometers. Its “deeper, hotter, and faster” strategy will start with old oil-and-gas drilling structures and extend them by blasting radiation from a gyrotron to vaporize the hard rock beneath. At these depths, Earth reaches 500 <sup>°</sup>C. Accessing this superhot geothermal energy could be a key part of achieving net zero emission goals by 2050, according to Quaise executives.</p><h3></h3><br/><div style="width: 75%; margin-left: auto; margin-right: auto;">
  530. <div class="horizontal-rule">
  531. </div>
  532. </div><h3>“Batteries Included” Induction Ovens</h3><br/><img alt="" class="rm-shortcode" data-rm-shortcode-id="c537ad3f0fb11d15fd8179df91f0040b" data-rm-shortcode-name="rebelmouse-image" id="71b60" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=50706087&width=980"/><h3></h3><br/><p>Now we’re cooking with gas—but soon, we may be cooking with induction. A growing number of consumers are switching to induction-based stoves and ovens to address environmental concerns and health risks associated with gas ranges. But while these new appliances are more energy efficient, most models require modified electrical outlets and cost hundreds of dollars to install. That’s why startups like <a href="https://www.channingcopper.com/pages/about" rel="noopener noreferrer" target="_blank">Channing Street Copper</a> and <a href="https://www.impulselabs.com/" rel="noopener noreferrer" target="_blank">Impulse Labs</a> are working to make induction ovens easier to install by adding built-in batteries that supplement regular wall-socket power. Channing Street Copper plans to roll out its battery-boosted Charlie appliance in <a href="https://www.energy.gov/sites/default/files/2023-05/bto-peer-2023-copper-street.pdf" rel="noopener noreferrer" target="_blank">early 2024</a>.</p><h3></h3><br/><div style="width: 75%; margin-left: auto; margin-right: auto;">
  533. <div class="horizontal-rule">
  534. </div>
  535. </div><h3>Triage Tech to the Rescue</h3><br/><img alt="" class="rm-shortcode" data-rm-shortcode-id="ddf46644590772bfebb47ee4ffc82b15" data-rm-shortcode-name="rebelmouse-image" id="4be5a" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=50742368&width=980"/><h3></h3><br/><p>In the second half of 2024, the U.S. <a href="https://www.darpa.mil/" rel="noopener noreferrer" target="_blank">Defense Advanced Research Projects Agency</a> will begin the first round of its <a href="https://triagechallenge.darpa.mil/" rel="noopener noreferrer" target="_blank">Triage Challenge</a>, a competition to develop sensors and algorithms to support triage efforts during mass-casualty incidents. According to a DARPA <a href="https://www.youtube.com/watch?v=f-nSqKTWlHA" rel="noopener noreferrer" target="_blank">video presentation</a> from last February, the agency is seeking new ways to help medics at two stages of treatment: During primary triage, those most in need of care will be identified with sensors from afar. Then, when the patients are stable, medics can decide the best treatment regimens based on data gleaned from noninvasive sensors. The three rounds will continue through 2026, with prizes totaling US $7 million.</p><h3></h3><br/><div style="width: 75%; margin-left: auto; margin-right: auto;">
  536. <div class="horizontal-rule">
  537. </div>
  538. </div><h3>Killer Drones Deployed From the Skies</h3><br/><img alt="" class="rm-shortcode" data-rm-shortcode-id="f56628494ac6de7916c7c13a4e5c3ae6" data-rm-shortcode-name="rebelmouse-image" id="cdfdc" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=50742310&width=980"/><h3></h3><br/><p>A new class of missile-firing drones will take to the skies in 2024. Like a three-layer aerial nesting doll, the missile-stuffed drone is itself released from the belly of a bomber while in flight. The uncrewed aircraft was developed by energy and defense company <a href="https://www.ga.com/" rel="noopener noreferrer" target="_blank">General Atomics</a> as part of the <a href="https://www.darpa.mil/program/longshot" rel="noopener noreferrer" target="_blank">Defense Advanced Research Projects Agency’s LongShot program</a> and will be flight-tested this year to prove its feasibility in air-based combat. Its goal is to extend the range and effectiveness of both air-to-air missiles and the current class of fighter jets while new aircraft are introduced.</p><h3></h3><br/><div style="width: 75%; margin-left: auto; margin-right: auto;">
  539. <div class="horizontal-rule">
  540. </div>
  541. </div><h3>Visible’s Anti-Activity Tracker</h3><br/><img alt="" class="rm-shortcode" data-rm-shortcode-id="1c84a95252690d88e5ae658fae52ff40" data-rm-shortcode-name="rebelmouse-image" id="c8777" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=50742306&width=980"/><h3></h3><br/><p>Long COVID and chronic fatigue often go unseen by others. But it’s important that people with these invisible illnesses understand how different activities affect their symptoms so they can properly pace their days. That’s why one man with long COVID, Harry Leeming, decided to create <a href="https://www.makevisible.com/" rel="noopener noreferrer" target="_blank">Visible</a>, an app that helps users monitor activity and avoid overexertion. This year, according to Leeming, Visible will launch a premium version of the app that uses a specialized heart-rate monitor. While most wearables are meant for workouts, Leeming says, these armband monitors are optimized for lower heart rates to help people with both long COVID and fatigue. The app will also collect data from consenting users to help research these conditions.</p><h3></h3><br/><div style="width: 75%; margin-left: auto; margin-right: auto;">
  542. <div class="horizontal-rule">
  543. </div>
  544. </div><h3>Amazon Launches New Internet Service—Literally</h3><br/><img alt="" class="rm-shortcode" data-rm-shortcode-id="91aad20ad75ff8dbc4c158b6b22f2ec8" data-rm-shortcode-name="rebelmouse-image" id="de16d" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=50742297&width=980"/><h3></h3><br/><p>Amazon expects to begin providing Internet service from space with <a href="https://www.aboutamazon.com/what-we-do/devices-services/project-kuiper" rel="noopener noreferrer" target="_blank">Project Kuiper</a> by the end of 2024. The US $10 billion project aims to expand reliable broadband internet access to rural areas around the globe by launching a constellation of more than 3,000 satellites into low Earth orbit. While the project will take years to complete in full, Amazon is set to start beta testing with customers later this year. If successful, Kuiper <a href="https://spectrum.ieee.org/amazons-project-kuiper-is-more-than-the-companys-response-to-spacex" target="_self">could be integrated</a> into the suite of Amazon Web Services. SpaceX’s <a href="https://www.starlink.com/" rel="noopener noreferrer" target="_blank">Starlink</a>, meanwhile, has been active since 2019 and already has 5,000 satellites in orbit.</p><h3></h3><br/><div style="width: 75%; margin-left: auto; margin-right: auto;">
  545. <div class="horizontal-rule">
  546. </div>
  547. </div><h3>Solar-Powered Test Drive</h3><br/><img alt="" class="rm-shortcode" data-rm-shortcode-id="bc379ac8939adc20ff1ef346d3b61b0d" data-rm-shortcode-name="rebelmouse-image" id="6009e" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=50742295&width=980"/><h3></h3><br/><p>The next car you buy might be powered by the sun. <a href="https://spectrum.ieee.org/the-aptera-2e-is-the-wildestlooking-electric-car-youll-ever-see" target="_self">Long awaited</a> by potential customers and crowdfunders, <a href="https://spectrum.ieee.org/solar-powered-cars" target="_self">solar electric vehicles</a> (SEVs) made by the startup <a href="https://aptera.us/" rel="noopener noreferrer" target="_blank">Aptera Motors</a> are set to hit the road in 2024, the company says. Like the cooler cousin of an SUV, these three-wheeled SEVs feature a sleek, aerodynamic design to cut down on drag. The latest version of the vehicle combines plug-in capability with solar panels that cover its roof, allowing for a 1,600-kilometer range on a single charge and up to 65 km a day from solar power. Aptera says it aims to begin early production in 2024, with the first 2,000 vehicles set to be delivered to investors. </p><h3></h3><br/><div style="width: 75%; margin-left: auto; margin-right: auto;">
  548. <div class="horizontal-rule">
  549. </div>
  550. </div><h3>Zero Trust, Two-Thirds Confidence</h3><br/><img alt="" class="rm-shortcode" data-rm-shortcode-id="8199e6c65b14e97b4aaca5e053577577" data-rm-shortcode-name="rebelmouse-image" id="c908f" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=50742180&width=980"/><h3></h3><br/><p>“Trust but verify” is now a proverb of the past in cybersecurity policy in the United States. By the end of the 2024 fiscal year, in September, all U.S. government agencies will be <a href="https://www.whitehouse.gov/wp-content/uploads/2022/01/M-22-09.pdf" rel="noopener noreferrer" target="_blank">required to switch</a> to a Zero Trust security architecture. All users must validate their identity and devices—even when they’re already connected to government networks and VPNs. This is achieved with methods like multifactor authentication and other access controls. About two-thirds of security professionals employed by federal agencies are confident that their department will hit the cybersecurity deadline, <a href="https://swimlane.com/resources/reports/security-automation-federal-agencies/" rel="noopener noreferrer" target="_blank">according to a 2023 report</a>.</p><h3></h3><br/><div style="width: 75%; margin-left: auto; margin-right: auto;">
  551. <div class="horizontal-rule">
  552. </div>
  553. </div><h3>First Light for Vera Rubin</h3><br/><img alt="" class="rm-shortcode" data-rm-shortcode-id="72d7f8e59b853361f470f2080e5713be" data-rm-shortcode-name="rebelmouse-image" id="a8281" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=50742082&width=980"/><h3></h3><br/><p><a href="https://rubinobservatory.org/" rel="noopener noreferrer" target="_blank">Vera C. Rubin Observatory</a>, home to the <a href="https://spectrum.ieee.org/the-world-s-largest-camera-is-nearly-complete" target="_self">largest digital camera</a> ever constructed, is expected to open its eye to the sky for the first time in late 2024. The observatory features an 8.4-meter wide-field telescope that will scan the Southern Hemisphere’s skies over the course of a decade-long project. Equipped with a 3,200-megapixel camera, the telescope will photograph an area the size of 40 full moons every night from its perch atop a Chilean mountain. That means it can capture the entire visible sky every three to four nights. When operational, the Rubin Observatory will help astronomers inventory the solar system, map the Milky Way, and shed light on dark matter and dark energy.</p><h3></h3><br/><div style="width: 75%; margin-left: auto; margin-right: auto;">
  554. <div class="horizontal-rule">
  555. </div>
  556. </div><h3>Hailing Air Taxis at the Olympics</h3><br/><img alt="" class="rm-shortcode" data-rm-shortcode-id="84fddec66e3d8799a613e24ae4309dd5" data-rm-shortcode-name="rebelmouse-image" id="830c2" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=50742058&width=980"/><h3></h3><br/><p>At this year’s summer Olympic Games in Paris, attendees may be able to take an electric vertical-take-off-and-landing vehicle, or eVTOL, to get around the city. <a href="https://www.volocopter.com/en" rel="noopener noreferrer" target="_blank">Volocopter</a>, in Bruchsal, Germany, hopes to make an air taxi service available to sports enthusiasts and tourists during the competition. Though the company is still awaiting certification from the European Union Aviation Safety Agency, Volocopter plans to offer three routes between various parts of the city, as well as two round-trip routes for tourists. Volocopter’s air taxis could make Paris the first European city to offer eVTOL services.</p><h3></h3><br/><div style="width: 75%; margin-left: auto; margin-right: auto;">
  557. <div class="horizontal-rule">
  558. </div>
  559. </div><h3>Faster Than a Speeding Bullet</h3><br/><img alt="" class="rm-shortcode" data-rm-shortcode-id="f95f9b5c84a8263d85344e5b4175ca0d" data-rm-shortcode-name="rebelmouse-image" id="44109" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=50741847&width=980"/><h3></h3><br/><p><a href="https://boomsupersonic.com/" rel="noopener noreferrer" target="_blank">Boom </a>Technology is developing an airliner, called Overture, that flies faster than the speed of sound. The U.S. company says it’s set to finish construction of its North Carolina “superfactory” in 2024. Each year Boom plans to manufacture as many as 33 of the aircraft, which the company claims will be the world’s fastest airliner. Overture is designed to be capable of flying twice as fast as today’s commercial planes, and Boom says it expects the plane to be powered by sustainable aviation fuel, made without petroleum. The company says it already has orders in place from commercial airlines and is aiming for first flight by 2027.</p>]]></description><pubDate>Mon, 01 Jan 2024 16:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/technology-in-2024</guid><category>Killer robots</category><category>Satellite internet</category><category>Supersonic</category><category>Geothermal</category><dc:creator>Gwendolyn Rak</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/an-illustration-of-multiple-icons-including-a-robot-and-a-satellite.png?id=50708266&amp;width=980"></media:content></item><item><title>Humanoid Robots Are Getting to Work</title><link>https://spectrum.ieee.org/humanoid-robots</link><description><![CDATA[
  560. <img src="https://spectrum.ieee.org/media-library/a-photo-of-a-robot-holding-a-container-in-front-of-rows-of-containers.jpg?id=50868927&width=1245&height=700&coordinates=0%2C645%2C0%2C645"/><br/><br/><p>
  561. <strong>Ten years ago,</strong> at the <a href="https://spectrum.ieee.org/darpa-robotics-challenge-trials-what-we-learned-on-day-1" target="_self">DARPA Robotics Challenge (DRC) Trial event</a> near Miami, I watched the most advanced humanoid robots ever built struggle their way through a scenario inspired by the <a href="https://spectrum.ieee.org/24-hours-at-fukushima" target="_self">Fukushima nuclear disaster</a>. A team of experienced engineers controlled each robot, and overhead safety tethers kept them from falling over. The robots had to demonstrate mobility, sensing, and manipulation—which, with painful slowness, they did.
  562. </p><p>
  563. These robots were clearly research projects, but DARPA has a history of catalyzing technology with a long-term view. The DARPA Grand and Urban Challenges for autonomous vehicles, in 2005 and 2007, formed the foundation for today’s autonomous taxis. So, after DRC ended in 2015 with several of the robots successfully completing the entire final scenario, the obvious question was: When would humanoid robots make the transition from research project to a commercial product?
  564. </p><div class="ieee-sidebar-small">
  565. <p>
  566. This article is part of our special report
  567. <a href="https://spectrum.ieee.org/special-reports/top-tech-2024" target="_blank">Top Tech 2024</a>.
  568. </p>
  569. </div><p>
  570. The answer seems to be 2024, when a handful of well-funded companies will be deploying their robots in commercial pilot projects to figure out whether humanoids are really ready to get to work.
  571. </p><p>
  572. One of the robots that
  573. <a href="https://www.youtube.com/watch?v=SkepUwb08EE" rel="noopener noreferrer" target="_blank">made an appearance at the DRC Finals in 2015 was called ATRIAS</a>, developed by Jonathan Hurst at the <a href="https://mime.engineering.oregonstate.edu/research/drl/" rel="noopener noreferrer" target="_blank">Oregon State University Dynamic Robotics Laboratory</a>. In 2015, Hurst cofounded <a href="https://agilityrobotics.com/" rel="noopener noreferrer" target="_blank">Agility Robotics</a> to turn ATRIAS into <a href="https://spectrum.ieee.org/building-robots-that-can-go-where-we-go" target="_self">a human-centric, multipurpose, and practical robot called</a> Digit. Approximately the same size as a human, <a href="https://robotsguide.com/robots/digit" rel="noopener noreferrer" target="_blank">Digit</a> stands 1.75 meters tall (about 5 feet, 8 inches), weighs 65 kilograms (about 140 pounds), and can lift 16 kg (about 35 pounds). Agility is now preparing to produce a commercial version of Digit at massive scale, and the company sees its first opportunity in the logistics industry, where it will start doing some of the jobs where humans are essentially acting like robots already.
  574. </p><h2>Are humanoid robots useful?</h2><p>
  575. “We spent a long time working with potential customers to find a use case where our technology can provide real value, while also being scalable and profitable,” Hurst says. “For us, right now, that use case is moving e-commerce totes.” Totes are standardized containers that warehouses use to store and transport items. As items enter or leave the warehouse, empty totes need to be continuously moved from place to place. It’s a vital job, and even in highly automated warehouses, much of that job is done by humans.
  576. </p><p>
  577. Agility says that in the United States, there are currently several million people working at tote-handling tasks, and
  578. <a href="https://www.theguardian.com/technology/2022/jun/22/amazon-workers-shortage-leaked-memo-warehouse" rel="noopener noreferrer" target="_blank">logistics companies</a> are having trouble keeping positions filled, because in some markets there are simply not enough workers available. Furthermore, the work tends to be dull, repetitive, and stressful on the body. “The people doing these jobs are basically doing robotic jobs,” says Hurst, and Agility argues that these people would be much better off doing work that’s more suited to their strengths. “What we’re going to have is a shifting of the human workforce into a more supervisory role,” explains Damion Shelton, Agility Robotics’ CEO. “We’re trying to build something that works with people,” Hurst adds. “We want humans for their judgment, creativity, and decision-making, using our robots as tools to do their jobs faster and more efficiently.”
  579. </p><p>
  580. For Digit to be an effective warehouse tool, it has to be capable, reliable, safe, and financially sustainable for both Agility and its customers. Agility is confident that all of this is possible, citing Digit’s potential relative to the cost and performance of human workers. “What we’re encouraging people to think about,” says Shelton, “is how much they could be saving per hour by being able to allocate their human capital elsewhere in the building.” Shelton estimates that a typical large logistics company spends at least US $30 per employee-hour for labor, including benefits and overhead. The employee, of course, receives much less than that.
  581. </p><div class="flourish-embed flourish-cards" data-src="visualisation/16205583?607871">
  582. <script src="https://public.flourish.studio/resources/embed.js"> </script>
  583. </div><p>
  584. Agility is not yet ready to provide pricing information for Digit, but we’re told that it will cost less than $250,000 per unit. Even at that price, if Digit is able to achieve Agility’s goal of minimum 20,000 working hours (five years of two shifts of work per day), that brings the hourly rate of the robot to $12.50. A service contract would likely add a few dollars per hour to that. “You compare that against human labor doing the same task,” Shelton says, “and as long as it’s apples to apples in terms of the rate that the robot is working versus the rate that the human is working, you can decide whether it makes more sense to have the person or the robot.”
  585. <br/>
  586. </p><p>
  587. Agility’s robot won’t be able to match the general capability of a human, but that’s not the company’s goal. “Digit won’t be doing everything that a person can do,” says Hurst. “It’ll just be doing that one process-automated task,” like moving empty totes. In these tasks, Digit is able to keep up with (and in fact slightly exceed) the speed of the average human worker, when you consider that the robot doesn’t have to accommodate the needs of a frail human body.
  588. </p><h2>Amazon’s experiments with warehouse robots</h2><p>
  589. The first company to put Digit to the test is Amazon. In 2022, Amazon invested in Agility as part of its
  590. <a href="https://www.aboutamazon.com/news/innovation-at-amazon/introducing-the-1-billion-amazon-industrial-innovation-fund" rel="noopener noreferrer" target="_blank">Industrial Innovation Fund</a>, and late last year <a href="https://www.aboutamazon.com/news/operations/amazon-introduces-new-robotics-solutions" rel="noopener noreferrer" target="_blank">Amazon started testing Digit at its robotics research and development site near Seattle, Wash.</a> Digit will not be lonely at Amazon—the company currently has more than 750,000 robots deployed across its warehouses, including legacy systems that operate in closed-off areas as well as more modern robots that have the necessary autonomy to work more collaboratively with people. These newer robots include autonomous mobile robotic bases like <a href="https://robotsguide.com/robots/proteus" rel="noopener noreferrer" target="_blank">Proteus</a>, which can move carts around warehouses, as well as stationary robot arms like Sparrow and Cardinal, which can handle inventory or customer orders in structured environments. But a robot with legs will be something new.
  591. </p><p>
  592. “What’s interesting about Digit is because of its bipedal nature, it can fit in spaces a little bit differently,” says Emily Vetterick, director of engineering at
  593. <a href="https://amazon.jobs/en/landing_pages/amazon-global-robotics" rel="noopener noreferrer" target="_blank">Amazon Global Robotics</a>, who is overseeing Digit’s testing. “We’re excited to be at this point with Digit where we can start testing it, because we’re going to learn where the technology makes sense.”
  594. </p><p>
  595. Where two legs make sense has been an ongoing question in robotics for decades. Obviously, in a world designed primarily for humans, a robot with a humanoid form factor would be ideal. But balancing dynamically on two legs is still difficult for robots, especially when those robots are carrying heavy objects and are expected to work at a human pace for tens of thousands of hours. When is it worthwhile to use a bipedal robot instead of something simpler?
  596. </p><p class="pull-quote">
  597. “The people doing these jobs are basically doing robotic jobs.”<strong>—Jonathan Hurst, Agility Robotics</strong>
  598. </p><p>
  599. “The use case for Digit that I’m really excited about is empty tote recycling,” Vetterick says. “We already automate this task in a lot of our warehouses with a conveyor, a very traditional automation solution, and we wouldn’t want a robot in a place where a conveyor works. But a conveyor has a specific footprint, and it’s conducive to certain types of spaces. When we start to get away from those spaces, that’s where robots start to have a functional need to exist.”
  600. </p><p>
  601. The need for a robot doesn’t always translate into the need for a robot with legs, however, and a company like Amazon has the resources to build its warehouses to support whatever form of robotics or automation it needs. Its newer warehouses are indeed built that way, with flat floors, wide aisles, and other environmental considerations that are particularly friendly to robots with wheels.
  602. </p><p>
  603. “The building types that we’re thinking about [for Digit] aren’t our new-generation buildings. They’re older-generation buildings, where we can’t put in traditional automation solutions because there just isn’t the space for them,” says Vetterick. She describes the organized chaos of some of these older buildings as including narrower aisles with roof supports in the middle of them, and areas where pallets, cardboard, electrical cord covers, and ergonomics mats create uneven floors. “Our buildings are easy for people to navigate,” Vetterick continues. “But even small obstructions become barriers that a wheeled robot might struggle with, and where a walking robot might not.” Fundamentally, that’s the advantage bipedal robots offer relative to other form factors: They can quickly and easily fit into spaces and workflows designed for humans. Or at least, that’s the goal.
  604. </p><p>
  605. Vetterick emphasizes that the Seattle R&D site deployment is only a very small initial test of Digit’s capabilities. Having the robot move totes from a shelf to a conveyor across a flat, empty floor is not reflective of the use case that Amazon ultimately would like to explore. Amazon is not even sure that Digit will turn out to be the best tool for this particular job, and for a company so focused on efficiency, only the best solution to a specific problem will find a permanent home as part of its workflow. “Amazon isn’t interested in a general-purpose robot,” Vetterick explains. “We are always focused on what problem we’re trying to solve. I wouldn’t want to suggest that Digit is the only way to solve this type of problem. It’s one potential way that we’re interested in experimenting with.”
  606. </p><p>
  607. The idea of a general-purpose humanoid robot that can assist people with whatever tasks they may need is certainly appealing, but as Amazon makes clear, the first step for companies like Agility is to find enough value performing a single task (or perhaps a few different tasks) to achieve sustainable growth. Agility believes that Digit will be able to scale its business by solving Amazon’s empty tote-recycling problem, and the company is confident enough that it’s preparing to open a
  608. <a href="https://spectrum.ieee.org/agility-humanoid-robotics-factory" target="_self">factory</a> in Salem, Ore. At peak production the plant will eventually be capable of manufacturing 10,000 Digit robots per year.
  609. </p><h2>A menagerie of humanoids</h2><p>
  610. Agility is not alone in its goal to commercially deploy bipedal robots in 2024. At least seven other companies are also working toward this goal, with hundreds of millions of dollars of funding backing them.
  611. <a href="https://www.1x.tech/androids" rel="noopener noreferrer" target="_blank">1X</a>, <a href="https://apptronik.com/apollo" rel="noopener noreferrer" target="_blank">Apptronik</a>, <a href="https://www.figure.ai/" rel="noopener noreferrer" target="_blank">Figure</a>, <a href="https://sanctuary.ai/product/" rel="noopener noreferrer" target="_blank">Sanctuary</a>, <a href="https://www.tesla.com/AI" rel="noopener noreferrer" target="_blank">Tesla</a>, and <a href="https://www.unitree.com/h1/" rel="noopener noreferrer" target="_blank">Unitree</a> all have commercial humanoid robot prototypes.
  612. </p><p>
  613. Despite an influx of money and talent into commercial humanoid robot development over the past two years, there have been no recent fundamental technological breakthroughs that will substantially aid these robots’ development. Sensors and computers are capable enough, but actuators remain complex and expensive, and batteries struggle to power bipedal robots for the length of a work shift.
  614. </p><p>
  615. There are other challenges as well, including creating a robot that’s manufacturable with a resilient supply chain and developing the service infrastructure to support a commercial deployment at scale. The biggest challenge by far is software. It’s not enough to simply build a robot that can do a job—that robot has to do the job with the kind of safety, reliability, and efficiency that will make it desirable as more than an experiment.
  616. </p><p>
  617. There’s no question that Agility Robotics and the other companies developing commercial humanoids have impressive technology, a compelling narrative, and an enormous amount of potential. Whether that potential will translate into humanoid robots in the workplace now rests with companies like Amazon, who seem cautiously optimistic. It would be a fundamental shift in how repetitive labor is done. And now, all the robots have to do is deliver.
  618. <span class="ieee-end-mark"></span>
  619. </p><p><em>This article appears in the January 2024 print issue as “Year of the Humanoid.”</em></p>]]></description><pubDate>Sat, 30 Dec 2023 16:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/humanoid-robots</guid><category>Humanoid robots</category><category>Robots</category><category>Darpa robotics challenge</category><category>Agility robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-photo-of-a-robot-holding-a-container-in-front-of-rows-of-containers.jpg?id=50868927&amp;width=980"></media:content></item><item><title>Video Friday: More Happy Holidays!</title><link>https://spectrum.ieee.org/video-friday-more-happy-holidays</link><description><![CDATA[
  620. <img src="https://spectrum.ieee.org/media-library/image.png?id=50953498&width=1245&height=700&coordinates=0%2C0%2C0%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 02 February 2024, ZURICH, SWITZERLAND</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote><em>Wishing you and your loved ones merry Christmas, happy holidays, and a happy New Year from everyone at the Autonomous Systems Lab at ETH Zürich!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b77a05be58fa66a7f1b271d15bd5ed86" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/jNTQXBAgG4g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://asl.ethz.ch/">ASL</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Merry Christmas and sustainable 2024 from VUB-imec Brubotics & Fysc!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f8f21097fb74aef8e8c2b4e575e5d5ec" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/tEgP_6lJWuU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.brubotics.eu/">BruBotics</a> ]</p><p>Thanks, Bram!</p><div class="horizontal-rule"></div><blockquote><em>Embark on MOMO (Mobile Object Manipulation Operator)’s thrilling quest to ignite joy and excitement! Watch as MOMO skillfully places the tree topper, ensuring that every KIMLAB member’s holiday season is filled with happiness and brightness. Happy Holidays!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="74f921f8aee2bbf915544a4b03fc4de0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FENfp1-bct8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://publish.illinois.edu/kimlab2020/">KIMLAB</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Merry Christmas from AgileX Robotics and our little wheeled bipedal robot, T-Rex! As we step into 2024, may the joy of the season accompany you throughout the year. Here’s to a festive holiday filled with warmth, laughter, and innovative adventures!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ed6fb0bfa334c043384459cb3dee5fd8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/IbYBE2p6uwM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://global.agilex.ai/">AgileX Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote><em>To celebrate this amazing year, we’d like to share a special holiday video showcasing our most requested demo! We hope it brings you a smile as bright as the lights of the season.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="79574789b66187ccd6d2db4d765489d6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Ip7gfJ1tGG8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.flexiv.com/en/">Flexiv</a> ]</p><div class="horizontal-rule"></div><blockquote><em>The Robotnik team is still working to make even smarter, more autonomous and more efficient mobile robotics solutions available to you in 2024. Merry Christmas!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="443132ed025602b0fc601ca782c7b2e3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Yhluwos4D_8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotnik.eu/products/">Robotnik</a> ]</p><div class="horizontal-rule"></div><p>Season’s Greetings from ABB Robotics!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="29b01f3253db0a81d87756d61a33c936" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/JaCmAF9b9OY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://new.abb.com/products/robotics">ABB</a> ]</p><div class="horizontal-rule"></div><p>If you were at ICRA you got a sneak peak at this, but here’s a lovely Spot tango from the AI Institute.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b430774c4336673ae1293cbce156745f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/hSP3V9yrSPY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://theaiinstitute.com/">The Institute</a> ]</p><div class="horizontal-rule"></div><blockquote><em>CL-1 is one of the few humanoid robots around the world that achieves dynamic stair climbing based on real-time terrain perception, mainly thanks to LimX Dynamics’ advanced motion control and AI algorithms, along with proprietary high-performing actuators and hardware system.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6f9d90bc8efd57362ab3e8b79696a674" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/sihIDeJ4Hmk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.limxdynamics.com/en">LimX Dynamics</a> ]</p><div class="horizontal-rule"></div><p>We wrote about Parallel Systems a couple years ago, and here’s a brief update.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b2815a7e6e68591970fb6bd3c0c77183" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/WK1I_atG72U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://moveparallel.com/">Parallel Systems</a> ]</p><div class="horizontal-rule"></div><blockquote><em>After 1,000 Martian days of exploration, NASA’s Perseverance rover is studying rocks that show several eras in the history of a river delta billions of years old. Scientists are investigating this region of Mars, known as Jezero Crater, to see if they can find evidence of ancient life recorded in the rocks. Perseverance project scientist Ken Farley provides a guided tour of a richly detailed panorama of the rover’s location in November 2023, taken by the Mastcam-Z instrument.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7394a9056de0a12291edff61e4bd9652" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/CIaHiGbFybQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://mars.nasa.gov/mars2020/">NASA</a> ]</p><div class="horizontal-rule"></div><p>It’s been many, many years since we’ve seen a new steampunk robot from I-Wei Huang, but it was worth the wait!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="61cfecbe78fda1c9c46b9990edc08271" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/yPqfccZK2_M?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="http://www.crabfu.com/">CrabFu</a> ]</p><div class="horizontal-rule"></div><p>Ok apparently this is a loop of Digit standing in front of a fireplace for 10 hours, rather than a very impressive demonstration of battery life.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1975658635d3e19c47df805154bdaa1e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/nyWBIBhxgas?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://agilityrobotics.com/">Agility</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 29 Dec 2023 17:04:33 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-more-happy-holidays</guid><category>Boston dynamics</category><category>Humanoid robots</category><category>Icra</category><category>Nasa</category><category>Quadruped robots</category><category>Robotic arm</category><category>Robotics</category><category>Video friday</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/image.png?id=50953498&amp;width=980"></media:content></item><item><title>Top Robotics Stories of 2023</title><link>https://spectrum.ieee.org/top-robotics-stories-2023</link><description><![CDATA[
  621. <img src="https://spectrum.ieee.org/media-library/image.jpg?id=50903802&width=1245&height=700&coordinates=0%2C82%2C0%2C82"/><br/><br/><p>2023 was the best year ever for robotics. I say this <em>every</em> year, but every year it’s true, because the robotics field seems to be always poised on the edge of changing absolutely everything. Is 2024 going to be even better? Will it be the year where humanoids, or AI, or something else makes our lives amazing? Maybe! Who knows! But either way, it’ll be exciting, and we’ll be here as it happens.</p><p>As we look forward to 2024, here’s a look back at some of our most popular stories of 2023. I hope you enjoyed reading them as much as I enjoyed writing them!</p><hr/><h4><a href="https://spectrum.ieee.org/south-pole-roombas" target="_self">Roombas at the End of the World</a></h4><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-35 rm-float-left" data-rm-resized-container="35%" style="float: left;">
  622. <img alt="" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="36729c2fc56aeece4d61a3f4a77dc9a2" data-rm-shortcode-name="rebelmouse-image" id="bf83e" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=50903788&width=980" style="max-width: 100%"/>
  623. </p><p>My favorite story to report and write in 2023 was this tale of the bizarre existence of the Roombas that live and work at the Amundsen–Scott South Pole Station. A single picture that I spotted while casually browsing through <a href="https://brr.fyi/" rel="noopener noreferrer" target="_blank">the blog of a South Pole infrastructure engineer</a> took me down a crazy rabbit hole of Antarctic hijinks, where a small number of humans relied on some robot vacuums to help keep themselves sane in the most isolated place on Earth.</p><div class="horizontal-rule"><br/></div><h4><a href="https://spectrum.ieee.org/stretch-assistive-robot" target="_self">This Robot Could Be the Key to Empowering People With Disabilities</a></h4><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-float-left rm-resized-container-35" data-rm-resized-container="35%" style="float: left;">
  624. <img alt="" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="5497a41baec3135ee2be77937871e9cb" data-rm-shortcode-name="rebelmouse-image" id="bb9ef" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=50903808&width=980" style="max-width: 100%"/>
  625. </p><p>This story about Henry and Jane Evans, Willow Garage, and Hello Robot beautifully tied together something like a decade and a half of my history as a robotics journalist. I got to know the folks at Willow early on in my career, and saw the PR2 doing some amazing work, including with Henry and Jane. But the PR2 was never going to be a practical robot for anyone, and it took the talent and passion of Hello Robot to design the hardware and software to make PR2’s promises into something real-world useful. </p><div class="horizontal-rule"><br/></div><h4><a href="https://spectrum.ieee.org/mars-helicopter-ingenuity-50" target="_self">What Flight 50 Means for the Ingenuity Mars Helicopter</a><br/></h4><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-float-left rm-resized-container-35" data-rm-resized-container="35%" style="float: left;">
  626. <img alt="" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="76ef378793e659693c9c752986325384" data-rm-shortcode-name="rebelmouse-image" id="4c3b3" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=50903813&width=980" style="max-width: 100%"/>
  627. </p><p>The <a href="https://mars.nasa.gov/technology/helicopter/" target="_blank">Ingenuity Mars Helicopter</a> is currently looking forward to its 70th flight, which is astonishing for a technology demo that was only really expected to fly five times. Arguably, this little helicopter is one of the most extreme autonomous systems that humans have ever built. I’ve written a bunch about Ginny over the last few years and talked to several different members of her team. But Flight 50 was a special milestone, and in this interview, Ingenuity team lead Teddy Tzanetos talks about why.</p><div class="horizontal-rule"><br/></div><h4><a href="https://spectrum.ieee.org/falling-robots" target="_self">It’s Totally Fine for Humanoid Robots to Fall Down</a></h4><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-float-left rm-resized-container-35" data-rm-resized-container="35%" style="float: left;">
  628. <img alt="" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="791ca65887d77dc5fdf7374c4b0625a1" data-rm-shortcode-name="rebelmouse-image" id="8623b" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=50903815&width=980" style="max-width: 100%"/>
  629. </p><p>We’re going to be seeing a lot more robots walking around over the next year, and that also means we’re going to be seeing a lot more robots <em>failing</em> to walk around in one way or another. Videos of robots falling tend to go crazy on social media, but most of the people who see them won’t have any idea about the underlying context. In this article, two of the companies with the most experience building humanoids (<a data-linked-post="2664552687" href="https://spectrum.ieee.org/boston-dynamics-dancing-robots" target="_blank">Boston Dynamics</a> and <a data-linked-post="2659620206" href="https://spectrum.ieee.org/agility-robotics-digit" target="_blank">Agility Robotics</a>) explain why robots falling down is actually not a big deal at all.</p><div class="horizontal-rule"><br/></div><h4><a href="https://spectrum.ieee.org/warehouse-robots" target="_self">Watch This Giant Chopstick Robot Handle Boxes With Ease</a><br/></h4><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-float-left rm-resized-container-35" data-rm-resized-container="35%" style="float: left;">
  630. <img alt="" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="38be5b1d74fedc6e6d4426b222c3e769" data-rm-shortcode-name="rebelmouse-image" id="df99c" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=50903816&width=980" style="max-width: 100%"/>
  631. </p><p>It’s not often that a robot is able to combine a truly novel design with an immediately practical commercial application, but Dextrous Robotics was able to do it with their giant chopstick box manipulator, and our readers certainly appreciated it. Boxes are a compelling near-term application for robots, but the go-to manipulation technique that we see over and over again is suction. Dextrous’ approach is totally unique, and it seems like a gimmick—until you see it in action.</p><div class="horizontal-rule"><br/></div><h4><a href="https://spectrum.ieee.org/ai-drone-racing" target="_self">Superhuman Speed: How Autonomous Drones Beat the Best Human Racers</a></h4><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-float-left rm-resized-container-35" data-rm-resized-container="35%" style="float: left;">
  632. <img alt="" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="d0a4edac0d2ac1ea09894952f407bb87" data-rm-shortcode-name="rebelmouse-image" id="8f22f" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=50903825&width=980" style="max-width: 100%"/>
  633. </p><p>Humans can do some pretty incredible things, and watching a human demonstrating some skill that they are the actual best in the world at is fascinating—especially if they’re at risk of losing that top spot to a robot. This race between world champion drone racers and autonomous drones from the University of Zurich took place in 2022, but I had to wait for the underlying research to be published before I was allowed to tell the whole story.</p><div class="horizontal-rule"><br/></div><h4><a href="https://spectrum.ieee.org/disney-robot" target="_self">How Disney Packed Big Emotion Into a Little Robot</a></h4><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-float-left rm-resized-container-35" data-rm-resized-container="35%" style="float: left;">
  634. <img alt="" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="219b5d6d156ebd8169b50529a34cf42f" data-rm-shortcode-name="rebelmouse-image" id="91ff2" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=50903827&width=980" style="max-width: 100%"/>
  635. </p><p>It’s refreshing to write about Disney’s robots, because somewhat uniquely, they’re designed for the primary purpose of bringing humans joy. Disney goes about this methodically, though, and we’re always excited to be able to share the research underlying everything that they do. These are some of my favorite stories to tell, where there’s a super cool robot that just gets cooler when the people behind it explain how it does what it does.</p><div class="horizontal-rule"><br/></div><h4><a href="https://spectrum.ieee.org/amazon-warehouse-robots-2659064182" target="_self">Stowing Is a “Beautiful Problem” That Amazon Is Solving With Robots</a></h4><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-float-left rm-resized-container-35" data-rm-resized-container="35%" style="float: left;">
  636. <img alt="" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="81030cf3a029e4f729c02c5a8c48166f" data-rm-shortcode-name="rebelmouse-image" id="9182b" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=50903828&width=980" style="max-width: 100%"/>
  637. </p><p>Robotics is full of problems that are hard, and one of those problems is stowing—the process of packing items into bins in a warehouse. Stowing is the opposite of picking, which is something that warehouse robots are getting pretty good at, but stowing is also much more difficult. “For me, it’s hard, but it’s not too hard,” Amazon senior manager of applied sciences Aaron Parness told us for this article. “It’s on the cutting edge of what’s feasible for robots, and it’s crazy fun to work on.”</p><div class="horizontal-rule"><br/></div><h4><a href="https://spectrum.ieee.org/xprize-robot-avatar" target="_self">Your Robotic Avatar Is Almost Ready</a></h4><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-float-left rm-resized-container-35" data-rm-resized-container="35%" style="float: left;">
  638. <img alt="" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="98fdc2dbfff75048e7653ec4b08eb238" data-rm-shortcode-name="rebelmouse-image" id="3700f" loading="lazy" src="https://spectrum.ieee.org/media-library/image.png?id=50903830&width=980" style="max-width: 100%"/>
  639. </p><p>As much as we love robots, humans are still much, much better at a lot of stuff. One thing that humans are particularly bad at, though, is being physically present in far away places. The Avatar XPrize competition combined human brains with robot embodiments, and the results were honestly much better than expected. With the right hardware and the right interface, the competition showed that humans and robots can make a fantastic team.</p><div class="horizontal-rule"><br/></div><h4><a href="https://spectrum.ieee.org/tag/humanoid-robots" target="_blank">All of the Humanoid Robots</a><br/></h4><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-float-left rm-resized-container-35" data-rm-resized-container="35%" style="float: left;">
  640. <img alt="" class="rm-shortcode rm-resized-image" data-rm-shortcode-id="330f96e861cd4314ee53602f29ab69e1" data-rm-shortcode-name="rebelmouse-image" id="35f34" loading="lazy" src="https://spectrum.ieee.org/media-library/image.jpg?id=50903807&width=980" style="max-width: 100%"/>
  641. </p><p>And finally, our top robotics coverage area for 2023 was humanoid robots. Or, more specifically, humanoid robots that are (supposedly) poised to enter the labor force. The last time we had this much humanoid robot coverage was probably in 2015 surrounding the DARPA Robotics Challenge Finals, and it’s not like there’s been a gradual increase or anything—humanoids just went absolutely bonkers in 2023, with something like a dozen companies developing human-scale bipedal robots with near-term commercial aspirations.<br/></p><div class="horizontal-rule"></div>]]></description><pubDate>Thu, 28 Dec 2023 14:00:04 +0000</pubDate><guid>https://spectrum.ieee.org/top-robotics-stories-2023</guid><category>Robotics</category><category>Roomba</category><category>Pr2</category><category>Ingenuity</category><category>Mars rovers</category><category>Humanoid robots</category><category>Drones</category><category>Avatars</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/image.jpg?id=50903802&amp;width=980"></media:content></item><item><title>Video Friday: Happy Holidays!</title><link>https://spectrum.ieee.org/video-friday-happy-holidays-2023</link><description><![CDATA[
  642. <img src="https://spectrum.ieee.org/media-library/a-yellow-legged-robot-wearing-antlers-is-passed-out-on-a-couch-in-an-office-next-to-santa-claus.png?id=50903752&width=1245&height=700&coordinates=0%2C0%2C0%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 02 February 2024, ZURICH, SWITZERLAND</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote><em>“Sport ist Mord,” as Germans would say. Santa was very ambitious to get fit for Christmas. Unfortunately, he had a minor accident while hiking on Karlsruhe’s Mount Klotz, so Christmas might be cancelled this year. Will our team of robotic reindeer, medics, and physiotherapists find a solution for Santa? We hope all will get their Christmas presents on time! The FZI wishes you a Merry Christmas and a Happy New Year! </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a49d66f0358bb42b109cabf69b029f01" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/f_g8Dy2f86s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.fzi.de/en/experience/house-of-living-labs/fzi-living-lab-service-robotics/">FZI</a> ]</p><p>Thanks, Arne!</p><div class="horizontal-rule"></div><blockquote><em>This holiday season, Santa’s on a mission! Join the festive fun as our beloved Santa teams up with his trusty robot sidekick to spread cheer and put a stop to any researchers who might be up to no good with innocent robots from the Robotic Systems Lab. Let’s find out if there are still some kind-hearted researchers out there!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="50b6c9d4f13cf76800f84601cdaebd61" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/1pyC3k0U9z4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://rsl.ethz.ch/">RSL</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Why do the baubles disappear from the Christmas tree in the University? And what role do our Naos play in this? Let yourselves be surprised and get into the spirit of a Merry Christmas! Humanoids Bonn wishes everyone a beautiful Christmastime!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b40b515f23927e2814b4986c9de98365" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/nR4Gzckasmk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.hrl.uni-bonn.de/">Humanoids Bonn</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Happy Holidays from the PAL Robotics team. The beginning of 2024 marks the 20th anniversary of our company, and we are excited for all the things to come.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7892c6cbdf6f4f0697302c92f0cf5de3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/j7oY3CaTodc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://pal-robotics.com/">PAL Robotics</a> ]</p><p>Thanks, Rugile!</p><div class="horizontal-rule"></div><blockquote><em>As we bid farewell to this year and welcome the prospects of 2024, United Robtics Group is delighted to share our season’s greetings with you through this special video. In this time of festivity and hope, we extend our warmest wishes for a joyful holiday season and a prosperous New Year to everyone around the world.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e1da3e5d2f8618c3377df35ab5afa15f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/l_STdU9w_LU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://unitedroboticsgroup.us/">United Robotics Group</a> ]</p><p>Thanks, Nicolas!</p><div class="horizontal-rule"></div><blockquote><em>Even devices from the laboratories of electrical engineers, roboticists and computer scientists can be turned into musical instruments. In their Christmas video, researchers from CTU’s Faculty of Electrical Engineering demonstrate this in an unconventional experiment by “tuning” their technology to the Christmas tune in the English carol We Wish You a Merry Christmas.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="dc97b87adcacef512c2fe0552d5ed615" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/QGhB6Kg9r9g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://fel.cvut.cz/en/what-s-on/news/32935-christmas-video-of-the-faculty-of-electrical-engineering-ctu-for-2023-is-based-on-the-equation-people-technology-music">CTU</a> ]</p><p>Thanks, Jiri!</p><div class="horizontal-rule"></div><blockquote><em>Season’s greetings from euRobotics!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="10fc9a90e9febe25a3f7f853b5998119" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/us9yHy8KNJQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://eu-robotics.net/">euRobotics</a> ]</p><div class="horizontal-rule"></div><blockquote><em>In this video, we give you an exclusive behind-the-scenes look at how our KUKA Austria team works hard to bring you the ultimate mulled cider experience. From conception to implementation, the LBR iisy Cobot, which combines the precision of a robot with the passion of a human team, provides support. A process that not only delights the taste buds, but also redefines the boundaries of robotics.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2d9302af338b8ad628181229057e6a01" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/bdpBjJlGDog?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.kuka.com/en-us">Kuka</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Santa Claus is getting ready to come to town, with a little bit of help from his friends at Flexiv! He made a list, checked it twice, and fingers crossed, you’ve been nice!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f417f42f426a293b57be3f26a6536269" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Es0icddraGk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.flexiv.com/en/">Flexiv</a> ]</p><div class="horizontal-rule"></div><p>Happy Holidays from Yaskawa!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b4ea1834d738cba9a08a35b31f10c98c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7SHV8-oYiBM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.motoman.com/en-us">Yaskawa</a> ]</p><div class="horizontal-rule"></div><blockquote><em>We have created an AI robot named CyberRunner whose task is to learn how to play the popular and widely accessible labyrinth marble game. The labyrinth is a game of physical skill whose goal is to steer a marble from a given start point to the end point. In doing so, the player must prevent the ball from falling into any of the holes that are present on the labyrinth board. CyberRunner applies recent advances in model-based reinforcement learning to the physical world and exploits its ability to make informed decisions about potentially successful behaviors by planning real-world decisions and actions into the future. The learning on the real-world labyrinth is conducted in 6.06 hours, comprising 1.2 million time steps at a control rate of 55 samples per second. The AI robot outperforms the previously fastest recorded time, achieved by an extremely skilled human player, by over 6 percent.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="02ede607c2dfb4301635a8071d2f145b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zQMKfuWZRdA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>In case you’re wondering (like I was), shortcutting across the maze to skip parts of the track is, in fact, cheating. The system (like most humans) did discover shortcuts and had to be explicitly directed not to take them.</p><p>[ <a href="https://www.cyberrunner.ai/">CyberRunner</a> ]</p><p>Thanks, Markus!</p><div class="horizontal-rule"></div><p>Grain Weevil, one of the more interesting single-purpose robots I’ve ever seen, had a busy 2023.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="efbbe80c1206b376356723e0087b2690" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/4OUSGJbQXmg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.grainweevil.com/">Grain Weevil</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Working in a greenhouse is both strenuous and time-consuming. The picking robot from ETH spin-off Floating Robotics takes on particularly repetitive tasks, thereby alleviating the strain on human pickers. It is currently undergoing testing at Beerstecher AG in Hinwil.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9441f645eda4e44ffb1ccafffca86413" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/M_W38gdWgiI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ethz.ch/en/news-and-events/eth-news/news/2023/12/a-picking-robot-for-the-greenhouse.html">ETHZ</a> ]</p><div class="horizontal-rule"></div><blockquote><em>The compilation showcases final project demos from the master course ‘Introduction to Soft Robotics,’ offered by SDU Soft Robotics during Autumn 2023 at the University of Southern Denmark, Odense Campus. This year’s project theme focused on soft locomotion. Each team was tasked with designing a soft robot capable of navigating a path comprising flat and inclined surfaces, obstacles, and rough terrains.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2c4cc0322132079decd75db749df9b4e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xO3deSSuH9s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://odin.sdu.dk/sitecore/index.php?a=searchfagbesk&bbcourseid=T550068101-1-E23&lang=en">SDU</a> ]</p><div class="horizontal-rule"></div><blockquote><em>In 2023, we were honored to have numerous clients place their trust in us, deploying our quadruped robot in a variety of settings. We take pride in our commitment to alleviate our clients’ challenges, a mission that has been at the heart of DEEPRobotics since the beginning. We’ve selected a few symbolic cases to share with you, and we hope you find them as fascinating as we do. Enjoy! </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c34bdcdf923bf9413d39e67648e97724" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/1lHOVpJuSt0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://deeprobotics.cn/en">Deep Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Check out precision layout through the “eyes” of the Dusty FieldPrinter. More than just lines on the ground, the FieldPrinter seamlessly syncs the digital model to the jobsite floor with unparalleled accuracy and detail.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c3f9cc5cfcb17f0423ae4a2dd73769a5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xRclqXDv8hM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dustyrobotics.com/">Dusty Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Why do data centers of the future need to be state-of-the-art, and why do we need to apply so many technologies to them? There are engineers tackling this very question with robotics, autonomous driving, and AI technologies. In this video, they explain the reason behind developing the robots and autonomous shuttles of the data center GAK Sejong.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="07b40b9c28dccacd11f53bf58d0b13a9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/E1k661x1o_g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.naverlabs.com/en/storyDetail/280">Naver Labs</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Failure is just a necessary stepping stone towards success. Follow Team RoMeLa’s journey with our humanoid robot ARTEMIS! Humanoid Locomotion Competition, IEEE Humanoids Conference 2023.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f158be84973e0289bc369a9c4375fa5d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Aj79CMou3to?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.romela.org/">RoMeLa</a> ]</p><div class="horizontal-rule"></div><blockquote><em>A fascinating history of the KR FAMULUS, the world’s first industrial robot with an electric motor, which went into service in Augsburg half a century ago! Starting with the vision of creating robots that would make work and life easier for people, the foundations were laid for today’s robot revolution. Richard Schwarz, one of the KUKA pioneers, gives a first-hand account of how, driven by passion and the German engineering spirit, they developed the KR Famulus, shaping the technology shift from cumbersome hydraulic robots to innovative, clean electric motor robots.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5bb4b43ed468493b42a7180d0e58aa44" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/6zo8uXhGQag?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.kuka.com/homeofrobotik">Kuka</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 22 Dec 2023 17:38:10 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-happy-holidays-2023</guid><category>Video friday</category><category>Robotics</category><category>Quadruped robots</category><category>Robotic arm</category><category>Kuka robotics</category><category>Eth zurich</category><category>Romela</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-yellow-legged-robot-wearing-antlers-is-passed-out-on-a-couch-in-an-office-next-to-santa-claus.png?id=50903752&amp;width=980"></media:content></item><item><title>35 Years Ago, Researchers Used Brain Waves to Control a Robot</title><link>https://spectrum.ieee.org/brain-waves-control-a-robot</link><description><![CDATA[
  643. <img src="https://spectrum.ieee.org/media-library/an-illustration-of-a-robot-and-a-representation-of-a-brain-wave.jpg?id=50838542&width=1245&height=700&coordinates=0%2C54%2C0%2C55"/><br/><br/><p>Using the brain to directly control an object was long the stuff of science fiction, and in 1988 the vision became a reality.</p><p>IEEE Life Senior Member <a href="https://www.linkedin.com/in/stevo-bozinovski-08a48155" rel="noopener noreferrer" target="_blank">Stevo Bozinovski</a> and Members <a href="https://www.linkedin.com/in/mihailsestakov/?originalSubdomain=au" rel="noopener noreferrer" target="_blank">Mihail Sestakov</a> and Dr. Liljana Bozinovska used a student volunteer’s electroencephalogram (EEG) brain signals to move a robot along a closed-circuit track. Bozinovski was an electrical engineering and computer science professor at <a href="https://www.ukim.edu.mk/en_index.php" rel="noopener noreferrer" target="_blank">Saints Cyril and Methodius University</a>, in Skopje, North Macedonia (formally the Republic of Macedonia). Sestakov was a doctoral student at the school while Bozinovska, a physician, taught in the university’s medical school. Their achievement has paved the way for EEG-controlled drones, wheelchairs, and exoskeletons.</p><p>IEEE commemorated their work with an IEEE Milestone during a ceremony at the university on 10 October.</p><p>“The accomplishment is not only meaningful locally,” <a href="https://mk.linkedin.com/in/vladimir-atanasovski-02b0527" rel="noopener noreferrer" target="_blank">Vladimir Atanasovski</a> said at the dedication ceremony. “It exhibits benefits for the entire humanity.” Atanasovski is dean of the university’s electrical engineering and information technologies school. </p><p>“It was at this very school, 35 years ago, where a relationship between two previously distant areas [robotics and EEG signals] was formed,” he added. “This remarkable work showed that science fiction can become a reality.</p><p>“Controlling a robot using human brain signals for the first time advanced both electrical and computer engineering and science, led to worldwide research on brain-computer interfaces, and opened an explicit communication channel between robots and humans.”</p><h2>Using engineering to demonstrate psychokinesis</h2><p>Bozinovski, Sestakov, and Bozinovska built a system to send commands to a robot based on EEG signal processing. The method is noninvasive; all you have to do is place electrodes on a volunteer’s scalp.</p><p>The three researchers used an Elehobby (now called <a href="https://www.elekit.co.jp/en/about/" rel="noopener noreferrer" target="_blank">Elekit</a>) <a href="https://www.worthpoint.com/worthopedia/brand-vintage-movit-913-line-tracer-534205157" rel="noopener noreferrer" target="_blank">Movit Line Tracer II</a> robot they purchased at the Akihabara market in Tokyo. The robot had two plastic discs that sat on top of each other and held the electronic components between them. Its two wheels were controlled with a start/stop mechanical switch. </p><p class="pull-quote">“Engineers are the driving force in every country, contributing to the welfare and progress of societies.”<strong>—Stevo Pendarovski</strong></p><p><span></span>The robot, powered by batteries, drove on a track drawn on a flat surface, according to the <a href="https://ieeemilestones.ethw.org/Milestone-Proposal:First_Robotic_Control_from_Human_Brain_Signals,_1988" target="_blank">Milestone proposal entry</a> on the <a href="https://ethw.org/Main_Page" target="_blank">Engineering Technology and History Wiki</a>.</p><p>But the researchers still didn’t know how they were going to translate the brain signals into commands. Bozinovska suggested using the EEG’s prominent range frequency of 8 to 13 hertz—known as the alpha rhythm. It’s a pattern of electrical activity in the part of the brain that processes visual information. The frequency increases when a person is relaxed and not processing visual information. </p><p>Bozinovska’s theory was that the alpha rhythm would command the Line Tracer. People attempting to control the robot would achieve relaxation by closing their eyes. To stop the robot from moving, they would open their eyes.</p><p>Bozinovski, Sestakov, and Bozinovska designed an experiment to test her theory. </p><h2>Moving a robot using brain signals</h2><p class="shortcode-media shortcode-media-rebelmouse-image">
  644. <img alt="Black and white photo of table with a race track on it with two people sitting behind it" class="rm-shortcode" data-rm-shortcode-id="8eda2191515ab85602eebd4b505c0a6a" data-rm-shortcode-name="rebelmouse-image" id="9780c" loading="lazy" src="https://spectrum.ieee.org/media-library/black-and-white-photo-of-table-with-a-race-track-on-it-with-two-people-sitting-behind-it.jpg?id=50838663&width=980"/>
  645. <small class="image-media media-caption" placeholder="Add Photo Caption...">Two floors were built to conduct the experiment. The first floor had a table on which sat a rectangular closed-circuit track with markings where the robot would be commanded to stop.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Saints Cyril and Mehtodius University/IEEE</small></p><p>Bozinovski and Sestakov built a two-floor “robot arena,” as Bozinovski described it in the Milestone entry, in the school’s Laboratory of Intelligent Machines and Bioinformation Systems to conduct the experiment. On the first floor there was a table on which a closed-circuit track was drawn. Optical sensors sat on the track, marking where the robot would be commanded to stop. Next to the table were an <a href="https://spectrum.ieee.org/how-the-ibm-pc-won-then-lost-the-personal-computer-market" target="_self">IBM XT personal computer</a>, analog-to-digital and digital-to-analog converters, and a differential biomedical amplifier, which collected the student volunteer’s brain signal.</p><p>The second floor housed power amplifiers. Wires from the amplifiers attached to the robot hung from the top floor, well out of the way of the robot’s movement. The student volunteer sat on the bottom floor next to the table.</p><p>Electrodes were placed at the volunteer’s midline parietal cortex—the part of the brain that completes visual scene processing. More were placed behind the right ear and on the forehead.</p><p>To move the robot, the student relaxed, with eyes closed. The differential biomedical amplifier recorded the EEG signals and inputted them into the computer with the help of the A/D converter set at a 300 Hz sampling rate. Machine learning and recognition software created by Bozinovski and Sestakov translated the signals into a <em>go</em> command. The computer then sent a 5-volt logic pulse through the D/A converter—which the transistor amplifier magnified and sent to the robot. The volunteer was able to stop the Line Tracer by opening their eyes.</p><p>Bozinovski, Sestakov, and Bozinovska presented their findings at the 1988<a href="https://www.embs.org/event/ieee-embs-international-conference-on-data-science-and-engineering-in-healthcare-medicine-biology/" target="_blank"> IEEE International Engineering in Medicine and Biology Conference</a>.</p><h2>North Macedonia’s president on the importance of engineers</h2><p>“Engineers are the driving force in every country, contributing to the welfare and progress of societies,” <a href="https://pretsedatel.mk/en/biography-of-the-president/" rel="noopener noreferrer" target="_blank">Stevo Pendarovski</a>, president of North Macedonia, said at the dedication ceremony.</p><p>“Let this genuinely exceptional event of great importance for Macedonian engineering in particular, but also for Macedonian science and society as a whole, be an inspiration for all students, professors, and future engineers,” Pendarovski said, “to create and contribute to building a modern and technologically advanced world.”</p><p>IEEE President Saifur Rahman also attended the ceremony.</p><p>A plaque recognizing the technology is displayed outside the Saints Cyril and Methodius University electrical engineering faculty building. The Laboratory of Intelligent Machines and Bioinformation Systems, where the Milestone was achieved, is in the building. It reads:</p><p><em>In 1988, in the Laboratory of Intelligent Machines and Bioinformation Systems, human brain signals controlled the movement of a physical object (a robot) for the first time worldwide. This linked electroencephalogram (EEG) signals collected from a brain with robotics research, opening a new channel for communication between humans and machines. EEG-controlled devices (wheelchairs, exoskeletons, etc.) have benefited numerous users and expanded technology’s role in modern society.</em></p><p>Administered by the IEEE History Center and<a href="https://www.ieeefoundation.org/donate_history" rel="noopener noreferrer" target="_blank"> supported by donors</a>, the Milestone program recognizes outstanding technical developments around the world.</p>The<a href="https://ethw.org/IEEE_North_Macedonia_Section_History" rel="noopener noreferrer" target="_blank"> IEEE North Macedonia Section</a> (formally the IEEE Republic of Macedonia Section) sponsored the nomination.]]></description><pubDate>Sat, 16 Dec 2023 19:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/brain-waves-control-a-robot</guid><category>Ieee history</category><category>Ieee milestone</category><category>Robotics</category><category>Type:ti</category><dc:creator>Joanna Goodrich</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/an-illustration-of-a-robot-and-a-representation-of-a-brain-wave.jpg?id=50838542&amp;width=980"></media:content></item><item><title>Video Friday: The Simplest Walking Robot</title><link>https://spectrum.ieee.org/walking-robot-video-friday</link><description><![CDATA[
  646. <img src="https://spectrum.ieee.org/media-library/an-animated-gif-showing-a-four-inch-tall-robot-that-consists-only-of-two-legs-walking-jauntily-across-a-conference-floor.gif?id=50816780&width=1245&height=700&coordinates=0%2C31%2C0%2C32"/><br/><br/><p>
  647. Video Friday is your weekly selection of awesome robotics videos, collected by your friends at
  648. <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/>
  649. </p><h5><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 02 February 2024, ZURICH</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><p>
  650. Enjoy today’s videos!
  651. </p><div class="horizontal-rule">
  652. </div><div style="page-break-after: always">
  653. <span style="display:none"> </span>
  654. </div><p class="rm-anchors" id="rljmxp9vyty">
  655. We wrote about an earlier version of this absurdly simple walking robot a few years ago. That version had two motors, but this version walks fully controllably with just a single motor! We’re told that the robot’s name is Mugatu, because for a while, it wasn’t an ambiturner.
  656. </p><p class="shortcode-media shortcode-media-youtube">
  657. <span class="rm-shortcode" data-rm-shortcode-id="02aa7328a945729a7cce8e4d10b6430d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/RLjmxp9vyTY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  658. </p><p>
  659. This was just presented at the IEEE Humanoids Conference in Austin. And here’s a second video with more technical detail on how the robot works:
  660. </p><p class="shortcode-media shortcode-media-youtube">
  661. <span class="rm-shortcode" data-rm-shortcode-id="a4ea6a3507093043bb682064368e5e23" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5EwBtk0PADw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  662. </p><p>
  663. [
  664. <a href="https://engineering.cmu.edu/news-events/news/2023/12/11-big-picture-small-robot.html">CMU</a> ]
  665. </p><div class="horizontal-rule">
  666. </div><p>
  667. Happy Holiday from Boston Dynamics!
  668. </p><p class="shortcode-media shortcode-media-youtube">
  669. <span class="rm-shortcode" data-rm-shortcode-id="01d75c7e9b6718dc255a9574beed5ad8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Cdc5iVqjNVs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  670. </p><p>
  671. Side note—has anyone built a robot that can flawlessly gift-wrap arbitrary objects yet?
  672. </p><p>
  673. [
  674. <a href="https://bostondynamics.com/">Boston Dynamics</a> ]
  675. </p><div class="horizontal-rule">
  676. </div><blockquote class="rm-anchors" id="cnkm0aecxya">
  677. <em>Is there a world where Digit can leverage a large language model (LLM) to expand its capabilities and better adapt to our world?  We had the same question.  Our innovation team developed this interactive demo to show how LLMs could make our robots more versatile and faster to deploy. The demo enables people to talk to Digit in natural language and ask it to do tasks, giving a glimpse at the future.</em>
  678. </blockquote><p class="shortcode-media shortcode-media-youtube">
  679. <span class="rm-shortcode" data-rm-shortcode-id="f04d422345885133443bafcae3eca717" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/CnkM0AecxYA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  680. </p><p>
  681. [
  682. <a href="https://agilityrobotics.com/">Agility Robotics</a> ]
  683. </p><p>
  684. Thanks, Tim!
  685. </p><div class="horizontal-rule">
  686. </div><blockquote>
  687. <em>In 2028, ESA will launch its most ambitious exploration mission to search for past and present signs of life on Mars. ESA’s Rosalind Franklin rover has unique scientific potential to search for evidence of past life on Mars thanks to its drill and scientific instruments. It will be the first rover to reach a depth of up to two metres deep below the surface, acquiring samples that have been protected from surface radiation and extreme temperatures. The drill will retrieve soils from ancient parts of Mars and analyse them in situ with its onboard laboratory.</em>
  688. </blockquote><p class="shortcode-media shortcode-media-youtube">
  689. <span class="rm-shortcode" data-rm-shortcode-id="8229cfd4fb964cdf6fd7e78108e043de" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/aj7xNrbfCKw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  690. </p><p>
  691. [
  692. <a href="https://www.esa.int/Science_Exploration/Human_and_Robotic_Exploration/Exploration/ExoMars">ESA</a> ]
  693. </p><div class="horizontal-rule">
  694. </div><blockquote>
  695. <em>With ChatGPT celebrating the anniversary of its launch a year ago, we thought this would be a good time to sit down with roboticist Hod Lipson and ask him what he thinks about all the changes with the rapid evolution of AI, how they’ve enabled the creation of ChatGPT, and what all this may mean for our future.</em>
  696. </blockquote><p class="shortcode-media shortcode-media-youtube">
  697. <span class="rm-shortcode" data-rm-shortcode-id="1a13346acf1bb9b5bc48231b978df8e3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/3rIlz9UREm8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  698. </p><p>
  699. [
  700. <a href="https://www.engineering.columbia.edu/faculty/hod-lipson">Columbia Engineering</a> ]
  701. </p><div class="horizontal-rule">
  702. </div><blockquote>
  703. <em>We propose a technique that simultaneously solves for optimal design and control parameters for a robotic character whose design is parameterized with configurable joints. At the technical core of our technique is an efficient solution strategy that uses dynamic programming to solve for optimal state, control, and design parameters, together with a strategy to remove redundant constraints that commonly exist in general robot assemblies with kinematic loops.</em>
  704. </blockquote><p class="shortcode-media shortcode-media-youtube">
  705. <span class="rm-shortcode" data-rm-shortcode-id="0991b40f85b2d35d366fe321f31997e0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/_7cd7XyhXCU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  706. </p><p>
  707. [
  708. <a href="https://la.disneyresearch.com/publication/optimal-robotic-kinematics/">Disney Research</a> ]
  709. </p><div class="horizontal-rule">
  710. </div><p>
  711. And now, this.
  712. </p><p class="shortcode-media shortcode-media-youtube">
  713. <span class="rm-shortcode" data-rm-shortcode-id="aeacd6675183a723aacec81662592cf0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/rRANR1kZBNo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  714. </p><p>
  715. [
  716. <a href="https://www.babyclappy.com/">Baby Clappy</a> ] via [ <a href="https://www.youtube.com/@KazumichiMoriyama">Kazumichi Moriyama</a> ]
  717. </p><div class="horizontal-rule">
  718. </div><blockquote>
  719. <em>Humanoid robots that can autonomously operate in diverse environments have the potential to help address labor shortages in factories, assist elderly at homes, and colonize new planets. While classical controllers for humanoid robots have shown impressive results in a number of settings, they are challenging to generalize and adapt to new environments. Here, we present a fully learning-based approach for humanoid locomotion.</em>
  720. </blockquote><p class="shortcode-media shortcode-media-youtube">
  721. <span class="rm-shortcode" data-rm-shortcode-id="2b4b171d457f42d8ebbe744c44283dff" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/eFoBfFhwo18?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  722. </p><p>
  723. [
  724. <a href="https://hybrid-robotics.berkeley.edu/">Hybrid Robotics Lab</a> ]
  725. </p><div class="horizontal-rule">
  726. </div><blockquote>
  727. <em>At the University of Michigan, graduate students in robotics all take ROB 550: Robotic Systems Laboratory. For the Fall 2023 class, the final project asked students to create a robot capable of lifting and stacking small pallets. Students designed and built the lift mechanisms from scratch, with a wide variety of solutions being implemented.</em>
  728. </blockquote><p class="shortcode-media shortcode-media-youtube">
  729. <span class="rm-shortcode" data-rm-shortcode-id="c63a6c7c8bf0ea54786647c56b326b91" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0iuTI2xilDQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  730. </p><p>
  731. [
  732. <a href="https://robotics.umich.edu/">Michigan Robotics</a> ]
  733. </p><div class="horizontal-rule">
  734. </div><blockquote>
  735. <em>In-hand object reorientation is necessary for performing many dexterous manipulation tasks, such as tool use in less structured environments that remain beyond the reach of current robots. We present a general object reorientation controller that uses readings from a single commodity depth camera to dynamically reorient complex and new object shapes by any rotation in real-time, with the median reorientation time being close to seven seconds.</em>
  736. </blockquote><p class="shortcode-media shortcode-media-youtube">
  737. <span class="rm-shortcode" data-rm-shortcode-id="cd10fe193e0a822b03635ed6db550c67" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5HoLBaAhK-s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  738. </p><p>
  739. [
  740. <a href="https://taochenshh.github.io/projects/visual-dexterity">Visual Dexterity</a> ]
  741. </p><div class="horizontal-rule">
  742. </div><p>
  743. If you weren’t at IEEE Humanoids this week, you missed out on meeting me in person, so shame on you. But you can see all the lightning talks from the Can We Built Baymax workshop right here.
  744. </p><p class="shortcode-media shortcode-media-youtube">
  745. <span class="rm-shortcode" data-rm-shortcode-id="8ce6293fad99fd211c72dcfb43d12bbe" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/y5PWEMTbWME?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  746. </p><p>
  747. [
  748. <a href="https://baymax.org/workshop/2023/">CWBB</a> ] via [ <a href="https://publish.illinois.edu/kimlab2020/">KIMLAB</a> ]
  749. </p><div class="horizontal-rule">
  750. </div><blockquote>
  751. <em>The U.S. National Science Foundation’s Graduate Research Fellowship Program (GRFP) has helped ensure the quality, vitality and diversity of the scientific and engineering workforce by recognizing and supporting outstanding graduate students since 1952. Kyle Johnson, a doctoral student at the University of Washington, joins us to talk about his work with robotics, his GRFP experience and how he inspires the next generation.</em>
  752. </blockquote><p class="shortcode-media shortcode-media-youtube">
  753. <span class="rm-shortcode" data-rm-shortcode-id="218ed18274ac7b405d822658a82f5e49" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/iqoU8_Hsr8M?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  754. </p><p>
  755. [
  756. <a href="https://www.youtube.com/@NSFScience">NSF</a> ]
  757. </p><div class="horizontal-rule">
  758. </div>]]></description><pubDate>Fri, 15 Dec 2023 16:30:02 +0000</pubDate><guid>https://spectrum.ieee.org/walking-robot-video-friday</guid><category>Cmu</category><category>Boston dynamics</category><category>Agility robotics</category><category>Esa</category><category>Robotics</category><category>Quadruped robots</category><category>Humanoid robots</category><category>Video friday</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/an-animated-gif-showing-a-four-inch-tall-robot-that-consists-only-of-two-legs-walking-jauntily-across-a-conference-floor.gif?id=50816780&amp;width=980"></media:content></item><item><title>This Robotic Pack Mule Can Carry Your Gear (and You)</title><link>https://spectrum.ieee.org/quadruped-robot-dog</link><description><![CDATA[
  759. <img src="https://spectrum.ieee.org/media-library/an-animated-gif-showing-a-human-riding-on-top-of-a-small-quadrupedal-robot-on-a-grassy-field.gif?id=50789945&width=1245&height=700&coordinates=0%2C14%2C0%2C14"/><br/><br/><p><p>
  760. <em>This article is part of our exclusive <a href="https://spectrum.ieee.org/collections/journal-watch/" rel="noopener noreferrer" target="_self">IEEE Journal Watch series</a> in partnership with IEEE Xplore.</em>
  761. </p></p><p>
  762. The useful niche that quadrupedal robots seem to have found for themselves, at least for the moment, is infrastructure inspection. They’ve had a mild to moderate amount of success monitoring industrial sites, tracking construction progress, and things of that nature. Which is great! But when you look at what humans have historically relied on quadrupeds for, there’s a little bit of situational awareness (in the form of security), but the majority of what these animals have done for us is manual labor.
  763. </p><p>
  764. <a href="https://ieeexplore.ieee.org/document/10246325" rel="noopener noreferrer" target="_blank">In a paper published last month in <em>IEEE Robotics and Automation Letters</em></a>, roboticists from the <a href="https://rsl.ethz.ch/" target="_blank">Robotic Systems Lab at ETH Zurich</a> are aiming to address the fact that “legged robots are still too weak, slow, inefficient, or fragile to take over tasks that involve heavy payloads.” Their new robot that is none of these things is Barry, which can efficiently carry up to 90 kilograms so that you don’t have to.
  765. </p><hr/><p>
  766. If you go back far enough, a bunch of the initial funding for quadrupedal robots that enabled the commercial platforms that are available today was tied into the idea of robotic pack animals. <a href="https://www.darpa.mil/about-us/timeline/legged-squad-support-system" target="_blank">Boston Dynamics’ BigDog and LS3</a> were explicitly designed to haul heavy loads (up to 200 kg) across rough terrain for the U.S. Military. This kind of application may be obvious, but the hardware requirements are challenging. Boston Dynamics’ large quadrupeds were all driven by hydraulics, which depended on the power density of gasoline to function, and ultimately they were too complex and noisy for the military to adopt. The current generation of quadruped robots, like <a data-linked-post="2661156031" href="https://spectrum.ieee.org/video-friday-spot-levels-up" target="_blank">Spot</a> and <a data-linked-post="2666409159" href="https://spectrum.ieee.org/quadruped-robot-wheels" target="_blank">ANYmal</a>, have a payload of between 10 and 15 kg.
  767. </p><p>
  768. Barry manages to carry 50 percent of the payload of LS3 in a much smaller, more efficient, and quieter form factor. It’s essentially a customized ANYmal, using unique high-efficiency electric actuators rather than hydraulics. The robot itself weighs 48 kg, and can handle unmodeled 90 kg payloads, meaning that Barry doesn’t have to know the size, weight, or mass distribution of what it’s carrying. It’s a key capability, because it makes Barry’s payload capacity actually useful, as the paper’s first author Giorgio Valsecchi explains: “When we use a wheelbarrow, we don’t have to change any settings on it, regardless of what we load it with—any manual adjustment is a bottleneck in usability. Why should a ‘smart’ robot be any different?” This is really what makes Barry’s payload capacity actually real-world useful, and also means that if you want to, you can even ride it.
  769. </p><p class="shortcode-media shortcode-media-youtube">
  770. <span class="rm-shortcode" data-rm-shortcode-id="e400d0df9ebd28d312b38e4c596b0698" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DkhbepqrOf0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  771. <small class="image-media media-caption" placeholder="Add Photo Caption...">Barry: A High-Payload and Agile Quadruped Robot</small>
  772. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">
  773. <a href="https://youtu.be/DkhbepqrOf0" target="_blank">youtu.be</a>
  774. </small>
  775. </p><p>Barry’s heroic payload is enabled by its custom actuators. While the standard approach for developing powered robotic joints involves choosing the smallest motor capable of producing the required peak power, Barry focuses on motor efficiency instead. “It turns out that the ideal solution is to have the biggest possible motor,” Valsecchi says. “It is a bit counterintuitive, but bigger motors are more efficient, they consume less energy when performing the same task. This results in a robot with more payload capabilities and a lower cost of transport.” Barry is actually quite efficient: with a cost of transport of just 0.7, it can operate with a payload for over two hours and travel nearly 10 km.<br/></p><p>
  776. The commercial potential for a robot like Barry is obvious, and Valsecchi is already thinking about several use cases: “carrying raw materials on construction sites to prevent injuries and increase productivity, carrying equipment in search and rescue operations to free up rescuers from excessive loads… The same technology could be used to design a walking wheelchair, and we actually got some requests for this specific use case. Once we started showing the robot with a big box on top, people realized a lot of things could be done.”
  777. </p><p>
  778. At the moment, Barry doesn’t yet have much in the way of perception, so giving the robot the ability to intelligently navigate around obstacles and over complex terrain is one of the things that the researchers will be working on next. They’re also starting to think about potential commercial applications, and it certainly seems like there’s a market for a robot like this—heck, I’d buy one.
  779. </p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-right" data-rm-resized-container="25%" style="float: right;">
  780. <img alt="A photograph of a brown and white stuffed dog in a display case." class="rm-shortcode rm-resized-image" data-rm-shortcode-id="a7f6051f76086e033ad01317fdb6a8b1" data-rm-shortcode-name="rebelmouse-image" id="af16a" loading="lazy" src="https://spectrum.ieee.org/media-library/a-photograph-of-a-brown-and-white-stuffed-dog-in-a-display-case.jpg?id=50789746&width=980" style="max-width: 100%"/>
  781. <small class="image-media media-caption" placeholder="Add Photo Caption...">The preserved 200 year old body of the original Barry.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Photo via Wikipedia by PraktikantinNMBE and reproduced under CC BY-SA 4.0.</small>
  782. </p><p>
  783. Barry, by the way, is named after a legendary St. Bernard who saved the lives of more than 40 people in the Swiss Alps in the early 1800s, including by carrying them to safety on his back. “Being able to ride the robot was one of our ambitions,” Valsecchi tells us. “When we managed to accomplish that I thought we did well enough to tribute the original Barry by using his name, to convey our vision of what robots could become.” Barry the dog died in 1814 (apparently stabbed by someone he was trying to rescue who thought he was a wolf), but his preserved body is on display at the Natural History Museum in Bern.</p><p><em><a href="https://ieeexplore.ieee.org/document/10246325" target="_blank">Barry: A High-Payload and Agile Quadruped Robot</a></em>, by Giorgio Valsecchi, Nikita Rudin, Lennart Nachtigall, Konrad Mayer, Fabian Tischhauser, and Marco Hutter from ETH Zurich, is published in <em>IEEE Robotics and Automation Letters</em>.</p>]]></description><pubDate>Fri, 15 Dec 2023 14:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/quadruped-robot-dog</guid><category>Anymal</category><category>Eth zurich</category><category>Journal watch</category><category>Quadruped robots</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/an-animated-gif-showing-a-human-riding-on-top-of-a-small-quadrupedal-robot-on-a-grassy-field.gif?id=50789945&amp;width=980"></media:content></item><item><title>Video Friday: Floppybot</title><link>https://spectrum.ieee.org/video-friday-floppybot</link><description><![CDATA[
  784. <img src="https://spectrum.ieee.org/media-library/an-animated-gif-showing-a-small-flat-blue-and-black-rectangular-sheet-slowly-and-repeatedly-flopping-forwards.gif?id=50721364&width=1245&height=700&coordinates=75%2C0%2C76%2C0"/><br/><br/><p>Your weekly selection of awesome robot videos</p><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.</p><h5><a href="https://2023.ieee-humanoids.org/">Humanoids 2023</a>: 12–14 December 2023, AUSTIN, TEX.</h5><h5><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 02 February 2024, ZURICH, SWITZERLAND</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p>This magnetically actuated soft robot is perhaps barely a robot by most definitions, but I can’t stop watching it flop around.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ff55ebea79252433a22e3634b356476b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Wgh_HNJ2T0c?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>In this work, Ahmad Rafsanjani, Ahmet F. Demirörs, and co‐workers from SDU (DK) and ETH (CH) introduce kirigami into a soft magnetic sheet to achieve bidirectional crawling under rotating magnetic fields. Experimentally characterized crawling and deformation profiles, combined with numerical simulations, reveal programmable motion through changes in cut shape, magnet orientation, and translational motion. This work offers a simple approach toward untethered soft robots.</em></blockquote><p>[ <a href="https://onlinelibrary.wiley.com/doi/10.1002/advs.202301895">Paper</a> ] via [ <a href="https://www.softrobotics.dk/">SDU</a> ]</p><p>Thanks, Ahmad!</p><div class="horizontal-rule"></div><p>Winner of the earliest holiday video is the LARSEN team at Inria!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="16313e12594c7519be4f56776b03bde4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/jqfBa_PIS9s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.inria.fr/en/larsen">Inria</a> ]</p><p>Thanks, Serena!</p><div class="horizontal-rule"></div><p>Even though this is just a rendering, I really appreciate Apptronik being like, “we’re into the humanoid thing, but sometimes you just don’t need legs.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3c8be8b4915634c8fd4ab61ffee41966" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Vd7I40iBQkI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://apptronik.com/">Apptronik</a> ]</p><div class="horizontal-rule"></div><p>We’re not allowed to discuss unmentionables here at <em>IEEE Spectrum</em>, so I can only tell you that Digit has started working in a warehouse handling, uh, things.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bd42e555f26250f9dea2a75e22b049e6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/NgYo-Wd0E_U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://agilityrobotics.com/news/2023/gxo-conducting-industry-leading-pilot-of-human-centric-robot">Agility</a> ]</p><div class="horizontal-rule"></div><p>Unitree’s sub-$90k H1 Humanoid suffering some abuse in a non-PR video.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="03fd9bc114b497bf06c00691e27a4416" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/tw5PzIlAg3E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://pc.watch.impress.co.jp/docs/news/1551834.html">Impress</a> ]</p><div class="horizontal-rule"></div><p>Unlike me, ANYmal can perform 24/7 in all weather.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e58880caab094004e500c1fcfebe5c06" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/1gu8tllMc2o?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.anybotics.com/">ANYbotics</a> ]</p><div class="horizontal-rule"></div><p>Most of the world will need to turn on subtitles for this, but it’s cool to see how industrial robots can be used to make art.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8648fe1bed17fda1af7a0a2c3a184bfd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/j2O5dijUUbU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.kuka.com/en-de/products/process-technologies/milling">Kuka</a> ]</p><div class="horizontal-rule"></div><p>I was only 12 when this episode of <em>Scientific American Frontiers</em> aired, but I totally remember Alan Alda meeting Flakey!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="19892997204c86f3963f505f87f672ad" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/h7eDWHLHIno?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>And here’s the segment, it’s pretty great.</p><p class="shortcode-media shortcode-media-youtube">
  785. <span class="rm-shortcode" data-rm-shortcode-id="14aa678a45759ded5ecb5527d62a9366" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/RU5X-AjG5_I?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  786. </p><p>[ <a href="https://www.sri.com/ics/cyber-formal-methods/karen-myers-when-i-introduced-flakey-the-robot-to-alan-alda/">SRI</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Agility CEO Damion Shelton talks about the hierarchy of robot control and draws similarities to the process of riding a horse.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1bd80168e1ad9c3712265b975386b325" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/guvzug2tuSk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://agilityrobotics.com/news/2023/gxo-conducting-industry-leading-pilot-of-human-centric-robot">Agility</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Seeking to instill students with real-life workforce skills through hands-on learning, teachers at Central High School in Louisville, Ky., incorporated Spot into their curriculum. For students at CHS, a magnet school for Jefferson County Public Schools district, getting experience with an industrial robot has sparked a passion for engineering and robotics, kickstarted advancement into university engineering programs, and built lifelong career skills. See how students learn to operate Spot, program new behaviors for the robot, and inspire their peers with the school’s “emotional support robot” and unofficial mascot.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e31c961a88b6a52d9a9428222d2dc6b0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2sZphMFJ-x8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/case-studies/spot-inspires-learners-central-high-school/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 08 Dec 2023 22:23:17 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-floppybot</guid><category>Video friday</category><category>Apptronik</category><category>Agility robotics</category><category>Unitree</category><category>Boston dynamics</category><category>Robotics</category><category>Humanoid robots</category><category>Quadruped robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/an-animated-gif-showing-a-small-flat-blue-and-black-rectangular-sheet-slowly-and-repeatedly-flopping-forwards.gif?id=50721364&amp;width=980"></media:content></item></channel></rss>

If you would like to create a banner that links to this page (i.e. this validation result), do the following:

  1. Download the "valid RSS" banner.

  2. Upload the image to your own server. (This step is important. Please do not link directly to the image on this server.)

  3. Add this HTML to your page (change the image src attribute if necessary):

If you would like to create a text link instead, here is the URL you can use:

http://www.feedvalidator.org/check.cgi?url=https%3A//feeds.feedburner.com/IeeeSpectrumRobotics%3Fformat%3Dxml

Copyright © 2002-9 Sam Ruby, Mark Pilgrim, Joseph Walton, and Phil Ringnalda