Congratulations!

[Valid RSS] This is a valid RSS feed.

Recommendations

This feed is valid, but interoperability with the widest range of feed readers could be improved by implementing the following recommendations.

Source: https://feeds.feedburner.com/IeeeSpectrumRobotics?format=xml

  1. <?xml version="1.0" encoding="utf-8"?>
  2. <rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:media="http://search.yahoo.com/mrss/"><channel><title>IEEE Spectrum</title><link>https://spectrum.ieee.org/</link><description>IEEE Spectrum</description><atom:link href="https://spectrum.ieee.org/feeds/topic/robotics.rss" rel="self"></atom:link><language>en-us</language><lastBuildDate>Fri, 08 Dec 2023 22:23:17 -0000</lastBuildDate><item><title>Video Friday: Floppybot</title><link>https://spectrum.ieee.org/video-friday-floppybot</link><description><![CDATA[
  3. <img src="https://spectrum.ieee.org/media-library/an-animated-gif-showing-a-small-flat-blue-and-black-rectangular-sheet-slowly-and-repeatedly-flopping-forwards.gif?id=50721364&width=1200&height=800&coordinates=128%2C0%2C129%2C0"/><br/><br/><p>Your weekly selection of awesome robot videos</p><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.</p><h5><a href="https://2023.ieee-humanoids.org/">Humanoids 2023</a>: 12–14 December 2023, AUSTIN, TEX.</h5><h5><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 02 February 2024, ZURICH, SWITZERLAND</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><h5><a href="https://2024.ieee-icra.org/">ICRA 2024</a>: 13–17 May 2024, YOKOHAMA, JAPAN</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p>This magnetically actuated soft robot is perhaps barely a robot by most definitions, but I can’t stop watching it flop around.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ff55ebea79252433a22e3634b356476b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Wgh_HNJ2T0c?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>In this work, Ahmad Rafsanjani, Ahmet F. Demirörs, and co‐workers from SDU (DK) and ETH (CH) introduce kirigami into a soft magnetic sheet to achieve bidirectional crawling under rotating magnetic fields. Experimentally characterized crawling and deformation profiles, combined with numerical simulations, reveal programmable motion through changes in cut shape, magnet orientation, and translational motion. This work offers a simple approach toward untethered soft robots.</em></blockquote><p>[ <a href="https://onlinelibrary.wiley.com/doi/10.1002/advs.202301895">Paper</a> ] via [ <a href="https://www.softrobotics.dk/">SDU</a> ]</p><p>Thanks, Ahmad!</p><div class="horizontal-rule"></div><p>Winner of the earliest holiday video is the LARSEN team at Inria!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="16313e12594c7519be4f56776b03bde4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/jqfBa_PIS9s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.inria.fr/en/larsen">Inria</a> ]</p><p>Thanks, Serena!</p><div class="horizontal-rule"></div><p>Even though this is just a rendering, I really appreciate Apptronik being like, “we’re into the humanoid thing, but sometimes you just don’t need legs.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3c8be8b4915634c8fd4ab61ffee41966" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Vd7I40iBQkI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://apptronik.com/">Apptronik</a> ]</p><div class="horizontal-rule"></div><p>We’re not allowed to discuss unmentionables here at <em>IEEE Spectrum</em>, so I can only tell you that Digit has started working in a warehouse handling, uh, things.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bd42e555f26250f9dea2a75e22b049e6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/NgYo-Wd0E_U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://agilityrobotics.com/news/2023/gxo-conducting-industry-leading-pilot-of-human-centric-robot">Agility</a> ]</p><div class="horizontal-rule"></div><p>Unitree’s sub-$90k H1 Humanoid suffering some abuse in a non-PR video.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="03fd9bc114b497bf06c00691e27a4416" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/tw5PzIlAg3E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://pc.watch.impress.co.jp/docs/news/1551834.html">Impress</a> ]</p><div class="horizontal-rule"></div><p>Unlike me, ANYmal can perform 24/7 in all weather.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e58880caab094004e500c1fcfebe5c06" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/1gu8tllMc2o?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.anybotics.com/">ANYbotics</a> ]</p><div class="horizontal-rule"></div><p>Most of the world will need to turn on subtitles for this, but it’s cool to see how industrial robots can be used to make art.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8648fe1bed17fda1af7a0a2c3a184bfd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/j2O5dijUUbU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.kuka.com/en-de/products/process-technologies/milling">Kuka</a> ]</p><div class="horizontal-rule"></div><p>I was only 12 when this episode of <em>Scientific American Frontiers</em> aired, but I totally remember Alan Alda meeting Flakey!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="19892997204c86f3963f505f87f672ad" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/h7eDWHLHIno?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>And here’s the segment, it’s pretty great.</p><p class="shortcode-media shortcode-media-youtube">
  4. <span class="rm-shortcode" data-rm-shortcode-id="14aa678a45759ded5ecb5527d62a9366" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/RU5X-AjG5_I?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  5. </p><p>[ <a href="https://www.sri.com/ics/cyber-formal-methods/karen-myers-when-i-introduced-flakey-the-robot-to-alan-alda/">SRI</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Agility CEO Damion Shelton talks about the hierarchy of robot control and draws similarities to the process of riding a horse.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1bd80168e1ad9c3712265b975386b325" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/guvzug2tuSk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://agilityrobotics.com/news/2023/gxo-conducting-industry-leading-pilot-of-human-centric-robot">Agility</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Seeking to instill students with real-life workforce skills through hands-on learning, teachers at Central High School in Louisville, Ky., incorporated Spot into their curriculum. For students at CHS, a magnet school for Jefferson County Public Schools district, getting experience with an industrial robot has sparked a passion for engineering and robotics, kickstarted advancement into university engineering programs, and built lifelong career skills. See how students learn to operate Spot, program new behaviors for the robot, and inspire their peers with the school’s “emotional support robot” and unofficial mascot.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e31c961a88b6a52d9a9428222d2dc6b0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2sZphMFJ-x8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/case-studies/spot-inspires-learners-central-high-school/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 08 Dec 2023 22:23:17 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-floppybot</guid><category>Video friday</category><category>Apptronik</category><category>Agility robotics</category><category>Unitree</category><category>Boston dynamics</category><category>Robotics</category><category>Humanoid robots</category><category>Quadruped robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/an-animated-gif-showing-a-small-flat-blue-and-black-rectangular-sheet-slowly-and-repeatedly-flopping-forwards.gif?id=50721364&amp;width=980"></media:content></item><item><title>Self-Driven “Vine” Seeks Light and Heat</title><link>https://spectrum.ieee.org/biomemetics-vine</link><description><![CDATA[
  6. <img src="https://spectrum.ieee.org/media-library/two-long-thin-robotic-objects-on-a-surface-each-turned-towards-a-light-on-the-side.jpg?id=50714100&width=1200&height=800&coordinates=136%2C0%2C136%2C0"/><br/><br/><p><em>This article is part of our exclusive </em><a href="https://spectrum.ieee.org/collections/journal-watch/" target="_self"><em>IEEE Journal Watch series</em></a><em> in partnership with IEEE Xplore.</em></p><p>Thanks to eons of evolution, vines have the ability to seek out light sources, growing in the direction that will optimize their chances of absorbing sunlight and thriving. Now, researchers have succeeded in creating a vine-inspired crawling bot that can achieve similar feats, seeking out and moving towards light and heat sources. It’s described in a <a href="https://ieeexplore.ieee.org/document/10305272" rel="noopener noreferrer" target="_blank">study</a> published last month in <a href="https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=7083369" rel="noopener noreferrer" target="_blank"><em>IEEE Robotics and Automation Letters</em></a>. </p><p>Shivani Deglurkar, a Ph.D. candidate in the department of Mechanical and Aerospace Engineering at the University of California, San Diego, helped co-design these automated “vines.” Because of its light- and heat-seeking abilities, the system doesn’t require a complex centralized controller. Instead, the “vines” automatically move towards a desired target. “[Also], if some of the vines or roots are damaged or removed, the others remain fully functional,” she notes. </p><p>While the tech is still in its infancy, Deglurkar says she envisions it helping in different applications related to solar tracking, or perhaps even in detecting and fighting smoldering fires. </p><p class="shortcode-media shortcode-media-youtube">
  7. <span class="rm-shortcode" data-rm-shortcode-id="36f24d9896acf5a87de8cf869e34b45e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Y1GZqAWq6a4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  8. <small class="image-media media-caption" placeholder="Add Photo Caption...">It uses a novel actuator that contracts in the presence of light, causing it to gravitate towards the source. </small>
  9. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Shivani Deglurkar et al. </small>
  10. </p><p>To help the device automatically gravitate towards heat and light, Deglurkar’s team developed a novel actuator. It uses a photo absorber in low-boiling-point fluid, which is contained in many small, individual pouches along the sides of the vine’s body. They called this novel actuator a Photothermal Phase-change Series Actuator (PPSA). </p><p>When exposed to light, the PPSAs absorb light, heat up, inflate with vapor, and contract. As the PPSAs are pressurized, they elongate, by unfurling material from inside its tip. “At the same time, the PPSAs on the side exposed to light contract, shortening that portion of the robot, and steering it toward the [light or heat] source,” explains Deglurkar.</p><p>Her team then tested the system, placing it at different distances from an infrared light source, and confirmed that it will gravitate towards the source at short distances. Its ability to do so depends on the light intensity, whereby stronger light sources allow the device to bend more towards the heat source.</p><p>Full turning of the vine by the PPSAs takes about 90 seconds. Strikingly, the device was even able to navigate around obstacles thanks to its inherent need to seek out light and heat sources. </p><p><a href="https://me.ucsb.edu/people/xiao-charlie" target="_blank">Charles Xiao</a>, a Ph. D. candidate in the department of Mechanical Engineering at the University of California, Santa Barbara, helped co-design the vine. He says he was surprised to see its responsiveness in even very low lighting. “Sunlight is about 1000 W/m<sup>2</sup>, and our robot has been shown to work at a fraction of solar intensity,” he explains, noting that a lot of comparable systems require illumination greater than that of one Sun.</p><p>Xiao says that the main strength of the automated vine is its simplicity and low cost to make. But more work is needed before it can hit the market—or makes its debut fighting fires. “It is slow to respond to light and heat signals and not yet designed for high temperature applications,” explains Xiao. </p><p>Therefore future prototypes would need better performance at high temperatures and ability to sense fires in order to be deployed in a real-world environment. Moving forward, Deglurkar says her team’s next steps include designing the actuators to be more selective to the wavelengths emitted by a fire, and developing actuators with a faster response time.</p>]]></description><pubDate>Fri, 08 Dec 2023 16:10:52 +0000</pubDate><guid>https://spectrum.ieee.org/biomemetics-vine</guid><category>Journal watch</category><category>Light absorbing</category><category>Locomotion</category><category>Bioinspired robots</category><category>Biomimetics</category><dc:creator>Michelle Hampson</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/two-long-thin-robotic-objects-on-a-surface-each-turned-towards-a-light-on-the-side.jpg?id=50714100&amp;width=980"></media:content></item><item><title>Drones Deliver Defibrillators Faster Than Ambulances</title><link>https://spectrum.ieee.org/drone-defibrillator</link><description><![CDATA[
  11. <img src="https://spectrum.ieee.org/media-library/a-yellow-drone-carrying-a-red-aed-case.jpg?id=50679166&width=1200&height=800&coordinates=44%2C0%2C44%2C0"/><br/><br/><p style="">Every minute counts when someone suffers a cardiac arrest. New research suggests that <a href="https://spectrum.ieee.org/autonomous-drones" target="_blank">drones</a> equipped with equipment to automatically restart someone’s heart could help get life-saving care to people much faster.</p><p style=""> If your heart stops beating outside of a hospital, your chance of survival is typically <a href="https://www.resuscitationjournal.com/article/S0300-9572(21)00060-5/pdf" target="_blank">less than 10 percent</a>. One thing that can boost the prospect of pulling through is an <a data-linked-post="2650250930" href="https://spectrum.ieee.org/idiotproofing-the-defibrillator" target="_blank">automated external defibrillator</a> (AED)—a device that can automatically diagnose dangerous heart rhythms and deliver an electric shock to get the heart pumping properly again.</p><p style="">AEDs are designed to be easy to use and provide step-by-step voice instructions, making it possible for untrained bystanders to deliver treatment before an ambulance arrives. But<strong> </strong>even though AEDs are often installed in <a href="https://www.aedbrands.com/blog/where-should-aeds-be-located/" target="_blank">public spaces</a> such as shopping malls and airports, the majority of cardiac arrests outside of hospitals actually occur in homes.</p><p style="">A team of Swedish researchers decided to use <a data-linked-post="2650271792" href="https://spectrum.ieee.org/defibrillator-drone-another-good-drone-idea" target="_blank">drones to deliver AEDs</a> directly to patients. Over the course of an 11-month trial in the suburbs of Gothenburg, the team showed they could get the devices to the scene of a medical emergency before an ambulance 67 percent of the time. Generally the AED arrived more than three minutes earlier, giving bystanders time to attach the device before paramedics reached the patient. In one case, this saved a patient’s life.<strong></strong></p><p style=""> “The results are really promising because we show that it’s possible to beat the ambulance services by several minutes in a majority of cases,” says <a href="https://staff.ki.se/people/andreas-claesson" rel="noopener noreferrer" target="_blank">Andreas Claesson</a>, an associate professor at the <a href="https://ki.se/en" target="_blank">Karolinska Institute</a> in Solna who led the research. “If you look at cardiac arrest, each minute that passes without treatment survival decreases by about 10 percent. So a time benefit of three minutes, as in this study, could potentially increase survival.”</p><p style=""> The project was a collaboration with Gothenburg-based drone operator <a href="https://everdrone.com/" rel="noopener noreferrer" target="_blank">Everdone</a> and covered 194.3 square kilometers of semi-urban areas around the city, with a total population of roughly 200,000. Throughout the study period, the company operated five <a data-linked-post="2652903581" href="https://spectrum.ieee.org/review-djis-new-fpv-drone-is-effortless-exhilarating-fun" target="_blank">DJI drones</a> that could be dispatched from hangars at five different locations around the city. The drones could <a href="https://spectrum.ieee.org/ai-drone-racing" target="_blank">autonomously</a> fly to the scene of an emergency under the watch of a single safety supervisor. Each drone carried an AED in a basket that could be winched down from an altitude of 30 meters.</p><p style=""> When the local emergency response center received a call about a suspected cardiac arrest or ongoing CPR, one of the drones was dispatched immediately. Once the drone reached the location, it lowered the AED to the ground. If the emergency dispatcher deemed it appropriate and safe, the person who had called in the cardiac arrest was directed to retrieve the device.</p><p class="shortcode-media shortcode-media-rebelmouse-image" style="">
  12. <img alt="Hands reach down to pickup an AED off the ground" class="rm-shortcode" data-rm-shortcode-id="d53a9d65cd48db14a240f362197811d3" data-rm-shortcode-name="rebelmouse-image" id="40ece" loading="lazy" src="https://spectrum.ieee.org/media-library/hands-reach-down-to-pickup-an-aed-off-the-ground.jpg?id=50679230&width=980"/>
  13. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Everdrone</small></p><p style=""> Drones weren’t dispatched for every emergency call, because they weren’t allowed to operate in rain and strong winds, in no-fly zones, or when calls came from high-rise buildings. But in <a href="https://www.thelancet.com/journals/landig/article/PIIS2589-7500(23)00161-9/fulltext#%20" target="_blank">a paper</a> in the December edition of <em>The Lancet D</em><em>igital Health</em>, the research team reported that of the 55 cases where both a drone and an ambulance reached the scene of the emergency, the drone got there first 37 times, with a median lead time of 3 minutes and 14 seconds.</p><p style=""> Only 18 of those emergency calls actually turned out to be cardiac arrests, but in six of those cases the caller managed to apply the AED. In two cases the device recommended applying a shock, with one of the patients surviving thanks to the intervention. The number of cases is too few to make any claims about the clinical effectiveness of the approach, says Claesson, but he says the results clearly show that drones are an effective way to improve emergency response times.</p><p style=""> “Three minutes is quite substantial,” says <a href="https://chan.mie.utoronto.ca/" rel="noopener noreferrer" target="_blank">Timothy Chan</a>, a professor of mechanical and industrial engineering at the University of Toronto, who has investigated the effectiveness of<strong> </strong>drone-delivered AEDs. “Given that in most parts of the world emergency response times are fairly static over time, it would be a huge win if we could achieve and sustain a big reduction like this in widespread practice.”</p><p style=""> The approach won’t work everywhere, admits Claesson. In rural areas, the technology would likely lead to even bigger reductions in response time, but lower population density means the cases would be too few to justify the investment. And in big cities, ambulance response times are already relatively rapid and high rise buildings would make drone operation challenging.</p><p style=""> But in the kind of semi-urban areas where the trial was conducted, Claesson thinks the technology is very promising. Each drone system costs roughly US $125,000 a year to run and can cover an area with roughly 30,000 to 40,000 inhabitants, which he says is already fairly cost-effective. But what will make the idea even more compelling is when the drones are able to respond to a wider range of emergencies.</p><p style=""> That could involve <a data-linked-post="2650278428" href="https://spectrum.ieee.org/in-the-air-with-ziplines-medical-delivery-drones" target="_blank">delivering medical supplies</a> for other time-sensitive medical emergencies like drug overdoses, allergic reactions or severe bleeding, he says. Drones equipped with cameras could also rapidly relay video of car accidents or fires to dispatchers, enabling them to tailor the emergency response based on the nature and severity of the incident.</p><p style=""> The biggest challenge when it comes to delivering medical support such as AEDs by drone, says Claesson, is the reliance on untrained bystanders.“It’s a really stressful event for them,” he says. “Most often it’s a relative and most often they don’t know CPR and they might not know how an AED works.”</p><p style=""> One promising future direction could be to combine drone-delivered AEDs with existing smartphone apps that are used to quickly alert volunteers trained in first aid to nearby medical emergencies. “In Sweden, in 40 percent of cases they arrive before an ambulance,” says Claesson. “We could just send a push notification to the app saying a drone will deliver an AED in two minutes, make your way to the site.”</p>]]></description><pubDate>Thu, 07 Dec 2023 15:00:54 +0000</pubDate><guid>https://spectrum.ieee.org/drone-defibrillator</guid><category>Drones</category><category>Aed</category><category>Medicine</category><category>Emergency response</category><category>Defibrillator</category><dc:creator>Edd Gent</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-yellow-drone-carrying-a-red-aed-case.jpg?id=50679166&amp;width=980"></media:content></item><item><title>ANYmal’s Wheel-Hand-Leg-Arms Open Doors Playfully</title><link>https://spectrum.ieee.org/quadruped-robot-wheels</link><description><![CDATA[
  14. <img src="https://spectrum.ieee.org/media-library/a-large-red-quadrupedal-robot-with-wheels-at-the-end-of-its-limbs-balances-on-two-legs-as-it-opens-a-door-and-throws-a-package-i.gif?id=50604361&width=1200&height=800&coordinates=62%2C0%2C63%2C0"/><br/><br/><p>The tricked out version of the <a href="https://robotsguide.com/robots/anymal" target="_blank">ANYmal</a> quadruped, as customized by Zürich-based <a href="https://spectrum.ieee.org/delivery-robot-anymal" target="_self">Swiss-Mile</a>, just keeps getting better and better. Starting with a commercial quadruped, adding powered wheels made the robot fast and efficient, while still allowing it to handle curbs and stairs. A few years ago, the robot <a href="https://spectrum.ieee.org/delivery-robot-anymal" target="_self">learned how to stand up</a>, which is an efficient way of moving and made the robot much more pleasant to hug, but more importantly, it unlocked the potential for the robot to start doing manipulation with its wheel-hand-leg-arms. </p><p>Doing any sort of practical manipulation with ANYmal is complicated, because its limbs were designed to be legs, not arms. But at the <a href="https://rsl.ethz.ch/" target="_blank">Robotic Systems Lab at ETH Zurich</a>, they’ve managed to teach this robot to use its limbs to open doors, and even to grasp a package off of a table and toss it into a box.</p><p class="pull-quote">When it makes a mistake in the real world, the robot has already learned the skills to recover.</p><hr/><p class="shortcode-media shortcode-media-youtube">
  15. <span class="rm-shortcode" data-rm-shortcode-id="93c144fb11193bed054fca76b9047153" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Qob2k_ldLuw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  16. </p><p>The ETHZ researchers got the robot to reliably perform these complex behaviors using a kind of reinforcement learning called ‘<a href="https://towardsdatascience.com/curiosity-driven-learning-made-easy-part-i-d3e5a2263359" target="_blank">curiosity driven’ learning</a>. In simulation, the robot is given a goal that it needs to achieve—in this case, the robot is rewarded for achieving the goal of passing through a doorway, or for getting a package into a box. These are very high-level goals (also called “<a data-linked-post="2650276740" href="https://spectrum.ieee.org/openai-releases-algorithm-that-helps-robots-learn-from-hindsight" target="_blank">sparse rewards</a>”), and the robot doesn’t get any encouragement along the way. Instead, it has to figure out how to complete the entire task from scratch.</p><p class="pull-quote">The next step is to endow the robot with a sense of contact-based surprise.</p><p>Given an impractical amount of simulation time, the robot would likely figure out how to do these tasks on its own. But to give it a useful starting point, the researchers introduced the concept of curiosity, which encourages the robot to play with goal-related objects. “In the context of this work, ‘curiosity’ refers to a natural desire or motivation for our robot to explore and learn about its environment,” says author <a href="https://www.markobjelonic.com/" target="_blank">Marko Bjelonic</a>, “Allowing it to discover solutions for tasks without needing engineers to explicitly specify what to do.” For the door-opening task, the robot is instructed to be curious about the position of the door handle, while for the package-grasping task, the robot is told to be curious about the motion and location of the package. Leveraging this curiosity to find ways of playing around and changing those parameters helps the robot achieve its goals, without the researchers having to provide any other kind of input.</p><p>The behaviors that the robot comes up with through this process are reliable, and they’re also diverse, which is one of the benefits of using sparse rewards. “The learning process is sensitive to small changes in the training environment,” explains Bjelonic. “This sensitivity allows the agent to explore various solutions and trajectories, potentially leading to more innovative task completion in complex, dynamic scenarios.” For example, with the door opening task, the robot discovered how to open it with either one of its end-effectors, or both at the same time, which makes it better at actually completing the task in the real world. The package manipulation is even more interesting, because the robot sometimes dropped the package in training, but it autonomously learned how to pick it up again. So, when it makes a mistake in the real world, the robot has already learned the skills to recover.</p><p>There’s still a bit of research-y cheating going on here, since the robot is relying on the visual code-based <a href="https://april.eecs.umich.edu/software/apriltag" rel="noopener noreferrer" target="_blank">AprilTags</a> system to tell it where relevant things (like door handles) are in the real world. But that’s a fairly minor shortcut, since direct detection of things like doors and packages is a fairly well understood problem. Bjelonic says that the next step is to endow the robot with a sense of contact-based surprise, in order to encourage exploration, which is a little bit gentler than what we see here. </p><p>Remember, too, that while this is definitely a research paper, Swiss-Mile is a company that wants to get this robot out into the world doing useful stuff. So, unlike most pure research that we cover, there’s a slightly better chance here for this ANYmal to wheel-hand-leg-arm its way into some practical application.</p>]]></description><pubDate>Sat, 02 Dec 2023 17:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/quadruped-robot-wheels</guid><category>Quadruped robots</category><category>Anymal</category><category>Swiss-mile</category><category>Eth zurich</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/a-large-red-quadrupedal-robot-with-wheels-at-the-end-of-its-limbs-balances-on-two-legs-as-it-opens-a-door-and-throws-a-package-i.gif?id=50604361&amp;width=980"></media:content></item><item><title>Video Friday: Tap Finger, Move Mountain</title><link>https://spectrum.ieee.org/video-friday-tap-finger-move-mountain</link><description><![CDATA[
  17. <img src="https://spectrum.ieee.org/media-library/an-excavator-with-no-human-inside-and-sensors-on-its-roof-picks-up-a-large-rock-and-places-it-on-a-partially-completed-rock-wall.png?id=50604854&width=1200&height=800&coordinates=150%2C0%2C150%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://2023.ieee-humanoids.org/">Humanoids 2023</a>: 12–14 December 2023, AUSTIN, TEXAS</h5><h5><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 2 February 2024, ZURICH</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p>This is such an excellent use for autonomous robots: difficult, precise work that benefits from having access to lots of data. Push a button, stand back, and let the robot completely reshape your landscape.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c325e1a010b2a883938eaa71e5d54bc8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/TxpE5yryCTU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://gravisrobotics.com/">Gravis Robotics</a> ]</p><div class="horizontal-rule"></div><p>Universal Robots introduced the UR30 at IREX, in Tokyo, which can lift 30 kilograms—not the 63.5 kg that it says on the tire. That’s the weight of the UR30 itself.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="43789a8e3aadaad2d9253391ad0a2ef9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/VtBtTT8PsNY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Available for preorder now.</p><p>[ <a href="https://www.universal-robots.com/products/ur30-robot/">Universal Robots</a> ]</p><div class="horizontal-rule"></div><p>IREX is taking place in Japan right now, and here’s a demo of Kaleido, a humanoid robot from Kawasaki.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="894a188414120595a4159f0f53137893" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/_h66xSbIEdU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://kawasakirobotics.com/asia-oceania/blog/story_22/">Kawasaki</a> ] via [ <a href="https://www.youtube.com/@shuchannel01/videos">YouTube</a> ]</p><div class="horizontal-rule"></div><p>The Unitree H1 is a full-size humanoid for under US $90,000 (!).</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="283d1a14ec1abe16fdc4060f8fa9263a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/q8JMX6PGRoI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/h1/">Unitree</a> ]</p><div class="horizontal-rule"></div><p>This is extremely impressive but freaks me out a little to watch, and I’m not entirely sure why.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="fe78cc3f4e978638bae929d9d9470255" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/cCtpNDl4IeU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://taochenshh.github.io/projects/visual-dexterity">MIT CSAIL</a> ]</p><div class="horizontal-rule"></div><p>If you look in the background of this video, there’s a person wearing an exoskeleton controlling the robot in the foreground. This is an ideal system for imitation learning, and the robot is then able to perform a similar task autonomously.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="638af8727e7655120c1060c2b24175b3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/RsoI0W8SPPA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://haraduka.github.io/jaxon-tablis-imitation/">Github</a> ]</p><p>Thanks, Kento!</p><div class="horizontal-rule"></div><blockquote><em>The video shows highlights from the RoboCup 2023 Humanoid AdultSize competition in Bordeaux, France. The winning team NimbRo is based in the Autonomous Intelligent Systems lab of University of Bonn, Germany.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cd3ae2fcebecc6496a08b9613407fb27" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/hKLC0Vz1GmM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://nimbro.net/Humanoid/">NimbRo</a> ]</p><div class="horizontal-rule"></div><blockquote><em>This video describes an approach to generate complex, multicontact motion trajectories using user guidance provided through Virtual Reality. User input is useful to reduce the search space through defined key frame. We show these results on the humanoid robot, Valkyrie, from NASA Johnson Space Center, in both simulation and on hardware.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9df5e0fd3c42632cd1603b1930803c5d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vi4zcLSKknk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2303.08232">Paper</a> ] via [ <a href="https://robots.ihmc.us/">IHMC</a> ]</p><div class="horizontal-rule"></div><p>For the foreseeable future, this is likely going to be necessary for most robots doing semi-structured tasks like trailer unloading: human in (or on) the loop supervision.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="29cbe57083c166bba536ae790cfc1538" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/X6umUz8Ia8Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Of course, one human can supervise many robots at once, so as long as most of the robots are autonomous most of the time, it’s all good.</p><p>[ <a href="https://www.contoro.com/">Contoro</a> ]</p><div class="horizontal-rule"></div><blockquote><em>The Danish medical technology start-up ROPCA ApS has launched its first medical product, the arthritis robot “ARTHUR”, which is already being used in the first hospitals. It is based on the lightweight robot LBR Med and supports the early diagnosis of rheumatoid arthritis using robot-assisted ultrasound. This ultrasound robot enables autonomous examination and can thus counteract the shortage of specialists in medicine. This enables earlier treatment, which is essential for a good therapeutic outcome.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d03959ddc42d4471df7d14207322fdcd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/KoxZAwUWTsk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ropca.com/">ROPCA</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Since 2020, KIMLAB has dedicated efforts to craft an affordable humanoid robot tailored for educational needs, boasting vital features like an ROS-enabled processor and multimodal sensory capabilities. By incorporating a commercially available product, we seamlessly integrated an SBC (Orange PI Lite 2), a camera, and an IMU to create a cost-effective humanoid robot, priced at less than $700 in total.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f0cd85f3c3d4c4aca602cbfd7a174abe" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/rO76ji2_sfU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://publish.illinois.edu/kimlab2020/">KIMLAB</a> ]</p><div class="horizontal-rule"></div><blockquote><em>As the newest product launched by WEILAN, the 6th generation AlphaDog, namely BabyAlpha, is defined as a new family member of the artificial intelligence era. Designed for domestic scenarios, it was born for the purpose of providing joyful companionship. Not only do they possess autonomous emotions and distinct personalities, but they also excel in various skills such as singing and dancing, FaceTime calling, English communication, and sports.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="441a07c20bc895c0a82d762de6e36c3a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/o04Sn3uhaDw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.weilan.com/en/alphadogc.html">Weilan</a> ] via [ <a href="https://www.youtube.com/watch?v=o04Sn3uhaDw">ModernExpress</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 01 Dec 2023 18:55:53 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-tap-finger-move-mountain</guid><category>Universal robots</category><category>Video friday</category><category>Kawasaki</category><category>Unitree</category><category>Robotics</category><category>Robotic arm</category><category>Humanoid robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/an-excavator-with-no-human-inside-and-sensors-on-its-roof-picks-up-a-large-rock-and-places-it-on-a-partially-completed-rock-wall.png?id=50604854&amp;width=980"></media:content></item><item><title>Video Friday: Punch-Out</title><link>https://spectrum.ieee.org/video-friday-punch-out</link><description><![CDATA[
  18. <img src="https://spectrum.ieee.org/media-library/a-photo-showing-a-humanoid-robot-wearing-boxing-gloves-facing-a-human-wearing-hand-pads.png?id=50552353&width=1200&height=800&coordinates=400%2C0%2C0%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://2023.ieee-humanoids.org/">Humanoids 2023</a>: 12–14 December 2023, AUSTIN, TEX.</h5><h5><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 02 February 2024, ZURICH, SWITZERLAND</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p>Do you find yourself wondering why the world needs bipedal humanoid robots? Allow IHMC and Boardwalk Robotics to answer that question with this video.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="32de133de258b15100136c2a05852823" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/e20zbSuIR7o?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robots.ihmc.us/">IHMC</a> ]</p><p>Thanks, Robert!</p><div class="horizontal-rule"></div><blockquote><em>As NASA’s Ingenuity Helicopter made its 59th flight on Mars–achieving its second highest altitude while taking pictures of this flight–the Perseverance Mars rover was watching. See two perspectives of this 142-second flight that reached an altitude of 20 meters (66 feet). This flight took place on 16 Sept. 2023. </em><em>In this side-by-side video, you’ll see the perspective from Perseverance on the left, which was captured by the rover’s Mastcam-Z imager from about 55 m (180 ft.) away. On the right, you’ll see the perspective from Ingenuity, which was taken by its downward-pointing Navigation Camera (Navcam). During Flight 59, Ingenuity hovered at different altitudes to check Martian wind patterns. The highest altitude achieved in this flight was 20 m. At the time, that was a record for the helicopter.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d2b4548eb74b5e513198bf934c113c6a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/V5ac3jktME4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://mars.nasa.gov/technology/helicopter/">JPL</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Cassie Blue showcases its ability to navigate a moving walkway, a common yet challenging scenario in human environments. Cassie Blue can walk on to and off of a 1.2 meter-per-second moving treadmill and reject disturbances caused by a tugging gantry and sub-optimal approach angle caused by operator error. The key to Cassie Blue’s success is a new controller featuring a novel combination of virtual constraint-based control and a model predictive controller applied on the often-neglected ankle motor. This technology paves the way for robots to adapt and function in dynamic, real-world settings.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c106c149f7b09148e1985f295e059446" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/utANK8jTwuI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2307.02448">Paper</a> ] via [ <a href="https://robotics.umich.edu/">Michigan Robotics</a> ]</p><p>Thanks, Wami!</p><div class="horizontal-rule"></div><blockquote><em>In this study, we propose a parallel wire-driven leg structure, which has one DoF of linear motion and two DoFs of rotation and is controlled by six wires, as a structure that can achieve both continuous jumping and high jumping. The proposed structure can simultaneously achieve high controllability on each DoF, long acceleration distance and high power required for jumping. In order to verify the jumping performance of the parallel wire-driven leg structure, we have developed a parallel wire-driven monopedal robot, RAMIEL. RAMIEL is equipped with quasi-direct drive, high power wire winding mechanisms and a lightweight leg, and can achieve a maximum jumping height of 1.6 m and a maximum of seven continuous jumps.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a3f8c9496561911eefb89da9ef48a143" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dPmIMdITTwM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://tenrobo18.github.io/ramiel-iros2022/">RAMIEL</a> ]</p><p>Thanks, Temma!</p><div class="horizontal-rule"></div><p><em>PAL Robotics’ Kangaroo has been designed to be lightweight and powerful, enabling the robot to perform agile maneuvers. To achieve this, we analysed existing bipedal platforms and developed a novel leg design based on ball screw linear actuators with closed kinematic chains. However, the team faced a challenge as there are only a few control algorithms and libraries that support closed kinematic chains. To overcome this challenge, we defined the full model of the robot and implemented a library that computes all the transformations using virtual constraints; and defined a simple model and integrated all the non-linear transformations in the ROS transmissions. Here, using those approaches we demonstrate Kangaroo walking using a ZMP-based (Zero moment point) position control algorithm.<br/></em></p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8b013cb9e5b04c5b132a0bbbe5a612b7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/TU9q6j8KJGU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://pal-robotics.com/robots/kangaroo/">PAL Robotics</a> ]<br/></p><p>Thanks, Lorna!</p><div class="horizontal-rule"></div><blockquote><em>SLOT is a small soft-bodied crawling robot with electromagnetic legs and passive body adaptation. The robot, driven by neural central pattern generator (CPG)-based control, can successfully crawl on a variety of metal terrains, including a flat surface, step, slope, confined space, and an inner (concave surface) and outer (convex surface) pipe in both horizontal and vertical directions. It can be also steered to navigate through a cluttered environment with obstacles. This small soft robot has the potential to be employed as a robotic system for inner and outer pipe inspection and confined space exploration in the oil and gas industry.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bd826a8358c415b303bbf204baee9e9b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7inSvLkLSwc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://vistec.ist/faculty-member/poramate">VISTEC</a> ]</p><p>Thanks, Poramate!</p><div class="horizontal-rule"></div><blockquote><em>It isn’t easy for a robot to find its way out of a maze. Picture these machines trying to traverse a kid’s playroom to reach the kitchen, with miscellaneous toys scattered across the floor and furniture blocking some potential paths. This messy labyrinth requires the robot to calculate the most optimal journey to its destination, without crashing into any obstacles. What is the bot to do? MIT CSAIL researchers’ “Graphs of Convex Sets (GCS) Trajectory Optimization” algorithm presents a scalable, collision-free motion planning system for these robotic navigational needs.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a11e43bf7d88da9763545479e7bebad3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/4zvVnUv3ZYw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.csail.mit.edu/news/new-optimization-framework-robot-motion-planning">MIT CSAIL</a> ]</p><div class="horizontal-rule"></div><blockquote><em>As the field of human-robot collaboration continues to grow and autonomous general-purpose service robots become more prevalent, robots need to obtain situational awareness and handle tasks with a limited field of view and workspace. Addressing these challenges, KIMLAB and Prof. Yong Jae Lee at the University of Wisconsin-Madison utilize the game of chess as a testbed, employing a general-purpose robotic arm.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7473cd70b8edd718a28d86092a1836a5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GsLxrXSdOqA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://publish.illinois.edu/kimlab2020/">KIMLAB</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Humanoid robots have the potential of becoming general purpose robots augmenting the human workforce in industries. However, they must match the agility and versatility of humans. In this paper, we perform experimental investigations on the dynamic walking capabilities of a series-parallel hybrid humanoid named RH5. We demonstrate that it is possible to walk up to speeds of 0.43 m/s with a position-controlled robot without full state feedback, which makes it one of the fastest walking humanoids with similar size and actuation modalities.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="edb59a7d7790c662e43ccb90024eed34" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/39GL2vPedGY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotik.dfki-bremen.de/en/research/robot-systems/rh5">DFKI</a> ]</p><div class="horizontal-rule"></div><p>Avocado drone. That is all.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="091b916a9e1a4dc59749205ff1cb6161" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xWKMRphs6fo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/document/10323196">Paper</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Autonomous robots must navigate reliably in unknown environments even under compromised exteroceptive perception, or perception failures. Such failures often occur when harsh environments lead to degraded sensing, or when the perception algorithm misinterprets the scene due to limited generalization. In this paper, we model perception failures as invisible obstacles and pits, and train a reinforcement learning (RL) based local navigation policy to guide our legged robot.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d55471bb11ddec7c7e79b151836928a3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GbTbUdCrDdI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sites.google.com/leggedrobotics.com/resilient-navigation">Resilient Navigation</a> ]</p><div class="horizontal-rule"></div><blockquote><em>X20 Long Range Remote Hazard Detection Test. We remote the robot dog from a straight line distance of one kilometer, and it successfully tested the density of gases. The purpose of the test is to provide solution for firefighters to use the robot to detect harmful gases first before putting themselves in danger.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3e12723b44df1451d0fa256167053025" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Q2A8UsNtAMc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en/index/product.html">Deep Robotics</a> ]</p><div class="horizontal-rule"></div><p>This CMU RI Seminar is by Robert Ambrose from Texas A&M, on “Robots at the Johnson Space Center and Future Plans.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="73042a74e3d49030d7f037d0269fd5bd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/JgxrAX5P0Y8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>The seminar will review a series of robotic systems built at the Johnson Space Center over the last 20 years. These will include wearable robots (exoskeletons, powered gloves and jetpacks), manipulation systems (ISS cranes down to human scale) and lunar mobility systems (human surface mobility and robotic rovers). As all robotics presentations should, this will include some fun videos.</em></blockquote><p>[ <a href="https://www.ri.cmu.edu/event/robots-at-the-johnson-space-center-and-future-plans/">CMU RI</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 24 Nov 2023 19:03:01 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-punch-out</guid><category>Ihmc</category><category>Video friday</category><category>Nasa</category><category>Pal robotics</category><category>Robotics</category><category>Humanoid robots</category><category>Walking robots</category><category>Dynamic walking</category><category>Ingenuity</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-photo-showing-a-humanoid-robot-wearing-boxing-gloves-facing-a-human-wearing-hand-pads.png?id=50552353&amp;width=980"></media:content></item><item><title>Imagineer Morgan Pope Uses Electromagnetism to Spark Emotions</title><link>https://spectrum.ieee.org/disney-roboticist-morgan-pope</link><description><![CDATA[
  19. <img src="https://spectrum.ieee.org/media-library/a-white-man-standing-alongside-a-human-looking-robot-seated-in-a-red-chair.png?id=50468106&width=1200&height=800&coordinates=0%2C480%2C0%2C481"/><br/><br/><p style="">Most people probably think of robots as cold and calculating, but for <a href="https://www.linkedin.com/in/morganthomaspope/" rel="noopener noreferrer" target="_blank">Morgan Pope</a> they can be a tool for generating emotions.</p><p style="">As a research scientist at <a href="https://www.disneyresearch.com/" rel="noopener noreferrer" target="_blank">Disney Research</a> in Glendale, Calif., Pope designs robots for the entertainment giant’s theme parks. But working as an Imagineer, as Disney’s researchers are known, requires both in-depth knowledge of the latest technologies and an instinctive sense of “magic.”</p><h3>Morgan Pope</h3><br/><p><strong></strong><strong>Employer: </strong></p><p>Disney Research, Glendale, Calif.</p>
  20. <p><strong>Title: </strong></p><p>Imagineer</p>
  21. <p><strong>Education:</strong> </p><p>Bachelor’s degree in engineering, Harvard; master’s and Ph.D. degrees in mechanical engineering, Stanford</p><p>“We have a very different mission compared to conventional roboticists,” he says. “We’re trying to use electromagnetism to create emotions.”</p><p>Robots have a long history at Disney. Since 1965, an animatronic of U.S. president Abraham Lincoln has been a fixture at<a href="https://disneyland.disney.go.com/" rel="noopener noreferrer" target="_blank"> Disneyland</a> in Anaheim, Calif. But until recently, most of the robots on display have been firmly bolted to the floor, Pope says, and that has limited the stories they can tell.</p><p>Pope takes advantage of recent breakthroughs in robotics to create robots that can jump, flip, and tumble. He helped build the mechanical superhero Spider-Man, a stunt-double animatronic, or <a href="https://robotsguide.com/robots/stuntronics" rel="noopener noreferrer" target="_blank">stuntronic</a>, that makes death-defying leaps off buildings and over the heads of audiences at Disneyland’s <a href="https://disneyland.disney.go.com/destinations/disney-california-adventure/avengers-campus/?ef_id=CjwKCAjwgsqoBhBNEiwAwe5w02eIiNoGUfG6mD7O6b96vX6eOJArdqmCOoo6nfwS-MIuxppjgsyVvxoC540QAvD_BwE:G:s&s_kwcid=AL!5054!3!567155566052!e!!g!!avengers%20campus&CMP=KNC-FY24_DLR_TRA_DOM_LACN_SCP_ORI_Marvel_EXACT%7CG%7C5232059.DL.AM.01.05%7CM1P1YOD%7CBR%7C567155566052&keyword_id=kwd-809287821531%7Cdc%7Cavengers%20campus%7C567155566052%7Ce%7C5054:3%7C&gclid=CjwKCAjwgsqoBhBNEiwAwe5w02eIiNoGUfG6mD7O6b96vX6eOJArdqmCOoo6nfwS-MIuxppjgsyVvxoC540QAvD_BwE" rel="noopener noreferrer" target="_blank">Avengers Campus</a>. Today Pope is busy designing a rollerblading cartoon character whose clumsiness is designed to tug at your heartstrings. </p><p style="">“We have all these amazing characters that do highly dynamic, engaging, fun things,” he says. “If we can bring these characters to life in ways that currently aren’t possible, that can give people powerful emotional experiences.”</p><h2 style="">A specialty in robot mobility</h2><p>Growing up, Pope was a bookworm. He loved science fiction and popular science magazines and gravitated toward topics like astronomy and quantum mechanics. In college, he discovered his passion for building things. He enrolled in engineering at<a href="https://www.harvard.edu/" rel="noopener noreferrer" target="_blank"> Harvard</a>, and during the summer before his senior year, he secured an internship at the university’s <a href="https://www.micro.seas.harvard.edu/" rel="noopener noreferrer" target="_blank">Microrobotics Laboratory</a>. </p><p style="">That experience stuck with him, and after graduating in 2011 Pope decided to pursue a master’s degree in mechanical engineering at <a href="https://www.stanford.edu/" rel="noopener noreferrer" target="_blank">Stanford</a>. He earned his master’s in 2013 and then continued at Stanford, earning a Ph.D. in the same field in 2016. At the university’s <a href="http://bdml.stanford.edu/" rel="noopener noreferrer" target="_blank">Biomimetics and Dexterous Manipulation Laboratory</a>, he specialized in robot mobility. He led the design of the Stanford Climbing and Aerial Maneuvering Platform (<a href="https://spectrum.ieee.org/stanfords-flying-perching-scamp-can-climb-up-walls" target="_self">SCAMP</a>), a small robot that could fly, land on walls, and then climb them.</p><p style="">He had nearly finished his Ph.D. when he met with a friend who had worked at <a href="https://studios.disneyresearch.com/researchlab/disney-research-pittsburgh/" rel="noopener noreferrer" target="_blank">Disney Research in Pittsburgh</a>. When Pope heard about the Imagineers and what they do, it immediately struck him as a great way to apply his skills. Entertainment applications for robotics sounded like a lot of fun, he says, and it was also a relatively unexplored field and therefore ripe for new innovations. That same year, Pope secured a job as a postdoctoral research associate at Disney. </p><p class="pull-quote" style="">“If we can bring these characters to life in ways that currently aren’t possible, that can give people powerful emotional experiences.”</p><p style=""><span></span>Three years later he became a full-time research scientist there, which took some adjustment. As an academic researcher, he spent a lot of time scrounging around for funding, Pope says, and when grants came through, the projects could take years to complete. “The output was also primarily intellectual—you had to prove the basic idea worked, write a research paper, and move on.”.</p><p>Grant writing is less of a concern for a Disney Imagineer, Pope says, but there is more pressure to deliver results quickly. Also, the kinds of problems Imagineers must solve are different from those of most roboticists. The robots are deployed in amusement parks, often in close proximity to guests, so they are held to much higher safety standards than is usual for most robots. There’s also the pressure to ensure that the robots perform reliably and predictably for multiple shows a day. And, while conventional robotics is typically focused on completing a specific task, Pope says his goal is to bring characters to life. That means concentrating on the way the robots look, move, and behave as well as the specific actions they take. </p><p style="">“It’s not what it does, it’s how it does it,” he explains. “It has to do it in a way that makes you feel like this is a real character, a real, live being.”</p><h2>Bringing Spider-Man to life</h2><p style="">Lifelike action was crucial for the first project that Pope worked on at Disney. The goal was to create a robotic stunt double capable of performing complex aerial acrobatics for the <a href="https://disneyland.disney.go.com/attractions/disney-california-adventure/web-slingers-spider-man-adventure/" target="_blank">Amazing Spider-Man</a> show at Disneyland, which launched in 2021. The show features human performers, but one of the stunts involves Spider-Man backflipping 20 meters into the air, which is too dangerous for even the most skilled acrobat.<br/></p><p style="">To convince the audience they were really watching Spider-Man, the researchers had to create a seamless transition between the acrobat and the robot, Pope says. His role was to work out the complex physics that would generate various somersaulting stunts while the robot was in midair. “It was super rewarding to play around with one of the greatest superhero characters of all time,” he says.</p><p class="shortcode-media shortcode-media-rebelmouse-image" style="">
  22. <img alt="A white man bending down to shake the hand of a child-like robot wearing a white hat and white vest." class="rm-shortcode" data-rm-shortcode-id="6006ed3eed49d6270efa6815aa82bd80" data-rm-shortcode-name="rebelmouse-image" id="70f32" loading="lazy" src="https://spectrum.ieee.org/media-library/a-white-man-bending-down-to-shake-the-hand-of-a-child-like-robot-wearing-a-white-hat-and-white-vest.png?id=50468284&width=980"/>
  23. <small class="image-media media-caption" placeholder="Add Photo Caption...">Morgan Pope shows off Disney’s new Indestructible robot, which can rollerblade, somersault, and perform other feats. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Walt Disney Imagineering</small></p><h2 style="">A robot on rollerblades</h2><p>Projects aren’t always so clear-cut, he admits, and they involve a lot of experimentation. In the early phases, small teams knock out quick and simple prototypes until they hit on something that works. </p><p>“You build something and then step back and think, ‘What about this is making me feel something, what about it is connecting with me?’” Pope says.</p><p>The project he’s currently working on involves a lot of this kind of exploration. For example, his team wanted to create robots that run, but the researchers quickly realized that the machines would fall down a lot. So they built a robot that could tolerate a tumble and get up again. In the end, they found that watching the robot pick itself up was what generated the most compelling emotional response. </p><p>“You relate to the robot struggling, because we’ve all been flat on our backs and had to get up,” he observes. </p><p style="">The team eventually scrapped the running concept and instead put its <a href="https://spectrum.ieee.org/disney-robot-indestructibles" target="_self">robot on a pair of Rollerblades</a>. Many people know the awkwardness of trying to skate for the first time, and that makes the robot’s clumsiness all the more relatable. When the researchers <a href="https://www.youtube.com/watch?v=lPqzLE4KjhI&t=260s" target="_blank">debuted a prototype </a>at this year’s South by Southwest in Austin, Texas, the audience’s warm reaction made it clear that they’d made an immediate emotional connection, Pope recalls.</p><h2 style="">A job for a generalist</h2><p>But building robots for Disney is about more than just intuition and emotional intelligence. It also requires skills in electronics, mechanical design, and programming. </p><p>“You need to understand how different systems work, so if you need to dive into any of them, you can go deep and also pull them all together,” Pope says.</p><p>That’s why his team is always on the lookout for generalists. One of the two most important tips he gives to students, he says, is to familiarize themselves with as many disciplines as possible. </p><p>His other suggestion is to build something. It’s the best way to figure out the kind of engineering that excites you the most, he adds. And learning to create stuff just for the joy of it is the surest path to a great career. </p><p style="">“Try to build things that make you happy,” Pope says. “Chase the things that bring you joy. Chase the things that are delightful.”</p>]]></description><pubDate>Tue, 21 Nov 2023 16:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/disney-roboticist-morgan-pope</guid><category>Disney</category><category>Careers</category><category>Robotics</category><category>Type:departments</category><dc:creator>Edd Gent</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-white-man-standing-alongside-a-human-looking-robot-seated-in-a-red-chair.png?id=50468106&amp;width=980"></media:content></item><item><title>Video Friday: GR-1</title><link>https://spectrum.ieee.org/video-friday-gr-1</link><description><![CDATA[
  24. <img src="https://spectrum.ieee.org/media-library/humanoid-robot-pictured-in-foreground-with-arms-wide-open-two-other-humanoids-behind-it.jpg?id=50507050&width=1200&height=800&coordinates=181%2C0%2C182%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://2023.ieee-humanoids.org/">Humanoids 2023</a>: 12–14 December 2023, AUSTIN, TEXAS</h5><h5><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 2 February 2024, ZURICH, SWITZERLAND</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p>Fourier Intelligence has just announced the mass production of their GR-1 humanoid, and they’ve got at least a dozen of them.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b4835aa39b6d91b92802e419d6a31b4f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/BvFxD-8AhJA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robots.fourierintelligence.com/">Fourier Intelligence</a> ]</p><p>Thanks, Ni Tao!</p><div class="horizontal-rule"></div><blockquote><em>This collaborative work between researchers from the University of Southern Denmark and VISTEC introduces a biomorphic soft robotic skin for a hexapod robot platform, featuring a central pattern generator–based neural controller for generating respiratory-like motions on the skin. The design enables visuo-haptic nonverbal communication between humans and robots and improves the robot’s aesthetics by enhancing its biomorphic qualities.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="eb4c18fe7b98dec15b873775ec73afed" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ioDlsNjLZ2I?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/document/10309420">Paper</a> ]</p><p>Thanks, Mads!</p><div class="horizontal-rule"></div><blockquote><em>According to data from 2010, around 1.8 million people in the United States can’t eat on their own. Yet training a robot to feed people presents an array of challenges for researchers. A team led by researchers at the University of Washington created a set of 11 actions a robotic arm can make to pick up nearly any food attainable by fork. In tests with this set of actions, the robot picked up the foods more than 80 percent of the time, which is the user-specified benchmark for in-home use. The small set of actions allows the system to learn to pick up new foods during one meal.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9f5e237322532cc4c15c6d24e61ffd80" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/6j2ymtDI8LI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.washington.edu/news/2023/11/16/robot-assisted-feeding-meal-accessibility/">UW</a> ]</p><p>Thanks, Stefan!</p><div class="horizontal-rule"></div><p>If you watch enough robot videos, you get to know when a robot is being pushed in a way that’s easy to recover from, and when it’s actually being challenged. The end of this video shows IHMC’s Nadia getting pushed sideways against its planted foot, which necessitates a crossover step recovery.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="43f6cca87fd3f03e7ad726b7088becb0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/aM-qb1yd5mU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/pdf/2307.11968.pdf">Paper</a> ] via [ <a href="https://robots.ihmc.us/nadia">IHMC</a> ]</p><p>Thanks, Robert!</p><div class="horizontal-rule"></div><p>Ayato Kanada, an assistant professor at Kyushu University, wants to build woodpecker-inspired Doc Ock tentacles. And when you’re a professor, you can just do that. </p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e3678c371c425bf34e2e6b4b4fe6ba87" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/oNT-9RBxN8s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Also, woodpeckers are weird.</p><p>[ <a href="https://sites.google.com/view/ayato-kanada-en/home">Ayato Kanada</a> ]</p><p>Thanks, Ayato!</p><div class="horizontal-rule"></div><blockquote><em>Explore Tevel’s joint robotic fruit-harvesting pilot program with Kubota in this video, filmed during the 2023 apple harvest season in the Mazzoni Group’s orchards in Ferrara, Italy. Watch as our autonomous fruit-picking systems operate with precision, skillfully harvesting various apples in the idyllic Italian orchards.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0ef15f5894a952853dada70ddd072b79" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DPjTmyT4a8w?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.tevel-tech.com/">Tevel</a> ]</p><div class="horizontal-rule"></div><p>Understanding what’s an obstacle and what’s only obstacle-ish has always been tricky for robots, but Spot is making some progress here.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="10d095de79546ceb896f2cfe425231fd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pV7GxAFYuto?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://xiaoyi-cai.github.io/evora/">EVORA</a> ]</p><div class="horizontal-rule"></div><blockquote><em>We tried to play Street Fighter 6 by teleoperating Reachy! Well, it didn’t go as planned, as Antoine won. But it was a pretty epic fight!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="73e7def716f904470c46ae1963306c71" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/-uGTTjLuU68?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pollen-robotics.com/">Pollen Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote><em>The key assets of a data center are the servers. While most of them are active in the server room, idle and new assets are stored in the IT warehouse. Focusing mainly on this IT warehouse, SeRo automates the inbound and outbound management of the data center’s assets.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2650ab3a56c041f5fa011df04989e706" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/cLzzHE5eJt4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.naverlabs.com/en/storyDetail/278">Naver Labs</a> ]</p><div class="horizontal-rule"></div><p>Humans can be so mean.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e0b4bd5d336985f1301c6dba4f849b79" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Qw-GBXj0n_4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.flexiv.com/en/technology/robot">Flexiv</a> ]</p><div class="horizontal-rule"></div><p>Interesting HRI with the flashing light on Spot here.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="32593d73b1324b7c7ef4aa3b79ca5914" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/XWP4scq95Eg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/products/spot/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><p>Flying in circles with a big tank of gas really seems like a better job for a robot pilot than for a human one.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="48d8361bc952d774a697f443bf450838" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/jK8eOZpxte0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.boeing.com/defense/mq25/">Boeing</a> ]</p><div class="horizontal-rule"></div><blockquote><em>On 2 November 2023, at an event hosted by the Swiss Association of Aeronautical Sciences at ETH, Professor Davide Scaramuzza presented a comprehensive overview of our latest advancements in autonomous drone technology aimed at achieving human-level performance.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bbe7c807f4c11b9606cedf37ececce55" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vVataTRomsg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://rpg.ifi.uzh.ch/">UZH RPG</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 17 Nov 2023 23:04:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-gr-1</guid><category>Fourier intelligence</category><category>Nadia</category><category>University of washington</category><category>Video friday</category><category>Robotics</category><category>Humanoid robots</category><category>Quadruped robots</category><category>Drones</category><category>Robotic arm</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/humanoid-robot-pictured-in-foreground-with-arms-wide-open-two-other-humanoids-behind-it.jpg?id=50507050&amp;width=980"></media:content></item><item><title>Robot Hand With Working Tendons Printed in One Go</title><link>https://spectrum.ieee.org/3d-printed-robot-hand</link><description><![CDATA[
  25. <img src="https://spectrum.ieee.org/media-library/a-photo-of-two-plastic-robot-hands-with-knobbly-joints-holding-a-pen-and-a-water-bottle.jpg?id=50473016&width=1200&height=800&coordinates=3%2C0%2C4%2C0"/><br/><br/><p>A skeletal <a data-linked-post="2650279155" href="https://spectrum.ieee.org/openai-demonstrates-sim2real-by-with-onehanded-rubiks-cube-solving" target="_blank">robotic hand</a> with working ligaments and tendons can now be <a data-linked-post="2650277941" href="https://spectrum.ieee.org/3d-printing-is-100-times-faster-with-a-powerpoint-projector" target="_blank">3D-printed</a> in one run. The creepy accomplishment was made possible by a new approach to additive manufacturing that can print both rigid and elastic materials at the same time in high resolution.</p><p> The new work is the result of a collaboration between researchers at <a href="https://ethz.ch/en.html" rel="noopener noreferrer" target="_blank">ETH Zurich</a> in Switzerland and a <a href="https://www.mit.edu/" rel="noopener noreferrer" target="_blank">Massachusetts Institute of Technology</a> spin-out called <a href="https://inkbit3d.com/" rel="noopener noreferrer" target="_blank">Inkbit</a>, based in Medford, Mass. The group has devised a new 3D inkjet-printing technique capable of using a wider range of materials than previous devices.</p><p> In a new <a href="http://dx.doi.org/10.1038/s41586-023-06684-3" rel="noopener noreferrer" target="_blank">paper in <em>Nature</em></a>, the group has shown for the first time that the technology can be used to print complex moving devices made of multiple materials in a single print job. These include a bio-inspired robotic hand, a six-legged robot with a grabber, and a pump modeled on the heart.</p><p> “What was really exciting for us is that this technology, for the first time, allowed us to print complete functional systems that work right off the print bed,” says <a href="https://srl.ethz.ch/the-group/people/thomas-buchner.html" rel="noopener noreferrer" target="_blank">Thomas Buchner</a>, a Ph.D. student at ETH Zurich and first author of the paper describing the work.</p><p style="">The new technique operates on principles similar to those of the kind of inkjet printer you might find in an office. Instead of colored inks, though, the printer sprays out resins that harden when exposed to ultraviolet (UV) light, and rather than just printing a single sheet, it builds up 3D objects layer by layer. It’s also capable of printing at extremely high resolution, with voxels—the 3D equivalent of pixels–just a few micrometers across.</p><p class="shortcode-media shortcode-media-youtube">
  26. <span class="rm-shortcode" data-rm-shortcode-id="ce52ed20d7400fd5d9dbb5e4f12af0e5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/eEpn77jNvxQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  27. <small class="image-media media-caption" placeholder="Add Photo Caption...">3D Printed Robot Hand Has Working Tendons</small>
  28. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">
  29. <a href="https://youtu.be/eEpn77jNvxQ" target="_blank">youtu.be</a>
  30. </small>
  31. </p><p> 3D inkjet printers aren’t new, but the palette of materials they can use has typically been limited. That’s because each layer inevitably has imperfections, and the standard approach to dealing with this has been to scrape them off or roll them flat. This means that soft or slow-curing materials cannot be used as they will get smeared or squashed.</p><p> Inkbit has been working on a workaround to this problem for a number of years. The company has built a printer featuring a platform that moves up and down beneath multiple inkjet units, a UV-curing unit, and a scanning unit. After a layer has been deposited and cured, the scanner creates a depth map of the print surface, which is then compared against the 3D model to work out how to adjust the rate of deposition from the inkjet units to even out any irregularities. Areas that received too much resin on the previous layer receive less on the next, and vice versa.</p><p> This means the printer doesn’t require any contact with the materials once they’ve been deposited, says <a href="https://srl.ethz.ch/the-group/prof-robert-katzschmann.html" rel="noopener noreferrer" target="_blank">Robert Katzschmann</a>, a robotics professor at ETH Zurich who led the research. “That leads to all kinds of benefits, because now you can use chemistries that take longer to polymerize, that take longer to harden out, and that opens up a whole new space of much more useful materials.”</p><p><cite class="pull-quote">“We can actually now create a structure or a robot in one shot. It might require maybe adding a motor here or there, but the actual complexity of the structure is all there.” <br/><strong>—Robert Katzschmann, ETH Zurich<br/></strong></cite></p><p>  Previously, Inkbit had been using a scanning approach that could capture images of areas only 2 centimeters across at a time. This process had to be repeated multiple times before all the images were stitched together and analyzed, which significantly slowed down fabrication times. The new technique uses a much faster laser scanning system—the device can now print 660 times as fast as before. In addition, the team has now demonstrated that they can print with elastic polymers called thiol-enes. These materials cure slowly, but they’re much springier and more durable than acrylates, the rubberlike materials that are normally used in commercial 3D inkjet printers.</p><p> To demonstrate the potential of the new 3D printing process, the researchers printed a robotic hand. The device features rigid bones modeled on MRI scans of human hands and elastic tendons that can be connected to servos to curl the fingers in toward the palm. Each fingertip also features a thin membrane with a small cavity behind, which is connected to a long tube printed into the structure of the finger. When the finger touches something, the cavity is compressed, causing the pressure inside the tube to rise. This is picked up by a pressure sensor at the end of the tube, and this signal is used to tell the fingers to stop curling once a certain pressure has been reached.</p><p>The researchers used the hand to grip a variety of objects, including a pen and a water bottle and to touch its thumb to each of its fingertips. Critically, all of the functional parts of the robotic hand, apart from the servos and the pressure sensors, were produced in a single printing job. “What we see as novel about our work is that we can actually now create a structure or a robot in one shot,” says Katzschmann. “It might require maybe adding a motor here or there, but the actual complexity of the structure is all there.”</p><p> The researchers also created a pneumatically powered six-legged robot with a gripper that was able to walk back and forth and pick up a box of Tic-Tacs, and a pump modeled on the human heart, featuring one-way valves and internal pressure sensors, that was capable of pumping 2.3 liters of fluid a minute.</p><p>  Future work will look to further expand the number of materials that the printer can use, says Katzschmann. They are restricted to materials that can be cured using UV light and that aren’t too viscous to work in an inkjet printer. But these could include things like hard epoxies, hydrogels suitable for tissue engineering, or even conductive polymers that could make it possible to print electronic circuits into devices.</p>]]></description><pubDate>Wed, 15 Nov 2023 16:56:55 +0000</pubDate><guid>https://spectrum.ieee.org/3d-printed-robot-hand</guid><category>3d printing</category><category>Additive manufacturing</category><category>Inkjet printing</category><category>Materials</category><category>Robot hand</category><category>Robotics</category><dc:creator>Edd Gent</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-photo-of-two-plastic-robot-hands-with-knobbly-joints-holding-a-pen-and-a-water-bottle.jpg?id=50473016&amp;width=980"></media:content></item><item><title>Video Friday: Beyond the Limit</title><link>https://spectrum.ieee.org/video-friday-beyond-the-limit</link><description><![CDATA[
  32. <img src="https://spectrum.ieee.org/media-library/a-picture-of-a-silver-robot-dog-climbing-a-set-of-concrete-outdoor-stairs-with-logs-rolling-down-them.png?id=50439766&width=1200&height=800&coordinates=150%2C0%2C150%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://ssrr2023.org/">IEEE SSRR 2023</a>: 13–15 November 2023, FUKUSHIMA, JAPAN</h5><h5><a href="https://2023.ieee-humanoids.org/">Humanoids 2023</a>: 12–14 December 2023, AUSTIN, TEXAS</h5><h5><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 2 February 2024, ZURICH</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote><em>Unitree B2: beyond the limit. Maximum speed of 6m/s, sustained load of 40kg and sustained walking endurance of 5h. The comprehensive performance is two to three times that of existing quadruped robots worldwide! Adaptable to all terrains, large load, long-lasting endurance, and super athletic performance! Evolve, evolve, and evolve again!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5910320152f1e0e4ec91904187b1dda5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/-0n_MFLKD3M?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/en/b2/">Unitree</a> ]</p><div class="horizontal-rule"></div><blockquote><em>This shape-changing robot just got a lot smaller. In a new study, engineers at the University of Colorado Boulder debuted mCLARI, a 2-centimeter-long modular robot that can passively change its shape to squeeze through narrow gaps in multiple directions. It weighs less than a gram but can support over three times its body weight as an additional payload.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c38abaac0a26fdb43ce3ddbdad03eb4b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/KbMi6ezXf-Y?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.colorado.edu/lab/jayaram/research/mclari">CU Boulder</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Researchers at CMU used fossil evidence to engineer a soft robotic replica of pleurocystitids, a marine organism that existed nearly 450 million years ago and is believed to be one of the first echinoderms capable of movement using a muscular stem.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5f55acaaf5499d28c94df3d8d2eae4b7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/KMz26Q6Vh-g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://engineering.cmu.edu/news-events/news/2023/11/06-paleobionics.html">CMU</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Stretch has moved over a million customer boxes in under a year, improving predictability and preventing injuries. But how did we get there? Discover how we put our expertise in robotics research to use designing, testing, and deploying a warehouse robot. Starting from the technological building blocks of Atlas, Stretch has the mobility, power, and intelligence to automate the industry’s toughest challenges.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="250e2f4e7d15ef17c15faa50a242a603" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8WZoVJIV9V0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/products/stretch/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote><em>What do the robots do on Halloween after everyone leaves? Join the Ingenuity Labs robots on their trick or treating adventure. Happy Halloween!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6d3a10374f9377312d8addc4be18013d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8NdBNtQmzZg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://offroad.engineering.queensu.ca/">Queens University</a>, Canada ]</p><p>Thanks Josh!</p><div class="horizontal-rule"></div><blockquote><em>FreeLander is a versatile, modular legged-robot hardware platform with adaptive bio-inspired neural control. The robot platform can be used to construct different bio-inspired legged robots. Each module of the platform consists of two legs designed to function as a two-legged robot, which is able to walk on a metal pipe using electromagnetic feet. Multiple modules can be combined to obtain six-legged and eight-legged robots to walk on difficult terrains, such as rough terrain, slopes, random stepfield, gravel, grass, and even in-pipe.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5e416fdf3643ab494da5dbe819459528" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/qr5hahtGYdc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://manoonpong.com/">VISTEC</a> ]</p><p>Thanks Poramate!</p><div class="horizontal-rule"></div><p>Energy Robotics hopes you had a Happy Halloween!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ace83c6b2839636f127daed0dfd97894" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/XMnJ2u1TfyA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.energy-robotics.com/">Energy Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote><em>This work presents a camera model for refractive media such as water and its application in underwater visual-inertial odometry. The model is self-calibrating in real time and is free of known correspondences or calibration targets.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0a8ba98aec4860b657f90ab89f10847d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/OtLE6sEH4IU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.autonomousrobotslab.com/">ARL</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Humans naturally exploit haptic feedback during contact-rich tasks like loading a dishwasher or stocking a bookshelf. Current robotic systems focus on avoiding unexpected contact, often relying on strategically placed environment sensors. In this paper we train a contact-exploiting manipulation policy in simulation for the contact-rich household task of loading plates into a slotted holder, which transfers without any fine-tuning to the real robot.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bd80fc8dae16bba97dcc18e8197ad6c8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/CS7uP_pW77U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sites.google.com/view/compliant-object-insertion">Paper</a> ]</p><p>Thanks Samarth!</p><div class="horizontal-rule"></div><blockquote><em>Presented herewith is another PAPRAS (Plug-And-Play Robotic Arm System) add-on system engineered to augment the functionalities of the quadrupedal robot, Boston Dynamics Spot. The system adeptly integrates two PAPRAS units onto the Spot, drawing inspiration from the mythological creature Orthrus—a two-headed dog in Greek mythology.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="236726e6921a52b5f3d83ba5681e8432" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Yn3IxwAzN6o?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://publish.illinois.edu/kimlab2020/">KIMLAB</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Marwa Eldiwiny is a Ph.D. student and early stages researcher (ESR) at the Vrije Universiteit Brussel whose current research focus is on modelling and simulating self-healing soft materials for industrial applications. Her master’s thesis was “UAV anti-stealth technology for safe operation.” She worked as a research engineer at Inria Lille Nord Europe, research scholar at Tartu Institute of Technology, and a lecturer with the Mechatronics and Industrial Robotics Programme at Minia University, Egypt. Eldiwiny hosts the IEEE RAS Soft Robotics Podcast where researchers from both academia and industry discuss recent developments in the soft robotics research field.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ab655bcf7f9effa8d68de88bdc32d514" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/BXl__8K2gyw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="http://www.smartitn.eu/smart-project-esrs/">SMART ITN</a> ]</p><div class="horizontal-rule"></div><blockquote><em>3 labs. Different robotic solutions of the future. Meet CSAIL’s machine friends.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="df9eacac45b21c2d1a07c205a6b7fe91" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/l5o_edsg_nU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.csail.mit.edu/">MIT CSAIL</a> ]</p><div class="horizontal-rule"></div><p>This University of Pennsylvania GRASP SFI Seminar is by E Farrell Helbling at Cornell, on autonomy for insect-scale robots.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a8da7aa6065c2787190ddf1281841f6c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/XbFaPdXiUHI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Countless science fiction works have set our expectations for small, mobile, autonomous robots for use in a broad range of applications. The ability to move through highly dynamic and complex environments can expand capabilities in search-and-rescue operations and safety-inspection tasks. These robots can also form a diverse collective to provide more flexibility than a multifunctional robot. I will present my work on the analysis of control and power requirements for this vehicle, as well as results on the integration of onboard sensors. I also will discuss recent results that culminate nearly two decades of effort to create a power autonomous insect-scale vehicle. Lastly, I will outline how this design strategy can be readily applied to other micro and bioinspired autonomous robots.</em></blockquote><p>[ <a href="https://www.grasp.upenn.edu/events/fall-2023-grasp-sfi-margaret-coad/">UPenn</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 10 Nov 2023 22:03:41 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-beyond-the-limit</guid><category>Video friday</category><category>Quadruped robots</category><category>Unitree</category><category>Stretch</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-picture-of-a-silver-robot-dog-climbing-a-set-of-concrete-outdoor-stairs-with-logs-rolling-down-them.png?id=50439766&amp;width=980"></media:content></item><item><title>Watch This Giant Chopstick Robot Handle Boxes With Ease</title><link>https://spectrum.ieee.org/warehouse-robots</link><description><![CDATA[
  33. <img src="https://spectrum.ieee.org/media-library/two-men-stand-next-to-a-black-mechanical-robotic-frame-taller-than-they-are-which-big-chopstick-grippers-connected-to-movable-t.jpg?id=50393306&width=1200&height=800&coordinates=0%2C0%2C0%2C497"/><br/><br/><p style="">Although robots are already in warehouses, shuffling small items between bins for shipping or storage, they have yet to take over the job of lugging big, heavy things. And that’s just where they could be of the most use, because lugging is hard for people to do.</p><p style="">Several companies are working on the problem, and there’s likely to be plenty of room for all of them, because the opportunity is enormous. There are a lot of trailers out there that need to be unloaded. Arguably the <em>most</em> interesting approach comes from <a href="https://www.dextrousrobotics.com/" rel="noopener noreferrer" target="_blank">Dextrous Robotics</a>, which has a robot that moves boxes around with a giant pair of chopsticks.</p><hr/><p style="">We first wrote about Dextrous Robotics <a href="https://spectrum.ieee.org/dexterous-robotics-develops-chopstick-manipulation-for-boxes" target="_self">in 2021</a>, when they were working on a proof of concept using Franka Panda robotic arms. Since then, the concept has been proved successfully, and Dextrous has scaled up to a much larger robot that can handle hundreds of heavy boxes per hour with its chopstick manipulators.</p><p class="shortcode-media shortcode-media-youtube" style="">
  34. <span class="rm-shortcode" data-rm-shortcode-id="2bec340386a21c649da07c98ca0959c1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ZtXuustzyqo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  35. </p><p style="">“The chopstick type of approach is very robust,” Dextrous CEO <a href="https://edrumwri.github.io/" target="_blank">Evan Drumwright</a> tells us. “We can carry heavy payloads and small items with very precise manipulation. Independently posable chopsticks permit grasping a nearly limitless variety of objects with a straightforward mechanical design. It’s a real simplification of the grasping problem.”</p><p style="">The video above shows the robot moving about 150 boxes per hour in a scenario that simulates unloading a packed trailer, but the system is capable of operating much faster. The demonstration was done without any path optimization. In an uncluttered environment, Dextrous has been able to operate the system at 900 boxes per hour, about twice as fast as the 300 to 500 boxes per hour that a person can handle.</p><p style="">Of course, the heavier the box, the harder it is for a person to maintain that pace. And once a box gets heavier than about 20 kilograms, it takes two people to move it. At that point, labor becomes far less efficient. On paper, the hardware of Dextrous’s robot is capable of handling 40 kg boxes at an acceleration of up to 3 <em>g</em>s, and up to 65 kg at a lower acceleration. That would equate to 2,000 boxes per hour. True, this is just a theoretical maximum, but it’s what Dextrous is working toward.</p><p style="">If the only problem was to move heavy boxes quickly, robots would have solved it long ago. However, before you can move the box you first have to pick it up, and that complicates matters. Other robotics companies use suction to pick things up. Dextrous alone favors giant chopsticks. </p><p style="">Suction does have the advantage of being somewhat easier to handle on the perception and planning side: Find a flat surface, stick to it, and there you go. That approach assumes you can find a flat surface, but the well-ordered stacks of boxes seen in most demo videos aren’t necessarily what you’ll get in a warehouse. Suction has other problems: It typically has a payload limit of 20 kg or so, it doesn’t work very well with odd-size boxes, and it has trouble operating in temperatures below 10 °C. Suction systems also pull in a lot of dirt, which can cause mechanical problems.</p><p style="">A suction system typically attaches to just one surface, and that limits how fast it can move without losing its grip or tearing open a box. The Dextrous chopsticks can support a box on two sides. But making full use of this capability adds difficulty to the perception and planning side.</p><p style="">“Just getting to this point has been hardcore,” Drumwright says. “We’ve had to get to a level of precision in the perception system and the manipulation to be able to understand what we’re picking with high confidence. Our initial engineering hurdle has been very, very high.”</p><p style="">Manipulating rigid objects with rigid manipulators like chopsticks has taken Dextrous several years to perfect. “Figuring out how to get a robot to perceive and understand its environment, figure out the best item to pick, and then manipulating that item and doing all that in a reasonable length of time—that is really, really hard,” Drumwright tells us. “I’m not going to say we’ve solved that 100 percent, but it’s working very well. We still have plenty of stuff left to do, but the proof of concept of actually getting a robot that does contact-based manipulation to pick variably sized objects out of an unconstrained environment in a reasonable time period? We’ve solved that.”</p><p style="">Here’s another video showing a sustained box-handling sequence; if you watch carefully, you’ll notice all kinds of precise little motions as the robot uses its manipulators to slightly reposition boxes to give it the best grasp:</p><p class="shortcode-media shortcode-media-youtube" style="">
  36. <span class="rm-shortcode" data-rm-shortcode-id="1d4b4de7d35670283318c90b8e69a9c9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/mi7kO0QTTRs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  37. </p><p style="">All of those motions makes the robot look almost like it’s being teleoperated, but Drumwright assures me that it’s completely autonomous. It turns out that teleoperation doesn’t work very well in this context. “We looked at doing teleop, and we actually could not do it. We found that our controllers are so precise that we could not actually make the system behave better through teleop than it did autonomously.” As to <em>how</em> the robot decides what to do what it does, “I can’t tell you exactly where these behaviors came from,” Drumwright says, “Let’s just call it AI. But these are all autonomous manipulation behaviors, and the robot is able to utilize this diverse set of skills to figure out how to pick every single box.”</p><p style="">You may have noticed that the boxes in the videos are pretty beat up. That’s because the robot has been practicing with those boxes for months, but Dextrous is mindful of the fact that care is necessary, says Drumwright. “One of the things that we were worried about from the very beginning was, how do we do this in a gentle way? But our newest version of the robot has the sensitivity to be very gentle with the boxes.”</p><p style="">I asked Drumwright what would be the most difficult object for his robot to pick up. I suggested a bowling ball (heavy, slippery, spherical). “Challenging, but by no means impossible,” was his response, citing research from <a href="https://goodrobot.ai/" target="_blank">Siddhartha Srinivasa</a> at the University of Washington showing that <a href="https://goodcherrybot.github.io/" rel="noopener noreferrer" target="_blank">a robot with chopsticks can learn to do dynamic fine manipulation of spherical objects</a>. Dextrous isn’t above cheating slightly, though, by adding a thin coating of hard rubber to the chopsticks’ end effectors to add just a tiny bit of compliance—not enough to mess with planning or control, but enough to make grasping some tricky objects a little easier.</p><p class="shortcode-media shortcode-media-youtube" style="">
  38. <span class="rm-shortcode" data-rm-shortcode-id="a98597c2119567981f99b1fe609dc2fc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pXgjySxI1qQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  39. </p><p style="">By a year ago, Dextrous had shown that it could move boxes at high speeds under limited scenarios. For the past year, it has been making sure that the system can handle the full range of scenarios that it’s likely to encounter in warehouses. Up next is combining those two things—cranking the speed back up while still working reliably and autonomously.</p><p style="">“On the manipulation side, the system is fully autonomous,” Drumwright says. “We currently have humans involved in driving the robot into the container and then joysticking it forward once it’s picked all that it can reach, but we’re making that fully autonomous, too.” And the robot has so far been quite reliable, requiring little more than lubrication.<br/></p><p style="">According to Drumwright, the biggest challenge on the business side at this point is simply manufacturing enough robots, since the company builds the hardware in-house. The remaining question is how long it will take to make the transition from experiment to product. The company is starting a few commercial pilots, and Drumwright says the thing that’s slowing them down the most is building enough robots to keep up with demand.</p><p style="">“We’ve solved all of the hardest technical problems,” he says. “And now, it’s the business part.”</p>]]></description><pubDate>Tue, 07 Nov 2023 18:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/warehouse-robots</guid><category>Manipulation</category><category>Warehouse robots</category><category>Robotics</category><category>Industrial robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/two-men-stand-next-to-a-black-mechanical-robotic-frame-taller-than-they-are-which-big-chopstick-grippers-connected-to-movable-t.jpg?id=50393306&amp;width=980"></media:content></item><item><title>Robots and the Humans Who Make Them</title><link>https://spectrum.ieee.org/chatbot-podcast-2666060927</link><description><![CDATA[
  40. <img src="https://spectrum.ieee.org/media-library/a-smiling-man-kneels-with-his-arm-around-a-squat-robot-which-has-two-flipper-legs-a-boxy-body-and-glowing-eyes-on-a-flat-head.jpg?id=50300836&width=1200&height=800&coordinates=0%2C0%2C0%2C352"/><br/><br/><p>When<em> IEEE Spectrum</em> editors are putting together an issue of the magazine, a story on the website, or an episode of a podcast, we try to facilitate dialogue about technologies, their development, and their implications for society and the planet. We feature expert voices to articulate technical challenges and describe the engineering solutions they’ve devised to meet them. </p><p>So when Senior Editor Evan Ackerman cooked up a concept for a robotics podcast, he leaned hard into that idea. Ackerman, the world’s premier robotics journalist, talks with roboticists every day, and recording those conversations to turn those interviews into a podcast is usually a relatively straightforward process. But Ackerman wanted to try something a little bit different: bringing two roboticists together and just getting out of the way.</p><p style="">“The way the Chatbot podcast works is that we invite a couple of robotics experts to talk with each other about a topic they have in common,” Ackerman explains. “They come up with the questions, not us, which results in the kinds of robotics conversations you won’t hear anywhere else—uniquely informative but also surprising and fun.”</p><p>Each episode focuses on a general topic the roboticists have in common, but once they get to chatting, the guests are free to ask each other about whatever interests them. Ackerman is there to make sure they don’t wander too far into the weeds, because we want everyone to be able to enjoy these conversations. “But otherwise, I’ll mostly just be listening,” Ackerman says, “because I’ll be as excited as you are to see how each episode unfolds.”</p><p style="">We think this unique format gives the listener the inside scoop on aspects of robotics that only the roboticists themselves could get each other to reveal. Our first few episodes are already live. They include <a href="https://spectrum.ieee.org/autonomous-drones" target="_self">Skydio CEO Adam Bry and the University of Zurich professor Davide Scaramuzza</a> talking about autonomous drones, <a href="https://spectrum.ieee.org/domestic-robots" target="_self">Labrador Systems CEO Mike Dooley and iRobot chief technology officer Chris Jones</a> on the challenges domestic robots face in unpredictable dwellings, and <a href="https://spectrum.ieee.org/boston-dynamics-dancing-robots" target="_self">choreographer Monica Thomas and Amy LaViers of the Robotics, Automation, and Dance (RAD) Lab</a> discussing how to make Boston Dynamics’ robot dance. </p><p style="">We have plenty more Chatbot episodes in the works, so please subscribe on whatever podcast service you like, <a href="https://spectrum.ieee.org/podcasts/" target="_self">listen and read the transcript on our website</a>, or watch the video versions on the <a href="https://www.youtube.com/playlist?list=PL8Ug41r-ywn-R8rHLer9X5hxLRaXnuOyz" rel="noopener noreferrer" target="_blank"><em>Spectrum</em> YouTube channel</a>. While you’re at it, subscribe to our other biweekly podcast, <a href="https://spectrum.ieee.org/podcasts/fixing-the-future/" target="_blank">Fixing the Future</a>, where we talk with experts and <em>Spectrum</em> editors about sustainable solutions to climate change and other topics of interest. And we’d love to hear what you think about our podcasts: what you like, what you don’t like, and especially who you’d like to hear on future episodes.</p>]]></description><pubDate>Sat, 04 Nov 2023 15:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/chatbot-podcast-2666060927</guid><category>Podcasts</category><category>Robotics</category><dc:creator>Harry Goldstein</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-smiling-man-kneels-with-his-arm-around-a-squat-robot-which-has-two-flipper-legs-a-boxy-body-and-glowing-eyes-on-a-flat-head.jpg?id=50300836&amp;width=980"></media:content></item><item><title>Video Friday: Robots for Humanity</title><link>https://spectrum.ieee.org/video-friday-robots-for-humanity</link><description><![CDATA[
  41. <img src="https://spectrum.ieee.org/media-library/a-photograph-of-a-young-child-in-a-colorful-dress-and-an-older-woman-sitting-on-the-floor-next-to-a-mobile-robot-with-a-screen-o.png?id=50376290&width=1121&height=718&coordinates=121%2C0%2C38%2C2"/><br/><br/><p style="">Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://ssrr2023.org/">IEEE SSRR 2023</a>: 13–15 November 2023, FUKUSHIMA, JAPAN</h5><h5 style=""><a href="https://2023.ieee-humanoids.org/">Humanoids 2023</a>: 12–14 December 2023, AUSTIN, TEXAS</h5><h5 style=""><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 2 February 2024, ZURICH</h5><h5><a href="https://www.eurobot.org/">Eurobot Open 2024</a>: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote><em>An overview of ongoing work by Hello Robot, the University of Illinois Urbana-Champaign, the University of Washington, and Robots for Humanity to empower Henry Evans’s independence through the use of the mobile manipulator Stretch.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="fcdd65bc5bdd105974abd316b54cb266" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GsdRvu-2nZ4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p style="">And of course, you can read more about this project in <a href="https://spectrum.ieee.org/stretch-assistive-robot" target="_blank">this month’s issue of <em>IEEE Spectrum</em> magazine</a>.</p><p style="">[ <a href="https://hello-robot.com/">Hello Robot</a> ]</p><div class="horizontal-rule" style=""></div><blockquote><em>At KIMLAB, we have a unique way of carving Halloween pumpkins! Our MOMO (Mobile Object Manipulation Operator) is equipped with PAPRAS arms featuring prosthetic hands, allowing it to use human tools.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d53e0d4b77ee1cf7ab583073e53e22a2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/A4W6SqH5_Ew?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://publish.illinois.edu/kimlab2020/">KIMLAB</a> ]</p><div class="horizontal-rule"></div><p style="">This new haptic system from Carnegie Mellon University seems actually amazing, although watching the haptic arrays pulse is wigging me out a little bit for some reason.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="df7089628ec802c70e4877b099748e94" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UJXLBqG9E_s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.figlab.com/research/2023/FluidReality">Fluid Reality Group</a> ]</p><div class="horizontal-rule"></div><blockquote><em>We are excited to introduce you to the Dingo 1.5,  the next generation of our popular Dingo platform!  With enhanced hardware and software updates, the Dingo 1.5 is ready to tackle even more challenging tasks with ease.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6a29e38237128a65322e2f2b306d75d3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/MKMav31GWv8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://clearpathrobotics.com/dingo-indoor-mobile-robot/">Clearpath</a> ]</p><div class="horizontal-rule"></div><p>A little bit of a jump scare here from ANYbotics.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="afcbe782cbac3cc19d0f947c19715f24" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0pyfUQG4Yts?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.anybotics.com/">ANYbotics</a> ]</p><div class="horizontal-rule"></div><p>Happy haunting from Boston Dynamics!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d7dd644a3e299024685d62e3bd3ac811" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/KfkDg8KE_JY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/products/spot/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><p style="">I’m guessing this is some sort of testing setup, but it’s low-key terrifying.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="eb4b10a2e1f6e4008c5f4f765acf73a9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pzewNZk-B04?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.flexiv.com/en/">Flexiv</a> ]</p><div class="horizontal-rule" style=""></div><blockquote><em>KUKA has teamed up with Augsburger Puppenkiste to build a mobile show cell in which two robots do the work of the puppeteers.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="aec7d3712305b46925b1be244a2982af" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/KeAUdbaDEvE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.kuka.com/HomeOfRobotik">KUKA</a> ]</p><div class="horizontal-rule"></div><blockquote><em>In this video, we showcase the Advanced Grasping premium software package’s capabilities. We demonstrate how TIAGo collects objects and places them, how the gripper adapts to different shapes, and the TIAGo robot’s perception and manipulation capabilities.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="869171dfff55d84b63d2e06bbe746a2b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/n_dbm0gttP8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://pal-robotics.com/robots/tiago/">PAL Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote><em>HEBI Robotics produces a platform for robot development.  Our long-term vision is to make it easy and practical for any worker, technician, farmer, et cetera, to create robots as needed.   Today the platform is used by researchers around the world, and HEBI is using it to solve challenging automation tasks related to inspections and maintenance.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="25c6bf6e6322dc57f52882ef17bfa32e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/4Y6Pw6sUaBY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.hebirobotics.com/">HEBI Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Folded robots are a rapidly growing field that is revolutionizing how we think about robotics.  Taking inspiration from the ancient art of origami results in thinner, lighter, more flexible autonomous robots.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5c06f4339e5314d50c9bcc0e5198fb2b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/_94RMdIqW9g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p style="">[ <a href="https://www.youtube.com/@NSFScience">NSF</a> ]</p><div class="horizontal-rule"></div><blockquote>Can I Have a Pet T. rex? <em>is a short interdisciplinary portrait documentary featuring the paleontologist and Kod*lab postdoc Aja Mia Carter and the Kod*lab robotics researchers postdoc Wei-Hsi Chen and Ph.D. student J. Diego Caporale. Chen applies the art of origami to make a hopping robot, while Caporale adds a degree of freedom to the spine of a quadruped robot to interrogate ideas about twisting and locomotion. An expert in the evolution of tetrapod spines from 380 million years ago, Carter is still motivated by her childhood dream of a pet T. rex. But how can these robotics researchers get her closer to her vision?</em></blockquote><p class="shortcode-media shortcode-media-youtube" style=""><span class="rm-shortcode" data-rm-shortcode-id="f82bc82877ff3e281c9ef13077e747fe" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/VkKuxDJ0Xtg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://kodlab.seas.upenn.edu/">Kodlab</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 03 Nov 2023 15:16:31 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robots-for-humanity</guid><category>Anybotics</category><category>Boston dynamics</category><category>Hello robot</category><category>Video friday</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-photograph-of-a-young-child-in-a-colorful-dress-and-an-older-woman-sitting-on-the-floor-next-to-a-mobile-robot-with-a-screen-o.png?id=50376290&amp;width=980"></media:content></item><item><title>Video Friday: ChatSpot</title><link>https://spectrum.ieee.org/video-friday-chatspot</link><description><![CDATA[
  42. <img src="https://spectrum.ieee.org/media-library/a-yellow-robot-dog-has-a-black-and-yellow-arm-mounted-on-its-head-where-the-gripper-of-the-arm-is-decorated-with-eyes-and-color.jpg?id=50315578&width=1200&height=800&coordinates=150%2C0%2C150%2C0"/><br/><br/><p style="">Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://ssrr2023.org/">IEEE SSRR 2023</a>: 13–15 November 2023, FUKUSHIMA, JAPAN</h5><h5 style=""><a href="https://2023.ieee-humanoids.org/">Humanoids 2023</a>: 12–14 December 2023, AUSTIN, TEXAS.</h5><h5 style=""><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 02 February 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p>The process of getting Spot to talk with a personality is very cool, but this is also something that should be done very carefully: Spot is a tool, and although it may sound like it thinks and feels, it absolutely doesn’t. Just something to keep in mind as more Spots (and other robots) make it out into the wild.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b114125caedf9305fb06571d19395175" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/djzOBZUFzTw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/blog/robots-that-can-chat/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><p>Shhh. Be vewy, vewy quiet.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a3e0746e9ebad0a9dc36d00b5428b191" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/kxQW5XxxmGk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://aps.arxiv.org/abs/2310.03743">Paper</a> ]</p><div class="horizontal-rule"></div><blockquote><em>This video presents the remarkable capabilities of the TALOS robot as it demonstrates agile and robust walking using Model Predictive Control (MPC) references sent to a Whole-Body Inverse Dynamics (WBID) controller developed in collaboration with Dynamograde.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a873198f685a278e2830d5a635414b07" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/NO4Uu_Rq-EQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://pal-robotics.com/robots/talos/">PAL Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Dr. Hooman Samani from the Creative Robotics Lab at the University of the Arts London writes, “The idea is to show how robots can be beyond traditional use and involve more people in robotics such as artists as we do at our university. So we made this video to show how a co-bot can be used as a DJ and people and robots dance together to the robot DJ in a robot dance party!”</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cddc4f407fc9b602a469d9d07b0e74c0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/cMpHrJPmA78?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.arts.ac.uk/creative-computing-institute">London CCI</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Future robots should perform multiple and various tasks, instead of simple pick-and-place operations. In this video, Dino Robotics demonstrates the functionalities in their software solution: it cooks a steak! Bon Appétit!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ac4274d85d42da03ef070aaf9fdf5581" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/svXBzlUCAPk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dino-robotics.com/?lang=en">Dino Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote><em>This video presents a novel perching and tilting aerial robot for precise and versatile power-tool work on vertical walls. The system was developed as part of the AITHON ETH Zürich Bachelor student focus project and presented at IEEE IROS 2023. It combines a compact integrated perching drone design with a concrete drill’s heavy payload and reaction forces. </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="09173a3903e0f174e3018c4d69fe7da6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/kGuKkIqlAqg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2310.10548">Paper</a> ]</p><div class="horizontal-rule"></div><p style="">This is what very high precision, very useful robotics looks like.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4e940fd68a23263307693cd53f120b46" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Vjk5mKSo5_U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dustyrobotics.com/">Dusty</a> ]</p><div class="horizontal-rule"></div><p>I never thought I’d write this sentence, but here is some video of a failing robotic mudskipper sex doll.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="85f1ad5917041974e32f4e540a7d2a52" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/QBdKXOLqmkA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pbs.org/wnet/nature/">Nature</a> ]</p><div class="horizontal-rule"></div><p>Good aim on this drone considering that its landing pad is speeding along at 20 knots.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5bfac1ad8940fd9f03a7461983488d79" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/KKEEu1ZUkgU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.avinc.com/uas/jump-20">AeroVironment</a> ]</p><div class="horizontal-rule"></div><p style="">From the people responsible for the giant gundam in Japan comes this very big and very slow rideable quadruped thing.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="779a9a784fb0d8c4f1d200568f7cb64a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/NgFR3IRT4sk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotstart.info/2023/10/26/sansei-new-sr02-jms2023.html">Robotstart</a> ]</p><div class="horizontal-rule"></div><p>RoboCup 2024 will be in Eindhoven in July!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="588c3ccc127f2a25adae2810f37f0260" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Z5vLekUfpS4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://2024.robocup.org/">RoboCup</a> ]</p><div class="horizontal-rule"></div><blockquote><em>A brief look into the 2023 IEEE RAS Summer School on Multi-Robot Systems, which took place in July 2023 in Prague.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4ea8001f8629b8586828edcec143762a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8-ajKIKAlss?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://mrs.felk.cvut.cz/summer-school-2024/">CTU</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Lava caves on Mars and particularly on the moon are not only interesting for exo-geologists and other space scientists, but they also could be used as storage rooms or even habitats for future human settlements. The question is how to access and explore these huge cavities under the lunar surface without risking the lives of astronauts. This is where robots, or rather teams of robots, come into play.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f5b22448022ea84f84ed925a10723c28" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/YNkbi8ta3yw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.corob-x.eu/">DFKI</a> ]</p><div class="horizontal-rule"></div><blockquote><em>The rise of recent Foundation models (and applications e.g. ChatGPT) offer an exciting glimpse into the capabilities of large deep networks trained on Internet-scale data. In this talk, I will briefly discuss some of the lessons we’ve learned while scaling real robot data collection, how we’ve been thinking about Foundation models, and how we might bootstrap off of them (and modularity) to make our robots useful sooner.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a077df6b22feeaecf019f50e0d9728a1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/v3ggjaBQeeo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.grasp.upenn.edu/events/fall-2023-grasp-sfi-andy-zeng/">UPenn</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 27 Oct 2023 18:18:54 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-chatspot</guid><category>Boston dynamics</category><category>Dfki</category><category>Robocup</category><category>Humanoid robots</category><category>Quadruped robots</category><category>Robotics</category><category>Video friday</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-yellow-robot-dog-has-a-black-and-yellow-arm-mounted-on-its-head-where-the-gripper-of-the-arm-is-decorated-with-eyes-and-color.jpg?id=50315578&amp;width=980"></media:content></item><item><title>Roombas at the End of the World</title><link>https://spectrum.ieee.org/south-pole-roombas</link><description><![CDATA[
  43. <img src="https://spectrum.ieee.org/media-library/a-photo-of-two-men-from-the-shoulders-down-standing-and-pointing-nerf-guns-at-a-roomba-on-a-pedestal-between-them.jpg?id=49795812&width=1200&height=800&coordinates=87%2C0%2C88%2C0"/><br/><br/><p style="">
  44. <a href="https://www.nsf.gov/geo/opp/support/southp.jsp" target="_blank">Amundsen–Scott South Pole Station</a> is a permanent scientific research base located at what is arguably the most isolated place on Earth. During the austral summer, the station is home to about 150 scientists and support staff, but during the austral winter, that number shrinks to just 40 or so, and those people are completely isolated from the rest of the world from mid-February until late October. For eight months, the station has to survive on its own, without deliveries of food, fuel, spare parts, or anything else. Only in <a href="https://www.southpolestation.com/trivia/10s/medevac.html" rel="noopener noreferrer" target="_blank">the most serious of medical emergencies</a> will a plane attempt to reach the station in the winter.
  45. </p><p style="">
  46. While the station’s humans rotate seasonally, there are in fact four full-time residents: the South Pole Roombas. First, there was Bert, a Roomba 652, who arrived at the station in 2018 and was for a time the loneliest robot in the world. Since the station has two floors, Bert was joined by Ernie, a Roomba 690, in 2019. A second pair of Roombas, Sam and Frodo, followed soon after.
  47. </p><p style="">
  48. These Roombas are at the South Pole to do what Roombas do: help keep the floors clean. But for the people who call the South Pole home for months on end, it turns out that these little robots have been able to provide some much-needed distraction in a place where things stay more or less the same all of the time, and where pets, plants, and even dirt is explicitly outlawed by the Antarctic Treaty in the name of ecological preservation.
  49. </p><hr/><p style="">
  50. For the last year, an anonymous IT engineer has been <a href="https://brr.fyi/" rel="noopener noreferrer" target="_blank">blogging about his experiences</a>, working first at McMurdo Station (on the Antarctic coast south of New Zealand), and later at Amundsen–Scott South Pole Station, where he’s currently spending the winter as part of the station’s support staff. <a href="https://brr.fyi/" rel="noopener noreferrer" target="_blank">His blog</a> includes mundane yet fascinating accounts of what day-to-day life is like at the South Pole, including <a href="https://brr.fyi/posts/showering-at-the-south-pole" rel="noopener noreferrer" target="_blank">how showering works</a> (four minutes per person per week), <a href="https://brr.fyi/posts/south-pole-electrical-infrastructure" rel="noopener noreferrer" target="_blank">where the electricity comes from</a> (a huge amount of aviation fuel hauled over land from the coast that will power generators), and <a href="https://brr.fyi/posts/the-last-egg" rel="noopener noreferrer" target="_blank">the fate of the last egg for five months</a> (over medium with salt and pepper).</p><p style="">The engineer also devoted an entire post to <a href="https://brr.fyi/posts/south-pole-signage" rel="noopener noreferrer" target="_blank">signage at the South Pole</a>, at the very end of which was this picture, which raised some questions for me:</p><p class="shortcode-media shortcode-media-rebelmouse-image" style="">
  51. <img alt='A close up picture of  the top of a Roomba showing some small eye stickers, a sticker with the name "Ernie," and a sticker that says "it was so cold."' class="rm-shortcode" data-rm-shortcode-id="0a7f8776a0ea79c2233e23a049a980cc" data-rm-shortcode-name="rebelmouse-image" id="c2dd8" loading="lazy" src="https://spectrum.ieee.org/media-library/a-close-up-picture-of-the-top-of-a-roomba-showing-some-small-eye-stickers-a-sticker-with-the-name-ernie-and-a-sticker-that.jpg?id=49795821&width=980"/>
  52. <small class="image-media media-caption" placeholder="Add Photo Caption...">Ernie is a Roomba living at the South Pole.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://brr.fyi/posts/south-pole-signage" target="_blank">brr.fyi</a></small></p><p style="">
  53. Ernie, it turns out, has had a dramatic and occasionally harrowing life at the South Pole station. After Ernie arrived in 2019 to clean one floor of the station, lore began to develop that Ernie and its partner Bert (tasked with cleaning the floor above) were “<a href="https://twitter.com/weblogarithms/status/1474984238097125379" rel="noopener noreferrer" target="_blank">star-crossed lovers, forever separated by the impenetrable barrier of the staircase</a>.” That quote comes from <a href="https://amylowitz.com/" rel="noopener noreferrer" target="_blank">Amy Lowitz</a>, a member of the <a href="https://pole.uchicago.edu/public/Home.html" rel="noopener noreferrer" target="_blank">South Pole Telescope</a> team, who <a href="http://amylowitz.com/SouthPole/" rel="noopener noreferrer" target="_blank">overwintered at the pole in 2016</a> and has spent many summers there. “I think I made that joke every year when a new group of people comes to the pole for the summer,” Lowitz tells <em>I</em><em>EEE Spectrum</em>. “There’s only so many things to talk about, so eventually the Roombas come up in conversation.” Happily for Ernie, Lowitz says that it’s now on the same floor as Bert, with the new Roombas Sam and Frodo teaming up on the floor below.
  54. </p><p style="">
  55. But Ernie’s presumed joy at finally being united with Bert was not to last—in January of 2020, Ernie went missing. The <a href="https://twitter.com/SPTelescope" target="_blank">Twitter account of the South Pole Telescope</a> posted photos pleading for Ernie’s return, and a small memorial appeared at Ernie’s docking station.
  56. </p><p class="shortcode-media shortcode-media-rebelmouse-image" style="">
  57. <img alt="A Twitter post shows two pictures, including a lost poster for the Roomba Ernie, and flowers near a sign for Ernie." class="rm-shortcode" data-rm-shortcode-id="6ce084ff14ba8ee49d0d9cdfc43b4bfb" data-rm-shortcode-name="rebelmouse-image" id="7ce63" loading="lazy" src="https://spectrum.ieee.org/media-library/a-twitter-post-shows-two-pictures-including-a-lost-poster-for-the-roomba-ernie-and-flowers-near-a-sign-for-ernie.jpg?id=49813864&width=980"/>
  58. <small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://twitter.com/SPTelescope" rel="noopener noreferrer" target="_blank">SPT</a></small></p><p style="">Soon, things took a more sinister (amusingly sinister) turn. Kyle Ferguson is a South Pole Telescope team member who was at the station in the summer of 2020 when Ernie went missing, and has vivid memories of the drama that ensued:</p><blockquote>I believe it started with just one poster that went up outside of the galley, with a picture of two people calling themselves the Cookie Monsters posing in balaclavas and standing on a staircase holding Ernie. It said something like, “If you ever want to see Ernie alive again, leave a tray of chocolate chip cookies in such and such location and we will return him safely.” So that was the initial ransom.</blockquote><p class="shortcode-media shortcode-media-rebelmouse-image" style="">
  59. <img alt="A twitter post showing two people with nerf guns and a roomba on ransom notes." class="rm-shortcode" data-rm-shortcode-id="1351a4f49e0b2fffec379d5cf564b60f" data-rm-shortcode-name="rebelmouse-image" id="8a0b2" loading="lazy" src="https://spectrum.ieee.org/media-library/a-twitter-post-showing-two-people-with-nerf-guns-and-a-roomba-on-ransom-notes.jpg?id=49813876&width=980"/>
  60. <small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://twitter.com/SPTelescope" rel="noopener noreferrer" target="_blank">SPT</a></small></p><blockquote>As tends to happen in a community like this, things sort of took off from there—everybody ran with it in their own direction. So, on that wall outside of the galley, there evolved a narrative where people were trying to mount rescue missions, and there were sign-up sheets for that. And there were people saying, “We won’t negotiate with you until you provide proof of life.”<br/><br/>Down the hallway, there was another narrative where people had assumed the worst: that the kidnappers had ended poor Ernie’s life prematurely. So the memorial that had sprung up for Ernie next to one of the water fountains grew. There were fake flowers and Tootsie rolls, and some people put some trash there, just in homage—trash that Ernie would never be able to sweep up. I even ended up writing a parody of the song “5,000 Candles in the Wind” from “Parks and Recreation” for Ernie, and singing it at an open-mic night.</blockquote><p class="shortcode-media shortcode-media-rebelmouse-image" style="">
  61. <img alt="A Twitter post shows two roombas, flowers, and signs." class="rm-shortcode" data-rm-shortcode-id="bf2d93ded71273058144006c142013b6" data-rm-shortcode-name="rebelmouse-image" id="57cf7" loading="lazy" src="https://spectrum.ieee.org/media-library/a-twitter-post-shows-two-roombas-flowers-and-signs.jpg?id=49813889&width=980"/>
  62. <small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://twitter.com/SPTelescope/status/1224166990941761536" target="_blank">SPT</a></small></p><blockquote>But Ernie did come back. Those of us who believed that he had perished (I was one of those) were in the wrong. Someone claimed that the cookies had been delivered and that the kidnappers should give Ernie back, and then there was a poster that went up that said Ernie was found abandoned underneath one of the staircases. He was rescued and revived by the Cookie Monsters. So, the kidnappers sort of got credit for saving him in the end.</blockquote><p style="">Ferguson suspects that Ernie’s “IT WAS SO COLD” sticker was acquired after the robot’s brief trip outside with the kidnappers. Summer temperatures at the south pole average around -28 °C, substantially below the operating temperature of a Roomba, although when we spoke to Ferguson for this article during the South Pole winter, it was closer to -80 °C outside the station, including wind chill.</p><p>The harsh weather and isolation may help explain why Ernie and his Roomba brethren get so much attention from the station residents. “There’s more to do at the South Pole than people think,” Amy Lowitz tells us, “but you’re still pretty much within a half mile radius of the main station, all of the time. So people get a little bored and a little stir crazy, and we look for <a href="https://www.wiffa.aq/en/film/1049" rel="noopener noreferrer" target="_blank">new and strange ways to entertain ourselves</a>. The ransom notes were just some goofy hijinks from some bored people at the South Pole.”</p><p style="">Lowitz also remembers a party where either Bert or Ernie was drafted as a DJ, with a Bluetooth speaker and some fancy lighting. “We had it running around up on a table so that people wouldn’t trip over it,” she recalls. And as recently as this winter, says Kyle Ferguson, a befurred Roomba could be seen on station: “Someone put up a silly ‘lost cat’ poster earlier in the winter, with a picture not even of a cat but of like a raccoon or something. And then someone else took that and decided to run with it, so they had this fake raccoon fur that they put to the top of one of the Roombas and sent it out to wander the hallways.”</p><p class="shortcode-media shortcode-media-rebelmouse-image" style="">
  63. <img alt="A photo of a Roomba that is mostly covered by the skin of a raccoon." class="rm-shortcode" data-rm-shortcode-id="46a03f0ee44e5025841967c05f1c91b6" data-rm-shortcode-name="rebelmouse-image" id="6f678" loading="lazy" src="https://spectrum.ieee.org/media-library/a-photo-of-a-roomba-that-is-mostly-covered-by-the-skin-of-a-raccoon.jpg?id=49795866&width=980"/>
  64. <small class="image-media media-caption" placeholder="Add Photo Caption...">Sam, the “station cat.”</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Kyle Ferguson</small></p><p style="">Covering a Roomba with fur may be getting the robot a little closer to what people at the South Pole are actually missing, suggests Lowitz: “My guess is that at least some Polies [i.e. South Pole residents] are into the Roombas because we’re not allowed to have pets at the South Pole, and when there are these little Roombas running around, it’s sort of close. People do odd things at that altitude [the <a href="https://brr.fyi/posts/pressure-altitude" target="_blank">pressure altitude</a> at the South Pole is nearly 3,500 meters], and when they miss home…a Roomba is just like a cute little thing to personify and pay attention to.”<br/></p><p style="">Ferguson agrees. “We all miss our pets down here. Sometimes we joke about trying to smuggle down a puppy or a kitten even though it’s a huge violation of the Antarctic Treaty. One of the things that I think gives the Roombas some of their charm is how they keep running into walls. If I was to ascribe a personality to them, it would be kind of dumb and aloof, which evokes some of those pet memories—maybe like the time that your dog ate something it shouldn’t have.”<br/></p><p class="shortcode-media shortcode-media-rebelmouse-image" style="">
  65. <img alt="A close up photo of a Roomba on a floor sprinkled with popcorn pieces." class="rm-shortcode" data-rm-shortcode-id="6bbc1b13e9ed2c5960678bcf7ee1a5db" data-rm-shortcode-name="rebelmouse-image" id="3e19c" loading="lazy" src="https://spectrum.ieee.org/media-library/a-close-up-photo-of-a-roomba-on-a-floor-sprinkled-with-popcorn-pieces.jpg?id=49795867&width=980"/>
  66. <small class="image-media media-caption" placeholder="Add Photo Caption...">Ernie is currently living underneath a popcorn machine.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Kyle Ferguson</small></p><p style="">Sadly, we’ve heard that the South Pole Roombas are not at their Roomb-iest right now. They’re not as young as they used to be, and getting spare parts (like new batteries) is only possible during the austral summer and requires a lead time of six months. We’ll be checking in on Bert, Ernie, Sam, and Frodo toward the end of the year once the Amundsen–Scott South Pole Station reopens for the austral summer. But for now, please enjoy the lyrics to Kyle Ferguson’s Ernie-themed “5000 Candles in the Wind” parody, adapted from ”<a href="https://www.youtube.com/watch?v=mjKR-HAUnz4" rel="noopener noreferrer" target="_blank">5,000 Candles in the Wind</a>” from ”<a href="https://www.youtube.com/watch?v=WEv0bl7oodU&t=162s" rel="noopener noreferrer" target="_blank"><em>Parks and Recreation</em></a><em>.”</em></p><p><em><br/></em></p><p style="text-align: center;"><em></em><em>Up in Roomba Heaven, here’s the thing;</em></p><p style="text-align: center;"><em>You trade your wheels for angel’s wings,</em></p><p style="text-align: center;"><em>And once we’ve all said goodbye,</em></p><p style="text-align: center;"><em>You stop running into walls and you learn to fly.</em></p><p style="text-align: center;"><em><br/></em></p><p style="text-align: center;"><em>Bye-bye, Roomba Ernie.</em></p><p style="text-align: center;"><em>You were taken from us too early.</em></p><p style="text-align: center;"><em>Bye-bye, Roomba Ernie.</em></p><p style="text-align: center;"><em>You’re 5,000 candles in the wind.</em></p><p style=""><em><br/></em></p><p style="text-align: center;"><em>Though we all miss you everyday,</em></p><p style="text-align: center;"><em>We know you’re up there cleaning heaven’s waste.</em></p><p style="text-align: center;"><em>Here’s the part that hurts the most:</em></p><p style="text-align: center;"><em>Humans cannot recharge a ghost.</em></p><p style=""><em><br/></em></p><p style="text-align: center;"><em>Bye-bye, Roomba Ernie.</em></p><p style="text-align: center;"><em>You were taken from us too early.</em></p><p style="text-align: center;"><em>Bye-bye, Roomba Ernie.</em></p><p style="text-align: center;"><em>You’re 5,000 candles in the wind.</em></p><p style=""><em><br/></em></p><p style="text-align: center;"><em>EVERYBODY NOW!</em></p><p style="text-align: center;"><em>Bye-bye, Roomba Ernie.</em></p><p style="text-align: center;"><em>You were taken from us too early.</em></p><p style="text-align: center;"><em>Bye-bye, Roomba Ernie.</em></p><p style="text-align: center;"><em>You’re 5,000 candles in the wind.</em></p><p style="text-align: center;"><em>Maybe some day you’ll clean these halls again.</em></p><p style="text-align: center;"><em>And I know I’ll always miss my Roomb-iest friend.</em></p><p style=""><em><br/></em></p><p style="text-align: center;"><em>Spread your wings and fly.</em></p><p style=""><em><br/></em></p><p style=""><em></em>Special thanks to the National Science Foundation, <a href="https://brr.fyi/" target="_blank">brr.fyi</a>, and the Polies that we spoke to for this article. And if you’d like even more South Pole winter shenanigans, there’s an <a href="https://www.wiffa.aq/en" target="_blank">Antarctic Film Festival</a> open to all of the research stations in Antarctica. Kyle Ferguson stars in <em>John Wiff</em>, an action movie that was written, filmed, and produced in just 48 hours, and you can watch it <a href="https://www.wiffa.aq/en/film/1049" target="_blank">here</a> (mildly NSFW for a truly astonishing amount of Nerf gun violence).</p>]]></description><pubDate>Thu, 26 Oct 2023 14:23:19 +0000</pubDate><guid>https://spectrum.ieee.org/south-pole-roombas</guid><category>Antarctica</category><category>Irobot</category><category>Roomba</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-photo-of-two-men-from-the-shoulders-down-standing-and-pointing-nerf-guns-at-a-roomba-on-a-pedestal-between-them.jpg?id=49795812&amp;width=980"></media:content></item><item><title>Exploring Sydney’s Deep Tech Ecosystem</title><link>https://spectrum.ieee.org/sydney-deep-tech</link><description><![CDATA[
  67. <img src="https://spectrum.ieee.org/media-library/aerial-view-of-buildings-with-green-rooftops-in-sydney-australia.jpg?id=49265589&width=1200&height=800&coordinates=120%2C0%2C121%2C0"/><br/><br/><p style=""><em><strong></strong>This sponsored article is brought to you by <a href="https://www.besydney.com.au/change-starts-here-tech-innov/?utm_source=ieee&utm_medium=article&utm_campaign=change-starts-here&utm_content=tech" rel="noopener noreferrer" target="_blank">BESydney</a>.</em></p><p style="">In the dynamic landscape of Australian technology, market advancements are often attributed to consumer-focused products like <a href="https://www.canva.com/" rel="noopener noreferrer" target="_blank">Canva</a> and <a href="https://www.afterpay.com/" rel="noopener noreferrer" target="_blank">Afterpay</a>. Capturing headlines and attention with their renowned success stories, these, along with other global companies like <a href="https://www.atlassian.com/" rel="noopener noreferrer" target="_blank">Atlassian</a>, Facebook, and Apple, have become the face of the tech industry.</p><p style="">The accomplishments of these companies are remarkable. They generate immense wealth for stakeholders and employees and boast a staggering market value. But this high-profile side of the industry is just the tip of the iceberg. Deep tech – characterised by breakthrough scientific innovations – is where hidden impacts take place. Beneath the surface of these tech giants lies a thriving industry dedicated to researching and developing solutions that address large-scale problems, with a profound effect on society.</p><h3></h3><br/><iframe class="rm-shortcode" data-rm-shortcode-id="77a7f732161c320054f59866060fcb8c" frameborder="0" height="480" scrolling="no" src="https://player.vimeo.com/video/866669978" width="100%"></iframe><h3></h3><br/><h2>The power of deep tech</h2><p>The tech industry in Australia is a powerhouse, employing one in 16 Australians and ranking as the country’s third-largest industry. In 2021, it accounted for 8.5 percent of the GDP, an undeniably significant contribution to the nation’s economy.</p><h3></h3><br/><p>For nearly two decades, <a href="https://www.besydney.com.au/change-starts-here-tech-innov/?utm_source=ieee&utm_medium=article&utm_campaign=change-starts-here&utm_content=tech" rel="noopener noreferrer" target="_blank">Sydney has also nurtured a thriving community</a> of resilient problem solvers, quietly pushing the boundaries of scientific discovery. While consumer-focused tech giants often steal the spotlight, it is imperative to recognize the profound impact of deep tech solutions that operate behind the scenes.</p><p>From eco-friendly fabric manufacturing and hydrogen storage to molecular diagnostics and sustainable alternatives to plastics, Sydney’s brightest minds are tackling some of the world’s most pressing challenges.</p><h3></h3><br/><h2>The transformation of deep tech startups</h2><p>Navigating the deep tech landscape is no small feat. These enterprises offer long-term solutions to pressing global challenges – a benefit that cannot be ignored – but deep tech innovations require significant time for research and development, often incubating for years before reaching the market.</p><p>They demand substantial investment and unwavering focus. Finding the right path to commercialization is paramount. Thankfully, incubators are emerging as champions in successfully transforming deep tech startups into thriving businesses.</p><h3></h3><br/><p>“Sydney’s DNA demands a deep-rooted vision, an unwavering belief in problem-solving, and the determination to persevere despite challenges.” <strong>—Sally-Ann Williams, Cicada Innovations</strong></p><h3></h3><br/><p><a href="https://www.cicadainnovations.com/" target="_blank">Cicada Innovations</a> is Australia’s oldest and largest deep tech incubator. It knows better than anyone the extent to which Australia’s deep tech evolution hinges on the power of startups. With over 365 resident companies incubated, over $1.7 billion raised, over $1.4 billion exits, and over 900 patents filed, these dynamic ventures are already spearheading groundbreaking advancements.<br/></p><p>It’s creating intelligent robots and pioneering scaled drone delivery to minimize environmental impacts in transportation. It’s slashing the cost of cancer drugs, offering hope for prolonged lifespans and alleviating suffering. And it’s crafting innovative farming tools to enhance agricultural yields and contribute to global food security.</p><h3></h3><br><img alt="Cicada Innovations chief executive Sally-Ann Williams, wearing a white jacket, smiles at the camera, with a blurry office seen in the background." class="rm-shortcode" data-rm-shortcode-id="687bbddeaf7d5ca13b49830524e01de3" data-rm-shortcode-name="rebelmouse-image" id="5fcd1" loading="lazy" src="https://spectrum.ieee.org/media-library/cicada-innovations-chief-executive-sally-ann-williams-wearing-a-white-jacket-smiles-at-the-camera-with-a-blurry-office-seen-i.jpg?id=49265534&width=980"/><h3></h3><br/><h2>A thriving hub for deep tech innovation</h2><h3></h3><br/><p>With its vibrant ecosystem, Sydney emerges as an ideal hub for unveiling and further developing deep tech innovations. The Australian spirit, shaped by resilience and problem-solving, thrives in this city. Sally-Ann Williams, chief executive of Cicada Innovations, affirms that “Sydney’s DNA demands a deep-rooted vision, an unwavering belief in problem-solving, and the determination to persevere despite challenges.”</p><p>The city offers a supportive community, facilitating connections and access to the talent necessary for entrepreneurs to pursue their dreams. It’s this unique blend of ingredients that fuels the growth of deep tech companies, propelling them toward success.</p><h3></h3><br/><img alt="A woman holds a set of vials while another woman observes, with a medical device seen in the background." class="rm-shortcode" data-rm-shortcode-id="f88e5fcac46996749d6cac25d2c57ec8" data-rm-shortcode-name="rebelmouse-image" id="47b3c" loading="lazy" src="https://spectrum.ieee.org/media-library/a-woman-holds-a-set-of-vials-while-another-woman-observes-with-a-medical-device-seen-in-the-background.jpg?id=49266004&width=980"/><h3></h3><br/><h2>Discover deep tech at Tech Central</h2><p>Deep tech is just one facet of what’s happening at Tech Central. While we shed light on these industry accomplishments and celebrated breakthroughs, it’s crucial to support and foster the growth of a wider industry: one that thrives on resilience, problem-solving, and visionary entrepreneurship.</p><p>Sydney – with its unique blend of community, talent, and resources – stands at the forefront of this transformative revolution, ready to propel tech innovation for the benefit of all.</p><p>For more information on Sydney’s Tech Industry and hosting your next conference in Sydney, visit <a href="https://www.besydney.com.au/change-starts-here-tech-innov/?utm_source=ieee&utm_medium=article&utm_campaign=change-starts-here&utm_content=tech" rel="noopener noreferrer" target="_blank">besydney.com.au</a>.</p><h3>A Closer Look at Deep Tech Innovators</h3><br/><p>
  68. To truly grasp the essence of deep tech, we must explore the stories of individuals and companies that are driving change. Here are a few examples of how deep tech is flourishing at Tech Central:<br/>
  69. </p><h5>Xefco: A sustainable textile revolution</h5><p>
  70. <a href="https://www.xefco.com/" target="_blank">Xefco</a> is a groundbreaking new materials company revolutionizing fabric manufacturing. Its innovative process significantly reduces water usage by up to 90% and eliminates the need for dyes and harsh chemicals. Traditionally, textile mills worldwide have polluted rivers and harmed local communities – Xefco aims to transform the textile industry, benefitting both the environment and economically disadvantaged communities worldwide.
  71. </p><h5>Rux: Empowering the hydrogen economy</h5><p>
  72. Another trailblazing company in Sydney’s deep tech ecosystem,
  73. <a href="https://ruxenergy.com/" rel="noopener noreferrer" target="_blank">Rux Energy</a> is tackling the challenge of hydrogen storage. Hydrogen presents immense potential in the energy transition movement, but efficient and scalable storage solutions are essential for its widespread adoption. Rux is developing new materials and technologies to store hydrogen more effectively, paving the way for a cleaner and more sustainable future.
  74. </p><h5>SpeeDX: Revolutionising molecular diagnostics</h5><p>
  75. Amidst the global pandemic,
  76. <a href="https://plexpcr.com/" rel="noopener noreferrer" target="_blank">SpeeDX</a>, a Sydney-based company, emerged as a key player in molecular diagnostic testing and antimicrobial resistance. SpeeDX aims to address the rising concern of antibiotic overuse by providing personalized recommendations for effective treatment. This groundbreaking technology has far-reaching implications, reducing unnecessary antibiotic usage, minimizing the risk of antimicrobial resistance, and safeguarding public health on a global scale.
  77. </p></br>]]></description><pubDate>Sun, 22 Oct 2023 19:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/sydney-deep-tech</guid><category>Deep tech</category><category>Ai</category><category>Robotics</category><category>Internet of things</category><category>Fintech</category><category>Blockchain</category><category>Virtual reality</category><category>Atlassian</category><category>Canva</category><category>Afterpay</category><category>Innovation</category><category>Startups</category><dc:creator>BESydney</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/aerial-view-of-buildings-with-green-rooftops-in-sydney-australia.jpg?id=49265589&amp;width=980"></media:content></item><item><title>Video Friday: Welcome to Fall</title><link>https://spectrum.ieee.org/video-friday-welcome-to-fall</link><description><![CDATA[
  78. <img src="https://spectrum.ieee.org/media-library/video-loop-of-a-teal-and-gray-humanoid-robot-trying-to-right-itself-in-front-of-a-banner-that-says-made-for-work.gif?id=50013820&width=1200&height=800&coordinates=19%2C0%2C19%2C0"/><br/><br/><p style="">Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5><a href="https://ssrr2023.org/">IEEE SSRR 2023</a>: 13–15 November 2023, FUKUSHIMA, JAPAN</h5><h5 style=""><a href="https://2023.ieee-humanoids.org/">Humanoids 2023</a>: 12–14 December 2023, AUSTIN, TEX.</h5><h5><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 02 February 2024, ZURICH, SWITZERLAND</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote><em>Digit, our human-centric robot, can now self-right and stand back up after it falls. This is footage from our testing lab, where we intentionally disable the perception systems that would normally avoid/adjust to obstacles preventing Digit from falling. For the purposes of this test, we force Digit to fall in a controlled environment to demonstrate our new self-righting and recovering ability.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4e83997690bbbc7544471541034ed374" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wapeP93Q5ng?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://agilityrobotics.com/">Agility</a> ]</p><div class="horizontal-rule"></div><blockquote><em>With our multipick functionality, Stretch is unlocking the next level of automated unloading. Stretch can now move multiple boxes with a single swing of the arm. In typical shipping containers filled with thousands of boxes, the robot is hitting significantly higher rates of productivity.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8b4d59555321056983db2487f17c7c70" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/9J0M6kMjX6c?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/products/stretch/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><p>The moral of this video is to always give your robots a gentle pat on the sensors when they do a good job at a challenging task.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="47f2766f45245e8034ca71e261c30e74" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/AQgYuhYJJ6U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.anybotics.com/robotics/anymal/">ANYbotics</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Since their mass production in the early 2000s, vacuum robots have emerged as highly successful commercial products in the field of home automation. At KIMLAB, we have implemented a mobile manipulator based on a vacuum robot and an add-on mechanism by employing our PAPRAS (Plug-And-Play Robotic Arm System). </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="223b26d28768a305f5972fbb092dc58c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0kmDHpyc-Oo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/document/10202493">Paper</a> ] via [ <a href="https://publish.illinois.edu/kimlab2020/">KIMLAB</a> ]</p><div class="horizontal-rule"></div><p>Happy 100 Ikeadroneversary to Verity!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0a08ec029edb44e3003d89fc9cf90051" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/6XDgqR61614?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://verity.net/">Verity</a> ]</p><div class="horizontal-rule"></div><p>If you’re wondering what kind of black magic is making this work, the answer is the best kind of black magic: magnets.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3942379c9b88234be6a761d6c38dd982" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/LVxtb-lpRlI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://freeformrobotics.org/wp-content/uploads/2023/08/DISG-TRO.pdf">Paper</a> ] via [ <a href="https://freeformrobotics.org/">Freeform Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Honda is exploring how our all-electric prototype Honda Autonomous Work Vehicle (AWV) could address the challenges of labor shortages, safety and security, and emissions reductions to bring new value to airfield operations. The Honda AWV is designed to boost workforce productivity and support repetitive tasks that allow companies to focus their workforce on value-added activities. First introduced as a concept at CES 2018, the Honda AWV is now advancing toward commercialization.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d20c0346217cb0464c9fb96756b25032" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/lpqeOb_sZwQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://hondanews.com/en-US/honda-corporate/releases/release-7b0d03d783f6d642dbbefc6b2a0c2fa7-honda-prototype-autonomous-work-vehicle-demonstrates-new-value-for-airfield-operations">Honda</a> ]</p><div class="horizontal-rule"></div><blockquote><em>First prototype of a bike tire treated with Self-Healing polymer internally. The result is a puncture-proof inflated tire that does not need the addition of any liquid sealant. The tire is a normal bike tire with an inner tire.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7ac8bc99436aa92e268abaaad80e05b2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/bD0hKhyn91o?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.brubotics.eu/">BruBotics</a> ]</p><div class="horizontal-rule"></div><p style="">The U.S. Navy is working on four-legged friends for sailors, and the ship’s cat is very upset.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b6816805791f61cda00c38c72c2d9246" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5tRbAiuUWjI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.nrl.navy.mil/">USNRL</a> ]</p><div class="horizontal-rule"></div><blockquote><em>The SMART Innovative Training Network is a joint venture between academia and industry, providing scientific and personal development of young researchers in the multidisciplinary fields of soft robotics and smart materials. SMART will realize the technologically and scientifically ambitious breakthroughs to exploit smart, stimuli-responsive material systems for actuation, sensing, and self-healing capabilities for intelligent soft devices.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3de129176aba6426561bfaf38b4069b5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/lcOup-BjSWM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="http://www.smartitn.eu/">SMART ITN</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 20 Oct 2023 20:01:12 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-welcome-to-fall</guid><category>Video friday</category><category>Agility robotics</category><category>Boston dynamics</category><category>Robotics</category><category>Humanoid robots</category><category>Robotic arm</category><category>Quadruped robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/video-loop-of-a-teal-and-gray-humanoid-robot-trying-to-right-itself-in-front-of-a-banner-that-says-made-for-work.gif?id=50013820&amp;width=980"></media:content></item><item><title>What’s Next for Clearpath Robotics?</title><link>https://spectrum.ieee.org/clearpath-robotics-post-acquisition</link><description><![CDATA[
  79. <img src="https://spectrum.ieee.org/media-library/a-photo-of-a-small-yellow-and-black-wheeled-robot-with-sensors-on-top-on-grass-with-rolling-fields-in-the-background.jpg?id=49824546&width=1200&height=800&coordinates=418%2C0%2C418%2C0"/><br/><br/><p style="">Now that <a href="https://www.rockwellautomation.com/en-us/company/news/press-releases/Rockwell-Automation-completes-acquisition-of-autonomous-robotics-leader-Clearpath-Robotics-and-its-industrial-offering-OTTO-Motors.html" rel="noopener noreferrer" target="_blank">Rockwell Automation’s acquisition of Clearpath Robotics and OTTO Motors is complete</a> (at something like US $600 million, <a href="https://www.theglobeandmail.com/business/article-waterloos-clearpath-robotics-sold-to-rockwell-automation-for-us600/" rel="noopener noreferrer" target="_blank">according to one source</a>), it’s more important than ever to get at least <em>some</em> understanding of what the future holds for those iconic yellow-and-black research robots. And it’s not just about their robots, either: Clearpath Robotics was one of the original champions of the <a href="https://www.ros.org/" target="_blank">Robot Operating System (ROS)</a>, and the company has provided a massive amount of support to the <a href="https://www.ros.org/blog/community/" target="_blank">ROS community</a> over the past decade. </p><p style="">At the <a href="https://ieee-iros.org/" target="_blank">IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2023)</a> in Detroit earlier this month, we spoke with Clearpath Robotics cofounder <a href="https://www.linkedin.com/in/rgariepy/" rel="noopener noreferrer" target="_blank">Ryan Gariepy</a> to get a better sense of where things are headed for Clearpath Robotics.</p><hr/><p style=""><strong>Now that you are part of Rockwell, what’s staying the same?</strong></p><p style=""><strong>Ryan Gariepy: </strong>Both Clearpath Robotics and OTTO Motors are still very much in existence. We’re still operating with our own road maps, and Rockwell Automation has a desire to keep these brands around. We plan to keep the iconic Clearpath colors. Basically, we’re going to continue business as usual. As much as I appreciate people’s concern, we do intend to continue building this for the long-term.</p><p class="pull-quote" style="">“We’re now in a world where one of the largest industrial automation companies has decided that robotics is a strategic interest. We think there will be a lot of things that the robotics research community will be excited about.”<br/>—Ryan Gariepy, Clearpath Robotics</p><p style=""><strong>What’s going to be different?</strong></p><p style=""><strong>Gariepy: </strong>We anticipate being able to take larger risks, with more of a long-term view on some of our products and services. Rockwell also has established global scale in sales, deployment, support, supply chain, everything. It’ll really allow us to focus much more on what we’re good at, rather than having to choose between product development and operations.</p><p style="">Rockwell currently does a lot of stuff which is peripheral to the robotics community. They’re a global leader in motion control, in sensing, in safety—these are things that could be of great interest. I think any long-time researcher will remember the days when sensor manufacturers didn’t even support using their sensors on robots, and you had to reverse-engineer those protocols yourself. But we’re now in a world where one of the largest industrial automation companies has decided that robotics is a strategic interest. We think there will be a lot of things that the robotics research community will be excited about.</p><p style=""><strong>What about long-term support for existing Clearpath research robots?</strong></p><p style=""><strong>Gariepy: </strong>If anything, a company like Rockwell gives us more stability rather than less stability. They’re used to supporting their products for far longer than us—the oldest Huskies are coming up on 12 or 13 years old. Rockwell has products that have been on the market for 20 years that they’re still supporting, so they very much respect that. I know that for a lot of researchers, it seems like Clearpath Robotics has been around forever, but we’ve only been around for 14 years. Rockwell has been around for 120 years. </p><p style=""><strong>What about <a href="https://spectrum.ieee.org/tag/turtlebot" target="_blank">TurtleBot</a>?</strong></p><p style=""><strong>Gariepy</strong>: TurtleBot 5 would be a future road map discussion, and that’s more in the hands of Open Robotics than Clearpath Robotics. We do love the TurtleBot, we’re building as many TurtleBots as we possibly can, and we have a long-term agreement with Open Robotics to continue the TurtleBot partnership. That agreement continues.</p><p style=""><strong>How does Rockwell feel about ROS?</strong></p><p style=""><strong>Gariepy: </strong>Rockwell wants to work more with ROS, and has definitely been excited by the leadership that we have with the ROS community. There are a lot of things that we’ve been talking about on how to build on this, but I can’t really get into any details. Honestly, this is because there are so many good ideas we have, that even with this larger company, I don’t have the people to pull everything off right now. </p><p style="">Again, it wasn’t that many years ago when you couldn’t get an API for a manipulator arm so that you could even use it, much less have the manufacturer of that arm support ROS themselves. Things have changed substantially, and now you have a company like Rockwell becoming very excited about the potential in the ROS community.</p><div class="horizontal-rule"></div><p style="">Clearpath Robotics has of course only ever been one part of the ROS community—an important part, certainly, but the continued success of ROS has (we hope) grown beyond what might be going on at any one company. It’s a little worrisome that several other important parts of the ROS community, including <a href="https://spectrum.ieee.org/zebra-technologies-acquire-fetch-robotics" target="_self">Fetch Robotics</a> and <a href="https://spectrum.ieee.org/alphabet-intrinsic-open-robotics-acquisition" target="_self">Open Robotics</a>, have also been acquired relatively recently. So with all this in mind, we’ll be at <a href="https://roscon.ros.org/2023/" target="_blank">ROSCon in New Orleans later this week</a> to try to get a better sense of how the community feels about the future of ROS.</p>]]></description><pubDate>Wed, 18 Oct 2023 14:00:04 +0000</pubDate><guid>https://spectrum.ieee.org/clearpath-robotics-post-acquisition</guid><category>Clearpath robotics</category><category>Iros 2023</category><category>Otto motors</category><category>Ros</category><category>Roscon</category><category>Turtlebot</category><category>Clearpath robotics</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-photo-of-a-small-yellow-and-black-wheeled-robot-with-sensors-on-top-on-grass-with-rolling-fields-in-the-background.jpg?id=49824546&amp;width=980"></media:content></item><item><title>Figure Unveils Its Humanoid Robot Prototype</title><link>https://spectrum.ieee.org/figure-humanoid-robot-2665982283</link><description><![CDATA[
  80. <img src="https://spectrum.ieee.org/media-library/a-photograph-of-a-slim-humanoid-robot-standing-with-reflective-metal-skin-and-a-black-motorcycle-helmet.jpg?id=49822181&width=1200&height=800&coordinates=0%2C137%2C0%2C31"/><br/><br/><p style="">When <a href="https://www.figure.ai/" target="_blank">Figure</a> announced earlier this year that it was working on a general-purpose humanoid robot, our excitement was tempered somewhat by the fact that the company didn’t have much to show besides <a href="https://spectrum.ieee.org/figure-humanoid-robot" target="_blank">renderings of the robot that it hoped to eventually build</a>. Figure had a slick-looking vision, but without anything to back it up (besides a world-class robotics team, of course), it was unclear how fast it would be able to progress.</p><p style="">As it turns out, the company progressed pretty darn fast, and today Figure is unveiling its Figure 01 robot, which has gone from nothing at all to dynamic walking in under a year.</p><hr/><p class="shortcode-media shortcode-media-youtube" style="">
  81. <span class="rm-shortcode" data-rm-shortcode-id="bfa615323eb49312f36b66497e561197" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/jACJruCzUzY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  82. </p><p style="">A couple of things to note about the video, once you tear your eyes away from that shiny metal skin and the enormous battery backpack: First, the robot is walking dynamically without a tether and there are no nervous-looking roboticists within easy grabbing distance. Impressive! Dynamic walking means that there are points during the robot’s gait cycle where abruptly stopping would cause the robot to fall over, since it’s depending on momentum to keep itself in motion. It’s the kind of walking that humans do, and is significantly more difficult than a more traditional “robotic” walk, in which a robot makes sure that its center of mass is always safely positioned above a solid ground contact. Dynamic walking is also where those gentle arm swings come from—they’re helping keep the robot’s motion smooth and balanced, again in a humanlike way.</p><p style="">The second thing that stands out is how skinny (and shiny!) this robot is, especially if you can look past the cable routing. Figure had initially shown renderings of a robot with the form factor of a slim human, but there’s usually a pretty big difference between an initial fancy rendering and real hardware that shows up months later. It now looks like Figure actually has a shot at keeping to that slim design, which has multiple benefits—there are human-robot interaction considerations, where a smaller form factor is likely to be less intimidating, but more importantly, the mass you save by slimming down as much as possible leads to a robot that’s more efficient, cheaper, and safer.</p><p style="">Obviously, there’s a lot more going on here than Figure could squeeze into its press release, so for more details, we spoke with <a href="https://www.linkedin.com/in/jennareher/" rel="noopener noreferrer" target="_blank">Jenna Reher</a>, a senior robotics/AI engineer at Figure, and <a href="https://www.linkedin.com/in/jerry-pratt-4488946/" target="_blank">Jerry Pratt</a>, Figure’s CTO.</p><p style=""><strong>What was the process like for you to teach this robot how to walk? How difficult was it to do that in a year?</strong></p><p style=""><strong>Jenna Reher: </strong>We’ve been really focused on making sure that we’re validating a lot of the hardware as it’s built. With the robot that’s shown in the video, earlier this year it was just the pelvis bolted onto a test fixture. Then we added the spine joints and all the joints connected to that pelvis, and literally built the robot out from that pelvis. We added the legs and had those swinging in the air, and then built up the torso on top of that. At each of those stages, we were making sure to have a good process for validating that those low-level pieces of this overall system were really well tuned in. I think that once you get to something as complex as a whole humanoid, all that validation really saves you a lot of time on the other end, since you have a lot more confidence in the lower level systems as you start working on higher level behaviors like locomotion </p><p style="">We also have a lot of people at the company that have experience on prior legged robotic platforms, so there’s a well of knowledge that we can draw from there. And there’s a large pool of literature that’s been published by people in the locomotion community that roboticists can now pull from. With our locomotion controller, it’s not like we’re trying to reinvent stable locomotion, so being able to implement things that we know already work is a big help.</p><p style=""><strong>Jerry Pratt: </strong>The walking algorithm we’re using has a lot of similarities to the ones that were developed during the DARPA Robotics Challenge. We’re doing a lot of machine learning on the perception side, but we’re not really doing any machine learning for control right now. For the walking algorithm, it’s pretty much robotics controllers 101.</p><p style="">And Jenna mentioned the step-by-step hardware bring-up. While that’s happening, we’re doing a lot of development on the controller in simulation to get to the point where the robot is walking in simulation pretty well, which means that we have a good chance of the controller working on the real robot once it comes online. I think as a company, we’ve done a good job coordinating all the pieces, and a lot of that has come from having people with the experience of having done this several times before.</p><p style=""><strong>More broadly, eight years after the DARPA Robotics Challenge, how hard is it to get a human-size bipedal robot to walk?</strong></p><p style=""><strong>Pratt:</strong> Theoretically, we understand walking pretty well now. There are a lot of different simple models, different toolboxes that are available, and about a dozen different approaches that you can take. A lot of it depends on having good hardware—it can be really difficult if you don’t have that. But for okay-ish walking on flat ground, it’s become easier and easier now with all of the prior work that’s been done.</p><p style="">There are still a lot of challenges for walking naturally, though. We really want to get to the point where our robot looks like a human when it walks. There are some robots that have gotten close, but none that I would say have passed the Turing test for walking, where if you looked at a silhouette of it, you’d think it was a human. Although, there’s not a good business case for doing that, except that it should be more efficient.</p><p style=""><strong>Reher: </strong>Walking is becoming more and more understood, and also accessible to roboticists if you have the hardware for it, but there are still a lot of challenges to be able to walk while doing something useful at the same time—interacting with your environment while moving, manipulating things while moving—these are still challenging problems.</p><p style=""><strong>What are some important things to look for when you see a bipedal robot walk to get a sense of how capable it might be?</strong></p><p style=""><strong>Reher:</strong> I think we as humans have pretty good intuition for judging how well something is locomoting—we’re kind of hardwired to do it. So if you see buzzing oscillations, or a very stiff upper body, those may be indications that a robot’s low-level controls are not quite there. A lot of success in bipedal walking comes down to making sure that a very complex systems-engineering architecture is all playing nice together.</p><p style=""><strong>Pratt:</strong> There have been some recent efforts to come up with performance metrics for walking. Some are kind of obvious, like walking speed. Some are harder to measure, like robustness to disturbances, because it matters what phase of gait the robot is at when it gets pushed—if you push it at just the right time, it’s much harder for it to recover. But I think the person pushing the robot test is a pretty good one. While we haven’t done pushes yet, we probably will in an upcoming video.</p><p style=""><strong>How important is it for your robot to be able to fall safely, and at what point do you start designing for that?</strong></p><p style=""><strong>Pratt:</strong> I think it’s critical to fall safely, to survive a fall, and be able to get back up. People fall—not very often, but they do—and they get back up. And there will be times in almost any application where the robot falls for one reason or another and we’re going to have to just accept that. I often tell people working on humanoids to build in a fall behavior. If the robot’s not falling, make it fall! Because if you’re trying to make the robot so that it can <em>never</em> fall, it’s just too hard of a problem, and it’s going to fall anyway, and then it’ll be dangerous. </p><p style="">I think falling can be done safely. As long as computers are still in control of the hardware, you can do very graceful, judo-style falls. You should be able to detect where people are if you are falling, and fall away from them. So, I think we can make these robots relatively safe. The hardest part of falling, I think, is protecting your hands so they don’t break as you’re falling. But it’s definitely not an insurmountable problem.</p><p class="shortcode-media shortcode-media-rebelmouse-image" style="">
  83. <img alt="A close-up photograph of a the torso and shoulder joint of a humanoid robot with metal skin" class="rm-shortcode" data-rm-shortcode-id="f09c6907d00f7e8199a51ededcc87ef8" data-rm-shortcode-name="rebelmouse-image" id="d5d56" loading="lazy" src="https://spectrum.ieee.org/media-library/a-close-up-photograph-of-a-the-torso-and-shoulder-joint-of-a-humanoid-robot-with-metal-skin.jpg?id=49822208&width=980"/>
  84. <small class="image-media media-caption" placeholder="Add Photo Caption...">Industrial design is a focus of Figure.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Figure</small></p><p style=""><strong>You have a very slim and shiny robot. Did the design require any engineering compromises?</strong></p><p style=""><strong>Pratt: </strong>It’s actually a really healthy collaboration. We’re trying to fit inside a medium-size female body shape, and so the industrial design team will make these really sleek-looking robot silhouettes and say, “Okay, mechanical team, everything needs to fit in there.” And the mechanical team will be like, “We can’t fit that motor in, we need a couple more millimeters.” It’s kind of fun watching the discussions, and sometimes there will be arguments and stuff, but it almost always leads to a better design. Even if it’s simply because it causes us to look at the problem a couple of extra times.</p><p style=""><strong>Reher:</strong> From my perspective, the kind of interaction with the mechanical engineers that led to the robot that we have now has been very beneficial for the controls side. We have a sleeker design with lower-inertia legs, which means that we’re not trying to move a lot of mass around. That ends up helping us down the line for designing control algorithms that we can execute on the hardware.</p><p style=""><strong>Pratt: </strong>That’s right. And keeping the legs slim allows you to do things like crossover steps—you get more range of motion because you don’t have parts of the robot bumping into each other. Self-collisions are something that you always have to worry about with a robot, so if your robot has fewer protruding cables or bumps, it’s pretty important.</p><p style=""><strong>Your CEO <a href="https://www.linkedin.com/posts/brettadcock_figures-custom-actuator-left-vs-off-the-shelf-activity-7117605842744442880-ebwc/" target="_blank">posted a picture of some compact custom actuators</a> that your robot is using. Do you feel like your actuator design (or something else) gives your robot some kind of secret sauce that will help it be successful?</strong></p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-right" data-rm-resized-container="25%" style="float: right;">
  85. <img alt="Two silver motors, side by side, one is about half the size of the other." class="rm-shortcode rm-resized-image" data-rm-shortcode-id="76f9a2da59a7e6793eb27d6d204119e8" data-rm-shortcode-name="rebelmouse-image" id="53c87" loading="lazy" src="https://spectrum.ieee.org/media-library/two-silver-motors-side-by-side-one-is-about-half-the-size-of-the-other.jpg?id=49821668&width=980" style="max-width: 100%"/>
  86. <small class="image-media media-caption" placeholder="Add Photo Caption...">Here’s Figure’s custom actuator [left] vs. an off-the-shelf actuator with equal torque.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Figure</small></p><p style=""><strong>Pratt: </strong>At this point, it’s mostly really amazing engineering and software development and super talented people. About half of our team have worked on humanoids before, and half of our team have worked in some related field. That’s important—things like, making batteries for cars, making electric motors for cars, software and management systems for electric airplanes. There are a few things we’ve learned along the way that we hadn’t learned before. Maybe they’re not super secret things that other people don’t know, but there’s a handful of tricks that we’ve picked up from bashing our heads against some problem over and over. But there’s not a lot of new technology going into the robot, let’s put it that way.</p><p><strong>Are there opportunities in the humanoid robot space for someone to develop a new technology that would significantly change the industry?</strong></p><p style=""><strong>Pratt:</strong> I think getting to whatever it takes to open up new application areas, and do it relatively quickly. We’re interested in things like using large language models to plan general-purpose tasks, but they’re not quite there yet. A lot of the examples that you see are at the research-y stage where they might work until you change up what’s going on—it’s not robust. But if someone cracks that open, that’s a huge advantage.</p><p style="">And then hand designs. If somebody can come up with a super-robust large-degree-of-freedom hand that has force sensing and tactile sensing on it, that would be huge too.</p><div class="horizontal-rule"></div><p class="shortcode-media shortcode-media-rebelmouse-image" style="">
  87. <img alt="A close-up photograph of a slim humanoid robot standing with shiny metal skin and a black motorcycle helmet." class="rm-shortcode" data-rm-shortcode-id="ea4b086d895e38c6dde93ee682c000bd" data-rm-shortcode-name="rebelmouse-image" id="88635" loading="lazy" src="https://spectrum.ieee.org/media-library/a-close-up-photograph-of-a-slim-humanoid-robot-standing-with-shiny-metal-skin-and-a-black-motorcycle-helmet.jpg?id=49822187&width=980"/>
  88. <small class="image-media media-caption" placeholder="Add Photo Caption...">The robot is designed to fit inside a medium-size female body shape.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Figure</small></p><p style="">This is a lot of progress from Figure in a very short time, but they’re certainly not alone in their goal of developing a commercial bipedal robot, and relative to other companies that have had operational hardware for longer, Figure may have some catching up to do. Or it may not—until we start seeing robots doing practical tasks outside of carefully controlled environments, it’s hard to know for sure.</p>]]></description><pubDate>Tue, 17 Oct 2023 14:31:00 +0000</pubDate><guid>https://spectrum.ieee.org/figure-humanoid-robot-2665982283</guid><category>Humanoid robots</category><category>Figure</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-photograph-of-a-slim-humanoid-robot-standing-with-reflective-metal-skin-and-a-black-motorcycle-helmet.jpg?id=49822181&amp;width=980"></media:content></item><item><title>Video Friday: Strandbeest</title><link>https://spectrum.ieee.org/video-friday-strandbeest-2</link><description><![CDATA[
  89. <img src="https://spectrum.ieee.org/media-library/mechanical-sail-powered-land-robot-pictured-on-a-beach-by-the-ocean-with-blue-sky-and-clouds-in-the-background.jpg?id=49474495&width=1200&height=800&coordinates=57%2C0%2C58%2C0"/><br/><br/><p style="">Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5 style=""><a href="https://roscon.ros.org/2023/">ROSCon 2023</a>: 18–20 October 2023, NEW ORLEANS</h5><h5 style=""><a href="https://ssrr2023.org/">IEEE SSRR 2023</a>: 13–15 November 2023, FUKUSHIMA, JAPAN</h5><h5 style=""><a href="https://2023.ieee-humanoids.org/">Humanoids 2023</a>: 12–14 December 2023, AUSTIN, TEXAS</h5><h5 style=""><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 02 February 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p style="">Let’s not concern ourselves with whether this beautiful monstrosity of a Strandbeest is technically a robot or not and instead just enjoy watching it move.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9aea1792f63d4d9c3ef52dfcf2f9fd15" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/iWGlbpEZkNA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Since the beginning of this summer I have been trying to connect several running units (Ordissen) in succession. Animaris Rex is a herd of beach animals whose specimens hold each other as defense against storms. As individuals they would simply blow over, but as a group the chance of surviving a storm would be greater. It is 18 meters long (5 meters longer than the largest Tyrannosaurus Rex found.)</em></blockquote><p>[ <a href="https://www.strandbeest.com/">Strandbeest</a> ]</p><div class="horizontal-rule"></div><p>It’s Slightly Less Big and Significantly Bluer Hero 6!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="fc4c0b732d93814cdbbff3ea4b0fc421" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0KIPnnxMIPw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.fujipress.jp/ijat/au/ijate001700030277/">Paper</a> ]</p><div class="horizontal-rule"></div><blockquote><em>A low-cost robot does extreme parkour including high jumps on obstacles 2x its height, long jumps across gaps 2x its length, handstand on stairs, and running across tilted ramps. We show how a single neural net policy operating directly from a camera image, trained in simulation with large-scale RL, can overcome imprecise sensing and actuation to output highly precise control behavior end-to-end. We show our robot can perform a high jump on obstacles 2x its height, long jump across gaps 2x its length, do a handstand and run across tilted ramps, and generalize to novel obstacle courses with different physical properties.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ac10dec7f7d5f77a1c3eda2980f3e505" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/cuboZYHGiMc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p style="">[ <a href="https://extreme-parkour.github.io/">CMU</a> ]</p><div class="horizontal-rule"></div><p>Human waiters might actually have something to worry about here.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9da2d154e00c9e7b3377991f89897f2e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/EKHTeC4FKm4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dynsyslab.org/vision-news/">LSRL</a> ]</p><div class="horizontal-rule"></div><blockquote><em>While traditional control methods require multiple sensory feedback for the stable and fast locomotion of quadruped robots, our recent work presents a modular neural control architecture that can encode robot dynamics for stable and robust gait generation without sensory feedback. The architecture, integrating a central pattern generator network, a premotor shaping network, and a motor-memory hypernetwork, enables a quadruped robot to walk at different walking frequencies on different terrains, including grass and uneven stone pavement.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8f35c5b3631c77b3d29eccb919658a15" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/KVy47W1N6ts?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.sciencedirect.com/science/article/abs/pii/S0893608023004501">Paper</a> ] via [ <a href="https://manoonpong.com/">NUAA</a> ]</p><p style="">Thanks, Poramate!</p><div class="horizontal-rule"></div><blockquote><em>Visual control enables quadrotors to adaptively navigate using real-time sensory data, bridging perception with action. Yet, challenges persist, including generalization across scenarios, maintaining reliability, and ensuring real-time responsiveness. This paper introduces a perception framework grounded in foundation models for universal object detection and tracking, moving beyond specific training categories.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9b3375df1dd1e4540d6aa66977acc754" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/35sX9C1wUpA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p style="">[ <a href="https://wp.nyu.edu/arpl/">ARPL</a> ]</p><div class="horizontal-rule"></div><blockquote><em>As always, performing a live robot demo is no small feat, but KIMLAB members embraced the challenge! MOMO (Mobile Object Manipulation Operator) stole the Demo Expo at IROS 2023 by charming everyone as it handed out candies and swept the floor with a broom. We also unveiled our armor-controlled robotic backpack.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6a7910e72f0dcdae99f1ae70de5263ef" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7wsVvqkqAPY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://publish.illinois.edu/kimlab2020/">KIMLAB</a> ]</p><div class="horizontal-rule"></div><blockquote><em>X30 quadruped robot, a flagship product designed to meet core industry needs in multiple fields, including inspection of power stations, factories, pipeline corridors, emergency rescue, fire detection, scientific research and more.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5548e3b26a9f9b6ef2f9ff5b2dd922f4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zxGwOEYYFVo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://deeprobotics.cn/en/index/product3.html">DeepRobotics</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Robots operating in close proximity to humans rely heavily on human trust to successfully complete their tasks. But what are the real outcomes when this trust is violated? Self-defense law provides a framework for analyzing tangible failure scenarios that can inform the design of robots and their algorithms.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7b5acc6737c45da7c82174bb8fd2f756" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dce7EnUBWqU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.cmu.edu/me/robomechanicslab/">Robomechanics Lab</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 13 Oct 2023 19:53:50 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-strandbeest-2</guid><category>Video friday</category><category>Strandbeest</category><category>Parkour</category><category>Robotics</category><category>Quadruped robots</category><category>Robotic arm</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/mechanical-sail-powered-land-robot-pictured-on-a-beach-by-the-ocean-with-blue-sky-and-clouds-in-the-background.jpg?id=49474495&amp;width=980"></media:content></item><item><title>This Robot Could Be the Key to Empowering People With Disabilities</title><link>https://spectrum.ieee.org/stretch-assistive-robot</link><description><![CDATA[
  90. <img src="https://spectrum.ieee.org/media-library/a-smiling-man-in-a-wheelchair-connects-eyes-with-a-smiling-woman-in-jeans-and-a-turtleneck-who-bends-over-to-smell-a-rose-that.jpg?id=49270266&width=1200&height=800&coordinates=0%2C0%2C0%2C209"/><br/><br/><p style="">
  91. <strong>In 2010, Henry Evans</strong> <a href="https://www.youtube.com/watch?v=1FwpgAgJG4c" rel="noopener noreferrer" target="_blank">saw a robot on TV</a>. It was a <a href="https://robotsguide.com/robots/pr2" rel="noopener noreferrer" target="_blank">PR2</a>, from the robotics company Willow Garage, and Georgia Tech robotics professor <a href="https://charliekemp.com/" rel="noopener noreferrer" target="_blank">Charlie Kemp</a> was demonstrating how the PR2 was able to locate a person and bring them a bottle of medicine. For most of the people watching that day, the PR2 was little more than a novelty. But for Evans, the robot had the potential to be life changing. “I imagined PR2 as my body surrogate,” Evans says. “I imagined using it as a way to once again manipulate my physical environment after years of just lying in bed.”
  92. </p><p style="">
  93. Eight years earlier, at the age of 40, Henry was working as a CFO in Silicon Valley when he suffered a strokelike attack caused by a birth defect, and overnight, became a nonspeaking person with quadriplegia. “One day I was a 6’4”, 200 Lb. executive,” Evans wrote <a href="http://hevans-hevans.blogspot.com/" rel="noopener noreferrer" target="_blank">on his blog</a> in 2006. “I had always been fiercely independent, probably to a fault. With one stroke I became completely dependent for everything…. Every single thing I want done, I have to ask someone else to do, and depend on them to do it.” Evans is able to move his eyes, head, and neck, and slightly move his left thumb. He can control a computer cursor using head movements and an onscreen keyboard to type at about 15 words per minute, which is how he communicated with <em>IEEE Spectrum</em> for this story.
  94. </p><p class="shortcode-media shortcode-media-youtube" style="">
  95. <span class="rm-shortcode" data-rm-shortcode-id="ea5cf217ea65f6b266fd6862da06dd18" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/SqOxkJSHKns?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  96. <small class="image-media media-caption" placeholder="Add Photo Caption...">Henry Evans shaves with the assistance of a PR2 robot in 2012.</small>
  97. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Georgia Tech</small></p><p style="">
  98. After getting in contact with Kemp at Georgia Tech, and in partnership with Willow Garage, Evans and his wife Jane began collaborating with the roboticists on a project called <a href="http://r4h.org/" rel="noopener noreferrer" target="_blank">Robots for Humanity</a>. The goal was to find ways of extending independence for people with disabilities, helping them and, just as importantly, their caregivers live better and more fulfilling lives. The PR2 was the first of many assistive technologies developed through Robots for Humanity, and Henry was eventually able to use the robot to (among other things) help himself shave and scratch his own itch for the first time in a decade.
  99. </p><p style="">
  100. “Robots are something that was always science fiction for me,” Jane Evans told me. “When I first began this journey with Henry, it never entered my mind that I’d have a robot in my house. But I told Henry, ‘I’m ready to take this adventure with you.’ Everybody needs a purpose in life. Henry lost that purpose when he became trapped in his body, and to see him embrace a new purpose—that gave my husband his life back.”
  101. </p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-left" data-rm-resized-container="25%" style="float: left;">
  102. <img alt="A smiling bespectacled man in a wheelchair is seated next to a robot consisting of a mobile base, a thin vertical pole, and a horizontal arm, whose gripper is repositioning a green blanket on the man\u2019s lap." class="rm-shortcode rm-resized-image" data-rm-shortcode-id="4d43e13967cdfca6e7084b408a3dea43" data-rm-shortcode-name="rebelmouse-image" id="bcc0c" loading="lazy" src="https://spectrum.ieee.org/media-library/a-smiling-bespectacled-man-in-a-wheelchair-is-seated-next-to-a-robot-consisting-of-a-mobile-base-a-thin-vertical-pole-and-a-ho.jpg?id=49270291&width=980" style="max-width: 100%"/>
  103. <small class="image-media media-caption" placeholder="Add Photo Caption...">Even simple tasks like repositioning a blanket require a caregiver, but Henry can use Stretch to move it on his own.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Peter Adams</small>
  104. </p><p>
  105. Henry stresses that an assistive device must not only increase the independence of the disabled person but also make the caregiver’s life easier. “Caregivers are super busy and have no interest in (and often no aptitude for) technology,” he explains. “So if it isn’t dead simple to set up <em>and</em> it doesn’t save them a meaningful amount of time, it very simply won’t get used.”
  106. </p><p>
  107. While the PR2 had a lot of potential, it was too big, too expensive, and too technical for regular real-world use. “It cost $400,000,” Jane recalls. “It weighed 400 pounds. It could destroy our house if it ran into things! But I realized that the PR2 is like the first computers—and if this is what it takes to learn how to help somebody, it’s worth it.”
  108. </p><p>
  109. For Henry and Jane, the PR2 was a research project rather than a helpful tool. It was the same for Kemp at Georgia Tech—a robot as impractical as the PR2 could never have a direct impact outside of a research context. And Kemp had bigger ambitions. “Right from the beginning, we were trying to take our robots out to real homes and interact with real people,” he says. To do that with a PR2 required the assistance of a team of experienced roboticists and a truck with a powered lift gate. Eight years into the Robots for Humanity project, they still didn’t have a robot that was practical enough for people like Henry and Jane to actually use. “I found that incredibly frustrating,” Kemp recalls.
  110. </p><p style="">
  111. In 2016, Kemp started working on the design of a new robot. The robot would leverage years of advances in hardware and computing power to do many of the things that the PR2 could do, but in a way that was simple, safe, and affordable. Kemp found a kindred spirit in <a href="https://www.linkedin.com/in/aaron-edsinger/" rel="noopener noreferrer" target="_blank">Aaron Edsinger</a>, who like Kemp had earned a Ph.D. at MIT under <a href="https://rodneybrooks.com/" target="_blank">Rodney Brooks</a>. Edsinger then cofounded a robotics startup that was <a href="https://spectrum.ieee.org/google-acquisition-seven-robotics-companies" target="_self">acquired by Google in 2013</a>. “I’d become frustrated with the complexity of the robots being built to do manipulation in home environments and around people,” says Edsinger. “[Kemp’s idea] solved a lot of problems in an elegant way.” In 2017, Kemp and Edsinger founded <a href="https://hello-robot.com/" target="_blank">Hello Robot</a> to make their vision real.
  112. </p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-left" data-rm-resized-container="25%" style="float: left;">
  113. <img alt="An animated gif of a robot with a mobile base, a long unmoving vertical piece, with a small camera on top, and a horizontal arm that moves up and down, as well as extending outwards, with a two finger gripper at the end." class="rm-shortcode rm-resized-image" data-rm-shortcode-id="3d91ec4f533b929ae0153c8c962459c8" data-rm-shortcode-name="rebelmouse-image" id="68c32" loading="lazy" src="https://spectrum.ieee.org/media-library/an-animated-gif-of-a-robot-with-a-mobile-base-a-long-unmoving-vertical-piece-with-a-small-camera-on-top-and-a-horizontal-arm.gif?id=49270299&width=980" style="max-width: 100%"/>
  114. <small class="image-media media-caption" placeholder="Add Photo Caption...">Stretch is a relatively small robot that one person can easily move, but it has enough range of motion to reach from the floor to countertop height.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Hello Robot</small>
  115. </p><p style="">
  116. The robot that Kemp and Edsinger designed is called <a href="https://spectrum.ieee.org/hello-robots-stretch-mobile-manipulator" target="_self">Stretch</a>. It’s small and lightweight, easily movable by one person. And with a commercial price of US $20,000, Stretch is a tiny fraction of the cost of a PR2. The lower cost is due to Stretch’s simplicity—it has a single arm, with just enough degrees of freedom to allow it to move up and down and extend and retract, along with a wrist joint that bends back and forth. The gripper on the end of the arm is based on a popular (and inexpensive) assistive grasping tool that Kemp found on Amazon. Sensing is focused on functional requirements, with basic obstacle avoidance for the base along with a depth camera on a pan-and-tilt head at the top of the robot. Stretch is also capable of performing basic tasks autonomously, like grasping objects and moving from room to room.
  117. </p><p style="">
  118. This minimalist approach to mobile manipulation has benefits beyond keeping Stretch affordable. Robots can be difficult to manually control, and each additional joint adds extra complexity. Even for non-disabled users, directing a robot with many different degrees of freedom using a keyboard or a game pad can be tedious, and requires substantial experience to do well. Stretch’s simplicity can make it a more practical tool than robots with more sensors or degrees of freedom, especially for novice users, or for users with impairments that may limit how they’re able to interact with the robot.
  119. </p><p class="shortcode-media shortcode-media-youtube" style="">
  120. <span class="rm-shortcode" data-rm-shortcode-id="53814a9ab795e1a0894380c5827cdb7c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0ozyxSDbEF0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  121. <small class="image-media media-caption" placeholder="Add Photo Caption...">A Stretch robot under Henry Evans’s control helps his wife, Jane, with meal prep and cleanup. </small>
  122. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Vy Nguyen/Hello Robot</small>
  123. </p><p style="">
  124. “The most important thing for Stretch to be doing for a patient is to give meaning to their life,” explains Jane Evans. “That translates into contributing to certain activities that make the house run, so that they don’t feel worthless. Stretch can relieve some of the caregiver burden so that the caregiver can spend more time with the patient.” Henry is acutely aware of this burden, which is why his focus with Stretch is on “mundane, repetitive tasks that otherwise take caregiver time.”
  125. </p><p class="shortcode-media shortcode-media-rebelmouse-image" style="">
  126. <img alt="Group portrait of a smiling woman with short hair and a green outfit, a bespectacled man in a wheelchair, a smiling woman in a black turtleneck, and a tall thin mobile robot with a camera and two finger gripper." class="rm-shortcode" data-rm-shortcode-id="b12923c2b06dfb8517f7f9be10f048ef" data-rm-shortcode-name="rebelmouse-image" id="77ba0" loading="lazy" src="https://spectrum.ieee.org/media-library/group-portrait-of-a-smiling-woman-with-short-hair-and-a-green-outfit-a-bespectacled-man-in-a-wheelchair-a-smiling-woman-in-a-b.jpg?id=49270333&width=980"/>
  127. <small class="image-media media-caption" placeholder="Add Photo Caption...">Vy Nguyen [left] is an occupational therapist at Hello Robot who has been working extensively with both Henry and Jane to develop useful applications for Stretch in their home.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Peter Adams</small>
  128. </p><p style="">
  129. <a href="https://www.linkedin.com/in/v-nguyen-otd-otr-l-80b20818/" rel="noopener noreferrer" target="_blank">Vy Nguyen</a> is an occupational therapist who has been working with Hello Robot to integrate Stretch into a caregiving role. With a <a href="https://hello-robot.com/press-release-nih-sbir-2023" rel="noopener noreferrer" target="_blank">$2.5 million Small Business Innovation Research grant</a> from the National Institutes of Health and in partnership with <a href="https://hfaging.ahs.illinois.edu/" rel="noopener noreferrer" target="_blank">Wendy Rogers</a> at the University of Illinois Urbana-Champaign and <a href="https://homes.cs.washington.edu/~mcakmak/" rel="noopener noreferrer" target="_blank">Maya Cakmak</a> at the University of Washington, Nguyen is helping to find ways that Stretch can be useful in the Evans’s daily lives.
  130. </p><p class="shortcode-media shortcode-media-rebelmouse-image" style="">
  131. <img alt="A smiling man lies in bed. He is looking at a monitor which shows multiple camera views, including one of himself, as a robotic gripper holding a hairbrush scratches his head." class="rm-shortcode" data-rm-shortcode-id="eb371a369c9006eb74766ae09dd8324b" data-rm-shortcode-name="rebelmouse-image" id="1d155" loading="lazy" src="https://spectrum.ieee.org/media-library/a-smiling-man-lies-in-bed-he-is-looking-at-a-monitor-which-shows-multiple-camera-views-including-one-of-himself-as-a-robotic.jpg?id=49270350&width=980"/>
  132. <small class="image-media media-caption" placeholder="Add Photo Caption...">To scratch an itch on his head, Henry uses a hairbrush that has been modified with a soft sleeve to make it easier for the robot to grasp it. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Vy Nguyen/Hello Robot</small>
  133. </p><p style="">
  134. There are many tasks that can be frustrating for the patient to depend on the caregiver for, says Nguyen. Several times an hour, Henry suffers from itches that he cannot scratch, and which he describes as debilitating. Rather than having to ask Jane for help, Henry can instead have Stretch pick up a scratching tool and use the robot to scratch those itches himself. While this may seem like a relatively small thing, it’s hugely meaningful for Henry, improving his quality of life while reducing his reliance on family and caregivers. “Stretch can bridge the gap between the things that Henry did before his stroke and the things he aspires to do now by enabling him to accomplish his everyday activities and personal goals in a different and adaptable way via a robot,” Nguyen explains. “Stretch becomes an extension of Henry himself.”
  135. </p><p style="">
  136. This is a unique property of a mobile robot that makes it especially valuable for people with disabilities: Stretch gives Henry his own agency in the world, which opens up possibilities that go far beyond traditional occupational therapy. “The researchers are very creative and have found several uses for Stretch that I never would have imagined,” Henry notes. Through Stretch, Henry has been able to play poker with his friends without having to rely on a teammate to handle his cards. He can send recipes to a printer, retrieve them, and bring them to Jane in the kitchen as she cooks. He can help Jane deliver meals, clear dishes away for her, and even transport a basket of laundry to the laundry room. Simple tasks like these are perhaps the most meaningful, Jane says. “How do you make that person feel like what they’re contributing is important and worthwhile? I saw Stretch being able to tap into that. That’s huge.”
  137. </p><p class="shortcode-media shortcode-media-rebelmouse-image" style="">
  138. <img alt="A group of people sit around a table, laughing and playing poker. In the foreground, a man in a wheelchair has a large monitor in front of him showing camera views, as he looks at a device affixed to a robot arm that holds five playing cards." class="rm-shortcode" data-rm-shortcode-id="3488b3148ec02c6cf559ee12718f585e" data-rm-shortcode-name="rebelmouse-image" id="8f0de" loading="lazy" src="https://spectrum.ieee.org/media-library/a-group-of-people-sit-around-a-table-laughing-and-playing-poker-in-the-foreground-a-man-in-a-wheelchair-has-a-large-monitor-i.jpg?id=49271762&width=980"/>
  139. <small class="image-media media-caption" placeholder="Add Photo Caption...">Using Stretch to manipulate cards, Henry can play games with friends and family without having to be on a team with someone else.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Vy Nguyen/Hello Robot</small></p><p>
  140. One day, Henry used Stretch to give Jane a rose. Before that, she says, “Every time he would pick flowers for me, I’m thanking Henry along with the caregiver. But when Henry handed me the rose through Stretch, there was no one else to thank but him. And the joy in his face when he handed me that rose was unbelievable.”
  141. </p><p style="">
  142. Henry has also been able to use Stretch to interact with his three-year-old granddaughter, who isn’t quite old enough to understand his disability and previously saw him, says Jane, as something like a piece of furniture. Through Stretch, Henry has been able to play little games of basketball and bowling with his granddaughter, who calls him “Papa Wheelie.” “She knows it’s Henry,” says Nguyen, “and the robot helped her see him as a person who can play with and have fun with her in a very cool way.”
  143. </p><p class="shortcode-media shortcode-media-rebelmouse-image" style="">
  144. <img alt="A smiling man in a wheelchair with a computer on his lap is controlling a nearby mobile robot, which holds a small plastic bowling ball that a small child is also grabbing onto." class="rm-shortcode" data-rm-shortcode-id="e014372424ff3ff1705f9981d48f7604" data-rm-shortcode-name="rebelmouse-image" id="966ae" loading="lazy" src="https://spectrum.ieee.org/media-library/a-smiling-man-in-a-wheelchair-with-a-computer-on-his-lap-is-controlling-a-nearby-mobile-robot-which-holds-a-small-plastic-bowli.png?id=49324661&width=980"/>
  145. <small class="image-media media-caption" placeholder="Add Photo Caption...">Through Stretch, Henry can play games with his granddaughter, like this version of bowling adapted for both small children and robots.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Vy Nguyen/Hello Robot</small></p><p style="">
  146. The person working the hardest to transform Stretch into a practical tool is Henry. That means “pushing the robot to its limits to see all it can do,” he says. While Stretch is physically capable of doing many things (and Henry has extended those capabilities by designing custom accessories for the robot), one of the biggest challenges for the user is finding the right way to tell the robot exactly <em>how</em> to do what you want it to do.</p><p class="shortcode-media shortcode-media-rebelmouse-image" style="">
  147. <img alt="A large monitor shows an interface consisting of multiple views from cameras, simple maps of a house, and a keyboard. A man is seated in front of the screen, with the arm of a robot just visible, holding a kebab on a red flat tool." class="rm-shortcode" data-rm-shortcode-id="b328f792565bfe263fc7c387c970a7cb" data-rm-shortcode-name="rebelmouse-image" id="5da1c" loading="lazy" src="https://spectrum.ieee.org/media-library/a-large-monitor-shows-an-interface-consisting-of-multiple-views-from-cameras-simple-maps-of-a-house-and-a-keyboard-a-man-is-s.png?id=49770933&width=980"/>
  148. <small class="image-media media-caption" placeholder="Add Photo Caption...">The graphical user interface that Henry (in collaboration with the researchers) developed to control Stretch uses multiple camera views and large onscreen buttons to make it easier for Henry to do tasks like feeding himself.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Julian Mehu/Hello Robot</small></p><p style="">
  149. Henry collaborated with the researchers to develop <a href="https://forum.hello-robot.com/t/summer-research-on-in-home-use-by-henry-evans/237/2" rel="noopener noreferrer" target="_blank">his own graphical user interface</a> to make manual control of Stretch easier, with multiple camera views and large onscreen buttons. But Stretch’s potential for partially or fully autonomous operation is ultimately what will make the robot most successful. The robot relies on “a very particular kind of autonomy, called assistive autonomy,” Jane explains. “That is, Henry is in control of the robot, but the robot is making it easier for Henry to do what he wants to do.” Picking up his scratching tool, for example, is tedious and time consuming under manual control, because the robot has to be moved into exactly the right position to grasp the tool. Assistive autonomy gives Henry higher-level control, so that he can direct Stretch to move into the right position on its own. Stretch now has a menu of prerecorded movement subroutines that Henry can choose from. “I can train the robot to perform a series of movements quickly, but I’m still in complete control of what those movements are,” he says.
  150. </p><p style="">
  151. Henry adds that getting the robot’s assistive autonomy to a point where it’s functional and easy to use is the biggest challenge right now. Stretch can autonomously navigate through the house, and the arm and gripper can be controlled reliably as well. But more work needs to be done on providing simple interfaces (like voice control), and on making sure that the robot is easy to turn on and doesn’t shut itself off unexpectedly. It is, after all, still research hardware. Once the challenges with autonomy, interfaces, and reliability are addressed, Henry says, “the conversation will turn to cost issues.”
  152. </p><p class="shortcode-media shortcode-media-youtube" style="">
  153. <span class="rm-shortcode" data-rm-shortcode-id="95578831ddac180a6ab6ff8ccff4ba37" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/61mhppeZryw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  154. <small class="image-media media-caption" placeholder="Add Photo Caption...">Henry Evans uses a Stretch robot to feed himself scrambled eggs.</small>
  155. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Vy Nguyen/Hello Robot</small></p><p>
  156. A $20,000 price tag for a robot is substantial, and the question is whether Stretch can become useful enough to justify its cost for people with cognitive and physical impairments. “We’re going to keep iterating to make Stretch more affordable,” says Hello Robot’s Charlie Kemp. “We want to make robots for the home that can be used by everyone, and we know that affordability is a requirement for most homes.”
  157. </p><p style="">
  158. But even at its current price, if Stretch is able to reduce the need for a human caregiver in some situations, the robot will start to pay for itself. Human care is very expensive—<a href="https://www.genworth.com/aging-and-you/finances/cost-of-care.html" rel="noopener noreferrer" target="_blank">the nationwide average is over $5,000 per month</a> for a home health aide, which is simply unaffordable for many people, and a robot that could reduce the need for human care by a few hours a week would pay for itself within just a few years. And this isn’t taking into account the value of care given by relatives. Even for the Evanses, who do have a hired caregiver, much of Henry’s daily care falls to Jane. This is a common situation for families to find themselves in, and it’s also where Stretch can be especially helpful: by allowing people like Henry to manage more of their own needs without having to rely exclusively on someone else’s help.
  159. </p><p class="shortcode-media shortcode-media-youtube" style="">
  160. <span class="rm-shortcode" data-rm-shortcode-id="7d94eddbde4252df51f9fd4b15af7a05" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/6Zo7EMUxbDI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  161. <small class="image-media media-caption" placeholder="Add Photo Caption...">Henry Evans uses his custom graphical user interface to control the Stretch robot to pick up a towel, place the towel in a laundry basket, and then tow the laundry basket to the laundry room.</small>
  162. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Vy Nguyen/Hello Robot</small></p><p style="">
  163. Stretch does still have some significant limitations. The robot can lift only about 2 kilograms, so it can’t manipulate Henry’s body or limbs, for example. It also has no way of going up and down stairs, is not designed to go outside, and still requires a lot of technical intervention. And no matter how capable Stretch (or robots like Stretch) become, Jane Evans is sure they will never be able to replace human caregivers, nor would she want them to. “It’s the look in the eye from one person to another,” she says. “It’s the words that come out of you, the emotions. The human touch is so important. That understanding, that compassion—a robot cannot replace that.”
  164. </p><p>
  165. Stretch may still be a long way from becoming a consumer product, but there’s certainly interest in it, says Nguyen. “I’ve spoken with other people who have paralysis, and they would like a Stretch to promote their independence and reduce the amount of assistance they frequently ask their caregivers to provide.” Perhaps we should judge an assistive robot’s usefulness not by the tasks it can perform for a patient, but rather on what the robot <em>represents</em> for that patient, and for their family and caregivers. Henry and Jane’s experience shows that even a robot with limited capabilities can have an enormous impact on the user. As robots get more capable, that impact will only increase.
  166. </p><p>
  167. “I definitely see robots like Stretch being in people’s homes,” says Jane. “When, is the question? I don’t feel like it’s eons away. I think we are getting close.” Helpful home robots can’t come soon enough, as Jane reminds us: “We are all going to be there one day, in some way, shape, or form.” Human society is <a href="https://www.who.int/news-room/fact-sheets/detail/ageing-and-health" rel="noopener noreferrer" target="_blank">aging rapidly</a>. Most of us will eventually need some assistance with activities of daily living, and before then, we’ll be assisting our friends and family. Robots have the potential to ease that burden for everyone.
  168. </p><p style="">
  169. And for Henry Evans, Stretch is already making a difference. “They say the last thing to die is hope,” Henry says. “For the severely disabled, for whom miraculous medical breakthroughs don’t seem feasible in our lifetimes, robots are the best hope for significant independence.” <span class="ieee-end-mark"></span>
  170. </p><p style=""><em>This article appears in the November 2023 print issue as “<span style="background-color: initial;">A</span> <span style="background-color: initial;">Robot</span> <span style="background-color: initial;">for</span> <span style="background-color: initial;">Humanity</span>.”</em></p>]]></description><pubDate>Sun, 08 Oct 2023 15:00:04 +0000</pubDate><guid>https://spectrum.ieee.org/stretch-assistive-robot</guid><category>Assistive technology</category><category>Disability technology</category><category>Hello robot</category><category>Mobile manipulators</category><category>Robotics</category><category>Type:cover</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-smiling-man-in-a-wheelchair-connects-eyes-with-a-smiling-woman-in-jeans-and-a-turtleneck-who-bends-over-to-smell-a-rose-that.jpg?id=49270266&amp;width=980"></media:content></item><item><title>How Disney Packed Big Emotion Into a Little Robot</title><link>https://spectrum.ieee.org/disney-robot</link><description><![CDATA[
  171. <img src="https://spectrum.ieee.org/media-library/an-animated-gif-showing-a-small-white-legged-robot-cutely-interacting-with-a-researcher-in-a-robotics-lab.gif?id=49273590&width=1200&height=800&coordinates=62%2C0%2C63%2C0"/><br/><br/><p style="">On Wednesday, at the <a href="https://ieee-iros.org/" rel="noopener noreferrer" target="_blank">2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)</a>, in Detroit, a <a href="https://la.disneyresearch.com/" rel="noopener noreferrer" target="_blank">Disney Research</a> team presented a brand new robotic character during their evening keynote address. The adorable robot packs an enormous amount of expression into its child-size body, from its highly expressive head and two wiggly antennae to its stubby little legs. But what sets this robot apart from other small bipeds is <em>how</em> it walks—it’s full of personality, emoting as it moves in a way that makes it seem uniquely alive.
  172. </p><p style="">
  173. Programming robots to move in emotive ways is something that Disney is an expert in, going as far back as 1971, with its animatronic Hall of Presidents in Disney World. As robots have gotten more advanced and more mobile, though, it’s become challenging for robot designers and robot animators to develop emotive behaviors that both take advantage of and are compatible with robotic hardware under real-world constraints. Disney Research has spent the last year developing a new system that leverages reinforcement learning to turn an animator’s vision into expressive motions that are robust enough to work almost anywhere, whether that’s a stage at IROS or a Disney theme park or a forest in Switzerland.</p><hr/><p class="shortcode-media shortcode-media-youtube" style="">
  174. <span class="rm-shortcode" data-rm-shortcode-id="513438d4355398aafad6fe5b5a2a99e3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/-cfIm06tcfA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  175. <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Disney Research</small>
  176. </p><p style="">
  177. This particular robot was developed by a team led by <a href="https://www.baecher.info/" target="_blank">Moritz Bächer </a>from <a href="https://studios.disneyresearch.com/" rel="noopener noreferrer" target="_blank">Disney Research in Zurich</a>. It’s mostly 3D printed, using modular hardware and actuators that made it quick to design and iterate on, going from concept to what you see in the above video in less than a year. It has a four-degree-of-freedom head (able to look up, down, around, and tilt), as well as five-degree-of-freedom legs with hip joints that allow it to walk while balancing dynamically.
  178. </p><p style="">
  179. “Most roboticists are focused on getting their bipedal robots to reliably walk,” says Disney research scientist <a href="https://spectrum.ieee.org/u/morgan-pope" target="_self">Morgan Pope</a>, who helped present the robot on stage. “At Disney, that might not be enough—our robots may have to strut, prance, sneak, trot, or meander to convey the emotion that we need them to.” Disney has animators who are experts in making characters convey all of those emotions (and more) through movement, as well as roboticists who are experts in building mechanical systems. “What we try to bring to these kinds of robots is born from our history of character animation,” explains <a href="https://www.linkedin.com/in/michael-hopkins-656b70b1/" target="_blank">Michael Hopkins</a>, a principle R&D engineer at Disney. “We have an amazing animator, <a href="https://www.linkedin.com/in/jared-bishop-3b09ba3/" target="_blank">Jared Bishop</a>, embedded in our team, and together, we’re able to leverage his knowledge and our technical expertise to create the best performance we can.”
  180. </p><p class="shortcode-media shortcode-media-rebelmouse-image" style="">
  181. <img alt="Two men stand on a conference presentation stage next to a small white legged robot." class="rm-shortcode" data-rm-shortcode-id="86c4f36abac4963e336bac4db8795084" data-rm-shortcode-name="rebelmouse-image" id="d479f" loading="lazy" src="https://spectrum.ieee.org/media-library/two-men-stand-on-a-conference-presentation-stage-next-to-a-small-white-legged-robot.jpg?id=49273698&width=980"/>
  182. <small class="image-media media-caption" placeholder="Add Photo Caption...">Morgan Pope [left] and Moritz Bächer present the new robot at IROS 2023.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Evan Ackerman</small></p><p style="">
  183. To create an effective robotic character requires the animators and the roboticists to combine their talents, a process that can be time consuming and involves a lot of trial and error to make sure that the robot can convey the animators’ artistic intent without falling over. “In general, animation tools don’t have physics built into them,” explains Bächer. “So that makes it hard for artists to design animations that will work in the real world.”
  184. </p><p style="">
  185. “It’s not just about walking,” adds Pope. “Walking is one of the inputs to the reinforcement-learning system, but the other important input is <em>how</em> it walks.”
  186. </p><p class="shortcode-media shortcode-media-rebelmouse-image rm-resized-container rm-resized-container-25 rm-float-right" data-rm-resized-container="25%" style="float: right;">
  187. <img alt="Photo of small robot and a man." class="rm-shortcode rm-resized-image" data-rm-shortcode-id="c15ab8265e496260a1f026f48c85721b" data-rm-shortcode-name="rebelmouse-image" id="16189" loading="lazy" src="https://spectrum.ieee.org/media-library/photo-of-small-robot-and-a-man.jpg?id=48632157&width=980" style="max-width: 100%"/>
  188. <small class="image-media media-caption" placeholder="Add Photo Caption..." style="max-width: 100%;">Disney’s Morgan Pope helped present the new robotic character at IROS.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..." style="max-width: 100%;">Evan Ackerman</small>
  189. </p><p style="">
  190. To bridge this gap, Disney Research has developed a reinforcement learning-based pipeline that relies on simulation to combine and balance the vision of an animator with robust robotic motions. For the animator, the pipeline essentially takes care of implementing the constraints of the physical world, letting the animator develop highly expressive motions while relying on the system to make those motions real—or get as close as is physically possible for the robot. Disney’s pipeline can train a robot on a new behavior on a single PC, running what amounts to years of training in just a few hours. According to Bächer, this has reduced the time that it takes for Disney to develop a new robotic character from years to just months.
  191. </p><p style="">
  192. A big advantage of reinforcement learning in this context is that the resulting motions can be highly robust. Disney’s system is able to train motions over and over while making slight changes to things like motor performance, mass distribution, and friction between the robot and the ground. The system ensures that whatever the robot encounters in the real world, it will know not just how to handle itself, but how to handle itself <em>while still emoting</em>, which is critical to the robot maintaining its character. “This is a challenge for traditional techniques,” says <a href="https://www.linkedin.com/in/ruben-grandia" target="_blank">Ruben Grandia</a>, an associate research scientist at Disney Research. “Normally, you have to hand-program this transition point. But if you put everything together in one simulation and perturb it while it tries to move and animate, it can determine that point for itself, which has resulted in recovery strategies that we see from this robot that we’d have no idea how to program.”
  193. </p><p style="">
  194. Social robots have existed for decades, and even robots not explicitly designed for social interaction usually have some human-robot interaction features if they’re likely to spend time around people. But human-robot interaction can sometimes be an afterthought for robots that are designed primarily with functionality in mind. With its robots, Disney has shown just how much a robot is able to communicate through <em>character</em> without sacrificing functionality, and this can be useful in robotics more broadly.
  195. </p><p style="">
  196. “In situations where humans and robots are close to each other, conveying emotion and intent can be an important feature,” explains <a href="https://www.linkedin.com/in/georg-wiedebach/" target="_blank">Georg Wiedebach</a>, senior R&D imagineer at Disney. “So I think this can also be valuable in other applications where robots are working next to people.”
  197. </p><p class="shortcode-media shortcode-media-rebelmouse-image" style="">
  198. <img alt="Photo of a small crowd of people surrounding a small robot and taking photos of it." class="rm-shortcode" data-rm-shortcode-id="bb20d8c71128144c765e449ed1149162" data-rm-shortcode-name="rebelmouse-image" id="62091" loading="lazy" src="https://spectrum.ieee.org/media-library/photo-of-a-small-crowd-of-people-surrounding-a-small-robot-and-taking-photos-of-it.jpg?id=48631709&width=980"/>
  199. <small class="image-media media-caption" placeholder="Add Photo Caption...">IROS attendees meet the Disney robot.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Evan Ackerman</small>
  200. </p><p style="">
  201. While it’s easy to focus on this specific robot (look how cute it is!), the researchers emphasize that what’s important here is not the robot, it’s the process. “The idea is that this is a platform that’s hardware agnostic,” says Bächer. “So if we wanted to add more legs, or add arms, or make an entirely new character with a completely different morphology, we can rapidly teach it new behaviors. The off-the-shelf actuators, the 3D-printed components, our adaptable reinforcement-learning framework—these can all be applied to robots that are widely different in how they look and move. This robot is a promising first step on that journey.”
  202. </p><p style="">
  203. The next steps on Disney’s journey involve using this technique to develop more physical robotic characters, and pushing the limits of what’s possible with faster and more dynamic motions. “We want to see what happens when we get to those limits,” says Disney research scientist <a href="https://www.linkedin.com/in/espen-knoop-47b525114/" target="_blank">Espen Knoop</a>, “and learn what we can do at those limits.”
  204. </p><p class="shortcode-media shortcode-media-rebelmouse-image" style="">
  205. <img alt="Photo of five men kneeling around a small robot." class="rm-shortcode" data-rm-shortcode-id="61d9f09a374315635b2c0ac5af9afbd3" data-rm-shortcode-name="rebelmouse-image" id="16976" loading="lazy" src="https://spectrum.ieee.org/media-library/photo-of-five-men-kneeling-around-a-small-robot.jpg?id=48631971&width=980"/>
  206. <small class="image-media media-caption" placeholder="Add Photo Caption...">The Disney Research team that created the new robot are [from left] Moritz Bächer, Georg Wiedebach, Michael Hopkins, Ruben Grandia, and Morgan Pope.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Evan Ackerman</small>
  207. </p><p style="">
  208. As far as this robot goes, the character doesn’t have an official name, and Disney isn’t ready to comment on where we might see it. But based on how it looks and sounds, we have some guesses. And this one little robot is only the beginning—now that they’re so much easier to create, we’re hoping to see many more of these expressive robotic characters from Disney.
  209. </p>]]></description><pubDate>Sat, 07 Oct 2023 00:20:51 +0000</pubDate><guid>https://spectrum.ieee.org/disney-robot</guid><category>Iros 2023</category><category>Entertainment robots</category><category>Robotics</category><category>Disney robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/an-animated-gif-showing-a-small-white-legged-robot-cutely-interacting-with-a-researcher-in-a-robotics-lab.gif?id=49273590&amp;width=980"></media:content></item><item><title>Video Friday: Morphing Adaptive Robots</title><link>https://spectrum.ieee.org/video-friday-morphing-robots</link><description><![CDATA[
  210. <img src="https://spectrum.ieee.org/media-library/image.png?id=49225717&width=1487&height=957&coordinates=370%2C11%2C63%2C112"/><br/><br/><p style="">Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em><a href="https://spectrum.ieee.org/" target="_blank">IEEE Spectrum</a></em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5 style=""><a href="https://roscon.ros.org/2023/">ROSCon 2023</a>: 18–20 October 2023, NEW ORLEANS</h5><h5 style=""><a href="https://ssrr2023.org/">IEEE SSRR 2023</a>: 13–15 November 2023, FUKUSHIMA, JAPAN</h5><h5 style=""><a href="https://2023.ieee-humanoids.org/">Humanoids 2023</a>: 12–14 December 2023, AUSTIN, TEXAS</h5><h5 style=""><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 2 February 2024, ZURICH</h5><p style="">Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote><em>Pulling inspiration from the natural world, we have developed a trio of robots that can morph their bodies and legs as needed to better crawl, shimmy, or swim over difficult terrain. These new robotic systems are designed to mimic the way biological organisms adapt their shape depending on their life cycle or environment and were developed by a team from the Department of Mechanical Engineering. The work is described in a new paper published in </em>Nature Communications<em>, which outlines the three robotic types and their different abilities, including gripping, climbing, and amphibious travel.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6435b65d2c72114bc9b7621b667a8409" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/erpCtfSOnF8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p style="">[ <a href="https://www.nature.com/articles/s41467-023-41708-6"><em>Nature</em></a> ] via [ <a href="https://www.engr.colostate.edu/~zhao/">Adaptive Robotics Lab</a> ]</p><p style="">Thanks, Jianguo!</p><div class="horizontal-rule"></div><blockquote><em>LimX Dynamics has launched its first wheeled quadruped robot: W1. W1 is equipped with perception and motion-control algorithms, and multiple proprietary high-performing actuators. It combines the advantages of legged and wheeled structures all in one, enabling it with powerful real-time terrain perception and all-terrain mobility.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7a80f1d763b864e8800a1dcdc2ce6d8d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/tEYLccxFuns?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Pre-orders should open before the end of this year.</p><p>[ <a href="http://www.limxdynamics.com/en">LimX Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Robots built by engineers at the University of California, San Diego helped achieve a major breakthrough in understanding how insect flight evolved, described in the 4 October 2023 issue of the journal </em>Nature<em>. </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5fe55263c065d9a92b1ddc18e910cf65" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zG8V1yrYlD8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://today.ucsd.edu/story/these-robots-helped-understand-how-insects-evolved-two-distinct-strategies-of-flight">UCSD</a> ]</p><div class="horizontal-rule"></div><p style="">Dino Robotics’ cobot demonstrates machine tending, featuring 3D environment scanning, bin picking of provided parts without mechanical fixtures, and force-controlled part-feeding.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5ff372168cd9d5e4cb4e59c9700a7ea7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xC0cYlY9Bzs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dino-robotics.com/">Dino Robotics</a> ]</p><div class="horizontal-rule"></div><p style="">This is clever: a chameleon-inspired suction gripper that grabs and retracts without any active sensing at all.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6706e60cfd6ed2ba9d85dd896a8b05b0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5f0qXYcE3nw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p style="">[ <a href="https://ieeexplore.ieee.org/document/10024348"><em>IEEE Robotics and Automation Letters</em></a> ] via [ <a href="https://www.biorobotics.snu.ac.kr/">SNU</a> ]</p><div class="horizontal-rule"></div><p>You should probably not do this with your lidar.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1d239314526847a1161ab580396f1caf" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/3ma0EUJ8WtM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Somehow, I feel like I’m getting called out here...</p><p>[ <a href="https://norlab.ulaval.ca/">Norlab</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Imagine dogs effortlessly managing a wide array of items, their versatile mouths allowing them to grasp and hold easily. Inspired by their natural abilities, we crafted a robotic gripper. Enhancing our passive-jamming lip and incorporating the ingenious structure of a dog’s teeth, our gripper now possesses the dexterity and precision to handle objects in daily life.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4767af52df30b43ab3357973328cd56f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/udHdfz-2tNk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://publish.illinois.edu/kimlab2020/">KIMLAB</a> ]</p><div class="horizontal-rule"></div><blockquote><em>The potential of Martian lava tubes for resource extraction and habitat sheltering highlights the need for robots capable of undertaking the grueling task of their exploration. Driven by this motivation, in this work we introduce a legged robot system optimized for jumping in the low gravity of Mars, designed with leg configurations adaptable to both bipedal and quadrupedal systems.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="30de12843ca5f1c7f79bedb1343586f0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/T82vtegaNPI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.autonomousrobotslab.com/">ARL</a> ]</p><div class="horizontal-rule"></div><p>How self-healing is self-healing? Let’s find out!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="271fc8040efa614eca5d8c6c4d4aa234" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/mNCdf8oCgP4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.brubotics.eu/">BruBotics</a> ]</p><div class="horizontal-rule"></div><p style="">Seems like GITAI is planning to test some of its robots outside of the International Space Station.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="258d6f8aa7fd33878dd95db240282c96" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/aEyb6iKgHPw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://gitai.tech/">GITAI</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 06 Oct 2023 16:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-morphing-robots</guid><category>Video friday</category><category>Quadruped robots</category><category>Bioinspired robots</category><category>Insect robots</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/image.png?id=49225717&amp;width=980"></media:content></item><item><title>Chatbot: A New Robotics Podcast from IEEE Spectrum</title><link>https://spectrum.ieee.org/chatbot-podcast</link><description><![CDATA[
  211. <img src="https://spectrum.ieee.org/media-library/illustration-of-two-robot-faces-as-line-drawings-looking-at-each-other-against-a-blue-to-green-gradient-background.jpg?id=48463185&width=1200&height=800&coordinates=95%2C0%2C96%2C0"/><br/><br/><p style="">
  212. We’re launching a new robotics podcast here at <em>Spectrum</em>! It’s called <a href="/podcasts/chatbot" target="_blank">Chatbot</a>, and it’s something a little different that I’m pretty excited about.
  213. </p><hr/><h3>Subscribe to Chatbot</h3><br/><img alt="Logo of Chatbot podcast with two robot face line drawings looking at each other, with the words IEEE Spectrum at the bottom, against blue to green gradient background" class="rm-shortcode" data-rm-shortcode-id="06168460711724c4b9f7be5bc788abda" data-rm-shortcode-name="rebelmouse-image" id="e61cc" loading="lazy" src="https://spectrum.ieee.org/media-library/logo-of-chatbot-podcast-with-two-robot-face-line-drawings-looking-at-each-other-with-the-words-ieee-spectrum-at-the-bottom-aga.jpg?id=46856262&width=980"/><p>
  214. Subscribe to <a href="https://spectrum.ieee.org/podcasts/chatbot" target="_self">Chatbot</a> on the podcast service you like, or watch the video versions on the <em>Spectrum</em> YouTube channel <a href="https://www.youtube.com/playlist?list=PL8Ug41r-ywn-R8rHLer9X5hxLRaXnuOyz" rel="noopener noreferrer" target="_blank">here</a>.
  215. </p><p>
  216. • <a href="https://open.spotify.com/show/2znNFbgQ0u0Ru26UdOEMRI" rel="noopener noreferrer" target="_blank">Spotify</a> ↗<br/>• <a href="https://podcasts.apple.com/gb/podcast/chatbot/id1695950315" rel="noopener noreferrer" target="_blank">Apple Podcasts</a> ↗<br/>• <a href="https://podcasts.google.com/feed/aHR0cHM6Ly9mZWVkcy50cmFuc2lzdG9yLmZtL2NoYXRib3Q=" rel="noopener noreferrer" target="_blank">Google Podcasts</a> ↗<br/>• <a href="https://music.amazon.com/podcasts/dc3762b3-d14e-4470-a428-f67118cf412b/chatbot" rel="noopener noreferrer" target="_blank">Amazon Music</a> ↗<br/>• <a href="https://pca.st/xeegprdx" rel="noopener noreferrer" target="_blank">Pocket Casts</a> ↗<br/>• <a href="https://podcastaddict.com/podcast/chatbot/4266448" rel="noopener noreferrer" target="_blank">Podcast Addict</a> ↗<br/>• <a href="https://www.deezer.com/us/show/6037397" rel="noopener noreferrer" target="_blank">Deezer</a> ↗<br/>• <a href="https://player.fm/series/chatbot" rel="noopener noreferrer" target="_blank">Player FM</a> ↗<br/>• <a href="https://www.podchaser.com/podcasts/chatbot-5317858" rel="noopener noreferrer" target="_blank">Podchaser</a> ↗<br/>• <a href="https://www.listennotes.com/podcasts/chatbot-ieee-spectrum-rSg-GbDMcx5/" rel="noopener noreferrer" target="_blank">Listen Notes</a> ↗<br/>• <a href="https://www.youtube.com/playlist?list=PL8Ug41r-ywn-R8rHLer9X5hxLRaXnuOyz" rel="noopener noreferrer" target="_blank">YouTube</a> ↗<br/>• <a href="https://feeds.transistor.fm/chatbot" rel="noopener noreferrer" target="_blank">RSS Feed</a> ↗
  217. </p><h3></h3><br><p>
  218. The way the Chatbot podcast works is that we invite a couple of robotics experts to talk with each other about a topic that they have in common. They come up with the questions, not us, which results in the kinds of robotics conversations you won’t hear anywhere else—uniquely informative, but also surprising and fun.
  219. </p><p>
  220. Each episode will focus on a general topic that the robotics experts have in common, but once we get going, our guests are free to ask each other about whatever interests them. I’ll be there to make sure that our guests don’t get too technical, because we want everyone to be able to enjoy these conversations, but otherwise, I’ll mostly just be listening, because I’ll be as excited as you are to see how each episode unfolds.
  221. </p><p>
  222. We’re going to try to keep the episodes pretty short, maybe 20 minutes or so, so they’ll be easy to fit into your day. We did a lousy job on that with the first couple, but we’ll do better going forward. This is all definitely a work in progress for us—I have to learn how to be a good podcast host, and Spectrum is still working out how to do all the editing and stuff to make it look and sound amazing. But we’ll get there!
  223. </p><p>
  224. Our first few episodes are already live, and we’ve got all kinds of ideas for more, so please subscribe on whatever podcast service you like, or watch the video versions on the <a href="https://www.youtube.com/playlist?list=PL8Ug41r-ywn-R8rHLer9X5hxLRaXnuOyz" target="_blank"><em>Spectrum</em> YouTube channel</a>. And we’d love to hear what you think: about what you like, what you don’t like, and especially who you’d like to hear on a future episode.
  225. </p><h3>Episode 1: Making Boston Dynamics' Robots Dance</h3><br><span class="rm-shortcode" data-rm-shortcode-id="fd3f800da3d9b3b917445725f9978e21" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/EpShHKQiKmg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span><p>A new podcast from IEEE Spectrum, where award-winning robotics reporter and editor Evan Ackerman pairs up leading experts to pose the kind of questions no-one can ask. In this first episode, the choreographer of Boston Dynamics’ hit dancing robot videos, Monica Thomas, engages with Amy LaViers, director of the Robotics, Automation, and Dance Lab, about the engineering and philosophy of robots doing the twist.</p><h3>Episode 2: How Labrador and iRobot Create Domestic Robots That Really Help</h3><br><span class="rm-shortcode" data-rm-shortcode-id="103ff9f89411f04a2be08043a61118c5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/QMjEtL9i1zI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span><p>Evan Ackerman pairs Mike Dooley, CEO and co-founder of Labrador Systems, with Chris Jones, CTO of iRobot. With the Roomba, iRobot created the first commercially successful domestic robot, and Labrador Systems is developing the Retriever, a semi-autonomous mobile table intended to help people live more independently. They quiz each about the challenges of bringing robots from the laboratory to the unpredictable environments of people’s homes and making sure manufacturers are meeting actual user needs.</p><h3>Episode 3: Drones That Can Fly Better Than You Can</h3><br><span class="rm-shortcode" data-rm-shortcode-id="127226db7bdb37d6a108e6c77775720e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Go0QMXnlIxs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span><p>Host Evan Ackermans pairs Adam Bry, CEO of Skydio, and Davide Scaramuzza, director of the Robotics and Perception Group at the University of Zurich, to probe the challenges of making drones that come with super-human piloting skills.</p></br></br></br></br>]]></description><pubDate>Wed, 04 Oct 2023 09:57:44 +0000</pubDate><guid>https://spectrum.ieee.org/chatbot-podcast</guid><category>Podcasts</category><category>Chatbot</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/illustration-of-two-robot-faces-as-line-drawings-looking-at-each-other-against-a-blue-to-green-gradient-background.jpg?id=48463185&amp;width=980"></media:content></item><item><title>Drones That Can Fly Better Than You Can</title><link>https://spectrum.ieee.org/autonomous-drones</link><description><![CDATA[
  226. <img src="https://spectrum.ieee.org/media-library/image.jpg?id=36416885&width=980"/><br/><br/><p class="shortcode-media shortcode-media-youtube">
  227. <span class="rm-shortcode" data-rm-shortcode-id="fdf710186153980340a30cc6cf858cfe" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Go0QMXnlIxs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  228. <small class="image-media media-caption" placeholder="Add Photo Caption...">Episode 3: Drones That Can Fly Better Than You Can</small>
  229. <small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://youtu.be/Go0QMXnlIxs" target="_blank"><br/>
  230. </a></small>
  231. </p><p>
  232. <strong>Evan Ackerman: </strong>I’m Evan Ackerman, and welcome to <em>Chatbot</em>, a new podcast from <em>IEEE Spectrum</em> where robotics experts interview each other about things that they find fascinating. On this episode of <em>Chatbot</em>, we’ll be talking with <a href="https://rpg.ifi.uzh.ch/people_scaramuzza.html" rel="noopener noreferrer" target="_blank">Davide Scaramuzza</a> and <a href="https://www.skydio.com/company" rel="noopener noreferrer" target="_blank">Adam Bry</a> about agile autonomous drones. Adam Bry is the CEO of <a href="https://www.skydio.com/" rel="noopener noreferrer" target="_blank">Skydio</a>, a company that makes consumer camera drones with <a href="https://spectrum.ieee.org/skydio-2-review-this-is-the-drone-you-want-to-fly" target="_self">an astonishing amount of skill at autonomous tracking and obstacle avoidance</a>. Foundation for Skydio’s drones can be traced back to <a href="https://spectrum.ieee.org/skydio-camera-drone-autonomous-flying" target="_self">Adam’s work on autonomous agile drones at MIT</a>, and after spending a few years at Google working on <a href="https://wing.com/" rel="noopener noreferrer" target="_blank">Project Wing’s delivery drones</a>, Adam cofounded Skydio in 2014. Skydio is currently on their third generation of consumer drones, and earlier this year, the company brought on three PhD students from Davide’s lab to expand their autonomy team. Davide Scaramuzza directs the <a href="https://rpg.ifi.uzh.ch/" rel="noopener noreferrer" target="_blank">Robotics and Perception group at the University of Zürich</a>. His lab is best known for developing extremely agile drones that can autonomously navigate through complex environments at very high speeds. Faster, it turns out, than even the best human drone racing champions. Davide’s drones rely primarily on computer vision, and <a href="https://spectrum.ieee.org/event-camera-helps-drone-dodge-thrown-objects" target="_self">he’s also been exploring potential drone applications for a special kind of camera called an event camera</a>, which is ideal for fast motion under challenging lighting conditions. So Davide, you’ve been doing drone research for a long time now, like a decade, at least, if not more.
  233. </p><p>
  234. <strong>Davide Scaramuzza:</strong> Since 2009. 15 years.
  235. </p><p>
  236. <strong>Ackerman:</strong> So what still fascinates you about drones after so long?
  237. </p><p>
  238. <strong>Scaramuzza: </strong>So what fascinates me about drones is their freedom. So that was the reason why I decided, back then in 2009, to actually move from ground robots—I was working at the time on self-driving cars—to drones. And actually, the trigger was when <a href="https://robotsguide.com/robots/googlecar" rel="noopener noreferrer" target="_blank">Google announced the self-driving car project</a>, and then for me and many researchers, it was clear that actually many things were now transitioning from academia to industry, and so we had to come up with new ideas and things. And then with my PhD adviser at that time [inaudible] we realized, actually, that drones, especially quadcopters, were just coming out, but they were all remote controlled or they were actually using GPS. And so then we said, “What about flying drones autonomously, but with the onboard cameras?” And this had never been done until then. But what fascinates me about drones is the fact that, actually, they can overcome obstacles on the ground very quickly, and especially, this can be very useful for many applications that matter to us all today, like, first of all, search and rescue, but also other things like inspection of difficult infrastructures like bridges, power [inaudible] oil platforms, and so on.
  239. </p><p>
  240. <strong>Ackerman: </strong>And Adam, your drones are doing some of these things, many of these things. And of course, I am fascinated by drones and by what your drone is able to do, but I’m curious. When you introduce it to people who have maybe never seen it, how do you describe, I guess, almost the magic of what it can do?
  241. </p><p>
  242. <strong>Adam Bry:</strong> So the way that we think about it is pretty simple. Our basic goal is to build in the skills of an expert pilot into the drone itself, which involves a little bit of hardware. It means we need sensors that see everything in every direction and we need a powerful computer on board, but is mostly a software problem. And it becomes quite application-specific. So for consumers, for example, our drones can follow and film moving subjects and avoid obstacles and create this incredibly compelling dynamic footage. And the goal there is really what would happen if you had the world’s best drone pilot flying that thing, trying to film something in an interesting, compelling way. We want to make that available to anybody using one of our products, even if they’re not an expert pilot, and even if they’re not at the controls when it’s flying itself. <a href="https://spectrum.ieee.org/skydio-2-review-this-is-the-drone-you-want-to-fly" target="_self">So you can just put it in your hand, tell it to take off, it’ll turn around and start tracking you, and then you can do whatever else you want to do, and the drone takes care of the rest</a>. In the industrial world, it’s entirely different. So <a href="https://www.skydio.com/distribution-network-inspection" rel="noopener noreferrer" target="_blank">for inspection applications</a>, say, for a bridge, you just tell the drone, “Here’s the structure or scene that I care about,” and then we have a product called 3D Scan that will automatically explore it, build a real-time 3D map, and then use that map to take high-resolution photos of the entire structure.
  243. </p><p>
  244. And to follow on a bit to what Davide was saying, I mean, I think if you sort of abstract away a bit and think about what capability do drones offer, thinking about camera drones, it’s basically you can put an image sensor or, really, any kind of sensor anywhere you want, any time you want, and then the extra thing that we’re bringing in is without needing to have a person there to control it. And I think the combination of all those things together is transformative, and we’re seeing the impact of that in a lot of these applications today, but I think that that really— realizing the full potential is a 10-, 20-year kind of project.
  245. </p><p>
  246. <strong>Ackerman: </strong>It’s interesting when you talk about the way that we can think about the Skydio drone is like having an expert drone pilot to fly this thing, because there’s so much skill involved. And Davide, I know that you’ve been working on very high-performance drones that can maybe challenge even some of these expert pilots in performance. And I’m curious, <a rel="noopener noreferrer" target="_blank"></a>when expert drone pilots come in and see what your drones can do autonomously for the first time, is it scary for them? Are they just excited? How do they react?<a href="#_msocom_1" rel="noopener noreferrer" target="_blank"></a>
  247. </p><p>
  248. <strong>Scaramuzza:</strong> First of all, actually, they say, “Wow.” So they can not believe what they see. But then they get super excited, but at the same time, nervous. So we started working on autonomous drone racing five years ago, but in the first three years, we have been flying very slowly, like three meters per second. So they were really snails. But then in the last two years is when actually we started really pushing the limits, both in control and planning and perception. So these are our most recent drone, by the way. And now we can really fly at the same level of agility as humans. Not yet at the level to beat human, but we are very, very close. So we started the collaboration with <a href="https://marv-fpv.com/" rel="noopener noreferrer" target="_blank">Marvin, who is the Swiss champion</a>, and he’s only— now he’s 16 years old. So last year he was 15 years old. So he’s a boy. And he actually was very mad at the drone. So he was super, super nervous when he saw this. So he didn’t even smile the first time. He was always saying, “I can do better. I can do better.” So actually, his reaction was quite scared. He was scared, actually, by what the drone was capable of doing, but he knew that, basically, we were using the motion capture. Now [inaudible] try to play in a fair comparison with a fair setting where both the autonomous drone and the human-piloted drone are using both onboard perceptions or egocentric vision, then things might end up differently.
  249. </p><p>
  250. Because in fact, actually, our vision-based drone, so flying with onboard vision, was quite slow. But actually now, after one year of pushing, we are at a level, actually, that we can fly a vision-based drone at the level of Marvin, and we are even a bit better than Marvin at the current moment, using only onboard vision. So we can fly— in this arena, the space allows us to go up to 72 kilometers per hour. We reached the 72 kilometers per hour, and we even beat Marvin in three consecutive laps so far. So that’s [inaudible]. But we want to now also compete against other pilots, other world champions, and see what’s going to happen.
  251. </p><p>
  252. <strong>Ackerman:</strong> Okay. That’s super impressive.
  253. </p><p>
  254. <strong>Bry:</strong> Can I jump in and ask a question?
  255. </p><p>
  256. <strong>Ackerman: </strong>Yeah, yeah, yeah.
  257. </p><p>
  258. <strong>Bry: </strong>I’m interested if you— I mean, since you’ve spent a lot of time with the expert pilots, if you learn things from the way that they think and fly, or if you just view them as a benchmark to try to beat, and the algorithms are not so much inspired by what they do.
  259. </p><p>
  260. <strong>Scaramuzza:</strong> So we did all these things. So we did it also in a scientific manner. So first, of course, we interviewed them. We asked any sort of question, what type of features are you actually focusing your attention, and so on, how much is the people around you, the supporters actually influencing you, and the hearing the other opponents actually screaming while they control [inaudible] influencing you. So there is all these psychological effects that, of course, influencing pilots during a competition. But then what we tried to do scientifically is to really understand, first of all, what is the latency of a human pilot. So there have been many studies that have been done for car racing, Formula One, back in the 80s and 90s. So basically, they put eye trackers and tried to understand— they tried to understand, basically, what is the latency between what you see until basically you act on your steering wheel. And so we tried to do the same for human pilots. So we basically installed an eye tracking device on our subjects. So we called 20 subjects from all across Switzerland, some people also from outside Switzerland, with different levels of expertise.
  261. </p><p>
  262. But they were quite good. Okay? We are not talking about median experts, but actually already very good experts. And then we would let them rehearse on the track, and then basically, we were capturing their eye gazes, and then we basically measured the time latency between changes in eye gaze and changes in throttle commands on the joystick. And we measured, and this latency was 220 milliseconds.
  263. </p><p>
  264. <strong>Ackerman: </strong>Wow. That’s high.
  265. </p><p>
  266. <strong>Scaramuzza:</strong> That includes the brain latency and the behavioral latency. So that time to send the control commands, once you process the information, the visual information to the fingers. So—
  267. </p><p>
  268. <strong>Bry:</strong> I think [crosstalk] it might just be worth, for the audience anchoring that, what’s the typical control latency for a digital control loop. It’s— I mean, I think it’s [crosstalk].
  269. </p><p>
  270. <strong>Scaramuzza:</strong> It’s typically in the— it’s typically in the order of— well, from images to control commands, usually 20 milliseconds, although we can also fly with the much higher latencies. It really depends on the speed you want to achieve. But typically, 20 milliseconds. So if you compare 20 milliseconds versus the 220 milliseconds of the human, you can already see that, eventually, the machine should beat the human. Then the other thing that you asked me was, what did we learn from human pilots? So what we learned was— interestingly, we learned that basically they were always pushing the throttle of the joystick at the maximum thrust, but actually, this is—
  271. </p><p>
  272. <strong>Bry:</strong> Because that’s very consistent with optimal control theory.
  273. </p><p>
  274. <strong>Scaramuzza:</strong> Exactly. But what we then realized, and they told us, was that it was interesting for them to observe that actually, for the AI, was better to brake earlier rather than later as the human was actually doing. And we published these results in <a href="https://www.youtube.com/watch?v=ZPI8U1uSJUs" rel="noopener noreferrer" target="_blank">Science Robotics</a> last summer. And we did this actually using an algorithm that computes the time optimal trajectory from the start to the finish through all the gates, and by exploiting the full quadrotor dynamical model. So it’s really using not approximation, not point-mass model, not polynomial trajectories. The full quadrotor model, it takes a lot to compute, let me tell you. It takes like one hour or more, depending on the length of the trajectory, but it does a very good job, to a point that Gabriel Kocher, who works for the Drone Racing League, told us, “Ah, this is very interesting. So I didn’t know, actually, I can push even faster if I start braking before this gate.”
  275. </p><p>
  276. <strong>Bry:</strong> Yeah, it seems like it went the other way around. The optimal control strategy taught the human something.
  277. </p><p>
  278. <strong>Ackerman:</strong> Davide, do you have some questions for Adam?
  279. </p><p>
  280. <strong>Scaramuzza: </strong>Yes. So since you mentioned that basically, one of the scenarios or one of the applications that you are targeting, it is basically cinematography, where basically, you want to take amazing shots at the level of Hollywood, maybe producers, using your autonomous drones. And this is actually very interesting. So what I want to ask you is, in general, so going beyond cinematography, if you look at the performance of autonomous drones in general, it still looks to me that, for generic applications, they are still behind human pilot performance. I’m thinking of beyond cinematography and beyond the racing. I’m thinking of search and rescue operations and many things. So my question to Adam is, do you think that providing a higher level of agility to your platform could potentially unlock new use cases or even extend existing use cases of the Skydio drones?
  281. </p><p>
  282. <strong>Bry:</strong> You’re asking specifically about agility, flight agility, like responsiveness and maneuverability?
  283. </p><p>
  284. <strong>Scaramuzza:</strong> Yes. Yes. Exactly.
  285. </p><p>
  286. <strong>Bry: </strong>I think that it is— I mean, in general, I think that most things with drones have this kind of product property where the more you get better at something, the better it’s going to be for most users, and the more applications will be unlocked. And this is true for a lot of things. It’s true for some things that we even wish it wasn’t true for, like flight time. Like the longer the flight time, the more interesting and cool things people are going to be able to do with it, and there’s kind of no upper limit there. Different use cases, it might taper off, but you’re going to unlock more and more use cases the longer you can fly. I think that agility is one of these parameters where the more, the better, although I will say it’s not the thing that I feel like we’re hitting a ceiling on now in terms of being able to provide value to our users. There are cases within different applications. So for example, search and rescue, being able to fly through a really tight gap or something, where it would be useful. And for capturing cinematic videos, similar story, like being able to fly at high speed through some really challenging course, where I think it would make a difference. So I think that there are areas out there in user groups that we’re currently serving where it would matter, but I don’t think it’s like the— it’s not the thing that I feel like we’re hitting right now in terms of sort of the lowest-hanging fruit to unlock more value for users. Yeah.
  287. </p><p>
  288. <strong>Scaramuzza: </strong><a rel="noopener noreferrer" target="_blank"></a>So you believe, though, that in the long term, actually achieving human-level agility would actually be added value for your drones?<a href="#_msocom_2" rel="noopener noreferrer" target="_blank"></a>
  289. </p><p>
  290. <strong>Bry: </strong>Definitely. Yeah. I mean, one sort of mental model that I think about for the long-term direction of the products is looking at what birds can do. And the agility that birds have and the kinds of maneuvers that that makes them capable of, and being able to land in tricky places, or being able to slip through small gaps, or being able to change direction quickly, that affords them capability that I think is definitely useful to have in drones and would unlock some value. But I think the other really interesting thing is that the autonomy problem spans multiple sort of ranges of hierarchy, and when you get towards the top, there’s human judgment that I think is very— I mean, it’s crucial to a lot of things that people want to do with drones, and it’s very difficult to automate, and I think it’s actually relatively low value to automate. So for example, in a search and rescue mission, a person might have— a search and rescue worker might have very particular context on where somebody is likely to be stuck or maybe be hiding or something that would be very difficult to encode into a drone. They might have some context from a clue that came up earlier in the case or something about the environment or something about the weather.
  291. </p><p>
  292. And so one of the things that we think a lot about in how we build our products—we’re a company. We’re trying to make useful stuff for people, so we have a pretty pragmatic approach on these fronts— is basically— we’re not religiously committed to automating everything. We’re basically trying to automate the things where we can give the best tool to somebody to then apply the judgment that they have as a person and an operator to get done what they want to get done.
  293. </p><p>
  294. <strong>Scaramuzza: </strong>And actually, yeah, now that you mentioned this, I have another question. So I’ve watched many of your previous tech talks and also interacted with you guys at conferences. So what I learned—and correct me if I’m wrong—is that you’re using a lot of deep learning on the perception side, so as part of a 3D construction, semantic understanding. But it seems to me that on the control and planning side, you’re still relying basically on optimal control. And I wanted to ask you, so if this is the case, are you happy there with optimal control? We also know that Boston Dynamics is actually using only optimal control. Actually, they even claim they are not using any deep learning in control and planning. So is this actually also what you experience? And if this is the case, do you believe in the future, actually, you will be using deep learning also in planning and control, and where exactly do you see the benefits of deep learning there?
  295. </p><p>
  296. <strong>Bry:</strong> Yeah, that’s a super interesting question. So what you described at a high level is essentially right. So our perception stack— and we do a lot of different things in perception, but we’re pretty heavily using deep learning throughout, for semantic understanding, for spatial understanding, and then our planning and control stack is based on more conventional kind of optimal control optimization and full-state feedback control techniques, and it generally works pretty well. Having said that, we did— <a href="https://www.skydio.com/blog/deep-neural-pilot-skydio-2" rel="noopener noreferrer" target="_blank">we put out a blog post on this</a>. We did a research project where we basically did end-to-end— pretty close to an end-to-end learning system where we replaced a good chunk of the planning stack with something that was based on machine learning, and we got it to the point where it was good enough for flight demonstrations. And for the amount of work that we put into it, relative to the capability that we got, I think the results were really compelling. And my general outlook on this stuff— I think that the planning and controls is an area where the models, I think, provide a lot of value. Having a structured model based on physics and first principles does provide a lot of value, and it’s admissible to that kind of modeling. You can write down the mass and the inertia and the rotor parameters, and the physics of quadcopters are such that those things tend to be pretty accurate and tend to work pretty well, and by starting with that structure, you can come up with quite a capable system.
  297. </p><p>
  298. Having said that, I think that the— to me, the trajectory of machine learning and deep learning is such that eventually I think it will dominate almost everything, because being able to learn based on data and having these representations that are incredibly flexible and can encode sort of subtle relationships that might exist but wouldn’t fall out of a more conventional physics model, I think is really powerful, and then I also think being able to do more end-to-end stuff where subtle sort of second- or third-order perception impact— or second- or third-order perception or real world, physical world things can then trickle through into planning and control actions, I think is also quite powerful. So generally, that’s the direction I see us going, and we’ve done some research on this. And I think the way you’ll see it going is we’ll use sort of the same optimal control structure we’re using now, but we’ll inject more learning into it, and then eventually, the thing might evolve to the point where it looks more like a deep network in end-to-end.
  299. </p><p>
  300. <strong>Scaramuzza: </strong>Now, earlier you mentioned that you foresee that in the future, drones will be flying more agilely, similar to human pilots, and even in tight spaces. You mentioned passing through a narrow gap or even in a small corridor. So when you navigate in tight spaces, of course, ground effect is very strong. So do you guys then model these aerodynamic effects, ground effect— not just ground effect. Do you try to model all possible aerodynamic effects, especially when you fly close to structures?
  301. </p><p>
  302. <strong>Bry: </strong>It’s an interesting question. So today we don’t model— we estimate the wind. We estimate the local wind velocity—and we’ve actually found that we can do that pretty accurately—around the drone, and then the local wind that we’re estimating gets fed back into the control system to compensate. And so that’s kind of like a catch-all bucket for— you could think about ground effect as like a variation— this is not exactly how it works, obviously, but you could think about it as like a variation in the local wind, and our response times on those, like the ability to estimate wind and then feed it back into control, is pretty quick, although it’s not instantaneous. So if we had like a feed forward model where we knew as we got close to structures, “This is how the wind is likely to vary,” we could probably do slightly better. And I think you’re— what you’re pointing at here, I basically agree with. I think the more that you kind of try to squeeze every drop of performance out of these things you’re flying with maximum agility in very dense environments, the more these things start to matter, and I could see us wanting to do something like that in the future, and that stuff’s fun. I think it’s fun when you sort of hit the limit and then you have to invent better new algorithms and bring more information to bear to get the performance that you want.
  303. </p><p>
  304. On this— perhaps related. You can tell me. So you guys have done a lot of work with event cameras, and I think that you were— this might not be right, but from what I’ve seen, I think you were one of the first, if not the first, to put event cameras on quadcopters. I’d be very interested in— and you’ve probably told these stories a lot, but I still think it’d be interesting to hear. What steered you towards event cameras? How did you find out about them, and what made you decide to invest in research in them?
  305. </p><p>
  306. <strong>Scaramuzza: </strong>[crosstalk] first of all, let me explain <a href="https://spectrum.ieee.org/drone-with-event-camera-takes-first-autonomous-flight" target="_self">what an event camera is</a>. An event camera is a camera that has also pixels, but differently from a standard camera, an event camera only sends information when there is motion. So if there is no motion, then the camera doesn’t stream any information. Now, the camera does this through smart pixels, differently from a standard camera, where every pixel triggers information the same time at equidistant time intervals. In an event camera, the pixels are smart, and they only trigger information whenever a pixel detects motion. Usually, a motion is recorded as a change of intensity. And the stream of events happens asynchronously, and therefore, the byproduct of this is that you don’t get frames, but you only get a stream of information continuously in time with microsecond temporal resolution. So one of the key advantages of event cameras is that, basically, you can actually record phenomena that actually would take expensive high-speed cameras to perceive. But the key difference with a standard camera is that an event camera works in differential mode. And because it works in differential mode, by basically capturing per-pixel intensity differences, it consumes very little power, and it also has no motion blur, because it doesn’t accumulate photons over time.
  307. </p><p>
  308. So I would say that for robotics, what I— because you asked me how did I find out. So what I really, really saw, actually, that was very useful for robotics about event cameras were two particular things. First of all, the very high temporal resolution, because this can be very useful for safety, critical systems. And I’m thinking about drones, but also to avoid collisions in the automotive setting, because now we are also working in automotive settings as well. And also when you have to navigate in low-light environments, where using a standard camera with the high exposure times, you would actually be coping with a lot of motion blur that would actually cause a feature loss and other artifacts, like impossibility to detect objects and so on. So event cameras excel at this. No motion blur and very low latency. Another thing that could be also very interesting for especially lightweight robotics—and I’m thinking of micro drones—would be actually the fact that they consume also very little power. So little power, in fact, just to be on an event camera consumes one milliwatt, on average, because in fact, the power consumption depends on the dynamics of the scene. If nothing moves, then the power consumption is very negligible. If something moves, it is between one milliwatt or maximum 10 milliwatt.
  309. </p><p>
  310. Now, the interesting thing is that if you then couple event cameras with the spiking neuromorphic chips that also consume less than one milliwatt, you can actually mount them on a micro drones, and you can do amazing things, and we started working on it. The problem is that how do you train spiking networks? But that’s another story. Other interesting things where I see potential applications of event cameras are also, for example— now, think about your keyframe features of the Skydio drones. And here what you are doing, guys, is that basically, you are flying the drones around, and then you’re trying to send 3D positions and orientation of where you would like then [inaudible] to fly faster through. But the images have been captured while the drone is still. So basically, you move the drone to a certain position, you orient it in the direction where later you want it to fly, and then you record the position and orientation, and later, the drone will fly agilely through it. But that means that, basically, the drone should be able to relocalize fast with respect to this keyframe. Well, at some point, there are failure modes. We already know it. Failure modes. When the illumination goes down and there is motion blur, and this is actually something where I see, actually, the event camera could be beneficial. And then other things, of course [crosstalk]—
  311. </p><p>
  312. <strong>Ackerman: </strong>Do you agree with that, Adam?
  313. </p><p>
  314. <strong>Bry: </strong>Say again?
  315. </p><p>
  316. <strong>Ackerman: </strong>Do you agree, Adam?
  317. </p><p>
  318. Bry: I guess I’m— and this is why kind of I’m asking the question. I’m very curious about event cameras. When I have kind of the pragmatic hat on of trying to build these systems and make them as useful as possible, I see event cameras as quite complementary to traditional cameras. So it’s hard for me to see a future where, for example, on our products, we would be only using event cameras. But I can certainly imagine a future where, if they were compelling from a size, weight, cost standpoint, we would have them as an additional sensing mode to get a lot of the benefits that Davide is talking about. And I don’t know if that’s a research direction that you guys are thinking about. And in a research context, I think it’s very cool and interesting to see what can you do with just an event camera. I think that the most likely scenario to me is that they would become like a complementary sensor, and there’s probably a lot of interesting things to be done of using standard cameras and event cameras side by side and getting the benefits of both, because I think that the context that you get from a conventional camera that’s just giving you full static images of the scene, combined with an event camera could be quite interesting. You can imagine using the event camera to sharpen and get better fidelity out of the conventional camera, and you could use the event camera for faster response times, but it gives you less of a global picture than the conventional camera. So Davide’s smiling. Maybe I’m— I’m sure he’s thought about all these ideas as well.
  319. </p><p>
  320. <strong>Scaramuzza:</strong> Yeah. We have been working on that exact thing, combining event cameras with standard cameras, now for the past three years. So initially, when we started almost 10 years ago, of course, we only focused on event cameras alone, because it was intellectually very challenging. But the reality is that an event camera—let’s not forget—it’s a differential sensor. So it’s only complementary with standard camera. You will never get the full absolute intensity from out of an event camera. We show that you can actually reproduce the grayscale intensity up to an unknown absolute intensity with very high fidelity, by the way, but it’s only complementary to a standard camera, as you correctly said. So actually, you already mentioned everything we are working on and we have also already published. So for example, you mentioned unblurring blurry frames. This also has already been done, not by my group, but a group of Richard Hartley at the University of Canberra in Australia. And what we also showed in my group last year is that you can also generate super slow motion video by combining an event camera with a standard camera, by basically using the events in the blind time between two frames to interpolate and generate arbitrary frames at any arbitrary time. And so we show that we could actually upsample a low frame rate video by a factor of 50, and this with only consuming one-fortieth of the memory footprint. And this is interesting, because—
  321. </p><p>
  322. <strong>Bry: </strong>Do you think from— this is a curiosity question. From a hardware standpoint, I’m wondering if it’ll go the next— go even a bit further, like if we’ll just start to see image sensors that do both together. I mean, you could certainly imagine just putting the two pieces of silicon right next to each other, or— I don’t know enough about image sensor design, but even at the pixel level, you could have pixel— like just superimposed on the same piece of silicon. You could have event pixels next to standard accumulation pixels and get both sets of data out of one sensor.
  323. </p><p>
  324. <strong>Scaramuzza: </strong>Exactly. So both things have been done. So—
  325. </p><p>
  326. <strong>Bry:</strong> [crosstalk].
  327. </p><p>
  328. <strong>Scaramuzza:</strong> —the latest one I described, we actually installed an event camera side by side with a very high-resolution standard camera. But there is already an event camera called DAVIS that outputs both frames and events between the frames. This has been available already since 2016, but at the very low resolution, and only last year it reached the VGA resolution. That’s why we are combining—
  329. </p><p>
  330. <strong>Bry: </strong>That’s like [crosstalk].
  331. </p><p>
  332. <strong>Scaramuzza: </strong>—an event camera with a high-resolution standard camera, because want to basically see what we could possibly do one day when these event cameras are also available [inaudible] resolution together with a standard camera overlaid on the same pixel array. But there is a good news, because you also asked me another question about cost of this camera. So the price, as you know very well, drops as soon as there is a mass product for it. The good news is that Samsung has now a product called <a href="https://www.samsung.com/se/smartthings/camera/smart-things-vision-gp-u999gteeaea/" rel="noopener noreferrer" target="_blank">SmartThings Vision Sensor</a> that basically is conceived for indoor home monitoring, so to basically detect people falling at home, and this device automatically triggers an emergency call. So this device is using an event camera, and it costs €180, which is much less than the cost of an event camera when you buy it from these companies. It’s around €3,000. So that’s a very good news. Now, if there will be other bigger applications, we can expect that the price would go down a lot, below even $5. That’s what these companies are openly saying. I mean, what I expect, honestly, is that it will follow what we experience with the time-of-flight cameras. I mean, the first time-of-flight cameras cost around $15,000, and then 15 years later, they were below $150. I’m thinking of the first Kinect tool that was time-of-flight and so on. And now we have them in all sorts of smartphones. So it all depends on the market.
  333. </p><p>
  334. <strong>Ackerman:</strong> Maybe one more question from each of you guys, if you’ve got one you’ve been saving for the end.
  335. </p><p>
  336. <strong>Scaramuzza: </strong>Okay. The very last question [inaudible]. Okay. I ask, Adam, and then you tell me if you want to answer or rather not. It’s, of course, about defense. So the question I prepared, I told Evan. So I read in the news that <a href="https://www.skydio.com/blog/skydio-raises-230-million-series-e-funding-round" rel="noopener noreferrer" target="_blank">Skydio donated 300K of equivalent of drones to Ukraine</a>. So my question is, what are your views on military use or dual use of quadcopters, and what is the philosophy of Skydio regarding defense applications of drones? I don’t know if you want to answer.
  337. </p><p>
  338. <strong>Bry:</strong> Yeah, that’s a great question. I’m happy to answer that. So our mission, which we’ve talked about quite publicly, is to make the world more productive, creative, and safe with autonomous flight. And the position that we’ve taken, and which I feel very strongly about, is that working with the militaries of free democracies is very much in alignment and in support of that mission. So going back three or four years, we’ve been working with the US Army. We won the Army’s <a href="https://www.skydio.com/blog/skydio-selected-sole-platform-for-us-army-srr" rel="noopener noreferrer" target="_blank">short-range reconnaissance program</a>, which was essentially a competition to select the official kind of soldier-carried quadcopter for the US Army. And the broader trend there, which I think is really interesting and in line with what we’ve seen in other technology categories, is basically the consumer and civilian technology just raced ahead of the traditional defense systems. The military has been using drones for decades, but their soldier-carried systems were these multi-hundred-thousand-dollar things that are quite clunky, quite difficult to use, not super capable. And our products and other products in the consumer world basically got to the point where they had comparable and, in many cases, superior capability at a fraction of the cost.
  339. </p><p>
  340. And I think— to the credit of the US military and other departments of defense and ministries of defense around the world, I think people realized that and decided that they were better off going with these kind of dual-use systems that were predominantly designed and scaled in civilian markets, but also had defense applicability. And that’s what we’ve done as a company. So it’s essentially our consumer civilian product that’s extended and tweaked in a couple of ways, like the radios, some of the security protocols, to serve defense customers. And I’m super proud of the work that we’re doing in Ukraine. So we’ve donated $300,000 worth of systems. At this point, we’ve sold way, way more than that, and we have hundreds of systems in Ukraine that are being used by Ukrainian defense forces, and I think that’s good important work. The final piece of this that I’ll say is we’ve also decided and we aren’t doing and we won’t put weapons on our drones. So we’re not going to build actual munition systems, which I think is— I don’t think there’s anything ethically wrong with that. Ultimately, militaries need weapons systems, and those have an important role to play, but it’s just not something that we want to do as a company, and is kind of out of step with the dual-use philosophy, which is really how we approach these things.
  341. </p><p>
  342. I have a question that I’m— it’s aligned with some of what we’ve talked about, but I’m very interested in how you think about and focus the research in your lab, now that this stuff is becoming more and more commercialized. There’s companies like us and others that are building real products based on a lot of the algorithms that have come out of academia. And in general, I think it’s an incredibly exciting time where the pace of progress is accelerating, there’s more and more interesting algorithms out there, and it seems like there’s benefits flowing both ways between research labs and between these companies, but I’m very interested in how you’re thinking about that these days.
  343. </p><p>
  344. <strong>Scaramuzza: </strong>Yes. It’s a very interesting question. So first of all, I think of you also as a robotics company. And so what you are demonstrating is what [inaudible] of robotics in navigation and perception can do, and the fact that you can do it on a drone, it means you can also do it on other robots. And that actually is a call for us researchers, because it pushes us to think of new venues where we can actually contribute. Otherwise, it looks like everything has been done. And so what, for example, we have been working on in my lab is trying to— so towards the goal of achieving human-level performance, how do humans do navigate? They don’t do ultimate control and geometric 3D reconstruction. We have a brain that does everything end to end, or at least with the [inaudible] subnetworks. So one thing that we have been playing with has been now deep learning for already now, yeah, six years. But in the last two years, we realized, actually, that you can do a lot with deep networks, and also, they have some advantages compared to the usual traditional autonomy architectures— architecture of autonomous robots. So what is the standard way to control robots, be it flying or ground? You have [inaudible] estimation. They have a perception. So basically, special AI, semantic understanding. Then you have localization, path planning, and control.
  345. </p><p>
  346. Now, all these modules are basically communicating with one another. Of course, you want them to communicate in a smart way, because you want to also try to plan trajectories that facilitate perception, so you have no motion blur while you navigate, and so on. But somehow, they are always conceived by humans. And so what we are trying to understand is whether you can actually replace some of these blocks or even all blocks and up to each point with deep networks, which begs the question, can you even train a policy end to end that takes as input some sort of sensory, like either images or even sensory obstructions, and outputs control commands of some sort of output abstraction, like [inaudible] or like waypoints? And what we found out is that, yes, this can be done. Of course, the problem is that for training these policies, you need a lot of data. And how do you generate this data? You can not fly drones in the real world. So we started working more and more in simulation. So now we are actually training all these things in simulation, even for forests. And thanks to the video game engines like Unity, now you can download a lot of these 3D environments and then deploy your algorithms there that train and teach a drone to fly in just a bunch of hours rather than flying and crashing drones in the real world, which is very costly as well. But the problem is that we need better simulators.
  347. </p><p>
  348. We need better simulators, and I’m not just thinking of for the realism. I think that one is actually somewhat solved. So I think we need the better physics like aerodynamic effects and other non-idealities. These are difficult to model. So we are also working on these kind of things. And then, of course, another big thing would be you would like to have a navigation policy that is able to abstract and generalize to different type of tasks, and possibly, at some point, even tell your drone or robot a high-level description of the task, and the drone or the robot would actually accomplish the task. That would be the dream. I think that the robotics community, we are moving towards that.
  349. </p><p>
  350. <strong>Bry:</strong> Yeah. I agree. I agree, and I’m excited about it.
  351. </p><p>Ackerman: We’ve been talking with Adam Bry from Skydio and Davide Scaramuzza from the University of Zürich about agile autonomous drones, and thanks again to our guests for joining us. For <em>Chatbot</em> and <em>IEEE Spectrum</em>, I’m Evan Ackerman.</p>]]></description><pubDate>Tue, 03 Oct 2023 10:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/autonomous-drones</guid><category>Skydio</category><category>University of zurich</category><category>Drones</category><category>Event cameras</category><category>Simulations</category><category>Type:podcast</category><category>Robots</category><category>Chatbot podcast</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://assets.rbl.ms/36416885/origin.jpg"></media:content></item><item><title>Creating Domestic Robots That Really Help</title><link>https://spectrum.ieee.org/domestic-robots</link><description><![CDATA[
  352. <img src="https://spectrum.ieee.org/media-library/image.jpg?id=36416762&width=980"/><br/><br/><p class="shortcode-media shortcode-media-youtube">
  353. <span class="rm-shortcode" data-rm-shortcode-id="90b50eec7875be630b223030476bf436" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/QMjEtL9i1zI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  354. <small class="image-media media-caption" placeholder="Add Photo Caption...">Episode 2: How Labrador and iRobot Create Domestic Robots That Really Help</small>
  355. <small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://youtu.be/QMjEtL9i1zI" target="_blank"><br/>
  356. </a></small>
  357. </p><p>
  358. <strong>Evan Ackerman: </strong>I’m Evan Ackerman, and welcome to ChatBot, a new podcast from<em> IEEE Spectrum</em> where robotics experts interview each other about things that they find fascinating. On this episode of ChatBot, we’ll be talking with Mike Dooley and Chris Jones about useful robots in the home. <a href="https://labradorsystems.com/about/" target="_blank">Mike Dooley is the CEO and co-founder of Labrador Systems</a>, the startup that’s developing an assistive robot in the form of a sort of semi-autonomous mobile table that can help people move things around their homes. Before founding Labrador, Mike led the development of <a href="https://spectrum.ieee.org/review-evolution-robotics-mint-sweeper" target="_self">Evolution Robotics’ innovative floor-cleaning robots</a>. And when <a href="https://spectrum.ieee.org/irobot-sweeps-up-evolution-robotics-for-74-million" target="_self">Evolution was acquired by iRobot in 2012</a>, Mike became iRobot’s VP of product and business development. Labrador Systems is getting ready to launch its first robot, the <a href="https://spectrum.ieee.org/labrador-systems-robot" target="_self">Labrador Retriever</a>, in 2023. <a href="https://www.linkedin.com/in/cjonesrobotics" target="_blank">Chris Jones is the chief technology officer at iRobot</a>, which is arguably one of the most successful commercial robotics companies of all time. Chris has been at iRobot since 2005, and he spent several years as a senior investigator at iRobot research working on some of <a href="https://spectrum.ieee.org/video-tour-all-of-irobots-coolest-stuff" target="_self">iRobot’s more unusual and experimental projects</a>. iRobot Ventures is one of the investors in Labrador Systems. Chris, you were doing some interesting stuff at iRobot back in the day too, that I think a lot of people may not know how diverse iRobot’s robotics projects were.<br/>
  359. </p><p>
  360. <strong>Chris Jones:</strong> I think iRobot as a company, of course, being around since 1990, has done all sorts of things. Toys, commercial robots, consumer, military, industrial, all sorts of different things. But yeah, myself in particular, I spent the first seven, eight years of my time at iRobot doing a lot of super fun kind of far-out-there research types of projects, a lot of them funded by places like DARPA and working with some great academic collaborators, and of course, a whole crew of colleagues at iRobot. But yeah, <a href="https://spectrum.ieee.org/irobot-developing-inflatable-robot-arms-inflatable-robots" target="_self">some of those</a> were ranged from completely squishy robots to robot arms to robots that could climb mountainsides to robots under the water, all sorts of different fun, useful, but fun, of course, and really challenging, which makes it fun, different types of robot concepts.
  361. </p><p>
  362. <strong>Ackerman:</strong> And those are all getting incorporated to the next generation Roomba, right?
  363. </p><p>
  364. <strong>Jones:</strong> I don’t know that I can comment on—
  365. </p><p>
  366. <strong>Ackerman: </strong>That’s not a no. Yeah. Okay. So Mike, I want to make sure that people who aren’t familiar with Labrador get a good understanding of what you’re working on. So can you describe kind of Labrador’s robot, what it does and why it’s important?
  367. </p><p>
  368. <strong>Mike Dooley:</strong> Yeah. So Labrador, we’re developing <a href="https://spectrum.ieee.org/labrador-systems-robot" target="_self">a robot called the Retriever</a>, and it’s really designed as an extra pair of hands for individuals who have some issue either with pain, a health issue or injury that impacts their daily activities, particularly in the home. And so this is a robot designed to help people live more independently and to augment their abilities and give them some degree of autonomy back where they’re fighting that with the issue that they’re facing. And the robot, I think it’s been— after previewing its CES, it has been called a self-driving shelf. It’s designed to be really a mobile platform that’s about the size of a side table but has the ability to carry things as large as a laundry basket or set the dinner and plates on it, automatically navigates from place to place. It raises up to go up to countertop height when you’re by the kitchen sink and lowers down when you’re by your armchair. And it has the ability to retrieve too. So it’s a cross between robots that are used in warehousing to furniture mixed together to make something that’s comfortable and safe for the environment, but really is really meant to help folks where they have some difficulty moving themselves. This is meant to help them give that some degree of that independence back, as well as extend the impact of it for caregivers.
  369. </p><p>
  370. <strong>Ackerman: </strong>Yeah, I thought that was a fantastic idea when I first saw it at <a href="https://www.ces.tech/" target="_blank">CES</a>, and I’m so glad that you’ve been able to continue working on it. And especially with some support from folks like iRobot, right? Chris, iRobot is an investor in Labrador?
  371. </p><p>
  372. <strong>Jones:</strong> Correct. Through iRobot Ventures, we’re an early investor in Labrador. Of course, where that means, and we continue to be super excited about what they’re doing. I mean, for us, anyone who has great ideas for how robots can help people, in particular, assist people in their home with independent living, etc., I think is something we strongly believe is going to be a great application for robots. And when making investments, I’ll just add, of course, that earliest stage, a lot of it is about the team, right? And so Mike and the rest of his team are super compelling, right? That paired with a vision, that’s something that we believe is a great application for robots. It makes it an easy decision, right, to say there’s someone we’d like to support. So we love seeing their progress.
  373. </p><p>
  374. <strong>Ackerman:</strong> Yeah, me too.
  375. </p><p>
  376. <strong>Dooley: </strong>And we appreciate your support very much. So yeah.
  377. </p><p>
  378. <strong>Ackerman: </strong>All right, so what do you guys want to talk about? Mike, you want to kick things off?
  379. </p><p>
  380. <strong>Dooley: </strong>I can lead off. Yeah, so in full disclosure, at some point in my life, I was-- Chris, what’s the official name for an iRobot employee? I forgot what they came up with. It’s not iRoboteer, is it?
  381. </p><p>
  382. <strong>Jones:</strong> iRoboteer. Yeah.
  383. </p><p>
  384. <strong>Dooley: </strong>Okay, okay. All right, so I was an iRoboteer in my past life and crossed over with Chris for a number of years. And I know they’ve renovated the building a couple times now, but these products you mentioned or the robots you mentioned at the beginning, a lot of them are in display <a href="https://experience.irobot.com/irobot-education-virtual-museum-tours-2023" target="_blank">in a museum</a>. And so I think my first question to Chris was, can you think of one of those, either that you worked on or maybe it didn’t, but you go, “Man, this should have taken off or this should have been this--” or it should have or you wished it would have. It would have been great if one of those that’s in there because there’s a lot, so.
  385. </p><p>
  386. <strong>Jones:</strong> Yes, there are a lot. You’re right. We have a museum, and it has been renovated in the last couple years, Mike, so you should come back and visit and check out the new updated museum. How would I answer that? There are so many things in there. I would say one that I have some sentimentality toward, and I think it holds some really compelling promise, even though at least to date, it hasn’t gone anywhere outside of the museum, Evan, is related to the squishy robots I was talking about. And in my mind, in one of the key challenges in unlocking future value in robots, and in particular, in autonomous robots, for example, in the home, is manipulation, is physical manipulation of the environment in the home. And Mike and Labrador are doing a little bit of this, right, by being able to maneuver and pick up, carry, drop off some things around the home. But the idea of a robot that’s able to physically pick up, grasp objects, pick them up off the floor, off a counter, open and close doors, all of those things is kind of the Holy Grail, right, if you can cost-effectively and robustly do that. In the home, there’s all sorts of great applications for that. <a href="https://spectrum.ieee.org/universal-jamming-gripper" target="_self">And one of those research projects that’s in the museum was actually something called the Jamming Gripper</a>. Mike, I don’t know if you remember seeing that at all, but this takes me back. And Evan, actually, I’m sure there are some IEEE stories and stuff back in the day from this. But this was an idea of a very compliant, it’s a soft manipulator. It’s not a hand. It’s actually very close to imagining a very soft membrane that’s filled with coffee grounds. So imagine a bag of coffee, right? Very soft and compliant.
  387. </p><p>
  388. But vacuum-packed coffee, you pull a vacuum on that bag. It turns rigid in the shape that it was in. It’s like a brick, which is a great concept for thinking about robot manipulation. That’s one idea. We had spent some research time with some folks in academia, had built a huge number of prototypes, and I still feel like there’s something there. There’s a really interesting concept there that can help with that more general purpose manipulation of objects in the home. So Mike, if you want to talk to us about licensing, maybe we can do that for Labrador with all your applications.
  389. </p><p>
  390. <strong>Dooley: </strong>Yeah. Actually, that’s what you should add. It would probably increase your budget dramatically, but you should add live demonstrations to the museum. See if you can have projects to get people to bring some of those back. Because I’m sure I saw it. I never knew it was doing that.
  391. </p><p>
  392. <strong>Jones:</strong> I mean, maybe we can continue this. There might be a little bit of a thread to continue that question into—the first one that came to my mind, Mike, when I was thinking about what to ask. And it’s something I have a lot of admiration or respect for you and how you do your job, which is you’re super good at engaging and listening to users kind of in their context to understand what their problems are. Such that you can best kind of articulate or define or ideate things that could help them address problems that they encounter in their everyday life. And that then allows you kind of as a leader, right, to use that to motivate quick prototype development to get the next level of testing or validation of what if this, right? And those things may or may not involve duct tape, right, involve some very crude things that are trying to elicit kind of that response or feedback from a user in terms of, is this something that would be valuable to you in overcoming some challenges that I’ve observed you having, let’s say, in your home environment? So I’m curious, Mike, how do you think about that process and how that translates into shaping a product design or the identification of an opportunity? I’m curious, maybe what you’ve learned through Labrador. I know you spent a lot of time in people’s homes to do exactly that. So I’m curious, how do you conduct that work? What are you looking for? How does that guide your development process?
  393. </p><p>
  394. <strong>Dooley: </strong>The word that you talk about is customer empathy, is are you feeling their pain? Are you understanding their need, and how are you connecting with it? And my undergrad’s in psychology, so I always was interested in what makes people think the way they do. <a rel="noopener noreferrer" target="_blank"></a>I remember a iRobot study going into a home. And we were in the last day testing with somebody and a busy mom. And we’re testing <a href="https://www.irobot.com/en_US/braava.html" rel="noopener noreferrer" target="_blank">Braava Jet</a>. It’s a little robot that iRobot sells, that it’s really good for places with tight spaces for spraying and scrubbing floors, like kitchens and bathrooms. And the mom said, she almost said it was exhaustion, is that— I said, “What is it?” She says, “Does this do as good of a job as you could do?” And I think most people from iRobot would admit, “No. Can I match what the grease power, all the effort and everything I can put into this?” And she says, “But at least I can set this up, hit a button, and I can go to sleep. And at least it’s getting the job done. It’s doing something, and it gives me my time back.” And when you hear that, people go, “Well, Roomba is just something that cleans for people or whatever.” Like, “No. Roomba gives people their time back.” And once you’re on that channel, then you start thinking about, “Okay, what can we do more with the product that does that, that’s hitting that sort of core thing?” So yeah, and I think having the humbleness to not build a product you want, build it to the need, and then also the humbleness about where you can meet that need and where you can’t. Because robotics is hard, and we can’t make Rosey yet and things like that, so.
  395. </p><p>
  396. <strong>Ackerman:</strong> Mike, I’m curious, did you have to make compromises like that? Is there an example you could give with Labrador?
  397. </p><p>
  398. <strong>Dooley: </strong>Oh, jeez, all the— yeah. I mean, no, Labrador is perfect. No, I mean, we go through that all the time. I think on Labrador, no, we can’t do everything people want. What you’re trying to say, is it— I think there’s different languages of minimum viable product or good enough. There was somebody at Amazon used the term— I’m going to blank on it. It was like wonderful enough or something, or they have a nicer—
  399. </p><p>
  400. <strong>Jones:</strong> Lovable?
  401. </p><p>
  402. <strong>Dooley: </strong>Lovable. Yeah, lovable enough or something. And I think that that’s what you have to remember, is like, so on one hand, you have to be— you have to sort of have this open heart that you want to help people. And the other point, you have to have a really tight wallet because you just can’t spend enough to meet everything that people want. And so just a classic example is, Labrador goes up and down a certain amount of height. And people’s cabinets and someone in a wheelchair, they would love it if we would go up to the upper cabinets above the kitchen sink or other locations. And when you look at that, mechanically we can, but that then creates-- there’s product realities about stability and tilt testing. And so we have to fit those. Chris knows that well with <a href="https://robotsguide.com/robots/ava" rel="noopener noreferrer" target="_blank">Ava</a>, for instance, is how heavy the base is for every inch you raise the mass above a certain amount. And so we have to make a limit. You have to say, “Hey, here’s the envelope. We’re going to do this to this, or we’re going to carry this much because that’s as much as we could deliver with this sort of function.” And then, is that lovable enough? Is that is that rewarding enough to people? And I think that’s the hard [inaudible], is that you have to do these deliveries within constraints. And I think sometimes when I’m talking to folks, they’re either outside robotics or they’re very much on the engineering side and not thinking about the product. They tend to think that you have to do everything. And it’s like that’s not how product development works, is you have to do just the critical first step, because then that makes this a category, and then you can do the next one and the next one. I think it brings to mind— Roomba has gone through an incredible evolution of what its functions were and how it worked and its performance since the very first version and to what Chris and team offer now. But if they tried to do the version today back then, they wouldn’t have been able to achieve it. And others fail because they probably went to the wrong angle. And yeah.
  403. </p><p>
  404. <strong>Jones: </strong>Evan, I think you asked if there are anything that was operating under constraints. I think product development in general, I presume, but certainly, robotics is all about constraints. It’s how do you operate within those? How do you understand where those boundaries are and having to make those calls as to— how are you going to have to— how are you going to decide to constrain your solution, right, to make sure that it’s something that’s feasible for you to do, right? It’s meeting a compelling need. It’s feasible for you to do. You can robustly deliver it. Trying to get that entire equation to work means you do have to reckon with those constraints kind of across the board to find the right solve. Mike, I’m curious. You do your user research, you have that customer empathy, you’ve perhaps worked through some of these surprising challenges that I’m sure you’ve encountered along the way with Labrador. You ultimately get to a point that you’re able to do pilots in homes, right? You’re actually now this— maybe the Duct Tape is gone or it’s at least hidden, right? It’s something that looks and feels more like a product and you’re actually getting into some type of more extended pilot of the product or idea of the product in users’ homes. What are the types of things you’re looking to accomplish with those pilots? Or what have you learned when you go from, “All right, I’ve been watching this user in their home with those challenges. So now I’m actually leaving something in their home without me being there and expecting them to be able to use it”? What’s the benefit or the learnings that you encounter in conducting that type of work?
  405. </p><p>
  406. <strong>Dooley: </strong>Yeah, it’s a weird type of experiment and there’s different schools of thought of how you do stuff. Some people want to go in and research everything to death and be a fly on the wall. And we went through this— I won’t say the source of it. A program we had to go through because of some of the— because of some of the funding that we’re getting from another project. And the quote in the beginning, they put up a slide that I think it’s from Steve Jobs. I’m sure I’m going to butcher it, that people don’t know what they want until I show them or something. I forget what the exact words are. And they were saying, “Yeah, that’s true for Steve Jobs, but for you, you can really talk to the customer and they’re going to tell you what they need.” I don’t believe that.
  407. </p><p>
  408. <strong>Jones:</strong> They need a faster horse, right? They don’t need a car.
  409. </p><p>
  410. <strong>Dooley: </strong>Yeah, exactly.
  411. </p><p>
  412. <strong>Jones:</strong> They’re going to tell you they need a faster horse.
  413. </p><p>
  414. <strong>Dooley:</strong> Yeah, so I’m in the Steve Jobs camp and on that. <a rel="noopener noreferrer" target="_blank"></a>And it’s not because people aren’t intelligent. It’s just that they’re not in that world of knowing what possibilities you’re talking about. So I think there is this sort of soft skill between, okay, listen to their pain point. What is that difficulty of it? You’ve got a hypothesis to say, “Okay, out of everything you said, I think there’s an overlap here. And now I want to find out—” and we did that. We did that in the beginning. We did different ways of explaining the concept, and then the first level we did was just explain it over the phone and see what people thought of it and almost test it neutrally. Say, “Hey, here’s an idea.” And then, “Oh, here’s an idea like Roomba and here’s an idea like Alexa. What do you like or dislike?” Then we would actually build a prototype that was remote-controlled and brought it in their home, and now we finally do the leave-behind. And the whole thing is it’s like how to say it. It’s like you’re sort of releasing it to the world and we get out of the way. The next part is that it’s like letting a kid go and play soccer on their own and you’re not yelling or anything or don’t even watch. You just sort of let it happen. And what you’re trying to do is organically look at how are people— you’ve created this new reality. How are people interacting with it? And what we can see is the robots, they won’t do this in the future, but right now they talk on Slack. So when they send it to the kitchen, I can look up and I can see, “Hey, user one just sent it to the kitchen, and now they’re sending it to their armchair, and they’re probably having an afternoon snack. Oh, they sent it to the laundry room. Now they sent it over to the closet. They’re doing the laundry.” And the thing for us was just watching how fast were people adopting certain things, and then what were they using it for. And the striking thing that was—
  415. </p><p>
  416. <strong>Jones: </strong>That’s interesting.
  417. </p><p>
  418. <strong>Dooley: </strong>Yeah, go ahead.
  419. </p><p>
  420. <strong>Jones:</strong> I was just going to say, I mean, that’s interesting because I think I’m sure it’s very natural to put the product in someone’s home and kind of have a rigid expectation of, “No, no, this is how you use it. No, no, you’re doing it wrong. Let me show you how you use this.” But what you’re saying is it’s almost, yeah, you’re trying your best to solve their need here, but at some point you kind of leave it there, and now you’re also back into that empathy mode. It’s like, “Now with this tool, how do you use it?” and see kind of what happens.
  421. </p><p>
  422. <strong>Dooley:</strong> I think you said it in a really good way, is that you’ve changed this variable in the experiment. You’ve introduced this, and now you go back to just observing, just hearing what they’re— just watching what they’re doing with it, being as in-intrusive as possible, which is like, “We’re not there anymore.” Yeah, the robot’s logging it and we can see it, but it’s just on them. And we’re trying to stay out of the process and see how they engage with it. And that’s sort of like the thing that— we’ve shared it before, but we were just seeing that people were using it 90 to a 100 times a month, especially after the first month. It was like, we were looking at just the steady state. Would this become a habit or routine, and then what were they using it for?
  423. </p><p>
  424. <strong>Jones:</strong> So you’re saying when you see that, you have kind of a data point of one or a small number, but you have such a tangible understanding of the impact that this seems to be having, that you as an entrepreneur, right, that gives you a lot of confidence that may not be visible to whatever people that are outside the walls just trying to look at what you’re doing in the business. They see one data point, which is harder to grapple with, but you, being that close and understanding in that connection between what the product is doing and the needs that that gives you or the team a substantial confidence boost, right, is to, “This is working. We need to scale it. We have to show that this ports to other people in their homes, etc.,” but it gives you that confidence.
  425. </p><p>
  426. <strong>Dooley: </strong>Yeah, and then when we take the robots away, because we only have so many and we rotate them, getting the guilt trip emojis two months later from people, “I miss my robot. When are you going to build a new one?” and all that and stuff. So—
  427. </p><p>
  428. <strong>Jones:</strong> Do people name the robots?
  429. </p><p>
  430. <strong>Dooley:</strong> Yeah. They immediately do that and come up with creative names for it. One was called Rosey, naturally, but others was like— I’m forgetting the name she called it. It was inspired by a science fiction on an artificial AI companion and things. And it was just quite a bit of just different angles of— because she saw this as her assistant. She saw this as sort of this thing. But yeah, so I think that, again, for a robot, what you can see in the design is the classic thing at CES is to make a robot with a face and arms that doesn’t really do anything with those, but it pretends to be humanoid or human-like. And so we went the entire other route with this. And the fact that people then still relate to it that way, it means that-- we’re not trying to be cold or dispassionate. We’re just really interested in, can they get that value? Are they reacting to what the robot is doing, not to what the sort of halo that you sort of dressed it up as for that?
  431. </p><p>
  432. <strong>Jones: </strong><a rel="noopener noreferrer" target="_blank"></a>Yeah, I mean, as you know, like with Roomba or Braava and things like that, it’s the same thing. People project anthropomorphism or project that personality onto them, but that’s not really there, right, in a strong way. So yeah.
  433. </p><p>
  434. <strong>Dooley:</strong> <a rel="noopener noreferrer" target="_blank"></a>Yeah, no, and it’s weird. And it’s something they do with robots in a weird way that they don’t-- people don’t name their dishwasher usually or something. But no, I would have-
  435. </p><p>
  436. <strong>Jones: </strong><a rel="noopener noreferrer" target="_blank"></a>You don’t?
  437. </p><p>
  438. <strong>Dooley:</strong> <a rel="noopener noreferrer" target="_blank"></a>Yeah, [inaudible]. I did for a while. The stove got jealous, and then we had this whole thing when the refrigerator got into it.
  439. </p><p>
  440. <strong>Ackerman:</strong> <a rel="noopener noreferrer" target="_blank"></a>I’ve heard anecdotally that maybe this was true with PackBots. I don’t know if it’s true with Roombas. That people want their robot back. They don’t want you to replace their old robot with a new robot. They want you to fix the old robot and have that same physical robot. It’s that lovely connection.
  441. </p><p>
  442. <strong>Jones:</strong> <a rel="noopener noreferrer" target="_blank"></a>Yeah, certainly, PackBot on kind of the military robot side for bomb disposal and things like that, you would directly get those technicians who had a damaged robot, who they didn’t want a new robot. They wanted this one fixed, right? Because again, they anthropomorphize or there is some type of a bond there. And I think that’s been true with all of the robots, right? It’s something about the mobility, right, that embodies them with some type of a-- people project a personality on it. So they don’t have to be fancy and have arms and faces necessarily for people to project that on them. So that seems to be a common trait for any autonomously mobile platform.
  443. </p><p>
  444. <strong>Ackerman:</strong> Yeah. Mike, it was interesting to hear you say that. You’re being very thoughtful about that, and so I’m wondering if Chris, you can address that a little bit too. I don’t know if they do this anymore, but for a while, robots would speak to you, and I think it was a female voice that they had if they had an issue or something or needed to be cleaned. And that I always found to be an interesting choice because it’s sort of like the company is now giving this robot a human characteristic that’s very explicit. And I’m wondering how much thought went into that, and has that changed over the years about how much you’re willing to encourage people to anthropomorphize?
  445. </p><p>
  446. <strong>Jones: </strong>I mean, it’s a good question. I mean, that’s evolved, I would say, over the years, from not so much to there’s more of kind of a vocalization coming from the robot for certain scenarios. It is an important part. Some users, that is a primary way of interacting. I would say more of that type of feedback these days comes through more of kind of the mobile experience through the app to give both the feedback, additional information, actionable next steps. If you need to empty the dustbin or whatever it is, that that’s just a richer place to put that and a more accepted or common way for that to happen. So I don’t know, I would say that’s the direction things have trended, but I don’t know that that’s— that’s not because I don’t believe that we’re not trying to humanize the robot itself. It’s just more of a practical place where people these days will expect. It’s almost like Mike was saying about the dishwasher and the stove, etc. If everything is trying to talk to you like that or kind of project its own embodiment into your space, it could be overwhelming. So I think it’s easier to connect people at the right place and the right time with the right information, perhaps, if it’s through the mobile experience though.
  447. </p><p>
  448. But it is. That human-robot interaction or that experience design is a nuanced and tricky one. I’m certainly not an expert there myself, but it’s hard to find that right balance, that right mix of, what do you ask or expect of the user versus what do you assume or don’t give them an option? Because you also don’t want to overload them with too much information or too many options or too many questions, right, as you try to operate the product. So sometimes you do have to make assumptions, make defaults, right, that maybe can be changed if there’s really a need to that might require more digging. And Mike, I was curious. That was a question I had for you, was you have a physically, a meaningfully-sized product that’s operating autonomously in someone’s home, right?
  449. </p><p>
  450. <strong>Dooley:</strong> Yes.
  451. </p><p>
  452. <strong>Jones:</strong> Roomba can drive around and will navigate, and it’s a little more expected that we might bump into some things as we’re trying to clean and clean up against walls or furniture and all of that. Then it’s small enough that that isn’t an issue. How do you design for a product of the size that you’re working on, right? What went into kind of human-robot interaction side of that to allow for people who need to use this in their home that are not technologists, but they can take advantage of the— that can take advantage of the great value, right, that you’re trying to deliver for them. But it’s got to be super simple. How did you think about that HRI kind of design?
  453. </p><p>
  454. <strong>Dooley:</strong> There’s a lot wrapped into that. I think the bus stop is the first part of it. What’s the simplest way that they can command in a metaphor? Like everybody can relate to armchair or front door, that sort of thing. And so that idea that the robot just goes to these destinations is super simplifying. People get that. It’s almost now at a nanosecond how fast they get that and that metaphor. So that was one of it. And then you sort of explain the rules of the road of how the robot can go from place to place. It’s got these bus routes, but they’re elastic and that it can go around you if needed. But there’s all these types of interactions. Okay, we figured out what happens when you’re coming down the hall and the robot’s coming down. Let’s say you’re somebody else and they just walk towards each other. And I know in hospitals, the robot’s programmed to go to the side of the corridor. There’s no side in a home. That’s the stuff. So those are things that we still have to iron out, but there’s timeouts and there’s things of—that’s where we’ll be—we’re not doing it yet, but it’d be great to recognize that’s a person, not a closed door or something and respond to it. So right now, we have to tell the users, “Okay, it’ll spin a time to make sure you’re there, but then it’ll give up. And if you really wanted to, you could tell it to go back from your app. You could get out of the way if you want, or you could stop it by doing this.”
  455. </p><p>
  456. And so that’ll get refined as we get to the market, but those interactions, yeah, you’re right. You have this big robot that’s coming down. And one of the surprising things was it’s not just people. One of the women in the pilot had a Border Collie, and their Border Collie’s, by instinct, bred to herd sheep. So it would hear the robot. The robot’s very quiet, but she would command it. It would hear the robot coming down the hall and it would put its paw out to stop it, and that became it’s game. It started herding the robot. And so it’s really this weird thing, this metaphor you’re getting at.
  457. </p><p>
  458. <strong>Jones:</strong> Robots are pretty stubborn. The robot probably just sat there for like five minutes, like, “Come on. Who’s going to blink?”
  459. </p><p>
  460. <strong>Dooley: </strong>Yeah. Yeah. And the AI we’d love to add, we have to catch up with where you guys are at or license some of your vision recognition algorithms because, first, we’re trying to navigate and avoid obstacles. And that’s where all the tech is going into in terms of the design and the tiers of safety that we’re doing. But it’s just like what the user wanted in that case is, if it’s the dog, can you play my voice, say, “Get out” or, “Move,” or whatever, or something, “Go away”? Because she sent me a video of this. It’s like it was happening to her too, is she would send the robot out. The dogs would get all excited, and she’s behind it in her wheelchair. And now the dogs are waiting for her on the other side of the robot, the robot’s wondering what to do, and they’re all in the hall. And so yeah, there’s this sort of complication that gets in there that you have multiple agents going on there.
  461. </p><p>
  462. <strong>Ackerman:</strong> Maybe one more question from each of you guys. Mike, you want to go first?
  463. </p><p>
  464. <strong>Dooley: </strong>I’m trying to think. I have one more. And when you have new engineers start—let’s say they haven’t worked on robots before. They might be experienced. They’re coming out of school or they’re from other industries and they’re coming in. What is some key thing that they learn, or what sort of transformation goes on in their mind when they finally get in the zone of what it means to develop robots? And it’s a really broad question, but there’s sort of a rookie thing.
  465. </p><p>
  466. <strong>Jones: </strong>Yeah. What’s an aha moment that’s common for people new to robotics? And I think this is woven throughout this entire conversation here, which is, macro level, robots are actually hard. They’re difficult to kind of put the entire electromechanical software system together. It’s hard to perceive the world. If a robot’s driving around the home on its own, it needs to have a pretty good understanding of kind of what’s around it. Is something there, is something not there? The richer that understanding can be, the more adaptable or personalized that it can be. But generating that understanding is also hard. They have to be built to deal with all of those unanticipated scenarios that they’re going to encounter when they’re let out into the wild. So it’s that I think it’s surprising to a lot of people how long that long tail of corner cases ends up being that you have to grapple with. If you ignore one of them, it can mean it can end the product, right? It’s a long tail of things. Any one of them ends up, if it rears its head enough for those users, they’ll stop using the product because, “Well, this thing doesn’t work, and this has happened like twice to me now in the year I’ve had it. I’m kind of done with it,” right?
  467. </p><p>
  468. So you really have to grapple with the very long, long tail of corner cases when the technology hits the real world. I think that’s a super surprising one for people who are new to robotics. It’s more than a hardware consumer product company, consumer electronics company. You do need to deal with those challenges of perception, mobility in the home, the chaos of— specifically, you’re talking about more of the home environment, not the more structured environment and the industrial side. And I think that’s something that everyone has to go through that learning curve of understanding the impact that can have.
  469. </p><p>
  470. <strong>Dooley: </strong>Yeah. Of the dogs and cats.
  471. </p><p>
  472. <strong>Jones: </strong>Yeah, I mean, who would have thought cats are going to jump on the thing or Border Collies are going to try to herd it, right? And you have to just-- and you don’t learn those things until you get products out there. And that’s, Mike, what I was asking you about pilots and what do you hope to learn or the experience there. Is you have to take that step if you’re going to start kind of figuring out what those elements are going to start looking like. It’s very hard to do just intellectually or on paper or in the lab. You have to let them out there. So that’s a learning lesson there. Mike, maybe a similar question for you, but--
  473. </p><p>
  474. <strong>Ackerman: </strong>This is the last one, so make it a good one.
  475. </p><p>
  476. <strong>Jones: </strong>Yep. The last one, it better be a good one, huh? It’s a similar question for you, but maybe cut more on address to an entrepreneur in the robotic space. I’m curious, for a robot company to succeed, there’s a lot of, I’ll call them, ecosystem partners, right, that have to be there. Manufacturing, channel, or go-to-market partners, funding, right, to support a capital-intensive development process, and many more. I’m curious, what have you learned or what do people need to going into a robotics development or looking to be a robotics entrepreneur, what do people miss? What have you learned? What have you seen? What are the partners that are the most important? And I’m not asking for, “Oh, iRobot’s an investor. Speak nicely on the financial investor side.” That’s not what I’m after. But what have you learned, that you better not ignore this set of partners because if one of them falls through or it doesn’t work or is ineffective, it’s going to be hard for all the other pieces to come together?
  477. </p><p>
  478. <strong>Dooley: </strong>Yeah, it’s complex. I think just like you said, robots is hard. I think when we got acquired by iRobot and we were having some of the first meetings over— it’s Mike from software. Halloran.
  479. </p><p>
  480. <strong>Ackerman: </strong>This was Evolution Robotics?
  481. </p><p>
  482. <strong>Dooley: </strong>Evolution. Yeah, but Mike Halloran from iRobot, we came to the office at the Evolution’s office, and he just said, “Robots are hard. They’re really hard.” And it’s like, that’s the point we knew there was harmony. We were sort of under this thing. And so for everything what Chris is saying is that all of that is high stakes. And so you sort of have to be-- you have to be good enough on all those fronts of all those partners. And so some of it is critical path technology. Depth cameras, that function is really critical to us, and it’s critical to work well and then cost and scale. And so just being flexible about how we can deal with that and looking at that sort of chain and how do we sort of start at one level and scale it through? So you look at sort of, okay, what are these key enabling technologies that have to work? And that’s one bucket that are there. Then the partnerships on the business side, we’re in a complex ecosystem. I think the other rude awakening when people look at this is like, “Well, yeah, why doesn’t-- as people get older, they have disabilities. That’s what you have-- that’s your insurance funds.” It’s like, “No, it doesn’t.” It doesn’t for a lot of-- unless you have specific types of insurance. We’re partnering with Nationwide. They have long-term care insurance - and that’s why they’re working with us - that pays for these sorts of issues and things. Or Medicaid will get into these issues depending on somebody’s need.
  483. </p><p>
  484. And so I think what we’re trying to understand is—this goes back to that original question about customer empathy—is that how do we adjust what we’re doing? That we have this vision. I want to help people like my mom where she is now and where she was 10 years ago when she was experiencing difficulties with mobility initially. And we have to stage that. We have to get through that progression. And so who are the people that we work with now that solves a pain point that can be something that they have control over that is economically viable to them? And sometimes that means adjusting a bit of what we’re doing, because it’s just this step onto the long path as we do it.
  485. </p><p>
  486. <strong>Ackerman:</strong> Awesome. Well, thank you both again. This was a great conversation.
  487. </p><p>
  488. <strong>Jones: </strong>Yeah, thanks for having us and for hosting, Evan and Mike. Great to talk to you.
  489. </p><p>
  490. Dooley: Nice seeing you again, Chris and Evan. Same. Really enjoyed it.
  491. </p><p><strong>Ackerman: </strong>We’ve been talking with Chris Jones from iRobot and Mike Dooley from Labrador Systems about developing robots for the home. And thanks again to our guests for joining us, for ChatBot and <em>IEEE Spectrum</em>. I’m Evan Ackerman.</p>]]></description><pubDate>Mon, 02 Oct 2023 10:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/domestic-robots</guid><category>Domestic robots</category><category>Robots</category><category>Robotics</category><category>Irobot</category><category>Type:podcast</category><category>Chatbot podcast</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://assets.rbl.ms/36416762/origin.jpg"></media:content></item><item><title>Making Boston Dynamics’ Robots Dance</title><link>https://spectrum.ieee.org/boston-dynamics-dancing-robots</link><description><![CDATA[
  492. <img src="https://spectrum.ieee.org/media-library/image.jpg?id=36416749&width=980"/><br/><br/><p class="shortcode-media shortcode-media-youtube">
  493. <span class="rm-shortcode" data-rm-shortcode-id="95fca008cbdb9e0a04a9eaa894ccb8eb" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/EpShHKQiKmg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  494. <small class="image-media media-caption" placeholder="Add Photo Caption...">Chatbot Episode 1: Making Boston Dynamics’ Robots Dance</small>
  495. <small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://youtu.be/EpShHKQiKmg" target="_blank"><br/>
  496. </a></small>
  497. </p><p style="">
  498. <strong>Evan Ackerman:</strong> I’m Evan Ackerman, and welcome to ChatBot, a robotics podcast from <em>IEEE Spectrum</em>. On this episode of ChatBot, we’ll be talking with Monica Thomas and Amy LaViers about robots and dance. <a href="https://www.madkingthomas.com/" target="_blank">Monica Thomas is a dancer and choreographer</a>. Monica has worked with <a href="https://bostondynamics.com/" target="_blank">Boston Dynamics</a> to choreograph some of their robot videos in which <a href="https://robotsguide.com/robots/atlas2016" target="_blank">Atlas</a>, <a href="https://robotsguide.com/robots/spot" target="_blank">Spot</a>, and even <a href="https://robotsguide.com/robots/handle" target="_blank">Handle</a> dance to songs like Do You Love Me? The <a href="https://www.youtube.com/watch?v=fn3KWM1kuAw" target="_blank">“Do You Love Me?” Video has been viewed 37 million times</a>. And if you haven’t seen it yet, it’s pretty amazing to see how these robots can move. <a href="https://theradlab.xyz/" target="_blank">Amy LaViers is the director of the Robotics, Automation, and Dance Lab</a>, or RAD lab, which she founded in 2013 at the University of Virginia. The RAD Lab is a collective for art making, commercialization, education, outreach, and research at the intersection of dance and robotics, and is now an independent nonprofit in Philadelphia. Amy’s work explores the creative relationships between machines and humans, as expressed through movement. So Monica, can you just tell me-- I think people in the robotics field may not know who you are or why you’re on the podcast at this point, so can you just describe how you initially got involved in Boston Dynamics?</p><p>
  499. <strong>Monica Thomas:</strong> Yeah. So I got involved really casually. I know people who work at Boston Dynamics and <a href="https://spectrum.ieee.org/tag/marc-raibert" target="_self">Marc Raibert</a>, their founder and head. They’d been working on Spot, and they added the arm to Spot. And Marc was kind of like, “I kind of think this could dance.” And they were like, “Do you think this could dance?” And I was like, “It could definitely dance. That definitely could do a lot of dancing.” And so we just started trying to figure out, can it move in a way that feels like dance to people watching it? And the first thing we made was <a href="https://www.youtube.com/watch?v=kHBcVlqpvZ8" target="_blank">Uptown Spot</a>. And it was really just figuring out moves that the robot does kind of already naturally. And that’s when they started developing, I think, <a href="https://dev.bostondynamics.com/docs/concepts/choreography/readme" rel="noopener noreferrer" target="_blank">Choreographer</a>, their tool. But in terms of my thinking, it was just I was watching what the robot did as its normal patterns, like going up, going down, walking this place, different steps, different gates, what is interesting to me, what looks beautiful to me, what looks funny to me, and then imagining what else we could be doing, considering the angles of the joints. And then it just grew from there. <a rel="noopener noreferrer" target="_blank"></a>And so once that one was out, Marc was like, “What about the rest of the robots? Could they dance? Maybe we could do a dance with all of the robots.” And I was like, “We could definitely do a dance with all of the robots. Any shape can dance.” So that’s when we started working on what turned into Do You Love Me? I didn’t really realize what a big deal it was until it came out and it went viral. And I was like, “Oh—” are we allowed to swear, or—?<a href="#_msocom_1" rel="noopener noreferrer" target="_blank"></a>
  500. </p><p>
  501. <strong>Ackerman:</strong> Oh, yeah. Yeah.
  502. </p><p>
  503. <strong>Thomas: </strong>Yeah. So I was like, “[bleep bleep, bleeeep] is this?” I didn’t know how to deal with it. I didn’t know how to think about it. As a performer, the largest audience I performed for in a day was like 700 people, which is a big audience as a live performer. So when you’re hitting millions, it’s just like it doesn’t even make sense anymore, and yeah. So that was pretty mind-boggling. And then also because of kind of how it was introduced and because there is a whole world of choreo-robotics, which I was not really aware of because I was just doing my thing. Then I realized there’s all of this work that’s been happening that I couldn’t reference, didn’t know about, and conversations that were really important in the field that I also was unaware of and then suddenly was a part of. So I think doing work that has more viewership is really—it was a trip and a half—is a trip and a half. I’m still learning about it. Does that answer your question?
  504. </p><p>
  505. <strong>Ackerman: </strong>Yeah. Definitely.
  506. </p><p>
  507. <strong>Thomas:</strong> It’s a long-winded answer, but.
  508. </p><p>
  509. <strong>Ackerman:</strong> And Amy, so you have been working in these two disciplines for a long time, in the disciplines of robotics and in dance. So what made you decide to combine these two things, and why is that important?
  510. </p><p>
  511. <strong>Amy LaViers: </strong>Yeah. Well, both things, I guess in some way, have always been present in my life. I’ve danced since I was three, probably, and my dad and all of his brothers and my grandfathers were engineers. So in some sense, they were always there. And it was really-- I could tell you the date. I sometimes forget what it was, but it was a Thursday, and I was taking classes and dancing and controlling of mechanical systems, and I was realizing this over. I mean, I don’t think I’m combining them. I feel like they already kind of have this intersection that just exists. And I realized-- or I stumbled into that intersection myself, and I found lots of people working in it. And I was-- oh, my interests in both these fields kind of reinforce one another in a way that’s really exciting and interesting. I also happened to be an almost graduating-- I was in last class of my junior year of college, so I was thinking, “What am I going to do with myself?” Right? So it was very happenstance in that way. And again, I mean, I just felt like— it was like I walked into a room where all of a sudden, a lot of things made sense to me, and a lot of interests of mine were both present.
  512. </p><p>
  513. <strong>Ackerman:</strong> And can you summarize, I guess, the importance here? Because I feel like— I’m sure this is something you’ve run into, is that it’s easy for engineers or roboticists just to be— I mean, honestly, a little bit dismissive of this idea that it’s important for robots to have this expressivity. So why is it important?
  514. </p><p>
  515. <strong>LaViers:</strong> That is a great question that if I could summarize what my life is like, it’s me on a computer going like this, trying to figure out the words to answer that succinctly. But one way I might ask it, earlier when we were talking, you mentioned this idea of functional behavior versus expressive behavior, which comes up a lot when we start thinking in this space. And I think one thing that happens-- and my training and background in <a href="https://www.backstage.com/magazine/article/laban-movement-analysis-guide-50428/" rel="noopener noreferrer" target="_blank">Laban Movement Analysis </a>really emphasizes this duality between function and expression as opposed to the either/or. It’s kind of like the mind-body split, the idea that these things are one integrated unit. Function and expression are an integrated unit. And something that is functional is really expressive. Something that is expressive is really functional.
  516. </p><p>
  517. <strong>Ackerman:</strong> It definitely answers the question. And it looks like Monica is resonating with you a little bit, so I’m just going to get out of the way here. Amy, do you want to just start this conversation with Monica?
  518. </p><p>
  519. <strong>LaViers: </strong>Sure. Sure. Monica has already answered, literally, my first question, so I’m already having to shuffle a little bit. But I’m going to rephrase. My first question was, can robots dance? And I love how emphatically and beautifully you answered that with, “Any shape can dance.” I think that’s so beautiful. That was a great answer, and I think it brings up— you can debate, is this dance, or is this not? But there’s also a way to look at any movement through the lens of dance, and that includes factory robots that nobody ever sees.
  520. </p><p>
  521. <strong>Thomas:</strong> It’s exciting. I mean, it’s a really nice way to walk through the world, so I actually recommend it for everyone, just like taking a time and seeing the movement around you as dance. I don’t know if it’s allowing it to be intentional or just to be special, meaningful, something.
  522. </p><p>
  523. <strong>LaViers:</strong> That’s a really big challenge, particularly for an autonomous system. And for any moving system, I think that’s hard, artificial or not. I mean it’s hard for me. My family’s coming into town this weekend. I’m like, “How do I act so that they know I love them?” Right? That’s dramaticized version of real life, right, is, how do I be welcoming to my guests? And that’ll be, how do I move?
  524. </p><p>
  525. <strong>Thomas:</strong> What you’re saying is a reminder of, one of the things that I really enjoy watching robots move is that I’m allowed to project as much as I want to on them without taking away something from them. When you project too much on people, you lose the person, and that’s not really fair. But when you’re projecting on objects, things that are objects but that we personify— or not even personify, that we anthropomorphize or whatever, it is just a projection of us. But it’s acceptable. So nice for it to be acceptable, a place where you get to do that.
  526. </p><p>
  527. <strong>LaViers: </strong>Well, okay. Then can I ask my fourth question even though it’s not my turn? Because that’s just too perfect to what it is, which is just, what did you learn about yourself working with these robots?
  528. </p><p>
  529. <strong>Thomas:</strong> Well, I learned how much I love visually watching movement. I’ve always watched, but I don’t think it was as clear to me how much I like movement. The work that I made was really about context. It was about what’s happening in society, what’s happening in me as a person. But I never got into that school of dance that really spends time just really paying attention to movement or letting movement develop or explore, exploring movement. That wasn’t what I was doing. And with robots, I was like, “Oh, but yeah, I get it better now. I see it more now.” So much in life right now, for me, is not contained, and it doesn’t have answers. And translating movement across species from my body to a robot, that does have answers. It has multiple answers. It’s not like there’s a yes and a no, but you can answer a question. And it’s so nice to answer questions sometimes. I sat with this thing, and here’s something I feel like is an acceptable solution. Wow. That’s a rarity in life. So I love that about working with robots. I mean, also, they’re cool, I think. And it is also— they’re just cool. I mean, that’s true too. It’s also interesting. I guess the last thing that I really loved—and I didn’t have much opportunity to do this or as much as you’d expect because of COVID—is being in space with robots. It’s really interesting, just like being in space with anything that is different than your norm is notable. Being in space with an animal that you’re not used to being with is notable. And there’s just something really cool about being with something very different. And for me, robots are very different and not acclimatized.
  530. </p><p>
  531. <strong>Ackerman: </strong>Okay. Monica, you want to ask a question or two?
  532. </p><p>
  533. <strong>Thomas: </strong>Yeah. I do. The order of my questions is ruined also. I was thinking about the <a href="https://theradlab.xyz/" rel="noopener noreferrer" target="_blank">RAD Lab</a>, and I was wondering if there are guiding principles that you feel are really important in that interdisciplinary work that you’re doing, and also any lessons maybe from the other side that are worth sharing.
  534. </p><p>
  535. <strong>LaViers:</strong> The usual way I describe it and describe my work more broadly is, I think there are a lot of roboticists that hire dancers, and they make robots and those dancers help them. And there are a lot of dancers that they hire engineers, and those engineers build something for them that they use inside of their work. And what I’m interested in, in the little litmus test or challenge I paint for myself and my collaborators is we want to be right in between those two things, right, where we are making something. First of all, we’re treating each other as peers, as technical peers, as artistic peers, as— if the robot moves on stage, I mean, that’s choreography. If the choreographer asks for the robot to move in a certain way, that’s robotics. That’s the inflection point we want to be at. And so that means, for example, in terms of crediting the work, we try to credit the creative contributions. And not just like, “Oh, well, you did 10 percent of the creative contributions.” We really try to treat each other as co-artistic collaborators and co-technical developers. And so artists are on our papers, and engineers are in our programs, to put it in that way. And likewise, that changes the questions we want to ask. We want to make something that pushes robotics just a inch further, a millimeter further. And we want to do something that pushes dance just an inch further, a millimeter further. We would love it if people would ask us, “Is this dance?” We get, “Is this robotics?” Quite a lot. So that makes me feel like we must be doing something interesting in robotics.
  536. </p><p>
  537. And every now and then, I think we do something interesting for dance too, and certainly, many of my collaborators do. And that inflection point, that’s just where I think is interesting. And I think that’s where— that’s the room I stumbled into, is where we’re asking those questions as opposed to just developing a robot and hiring someone to help us do that. I mean, it can be hard in that environment that people feel like their expertise is being given to the other side. And then, where am I an expert? And we’ve heard editors at publication venues say, “Well, this dancer can’t be a co-author,” and we’ve had venues where we’re working on the program and people say, “Well, no, this engineer isn’t a performer,” but I’m like, “But he’s queuing the robot, and if he messes up, then we all mess up.” I mean, that’s vulnerability too. So we have those conversations that are really touchy and a little sensitive and a little— and so how do you create that space where people do you feel safe and comfortable and valued and attributed for their work and that they can make a track record and do this again in another project, in another context and— so, I don’t know, if I’ve learned anything, I mean, I’ve learned that you just have to really talk about attribution all the time. I bring it up every time, and then I bring it up before we even think about writing a paper. And then I bring it up when we make the draft. And first thing I put in the draft is everybody’s name in the order it’s going to appear, with the affiliations and with the—subscripts on that don’t get added at the last minute. And when the editor of a very famous robotics venue says, “This person can’t be a co-author,” that person doesn’t get taken off as a co-author; that person is a co-author, and we figure out another way to make it work. And so I think that’s learning, or that’s just a struggle anyway.
  538. </p><p>
  539. <strong>Ackerman:</strong> Monica, I’m curious if when you saw the Boston Dynamics videos go viral, did you feel like there was much more of a focus on the robots and the mechanical capabilities than there was on the choreography and the dance? And if so, how did that make you feel?
  540. </p><p>
  541. <strong>Thomas:</strong> Yeah. So yes. Right. When dances I’ve made have been reviewed, which I’ve always really appreciated, it has been about the dance. It’s been about the choreography. And actually, kind of going way back to what we were talking about a couple things ago, a lot of the reviews that you get around this are about people, their reactions, right? Because, again, we can project so much onto robots. So I learned a lot about people, how people think about robots. There’s a lot of really overt themes, and then there’s individual nuance. But yeah, it wasn’t really about the dance, and it was in the middle of the pandemic too. So there’s really high isolation. I had no idea how people who cared about dance thought about it for a long time. And then every once in a while, I get one person here or one person there say something. So it’s a totally weird experience. Yes.
  542. </p><p>
  543. The way that I took information about the dance was kind of paying attention to the affective experience, the emotional experience that people had watching this. The dance was— nothing in that dance was— we use the structures of the traditions of dance in it for intentional reason. I chose that because I wasn’t trying to alarm people or show people ways that robots move that totally hit some old part of our brain that makes us absolutely panicked. That wasn’t my interest or the goal of that work. And honestly, at some point, it’d be really interesting to explore what the robots can just do versus what I, as a human, feel comfortable seeing them do. But the emotional response that people got told me a story about what the dance was doing in a backward-- also, what the music’s doing because—let’s be real—that music does— right? We stacked the deck.
  544. </p><p>
  545. <strong>LaViers:</strong> Yeah. And now that brings— I feel like that serves up two of my questions, and I might let you pick which one maybe we go to. I mean, one of my questions, I wrote down some of my favorite moments from the choreography that I thought we could discuss. Another question—and maybe we can do both of these in serie—is a little bit about— I’ll blush even just saying it, and I’m so glad that the people can’t see the blushing. But also, there’s been so much nodding, and I’m noticing that that won’t be in the audio recording. We’re nodding along to each other so much. But the other side—and you can just nod in a way that gives me your—the other question that comes up for that is, yeah, what is the monetary piece of this, and where are the power dynamics inside this? And how do you feel about how that sits now as that video continues to just make its rounds on the internet and establish value for Boston Dynamics?
  546. </p><p>
  547. <strong>Thomas: </strong>I would love to start with the first question. And the second one is super important, and maybe another day for that one.
  548. </p><p>
  549. <strong>Ackerman:</strong> Okay. That’s fair. That’s fair.
  550. </p><p>
  551. <strong>LaViers: </strong>Yep. I like that. I like that. So the first question, so my favorite moments of <a href="https://www.youtube.com/watch?v=fn3KWM1kuAw" rel="noopener noreferrer" target="_blank">the piece that you choreographed to Do You Love Me</a>? For the Boston Dynamics robots, the swinging arms at the beginning, where you don’t fully know where this is going. It looks so casual and so, dare I say it, natural, although it’s completely artificial, right? And the proximal rotation of the legs, I feel like it’s a genius way of getting around no spine. But you really make use of things that look like hip joints or shoulder joints as a way of, to me, accessing a good wriggle or a good juicy moment, and then the Spot space hold, I call it, where the head of the Spot is holding in place and then the robot wiggles around that, dances around that. And then the moment when you see all four complete—these distinct bodies, and it looks like they’re dancing together. And we touched on that earlier—any shape can dance—but making them all dance together I thought was really brilliant and effective in the work. So it’s one of those moments, super interesting, or you have a funny story about, I thought we could talk about it further.
  552. </p><p>
  553. <a rel="noopener noreferrer" target="_blank"></a><strong>Thomas: </strong>I have a funny story about the hip joints. So the initial— well, not the initial, but when they do <a href="https://youtu.be/fn3KWM1kuAw?t=49" rel="noopener noreferrer" target="_blank">the mashed potato</a>, that was the first dance move that we started working on, on Atlas. And for folks who don’t know, the mashed potato is kind of the feet are going in and out; the knees are going in and out. So we ran into a couple of problems, which—and the twist. I guess it’s a combo. Both of them like you to roll your feet on the ground like rub, and that friction was not good for the robots. <a href="https://youtu.be/fn3KWM1kuAw?t=49" rel="noopener noreferrer" target="_blank">So when we first started really moving into the twist</a>, which has this torso twisting— the legs are twisting. The foot should be twisting on the floor. The foot is not twisting on the floor, and the legs were so turned out that the shape of the pelvic region looked like a over-full diaper. So, I mean, it was wiggling, but it made the robot look young. It made the robot look like it was in a diaper that needed to be changed. It did not look like a twist that anybody would want to do near anybody else. And it was really amazing how— I mean, it was just hilarious to see it. And the engineers come in. They’re really seeing the movement and trying to figure out what they need for the movement. And I was like, “Well, it looks like it has a very full diaper.” And they were like, “Oh.” They knew it didn’t quite look right, but it was like—because I think they really don’t project as much as I do, I’m very projective that’s one of the ways that I’ve watched work, or you’re pulling from the work that way, but that’s not what they were looking at. And so yeah, then you change the angles of the legs, how turned in it is and whatever, and it resolved to a degree, I think, fairly successfully. It doesn’t really look like a diaper anymore. But that wasn’t really— and also to get that move right took us over a month.
  554. </p><p>
  555. <strong>Ackerman:</strong> Wow.
  556. </p><p>
  557. <strong>LaViers:</strong> Wow.
  558. </p><p>
  559. <strong>Thomas: </strong>We got much faster after that because it was the first, and we really learned. But it took a month of programming, me coming in, naming specific ways of reshifting it before we got a twist that felt natural if amended because it’s not the same way that--
  560. </p><p>
  561. <strong>LaViers:</strong> Yeah. Well, and it’s fascinating to think about how to get it to look the same. You had to change the way it did the movement, is what I heard you describing there, and I think that’s so fascinating, right? And just how distinct the morphologies between our body and any of these bodies, even the very facile human-ish looking Atlas, that there’s still a lot of really nuanced and fine-grained and human work-intensive labor to go into getting that to look the same as what we all think of as the twist or the mashed potato.
  562. </p><p>
  563. <strong>Thomas:</strong> Right. Right. And it does need to be something that we can project those dances onto, or it doesn’t work, in terms of this dance. It could work in another one. Yeah.
  564. </p><p>
  565. <strong>LaViers:</strong> Right. And you brought that up earlier, too, of trying to work inside of some established forms of dance as opposed to making us all terrified by the strange movement that can happen, which I think is interesting. And I hope one day you get to do that dance too.
  566. </p><p>
  567. <strong>Thomas:</strong> Yeah. No, I totally want to do that dance too.
  568. </p><p>
  569. <strong>Ackerman:</strong> Monica, do you have one last question you want to ask?
  570. </p><p>
  571. <strong>Thomas: </strong>I do. And this is— yeah. I want to ask you, kind of what does embodied or body-based intelligence offer in robotic engineering? So I feel like, you, more than anyone, can speak to that because I don’t do that side.
  572. </p><p>
  573. <strong>LaViers:</strong> Well, I mean, I think it can bring a couple of things. One, it can bring— I mean, the first moment in my career or life that that calls up for me is, I was watching one of my lab mates, when I was a doctoral student, give a talk about a quadruped robot that he was working on, and he was describing the crawling strategy like the gate. And someone said— and I think it was roughly like, “Move the center of gravity inside the polygon of support, and then pick up— the polygon of support formed by three of the legs. And then pick up the fourth leg and move it. Establish a new polygon of support. Move the center of mass into that polygon of support.” And it’s described with these figures. Maybe there’s a center of gravity. It’s like a circle that’s like a checkerboard, and there’s a triangle, and there’s these legs. And someone stands up and is like, “That makes no sense like that. Why would you do that?” And I’m like, “Oh, oh, I know, oh, because that’s one of the ways you can crawl.” I actually didn’t get down on the floor and do it because I was not so outlandish at that point.
  574. </p><p>
  575. But today, in the RAD lab, that would be, “Everyone on all fours, try this strategy out.” Does it feel like a good idea? Are there other ideas that we would use to do this pattern that might be worth exploring here as well? And so truly rolling around on the floor and moving your body and pretending to be a quadruped, which— in my dance classes, it’s a very common thing to practice crawling because we all forget how to crawl. We want to crawl with the cross-lateral pattern and the homo-lateral pattern, and we want to keep our butts down-- or keep the butts up, but we want to have that optionality so that we look like we’re facile, natural crawlers. We train that, right? And so for a quadruped robot talk and discussion, I think there’s a very literal way that an embodied exploration of the idea is a completely legitimate way to do research.
  576. </p><p>
  577. <strong>Ackerman: </strong>Yeah. I mean, Monica, this is what you were saying, too, as you were working with these engineers. Sometimes it sounded like they could tell that something wasn’t quite right, but they didn’t know how to describe it, and they didn’t know how to fix it because they didn’t have that language and experience that both of you have.
  578. </p><p>
  579. <strong>Thomas: </strong>Yeah. Yeah, exactly that.
  580. </p><p>
  581. <strong>Ackerman: </strong>Okay. Well, I just want to ask you each one more really quick question before we end here, which is that, what is your favorite fictional robot and why? I hope this isn’t too difficult, especially since you both work with real robots, but. Amy, you want to go first?
  582. </p><p>
  583. <strong>LaViers:</strong> I mean, I’m going to feel like a party pooper. I don’t like any robots, real or fictional. The fictional ones annoy me because-- the fictional ones annoy me because of the disambiguation issue and WALL-E and Eva are so cute. And I do love cute things, but are those machines, or are those characters? And are we losing sight of that? I mean, my favorite robot to watch move, this one-- I mean, I love the <a href="https://www.youtube.com/watch?v=3g-yrjh58ms" rel="noopener noreferrer" target="_blank">Keepon dancing to Spoon</a>. That is something that if you’re having an off day, you google Keepon dancing to Spoon— Keepon is one word, K-E-E-P-O-N, dancing to Spoon, and you just bop. It’s just a bop. I love it. It’s so simple and so pure and so right.
  584. </p><p>
  585. <strong>Ackerman: </strong>It’s one of my favorite robots of all time, Monica. I don’t know if you’ve seen this, but it’s two little yellow balls like this, and it just goes up and down and rocks back and forth. But it does it so to music. It just does it so well. It’s amazing.
  586. </p><p>
  587. <strong>Thomas:</strong> I will definitely be watching that [crosstalk].
  588. </p><p>
  589. <strong>Ackerman:</strong> Yeah. And I should have expanded the question, and now I will expand it because Monica hasn’t answered yet. Favorite robot, real or fictional?
  590. </p><p>
  591. <strong>Thomas: </strong>So I don’t know if it’s my favorite. This one breaks my heart, and I’m currently having an empathy overdrive issue as a general problem. But there’s a robot installation - and I should know its name, but I don’t— <a href="https://www.youtube.com/watch?v=ZS4Bpr2BgnE" rel="noopener noreferrer" target="_blank">where the robot reaches out, and it grabs the oil that they’ve created it to leak and pulls it towards its body</a>. And it’s been doing this for several years now, but it’s really slowing down now. And I don’t think it even needs the oil. I don’t think it’s a robot that uses oil. It just thinks that it needs to keep it close. And it used to happy dance, and the oil has gotten so dark and the red rust color of, oh, this is so morbid of blood, but it just breaks my heart. So I think I love that robot and also want to save it in the really unhealthy way that we sometimes identify with things that we shouldn’t be thinking about that much.
  592. </p><p>
  593. <strong>Ackerman:</strong> And you both gave amazing answers to that question.
  594. </p><p>
  595. <strong>LaViers:</strong> And the piece is <a href="https://www.youtube.com/watch?v=ZS4Bpr2BgnE" rel="noopener noreferrer" target="_blank">Sun Yuan and Peng Yu’s Can’t Help Myself</a>.
  596. </p><p>
  597. <strong>Ackerman: </strong>That’s right. Yeah.
  598. </p><p>
  599. <strong>LaViers: </strong>And it is so beautiful. I couldn’t remember the artist’s name either, but—you’re right—it’s so beautiful.
  600. </p><p>
  601. <strong>Thomas:</strong> It’s beautiful. The movement is beautiful. It’s beautifully considered as an art piece, and the robot is gorgeous and heartbreaking.
  602. </p><p>
  603. <strong>Ackerman:</strong> Yeah. Those answers were so unexpected, and I love that. So thank you both, and thank you for being on this podcast. This was an amazing conversation. We didn’t have nearly enough time, so we’re going to have to come back to so much.
  604. </p><p>
  605. <strong>LaViers: </strong>Thank you for having me.
  606. </p><p>
  607. <strong>Thomas: </strong>Thank you so much for inviting me. [music]
  608. </p><p><strong>Ackerman: </strong>We’ve been talking with Monica Thomas and Amy LaViers about robots and dance. And thanks again to our guests for joining us for ChatBot and<em> IEEE Spectrum</em>. I’m Evan Ackerman.</p>]]></description><pubDate>Sun, 01 Oct 2023 21:23:34 +0000</pubDate><guid>https://spectrum.ieee.org/boston-dynamics-dancing-robots</guid><category>Dance</category><category>Art and technology</category><category>Robots</category><category>Type:podcast</category><category>Boston dynamics</category><category>Chatbot podcast</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://assets.rbl.ms/36416749/origin.jpg"></media:content></item><item><title>Solar-Powered Microfliers for Swarm-Based Surveying</title><link>https://spectrum.ieee.org/drone-surveying-microfliers</link><description><![CDATA[
  609. <img src="https://spectrum.ieee.org/media-library/gold-colored-leaf-sized-origami-surround-a-microflier.jpg?id=46551641&width=1200&height=800&coordinates=0%2C57%2C0%2C58"/><br/><br/><p style=""><u><a href="https://www.nature.com/articles/s41586-021-03847-y" rel="noopener noreferrer" target="_blank">Microfliers</a></u>, or miniature wireless robots deployed in numbers, are sometimes used today for large-scale surveillance and monitoring purposes, such as in environmental or biological studies. Because of the fliers’ ability to disperse in air, they can spread out to cover large areas after being dropped from a single location, including in places where access is otherwise difficult. Plus, they are smaller, lighter, and cheaper to deploy than multiple drones.</p><p style="">One of the challenges in creating more efficient microfliers has been in reducing power consumption. One way to do so, as researchers from the <a href="https://www.washington.edu/" rel="noopener noreferrer" target="_blank"><u>University of Washington</u></a> (UW) and <a href="https://www.univ-grenoble-alpes.fr/english/" rel="noopener noreferrer" target="_blank"><u>Université Grenoble Alpes</u></a> have demonstrated, is to get rid of the battery. With inspiration from the Japanese art of paper folding, origami, they designed programmable microfliers that can disperse in the wind and change shape using electronic actuation. This is achieved by a solar-powered actuator that can produce up to 200 millinewtons of force in 25 milliseconds.</p><p class="pull-quote" style="">“Think of these little fliers as a sensor platform to measure environmental conditions, like, temperature, light, and other things.”<br/><strong>—Vikram Iyer, University of Washington</strong></p><p style="">“The cool thing about these origami designs is, we’ve created a way for them to change shape in midair, completely battery free,” says <a href="https://homes.cs.washington.edu/~vsiyer/" rel="noopener noreferrer" target="_blank"><u>Vikram Iyer</u></a>, computer scientist and engineer at UW, one of the authors. “It’s a pretty small change in shape, but it creates a very dramatic change in falling behavior…that allows us to get some control over how these things are flying.” </p><p class="shortcode-media shortcode-media-rebelmouse-image" style="">
  610. <img alt="A diagram demonstrating the different shapes of the microflier" class="rm-shortcode" data-rm-shortcode-id="5a1b6f2d4dea1a6160225d664de9de81" data-rm-shortcode-name="rebelmouse-image" id="cee97" loading="lazy" src="https://spectrum.ieee.org/media-library/a-diagram-demonstrating-the-different-shapes-of-the-microflier.png?id=46551869&width=980"/>
  611. <small class="image-media media-caption" data-gramm="false" data-lt-tmp-id="lt-124820" placeholder="Add Photo Caption..." spellcheck="false">Tumbling and stable states: A) The origami microflier here is in its tumbling state and B) postlanding configuration. As it descends, the flier tumbles, with a typical tumbling pattern pictured in C. D) The origami microflier is here in its stable descent state. The fliers’ range of landing locations, E, reveals its dispersal patterns after being released from its parent drone. </small><small class="image-media media-photo-credit" data-gramm="false" data-lt-tmp-id="lt-962488" placeholder="Add Photo Credit..." spellcheck="false">Vicente Arroyos, Kyle Johnson, and Vikram Iyer/University of Washington</small></p><p style="">This research builds on the researchers’ <a href="https://www.nature.com/articles/s41586-021-04363-9" rel="noopener noreferrer" target="_blank"><u>earlier work</u></a> published in 2022, demonstrating sensors that can disperse in air like dandelion seeds. For the current study, “the goal was to deploy hundreds of these sensors and control where they land, to achieve precise deployments,” says coauthor <a href="https://homes.cs.washington.edu/~gshyam/" rel="noopener noreferrer" target="_blank"><u>Shyamnath Gollakota</u></a>, who leads the <a href="https://netlab.cs.washington.edu/" rel="noopener noreferrer" target="_blank"><u>Mobile Intelligence Lab</u></a> at WU. The microfliers, each weighing less than 500 milligrams, can travel almost 100 meters in a light breeze, and wirelessly transmit data about air pressure and temperature via Bluetooth up to a distance of 60 meters. The group’s <a href="https://www.science.org/doi/10.1126/scirobotics.adg4276" rel="noopener noreferrer" target="_blank"><u>findings</u></a> were published in <em>Science Robotics </em>earlier this month.</p><p style="">Discovering the difference in the falling behavior of the two origami states was serendipity, Gollakota says: “When it is flat, it’s almost like a leaf, tumbling [in the] the wind,” he says. “A very slight change from flat to a little bit of a curvature [makes] it fall like a parachute in a very controlled motion.” In their tumbling state, in lateral wind gusts, the microfliers achieve up to three times the dispersal distance as in their stable state, he adds.</p><p class="shortcode-media shortcode-media-rebelmouse-image" style="">
  612. <img alt="close-up image of a yellow folded origami flier with electronic circuits printed across it and stabilizing wires going from edges of the wing to a central rod that stabilizes the craft" class="rm-shortcode" data-rm-shortcode-id="e29ecd9e191808779d6b8fd61d08a95f" data-rm-shortcode-name="rebelmouse-image" id="b6cb6" loading="lazy" src="https://spectrum.ieee.org/media-library/close-up-image-of-a-yellow-folded-origami-flier-with-electronic-circuits-printed-across-it-and-stabilizing-wires-going-from-edge.jpg?id=46551901&width=980"/>
  613. <small class="image-media media-caption" data-gramm="false" data-lt-tmp-id="lt-99135" placeholder="Add Photo Caption..." spellcheck="false">This close-up of the microflier reveals the electronics and circuitry on its top side.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Vicente Arroyos, Kyle Johnson, and Vikram Iyer/University of Washington</small></p><p style="">There have been other origami-based systems in which motors, electrostatic actuators, shape-memory alloys, and electrothermal polymers, for example, have been used, but these did not address the challenges facing the researchers, Gollakota says. One was to find the sweet spot between an actuation mechanism strong enough to not change shape without being triggered, yet lightweight enough to keep power consumption low. Next, it had to produce a rapid transition response while falling to the ground. Finally, it needed to have a lightweight energy storage solution onboard to trigger the transition. </p><p style="">The mechanism, which Gollakota describes as “pretty commonsensical” still took them a year to come up with. There’s a stem in the middle of the origami, comprising a solenoid coil (a coil that acts as a magnet when a current passes through it), and two small magnets. Four hinged carbon-fiber rods attach the stem to the edges of the structure. When a pulse of current is applied to the solenoid coil, it pushes the magnets toward each other, making the structure snap into its alternative shape. </p><p style="">All it requires is a tiny bit of power, just enough to put the magnets within the right distance from each other for the magnetic forces to work, Gollakota says. There is an array of thin, lightweight solar cells to harvest energy, which is stored in a little capacitor. The circuit is fabricated directly on the foldable origami structure, and also includes a microcontroller, timer, Bluetooth receiver, and pressure and temperature sensors. </p><p style="">“We can program these things to trigger the shape change based on any of these things—after a fixed time, when we send it a radio signal, or, at an altitude [or temperature] that this device detects,” Iyer adds. The origami structure is bistable, meaning it does not need any energy to maintain shape once it has transitioned.</p><p style="">The researchers say their design can be extended to incorporate sensors for a variety of environmental monitoring applications. “Think of these little fliers as a sensor platform to measure environmental conditions, like temperature, light, and other things, [and] how they vary throughout the atmosphere,” Iyer says. Or they can deploy sensors on the ground for things like digital agriculture, climate change–related studies, and tracking forest fires.</p><p style="">In their current prototype, the microfliers only shape-change in one direction, but the researchers want to make them transition in both directions, to be able to toggle the two states, and control the trajectory even better. They also imagine a swarm of microfliers communicating with one another, controlling their behavior, and self-organizing how they are falling and dispersing.</p>]]></description><pubDate>Fri, 29 Sep 2023 16:00:52 +0000</pubDate><guid>https://spectrum.ieee.org/drone-surveying-microfliers</guid><category>Drones</category><category>Environmental monitoring</category><category>Origami</category><category>Surveying</category><category>University of washington</category><category>Robotics</category><category>Sensors</category><category>Swarm robots</category><category>Drones</category><dc:creator>Payal Dhar</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/gold-colored-leaf-sized-origami-surround-a-microflier.jpg?id=46551641&amp;width=980"></media:content></item><item><title>YORI: A Hybrid Approach to Robotic Cooking</title><link>https://spectrum.ieee.org/romela-cooking-robot</link><description><![CDATA[
  614. <img src="https://spectrum.ieee.org/media-library/a-still-from-a-video-shows-a-metal-robot-with-arms-surrounded-by-kitchen-equipment.png?id=45355044&width=1200&height=800&coordinates=150%2C0%2C150%2C0"/><br/><br/><p style="">There seems to be two general approaches to <a href="https://spectrum.ieee.org/tag/cooking-robots" rel="noopener noreferrer" target="_blank">cooking automation</a>. There’s the “let’s make a robot that can operate in a human kitchen because everyone has a human kitchen,” which seems like a good idea, except that you then have to build your robot to function in human environments—which is super hard. On the other end of the spectrum, there’s the “let’s make a dedicated automated system because automation is easier than robotics,” which seems like a good idea, except that you then have to be willing to accept compromises in recipes and texture and taste because preparing food in an automated way simply does not yield the same result, as anyone who has ever attempted to Cuisinart their way out of developing some knife skills can tell you.</p><p style="">The <a href="https://www.romela.org/" target="_blank">Robotics and Mechanisms Lab (RoMeLa) at the University of California, Los Angeles</a>, run by <a href="https://www.romela.org/dr-dennis-hong/" rel="noopener noreferrer" target="_blank">Dennis Hong</a>, has been working on a compromise approach that leverages both robot-friendly automation and the kind of human skills that make things taste right. Called Project YORI, which stands for “Yummy Operations Robot Initiative” and is also the Korean word for “cooking,” the system combines a robot-optimized environment with a pair of arms that can operate kitchen tools sort of like a human.</p><hr style=""/><p class="shortcode-media shortcode-media-youtube">
  615. <span class="rm-shortcode" data-rm-shortcode-id="769865f8d69bf9b3bf966ef6d089f688" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8SsgzCbYqc8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  616. </p><p style="">“Instead of trying to mimic how humans cook,” the researchers say, “we approached the problem by thinking how cooking would be accomplished if a robot cooks. Thus the YORI system does not use the typical cooking methods, tools, or utensils which are developed for humans.” In addition to a variety of automated cooking systems, the tools that YORI does use are modified to work with a tool-changing system, which mostly eliminates the problem of grasping something like a knife well enough to precisely and repeatedly exert a substantial amount of force through it, and it also helps keep things structured and accessible.</p><p style="">In terms of cooking methods, the system takes advantage of technology when and where it works better than conventional human cooking techniques. For example, in order to tell whether ingredients are fresh or to determine when food is cooked ideally, YORI “utilizes unique chemical sensors,” which I guess are the robot equivalents of a nose and taste buds and arguably would do a more empirical assessment than some useless recipe metric like “season to taste.”</p><p style="">The advantage of a system like this is versatility. In theory, those added robotic capabilities means it’s not as constrained by recipes you could cram into a system built around automation. At the same time, it’s somewhat practical—or at least, more practical than a robot designed to interact with a lightly modified human kitchen. And it’s actually designed to be practical(ish), in the sense that it’s being developed under a partnership with <a href="https://www.ajudaily.com/view/20190729142405486" rel="noopener noreferrer" target="_blank">Woowa Brothers</a>, the company that runs the leading food-delivery service in South Korea. It’s obviously still a work in progress—you can see a human hand sneaking in there from time to time. But the approach seems interesting, and I hope that RoMeLa keeps making progress on it, because I’m hungry.</p>]]></description><pubDate>Tue, 26 Sep 2023 17:16:31 +0000</pubDate><guid>https://spectrum.ieee.org/romela-cooking-robot</guid><category>Romela</category><category>Cooking robots</category><category>Robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-still-from-a-video-shows-a-metal-robot-with-arms-surrounded-by-kitchen-equipment.png?id=45355044&amp;width=980"></media:content></item><item><title>Video Friday: Robot Dance</title><link>https://spectrum.ieee.org/video-friday-robot-dance</link><description><![CDATA[
  617. <img src="https://spectrum.ieee.org/media-library/humanoid-robot-with-googly-white-eyes-extends-arms-toward-camera-in-a-dance-club-like-blue-and-pink-lit-scene-with-confetti-and.jpg?id=43150832&width=1200&height=800&coordinates=57%2C0%2C58%2C0"/><br/><br/><p style="">Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.<br/></p><h5 style=""><a href="https://ieee-iros.org/">IROS 2023</a>: 1–5 October 2023, DETROIT</h5><h5><a href="https://clawar.org/clawar23/">CLAWAR 2023</a>: 2–4 October 2023, FLORIANOPOLIS, BRAZIL</h5><h5 style=""><a href="https://roscon.ros.org/2023/">ROSCon 2023</a>: 18–20 October 2023, NEW ORLEANS</h5><h5 style=""><a href="https://2023.ieee-humanoids.org/">Humanoids 2023</a>: 12–14 December 2023, AUSTIN, TEXAS</h5><h5 style=""><a href="https://cybathlon.ethz.ch/en/events/challenges/Challenges-2024">Cybathlon Challenges</a>: 02 February 2024, ZURICH</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote><em>Musical dancing is an ubiquitous phenomenon in human society. Providing robots the ability to dance has the potential to make the human/robot coexistence more acceptable. Hence, dancing robots have generated a considerable research interest in the recent years. In this paper, we present a novel formalization of robot dancing as planning and control of optimally timed actions based on beat timings and additional features extracted from the music.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4aa97ad5a87070eee574e15d3c80e7c8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/MA42YUg3e8E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Wow! Okay, all robotics videos definitely need confetti cannons.</p><p>[ <a href="https://github.com/dfki-ric-underactuated-lab/robot_dance_generation">DFKI</a> ]</p><div class="horizontal-rule"></div><p>What an incredibly relaxing robot video this is.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b6cf166a88ca481692ad9d071e90e348" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/NxpNbGzozRk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Except for the tree bit, I mean.</p><p>[ <a href="https://ieeexplore.ieee.org/document/10146448">Paper</a> ] via [ <a href="https://asl.ethz.ch/">ASL</a> ]</p><div class="horizontal-rule"></div><p>Skydio has a fancy new drone, but not for you!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c69a32a50cc25adcb33fb9a2160b5e79" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GRdM-BxlBQg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Skydio X10, a drone designed for first responders, infrastructure operators, and the U.S. and allied militaries around the world. It has the sensors to capture every detail of the data that matters and the AI-powered autonomy to put those sensors wherever they are needed. It packs more capability and versatility in a smaller and easier-to-use package than has ever existed.</em></blockquote><p>[ <a href="https://www.skydio.com/x10">Skydio X10</a> ]</p><div class="horizontal-rule"></div><blockquote><em>An innovative adaptive bipedal robot with bio-inspired multimodal locomotion control can autonomously adapt its body posture to balance on pipes, surmount obstacles of up to 14 centimeters in height (48 percent of its height), and stably move between horizontal and vertical pipe segments. This cutting-edge robotics technology addresses challenges that out-pipe inspection robots have encountered and can enhance out-pipe inspections within the oil and gas industry.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e8e75c30f5cdc438cceeb8b38fdf4113" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/SpwumbieLr8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p style="">[ <a href="https://ieeexplore.ieee.org/document/10190136">Paper</a> ] via [ <a href="https://manoonpong.com/index.html">VISTEC</a> ]</p><p style="">Thanks, Poramate!</p><div class="horizontal-rule" style=""></div><p>I’m not totally sure how you’d control all of these extra arms in a productive way, but I’m sure they’ll figure it out!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="614f023d47bb8fdda6b167f1ea5230dd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/NsQyspbpdVE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://publish.illinois.edu/kimlab2020/">KIMLAB</a> ]</p><div class="horizontal-rule"></div><blockquote><em>The video is one of the tests we tried on the X30 robot dog in the R&D period, to examine the speed of its stair-climbing ability.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9b9867cba225eef09d16b16b346cbaed" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FgDvoASaI2Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://deeprobotics.cn/en/index/buy.html">Deep Robotics</a> ]</p><div class="horizontal-rule"></div><p>They’re calling this the “T-REX” but without a pair of tiny arms. Missed opportunity there.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ec9a1374683b24abd1347aa8c72104fe" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/mQfc8o9RyxU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://global.agilex.ai/">AgileX</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Drag your mouse to look around within this 360-degree panorama captured by NASA’s Curiosity Mars rover. See the steep slopes, layered buttes, and dark rocks surrounding Curiosity while it was parked below Gediz Vallis Ridge, which formed as a result of violent debris flows that were later eroded by wind into a towering formation. This happened about 3 billion years ago, during one of the last wet periods seen on this part of the Red Planet.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="99134f5ae5094f49e6198f14f8e0736e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/sIfLkcFaFQY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://mars.nasa.gov/msl/home/">NASA</a> ]</p><div class="horizontal-rule"></div><p style="">I don’t know why you need to drive out into the woods to drop-test your sensor rack. Though maybe the stunning Canadian backwoods scenery is reason enough.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c99b355069df41df5bec1d419c6058fa" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dJ5m5HDCILo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p style="">[ <a href="https://norlab.ulaval.ca/">NORLab</a> ]</p><div class="horizontal-rule" style=""></div><blockquote><em>Here’s footage of Reachy in the kitchen, opening the fridge’s door and others, cleaning dirt and coffee stains.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e68206ea5a54815a7e372106faaf8035" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/LQ63wUI7ZXY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p style="">If they ever make Reachy’s face symmetrical, I will refuse to include it in any more Video Fridays. O_o</p><p style="">[ <a href="https://www.pollen-robotics.com/">Pollen Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Inertial odometry is an attractive solution to the problem of state estimation for agile quadrotor flight. In this work, we propose a learning-based odometry algorithm that uses an inertial measurement unit (IMU) as the only sensor modality for autonomous drone racing tasks. We show that our inertial odometry algorithm is superior to the state-of-the-art filter-based and optimization-based visual-inertial odometry as well as the state-of-the-art learned-inertial odometry in estimating the pose of an autonomous racing drone.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d311ba38a4aa9b9e893af15598a4c817" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DHQzaDVWXrc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://rpg.ifi.uzh.ch/research_drone_racing.html">UZH RPG</a> ]</p><div class="horizontal-rule"></div><blockquote><em>Robotic Choreographer is the world’s first dance performance-only robot arm born from the concept of performers that are bigger and faster than humans. This robot has a total length of 3 meters, two rotation axes that rotate infinitely, and an arm rotating up to five times for 1 second.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d3cd126b5d878475b729ccb5c77f6a53" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/tCOXOcISRec?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.mplpl.com/project/319/">MPlusPlus</a> ] via [ <a href="https://www.youtube.com/c/kazumichimoriyama">Kazumichi Moriyama</a> ]</p><div class="horizontal-rule"></div><blockquote><em>This video shows the latest development from Extend Robotics, demonstrating the completion of integration of the Mitsubishi Electric Melfa robot. Key demonstrations include 6 degrees-of-freedom (DoF) precision control with real-time inverse kinematics, dual Kinect camera, low-latency streaming and fusion, and high precision control drawing.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ef8054ba528760b854e8cf36b2df25df" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7Eelbc1t0B8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.extendrobotics.com/">Extend Robotics</a> ]</p><div class="horizontal-rule"></div><p>Here’s what’s been going on at the GRASP Lab at UPenn.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bf8dfbf28076356d266ebbd909faf9ef" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/yxvltmQ0jp0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p class="shortcode-media shortcode-media-youtube">
  618. <span class="rm-shortcode" data-rm-shortcode-id="cc76a4814284197448a323a6bab89067" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/63Tmw6DLsss?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span>
  619. </p><p>[ <a href="https://www.grasp.upenn.edu/">GRASP Lab</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 22 Sep 2023 15:38:34 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robot-dance</guid><category>Video friday</category><category>Robotics</category><category>Curiosity mars rover</category><category>Humanoid robots</category><category>Skydio</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/humanoid-robot-with-googly-white-eyes-extends-arms-toward-camera-in-a-dance-club-like-blue-and-pink-lit-scene-with-confetti-and.jpg?id=43150832&amp;width=980"></media:content></item></channel></rss>

If you would like to create a banner that links to this page (i.e. this validation result), do the following:

  1. Download the "valid RSS" banner.

  2. Upload the image to your own server. (This step is important. Please do not link directly to the image on this server.)

  3. Add this HTML to your page (change the image src attribute if necessary):

If you would like to create a text link instead, here is the URL you can use:

http://www.feedvalidator.org/check.cgi?url=https%3A//feeds.feedburner.com/IeeeSpectrumRobotics%3Fformat%3Dxml

Copyright © 2002-9 Sam Ruby, Mark Pilgrim, Joseph Walton, and Phil Ringnalda