Evan Ackerman
banner
evanackerman.bsky.social
Evan Ackerman
@evanackerman.bsky.social
Senior editor at IEEE Spectrum. I hug robots.
spectrum.ieee.org
This Soft Robot Is 100% Edible, Including the Battery https://spectrum.ieee.org/soft-edible-robot
This Soft Robot Is 100% Edible, Including the Battery
While there are many useful questions to ask when encountering a new robot, “can I eat it” is generally not one of them. I say ‘generally,’ because edible robots are actually a thing —and not just edible in the sense that you can technically swallow them and suffer both the benefits and consequences , but ingestible , where you can take a big bite out of the robot , chew it up, and swallow it. Yum. But so far these ingestible robots have included a very please-don’t-ingest-this asterisk: the motor and battery, which are definitely toxic and probably don’t taste all that good. The problem has been that soft, ingestible actuators run on gas pressure, requiring pumps and valves to function, neither of which are easy to make without plastic and metal. But in a new paper , researchers from Dario Floreano’s Laboratory of Intelligent Systems at EPFL in Switzerland have demonstrated ingestible versions of both of batteries and actuators, resulting in what is, as far as I know, the first entirely ingestible robot capable of controlled actuation. EPFL Let’s start with the battery on this lil’ guy. In a broad sense, a battery is just a system for storing and releasing energy. In the case of this particular robot, the battery is made of gelatin and wax. It stores chemical energy in chambers containing liquid citric acid and baking soda, both of which you can safely eat. The citric acid is kept separate from the baking soda by a membrane, and enough pressure on the chamber containing the acid will puncture that membrane, allowing the acid to slowly drip onto the baking soda. This activates the battery and begins to generate CO2 gas, along with sodium citrate (common in all kinds of foods, from cheese to sour candy) as a byproduct. EPFL The CO2 gas travels through gelatin tubing into the actuator, which is of a fairly common soft robotic design that uses interconnected gas chambers on top of a slightly stiffer base that bends when pressurized. Pressurizing the actuator gets you one single actuation, but to make the actuator wiggle (wiggling being an absolutely necessary skill for any robot), the gas has to be cyclically released. The key to doing this is the other major innovation here: an ingestible valve. EPFL The valve operates based on the principle of snap-buckling, which means that it’s happiest in one shape (closed), but if you put it under enough pressure, it rapidly snaps open and then closed again once the pressure is released. The current version of the robot operates at about four bending cycles per minute over a period of a couple of minutes before the battery goes dead. And so there you go: a battery, a valve, and an actuator, all ingestible, makes for a little wiggly robot, also ingestible. Great! But why ? “A potential use case for our system is to provide nutrition or medication for elusive animals, such as wild boars,” says lead author Bokeon Kwak. “Wild boars are attracted to live moving prey, and in our case, it’s the edible actuator that mimics it.” The concept is that you could infuse something like a swine flu vaccine into the robot. Because it’s cheap to manufacture, safe to deploy, completely biodegradable, and wiggly, it could potentially serve as an effective strategy for targeted mass delivery to the kind of animals that nobody wants to get close to. And it’s obviously not just wild boars—by tuning the size and motion characteristics of the robot, what triggers it, and its smell and taste, you could target pretty much any animal that finds wiggly things appealing. And that includes humans! Kwak says that if you were to eat this robot, the actuator and valve would taste a little bit sweet, since they have glycerol in them, with a texture like gummy candy. The pneumatic battery would be crunchy on the outside and sour on the inside (like a lemon) thanks to the citric acid. While this work doesn’t focus specifically on taste, the researchers have made other versions of the actuator that were flavored with grenadine. They served these actuators out to humans earlier this year , and are working on an ‘analysis of consumer experience’ which I can only assume is a requirement before announcing a partnership with Haribo. Eatability, though, is not the primary focus of the robot, says PI Dario Floreano. “If you look at it from the broader perspective of environmental and sustainable robotics, the pneumatic battery and valve system is a key enabling technology, because it’s compatible with all sorts of biodegradable pneumatic robots.” And even if you’re not particularly concerned with all the environmental stuff, which you really should be, in the context of large swarms of robots in the wild it’s critical to focus on simplicity and affordability just to be able to usefully scale. This is all part of the EU-funded RoboFood project , and Kwak is currently working on other edible robots. For example, the elastic snap-buckling behavior in this robot’s valve is sort of battery-like in that it’s storing and releasing elastic energy, and with some tweaking, Kwak is hoping that edible elastic power sources might be the key for tasty little jumping robots that jump right off the dessert plate and into your mouth . Edible Pneumatic Battery for Sustained and Repeated Robot Actuation , by Bokeon Kwak, Shuhang Zhang, Alexander Keller, Qiukai Qi, Jonathan Rossiter, and Dario Floreano from EPFL, is published in Advanced Science .
spectrum.ieee.org
November 14, 2025 at 8:24 PM
Video Friday: DARPA Challenge Focuses on Heavy Lift Drones https://spectrum.ieee.org/video-friday-heavy-lift-drones
Video Friday: DARPA Challenge Focuses on Heavy Lift Drones
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA Enjoy today’s videos! Current multirotor drones provide simplicity, affordability, and ease of operation; however, their primary limitation is their low payload-to-weight ratio, which typically falls at 1:1 or less. The DARPA Lift Challenge aims to shatter the heavy lift bottleneck, seeking novel drone designs that can carry payloads more than four times their weight, which would revolutionize the way we use drones across all sectors. [ DARPA ] Huge milestone achieved! World’s first mass delivery of humanoid robots has completed! Hundreds of UBTECH Walker S2 have been delivered to our partners. I really hope that’s not how they’re actually shipping their robots. [ UBTECH ] There is absolutely no reason to give robots hands if you can just teach them to lasso stuff instead. [ ArcLab ] Saddle Creek deployed Carter in its order fulfillment operation for a beauty client. It helps to automate and optimize tote delivery operations between multiple processing and labeling lines and more than 20 designated drop-off points. In this capacity, Carter functions as a flexible, non-integrated “virtual conveyor” that streamlines material flow without requiring fixed infrastructure. [ Robust.ai ] This is our latest work on an aerial–ground robot team, the first time a language–vision hierarchy achieves long-horizon navigation and manipulation on the real UAV + quadruped using only 2D cameras. The article is published open-access in Advanced Intelligent Systems. [ DRAGON Lab ] Thanks, Moju! I am pretty sure that you should not use a quadrupedal robot to transport your child. But only pretty sure, not totally certain. [ DEEP Robotics ] Building Behavioral Foundation Models (BFMs) for humanoid robots has the potential to unify diverse control tasks under a single, promptable generalist policy. However, existing approaches are either exclusively deployed on simulated humanoid characters, or specialized to specific tasks such as tracking. We propose BFM-Zero, a framework that learns an effective shared latent representation that embeds motions, goals, and rewards into a common space, enabling a single policy to be prompted for multiple downstream tasks without retraining. [ BFM-Zero ] Welcome to the very, very near future of manual labor. [ AgileX ] MOMO (Mobile Object Manipulation Operator) has been one of KIMLAB’s key robots since its development about two years ago and has featured as a main actor in several of our videos. The design and functionalities of MOMO were recently published in IEEE Robotics & Automation Magazine. [ Paper ] via [ KIMLAB ] We are excited about our new addition to our robot fleet! As a shared resource for our faculty members, this robot will facilitate multiple research activities within our institute that target significant future funding. Our initial focus for this robot will be on an agricultural application but we have big plans for the robot in human-robot interaction projects. [ Ingenuity Labs ] The nice thing about robots that pick grapes in vineyards is that they don’t just eat the grapes, like I do. [ Extend Robotics ] How mobile of a mobile manipulator do you need? [ Clearpath Robotics ] Robotics professor, Dr. Christian Hubicki, talks about the NEO humanoid announcement on October 29th, 2025. While explaining the technical elements and product readiness, he refuses to show any emotion whatsoever. [ Optimal Robotics Lab ]
spectrum.ieee.org
November 14, 2025 at 6:30 PM
Video Friday: This Drone Drives and Flies—Seamlessly https://spectrum.ieee.org/video-friday-multimode-drone
Video Friday: This Drone Drives and Flies—Seamlessly
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA Enjoy today’s videos! Unlike existing hybrid designs, Duawlfin eliminates the need for additional actuators or propeller-driven ground propulsion by leveraging only its standard quadrotor motors and introducing a differential drivetrain with one-way bearings. The seamless transitions between aerial and ground modes further underscore the practicality and effectiveness of our approach for applications like urban logistics and indoor navigation. [ HiPeR Lab ] I appreciate the softness of NEO’s design, but those fingers look awfully fragile. [ 1X ] Imagine reaching into your backpack to find your keys. Your eyes guide your hand to the opening, but once inside, you rely almost entirely on touch to distinguish your keys from your wallet, phone, and other items. This seamless transition between sensory modalities (knowing when to rely on vision versus touch) is something humans do effortlessly but robots struggle with. The challenge isn’t just about having multiple sensors. Modern robots are equipped with cameras, tactile sensors, depth sensors, and more. The real problem is **how to integrate these different sensory streams**, especially when some sensors provide sparse but critical information at key moments. Our solution comes from rethinking how we combine modalities. Instead of forcing all sensors through a single network, we train separate expert policies for each modality and learn how to combine their action predictions at the policy level. Multi-university Collaboration presented via [ GitHub ] Thanks, Haonan! Happy (somewhat late) Halloween from Pollen Robotics! [ Pollen Robotics ] In collaboration with our colleagues from Iowa State and University of Georgia, we have put our pipe-crawling worm robot to test in the field. See it crawls through corrugated drainage pipes in a stream, and a smooth section of a subsurface drainage system. [ Paper ] from [ Smart Microsystems Laboratory, Michigan State University ] Heterogeneous robot teams operating in realistic settings often must accomplish complex missions requiring collaboration and adaptation to information acquired online. Because robot teams frequently operate in unstructured environments — uncertain, open-world settings without prior maps — subtasks must be grounded in robot capabilities and the physical world. We present SPINE-HT, a framework that addresses these limitations by grounding the reasoning abilities of LLMs in the context of a heterogeneous robot team through a three-stage process. In real-world experiments with a Clearpath Jackal, a Clearpath Husky, a Boston Dynamics Spot, and a high-altitude UAV, our method achieves an 87% success rate in missions requiring reasoning about robot capabilities and refining subtasks with online feedback. [ SPINE-HT ] from [ GRASP Lab, University of Pennsylvania ] Astribot keeping itself busy at IROS 2025. [ Astribot ] In two papers published in Matter and Advanced Science, a team of scientists from the Physical Intelligence Department at the Max Planck Institute for Intelligent Systems in Stuttgart, Germany, developed control strategies for influencing the motion of self-propelling oil droplets. These oil droplets mimic single-celled microorganisms and can autonomously solve a complex maze by following chemical gradients. However, it is very challenging to integrate external perturbation and use these droplets in robotics. To address these challenges, the team developed magnetic droplets that still possess life-like properties and can be controlled by external magnetic fields. In their work, the researchers showed that they are able to guide the droplet’s motion and use them in microrobotic applications such as cargo transportation. [ Max Planck Institute ] Everyone has fantasized about having an embodied avatar! Full-body teleoperation and full-body data acquisition platform is waiting for you to try it out! [ Unitree ] It’s not a humanoid , but it right now safely does useful things and probably doesn’t cost all that much to buy or run. [ Naver Labs ] This paper presents a curriculum-based reinforcement learning framework for training precise and high-performance jumping policies for the robot `Olympus’. Separate policies are developed for vertical and horizontal jumps, leveraging a simple yet effective strategy. Experimental validation demonstrates horizontal jumps up to 1.25 m with centimeter accuracy and vertical jumps up to 1.0 m. Additionally, we show that with only minor modifications, the proposed method can be used to learn omnidirectional jumping. [ Paper ] from [ Autonomous Robots Lab, Norwegian University of Science and Technology ] Heavy payloads are no problem for it: The new KR TITAN ultra moves payloads of up to 1500 kg, making the heavy lifting extreme in the KUKA portfolio. [ Kuka ] Good luck getting all of the sand out of that robot. Perhaps a nice oil bath is in order? [ DEEP Robotics ] This CMU RI Seminar is from Yuke Zhu at University of Texas at Austin, on “Toward Generalist Humanoid Robots: Recent Advances, Opportunities, and Challenges.” In an era of rapid AI progress, leveraging accelerated computing and big data has unlocked new possibilities to develop generalist AI models. As AI systems like ChatGPT showcase remarkable performance in the digital realm, we are compelled to ask: Can we achieve similar breakthroughs in the physical world — to create generalist humanoid robots capable of performing everyday tasks? In this talk, I will outline our data-centric research principles and approaches for building general-purpose robot autonomy in the open world. I will present our recent work leveraging real-world, synthetic, and web data to train foundation models for humanoid robots. Furthermore, I will discuss the opportunities and challenges of building the next generation of intelligent robots. [ Carnegie Mellon University Robotics Institute ]
spectrum.ieee.org
November 7, 2025 at 6:30 PM
Video Friday: Happy Robot Halloween!
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ICRA 2026 : 1–5 June 2026, VIENNA Enjoy today’s videos! Happy Halloween from UCL! [ University College London ] Happy Halloween from KIMLAB ! [ Kinetic Intelligent Machine Lab ] Happy Halloween from the DRAGON Lab ! [ DRAGON Lab, University of Tokyo ] Thanks, Moju! Happy Halloween from Agility Robotics ! [ Agility Robotics ] Happy Halloween from HEBI Robotics! [ HEBI Robotics ] You can now pay 1X $500/mo to collect data in your home. And it’s about what you’d expect: [ 1X ] via [ WSJ ] At our test warehouse, we recreate our customers’ inbound operations, from the dock configuration and conveyors, to the freight and beyond. Step inside our Stretch testing facility to learn about the the latest developments in warehouse automation and explore how we ensure robust, reliable performance in the real world. [ Boston Dynamics ] Well this is just mean. Important, but mean. [ Istituto Italiano de Tecnologia ] SpikeATac is a a multimodal tactile finger combining a taxelized and highly sensitive dynamic response (PVDF) with a static transduction method (capacitive) for multimodal touch sensing. Named for its `spiky’ response, SpikeATac’s multitaxel PVDF film provides fast, sensitive dynamic signals to the very onset and breaking of contact, providing the ability to stop quickly and delicately when grasping fragile, deformable objects. [ ROAM Lab, Columbia University ] Effectively integrating diverse sensory representations is crucial for robust robotic manipulation. However, the typical approach of feature concatenation is often suboptimal: dominant modalities such as vision can overwhelm sparse but critical signals like touch in contact-rich tasks, and monolithic architectures cannot flexibly incorporate new or missing modalities without retraining. Our method factorizes the policy into a set of diffusion models, each specialized for a single representation (e.g., vision or touch), and employs a router network that learns consensus weights to adaptively combine their contributions, enabling incremental integration of new representations. [ GitHub ] Thanks, Haonan! General-purpose robots should possess human-like dexterity and agility to perform tasks with the same versatility as us. A human-like form factor further enables the use of vast datasets of human-hand interactions. However, the primary bottleneck in dexterous manipulation lies not only in software but arguably even more in hardware. We present the open-source ORCA hand, a reliable and anthropomorphic 17-DoF tendon-driven robotic hand with integrated tactile sensors, fully assembled in less than eight hours and built for a material cost below 2,000 CHF. [ ORCA ] University of Chicago computer scientist Sarah Sebo is programming robots to give empathetic responses and perform nonverbal social cues like nodding to better build trust and rapport with humans. The goal is to develop robots that can improve performance in human-robot teams , such as enhancing learning outcomes for children. [ University of Chicago ] DJI has a robot vacuum now, which is fine. As far as I can make out, we’ve reached the point where just about every robot vacuum is (for better or worse) just that: fine. [ DJI ] This ICRA 2025 keynote is from Angela Schoellig at Technical University of Munich, on “Powering Robotics with AI.” [ ICRA 2025 ] This Carnegie Mellon University, Robotics Institute (CMU RI) Seminar is from Nancy Pollard, on “Bringing Dexterity to Robot Hands in the Real World.” Dexterous manipulation is a grand challenge of robotics, and fine manipulation skills are required for many robotics applications that we envision. In this overview talk, I will discuss my view of some major factors that contribute to dexterity and discuss how we can incorporate them into our robots and systems. [ CMU RI ]
spectrum.ieee.org
October 31, 2025 at 3:30 PM
Video Friday: Unitree’s Human-Size Humanoid Robot https://spectrum.ieee.org/video-friday-human-size-robot
Video Friday: Unitree’s Human-Size Humanoid Robot
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ROSCon 2025 : 27–29 October 2025, SINGAPORE Enjoy today’s videos! Welcome to this world—standing 180 cm tall and weighing 70 kg. The H2 bionic humanoid—born to serve everyone safely and friendly. Starting at US$29,900 plus tax and shipping. [ Unitree ] The title of this one, “Eagle Stole our FPV Drone,” pretty much sums it up. [ Team BlackSheep ] Historically, small robots couldn’t have arms because the necessary motors made them too heavy. We addressed this challenge by replacing multiple motors with a single motor and miniature electrostatic clutches. This innovation allowed us to create a high-DOF, lightweight arm for small robots, which can even hitch onto a drone. [ Seoul National University ] Thanks, Kyu-Jin! Just FYI, any robot that sounds like a tasty baked good is guaranteed favorable coverage on Video Friday. [ Cleo Robotics ] Oli now pulls off a smooth, coordinated whole-body sequence from lying down to getting back up. Standing 165 cm tall and powered by 31 degrees of freedom, Oli continues to demonstrate natural and fluid motion. [ LimX Dynamics ] Thanks, Jinyan! Friend o’ the blog Bram Vanderborght tours the exhibit floor at IROS 2025 in Hanghzou, China. [ IROS 2025 ] In a fireside chat with Professor Sam Madden, Tye Brady, Chief Technologist at Amazon Robotics , will discuss the trajectory of robotics and how generative AI plays a role in robotics innovation. [ MIT Generative AI Impact Consortium ] Prof. Dimitrios Kanoulas gave an invited talk at the Workshop on The Art of Robustness: Surviving Failures in Robotics at IROS 2025. [ IROS 2025 ] This University of Pennsylvania GRASP talk is by Suraj Nair from Physical Intelligence, on “Scaling Robot Learning with Vision-Language-Action Models.” The last several years have witnessed tremendous progress in the capabilities of AI systems, driven largely by foundation models that scale expressive architectures with diverse data sources. While the impact of this technology on vision and language understanding is abundantly clear, its use in robotics remains in its infancy. Scaling robot learning still presents numerous open challenges—from selecting the right data to scale, to developing algorithms that can effectively fit this data for closed-loop operation in the physical world. At Physical Intelligence, we aim to tackle these questions. This talk will present our recent work on building vision-language-action models, covering topics such as architecture design, data scaling, and open research directions. [ University of Pennsylvania GRASP Laboratory ]
spectrum.ieee.org
October 24, 2025 at 6:00 PM
Video Friday: Multimodal Humanoid Walks, Flies, Drives https://spectrum.ieee.org/video-friday-multimodal-robot
Video Friday: Multimodal Humanoid Walks, Flies, Drives
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. IROS 2025 : 19–25 October 2025, HANGZHOU, CHINA Enjoy today’s videos! Caltech’s Center for Autonomous Systems and Technologies (CAST) and the Technology Innovation Institute in Abu Dhabi, UAE, recently conducted a demonstration of X1, a multirobot system developed as part of a three-year collaboration between the two institutes. During the demo, M4, a multimodal robot developed by CAST, launches in drone-mode from a humanoid robot’s back. It lands and converts into driving mode and then back again, as needed. The demonstration underscored the kind of progress that is possible when engineers from multiple institutions at the forefront of autonomous systems and technologies truly collaborate. [ Caltech Center for Autonomous Systems and Technologies ] Spot robot performs dynamic whole-body manipulation using a combination of reinforcement learning and sampling-based control. Behavior shown in the video is fully autonomous, including the dynamic selection of contacts on the arm, legs, and body, and coordination between the manipulation and locomotion processes. The tire weighs 15 kg (33 lbs), making its mass and inertial energy significant compared to the weight of the robot. An external motion capture system was used to simplify perception and an external computer linked by WiFi performed the intensive computational operations. Spot’s arm is stronger than I thought. Also, the arm-foot collaboration is pretty wild. [ Robotics and AI Institute ] Figure 03 represents an unprecedented advancement in taking humanoid robots from experimental prototypes to deployable, scalable products. By uniting advanced perception and tactile intelligence with home-safe design and mass-manufacturing readiness, Figure has built a platform capable of learning, adapting, and working across both domestic and commercial settings. Designed for Helix, the home, and the world at scale, Figure 03 establishes the foundation for true general-purpose robotics - one capable of transforming how people live and work. The kid and the dog in those clips make me very, very nervous. [ Figure ] Researchers have invented a new super agile robot that can cleverly change shape thanks to amorphous characteristics akin to the popular Marvel anti-hero Venom. Researchers used a special material called electro-morphing gel (e-MG) which allows the robot to show shapeshifting functions, allowing them to bend, stretch, and move in ways that were previously difficult or impossible, through manipulation of electric fields from ultralightweight electrodes. [ University of Bristol ] This is very preliminary of course, but I love the idea of quadrupedal robots physically assisting each other to surmount obstacles like this. [ Robot Perception and Learning Lab ] Have we reached peak dynamic humanoid yet? [ Unitree ] Dynamic manipulation, such as robot tossing or throwing objects, has recently gained attention as a novel paradigm to speed up logistic operations. However, the focus has predominantly been on the object’s landing location, irrespective of its final orientation. In this work, we present a method enabling a robot to accurately “throw-flip” objects to a desired landing pose (position and orientation). [ LASA ] I don’t care all that much about “industry-oriented” quadrupeds. I do care very much about “rideable” quadrupeds. [ MagicLab ] I am not yet at the point where I would trust any humanoid around priceless ancient relics. Any humanoid, not just the robotic ones. [ LimX ] This CMU RI Seminar is from Matt Mason, Professor Emeritus at CMU, entitled “A Manipulation Journey.” The talk will revisit my career in manipulation research, focusing on projects that might offer some useful lessons for others. We will start with my beginnings at the MIT AI Lab and my MS thesis, which is still my most cited work, then continue with my arrival at CMU, a discussion with Allen Newell, an exercise to envision a coherent research program, and how that led to a second and third childhood. The talk will conclude with some discussion of lessons learned. [ Carnegie Mellon University Robotics Institute ] Dr. Christian Hubicki highlights and explains the past year of humanoid robotics research and news. [ Florida State University ] More excellent robotics discussions from ICRA@40 . [ ICRA@40 ]
spectrum.ieee.org
October 17, 2025 at 4:30 PM
Video Friday: Non-Humanoid Hands for Humanoid Robots https://spectrum.ieee.org/video-friday-robotic-hands-2674168909
Video Friday: Non-Humanoid Hands for Humanoid Robots
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. World Robot Summit : 10–12 October 2025, OSAKA, JAPAN IROS 2025 : 19–25 October 2025, HANGZHOU, CHINA Enjoy today’s videos! There are two things that I really appreciate about this video on grippers from Boston Dynamics . First, building a gripper while keeping in mind that the robot will inevitably fall onto it, because I’m seeing lots of very delicate looking five-fingered hands on humanoids and I’m very skeptical of their ruggedness. And second, understanding that not only is a five-fingered hand very likely unnecessary for the vast majority of tasks, but also robot hands don’t have to be constrained by a human hand’s range of motion. [ Boston Dynamics ] Yes, okay, it’s a fancy looking robot, but I’m still stuck on what useful, practical things can it reliably and cost effectively and safely DO. - YouTube youtu.be [ Figure ] Life on Earth has evolved in constant relation to gravity, yet we rarely consider how deeply it shapes living systems, until we imagine a place without it. In MycoGravity, pink oyster mushrooms grow inside a custom-built bioreactor mounted on a KUKA robotic arm. Inspired by NASA’s random positioning machines, the robot’s programmed movement simulates altered gravity. Over time, sculptural mushrooms emerge, shaped by their environment without a stable gravitational direction. [ MycoGravity ] A new technological advancement gives robotic systems a natural sense of touch without extra skins or sensors. With advanced force sensing and deep learning, this robot can feel where you touch, recognize symbols, and even use virtual buttons—paving the way for more natural and flexible human-robot interaction. [ Science Robotics ] Thanks, Maged! The creator of Mini Pupper introduces Hey Santa , which can be yours for under $60. [ Kickstarter campaign ] I think humanoid robotics companies are starting to realize that they’re going to need to differentiate themselves somehow. [ DEEP Robotics ] Drone swarm performances---synchronized, expressive aerial displays set to music---have emerged as a captivating application of modern robotics. Yet designing smooth, safe choreographies remains a complex task requiring expert knowledge. We present SwarmGPT, a language-based choreographer that leverages the reasoning power of large language models (LLMs) to streamline drone performance design. [ SwarmGPT ] Dr. Mark Draelos, assistant professor of robotics and ophthalmology, received the National Institutes of Health (NIH) Director’s New Innovator Award for a project which seeks to improve how delicate microsurgeries are conducted by scaling up tissue to a size where surgeons could “walk across the retina” in virtual reality and operate on tissue as if “raking leaves.” [ University of Michigan ] The intricate mechanisms of the most sophisticated laboratory on Mars are revealed in Episode 4 of the ExoMars Rosalind Franklin series, called “Sample processing.” [ European Space Agency ] There’s currently a marketplace for used industrial robots, and it makes me wonder what’s next. Used humanoids, anyone? [ Kuka ] On October 2, 2025, the 10th “Can We Build Baymax?” Workshop Part 10: What Can We Build Today? & BYOB (Bring Your Own Baymax) was held in Seoul, Korea. To celebrate the 10th anniversary, Baymax delivered a special message from his character designer, Jin Kim. [ Baymax ] I am only sharing this to declare that iRobot has gone off the deep end with their product names: Meet the “Roomba® Max 705 Combo Robot + AutoWash™ Dock.” [ iRobot ] Daniel Piedrahita, Navigation Team Lead, presents on his team’s recent work rebuilding Digit’s navigation stack, including a significant upgrade to foostep path planning. [ Agility Robotics ] A bunch of videos from ICRA@40 have just been posted, and here are a few of my favorites. [ ICRA@40 ]
spectrum.ieee.org
October 10, 2025 at 4:00 PM
Video Friday: Drone Easily Lands on Speeding Vehicle https://spectrum.ieee.org/video-friday-speedy-drone-landing
Video Friday: Drone Easily Lands on Speeding Vehicle
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. World Robot Summit : 10–12 October 2025, OSAKA, JAPAN IROS 2025 : 19–25 October 2025, HANGZHOU, CHINA Enjoy today’s videos! We demonstrate a new landing system that lets drones safely land on moving vehicles at speeds up to 110 km/h. By combining lightweight shock absorbers with reverse thrust, our approach drastically expands the landing envelope, making it far more robust to wind, timing, and vehicle motion. This breakthrough opens the door to reliable high-speed drone landings in real-world conditions. [ Createk Design Lab ] Thanks, Alexis! This video presents an academic parody inspired by KAIST’s humanoid robot moonwalk. While KAIST demonstrated the iconic move with robot legs, we humorously reproduced it using the Tesollo DG-5F robot hand. A playful experiment to show that not only humanoid robots but also robotic fingers can “dance.” [ Hangyang University ] 20 years ago, Universal Robots built the first collaborative robot . You turned it into something bigger. Our cobot was never just technology. In your hands, it became something more: a teammate, a problem-solver, a spark for change. From factories to labs, from classrooms to warehouses. That’s the story of the past 20 years. That’s what we celebrate today. [ Universal Robots ] The assistive robot Maya, newly developed at DLR, is designed to enable people with severe physical disabilities to lead more independent lives. The new robotic arm is built for seamless wheelchair integration, with optimized kinematics for stowing, ground-level access, and compatibility with standing functions. [ DLR ] Contoro and HARCO Lab have launched an open-source initiative, ROS-MCP-Server, which connects AI models (e.g., Claude, GPT, Gemini) with robots using ROS and MCP. This software enables AI to communicate with multiple ROS nodes in the language of robots. We believe it will allow robots to perform tasks previously impossible due to limited intelligence, help robotics engineers program robots more efficiently, and enable non-experts to interact with robots without deep robotics knowledge. [ GitHub ] Thanks, Mok! Here’s a quick look at the Conference on Robotic Learning (CoRL) exhibit hall, thanks to PNDbotics. [ PNDbotics ] Old and busted: sim to real. New hotness: real to sim! [ Paper ] Any humanoid video with tennis balls should be obligated to show said humanoid failing to walk over them. [ LimX ] Thanks, Jinyan! The correct answer to the question, ‘can you beat a robot arm at Tic-Tac-Toe’ should be no, no you cannot. And you can’t beat a human, either, if they know what they’re doing. [ AgileX ] It was an honor to host the team from Microsoft AI as part of their larger educational collaboration with The University of Texas at Austin. During their time here, they shared this wonderful video of our lab facilities. Moody lighting is second only to random primary colored lighting when it comes to making a lab look sciency. [ The University of Texas at Austin HCRL ] Robots aren’t just sci-fi anymore. They’re evolving fast. AI is teaching them how to adapt, learn and even respond to open-ended questions with advanced intelligence. Aaron Saunders, CTO of Boston Dynamics, explains how this leap is transforming everything, from simple controls to full-motion capabilities. While there are some challenges related to safety and reliability, AI is significantly helping robots become valuable partners at home and on the job. [ IBM ]
spectrum.ieee.org
October 3, 2025 at 4:01 PM
Why the World Needs a Flying Robot Baby https://spectrum.ieee.org/ironcub-jet-powered-flying-robot
Why the World Needs a Flying Robot Baby
One of the robotics projects that I’ve been most excited about for years now is iRonCub , from Daniele Pucci’s Artificial and Mechanical Intelligence Lab at IIT in Genoa, Italy. Since 2017 , Pucci has been developing a jet propulsion system that will enable an iCub robot (originally designed to be the approximate shape and size of a five year old child) to fly like Iron Man. Over the summer, after nearly 10 years of development, iRonCub3 achieved liftoff and stable flight for the first time , with its four jet engines lifting it 50 centimeters off the ground for several seconds. The long-term vision is for iRonCub (or a robot like it) to operate as a disaster response platform, Pucci tells us. In an emergency situation like a flood or a fire, iRonCub could quickly get to a location without worrying about obstacles, and then on landing, start walking for energy efficiency while using its arms and hands to move debris and open doors. “We believe in contributing to something unique in the future,” says Pucci. “We have to explore new things, and this is wild territory at the scientific level.” Obviously, this concept for iRonCub and the practical experimentation attached to it is really cool. But coolness in of itself is usually not enough of a reason to build a robot, especially a robot that’s a (presumably rather expensive) multi-year project involving a bunch of robotics students, so let’s get into a little more detail about why a flying robot baby is actually something that the world needs. In an emergency situation like a flood or a fire, iRonCub could quickly get to a location without worrying about obstacles, and then on landing, start walking for energy efficiency while using its arms and hands to move debris and open doors. IIT Getting a humanoid robot to do this sort of thing is quite a challenge. Together, the jet turbines mounted to iRonCub’s back and arms can generate over 1000 N of thrust, but because it takes time for the engines to spool up or down, control has to come from the robot itself as it moves its arm-engines to maintain stability. “What is not visible from the video,” Pucci tells us, “is that the exhaust gas from the turbines is at 800 degrees Celsius and almost supersonic speed. We have to understand how to generate trajectories in order to avoid the fact that the cones of emission gasses were impacting the robot.” Even if the exhaust doesn’t end up melting the robot, there are still aerodynamic forces involved that have until this point really not been a consideration for humanoid robots at all—in June, Pucci’s group published a paper in Nature Engineering Communications , offering a “comprehensive approach to model and control aerodynamic forces [for humanoid robots] using classical and learning techniques.” “The exhaust gas from the turbines is at 800 degrees Celsius and almost supersonic speed.” —Daniele Pucci, IIT Whether or not you’re on board with Pucci’s future vision for iRonCub as a disaster response platform, derivatives of current research can be immediately applied beyond flying humanoid robots. The algorithms for thrust estimation can be used with other flying platforms that rely on directed thrust, like eVTOL aircraft. Aerodynamic compensation is relevant for humanoid robots even if they’re not airborne, if we expect them to be able to function when it’s windy outside. More surprising, Pucci describes a recent collaboration with an industrial company developing a new pneumatic gripper. “At a certain point, we had to do force estimation for controlling the gripper, and we realized that the dynamics looked really similar to those of the jet turbines, and so we were able to use the same tools for gripper control. That was an ‘ah-ha’ moment for us: first you do something crazy, but then you build the tools and methods, and then you can actually use those tools in an industrial scenario. That’s how to drive innovation.” What’s Next for iRonCub: Attracting Talent and Future Enhancements There’s one more important reason to be doing this, he says: “It’s really cool.” In practice, a really cool flagship project like iRonCub not only attracts talent to Pucci’s lab, but also keeps students and researchers passionate and engaged. I saw this firsthand when I visited IIT last year, where I got a similar vibe to watching the DARPA Robotics Challenge and DARPA SubT —when people know they’re working on something really cool , there’s this tangible, pervasive, and immersive buzzing excitement that comes through. It’s projects like iRonCub that can get students to really love robotics. In the near future, a new jetpack with an added degree of freedom will make yaw control of iRonCub easier, and Pucci would also like to add wings for more efficient long distance flight. But the logistics of testing the robot are getting more complicated—there’s only so far that the team can go with their current test stand (which is on the roof of their building), and future progress will likely require coordinating with the Genoa airport. It’s not going to be easy, but as Pucci makes clear, “this is not a joke. It’s something that we believe in. And that feeling of doing something exceptional, or possibly historical, something that’s going to be remembered—that’s something that’s kept us motivated. And we’re just getting started.”
spectrum.ieee.org
September 30, 2025 at 12:01 PM
Video Friday: Gemini Robotics Improves Motor Skills https://spectrum.ieee.org/video-friday-google-gemini-robotics
Video Friday: Gemini Robotics Improves Motor Skills
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. CoRL 2025 : 27–30 September 2025, SEOUL IEEE Humanoids : 30 September–2 October 2025, SEOUL World Robot Summit : 10–12 October 2025, OSAKA, JAPAN IROS 2025 : 19–25 October 2025, HANGZHOU, CHINA Enjoy today’s videos! Gemini Robotics 1.5 is our most capable vision-language-action (VLA) model that turns visual information and instructions into motor commands for a robot to perform a task. This model thinks before taking action and shows its process, helping robots assess and complete complex tasks more transparently. It also learns across embodiments, accelerating skill learning . [ Google DeepMind ] A simple “force pull” gesture brings Carter straight into her hand. This is a fantastic example of how an intuitive interaction can transform complex technology into an extension of our intent. [ Robust.ai ] I can’t help it, I feel bad for this poor little robot. [ Urban Robotics Laboratory, KAIST ] Hey look, no legs! [ Kinisi Robotics ] Researchers at the University of Michigan and Shanghai Jiao Tong University have developed a soft robot that can crawl along a flat path and climb up vertical surfaces using its unique origami structure. The robot can move with an accuracy typically seen only in rigid robots. [ University of Michigan Robotics ] Unitree G1 has learned the “Anti-Gravity” mode: stability is greatly improved under any action sequence, and even if it falls, it can quickly get back up. [ Unitree ] Kepler Robotics has commenced mass production of the K2 Bumblebee, the world’s first commercially available humanoid robot powered by Tesla’s hybrid architecture. [ Kepler Robotics ] Reinforcement learning (RL)-based legged locomotion controllers often require meticulous reward tuning to track velocities or goal positions while preserving smooth motion on various terrains. Motion imitation methods via RL using demonstration data reduce reward engineering but fail to generalize to novel environments. We address this by proposing a hierarchical RL framework in which a low-level policy is first pre-trained to imitate animal motions on flat ground, thereby establishing motion priors. Real-world experiments with an ANYmal-D quadruped robot confirm our policy’s capability to generalize animal-like locomotion skills to complex terrains, demonstrating smooth and efficient locomotion and local navigation performance amidst challenging terrains with obstacles. [ ETHZ RSL ] I think we have entered the ‘differentiation-through-novelty’ phase of robot vacuums. [ Roborock ] In this work, we present Kinethreads: a new full-body haptic exosuit design built around string-based motor-pulley mechanisms, which keeps our suit lightweight ( [ ACM Symposium on User Interface and Software Technology ] In this episode of the IBM AI in Action podcast, Aaron Saunders, CTO of Boston Dynamics, delves into the transformative potential of AI-powered robotics, highlighting how robots are becoming safer, more cost-effective and widely accessible through Robotics as a Service (RaaS). [ IBM ] This CMU RI Seminar is by Michael T. Tolley from UCSD, on ‘Biologically Inspired Soft Robotics.’ Robotics has the potential to address many of today’s pressing problems in fields ranging from healthcare to manufacturing to disaster relief. However, the traditional approaches used on the factory floor do not perform well in unstructured environments. The key to solving many of these challenges is to explore new, non-traditional designs. Fortunately, nature surrounds us with examples of novel ways to navigate and interact with the real world. Dr. Tolley’s Bioinspired Robotics and Design Lab seeks to borrow the key principles of operation from biological systems and apply them to robotic design. [ Carnegie Mellon University Robotics Institute ]
spectrum.ieee.org
September 26, 2025 at 3:31 PM
Exploit Allows for Takeover of Fleets of Unitree Robots https://spectrum.ieee.org/unitree-robot-exploit
Exploit Allows for Takeover of Fleets of Unitree Robots
A critical vulnerability in the Bluetooth Low Energy (BLE) Wi-Fi configuration interface used by several different Unitree robots can result in a root level takeover by an attacker, security researchers disclosed on 20 September . The exploit impacts Unitree’s Go2 and B2 quadrupeds and G1 and H1 humanoids. Because the vulnerability is wireless, and the resulting access to the affected platform is complete, the vulnerability becomes wormable, say the researchers , meaning “an infected robot can simply scan for other Unitree robots in BLE range and automatically compromise them, creating a robot botnet that spreads without user intervention.” Initially discovered by security researchers Andreas Makris and Kevin Finisterre, UniPwn takes advantage of several security lapses that are still present in the firmware of Unitree robots as of 20 September, 2025. As far as IEEE Spectrum is aware, this is the first major public exploit of a commercial humanoid platform. Unitree Robots’ BLE Security Flaw Exposed Like many robots, Unitree’s robots use an initial BLE connection to make it easier for a user to set up a Wi-Fi network connection. The BLE packets that the robot accepts are encrypted, but those encryption keys are hardcoded and were published on X (formerly Twitter) by Makris in July. Although the robot does validate the contents of the BLE packets to make sure that the user is authenticated, the researchers say that all it takes to become an authenticated user is to encrypt the string ‘unitree’ with the hardcoded keys and the robot will let someone in. From there, an attacker can inject arbitrary code masquerading as the Wi-Fi SSID and password, and when the robot attempts to connect to Wi-Fi, it will execute that code without any validation and with root privileges. “A simple attack might be just to reboot the robot, which we published as a proof-of-concept,” explains Makris. “But an attacker could do much more sophisticated things: It would be possible to have a trojan implanted into your robot’s startup routine to exfiltrate data while disabling the ability to install new firmware without the user knowing. And as the vulnerability uses BLE, the robots can easily infect each other, and from there the attacker might have access to an army of robots.” Makris and Finisterre first contacted Unitree in May in an attempt to responsibly disclose this vulnerability. After some back and forth with little progress, Unitree stopped responding to the researchers in July, and the decision was made to make the vulnerability public. “We have had some bad experiences communicating with them,” Makris tells us, citing an earlier backdoor vulnerability he discovered with the Unitree Go1. “So we need to ask ourselves—are they introducing vulnerabilities like this on purpose, or is it sloppy development? Both answers are equally bad.” Unitree has not responded to a request for comment from IEEE Spectrum as of press time. “Unitree, as other manufacturers do, has simply ignored prior security disclosures and repeated outreach attempts,” says Víctor Mayoral-Vilches, the founder of robotics cybersecurity company Alias Robotics . “This is not the right way to cooperate with security researchers.” Mayoral-Vilches was not involved in publishing the UniPwn exploit, but he has found other security issues with Unitree robots, including undisclosed streaming of telemetry data to servers in China which could potentially include audio, visual, and spatial data. Mayoral-Vilches explains that security researchers are focusing on Unitree primarily because the robots are available and affordable. This makes them not just more accessible for the researchers, but also more relevant, since Unitree’s robots are already being deployed by users around the world who are likely not aware of the security risks. For example, Makris is concerned that the Nottinghamshire Police in the UK have begun testing a Unitree Go2 , which can be exploited by UniPwn. “We tried contacting them and would have disclosed the vulnerability upfront to them before going public, but they ignored us. What would happen if an attacker implanted themselves into one of these police dogs?” How to Secure Unitree Robots In the short term, Mayoral-Vilches suggests that people using Unitree robots can protect themselves by only connecting the robots to isolated Wi-Fi networks and disabling their Bluetooth connectivity. “You need to hack the robot to secure it for real,” he says. “This is not uncommon and why security research in robotics is so important.” Both Mayoral-Vilches and Makris believe that fundamentally it’s up to Unitree to make their robots secure in the long term, and that the company needs to be much more responsive to users and security researchers. But Makris says: “There will never be a 100 percent secure system.” Mayoral-Vilches agrees. “Robots are very complex systems, with wide attack surfaces to protect, and a state-of-the-art humanoid exemplifies that complexity.” Unitree, of course, is not the only company offering complex state-of-the-art quadrupeds and humanoids, and it seems likely (if not inevitable) that similar exploits will be discovered in other platforms. The potential consequences here can’t be overstated—the idea that robots can be taken over and used for nefarious purposes is already a science fiction trope, but the impact of a high-profile robot hack on the reputation of the commercial robotics industry is unclear. Robots companies are barely talking about security in public, despite how damaging even the perception of an unsecured robot might be. A robot that is not under control has the potential to be a real physical danger. At the IEEE Humanoids Conference in Seoul from 30 September to 2 October, Mayoral-Vilches has organized a workshop on Cybersecurity for Humanoids , where he will present a brief (co-authored with Makris and Finisterre) titled Humanoid Robots as Attack Vectors . Despite the title, their intent is not to overhype the problem but instead to encourage roboticists (and robotics companies) to take security seriously, and not treat it as an afterthought. As Mayoral-Vilches points out, “robots are only safe if secure.”
spectrum.ieee.org
September 25, 2025 at 1:37 PM
Video Friday: A Billion Dollars for Humanoid Robots https://spectrum.ieee.org/video-friday-billion-humanoid-robots
Video Friday: A Billion Dollars for Humanoid Robots
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ACTUATE 2025 : 23–24 September 2025, SAN FRANCISCO CoRL 2025 : 27–30 September 2025, SEOUL IEEE Humanoids : 30 September–2 October 2025, SEOUL World Robot Summit : 10–12 October 2025, OSAKA, JAPAN IROS 2025 : 19–25 October 2025, HANGZHOU, CHINA Enjoy today’s videos! A billion dollars is a lot of money. And this is actual money, not just a valuation. but Figure already had a lot of money. So what are they going to be able to do now that they weren’t already doing, I wonder? [ Figure ] Robots often succeed in simulation but fail in reality. With PACE, we introduce a systematic approach to sim-to-real transfer. [ Paper ] Anthropomorphic robotic hands are essential for robots to learn from humans and operate in human environments. While most designs loosely mimic human hand kinematics and structure, achieving the dexterity and emergent behaviors present in human hands, anthropomorphic design must extend to also match passive compliant properties while simultaneously strictly having kinematic matching. We present ADAPT-Teleop, a system combining a robotic hand with human-matched kinematics, skin, and passive dynamics, along with a robotic arm for intuitive teleoperation. [ Paper ] This robot can walk without any electronic components in its body, because the power is transmitted through wires from motors concentrated outside of its body. Also, this robot’s front and rear legs are optimally coupled, and can walk with just 4 wires. [ JSK Lab ] Thanks, Takahiro! Five teams of Los Alamos engineers competed to build the ultimate hole-digging robot dog in a recent engineering sprint. In just days, teams programmed their robot dogs to dig, designing custom “paws” from materials like sheet metal, foam and 3D-printed polymers. The paws mimicked animal digging behaviors — from paddles and snowshoes to dew claws — and helped the robots avoid sinking into a 30-gallon soil bucket. Teams raced to see whose dog could dig the biggest hole and dig under a fence the fastest. [ Los Alamos ] This work presents UniPilot, a compact hardware-software autonomy payload that can be integrated across diverse robot embodiments to enable resilient autonomous operation in GPS-denied environments. The system integrates a multi-modal sensing suite including LiDAR, radar, vision, and inertial sensing for robust operation in conditions where uni-modal approaches may fail. A large number of experiments are conducted across diverse environments and on a variety of robot platforms to validate the mapping, planning, and safe navigation capabilities enabled by the payload. [ NTNU ] Thanks, Kostas! KAIST Humanoid v0.5. Developed at the DRCD Lab, KAIST, with a control policy trained via reinforcement learning. [ KAIST ] I just like the determined little hops. [ AgileX ] I’m always a little bit suspicious of robotics labs that are exceptionally clean and organized. [ PNDbotics ] Er, has PAL Robotics ever actually seen a kangaroo...? [ PAL ] See Spots push. Push, Spots, push. [ Tufts ] Training humanoid robots to hike could accelerate development of embodied AI for tasks like autonomous search and rescue, ecological monitoring in unexplored places and more, say University of Michigan researchers who developed an AI model that equips humanoids to hit the trails. [ Michigan ] I am dangerously close to no longer being impressed by breakdancing humanoid robots. [ Fourier ] This, though, would impress me. [ Inria ] In this interview, Clone’s co-founder and CEO Dhanush Radhakrishnan discusses the company’s path to creating the synthetic humans straight out of science fiction. (If YouTube brilliantly attempts to auto-dub this for you, switch the audio track to original (which YouTube thinks is Polish) and the video will still be in English.) [ Clone ] This documentary takes you behind the scenes of HMND 01 Alpha release: the breakthroughs, the failures, and the late nights of building the UK’s first industrial humanoid robot. [ Humanoid ] What is the role of ethical considerations in the development and deployment of robotic and automation technologies and what are the responsibilities of researchers to ensure that these technologies advance in ways that are transparent, fair, and aligned with the broader well-being of society? [ ICRA@40 ] This UPenn GRASP SFI lecture is from Tairan He at NVIDIA, on “Scalable Sim-to-Real Learning for General-Purpose Humanoid Skills”. Humanoids represent the most versatile robotic platform, capable of walking, manipulating, and collaborating with people in human-centered environments. Yet, despite recent advances, building humanoids that can operate reliably in the real world remains a fundamental challenge. Progress has been hindered by difficulties in whole-body control, robust perceptive reasoning, and bridging the sim-to-real gap. In this talk, I will discuss how scalable simulation and learning can systematically overcome these barriers. [ UPenn ]
spectrum.ieee.org
September 19, 2025 at 3:30 PM
Video Friday: A Soft Robot Companion
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ACTUATE 2025 : 23–24 September 2025, SAN FRANCISCO CoRL 2025 : 27–30 September 2025, SEOUL IEEE Humanoids : 30 September–2 October 2025, SEOUL World Robot Summit : 10–12 October 2025, OSAKA, JAPAN IROS 2025 : 19–25 October 2025, HANGZHOU, CHINA Enjoy today’s videos! Fourier’s First Care-bot GR-3. This full-size “Care-bot” is designed for interactive companion. Its soft-touch outer shell and multimodal emotional interaction system bring the concept of “warm tech companionship” to life. I like that it’s soft to the touch , although I’m not sure that encouraging touch is safe. Reminds me a little bit of Valkyrie , where NASA put a lot of thought into the soft aspects of the robot. [ Fourier ] TAKE MY MONEY This 112 gram micro air vehicle (MAV) features foldable propeller arms that can lock into a compact rectangular profile comparable to the size of a smartphone. The vehicle can be launched by simply throwing it in the air, at which point the arms would unfold and autonomously stabilize to a hovering state. Multiple flight tests demonstrated the capability of the feedback controller to stabilize the MAV from different initial conditions including tumbling rates of up to 2500 deg/s. [ AVFL ] The U.S. Naval Research Laboratory (NRL), in collaboration with NASA, is advancing space robotics by deploying reinforcement learning algorithms onto Astrobee , a free-flying robotic assistant on board the International space station. This video highlights how NRL researchers are leveraging artificial intelligence to enable robots to learn, adapt, and perform tasks autonomously. By integrating reinforcement learning, Astrobee can improve maneuverability and optimize energy use. [ NRL ] Every day I’m scuttlin’ [ Ground Control Robotics ] Trust is built. Every part of our robot Proxie—from wheels to eyes—is designed with trust in mind. Cobot CEO Brad Porter explains the intent behind its design. [ Cobot ] Phase 1: Build lots of small quadruped robots. Phase 2: ? Phase 3: Profit! [ DEEP Robotics ] LAPP USA partnered with Corvus Robotics to solve a long-standing supply chain challenge: labor-intensive, error-prone inventory counting. [ Corvus ] I’m pretty sure that 95 percent of all science consists of moving small amounts of liquid from one container to another. [ Flexiv ] Raffaello D’Andrea , interviewed at ICRA 2025. [ Verity ] Tessa Lau, interviewed at ICRA 2025. [ Dusty Robotics ] Ever wanted to look inside the mind behind a cutting-edge humanoid robot? In this special episode, we have Dr.Aaron, our Product Manager at LimX Dynamics, for an exclusive deep dive into the LimX Oli. [ LimX Dynamics ]
spectrum.ieee.org
September 12, 2025 at 5:04 PM
Reality Is Ruining the Humanoid Robot Hype https://spectrum.ieee.org/humanoid-robot-scaling
Reality Is Ruining the Humanoid Robot Hype
Over the next several years, humanoid robots will change the nature of work. Or at least, that’s what humanoid robotics companies have been consistently promising, enabling them to raise hundreds of millions of dollars at valuations that run into the billions. Delivering on these promises will require a lot of robots. Agility Robotics expects to ship “hundreds ” of its Digit robots in 2025 and has a factory in Oregon capable of building over 10,000 robots per year. Tesla is planning to produce 5,000 of its Optimus robots in 2025, and at least 50,000 in 2026. Figure believes “there is a path to 100,000 robots ” by 2029. And these are just three of the largest companies in an increasingly crowded space. Amplifying this message are many financial analysts: Bank of America Global Research , for example, predicts that global humanoid robot shipments will reach 18,000 units in 2025. And Morgan Stanley Research estimates that by 2050 there could be over 1 billion humanoid robots, part of a US $5 trillion market. But as of now, the market for humanoid robots is almost entirely hypothetical. Even the most successful companies in this space have deployed only a small handful of robots in carefully controlled pilot projects. And future projections seem to be based on an extraordinarily broad interpretation of jobs that a capable, efficient, and safe humanoid robot—which does not currently exist—might conceivably be able to do. Can the current reality connect with the promised scale? What Will It Take to Scale Humanoid Robots? Physically building tens of thousands, or even hundreds of thousands, of humanoid robots, is certainly possible in the near term. In 2023, on the order of 500,000 industrial robots were installed worldwide . Under the basic assumption that a humanoid robot is approximately equivalent to four industrial arms in terms of components, existing supply chains should be able to support even the most optimistic near-term projections for humanoid manufacturing. But simply building the robots is arguably the easiest part of scaling humanoids, says Melonee Wise , who served as chief product officer at Agility Robotics until this month. “The bigger problem is demand—I don’t think anyone has found an application for humanoids that would require several thousand robots per facility.” Large deployments, Wise explains, are the most realistic way for a robotics company to scale its business, since onboarding any new client can take weeks or months. An alternative approach to deploying several thousand robots to do a single job is to deploy several hundred robots that can each do 10 jobs, which seems to be what most of the humanoid industry is betting on in the medium to long term. While there’s a belief across much of the humanoid robotics industry that rapid progress in AI must somehow translate into rapid progress toward multipurpose robots, it’s not clear how, when, or if that will happen. “I think what a lot of people are hoping for is they’re going to AI their way out of this,” says Wise. “But the reality of the situation is that currently AI is not robust enough to meet the requirements of the market.” Bringing Humanoid Robots to Market Market requirements for humanoid robots include a slew of extremely dull, extremely critical things like battery life, reliability, and safety. Of these, battery life is the most straightforward—for a robot to usefully do a job, it can’t spend most of its time charging. The next version of Agility’s Digit robot, which can handle payloads of up to 16 kilograms, includes a bulky “backpack” containing a battery with a charging ratio of 10 to 1: The robot can run for 90 minutes, and fully recharge in 9 minutes. Slimmer humanoid robots from other companies must necessarily be making compromises to maintain their svelte form factors. In operation, Digit will probably spend a few minutes charging after running for 30 minutes. That’s because 60 minutes of Digit’s runtime is essentially a reserve in case something happens in its workspace that requires it to temporarily pause, a not-infrequent occurrence in the logistics and manufacturing environments that Agility is targeting. Without a 60-minute reserve, the robot would be much more likely to run out of power mid-task and need to be manually recharged. Consider what that might look like with even a modest deployment of several hundred robots weighing over a hundred kilograms each. “No one wants to deal with that,” comments Wise. Potential customers for humanoid robots are very concerned with downtime. Over the course of a month, a factory operating at 99 percent reliability will see approximately 5 hours of downtime. Wise says that any downtime that stops something like a production line can cost tens of thousands of dollars per minute, which is why many industrial customers expect a couple more 9s of reliability: 99.99 percent. Wise says that Agility has demonstrated this level of reliability in some specific applications, but not in the context of multipurpose or general-purpose functionality. Humanoid Robot Safety A humanoid robot in an industrial environment must meet general safety requirements for industrial machines. In the past, robotic systems like autonomous vehicles and drones have benefited from immature regulatory environments to scale quickly. But Wise says that approach can’t work for humanoids, because the industry is already heavily regulated—the robot is simply considered another piece of machinery. There are also more specific safety standards currently under development for humanoid robots, explains Matt Powers, associate director of autonomy R&D at Boston Dynamics. He notes that his company is helping develop an International Organization for Standardization (ISO) safety standard for dynamically balancing legged robots . “We’re very happy that the top players in the field, like Agility and Figure, are joining us in developing a way to explain why we believe that the systems that we’re deploying are safe,” Powers says. These standards are necessary because the traditional safety approach of cutting power may not be a good option for a dynamically balancing system. Doing so will cause a humanoid robot to fall over, potentially making the situation even worse. There is no simple solution to this problem, and the initial approach that Boston Dynamics expects to take with its Atlas robot is to keep the robot out of situations where simply powering it off might not be the best option. “We’re going to start with relatively low-risk deployments, and then expand as we build confidence in our safety systems,” Powers says. “I think a methodical approach is really going to be the winner here.” In practice, low risk means keeping humanoid robots away from people. But humanoids that are restricted by what jobs they can safely do and where they can safely move are going to have more trouble finding tasks that provide value. Are Humanoids the Answer? The issues of demand, battery life, reliability, and safety all need to be solved before humanoid robots can scale. But a more fundamental question to ask is whether a bipedal robot is actually worth the trouble. Dynamic balancing with legs would theoretically enable these robots to navigate complex environments like a human. Yet demo videos show these humanoid robots as either mostly stationary or repetitively moving short distances over flat floors. The promise is that what we’re seeing now is just the first step toward humanlike mobility. But in the short to medium term, there are much more reliable, efficient, and cost-effective platforms that can take over in these situations: robots with arms, but with wheels instead of legs. Safe and reliable humanoid robots have the potential to revolutionize the labor market at some point in the future. But potential is just that, and despite the humanoid enthusiasm, we have to be realistic about what it will take to turn potential into reality. This article appears in the October 2025 print issue as “Why Humanoid Robots Aren’t Scaling.”
spectrum.ieee.org
September 11, 2025 at 2:27 PM
Large Behavior Models Are Helping Atlas Get to Work https://spectrum.ieee.org/boston-dynamics-atlas-scott-kuindersma
Large Behavior Models Are Helping Atlas Get to Work
Boston Dynamics can be forgiven, I think, for the relative lack of acrobatic prowess displayed by the new version of Atlas in (most of ) its latest videos. In fact, if you look at this Atlas video from late last year, and compare it to Atlas’ most recent video , it’s doing what looks to be more or less the same logistics-y stuff—all of which is far less visually exciting than backflips. But I would argue that the relatively dull tasks Atlas is working on now, moving car parts and totes and whatnot, are just as impressive. Making a humanoid that can consistently and economically and safely do useful things over the long term could very well be the hardest problem in robotics right now, and Boston Dynamics is taking it seriously. Last October, Boston Dynamics announced a partnership with Toyota Research Institute with the goal of general-purpose-izing Atlas. We’re now starting to see the results of that partnership, and Boston Dynamics’ vice president of robotics research, Scott Kuindersma , takes us through the progress they’ve made. Building AI Generalist Robots While the context of this work is “building AI generalist robots,” I’m not sure that anyone really knows what a “generalist robot” would actually look like, or even how we’ll even know when someone has achieved it. Humans are generalists, sort of—we can potentially do a lot of things, and we’re fairly adaptable and flexible in many situations, but we still require training for most tasks. I bring this up just to try and contextualize expectations, because I think a successful humanoid robot doesn’t have to actually be a generalist, but instead just has to be capable of doing several different kinds of tasks, and to be adaptable and flexible in the context of those tasks. And that’s already difficult enough. The approach that the two companies are taking is to leverage large behavior models (LBMs), which combine more general world knowledge with specific task knowledge to help Atlas with that adaptability and flexibility thing. As Boston Dynamics points out in a recent blog post , “the field is steadily accumulating evidence that policies trained on a large corpus of diverse task data can generalize and recover better than specialist policies that are trained to solve one or a small number of tasks.” Essentially, the goal is to develop a foundational policy that covers things like movement and manipulation, and then add more specific training (provided by humans) on top of that for specific tasks. This video below shows how that’s going so far. - YouTube What the video doesn’t show is the training system that Boston Dynamics uses to teach Atlas to do these tasks. Essentially imitation learning, an operator wearing a motion tracking system teleoperates Atlas through motion and manipulation tasks. There’s a one-to-one mapping between the operator and the robot, making it fairly intuitive, although as anyone who has tried to teleoperate a robot with a surfeit of degrees of freedom can attest to, it takes some practice to do it well. A motion tracking system provides high-quality task training data for Atlas. Boston Dynamics This interface provides very high-quality demonstration data for Atlas, but it’s not the easiest to scale—just one of the challenges of deploying a multipurpose (different than generalist!) humanoid. For more about what’s going on behind the scenes in this video and Boston Dynamics’ strategy with Atlas, IEEE Spectrum spoke with Kuindersma. In a video from last October just as your partnership with Toyota Research Institute was beginning, Atlas was shown moving parts around and performing whole-body manipulation. What’s the key difference between that demonstration and what we’re seeing in the new video? Scott Kuindersma: The big difference is how we programmed the behavior. The previous system was a more traditional robotics stack involving a combination of model-based controllers, planners, and machine learning models for perception all architected together to do end-to-end manipulation. Programming a new task on that system generally required roboticists or system integrators to touch code and tell the robot what to do. For this new video, we replaced most of that system with a single neural network that was trained on demonstration data. This is much more flexible because there’s no task-specific programming or other open-ended creative engineering required. Basically, if you can teleoperate the robot to do a task, you can train the network to reproduce that behavior. This approach is more flexible and scalable because it allows people without advanced degrees in robotics to “program” the robot. We’re talking about a large behavior model (LBM) here, right? What would you call the kind of learning that this model does? Kuindersma: It is a kind of imitation learning. We collect many teleoperation demonstrations and train a neural network to reproduce the input-output behaviors in the data. The inputs are things like raw robot camera images, natural language descriptions of the task, and proprioception, and the outputs are the same teleop commands sent by the human interface. What makes it a large behavior model is that we collect data from many different tasks and, in some cases, many different robot embodiments, using all of that as training data for the robot to end up with a single policy that knows how to do many things. The idea is that by training the network on a much wider variety of data and tasks and robots, its ability to generalize will be better. As a field, we are still in the early days of gathering evidence that this is actually the case (our [Toyota Research Institute] collaborators are among those leading the charge ), but we expect it is true based on the empirical trends we see in robotics and other AI domains. So the idea with the behavior model is that it will be more generalizable, more adaptable, or require less training because it will have a baseline understanding of how things work? Kuindersma: Exactly, that’s the idea. At a certain scale, once the model has seen enough through its training data, it should have some ability to take what it’s learned from one set of tasks and apply those learnings to new tasks. One of the things that makes these models flexible is that they are conditioned on language. We collect teleop demonstrations and then post-annotate that data with language, having humans or language models describing in English what is happening. The network then learns to associate these language prompts with the robot’s behaviors. Then, you can tell the model what to do in English, and it has a chance of actually doing it. At a certain scale, we hope it won’t take hundreds of demonstrations for the robot to do a task; maybe only a couple, and maybe way in the future, you might be able to just tell the robot what to do in English, and it will know how to do it, even if the task requires dexterity beyond simple object pick-and-place. There are a lot of robot videos out there of robots doing stuff that might look similar to what we’re seeing here. Can you tell me how what Boston Dynamics and Toyota Research Institute are doing is unique? Kuindersma: Many groups are using AI tools for robot demos, but there are some differences in our strategic approach. From our perspective, it’s crucial for the robot to perform the full breadth of humanoid manipulation tasks. That means, if you use a data-driven approach, you need to somehow funnel those embodied experiences into the dataset you’re using to train the model. We spent a lot of time building a highly expressive teleop interface for Atlas, which allows operators to move the robot around quickly, take steps, balance on one foot, reach the floor and high shelves, throw and catch things, and so on. The ability to directly mirror a human body in real time is vital for Atlas to act like a real humanoid laborer. If you’re just standing in front of a table and moving things around, sure, you can do that with a humanoid, but you can do it with much cheaper and simpler robots, too. If you instead want to, say, bend down and pick up something from between your legs, you have to make careful adjustments to the entire body while doing manipulation. The tasks we’ve been focused on with Atlas over the last couple months have been focused more on collecting this type of data, and we’re committed to making these AI models extremely performant so the motions are smooth, fast, beautiful, and fully cover what humanoids can do. Is it a constraint that you’re using imitation learning, given that Atlas is built to move in ways that humans can’t? How do you expand the operating envelope with this kind of training? Kuindersma: That’s a great question. There are a few ways to think about it: Atlas can certainly do things like continuous joint rotation that people can’t. While those capabilities might offer efficiency benefits, I would argue that if Atlas only behaved exactly like a competent human, that would be amazing, and we would be very happy with that. We could extend our teleop interface to make available types of motions the robot can do but a person can’t. The downside is this would probably make teleoperation less intuitive, requiring a more highly trained expert, which reduces scalability. We may be able to co-train our large behavior models with data sources that are not just teleoperation-based. For example, in simulation, you could use rollouts from reinforcement learning policies or programmatic planners as augmented demonstrations that include these high-range-of-motion capabilities. The LBM can then learn to leverage that in conjunction with teleop demonstrations. This is not just a hypothetical, we’ve actually found that co-training with simulation data has improved performance in the real robot, which is quite promising. Can you tell me what Atlas was directed to do in the video? Is it primarily trying to mirror its human-based training, or does it have some capacity to make decisions? Kuindersma: In this case, Atlas is responding primarily to visual and language queues to perform the task. At our current scale and with the model’s training, there’s a limited ability to completely innovate behaviors. However, you can see a lot of variety and responsiveness in the details of the motion, such as where specific parts are in the bin or where the bin itself is. As long as those experiences are reflected somewhere in the training data, the robot uses its real-time sensor observations to produce the right type of response. So, if the bin was too far away for the robot to reach, without specific training, would it move itself to the bin? Kuindersma: We haven’t done that experiment, but if the bin was too far away, I think it might take a step forward because we varied the initial conditions of the bin when we collected data, which sometimes required the operator to walk the robot to the bin. So there is a good chance that it would step forward, but there is also a small chance that it might try to reach and not succeed. It can be hard to make confident predictions about model behavior without running experiments, which is one of the fun features of working with models like this. It’s interesting how a large behavior model, which provides world knowledge and flexibility, interacts with this instance of imitation learning, where the robot tries to mimic specific human actions. How much flexibility can the system take on when it’s operating based on human imitation? Kuindersma: It’s primarily a question of scale. A large behavior model is essentially imitation learning at scale, similar to a large language model. The hypothesis with large behavior models is that as they scale, generalization capabilities improve, allowing them to handle more real-world corner cases and require less training data for new tasks. Currently, the generalization of these models is limited, but we’re addressing that by gathering more data not only through teleoperating robots but also by exploring other scaling bets like non-teleop human demonstrations and sim/synthetic data. These other sources might have more of an “embodiment gap” to the robot, but the model’s ability to assimilate and translate between data sources could lead to better generalization. How much skill or experience does it take to effectively train Atlas through teleoperation? Kuindersma: We’ve had people on day tours jump in and do some teleop, moving the robot and picking things up. This ease of entry is thanks to our teams building a really nice interface: The user wears a VR headset, where they’re looking at a re-projection of the robot’s stereo RGB cameras, which are aligned to provide a 3D sense of vision, and there are built-in visual augmentations like desired hand locations and what the robot is actually doing to give people situational awareness. So novice users can do things fairly easily, they’re probably not generating the highest quality motions for training policies. To generate high-quality data, and to do that consistently over a period of several hours, it typically takes a couple of weeks of onboarding. We usually start with manipulation tasks and then progress to tasks involving repositioning the entire robot. It’s not trivial, but it’s doable. The people doing it now are not roboticists; we have a team of ‘robot teachers’ who are hired for this, and they’re awesome. It gives us a lot of hope for scaling up the operation as we build more robots. How is what you’re doing different from other companies that might lean much harder on scaling through simulation? Are you focusing more on how humans do things? Kuindersma: Many groups are doing similar things, with differences in technical approach, platform, and data strategy. You can characterize the strategies people are taking by thinking about a “data pyramid,” where the top of the pyramid is the highest quality, hardest-to-get data, which is typically teleoperation on the robot you’re working with. The middle of the pyramid might be egocentric data collected on people (e.g., by wearing sensorized gloves), simulation data, or other synthetic world models. And the bottom of the pyramid is data from YouTube or the rest of the Internet. Different groups allocate finite resources to different distributions of these data sources. For us, we believe it’s really important to have as large a baseline of actual on-robot data (at the top of the pyramid) as possible. Simulation and synthetic data are almost certainly part of the puzzle, and we’re investing resources there, but we’re taking a somewhat balanced data strategy rather than throwing all of our eggs in one basket. Ideally you want the top of the pyramid to be as big as possible, right? Kuindersma: Ideally, yes. But you won’t get to the scale you need by just doing that. You need the whole pyramid, but having as much high-quality data at the top as possible only helps. But it’s not like you can just have a super large bottom to the pyramid and not need the top? Kuindersma: I don’t think so. I believe there needs to be enough high-quality data for these models to effectively translate into the specific embodiment that they are executing on. There needs to be enough of that “top” data for the translation to happen, but no one knows the exact distribution, like whether you need 5 percent real robot data and 95 percent simulation, or some other ratio. Is that a box of ‘Puny-os ’ on the shelf in the video? Part of this self-balancing robot. Boston Dynamics Kuindersma: Yeah! Alex Alspach from [Toyota Research Institute] brought it in to put in the background as an easter egg. What’s next for Atlas? Kuindersma: We’re really focused on maximizing the performance manipulation behaviors. I think one of the things that we’re uniquely positioned to do well is reaching the full behavioral envelope of humanoids, including mobile bimanual manipulation, repetitive tasks, and strength, and getting the robot to move smoothly and dynamically using these models. We’re also developing repeatable processes to climb the robustness curve for these policies—we think reinforcement learning may play a key role in achieving this. We’re also looking at other types of scaling bets around these systems. Yes, it’s going to be very important that we have a lot of high-quality on-robot on task data that we’re using as part of training these models. But we also think there are real opportunities and being able to leverage other data sources, whether that’s observing or instrumenting human workers or scaling up synthetic and simulation data, and understanding how those things can mix together to improve the performance of our models.
spectrum.ieee.org
September 7, 2025 at 1:00 PM
Video Friday: Robot Vacuum Climbs Stairs https://spectrum.ieee.org/video-friday-eufy-robot-vacuum
Video Friday: Robot Vacuum Climbs Stairs
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. ACTUATE 2025 : 23–24 September 2025, SAN FRANCISCO CoRL 2025 : 27–30 September 2025, SEOUL IEEE Humanoids : 30 September–2 October 2025, SEOUL World Robot Summit : 10–12 October 2025, OSAKA, JAPAN IROS 2025 : 19–25 October 2025, HANGZHOU, CHINA Enjoy today’s videos! This is ridiculous and I love it. [ Eufy ] At ICRA 2024, We met Paul Nadan to learn about how his LORIS robot climbs up walls by sticking itself to rocks. [ CMU ] If a humanoid robot is going to load my dishwasher, I expect it to do so optimally, not all haphazardly like a puny human. [ Figure ] Humanoid robots have recently achieved impressive progress in locomotion and whole-body control, yet they remain constrained in tasks that demand rapid interaction with dynamic environments through manipulation. Table tennis exemplifies such a challenge: with ball speeds exceeding 5 m/s, players must perceive, predict, and act within sub-second reaction times, requiring both agility and precision. To address this, we present a hierarchical framework for humanoid table tennis that integrates a model-based planner for ball trajectory prediction and racket target planning with a reinforcement learning–based whole-body controller. [ Hybrid Robotics ] Despite their promise, today’s biohybrid robots typically underperform their fully synthetic counterparts and their potential as predicted from a reductionist assessment of constituents. Many systems represent enticing proofs of concept with limited practical applicability. Most remain confined to controlled laboratory settings and lack feasibility in complex real-world environments. Developing biohybrid robots is currently a painstaking, bespoke process, and the resulting systems are routinely inadequately characterized. Complex, intertwined relationships between component, interface, and system performance are poorly understood, and methodologies to guide informed design of biohybrid systems are lacking. The HyBRIDS ARC opportunity seeks ideas to address the question: How can synthetic and biological components be integrated to enable biohybrid platforms that outperform traditional robotic systems? [ DARPA ] Robotic systems will play a key role in future lunar missions, and a great deal of research is currently being conducted in this area. One such project is SAMLER-KI (Semi-Autonomous Micro Rover for Lunar Exploration Using Artificial Intelligence), a collaboration between the German Research Center for Artificial Intelligence (DFKI) and the University of Applied Sciences Aachen (FH Aachen), Germany. The project focuses on the conceptual design of a semi-autonomous micro rover that is capable of surviving lunar nights while remaining within the size class of a micro rover. During development, conditions on the Moon such as dust exposure, radiation, and the vacuum of space are taken into account, along with the 14-Earth-day duration of a lunar night. [ DFKI ] ARMstrong Dex is a human-scale dual-arm hydraulic robot developed by the Korea Atomic Energy Research Institute (KAERI) for disaster response applications. It is capable of lifting its own body through vertical pull-ups and manipulating objects over 50 kg, demonstrating strength beyond human capabilities. In this test, ARMstrong Dex used a handheld saw to cut through a thick 40×90 mm wood beam. Sawing is a physically demanding task involving repetitive force application, fine trajectory control, and real-time coordination. [ KAERI ] This robot stole my “OMG I HAVE JUICE” face. [ Pudu Robotics ] The best way of doging a punch to the face is to just have a big hole where your face should be. I do wish they wouldn’t call it a combat robot, though. [ Unitree ] It really might be fun to have a DRC-style event for quadrupeds. [ DEEP Robotics ] CMU researchers are developing new technology to enable robots to physically interact with people who are not able to care for themselves. These breakthroughs are being deployed in the real world, making it possible for individuals with neurological diseases, stroke, multiple sclerosis, ALS and dementia to be able to eat, clean and get dressed fully on their own. [ CMU ] Caracol’s additive manufacturing platforms use KUKA robotic arms to produce large-scale industrial parts with precision and flexibility. This video outlines how Caracol integrates multi-axis robotics, modular extruders, and proprietary software to support production in sectors like aerospace, marine, automotive, and architecture. [ KUKA ] There were a couple of robots at ICRA 2025, as you might expect. [ ICRA ] On June 6, 1990, following the conclusion of Voyager’s planetary explorations, mission representatives held a news conference at NASA’s Jet Propulsion Laboratory in Southern California to summarize key findings and answer questions from the media. In the briefing, Voyager’s longtime project scientist Ed Stone, along with renowned science communicator Carl Sagan, also revealed the mission’s “Solar System Family Portrait,” a mosaic comprising images of six of the solar system’s eight planets. Carl Sagan was a member of the Voyager imaging team and instrumental in capturing these images and bringing them to the public. Carl Sagan, man. Carl Sagan. Blue Dot unveil was right around 57:00, if you missed it. [ JPL ]
spectrum.ieee.org
September 5, 2025 at 4:01 PM
Video Friday: Spot’s Got Talent
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. CLAWAR 2025 : 5–7 September 2025, SHENZHEN, CHINA ACTUATE 2025 : 23–24 September 2025, SAN FRANCISCO CoRL 2025 : 27–30 September 2025, SEOUL IEEE Humanoids : 30 September–2 October 2025, SEOUL World Robot Summit : 10–12 October 2025, OSAKA, JAPAN IROS 2025 : 19–25 October 2025, HANGZHOU, CHINA Enjoy today’s videos! Boston Dynamics is back and their dancing robot dogs are bigger, better, and bolder than ever! Watch as they bring a “dead” robot to life and unleash a never before seen synchronized dance routine to “Good Vibrations.” And much more interestingly, here’s a discussion of how they made it work: [ Boston Dynamics ] I don’t especially care whether a robot falls over . I care whether it gets itself back up again. [ LimX Dynamics ] The robot autonomously connects multiple wires to the environment using small flying anchors—drones equipped with anchoring mechanisms at the wire tips. Guided by an onboard RGB-D camera for control and environmental recognition, the system enables wire attachment in unprepared environments and supports simultaneous multi-wire connections, expanding the operational range of wire-driven robots. [ JSK Robotics Laboratory ] at [ University of Tokyo ] Thanks, Shintaro! For a robot that barely has a face, this is some pretty good emoting. [ Pollen ] Learning skills from human motions offers a promising path toward generalizable policies for whole-body humanoid control, yet two key cornerstones are missing: (1) a scalable, high-quality motion tracking framework that faithfully transforms kinematic references into robust, extremely dynamic motions on real hardware, and (2) a distillation approach that can effectively learn these motion primitives and compose them to solve downstream tasks. We address these gaps with BeyondMimic, a real-world framework to learn from human motions for versatile and naturalistic humanoid control via guided diffusion. [ Hybrid Robotics ] Introducing our open-source metal-made bipedal robot MEVITA. All components can be procured through e-commerce, and the robot is built with a minimal number of parts. All hardware, software, and learning environments are released as open source. [ MEVITA ] Thanks, Kento! I’ve always thought that being able to rent robots (or exoskeletons) to help you move furniture or otherwise carry stuff would be very useful. [ DEEP Robotics ] A new study explains how tiny water bugs use fan-like propellers to zip across streams at speeds up to 120 body lengths per second. The researchers then created a similar fan structure and used it to propel and maneuver an insect-sized robot. The discovery offers new possibilities for designing small machines that could operate during floods or other challenging situations. [ Georgia Tech ] Dynamic locomotion of legged robots is a critical yet challenging topic in expanding the operational range of mobile robots. To achieve generalized legged locomotion on diverse terrains while preserving the robustness of learning-based controllers, this paper proposes to learn an attention-based map encoding conditioned on robot proprioception, which is trained as part of the end-to-end controller using reinforcement learning. We show that the network learns to focus on steppable areas for future footholds when the robot dynamically navigates diverse and challenging terrains. [ Paper ] from [ ETH Zurich ] In the fifth installment of our Moonshot Podcast Deep Dive video interview series, X’s Captain of Moonshots Astro Teller sits down with Google DeepMind’s Chief Scientist Jeff Dean for a conversation about the origin of Jeff’s pioneering work scaling neural networks. They discuss the first time AI captured Jeff’s imagination, the earliest Google Brain framework, the team’s stratospheric advancements in image recognition and speech-to-text, how AI is evolving, and more. [ Moonshot Podcast ]
spectrum.ieee.org
August 29, 2025 at 4:30 PM
Video Friday: Inaugural World Humanoid Robot Games Held https://spectrum.ieee.org/world-humanoid-robot-games
Video Friday: Inaugural World Humanoid Robot Games Held
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. RO-MAN 2025 : 25–29 August 2025, EINDHOVEN, THE NETHERLANDS CLAWAR 2025 : 5–7 September 2025, SHENZHEN, CHINA ACTUATE 2025 : 23–24 September 2025, SAN FRANCISCO CoRL 2025 : 27–30 September 2025, SEOUL IEEE Humanoids : 30 September–2 October 2025, SEOUL World Robot Summit : 10–12 October 2025, OSAKA, JAPAN IROS 2025 : 19–25 October 2025, HANGZHOU, CHINA Enjoy today’s videos! The First World Humanoid Robot Games Conclude Successfully! Unitree Strikes Four Golds (1500m, 400m, 100m Obstacle, 4×100m Relay). [ Unitree ] Steady! PNDbotics Adam has become the only full-size humanoid robot athlete to successfully finish the 100m Obstacle Race at the World Humanoid Robot Games! [ PNDbotics ] Introducing Field Foundation Models (FFMs) from FieldAI - a new class of “physics-first” foundation models built specifically for embodied intelligence. Unlike conventional vision or language models retrofitted for robotics, FFMs are designed from the ground up to grapple with uncertainty, risk, and the physical constraints of the real world. This enables safe and reliable robot behaviors when managing scenarios that they have not been trained on, navigating dynamic, unstructured environments without prior maps, GPS, or predefined paths. [ Field AI ] Multiply Labs, leveraging Universal Robots’ collaborative robots, has developed a groundbreaking robotic cluster that is fundamentally transforming the manufacturing of life-saving cell and gene therapies. The Multiply Labs solution drives a staggering 74% cost reduction and enables up to 100x more patient doses per square foot of cleanroom. [ Universal Robots ] In this video, we put Vulcan V3, the world’s first ambidextrous humanoid robotic hand capable of performing the full American Sign Language (ASL) alphabet, to the ultimate test—side by side with a real human! [ Hackaday ] Thanks, Kelvin! More robots need to have this form factor. [ Texas A & M University ] Robotic vacuums are so pervasive now that it’s easy to forget how much of an icon the iRobot Roomba has been. [ iRobot ] This is quite possibly the largest robotic hand I’ve ever seen. [ CAFE Project ] via [ BUILT ] Modular robots built by Dartmouth researchers are finding their feet outdoors. Engineered to assemble into structures that best suit the task at hand, the robots are pieced together from cube-shaped robotic blocks that combine rigid rods and soft, stretchy strings whose tension can be adjusted to deform the blocks and control their shape. [ Dartmouth ] Our quadruped robot X30 has completed extreme-environment missions in Hoh Xil—supporting patrol teams, carrying vital supplies, and protecting fragile ecosystems. [ DEEP Robotics ] We propose a base-shaped robot named “koboshi” that moves everyday objects. This koboshi has a spherical surface in contact with the floor, and by moving a weight inside using built-in motors, it can rock up and down, and side to side. By placing everyday items on this koboshi, users can impart new movement to otherwise static objects. The koboshi is equipped with sensors to measure its posture, enabling interaction with users. Additionally, it has communication capabilities, allowing multiple units to communicate with each other. [ Paper ] Bi-LAT is the world’s first Vision-Language-Action (VLA) model that integrates bilateral control into imitation learning, enabling robots to adjust force levels based on natural language instructions. [ Bi-LAT ] to be presented at [ IEEE RO-MAN 2025 ] Thanks, Masato! Look at this jaunty little guy! Although, they very obviously cut the video right before it smashes face first into furniture more than once. [ Paper ] to be presented at [ 2025 IEEE-RAS International Conference on Humanoid Robotics ] This research has been conducted at the Human Centered Robotics Lab at UT Austin. The video shows our latest experimental bipedal robot, dubbed Mercury, which has passive feet. This means that there are no actuated ankles, unlike humans, forcing Mercury to gain balance by dynamically stepping. [ University of Texas at Austin Human Centered Robotics Lab ] We put two RIVR delivery robots to work with an autonomous vehicle — showing how Physical AI can handle the full last mile, from warehouse to consumers’ doorsteps. [ Rivr ] The KR TITAN ultra is a high-performance industrial robot weighing 4.6 tonnes and capable of handling payloads up to 1.5 tonnes. [ Kuka ] CMU MechE’s Ding Zhao and Ph.D. student Yaru Niu describe LocoMan, a robotic assistant they have been developing. [ Carnegie Mellon University ] Twenty-two years ago, Silicon Valley executive Henry Evans had a massive stroke that left him mute and paralyzed from the neck down. But that didn’t prevent him from becoming a leading advocate of adaptive robotic tech to help disabled people – or from writing country songs, one letter at a time. Correspondent John Blackstone talks with Evans about his upbeat attitude and unlikely pursuits. [ CBS News ]
spectrum.ieee.org
August 22, 2025 at 3:30 PM
Video Friday: SCUTTLE
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. RO-MAN 2025 : 25–29 August 2025, EINDHOVEN, THE NETHERLANDS CLAWAR 2025 : 5–7 September 2025, SHENZHEN, CHINA ACTUATE 2025 : 23–24 September 2025, SAN FRANCISCO CoRL 2025 : 27–30 September 2025, SEOUL IEEE Humanoids : 30 September–2 October 2025, SEOUL World Robot Summit : 10–12 October 2025, OSAKA, JAPAN IROS 2025 : 19–25 October 2025, HANGZHOU, CHINA Enjoy today’s videos! Check out our latest innovations on SCUTTLE, advancing multilegged mobility anywhere. [ GCR ] That laundry folding robot we’ve been working on for 15 years is still not here yet. Honestly I think Figure could learn a few tricks from vintage UC Berkeley PR2, though: - YouTube [ Figure ] Tensegrity robots are so cool, but so hard—it’s good to see progress. [ Michigan Robotics ] We should find out next week how quick this is. [ Unitree ] We introduce a methodology for task-specific design optimization of multirotor Micro Aerial Vehicles. By leveraging reinforcement learning, Bayesian optimization, and covariance matrix adaptation evolution strategy, we optimize aerial robot designs guided only by their closed-loop performance in a considered task. Our approach systematically explores the design space of motor pose configurations while ensuring manufacturability constraints and minimal aerodynamic interference. Results demonstrate that optimized designs achieve superior performance compared to conventional multirotor configurations in agile waypoint navigation tasks, including against fully actuated designs from the literature. We build and test one of the optimized designs in the real world to validate the sim2real transferability of our approach. [ ARL ] Thanks, Kostas! I guess legs are required for this inspection application because of the stairs right at the beginning? But sometimes, that’s how the world is. [ DEEP Robotics ] The Institute of Robotics and Mechatronics at DLR has a long tradition in developing multi-fingered hands, creating novel mechatronic concepts as well as autonomous grasping and manipulation capabilities. The range of hands spans from Rotex, a first two-fingered gripper for space applications, to the highly anthropomorphic Awiwi Hand and variable stiffness end effectors. This video summarizes the developments of DLR in this field over the past 30 years, starting with the Rotex experiment in 1993. [ DLR RM ] The quest for agile quadrupedal robots is limited by handcrafted reward design in reinforcement learning. While animal motion capture provides 3D references, its cost prohibits scaling. We address this with a novel video-based framework. The proposed framework significantly advances robotic locomotion capabilities. [ Arc Lab ] Serious question: Why don’t humanoid robots sit down more often? [ EngineAI ] And now, this. [ LimX Dynamics ] NASA researchers are currently using wind tunnel and flight tests to gather data on an electric vertical takeoff and landing (eVTOL) scaled-down small aircraft that resembles an air taxi that aircraft manufacturers can use for their own designs. By using a smaller version of a full-sized aircraft called the RAVEN Subscale Wind Tunnel and Flight Test (RAVEN SWFT) vehicle, NASA is able to conduct its tests in a fast and cost-effective manner. [ NASA ] This video details the advances in orbital manipulation made by DLR’s Robotic and Mechatronics Center over the past 30 years, paving the way for the development of robotic technology for space sustainability. [ DLR RM ] This summer, a team of robots explored a simulated Martian landscape in Germany, remotely guided by an astronaut aboard the International Space Station. This marked the fourth and final session of the Surface Avatar experiment, a collaboration between ESA and the German Aerospace Center (‪DLR) to develop how astronauts can control robotic teams to perform complex tasks on the Moon and Mars. [ ESA ]
spectrum.ieee.org
August 15, 2025 at 4:00 PM
Video Friday: Unitree’s A2 Quadruped Goes Exploring https://spectrum.ieee.org/video-friday-exploration-robots
Video Friday: Unitree’s A2 Quadruped Goes Exploring
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. World Humanoid Robot Games : 15–17 August 2025, BEIJING RO-MAN 2025 : 25–29 August 2025, EINDHOVEN, THE NETHERLANDS CLAWAR 2025 : 5–7 September 2025, SHENZHEN, CHINA ACTUATE 2025 : 23–24 September 2025, SAN FRANCISCO CoRL 2025 : 27–30 September 2025, SEOUL IEEE Humanoids : 30 September–2 October 2025, SEOUL World Robot Summit : 10–12 October 2025, OSAKA, JAPAN IROS 2025 : 19–25 October 2025, HANGZHOU, CHINA Enjoy today’s videos! The A2 sets a new standard in quadruped robots, balancing endurance, strength, speed, and perception. The A2 weighs 37 kg (81.6 lbs) unloaded. Fully loaded with a 25 kg (55 lbs) payload, it can continuously walk for 3 hours or approximately 12.5 km. Unloaded, it can continuously walk for 5 hours or approximately 20 km. Hot-swappable dual batteries enable seamless battery swap and continuous runtime for any mission. [ Unitree ] Thanks, William! ABB is working with Cosmic Buildings to reshape how communities rebuild and transform construction after disaster. In response to the 2025 Southern California wildfires, Cosmic Buildings are deploying mobile robotic microfactories to build modular homes on-site—cutting construction time by 70% and costs by 30%. [ ABB ] Thanks, Caitlin! How many slightly awkward engineers can your humanoid robot pull? [ MagicLab ] The physical robot hand does some nifty stuff at about 1 minute in. [ ETH Zurich Soft Robotics Lab ] Biologists, you can all go home now. [ AgileX ] The World Humanoid Robot Games starts next week in Beijing, and of course Tech United Eindhoven are there. [ Tech United ] Our USX-1 Defiant is a new kind of autonomous maritime platform , with the potential to transform the way we design and build ships. As the team prepares Defiant for an extended at-sea demonstration, program manager Greg Avicola shares the foundational thinking behind the breakthrough vessel. [ DARPA ] After loss, how do you translate grief into creation? Meditation Upon Death is Paul Kirby’s most personal and profound painting—a journey through love, loss, and the mystery of the afterlife. Inspired by a conversation with a Native American shaman and years of artistic exploration, Paul fuses technology and traditional art to capture the spirit’s passage beyond. With 5,796 brushstrokes, a custom-built robotic painting system, and a vision shaped by memory and devotion, this is the most important painting he has ever made. [ Dulcinea ] Thanks, Alexandra! In the fourth installment of our Moonshot Podcast Deep Dive video interview series, X’s Captain of Moonshots Astro Teller sits down with Andrew Ng, the founder of Google Brain and DeepLearning.AI, for a conversation about the history of neural network research and how Andrew’s pioneering ideas led to some of the biggest breakthroughs in modern-day AI. [ Moonshot Podcast ]
spectrum.ieee.org
August 8, 2025 at 3:30 PM
Video Friday: Dance With CHILD
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. RO-MAN 2025 : 25–29 August 2025, EINDHOVEN, THE NETHERLANDS CLAWAR 2025 : 5–7 September 2025, SHENZHEN, CHINA ACTUATE 2025 : 23–24 September 2025, SAN FRANCISCO CoRL 2025 : 27–30 September 2025, SEOUL IEEE Humanoids : 30 September–2 October 2025, SEOUL World Robot Summit : 10–12 October 2025, OSAKA, JAPAN IROS 2025 : 19–25 October 2025, HANGZHOU, CHINA Enjoy today’s videos! Many parents naturally teach motions to their child while using a baby carrier. In this setting, the parent’s range of motion fully encompasses the child’s, making it intuitive to scale down motions in a puppeteering manner. This inspired UIUC KIMLAB to build CHILD: Controller for Humanoid Imitation and Live Demonstration. The role of teleoperation has grown increasingly important with the rising interest in collecting physical data in the era of Physical/Embodied AI. We demonstrate the capabilities of CHILD through loco-manipulation and full-body control experiments using the Unitree G1 and other PAPRAS dual-arm systems. To promote accessibility and reproducibility, we open-source the hardware design. [ KIMLAB ] This costs less than US $6,000. [ Unitree ] If I wasn’t sold on one of these little Reachy Minis before, I definitely am now. [ Pollen ] In this study, we propose a falconry-like interaction system in which a flapping-wing drone performs autonomous palm landing motion on a human hand. To achieve a safe approach toward humans, our motion planning method considers both physical and psychological factors. I should point out that palm landings are not falconry-like at all, and that if you’re doing falconry right, the bird should be landing on your wrist instead. I have other hobbies besides robots, you know! [ Paper ] I’m not sure that augmented reality is good for all that much, but I do like this use case of interactive robot help. [ MRHaD ] Thanks, Masato! LimX Dynamics officially launched its general-purpose full-size humanoid robot LimX Oli. It’s currently available only in Mainland China. A global version is coming soon. Standing at 165 cm and equipped with 31 active degrees of freedom (excluding end-effectors), LimX Oli adopts a general-purpose humanoid configuration with modular hardware-software architecture and is supported by a development toolchain. It is built to advance embodied AI development from algorithm research to real-world deployment. [ LimX Dynamics ] Thanks, Jinyan! Meet Treadward – the newest robot from HEBI Robotics, purpose-built for rugged terrain, inspection missions, and real-world fieldwork. Treadward combines high mobility with extreme durability, making it ideal for challenging environments like waterlogged infrastructure, disaster zones, and construction sites. With a compact footprint and treaded base, it can climb over debris, traverse uneven ground, and carry substantial payloads. [ HEBI ] PNDbotics made a stunning debut at the 2025 World Artificial Intelligence Conference (WAIC) with the first-ever joint appearance of its full-sized humanoid robot Adam and its intelligent data-collection counterpart Adam-U. [ PNDbotics ] This paper presents the design, development, and validation of a fully autonomous dual-arm aerial robot capable of mapping, localizing, planning, and grasping parcels in an intra-logistics scenario. The aerial robot is intended to operate in a scenario comprising several supply points, delivery points, parcels with tags, and obstacles, generating the mission plan from voice the commands given by the user. [ GRVC ] We left the room. They took over. No humans. No instructions. Just robots... moving, coordinating, showing off. It almost felt like… they were staging something. [ AgileX ] TRI’s internship program offers a unique opportunity to work closely with our researchers on technologies to improve the quality of life for individuals and society. Here’s a glimpse into that experience from some of our 2025 interns! [ TRI ] In the third installment of our Moonshot Podcast Deep Dive video interview series, X’s Captain of Moonshots Astro Teller sits down with Dr. Catie Cuan, robot choreographer and former artist in residence at Everyday Robots, for a conversation about how dance can be used to build beautiful and useful robots that people want to be around. [ Moonshot Podcast ]
spectrum.ieee.org
August 1, 2025 at 5:01 PM
Video Friday: Skyfall Takes on Mars With Swarm Helicopter Concept https://spectrum.ieee.org/video-friday-skyfall-mars-helicopter
Video Friday: Skyfall Takes on Mars With Swarm Helicopter Concept
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. RO-MAN 2025 : 25–29 August 2025, EINDHOVEN, THE NETHERLANDS CLAWAR 2025 : 5–7 September 2025, SHENZHEN, CHINA ACTUATE 2025 : 23–24 September 2025, SAN FRANCISCO CoRL 2025 : 27–30 September 2025, SEOUL IEEE Humanoids : 30 September–2 October 2025, SEOUL World Robot Summit : 10–12 October 2025, OSAKA, JAPAN IROS 2025 : 19–25 October 2025, HANGZHOU, CHINA Enjoy today’s videos! AeroVironment revealed Skyfall—a potential future mission concept for next-generation Mars Helicopters developed with NASA’s Jet Propulsion Laboratory (JPL) to help pave the way for human landing on Mars through autonomous aerial exploration. The concept is heavily focused on rapidly delivering an affordable, technically mature solution for expanded Mars exploration that would be ready for launch by 2028. Skyfall is designed to deploy six scout helicopters on Mars, where they would explore many of the sites selected by NASA and industry as top candidate landing sites for America’s first Martian astronauts. While exploring the region, each helicopter can operate independently, beaming high-resolution surface imaging and sub-surface radar data back to Earth for analysis, helping ensure crewed vehicles make safe landings at areas with maximum amounts of water, ice, and other resources. The concept would be the first to use the “Skyfall Maneuver”–an innovative entry, descent, and landing technique whereby the six rotorcraft deploy from their entry capsule during its descent through the Martian atmosphere. By flying the helicopters down to the Mars surface under their own power, Skyfall would eliminate the necessity for a landing platform–traditionally one of the most expensive, complex, and risky elements of any Mars mission. [ AeroVironment ] By far the best part of videos like these is watching the expressions on the faces of the students when their robot succeeds at something. [ RaiLab ] This is just a rendering of course, but the real thing should be showing up on August 6. [ Fourier ] Top performer in its class! Less than two weeks after its last release, MagicLab unveils another breakthrough — MagicDog-W, the wheeled quadruped robot. Cyber-flex, dominate all terrains! [ MagicLab ] Inspired by the octopus’s remarkable ability to wrap and grip with precision, this study introduces a vacuum-driven, origami-inspired soft actuator that mimics such versatility through self-folding design and high bending angles. Its crease-free, 3D-printable structure enables compact, modular robotics with enhanced grasping force—ideal for handling objects of various shapes and sizes using octopus-like suction synergy. [ Paper ] via [ IEEE Transactions on Robots ] Thanks, Bram! Is it a plane? Is it a helicopter? Yes. [ Robotics and Intelligent Systems Laboratory, City University of Hong Kong ] You don’t need wrist rotation as long as you have the right gripper . [ Nature Machine Intelligence ] ICRA 2026 will be in Vienna next June! [ ICRA 2026 ] Boing, boing, boing! [ Robotics and Intelligent Systems Laboratory, City University of Hong Kong ] ROBOTERA Unveils L7: Next-Generation Full-Size Bipedal Humanoid Robot with Powerful Mobility and Dexterous Manipulation! [ ROBOTERA ] Meet UBTECH New-Gen of Industrial Humanoid Robot—Walker S2 makes multiple industry-leading breakthroughs! Walker S2 is the world’s first humanoid robot to achieve 3-minute autonomous battery swapping and 24/7 continuous operation. [ UBTECH ] ARMstrong Dex is a human-scale dual-arm hydraulic robot developed by the Korea Atomic Energy Research Institute (KAERI) for disaster response. It can perform vertical pull-ups and manipulate loads over 50 kg, demonstrating strength beyond human capabilities. However, disaster environments also require agility and fast, precise movement. This test evaluated ARMstrong Dex’s ability to throw a 500 ml water bottle (0.5 kg) into a target container. The experiment assessed high-speed coordination, trajectory control, and endpoint accuracy, which are key attributes for operating in dynamic rescue scenarios. [ KAERI ] This is not a humanoid robot, it’s a data acquisition platform. [ PNDbotics ] Neat feature on this drone to shift the battery back and forth to compensate for movement of the arm. [ Paper ] via [ Drones journal ] As residential buildings become taller and more advanced, the demand for seamless and secure in-building delivery continues to grow. In high-end apartments and modern senior living facilities where couriers cannot access upper floors, robots like FlashBot Max are becoming essential. In this featured elderly care residence, FlashBot Max completes 80-100 deliveries daily, seamlessly navigating elevators, notifying residents upon arrival, and returning to its charging station after each delivery. [ Pudu Robotics ] “How to Shake Trees With Aerial Manipulators.” [ GRVC ] We see a future where seeing a cobot in a hospital delivering supplies feels as normal as seeing a tractor in a field. Watch our CEO Brad Porter share what robots moving in the world should feel like. [ Cobot ] Introducing the Engineered Arts UI for robot Roles, it’s now simple to set up a robot to behave exactly the way you want it to. We give a quick overview of customization for languages, personality, knowledge and abilities. All of this is done with no code. Just simple LLM prompts, drop down list selections and some switches to enable the features you need. [ Engineered Arts ] Unlike most quadrupeds, CARA doesn’t use any gears or pulleys. Instead, her joints are driven by rope through capstan drives. Capstan drives offer several advantages: zero backlash, high torque transparency, low inertia, low cost, and quiet operation. These qualities make them an ideal speed reducer for robotics. [ CARA ]
spectrum.ieee.org
July 25, 2025 at 4:00 PM
Video Friday: Robot Metabolism
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics robotics . We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. RO-MAN 2025 : 25–29 August 2025, EINDHOVEN, THE NETHERLANDS CLAWAR 2025 : 5–7 September 2025, SHENZHEN, CHINA ACTUATE 2025 : 23–24 September 2025, SAN FRANCISCO CoRL 2025 : 27–30 September 2025, SEOUL IEEE Humanoids : 30 September–2 October 2025, SEOUL World Robot Summit : 10–12 October 2025, OSAKA, JAPAN IROS 2025 : 19–25 October 2025, HANGZHOU, CHINA Enjoy today’s videos! Columbia University researchers introduce a process that allows machines to “grow” physically by integrating parts from their surroundings or from other robots, demonstrating a step towards self-sustaining robot ecologies. [ Robot Metabolism ] via [ Columbia ] We challenged ourselves to see just how far we could push Digit’s ability to stabilize itself in response to a disturbance. Utilizing state-of-the-art AI technology and robust physical intelligence, Digit can adapt to substantial disruptions, all without the use of visual perception. [ Agility Robotics ] We are presenting the Figure 03 (F.03) battery — a significant advancement in our core humanoid robot technology roadmap. The effort that was put into safety for this battery is impressive. But I would note two things: the battery life is “5 hours of run time at peak performance” without saying what “peak performance” actually means, and 2-kilowatt fast charge still means over an hour to fully charge. [ Figure ] Well this is a nifty idea. [ UBTECH ] PAPRLE is a plug-and-play robotic limb environment for flexible configuration and control of robotic limbs across applications. With PAPRLE, user can use diverse configurations of leader-follower pair for teleoperation. In the video, we show several teleoperation examples supported by PAPRLE. [ PAPRLE ] Thanks, Joohyung! Always nice to see a robot with a carefully thought out commercial use case in which it can just do robot stuff like a robot. [ Cohesive Robotics ] Thanks, David! We are interested in deploying autonomous legged robots in diverse environments, such as industrial facilities and forests. As part of the DigiForest project, we are working on new systems to autonomously build forest inventories with legged platforms, which we have deployed in the UK, Finland, and Switzerland. [ Oxford ] Thanks, Matias! In this research we introduce a self-healing, biocompatible strain sensor using Galinstan and a Diels-Alder polymer, capable of restoring both mechanical and sensing functions after repeated damage. This highly stretchable and conductive sensor demonstrates strong performance metrics—including 80% mechanical healing efficiency and 105% gauge factor recovery—making it suitable for smart wearable applications. [ Paper ] Thanks, Bram! The “Amazing Hand” from Pollen Robotics costs under $250. [ Pollen ] Welcome to our Unboxing Day! After months of waiting, our humanoid robot has finally arrived at Fraunhofer IPA in Stuttgart. I used to take stretching classes from a woman who could do this backwards in 5.43 seconds . [ Fraunhofer ] At the Changchun stop of the VOYAGEX Music Festival on July 12, PNDbotics’ full-sized humanoid robot Adam took the stage as a keytar player with the famous Chinese musician Hu Yutong’s band. [ PNDbotics ] Material movement is the invisible infrastructure of hospitals, airports, cities–everyday life. We build robots that support the people doing this essential, often overlooked work. Watch our CEO Brad Porter reflect on what inspired Cobot. [ Cobot ] Yes please. [ Pollen ] I think I could get to the point of being okay with this living in my bathroom. [ Paper ] Thanks to its social perception, high expressiveness and out-of-the-box integration, TIAGo Head offers the ultimate human-robot interaction experience. [ PAL Robotics ] Sneak peek: Our No Manning Required Ship (NOMARS) Defiant unmanned surface vessel is designed to operate for up to a year at sea without human intervention. In-water testing is preparing it for an extended at-sea demonstration of reliability and endurance. Excellent name for any ship. [ DARPA ] At the 22nd International Conference on Ubiquitous Robots (UR2025), high school student and robotics researcher Ethan Hong was honored as a Special Invited Speaker for the conference banquet and “Robots With Us” panel. In this heartfelt and inspiring talk, Ethan shares the story behind Food Angel — a food delivery robot he designed and built to support people experiencing homelessness in Los Angeles. Motivated by the growing crises of homelessness and food insecurity, Ethan asked a simple but profound question: “Why not use robots to help the unhoused?” [ UR2025 ]
spectrum.ieee.org
July 18, 2025 at 3:32 PM
Video Friday: Reachy Mini Brings Cute to Open-Source Robotics https://spectrum.ieee.org/video-friday-reachy-mini
Video Friday: Reachy Mini Brings Cute to Open-Source Robotics
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. IFAC Symposium on Robotics : 15–18 July 2025, PARIS RoboCup 2025 : 15–21 July 2025, BAHIA, BRAZIL RO-MAN 2025 : 25–29 August 2025, EINDHOVEN, THE NETHERLANDS CLAWAR 2025 : 5–7 September 2025, SHENZHEN, CHINA ACTUATE 2025 : 23–24 September 2025, SAN FRANCISCO CoRL 2025 : 27–30 September 2025, SEOUL IEEE Humanoids : 30 September–2 October 2025, SEOUL World Robot Summit : 10–12 October 2025, OSAKA, JAPAN IROS 2025 : 19–25 October 2025, HANGZHOU, CHINA Enjoy today’s videos! Reachy Mini is an expressive, open-source robot designed for human-robot interaction, creative coding, and AI experimentation. Fully programmable in Python (and soon JavaScript, Scratch) and priced from $299, it’s your gateway into robotics AI: fun, customizable, and ready to be part of your next coding project. I’m so happy that Pollen and Reachy found a home with Hugging Face, but I hope they understand that they are never, ever allowed to change that robot’s face. O-o [ Reachy Mini ] via [ Hugging Face ] General-purpose robots promise a future where household assistance is ubiquitous and aging in place is supported by reliable, intelligent help. These robots will unlock human potential by enabling people to shape and interact with the physical world in transformative new ways. At the core of this transformation are Large Behavior Models (LBMs) - embodied AI systems that take in robot sensor data and output actions. LBMs are pretrained on large, diverse manipulation datasets and offer the key to realizing robust, general-purpose robotic intelligence. Yet despite their growing popularity, we still know surprisingly little about what today’s LBMs actually offer - and at what cost. This uncertainty stems from the difficulty of conducting rigorous, large-scale evaluations in real-world robotics. As a result, progress in algorithm and dataset design is often guided by intuition rather than evidence, hampering progress. Our work aims to change that. [ Toyota Research Institute ] Kinisi Robotics is advancing the frontier of physical intelligence by developing AI-driven robotic platforms capable of high-speed, autonomous pick-and-place operations in unstructured environments. This video showcases Kinisi’s latest wheeled-base humanoid performing dexterous bin stacking and item sorting using closed-loop perception and motion planning. The system combines high-bandwidth actuation, multi-arm coordination, and real-time vision to achieve robust manipulation without reliance on fixed infrastructure. By integrating custom hardware with onboard intelligence, Kinisi enables scalable deployment of general-purpose robots in dynamic warehouse settings, pushing toward broader commercial readiness for embodied AI systems. [ Kinisi Robotics ] Thanks, Bren! In this work, we develop a data collection system where human and robot data are collected and unified in a shared space, and propose a modularized cross-embodiment Transformer that is pretrained on human data and fine-tuned on robot data. This enables high data efficiency and effective transfer from human to quadrupedal embodiments, facilitating versatile manipulation skills for unimanual and bimanual, non-prehensile and prehensile, precise tool-use, and long-horizon tasks, such as cat litter scooping! [ Human2LocoMan ] Thanks, Yaru! LEIYN is a quadruped robot equipped with an active waist joint. It achieves the world’s fastest chimney climbing through dynamic motions learned via reinforcement learning. [ JSK Lab ] Thanks, Keita! Quadrupedal robots are really just bipedal robots that haven’t learned to walk on two legs yet. [ Adaptive Robotic Controls Lab, University of Hong Kong ] This study introduces a biomimetic self-healing module for tendon-driven legged robots that uses robot motion to activate liquid metal sloshing, which removes surface oxides and significantly enhances healing strength. Validated on a life-sized monopod robot, the module enables repeated squatting after impact damage, marking the first demonstration of active self-healing in high-load robotic applications. [ University of Tokyo ] Thanks, Kento! That whole putting wheels on quadruped robots thing was a great idea that someone had way back when. [ Pudu Robotics ] I know nothing about this video except that it’s very satisfying and comes from a YouTube account that hasn’t posted in 6 years. [ Young-jae Bae YouTube ] Our AI WORKER now comes in a new Swerve Drive configuration, optimized for logistics environments. With its agile and omnidirectional movement, the swerve-type mobile base can efficiently perform various logistics tasks such as item transport, shelf navigation, and precise positioning in narrow aisles. Wait, you can have a bimanual humanoid without legs? I am shocked. [ ROBOTIS ] I can’t tell whether I need an office assistant, or if I just need snacks. [ PNDbotics ] “MagicBot Z1: Atomic kinetic energy, the brave are fearless,” says the MagicBot website. Hard to argue with that! [ MagicLab ] We’re excited to announce our new HQ in Palo Alto [CA]. As we grow, consolidating our Sunnyvale [CA] and Moss [Norway] team under one roof will accelerate our speed to ramping production and getting NEO into homes near you. I’m not entirely sure that moving from Norway to California is an upgrade, honestly. [ 1X ] Jim Kernan, Chief Product Officer at Engineered Arts, shares how they’re commercializing humanoid robots—blending AI, expressive design, and real-world applications to build trust and engagement. [ Humanoids Summit ] In the second installment of our Moonshot Podcast Deep Dive video interview series, X’s Captain of Moonshots Astro Teller sits down with André Prager, former Chief Engineer at Wing, for a conversation about the early days of Wing and how the team solved some of their toughest engineering challenges to develop simple, lightweight, inexpensive delivery drones that are now being used every day across three continents. [ Moonshot Podcast ]
spectrum.ieee.org
July 11, 2025 at 4:00 PM
Video Friday: Cyborg Beetles May Speed Disaster Response One Day https://spectrum.ieee.org/video-friday-cyborg-beetles
Video Friday: Cyborg Beetles May Speed Disaster Response One Day
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. IEEE World Haptics : 8–11 July 2025, SUWON, SOUTH KOREA IFAC Symposium on Robotics : 15–18 July 2025, PARIS RoboCup 2025 : 15–21 July 2025, BAHIA, BRAZIL RO-MAN 2025 : 25–29 August 2025, EINDHOVEN, THE NETHERLANDS CLAWAR 2025 : 5–7 September 2025, SHENZHEN, CHINA ACTUATE 2025 : 23–24 September 2025, SAN FRANCISCO CoRL 2025 : 27–30 September 2025, SEOUL IEEE Humanoids : 30 September–2 October 2025, SEOUL World Robot Summit : 10–12 October 2025, OSAKA, JAPAN IROS 2025 : 19–25 October 2025, HANGZHOU, CHINA Enjoy today’s videos! Common beetles equipped with microchip backpacks could one day be used to help search and rescue crews locate survivors within hours instead of days following disasters such as building and mine collapses. The University of Queensland’s Dr. Thang Vo-Doan and Research Assistant Lachlan Fitzgerald have demonstrated they can remotely guide darkling beetles (Zophobas morio) fitted with the packs via video game controllers. [ Paper ] via [ University of Queensland ] Thanks, Thang! This is our latest work about six-doF hand-based teleoperation for omnidirectional aerial robots, which shows an intuitive teleoperation system for advanced aerial robot. This work has been presented in 2025 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2025). [ DRAGON Lab ] Thanks, Moju! Pretty sure we’ve seen this LimX humanoid before, and we’re seeing it again right now, but hey, the first reveal is just ahead! [ LimX Dynamics ] Thanks, Jinyan! Soft robot arms use soft materials and structures to mimic the passive compliance of biological arms that bend and extend. Here, we show how relying on patterning structures instead of inherent material properties allows soft robotic arms to remain compliant while continuously transmitting torque to their environment. We demonstrate a soft robotic arm made from a pair of mechanical metamaterials that act as compliant constant-velocity joints. [ Paper ] via [ Transformative Robotics Lab ] Selling a platform is really hard, but I hope K-Scale can succeed with their open source humanoid. [ K-Scale ] MIT CSAIL researchers combined GenAI and a physics simulation engine to refine robot designs . The result: a machine that out-jumped a robot designed by humans. [ MIT News ] ARMstrong Dex is a human-scale dual-arm hydraulic robot under development at the Korea Atomic Energy Research Institute (KAERI) for disaster response applications. Designed with dimensions similar to an adult human, it combines human-equivalent reach and dexterity with force output that exceeds human physical capabilities, enabling it to perform extreme heavy-duty tasks in hazardous environments. [ Korea Atomic Energy Research Institute ] This is a demonstration of in-hand object rotation with Torobo Hand. Torobo Hand is modeled in simulation, and a control policy is trained within several hours using large-scale parallel reinforcement learning in Isaac Sim. The trained policy can be executed without any additional training in both a different simulator (MuJoCo) and on the real Torobo Hand. [ Tokyo Robotics ] Since 2005, Ekso Bionics has been developing and manufacturing exoskeleton bionic devices that can be strapped on as wearable robots to enhance the strength, mobility, and endurance of soldiers, patients, and workers. These robots have a variety of applications in the medical, military, industrial, and consumer markets, helping rehabilitation patients walk again and workers preserve their strength. [ Ekso Bionics ] Sponsored by Raytheon, an RTX business, the 2025 east coast Autonomous Vehicle Competition was held at XElevate in Northern Virginia. Student Engineering Teams from five universities participated in a two-semester project to design, develop, integrate, and compete two autonomous vehicles that could identify, communicate, and deliver a medical kit with the best accuracy and time. [ RTX ] This panel is from the Humanoids Summit in London: “Investing in the Humanoids Robotics Ecosystem—a VC Perspective.” [ Humanoids Summit ]
spectrum.ieee.org
July 5, 2025 at 3:04 PM