- Video Friday: Happy Robot Holidaysby Evan Ackerman on December 19, 2025 at 4:30 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! Happy Holidays from FZI Living Lab![ FZI ]Thanks, Georg!Happy Holidays from Norlab!I should get a poutine...[ Norlab ]Happy Holidays from Fraunhofer IOSB![ Fraunhofer ]Thanks, Janko!Happy Holidays from HEBI Robotics![ HEBI Robotics ]Thanks, Trevor!Happy Holidays from the Learning Systems and Robotics Lab![ Learning Systems and Robotics Lab ]Happy Holidays from Toyota Research Institute![ Toyota Research Institute ]Happy Holidays from Clearpath Robotics![ Clearpath Robotics ]Happy AI Holidays from Robotnik![ Robotnik ]Happy AI Holidays from ABB Robotics![ ABB Robotics ]With its unique modular configuration, TRON 2 lets you freely configure dual-arm, bipedal, or wheeled setups to fit your mission.[ LimX Dynamics ]Thanks, Jinyan!I love this robot, but can someone please explain why what happens at 2:00 makes me physically uncomfortable?[ Paper ]Thanks, Ayato!This robot, REWW-ARM, is a remote wire-driven mobile robot that separates and excludes electronics from the mobile part, so that the mobile robot can operate in harsh environments. A novel transmission mechanism enables efficient and long-distance electronics-free power transmission, closed-loop control that estimates the distal state from wire. It demonstrated locomotion and manipulation on land and underwater.[ JSK Lab ]Thanks, Takahiro!DEEP Robotics has deployed China’s first robot dog patrol team for forest fire protection in the West Lake area. Powered by embodied AI, these quadruped robots support early detection, patrol, and risk monitoring—using technology to protect nature and strengthen emergency response.[ DEEP Robotics ]In this video we show how we trained our robot to fold a towel from start to finish. Folding a towel might seem simple, but for a robot it means solving perception, planning, and dexterous manipulation all at once, especially when dealing with soft, deformable fabric. We walk through how the system sees the towel, identifies key features, and executes each fold autonomously. [ Kinisi Robotics ]This may be the first humanoid app store, but it’s far from the first app store for robots. Problem is, for an app store to gain traction, there needs to be a platform out there that people will buy for its core functionality first.[ Unitree ]You can tell that this isn’t U.S. government–funded research because it involves a robot fetching drinks.[ Flexiv ]This video shows the Perseverance Mars Rover’s point of view during a record-breaking drive that occurred June 19, 2025, the 1,540th Martian day, or sol, of the mission. The Perseverance rover was traveling northbound and covered 1,350.7 feet (411.7 meters) on that sol, over the course of about 4 hours and 24 minutes. This distance eclipsed its previous record of distance traveled in a single sol: 1,140.7 feet (347.7 meters), which was achieved on April 3, 2023 (Sol 753). [ NASA ]Automation is what’s helped keep lock maker Wilson Bohannan based in America for more than 150 years while all of its competitors relocated overseas. Using two high-speed and high-precision FANUC M-10 series robots, Acme developed a simple but highly sophisticated system that uses innovative end-of-arm tooling to accommodate 18 different styles of padlocks. As a result of Acme’s new system using FANUC robots, Wilson Bohannan production rocketed from 1,500-1,800 locks finished per eight-hour shift to more than 5,000.[ Fanuc ]In this conversation, Zack Jackowski, general manager and vice president, Atlas, and Alberto Rodriguez, director of robot behavior, sit down to discuss the path to generalist humanoid robots working at scale and how we approach research & development to both push the boundaries of the industry and deliver valuable applications.[ Boston Dynamics ]
- iRobot’s Cofounder Weighs In on Company’s Bankruptcyby Evan Ackerman on December 16, 2025 at 8:12 pm
On Sunday evening, the legendary robotics company iRobot, manufacturer of the Roomba robotic vacuum, filed for bankruptcy. The company will be handing over all of its assets to its Chinese manufacturing partner, Picea. According to iRobot’s press release, “this agreement represents a critical step toward strengthening iRobot’s financial foundation and positioning the Company for long-term growth and innovation,” which sounds like the sort of thing that you put in a press release when you’re trying your best to put a positive spin on really, really bad news.This whole situation started back in August 2022, when iRobot announced a US $1.7 billion acquisition by Amazon. Amazon’s interest was obvious—some questionable hardware decisions had left the company struggling to enter the home robotics market. And iRobot was at a point where it needed a new strategy to keep ahead of lower-cost (and increasingly innovative) home robots from China.Some folks were skeptical of this acquisition, and admittedly, I was one of them. My primary worry was that iRobot would get swallowed up and effectively cease to exist, which tends to happen with acquisitions like these, but regulators in the United States had much more pointed concerns: namely, that Amazon would leverage its marketplace power to restrict competition. The European Commission expressed similar objections.By late January 2024, the deal had fallen through, iRobot laid off a third of its staff, suspended research and development, and CEO and cofounder Colin Angle left the company. Since then, iRobot has seemed resigned to its fate, coasting along on a few lackluster product announcements and not much else, and so Sunday’s announcement of bankruptcy was a surprise to no one—perhaps least of all to Angle.iRobot’s Bankruptcy and Amazon Deal Collapse“iRobot’s bankruptcy filing was really just a public-facing outcome of the tragedy that happened a year and a half ago,” Angle told IEEE Spectrum on Monday. “Today sucks, but I’ve already mourned. I mourned when the deal with Amazon got blocked for all the wrong reasons.” Angle points out that by the early 2020s, iRobot was no longer monopolizing the robot-vacuum market. This was especially true in Europe, where iRobot’s market share was 12 percent and decreasing. But from Angle’s perspective, regulators were more focused on making a point about Big Tech than they were about the actual merits and risks of the merger. Cofounder Colin Angle says that iRobot’s bankruptcy filing was unsurprising after a failed acquisition by Amazon a year and a half ago.Charles Krupa/AP“We were roadkilled in a larger agenda,” Angle says. “And this kind of regulation is incredibly destructive to the innovation economy. The whole concept of starting a tech company and having it acquired by a bigger tech company is far and away the most common positive outcome. For that to be taken away is not a good thing.” And for iRobot, it was fatal. A common criticism of iRobot even before the attempted Amazon merger is that the company was simply being out-innovated in the robot-vacuum space, and Angle doesn’t necessarily disagree. “By 2020, China had become the largest market in the world for robot vacuums, and Chinese robotics companies with government support were investing two or three times as much as iRobot was in R&D. We simply didn’t have the capital to move as quickly as we wanted to. In order for iRobot to continue to innovate and lead the industry, we needed to do so as part of a larger entity, and Amazon was very aligned with our vision for the home.”This situation is not unique to iRobot, and there is significant concern in robotics about how companies can effectively compete against the massive advantage that China has in the production of low-cost hardware. In some sense, what happened to iRobot is an early symptom of what Angle (and others) see as a fundamental problem with robotics in the United States: lack of government support. In China, long-term government support for robotics and embodied AI (in the form of both policy and direct investment) can be found across industry and academia, something that neither the United States nor the European Union has been able to match. “Robotics is in a global competition against some very fearsome competitors,” Angle says. “We have to decide whether we want to support our innovation economy. And if the answer is no, then the innovation economy goes elsewhere.”The consequence of companies like iRobot losing this competition can be more than just bankruptcy. In iRobot’s case, a Chinese company now owns iRobot’s intellectual property and app infrastructure, which gives it access to data from millions of highly sensorized autonomous mobile robots in homes across the world. I asked Angle whether or not Roomba owners should be concerned about this. “When I was running the company, we talked a lot about this, and put a lot of effort into privacy and security,” he says. “This was fundamental to Roomba’s design. Now, I can’t speak to what they’ll prioritize.”While Angle has moved on from iRobot, and has since cofounded a more-mysterious-than-we’d-like company called Familiar Machines and Magic, he still feels strongly that what has happened to iRobot should be a warning to both robotics companies and policymakers. “Make no mistake: China is good at robots. So we need to play this hard. There’s a lot to learn from what we did at iRobot, and a lot of ways to do it better.”On a personal note, I’m choosing to remember the iRobot that was—not just the company that built a robot vacuum out of nothing and conquered the world with it for nearly two decades, but also the company that built the PackBot to save lives, as well as all of these other crazy robots. I’m not sure there’s ever been a company quite like iRobot, and there may never be again. It will be missed.
- Video Friday: Robot Dog Shows Off Its Musclesby Evan Ackerman on December 12, 2025 at 5:00 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! Suzumori Endo Lab, Science Tokyo has developed a musculoskeletal dog robot using thin McKibben muscles. This robot mimics the flexible “hammock-like” shoulder structure to investigate the biomechanical functions of dog musculoskeletal systems.[ Suzumori Endo Robotics Laboratory ]HOLEY SNAILBOT!!![ Freeform Robotics ]We present a system that transforms speech into physical objects using 3D generative AI and discrete robotic assembly. By leveraging natural language, the system makes design and manufacturing more accessible to people without expertise in 3D modeling or robotic programming.[ MIT ]Meet the next generation of edge AI. A fully self-contained vision system built for robotics, automation, and real-world intelligence. Watch how OAK 4 brings compute, sensing, and 3D perception together in one device.[ Luxonis ]Thanks, Max!Inspired by vines’ twisty tenacity, engineers at MIT and Stanford University have developed a robotic gripper that can snake around and lift a variety of objects, including a glass vase and a watermelon, offering a gentler approach compared to conventional gripper designs. A larger version of the robo-tendrils can also safely lift a human out of bed.[ MIT ]The paper introduces an automatic limb attachment system using soft actuated straps and a magnet-hook latch for wearable robots. It enables fast, secure, and comfortable self-donning across various arm sizes, supporting clinical-level loads and precise pressure control.[ Paper ]Thanks, Bram!Autonomous driving is the ultimate challenge for AI in the physical world. At Waymo, we’re solving it by prioritizing demonstrably safe AI, where safety is central to how we engineer our models and AI ecosystem from the ground up.[ Waymo ]Built by Texas A&M engineering students, this AI-powered robotic dog is reimagining how robots operate in disaster zones. Designed to climb through rubble, avoid hazards, and make autonomous decisions in real time, the robot uses a custom multimodal large language model (MLLM) combined with visual memory and voice commands to see, remember, and plan its next move like a first responder.[ Texas A&M ]So far, aerial microrobots have only been able to fly slowly along smooth trajectories, far from the swift, agile flight of real insects—until now. MIT researchers have demonstrated aerial microrobots that can fly with speed and agility comparable to their biological counterparts. A collaborative team designed a new AI-based controller for the robotic bug that enabled it to follow gymnastic flight paths, such as executing continuous body flips.[ MIT ]In this audio clip generated by data from the SuperCam microphone aboard NASA’s Perseverance, the sound of an electrical discharge can be heard as a Martian dust devil flies over the Mars rover. The recording was collected on Oct. 12, 2024, the 1,296th Martian day, or sol, of Perseverance’s mission on the Red Planet.[ NASA Jet Propulsion Laboratory ]In this episode, we open the archives on host Hannah Fry’s visit to our California robotics lab. Filmed earlier this year, Hannah interacts with a new set of robots—those that don’t just see, but think, plan, and do. Watch as the team goes behind the scenes to test the limits of generalization, challenging robots to handle unseen objects autonomously.[ Google DeepMind ]This GRASP on Robotics Seminar is by Parastoo Abtahi from Princeton University, on “When Robots Disappear–From Haptic Illusions in VR to Object-Oriented Interactions in AR.”Advances in audiovisual rendering have led to the commercialization of virtual reality (VR); however, haptic technology has not kept up with these advances. While a variety of robotic systems aim to address this gap by simulating the sensation of touch, many hardware limitations make realistic touch interactions in VR challenging. In my research, I explore how, by understanding human perception through the lens of sensorimotor control theory, we can design interactions that not only overcome the current limitations of robotic hardware for VR but also extend our abilities beyond what is possible in the physical world.In the first part of this talk, I will present my work on redirection illusions that leverage the limits of human perception to improve the perceived performance of encountered-type haptic devices in VR, such as the position accuracy of drones and the resolution of shape displays. In the second part, I will share how we apply these illusory interactions to physical spaces and use augmented reality (AR) to facilitate situated and bidirectional human-robot communication, bridging users’ mental models and robotic representations.[ University of Pennsylvania GRASP Laboratory ]
- Ghost Robotics’ Arm Brings Manipulation to Military Quadrupedsby Evan Ackerman on December 11, 2025 at 3:00 pm
Ghost Robotics is today announcing a major upgrade for their Vision 60 quadruped: an arm. Ghost, a company which originated at the GRASP Lab at the University of Pennsylvania, specializes in exceptionally rugged quadrupeds, and while many of its customers use its robots for public safety and disaster relief, it also provides robots to the United States military, which has very specific needs when it comes to keeping humans out of danger.In that context, it’s not unreasonable to assume that Ghost’s robots may sometimes be used to carry weapons, and despite the proliferation of robots in many roles in the Ukraine war, the idea of a legged robot carrying a weapon is not a comfortable one for many people. IEEE Spectrum spoke with Ghost co-founder and current CEO Gavin Kenneally to learn more about the new arm, and to get his perspective on selling robots to the military. The Vision 60’s new arm has six degrees of freedom. Ghost RoboticsRobots for the MilitaryGhost Robotics initially made a name for itself with its very impressive early work with the Minitaur direct-drive quadruped in 2016. The company also made headlines in late 2021, when a now-deleted post on Twitter (now X) went viral because it included a photograph of one of Ghost’s Vision 60 quadrupeds with a rifle mounted on its back.That picture resulted in a very strong reaction, although as IEEE Spectrum reported at the time, robots with guns affixed to them wasn’t new: To mention one early example, the U.S. military had already deployed weapons on mobile robots in Iraq in 2007. And while several legged robot companies pledged in 2022 not to weaponize their general purpose robots, the Chinese military in 2024 displayed quadrupeds from Unitree equipped with guns. (Unitree, based in China, was one of the signers of the 2022 pledge.)The issue of weaponized robots goes far beyond Ghost Robotics, and far beyond robots with legs. We’ve covered both the practical and ethical perspectives on this extensively at IEEE Spectrum, and the intensity of the debates show that there is no easy answer. But to summarize one important point made by some ethicists, some military experts, and Ghost Robotics itself: robots are replaceable, humans are not. “Customers use our robots to keep people out of harm’s way,” Ghost CEO Kenneally tells Spectrum.It’s also worth pointing out that even the companies who signed the pledge not to weaponize their general purpose robots acknowledge that military robots exist, and are accepting of that, provided that such robots are used under existing legal doctrines and operate within those safeguards—and that what constraints should or should not be imposed on these kinds of robots is best decided by policymakers rather than industry.This is essentially Ghost Robotics’ position as well, says Kenneally. “We sell our robots to U.S. and allied governments, and as part of that, the robots are used in defense applications where they will sometimes be weaponized. What’s most critical to us is that the decisions about how to use these robots are happening systematically and ethically at the government policy level.”To some extent, these decisions are already being made within the U.S. government. Department of Defense Directive 3000.09, ‘Autonomy in Weapon Systems,’ lays out the responsibilities and limitations for how autonomous or human-directed robotics weapons systems should be developed and deployed, including requirements for human use-of-force judgements. At least in the U.S., this directive implies that there are rules and accountability for robotic weapons.Vision 60’s Versatile Arm CapabilitiesGhost sees its Vision 60 quadruped as a system that its trusted customers can use as they see fit, and the manipulator enables many additional capabilities. “The primary purpose of the robot has been as a sensor platform,” Kenneally says, “but sometimes there are doors in the way, or objects that need to be moved, or you might want the robot to take a sample. So the ability to do all of that mobile manipulation has been hugely valuable for our customers.”As it turns out, arms are good for more than manipulation. “One thing that’s been very interesting is that our customers have been using the arm as a sensor boom, which is something that we hadn’t anticipated,” says Kenneally. Ghost’s robot has plenty of cameras, but they’re mostly at the viewpoint of a moderately-sized dog. The new arm offers a more human-like vantage and a way to peek around corners or over things without exposing the whole robot.Ghost was not particularly interested in building their own arm, and tried off-the-shelf options to get the manipulation bit working. And they did get the manipulation working; what didn’t work were any of those arms after the 50 kilogram robot rolled over on them. “We wanted to make sure that we could build an arm that could stand up to the same intense rigors of our customers’ operations that the rest of the robot can,” says Kenneally. “Morphologically, we actually consider the arm to be a fifth leg, so that the robot operates as a unified system for whole-body control.”The rest of the robot is exceptionally rugged, which is what makes it appealing to customers with unique needs, like special forces teams. Enough battery life for more than three hours of walking (or more than 20 hours on standby) isn’t bad, and the Vision 60 is sealed against sand and dust, and can survive complete submergence in shallow water. It can operate in extreme temperatures ranging from -40 °C to 55 °C, which has been a particular challenge for robots. And if you do manage to put it in a situation where it physically breaks one of its legs, it’s easy to swap in a spare in just a few minutes, even out in the field. The Vision 60 can open doors withe high-level direction from a human operator.Ghost RoboticsQuadruped Robot Competition From ChinaDespite Ghost quietly selling over a thousand quadrupeds to date, Kenneally is cautious about the near future for legged robots, as is anyone who has seriously considered buying one, because it’s impossible to ignore the option of just buying one from a Chinese company at about a tenth the cost of a quadruped from a company based in the U.S. or Europe.“China has identified legged robotics as a lynchpin technology that they are strategically funding,” Kenneally says. “I think it’s an extremely serious threat in the long term, and we have to take these competitors very seriously despite their current shortcomings.” There is a technological moat, for now, but if the market for legged robots follows the same trajectory as the market for drones did, that moat will shrink drastically over the next few years.The United States is poised to ban consumer drone sales from Chinese manufacturer DJI, and banned DJI drone use by federal agencies in 2017. But it may be too late in some sense, as DJI’s global market share is something like 90 percent. Meanwhile, Unitree may have already cornered somewhere around 70 percent of the global market for quadrupeds, despite the recent publication of exploits that allow the robots to send unauthorized data to China.In the United States in particular, private sector robotics funding is unpredictable at the best of times, and Kenneally argues that to compete with Chinese-subsidized robot-makers American companies like Ghost who produce these robots domestically will need sustained U.S. government support, too. That doesn’t mean the government has to pick which companies will be the winners, but that it should find a way to support the U.S. robotics industry as a whole, if it still wants to have a meaningful one. “The quadruped industry isn’t a science project anymore,” says Kenneally. “It’s matured, and quadruped robots are going to become extremely important in both commercial and government applications. But it’s only through continued innovation that we’ll be able to stay ahead.”
- Video Friday: Biorobotics Turns Lobster Tails Into Gripperby Evan Ackerman on December 5, 2025 at 5:30 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! EPFL scientists have integrated discarded crustacean shells into robotic devices, leveraging the strength and flexibility of natural materials for robotic applications.[ EPFL ]Finally, a good humanoid robot demo!Although having said that, I never trust videos demos where it works really well once, and then just pretty well every other time.[ LimX Dynamics ]Thanks, Jinyan!I understand how these structures work, I really do. But watching something rigid extrude itself from a flexible reel will always seem a little magical.[ AAAS ]Thanks, Kyujin!I’m not sure what “industrial grade” actually means, but I want robots to be “automotive grade,” where they’ll easily operate for six months or a year without any maintenance at all.[ Pudu Robotics ]Thanks, Mandy!When you start to suspect that your robotic EV charging solution costs more than your car.[ Flexiv ]Yeah, uh, if the application for this humanoid is actually making robot parts with a hammer and anvil, then I’d be impressed.[ EngineAI ]Researchers at Columbia Engineering have designed a robot that can learn a humanlike sense of neatness. The researchers taught the system by showing it millions of examples, not teaching it specific instructions. The result is a model that can look at a cluttered tabletop and rearrange scattered objects in an orderly fashion.[ Paper ]Why haven’t we seen this sort of thing in humanoid robotics videos yet?[ HUCEBOT ]While I definitely appreciate in-the-field testing, it’s also worth asking to what extent your robot is actually being challenged by the in-the-field field that you’ve chosen.[ DEEP Robotics ]Introducing HMND 01 Alpha Bipedal—autonomous, adaptive, designed for real-world impact. Built in five months, walking stably after 48 hours of training.[ Humanoid ]Unitree says that “this is to validate the overall reliability of the robot,” but I really have to wonder how useful this kind of reliability validation actually is.[ Unitree ]This University of Pennsylvania GRASP on Robotics seminar is by Jie Tan from Google DeepMind, on “Gemini Robotics: Bringing AI into the Physical World.”Recent advancements in large multimodal models have led to the emergence of remarkable generalist capabilities in digital domains, yet their translation to physical agents such as robots remains a significant challenge. In this talk, I will present Gemini Robotics, an advanced Vision-Language-Action (VLA) generalist model capable of directly controlling robots. Furthermore, I will discuss the challenges, learnings, and future research directions on robot foundation models.[ University of Pennsylvania GRASP Laboratory ]
- MIT’s AI Robotics Lab Director Is Building People-Centered Robotsby Willie D. Jones on December 3, 2025 at 7:00 pm
Daniela Rus has spent her career breaking barriers—scientific, social, and material—in her quest to build machines that amplify rather than replace human capability. She made robotics her life’s work, she says, because she understood it was a way to expand the possibilities of computing while enhancing human capabilities.“I like to think of robotics as a way to give people superpowers,” Rus says. “Machines can help us reach farther, think faster, and live fuller lives.”Daniela RusEmployer MITJob titleProfessor of electrical and computer engineering and computer science; director of the MIT Computer Science and Artificial Intelligence LaboratoryMember gradeFellowAlma maters University of Iowa, in Iowa City; CornellHer dual missions, she says, are to make technology humane and to make the most of the opportunities afforded by life in the United States. The two goals have fueled her journey from a childhood living under a dictatorship in Romania to the forefront of global robotics research.Rus, who is director of MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), is the recipient of this year’s IEEE Edison Medal, which recognizes her for “sustained leadership and pioneering contributions in modern robotics.”An IEEE Fellow, she describes the recognition as a responsibility to further her work and mentor the next generation of roboticists entering the field.The Edison Medal is the latest in a string of honors she has received. In 2017 she won an Engelberger Robotics Award from the Robotic Industries Association. The following year, she was honored with the Pioneer in Robotics and Automation Award by the IEEE Robotics and Automation Society. The society recognized her again in 2023 with its IEEE Robotics and Automation Technical Field Award.From Romania to Iowa Rus was born in Cluj-Napoca, Romania, during the rule of dictator Nicolae Ceausescu. Her early life unfolded in a world defined by scarcity—rationed food, intermittent electricity, and a limited ability to move up or out. But she recalls that, amid the stifling insufficiencies, she was surrounded by an irrepressible warmth and intellectual curiosity—even when she was making locomotive screws in a state-run factory as part of her school’s curriculum.“Life was hard,” she says, “but we had great teachers and strong communities. As a child, you adapt to whatever is around you.”Her father, Teodor, was a computer scientist and professor, and her mother, Elena, was a physicist.In 1982, when she was 19, Rus’s father emigrated to the United States to join the faculty at the University of Iowa, in Iowa City. It was an act of courage and conviction. Within a year, Daniela and her mother joined him there.“He wanted the freedom to think, to publish, to explore ideas,” Rus says. “And I reaped the benefits of being free from the limitations of our homeland.”America’s open horizons were intoxicating, she says.A lecture that changed everythingRus decided to pursue a degree at her father’s university, where her life changed direction, she says. One afternoon, John Hopcroft—a Turing Award–winning Cornell computer scientist renowned for his work on algorithms and data structures—gave a talk on campus. His message was simple but electrifying, Rus says: Classical computer science had been solved. The next frontier, Hopcroft declared, was computations that interact with the messy physical world.For Rus, the idea was a revelation.“It was as if a door had opened,” she says. “I realized the future of computing wasn’t just about logic and code; it was about how machines can perceive, move, and help us in the real world.”After the lecture, she introduced herself to Hopcroft and told him she wanted to learn from him. Not long after earning her bachelor’s degree in computer science and mathematics in 1985, she applied to get a master’s degree at Cornell, where Hopcroft became her graduate advisor. Rus developed algorithms there for dexterous robotic manipulation—teaching machines to grasp and move objects with precision. She earned her master’s in computer science in 1990, then stayed on at Cornell to work toward a doctorate.“I like to think of robotics as a way to give people superpowers. Machines can help us reach farther, think faster, and live fuller lives.”In 1993 she earned her Ph.D. in computer science, then took a position as an assistant professor of computer science at Dartmouth College, in Hanover, N.H. She founded the college’s robotics laboratory and expanded her work into distributed robotics. She developed teams of small robots that cooperated to perform tasks such as ensuring products in warehouses are correctly gathered to fulfill orders, get packaged safely, and are routed to their respective destinations efficiently.Despite a lack of traditional machine shop facilities for fabrication on the Hanover campus, Rus found a way. She pioneered the use of 3D printing to rapidly prototype and build robots.In 2003 she left Dartmouth to become a professor in the electrical engineering and computer science department at MIT.The robotics lab she created at Dartmouth moved with her to MIT and became known as the Distributed Robotics Laboratory (DRL). In 2012 she was named director of MIT’s Computer Science and Artificial Intelligence Laboratory, the school’s largest interdisciplinary lab, with 60 research groups including the DRL. She also continues to serve as the DRL’s principal investigator.The science of physical intelligenceRus now leads pioneering research at the intersection of AI and robotics, a field she calls physical intelligence. It’s “a new form of intelligent machine that can understand dynamic environments, cope with unpredictability, and make decisions in real time,” she says.Her lab builds soft-body robots inspired by nature that can sense, adapt, and learn. They are AI-driven systems that passively handle tasks—such as self-balancing and complex articulation similar to that done by the human hand—because their shape and materials minimize the need for heavy processing.Such machines, she says, someday will be able to navigate different environments, perform useful functions without external control, and even recover from disturbances to their route planning. Researchers also are exploring ways to make them more energy-efficient.One prototype developed by Rus’s team is designed to retrieve foreign objects from the body, including batteries swallowed by children. The ingestible robots are artfully folded, similar to origami, so they are small enough to be swallowed. Embedded magnetic materials allow doctors to steer the soft robots and control their shape. Upon arriving in the stomach, a soft bot can be programmed to wrap around a foreign object and guide it safely out of the patient’s body.CSAIL researchers also are working on small robots that can carry a medication and release it at a specific area within the digestive tract, bypassing the stomach acid known to diminish some drugs’ efficacy. Ingestible robots also could patch up internal injuries or ulcers. And because they’re made from digestible materials such as sausage casings and biocompatible polymers, the robots can perform their assigned tasks and then get safely absorbed by the body, she says.Health care isn’t the only application on the horizon for such AI-driven technologies. Robots with physical intelligence might someday help firefighters locate people trapped in burning buildings, find miners after a cave-in, and provide valuable situational awareness information to emergency response teams in the aftermath of natural disasters, Rus says.“What excites me is the possibility of giving people new powers,” she says. “Machines that can think and move safely in the physical world will let us extend human reach—at work, at home, in medicine … everywhere.”To make such a vision a reality, she has expanded her technical interests to include several complementary lines of research.She’s working on self-reconfiguring and modular robots such as MIT’s M-Blocks and NASA’s SuperBots, which can attach, detach, and rearrange themselves to form shapes suited for different actions such as slithering, climbing, and crawling.With networked robots—including those Amazon uses in its warehouses—thousands of machines can operate as a large adaptive system. The machines communicate continuously to divide tasks, avoid collisions, and optimize package routing.Rus’s team also is making advances in human-robot interaction, such as reading brainwave activity and interpreting sign language through a smart glove.To further her plan of putting all the computerized smarts the robots need within their physical bodies instead of in the cloud, she helped found Liquid AI in 2023. The company, based in Cambridge, Mass., develops liquid neural networks, inspired by the simple brains of worms, that can learn and adapt continuously. The word liquid in this case refers to the adaptability, flexibility, and dynamic nature of the team’s model architecture. It can change shape and adapt to new data inputs, and it fits within constraints imposed by the hardware in which it’s contained, she says.Finding community in IEEERus joined IEEE at one of its robotics conferences when she was a graduate student.“I think I signed up just to get the student discount,” she says with a laugh. “But IEEE turned out to be the place where my community lived.”She credits the organization’s conferences, journals, and collaborative spirit with shaping her professional growth.“The exchange of ideas, the chance to test your thinking against others—it’s invaluable,” she says. “It’s how our field moves forward.”Rus continues to serve on IEEE panels and committees, mentoring the next generation of roboticists.“IEEE gave me a platform,” Rus says. “It taught me how to communicate, how to lead, and how to dream bigger.”Living the American dreamLooking back, Rus sees her story as a testament to unforeseen possibilities.“When I was growing up in Romania, I couldn’t even imagine living in America,” she says. “Now I’m here, working with brilliant students, building robots that help people, and trying to make a difference. I feel like I’m living the American dream.”In a nod to a memorable song from the Broadway musical Hamilton, Rus echoes Alexander Hamilton’s determination to make the most of his opportunities, saying, “I don’t ever want to throw away my shot.”
- Video Friday: Disney’s Robotic Olaf Makes His Debutby Evan Ackerman on November 29, 2025 at 4:30 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.SOSV Robotics Matchup: 1–5 December 2025, ONLINEICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! Step behind the scenes with Walt Disney Imagineering Research & Development and discover how Disney uses robotics, AI, and immersive technology to bring stories to life! From the brand new self-walking Olaf in World of Frozen and BDX Droids to cutting-edge attractions like Millennium Falcon: Smugglers Run, see how magic meets innovation.[ Disney Experiences ]We just released a new demonstration of Mentee’s V3 humanoid robots completing a real world logistics task together. Over an uninterrupted 18-minute run, the robots autonomously move 32 boxes from eight piles to storage racks of different heights. The video shows steady locomotion, dexterous manipulation, and reliable coordination throughout the entire task.And there’s an uncut 18 minute version of this at the link.[ MenteeBot ]Thanks, Yovav!This video contains graphic depictions of simulated injuries. Viewer discretion is advised.In this immersive overview, guided by the DARPA Triage Challenge program manager, retired Army Col. Jeremy C. Pamplin, M.D., you’ll experience how teams of innovators, engineers, and DARPA are redefining the future of combat casualty care. Be sure to look all around! Check out competition runs, behind-the-scenes of what it takes to put on a DARPA Challenge, and glimpses into the future of lifesaving care.Those couple of minutes starting at 6:50 with the human medic and robotic teaming was particularly cool.[ DARPA ]You don’t need to build a humanoid robot if you can just make existing humanoids a lot better.I especially love 0:45 because you know what? Humanoids should spend more time sitting down, for all kinds of reasons. And of course, thank you for falling and getting up again, albeit on some of the squishiest grass on the planet.[ Flexion ]“Human-in-the-Loop Gaussian Splatting” wins best paper title of the week.[ Paper ] via [ IEEE Robotics and Automation Letters in IEEE Xplore ]Scratch that, “Extremum Seeking Controlled Wiggling for Tactile Insertion” wins best paper title of the week.[ University of Maryland PRG ]The battery swapping on this thing is... Unfortunate.[ LimX Dynamics ]To push the boundaries of robotic capability, researchers in the Department of Mechanical Engineering at Carnegie Mellon University in collaboration with The University of Washington and Google Deepmind, have developed a new tactile sensing system that enables four-legged robots to carry unsecured, cylindrical objects on their backs. This system, known as LocoTouch, features a network of tactile sensors that spans the robot’s entire back. As an object shifts, the sensors provide real-time feedback on its position, allowing the robot to continuously adjust its posture and movement to keep the object balanced.[ Carnegie Mellon University ]This robot is in more need of googly eyes than any other robot I’ve ever seen.[ Zarrouk Lab ]DPR Construction has deployed Field AI’s autonomy software on a quadruped robot at the company’s job site in Santa Clara, CA, to greatly improve its daily surveying and data collection processes. By automating what has traditionally been a very labor intensive and time consuming process, Field AI is helping the DPR team operate more efficiently and effectively, while increasing project quality.[ FieldAI ]In our second episode of AI in Motion, our host, Waymo AI researcher Vincent Vanhoucke, talks with a robotics startup founder Sergey Levine, who left a career in academic research to build better robots for the home and workplace.[ Waymo ]
- For This Engineer, Taking Deep Dives Is Part of the Jobby Edd Gent on November 27, 2025 at 1:00 pm
Early in Levi Unema’s career as an electrical engineer, he was presented with an unusual opportunity. While working on assembly lines at an automotive parts supplier in 2015, he got a surprise call from his high-school science teacher that set him off on an entirely new path: piloting underwater robots to explore the ocean’s deepest abysses.That call came from Harlan Kredit, a nationally renowned science teacher and board member of a Rhode Island-based nonprofit called the Global Foundation for Ocean Exploration (GFOE). The organization was looking for an electrical engineer to help design, build, and pilot remotely operated vehicles (ROVs) for the U.S. National Oceanic and Atmospheric Administration.Levi UnemaEmployerDeep Exploration SolutionsOccupationROV engineerEducation Bachelor’s degree in electrical engineering, Michigan Technological UniversityThis was an exciting break for Unema, a Washington state native who had grown up tinkering with electronics and exploring the outdoors. Unema joined the team in early 2016 and has since helped develop and operate deep-sea robots for scientific expeditions around the globe.The GFOE’s contract with NOAA expired in July, forcing the engineering team to disband. But soon after, Unema teamed up with four former colleagues to start their own ROV consultancy, called Deep Exploration Solutions, to continue the work he’s so passionate about.“I love the exploration and just seeing new things every day,” he says. “And the engineering challenges that go along with it are really exciting, because there’s a lot of pressure down there and a lot of technical problems to solve.”Nature and TechnologyUnema’s fascination with electronics started early. Growing up in Lynden, Wash., he took apart radios, modified headphones, and hacked together USB chargers from AA batteries. “I’ve always had to know how things work,” he says. He was also a Boy Scout, and much of his youth was spent hiking, camping, and snowboarding.That love of both technology and nature can be traced back, at least in part, to his parents—his father was a civil engineer, and his mother was a high-school biology teacher. But another major influence growing up was Kredit, the science teacher who went on to recruit him. (Kredit was also a colleague of Unema’s mother.)Kredit has won numerous awards for his work as an educator, including the Presidential Award for Excellence in Science Teaching in 2004. Like Unema, he also shares a love for the outdoors as Yellowstone National Park’s longest-serving park ranger. “He was an excellent science teacher, very inspiring,” says Unema.When Unema graduated high school in 2010, he decided to enroll at his father’s alma mater, Michigan Technological University, to study engineering. He was initially unsure what discipline to follow and signed up for the general engineering course, but he quickly settled on electrical engineering.A summer internship at a steel mill run by the multinational corporation ArcelorMittal introduced Unema to factory automation and assembly lines. After graduating in 2014 he took a job at Gentex Corp. in Zeeland, Mich., where he worked on manufacturing systems and industrial robotics.Diving Into Underwater RoboticsIn late 2015, he got the call from Kredit asking if he’d be interested in working on underwater robots for GFOE. The role involved not just engineering these systems, but also piloting them. Taking the plunge was a difficult choice, says Unema, as he’d just been promoted at Gentex. But the promise of travel combined with the novel engineering challenges made it too good an opportunity to turn down.Building technology that can withstand the crushing pressure at the bottom of the ocean is tough, he says, and you have to make trade-offs between weight, size, and cost. Everything has to be waterproof, and electronics have to be carefully isolated to prevent them from grounding on the ocean floor. Some components are pressure-tolerant, but most must be stored in pressurized titanium flasks, so the components must be extremely small to minimize the size of the metallic housing. Unema conducts predive checks from the Okeanos Explorer’s control room. Once the ROV is launched, scientists will watch the camera feeds and advise his team where to direct the vehicle.Art Howard“You’re working very closely with the mechanical engineer to fit the electronics in a really small space,” he says. “The smaller the cylinder is, the cheaper it is, but also the less mass on the vehicle. Every bit of mass means more buoyancy is required, so you want to keep things small, keep things light.”Communications are another challenge. The ROVs rely on several kilometers of cable containing just three single-mode optical fibers. “All the communication needs to come together and then go up one cable,” Unema says. “And every year new instruments consume more data.”He works exclusively on ROVs that are custom made for scientific research, which require smoother control and considerably more electronics and instrumentation than the heavier-duty vehicles used by the oil and gas industry. “The science ones are all hand-built, they’re all quirky,” he says.Unema’s role spans the full life cycle of an ROV’s design, construction, and operation. He primarily spends winters upgrading and maintaining vehicles and summers piloting them on expeditions. At GFOE, he mainly worked on two ROVs for NOAA called Deep Discoverer and Seirios, which operate from the ship Okeanos Explorer. But he has also piloted ROVs for other organizations over the years, including the Schmidt Ocean Institute and the Ocean Exploration Trust.Unema’s new consultancy, Deep Exploration Solutions, has been given a contract to do the winter maintenance on the NOAA ROVs, and the firm is now on the lookout for more ROV design and upgrade work, as well as piloting jobs.An Engineer’s Life at SeaOn expeditions, Unema is responsible for driving the robot. He follows instructions from a science team that watches the ROV’s video feed to identify things like corals, sponges, or deepwater creatures that they’d like to investigate in more detail. Sometimes he will also operate hydraulic arms to sample particularly interesting finds.In general, the missions are aimed at discovering new species and mapping the range of known ones, says Unema. “There’s a lot of the bottom of the ocean where we don’t know anything about it,” he says. “Basically every expedition there’s some new species.”This involves being at sea for weeks at a time. Unema says that life aboard ships can be challenging—many new crew members get seasick, and you spend almost a month living in close quarters with people you’ve often never met before. But he enjoys the opportunity to meet colleagues from a wide variety of backgrounds who are all deeply enthusiastic about the mission.“It’s like when you go to scout camp or summer camp,” he says. “You’re all meeting new people. Everyone’s really excited to be there. We don’t know what we’re going to find.”Unema also relishes the challenge of solving engineering problems with the limited resources available on the ship. “We’re going out to the middle of the Pacific,” he says. “Things break, and you’ve got to fix them with what you have out there.”If that sounds more exciting than daunting, and you’re interested in working with ROVs, Unema’s main advice is to talk to engineers in the field. It’s a small but friendly community, he says, so just do your research to see what opportunities are available. Some groups, such as the Ocean Exploration Trust, also operate internships for college students to help them get experience in the field.And Unema says there are very few careers quite like it. “I love it because I get to do all aspects of engineering—from idea to operations,” he says. “To be able to take something I worked on and use it in the field is really rewarding.”This article appears in the December 2025 print issue as “Levi Unema.”
- Remote Robotics Could Widen Access to Stroke Treatmentby Greg Uyeno on November 24, 2025 at 2:15 pm
When treating strokes, every second counts. But for patients in remote areas, it may take hours to receive treatment. The standard treatment for a common type of stroke, caused by large clots interrupting blood flow to the brain, is a procedure called endovascular thrombectomy, or EVT. During the procedure, an experienced surgeon pilots catheters through blood vessels to the blockage, accessed through a major channel such as the femoral artery in the groin. This is typically aided by X-ray imaging, which shows the position of blood vessels.“Good outcomes are directly associated with early treatment,” says Cameron Williams, a neurologist at the University of Melbourne and fellow with the Australian Stroke Alliance. In fact, “time is brain” is a common refrain in stroke treatment. While blood flow is stopped, about 2 million neurons die each minute. Over an hour, that adds up to 3.6 years of typical age-related brain cell loss.But in remote places like Darwin, in the north of Australia, this treatment isn’t available. Instead, it could take 6 hours or more and an expensive aeromedical transfer to get a patient to a medical center, says Williams. There are similar geographical challenges to stroke treatment access all over the world. Sparing a rural patient hours of transfer time to a hospital with an on-site expert could save their life, prevent disability, or preserve years of their quality of life.That’s why there is a particular interest in the possibility of emergency stroke treatment performed remotely with the help of robotics. Machines placed in smaller population centers could connect patients to expert surgeons miles away, and shave hours off of time to treatment. Two companies have recently demonstrated their remote capabilities. In September, doctors in Toronto completed a series of increasingly distant brain angiograms, the X-ray imaging element of an EVT, eventually performing two angiograms between crosstown hospitals using the N1 system from Remedy Robotics. And in October, Sentante equipment facilitated a simulated EVT between a surgeon in Jacksonville, Fla., and a cadaver with artificial blood flow in Dundee, Scotland.“All those stories connected is not only proof of concept. It’s coming to realization and implementation that robotic and remote interventions can be performed, and soon will be the reality for many centers in rural areas,” says Vitor Pereira, a neurosurgeon at Unity Health who performed the Toronto procedures.Two Approaches to Remote EVTOne challenge of performing these remote procedures is maintaining strong, fast connections at large distances. “Is there a real life need to do this transatlantically? Probably not,” says Edvardas Satkauskas, CEO of Sentante. “It demonstrates the capabilities. Even this distance is feasible.” Although performing a procedure remotely introduces issues related to latency, the pace of EVT—while urgent—is not reliant on instant reactions, says Satkauskas. Redundant connections should also be an important safeguard for dropped connections. Remedy has taken measures, for instance, to ensure that its robot monitors connection quality, and doesn’t make any harmful movements due to a poor connection, says David Bell, the company’s CEO.Though both companies are careful about disclosing details of products and research that are still in development, there are notable differences between their approaches.“Our device leans heavily on artificial intelligence,” says Bell. Machine learning is incorporated into how the Remedy device manipulates guide wires and creates an informational overlay atop X-ray images for remote physicians, who can control the robot with a laptop and software interface. The long-term goal is for a surgeon to be able to log on to Remedy software at short notice from a central location to interact with Remedy robots in multiple hospitals as needed.In contrast, Sentante uses a control console meant to look and feel like the catheters and guide wires that surgeons are accustomed to manipulating in manual EVT, including force feedback that mimics the resistance they would feel in person. “It’s very intuitive to use this,” says Ricardo Hanel, a neurosurgeon with Baptist Health in Jacksonville, who was on the piloting end of the Sentante demonstration. Naturalistic feeling in the transatlantic procedure came with reported latency of around 120 milliseconds. Hanel is also on Sentante’s medical advisory board.Sentante has not yet implemented AI-assisted movements of its robot, though a plan is in place to capture as much training data as possible, both from images and force measurements. “As we joke, we had to build a sophisticated piece of hardware to become a software company,” says CEO Satkauskas. The Path to Clinical UseHanel expressed optimism that any control system would be easily learned by surgeons. “I think the main limitation for robotics is that you are still dependent on bedside interventionists,” says Ahmet Gunkan, an interventional radiologist at the University of Arizona, who has written about robots and endovascular interventions. Depending on the system, these bedside assistants might be responsible for a variety of tasks related to preparing and communicating with the patient, sterilizing and preparing equipment, loading step-specific parts, and repositioning X-ray or robotic equipment. Both CEOs note that while proper training will be essential, there are ways to reduce the burden on health care providers at the patient site.In the case of remote operations, “it was important to us that the robot could do the entire thing,” says Bell. Remedy’s system has been designed to handle as much of the procedure as possible, and streamline moments when bedside human interaction is necessary. For example, even since the older version used in Toronto, changes have been made to maintain a clean line of communication between bedside and remote clinicians, facilitated by the Remedy system, says Bell. A team at St. Michael’s Hospital in Toronto performs, for the world’s first time, a robot-assisted neurovascular procedure remotely over a network, on 28 August 2025. Katie Cooper and Kevin Van Paassen/Unity Health TorontoThough remote EVT is a high priority, systems capable of the procedure may first be approved for other endovascular procedures performed locally. The hope is that precision robotics leads to better patient outcomes, whether the surgeon is in the next room or the next county. Remedy has a clinical trial planned in 2026 for on-premise neurointerventions, and has partnered with the Australian Stroke Alliance to distribute its N1 system and conduct a future clinical trial for remote procedures. Eventually the robot could be used to treat as many as 30 different conditions, says Bell.Satkauskas views Sentante’s equipment as a flexible platform for endovascular procedures throughout the body, which could help keep bedside clinicians familiar with the device. The system may go to market in the EU next year for peripheral vascular interventions, which restore blood flow to the limbs, and it has a breakthrough device designation from the U.S. FDA for remote stroke treatment.There are other players in the space. For example, an early telerobotic effort from a company called Corindus is still ongoing after the company’s acquisition by Siemens in 2019. And Pereira notes that Xcath has also demonstrated a long-distance simulated EVT and looks to perform local robotic EVT with live patients soon.“I think it’s an exciting time to be a neurointerventionalist,” says Hanel.
- Video Friday: Watch Robots Throw, Catch, and Hit a Baseballby Evan Ackerman on November 21, 2025 at 4:20 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.SOSV Robotics Matchup: 1–5 December 2025, ONLINEICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! Researchers at the RAI Institute have built a low-impedance platform to study dynamic robot manipulation. In this demo, robots play a game of catch and participate in batting practice, both with each other and with skilled humans. The robots are capable of throwing 70mph [112 kph], approaching the speed of a strong high school pitcher. The robots can catch and bat at short distances (23 feet [7 m]) requiring quick reaction times to catch balls thrown at up to 41 mph [66kph] and hit balls pitched at up to 30 mph [48kph].That’s a nice touch with the custom “RAI” baseball gloves, but what I really want to know is how long a pair of robots can keep themselves entertained.[ RAI Institute ]This week’s best bacronym winner is GIRAF: Greatly Increased Reach AnyMAL Function. And if that arm looks like magic, that’s because it is, although with some careful pausing of the video you’ll be able to see how it works.[ Stanford BDML ]DARPA concluded the second year of the DARPA Triage Challenge on October 4, awarding top marks to DART and MSAI in Systems and Data competitions, respectively. The three-year prize competition aims to revolutionize medical triage in mass casualty incidents where medical resources are limited.[ DARPA ]We propose a robot agnostic reward function that balances the achievement of a desired end pose with impact minimization and the protection of critical robot parts during reinforcement learning. To make the policy robust to a broad range of initial falling conditions and to enable the specification of an arbitrary and unseen end pose at inference time, we introduce a simulation-based sampling strategy of initial and end poses. Through simulated and real-world experiments, our work demonstrates that even bipedal robots can perform controlled, soft falls.[ Moritz Baecher ]Oh look, more humanoid acrobatics.My prediction: once humanoid companies run out of mocapped dance moves, we’ll start seeing some freaky stuff that leverages the degrees of freedom that robots have and humans do not. You heard it here first, folks.[ MagicLab ]I challenge the next company that makes a “lights-out” video to just cut to just a totally black screen with a little “Successful Picks” counter in the corner that just goes up and up and up.[ Brightpick ]Thanks, Gilmarie!The terrain stuff is cool and all but can we just talk about the trailer instead?[ LimX Dynamics ]Presumably very picky German birblets are getting custom nesting boxes manufactured with excessively high precision by robots.[ TUM ]All those UBTECH Walker S2 robots weren’t fake, it turns out.[ UBTECH ]This is more automation than what we’d really be thinking of as robotics at this point, but I could still watch it all day.[ Motoman ]Brad Porter (Cobot) and Alfred Lin (Sequoia Capital) discuss the future of robotics, AI, and automation at the Human[X] Conference, moderated by CNBC’s Kate Rooney. They explore why collaborative robots are accelerating now, how AI is transforming physical systems, the role of humanoids, labor market shifts, and the investment trends shaping the next decade of robotics.[ Cobot ]Humanoid robots have long captured our imagination. Interest has skyrocketed along with the perception that robots are getting closer to taking on a wide range of labor-intensive tasks. In this discussion, we reflect on what we’ve learned by observing factory floors, and why we’ve grown convinced that chasing generalization in manipulation—both in hardware and behavior—isn’t just interesting, but necessary. We’ll discuss AI research threads we’re exploring at Boston Dynamics to push this mission forward, and highlight opportunities our field should collectively invest more in to turn the humanoid vision, and the reinvention of manufacturing, into a practical, economically viable product.[ Boston Dynamics ]On November 12, 2025, Tom Williams presented “Degrees of Freedom: On Robotics and Social Justice” as part of the Michigan Robotics Seminar Series.[ Michigan Robotics ]Ask the OSRF Board of Directors anything! Or really, listen to other people ask them anything. [ ROSCon ]
- Why Is Everyone’s Robot Folding Clothes?by Chris Paxton on November 19, 2025 at 4:00 pm
It seems like every week there’s a new video of a robot folding clothes. We’ve had some fantastic demonstrations, like this semi-autonomous video from Weave Robotics on X.It’s awesome stuff, but Weave is far from the only company producing these kinds of videos. Figure 02 is folding clothes. Figure 03 is folding clothes. Physical Intelligence launched their flagship vision-language-action model, pi0, with an amazing video of a robot folding clothes after unloading a laundry machine. You can see robots folding clothes live at robotics expos. Even before all this, Google showed clothes folding in their work, ALOHA unleashed. 7X Tech is even planning to sell robots to fold clothes!And besides folding actual clothes, there are other clothes-folding-like tasks, like Dyna’s napkin folding—which leads to what is probably my top robot video of the year, demonstrating 18 hours of continuous napkin folding. So why are all of these robotic manipulation companies suddenly into folding? Reason 1: We basically couldn’t do this beforeThere’s work going back over a decade that shows some amount of robotic clothes folding. But these demonstrations were extremely brittle, extremely slow, and not even remotely production-ready. Previous solutions existed (even learning-based solutions!) but they relied on precise camera calibration, or on carefully hand-designed features, meaning that these clothes-folding demos generally worked only on one robot, in one environment, and may have only ever worked a single time—just enough for the recording of a demo video or paper submission. With a little bit of help from a creatively patterned shirt, PR2 was folding things back in 2014.Bosch/IEEETake a look at this example of UC Berkeley’s PR2 folding laundry from 2014. This robot is, in fact, using a neural network policy. But that policy is very small and brittle; it picks and places objects against the same green background, moves very slowly, and can’t handle a wide range of shirts. Making this work in practice would require larger models, pretrained on web-scale data, and better, more general techniques for imitation learning.And so 10 years later, with the appropriate demonstration data, many different startups and research labs have been able to implement clothes-folding demos; it’s something we have seen from numerous hobbyists and startups, using broadly similar tools (like LeRobot from HuggingFace), without intense specialization.Reason 2: It looks great and people want it!Many of us who work in robotics have this “north star” of a robot butler that can do all the chores we don’t want to do. Mention clothes folding, and many, many people will chime in about how they don’t ever want to fold clothes again and are ready to part with basically any amount of money to make that happen.This is important for the companies involved as well. Companies like Figure and 1x have been raising large amounts of money predicated on the idea that they will be able to automate many different jobs, but increasingly these companies seem to want to start in the home. Dyna Robotics can fold an indefinite number of napkins indefinitely.Dyna RoboticsAnd that’s part of the magic of these demos. While they’re slow and imperfect, everyone can start to envision how this technology becomes the thing that we all want: a robot that can exist in our house and mitigate all those everyday annoyances that take up our time.Reason 3: It avoids what robots are still bad atThese robot behaviors are produced by models trained via imitation learning. Modern imitation-learning methods like Diffusion Policy use techniques inspired by generative AI to produce complex, dexterous robot trajectories, based on examples of expert human behavior that’s been provided to them—and they often need many, many trajectories. The work ALOHA Unleashed by Google is a great example, needing about 6,000 demonstrations to learn how to, for example, tie a pair of shoelaces. For each of these demonstrations, a human piloted a pair of robot arms while performing the task; all of this data was then used to train a policy.We need to keep in mind what’s hard about these demonstrations. Human demonstrations are never perfect, nor are they perfectly consistent; for example, two human demonstrators will never grab the exact same part of an object with submillimeter precision. That’s potentially a problem if you want to screw a cover in place on top of a machine you’re building, but it’s not a problem at all for folding clothes, which is fairly forgiving. This has two knock-on effects:It’s easier to collect the demonstrations you need for folding clothes, as you don’t need to throw out every training demonstration that’s a millimeter out of spec.You can use cheaper, less repeatable hardware to accomplish the same task, which is useful if you suddenly need a fleet of robots collecting thousands of demos, or if you’re a small team with limited funding!For similar reasons, it’s great that with cloth folding, you can fix your cameras in just the right position. When learning a new skill, you need training examples with “coverage” of the space of environments you expect to see at deployment time. So the more control you have, the more efficient the learning process will be—the less data you’ll need, and the easier it will be to get a flashy demo. Keep this in mind when you see a robot folding things on a plain tabletop or with an extremely clean background; that’s not just nice framing, it helps out the robot a lot!And since we’ve committed to collecting a ton of data—dozens of hours—to get this task working well, mistakes will be made. It’s very useful, then, if it’s easy to reset the task, i.e., restore it to a state from which you can try the task again. If something goes wrong folding clothes, it’s fine. Simply pick the cloth up, drop it, and it’s ready for you to start over. This wouldn’t work if, say, you were stacking glasses to put away in a cupboard, since if you knock over the stack or drop one on the floor, you’re in trouble.Clothes folding also avoids making forceful contact with the environment. Once you’re exerting a lot of pressure, things can break, the task can become non-resettable, and demonstrations are often much harder to collect because forces aren’t as easily observable to the policy. And every piece of variation (like the amount of force you’re exerting) will end up requiring more data so the model has “coverage” of the space it’s expected to operate in.What to Look Forward toWhile we’re seeing a lot of clothes-folding demos now, I still feel, broadly, quite impressed with many of them. As mentioned above, Dyna was one of my favorite demos this year, mostly because longer-running robot policies have been so rare until now. But they were able to demonstrate zero-shot folding (meaning folding without additional training data) at a couple of different conferences, including Actuate in San Francisco and the Conference on Robot Learning (CoRL) in Seoul. This is impressive and actually very rare in robotics, even now.In the future, we should hope to see robots that can handle more challenging and dynamic interactions with their environments: moving more quickly, moving heavier objects, and climbing or otherwise handling adverse terrain while performing manipulation tasks.But for now, remember that modern learning methods will come with their own strengths and weaknesses. It seems that, while not easy, clothes folding is the kind of task that’s just really well suited for what our models can do right now. So expect to see a lot more of it.
- Students Compete—and Cooperate—in FIRST Global Robotics Challengeby Kohava Mendelsohn on November 15, 2025 at 2:00 pm
Aspiring engineers from 191 countries gathered in Panama City in October to compete in the FIRST Global Robotics Challenge. The annual contest aims to foster problem-solving and cooperation, and inspire the next generation of engineers through three challenges that are inspired by a different theme every year. Teams of students from 14 to 18 years old from around the world compete in the three-day event, remotely operating their robots to complete the challenges. This year’s topic was “Eco-equilibrium,” emphasizing the importance of preserving ecosystems and protecting vulnerable species.Turning Robotics Into a Sport Each team competed in a series of ranking matches at the event. The matches consisted of several simultaneous goals, lasting 2 minutes and 30 seconds. First, students guided their robots in gathering “biodiversity units” (multicolored balls) and delivering them to their humans. Next, the robots removed “barriers” (larger, gray balls) from containers and disposed of them in a set area. Then team members threw the biodiversity units into the now-cleared containers to score points. At the end of the match, each robot was tasked with climbing a 1.5-meter rope. The team with the most points won the match.To promote collaboration, each match had two groups, which consisted of three individual teams and their robots, competing for victory. Each team controlled its own robot but had to work with the other robots in the group to complete the tasks. If all six robots managed to climb the rope at the end of the match, each team’s scores were multiplied by 1.5.The top 24 teams were split into six “alliances” of four individual teams each to compete in the playoffs. The highest-scoring alliance was crowned the winner. This year’s winning teams were Cameroon, Mexico, Panama, and Venezuela. Each student received a gold medal.It may have been hard to tell it was a competition at first glance. When all six robots successfully climbed the rope at the end of the match, students across teams were hugging each other, clapping, and cheering. “It’s not about winning, it’s not about losing, it’s about learning from others,” says Clyde Snyders, a member of the South Africa team. His sentiment was echoed throughout the event. Making It Into the CompetitionBefore the main event, countries all over the world run qualifying events where thousands of students show off their robotics skills for a chance to make it to the final competition. Each country chooses its team differently. Some pick the top-scoring team to compete, while others pick students from different teams to create a new one.Even after qualifying, for some students, physically getting to the competition isn’t straightforward. This year, Team Jamaica faced challenges after Hurricane Melissa struck the country on 28 October, one day before the competition began. It was the strongest storm that has ever hit Jamaica, killing 32 people and leaving billions of dollars in infrastructure repairs. Because of the damage, the Jamaican team faced repeatedly cancelled flights and other travel delays. They almost didn’t make it, but FIRST Global organizers covered the costs of their travel. The students arrived on the second day, just in time to participate in enough matches to avoid being disqualified. Team Jamaica arrived late due to Hurricane Melissa, but they remained positive. Kohava Mendelsohn“We are so happy to be here,” says Joelle Wright, the team captain. “To be able to engage in new activities, to compete, and to be able to showcase our hard work.” Team Jamaica won a bronze medal.Working Together to Fix and Improve RobotsThroughout the competition, it was a regular occurrence to see students from different teams huddled together, debugging problems, sharing tips, and learning together. Students were constantly fixing their robots and adding new features at the event’s robot hospital. There, teams could request spare parts, get help from volunteers, and access the tools they needed. Volunteering in the robot hospital is demanding, but rewarding, says Janet Kapito, an electrical engineer and the operations manager at Robotics Foundation Malawi in Blantyre. She participated in the FIRST Global Challenge when she was a student. “[The volunteers] get to see different perspectives and understand how people think differently,” she says. It’s rewarding to watch students solve problems on their own, she adds. The hospital was home to many high-stress situations, especially on the first day of the competition. The Ecuadorian team’s robot was delayed in transit. So, using the robot hospital’s parts, the students built a new robot to compete with. Tanzanian team members were hard at work repairing their robot, which was having issues with the mechanism that allowed it to climb up the rope. Collaboration played a key role in the hospital. When the South African team’s robot was having mechanical problems, the students weren’t fixing it alone—several teams, including Venezuela, Slovenia, and India, came to help. “It was truly inspirational, and such a great effort in bringing teams from over 190 countries to come and collaborate,” says Joseph Wei, director of IEEE Region 6, who was in attendance at the event.The Importance of Mentoring Future EngineersBehind every team were mentors and coaches who provided students with guidance and experience. Many of them were past participants who are invested in teaching the next generation of engineers. But the robots are designed and built by the students, says Rob Haake, a mentor for Team United States. He tried to stay as hands-off as possible in the engineering of the robot, he says, “so if you asked me to turn on the robot, I don’t even know how to do it.” Haake is the COO of window and door manufacturing company Weiland Inc., in Norfolk, Neb. His passion is to teach kids the skills they need to build things. It’s important to teach students how to think critically and solve problems while also developing technical skills, he says, because those students are the future tech leaders. One major issue he sees is the lack of team mentors. If you’re an engineer, he says, “the best way to help [FIRST Global] grow is to call your local schools to ask if they have a robotics team, and if not, how you can help create one.“The answer may be a monetary donation or, more importantly, your time,” he says. The students you mentor may one day represent their country at a FIRST Robotics Challenge.
- This Soft Robot Is 100% Edible, Including the Batteryby Evan Ackerman on November 14, 2025 at 8:24 pm
While there are many useful questions to ask when encountering a new robot, “can I eat it” is generally not one of them. I say ‘generally,’ because edible robots are actually a thing—and not just edible in the sense that you can technically swallow them and suffer both the benefits and consequences, but ingestible, where you can take a big bite out of the robot, chew it up, and swallow it.Yum.But so far these ingestible robots have included a very please-don’t-ingest-this asterisk: the motor and battery, which are definitely toxic and probably don’t taste all that good. The problem has been that soft, ingestible actuators run on gas pressure, requiring pumps and valves to function, neither of which are easy to make without plastic and metal. But in a new paper, researchers from Dario Floreano’s Laboratory of Intelligent Systems at EPFL in Switzerland have demonstrated ingestible versions of both of batteries and actuators, resulting in what is, as far as I know, the first entirely ingestible robot capable of controlled actuation. EPFL Let’s start with the battery on this lil’ guy. In a broad sense, a battery is just a system for storing and releasing energy. In the case of this particular robot, the battery is made of gelatin and wax. It stores chemical energy in chambers containing liquid citric acid and baking soda, both of which you can safely eat. The citric acid is kept separate from the baking soda by a membrane, and enough pressure on the chamber containing the acid will puncture that membrane, allowing the acid to slowly drip onto the baking soda. This activates the battery and begins to generate CO2 gas, along with sodium citrate (common in all kinds of foods, from cheese to sour candy) as a byproduct. EPFLThe CO2 gas travels through gelatin tubing into the actuator, which is of a fairly common soft robotic design that uses interconnected gas chambers on top of a slightly stiffer base that bends when pressurized. Pressurizing the actuator gets you one single actuation, but to make the actuator wiggle (wiggling being an absolutely necessary skill for any robot), the gas has to be cyclically released. The key to doing this is the other major innovation here: an ingestible valve. EPFLThe valve operates based on the principle of snap-buckling, which means that it’s happiest in one shape (closed), but if you put it under enough pressure, it rapidly snaps open and then closed again once the pressure is released. The current version of the robot operates at about four bending cycles per minute over a period of a couple of minutes before the battery goes dead.And so there you go: a battery, a valve, and an actuator, all ingestible, makes for a little wiggly robot, also ingestible. Great! But why? “A potential use case for our system is to provide nutrition or medication for elusive animals, such as wild boars,” says lead author Bokeon Kwak. “Wild boars are attracted to live moving prey, and in our case, it’s the edible actuator that mimics it.” The concept is that you could infuse something like a swine flu vaccine into the robot. Because it’s cheap to manufacture, safe to deploy, completely biodegradable, and wiggly, it could potentially serve as an effective strategy for targeted mass delivery to the kind of animals that nobody wants to get close to. And it’s obviously not just wild boars—by tuning the size and motion characteristics of the robot, what triggers it, and its smell and taste, you could target pretty much any animal that finds wiggly things appealing. And that includes humans!Kwak says that if you were to eat this robot, the actuator and valve would taste a little bit sweet, since they have glycerol in them, with a texture like gummy candy. The pneumatic battery would be crunchy on the outside and sour on the inside (like a lemon) thanks to the citric acid. While this work doesn’t focus specifically on taste, the researchers have made other versions of the actuator that were flavored with grenadine. They served these actuators out to humans earlier this year, and are working on an ‘analysis of consumer experience’ which I can only assume is a requirement before announcing a partnership with Haribo. Eatability, though, is not the primary focus of the robot, says PI Dario Floreano. “If you look at it from the broader perspective of environmental and sustainable robotics, the pneumatic battery and valve system is a key enabling technology, because it’s compatible with all sorts of biodegradable pneumatic robots.” And even if you’re not particularly concerned with all the environmental stuff, which you really should be, in the context of large swarms of robots in the wild it’s critical to focus on simplicity and affordability just to be able to usefully scale.This is all part of the EU-funded RoboFood project, and Kwak is currently working on other edible robots. For example, the elastic snap-buckling behavior in this robot’s valve is sort of battery-like in that it’s storing and releasing elastic energy, and with some tweaking, Kwak is hoping that edible elastic power sources might be the key for tasty little jumping robots that jump right off the dessert plate and into your mouth.Edible Pneumatic Battery for Sustained and Repeated Robot Actuation, by Bokeon Kwak, Shuhang Zhang, Alexander Keller, Qiukai Qi, Jonathan Rossiter, and Dario Floreano from EPFL, is published in Advanced Science.
- Video Friday: DARPA Challenge Focuses on Heavy-Lift Dronesby Evan Ackerman on November 14, 2025 at 6:30 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! Current multirotor drones provide simplicity, affordability, and ease of operation; however, their primary limitation is their low payload-to-weight ratio, which typically falls at 1:1 or less. The DARPA Lift Challenge aims to shatter the heavy lift bottleneck, seeking novel drone designs that can carry payloads more than four times their weight, which would revolutionize the way we use drones across all sectors.[ DARPA ]Huge milestone achieved! World’s first mass delivery of humanoid robots has completed! Hundreds of UBTECH Walker S2 have been delivered to our partners.I really hope that’s not how they’re actually shipping their robots.[ UBTECH ]There is absolutely no reason to give robots hands if you can just teach them to lasso stuff instead.[ ArcLab ]Saddle Creek deployed Carter in its order fulfillment operation for a beauty client. It helps to automate and optimize tote delivery operations between multiple processing and labeling lines and more than 20 designated drop-off points. In this capacity, Carter functions as a flexible, non-integrated “virtual conveyor” that streamlines material flow without requiring fixed infrastructure.[ Robust.ai ]This is our latest work on an aerial–ground robot team, the first time a language–vision hierarchy achieves long-horizon navigation and manipulation on the real UAV + quadruped using only 2D cameras. The article is published open-access in Advanced Intelligent Systems.[ DRAGON Lab ]Thanks, Moju!I am pretty sure that you should not use a quadrupedal robot to transport your child. But only pretty sure, not totally certain.[ DEEP Robotics ]Building Behavioral Foundation Models (BFMs) for humanoid robots has the potential to unify diverse control tasks under a single, promptable generalist policy. However, existing approaches are either exclusively deployed on simulated humanoid characters, or specialized to specific tasks such as tracking. We propose BFM-Zero, a framework that learns an effective shared latent representation that embeds motions, goals, and rewards into a common space, enabling a single policy to be prompted for multiple downstream tasks without retraining.[ BFM-Zero ]Welcome to the very, very near future of manual labor.[ AgileX ]MOMO (Mobile Object Manipulation Operator) has been one of KIMLAB’s key robots since its development about two years ago and has featured as a main actor in several of our videos. The design and functionalities of MOMO were recently published in IEEE Robotics & Automation Magazine.[ Paper ] via [ KIMLAB ]We are excited about our new addition to our robot fleet! As a shared resource for our faculty members, this robot will facilitate multiple research activities within our institute that target significant future funding. Our initial focus for this robot will be on an agricultural application but we have big plans for the robot in human-robot interaction projects.[ Ingenuity Labs ]The nice thing about robots that pick grapes in vineyards is that they don’t just eat the grapes, like I do.[ Extend Robotics ]How mobile of a mobile manipulator do you need?[ Clearpath Robotics ]Robotics professor Dr. Christian Hubicki talks about the NEO humanoid announcement on October 29, 2025. While explaining the technical elements and product readiness, he refuses to show any emotion whatsoever.[ Optimal Robotics Lab ]
- Video Friday: This Drone Drives and Flies—Seamlesslyby Evan Ackerman on November 7, 2025 at 6:30 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! Unlike existing hybrid designs, Duawlfin eliminates the need for additional actuators or propeller-driven ground propulsion by leveraging only its standard quadrotor motors and introducing a differential drivetrain with one-way bearings. The seamless transitions between aerial and ground modes further underscore the practicality and effectiveness of our approach for applications like urban logistics and indoor navigation.[ HiPeR Lab ]I appreciate the softness of NEO’s design, but those fingers look awfully fragile.[ 1X ]Imagine reaching into your backpack to find your keys. Your eyes guide your hand to the opening, but once inside, you rely almost entirely on touch to distinguish your keys from your wallet, phone, and other items. This seamless transition between sensory modalities (knowing when to rely on vision versus touch) is something humans do effortlessly but robots struggle with. The challenge isn’t just about having multiple sensors. Modern robots are equipped with cameras, tactile sensors, depth sensors, and more. The real problem is how to integrate these different sensory streams, especially when some sensors provide sparse but critical information at key moments. Our solution comes from rethinking how we combine modalities. Instead of forcing all sensors through a single network, we train separate expert policies for each modality and learn how to combine their action predictions at the policy level.Multi-university Collaboration presented via [ GitHub ]Thanks, Haonan!Happy (somewhat late) Halloween from Pollen Robotics![ Pollen Robotics ]In collaboration with our colleagues from Iowa State and University of Georgia, we have put our pipe-crawling worm robot to test in the field. See it crawl through corrugated drainage pipes in a stream and a smooth section of a subsurface drainage system.[ Paper ] from [ Smart Microsystems Laboratory, Michigan State University ]Heterogeneous robot teams operating in realistic settings often must accomplish complex missions requiring collaboration and adaptation to information acquired online. Because robot teams frequently operate in unstructured environments—uncertain, open-world settings without prior maps—subtasks must be grounded in robot capabilities and the physical world. We present SPINE-HT, a framework that addresses these limitations by grounding the reasoning abilities of LLMs in the context of a heterogeneous robot team through a three-stage process. In real-world experiments with a Clearpath Jackal, a Clearpath Husky, a Boston Dynamics Spot, and a high-altitude UAV, our method achieves an 87 percent success rate in missions requiring reasoning about robot capabilities and refining subtasks with online feedback.[ SPINE-HT ] from [ GRASP Lab, University of Pennsylvania ]Astribot keeping itself busy at IROS 2025.[ Astribot ]In two papers published in Matter and Advanced Science, a team of scientists from the Physical Intelligence Department at the Max Planck Institute for Intelligent Systems in Stuttgart, Germany, developed control strategies for influencing the motion of self-propelling oil droplets. These oil droplets mimic single-celled microorganisms and can autonomously solve a complex maze by following chemical gradients. However, it is very challenging to integrate external perturbation and use these droplets in robotics. To address these challenges, the team developed magnetic droplets that still possess lifelike properties and can be controlled by external magnetic fields. In their work, the researchers showed that they are able to guide the droplet’s motion and use them in microrobotic applications such as cargo transportation.[ Max Planck Institute ]Everyone has fantasized about having an embodied avatar! Full-body teleoperation and full-body data acquisition platform is waiting for you to try it out![ Unitree ]It’s not a humanoid, but it right now safely does useful things and probably doesn’t cost all that much to buy or run.[ Naver Labs ]This paper presents a curriculum-based reinforcement learning framework for training precise and high-performance jumping policies for the robot Olympus. Separate policies are developed for vertical and horizontal jumps, leveraging a simple yet effective strategy. Experimental validation demonstrates horizontal jumps up to 1.25 m with centimeter accuracy and vertical jumps up to 1.0 m. Additionally, we show that with only minor modifications, the proposed method can be used to learn omnidirectional jumping.[ Paper ] from [ Autonomous Robots Lab, Norwegian University of Science and Technology ]Heavy payloads are no problem for it: The new KR TITAN ultra moves payloads of up to 1,500 kg, making the heavy lifting extreme in the KUKA portfolio.[ Kuka ]Good luck getting all of the sand out of that robot. Perhaps a nice oil bath is in order?[ DEEP Robotics ]This CMU RI Seminar is from Yuke Zhu at the University of Texas at Austin, on “Toward Generalist Humanoid Robots: Recent Advances, Opportunities, and Challenges.”In an era of rapid AI progress, leveraging accelerated computing and big data has unlocked new possibilities to develop generalist AI models. As AI systems like ChatGPT showcase remarkable performance in the digital realm, we are compelled to ask: Can we achieve similar breakthroughs in the physical world—to create generalist humanoid robots capable of performing everyday tasks? In this talk, I will outline our data-centric research principles and approaches for building general-purpose robot autonomy in the open world. I will present our recent work leveraging real-world, synthetic, and web data to train foundation models for humanoid robots. Furthermore, I will discuss the opportunities and challenges of building the next generation of intelligent robots.[ Carnegie Mellon University Robotics Institute ]
- Inside Hyundai’s Massive Metaplantby Lawrence Ulrich on November 5, 2025 at 2:00 pm
When I traveled to Ellabell, Ga., in May to report on Hyundai Motor Group’s hyperefficient Metaplant—a US $12.6 billion boost to U.S.-based manufacturing of EVs and batteries—the company’s timing appeared solid. At this temple of leading-edge factory tech, Ioniq 5 and Ioniq 9 SUVs marched along surgically spotless assembly lines, giving the South Korean automaker a defensible bulwark against the Trump administration’s tariffs and onshoring fervor.But dark clouds were already gathering. Consumer adoption of EVs had started slowing. The U.S. federal government’s $7,500 clean-car tax credit, which had helped hundreds of thousands of people make the leap to EVs, was being phased out. Held securely on a yellow jig, a three-row Ioniq 9 SUV glides from station to station in the assembly hall. A view from below shows its generous, 110.3-kilowatt-hour battery pack, which, as in most EVs, sits below the floor of the car. The pack, which is shielded to prevent or limit damage in a collision, is part of an advanced 800-volt architecture for ultrafast DC charging. Christopher Payne/EstoNear the Savannah-area factory, I drove a smartly designed Ioniq 9, a three-row SUV tailored to the United States’ plus-size tastes. I also saw a battery plant taking shape: a $4.3 billion joint venture between Hyundai and LG Energy Solution, on track to produce lithium-ion cells for Hyundai, Kia, and Genesis models in 2026. That facility is one of 11 low-roofed buildings that encompass 697,000 square meters (70 hectares), their pale green walls designed to blend into the Georgia countryside. Backed by $2.1 billion in state subsidies, the Metaplant is the largest public development project in Georgia’s history. Covering 70 hectares, it is the centerpiece of Hyundai’s $12.6 billion total investment in the state, including the battery factory built with LG Energy Solution that ICE and other agents raided in September. Christopher Payne/EstoThat battery plant made headlines in September, when U.S. Immigration and Customs Enforcement (ICE) agents staged a workplace raid that led to more than 300 South Korean workers being detained and deported.The episode highlighted the transnational cooperation—and tensions—inherent in importing a leading-edge manufacturing operation, a duality that might be familiar to anyone old enough to recall Japan’s game-changing entry into the U.S. automobile market in the 1970s and ’80s. The Metaplant is the largest publicly backed project in Georgia’s history. Its creation was accelerated by the Biden administration’s pro-EV policies, and it was also the centerpiece of Republican Gov. Brian Kemp’s bid to make his state “the electric mobility capital of the country.” Now, it was suddenly the latest flashpoint in an ongoing culture-and-trade war.Automakers roll with the punches because they have no choice An automated guided vehicle (AGV) prepares to pick up a rack of windshields from an automated trailer unloader, for “just in time” delivery to an assembly line where Ioniq 5 EVs are being built. There is no human intervention from the time parts arrive at the Metaplant’s loading docks to their installation. Christopher Payne/Esto Robots perform myriad tasks, yet human hands are still best for precision work. Jerry Roach, the Metaplant’s assembly manager, says, “I want my people doing craftsmanship. I want to pay people well for the things humans do well, and take away the stuff that’s tedious and boring.” Christopher Payne/EstoAs with other EV makers facing hurricane-force headwinds, including the U.S. rollback of pollution and fuel-economy rules, Hyundai has chosen to forge ahead with its long-laid plans. Company executives call the Metaplant North America’s most automated car factory and the most advanced full-scale factory among Hyundai Motor Co.’s 12 global manufacturing facilities. It rivals or surpasses Japan’s most advanced plants, such as the best operated by Toyota. Compared with the near-Dickensian Detroit auto factory that I toiled at in the 1980s, the stunning facility is a veritable MOMA: a modern museum of manufacturing art.To have any chance of one-upping China, car factories elsewhere must become hyperefficient, which includes enlisting armies of AI-controlled robots—robots that can potentially work 24/7 and never ask for a raise or a lunch break.The factory may eventually employ 8,500 people directly, and 7,000 satellite workers, for an annual capacity of 500,000 cars—more than Tesla’s Texas Gigafactory but less than Tesla’s Shanghai plant. This past summer, just 1,340 humans were sufficient to send a constant stream of two Ioniq models down these gleaming assembly lines. The “Meta Pros” working on those lines were earning on average $58,100 a year, which is 35 percent higher than the average in Bryan County, Ga.Clearly the days of Ford’s River Rouge complex, which employed more than 100,000 in the 1930s, are gone. As in many new factories, you’ll see surprisingly few people beyond the assembly line itself. During my visit, I spotted less than two dozen in a cavernous welding hall, where 475 robots were piecing together car chassis in a whirling, metallic dance. A steel stamping plant was so quiet that no ear protection was required, even as robots stamped out roofs and other body panels, and then stowed them in overhead racks.Outside, human workers parked their cars beneath solar roofs that generate up to 5 percent of the plant’s electricity. Meanwhile, a fleet of 21 hydrogen fuel-cell trucks, from the Hyundai-owned Xcient, carries parts from suppliers, emitting zero tailpipe emissions. The automaker’s goal is to obtain 100 percent of the Metaplant’s energy from renewables by 2030. An Ioniq 9 body-in-white, the basic steel skeleton of an automobile, leaves the “main buck” section of the body build line. This line is where the vehicle’s floor and sides meet to form a recognizable car. The line adapts to changing production mixes to meet customer orders, with built-in flexibility to assemble future models.Christopher Payne/Esto Sparks fly as welding robots piece together the Ioniq 9’s “body-in-white,” the industry term for the basic steel skeleton of a car, prior to the addition of subassemblies such as the suspension, power train, body trim, and interior. The Metaplant’s welding shop houses about 500 industrial robots.Christopher Payne/Esto Robotic welders have revolutionized car manufacturing, joining the parts of an auto body with levels of speed, precision, and safety that humans can’t match. Such advantages reduce labor costs and scrapped materials. Hyundai is also now experimenting with humanoid robots to perform welding tasks.Christopher Payne/Esto “Body-complete” robots mount front doors onto Ioniq 5s, using machine vision and laser-measurement systems to ensure an exact fit of movable panels on each body. The robots also install mounting bolts to exact torque specifications, all validated to ensure their work meets safety and quality standards.Christopher Payne/EstoSmart, silent robots unload trucksWhen those trucks roll into docks at the Metaplant, some of the factory’s 850 robots promptly unload their parts. About 300 automated guided vehicles, or AGVs, glide silently across the factory floor with no tracks required, trained to smartly stop for humans. An AGV rolls beneath a finished Hyundai, squeezes the wheels in its robotic arms, then swiftly hoists and ferries the car where it needs to go. A companion AGV further down the line executes the exact same moves. I’ve never seen so many robotic sleds like these, or a tag team move with more efficiency and grace. Within an AI-based procurement-and-logistics system, the AGVs allocate and deliver parts to workstations for “just in time” delivery, avoiding wasted time, space, and money as they stockpile components. An automated guided vehicle ferries dashboards for the Hyundai Ioniq 9 SUV, including each dashboard’s pair of 30-centimeter display screens. AGVs are programmed to navigate the factory, using cameras and sensors to slow or stop to avoid collisions, and emit spoken warnings to human workers in their path.Christopher Payne/Esto“They’re delivering the right parts to the right station at the right time, so you’re no longer relying on people to make those decisions,” says Jerry Roach, senior manager of general assembly at the Metaplant.Roach prefers that his skilled humans focus on craftsmanship, doing jobs with tactile precision that only human hands and vision can accomplish. The idea is to free people from those elements of factory work that are physically taxing, unfulfilling, and, well, robotic, so workers can use their brains and take pride in their specialized skills. Left: Adjustable-height carriers elevate an Ioniq 5 for easy access to the central fasteners and plugs that will position suspension components and the high-voltage battery, prior to the “marriage” between the upper and lower sections of the vehicle. Those carriers provide flexibility for automated functions and manual operations by the human workers at the plant (whom Hyundai calls Meta Pros). Right: On the final assembly line, an Ioniq 9’s “top hat”—including body panels—is married to the lower “skateboard” structure, which includes the electric motors, battery, and suspension. A finished car then undergoes various tests, including a water bath to check for leaks and a quick road test outdoors. Christopher Payne/EstoRobots, Roach says, are best tasked with heavy lifting and repetitive tasks, or those that demand digitized speed and accuracy. One example is a “collaborative” robot, sophisticated enough to work safely in close proximity to people, despite its mammoth strength. For the first time at a Hyundai factory, such a robot is installing bulky, heavy doors on the assembly line—a notoriously tricky task to perform without scratching the glossy paint or damaging surrounding panels. Hyundai is proud of its collaborative robots, including one that can precisely install a heavy door, a tricky task for humans to perform without damaging the panels. Those robots require advanced control systems so that they can work alongside human workers without needing to be fenced off or otherwise isolated.Christopher Payne/Esto“Guess what? Robots do that perfectly, always putting the door in the exact same place,” Roach says. “So here, that technology makes sense.”Man’s best friend, or its mechanical counterparts, stroll the factory floor: Spot, the robotic quadrupeds from Hyundai-owned Boston Dynamics, use camera vision, sensors, and what Boston Dynamics calls “athletic intelligence” to sniff out potential welding defects. Spot, the robot dog designed by Hyundai-owned Boston Dynamics, inspects body welds on an Ioniq 5 for defects. Equipped with a sensor suite, the quadruped bot can recharge autonomously, dynamically work around fixed or moving obstacles, and get back on its feet if it falls. Christopher Payne/EstoThose four-legged bots may soon have a biped master: Atlas, the humanoid robot, also from Boston Dynamics. The humanoid’s physical dexterity is uncanny, with a 360-degree swiveling head that allows it to walk forward and backward without turning its body. One look at these Atlases crawling, cartwheeling, or breakdancing during testing and you might reasonably conclude they’re a potential Terminator of jobs. Hyundai executives insist that’s not the case, even as they plan to put Atlases to work in their global factories. Boston Dynamics is training these robots to sense their environments and manipulate and move parts in complex sequences. At this backup station, high-voltage battery fasteners can be installed in an Ioniq 5. The station ensures that the assembly line keeps running even if an automated production system requires servicing. Christopher Payne/EstoFrom nearby Interstate 16, Georgia drivers can see freshly painted Ioniq 5s and 9s moving along a conveyor on a windowed bridge—an intentional glimpse of what’s happening inside. They can also see their tax dollars at work, after $2.1 billion in state subsidies. Hyundai is already building a second battery plant in Georgia, and a steel plant in Louisiana, part of an expanded pledge of $21 billion in U.S. investment through 2028. After their frames are fully welded, Ioniq 5s move along a conveyor [in the background] to an environmentally friendly paint shop. From there, the cars will travel along an elevated bridge, visible from nearby Interstate 16 in Ellabell, Ga., toward final assembly.Christopher Payne/Esto An Ioniq 5 arrives at its final inspection station. Immediately after, a human driver gets to drive the pristine car for the first time, on a test track just outside the factory. The first Ioniq 5 rolled off the Metaplant line on 3 October 2024, with the larger Ioniq 9 kicking off production in March 2025. Christopher Payne/EstoIn a suddenly inhospitable climate for EVs, there’s nothing automatic about building and selling the cars. But Hyundai and other automakers will keep trying. They don’t have any other choice.This article appears in the December 2025 print issue as “Inside Hyundai’s Massive Metaplant.”
- Robotic Fish Zips Through Water With Flexible Electromagnetic Finby Michelle Hampson on November 4, 2025 at 4:00 pm
This article is part of our exclusive IEEE Journal Watch series in partnership with IEEE Xplore.Fish are able to dart quickly through the water and turn on a dime with a flick of their tail. Researchers have been trying to achieve similar results with aquatic robots. In fact, one group in China has made progress using a flexible electromagnetic fin that propels an underwater robot at 405 millimeters—or 1.66 body lengths—per second. The team’s robotic swimmer can also make turns over just a 0.86 body-length radius.Fanghao Zhou, an assistant professor at the State Key Laboratory of Ocean Sensing at Zhejiang University in Zhejiang, China, helped guide the research. Zhou notes that fish are agile, efficient, and adaptive—and robotically mimicking these qualities is a challenge.“Traditional robotic fins powered by motors can generate strong thrust, but they’re often bulky and rigid,” he says. “Soft actuators, on the other hand, are flexible but usually too weak to be practical. Our goal was to combine the best of both fields—a compact actuator that’s powerful yet flexible, like real muscle.”Making a New Kind of FinSo the research team designed a flexible electromagnetic fin with an elastic joint that swishes back and forth with little friction. It’s built with two small coils and spherical magnets. When alternating current flows through the coils, it creates an oscillating magnetic field that makes the fin flap back and forth, much like a fish’s tail. When the magnetic field isn’t oscillating, the fin returns to a neutral position at rest. In their study, the researchers tested their bionic fin in a pool. Zhe Wang, a Ph.D. student in Zhou’s lab, emphasizes that the team not only successfully piloted the bionic fin in water, but they also built a mathematical model connecting electrical input to hydrodynamic thrust output. “That means we can predict how the fin will behave underwater just from the input current, which is rare in soft robotics,” he says. A new robotic fish design reveals different swimming behaviors at different fin oscillation speeds. Zhe Wang et al. In their experiments, the researchers used a high-speed camera and precision force sensor to measure the trajectory of the fin and the thrust it generated—achieving a peak thrust of 0.493 newtons, despite the fin weighing just 17 grams. Zhou notes the robotic system is small, lightweight, and powerful, and it will also be easy to scale into multi-fin systems. However, he adds that the current design consumes a lot of energy. “The electromagnetic coils draw a lot of current, so the swimming duration is relatively short,” he explains. “We are exploring ways to reduce energy loss, for example, [by] optimizing coil geometry, using energy recovery circuits, and applying smart control strategies that don’t require continuous excitation.”The researchers anticipate this robotic system could have a range of applications, including perhaps in underwater exploration, ecological monitoring, and inspection—such as safely interacting with coral reefs and marine life.“Our next step is to study multi-fin coordinated motion, enabling the robot to perform more flexible and lifelike swimming behaviors,” Wang says. “We are also exploring ways to improve energy efficiency, extend operation time, and further miniaturize the system for small autonomous underwater platforms.”The researchers’ bionic fin is described in a study published 4 September in IEEE Robotics and Automation Letters.
- This Professor’s Open-Source Robots Make STEM More Inclusiveby Novid Parsi on November 4, 2025 at 3:00 pm
As an electrical engineering student in the 1980s and ’90s, Carlotta Berry had two experiences that helped shape her future as an educator.First, while she studied robots, she wasn’t allowed to interact with them. “The robots were too expensive, so the undergrads did not get to touch them,” Berry recalls. “I said to myself, I’m going to teach engineering someday, but in a way that the students will get to touch and program the robot.”This led Berry to work toward overcoming the economic exclusivity of robotics. But her second formative undergrad experience involved a different type of exclusion: Berry was one of only a few engineering students who were female or Black. “It sometimes could be a lonely experience,” says Berry. “Representation does matter.”Now, Berry is a professor in the electrical and computer engineering department at Rose-Hulman Institute of Technology, where her students learn about human-robot interactions and mobile robotics by using actual robots. Berry works on her first open-source modular 3D-printed robot, the LilyBot, with Rose-Hulman engineering students Murari Srinivasan (left) and Josiah McGee (right). Bryan Cantwell/Rose-Hulman Institute of TechnologyShe also works to support people of color in engineering. Almost three decades after she graduated, Berry realized little progress had been made when she heard Black women grad students describe feeling isolated and marginalized during an online engineering conference in 2020. “This was exactly how I felt 30 years ago,” says Berry, noting that today only about 8 percent of electronics engineers are women and about 5 percent are Black. “It was time for something to change.”Berry’s Path to TeachingAs a child in Nashville, Berry excelled at school—especially math—and thought she’d become a math teacher. But in high school, a mentor suggested that Berry consider engineering, given her strong grades in both math and science. “I didn’t really know what an engineer was,” she recalls. “I didn’t know anyone who was an engineer.” After learning about the profession at a library, Berry decided to study both engineering and math in college. In 1993, Berry earned a bachelor’s in electrical engineering at the Georgia Institute of Technology as part of a dual degree program with Spelman College, where she earned a bachelor’s in mathematics in 1992.After her bachelor’s degrees, Berry worked as a control engineer for Ford Motor Company, where she programmed assembly-line industrial robots, but she found herself yearning to answer her true calling as an educator. So, she returned to academia and got a master’s in electrical engineering and control systems at Wayne State University in 1996. Saddled with student loan debt, however, Berry then accepted a position as a control engineer for Detroit Edison. “I really enjoyed the work but once again realized I was not doing what I was meant to do,” she says. After a year at Detroit Edison, she left in pursuit of her Ph.D. in electrical and computer engineering, which she earned at Vanderbilt University in 2003. As a grad student, Berry taught at a technical school—and at last found herself on the right career path: “I always wanted to be an educator,” she says. A Turn Toward OutreachBerry traces her community-outreach work to two more pivotal moments in her career: In 2018, she became a full professor at Rose-Hulman, and in 2020, she became an endowed chair in the electrical and computer engineering department. Berry says her tenure and position at Rose-Hulman enabled her to pursue work that brings her research, teaching, and service interests together. Berry hopes to support women of color in STEM through public events. Here, she sits with students Liz Francois and Janae Gillus, both members of Rose-Hulman’s chapter of the National Society of Black Engineers.Griffin Museum of Science and Industry“As a full professor, I don’t have to worry that someone might consider the [outreach] work I do not as important as my technical robotics work,” she says. “When I provide education for students and for the community, that’s also part of my research and service.” For Berry, research and service are not separate but intertwined subject areas: Her research involves designing open-source, low-cost mobile robots to promote more inclusive robotics education.Since 2020, Berry has helped transform how electrical and computer engineering is taught and perceived. She has been teaching hands-on, interactive robotics not only to her students at Rose-Hulman but also to kids and adults across the nation. Berry has been taking her robots, as she says, “to the streets.”Berry demonstrates and discusses her open-source, 3D-printed wheeled robots at schools, libraries, museums, and other community venues. Her audiences range from kids just a few years old to adult educators who learn about robotics from Berry so they can teach the subject to their own students. To spread the word about robotics and STEM, Berry also has become active on social media, overcoming her innate introversion because, she explains, “visibility matters.”With any audience, Berry is always “very approachable and very engaging,” says Nicki Manion, a program manager for Rose-Hulman’s educational outreach who collaborates with Berry on professional development workshops for teachers.“I have to go where people are,” Berry says. “I get robots in front of people who are historically marginalized and would normally not have access to these technologies.”This past summer, for example, Berry shared her robots with children from about 3 to 10 years old at the dozens of branches of the Indianapolis Public Library. To understand the three main pillars of robotics—sense, plan, act—the kids learned how the robots use a sonar, microphone, and speaker in order to see, hear, and talk. Notably, at the end of each presentation, the kids got to play and interact with the robots.Last year, as part of an IEEE Education Society Initiative, Berry brought her robots to the streets globally. After grad students in countries such as Costa Rica, Niger, and Uganda received parts in the mail, Berry showed them the basics of building and programming robots.Online Community and WritingBerry hasn’t set out on her pedagogic journey all on her own, she says. In 2020, she cofounded Black in Engineering and Black in Robotics—part of the Black in X network comprising more than 80 organizations that support the work of Black professionals in STEM. For Berry, it’s no coincidence that Black in X emerged early in the pandemic. “There were a lot of bad things about the pandemic, but because we were all home and on social media, we were able to connect and find each other and form these organizations that, five years later, are still going,” she says.Her professional turning point toward more community-oriented service has led to several accolades, she says: “That was when I started to earn these awards I had never been considered for before.” In 2023, the IEEE Robotics and Automation Society awarded Berry the prestigious Undergraduate Teaching Award for her contributions to multidisciplinary robotics education and leadership in diversifying STEM. She has also been recognized by the Society of Women Engineers and AnitaB.org. Children’s books like the series Berry wrote help get kids interested in STEM.Rebellion LITOn top of her outreach and community work, Berry finds time to write children’s books—work that also has its roots in the pandemic. During that time, Berry woke up from a dream and remembered only the title of a children’s book she knew she had to write: There’s a Robot in My Closet. The book spawned a series, which features kid protagonists learning how to program robots and developing their problem-solving skills. (Berry also writes STEM-centered romance novels for adults under the pseudonym Carlotta Ardell. The heroine of her book Elevated Inferno, Berry says, struggles with the expectation to flawlessly juggle work and life—an expectation that falls more heavily on women, she finds.) While balancing her many personal and professional interests, Berry says, she maintains a clear-eyed pursuit of her professional mission: helping people of diverse backgrounds “see themselves as not just consumers of technology but creators of technology.”
- A Challenge to Roboticists: My Humanoid Olympicsby Benjie Holson on November 4, 2025 at 1:00 pm
I was a little disappointed by China’s World Humanoid Robot Games.1 As fun as real-life Rock ‘Em Sock ‘Em Robots is, what people really care about is robots doing their chores. This is why robot laundry-folding videos are so popular: We didn’t know how to do that even a few years ago. And it is certainly something that people want! But as this article so nicely articulates, basic laundry folding is in a sweet spot given the techniques we have now. It might feel like, if our AI techniques can fold laundry, maybe they can do anything—but that isn’t true, and we’re going to have to invent new techniques to be really general-purpose and useful.With that in mind, I am issuing a challenge to roboticists: Here are my Humanoid Olympic events. Each event will require us to push the state-of-the-art and unlock new capabilities in robotic manipulation. I will update my Substack as folks achieve these milestones and will mail actual, real-life medals to the winners.Current State-of-the-ArtIn order to talk about why each of these challenges pushes the state-of-the-art in robotic manipulation, let’s first talk about what’s working now. What I’m seeing working is learning from demonstration. Generally, folks are using puppeteering interfaces. Most common seems to be two copies of the robot so that a human can grab and move one of them while the other follows, or a virtual reality headset with controllers for hand tracking. They then record some 10- to 30-second activity hundreds of times over. From that data, a neural network is trained to mimic those examples. This technique has unlocked tasks that have steps that are somewhat chaotic (like pulling a corner of a towel to get it to lie flat) or have a high state space (like how a towel can be bunched up in many different ways).When thinking about this method of training robots to do things, it should be clear what some of the limitations are. Each of these has exceptions, but together they form a general trend.No force feedback at the wrists.2 The robot can only ever perform as well as the human teleoperating it, but we don’t yet have good standardized ways of getting high-resolution force information to the human teleoperator.Limited finger control.3 It’s hard for the teleoperator (and for a foundation model) to see and control all of a robot’s fingers with much more finesse than just opening and closing them.No sense of touch.4 Human hands are absolutely packed full of sensors. Getting anywhere near that kind of sensing out of robot hands in a way that’s usable by a human teleoperator is not currently possible.Medium precision.5 Based on videos I’ve seen, I think we’ve got about 1-3 centimeter precision for tasks.Now, on to the events!Event 1: DoorsEvent 2: LaundryEvent 3: ToolsEvent 4: Fingertip ManipulationEvent 5: Wet ManipulationEvent 1: DoorsThings like doors are tricky because of the asymmetric forces: You need to grasp and twist the handle or knob quite hard, but if you pull hard outside of the arc of the door, you tend to slip your grasp. Also, moving through a door requires whole-body manipulation, which is more than I’ve seen from anyone yet.Bronze Medal: Entering a round-knob push door Your browser does not support the video tag. Benjie HolsonI think this is very close to state-of-the-art (or maybe it has happened and I didn’t see it). I expect this medal to be claimed by December.Silver Medal: Entering a lever-handle self-closing push door Your browser does not support the video tag. Benjie HolsonAdding self-closing makes this significantly more challenging because of the force involved, though the lever handle is arguably easier (I just don’t see many round-knob self-closing doors).6Gold Medal: Entering a lever-handle self-closing pull door Your browser does not support the video tag. Benjie HolsonThe boss fight of doors.7 You need to either use a second limb to block the door from re-closing, or go through the door fast enough to use dynamics.Event 2: LaundryWe’re just getting started on laundry.Bronze Medal: Fold an inside-out T-shirt Your browser does not support the video tag. Benjie HolsonThis is probably doable using the techniques we have now, but it’s a longer horizon task and might require some tricky two-handed actions to pull the shirt through to right-side-out.8Silver Medal: Turn a sock inside-out Your browser does not support the video tag. Benjie HolsonI think both the hand insertion and the action of pinching the inside of the sock are interesting new challenges.Gold Medal: Hang a men’s dress shirt Your browser does not support the video tag. Benjie HolsonThe size medium shirt starts unbuttoned with one sleeve inside-out. It must end up on the hanger correctly with the sleeve fixed and at least one button buttoned. I think this one is 3-10 years out, both because buttons are really hard and because getting a strong, dexterous hand small enough to fit into a sleeve is going to be hard.Event 3: ToolsHumans are creatures of technology, and as useful as our hands are, we mostly use them to hold and manipulate tools. This challenge is about building the strength and dexterity to use basic tools.Bronze Medal: Window cleaner and paper towels Your browser does not support the video tag. Benjie HolsonThe window-cleaning fluid bottle is super-forgiving in terms of how you grasp it, but you do need to independently articulate a finger, and the finger has to be pretty strong to get fluid to spray out.9Silver Medal: Peanut butter sandwiches Your browser does not support the video tag. Benjie HolsonThe challenge here is to pick up a knife and then adjust the grasp to be strong and stable enough to scoop and spread the peanut butter. Humans use a “strong tool grasp” for all kinds of activities, but it’s very challenging for robot grippers.10Gold Medal: Use a key Your browser does not support the video tag. Benjie HolsonA key ring with at least two keys and a keychain is dropped into the robot’s waiting palm/gripper. Without putting the keys down, get the correct key aligned and inserted and turned in a lock. This requires very challenging in-hand manipulation, along with high-precision forceful interaction.Event 4: Fingertip ManipulationWe humans do all kinds of in-hand manipulation using the structure of our hands to manipulate things that we are holding.Bronze Medal: Roll matched socks Your browser does not support the video tag. Benjie HolsonRequires dexterity and precision, but not very much force.Silver Medal: Use a dog poop bag Your browser does not support the video tag. Benjie HolsonWhen I use a dog bag, I have to do a slide-between-the-fingertips action to separate the opening of the bag, which is a tricky, forceful interaction as well as a motion that I’m not even sure most robot hands are capable of. Also tricky is tearing off a single bag rather than pulling a big long spool out of the holder, if you choose to use one.11Gold Medal: Peel an orange Your browser does not support the video tag. Benjie HolsonDone without external tools. This is super tricky: high-force yet high-precision fingertip actions.Event 5: Wet ManipulationIf you sit down and write out what you might want a robot to do for you, a lot of tasks end up being kind of wet. Robots usually don’t like being wet, but we’ll have to change that if we want to have them clean for us. And wet things can be difficult to grasp and use.Bronze Medal: Wipe a countertop with a sponge Your browser does not support the video tag. Benjie HolsonMildly damp, but with exciting risk of getting the whole hand in the water if you aren’t careful. Probably requires at least splash-resistant hands (or a whole bunch of spares).Silver Medal: Clean peanut butter off your manipulator Your browser does not support the video tag. Benjie HolsonThis one naturally follows the sandwich one. Water everywhere. Seems like an important skill to have after a few hours collecting training data on the dog-poop task.Gold Medal: Use a sponge to wash grease off a pan in a sink Your browser does not support the video tag. Benjie HolsonWater, soap, grease, and an unpleasant task no one wants to do.Terms and ConditionsTo be eligible to win, a general-purpose manipulator robot running autonomously must demonstrate successful task completion in a real-time video with no cuts. You are allowed a maximum of 10x the time I took to do each task (a 4-second task can take your robot up to 40 seconds). I reserve the right to be arbitrary in deciding if things aren’t following in the spirit of the challenge. First robot to achieve this wins the prize!To claim your medallion, email bmholson+olympics@gmail.com with an address for me to ship it to. If you give me a photo of your robot wearing a medal, I will be tickled pink. I will also accept future challengers that are at least 25 percent faster than the current winner. Some medals have already been claimed; you can see the winning videos here. Good luck, and may the odds be ever in your favor.Thanks to Jeff Bingham for advice, fact-checking, and cool robot videos. And thanks to my patient wife for spending an hour filming me doing silly things in a silly costume.Notes1 As far as I can tell, kickboxing was just the Unitree mini-humanoid robot, and everyone had the same code running, so… I guess it won?2 TRI has some pretty cool stuff with force control using a big training rig.3 Tesla’s Optimus has 22 degrees of freedom using cable drives (cause you can’t fit those motors in a hand). In 2008 I worked on this robot, which also had 22 degrees of freedom, and controlling it was crazy hard (as was keeping all the cables correctly tensioned). The other hand was a big two-finger gripper, which I ended up using for most teleop tasks.4 Meta has been working with some in-finger vision systems, which seem cool.5 This is likely more a teleoperation precision limitation than a model limitation. Here is a video of Generalist Robotics doing sub-cm precision tasks. I love that hockey sticks have become the traditional “mess with a robot” tool, even for ridiculous things like this.6 Yes, I did wear this at my workplace in order to get this video. You’re welcome.7 I have programmed (not trained) a general-purpose mobile manipulator to pass through a self-close pull door, but it took over 4 minutes (disqualified for taking too long) and required a special doorstop. Also, the video isn’t public (also disqualified). Also, it’s really tacky to put up a competition and award yourself gold before it even starts.8 T-shirt starts fully inside-out in a wad. Finishes tolerably folded, right-side-out.9 You must spray three good spritzes on the window, and wipe them up with paper towels so there are no ugly streaks. Paper towels start on the paper towel roll, not pre-torn and pre-wadded.10 Peanut butter jar starts and ends closed. Sandwich should be cut in half. (Triangle or rectangular cuts are both acceptable, though your 3-year-old might disagree.)11 Mock poo allowed. Bag starts on the roll but can be in a standard dog-bag holder, held by the robot.This post originally appeared on General Robots, Benjie Holson’s Substack about making a general-purpose robot company.
- Spider-Inspired Microbots Could Replace Invasive Gut Diagnosticsby Tereza Pultarova on November 2, 2025 at 2:00 pm
Deadly intestinal cancers are on the rise, and the best chance to beat them is with an early diagnosis. But current techniques used to inspect the digestive tract are highly invasive, scaring many patients away. Some researchers hope that soft, magnetically controlled robots the size of a vitamin capsule could replace these diagnostic methods in a few short years.A team led by Qingsong Xu, a professor of electromechanical engineering at the University of Macau, in China, recently unveiled a micro-robot prototype inspired by the locomotion of an African spider that cartwheels across the desert dunes of Namibia instead of crawling. The robot, made of a rubber-like magnetic material, has been tested in animal stomachs, colons, and small intestines. The researchers said it successfully navigated the “complex environment” of the digestive tract, full of mucus, sharp turns, and obstacles as high as 8 centimeters. Today’s procedures use endoscopes, flexible tubes fitted with a camera that doctors insert into the patient’s digestive tract through the mouth or rectum. The procedure requires sedation due to the discomfort it causes to the patient, and improper manipulation with the endoscope can cause serious injuries, including bowel perforation. Some patients might delay the investigation out of fear, which could have catastrophic consequences, as cancer might spread. Other diseases, such as stomach ulcers and Crohn’s disease, are also diagnosed with endoscopy.“Traditional endoscopes cause a lot of discomfort and cannot easily access complex deeper regions inside the body,” Xu told IEEE Spectrum. “The purpose of the soft magnetic robot is to provide a minimally invasive, controllable, and highly flexible alternative.”How the Robot MovesSoft magnetic robots, like that developed by Xu, offer a more palatable alternative to endoscopy. The robot, the size of a large vitamin capsule, could be swallowed with relative ease and pass through the stomach and the entire length of the small and large intestines, propelled by an externally applied magnetic field. The robotic mini-spider could perform detailed inspections of the complicated terrain without bothering the patient too much. At the end of its journey, it passes out of the body just like processed food.Other teams have experimented with various types of robot locomotion, including crawling, jumping, and swimming, the researchers said in a paper describing the new robot, published in the International Journal of Extreme Manufacturing last month. Those earlier designs, however, had limitations when traversing such complex environments as the digestive tract. This series of images show how the robot moves through the stomach. In the rightmost column, the researchers also demonstrate targeted drug delivery. Ruomeng Xu, Xianli Wang, et al.“We went for a design inspired by the golden wheel spider, because it provided superior obstacle-crossing ability and energy efficiency compared to other locomotion models,” Xu said. “By mimicking this type of locomotion in the patient, the robot can navigate in the mucus, in the folded and even vertically inclined surfaces, with remarkable stability.”The golden wheel spider is a small arachnid, about 2 cm wide, that escapes danger by curling its legs around its body and rolling down the sloping desert dunes. The robot developed by the Macau team uses the same strategy, but instead of being propelled by gravity, it rolls through the digestive tract thanks to an external magnetic force that interacts with tiny magnets in the robot’s legs. To control the robot with precision, the researchers created a dexterous robotic arm fitted with a powerful rotating magnet that sits next to the patient during the procedure.The Future of Gut DiagnosticsThe researchers plan to perform further experiments with live animals and, if all goes well, move to clinical trials with humans. Xu hopes the soft spider robots could help doctors examine patients’ insides in as little as five years. “The medical community increasingly recognizes the potential of soft magnetic robots to revolutionize endoscopic procedures by minimizing patient discomfort and increasing precision,” said Xu. “There is a lot of interest in the medical world.”In the not-so-distant future, advances in micro-robotics may enable targeted drug delivery in the treatment of ulcers or tumors. The tiny robots could also be used in a range of minimally invasive interventions and examinations. The field has been growing rapidly in the past few years, although no such robots have yet made it into clinical practice, and magnetically controlled robots are a promising field of development many researchers are exploring.For example, a team from North Carolina State University recently presented another such robot, also using a flexible magnetic material. Instead of cartwheeling, the North Carolina robot crawls through the digestive tract like a caterpillar; the external magnetic forces induce contractions in its 3D-printed origami-style structure. A paper published in the journal Advanced Functional Materials described experiments with the robot delivering mock treatment to a mock stomach ulcer. Xiaomeng Fang, assistant professor in material engineering at North Carolina State University and lead author of the paper, told IEEE Spectrum the work has garnered a lot of interest. “These robots are soft and they can be controlled remotely,” she said. “They can also change their shape, which makes them very interesting for treatment of internal diseases.”
Subscribe to Updates
Get the latest creative news from FooBar about art, design and business.
Trending
- Ai2 dice que su modelo Molmo 2 multimodal AI puede hacer más con menos datos
- Chef Robotics lanza su robot de montaje más avanzado aún
- Otis comparte los secretos para controlar ascensores para robots
- Viernes de Video: Feliz Vacaciones de Robot
- LimX Dynamics revela TRON 2 robot de desplazamiento de forma
- RealMan Robotics abre su conjunto de datos de robots RealSource
- China descubrió cómo vender EVs. Ahora tiene que enterrar sus baterías.
- Flexxbotics continúa la expansión, oficina de apertura en Newlab en Detroit




















