- Overcoming Core Engineering Barriers in Humanoid Robotics Developmentby Murata Manufacturing Co. on March 19, 2026 at 10:00 am
A technical examination of the sensing, motion control, power, and thermal challenges facing humanoid robotics engineers — with component-level design strategies for real-world deployment.What Attendees will LearnWhy motion control remains the hardest unsolved problem — Explore the modelling complexity, real-time feedback requirements, and sensor fusion demands of maintaining stable bipedal locomotion across dynamic environments.How sensing architectures enable perception and safety — Understand the role of inertial measurement units, force/torque feedback, and tactile sensing in achieving reliable human-robot interaction and collision avoidance.What power and thermal constraints mean for system design — Examine the trade-offs in battery chemistry selection (LFP vs. NCA), DC/DC converter topologies, and thermal protection strategies that determine operational endurance.How the industry is transitioning from prototype to mass production — Learn about the shift toward modular architectures, cost-driven component selection, and supply chain readiness projected for the late 2020s.Download this free whitepaper now!
- Video Friday: These Robots Were Born to Runby Evan Ackerman on March 13, 2026 at 4:00 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! All legged robots deployed “in the wild” to date were given a body plan that was predefined by human designers and could not be redefined in situ. The manual and permanent nature of this process has resulted in very few species of agile terrestrial robots beyond familiar four-limbed forms. Here, we introduce highly athletic modular building blocks and show how they enable the automatic design and rapid assembly of novel agile robots that can “hit the ground running” in unstructured outdoor environments.[ Northwestern UniversityCenter for Robotics and Biosystems ] [ Paper ] via [ Gizmodo ] If you were going to develop the ideal urban delivery robot more or less from scratch, it would be this.[ RIVR ]Don’t get me wrong, there are some clever things going on here, but I’m still having a lot of trouble seeing where the unique, sustainable value is for a humanoid robot performing these sorts of tasks.[ Figure ]One of those things that you don’t really think about as a human, but is actually pretty important.[ Paper ] via [ ETH Zurich ]We propose TRIP-Bag (Teleoperation, Recording, Intelligence in a Portable Bag), a portable, puppeteer-style teleoperation system fully contained within a commercial suitcase, as a practical solution for collecting high-fidelity manipulation data across varied settings.[ KIMLAB ]We propose an open-vocabulary semantic exploration system that enables robots to maintain consistent maps and efficiently locate (unseen) objects in semi-static real-world environments using LLM-guided reasoning.[ TUM ]That’s it folks, we have no need for real pandas anymore—if we ever did in the first place. Be honest, what has a panda done for you lately?[ MagicLab ]RoboGuard is a general-purpose guardrail for ensuring the safety of LLM-enabled robots. RoboGuard is configured offline with high-level safety rules and a robot description, reasons about how these safety rules are best applied in robot’s context, then synthesizes a plan that maximally follows user preferences while ensuring safety.[ RoboGuard ]In this demonstration, a small team responds to a (simulated) radiation contamination leak at a real nuclear reactor facility. The team deploys their reconfigurable robot to accompany them through the facility. As the station is suddenly plunged into darkness, the robot’s camera is hot-swapped to thermal so that it can continue on. Upon reaching the approximate location of the contamination, the team installs a Compton gamma-ray camera and pan-tilt illuminating device. The robot autonomously steps forward, locates the radiation source, and points it out with the illuminator.[ Paper ]On March 6th, 2025, the Robomechanics Lab at CMU was flooded with 4 feet of black water (i.e. mixed with sewage). We lost most of the robots in the lab, and as a tribute my students put together this “In Memoriam” video. It includes some previously unreleased robots and video clips.[ Carnegie Mellon University Robomechanics Lab ]There haven’t been a lot of successful education robots, but here’s one of them.[ Sphero ]The opening keynote from the 2025 Silicon Valley Humanoids Summit: “Insights Into Disney’s Robotic Character Platform,” by Moritz Baecher, Director, Zurich Lab, Disney Research.[ Humanoids Summit ]
- Video Friday: A Robot Hand With Artificial Muscles and Tendonsby Evan Ackerman on March 6, 2026 at 4:00 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! The functional replication and actuation of complex structures inspired by nature is a longstanding goal for humanity. Creating such complex structures combining soft and rigid features and actuating them with artificial muscles would further our understanding of natural kinematic structures. We printed a biomimetic hand in a single print process composed of a rigid skeleton, soft joint capsules, tendons, and printed touch sensors.[ Paper ] via [ SRL ]Two Boston Dynamics product managers talk about their favorite classic BD robots, and then I talk about mine.And this is Boston Dynamics’ LittleDog, doing legged locomotion research 16 or so years ago in what I’m pretty sure is Katie Byl’s lab at UCSB.[ Boston Dynamics ]This is our latest work on the trajectory planning method for floating-based articulated robots, enabling the global path for searching in complex and cluttered environments.[ DRAGON Lab ]Thanks, Moju!OmniPlanner is a unified solution for exploration and inspection-path planning (as well as target reach) across aerial, ground, and underwater robots. It has been verified through extensive simulations and a multitude of field tests, including in underground mines, ballast water tanks, forests, university buildings, and submarine bunkers.[ NTNU ]Thanks, Kostas!In the ARISE project, the FZI Research Center for Information Technology and its international partners ETH Zurich, University of Zurich, University of Bern, and University of Basel took a major step toward future lunar missions by testing cooperative autonomous multirobot teams under outdoor conditions.[ FZI ]Welcome to the future, where there are no other humans.[ Zhejiang Humanoid ]This is our latest work on robotic fish, and it’s also the first underwater robot from DRAGON Lab. [ DRAGON Lab ]Thanks, Moju!Watch this one simple trick to make humanoid robots cheaper and safer![ Zhejiang Humanoid ]Gugusse and the Automaton is a 1897 French film by Georges Méliès featuring a humanoid robot in a depiction that’s nearly as realistic as some of the humanoid promo videos we’ve seen lately.[ Library of Congress ] via [ Gizmodo ]At Agility, we create automated solutions for the hardest work. We’re incredibly proud of how far we’ve come, and can’t wait to show you what’s next.[ Agility ]Kamel Saidi, robotics program manager at the National Institute of Standards and Technology (NIST), on how performance standards can pave the way for humanoid adoption.[ Humanoids Summit ]Anca Dragan is no stranger to Waymo. She worked with us for six years while also at UC Berkeley and now at Google DeepMind. Her focus on making AI safer helped Waymo as it launched commercially. In this final episode of our season, Anca describes how her work enables AI agents to work fluently with people, based on human goals and values.[ Waymo Podcast ]This UPenn GRASP SFI Seminar is by Junyao Shi: “Unlocking Generalist Robots with Human Data and Foundation Models.”Building general-purpose robots remains fundamentally constrained by data scarcity and labor-intensive engineering. Unlike vision and language, robotics lacks large, diverse datasets that span tasks, environments, and embodiments, thus limiting both scalability and generalization. This talk explores how human data and foundation models trained at scale can help overcome these bottlenecks.[ UPenn ]
- What Military Drones Can Teach Self-Driving Carsby Missy Cummings on March 2, 2026 at 12:00 pm
Self-driving cars often struggle with situations that are commonplace for human drivers. When confronted with construction zones, school buses, power outages, or misbehaving pedestrians, these vehicles often behave unpredictably, leading to crashes or freezing events, causing significant disruption to local traffic and possibly blocking first responders from doing their jobs. Because self-driving cars cannot successfully handle such routine problems, self-driving companies use human babysitters to remotely supervise them and intervene when necessary.This idea—humans supervising autonomous vehicles from a distance—is not new. The U.S. military has been doing it since the 1980s with unmanned aerial vehicles (UAVs). In those early years, the military experienced numerous accidents due to poorly designed control stations, lack of training, and communication delays.As a Navy fighter pilot in the 1990s, I was one of the first researchers to examine how to improve the UAV remote supervision interfaces. The thousands of hours I and others have spent working on and observing these systems generated a deep body of knowledge about how to safely manage remote operations. With recent revelations that U.S. commercial self-driving car remote operations are handled by operators in the Philippines, it is clear that self-driving companies have not learned the hard-earned military lessons that would promote safer use of self-driving cars today.While stationed in the Western Pacific during the Gulf War, I spent a significant amount of time in air operations centers, learning how military strikes were planned, implemented and then replanned when the original plan inevitably fell apart. After obtaining my PhD, I leveraged this experience to begin research on the remote control of UAVs for all three branches of the U.S. military. Sitting shoulder-to-shoulder in tiny trailers with operators flying UAVs in local exercises or from 4000 miles away, my job was to learn about the pain points for the remote operators as well as identify possible improvements as they executed supervisory control over UAVs that might be flying halfway around the world.Supervisory control refers to situations where humans monitor and support autonomous systems, stepping in when needed. For self-driving cars, this oversight can take several forms. The first is teleoperation, where a human remotely controls the car’s speed and steering from afar. Operators sit at a console with a steering wheel and pedals, similar to a racing simulator. Because this method relies on real-time control, it is extremely sensitive to communication delays.The second form of supervisory control is remote assistance. Instead of driving the car in real time, a human gives higher-level guidance. For example, an operator might click a path on a map (called laying “breadcrumbs”) to show the car where to go, or interpret information the AI cannot understand, such as hand signals from a construction worker. This method tolerates more delay than teleoperation but is still time-sensitive.Five Lessons From Military Drone OperationsOver 35 years of UAV operations, the military consistently encountered five major challenges during drone operations which provide valuable lessons for self-driving cars.LatencyLatency—delays in sending and receiving information due to distance or poor network quality—is the single most important challenge for remote vehicle control. Humans also have their own built-in delay: neuromuscular lag. Even under perfect conditions, people cannot reliably respond to new information in less than 200–500 milliseconds. In remote operations, where communication lag already exists, this makes real-time control even more difficult.In early drone operations, U.S. Air Force pilots in Las Vegas (the primary U.S. UAV operations center) attempted to take off and land drones in the Middle East using teleoperation. With at least a two-second delay between command and response, the accident rate was 16 times that of fighter jets conducting the same missions . The military switched to local line-of-sight operators and eventually to fully automated takeoffs and landings. When I interviewed the pilots of these UAVs, they all stressed how difficult it was to control the aircraft with significant time lag.Self-driving car companies typically rely on cellphone networks to deliver commands. These networks are unreliable in cities and prone to delays. This is one reason many companies prefer remote assistance instead of full teleoperation. But even remote assistance can go wrong. In one incident, a Waymo operator instructed a car to turn left when a traffic light appeared yellow in the remote video feed—but the network latency meant that the light had already turned red in the real world. After moving its remote operations center from the U.S. to the Philippines, Waymo’s latency increased even further. It is imperative that control not be so remote, both to resolve the latency issue but also increase oversight for security vulnerabilities.Workstation DesignPoor interface design has caused many drone accidents. The military learned the hard way that confusing controls, difficult-to-read displays, and unclear autonomy modes can have disastrous consequences. Depending on the specific UAV platform, the FAA attributed between 20% and 100% of Army and Air Force UAV crashes caused by human error through 2004 to poor interface design.UAV crashes (1986-2004) caused by human factors problems, including poor interface and procedure design. These two categories do not sum to 100% because both factors could be present in an accident.Human Factors Interface Design Procedure Design Army Hunter 47% 20% 20% Army Shadow 21% 80% 40%Air Force Predator 67% 38% 75% Air Force Global Hawk 33% 100% 0%Many UAV aircraft crashes have been caused by poor human control systems. In one case, buttons were placed on the controllers such that it was relatively easy to accidentally shut off the engine instead of firing a missile. This poor design led to the accidents where the remote operators inadvertently shut the engine down instead of launching a missile. The self-driving industry reveals hints of comparable issues. Some autonomous shuttles use off-the-shelf gaming controllers, which—while inexpensive—were never designed for vehicle control. The off-label use of such controllers can lead to mode confusion, which was a factor in a recent shuttle crash. Significant human-in-the-loop testing is needed to avoid such problems, not only prior to system deployment, but also after major software upgrades.Operator WorkloadDrone missions typically include long periods of surveillance and information gathering, occasionally ending with a missile strike. These missions can sometimes last for days; for example, while the military waits for the person of interest to emerge from a building. As a result, the remote operators experience extreme swings in workload: sometimes overwhelming intensity, sometimes crushing boredom. Both conditions can lead to errors.When operators teleoperate drones, workload is high and fatigue can quickly set in. But when onboard autonomy handles most of the work, operators can become bored, complacent, and less alert. This pattern is well documented in UAV research.Self-driving car operators are likely experiencing similar issues for tasks ranging from interpreting confusing signs to helping cars escape dead ends. In simple scenarios, operators may be bored; in emergencies—like driving into a flood zone or responding during a citywide power outage—they can become quickly overwhelmed.The military has tried for years to have one person supervise many drones at once, because it is far more cost effective. However, cognitive switching costs (regaining awareness of a situation after switching control between drones) result in workload spikes and high stress. That coupled with increasingly complex interfaces and communication delays have made this extremely difficult.Self-driving car companies likely face the same roadblocks. They will need to model operator workloads and be able to reliably predict what staffing should be and how many vehicles a single person can effectively supervise, especially during emergency operations. If every self-driving car turns out to need a dedicated human to pay close attention, such operations would no longer be cost-effective.TrainingEarly drone programs lacked formal training requirements, with training programs designed by pilots, for pilots. Unfortunately, supervising a drone is more akin to air traffic control than actually flying an aircraft, so the military often placed drone operators in critical roles with inadequate preparation. This caused many accidents. Only years later did the military conduct a proper analysis of the knowledge, skills, and abilities needed to conduct safe remote operations, and changed their training program.Self-driving companies do not publicly share their training standards, and no regulations currently govern the qualifications for remote operators. On-road safety depends heavily on these operators, yet very little is known about how they are selected or taught. If commercial aviation dispatchers are required to have formal training overseen by the FAA, which are very similar to self-driving remote operators, we should hold commercial self-driving companies to similar standards.Contingency PlanningAviation has strong protocols for emergencies including predefined procedures for lost communication, backup ground control stations, and highly reliable onboard behaviors when autonomy fails. In the military, drones may fly themselves to safe areas or land autonomously if contact is lost. Systems are designed with cybersecurity threats—like GPS spoofing—in mind.Self-driving cars appear far less prepared. The 2025 San Francisco power outage left Waymo vehicles frozen in traffic lanes, blocking first responders and creating hazards. These vehicles are supposed to perform “minimum-risk maneuvers” such as pulling to the side—but many of them didn’t. This suggests gaps in contingency planning and basic fail-safe design.The history of military drone operations offers crucial lessons for the self-driving car industry. Decades of experience show that remote supervision demands extremely low latency, carefully designed control stations, manageable operator workload, rigorous, well-designed training programs, and strong contingency planning.Self-driving companies appear to be repeating many of the early mistakes made in drone programs. Remote operations are treated as a support feature rather than a mission-critical safety system. But as long as AI struggles with uncertainty, which will be the case for the foreseeable future, remote human supervision will remain essential. The military learned these lessons through painful trial and error, yet the self-driving community appears to be ignoring them. The self-driving industry has the chance—and the responsibility—to learn from our mistakes in combat settings before it harms road users everywhere.
- Video Friday: Robot Dogs Haul Produce From the Fieldby Evan Ackerman on February 27, 2026 at 6:00 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! Our robots Lynx M20 help transport harvested crops in mountainous farmland—tackling the rural “last mile” logistics challenge.[ Deep Robotics ]Once again, I would point out that now that we are reaching peak humanoid robots doing humanoid things, we are inevitably about to see humanoid robots doing nonhumanoid things.[ Unitree ]In a study, a team of researchers from the Max Planck Institute for Intelligent Systems, the University of Michigan, and Cornell University show that groups of magnetic microrobots can generate fluidic forces strong enough to rotate objects in different directions without touching them. These microrobot swarms can turn gear systems, rotate objects much larger than the robots themselves, assemble structures on their own, and even pull in or push away many small objects.[ Science ] via [ Max Planck Institute ]Bipedal—or two-legged—autonomous robots can be quite agile. This makes them useful for performing tasks on uneven terrain, such as carrying equipment through outdoor environments or performing maintenance on an oceangoing ship. However, unstable or unpredictable conditions also increase the possibility of a robot wipeout. Until now, there’s been a significant lack of research into how a robot recovers when its direction shifts—for example, a robot losing balance when a truck makes a quick turn. The team aims to fix this research gap.[ Georgia Tech ]Robotics is about controlling energy, motion, and uncertainty in the real world.[ Carnegie Mellon University ]Delicious dinner cooked by our robot Robody. We’ve asked our investors to speak about why they’re along for the ride.[ Devanthro ]Tilt-rotor aerial robots enable omnidirectional maneuvering through thrust vectoring, but introduce significant control challenges due to the strong coupling between joint and rotor dynamics. This work investigates reinforcement learning for omnidirectional aerial motion control on overactuated tiltable quadrotors that prioritizes robustness and agility.[ Dragon Lab ]At the [Carnegie Mellon University] Robotic Innovation Center’s 75,000-gallon water tank, members of the TartanAUV student group worked to further develop their autonomous underwater vehicle (AUV) called Osprey. The team, which takes part in the annual RoboSub competition sponsored by the U.S. Office of Naval Research, is comprised primarily of undergraduate engineering and robotics students.[ Carnegie Mellon University ]Sure seems like the only person who would want a robot dog is a person who does not in fact want a dog.Compact size, industrial capability. Maximum torque of 90N·m, over 4 hours of no-load runtime, IP54 rainproof design. With a 15-kg payload, range exceeds 13 km. Open secondary development, empowering industry applications.[ Unitree ]If your robot video includes tasty baked goods it will be included in Video Friday.[ QB Robotics ]Astorino is a 6-axis educational robot created for practical and affordable teaching of robotics in schools and beyond. It has been created with 3D printing, so it allows for experimentation and the possible addition of parts. With its design and programming, it replicates the actions of industrial robots giving students the necessary skills for future work.[ Astorino by Kawasaki ]We need more autonomous driving datasets that accurately reflect how sucky driving can be a lot of the time.[ ASRL ]This Carnegie Mellon University Robotics Institute Seminar is by CMU’s own Victoria Webster-Wood, on “Robots as Models for Biology and Biology and Materials for Robots.”In the last century, it was common to envision robots as shining metal structures with rigid and halting motion. This imagery is in contrast to the fluid and organic motion of living organisms that inhabit our natural world. The adaptability, complex control, and advanced learning capabilities observed in animals are not yet fully understood, and therefore have not been fully captured by current robotic systems. Furthermore, many of the mechanical properties and control capabilities seen in animals have yet to be achieved in robotic platforms. In this talk, I will share an interdisciplinary research vision for robots as models for neuroscience and biology as materials for robots.[ CMU RI ]
- Perseverance Smashes Autonomous Driving Record on Marsby Michelle Hampson on February 25, 2026 at 3:00 pm
This article is part of our exclusive IEEE Journal Watch series in partnership with IEEE Xplore.In past missions to Mars, like with the Curiosity and Opportunity rovers, the robots relied mostly on human instructions from millions of miles away in order to safely navigate the Martian landscape. The Perseverance rover, on the other hand, has zipped across the alien, boulder-ridden land almost completely autonomously, smashing previous records for autonomous driving on Mars. Whereas the Curiosity rover completed about 6.2 percent of its travels autonomously, Perseverance had completed about 90 percent of its travels autonomously, as of its 1,312th Martian day since landing (28 October 2024). Perseverance was able to accomplish such a feat—using remarkably little computing power—thanks to its specially designed autonomous driving algorithm, Enhanced Autonomous Navigation, or ENav. The full details on ENav’s inner workings and how well it has performed on Mars are described in a study published in IEEE Transactions on Field Robotics in November 2025. There are some advantages, but some serious challenges when it comes to autonomous navigation on Mars. On the plus side, almost nothing on the planet moves. Rocks and gravel slopes—while formidable obstacles—remain stationary, offering rovers consistency and predictability in their calculations and pathfinding. On the other hand, Mars is in large part uncharted terrain. “This enormous uncertainty is the major challenge,” says Masahiro “Hiro” Ono, supervisor of the Robotic Surface Mobility Group at NASA’s Jet Propulsion Laboratory, who helped develop ENav.Creating a Highly Autonomous Rover While some images from the space-borne Mars Reconnaissance Orbiter exist, these are usually not high enough resolution for ground-based navigation by a rover. In December, NASA engineers performed the first test of a navigation technique that uses a model based on Anthropic’s AI to analyze MRO images and generate waypoints—the coordinates used to guide the rover’s path—for more complete automation. RELATED: NASA Let AI Drive the Perseverance RoverBut in the majority of today’s navigation, Perseverance must rely on images the rover itself takes, analyze these to assess thousands of different paths, and choose the right route that won’t end in its own demise. The kicker? It must do so with the equivalent computing capacity of an iMac G3, an Apple computer sold in the late 1990s.The rover’s processor must undergo radiation hardening, a process that makes them resilient to the extreme levels of solar radiation and cosmic rays experienced on Mars. Although other radiation-hardened CPUs with more computing power were available at the time of Perseverance‘s development, the one used is proven to be reliable in the harsh conditions of outer space. By reusing hardware from previous missions—the same CPU was used in Curiosity—NASA can reduce costs while minimizing risk.Given its limited computing resources, the ENav algorithm was strategically designed to do the heaviest computing only when driving on challenging terrains. It works by analyzing images of its surroundings and assessing about 1,700 possible paths forward, typically within 6 meters from the rover’s current position. Assessing factors such as travel time and terrain roughness, it ranks possible paths. Finally, it runs a computationally heavy collision-checking algorithm, called ACE (approximate clearance estimation) on only on a handful of top-ranked potential paths. As of October 2024, Perseverance has driven more than 30 kilometers (18.65 miles) and collected 24 samples of rock and regolith. Source: JPL-Caltech/ASU/MSSS/NASAExploring the Red Planet with ENavPerseverance landed on Mars on 18 February 2021. In their study, Ono and his colleagues describe how the rover was initially deployed with strong human navigation oversight during its first 64 Martian days on the Red Planet, but then went on to predominantly use ENav to travel to one of the major exploration targets: the delta formed by an ancient river that once flowed into Jezero Crater billions of years ago. Scientists believe it could be a prime spot for finding evidence of past alien life, if life ever existed on Mars.After a brief exploration of an area southwest of its landing site, Perseverance jetted counterclockwise around sand dunes toward the ancient river delta at a crisp pace, averaging 201 meters per Martian day. (It’s too cold for the rover to travel at night.) Over the course of just 24 Martian days of driving, the rover traveled about 5 kilometers into the foothill of the delta. 95 percent of all driving that month was performed using the autonomous driving mode, resulting in an unprecedented amount of autonomous driving on Mars.Past rovers, such as Curiosity, had to stop and “think” about their paths before moving forward. “That was the main speed bump for Curiosity, why it was so slow to drive autonomously,” Ono explains. In contrast, Perseverance is able to think and drive at the same time. “Sometimes [Perseverance] has to stop and think, particularly when it cannot figure out a safe path quickly. But most of the time, particularly on easy terrains, it can keep driving without stopping,” Ono says. “That made its autonomous driving an order of magnitude faster.”Opportunity held the previous record for autonomous driving on Mars, traveling 109 meters in a single Martian day. But on 3 April 2023, Perseverance set a new record by driving 331.74 meters autonomously (and 347.69 meters in total) in a single Martian day. Ono says that fine-tuning the ENav algorithm took a lot of work, but he is happy with its performance. He also emphasizes that efforts to continue advancing autonomous navigation are critical if humans want to continue exploring even deeper into space, where Earthly communication with rovers and other spacecraft will become increasingly difficult.“The automation of the space systems is unstoppable direction that we have to go if we want to explore deeper in space,” Ono says. “This is the direction that we must go to push the boundaries and frontiers of space exploration.”This article was updated on 27 February to clarify NASA’s reasoning for selecting the CPU used in the Perseverance rover.
- Video Friday: Humanoid Robots Celebrate Springby Evan Ackerman on February 20, 2026 at 6:00 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! So humanoid robots are nearing peak human performance. I would point out, though, that this is likely very far from peak robot performance, which has yet to be effectively exploited, because it requires more than just copying humans.[ Unitree ]“The Street Dance of China”: Turning lightness into gravity, and rhythm into impact.This is a head-on collision between metal and beats. This Chinese New Year, watch PNDbotics Adam bring the heat with a difference.[ PNDbotics ]You had me at robot pandas.[ MagicLab ]NASA’s Perseverance rover can now precisely determine its own location on Mars without waiting for human help from Earth. This is possible thanks to a new technology called Mars global localization. This technology rapidly compares panoramic images from the rover’s navigation cameras with onboard orbital terrain maps. It’s done with an algorithm that runs on the rover’s helicopter base station processor, which was originally used to communicate with the Ingenuity Mars helicopter. In a few minutes, the algorithm can pinpoint Perseverance’s position to within about 10 inches (25 centimeters). The technology will help the rover drive farther autonomously and keep exploring. [ NASA Jet Propulsion Laboratory ]Legs? Where we’re going, we don’t need legs![ Paper ]This is a bit of a tangent from robotics, but it gets a pass because of the cute jumping spider footage.[ Berkeley Lab ]Corvus One for Cold Chain is engineered to live and operate in freezer environments permanently, down to –20 °F, while maintaining full-flight and barcode-scanning performance.I am sure there is an excellent reason for putting a cold-storage facility in the Mojave Desert.[ Corvus Robotics ]The video documents the current progress made in the picking rate of the Shiva robot when picking strawberries. It first shows the previous status, then the further development, and finally the field test.[ DFKI ]Data powers an organization’s digital transformation, and ST Engineering MRAS is leveraging Spot to get a full view of critical equipment and a facility. Working autonomously, Spot collects information about machine health—and now, thanks to an integration of the Leica BLK ARC for reality capture, detailed and accurate point-cloud data for their digital twin.[ Boston Dynamics ]The title of this video is “Get out and have fun!” Is that mostly what humanoid robots are good for right now, pretty much...?[ Engine AI ]Astorino is a modern six-axis robot based on 3D-printing technology. Programmable in AS language, the robot facilitates the preparation of classes with ready-made teaching materials, is easy both to use and to repair, and gives the opportunity to learn and make mistakes without fear of breaking it.[ Kawasaki ]Can I get this in my living room?[ Yaskawa ]What does it mean to build a humanoid robot in seven months, and the next one in just five? This documentary takes you behind the scenes at Humanoid, a U.K.-based AI and robotics company building reliable, safe, and helpful humanoid robots. You’ll hear directly from our engineering, hardware, product, and other teams as they share their perspectives on the journey of turning physical AI into reality.[ Humanoid ]This IROS 2025 keynote is from Tim Chung—now at Microsoft—on catalyzing the future of human, robot, and AI agent teams in the physical world.The convergence of technologies—from foundation AI models to diverse sensors and actuators to ubiquitous connectivity—is transforming the nature of interactions in the physical and digital world. People have accelerated their collaborative connections and productivity through digital and immersive technologies, no longer limited by geography or language or access. Humans have also leveraged and interacted with AI in many different forms, with the advent of hyperscale AI models (that is, large language models) forever changing (and at an ever-astonishing pace) the nature of human–AI teams, realized in this era of the AI “copilot.” Similarly, robotics and automation technologies now afford greater opportunities to work with and/or near humans, allowing for increasingly collaborative physical robots to dramatically impact real-world activities. It is the compounding effect of enabling all three capabilities, each complementary to one another in valuable ways, and we envision the triad formed by human–robot–AI teams as revolutionizing the future of society, the economy, and technology.[ IROS 2025 ]This GRASP SFI talk is by Chris Paxton at Agility Robotics: “How Close Are We to Generalist Humanoid Robots?”With billions of dollars of funding pouring into robotics, general-purpose humanoid robots seem closer than ever. And certainly it feels like the pace of robotics is faster than ever, with multiple companies beginning large-scale deployments of humanoid robots. In this talk, I’ll go over the challenges still facing scaling robot learning, looking at insights from a year of discussions with researchers all over the world.[ University of Pennsylvania GRASP Laboratory ]This week’s Carnegie Mellon University Robotics Institute Seminar is from Jitendra Malik at University of California, Berkeley: “Robot Learning, With Inspiration From Child Development.”For intelligent robots to become ubiquitous, we need to “solve” locomotion, navigation, and manipulation at sufficient reliability in widely varying environments. In locomotion, we now have demonstrations of humanoid walking in a variety of challenging environments. In navigation, we pursued the task of “Go to Any Thing”: A robot, on entering a newly rented Airbnb, should be able to find objects such as TV sets or potted plants. RL in simulation and sim-to-real have been workhorse technologies for us, assisted by a few technical innovations. I will sketch promising directions for future work.[ Carnegie Mellon University Robotics Institute ]
- Tech Is Taking Over Olympic Curlingby Elie Dolgin on February 18, 2026 at 3:00 pm
At this year’s Winter Olympics in Italy, the controversy began with a fingertip.A disputed double-touch—whether a curler had brushed a moving stone twice—sparked protests, profanity-laced exchanges, and heated debate about sportsmanship. In a game that prides itself on mutual trust and the idea of competition as a shared test of skill, even the suggestion of impropriety can ripple far beyond a single end.But if a double-touch can shake the sport, what happens when the controversy isn’t about a fingertip but an algorithm?That’s the question shadowing the rise of analytics driven by machine learning and a new breed of AI-powered robots that can throw stones, read the ice, and calculate strategy with machine precision.RELATED: Milan-Cortina Winter Olympics Debut Next-Generation Sport SmartsSome of these robots, such a “Curly,” have already toppled elite human opponents in head-to-head competitions. Others, engineered either to replicate the biomechanics of human shot delivery or to fire stones consistently with repeatable speed and rotation, are transforming the sport by dissecting technique and strategy with a level of rigor no coach with a stopwatch could match. Seen here in action, the two-part robot system named Curly made its debut in 2018 ahead of that year’s Paralympic Winter Games in Pyeongchang.TUBerlinTV/YouTube“The amount of innovation I’m seeing is just tremendous,” says Glenn Paulley, a retired computer scientist who now runs Throwing Rocks Consulting Services, where he coaches curlers and advises teams on analytics.Fueled by investments from governments and sporting bodies around the world, the pursuit of a competitive edge has escalated into a data-driven push for marginal gains ahead of each Olympic cycle. “They’re trying like crazy to elevate their national team programs,” Paulley says, “and they’re doing it in every way possible.” By the time medals are handed out in Cortina d’Ampezzo this weekend, the imprint of this full-throttle tech offensive could be etched into every sheet of ice.Yet, as algorithms begin suggesting shots, the contours of fair play blur. Regulators and coaches alike are grappling with where to draw the line. And as top curlers lean more into AI and robotic systems, some fear the loss of something fundamental: the quiet, hard-earned feel for ice that separates veterans from novices.“It’s a big debate!” says Emily Zacharias, a former elite curler from Manitoba who captured gold representing Canada at the 2020 World Junior Curling Championships.Three decades ago, Garry Kasparov sat across from IBM’s Deep Blue and discovered that even the most cerebral of games could be unsettled by silicon. Curling, long called “chess on ice,” may now be entering its own version of that reckoning.Can New Tech Comply With the “Spirit of Curling”?Curling has been at this kind of crossroads before. A decade back, the sweeping-fabric controversy known as “Broomgate” triggered accusations of technological doping, a dispute that tore at the heart of the sport’s ethos of trust and bonhomie.The World Curling Federation responded by clamping down on brush materials, but AI now poses a broader challenge. It is not just a better broom but a decision engine, capable of shifting authority from a player’s judgment in the “house” to a model running in the cloud. The six-legged “hexapod” curling robot is displayed at the World Robot Conference 2022 in Beijing, where that year’s Olympic Games were also held.Anna Ratkoglo/Sputnik/APIt’s a prospect that unsettles some athletes and ethicists, who worry about what gets lost as optimization tightens its grip on a sport long governed by the so-called Spirit of Curling, an unwritten code of integrity, fairness, and respect.“We’re at a point now where just about everything that we used to hold up as uniquely human is now being eroded by technology—and we feel a loss,” says Jason Millar, who runs the Canadian Robotics and AI Ethical Design Lab at the University of Ottawa.“The AI doesn’t care,” he adds. “There’s no ‘spirit’ there.”Building Rock-Solid Curling RobotsThe Curly robot first made waves in 2018 when, ahead of that year’s Paralympic Winter Games in Pyeongchang, engineers at Korea University, in Seoul, unveiled the AI-powered device—or, rather, two coordinated devices, a pair of “skip” and “thrower” units, designed to read the ice and deliver stones.Driven by a physics-based simulator and an adaptive deep-reinforcement-learning framework, the robot didn’t simply replay preprogrammed shots. It learned from its own misses, updated its aim based on the distance gaps between intended and actual stone positions, and factored in the cumulative wear of pebbled ice as a match unfolded.That capacity was put to the test in a series of mini-games against top-ranked Korean athletes. As reported in the journal Science Robotics, Curly started slow, dropping the opening match as it calibrated to the live ice. But it then went on to win the next three contests, demonstrating what its creators called “human-level performance” under real-world conditions.The next Winter Olympics—the Beijing 2022 Games—then brought a more agile machine: a “hexapod” curling robot built to walk, align, and throw like a human curler. With six legs, the hexapod robot can act more like a human curler when launching the stone, putting a new spin on curling-robot tech.FlyingDumplings/YouTubeWith its six-legged gait for stable traction and flexibility on the ice, the robot could pivot at the “hack,” the rubber foothold curlers use to launch their delivery. From there, the hexapod set its angle, kicked off, and glided on a skateboard-like undercarriage before releasing the stone, imparting competition-level spin.Equipped with lidar and cameras, the robot scanned the sheet to map stone positions and fed those data into software that calculated collision paths and solved for the precise release parameters needed to execute a chosen strategy.Curling Bots Leave Broom for ImprovementFor all the technical prowess of Curly and the hexapod, one stubborn constraint remains: No robot can sweep—at least not yet.There are no Roomba-like machines flanking the stone, frantically brushing to extend its travel or hold its line. Once released, the robot’s shot is fate, untouched by the vigorous, broom-flailing choreography that so often determines whether a stone bites the button or drifts wide.“These robots are leaving out a huge chunk of potential that humans are bringing to the game,” says Steven Passmore, a human-movement scientist at the University of Manitoba in Winnipeg who, together with Zacharias, coauthored a comprehensive review of the scientific literature on curling.At the time of their data cutoff, in 2021, they found nearly two dozen published studies about robotics, AI, and emerging tech in the sport. But as Zacharias points out, the most sophisticated tools shaping elite play often never appear in academic journals, developed behind closed doors and closely guarded as competitive secrets.For her part, Zacharias—who competed at four Canadian women’s curling championships between 2021 and 2024—says she never once practiced against a robot. But she has trained with a rock launcher, a mechanized delivery system that fires stones at precisely calibrated speeds and rotations, over and over.By standardizing the throw, the device allows athletes to isolate how different sweeping techniques, brush-head fabrics, or ice temperatures alter a stone’s path, explains Paulley. “It means you can run repeated experiments in order to test the impact of different variables,” he says. “And in curling, there are a lot of variables.”Cutting-Edge Tech Helps Athletes TrainIn Japan, all these technologies and more are being explored in a government-backed initiative called Curling of the Future.The program brings together university engineers, sporting agencies, and elite athletes to prototype delivery robots and sweep-assist machines, along with AI strategy engines, instrumented “smart stones,” and rock-launcher systems for controlled training. “The core objective is elite performance: improving decision-making and the quality of training so that Japan can strengthen its competitiveness in international competition,” says Yoshinari Takegawa, an information scientist at the Future University Hakodate who is co-leading the project. Dylan Rusnak, a kinesiology student at Red Deer Polytechnic, contributed to the project by developing a VR system for curling. Rusnak wears a Meta Quest headset [left] while demonstrating the system, which shows athletes immersive views of the rink [right]. Red Deer PolytechnicThe technology push isn’t confined to Olympic play either. At the Paralympics next month, the Canadian national wheelchair curling squad will be coming primed with training sessions inside a full virtual replica of the Cortina Curling Olympic Stadium, courtesy of a VR system developed by mechanical engineer Jennifer Dornstauder and her students at Red Deer Polytechnic in Alberta. The setup drops athletes into an immersive curling rink via a Meta Quest headset, where they can look down and see virtual renderings of their legs, wheelchair, throwing stick, stones, and the ice surface beneath them.According to Mick Lizmore, head coach of Canada’s National Wheelchair Curling Program, his team has used the VR to help visualize the venue where they will be competing and for group tactical training, even when they can’t meet together in person. Beyond sharpening elite preparation, Dornstauder says, the same tool should help expand access to wheelchair curling for people with disabilities who face mobility challenges or limited ice availability.“VR is just this amazing tool that is almost designed for getting around these barriers,” she says.Will Tech Change Curling?Many of the technologies entering curling are, in many ways, benign—tools for analysis, accessibility, and incremental refinement rather than wholesale disruption. A rock launcher standardizes practice. A VR headset extends rehearsal beyond the rink. A strategy engine offers probabilities, not ultimatums.Taken together, however, they reveal how thoroughly digital systems are seeping into every layer of the sport.AI-powered sparring machines tuned to mimic a rival team’s tendencies, and thus capable of playing out fully simulated preparatory matches, remain a fantasy. National curling programs operate on tight budgets, limiting how far and how fast innovation can go. And even well-funded federations must balance software and robotics against coaching, travel, and ice time. Rock launchers provide a consistent throw to help athletes practice sweeping.Sean Maw/University of SaskatchewanYet as money continues to flow into high-performance curling, those possibilities draw closer.“It’s probably just a matter of time,” says Sean Maw, a sports engineer at the University of Saskatchewan who has built rock launchers and studies the complexities of curling. For now, the stones still leave human hands—hands capable of brilliance, instinct, and the occasional double-touch—and the final call still rests with the skip in the house. But the algorithms are edging closer to the button.
- Video Friday: Robot Collective Stays Alive Even When Parts Dieby Evan Ackerman on February 13, 2026 at 4:30 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! No system is immune to failure. The compromise between reducing failures and improving adaptability is a recurring problem in robotics. Modular robots exemplify this trade-off, because the number of modules dictates both the possible functions and the odds of failure. We reverse this trend, improving reliability with an increased number of modules by exploiting redundant resources and sharing them locally.[ Science ] via [ RRL ]Now that the Atlas enterprise platform is getting to work, the research version gets one last run in the sun. Our engineers made one final push to test the limits of full-body control and mobility, with help from the RAI Institute.[ RAI ] via [ Boston Dynamics ]Announcing Isaac 0: the laundry-folding robot we’re shipping to homes, starting in February 2026 in the Bay Area.[ Weave Robotics ]In a paper published in Science, researchers at the Max Planck Institute for Intelligent Systems, the Humboldt University of Berlin, and the University of Stuttgart have discovered that the secret to the elephant’s amazing sense of touch is in its unusual whiskers. The interdisciplinary team analyzed elephant-trunk whiskers using advanced microscopy methods that revealed a form of material intelligence more sophisticated than the well-studied whiskers of rats and mice. This research has the potential to inspire new physically intelligent robotic sensing approaches that resemble the unusual whiskers that cover the elephant trunk.[ MPI ]Got an interest in autonomous mobile robots, ROS2, and a mere US $150 lying around? Try this.[ Maker's Pet ]Thanks, Ilia!We’re giving humanoid robots swords now.[ Robotera ]A system developed by researchers at the University of Waterloo lets people collaborate with groups of robots to create works of art inspired by music.[ Waterloo ]FastUMI Pro is a multimodal, model-agnostic data acquisition system designed to power a truly end-to-end closed loop for embodied intelligence, transforming real-world data into genuine robotic capability.[ Lumos Robotics ]We usually take fingernails for granted, but they’re vital for fine-motor control and feeling textures. Our students have been doing some great work looking into the mechanics behind this.[ Paper ]This is a 550-lb. all-electric coaxial unmanned rotorcraft developed by Texas A&M University’s Advanced Vertical Flight Laboratory and Harmony Aeronautics as a technology demonstrator for our quiet-rotor technology. The payload capacity is 200 lb. (gross weight = 750 lb). The noise level measured was around 74 dBA in hover mode at 50 feet, making this probably the quietest rotorcraft at this scale.[ Harmony Aeronautics ]Harvard scientists have created an advanced 3D-printing method for developing soft robotics. This technique, called rotational multimaterial 3D printing, enables the fabrication of complex shapes and tubular structures with dissolvable internal channels. This innovation could someday accelerate the production of components for surgical robotics and assistive devices, advancing medical technology.[ Harvard ]The Lynx M20 wheel-legged robot steps onto the ice and snow, taking on challenges inspired by four winter sports scenarios. Who says robots can’t enjoy winter sports?[ Deep Robotics ]NGL right now I find this more satisfying to watch than a humanoid doing just about anything.[ Fanuc ]At Mentee Robotics, we design and build humanoid robots from the ground up with one goal: reliable, scalable deployment in real-world industrial environments. Our robots are powered by deep vertical integration across hardware, embedded software, and AI, all developed in-house to close the Sim2Real gap and enable continuous, around-the-clock operation.[ Mentee Robotics ]You don’t need to watch this whole video, but the idea of little submarines that hitch rides on bigger boats and recharge themselves is kind of cool.[ Lockheed Martin ]Learn about the work of Dr. Roland Siegwart, Dr. Anibal Ollero, Dr. Dario Floreano, and Dr. Margarita Chli on flying robots and some of the challenges they are still trying to tackle in this video created based on their presentations at ICRA@40, the 40th-anniversary celebration of the IEEE International Conference on Robotics and Automation.[ ICRA@40 ]
- Video Friday: Autonomous Robots Learn By Doing in This Factoryby Evan Ackerman on February 6, 2026 at 5:00 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! To train the next generation of autonomous robots, scientists at Toyota Research Institute are working with Toyota Manufacturing to deploy them on the factory floor.[ Toyota Research Institute ]Thanks, Erin!This is just one story (of many) about how we tried, failed, and learned how to improve our drone delivery system.Okay, but like you didn’t show the really cool bit...?[ Zipline ]We’re introducing KinetIQ, an AI framework developed by Humanoid, for end-to-end orchestration of humanoid robot fleets. KinetIQ coordinates wheeled and bipedal robots within a single system, managing both fleet-level operations and individual robot behavior across multiple environments. The framework operates across four cognitive layers, from task allocation and workflow optimization to task execution based on Vision-Language-Action models and whole-body control taught by reinforcement learning, and is shown here running across our wheeled industrial robots and bipedal R&D platform.[ Humanoid ]What if a robot gets damaged during operation? Can it still perform its mission without immediate repair? Inspired by the self-embodied resilience strategies of stick insects, we developed a decentralized adaptive resilient neural control system (DARCON). This system allows legged robots to autonomously adapt to limb loss, ensuring mission success despite mechanical failure. This innovative approach leads to a future of truly resilient, self-recovering robotics.[ VISTEC ]Thanks, Poramate!This animation shows Perseverance’s point of view during a drive of 807 feet (246 meters) along the rim of Jezero Crater on 10 December 2025, the 1,709th Martian day, or sol, of the mission. Captured over 2 hours and 35 minutes, 53 navigation-camera (Navcam) image pairs were combined with rover data on orientation, wheel speed, and steering angle, as well as data from Perseverance’s inertial measurement unit, and placed into a 3D virtual environment. The result is this reconstruction with virtual frames inserted about every 4 inches (0.1 meters) of drive progress.[ NASA Jet Propulsion Lab ]−47.4 °C, 130,000 steps, 89.75°E, 47.21°N… On the extremely cold snowfields of Altay, the birthplace of human skiing, Unitree’s humanoid robot G1 left behind a unique set of marks.[ Unitree ]Representing and understanding 3D environments in a structured manner is crucial for autonomous agents to navigate and reason about their surroundings. In this work, we propose an enhanced hierarchical 3D scene graph that integrates open-vocabulary features across multiple abstraction levels and supports object-relational reasoning. Our approach leverages a vision language model (VLM) to infer semantic relationships. Notably, we introduce a task-reasoning module that combines large language models and a VLM to interpret the scene graph’s semantic and relational information, enabling agents to reason about tasks and interact with their environment more intelligently. We validate our method by deploying it on a quadruped robot in multiple environments and tasks, highlighting its ability to reason about them.[ Norwegian University of Science & Technology, Autonomous Robots Lab ]Thanks, Kostas!We present HoLoArm, a quadrotor with compliant arms inspired by the nodus structure of dragonfly wings. This design provides natural flexibility and resilience while preserving flight stability, which is further reinforced by the integration of a reinforcement-learning control policy that enhances both recovery and hovering performance.[ HO Lab via IEEE Robotics and Automation Letters ]In this work, we present SkyDreamer, to the best of our knowledge the first end-to-end vision-based autonomous-drone racing policy that maps directly from pixel-level representations to motor commands.[ MAVLab ]This video showcases AI Worker, equipped with five-finger hands, performing dexterous object manipulation across diverse environments. Through teleoperation, the robot demonstrates precise, humanlike hand control in a variety of manipulation tasks.[ Robotis ]Autonomous following, 45-degree slope climbing, and reliable payload transport in extreme winter conditions, built to support operations where environments push the limits.[ DEEP Robotics ]Living architectures, from plants to beehives, adapt continuously to their environments through self-organization. In this work, we introduce the concept of architectural swarms: systems that integrate swarm robotics into modular architectural façades. The Swarm Garden exemplifies how architectural swarms can transform the built environment, enabling “living-like” architecture for functional and creative applications.[ SSR Lab via Science Robotics ]Here are a couple of IROS 2025 keynotes, featuring Bram Vanderborght and Kyu-Jin Cho. - YouTube www.youtube.com [ IROS 2025 ]
- Ode to Very Small Devicesby Paul Jones on January 30, 2026 at 7:02 pm
As fairies for the Irish or leeks for Welsh,it’s the secret lives of small hidden machines,their junctures, and networks that inspire me:Mystic hidden functionaries that makeour made world live, brave little servo motors,whose couplers, whose eccentric fire-filledsensors are encased in bakelite with brassscrews, who stare with red eyes, who gauge moisture,who notice tiny motions and respond,whose cooling fans call out in white-noiseregisters like older folk singers–I canalmost hear their earlier songs, their strong voicesnow yelps, their thumps, their throbs, their hum, their chant–,they click, they whir, they are sent spinninginside like teen girls giggling over boy bands.Most of all: ones waiting silently, concealingthe surprise of their purpose, tasks not yet known,their true natures found only in connections.Those that listen, those that speak,those that control cool and heat,those that open doors, those that lockall the things that we’ve forgot,those that hide, those that disclosethose embedded in our clothesthose in our ears, those in our heartsthose that bring together, those a partof divisions, those like birds,like parrots that complete our words,those like fish, those that entrap,those that free, those that freely flapin fierce winds, those that replacewhat we have lost, those that seeat night, in fog, in brightness, in fear,those that show what we hold dear,those that tempt, those that repel,those that buy and those that sell,those that keep us alive, those thatdon’t, won’t, couldn’t and cannot.Parts of one mind, not mine, blunt orchestraof information, bundles of feelersreaching out to touch us, teach us, guide usto form better futures better understood.May your sounds, your chimes, your silence calm us.May your tender tendrils touch what we seek.Small parts becoming one being intertwined,a world in itself, remind us to be kind.
- Video Friday: Multitasking Robots Smoothly Do the Things Togetherby Evan Ackerman on January 30, 2026 at 6:30 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy this week’s videos! Westwood Robotics is proud to announce a major update: THEMIS Gen2.5, the world’s first commercial full-size humanoid robot capable of manipulation on the move!Now that you mention it, the bit at the end where the robot picks up a can while walking? I haven’t seen a lot of that.[ Westwood Robotics ] Last year, Helix showed that a single neural network could control a humanoid’s upper body from pixels. Today, Helix 02 extends that control to the entire robot—walking, manipulating, and balancing as one continuous system.Why, yes, I am a normal human, and this is very similar to the default state of my kitchen.[ Figure ]Harry Goldstein, our editor in chief, went to meet Sprout from Fauna Robotics. He was skeptical at first, but Sprout won him over with its robotic charm.[ Fauna Robotics ]Kimberly Elenberg is showing how the data collected by robotic responders can save lives in mass casualty events.[ Carnegie Mellon University ]The educational robotics market is tough, but you’ve got to hand it to Sphero—going strong since 2011, which is pretty incredible.[ Sphero ]If you want to fly in crazy conditions, you have to flight test in those conditions. Here’s how and why we do it![ Zipline ]I want to be impressed more by the idea of 3D-printing skin and skeleton at the same time, but come on, animals have been doing that for literally hundreds of years without even trying.[ JSK Lab, University of Tokyo ]If there is a market for small bipedal robots that can both ski and be dinosaurs, LimX has it covered.[ LimX ]How do you remotely control robots that change shape? We introduce a method for user-guided control of modular robots using reconfigurable joint-space joysticks (JoJo) and real-time optimization. We demonstrate this system on two different robots, Mori3 and Roombots. The video shows examples of these robots performing object manipulation, locomotion, human-assistance, and reconfiguration, controlled by our system.[ EPFL Reconfigurable Robotics Lab ] via [ Nature Communications ]Quadrotor Biplane Tailsitter (QBiT) UAVs at four different sizes (4, 12, 25, and 50 lbs) developed at Texas A&M University. QBiT combines the mechanical simplicity of a quadrotor drone with the cruise efficiency of a fixed-wing aircraft.[ Texas A&M University ]There’s a new DARPA challenge for “novel drone designs that can carry payloads more than four times their weight, which would revolutionize the way we use drones across all sectors.”[ DARPA ]Here are a couple of plenary and keynote talks from IROS 2025, from Marco Hutter and Karinne Ramirez Amaro.[ IROS 2025 ]
- Video Friday: Humans and Robots Team Up in Battlefield Triageby Evan Ackerman on January 23, 2026 at 5:00 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! One of my favorite parts of robotics is watching research collide with non-roboticists in the real (or real-ish) world.[ DARPA ]Spot will put out fires for you. Eventually. If it feels like it.[ Mechatronic and Robotic Systems Laboratory ]All those robots rising out of their crates is not sinister at all.[ LimX ]The Lynx M20 quadruped robot recently completed an extreme cold-weather field test in Yakeshi, Hulunbuir, operating reliably in temperatures as low as –30°C.[ DEEP Robotics ]This is a teaser video for KIMLAB’s new teleoperation robot. For now, we invite you to enjoy the calm atmosphere, with students walking, gathering, and chatting across the UIUC Main Quad—along with its scenery and ambient sounds, without any technical details. More details will be shared soon. Enjoy the moment.The most incredible part of this video is that they have publicly available power in the middle of their quad.[ KIMLAB ]For the eleventy-billionth time: Just because you can do a task with a humanoid robot doesn’t mean you should do a task with a humanoid robot.[ UBTECH ]I am less interested in this autonomous urban delivery robot and more interested in whatever that docking station is at the beginning that loads the box into it.[ KAIST ]Okay, so figuring out where Spot’s face is just got a lot more complicated.[ Boston Dynamics ]An undergraduate team at HKU’s Tam Wing Fan Innovation Wing developed CLIO, an embodied tour-guide robot, just in months. Built on LimX Dynamics TRON 1, it uses LLMs for tour planning, computer vision for visitor recognition, and a laser pointer/expressive display for engaging tours.[ CLIO ]The future of work is doing work so that robots can then do the same work, except less well.[ AgileX ]
- Video Friday: Bipedal Robot Stops Itself From Fallingby Evan Ackerman on January 16, 2026 at 6:30 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! This is one of the best things I have ever seen. [ Kinetic Intelligent Machine LAB ]After years of aggressive testing and pushing the envelope with U.S. Army and Marine Corps partners, the Robotic Autonomy in Complex Environments with Resiliency (RACER) program approaches its conclusion. But the impact of RACER will reverberate far beyond the program’s official end date, leaving a legacy of robust autonomous capabilities ready to transform military operations and inspire a new wave of private-sector investment.[ DARPA ]Best-looking humanoid yet.[ Kawasaki ]COSA (Cognitive OS of Agents) is a physical-world-native Agentic OS that unifies high-level cognition with whole-body motion control, enabling humanoid robots to think while acting in real environments. Powered by COSA, Oli becomes the first humanoid agent with both advanced loco-manipulation and high-level autonomous cognition.[ LimX Dynamics ]Thanks, Jinyan!The 1X World Model’s latest update is a paradigm shift in robot learning: NEO now uses a physics-grounded video model (World Model) to turn any voice or text prompt into fully autonomous action, even for completely novel tasks and objects NEO has never seen before. By leveraging internet-scale video data fine-tuned on real robot experience, NEO can visualize future actions, predict outcomes, and execute them with humanlike understanding–all without prior examples. This marks the critical first step in NEO being able to collect data on its own to master new tasks all by itself. [ 1X ]I’m impressed by the human who was mocapped for this.[ PNDbotics ]We introduce the GuideData Dataset, a collection of qualitative data, focusing on the interactions between guide dog trainers, visually impaired (BLV) individuals, and their guide dogs. The dataset captures a variety of real-world scenarios, including navigating sidewalks, climbing stairs, crossing streets, and avoiding obstacles. By providing this comprehensive dataset, the project aims to advance research in areas such as assistive technologies, robotics, and human-robot interaction, ultimately improving the mobility and safety of visually impaired people.[ DARoS Lab ]Fourier’s desktop Care-Bot prototype is gaining much attention at CES 2026! Even though it’s still in the prototype stage, we couldn’t wait to share these adorable and fun interaction features with you.[ Fourier ]Volcanic gas measurements are critical for understanding eruptive activity. However, harsh terrain, hazardous conditions, and logistical constraints make near-surface data collection extremely challenging. In this work, we present an autonomous legged robotic system for volcanic gas monitoring, validated through real-world deployments on Mount Etna. The system combines a quadruped robot equipped with a quadrupole mass spectrometer and a modular autonomy stack, enabling long-distance missions in rough volcanic terrain.[ ETH Zurich RSL ]Humanoid and Siemens successfully completed a POC testing humanoid robots in industrial logistics. This is the first step in the broader partnership between the companies. The POC focused on a tote-to-conveyor destacking task within Siemens’s logistics process. HMND 01 autonomously picked, transported, and placed totes in a live production environment during a two-week on-site deployment at the Siemens Electronics Factory in Erlangen.[ Humanoid ]Four Growers, a category leader in intelligent ag-tech platforms, developed the GR-200 robotic harvesting platform, powered by FANUC’s LR Mate robot. The system combines AI-driven vision and motion planning to identify and harvest ripe tomatoes with quick precision.[ FANUC ]Columbia Engineers built a robot that, for the first time, is able to learn facial lip motions for tasks such as speech and singing. In a new study published in Science Robotics, the researchers demonstrate how their robot used its abilities to articulate words in a variety of languages, and even sing a song out of its AI-generated debut album, “hello world_.” The robot acquired this ability through observational learning rather than via rules. It first learned how to use its 26 facial motors by watching its own reflection in the mirror before learning to imitate human lip motion by watching hours of YouTube videos.[ Columbia ]Roborock has some odd ideas about what lawns are like.[ Roborock ]DEEP Robotics’ quadruped robots demonstrate coordinated multi-module operations under unified command, tackling complex and dynamic firefighting scenarios with agility and precision.[ DEEP Robotics ]Unlike statically stable wheeled platforms, humanoids are dynamically stable, requiring continuous active control to maintain balance and prevent falls. This inherent instability presents a critical challenge for functional safety, particularly in collaborative settings. This presentation will introduce Synapticon’s POSITRON platform, a comprehensive solution engineered to address these safety-critical demands. We will explore how its integrated hardware and software enable robust, certifiable safety functions that meet the highest industrial standards, providing key insights into making the next generation of humanoid robots safe for real-world deployment.[ Synapticon ]The University of California, Berkeley, is world-famous for its AI developments, and one big name behind them is Ken Goldberg. Longtime professor and lifelong artist, Ken is all about deep learning while staying true to “good old-fashioned engineering.” Hear Ken talk about his approach to vision and touch for robotic surgeries and how robots will evolve across the board.[ Waymo ]
- Video Friday: Robots Are Everywhere at CES 2026by Evan Ackerman on January 9, 2026 at 6:00 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! We’re excited to announce the product version of our Atlas® robot. This enterprise-grade humanoid robot offers impressive strength and range of motion, precise manipulation, and intelligent adaptability—designed to power the new industrial revolution. [ Boston Dynamics ]I appreciate the creativity and technical innovation here, but realistically, if you’ve got more than one floor in your house? Just get a second robot. That single-step sunken living room though....[ Roborock ]Wow, SwitchBot’s CES 2026 video shows almost as many robots in their fantasy home as I have in my real home.[ SwitchBot ]What is happening in robotics right now that I can derive more satisfaction from watching robotic process automation than I can from watching yet another humanoid video?[ ABB ]Yes, this is definitely a robot I want in close proximity to my life.[ Unitree ]The video below demonstrates a MenteeBot learning, through mentoring, how to replace a battery in another MenteeBot. No teleoperation is used.[ Mentee Robotics ]Personally, I think we should encourage humanoid robots to fall much more often, just so we can see whether they can get up again.[ Agility Robotics ]Achieving long-horizon, reliable clothing manipulation in the real world remains one of the most challenging problems in robotics. This live test demonstrates a strong step forward in embodied intelligence, vision-language-action systems, and real-world robotic autonomy.[ HKU MMLab ]Millions of people around the world need assistance with feeding. Robotic feeding systems offer the potential to enhance autonomy and quality of life for individuals with impairments and reduce caregiver workload. However, their widespread adoption has been limited by technical challenges such as estimating bite timing, the appropriate moment for the robot to transfer food to a user’s mouth. In this work, we introduce WAFFLE: Wearable Approach For Feeding with LEarned Bite Timing, a system that accurately predicts bite timing by leveraging wearable sensor data to be highly reactive to natural user cues such as head movements, chewing, and talking.[ CMU RCHI ]Humanoid robots are now available as platforms, which is a great way of sidestepping the whole practicality question.[ PNDbotics ]We’re introducing Spatially Enhanced Recurrent Units (SRUs)—a simple yet powerful modification that enables robots to build implicit spatial memories for navigation. Published in the International Journal of Robotics Research (IJRR), this work demonstrates up to +105 percent improvement over baseline approaches, with robots successfully navigating 70+ meters in the real world using only a single forward-facing camera.[ ETHZ RSL ]Looking forward to the DARPA Triage Challenge this fall![ DARPA ]Here are a couple of good interviews from the Humanoids Summit 2025. [ Humanoids Summit ]
- Video Friday: Watch Scuttle Evolveby Evan Ackerman on January 2, 2026 at 6:00 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! I always love seeing robots progress from research projects to commercial products.[ Ground Control Robotics ]Well this has to be one of the most “watch a robot do this task entirely through the magic of jump cuts” I’ve ever seen.[ UBTECH ]Very satisfying sound on this one.[ Pudu Robotics ]Welcome to the AgileX Robotics Data Collection Facility, where real robots build the foundation for universal embodied intelligence. Our core mission? Enable large-scale data sharing and reuse across dual-arm teleoperation robots of diverse morphologies, breaking down data silos that slow down AI progress.[ AgileX ]I’m not sure how much thought was put into this, but giving a service robot an explicit cat face could be a good way of moderating expectations on its behavior and interactivity.[ Pudu Robotics ]UBTECH says they have built 1,000 of their Walker S2 humanoid robots, over 500 of which are “delivered & working.” I would very much like to know what “working” means in this context.[ UBTECH ]Every story has its beginning, and ours started in 2023—a year defined by the unknown. Let technology return to passion; let trials catalyze evolution. Embracing growth, embarking on a new journey. We’ll see you at the next stop.Please, please hire someone to do some HRI (human-robot interface) design.[ PNDbotics ]
- Tech to Track in 2026by Harry Goldstein on January 1, 2026 at 3:00 pm
Every September as we plan our January tech forecast issue, IEEE Spectrum’s editors survey their beats and seek out promising projects that could solve seemingly intractable problems or transform entire industries.Often these projects fly under the radar of the popular technology press, which these days seems more interested in the personalities driving Big Tech companies than in the technology itself. We go our own way here, getting out into the field to bring you news of the hidden gems that genuinely—as the IEEE motto goes—advance technology for the benefit of humanity.A look back at the last 20 years of January issues reveals that while we’ve certainly covered our share of huge tech projects, like the James Webb Space Telescope, many of the stories touch on subjects most people would have otherwise missed.Last January, Senior Associate Editor Emily Waltz reported on startups that are piloting ocean-based carbon capture. This issue, she’s back with another CO2-centric story, this time focused on grid-scale storage, which is poised to blow up—literally. Waltz traveled to Sardinia to check out Milan-based Energy Dome’s “bubble battery,” which can store up to 200 megawatt-hours by compressing and decompressing pure carbon dioxide inside an inflatable dome.This kind of modular, easy-to-deploy energy storage could be especially useful for AI data centers, says Senior Editor Samuel K. Moore, who curated this issue and wrote about gravity energy storage back in January 2021.Big bubbles could help with grid-scale storage; tiny bubbles can liquefy cancer tumors. “When we think about energy storage, our minds usually go to grid-scale batteries,” Moore says. “Yet these bubbles, which are in many ways more capable than batteries, will be sprouting up all over the place, often in association with computing infrastructure.”For his story in this issue, Moore dove into the competition between two startups that are developing radio-based cables to replace conventional copper cables and fiber optics in data centers. These radio systems can connect processors 10 to 20 meters apart using a third of the power of optical-fiber cables and at a third of the cost. The next step is to integrate the radio connections directly with GPUs, to ease cooling burdens and help data centers and the AI models running on them continue to scale up.Big bubbles could help with grid-scale storage; tiny bubbles can liquify cancer tumors, as Greg Uyeno found when reporting on HistoSonics’ ultrasound treatment. Feared for its aggressive nature and extremely low survival rate, pancreatic cancer kills almost half a million people per year worldwide. HistoSonics uses noninvasive, focused ultrasound to create cavitation bubbles that destroy tumors without dangerously heating surrounding tissue. This year, the company is concluding kidney trials as well as launching pancreatic cancer trials.Over the last two decades, Spectrum has regularly covered the rise of drones. In 2018, for instance, we reported that the startup Zipline would deploy autonomous drones to deliver blood and medical supplies in rural Rwanda. Today, Zipline has a market cap of about US $4 billion and operates in several African countries, Japan, and the United States, having completed almost 2 million drone deliveries. In this issue, journalist Robb Mandelbaum takes us inside the Wildfire XPrize competition, aimed at providing another life-saving service: dousing wildfires before they grow out of control. Zipline succeeded because it could make deliveries to remote locations much faster than land vehicles. This year’s XPrize teams plan to detect and suppress fires faster than conventional firefighting methods.In addition to these emerging technologies, we’ve packed this issue with a dozen others, including Porsche’s wireless home charger for EVs, the world’s first electric air taxi service, neutral-atom quantum computers, interoperable mesh networks, and robotic baseball umpires. Let’s see which of this year’s picks make it to the big leagues.
- Teams of Robots Compete to Save Lives on the Battlefieldby Evan Ackerman on December 31, 2025 at 1:00 pm
Last September, the Defense Advanced Research Projects Agency (DARPA) unleashed teams of robots on simulated mass-casualty scenarios, including an airplane crash and a night ambush. The robots’ job was to find victims and estimate the severity of their injuries, with the goal of helping human medics get to the people who need them the most.Kimberly ElenbergKimberly Elenberg is a principal project scientist with the Auton Lab of Carnegie Mellon University’s Robotics Institute. Before joining CMU, Elenberg spent 28 years as an army and U.S. Public Health Service nurse, which included 19 deployments and serving as the principal strategist for incident response at the Pentagon.The final event of the DARPA Triage Challenge will take place in November, and Team Chiron from Carnegie Mellon University will be competing, using a squad of quadruped robots and drones. The team is led by Kimberly Elenberg, whose 28-year career as an army and U.S. Public Health Service nurse took her from combat surgical teams to incident response strategy at the Pentagon.Why do we need robots for triage?Kimberly Elenberg: We simply do not have enough responders for mass-casualty incidents. The drones and ground robots that we’re developing can give us the perspective that we need to identify where people are, assess who’s most at risk, and figure out how responders can get to them most efficiently.When could you have used robots like these?Elenberg: On the way to one of the challenge events, there was a four-car accident on a back road. For me on my own, that was a mass-casualty event. I could hear some people yelling and see others walking around, and so I was able to reason that those people could breathe and move.In the fourth car, I had to crawl inside to reach a gentleman who was slumped over with an occluded airway. I was able to lift his head until I could hear him breathing. I could see that he was hemorrhaging and feel that he was going into shock because his skin was cold. A robot couldn’t have gotten inside of the car to make those assessments.This challenge involves enabling robots to remotely collect this data—can they detect heart rate from changes in skin color or hear breathing from a distance? If I’d had these capabilities, it would have helped me identify the person at greatest risk and gotten to them first.How do you design tech for triage?Elenberg: The system has to be simple. For example, I can’t have a device that’s going to force a medic to take their hands away from their patient. What we came up with is a vest-mounted Android phone that flips down at chest height to display a map that has the GPS location of all of the casualties on it and their triage priority as colored dots, autonomously populated from the team of robots.Are the robots living up to the hype?Elenberg: From my time in service, I know the only way to understand true capability is to build it, test it, and break it. With this challenge, I’m learning through end-to-end systems integration—sensing, communications, autonomy, and field testing in real environments. This is art and science coming together, and while the technology still has limitations, the pace of progress is extraordinary.What would be a win for you?Elenberg: I already feel like we’ve won. Showing responders exactly where casualties are and estimating who needs attention most—that’s a huge step forward for disaster medicine. The next milestone is recognizing specific injury patterns and the likely life-saving interventions needed, but that will come.This article appears in the January 2026 print issue as “Kimberly Elenberg.”
- First Air Taxi Service to Launch in Dubai in 2026by Elan Head on December 28, 2025 at 1:00 pm
SummaryJoby Aviation is realizing Uber’s original “Elevate” dream, moving electric vertical take-off and landing (eVTOL) aircraft from science fiction toward commercial reality.By 2026, Joby aims to inaugurate the world’s first integrated air taxi network—in Dubai—leveraging aggressive local infrastructure investment to bypass Western bureaucratic hurdles.The plan includes “vertiports” at strategic hubs like Dubai International Airport, creating the essential physical and digital ecosystem required for reliable point-to-point urban flight.While facing a cautious FAA in the U.S., Joby will use its Dubai operations to bridge the gap between experimental testing and full-scale passenger operations.Ten years ago, ride-sharing giant Uber embraced a sci-fi future in which clean, quiet electric aircraft would shuttle passengers around crowded cities. Uber’s well-funded Elevate initiative, which included a white paper and three high-profile annual summits, effectively launched the electric vertical take-off and landing (eVTOL) industry, promising investors, regulators, and the general public that these futuristic flying taxis were “closer than you think.”At the time, California-based Joby Aviation was still in stealth mode. But behind the scenes, this pioneering eVTOL developer—which has received more than US $3 billion in total funding, including around $900 million from Toyota—was playing a major role in shaping Uber’s vision. It later stepped in to keep that vision alive, acquiring the Elevate program in 2020 after Uber CEO Dara Khosrowshahi decided to axe it.Now, Joby, which was founded in 2009 and has become the dominant eVTOL startup, says it is finally on the verge of making “urban air mobility” a reality. It plans to conduct its first passenger flights in 2026 in Dubai, United Arab Emirates.This article is part of our special report Top Tech 2026.“Dubai continues to be our global launchpad for commercial service, and our progress here is a testament to the UAE’s visionary approach to advanced air mobility,” says Anthony Khoury, Joby’s UAE general manager, in an email interview. “Dubai is on track to be the first city in the world to offer a fully integrated, premium air taxi network, and we are sprinting toward that target.”Joby Struck a Six-Year Exclusive Deal with DubaiThe company first announced its UAE plans at the World Governments Summit in Dubai in February 2024, striking a deal with Dubai’s Roads and Transport Authority (RTA) that gives it an exclusive right to operate air taxis there for six years from the launch of commercial operations.Joby’s exclusive Dubai deal will help fortify its lead in the global race to commercialize electric air taxis Joby also signed an agreement with U.K.-based Skyports to design, build, and operate four “vertiport” sites in Dubai—places for the eVTOL aircraft to load and unload passengers and charge their batteries. The first vertiport will be near Dubai International Airport, with additional ones planned for Dubai Mall, the Atlantis the Royal resort, and American University in Dubai.Joby won’t be the first eVTOL developer to carry passengers. That distinction goes to China’s EHang, which is already conducting limited sightseeing and demonstration flights with its two-seat, autonomous electric multicopters. (Joby’s aircraft are piloted.) If Joby pulls off its goal, however, it will be the first to routinely fly passengers from point to point over urban traffic, in keeping with Uber Elevate’s original vision. Its exclusive agreement in Dubai will help fortify its lead in the global race to commercialize electric air taxis, which includes a handful of other Western eVTOL developers, plus a growing number of Chinese players. Besides its Dubai deal, Joby also has a partnership with Delta to start an airport shuttle service in the United States. The Joby S4 electric vertical takeoff and landing (eVTOL) aircraft has six electric motors, each weighing 28 kilograms and capable of a peak output of 236 kilowatts.Joby AviationOperating a reliable air taxi service is a demanding proposition that will require Joby’s aircraft, charging infrastructure, and scheduling software to perform safely and reliably day in and day out. Since every new and complex technology has teething problems, Joby envisions fairly limited initial operations in 2026.“We will transition from test flights to more complex proving runs and eventually nonpaying passenger flights out of the completed vertiports, ensuring a seamless passenger experience ahead of full commercial launch,” says Khoury. He adds that Joby is currently working with Skyports to ready its initial vertiports and with government agencies in Dubai and the UAE to receive the necessary approvals for its operations.“Dubai’s approach is deeper and more comprehensive than what you see in many of the headlines,” said Clint Harper, an aviation infrastructure and policy advisor who recently participated in an advanced air mobility workshop with Dubai’s RTA. “In our workshop,” he says, “the RTA staff had fantastic questions and concerns regarding safety, security, and system-level integration. Everyone recognized and appreciated strong government support and wanted to deliver the right system solution, not just a one-off demo. I was thoroughly impressed and inspired.”Initial Air Operations Will Precede an Airworthiness CertificateNotably, all of this groundwork is taking place in advance of Joby receiving an initial type certificate for its aircraft from the U.S. Federal Aviation Administration. In the United States (and elsewhere), a type certificate is typically a prerequisite for conducting commercial operations with paying passengers. Joby claims it’s making good progress toward FAA certification, but how quickly (or slowly) that process moves is largely out of its hands. In recent years, the FAA has been taking longer to certify even conventional airplanes and helicopters, which the industry blames on staffing shortages at the agency and more cautious decision-making in the wake of the Boeing 737 Max crisis.This perception that certification delays have more to do with bureaucracy than safety may be why Dubai is willing to approve some early operations by Joby in advance of FAA type certification. Interestingly, the United States is now following the UAE’s example. In September, the FAA and U.S. Department of Transportation began soliciting proposals for an eVTOL Integration Pilot Program (eIPP), which will select at least five projects to demonstrate eVTOL operations in the national airspace starting as early as summer 2026.The FAA has stated that the eIPP won’t allow eVTOL developers to bypass certification requirements or carry paying passengers. However, it will enable them to undertake additional testing and demonstration flights as a stepping-stone to commercial operations. Joby says it’s planning to take part in the eIPP, meaning its air taxis could also be flying over U.S. cities in 2026—even if the only person on board is the pilot.
- Video Friday: Holiday Robot Helpers Send Season’s Greetingsby Evan Ackerman on December 26, 2025 at 6:30 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! Happy Holidays from Boston Dynamics!I would pay any amount of money for that lamp.[ Boston Dynamics ]What if evolution wasn’t carbon-based—but metal instead? This short film explores an alternative, iron-based evolution through robots, simulation, and real-world machines. Inspired by biological evolution, this Christmas lab film imagines a world where machines evolve instead of organisms.[ ETH Zurich Robotics System Lab ]Happy Holidays from FieldAI![ FieldAI ]Happy Holidays from the Institute of Robotics and Machine Intelligence at Poznan University of Technology![ Poznan University of Technology IRMI ]Happy Holidays from BruBotics![ AugmentX ]Thanks, Bram![ Humanoid ]Check out how SCUTTLE tackles the dull, dirty, and dangerous tasks of the pest control industry.[ Ground Control Robotics ]Happy Holidays from LimX Dynamics![ LimX Dynamics ]Happy (actually maybe not AI?) Holidays from Kawasaki Robotics![ Kawasaki Robotics ]Happy Holidays from AgileX Robotics[ AgileX Robotics ]Big news: Badminton just got a new training partner. Our humanoid robot can rally with a human in continuous exchanges, combining fast returns with stable movement. Peak return speed reaches 19.1 m/s.[ Phybot ]Well, here’s one way of deploying a legged robot.[ Kepler ]Today, we present the world’s first demo video of a full-size robot taking on the challenging Charleston dance.[ PNDbotics ]The DR02 humanoid robot from DEEP Robotics showcases remarkable versatility and agility. From the graceful flow of Tai Chi to the energetic moves of street dance, DR02 combines precision, strength, and artistry with ease![ Deep Robotics ]Decreasing the Cost of Morphing in Adaptive Morphogenetic Robots: By using kirigami laminar jamming flippers, the Jamming Amphibious Robotic Turtle (JART) can quickly morph its limbs to adapt to changing terrain. This pneumatic layer jamming technology enables multi-environment locomotion on land and water by changing the robot’s flipper shape and stiffness to decrease the cost of transport.[ Paper ]Super Odometry is a resilient sensor-fusion framework that delivers accurate, real-time state estimation in challenging environments by integrating external and inertial sensing. For decades, SLAM has depended on external sensors like cameras and lidar. We argue it’s time to reverse this hierarchy: True robustness begins from within. By placing inertial sensing at the core of state estimation, robots gain an inner sense of motion. We believe the systems that not only see, but also feel, learn, and adapt.[ AirLab ]
Subscribe to Updates
Get the latest creative news from FooBar about art, design and business.
Trending
- ElliQ gana el apoyo Medicaid del estado de Washington para el cuidado inteligente
- KEWAZO recauda fondos para acelerar el despliegue de LIFTBOT en industria pesada
- Uruguay frente a la Industria 4.0: la oportunidad de formar partede la nueva revolución industrial del Cono Sur
- OVHcloud seleccionada para ofrecer servicios cloud soberanos para el euro digital del BCE
- MISUMI se asocia con Oishii para suministrar automatización Fictiv para fincas verticales
- La plataforma NetApp AI Data Engine optimiza el uso de datos en proyectos de IA
- Telefónica Tech refuerza su apuesta por la computación cuántica con acuerdos estratégicos
- Ottobot hace entregas en la remota aldea de minas en Australia
Noticias en Inglés
Medio digital especializado en robótica, inteligencia artificial, automatización e industria 4.0 en Latinoamérica.
Cobertura en Argentina, Uruguay, Brasil, Chile y la región, en español, portugués e inglés.




















