Robot Halloween Videos: Humanoids, Hands, Vacuums, more

https://www.profitableratecpm.com/f4ffsdxe?key=39b1ebce72f3758345b2155c98e6709c

Video Friday is your weekly selection of awesome robotics videos, collected by your friends on IEEE Spectrum robotics. We are also publishing a weekly calendar of upcoming robotics events for the coming months. Please send us your events for inclusion.

ICRA 2026: June 1-5, 2026, VIENNA

Enjoy today’s videos!

Happy Halloween from UCL!

[ University College London ]

Happy Halloween from KIMLAB!

[ Kinetic Intelligent Machine Lab ]

Happy Halloween from DRAGON Lab!

[ DRAGON Lab, University of Tokyo ]

Thanks Moju!

Happy Halloween from Agility Robotics!

[ Agility Robotics ]

Happy Halloween from HEBI Robotics!

[ HEBI Robotics ]

You can now pay 1X $500/month to collect data in your home.

And it’s pretty much what you’d expect:

[ 1X ] via [ WSJ ]

In our test warehouse, we recreate our customers’ inbound operations, from dock layout and conveyors, to freight and beyond. Step into our Stretch test facility to learn about the latest developments in warehouse automation and see how we ensure robust, reliable performance in the real world.

[ Boston Dynamics ]

Well, that’s just mean. Important, but nasty.

[ Istituto Italiano de Tecnologia ]

SpikeATac is a multimodal tactile finger combining a taxelized and highly sensitive dynamic response (PVDF) with a static (capacitive) transduction method for multimodal tactile sensing. Named for its “sharp” response, SpikeATac’s multitaxel PVDF film provides fast, sensitive dynamic signals upon the onset and breakdown of contact, providing the ability to quickly and gently stop when gripping fragile and deformable objects.

[ ROAM Lab, Columbia University ]

Effective integration of diverse sensory representations is crucial for robust robotic manipulation. However, the typical feature concatenation approach is often suboptimal: dominant modalities such as vision can overwhelm sparse but critical cues like touch in contact-rich tasks, and monolithic architectures cannot flexibly incorporate new or missing modalities without retraining. Our method factorizes the policy into a set of diffusion models, each specialized for a single representation (e.g., vision or touch), and uses a network of routers that learns consensus weights to adaptively combine their contributions, thereby enabling the incremental integration of new representations.

[ GitHub ]

Thanks Haonan!

General-purpose robots must possess human-like dexterity and agility to perform tasks with the same versatility as us. A human-like form factor further enables the use of large datasets of human-hand interactions. However, the main bottleneck in dexterous handling lies not only in the software, but arguably even more in the hardware. We present the open source ORCA hand, a reliable, anthropomorphic, tendon-driven, 17-degree-of-freedom robotic hand with integrated tactile sensors, fully assembled in less than eight hours and built for a material cost of less than CHF 2,000.

[ ORCA ]

Sarah Sebo, a computer scientist at the University of Chicago, programs robots to give empathetic responses and perform nonverbal social cues like nodding to better build trust and rapport with humans. The aim is to develop robots that can improve the performance of human-robot teams, for example by improving children’s learning outcomes.

[ University of Chicago ]

DJI now has a robot vacuum, which is great. As far as I can tell, we’ve reached the point where almost all robot vacuums are (for better or worse) exactly that: good.

[ DJI ]

This ICRA 2025 keynote speech is delivered by Angela Schoellig of the Technical University of Munich, on “Powering Robotics with AI”.

[ ICRA 2025 ]

This seminar from Carnegie Mellon University, Robotics Institute (CMU RI) is led by Nancy Pollard, on “Bringing dexterity to the hands of robots in the real world.”

Dexterous manipulation is a big challenge in robotics, and many of the robotic applications we envision require fine manipulation skills. In this overview, I will discuss my views on some major factors that contribute to dexterity and discuss how we can integrate them into our robots and systems.

[ CMU RI ]

From the articles on your site

Related articles on the web

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button