TECH & SPACE
PROHR
Space Tracker
// INITIALIZING GLOBE FEED...
RoboticsREWRITTENdb#3622

Robot Hands Can Now Feel. Deploying Them Is Another Story.

(1d ago)
Hong Kong
IEEE Spectrum
Quick article interpreter

DAIMON Robotics, a 2.5-year-old Hong Kong company, has released Daimon-Infinity, which it calls the largest omni-modal robotic dataset for physical AI, complete with high-resolution tactile sensing spanning domestic and industrial tasks. The release matters because robot manipulation has long been crippled by a fundamental insensitivity to touch, and DAIMON's hardware—packing over 110,000 effective sensing units into a fingertip-sized sensor—represents a genuine hardware advance. The company has partnered with Google DeepMind, Northwestern University, and NUS, and open-sourced 10,000 hours of data. What remains to be seen is whether any of this translates beyond controlled demonstrations into environments where robots actually work.

A tactile robotic fingertip presses multiple materials while sensor maps light up.📷 AI-generated / Tech&Space

Dr. Servo Lin
AuthorDr. Servo LinRobotics editor"Has the kind of robot obsession that turns every demo into a stress test."
  • ★Daimon-Infinity contains 10,000 hours of data
  • ★Fingertip sensors measure 110,000 units
  • ★The real test is manipulation outside the lab

DAIMON Robotics wants to solve the problem that makes most robot videos embarrassing: the moment when a machine grips too hard, too soft, or not at all. The Hong Kong-based company, founded in late 2021, has released Daimon-Infinity, a dataset it describes as the largest omni-modal collection for physical AI, built around its own monochromatic vision-based tactile sensors. Each fingertip-sized unit packs more than 110,000 effective sensing units—density that matters when you're trying to distinguish between a ripe tomato and a stress ball.

The dataset spans tasks from folding laundry to factory assembly lines, and DAIMON has open-sourced 10,000 hours of collected data. Partners including Google DeepMind, Northwestern University, and the National University of Singapore lent credibility to the effort. Co-founder and chief scientist Prof. Michael Yu Wang brings academic robotics pedigree to a team that has otherwise moved fast for a company barely past its Series A breathing period.

The hardware itself is where DAIMON distinguishes itself from the dataset crowd. Vision-based tactile sensors aren't new, but the resolution claims here are specific enough to invite measurement, not just awe. Whether that translates to robustness in the field—grease, vibration, temperature swings, the incremental damage of repeated contact—is where lab demos traditionally dissolve.

The hardware exists. The training data exists. The gap between lab bench and loading dock still gapes.

A robotics pipeline turns pressure maps into grasp policies.📷 AI-generated / Tech&Space

The omni-modal framing suggests fusion of tactile, visual, and force data, though exact sensor combinations and synchronization protocols remain unspecified. That's typical for early releases, but it also means researchers downloading the dataset will spend significant time reverse-engineering what was actually collected, and under what conditions. The IEEE Spectrum coverage is sponsored by DAIMON itself, a disclosure that doesn't invalidate the claims but should calibrate expectations about independent validation.

The real signal here is not the dataset size but the strategic bet: DAIMON is building the sensing layer and the training data layer simultaneously, hoping to become the default stack for touch-enabled manipulation. It's a defensible position if the hardware ships reliably and the data proves representative of messy reality. Hotels and convenience stores in China are mentioned as early deployment targets—environments with repetitive handling tasks, controlled lighting, and human supervisors nearby. That's not criticism; it's realism. Those constraints define the actual frontier.

What would make this widely useful is not more data but clearer evidence that the tactile generalizes across surfaces, pressures, and failure modes that don't appear in curated training sets. The open-source release is a genuine contribution. Whether it accelerates embodied AI or merely accelerates conference papers depends on what happens when these systems leave the benchmark behind.

// Continue in this category

// liked by readers

//Comments

⊞ Foto Review