GenDexHand: Generative Simulation for Dexterous Hands
Reading time: 2 minute
...
📝 Original Info
- Title: GenDexHand: Generative Simulation for Dexterous Hands
- ArXiv ID: 2511.01791
- Date: 2025-11-03
- Authors: ** 논문에 명시된 저자 정보가 제공되지 않았습니다. (가능하면 원문 PDF 혹은 arXiv 페이지에서 확인 필요) — **
📝 Abstract
Data scarcity remains a fundamental bottleneck for embodied intelligence. Existing approaches use large language models (LLMs) to automate gripper-based simulation generation, but they transfer poorly to dexterous manipulation, which demands more specialized environment design. Meanwhile, dexterous manipulation tasks are inherently more difficult due to their higher degrees of freedom. Massively generating feasible and trainable dexterous hand tasks remains an open challenge. To this end, we present GenDexHand, a generative simulation pipeline that autonomously produces diverse robotic tasks and environments for dexterous manipulation. GenDexHand introduces a closed-loop refinement process that adjusts object placements and scales based on vision-language model (VLM) feedback, substantially improving the average quality of generated environments. Each task is further decomposed into sub-tasks to enable sequential reinforcement learning, reducing training time and increasing success rates. Our work provides a viable path toward scalable training of diverse dexterous hand behaviors in embodied intelligence by offering a simulation-based solution to synthetic data generation. Our website: https://winniechen2002.github.io/GenDexHand/.💡 Deep Analysis
📄 Full Content
Reference
This content is AI-processed based on open access ArXiv data.