Timezone: »

 
Poster
DART: Articulated Hand Model with Diverse Accessories and Rich Textures
Daiheng Gao · Yuliang Xiu · Kailin Li · Lixin Yang · Feng Wang · Peng Zhang · Bang Zhang · Cewu Lu · Ping Tan

Tue Nov 29 02:00 PM -- 04:00 PM (PST) @ Hall J #1025

Hand, the bearer of human productivity and intelligence, is receiving much attention due to the recent fever of digital twins. Among different hand morphable models, MANO has been widely used in vision and graphics community. However, MANO disregards textures and accessories, which largely limits its power to synthesize photorealistic hand data. In this paper, we extend MANO with Diverse Accessories and Rich Textures, namely DART. DART is composed of 50 daily 3D accessories which varies in appearance and shape, and 325 hand-crafted 2D texture maps covers different kinds of blemishes or make-ups. Unity GUI is also provided to generate synthetic hand data with user-defined settings, e.g., pose, camera, background, lighting, textures, and accessories. Finally, we release DARTset, which contains large-scale (800K), high-fidelity synthetic hand images, paired with perfect-aligned 3D labels. Experiments demonstrate its superiority in diversity. As a complement to existing hand datasets, DARTset boosts the generalization in both hand pose estimation and mesh recovery tasks. Raw ingredients (textures, accessories), Unity GUI, source code and DARTset are publicly available at dart2022.github.io.

Author Information

Daiheng Gao (Alibaba XR Lab)
Yuliang Xiu (Max Planck Institute for Intelligent Systems)
Kailin Li (Shanghai Jiao Tong University)
Lixin Yang (Shanghai Jiao Tong University)
Feng Wang (Alibaba Group)
Peng Zhang (University of Science and Technology of China)
Bang Zhang (Alibaba Group)
Cewu Lu (Shanghai Jiao Tong University)
Ping Tan (Simon Fraser University)

More from the Same Authors