Timezone: »
Deep generative models have shown success in generating 3D shapes with different representations. In this work, we propose Neural Volumetric Mesh Generator (NVMG), which can generate novel and high-quality volumetric meshes. Unlike the previous 3D generative model for point cloud, voxel, and implicit surface, volumetric mesh is a ready-to-use representation in industry with details on both the surface and interior. Generating this kind of highly-structured data thus brings a great challenge. To tackle this problem, we first propose to use a diffusion-based generative model to generate voxelized shapes with realistic shape and topology information. With the voxelized shape, we can simply obtain a tetrahedral mesh as a template. Further, we use a voxel-conditional neural network to predict the surface conditioned on the voxels, and progressively project the tetrahedral mesh to the predicted surface under regularization. As shown in the experiments, without any post-processing, our pipeline can generate high-quality artifact-free volumetric and surface meshes.
Author Information
Yan Zheng (University of Texas at Austin)
Lemeng Wu (The University of Texas at Austin)
Xingchao Liu (University of Texas, Austin)
Zhen Chen (, University of Texas, Austin)
Qiang Liu (Dartmouth College)
Qixing Huang (The University of Texas at Austin)
More from the Same Authors
-
2021 Spotlight: Profiling Pareto Front With Multi-Objective Stein Variational Gradient Descent »
Xingchao Liu · Xin Tong · Qiang Liu -
2022 : BOME! Bilevel Optimization Made Easy: A Simple First-Order Approach »
Mao Ye · Bo Liu · Stephen Wright · Peter Stone · Qiang Liu -
2022 : Diffusion-based Molecule Generation with Informative Prior Bridges »
Chengyue Gong · Lemeng Wu · Xingchao Liu · Mao Ye · Qiang Liu -
2022 : HotProtein: A Novel Framework for Protein Thermostability Prediction and Editing »
Tianlong Chen · Chengyue Gong · Daniel Diaz · Xuxi Chen · Jordan Wells · Qiang Liu · Zhangyang Wang · Andrew Ellington · Alex Dimakis · Adam Klivans -
2022 : First hitting diffusion models »
Mao Ye · Lemeng Wu · Qiang Liu -
2022 : Flow Straight and Fast: Learning to Generate and Transfer Data with Rectified Flow »
Xingchao Liu · Chengyue Gong · Qiang Liu -
2022 : Let us Build Bridges: Understanding and Extending Diffusion Generative Models »
Xingchao Liu · Lemeng Wu · Mao Ye · Qiang Liu -
2022 Poster: First Hitting Diffusion Models for Generating Manifold, Graph and Categorical Data »
Mao Ye · Lemeng Wu · Qiang Liu -
2022 Poster: Sampling in Constrained Domains with Orthogonal-Space Variational Gradient Descent »
Ruqi Zhang · Qiang Liu · Xin Tong -
2022 Poster: BOME! Bilevel Optimization Made Easy: A Simple First-Order Approach »
Bo Liu · Mao Ye · Stephen Wright · Peter Stone · Qiang Liu -
2022 Poster: Diffusion-based Molecule Generation with Informative Prior Bridges »
Lemeng Wu · Chengyue Gong · Xingchao Liu · Mao Ye · Qiang Liu -
2021 Poster: Conflict-Averse Gradient Descent for Multi-task learning »
Bo Liu · Xingchao Liu · Xiaojie Jin · Peter Stone · Qiang Liu -
2021 Poster: Sampling with Trusthworthy Constraints: A Variational Gradient Framework »
Xingchao Liu · Xin Tong · Qiang Liu -
2021 Poster: Automatic and Harmless Regularization with Constrained and Lexicographic Optimization: A Dynamic Barrier Approach »
Chengyue Gong · Xingchao Liu · Qiang Liu -
2021 Poster: argmax centroid »
Chengyue Gong · Mao Ye · Qiang Liu -
2021 Poster: Profiling Pareto Front With Multi-Objective Stein Variational Gradient Descent »
Xingchao Liu · Xin Tong · Qiang Liu -
2020 Poster: Implicit Regularization and Convergence for Weight Normalization »
Xiaoxia Wu · Edgar Dobriban · Tongzheng Ren · Shanshan Wu · Zhiyuan Li · Suriya Gunasekar · Rachel Ward · Qiang Liu -
2020 Poster: Firefly Neural Architecture Descent: a General Approach for Growing Neural Networks »
Lemeng Wu · Bo Liu · Peter Stone · Qiang Liu -
2020 Poster: Dense Correspondences between Human Bodies via Learning Transformation Synchronization on Graphs »
Xiangru Huang · Haitao Yang · Etienne Vouga · Qixing Huang -
2020 Poster: Greedy Optimization Provably Wins the Lottery: Logarithmic Number of Winning Tickets is Enough »
Mao Ye · Lemeng Wu · Qiang Liu -
2019 Poster: Splitting Steepest Descent for Growing Neural Architectures »
Lemeng Wu · Dilin Wang · Qiang Liu -
2019 Spotlight: Splitting Steepest Descent for Growing Neural Architectures »
Lemeng Wu · Dilin Wang · Qiang Liu -
2019 Poster: A Condition Number for Joint Optimization of Cycle-Consistent Networks »
Leonidas Guibas · Qixing Huang · Zhenxiao Liang -
2019 Spotlight: A Condition Number for Joint Optimization of Cycle-Consistent Networks »
Leonidas Guibas · Qixing Huang · Zhenxiao Liang