Skip to yearly menu bar Skip to main content


Poster
in
Workshop: New Frontiers in Graph Learning (GLFrontiers)

Bridging the Gap: Towards Flexible, Efficient, and Effective Tensor Product Networks

Nanxiang Wang · Chen Lin · Michael Bronstein · Philip Torr

Keywords: [ efficient GNN ] [ geometric GNN ] [ scalarization ] [ Pruning ] [ tensor product ] [ Graph neural network ] [ Equivariant ]


Abstract:

Geometric graph neural networks have showcased exceptional performance in modelling geometric data, a common requirement in natural science research. These models heavily rely on equivariant operations, encompassing vital techniques such as scalarization and the Clebsch-Gordan tensor product. However, tensor-product-based architectures face substantial computational challenges as the representation order increases, significantly limiting their versatility. Moreover, the interpretability of interactions between steerable components remains elusive. In contrast, scalarization methods benefit from cost-efficient invariant scalar functions while still being capable of outperforming certain tensor-product-based models. To bridge the gap between these approaches, we introduce a conceptual framework that emphasizes the potential flexibility of tensor product networks. To provide guidance for efficient framework design and gain deeper insights into steerable components, we conduct a preliminary investigation by pruning tensor product interactions. This approach enables us to directly assess the redundancy and significance of steerable components, paving the way for future design possibilities.

Chat is not available.