Timezone: »
Optical computing has become emerging technology in next-generation efficient artificial intelligence (AI) due to its ultra-high speed and efficiency. Electromagnetic field simulation is critical to the design, optimization, and validation of photonic devices and circuits.However, costly numerical simulation significantly hinders the scalability and turn-around time in the photonic circuit design loop. Recently, physics-informed neural networks were proposed to predict the optical field solution of a single instance of a partial differential equation (PDE) with predefined parameters. Their complicated PDE formulation and lack of efficient parametrization mechanism limit their flexibility and generalization in practical simulation scenarios. In this work, for the first time, a physics-agnostic neural operator-based framework, dubbed NeurOLight, is proposed to learn a family of frequency-domain Maxwell PDEs for ultra-fast parametric photonic device simulation. Specifically, we discretize different devices into a unified domain, represent parametric PDEs with a compact wave prior, and encode the incident light via masked source modeling. We design our model to have parameter-efficient cross-shaped NeurOLight blocks and adopt superposition-based augmentation for data-efficient learning. With those synergistic approaches, NeurOLight demonstrates 2-orders-of-magnitude faster simulation speed than numerical solvers and outperforms prior NN-based models by ~54% lower prediction error using ~44% fewer parameters.
Author Information
Jiaqi Gu (The University of Texas at Austin)
Zhengqi Gao (Massachusetts Institute of Technology)
Chenghao Feng (University of Texas, Austin)
Hanqing Zhu (University of Texas, Austin)
Ray Chen (University of Texas, Austin)
Duane Boning (Massachusetts Institute of Technology)
David Pan (University of Texas, Austin)
Related Events (a corresponding poster, oral, or spotlight)
-
2022 Poster: NeurOLight: A Physics-Agnostic Neural Operator Enabling Parametric Photonic Device Simulation »
Thu. Dec 1st through Fri the 2nd Room Hall J #201
More from the Same Authors
-
2022 : An Adversarial Active Sampling-based Data Augmentation Framework for Manufacturable Chip Design »
Mingjie Liu · Haoyu Yang · David Pan · Brucek Khailany · Mark Ren -
2022 : HEAT: Hardware-Efficient Automatic Tensor Decomposition for Transformer Compression »
Jiaqi Gu · Ben Keller · Jean Kossaifi · Anima Anandkumar · Brucek Khailany · David Pan -
2023 Poster: Pre-RMSNorm and Pre-CRMSNorm Transformers: Equivalent and Efficient Pre-LN Transformers »
Zixuan Jiang · Jiaqi Gu · Hanqing Zhu · David Pan -
2023 Poster: Nominality Score Conditioned Time Series Anomaly Detection by Point/Sequential Reconstruction »
Chih-Yu Lai · Fan-Keng Sun · Zhengqi Gao · Jeffrey H Lang · Duane Boning -
2022 : HEAT: Hardware-Efficient Automatic Tensor Decomposition for Transformer Compression »
Jiaqi Gu · Ben Keller · Jean Kossaifi · Anima Anandkumar · Brucek Khailany · David Pan -
2021 Poster: Adjusting for Autocorrelated Errors in Neural Networks for Time Series »
Fan-Keng Sun · Chris Lang · Duane Boning -
2021 Poster: L2ight: Enabling On-Chip Learning for Optical Neural Networks via Efficient in-situ Subspace Optimization »
Jiaqi Gu · Hanqing Zhu · Chenghao Feng · Zixuan Jiang · Ray Chen · David Pan -
2020 Poster: Robust Deep Reinforcement Learning against Adversarial Perturbations on State Observations »
Huan Zhang · Hongge Chen · Chaowei Xiao · Bo Li · Mingyan Liu · Duane Boning · Cho-Jui Hsieh -
2020 Spotlight: Robust Deep Reinforcement Learning against Adversarial Perturbations on State Observations »
Huan Zhang · Hongge Chen · Chaowei Xiao · Bo Li · Mingyan Liu · Duane Boning · Cho-Jui Hsieh -
2020 Poster: Multi-Stage Influence Function »
Hongge Chen · Si Si · Yang Li · Ciprian Chelba · Sanjiv Kumar · Duane Boning · Cho-Jui Hsieh -
2019 Poster: Robustness Verification of Tree-based Models »
Hongge Chen · Huan Zhang · Si Si · Yang Li · Duane Boning · Cho-Jui Hsieh