Summary
Implement the Latent Neural Operator, which uses Physics-Cross-Attention (PhCA) to learn operators in a compressed latent space.
Reference
- Wang et al., "Latent Neural Operator for Solving Forward and Inverse PDE Problems," NeurIPS 2024. arXiv:2406.03923
Description
The Latent Neural Operator transforms input function representations from geometric space to a compressed latent space via Physics-Cross-Attention (PhCA), processes them with latent-space operator layers, then maps back. This achieves 50% GPU memory reduction and 1.8x training speedup compared to full-space operators, while achieving highest precision on multiple benchmarks.
Key components:
- Physics-Cross-Attention (PhCA) encoder/decoder between geometric and latent spaces
- Latent-space operator layers (can use FNO or attention)
- Applicable to both forward and inverse problems
Summary
Implement the Latent Neural Operator, which uses Physics-Cross-Attention (PhCA) to learn operators in a compressed latent space.
Reference
Description
The Latent Neural Operator transforms input function representations from geometric space to a compressed latent space via Physics-Cross-Attention (PhCA), processes them with latent-space operator layers, then maps back. This achieves 50% GPU memory reduction and 1.8x training speedup compared to full-space operators, while achieving highest precision on multiple benchmarks.
Key components: