Summary
Implement GNOT, a general-purpose transformer-based neural operator with heterogeneous cross-attention and linear complexity.
Reference
- Hao et al., "GNOT: A General Neural Operator Transformer for Operator Learning," ICML 2023. Paper
Description
GNOT uses heterogeneous normalized cross-attention to handle multiple types of input conditions (initial conditions, boundary conditions, forcing terms, PDE coefficients) within a single architecture. It achieves linear-time complexity through its attention mechanism, making it scalable to large grids. Designed as a general-purpose operator learning framework.
Summary
Implement GNOT, a general-purpose transformer-based neural operator with heterogeneous cross-attention and linear complexity.
Reference
Description
GNOT uses heterogeneous normalized cross-attention to handle multiple types of input conditions (initial conditions, boundary conditions, forcing terms, PDE coefficients) within a single architecture. It achieves linear-time complexity through its attention mechanism, making it scalable to large grids. Designed as a general-purpose operator learning framework.