Summary
Implement OFormer, an attention-based operator learning framework using self-attention and cross-attention with minimal assumptions on input/output sampling.
Reference
- Li et al., "Transformer for Partial Differential Equations' Operator Learning," 2022. arXiv:2205.13671
Description
OFormer uses self-attention to encode the input function, cross-attention to query at arbitrary output locations, and point-wise MLPs for nonlinear feature extraction. It makes few assumptions on the sampling pattern or query locations, providing flexibility for irregularly sampled data.
Summary
Implement OFormer, an attention-based operator learning framework using self-attention and cross-attention with minimal assumptions on input/output sampling.
Reference
Description
OFormer uses self-attention to encode the input function, cross-attention to query at arbitrary output locations, and point-wise MLPs for nonlinear feature extraction. It makes few assumptions on the sampling pattern or query locations, providing flexibility for irregularly sampled data.