Skip to content

Implement OFormer (Operator Transformer) #119

@ChrisRackauckas-Claude

Description

@ChrisRackauckas-Claude

Summary

Implement OFormer, an attention-based operator learning framework using self-attention and cross-attention with minimal assumptions on input/output sampling.

Reference

  • Li et al., "Transformer for Partial Differential Equations' Operator Learning," 2022. arXiv:2205.13671

Description

OFormer uses self-attention to encode the input function, cross-attention to query at arbitrary output locations, and point-wise MLPs for nonlinear feature extraction. It makes few assumptions on the sampling pattern or query locations, providing flexibility for irregularly sampled data.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions