Skip to content

Implement GNOT (General Neural Operator Transformer) #117

@ChrisRackauckas-Claude

Description

@ChrisRackauckas-Claude

Summary

Implement GNOT, a general-purpose transformer-based neural operator with heterogeneous cross-attention and linear complexity.

Reference

  • Hao et al., "GNOT: A General Neural Operator Transformer for Operator Learning," ICML 2023. Paper

Description

GNOT uses heterogeneous normalized cross-attention to handle multiple types of input conditions (initial conditions, boundary conditions, forcing terms, PDE coefficients) within a single architecture. It achieves linear-time complexity through its attention mechanism, making it scalable to large grids. Designed as a general-purpose operator learning framework.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions