Skip to content

Implement Latent Neural Operator #120

@ChrisRackauckas-Claude

Description

@ChrisRackauckas-Claude

Summary

Implement the Latent Neural Operator, which uses Physics-Cross-Attention (PhCA) to learn operators in a compressed latent space.

Reference

  • Wang et al., "Latent Neural Operator for Solving Forward and Inverse PDE Problems," NeurIPS 2024. arXiv:2406.03923

Description

The Latent Neural Operator transforms input function representations from geometric space to a compressed latent space via Physics-Cross-Attention (PhCA), processes them with latent-space operator layers, then maps back. This achieves 50% GPU memory reduction and 1.8x training speedup compared to full-space operators, while achieving highest precision on multiple benchmarks.

Key components:

  • Physics-Cross-Attention (PhCA) encoder/decoder between geometric and latent spaces
  • Latent-space operator layers (can use FNO or attention)
  • Applicable to both forward and inverse problems

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions