Releases: ai2cm/ace
v2026.4.0
Release date: April 9, 2026
What's Changed
A subset of changes are listed here, see full changelog for more detail: v2026.1.1...v2026.4.0
⚠️ Breaking Changes
fme.aceandfme.coupledtraining configs: Training-only fields (loss,optimize_last_step_only,n_ensemble,parameter_init,train_n_forward_steps) have been removed fromStepperConfigand must now be set under a new top-levelstepper_training: TrainStepperConfigfield. Existing training configs will need to be updated. (#862)
New Config Options
metrics_log_dironLoggingConfig: Log W&B scalar metrics to a local JSONL file on disk in addition to W&B. (#992)- Configurable inference step logging: Control which inference steps are logged to W&B. (#883)
ValidationConfigonInferenceEvaluatorConfig(fme.ace): Optionally run a validation pass before inference and log metrics to step 0 of the W&B run. (#878)LRTuningConfigonTrainConfig(fme.ace,fme.coupled,fme.diffusion): Automatically tune the learning rate at configurable epochs by running short isolated comparison trials between the current and a candidate LR — no restarts required. (#930)prescribed_prognostic_namesonSingleModuleStepperConfig: Override named prognostic variables with ground-truth values at each inference/eval timestep. Intended to be set via stepper_override in eval configs. (#810)- Optional left/two-tailed PDF metrics for downscaling training. (#994)
LossVsNoiseAggregatorfor downscaling: Tracks loss as a function of noise level during diffusion training. (#1025)- Configurable training noise distribution. (#874)
Deprecations
sea_ice_thickness_nameonSeaIceFractionConfig(ocean corrector): Deprecated in favor of the more generalzero_where_ice_free_nameslist, which supports correcting multiple outputs. (#843)CascadePredictor(downscaling): Deprecated and removed. (#970)- Topography pathway on downscaling
DataLoaderConfig/PairedDataLoaderConfig: Deprecated; useStaticInputsinstead. (#926)
Notable Behavioral Fixes
HiRO-ACE Release
This release is the official milestone for our team's change to fully open development, and includes the latest updates for our HiRO-ACE as described in our paper. HiRO-ACE is a two-stage emulation framework for generating 3 km resolution precipitation outputs using a stochastic climate emulator (ACE2S) to generate 100km climate simulations and a downscaling model (HiRO) to generate 3 km precipitation outputs.
See the docs for a quickstart on installation and use, and our huggingface repo for the models and some sample data to run on.
Open Development
Previously, the ACE repo solely held updates related to papers and releases, while most of the development happened behind the scenes in a separate repository. This made it harder for external collaborators to contribute and for users to track development progress. We hope the changeover to all of our development happening here brings us closer to users, facilitating easier paths for soliciting feedback, issues, and development from outside of our group.
Updates
- Don't upload big maps by @AnnaKwa in #707
- Add
StaticInputsclass by @AnnaKwa in #713 - Add hiro ckpt train config by @AnnaKwa in #721
- Provide backwards compatibility for list-type BatchLabels by @mcgibbon in #722
- Serialize static inputs with downscaling model by @AnnaKwa in #727
- Beaker CI test via gantry by @brianhenn in #723
- Remove filter repo tools by @brianhenn in #729
- Add training configs for ACE2S used in HiRO - ACE manuscript by @Arcomano1234 in #710
- Pass model static inputs to dataset build calls at generation by @AnnaKwa in #728
- Call optimizer autocast in stepper predict generator by @mcgibbon in #733
- Ensure topography is on device in downscaling inference by @AnnaKwa in #731
- Coupled stepper config removes deprecated
crps_trainingkey by @elynnwu in #734 - Samudra bugfix: Use circular padding for longitude axis by @elynnwu in #735
- Add additional diagnostics of the OHC budget by @jpdunc23 in #737
- Prevent backpropagation anomalies in energy corrector by @spencerkclark in #724
- Fix bug causing step sampler to be ignored by @mcgibbon in #742
- Add a contributing guideline by @oliverwm1 in #730
- Increase timeout of NCCL collective operations to 20 minutes by @jpdunc23 in #746
- Add docs page for downscaling inference by @AnnaKwa in #743
- Vendorize Apache 2.0 Nvidia Downscaling Code by @frodre in #748
- Enforce lat bounds (-88 deg, 88 deg) by @AnnaKwa in #740
- Bump version v2026.1.1 for HiRO-ACE release by @frodre in #751
Full Changelog: v2026.1.0...v2026.1.1
v2026.1.0
Release marking the switch to open development for the ai2cm team.
What's Changed
- Docs CI update by @brianhenn in #711
- Bump version to 2026.1.0 by @brianhenn in #715
Full Changelog: https://github.com/ai2cm/ace/commits/v2026.1.0
2025.11.0
Release date: November 7, 2025
Full Changelog: 2025.10.0...2025.11.0
What's Changed
We updated the versions of fme dependencies torch-harmonics (0.7.4 --> 0.8.0) and imageio(<2.27.0 --> >2.28.1) based on user feedback.
2025.10.0
Release date: October 16, 2025
Full Changelog: 2025.7.0...2025.10.0
What's Changed
This release includes the capability to run coupled models (such as those emulating the atmosphere, ocean, and sea ice!) via entrypoints in fme.coupled. We have provided documentation for running inference using coupled model weights.
The deprecated legacy training configuration format (SingleModuleStepperConfig) has been removed in this release. However, breaking changes have been avoided and backwards compatibility has been maintained with existing saved models for most cases.
2025.7.0
What's Changed
This release includes major internal refactors and improved documentation. The previous training configuration format has been deprecated and will be removed in a future release. However, breaking changes have been avoided and backwards compatibility has been maintained with existing saved models for most cases.
Version updates:
- Python 3.11 and torch 2.7.1
Internal refactors:
- The
fmepackage has been moved one level up (i.e., away from the legacyfme/fme/...layout and tofme/ace/andfme/core/instead).
Increased modularity for ML emulation:
- Training configuration is now based around a more flexible
StepperConfig; the legacySingleModuleStepperConfigis deprecated and will be removed in a future release. - The stepper config now supports the modular
stepframework allowing composible steps for ML emulation.
Experimental features:
- Samudra, a global ocean emulator developed by M2LInES, is now fully integrated into Ai2's full model framework. An example production workflow for training and running Samudra is currently under development and will be included in the upcoming release.
Documentation
- Added an improved
quickstart.rstfocused around the models saved in our Hugging Face collection.
Full Changelog: 2024.12.0...2025.7.0
2024.12.0
What's Changed
This release contains many internal changes for ACE code. However, all configuration options accessible by the entrypoints of the fme package (i.e. fme.ace.train, fme.ace.inference and fme.ace.evaluator) have had no breaking changes.
The following lists are not complete but just a highlight of changes which may be relevant to users.
Bug fixes:
- resolved transient bug that sometimes occurred in
XarrayDatasetwhen trying to read the image shape from a scalar field - when using
n_repeatsgreater than 1,XarrayDatasetnow correctly increments the values in the returnedtimearrays
New features:
- ACE works on Apple Silicon! Set the environmental variable
FME_USE_MPS=1to use the pytorch MPS backend. Make sure to have the latest version of pytorch installed. This gives about a 5x speed up over running on CPU (tested on a Macbook Pro M3 Max). - add perturbations to sea surface temperature during inference (see
ForcingDataLoaderConfig.perturbations)
Refactors:
- deduplicated some inference code by using generics. Now the
fme.ace.inferenceandfme.ace.evaluatorentrypoints now share more code.
Full Changelog: 2024.9.0...2024.12.0
2024.9.0
What's Changed
- Update README to link to zenodo repo with checkpoint by @oliverwm1 in #3
- New public release of FME code by @oliverwm1 in #5
- Fix instruction for installing from GitHub by @oliverwm1 in #7
- Add readthedocs config by @mcgibbon in #6
- Add docs badge and link by @oliverwm1 in #8
- Add link to zenodo archive with checkpoint by @oliverwm1 in #9
- Add link to E3SMv2-trained paper and checkpoint by @oliverwm1 in #12
- Add link to published EAMv2 paper in JGR-ML by @jpdunc23 in #16
- Add missing init files by @oliverwm1 in #17
- Update for PyPI release by @frodre in #20
New Contributors
- @oliverwm1 made their first contribution in #3
- @mcgibbon made their first contribution in #6
- @jpdunc23 made their first contribution in #16
- @frodre made their first contribution in #20
Full Changelog: 2023.12.0...2024.9.0
2023.12.0
Inference code for model described in https://arxiv.org/abs/2310.02074