Skip to content

Perpetual Pre-training of G and RaGAN #106

Merged
simon-donike merged 2 commits intoESAOpenSR:ESRGANfrom
CristianCerasuoloo:ESRGAN
Feb 18, 2026
Merged

Perpetual Pre-training of G and RaGAN #106
simon-donike merged 2 commits intoESAOpenSR:ESRGANfrom
CristianCerasuoloo:ESRGAN

Conversation

@CristianCerasuoloo
Copy link

Pull Request

Description

Add of new 2 features:

  1. Optionally pre-train the generator G with content loss for un undefined number of steps by using -1 as g_pretrain_steps configurations
  2. Implemented 'Relativistic Average GAN' as the original paper of ESRGAN. Optionally usable by setting config key Training/Losses/relativistic_average_d to True

Type of change

Please delete options that are not relevant.

  • New feature (non-breaking change which adds functionality)

How Has This Been Tested?

Many training tests.

Checklist:

  • [X ] I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • [X ] My changes generate no new warnings
  • I have added tests that prove that my feature works
  • New and existing unit tests pass locally with my changes
  • [ X] Any dependent changes have been merged and published in downstream modules

@simon-donike
Copy link
Member

Super cool Cristian, thanks for contributing 💪

@simon-donike simon-donike mentioned this pull request Feb 18, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants