Skip to content

why is an optimizer used in something that is not a neural network? #44

@molo32

Description

@molo32

Can someone tell me the intuition of applying an optimizer (descending gradient, adam) to the latent code.

the optimizer looks for the image in the latent space, the latent space is updated instead of the weights of a neural network.

Why does it work in this case, since it is not the weights of a neural network that is updated?

How does the optimizer know the latent code that represents the input image for the generator?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions