heat.optim

This is the heat.optimizer submodule.

It contains data parallel specific optimizers and learning rate schedulers. It also includes all of the optimizers and learning rate schedulers in the torch namespace

Submodules

Package Contents

__getattr__(name)

Call the torch learning rate scheduler of a specified name

__getattr__(name)

When a function is called for the heat.optim module it will attempt to run the heat optimizer with that name, then, if there is no such heat optimizer, it will attempt to get the torch optimizer of that name.