LRScheduler#

class LRScheduler(optimizer, current_epoch=- 1)[source]#

Base class for all learning rate based schedulers.

Parameters:
  • optimizer (Optimizer) – wrapped optimizer.

  • current_epoch (int) – the index of current epoch. Default: -1

get_lr()[source]#

Compute current learning rate for the scheduler.

load_state_dict(state_dict)[source]#

Loads the schedulers state.

Parameters:

state_dict – scheduler state.

state_dict()[source]#

Returns the state of the scheduler as a dict. It contains an entry for every variable in self.__dict__ which is not the optimizer.