Adagrad

class Adagrad(params, lr=0.01, lr_decay=0.0, eps=1e-10, weight_decay=0.0)[源代码]

Implements Adagrad algorithm proposed in “Adaptive Subgradient Methods for Online Learning and Stochastic Optimization”.

参数
  • params (Union[Iterable[Parameter], dict]) – 可迭代对象,可以是一组待优化的参数,或定义几组参数的dict类型。

  • lr (float) – coefficient that scales delta before it is applied to the parameters. Default: 1e-2.

  • lr_decay (float) – learning rate decay. Default: 0.

  • eps (float) – term added to the denominator to improve numerical stability. Default: 1e-10.

  • weight_decay (float) – weight decay (L2 penalty). Default: 0.

返回

An instance of the Adagrad optimizer.