pygrad.optims

Module storing (gradient descent) optimization methods.

Classes

Adam(model_parameters[, beta1, beta2, eps, lr])

Adam Optimizer.

RMSProp(model_parameters[, beta, lr])

RMS Prop.

SGD(model_parameters[, lr])

Vanilla Gradient Descent.

SGD_Momentum(model_parameters[, beta, lr])

Gradient Descent with Momentum.