Class used to compute the gradient for a loss function, given a single data point.
Class used to solve an optimization problem using Gradient Descent.
Compute gradient and loss for a Hinge loss function.
Updater that adjusts learning rate and performs L1 regularization.
Compute gradient and loss for a logistic loss function.
A simple updater that adaptively adjusts the learning rate the square root of the number of iterations.
Compute gradient and loss for a Least-squared loss function.
Updater that adjusts the learning rate and performs L2 regularization
Class used to update weights used in Gradient Descent.