pub struct AdaGrad { /* private fields */ }
Expand description

Adaptive Gradient Descent

The adaptive gradient descent algorithm (Duchi et al. 2010).

Implementations

Constructs a new AdaGrad algorithm.

Examples
use rusty_machine::learning::optim::grad_desc::AdaGrad;

// Create a new AdaGrad algorithm with step size 0.5
// and adaptive scaling constant 1.0
let gd = AdaGrad::new(0.5, 1.0, 100);

Trait Implementations

Formats the value using the given formatter. Read more
Returns the “default value” for a type. Read more
Return the optimized parameter using gradient optimization. Read more

Auto Trait Implementations

Blanket Implementations

Gets the TypeId of self. Read more
Immutably borrows from an owned value. Read more
Mutably borrows from an owned value. Read more

Returns the argument unchanged.

Calls U::from(self).

That is, this conversion is whatever the implementation of [From]<T> for U chooses to do.

The type returned in the event of a conversion error.
Performs the conversion.
The type returned in the event of a conversion error.
Performs the conversion.