Expand description

Neural Network module

Contains implementation of simple feed forward neural network.

Usage

use rusty_machine::learning::nnet::{NeuralNet, BCECriterion};
use rusty_machine::learning::toolkit::regularization::Regularization;
use rusty_machine::learning::toolkit::activ_fn::Sigmoid;
use rusty_machine::learning::optim::grad_desc::StochasticGD;
use rusty_machine::linalg::Matrix;
use rusty_machine::learning::SupModel;

let inputs = Matrix::new(5,3, vec![1.,1.,1.,2.,2.,2.,3.,3.,3.,
                                4.,4.,4.,5.,5.,5.,]);
let targets = Matrix::new(5,3, vec![1.,0.,0.,0.,1.,0.,0.,0.,1.,
                                    0.,0.,1.,0.,0.,1.]);

// Set the layer sizes - from input to output
let layers = &[3,5,11,7,3];

// Choose the BCE criterion with L2 regularization (`lambda=0.1`).
let criterion = BCECriterion::new(Regularization::L2(0.1));

// We will create a multilayer perceptron and just use the default stochastic gradient descent.
let mut model = NeuralNet::mlp(layers, criterion, StochasticGD::default(), Sigmoid);

// Train the model!
model.train(&inputs, &targets).unwrap();

let test_inputs = Matrix::new(2,3, vec![1.5,1.5,1.5,5.1,5.1,5.1]);

// And predict new output from the test inputs
let outputs = model.predict(&test_inputs).unwrap();

The neural networks are specified via a criterion - similar to Torch. The criterions specify a cost function and any regularization.

You can define your own criterion by implementing the Criterion trait with a concrete CostFunc.

Modules

Neural Network Layers

Structs

The binary cross entropy criterion.
Base Neural Network struct
The mean squared error criterion.
Neural Network Model

Traits

Criterion for Neural Networks