Deep network
Layer::Layer Class Referenceabstract

#include <layer.hpp>

Inheritance diagram for Layer::Layer:

Public Member Functions

 Layer (size_t n, size_t m)
 
 ~Layer ()
 
size_t get_input_size () const
 
size_t get_output_size () const
 
Vector get_output () const
 
virtual Vector feed_forward (Vector x)=0
 
virtual void init_nabla ()=0
 
virtual Vector back_propagation (Vector e)=0
 
virtual void update (Real eta)=0
 

Public Attributes

string name
 
size_t n
 
size_t m
 
Vector x
 
Vector y
 
Vector d
 

Detailed Description

An abstract class representing a layer of a deep network. Such a layer can be seen as a map

\[ \begin{array}{rcl} f:\mathbb{R}^n&\mapsto&\mathbb{R}^m\\ x&\to&y \end{array} \]

.

Constructor & Destructor Documentation

◆ Layer()

Layer::Layer::Layer ( size_t  n,
size_t  m 
)
inline

◆ ~Layer()

Layer::Layer::~Layer ( )
inline

Member Function Documentation

◆ back_propagation()

virtual Vector Layer::Layer::back_propagation ( Vector  e)
pure virtual

Apply back propagation algorithm on the delta output vector d. Used the input vector stored in x_in_ref during feedforward. Return a reference to the computed (and stored) input delta vector. Nabla vectors must be computed here.

Implemented in Layer::ActivationLayer< A >, Layer::ConvolutionLayer, Layer::FullConnectedLayer, and Layer::Pooling.

◆ feed_forward()

virtual Vector Layer::Layer::feed_forward ( Vector  x)
pure virtual

Apply the layer to the input vector x. Vectors x_in_ref and x_out must be updated in consequence. Return a reference to x_out.

Implemented in Layer::ActivationLayer< A >, Layer::ConvolutionLayer, Layer::FullConnectedLayer, and Layer::Pooling.

◆ get_input_size()

size_t Layer::Layer::get_input_size ( ) const
inline

Return the input size.

◆ get_output()

Vector Layer::Layer::get_output ( ) const
inline

Return a reference to the computed output vector.

◆ get_output_size()

size_t Layer::Layer::get_output_size ( ) const
inline

Return the output size.

◆ init_nabla()

virtual void Layer::Layer::init_nabla ( )
pure virtual

Initialize nabla vectors which are used during gradient descent.

Implemented in Layer::ActivationLayer< A >, Layer::ConvolutionLayer, Layer::FullConnectedLayer, and Layer::Pooling.

◆ update()

virtual void Layer::Layer::update ( Real  eta)
pure virtual

Update layer parameters using gradient descent algorithm with learning rate eta.

Implemented in Layer::ActivationLayer< A >, Layer::ConvolutionLayer, Layer::FullConnectedLayer, and Layer::Pooling.

Member Data Documentation

◆ d

Vector Layer::Layer::d

Computed input delta vector computed by back propagation algorithm. Owned by the layer.

◆ m

size_t Layer::Layer::m

Size of the output vector.

◆ n

size_t Layer::Layer::n

Size of the input vector.

◆ name

string Layer::Layer::name

The name of the layer. Used for debugging.

◆ x

Vector Layer::Layer::x

A reference to the input vector.

◆ y

Vector Layer::Layer::y

Computed output vector. Owned by the layer.


The documentation for this class was generated from the following file: