Simd::Neural is C++ framework for running and learning of Convolutional Neural Network.
More...
|
enum | Type {
Identity,
Tanh,
Sigmoid,
Relu,
LeakyRelu,
Softmax
} |
|
enum | Type {
Input,
Convolutional,
MaxPooling,
FullyConnected,
Dropout
} |
|
enum | Method {
Fast,
Check,
Train
} |
|
enum | InitType { Xavier
} |
|
enum | LossType {
Mse,
CrossEntropyMulticlass
} |
|
enum | UpdateType { AdaptiveGradient
} |
|
Simd::Neural is C++ framework for running and learning of Convolutional Neural Network.
Describes types of activation function. It is used in order to create a Layer in Network.
Describes types of network layers.
Describes method of forward propagation in the network layer.
Enumerator |
---|
Fast |
The fastest method. It is incompatible with train process.
|
Check |
Control checking during train process.
|
Train |
Forward propagation in train process.
|
Describes method to initialize weights of neural network.
Enumerator |
---|
Xavier |
Use fan-in and fan-out for scaling Xavier Glorot, Yoshua Bengio. "Understanding the difficulty of training deep feedforward neural networks" Proc. AISTATS 10, May 2010, vol.9, pp249-256
|
Describes loss function.
Enumerator |
---|
Mse |
Mean-Squared-Error loss function for regression.
|
CrossEntropyMulticlass |
Cross Entropy Multiclass loss function for multi-class classification.
|
Method of weights' updating.
Enumerator |
---|
AdaptiveGradient |
Adaptive gradients method. J Duchi, E Hazan and Y Singer, "Adaptive subgradient methods for online learning and stochastic optimization" The Journal of Machine Learning Research, pages 2121-2159, 2011.
- Note
- See SimdNeuralAdaptiveGradientUpdate.
|