10 Deep learning
Deep learning is an advanced version of neural networks (NNs with more than three layers) to make the machines learn from data.
- RNN (reccurent neural network) is an algorithm that uses sequential data (i.e. data that are ordered into sequences) such as timeseries, stock market, temperature, etc.
- NLP (natural language processing) deals with the study of how computers learn a massive amount of textual data through programming
- Batch normalization is a technique for training very deep neural networks that standardize the inputs to a layer for each mini-batch. This has the effect of stabilizing the learning process and dramatically reducing the number of training epochs required to train deep networks. After this, model is less sensitive to hyperparameter tuning, high learning rates become acceptable (which results in faster training of the model), weight initialization becomes an easy task,..
- A perceptron is the simplest NN that contains a single neuron that performs 2 functions. The first function is to perform the weighted sum of all the inputs and the second is an activation function
- Autoencoders are learning networks which transform inputs into outputs with minimum possible errors. Can be used in anomaly detection.