Batch Normalization
4 min read
Batch Normalization is a technique for making artificial neural network training faster and more stable by normalizing the layers' inputs for each training mini-batch. This normalization reduces the effect of internal covariate shift, allowing for faster training with higher learning rates.