Warm tip: This article is reproduced from serverfault.com, please click

Does BatchNormalization count as a layer in a network?

发布于 2020-12-05 23:46:41

Is BatchNormalizationLayer considered a layer in a neural network? For example, if we say, Resnet50 has 50 layers, does that mean that some of those layers may be batchnormalization layers?

When building models in Keras I considered it as an extra, similar to a dropout layer or when adding an “Activation layer”. But BatchNormalization has trainable parameters, so... I am confused

Questioner
Liubove
Viewed
0
Rika 2020-12-07 13:26:13

In DeepLearning literature, an X layer network simply refers to the usage of learnable layers that constitute the representational capacity of the network.
Activation layers, normalization layers (such as NLR, BatchNorm, etc), Downsampling layers (such as Maxpooling, etc) are not considered.

Layers such as CNN, RNN, FC, and the likes that are responsible for the representational capacity of the network are counted.