niftynet.layer.gn module

class GNLayer(group_size=32, regularizer=None, eps=1e-05, name='group_norm')[source]

Bases: niftynet.layer.base_layer.TrainableLayer

Group normalisation layer, with trainable mean value ‘beta’ and std ‘gamma’. ‘beta’ is initialised to 0.0 and ‘gamma’ is initialised to 1.0. This class assumes ‘beta’ and ‘gamma’ share the same type_str of regulariser.

Reimplementation of Wu and He, Group Normalization, arXiv:1803.08494 (2018)

layer_op(inputs)[source]