niftynet.layer.activation module¶
-
class
ActiLayer
(func, regularizer=None, name='activation')[source]¶ Bases:
niftynet.layer.base_layer.TrainableLayer
Apply an element-wise non-linear activation function. ‘Prelu’ uses trainable parameters and those are initialised to zeros Dropout function is also supported