niftynet.layer.activation module¶
-
class
niftynet.layer.activation.
ActiLayer
(func, regularizer=None, name='activation')¶ Bases:
niftynet.layer.base_layer.TrainableLayer
Apply an element-wise non-linear activation function. ‘Prelu’ uses trainable parameters and those are initialised to zeros Dropout function is also supported
-
layer_op
(input_tensor, keep_prob=None)¶
-
-
niftynet.layer.activation.
leakyRelu
(x, name)¶
-
niftynet.layer.activation.
prelu
(f_in, channelwise_params)¶
-
niftynet.layer.activation.
selu
(x, name)¶