niftynet.layer.activation module

prelu(f_in, channelwise_params)[source]
selu(x, name)[source]
leaky_relu(x, name)[source]
class ActiLayer(func, regularizer=None, name='activation')[source]

Bases: niftynet.layer.base_layer.TrainableLayer

Apply an element-wise non-linear activation function. ‘Prelu’ uses trainable parameters and those are initialised to zeros Dropout function is also supported

layer_op(input_tensor, keep_prob=None)[source]