niftynet.layer.convolution module

default_w_initializer()[source]
default_b_initializer()[source]
class ConvLayer(n_output_chns, kernel_size=3, stride=1, dilation=1, padding='SAME', with_bias=False, w_initializer=None, w_regularizer=None, b_initializer=None, b_regularizer=None, padding_constant=0, name='conv')[source]

Bases: niftynet.layer.base_layer.TrainableLayer

This class defines a simple convolution with an optional bias term. Please consider ConvolutionalLayer if batch_norm and activation are also used.

__init__(n_output_chns, kernel_size=3, stride=1, dilation=1, padding='SAME', with_bias=False, w_initializer=None, w_regularizer=None, b_initializer=None, b_regularizer=None, padding_constant=0, name='conv')[source]
Parameters:padding_constant – a constant applied in padded convolution

(see also tf.pad)

layer_op(input_tensor)[source]
class ConvolutionalLayer(n_output_chns, kernel_size=3, stride=1, dilation=1, padding='SAME', with_bias=False, feature_normalization='batch', group_size=-1, acti_func=None, preactivation=False, w_initializer=None, w_regularizer=None, b_initializer=None, b_regularizer=None, moving_decay=0.9, eps=1e-05, padding_constant=0, name='conv')[source]

Bases: niftynet.layer.base_layer.TrainableLayer

This class defines a composite layer with optional components:

convolution -> feature_normalization (default batch norm) -> activation -> dropout

The b_initializer and b_regularizer are applied to the ConvLayer The w_initializer and w_regularizer are applied to the ConvLayer, the feature normalization layer, and the activation layer (for ‘prelu’)

__init__(n_output_chns, kernel_size=3, stride=1, dilation=1, padding='SAME', with_bias=False, feature_normalization='batch', group_size=-1, acti_func=None, preactivation=False, w_initializer=None, w_regularizer=None, b_initializer=None, b_regularizer=None, moving_decay=0.9, eps=1e-05, padding_constant=0, name='conv')[source]
Parameters:padding_constant – constant applied with CONSTANT padding
layer_op(input_tensor, is_training=None, keep_prob=None)[source]