niftynet.layer.fully_connected module

default_w_initializer()[source]
default_b_initializer()[source]
class FCLayer(n_output_chns, with_bias=True, w_initializer=None, w_regularizer=None, b_initializer=None, b_regularizer=None, name='fc')[source]

Bases: niftynet.layer.base_layer.TrainableLayer

This class defines a simple fully connected layer with an optional bias term. Please consider FullyConnectedLayer if batch_norm and activation are also used.

layer_op(input_tensor)[source]
class FullyConnectedLayer(n_output_chns, with_bias=True, feature_normalization='batch', group_size=-1, acti_func=None, w_initializer=None, w_regularizer=None, b_initializer=None, b_regularizer=None, moving_decay=0.9, eps=1e-05, name='fc')[source]

Bases: niftynet.layer.base_layer.TrainableLayer

This class defines a composite layer with optional components:

fully connected layer -> batch_norm -> activation -> dropout

The b_initializer and b_regularizer are applied to the FCLayer The w_initializer and w_regularizer are applied to the FCLayer, the batch normalisation layer, and the activation layer (for ‘prelu’)

layer_op(input_tensor, is_training=None, keep_prob=None)[source]