niftynet.layer.fully_connected module¶
-
class
niftynet.layer.fully_connected.
FCLayer
(n_output_nodes, with_bias=True, w_initializer=None, w_regularizer=None, b_initializer=None, b_regularizer=None, name='fc')¶ Bases:
niftynet.layer.base_layer.TrainableLayer
This class defines a simple fully connected layer with an optional bias term. Please consider FullyConnectedLayer if batch_norm and activation are also used.
-
layer_op
(input_tensor)¶
-
-
class
niftynet.layer.fully_connected.
FullyConnectedLayer
(n_output_nodes, with_bias=True, with_bn=True, acti_func=None, w_initializer=None, w_regularizer=None, b_initializer=None, b_regularizer=None, moving_decay=0.9, eps=1e-05, name='fc')¶ Bases:
niftynet.layer.base_layer.TrainableLayer
- This class defines a composite layer with optional components:
- fully connected layer -> batch_norm -> activation -> dropout
The b_initializer and b_regularizer are applied to the FCLayer The w_initializer and w_regularizer are applied to the FCLayer, the batch normalisation layer, and the activation layer (for ‘prelu’)
-
layer_op
(input_tensor, is_training=None, keep_prob=None)¶
-
niftynet.layer.fully_connected.
default_b_initializer
()¶
-
niftynet.layer.fully_connected.
default_w_initializer
()¶