niftynet.layer.base_layer module

class Invertible[source]

Bases: object

interface of Invertible data

inverse_op(*args, **kwargs)[source]
class Layer(name='untitled_op')[source]

Bases: object

layer_op(*args, **kwargs)[source]
layer_scope()[source]
to_string()[source]
class TrainableLayer(name='trainable_op')[source]

Bases: niftynet.layer.base_layer.Layer

Extends the Layer object to have trainable parameters, adding initializers and regularizers.

trainable_variables()[source]
restore_from_checkpoint(checkpoint_name, scope=None)[source]
regularizer_loss()[source]
num_trainable_params()[source]
to_string()[source]
initializers
regularizers
class DataDependentLayer(name='data_dependent_op')[source]

Bases: niftynet.layer.base_layer.Layer

Some layers require a one-pass training through the training set to determine their internal models, this abstract provides interfaces for training these internal models and querying the status.

is_ready()[source]
train(*args, **kwargs)[source]
class RandomisedLayer(name='randomised_op')[source]

Bases: niftynet.layer.base_layer.Layer

The layers require a randomisation process, to randomly change some of the layer’s states on the fly.

randomise(*args, **kwargs)[source]
class LayerFromCallable(layer_op, name='from_callable_op')[source]

Bases: niftynet.layer.base_layer.Layer

Module wrapping a function provided by the user. Analogous to snt.Module

layer_op(*args, **kwargs)[source]