niftynet.engine.application_optimiser module

To customise optimisers including new optimisation methods, learning rate decay schedule, or customise other optional parameters of the optimiser:

create a newclass.py that has a class NewOptimisor and implement get_instance(). and set config parameter in config file or from command line specify –optimiser newclass.NewOptimisor

class Adam[source]

Bases: object

Adam optimiser with default hyper parameters

static get_instance(learning_rate)[source]

create an instance of the optimiser

class Adagrad[source]

Bases: object

Adagrad optimiser with default hyper parameters

static get_instance(learning_rate)[source]

create an instance of the optimiser

class Momentum[source]

Bases: object

Momentum optimiser with default hyper parameters

static get_instance(learning_rate)[source]

create an instance of the optimiser

class NesterovMomentum[source]

Bases: object

Nesterov Momentum optimiser with default hyper parameters

static get_instance(learning_rate)[source]

create an instance of the optimiser

class RMSProp[source]

Bases: object

RMSProp optimiser with default hyper parameters

static get_instance(learning_rate)[source]

create an instance of the optimiser

class GradientDescent[source]

Bases: object

Gradient Descent optimiser with default hyper parameters

static get_instance(learning_rate)[source]

create an instance of the optimiser