niftynet.engine.application_optimiser module

To customise optimisers including new optimisation methods, learning rate decay schedule, or customise other optional parameters of the optimiser:

create a newclass.py that has a class NewOptimisor and implement get_instance(). and set config parameter in config file or from command line specify –optimiser newclass.NewOptimisor

class niftynet.engine.application_optimiser.Adagrad

Bases: object

Adagrad optimiser with default hyper parameters

static get_instance(learning_rate)

create an instance of the optimiser

class niftynet.engine.application_optimiser.Adam

Bases: object

Adam optimiser with default hyper parameters

static get_instance(learning_rate)

create an instance of the optimiser

class niftynet.engine.application_optimiser.GradientDescent

Bases: object

Gradient Descent optimiser with default hyper parameters

static get_instance(learning_rate)

create an instance of the optimiser

class niftynet.engine.application_optimiser.Momentum

Bases: object

Momentum optimiser with default hyper parameters

static get_instance(learning_rate)

create an instance of the optimiser