niftynet.layer.loss_classification_multi module

Loss functions for multi-class classification

class LossFunction(n_class, n_rater, loss_type='CrossEntropy', loss_func_params=None, name='loss_function')[source]

Bases: niftynet.layer.base_layer.Layer

make_callable_loss_func(type_str)[source]
layer_op(pred_ave=None, pred_multi=None, ground_truth=None, weight_batch=None, var_scope=None)[source]

Compute the losses in the case of a multirater setting :param pred_ave: average of the predictions over the different raters :param pred_multi: prediction for each individual rater :param ground_truth: ground truth classification for each individual rater :param weight_batch: :param var_scope: :return:

labels_to_one_hot(ground_truth, num_classes=1)[source]

Converts ground truth labels to one-hot, sparse tensors. Used extensively in segmentation losses.

Parameters:
  • ground_truth – ground truth categorical labels (rank N)
  • num_classes – A scalar defining the depth of the one hot dimension (see depth of tf.one_hot)
Returns:

one-hot sparse tf tensor (rank N+1; new axis appended at the end) and the output shape

loss_confusion_matrix(ground_truth, pred_multi, num_classes=2, nrater=6)[source]

Creates a loss over the two multi rater confusion matrices between the rater :param ground_truth: multi rater classification :param pred_multi: multi rater prediction (1 pred per class for each rater and each observation - A softmax is performed during the loss calculation :param nrater: number of raters :return: integration over the absolute differences between the confusion matrices divided by number of raters

variability(pred_multi, num_classes=2, nrater=2)[source]
loss_variability(ground_truth, pred_multi, weight_map=None)[source]
rmse_consistency(pred_ave, pred_multi, weight_map=None)[source]