NiftyNet is a TensorFlow-based open-source convolutional neural networks platform for research in medical image analysis and image-guided therapy. NiftyNet’s modular structure is designed for sharing networks and pre-trained models. NiftyNet is a consortium of research organisations (BMEIS – School of Biomedical Engineering and Imaging Sciences, King’s College London; WEISS – Wellcome EPSRC Centre for Interventional and Surgical Sciences, UCL; CMIC – Centre for Medical Image Computing, UCL; HIG – High-dimensional Imaging Group, UCL), where BMEIS acts as the consortium lead.

Using NiftyNet’s modular structure you can:

  • Get started with established pre-trained networks using built-in tools
  • Adapt existing networks to your imaging data
  • Quickly build new solutions to your own image analysis problems

The code is available via GitHub, or you can quickly get started with the released versions in the Python Package Index. You can also check out the release notes.


This section shows you how to run a segmentation application with net_segment command, using a model with trained weights and image data downloaded from NiftyNet model zoo with net_download.

With NiftyNet installed from PyPI:

net_download dense_vnet_abdominal_ct_model_zoo
net_segment inference -c ~/niftynet/extensions/dense_vnet_abdominal_ct/config.ini

With NiftyNet source code cloned at ./NiftyNet/:

# go to the source code directory
cd NiftyNet/
python net_download.py dense_vnet_abdominal_ct_model_zoo
python net_segment.py inference -c ~/niftynet/extensions/dense_vnet_abdominal_ct/config.ini

The segmentation output of this example application should be located at


Applications and models

More applications and models are available at NiftyNet model zoo and the network directory.

Configuration specifications

For detailed specifications of NiftyNet commands and configurations, check out our Configuration docs.

Extending NiftyNet applications

To learn more about developing NiftyNet applications, see the Extending application and Developing new networks section.

Contributing to NiftyNet

Contributors are always welcomed! For more information please visit the Contributor guide section.

All how-to guides are listed in the following section.


This project is grateful for the support from the Wellcome Trust, the Engineering and Physical Sciences Research Council (EPSRC), the National Institute for Health Research (NIHR), the Department of Health (DoH), King’s College London (KCL), University College London (UCL), the Science and Engineering South Consortium (SES), the STFC Rutherford-Appleton Laboratory, and NVIDIA.

If you use NiftyNet in your work, please cite Gibson and Li, et al. 2017:

E. Gibson*, W. Li*, C. Sudre, L. Fidon, D. I. Shakir, G. Wang, Z. Eaton-Rosen, R. Gray, T. Doel, Y. Hu, T. Whyntie, P. Nachev, M. Modat, D. C. Barratt, S. Ourselin, M. J. Cardoso† and T. Vercauteren† 2017. NiftyNet: a deep-learning platform for medical imaging. Computer Methods and Programs in Biomedicine (2017).

BibTeX entry:

  author = "Eli Gibson and Wenqi Li and Carole Sudre and Lucas Fidon and
            Dzhoshkun I. Shakir and Guotai Wang and Zach Eaton-Rosen and
            Robert Gray and Tom Doel and Yipeng Hu and Tom Whyntie and
            Parashkev Nachev and Marc Modat and Dean C. Barratt and
            Sébastien Ourselin and M. Jorge Cardoso and Tom Vercauteren",
  title = "NiftyNet: a deep-learning platform for medical imaging",
  journal = "Computer Methods and Programs in Biomedicine",
  year = "2018",
  issn = "0169-2607",
  doi = "https://doi.org/10.1016/j.cmpb.2018.01.025",
  url = "https://www.sciencedirect.com/science/article/pii/S0169260717311823",

The NiftyNet platform originated in software developed for Li, et al. 2017:

Li W., Wang G., Fidon L., Ourselin S., Cardoso M.J., Vercauteren T. (2017) On the Compactness, Efficiency, and Representation of 3D Convolutional Networks: Brain Parcellation as a Pretext Task. In: Niethammer M. et al. (eds) Information Processing in Medical Imaging. IPMI 2017. Lecture Notes in Computer Science, vol 10265. Springer, Cham. DOI: 10.1007/978-3-319-59050-9_28