Paired prostate MR-ultrasound registration

Note: Please read the DeepReg Demo Disclaimer.

Source Code

This demo uses DeepReg to re-implament the algorithms described in Weakly-supervised convolutional neural networks for multimodal image registration. A standalone demo was hosted at https://github.com/yipenghu/label-reg.

Author

DeepReg Development Team

Application

Registering preoperative MR images to intraoperative transrectal ultrasound images has been an active research area for more than a decade. The multimodal image registration task assist a number of ultrasound-guided interventions and surgical procedures, such as targeted biopsy and focal therapy for prostate cancer patients. One of the key challenges in this registration task is the lack of robust and effective similarity measures between the two image types. This demo implements a weakly-supervised learning approach to learn voxel correspondence between intensity patterns between the multimodal data, driven by expert-defined anatomical landmarks, such as the prostate gland segmenation.

Instruction

  • Install DeepReg;

  • Change current directory to the root directory of DeepReg project;

  • Run demo_data.py script to download 10 folds of unpaired 3D ultrasound images and the pre-trained model.

python demos/paired_mrus_prostate/demo_data.py
  • Call deepreg_train from command line. The following example uses two GPUs and launches the first of the ten runs of a 9-fold cross-validation, as specified in the ``dataset` section <./paired_mrus_prostate_dataset0.yaml>`_ and the ``train` section <./paired_mrus_prostate_train.yaml>`_, which can be specified in separate yaml files;

deepreg_train --gpu "1, 2" --config_path demos/paired_mrus_prostate/paired_mrus_prostate_dataset0.yaml demos/paired_mrus_prostate/paired_mrus_prostate_train.yaml --log_dir paired_mrus_prostate
  • Call deepreg_predict from command line to use the saved ckpt file for testing on the data partitions specified in the config file, a copy of which will be saved in the [log_dir]. The following example uses a pre-trained model, on CPU. If not specified, the results will be saved at the created timestamp-named directories under /logs.

deepreg_predict --gpu "" --config_path demos/paired_mrus_prostate/paired_mrus_prostate_dataset0.yaml demos/paired_mrus_prostate/paired_mrus_prostate_train.yaml --ckpt_path demos/paired_mrus_prostate/dataset/pre-trained/weights-epoch500.ckpt --mode test

Pre-trained Model

A pre-trained model will be downloaded after running demo_data.py and unzipped at the dataset folder under the demo folder. This pre-trained model will be used by default with deepreg_predict. Run the user-trained model by specifying --ckpt_path the location where the ckpt files will be saved, in this case (specified by deepreg_train as above), /logs/paired_mrus_prostate/.

Data

This is a demo without real clinical data due to regulatory restrictions. The MR and ultrasound images used are simulated dummy images.

Tested DeepReg version

Last commit at which demo was tested: 7ec0f5157a81cd5e60cadd61bd617b433039d0e6

Contact

Please raise an issue.