Paired prostate MR-ultrasound registration

Note: Please read the DeepReg Demo Disclaimer.

Source Code

This demo uses DeepReg to re-implement the algorithms described in Weakly-supervised convolutional neural networks for multimodal image registration. A standalone demo was hosted at https://github.com/yipenghu/label-reg.

Author

DeepReg Development Team

Application

Registering preoperative MR images to intraoperative transrectal ultrasound images has been an active research area for more than a decade. The multimodal image registration task assist a number of ultrasound-guided interventions and surgical procedures, such as targeted biopsy and focal therapy for prostate cancer patients. One of the key challenges in this registration task is the lack of robust and effective similarity measures between the two image types. This demo implements a weakly-supervised learning approach to learn voxel correspondence between intensity patterns between the multimodal data, driven by expert-defined anatomical landmarks, such as the prostate gland segmentaion.

Data

This is a demo without real clinical data due to regulatory restrictions. The MR and ultrasound images used are simulated dummy images.

Instruction

Installation

Please install DeepReg following the instructions and change the current directory to the root directory of DeepReg project, i.e. DeepReg/.

Download data

Please execute the following command to download/pre-process the data and download the pre-trained model.

python demos/paired_mrus_prostate/demo_data.py

Launch demo training

Please execute the following command to launch a demo training (the first of the ten runs of a 9-fold cross-validation). The training logs and model checkpoints will be saved under demos/paired_mrus_prostate/logs_train.

python demos/paired_mrus_prostate/demo_train.py

Here the training is launched using the GPU of index 0 with a limited number of steps and reduced size. Please add flag --no-test to use the original training configuration, such as

python demos/paired_mrus_prostate/demo_train.py --no-test

Predict

Please execute the following command to run the prediction with pre-trained model. The prediction logs and visualization results will be saved under demos/paired_mrus_prostate/logs_predict. Check the CLI documentation for more details about prediction output.

python demos/paired_mrus_prostate/demo_predict.py

Optionally, the user-trained model can be used by changing the ckpt_path variable inside demo_predict.py. Note that the path should end with .ckpt and checkpoints are saved under logs_train as mentioned above.

Contact

Please raise an issue for any questions.