Skip to content

Example of evaluation metrics used in the SynthRAD2023 challenge

License

Notifications You must be signed in to change notification settings

SynthRAD2023/metrics

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Contributors Forks Stargazers Issues GNU GPL-v3.0


Logo

Preparing the metrics for evaluation of SynthRAD2023 Grand Challenge
Explore the docs »

View Demo · Report Bug · Request Feature

Table of Contents

Goal

Assess the quality of the synthetic computed tomography (sCT) images against CT. N.B. At the moment only the image similarity metrics are usable. Dose metrics will be updated when the validation phase is open.

Getting Started

To get a local copy up and running follow these simple steps.

Installation

  1. Clone the repo
git clone https://github.com/SynthRAD2023/metrics.git

or

git clone git@github.com:SynthRAD2023/metrics.git

Prerequisites

  • numpy
pip install -r requirements.txt

Usage

The metrics are computed in two files: image_metrics.py and dose_metrics.py. These compute respectively,

  • The image similarity between the ground-truth CT and the synthetic CT. Thes metrics include the mean squared error (MSE), peak signal to noise ratio (PSNR), and structural similarity (SSIM).
  • The metrics to compare the dose delivered to the ground truth and the synthetic CT. These metrics include the mean absolute dose (MAE), a dose-volume histogram (DVH) metric, and the gamma pass rate.

Functions Descriptions

In general, any function can be used in the following way.

a(input, output)

description:
compute the metric a (e.g., mse, psnr, ssim) between input and output

arguments:
input: The numpy array of the ground-truth image
output: The numpy array of the predicted image

All metrics can be computed by using the score_patient, which loads the data and returns all metrics:

Image metrics

    metrics = ImageMetrics()
    ground_truth_path = "path/to/ground_truth.mha"
    predicted_path = "path/to/prediction.mha"
    mask_path = "path/to/mask.mha"
    print(metrics.score_patient(ground_truth_path, predicted_path, mask_path))

Dose metrics

    dose_path = 'path/to/treatment_plans'
    predicted_path = "path/to/prediction.mha"
    patient_id="1BA000"
    
    metrics = DoseMetrics(dose_path)
    print(metrics.score_patient(patient_id, predicted_path))

Roadmap

See the open issues for a list of proposed features (and known issues).

Contributing

Contributions are what make the open source community such an amazing place to be learn, inspire, and create. Any contributions you make are greatly appreciated.

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

License

Distributed under the GNU General Public License v3.0. See LICENSE for more information.

Contact

Matteo Maspero - @matteomasperonl - m.maspero@umcutrecht.nl

Project Link: https://github.com/SynthRAD2023/metrics

About

Example of evaluation metrics used in the SynthRAD2023 challenge

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages