Skip to content

Commit 14d221a

Browse files
committed
Version 1.0.5 Changes:
1. Update PortPy.AI upgrade with easy to run notebooks 2. Update PortPy visualization.py 3. Update vmat scp dose prediction 4. Add random leaf position initialization feature to VMAT SCP
1 parent e117c63 commit 14d221a

36 files changed

Lines changed: 1759 additions & 305 deletions

.gitignore

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -51,3 +51,6 @@ coverage.xml
5151
# Sphinx documentation
5252
docs/_build/
5353
/data/
54+
/ai_data/
55+
/checkpoints/
56+
/results/

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@
1010

1111
</h2>
1212

13-
![Version](https://img.shields.io/static/v1?label=latest&message=v1.0.4.6&color=darkgreen)
13+
![Version](https://img.shields.io/static/v1?label=latest&message=v1.0.5&color=darkgreen)
1414
[![Total Downloads](https://static.pepy.tech/personalized-badge/portpy?period=total&units=international_system&left_color=grey&right_color=blue&left_text=total%20downloads)](https://pepy.tech/project/portpy?&left_text=totalusers)
1515
[![Monthly Downloads](https://static.pepy.tech/badge/portpy/month)](https://pepy.tech/project/portpy)
1616

examples/dose_mimicking.ipynb

Lines changed: 908 additions & 0 deletions
Large diffs are not rendered by default.

examples/imrt_dose_prediction.ipynb

Lines changed: 146 additions & 72 deletions
Large diffs are not rendered by default.
Lines changed: 116 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -1,42 +1,129 @@
1-
import os
1+
#!/usr/bin/env python
2+
# coding: utf-8
3+
4+
# # <center>Pipeline for Deep learning based Dose Prediction and Fluence Optimization </center>
5+
#
6+
#
7+
# This example showcases the core functionalities of the portpy.ai module, which is designed for deep learning-based dose prediction in the context of radiotherapy treatment planning. The main functionalities demonstrated are as follows:
8+
#
9+
# 1. **Data Access and Preprocessing**: The portpy.ai module allows easy access to curated portpy data, specifically tailored for training deep learning models for dose prediction. It also includes preprocessing steps to prepare the data for use in the models.
10+
#
11+
# 2. **Model Training and Testing**: With the portpy.ai module, users can efficiently train and test their deep learning models for dose prediction. This involves using the prepared data to create and fine-tune the models for optimal performance.
12+
#
13+
# 3. **Patient-Specific Dose Prediction**: The module enables the creation of personalized dose prediction for available patients in the PortPy database. This feature allows users to estimate the dose distribution for individual patients, aiding in treatment planning. Also, PortPy uses simple UNET model with MAE loss to create dose prediction. Users can implement their own customized models for better prediction and accuracy
14+
#
15+
#
16+
# 4. **Fluence Optimization with Predicted Dose**: The predicted dose distribution can be used to optimize the fluence of radiation beams during treatment planning. The optimized fluence can be imported back to eclipse for final dose calculation. Please refer to [imrt_tps_import.ipynb](https://github.com/PortPy-Project/PortPy/blob/master/examples/imrt_tps_import.ipynb) for more details.
17+
#
18+
# 5. **Plan Visualization**: The module offers tools for visualizing treatment plans based on the predicted dose distribution.
19+
#
20+
# Overall, the portpy.ai module provides a comprehensive solution for dose prediction and treatment planning in radiotherapy, leveraging the power of deep learning and data-driven methodologies to enhance precision and patient-specific care.
21+
#
22+
# **Note: At present, the PortPy dataset contains only 10 lung patients. However, using the PortPy AI module, treatment planning using dose prediction pipeline has been demonstrated successfully. By the end of August, an additional 100 lung patients will be added to the PortPy dataset, significantly enhancing the training data and thereby improving the performance of the dose prediction model. **
23+
224
import portpy.photon as pp
325
import SimpleITK as sitk
426
import os
527
import cvxpy as cp
628
import numpy as np
729
import matplotlib.pyplot as plt
830
from portpy.ai.preprocess.predict_using_model import predict_using_model
31+
from portpy.ai.preprocess.data_preprocess import data_preprocess
32+
from portpy.ai.train import train
33+
from portpy.ai.test import test
34+
35+
36+
# ### 1. Preprocess, train and test using PortPy AI module
37+
38+
39+
40+
in_dir = r'../data' # directory where portpy raw data is located
41+
out_dir = r'../ai_data' # directory where processed data to be stored for training and testing model
42+
43+
44+
# preprocess portpy data
45+
data_preprocess(in_dir, out_dir)
46+
47+
# **Note** split the data in train and test folder in the output directory before running further code
48+
# e.g. out_dir\train\Lung_Patient_2 out_dir\test\Lung_Patient_9
49+
50+
# ### 2. Training and testing the model
51+
#
52+
# Train the model. You can change the parameters for the training as show below
53+
54+
# Provide only the arguments you want to override
55+
train_options = {
56+
"dataroot": "../../ai_data",
57+
"checkpoints_dir": "../../checkpoints",
58+
"netG": "unet_128",
59+
"name": "portpy_test_3",
60+
"model": "doseprediction3d",
61+
"direction": "AtoB",
62+
"lambda_L1": 1,
63+
"dataset_mode": "dosepred3d",
64+
"norm": "batch",
65+
"batch_size": 1,
66+
"pool_size": 0,
67+
"display_port": 8097,
68+
"lr": 0.0002,
69+
"input_nc": 8,
70+
"output_nc": 1,
71+
"display_freq": 10,
72+
"print_freq": 1,
73+
"gpu_ids": [0] # Converted to a list since multiple GPUs may be supported
74+
}
75+
76+
train(train_options) # Run training directly in Jupyter Notebook
77+
78+
# You can uncomment and run below in case if you want to run train script from CLI
79+
#!python ../portpy/ai/train.py --dataroot ../ai_data --netG unet_128 --name portpy_test_3 --model doseprediction3d --direction AtoB --lambda_L1 1 --dataset_mode dosepred3d --norm batch --batch_size 1 --pool_size 0 --display_port 8097 --lr 0.0002 --input_nc 8 --output_nc 1 --display_freq 10 --print_freq 1 --gpu_ids 0
80+
81+
82+
# Test the model
83+
test_options = {
84+
"dataroot": "../ai_data",
85+
"netG": "unet_128",
86+
"checkpoints_dir": "../../checkpoints",
87+
"results_dir": "../../results",
88+
"name": "portpy_test_3",
89+
"phase": "test",
90+
"mode": "eval",
91+
"eval": True, # Boolean flag
92+
"model": "doseprediction3d",
93+
"input_nc": 8,
94+
"output_nc": 1,
95+
"direction": "AtoB",
96+
"dataset_mode": "dosepred3d",
97+
"norm": "batch"
98+
}
99+
test(test_options)
100+
# !python ../portpy/ai/test.py --dataroot ../ai_data --netG unet_128 --checkpoints_dir ../checkpoints --results_dir ../results --name portpy_test_2 --phase test --mode eval --eval --model doseprediction3d --input_nc 8 --output_nc 1 --direction Ato
101+
102+
103+
# ### 3. Create dose prediction for the PortPy patient
104+
#
105+
# After training the model, users can use their customized model to create dose prediction for any portpy patient. Below script would preprocess and predict for the patient
106+
# For users who does not want to train and test, they can directly preprocess and predict using AI model
107+
patient_id = 'Lung_Patient_4'
108+
model_name = 'portpy_test_3'
109+
pred_dose = predict_using_model(patient_id=patient_id, in_dir=in_dir, out_dir=out_dir, model_name=model_name, checkpoints_dir='../../checkpoints', results_dir='../../results')
9110

10-
# change directory to portpy ai module to preprocess the portpy data, train and test the model
11-
os.chdir('../../portpy/ai')
12-
13-
# # preprocess portpy data
14-
in_dir = r'../../data'
15-
out_dir = r'../../ai_data'
16-
17-
os.system('python ./preprocess/data_preprocess.py --in_dir ../../data --out_dir ../../ai_data')
18-
19-
# Train preprocess data
20-
os.system('python train.py --dataroot ../../ai_data --netG unet_128 --name portpy_test_2 --model doseprediction3d --direction AtoB --lambda_L1 1 --dataset_mode dosepred3d --norm batch --batch_size 1 --pool_size 0 --display_port 8097 --lr 0.0002 --input_nc 8 --output_nc 1 --display_freq 10 --print_freq 1 --gpu_ids 0')
21-
22-
# Test preprocess data
23-
os.system('python test.py --dataroot ../../ai_data --netG unet_128 --name portpy_test_2 --phase test --mode eval --model doseprediction3d --input_nc 8 --output_nc 1 --direction AtoB --dataset_mode dosepred3d --norm batch')
24111

25-
# predicted dose back to portpy
26-
patient_id = 'Lung_Patient_4'
27-
model_name = 'portpy_test_2'
28-
pred_dose = predict_using_model(patient_id=patient_id, in_dir=in_dir, model_name=model_name)
112+
# ### 4. Fluence optimization using predicted dose
29113

30114
# load portpy data
31115
data = pp.DataExplorer(data_dir=in_dir)
32116
data.patient_id = patient_id
33117
# Load ct and structure set for the above patient using CT and Structures class
34118
ct = pp.CT(data)
35-
ct_arr = ct.ct_dict['ct_hu_3d'][0]
36119
structs = pp.Structures(data)
37120

38121
beams = pp.Beams(data)
39122

123+
# load clinical criteria from the config files for which plan to be optimized
124+
protocol_name = 'Lung_2Gy_30Fx'
125+
clinical_criteria = pp.ClinicalCriteria(data, protocol_name=protocol_name)
126+
40127
# create rinds based upon rind definition in optimization params
41128
protocol_name = 'Lung_2Gy_30Fx'
42129
opt_params = data.load_config_opt_params(protocol_name=protocol_name)
@@ -45,13 +132,10 @@
45132
# load influence matrix based upon beams and structure set
46133
inf_matrix = pp.InfluenceMatrix(ct=ct, structs=structs, beams=beams)
47134

48-
# load clinical criteria from the config files for which plan to be optimized
49-
clinical_criteria = pp.ClinicalCriteria(data, protocol_name=protocol_name)
50-
51135
pred_dose_1d = inf_matrix.dose_3d_to_1d(dose_3d=pred_dose)
52136

53137
# create a plan using ct, structures, beams and influence matrix. Clinical criteria is optional
54-
my_plan = pp.Plan(ct, structs, beams, inf_matrix, clinical_criteria)
138+
my_plan = pp.Plan(ct=ct, structs=structs, beams=beams, inf_matrix=inf_matrix, clinical_criteria=clinical_criteria)
55139

56140
# create cvxpy problem using the clinical criteria and optimization parameters
57141
opt = pp.Optimization(my_plan, opt_params=opt_params)
@@ -75,10 +159,12 @@
75159
opt.obj += [(smoothness_X_weight * (1 / num_cols) * cp.sum_squares(Qx @ x) +
76160
smoothness_Y_weight * (1 / num_rows) * cp.sum_squares(Qy @ x))]
77161

78-
sol = opt.solve(solver='MOSEK', verbose=True)
79-
# sol = opt.solve(solver='MOSEK', verbose=True)
162+
sol = opt.solve(solver='MOSEK', verbose=False)
80163
sol = {'optimal_intensity': x.value, 'inf_matrix': inf_matrix}
81164

165+
166+
# ### 5. Plan visualization
167+
82168
# plot fluence 3d and 2d for the 1st beam
83169
pp.Visualization.plot_fluence_3d(sol=sol, beam_id=my_plan.beams.get_all_beam_ids()[0])
84170

@@ -89,3 +175,7 @@
89175
norm_flag=True)
90176
ax.set_title('- Optimized .. Predicted')
91177
plt.show()
178+
179+
180+
181+

examples/python_files/vmat_scp_dose_prediction.py

Lines changed: 62 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -1,31 +1,77 @@
11
import os
22

3-
import numpy as np
4-
53
import portpy.photon as pp
4+
import SimpleITK as sitk
65
import os
6+
import cvxpy as cp
7+
import numpy as np
78
import matplotlib.pyplot as plt
89
from portpy.ai.preprocess.predict_using_model import predict_using_model
9-
10-
# change directory to portpy ai module to preprocess the portpy data, train and test the model
11-
os.chdir('../../portpy/ai')
10+
from portpy.ai.preprocess.data_preprocess import data_preprocess
11+
from portpy.ai.train import train
12+
from portpy.ai.test import test
1213

1314
# # preprocess portpy data
1415
in_dir = r'../../data'
1516
out_dir = r'../../ai_data'
1617

17-
# os.system('python ./preprocess/data_preprocess.py --in_dir ../../data --out_dir ../../ai_data')
18-
19-
# Train preprocess data
20-
# os.system('python train.py --dataroot ../../ai_data --netG unet_128 --name portpy_test_2 --model doseprediction3d --direction AtoB --lambda_L1 1 --dataset_mode dosepred3d --norm batch --batch_size 1 --pool_size 0 --display_port 8097 --lr 0.0002 --input_nc 8 --output_nc 1 --display_freq 10 --print_freq 1 --gpu_ids 0')
21-
22-
# Test preprocess data
23-
# os.system('python test.py --dataroot ../../ai_data --netG unet_128 --name portpy_test_2 --phase test --mode eval --model doseprediction3d --input_nc 8 --output_nc 1 --direction AtoB --dataset_mode dosepred3d --norm batch')
18+
# ### 2. Training and testing the model
19+
#
20+
# Train the model. You can change the parameters for the training as show below
21+
22+
# Provide only the arguments you want to override
23+
train_options = {
24+
"dataroot": "../../ai_data",
25+
"checkpoints_dir": "../../checkpoints",
26+
"netG": "unet_128",
27+
"name": "portpy_test_3",
28+
"model": "doseprediction3d",
29+
"direction": "AtoB",
30+
"lambda_L1": 1,
31+
"dataset_mode": "dosepred3d",
32+
"norm": "batch",
33+
"batch_size": 1,
34+
"pool_size": 0,
35+
"display_port": 8097,
36+
"lr": 0.0002,
37+
"input_nc": 8,
38+
"output_nc": 1,
39+
"display_freq": 10,
40+
"print_freq": 1,
41+
"gpu_ids": [0] # Converted to a list since multiple GPUs may be supported
42+
}
43+
44+
train(train_options) # Run training directly in Jupyter Notebook
45+
46+
# You can uncomment and run below in case if you want to run train script from CLI
47+
#!python ../portpy/ai/train.py --dataroot ../ai_data --netG unet_128 --name portpy_test_3 --model doseprediction3d --direction AtoB --lambda_L1 1 --dataset_mode dosepred3d --norm batch --batch_size 1 --pool_size 0 --display_port 8097 --lr 0.0002 --input_nc 8 --output_nc 1 --display_freq 10 --print_freq 1 --gpu_ids 0
48+
49+
50+
# Test the model
51+
test_options = {
52+
"dataroot": "../../ai_data",
53+
"netG": "unet_128",
54+
"checkpoints_dir": "../../checkpoints",
55+
"results_dir": "../../results",
56+
"name": "portpy_test_3",
57+
"phase": "test",
58+
"mode": "eval",
59+
"eval": True, # Boolean flag
60+
"model": "doseprediction3d",
61+
"input_nc": 8,
62+
"output_nc": 1,
63+
"direction": "AtoB",
64+
"dataset_mode": "dosepred3d",
65+
"norm": "batch"
66+
}
67+
test(test_options)
68+
# !python ../portpy/ai/test.py --dataroot ../ai_data --netG unet_128 --checkpoints_dir ../checkpoints --results_dir ../results --name portpy_test_2 --phase test --mode eval --eval --model doseprediction3d --input_nc 8 --output_nc 1 --direction Ato
2469

2570
# predicted dose back to portpy
2671
patient_id = 'Lung_Patient_6'
27-
model_name = 'portpy_test_2'
28-
pred_dose = predict_using_model(patient_id=patient_id, in_dir=in_dir, model_name=model_name)
72+
model_name = 'portpy_test_3'
73+
pred_dose = predict_using_model(patient_id=patient_id, in_dir=in_dir, out_dir=out_dir, model_name=model_name, checkpoints_dir='../../checkpoints', results_dir='../../results')
74+
2975

3076
# load portpy data
3177
data = pp.DataExplorer(data_dir=in_dir)
@@ -49,8 +95,8 @@
4995

5096
pred_dose_1d = inf_matrix.dose_3d_to_1d(dose_3d=pred_dose)
5197

52-
arcs_dict = {'arcs': [{'arc_id': "01", "control_point_ids": beam_ids[0:int(len(beam_ids) / 2)]},
53-
{'arc_id': "02", "control_point_ids": beam_ids[int(len(beam_ids) / 2):]}]}
98+
arcs_dict = {'arcs': [{'arc_id': "01", "beam_ids": beam_ids[0:int(len(beam_ids) / 2)]},
99+
{'arc_id': "02", "beam_ids": beam_ids[int(len(beam_ids) / 2):]}]}
54100
# Create arcs object using arcs dictionary and influence matrix
55101
arcs = pp.Arcs(arcs_dict=arcs_dict, inf_matrix=inf_matrix)
56102

examples/python_files/vmat_scp_tutorial.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -63,8 +63,8 @@ def vmat_scp_tutorial():
6363
"""
6464
# Assign discrete beam/control_point_ids to arcs and create arcs dictionary.
6565
# Below is an example of creating 2 arcs. Users can create single or multiple arcs.
66-
arcs_dict = {'arcs': [{'arc_id': "01", "control_point_ids": beam_ids[0:int(len(beam_ids) / 2)]},
67-
{'arc_id': "02", "control_point_ids": beam_ids[int(len(beam_ids) / 2):]}]}
66+
arcs_dict = {'arcs': [{'arc_id': "01", "beam_ids": beam_ids[0:int(len(beam_ids) / 2)]},
67+
{'arc_id': "02", "beam_ids": beam_ids[int(len(beam_ids) / 2):]}]}
6868
# Create arcs object using arcs dictionary and influence matrix
6969
arcs = pp.Arcs(arcs_dict=arcs_dict, inf_matrix=inf_matrix)
7070

examples/python_files/vmat_tps_import.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -66,8 +66,8 @@
6666

6767
# Assign discrete beam/control_point_ids to arcs and create arcs dictionary.
6868
# Below is an example of creating 2 arcs. Users can create single or multiple arcs.
69-
arcs_dict = {'arcs': [{'arc_id': "01", "control_point_ids": beam_ids[0:int(len(beam_ids) / 2)]},
70-
{'arc_id': "02", "control_point_ids": beam_ids[int(len(beam_ids) / 2):]}]}
69+
arcs_dict = {'arcs': [{'arc_id': "01", "beam_ids": beam_ids[0:int(len(beam_ids) / 2)]},
70+
{'arc_id': "02", "beam_ids": beam_ids[int(len(beam_ids) / 2):]}]}
7171
# Create an object of Arcs class
7272
arcs = pp.Arcs(arcs_dict=arcs_dict, inf_matrix=inf_matrix)
7373

0 commit comments

Comments
 (0)