A Portal-based System for Quality Assurance of
Download
Report
Transcript A Portal-based System for Quality Assurance of
A portal-based system for quality assurance of radiotherapy treatment
plans using Grid-enabled High Performance Computing clusters
Ian C. Smith1
CR Baker2, V Panettieri3, C Addison1, AE Nahum3
1 Computing
Services Dept, University of Liverpool; 2 Directorate of Medical Imaging and Radiotherapy,
University of Liverpool; 3 Physics Department, Clatterbridge Centre for Oncology
Outline
Introduction to radiotherapy treatment planning
University of Liverpool Grid Computing Server (GCS)
GCS tools
Command line job submission using the GCS
UL-GRID Portal
Results
Future directions
Rationale
Routine radiotherapy treatment planning is constrained by lack of
sufficiently powerful computing resources
Monte Carlo (MC) based codes can provide accurate absorbed
dose calculations but are computationally demanding (single
simulation can take 3 weeks on a desktop machine)
Fortunately MC methods are inherently parallel – can run on HPC
resources and (for some codes) HTC resources
So far looked at looked running simulations on local and centrally
funded HPC clusters in a user-friendly manner
Starting to look at using Condor pools
Radiotherapy codes
Two MC codes have been investigated to date:
MCNPX (beta v2.7a)
general purpose transport code, tracks nearly all particles at nearly all
energies (https://mcnpx.lanl.gov/).
parallel (MPI-based) code, only runs on clusters
self contained – no need for pre- and post- processing steps
PENELOPE
general purpose MC code implemented as a set of FORTRAN routines
coupled electron-photon transport from 50 eV to 1 GeV in arbitrary
materials and complex geometries[1].
serial implementation, will run on clusters and Condor pools
needs pre- and post- processing to set up input files and combine partial
results
Starting to look at EGSnrc / BEAMnrc / DOSXYZnrc
[1] Salvat F, Fernández-Varea JM, Sempau J. PENELOPE, a code system for
Monte Carlo simulation of electron and photon transport. France: OECD Nuclear
Energy Agency, Issy-les-Moulineaux; 2008. ISBN 9264023011. Available in pdf
format at: http://www.nea.fr.
Courtesy of Prof. A.
Nahum (CCO)
Simulation of an electron
treatment: from the
treatment head to the patient
(taken from Cygler et al)
Grid Computing Server / UL-GRID Portal
Grid Computing Server / UL-GRID software
stack
Grid Computing Server tools
single sign on to resources via MyProxy, use ulg-get-proxy (proxies
automatically renewed)
job management is very similar to local batch systems such as SGE: ulgqsub, ulg-qstat, ulg-qdel etc
support for submitting large numbers of jobs, file staging and pre- and
post- processing
job submission process is the same for all compute clusters (local or
external)
utility tools1 provide simple Grid based extensions to standard UNIX
commands: ulg-cp, ulg-ls, ulg-rm, ulg-mkdir etc.
status commands available e.g. ulg-status, ulg-rqstat
1
based on GROWL scripts from STFC Daresbury
PENELOPE (serial code) workflows
create random seeds for N
input files using clonEasy[1]
create random seeds
for
input files using
NRereasdasdas
clonEasy[1]
repeat for
other patients
compute individual
phase-space file
stage-in phase-space file
(only if necessary)
compute partial treatment
simulation results
combine N individual
phase-space files
combine partial treatment
simulation results using clonEasy[1]
phase-space file
calculation
patient treatment simulation
HPC cluster
Portal
[1] Badal A and Sempau J 2006 A package of Linux scripts for the parallelization
of Monte Carlo simulations Comput.Phys. Commun. 175 440–50
GCS job description files for PENELOPE (1)
#
# create phase space file (PSF)
#
job_type = remote_script
host = ulgbc2
total_jobs = 16
name = penelopeLPO
pre_processor = /opt1/ulgrid/apps/penelope/seed_input_files
pre_processor_arguments = penmain_acc6_LPO35_.in 16
indexed_input_files = penmain_acc6_LPO35_.in
input_files = spectrum_pE_6_LPO35.geo, 6MW_2.mat
executable = /usr/local/bin/run_penmain
arguments= penmain_acc6_LPO35_INDEX.in penmain_LPO35_INDEX.out
log = mylogfile
GCS job description files for PENELOPE (2)
#
# perform patient simulation using previously calculated phase space file (PSF)
#
job_type = remote_script
host = ulgbc2
name = penelope
total_jobs = 10
pre_processor=/opt1/ulgrid/apps/penelope/seed_input_files
pre_processor_arguments=penmain.in 10
staged_input_files=PSF_test.psf
input_remote_stage_dir=staging
input_files = water_phantom.geo,water.mat
indexed_input_files = penmain.in
executable = /usr/local/bin/run_penmain
arguments= penmainINDEX.in penmainINDEX.out ics
log = penelope.log
Condor job files for PENELOPE
# job description file
grid_resource = gt2 ulgbc2.liv.ac.uk/jobmanager-condorg
universe = grid
executable = /usr/local/bin/run_penmain
arguments = penmain_acc6_LPO35_$(PROCESS).in penmain_LPO35_$(PROCESS).out ics_test
+ulg_job_name = penelopeLPO
log = log
transfer_input_files = spectrum_pE_6_LPO35.geo, 6MW_2.mat, penmain_acc6_LPO35_$(PROCESS).in
transfer_files = always
transfer_executable = FALSE
GlobusRSL = (count=1)(job_type=remote_script) \
(input_working_directory=/condor_data/smithic/penelope/big_test/create_psf) \
(job_name=penelopeLPO)
notification = never
queue 16
# DAG file
JOB pre_process dummy1.sub
JOB staging penelopeLPO35.sub
SCRIPT PRE pre_process /opt1/ulgrid/apps/penelope/seed_input_files
PARENT pre_process CHILD staging
GCS job submission and monitoring
smithic(ulgp4)create_psf$ ulg-qsub penelopeLPO35
Grid job submitted successfully, Job ID is 125042
smithic(ulgp4)create_psf$ ulg-qstat
Job ID
------
125015.0
125034.0
125035.0
125038.0
125042.0
125043.0
125044.0
125044.0
Job Name
--------
penelopeLPO
penelope
penelope
penelope
penelopeLPO
mcnpx3
mcnpx3
gamess_test
Owner
-----
State
-----
smithic
vpanetti
vpanetti
smithic
smithic
colinb
colinb
bonarlaw
pr
r
w
si
qw
r
r
r
Cores
-----
1
1
1
1
1
64
64
32
Host
----
ulgbc2.liv.ac.uk
ulgbc2.liv.ac.uk
ulgbc2.liv.ac.uk
ulgbc2.liv.ac.uk
ulgbc2.liv.ac.uk
ulgbc4.liv.ac.uk
lancs2.nw-grid.ac.uk
ngs.rl.ac.uk
Lung treatment simulated with PENELOPE and penVOX 07
7 fields
PSF calculation 1.5 days (14 cores) approximately 1.5 million particles
Patient calculation 1.5 days for all 7 fields (single core) Statistical uncertainty 1% (1 sigma)
Proton absorbed dose in water using MCNPX
2.5cm diameter beam, full
energy (~60 MeV at patient,
~3.2 cm range in water)
500 million histories
0.5x0.5x5 mm voxels
50keV proton cut-off
Bragg peak
<1% statistical uncertainty in
absorbed dose in high dose
region (1s)
Half-modulation
Future Directions
Provide support for BEAM[1] and DOSxyz[3] (based on the EGSnrc
MC code [2])
Utilise Liverpool Windows Condor Pool for running PENELOPE
jobs
Compare with other implementations e.g. RT-Grid.
References:
[1] 23D. W. Rogers, B. Faddegon, G. X. Ding, C. M. Ma, J. Wei, and T. Mackie, “BEAM: A Monte Carlo code to
simulate radiotherapy treatment units,” Med. Phys. 22, 503–524 _1995_.
[2] Kawrakow and D. W. O. Rogers. The EGSnrc Code System: Monte Carlo simulation of electron and photon
transport. Technical Report PIRS-701 (4th printing), National Research Council of Canada, Ottawa, Canada, 2003.
[3] Walters B, Kawrakow I and Rogers D W O 2007 DOSXYZnrc Users Manual Report PIRS 794 (Ottawa: National
Research Council of Canada)