Printer Friendly

Grid and cluster computing for radiotherapy: Monte Carlo simulations bring much-needed accuracy to benefit cancer patients.

Each year, over 11 million people are diagnosed with cancer, and seven million die of the disease. * Radiation therapy, surgery, or a combination of the two, are able to cure about a third of these cases. (1) Computer simulation of radiation transport is very useful for radiotherapy, both in research and in clinical routines, particularly when:

* accuracy from existing calculation methods is unsatisfactory

* physical experiments are impractical, unethical or impossible.

Today, accurate calculation is in high demand more than ever, particularly with the wide introduction of complex treatment modalities such as conformal and intensity modulated radiotherapy (CFRT and IMRT). (2) To calculate radiation dose, one might attempt to solve coupled integro-differential equations describing the electromagnetic shower of particles. However, without imposing severe approximations, such analytical treatment is prohibitively complicated. The first alternative, and the conventional way, is to employ one-off, pre-computed, energy deposition kernels. (3) Still, accuracy is limited since certain conditions must be assumed unchanged between kernel generation and kernel application.

Additionally, some experiments are impractical (i.e., knocking down walls or adjusting ceiling heights in radiation shielding room design) (4,5) or unethical (i.e., exposing patients to trial-and-error radiation beams) (6) to carry out. Or, some quantities are impossible to measure, such as the origin of a detected particle--what different interactions has it been through, and what were its ancestor particles?

One numerical method of radiation transport simulation is known as Monte Carlo, thus named for its use of random numbers during a computer simulation. (7) achieve de facto accuracy, one must resort to this method, which works by faithfully simulating physical reality of the stochastic nature behind the birth, life and death of each radiation particle. For difficult experiments, computer simulations offer tremendous flexibility and power. When using a 'phantom' replica of the patient's anatomy using X-ray computed tomography (CT) data (and sometimes other imaging modalities), the most beneficial beam arrangement can be configured for a specific patient.

A collaborative effort

Velindre Cancer Centre serves the population of South-East Wales, about 1.5 million, of whom approximately 3,000 new patients require radiotherapy each year. Every center has to balance the need to provide each patient with appropriate care in a timely manner, with the desire to provide the best possible treatment quality and accuracy. More sophisticated treatment regimes inevitably involve greater time and effort to plan and deliver.

The desire to develop state-of-the-art dose calculation methods in the shortest possible times has driven the collaboration between the clinically based radiotherapy physicists at Velindre Cancer Centre with the computer scientists at the Welsh e-Science Centre (WeSC). Based in the School of Computer Science at Cardiff University, WeSC is one of eight regional centers established in 2001 by the UK Research Councils' e-Science program to support and promote a new approach to large-scale science carried out through distributed global collaborations enabled by the Internet. Typically, such collaborative scientific enterprises require desktop access to very large data collections, very large-scale computing resources, and high performance visualization.

Seeking a clinically valuable solution

Monte Carlo is indeed a good solution. But the computer run-time required can be prohibitively long, since the confidence level of Monte Carlo results depends on statistical sampling. To obtain statistically meaningful results, many events of radiation histories must be simulated. Computers process successive radiation histories serially, one after another. With a standalone computer, it can take several weeks to compute a treatment dosage within acceptable statistical uncertainty for a cancer patient, a process that is obviously not clinically practical.

Clinically useful solutions require a turnaround time of, at most, a few hours. Possible ways to increase simulation efficiency (i.e., to reduce the run-time required to achieve a given statistical uncertainty) include:

* variance reduction techniques, which introduce preferential sampling or non-sampling during simulation (8,9)

* denoising, a post-simulation process that smoothes fluctuating data (10,11)

* grid and cluster computing, which run different radiation histories simultaneously on several computers (12)

* quantum computing, which is not yet a practical proposition. (13)

Grid and cluster computing

Of the possible solutions above, grid and cluster computing has the advantage of biasing neither the physics nor the statistics, and the discussion that follows describes a solution for speeding up Monte Carlo simulations. Grid computing involves using services on a large number of networked resources, an infrastructure to provide secure remote access to these services, and tools to control a computation across the different services. Available computing resources include:

* The UK National Grid Service (NGS), the core production-level grid under the UK e-Science program, providing over 200 dedicated dual 3.06 GHz Intel Xeon nodes and other machines spread over seven UK sites.

* Non-dedicated Condor pools of 30 desktop computers at Velindre Cancer Centre, as well as over 800 desktop computers at Cardiff University, to tap into office desktops while idle.

* A dedicated Beowulf cluster jointly owned by VCC and the University of Surrey, which has been in use since 1999. (14)

A Monte Carlo simulation run on a grid or cluster differs from that run on a standalone processor, in that the simulation is split into many smaller sub-tasks, each executed on a different processor. The input files for each small simulation are identical to each other and to the original simulation in every aspect except one number--the initial random number seed. Different seeds ensure non-correlation between the multiple small simulations, so that their output may be statistically meaningful when combined.

The EGSnrc-based Monte Carlo codes, (15) which have been used extensively by our group, offer two choices of random number generators: RANLUX and RANMAR. (16,17) RANLUX has a period of over [1.sup.165] and offers four luxury levels (excluding level zero, which is known to cause problems and therefore, should be avoided). (15) The higher the luxury level, the longer the simulation run-time. RANLUX luxury level 1 is used routinely in our computations and has been reported to be adequate for most purposes. (15)

The EGSnrc family includes BEAMnrc (for medical linear accelerators) and DOSXYZnrc (for 3-D phantom lattices). (18,19) The multi-platform capabilities of the recent version, EGSnrcMP, allow simulations to run on Linux, Irix, Windows 2000 and Windows XP operating systems, among others. Nimrod and Condor are used as resource brokers, which decide to what site and processor to send individual simulation jobs. No pre-installation of EGSnrc on the execution site is necessary; a lightweight file system (i.e, small in terms of disk space compared to full installation) is transferred to the execution site at the beginning of each job. Using conditional statements in shell and Perl scripts, the appropriate pre-compiled executable is selected based on the local operating system of the execution site.

Greatly improved simulation efficiency

Using the NGS, and in competition with other users' computations, simulation of the second example described below took less than three hours to complete, a 50-fold increase in simulation efficiency compared to a simulation on a single 2.66 GHz Intel Pentium 4 processor.

The three simulation examples presented here involve:

* verification of patient dosage calculation (clinical pre-treatment verification)

* verification of radiotherapy delivery (clinical on-treatment verification)

* scientific investigation of physically immeasurable quantities (research).

Pre-treatment and on-treatment verifications are vital in radiotherapy. Adverse events such as 100% overdose, wrong data transfer and even death have been reported in the U.S. Food and Drug Administration (FDA) database. Radiotherapy planning and delivery are potentially error-prone processes because:

* they involve expertise from different departments (i.e., physicists, clinicians, technicians, radiographers)

* a sizeable amount of data must be transferred and modified from one workstation to another

* depending on the algorithm used, different dose calculations are of varying accuracy

* the electronic and mechanical performance of the linear accelerator, while highly reliable, is not completely error-free

* alignment of the patient's internal organs with respect to the beam is not always exact.

To avoid such happenings, a stringent verification framework both before and during treatment must be in place.

Figure 1 shows pre-treatment verification of a dose calculation for a head-and-neck IMRT treatment. The treatment planning system (TPS) used locally, as is the case elsewhere, calculates dose based on pre-calculated kernels, which are less accurate than Monte Carlo. Such verification of calculation accuracy against Monte Carlo results is therefore vital as we move towards high-precision radiotherapy.

[FIGURE 1 OMITTED]

Figure 2 shows dose verification during treatment in a controlled phantom study, where a transmission radiograph was acquired during beam delivery. The 20x20x20 cm perspex phantom had a 4 cm-thick, 5x5 cm air cavity at its center. A similar geometry was then simulated using Monte Carlo, producing a virtual radiograph. After appropriate calibrations, the radiographs can be interpreted as dose maps. (21) Excellent agreement between measurement and simulation on the graph shows successful dose prediction by Monte Carlo simulations. When quantified, the dose difference between simulation and measurement is found to be less than two percent in most areas. Where the dose difference exceeds two percent, the positional difference is less than two millimeter. This '2 percent or 2 mm' agreement criterion is widely used in radiotherapy dose comparisons, and the 'or 2 mm' condition is to account for positioning uncertainty in areas of high dose gradients.

[FIGURE 2 OMITTED]

Some quantities cannot be measured using physical instruments, but can be estimated using computer simulations. Results from an EGSnrc simulation show the count of each interaction type in every layer of an amorphous silicon (a-Si) radiation detector (Figure 3), which is typically used during treatment delivery. The Monte Carlo simulation gives us a glimpse of what happened when the a-Si detector was exposed to radiation: event counts for Compton scattering, photoelectric absorption, pair production, bremsstrahlung, annihilation, Moller and Bhabha scattering, and fluorescence events respectively. The physical instrument of the a-Si detector itself would only measure the energy deposited in the gadolinium oxysulphide terbium-activated (Gd202:Tb) layer; no knowledge of the radiation history (i.e., the breakdown of interaction and particle type) could be gained.

[FIGURE 3 OMITTED]

Figure 4 shows the share of energy deposition by signal- and non-signal-contributing particles, where a particle contributes to the signal by depositing energy in the Gd2O2:Tb layer. Such knowledge about the radiation history can neither be measured nor calculated on the 'back-of-the-envelope' (i.e., using cross-section lookup tables) since each layer is of varying thickness comprising compounds of different elements in varying proportions, exposed to a stochastic radiation beam containing generations of secondary particles.

Conclusion

The collaboration between Velindre Cancer Centre and WeSC has already been mutually beneficial in matching supply and demand for major computing power to meet a pressing clinical need. (22) These collaborative efforts have recently been further boosted by 400,000 [pounds sterling] of research funding for the next three years from the UK Engineering and Physical Sciences Research Council, which will allow the development of a more streamlined system to bring grid and cluster computing to the wider UK radiotherapy community as a national grid resource in the future.

References

(1.) R. Souhami and J. Tobias. Cancer and its management. Blackwells, Oxford, 1986.

(2.) S. Webb. The Physics of Conformal Radiotherapy. IOP Publishing, Bristol, 1997.

(3.) A. Ahnesjo and M. M. Aspradakis. Dose calculations for external photon beams in radiotherapy. Physics in Medicine and Biology, 44:R99-155, 1999.

(4.) IPEM Report No. 75: The Design of Radiotherapy Treatment Room Facilities. Institute of Physics and Engineers in Medicine, York, 1997.

(5.) P.W. Chin. Neutron contamination in a radiotherapy maze. Master's thesis, University of Surrey, 1999.

(6.) R. Jeraj, P. Keall, and J. V. Siebers. The effect of dose calculation accuracy on inverse treatment planning. Physics in Medicine and Biology, 47:391-407, 2002.

(7). A. E. Nahum. Monte Carlo Transport of Electrons and Photons, chapter Overview of Photon and Electron Monte Carlo. Plenum Press, 1988.

(8.) J. Briesmeister. MCNP-A genera/purpose Monte Carlo code for neutron and photon transport, Version 3A, LANL Report LA-7396-M. Los Alamos National Laboratory, Las Alamos, 1986.

(9.) I. Kawrakow, D. W. O. Rogers, and B. R. B. Waiters. Large efficiency improvements in BEAMnrc using directional bremsstrahlung splitting. Medical Physics, 31:2883-2898, 2004.

(10.) I. Kowrakow. On the de-noising of Monte Carlo calculated dose distributions. Physics in Medicine and Biology, 47:3087-103, 2002.

(11.) B. Miao, R. Jeraj, S. Bao, and T. R. Mackie. Adaptive anisotropic diffusion filtering of Monte Carlo dose distributions. Physics in Medicine and Biology, 48:2767-81, 2003.

(12.) I. Foster and C. Kesselman, editors. The Grid: Blueprint for a New Computing Infrastructure. Elsevier, 2003.

(13.) M. Peplow. Quantum computing gets a step closer. Nature, 2004.

(14.) P. Love, E. Spezi, D. 6. Lewis, C. W. Smith, E. Morton, and D. Munro. Parallel processing of radiotherapy Monte Carlo simulations on a remote Beowulf cluster. In W. Schlegel and T. Bortfeld, editors, Proc. 13th Int. Conf on the Use of Computers in Radiation Therapy, Heidelberg, 2000. Springer.

(15.) I. Kawrakow and D. W. O. Rogers. The EGSnrc Code System, NRCC Report PIRS-707. NRCC, Ottawa, 2002.

(10.) M. L"uscher. A portal high-quality random number generator for lattice field theory simulations. Computer Phys. Commun., 1994.

(17.) G. Marsaglia and A. Zaman. A new class of random number generators. Annals of Applied Probability I, 1991.

(18.) D. W. O. Rogers, C-M Ma, B. Waiters, G. X. Ding, D. Sheikh-Bagheri, and G. Zhang. BEAMnrc Users Manual, NRCC Report PIRS-O509A (rev G). NRCC, Ottawa, 2001.

(19.) B. R. B. Waiters and D. W. O. Rogers. DOSXYZnrc Users Manual NRCC Report PIRS-794. NRCC, Ottawa, 2002.

(20.) P. W. Chin, D. G. Lewis, and J. P. Giddy. A Monte Carlo solution for external beam photon radiotherapy verification. In The Monte Carlo Method: Versatility Unbounded In A Dynamic Computing World, Chattanooga, 2005. American Nuclear Society.

(21.) P. W. Chin. Monte Carlo portal dosimetry. Ph.D. thesis, University of Wales, 2005.

(22.) P. W. Chin, D. G. Lewis, and J. P. Giddy. Implementation of BEAMnrc Monte Carlo Simulations on the Grid. In Proc. 14th Int. Conf on the Use of Computers in Radiation Therapy, Seoul, 2004. Jeong.

Dr. P. W. Chin is a Research Associate of Cardiff University. Dr. D.G. Lewis is the head of the Department of Medical Physics at the Velindre Cancer Centre. J.P. Giddy is the Grid Technologies Coordinator at the Welsh e-Science Centre, Cardiff University. They may be contacted at sceditor@scimag.com.
Grid Computing and Radiotherapy
Resources

Beowulf.org www.beowulf.org

Center for Devices and
Radiological Health www.accessdata.fda.gov/scripts/cdrh/
 cfdocs/cfMAUDE/search.cfm
Condor High Throughput Computing www.cs.wise.edu/condor
National Grid Service (NGS) www.ngs.ac.uk
Nimrod www.csse.monash.edu.au/nimrod
Welsh e-Science Centre www.wesc.ac.uk
* World Health Organization (WHO) www.who.int
COPYRIGHT 2006 Advantage Business Media
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2006 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:FOCUS--HIGH PERFORMANCE COMPUTING
Author:Chin, P.W.; Lewis, D.G.; Giddy, J.P.
Publication:Scientific Computing
Geographic Code:4EUUW
Date:Mar 1, 2006
Words:2438
Previous Article:Breaking the high-throughput bottleneck: new tools help biologists integrate complex datasets.
Next Article:A new model for application development: maximizing your return from parallel computing.
Topics:

Terms of use | Privacy policy | Copyright © 2018 Farlex, Inc. | Feedback | For webmasters