Lobectomy is a common and effective procedure for treating early-stage lung cancers. However, for patients with compromised pulmonary function (e.g. COPD) lobectomy can lead to major postoperative pulmonary complications. A technique for quantitatively predicting postoperative pulmonary function is needed to assist surgeons in assessing candidate’s suitability for lobectomy. We present a framework for quantitatively predicting the postoperative lung physiology and function using a combination of lung biomechanical modeling and machine learning strategies. A set of 10 patients undergoing lobectomy was used for this purpose. The image input consists of pre- and post-operative breath hold CTs. An automated lobe segmentation algorithm and lobectomy simulation framework was developed using a Constrained Adversarial Generative Networks approach. Using the segmented lobes, a patient-specific GPU-based linear elastic biomechanical and airflow model and surgery simulation was then assembled that quantitatively predicted the lung deformation during the forced expiration maneuver. The lobe in context was then removed by simulating a volume reduction and computing the elastic stress on the surrounding residual lobes and the chest wall. Using the deformed lung anatomy that represents the post-operative lung geometry, the forced expiratory volume in 1 second (FEV1) (the amount of air exhaled by a patient in 1 second starting from maximum inhalation), and forced vital capacity (FVC) (the amount of air exhaled by force from maximum inhalation), were then modeled. Our results demonstrated that the proposed approach quantitatively predicted the postoperative lobe-wise lung function at the FEV1 and FEV/FVC.
Adaptive radiotherapy is an effective procedure for the treatment of cancer, where the daily anatomical changes in the patient are quantified, and the dose delivered to the tumor is adapted accordingly. Deformable Image Registration (DIR) inaccuracies and delays in retrieving and registering on-board cone beam CT (CBCT) image datasets from the treatment system with the planning kilo Voltage CT (kVCT) have limited the adaptive workflow to a limited number of patients. In this paper, we present an approach for improving the DIR accuracy using a machine learning approach coupled with biomechanically guided validation. For a given set of 11 planning prostate kVCT datasets and their segmented contours, we first assembled a biomechanical model to generate synthetic abdominal motions, bladder volume changes, and physiological regression. For each of the synthetic CT datasets, we then injected noise and artifacts in the images using a novel procedure in order to mimic closely CBCT datasets. We then considered the simulated CBCT images for training neural networks that predicted the noise and artifact-removed CT images. For this purpose, we employed a constrained generative adversarial neural network, which consisted of two deep neural networks, a generator and a discriminator. The generator produced the artifact-removed CT images while the discriminator computed the accuracy. The deformable image registration (DIR) results were finally validated using the model-generated landmarks. Results showed that the artifact-removed CT matched closely to the planning CT. Comparisons were performed using the image similarity metrics, and a normalized cross correlation of >0.95 was obtained from the cGAN based image enhancement. In addition, when DIR was performed, the landmarks matched within 1.1 +/- 0.5 mm. This demonstrates that using an adversarial DNN-based CBCT enhancement, improved DIR accuracy bolsters adaptive radiotherapy workflow.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.