A coaxial projective imaging (CPI) module acquires surgical scene images from the local site of surgery, transfers them wirelessly to the remote site, and projects instructive annotations to the surgical field. At the remote site, the surgical scene images are displayed, and the instructive annotations from a surgical specialist are wirelessly transferred back to the local site in order to guide the surgical intervention by a less experienced surgeon. The CPI module achieves seamless imaging of the surgical field and accurate projection of the instructive annotations, by a coaxial optical path design that couples the imaging arm with the projection arm and by a color correction algorithm that recovers the true color of the surgical scene. Our benchtop study of tele-guided intervention verifies that the proposed system has a positional accuracy of better than 1 mm at a working distance ranging from 300 to 500 mm. Our in vivo study of cricothyrotomy in a rabbit model proves the concept of tele-mentored surgical navigation. This is the first report of tele-guided surgery based on CPI. The proposed technique can be potentially used for surgical training and for telementored surgery in resource-limited settings.
We propose a handheld projective imaging device for orthotopic projection of near-infrared fluorescence images onto target biological tissue at visible wavelengths without any additional visual aid. The device integrates a laser diode light source module, a camera module, a projector, an ultrasonic distance sensor, a Raspberry Pi single-board computer, and a battery module in a rugged handheld unit. It is calibrated at the detected working distance for seamless coregistration between fluorescence emission and projective imaging at the target tissue site. The proposed device is able to achieve a projection resolution higher than 314 μm and a planar projection bias less than 1 mm at a projection field of view of 58 × 108 mm2 and a working distance of 27 cm. Technical feasibility for projective imaging is verified in an ex vivo model of chicken breast tissue using indocyanine green as a fluorescence agent. Clinical utility for image-guided surgery is demonstrated in a clinical trial where sentinel lymph nodes in breast cancer patients are identified and resected under the guidance of projective imaging. Our ex vivo and in vivo experiments imply the clinical utility of deploying the proposed device for image-guided surgical interventions in resource-limited settings.
Sentinel lymph node biopsy is important in the early stage breast tumor resection surgery. Its results will determine if the axillary lymph node dissection (ALND) will be conducted afterwards. For locating sentinel lymph nodes, indocyanine green (ICG) has been widely used with a near infrared (NIR) camera to image its fluorescence. However, surgeons need to watch a screen beside the operating table to see the fluorescence, with their hands operating on the surgical site. We developed a navigation system that projects the invisible fluorescence back to the surgical site visibly in real-time. The system introduces a co-axial optics design to guarantee the projection accuracy. Phantom experiments are conducted to assess the projection resolution and accuracy of the system. Animal experiments with three mice show a good system performance and its preclinical feasibility. Furthermore, the system is tested in a clinical trial of ninety breast cancer patients in three hospitals in China. ICG and methylene blue (MB) is subcutaneously injected separately into the areola at 3-4 points to get both fluorescent and visible contrast, for further comparison. The navigation process is compared with a commercialized NIR imaging system. The results show a 100% detection rate of sentinel lymph nodes and a good consistency with the methylene blue and the commercialized imaging device. The experiments demonstrate good clinical feasibility of the co-axial projection system.
We propose a portable fluorescence microscopic imaging system (PFMS) for intraoperative display of biliary structure and prevention of iatrogenic injuries during cholecystectomy. The system consists of a light source module, a camera module, and a Raspberry Pi computer with an LCD. Indocyanine green (ICG) is used as a fluorescent contrast agent for experimental validation of the system. Fluorescence intensities of the ICG aqueous solution at different concentration levels are acquired by our PFMS and compared with those of a commercial Xenogen IVIS system. We study the fluorescence detection depth by superposing different thicknesses of chicken breast on an ICG-loaded agar phantom. We verify the technical feasibility for identifying potential iatrogenic injury in cholecystectomy using a rat model in vivo. The proposed PFMS system is portable, inexpensive, and suitable for deployment in resource-limited settings.
Current flow-based blood counting devices require expensive and centralized medical infrastructure and are not appropriate for field use. In this paper we report a method to count red blood cells, white blood cells as well as platelets through a low-cost and fully-automated blood counting system. The approach consists of using a compact, custom-built microscope with large field-of-view to record bright-field and fluorescence images of samples that are diluted with a single, stable reagent mixture and counted using automatic algorithms. Sample collection is performed manually using a spring loaded lancet, and volume-metering capillary tubes. The capillaries are then dropped into a tube of pre-measured reagents and gently shaken for 10-30 seconds. The sample is loaded into a measurement chamber and placed on a custom 3D printed platform. Sample translation and focusing is fully automated, and a user has only to press a button for the measurement and analysis to commence. Cost of the system is minimized through the use of custom-designed motorized components. We performed a series of comparative experiments by trained and untrained users on blood from adults and children. We compare the performance of our system, as operated by trained and untrained users, to the clinical gold standard using a Bland-Altman analysis, demonstrating good agreement of our system to the clinical standard. The system’s low cost, complete automation, and good field performance indicate that it can be successfully translated for use in low-resource settings where central hematology laboratories are not accessible.
Surgical resection remains the primary curative intervention for cancer treatment. However, the occurrence of a residual tumor after resection is very common, leading to the recurrence of the disease and the need for re-resection. We develop a surgical Google Glass navigation system that combines near infrared fluorescent imaging and ultrasonography for intraoperative detection of sites of tumor and assessment of surgical resection boundaries, well as for guiding sentinel lymph node (SLN) mapping and biopsy. The system consists of a monochromatic CCD camera, a computer, a Google Glass wearable headset, an ultrasonic machine and an array of LED light sources. All the above components, except the Google Glass, are connected to a host computer by a USB or HDMI port. Wireless connection is established between the glass and the host computer for image acquisition and data transport tasks. A control program is written in C++ to call OpenCV functions for image calibration, processing and display. The technical feasibility of the system is tested in both tumor simulating phantoms and in a human subject. When the system is used for simulated phantom resection tasks, the tumor boundaries, invisible to the naked eye, can be clearly visualized with the surgical Google Glass navigation system. This system has also been used in an IRB approved protocol in a single patient during SLN mapping and biopsy in the First Affiliated Hospital of Anhui Medical University, demonstrating the ability to successfully localize and resect all apparent SLNs. In summary, our tumor simulating phantom and human subject studies have demonstrated the technical feasibility of successfully using the proposed goggle navigation system during cancer surgery.
Vulvar lichen sclerosis (VLS) is a chronic, inflammatory and mucocutaneous disease of extragenital skin, which often goes undetected for years. The underlying causes are associated with the decrease of VEGF that reduces the blood oxygenation of vulva and the structural changes in the collagen fibrils, which can lead to scarring of the affected area. However, few methods are available for quantitative detection of VLS. Clinician’s examinations are subjective and may lead to misdiagnosis. Spectroscopy is a potentially effective method for noninvasive detection of VLS. In this paper, we developed a polarized, hyperspectral imaging system for quantitative assessment. The system utilized a hyperspectral camera to collect the reflectance images of the entire vulva under Xenon lamp illumination with and without a polarizer in front of the fiber. One image (Ipar) acquired with the AOTF parallel to the polarization of illumination and the other image (Iper) acquired with the AOTF perpendicular to the illumination. This paper compares polarized images of VLS in a pilot clinical study. The collected reflectance data under Xenon lamp illumination without a polarizer are calibrated and the hyperspectral signals are extracted. An IRB approved clinical trial was carried out to evaluate the clinical utility for VLS detection. Our pilot study has demonstrated the technical potential of using this polarized hyperspectral imaging system for in vivo detection of vulvar lichen sclerosis.
KEYWORDS: Cameras, 3D image processing, Hyperspectral imaging, Tissues, 3D image reconstruction, Imaging systems, 3D modeling, 3D acquisition, Calibration, Blood
Accurate and in vivo characterization of structural, functional, and molecular characteristics of biological tissue will facilitate quantitative diagnosis, therapeutic guidance, and outcome assessment in many clinical applications, such as wound healing, cancer surgery, and organ transplantation. We introduced and tested a multiview hyperspectral imaging technique for noninvasive topographic imaging of cutaneous wound oxygenation. The technique integrated a multiview module and a hyperspectral module in a single portable unit. Four plane mirrors were cohered to form a multiview reflective mirror set with a rectangular cross section. The mirror set was placed between a hyperspectral camera and the target biological tissue. For a single image acquisition task, a hyperspectral data cube with five views was obtained. The five-view hyperspectral image consisted of a main objective image and four reflective images. Three-dimensional (3-D) topography of the scene was achieved by correlating the matching pixels between the objective image and the reflective images. 3-D mapping of tissue oxygenation was achieved using a hyperspectral oxygenation algorithm. The multiview hyperspectral imaging technique was validated in a wound model, a tissue-simulating blood phantom, and in vivo biological tissue. The experimental results demonstrated the technical feasibility of using multiview hyperspectral imaging for 3-D topography of tissue functional properties.
Obtaining three-dimensional (3D) information of biologic tissue is important in many medical applications. This paper
presents two methods for reconstructing 3D topography of biologic tissue: multiview imaging and structured light
illumination. For each method, the working principle is introduced, followed by experimental validation on a diabetic
foot model. To compare the performance characteristics of these two imaging methods, a coordinate measuring machine
(CMM) is used as a standard control. The wound surface topography of the diabetic foot model is measured by
multiview imaging and structured light illumination methods respectively and compared with the CMM measurements.
The comparison results show that the structured light illumination method is a promising technique for 3D topographic
imaging of biologic tissue.
Accurate and in vivo characterization of structural, functional, and molecular characteristics of biological tissue will
facilitate quantitative diagnosis, therapeutic guidance, and outcome assessment in many clinical applications, such as
wound healing, cancer surgery, and organ transplantation. However, many clinical imaging systems have limitations and
fail to provide noninvasive, real time, and quantitative assessment of biological tissue in an operation room. To
overcome these limitations, we developed and tested a multiview hyperspectral imaging system. The multiview
hyperspectral imaging system integrated the multiview and the hyperspectral imaging techniques in a single portable
unit. Four plane mirrors are cohered together as a multiview reflective mirror set with a rectangular cross section. The
multiview reflective mirror set was placed between a hyperspectral camera and the measured biological tissue. For a
single image acquisition task, a hyperspectral data cube with five views was obtained. The five-view hyperspectral image
consisted of a main objective image and four reflective images. Three-dimensional topography of the scene was achieved
by correlating the matching pixels between the objective image and the reflective images. Three-dimensional mapping of
tissue oxygenation was achieved using a hyperspectral oxygenation algorithm. The multiview hyperspectral imaging
technique is currently under quantitative validation in a wound model, a tissue-simulating blood phantom, and an in vivo
biological tissue model. The preliminary results have demonstrated the technical feasibility of using multiview
hyperspectral imaging for three-dimensional topography of tissue functional properties.
The wound healing process involves the reparative phases of inflammation, proliferation, and remodeling. Interrupting
any of these phases may result in chronically unhealed wounds, amputation, or even patient death. Quantitative
assessment of wound tissue ischemia, perfusion, and inflammation provides critical information for appropriate
detection, staging, and treatment of chronic wounds. However, no method is available for noninvasive, simultaneous,
and quantitative imaging of these tissue parameters. We integrated hyperspectral, laser speckle, and thermographic
imaging modalities into a single setup for multimodal assessment of tissue oxygenation, perfusion, and inflammation
characteristics. Advanced algorithms were developed for accurate reconstruction of wound oxygenation and appropriate
co-registration between different imaging modalities. The multimodal wound imaging system was validated by an
ongoing clinical trials approved by OSU IRB. In the clinical trial, a wound of 3mm in diameter was introduced on a
healthy subject’s lower extremity and the healing process was serially monitored by the multimodal imaging setup. Our
experiments demonstrated the clinical usability of multimodal wound imaging.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.