KEYWORDS: Distortion, Video, Video acceleration, Endoscopy, Visualization, Error analysis, 3D video streaming, Volume rendering, 3D image processing, Image processing
Optical lens systems suffer from nonlinear geometrical distortion. Optical imaging applications such as image-enhanced endoscopy and image-based bronchoscope tracking require correction of this distortion for accurate localization, tracking, registration, and measurement of image features. Real-time capability is desirable for interactive systems and live video. The use of a texture-mapping graphics accelerator, which is standard hardware on current motherboard chipsets and add-in video graphics cards, to perform distortion correction is proposed. Mesh generation for image tessellation, an error analysis, and performance results are presented. It is shown that distortion correction using commodity graphics hardware is substantially faster than using the main processor and can be performed at video frame rates (faster than 30 frames per second), and that the polar-based method of mesh generation proposed here is more accurate than a conventional grid-based approach. Using graphics hardware to perform distortion correction is not only fast and accurate but also efficient as it frees the main processor for other tasks, which is an important issue in some real-time applications.
Electromagnetic tracking systems (EMTS's) are widely used in clinical applications. Many reports have evaluated their static behavior and errors caused by metallic objects were examined. Although there exist some publications concerning the dynamic behavior of EMTS's the measurement protocols are either difficult to reproduce with respect of the movement path or only accomplished at high technical effort. Because dynamic behavior is of major interest with respect to clinical applications we established a simple but effective modal measurement easy to repeat at other laboratories. We built a simple pendulum where the sensor of our EMTS (Aurora, NDI, CA) could be mounted. The pendulum was mounted on a special bearing to guarantee that the pendulum path is planar. This assumption was tested before starting the measurements. All relevant parameters defining the pendulum motion such as rotation center and length are determined by static measurement at satisfactory accuracy. Then position and orientation data were gathered over a time period of 8 seconds and timestamps were recorded. Data analysis provided a positioning error and an overall error combining both position and orientation. All errors were calculated by means of the well know equations concerning pendulum movement. Additionally, latency - the elapsed time from input motion until the immediate consequences of that input are available - was calculated using well-known equations for mechanical pendulums for different velocities. We repeated the measurements with different metal objects (rods made of stainless steel type 303 and 416) between field generator and pendulum.
We found a root mean square error (eRMS) of 1.02mm with respect to the distance of the sensor position to the fit plane (maximum error emax = 2.31mm, minimum error emin = -2.36mm). The eRMS for positional error amounted to 1.32mm while the overall error was 3.24 mm. The latency at a pendulum angle of 0° (vertical) was 7.8ms.
The major aim of this work was to define a protocol for evaluation of electromagnetic tracking systems (EMTS). Using this protocol we compared two commercial EMTS: the Ascension microBIRD (B) and NDI Aurora (A). To enable reproducibility and comparability of the assessments a machined base plate was designed, in which a 50 mm grid of holes is precision drilled for position measurements. A circle of 32 equispaced holes in the center enables the assessment of rotation. A small mount which fits into pairs of grid holes on the base plate is used to mount the sensor in a defined and rigid way. Relative positional/orientational errors are found by subtracting the known distances/rotations between the machined locations from the differences of the mean observed positions/rotation. To measure the influence of metallic objects we inserted rods (made of SST 303, SST 416, aluminum, and bronze) into the sensitive volume between sensor and emitter. Additionally the dynamic behavior was tested by using an optical sensor mounted on a spacer in a distance of 150 mm to the EMTS sensors. We found a relative positional error of 0.96mm +/- 0.68mm, range -0.06mm;2.23mm (A) and 1.14mm +/- 0.78mm, range -3.72mm;1.57mm (B) for a give distance of 50 mm. The positional jitter amounted to 0.14 mm(A) / 0.20mm (B). The relative rotation error was found to be 1.81 degrees(A) / 0.63 degrees(B). For the dynamic behavior we calculated an error of 1.63mm(A)/1.93mm(B). The most relevant distortion caused by metallic objects results from SST 416. The maximum error 4.2mm(A)/41.9mm(B) occurs when the rod is close to the sensor(20mm).
Optical lens systems suffer from non-linear radial distortion. Applications such as computer vision and medical imaging require distortion compensation for the accurate location, registration and measurement of image features. While in many applications distortion correction may be applied offline, a real-time capability is desirable for systems that interact with the environment or with a user in real time. The construction of a triangle mesh combined with distortion compensation of the mesh nodes results in a pair of static node co-ordinate sets, which a texture-mapping graphics accelerator can use along with the dynamic distorted image to render high-quality distortion-corrected images at video framerates. Mesh generation, an error analysis, and performance results are presented. The polar-based method proposed in this paper is shown to have both more accuracy than a conventional grid-based approach and greater speed than the traditional method of using the CPU to transform each pixel individually.
Combining endoscopic video with overlaid reconstructed 3D images allows surgeons to see past opaque surfaces and provides 2D and 3D planning and navigation information. Ensuring surgical quality necessitates accurate calibration and accuracy testing of such systems, as errors from sources such as uncompensated lens distortion and the tracking system cause image misregistration. A method for the calibration of an image-enhanced endoscopy system is presented. Constant-radius linear compensation is proposed as an alternative to computationally expensive nonlinear radial lens-distortion compensation. The results of a system accuracy test using these methods are discussed and the linear lens-distortion compensation method is shown to be an effective compromise between computational speed and accuracy.
KEYWORDS: Ultrasonography, Calibration, 3D acquisition, 3D image processing, Surgery, Visualization, 3D displays, Liver, Data acquisition, Optical tracking
The goal of the Image Guidance Laboratories (IGL) is to provide the highest quality, real-time visualization of 3D ultrasound images in a manner that is most acceptable and suited for adoption by surgeons. To this end, IGL has developed an optically tracked, freehand, 3D volume- rendering ultrasound system for image-guided surgery. Other systems temporally separate frame acquisition from volume construction and display; the data must be stored before being loaded into the volume construction engine for visualization. By incorporating novel methods to reduce the computational expense associated with frame insertion, volume maintenance, and 2D texture-based rendering, the IGL system is able to simultaneously acquire and display 3D ultrasound data. The work presented here focuses on methods unique to achieving near real-time 3D visualization using 2D ultrasound images and discusses the potential of this system to address clinical situations such as liver resection, tumor ablation, and breast biopsy.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.