Open Access
12 March 2012 Foveated endoscopic lens
Author Affiliations +
Abstract
We present a foveated miniature endoscopic lens implemented by amplifying the optical distortion of the lens. The resulting system provides a high-resolution region in the central field of view and low resolution in the outer fields, such that a standard imaging fiber bundle can provide both the high resolution needed to determine tissue health and the wide field of view needed to determine the location within the inspected organ. Our proof of concept device achieves 7 ∼ 8  μm resolution in the fovea and an overall field of view of 4.6 mm. Example images and videos show the foveated lens' capabilities.

1.

Introduction

The human eye’s separation of the field of view into two regions—one of high and one of low resolution, the fovea centralis and periphery—has inspired a number of attempts at doing the same thing with digital imaging systems.16 The advantage is the ability to see over a wide field of view (FOV) while maintaining high resolution capability in the central region of view. A foveated system thus possesses the potential to solve the main difficulty that has prevented the translation of endomicroscopy to direct clinical use: current endomicroscopes lack the field of view needed to give clinicians the proper context for the high-resolution measurements needed to detect lesions. Pathologists, for example, routinely scan entire tissue sections with low resolution, and then focus down on areas of interest with high magnification and resolution. The high resolution images allow one to determine the health and morphology of individual cells, but this information is only useful when one knows the cells’ location within the overall tissue.

The first demonstrated foveated imaging system of which we are aware is that of Spiegel et al.,7 in which the authors describe the design of an “artificial retina”—a custom-designed sensor array with spatially variant response. Following on this work, a number of other research teams attempted alternative approaches for customized on-chip foveated sensor arrays.810 McCarley et al.4 take a different direction and develop an on-chip binning method that allows one to transmit regions of the image at the full sampling rate of the detector array while binning pixels outside these regions into larger “super-pixels” for lower resolution.

The first attempt to perform foveation with optical components appears to be Suematsu et al.,11 where the authors use a door peephole lens together with distortion-correction software to obtain foveated images. The authors later designed a higher performance lens to achieve similar distortion characteristics over a larger image.12 This elegant approach to foveation was taken further by Wakamiya et al.,13 in which the authors design a wide-angle camera lens adapted to a standard CCD array format.

An alternative approach to foveation is to use a spatially tunable phase element to correct lens aberrations in a localized region of the field of view. Martinez et al.,1 for example, use a tunable transmissive liquid crystal spatial light modulator (SLM) to correct for aberrations across a ±45deg FOV. The same basic method is also possible using a reflective SLM.14 A pragmatic approach is taken by Ude et al.,15 in which the authors use a pair of cameras viewing the same scene—one high-resolution narrow-FOV camera and one low-resolution wide-FOV camera—to obtain foveation. Since this can produce parallax problems when viewing nearby objects, Hua and Liu5 adapt the method to use a beamsplitter to separate the objective lens optical path into two paths of different magnification, each of which is sampled by its own detector array.

For endoscopic imaging, many of these prior implementations are problematic. The SLM approach requires that the modulator be an integral part of the objective lens, so that an SLM’s large size becomes an obstacle. The on-chip method of McCarley et al. provides no benefits to endoscopic imaging, where bandwidth constraints are generally minimal due to the low pixel counts typically transmitted by imaging fiber bundles. The dual detector method of Hua and Liu, while compatible with endoscopic imaging if implemented with a pair of imaging fiber bundles, does not substantially ease the optical design problem of achieving simultaneously a high-resolution central field and a wide total field of view. Our own approach follows that of Suematsu12 and Wakamiya13 by using optical distortion within a customized objective lens, but designed for endoscopy rather than camera use. The resulting foveated endoscopic objective lens attempts to capture all of the benefits of foveation (high-resolution central field, wide total field, and easier optical design) at the cost of requiring highly aspheric optical elements.

Foveation also has the potential of getting around the sampling limitation available with current imaging fiber bundles. If high resolution is achieved everywhere within the field of view, the limited number of fibers (bundles with up to 100,000 fibers are available) means that the bundle rather than the lens restricts the overall FOV. With foveated imaging we can dedicate a fraction of fibers to the fovea and use the remaining fibers for wide field imaging.

2.

Lens Design

The goal of the foveated endoscopic lens design is to achieve 5μm resolution in the central field of view (<0.5mm), smoothly transitioning to 50 μm resolution at the edge of the field. In order to allow the lens to access remote areas of the body, the design attempts to keep the optical elements under 3mm diameter. The design starts with a three-element system in which the front and rear elements are diamond-turned plastic aspheric surfaces while the middle element is a commercial off-the-shelf spherical glass lens, following loosely along a previous design.16 The layout of this objective design and its optical performance is shown in Fig. 1, while the optical prescription is given in Table 1. In the final design, the lens maps a ±2.7mm field of view onto a ±0.7mm image. The central field has a magnification M of 0.5, falling to M=0.04 at the edges of the field, and is diffraction limited throughout the entire FOV. The design is telecentric in image space in order to provide high coupling efficiency into an imaging fiber bundle.

Fig. 1

(a) The optical layout of the foveated lens, (b) spotsize diagrams, and (c) MTF curves. The spotsize diagrams include a circle showing the 2.6 μm radius Airy disk for the on-axis NA. The MTF plot uses a maximum spatial frequency corresponding to the optical cut-off frequency of the on-axis NA.

JBO_17_2_021104_f001.png

Table 1

The optical prescription for the prototype foveated lens, designed for a wavelength of 633 nm. Surfaces 3 to 4 comprise a commercial glass lens (Qioptiq 311330000). The surface sag z is represented by the equation z=(1/R)r2+α4r4+α6r6, where r is the transverse coordinate, R the radius of curvature (the second column below), and α4 and α6 are aspheric coefficients. All dimensioned units are in millimeters.

SurfRadiusThicknessGlassDiamα4α6
02.165.400
15.1181.08PMMA2.0330.07300.0492
25.401.951
320.8802.00N-BK71.445
43.5503.602.022
56.9772.00PMMA2.6680.02490.0075
64.7163.603.0800.03660.0015

The numerical aperture of the lens varies with field angle as indicated in Table 2, determined by numerical ray tracing at each field position (interested readers can consult details in the Zemax macro NA_calcs.zpl given at the authors’ website.17) Because the lens operates with large field angles (up to ±41deg at the edges of the field), the standard paraxial formula for object-space NA is split in two in order to factor in the oblique angle: for a system with low distortion, the tangential and sagittal NA values at a given field point are related to one another by approximately a factor of cos2(θ), where θ is the chief ray angle in object space.18,19 For the semimajor rtan and semiminor rsag axes of the Airy ellipse, one still uses the familar equation r=0.61λ/NA by substituting NAtan or NAsag for the numerical aperture value. The depth of field (DOF) is calculated from the ray tracing model and is defined as distance between the positive and negative longitudinal offsets at which the defocus aberration rms spotsize exceeds the diffraction-limited spotsize. The outer fields, due to their small NA, have an extremely long depth of focus. Due to the telecentric geometry, on the other hand, the image-side NA is approximately the same everywhere across the field, at NA=0.145. Note that the reduction in NA with field position does not result in a large falloff of light level at the edges of the raw (distorted) image because of the corresponding decrease in magnification there.

Table 2

The design parameters for the prototype foveated lens. In a given column below, a pair of numbers separated by a slash indicates the value for tangential/sagittal fields. For the on-axis fields, these two numbers are identical. DOF is the depth of field and M the magnification.

Field (mm)NAAiry radius (μm)DOF (mm)M
00.0752.60.320.51
10.067/0.0552.8/3.50.520.38
20.051/0.0203.8/9.73.460.12
2.70.040/0.0064.8/31.111.000.04

The first and last surfaces in the system both create large negative distortion. Since the front element contains negative power and the rear elements are both positive in power, the design is roughly anti-symmetric about the stop, so that odd aberrations (i.e. distortion and coma) sum together. The first surface is concave toward the object—a common configuration in microscope objectives that allows for the reduction of spherical aberration and field curvature.2022 The lens was designed for monochromatic 633 nm light but remains diffraction-limited to about a bandwidth of ±50nm from the design wavelength. That is, at ±50nm, the RMS polychromatic geometric spotsize is approximately the same as the Airy disk radius.

While we have indicated that the lens design is diffraction-limited across the entire FOV [as indicated by Fig. 1(b)], the model MTF curves [Fig. 1(c)] show substantial departure from the behavior of a conventional diffraction-limited lens. The off-axis curves at low and medium spatial frequencies fall well below the MTF curves for the on-axis field, but this is due almost entirely to the reduction in the NA with increasing field position rather than to lens aberrations. In addition, the large off-axis angles in image space cause the pupil to become elliptical, so that the tangential and meridional MTF curves show different behavior.

Figures 2(a) and 2(b) show the field curvature and distortion of the design, indicating that the distortion reaches a value of 50% at the maximum field point. Figures 2(c) and 2(d) shows the effect of the distortion on a simulated object. The resulting image will be sampled well in the central region while the edge regions will be sampled sparsely.

Fig. 2

(a) Field curvature and (b) distortion in the design. The two diagrams in the bottom row show a ray tracing simulation of an image acquired with the foveated objective: (c) the object in view, (d) its image. (Note that the image has been inverted for easy comparison with the object.)

JBO_17_2_021104_f002.png

3.

Lens Tolerancing, Manufacture, and Testing

Tolerance analysis on the design requires a small modification to standard procedures because rays near the edge of the field can experience extreme deviations from their designed location. These extreme rays completely dominate the resulting merit function, despite the poor system resolution there and so need to be treated separately from rays corresponding to the other field points. Thus, system tolerancing was performed using the inner set of fields alone (i.e. field heights of 0, 1, and 2 mm). From the analysis, one finds that lens decenter is the dominating source of system error for the manufactured system, requiring that element and surface decenters be held to less than 40 μm.

The lens design consists of two plastic elements and one commercial glass lens (Qioptiq 311330000). The two plastic lenses [the first and third elements shown in Fig. 1(a)] are cut from 38 mm diameter disks of PMMA (acrylic) on a 4-axis Nanotech 250UPL precision lathe using single point diamond turning. Only the central 3 mm of the plastic disks contain the optical surfaces; the remainder of the disks were painted blue in order to prevent stray light.

In order to mount and align the assembled system, we designed three mounts and an aperture stop using SolidWorks CAD software and manufactured these with a ProJet SD3000 3D printer. The resulting system is shown in Fig. 3.

Fig. 3

The experimental setup for testing the prototype foveated lens. Shown at left an Olympus UMPlanFl 5× microscope objective lens used for relaying and magnifying the foveated lens’ image onto a detector array. At right are the (white) lens disc mounts, with the optical surfaces themselves comprising only the central 3 mm of each disc, which have been painted blue to prevent stray light.

JBO_17_2_021104_f003.png

Because the back working distance of the foveated lens is only 3.6 mm, one cannot place a commonly available detector array directly at the image plane in order to detect the signal. Instead, for testing the system, we constructed an optical relay using an Olympus UMPlanFl 5× microscope objective lens together with a 100 mm tube lens (Thorlabs LA1509-A) to re-image the signal onto a detector array (QImaging Retiga 2000R). The input NA and field of view of the optical relay (0.15 and 3×4mm respectively) have been closely matched to the image size and image-space NA of the foveated lens. All images were taken using a 660 nm LED light source (Thorlabs M660L2) for illumination.

4.

Calibration and Experimental Results

For practical use, the displayed images from the system must be unwarped in order to remove the distortion. As a first step toward correcting the distortion present in the images, we created a set of calibration targets comprising a field of dots laid out in a Cartesian grid pattern, printed onto transparency film using a standard desktop laser printer.* Figure 4 shows an ideal target, the corresponding image through the prototype lens, and the distortion-corrected image after unwarping. Note that the square-shaped appearance of the circular dots is a result of resolution degradation by the printer.

Fig. 4

(a) The ideal calibration target, (b) its image through the prototype lens, and (c) the distortion-corrected image. The target consists of a Cartesian grid of dots spaced 0.628 mm apart and designed to overfill the field of view. Note that the bottom image contains about 2.4 times the number of pixels (1181×1166) as the measured image (752×752).

JBO_17_2_021104_f004.png

Using the measured images, we can locate the dot coordinates on the detector array and compare to the known dot coordinates at the object plane. One can then use least squares analysis to estimate the mapping parameters for performing image correction.2328 Since the mapping of object to image is in principle a purely radial function, the mapping parameters are the image coordinates of the optical axis (xc,yc) together with the ai coefficients used to express the radial form of the mapping from image plane (x,y) to object plane (x,y):25

r=a1r+a2r2+a3r3+,
where
x=xc+rcosθ,y=yc+rsinθ,r=[(xxc)2+(yyc)2]1/2,θ=arctan(yycxxc).
However, in our prototype lens this approach encountered difficulties, which may be due to non-radial distortion caused by element misalignment (such as decentering). As a result, we used a piecewise polynomial mapping,29 the results of which are shown in Fig. 4. The spacings between dots in the corrected image follow closely the Cartesian grid of the object. Additionally, the corrected images contain 2.4 times the number of pixels (1181×1166) as the measured images (752×752) as a result of the outer regions of the FOV being stretched to the sampling rate defined by the central field.

Using the prototype lens and distortion correction algorithm, we measured a 1951 USAF resolution target in the lens’ central field of view (see Fig. 5). (With the exception of Fig. 8, all images were taken with the object placed at the 1.9 mm working distance.) Looking at the closeup of the highest-resolution groups, we can see the bars are resolved down to elements 1 or 2 in group 7, corresponding to 128 or 144 line pairs per mm—a resolution of 78μm. This number is worse than the initial design of 5μm and appears to be a result of damage to one of our lenses (surface 6) during system assembly and alignment as well as to difficulty in reaching the required mechanical tolerances using the SD3000 3D printer for manufacturing our mounts.

Fig. 5

A closeup of a 1951 USAF resolution target imaged through the prototype lens, showing a closeup of (a) groups 4 to 7, and (b) groups 6 and 7 on the target (shown by the red box outline in the full image), indicating that either element 1 or 2 of group 7 is just resolved (i.e. a resolution of 78μm). The fields of view of the two images are 1.19 mm and 238 μm respectively.

JBO_17_2_021104_f005.png

While Fig. 5 shows the resolution in the fovea, Video 1 (Fig. 6) shows the wide field of view achieved by the system. In this example we translate the resolution target across the field of view and show simultaneously the raw measured image and the distortion-corrected image as the object moves across. While the distortion correction algorithm has difficulty in achieving full correction at the edges of the FOV, one can see that the corrected images show little of the hemispherical warping present in the raw measurements.

Fig. 6

An image sequence showing the resolution target as it translates across the field of view, in raw data (left) and after distortion correction (right). (Video 1, 5.4MB MPG) (URL: http://dx.doi.org/10.1117/1.JBO.17.2.021104.1)

JBO_17_2_021104_f006.png

Figure 7 shows a hematoxylin and eosin (H&E) stained slice of mouse esophagus tissue imaged through the prototype lens after the distortion correction algorithm has been applied. The fovea allows one to see individual nuclei, while the entire system allows mapping of a wide field of view. The overall field of view shown in the images (Figs. 6 and 7) is 4.6 mm in diameter. This is smaller than the designed 5.4 mm due to the presence of reflections at the edges of the field. (These are especially apparent in the raw data of Fig. 6.) When cutting small lenses into larger disk substrates, the edges of the lens have a fillet due to the finite radius of curvature of the cutting tool. The resulting fillet has a curvature different from the lens itself, the result of which produces reflections at the edge of the field. In order to remove these reflections, we have truncated the field to a 4.6 mm diameter.

Fig. 7

(Top) An image of a hematoxylin and eosin (H&E) stained slice of mouse esophagus, with diameter at the object of 4.6 mm. In the foveal image (bottom, 1.1 mm diameter at the object) one can see individual nuclei (dark spots). The annotated tissue regions are: (1) epithelium, (2) muscle/cartilage, (3) thyroid, and (4) salivary gland. Note that the 660 nm illumination results in poor contrast for the tissue’s eosinophilic structures.

JBO_17_2_021104_f007.png

In order to illustrate the lens’ long depth of focus, Fig. 8 shows a video of an experiment in which a user waves a hand 350 mm in front of the lens’ nominal object plane. Both the user’s hand and the raw image from the lens on the computer screen are simultaneously in view, and show that, with the exception of the fovea, the shape of the hand is visible even at such a long distance.

Fig. 8

An illustration of the long depth of field of the lens. While a hand is waved about 350 mm in front of the foveated lens’ nominal object plane, the raw image data (see the computer screen shown in the image) shows the shape of the hand is resolved except in the high-resolution central field. (Video 2, 1.1 MB MPG) (URL: http://dx.doi.org/10.1117/1.JBO.17.2.021104.2)

JBO_17_2_021104_f008.png

5.

Conclusion

Endoscopes with wide fields of view are commonly used in clinical settings to view organs inside the body. On the other hand, endomicroscopes, despite their proven ability to diagnose tissue diseases, have seen little direct application in the clinic due to practical difficulties with their use: although they enable viewing of cellular morphology, they cannot provide a view of the large-scale tissue structure that is needed to determine context for morphology-based diagnosis. Bringing together these two techniques in a single instrument has proven difficult, but we have been able to show that distortion-induced foveation should allow one to achieve a high-resolution fovea together with wide field of view in order to marry the strengths of endoscopy with endomicroscopy. Our prototype foveated lens shows that this can also be done in the small system diameters needed for agile endoscope probing within the body.

For ease of use, another advantage shown by our prototype lens is a long depth of field for the periphery. Not only does this make the probe much more practical for viewing the contorted surface geometry of organs inside the body, it also allows users to scan whole organs from a distance, and then to move in closer to regions of interest for obtaining morphological measurements.

The current proof-of-concept lens design has a low object space NA on axis for performing endomicroscopy. In future work, we plan to modify the current design, increasing the number of lens elements to achieve NA=0.25 on axis. We are also developing new manufacturing methods for these miniature lenses, allowing for the tighter tolerances required by high NA optics.

Acknowledgments

This work was supported in part by the National Institute of Health under grants R01-EB007594 and R01-CA124319.

Notes

*Printing onto transparencies rather than onto paper produces results with much higher resolution. The resulting targets are not as high a quality as available with commercially available elements, but their flexibility allows one to use any pattern desired.

References

1. 

T. MartinezD. V. WickS. R. Restaino, “Foveated, wide field-of-view imaging system using a liquid crystal spatial light modulator,” Opt. Express, 8 (10), 555 –560 (2001). http://dx.doi.org/10.1364/OE.8.000555 OPEXFF 1094-4087 Google Scholar

2. 

D. V. Wicket al., “Foveated imaging demonstration,” Opt. Express, 10 (1), 60 –65 (2002). OPEXFF 1094-4087 Google Scholar

3. 

B. E. BagwellW. C. SweattD. V. Wick, “Adaptive optical zoom sensor,” (2005). Google Scholar

4. 

P. L. McCarleyM. A. MassieJ. P. Curzan, “Foveating infrared imaging sensors,” Proc. SPIE, 6660 666002 (2007). http://dx.doi.org/10.1117/12.740036 PSISDG 0277-786X Google Scholar

5. 

H. HuaS. Liu, “Dual-sensor foveated imaging system,” Appl. Opt., 47 (3), 317 –327 (2008). http://dx.doi.org/10.1364/AO.47.000317 APOPAI 0003-6935 Google Scholar

6. 

G. CuratuJ. E. Harvey, “Analysis and design of wide-angle foveated optical systems based on transmissive liquid crystal spatial light modulators,” Opt. Eng., 48 (4), 043001 (2009). http://dx.doi.org/10.1117/1.3122006 OPENEI 0892-354X Google Scholar

7. 

J. van der Spiegelet al., “A foveated retina-like sensor using CCD technology,” Analog VLSI Implementation of Neural Systems, 189 –212 Kluwer Academic, Norwell, MA (1989). Google Scholar

8. 

R. WodnickiG. W. RobertsM. D. Levine, “A foveated image sensor in standard CMOS technology,” in Proceedings of the IEEE Custom Integrated Circuits Conference, 357 –360 (1995). Google Scholar

9. 

S. XiaR. SridharP. ScottC. Bandera, “An all CMOS foveal image sensor chip,” in Proceedings of the Eleventh Annual IEEE International ASIC Conference, 409 –413 (1998). Google Scholar

10. 

R. Etienne-Cummingset al., “A foveated silicon retina for two-dimensional tracking,” IEEE Trans. Circ. Syst. II, 47 (6), 504 –517 (2000). http://dx.doi.org/10.1109/82.847066 1549-7747 Google Scholar

11. 

Y. SuematsuT. Hayase, “An advanced image system with fovea,” in 16th Annual Conference of IEEE Industrial Electronics Society (IECON ’90), 581 –585 (1990). Google Scholar

12. 

Y. SuematsuH. YamadaT. Ueda, “A wide angle vision sensor with fovea—design of distortion lens and the simulated images,” in Proceedings of the International Conference on Industrial Electronics, Control, and Instrumentation (IECON), 1770 –1773 (1993). Google Scholar

13. 

K. Wakamiyaet al., “New foveated wide angle lens with high resolving power and without brightness loss in the periphery,” Proc. SPIE, 6051 605107 (2005). http://dx.doi.org/10.1117/12.650967 PSISDG 0277-786X Google Scholar

14. 

G. Curatuet al., “Wide field-of-view imaging system using a liquid crystal spatial light modulator,” Proc. SPIE, 5874 587408 (2005). http://dx.doi.org/10.1117/12.619257 PSISDG 0277-786X Google Scholar

15. 

A. UdeC. GaskettG. Cheng, “Foveated vision systems with two cameras per eye,” in Proceedings of the IEEE International Conference on Robotics and Automation, 3457 –3462 (2006). Google Scholar

16. 

J. D. RogersT. T. TkaczykM. R. Descour, “Foveated endoscope objective design to combine high resolution with wide field of view,” Proc. SPIE, 7558 755808 (2010). http://dx.doi.org/10.1117/12.842778 PSISDG 0277-786X Google Scholar

18. 

M. V. R. K. Murty, “On the theoretical limit of resolution,” J. Opt. Soc. Am., 47 (7), 667 –668 (1957). http://dx.doi.org/10.1364/JOSA.47.000667 JOSAAH 0030-3941 Google Scholar

19. 

C. J. R. SheppardZ. Hegedus, “Resolution for off-axis illumination,” J. Opt. Soc. Am. A, 15 (3), 622 –624 (1998). http://dx.doi.org/10.1364/JOSAA.15.000622 JOAOD6 0740-3232 Google Scholar

20. 

C. G. Wynne, “Flat-field microscope objective,” J. Sci. Instr., 38 92 –94 (1961). http://dx.doi.org/10.1088/0950-7671/38/3/307 Google Scholar

21. 

H. C. Claussen, “Microscope objectives with plano-correction,” Appl. Opt., 3 (9), 993 –1003 (1964). http://dx.doi.org/10.1364/AO.3.000993 APOPAI 0003-6935 Google Scholar

22. 

A. MiksJ. Novak, “Analysis and synthesis of planachromats,” Appl. Opt., 49 (17), 3403 –3410 (2010). http://dx.doi.org/10.1364/AO.49.003403 APOPAI 0003-6935 Google Scholar

23. 

W. E. SmithN. VakilS. A. Maislin, “Correction of distortion in endoscope images,” IEEE Trans. Med. Imag., 11 (1), 117 –122 (1992). http://dx.doi.org/10.1109/42.126918 ITMID4 0278-0062 Google Scholar

24. 

J. WengP. CohenM. Herniou, “Camera calibration with distortion models and accuracy evaluation,” IEEE Trans. Pattern Anal. Mach. Intell., 14 (10), 965 –980 (1992). http://dx.doi.org/10.1109/34.159901 ITPIDJ 0162-8828 Google Scholar

25. 

H. HaneishiY. YagihashiY. Miyake, “A new method for distortion correction of electronic endoscope images,” IEEE Trans. Med. Imag., 14 (3), 548 –555 (1995). http://dx.doi.org/10.1109/42.414620 ITMID4 0278-0062 Google Scholar

26. 

K. V. AsariS. KumarD. Radhakrishnan, “A new approach for nonlinear distortion correction in endoscopic images based on least squares estimation,” IEEE Trans. Med. Imag., 18 (4), 345 –354 (1999). http://dx.doi.org/10.1109/42.768843 ITMID4 0278-0062 Google Scholar

27. 

H. FaridA. C. Popescu, “Blind removal of lens distortion,” J. Opt. Soc. Am. A, 18 (9), 2072 –2078 (2001). http://dx.doi.org/10.1364/JOSAA.18.002072 JOAOD6 0740-3232 Google Scholar

28. 

V. K. Asari, “Non-linear spatial warping of endoscopic images: an architectural perspective for real time applications,” Microprocess. Microsy., 26 161 –171 (2002). http://dx.doi.org/10.1016/S0141-9331(02)00010-8 MIMID5 0141-9331 Google Scholar

29. 

A. Goshtasby, “Image registration by local approximation methods,” Image Vis. Comput., 6 (4), 255 –261 (1988). http://dx.doi.org/10.1016/0262-8856(88)90016-9 IVCODK 0262-8856 Google Scholar
© 2012 Society of Photo-Optical Instrumentation Engineers (SPIE) 0091-3286/2012/$25.00 © 2012 SPIE
Nathan Hagen and Tomasz S. Tkaczyk "Foveated endoscopic lens," Journal of Biomedical Optics 17(2), 021104 (12 March 2012). https://doi.org/10.1117/1.JBO.17.2.021104
Published: 12 March 2012
Lens.org Logo
CITATIONS
Cited by 16 scholarly publications and 39 patents.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Distortion

Endoscopy

Image resolution

Lens design

Prototyping

Objectives

Tissues

RELATED CONTENT

Foveated optics
Proceedings of SPIE (May 17 2016)
Low cost video endoscopes with simplified integration
Proceedings of SPIE (May 13 2010)
360° endoscopy using panomorph lens technology
Proceedings of SPIE (February 24 2010)
3-D endoscopy through alternating-frame technology
Proceedings of SPIE (September 01 1990)

Back to Top