Fluorescence lifetime imaging microscopy (FLIM) allows the investigation of the physicochemical environment of fluorochromes and protein-protein interaction mapping by Förster resonance energy transfer (FRET) in living cells. However, simpler and cheaper solutions are required before this powerful analytical technique finds a broader application in the life sciences. Wide-field frequency-domain FLIM represents a solution whose application is currently limited by the need for multichannel-plate image intensifiers. We recently showed the feasibility of using a charge-coupled device/complementory metal-oxide semiconductor (CCD/CMOS) hybrid lock-in imager, originally developed for 3-D vision, as an add-on device for lifetime measurements on existing wide-field microscopes. In the present work, the performance of the setup is validated by comparison with well-established wide-field frequency-domain FLIM measurements. Furthermore, we combine the lock-in imager with solid-state light sources. This results in a simple, inexpensive, and compact FLIM system, operating at a video rate and capable of single-shot acquisition by virtue of the unique parallel retrieval of two phase-dependent images. This novel FLIM setup is used for cellular and FRET imaging, and for high-throughput and fast imaging applications. The all-solid-state design bridges the technological gap that limits the use of FLIM in areas such as drug discovery and medical diagnostics.
Optical time-of-flight (TOF) distance measurements can be performed using so-called smart lock-in pixels. By sampling the optical signal 2, 4 or n times in each pixel synchronously with the modulation frequency, the phase between the emitted and reflected signal is extracted and the object's distance is determined. The high
integration-level of such lock-in pixels enables the real-time acquisition of the three-dimensional environment without using any moving mechanical components. A novel design of the 2-tap lock-in pixel in a 0.6 μm semiconductor technology is presented. The pixel was implemented on a sensor with QCIF resolution. The optimized
pixel design allows for high-speed operation of the device, resulting in a nearly-optimum demodulation performance and precise distance measurements which are almost exclusively limited by photon shot noise. In-pixel background-light suppression allows the sensor to be operated in an outdoor environment with sunlight incidence. The highly complex pixel functionality of the sensor was successfully demonstrated on the new SwissRanger SR3000 3D-TOF camera design. Distance resolutions in the millimeter range have been achieved
while the camera is operating with frame rates of more than 20Hz.
In recent years, pervasive computing has become an important topic in automobile industry. Besides well-known driving assistant systems such as ABS, ASR and ESP several small tools that support driving activities were developed. The most important reason for integrating new technologies is to increase the safety of passengers as well as road users. The Centre Suisse d'Electronique et de Microtechnique SA (CSEM) Zurich presented the CMOS/CCD real-time range-imaging technology, a measurement principle with a wide field of applications in automobiles. The measuring system is based on the time-of-flight measurement principle using actively modulated radiation. Thereby, the radiation is emitted by the camera's illumination system, reflected by objects in the field of view and finally imaged on the CMOS/CCD sensor by the optics. From the acquired radiation, the phase delay and hence the target distance is derived within each individual pixel. From these distance measurements, three-dimensional coordinates can then be calculated. The imaging sensor acquires its environment data in a high-frequency mode and is therefore appropriate for real-time applications. The basis for decisions which contribute to the increased safety is thus available. In this contribution, first the operational principle of the sensor technology is outlined. Further, some implementations of the technology are presented. At the laboratories of the Institute of Geodesy and Photogrammetry (IGP) at ETH Zurich an implementation of the above mentioned measurement principle, the SwissRanger, was investigated in detail. Special attention was focused on the characteristics of this sensor and its calibration. Finally, sample applications within the automobile are introduced.
KEYWORDS: 3D-TOF imaging, Sensors, 3D image processing, Time of flight imaging, Cameras, Sun, Modulation, Solid state electronics, Image sensors, Time metrology
The time-of-flight (TOF) principle is a well known principle to acquire a scene in all three dimensions. The advantages of the knowledge of the third dimension are obvious for many kinds of applications. The distance information within the scene renders automatic systems more robust and much less complex or even enables completely new solutions. A solid-state image sensor containing 124 x 160 pixels and the corresponding 3D-camera, the so-called SwissRanger camera, has already been presented in detail in [1]. It has been shown that the SwissRanger camera achieves depth resolutions in the sub-centimeter range, corresponding to a measured time resolution of a few tens of picoseconds with respect to the speed of light (c~3•108 m/s).
However, one main drawback of these so-called lock-in TOF pixels is their limited capacity to handle background illumination. Keeping in mind that in outdoor applications the optical power on the sensor originating from background illumination (e.g., sun light) may be up to a few 100 times higher than the power of the modulated illumination, the sensor requires new pixel structures eliminating or at least reducing the currently experienced restrictions in terms of background illumination.
Based on a 0.6 µm CMOS/CCD technology, four new pixel architectures suppressing background illumination and/or improving the ratio of modulated signal to background signal at the pixel-output level were developed and will be presented in this paper. The theoretical principle of operation and the expected performance are described in detail, together with a sketch of the implementation of the different pixel designs at silicon level. Furthermore, test results obtained in a laboratory environment are published. The sensor structures are characterized in a high background-light environment with up to sun light conditions. The distance linearity over a range of a few meters with the mentioned light conditions is measured. At the same time, the distance resolution is plotted as a function of the target distance, the integration time and the background illumination power. This in-depth evaluation leads to a comparison of the various background suppression approaches; it also includes a comparison with the traditional pixel structure in order to highlight the benefits of the new approaches.
The paper concludes by providing parameter estimations which enables the outlook to build a sensor with a high lateral resolution containing the most promising pixel.
A new miniaturised 256 pixel silicon line sensor, which allows for
the acquisition of depth-resolved images in real-time, is
presented. It reliably and simultaneously delivers intensity data
as well as distance information on the objects in the scene. The
depth measurement is based on the time-of-flight (TOF) principle.
The device allows the simultaneous measurement of the phase,
offset and amplitude of a radio frequency modulated light field
that is emitted by the system and reflected back by the camera
surroundings, without requiring any mechanical scanning parts. The
3D line sensor will be used on a mobile robot platform to
substitute the laser range scanners traditionally used for
navigation in dynamic and/or unknown environments.
KEYWORDS: Demodulation, Modulation, Signal detection, Distance measurement, Electrons, Sensors, Camera shutters, Signal processing, 3D image processing, Diffusion
A new pixel structure for the demodulation of intensity modulated
light waves is presented. The integration of such pixels in line
and area array sensors finds application in time-of-flight
three-dimensional imaging. In 3D range imaging an illumination
module sends a modulated optical signal to a target, where it is
reflected back to the sensor. The phase shift of the reflected
signal compared to the emitted signal is proportional to the
distance to one point of the target. The detection and
demodulation of the signal is performed by a new pixel structure
named drift field pixel. The sampling process is based on the fast
separation of photogenerated charge due to lateral electrical
fields below a high-resistive transparent poly-Si photogate. The
dominant charge transfer phenomenon of drift, instead of diffusion
as in conventional CCD pixels, allows much higher modulation
frequencies of up to 1 GHz and a much higher ultimate distance
accuracy as a consequence. First measurements performed with a
prototype pixel array of 3x3 pixels in a 0.8 micron technology
confirm the suitability of the pixels for applications in the
field of 3D-imaging. Depth accuracies in the sub centimeter range
have already been achieved.
A new miniaturized camera system that is capable of 3-dimensional imaging in real-time is presented. The compact imaging device is able to entirely capture its environment in all three spatial dimensions. It reliably and simultaneously delivers intensity data as well as range information on the objects and persons in the scene. The depth measurement is based on the time-of-flight (TOF) principle. A custom solid-state image sensor allows the parallel measurement of the
phase, offset and amplitude of a radio frequency (RF) modulated light field that is emitted by the system and reflected back by the camera surroundings without requiring any mechanical scanning parts. In this paper, the theoretical background of the implemented TOF principle is presented, together with the technological requirements and detailed
practical implementation issues of such a distance measuring system. Furthermore, the schematic overview of the complete 3D-camera system is provided. The experimental test results are presented and discussed. The present camera system can achieve sub-centimeter depth resolution for a wide range of operating conditions. A miniaturized version of such a 3D-solid-state camera, the SwissRanger 2, is presented as an example, illustrating the possibility of
manufacturing compact, robust and cost effective ranging camera products for 3D imaging in real-time.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.