Near-eye displays–displays positioned in close proximity to the observer’s eye–are a technology continuing to gain significance in industrial and defense applications, e.g. for augmented reality and digital night vision. Fraunhofer IOSB has recently developed a specialized measurement setup for assessing the display capabilities of such devices as part of the optoelectronic imaging chain, with the primary focus on the Modulation Transfer Function (MTF).
The setup consists of an imaging system with a high-resolution CMOS camera and a motorized positioning system. It is intended to run different measurement procedures semi-automatically, performing the desired measurements at specified points on the display.
This paper presents the extended work on near-eye display imaging quality assessment following the initial publication. Using a commercial virtual reality headset as a sample display, we further refined the previously described MTF measurement procedures, with one method being based on bar pattern images and another method using a slanted edge image. Refinements include improvements to the processing of the camera images as well as to the method of extracting contrast measurements. Furthermore, we implemented an additional, line-image-based method for determining the device’s MTF.
The impact of the refinements is examined and the results of the different methods are discussed with the goal to find the most suitable measurement procedures for our setup and to highlight the individual merits of different measurement methods.In the first part, we outline the experimental setup of our testbed. It allows for mimicking infrared imaging of real scenes in a controlled laboratory environment. We describe the process of dynamic infrared scene generation as well as the physical limitations of our scene projection setup.
A second part discusses ongoing and future applications. This testbed extends our standard lab measurements for thermal imagers by a image based performance analysis method. Scene based methods are necessary to investigate and assess advanced digital signal processing (ADSP) algorithms which are becoming an integral part of thermal imagers. We use this testbed to look into inferences of unknown proprietary ADSP algorithms by choosing suitable test scenes.
Furthermore, we investigate the influence of dazzling on thermal imagers by coupling infrared laser radiation into the projected scene. The studies allow to evaluate the potential and hazards of infrared dazzling and to describe correlated effects. In a future step, we want to transfer our knowledge of VIS/NIR laser protection into the infrared regime.
The experimental setup for the investigation of laser-induced damage is described in detail. As laser sources both pulsed lasers and continuous-wave (CW) lasers are used. The laser-induced damage threshold (LIDT) is determined by the single-shot method by increasing the pulse energy from pulse to pulse or in the case of CW-lasers, by increasing the laser power.
Furthermore, we investigate the morphology of laser-induced damage patterns and the dependence of the number of destructed device elements on the laser pulse energy or laser power. In addition to the destruction of single pixels, we observe aftereffects like persisting dead columns or rows of pixels in the sensor image.
Based on this dilemma, we propose to construct a camera that mimics the perception of laser dazzle by the human eye as close as possible. The human eye camera consists of hardware and a software component, which perform the several tasks of the eye. The hardware controls the eye-movement (saccadic viewing), the adaption of the iris (irradiance control) and the projection of the image onto a sensor. The software receives the image taken by the sensor and includes the density of receptors, the retinal neural operations and a feedback to the hardware. The processed images by the virtual retina are meant to be used to evaluate the degree of dazzling, as it would occur to a human observer.
View contact details