To ensure safe mission completion, Army aviators must be prepared to execute appropriate emergency procedures (EPs) in a range of situations. Augmented and Virtual Reality (AR/VR) technologies provide novel opportunities to enhance the performance of emergency procedures by presenting them in powerful simulations of the operational environment. Currently, USAARL is developing a VR EP research simulator to support systematic, human performance research, development, test, and evaluation (RDT&E) programs to investigate and thereby enhance the effective, safe execution of EPs. Factors to be investigated include workload, flight maneuver, displays, multisensory stimuli, and aircrew coordination, as well psychophysiological and operational stressors known to potentially impact EP execution. The USAARL EP simulator is being developed and operates within the Unity Real-Time Development Platform currently instantiated with HTC Vive Pro hardware. In order to maximize immersion, virtual reality gloves are the user's method of controlling the simulation. Pupil Labs hardware and software has been integrated with the HTC Vive Pro to record, archive, and analyze synchronous, time stamped oculometric data. Engine fire and a single-engine failure are the emergencies that have been implemented to date in the EP simulator. One mode of operation permits the user to passively experience the EP with no required input. Other modes require the participant to execute the EP with predetermined cueing stimuli ranging from substantial step-by-step cueing to no cueing. Future work will incorporate additional EPs into standard maneuvers with defined physiological stressors.
Normal human pupil diameter (PD) ranges between about 2 to 8 millimeters (mm) depending on a number of factors. While the inverse relation between PD and luminance is well known, less well known and dramatic is the dependence of PD on mental workload (MWL). The PD response to MWL states are typically of small amplitude and high frequency, reflecting differential sympathetic and parasympathetic enervation. The literature shows that with contemporary objective eye tracking technology, PD behavior may prove to be a practical indication of psychophysiological status, including attention and MWL. Despite the pupil’s limited response repertoire; i.e., diameter, it is capable of potentially encoding a spectrum of information about operator state. The challenge is disambiguating PD response characteristics. A commercial, off-the-shelf eye tracker recorded PD of volunteers performing the N-back task,a standard MWL task involving auditory attention and short term memory with three levels of difficulty (easy, moderate, and difficult). The measurements were made at 3 levels of average luminance selected to assess PD when it is small, medium, and large. Each of these luminance levels fluctuated around its mean level in a sinusoidal fashion (with a ±25% modulation) at one of 4 frequencies: 1.0 hertz (Hz), 0.2 Hz, 0.1 Hz, and 0.0 Hz. The last frequency being the control condition of a 0% light modulation. Multivariate analysis revealed the effects of MWL on PD were detectable across all combinations of luminance and modulation frequency. The next step is to determine whether pupil diameter can predict MWL.
The U.S. Army Aeromedical Research Laboratory has transformed its NUH-60 Blackhawk simulator into a degraded visual environment (DVE) test bed capable of assessing integrated cueing technologies and their impact on flight performance. It is a unique simulator with the Lift Simulator Modernization Program database and is equipped with an enhanced brownout/whiteout model that replicates typical DVE conditions. The simulator is equipped with environmental temperature control and is a full-visual and full-motion simulator with six degrees-of-freedom. The flight simulator consists of a simulator compartment containing a cockpit, instructor/operator station, and observer station. It is equipped with eight Dell XIG visual image generator systems that simulate natural helicopter environments for day, dusk, night, and NVG with blowing sand or snow. The visual scenes data bases are created using satellite imagery of real-world locations. New sensor imaging capabilities produce realistic visuals that allow testing of DVE countermeasures. The simulator is equipped with USAARL’s Tactile Situation Awareness System (TSAS), which stimulates the pilot through belt-worn and seat-cushion “tactors” that vibrate to transmit through the sense of touch specific aircraft flight parameters such as drift, direction, and altitude. In addition, a glass cockpit façade allows UH-60 Mike model functionality. The simulator is now being used as part of the U.S. Army’s Research Development and Engineering Command’s DVE mitigation program.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.