A low-cost vehicle test-bed system was developed to iteratively test, refine and demonstrate navigation algorithms before
attempting to transfer the algorithms to more advanced rover prototypes. The platform used here was a modified radio
controlled (RC) car. A microcontroller board and onboard laptop computer allow for either autonomous or remote
operation via a computer workstation. The sensors onboard the vehicle represent the types currently used on NASA-JPL
rover prototypes. For dead-reckoning navigation, optical wheel encoders, a single axis gyroscope, and 2-axis
accelerometer were used. An ultrasound ranger is available to calculate distance as a substitute for the stereo vision
systems presently used on rovers. The prototype also carries a small laptop computer with a USB camera and wireless
transmitter to send real time video to an off-board computer. A real-time user interface was implemented that combines
an automatic image feature selector, tracking parameter controls, streaming video viewer, and user generated or
autonomous driving commands. Using the test-bed, real-time landmark tracking was demonstrated by autonomously
driving the vehicle through the JPL Mars yard. The algorithms tracked rocks as waypoints. This generated coordinates
calculating relative motion and visually servoing to science targets. A limitation for the current system is serial
computing−each additional landmark is tracked in order−but since each landmark is tracked independently, if
transferred to appropriate parallel hardware, adding targets would not significantly diminish system speed.
Automatic adaptive tracking in real-time for target recognition provided autonomous control of a scale model electric
truck. The two-wheel drive truck was modified as an autonomous rover test-bed for vision based guidance and
navigation. Methods were implemented to monitor tracking error and ensure a safe, accurate arrival at the intended
science target. Some methods are situation independent relying only on the confidence error of the target recognition
algorithm. Other methods take advantage of the scenario of combined motion and tracking to filter out anomalies. In
either case, only a single calibrated camera was needed for position estimation. Results from real-time autonomous
driving tests on the JPL simulated Mars yard are presented. Recognition error was often situation dependent. For the
rover case, the background was in motion and may be characterized to provide visual cues on rover travel such as rate,
pitch, roll, and distance to objects of interest or hazards. Objects in the scene may be used as landmarks, or waypoints,
for such estimations. As objects are approached, their scale increases and their orientation may change. In addition,
particularly on rough terrain, these orientation and scale changes may be unpredictable. Feature extraction combined
with the neural network algorithm was successful in providing visual odometry in the simulated Mars environment.
In real-world pattern recognition applications, multiple correlation filters can be synthesized to recognize broad variation of object classes, viewing angles, scale changes, and background clutters. Composite filters are used to reduce the number of filters needed for a particular target recognition task. Conventionally, the correlation peak is thresholded to determine if a target is present. Due to the complexity of the objects and the unpredictability of the environment, false positive or false negative identification often occur. In this paper we present the use of a radial basis function neural network (RBFNN) as a post-processor to assist the optical correlator to identify the objects and to reject false alarms. Image plane features near the correlation peaks are extracted and fed to the neural network for analysis. The approach is capable of handling large number of object variations and filter sets. Preliminary experimental results are presented and the performance is analyzed.
JPL is developing an Advanced Autonomous Target Recognition (AATR) technology to significantly reduce broad area search workload for imagery analysts. One of the algorithms to be delivered, as part of
JPL ATR Development and Evaluation (JADE) project, is the OT-MACH based ATR algorithm software package for grayscale optical correlator. In this paper we describe the basic features and functions of the software package as currently implemented. Automation of filter synthesis and test for GOC, particularly
the automation of OT-MACH parameter optimization, is discussed.
JPL is developing a high resolution (512 pixel x 512 pixel), high-speed (1000 frames/sec), compact Automatic Target Recognition (ATR) processor for onboard target detection, identification and tracking. This ATR processor consists of a compact Grayscale Optical Correlator (GOC) for parallel wide area target-of-interest (TOI) detection and a hardware based self-learning neural network (NN) for target identification and adaptive monitoring. This processor can be tailored to meet specific system requirements for many ATR applications. Development includes simulation of key components in software including GOC simulation and NN simulation. Both simulation tools are discussed and demonstrated.
JPL has developed a compact portable 512 x 512 Grayscale Optical Correlator [1-4] by integrating a pair of 512 x 512 Ferroelectric Spatial Light Modulator (FLCSLM), a red diode laser, Fourier optics, a CMOS photodetector array. The system is designed to operate at the maximum speed of 1000 frames per second. A FPGA card was custom programmed to perform peak-detection post-processing to accommodate the system throughput rate. Custom mechanical mounting brackets were designed miniaturized the optics head of the GOC into a 6” x 3.5” x 2” volume. The device driver HW/SW is installed in a customized PC. The GOC system’s portability has been demonstrated by shipping it to various locations for target recognition testing.
JPL is developing a portable 512 x 512 Grayscale Optical Correlator (GOC) system for target data mining and identification applications. This GOC system will utilized a pair of 512 x 512 Ferroelectric Liquid Crystal Spatial Light Modulator (FLCSLM) to achieve 1000 frames/sec data throughput. Primary system design issues including: optics design to achieve compact system volume with fine tuning capability, photodetector array with onboard post-processing for peak detection and target identification. These issues and corresponding solutions will be discussed.
An Optical Processing for the Mining and Identification of Targets (OPMIT) system is being proposed to significantly reduce broad area search workload for NIMA imagery analysts. Central to the system is a Grayscale Optical Correlator (GOC), developed by JPL in recent years. In this paper we discuss some preliminary development of an important system component - the filter management module - that is critical for the success of GOC operation. The emphasis is on the streamlining the OT-MACH filter synthesis/testing procedure for effective and efficient filter design while maintaining filter performance.
At the Optical Computing Lab in the Jet Propulsion Laboratory (JPL) a binary holographic data storage system was designed and tested with methods of recording and retrieving the binary information. Levels of error correction were introduced to the system including pixel averaging, thresholding, and parity checks. Errors were artificially introduced into the binary holographic data storage system and were monitored as a function of the defect area fraction, which showed a strong influence on data integrity. Average area fractions exceeding one quarter of the bit area caused unrecoverable errors. Efficient use of the available data density was discussed.
Feature detectors have been considered for the role of supplying additional information to a neural network tracker. The feature detector focuses on areas of the image with significant information. Basically, if a picture says a thousand words, the feature detectors are looking for the key phrases (keypoints). These keypoints are rotationally invariant and may be matched across frames. Application of these advanced feature detectors to the neural network tracking system at JPL has promising potential. As part of an ongoing program, an advanced feature detector was tested for augmentation of a neural network based tracker. The advance feature detector extended tracking periods in test sequences including aircraft tracking, rover tracking, and simulated Martian landing. Future directions of research are also discussed.
The precision of a radial basis function (RBF) neural network based tracking method has been assessed against real targets. Intensity profile feature extraction was used to build a model in real time, evolving with the target. Precision was assessed against traditionally measured frame-by-frame measurements from the recorded data set. The results show the potential limit for the technique and reveal intricacies associated with empirical data not necessarily observed in simulations.
JPL has recently developed, for the first time, a compact (2” x 2” x 1”) Grayscale Optical Correlator (GOC) using a pair of 512 x 512 Ferroelectric Liquid Crystal Spatial Light Modulators. In this paper, we will discuss recent progress in the design and packaging technology to achieve a rugged portable GOC module to enable the real-time onboard applications of this miniature GOC. Several automatic target recognition applications will also be presented.
An innovative compact holographic memory system will be presented. This system utilizes a new electro-optic (E-O) beam steering technology to achieve high-speed, high-density holographic data storage.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.