An effective integrated structural health monitoring system must include a method of sensing and a process of damage identification that are optimized to work together. The result is a system that provides an automated and quantified assessment of damage present in a structure. Two candidates for such a symbiosis of sensing and damage identification are impedance-based measurement and statistical process control. The impedance-based structural health monitoring method uses a high frequency signal to excite a structure through a bonded piezoelectric patch and measures the impedance response of the excited structure across a frequency spectrum. In structural damage cases such as threads loosening or a crack developing, the structure in question will begin to show a change in impedance. Once measured, a damage sensitive feature from this impedance change can be statistically quantified into different damage cases by statistical process control. This paper addresses impedance measurements from experimental structures and a subsequent statistical method for quantitatively determining when the impedance signature of the structures has changed significantly enough to warrant the classification of “damaged”. Simple features and hypothesis testing algorithms are explored in an effort to create real-time solutions and reduce the complexity of damage identification for future use in low resource integrated structural health monitoring systems.
At Los Alamos National Laboratory (LANL), various algorithms for structural health monitoring problems have been explored in the last 5 to 6 years. The original DIAMOND (Damage Identification And MOdal aNalysis of Data) software was developed as a package of modal analysis tools with some frequency domain damage identification algorithms included. Since the conception of DIAMOND, the Structural Health Monitoring (SHM) paradigm at LANL has been cast in the framework of statistical pattern recognition, promoting data driven damage detection approaches. To reflect this shift and to allow user-friendly analyses of data, a new piece of software, DIAMOND II is under development. The Graphical User Interface (GUI) of the DIAMOND II software is based on the idea of GLASS (Graphical Linking and Assembly of Syntax Structure) technology, which is currently being implemented at LANL. GLASS is a Java based GUI that allows drag and drop construction of algorithms from various categories of existing functions. In the platform of the underlying GLASS technology, DIAMOND II is simply a module specifically targeting damage identification applications. Users can assemble various routines, building their own algorithms or benchmark testing different damage identification approaches without writing a single line of code.
The first and most important objective of any damage identification algorithms is to ascertain with confidence if damage is present or not. Many methods have been proposed for damage detection based on ideas of novelty detection founded in pattern recognition and multivariate statistics. The philosophy of novelty detection is simple. Features are first extracted from a baseline system to be monitored, and subsequent data are then compared to see if the new features are outliers, which significantly depart from the rest of population. In damage diagnosis problems, the assumption is that outliers are generated from a damaged condition of the monitored system. This damage classification necessitates the establishment of a decision boundary. Choosing this threshold value is often based on the assumption that the parent distribution of data is Gaussian in nature. While the problem of novelty detection focuses attention on the outlier or extreme values of the data i.e. those points in the tails of the distribution, the threshold selection using the normality assumption weighs the central population of data. Therefore, this normality assumption might impose potentially misleading behavior on damage classification, and is likely to lead the damage diagnosis astray. In this paper, extreme value statistics is integrated with the novelty detection to specifically model the tails of the distribution of interest. Finally, the proposed technique is demonstrated on simulated numerical data and time series data measured from an eight degree-of-freedom spring-mass system.
KEYWORDS: Statistical modeling, Statistical analysis, Autoregressive models, Statistical inference, Data modeling, Error analysis, Time series analysis, Aluminum, Damage detection, Digital signal processing
In this application of the statistical pattern recognition paradigm, a prediction model of a chosen feature is developed from the time domain response of a baseline structure. After the model is developed, subsequent feature sets are tested against the model to determine if a change in the feature has occurred. In the proposed statistical inference for damage identification there are two basic hypotheses; (1) the model can predict the feature, in which case the structure is undamaged or (2) the model can not accurately predict the feature, suggesting that the structure is damaged. The Sequential Probability Ratio Test (SPRT) develops a statistical method that quickly arrives at a decision between these two hypotheses and is applicable to continuous monitoring. In the original formulation of the SPRT algorithm, the feature is assumed to be Gaussian and thresholds are set accordingly. It is likely, however, that the feature used for damage identification is sensitive to the tails of the distribution and that the tails may not necessarily be governed by Gaussian characteristics. By modeling the tails using the technique of Extreme Value Statistics, the hypothesis decision thresholds for the SPRT algorithm may be set avoiding the normality assumption. The SPRT algorithm is utilized to decide if the test structure is undamaged or damaged and which joint is exhibiting the change.
A new design of fiber optic chemical sensor is proposed and demonstrated as a working pH sensor. The sensor is based on the evanescent coupling between a side polished single mode optical fiber and a single mode overlay planar waveguide. The stringent tolerances placed on the planar waveguide thickness are met by depositing the overlay, one molecular layer at a time, by Langmuir-Blodgett (LB) deposition. The advantages of being able to design the optical properties of the organic dye material is demonstrated by comparing two different overlay material and their responses over different wavelength ranges. Finally, a Kramers-Kronig based model, relating the absorption spectrum of the overlay material to its material dispersion, is shown to be able to predict the sensor response.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.