Paper
30 June 1994 Evolving multivariate mixture density estimates for classification
Donald E. Waagen, M. D. Parsons, John R. McDonnell, Jeffrey D. Argast
Author Affiliations +
Abstract
Finite mixture models (mixture models) estimate probability density functions based on a weighted combination of density functions. This work investigates a combined stochastic and deterministic optimization approach of a generalized kernel function for multivariate mixture density estimation. Mixture models are selected and optimized by combining the optimization characteristics of a multi-agent stochastic optimization algorithm, based on evolutionary programming, and the EM algorithm. A classification problem is approached by optimizing a mixture density estimate for each class. Rissanen's minimum description length criterion provides the selection mechanism for evaluating mixture models. A comparison of each class' posterior probability (Bayes rule) provides the classification decision procedure. A 2-D, two- class classification problem is posed, and classification performance of the optimal mixture models is compared with a kernel estimator whose bandwidth is optimized using the technique of least-squares cross-validation.
© (1994) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Donald E. Waagen, M. D. Parsons, John R. McDonnell, and Jeffrey D. Argast "Evolving multivariate mixture density estimates for classification", Proc. SPIE 2304, Neural and Stochastic Methods in Image and Signal Processing III, (30 June 1994); https://doi.org/10.1117/12.179226
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Data modeling

Expectation maximization algorithms

Optimization (mathematics)

Stochastic processes

Statistical analysis

Statistical modeling

Systems modeling

Back to Top