Paper
17 May 2022 Empirical analysis of Ridge, Lasso, and Dropout regularizations
Yiming Wei
Author Affiliations +
Proceedings Volume 12259, 2nd International Conference on Applied Mathematics, Modelling, and Intelligent Computing (CAMMIC 2022); 122590H (2022) https://doi.org/10.1117/12.2639002
Event: 2nd International Conference on Applied Mathematics, Modelling, and Intelligent Computing, 2022, Kunming, China
Abstract
This paper analyzes the performance of Lasso, Ridge and Dropout Regression using different synthetic and real datasets covering classification and regression problems. The MNIST dataset displayed relatively satisfying results but little difference created by regularization. The Synthetic dataset displays a growing Mean Square Error(MSE) with greater signal to noise ratio as well as growing dimension of dataset. The comparison among the three regularizations reveals how Dropout regularization performs better with decreasing SNR, potentially related to the attribute of Dropout. However, the training model displayed mediocre results when tested upon real datasets and insignificant difference after the usage of regularizations, which could potentially be attributed to the depth of the training model being insufficient with the given data.
© (2022) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Yiming Wei "Empirical analysis of Ridge, Lasso, and Dropout regularizations", Proc. SPIE 12259, 2nd International Conference on Applied Mathematics, Modelling, and Intelligent Computing (CAMMIC 2022), 122590H (17 May 2022); https://doi.org/10.1117/12.2639002
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Signal to noise ratio

Data modeling

Neural networks

Performance modeling

Neurons

Signal attenuation

Lawrencium

Back to Top