We validate a simple method for determining the confidence intervals on fitted parameters derived from modeling optical reflectance spectroscopy measurements using synthetic datasets. The method estimates the parameter confidence intervals as the square roots of the diagonal elements of the covariance matrix, obtained by multiplying the inverse of the second derivative matrix of with respect to its free parameters by , with the number of degrees of freedom. We show that this method yields correct confidence intervals as long as the model used to describe the data is correct. Imperfections in the fitting model introduces a bias in the fitted parameters that greatly exceeds the estimated confidence intervals. We investigate the use of various methods to identify and subsequently minimize the bias in the fitted parameters associated with incorrect modeling.