The association between a categorical variable and a group of interconnected factors is the main objective of the classification procedure. The linear discriminant analysis (LDA) aims to provide a method for classifying populations and dividing up forthcoming observations among the groups that have already been identified. Under the suppositions of normality and homoscedasticity, the LDA produces the best discriminant rule for two or more groups. Outliers have a significant impact on the parameters of the LDA, mean, and covariance matrix. Robust methods are resistant to outliers. This paper explores the robust methods, namely the Minimum Covariance Determinant (MCD) estimator and Minimum Regularized Covariance Determinant (MRCD) estimators in the context of discriminant analysis under real environments. The MCD technique is used to estimate the location and dispersion matrix using the subset of the given size that has the lowest sample covariance determinant. Its fundamental problem is that it doesn’t provide a reliable result when the features/dimension is greater than the size of the subset. As a result, the MRCD method is employed and the efficiency is studied by computing the Apparent Error Rate (AER). In this paper, an attempt has been made to review the existing theory and methods of RLDA.