Oyegoke, O. A.Oyeyemi, G. M.Adeleke, M. O.Kolawole, R. O.2023-07-272023-07-272020FULafia Journal of Science and Technology2449 - 0954https://uilspace.unilorin.edu.ng/handle/20.500.12484/11645Multicollinearity has been a serious problem in regression analysis. Ordinary least square (OLS) regression may result in high variability in the estimates of the regression coefficients in the presence of multicollinearity. Least Absolute Shrinkage and Selection Operator (LASSO), Ridge Regression (RR), and Partial Least Squares (PLS) methods are well established methods that reduce the variability of the estimates by shrinking the coefficients and at the same time produce interpretable models by shrinking some coefficients. The performances of LASSO, Ridge Regression, PLS and OLS estimators were evaluated using Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC) in the presence of multicolinearity using Monte Carlo Simulation. The simulations were done for different sample sizes: n (10, 50, 100, 150) and levels of multicollinearity: Mild (0.1 – 0.3), Low (0.4 – 0.6) and High (0.7 - 0.9). OLS had poor parameters estimate and produced wrong inferences, LASSO estimator is the best, while PLS is most efficient when the number of variables is greater than sample size.enMulticollinearity, Least Absolute Shrinkage and Selection Operator, Ridge Regression, Partial Least Squares and EstimatorsREGULARIZATION TECHNIQUES IN MULTIPLE LINEAR REGRESSION IN THE PESENCE OF MULTICOLLINEARITYArticle