Browsing by Author "Kolawole, R. O."
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Comparison of Some Spike-and-Slab Priors for Bayesian Variable Selection in Multiple Linear Regression(Akamai University, U.S.A, 2019) Oyeyemi, G. M.; Olanrewaju, Y. A.; Kolawole, R. O.Variable selection has been a very essential challenge in building a multiple regression model. Exclusion of influential covariates or including covariate with zero effect will no doubt affect the estimation precision and as well the predictive accuracy of the model. “Spike-and-Slab prior” is an increasingly popular variable selection approach used in the Bayesian framework, which aids the variable selection and the estimation of regression parameters. In this research, the performances of the MCMC implementation for some versions of spike and slab priors for variable selection in normal linear regression models were investigated with regards to posterior inclusion probability for the simulated data under different setting (independent and correlated covariates, difference variance scales and varying sample sizes). Evidence from the simulation study revealed that the selected priors have similar performance under the independent setup and correlated setup, but the standard errors of coefficient estimates are higher for correlated covariates compare to independent covariates. The mean estimates of the coefficients get closer to the true coefficient values as the sample size increases under all different priors considered, and also the posterior inclusion probability depends on the size of variance of the slab component.Item REGULARIZATION TECHNIQUES IN MULTIPLE LINEAR REGRESSION IN THE PESENCE OF MULTICOLLINEARITY(Faculty of Physical Sciences, Federal University of Lafia, Nigeria., 2020) Oyegoke, O. A.; Oyeyemi, G. M.; Adeleke, M. O.; Kolawole, R. O.Multicollinearity has been a serious problem in regression analysis. Ordinary least square (OLS) regression may result in high variability in the estimates of the regression coefficients in the presence of multicollinearity. Least Absolute Shrinkage and Selection Operator (LASSO), Ridge Regression (RR), and Partial Least Squares (PLS) methods are well established methods that reduce the variability of the estimates by shrinking the coefficients and at the same time produce interpretable models by shrinking some coefficients. The performances of LASSO, Ridge Regression, PLS and OLS estimators were evaluated using Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC) in the presence of multicolinearity using Monte Carlo Simulation. The simulations were done for different sample sizes: n (10, 50, 100, 150) and levels of multicollinearity: Mild (0.1 – 0.3), Low (0.4 – 0.6) and High (0.7 - 0.9). OLS had poor parameters estimate and produced wrong inferences, LASSO estimator is the best, while PLS is most efficient when the number of variables is greater than sample size.