This thesis presents new linear regression estimators that perform simultaneous model parameter estimation and hyperparameter tuning within a Bayesian framework, including settings where the regression coefficient exhibits sparsity, or when the predictors have a natural grouping structure. We examine three different shrinkage estimators - the Bayesian horseshoe, Bayesian ridge, and Bayesian Lasso. We also introduced an adaptive estimator that performs well in problems with varying sparsity levels and signal-to-noise ratio strengths, making it a reasonable default estimator when there is no prior knowledge about the sparsity level of the problem.