School

University of the People **We aren't endorsed by this school

Course

MATH 1281

Subject

Statistics

Date

Sep 26, 2023

Pages

2

Uploaded by wallycamara2014 on coursehero.com

Similarities between adjusted R-squared and R-Squared
In order to assess the multiple regression model's goodness of fit, two statistical metrics, R2 and
adjusted R2, are used and both important and produce valuable results.
Differences between adjusted R-squared and R-Squared
According to the statistical indicator
R-squared
, the independent variable or variables in a
regression model account for a certain percentage of the variance in the dependent variable. In
the case of a value of 0.7, the independent variables would be responsible for 70% of the
variation in the target variable. A perfect correlation between the independent and dependent
variables is indicated by an R-squared value of 1, which is always in the range of 0 to 1.
Adjusted R-squared:
This is a variant of R-squared that accounts for the number of
predictors in the model and penalizes overly frequent variables, offering a more precise
assessment of the model's goodness of fit, particularly when there are several predictors.
Adjusted R-squared provides a more precise measurement of the model's goodness of fit,
especially when there are several predictors, by taking into account the number of predictors in
the model and penalizing unnecessary variables. The value of Adjusted R-squared will actually
fall if R-squared does not considerably rise with the addition of a new independent variable.
Furthermore, if we include a random independent variable in our model, the difference
between the R-squared and Adjusted R-squared values will become apparent
(Bhandari, 2023)
.
Which one is expected to be higher?
Since R2 assumes that every independent variable in the model explains the variance in the
dependent variable, it will typically be higher than the adjusted R2 since more variability is
explained by our model when the R-squared value is higher. Contrarily, adjusted R2 penalizes
complexity by taking into account the number of predictors in the model.
Which one will be accurately measure?
Adjusted R2 is a better fit in that situation, in my opinion, when assessing the robustness of a
linear regression model. The explanation is that it provides a more precise measurement of the
model's predictive capacity by accounting for the number of predictors in the model
(Bhandari,
2023)
For instance
Consider a model that forecasts home values based on a variety of variables, such as the
environment where the house was built, the size of each room, the number of rooms, the age
of the home, etc. As we continue to decorate and include more elements (such as painting the

house, the quantity of windows and doors, furniture, etc.), the situation will become more
complex. The R-squared value might have increased, indicating a better fit. However, it's
possible that these factors won't actually make a difference in how much a house costs. If that's
the case, the modified R2 value would fall, highlighting the model's insignificance of extra
variables that aren't making a difference.
Reference:
Bhandari, A. (2023, July 28). Key difference between R-squared and adjusted R-squared for regression
analysis. Analytics Vidhya.
https://www.analyticsvidhya.com/blog/2020/07/difference-between-r-
squared-and-adjusted-r-squared/#h-adjusted-r-squared-statistic

Page1of 2