What Is the Residual Sum of Squares (RSS)?
The residual sum of squares (RSS) is a statistical technique used to measure the amount of variance in a data set that is not explained by a regression model itself. Instead, it estimates the variance in the residuals, or error term.
Linear regression is a measurement that helps determine the strength of the relationship between a dependent variable and one or more other factors, known as independent or explanatory variables.
- The residual sum of squares (RSS) measures the level of variance in the error term, or residuals, of a regression model.
- The smaller the residual sum of squares, the better your model fits your data; the greater the residual sum of squares, the poorer your model fits your data.
- A value of zero means your model is a perfect fit.
- Statistical models are used by investors and portfolio managers to track an investment’s price and use that data to predict future movements.
- The RSS is used by financial analysts in order to estimate the validity of their econometric models.
Understanding the Residual Sum of Squares
In general terms, the sum of squares is a statistical technique used in regression analysis to determine the dispersion of data points. In a regression analysis, the goal is to determine how well a data series can be fitted to a function that might help to explain how the data series was generated. The sum of squares is used as a mathematical way to find the function that best fits (varies least) from the data.
The RSS measures the amount of error remaining between the regression function and the data set after the model has been run. A smaller RSS figure represents a regression function that is well-fit to the data.
The RSS, also known as the sum of squared residuals, essentially determines how well a regression model explains or represents the data in the model.
How to Calculate the Residual Sum of Squares
RSS = ∑ni=1 (yi – f(xi))2
yi = the ith value of the variable to be predicted
f(xi) = predicted value of yi
n = upper limit of summation
Residual Sum of Squares (RSS) vs. Residual Standard Error (RSE)
The residual standard error (RSE) is another statistical term used to describe the difference in standard deviations of observed values versus predicted values as shown by points in a regression analysis. It is a goodness-of-fit measure that can be used to analyze how well a set of data points fit with the actual model.
RSE is computed by dividing the RSS by the number of observations in the sample less 2, and then taking the square root: RSE = [RSS/(n-2)]1/2
Minimizing RSS For Optimal Fit
In the realm of regression analysis, minimizing the residual sum of squares is crucial for achieving the best possible fit of a model to the data. Among the different techniques to make this happen, one of the most fundamental and widely used approaches is least squares regression.
Least squares regression is a method that aims to find the line or curve that minimizes the sum of the squared differences. These differences will be between the observed values and the values predicted by the model. In essence, the least squares regression seeks to strike a balance where the model captures the underlying trend of the data while still minimizing the discrepancies between what’s been observed and what’s been predicted.
The process of minimizing RSS through least squares regression involves iteratively adjusting the parameters of the model. This is usually done until the optimal fit is achieved. For a simple linear regression model, this typically entails finding the slope and intercept of the line that best fits the data. In more complex scenarios, the process becomes more intricate but has many of the same principles.
Limitations of RSS
RSS has some limitations to it. First, RSS gives equal weight to all residuals. This means that outliers can disproportionately influence the RSS, meaning that estimated coefficients may be negatively skewed. Another downside is that RSS relies on several assumptions. If any assumption such as linearity, independence of errors, or homoscedasticity are violated, RSS may lead to biased estimates and incorrect inferences.
While RSS is useful for evaluating the fit of a single model, comparing the fit across multiple models using RSS alone can be tough. This is because RSS depends on the number of parameters in the model. It isn’t really meant to compare models with a different number of parameters.
Last, while RSS is easy to compute and interpret, it provides limited insight into the underlying structure of the data. In cases where understanding the relationship between predictors and the response variable is important, there may be better metrics to use. In some ways, RSS can act somewhat like a black box where the relationships aren’t entirely known; only the end value is of most importance.
Financial markets have increasingly become more quantitatively driven; as such, in search of an edge, many investors are using advanced statistical techniques to aid in their decisions. Big data, machine learning, and artificial intelligence applications further necessitate the use of statistical properties to guide contemporary investment strategies. The residual sum of squares—or RSS statistics—is one of many statistical properties enjoying a renaissance.
Statistical models are used by investors and portfolio managers to track an investment’s price and use that data to predict future movements. The study—called regression analysis—might involve analyzing the relationship in price movements between a commodity and the stocks of companies engaged in producing the commodity.
Finding the residual sum of squares by hand can be difficult and time-consuming. Because it involves a lot of subtracting, squaring, and summing, the calculations can be prone to errors. For this reason, you may decide to use software, such as Excel, to do the calculations.
Any model might have variances between the predicted values and actual results. Although the variances might be explained by the regression analysis, the RSS represents the variances or errors that are not explained.
Since a sufficiently complex regression function can be made to closely fit virtually any data set, further study is necessary to determine whether the regression function is, in fact, useful in explaining the variance of the dataset.
Typically, however, a smaller or lower value for the RSS is ideal in any model since it means there’s less variation in the data set. In other words, the lower the sum of squared residuals, the better the regression model is at explaining the data.
Example of the Residual Sum of Squares
For a simple (but lengthy) demonstration of the RSS calculation, consider the well-known correlation between a country’s consumer spending and its GDP. The following chart reflects the published values of consumer spending and Gross Domestic Product for the 27 states of the European Union. Note that this information may have slightly changed since it has been published, but the example of residual sum of squares remains valid.
|Consumer Spending vs. GDP for EU Member States
Consumer spending and GDP have a strong positive correlation, and it is possible to predict a country’s GDP based on consumer spending (CS). Using the formula for a best fit line, this relationship can be approximated as:
GDP = 1.3232 x CS + 10447
The units for both GDP and Consumer Spending are in millions of U.S. dollars.
This formula is highly accurate for most purposes, but it is not perfect, due to the individual variations in each country’s economy. The following chart compares the projected GDP of each country, based on the formula above, and the actual GDP as recorded by the World Bank.
|Projected and Actual GDP Figures for EU Member States, and Residual Squares
|Consumer Spending Most Recent Value (Millions)
|GDP Most Recent Value (Millions)
|Projected GDP (Based on Trendline)
|Residual Square (Projected – Real)^2
The column on the right indicates the residual squares–the squared difference between each projected value and its actual value. The numbers appear large, but their sum is actually lower than the RSS for any other possible trendline. If a different line had a lower RSS for these data points, that line would be the best fit line.
Is the Residual Sum of Squares the Same as R-Squared?
The residual sum of squares (RSS) is the absolute amount of explained variation, whereas R-squared is the absolute amount of variation as a proportion of total variation.
Is RSS the Same as the Sum of Squared Estimate of Errors (SSE)?
The residual sum of squares (RSS) is also known as the sum of squared estimate of errors (SSE).
What Is the Difference Between the Residual Sum of Squares and Total Sum of Squares?
The total sum of squares (TSS) measures how much variation there is in the observed data, while the residual sum of squares measures the variation in the error between the observed data and modeled values. In statistics, the values for the residual sum of squares and the total sum of squares (TSS) are oftentimes compared to each other.
Can a Residual Sum of Squares Be Zero?
The residual sum of squares can be zero. The smaller the residual sum of squares, the better your model fits your data; the greater the residual sum of squares, the poorer your model fits your data. A value of zero means your model is a perfect fit.
The Bottom Line
Residual sum of squares quantifies the discrepancy between observed data points and the predictions made by a regression model, calculated as the sum of the squared residuals. Minimizing RSS is a fundamental objective in regression analysis, as it represents the degree to which the model accurately captures the variability in the data.