Our objective is to provide an overview of the minimum sum of absolute errors (MSAE) regression. Although proposed fifty years before the concept of least squares regression, the MSAE regression did not receive much attention until the second half of the last century. During this period, several very effective and efficient algorithms to compute the MSAE estimates of the unknown parameters of the multiple linear regression model were proposed and studied. Today a number of very good computer programs are available in the open literature and in publicly available computer packages. Efficient algorithms and computer programs for the selection of models with fewer variables are also available.
The asymptotic and small sample properties of the MSAE estimators have been studied. Based on these results, formulae for confidence intervals and procedures for testing hypotheses have been developed.
Some of these results have appeared in survey articles in the literature. However, since their publication, a number of new results have appeared in he literature that makes the MSAE regression procedure more attractive. For example, an R2 like measure for the MSAE regression is now available. We now understand how special characteristics unique to the MSAE regression explain its robustness to certain type of outliers. And unlike least squares regression, variations in the values of the response and predictor variables within definable limits do not change the fitted MSAE regression. We highlighted those features of MSAE regression in this paper.