World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×
Spring Sale: Get 35% off with a min. purchase of 2 titles. Use code SPRING35. Valid till 31st Mar 2025.

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

Chapter 5: Model Building and Diagnostics in Regression

      https://doi.org/10.1142/9789811200410_0005Cited by:0 (Source: Crossref)
      Abstract:

      As mentioned in Section 2.3, the regression of y on x1, …, xk, namely E(yx1, …, xk) represents the ‘best’ approximation (in the sense of smallest mean squared error) of a random variable y, as a function of other random variables x1, …, xk. The particular linear model (y, X β, σ2 I) presents us with a simple setup for studying such a regression function and the associated approximation error. Specifically, the model for a single response variable y becomes

      E(y|x1,,xk)=β0+β1x1++βkxk,Var(y|x1,,xk)=σ2,
      and the observations on y are regarded as uncorrelated random variables (see (1.7)). While Chapter 3 discussed how the parameters β and σ2 can be estimated, Chapter 4 considered various inferential issues such as testing of hypothesis, construction of confidence intervals and regions, prediction, construction of tolerance and prediction intervals etc., under the additional assumptions that the conditional distribution of y given x1, …, xk is normal and that these observations are independent. Such a streamlining of these inference procedures makes the linear model a very popular choice in studying regression problems…