Edgeworth expansion and bootstrap approximation for M-estimators of linear regression parameters with increasing dimensions
General Material Designation
[Thesis]
First Statement of Responsibility
M. A. Tiro
Subsequent Statement of Responsibility
S. N. D. Lahiri, H. A.
.PUBLICATION, DISTRIBUTION, ETC
Name of Publisher, Distributor, etc.
Iowa State University
Date of Publication, Distribution, etc.
1991
PHYSICAL DESCRIPTION
Specific Material Designation and Extent of Item
92
DISSERTATION (THESIS) NOTE
Dissertation or thesis details and type of degree
Ph.D.
Body granting the degree
Iowa State University
Text preceding or following the note
1991
SUMMARY OR ABSTRACT
Text of Note
In this study, we consider two different inference problems in linear regression models. The first problem deals with the model usdy\sb j\ =\ x\sbsp{j}{\prime}\beta\ +\ \epsilon\sb jusd; j = 1,2, ...,n, where usdy\sb1,y\sb2usd, ...,usdy\sb nusd are observations; usd\epsilon\sb1,\epsilon\sb2usd, ...,usd\epsilon\sb nusd are independent and identically distributed random variables with a common distribution function; usdx\sb1,x\sb2usd, ...,usdx\sb nusd are known, nonrandom p-vectors; and usd\betausd is px1 vector of parameters. Edgeworth expansions for the standardized as well as studentized linear combinations of least squares estimators are obtained without assuming normal errors. The number of parameters p is allowed to increase with n and essentially under the condition p = usdO(n\sp{({1\over2}-\delta)})usd as usdn\to\inftyusd for any usd\delta > 0usd. These results extend the results of Qumsiyeh (1986). It is also shown that the bootstrap method is second order correct for the studentized statistics improving the results of Bickel and Freedman (1983). The second model is a p-population model given by usdY\sb{i}\ =\ X\sb iB\sb i\ +\ \epsilon\sb{i}usd; i = 1,2, ...,p, where usdY\sb iusd is the usdn\sb iusdx1 vector of observations from the usdi\sp{th}usd population; usd\epsilon\sb iusd is a usdn\sb iusdx1 random vector; usd\beta\sb iusd is the kx1 vector of parameters; and usdX\sb iusd is usdn\sb i{\rm x}kusd known nonrandom matrix. Here k is a fixed positive integer and usdn\sb i\geq 1usd denotes the usdi\sp{th}usd sample size for i = 1,2, ...,p. This is an extension of Ringland (1980) to a general regression model. Edgeworth expansions and bootstrap approximations of the M-estimator corresponding to some score function usd\psiusd of the linear regression parameters are obtained under some regularity conditions on usd\psiusd and on the error distribution function. This extends the results of Lahiri (1990) from fixed to increasing dimensionality. Results of this part remain valid under the condition usd{p\sp3\over N}\to 0usd as usdN\to\inftyusd, where N = is the number of observations.