10

Xl = arsd ; E(V) = 0. · 2016-12-16 · ; Xl = arsd ; %1= qxp PX4 4XQ b--Y) yP pxy Note that selecting 2 tQ be a C0hHl-l~ basis Qf I rank decomposition of We define so that E(U) =

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Xl = arsd ; E(V) = 0. · 2016-12-16 · ; Xl = arsd ; %1= qxp PX4 4XQ b--Y) yP pxy Note that selecting 2 tQ be a C0hHl-l~ basis Qf I rank decomposition of We define so that E(U) =
Page 2: Xl = arsd ; E(V) = 0. · 2016-12-16 · ; Xl = arsd ; %1= qxp PX4 4XQ b--Y) yP pxy Note that selecting 2 tQ be a C0hHl-l~ basis Qf I rank decomposition of We define so that E(U) =

; Xl = arsd ; %1= qxp PX4 4XQ b--Y) yP pxy

Note that selecting 2 tQ be a C0hHl-l~ basis Qf

I rank decomposition of

We define

so that

E(U) = TX* ‘ind E(V) = 0.

Page 3: Xl = arsd ; E(V) = 0. · 2016-12-16 · ; Xl = arsd ; %1= qxp PX4 4XQ b--Y) yP pxy Note that selecting 2 tQ be a C0hHl-l~ basis Qf I rank decomposition of We define so that E(U) =

2. 3)

i= !I,2 ,..., pf,

Ba can shown that the csvariance adjusted esti

Page 4: Xl = arsd ; E(V) = 0. · 2016-12-16 · ; Xl = arsd ; %1= qxp PX4 4XQ b--Y) yP pxy Note that selecting 2 tQ be a C0hHl-l~ basis Qf I rank decomposition of We define so that E(U) =

covariabIes, respectively. oweves, there are plenty of intermediate estimators utilizing a subset of COW tes. Some of these intermediate estimates may be the best among the ccsvasiance adjusted estimators, when C is unknown an rephced with its estimate.

iIf we use only the jth cmmiable to improve ui, then we ok-m the covari- ante adjusted es&irnator

where pi is the mukiple corretation coeRicient between the jth covariable arid main variables,

IIn order 143 examine the effect of the jth covariable, on the estimate of yi, we

test t&e foflowing hypotheses

H,u: f’j = 8 against N;, : (‘j + 0, j = I. 2. .

It is well known that

s = YfI, - x:(x,x:)--‘x~]v’ - iy,(lZ - I’, q. I’? I’

and tzonsequenily an estimate of

Wishart distribution with n - r The sample muhipie cosrelatism

Page 5: Xl = arsd ; E(V) = 0. · 2016-12-16 · ; Xl = arsd ; %1= qxp PX4 4XQ b--Y) yP pxy Note that selecting 2 tQ be a C0hHl-l~ basis Qf I rank decomposition of We define so that E(U) =
Page 6: Xl = arsd ; E(V) = 0. · 2016-12-16 · ; Xl = arsd ; %1= qxp PX4 4XQ b--Y) yP pxy Note that selecting 2 tQ be a C0hHl-l~ basis Qf I rank decomposition of We define so that E(U) =

so that

This is a standard multiple regression model. Each column of v; corresponds to one covariablle. We may now use standard procedures of variable selection

in hear regression. Note that under the procedure described above the co- variables to be selected need not to be the same for alI main variables.

In sum-nary: (a) For each i, perform the selection of covasiables by using usual met

variable selection in hear regression. Note that X~Q, is a given fixed part of the model and we only sekct the colums of Vi.

(b) Based on the variables selected in (a) we obtain the estimate of Sij, of qj, by using the ordinary least squares regression.

The procedure was tested by using a data on 208 bulls born in 1966 [6]. The testing period of the bulk covered the interval 30-365 days of age, and the bulls were weighed every 30 days. T&e data consists of two breeds: I68 Ayrshire bulk

number of models for the body weight

where 1 is time, indicated a satisfactory fat within individual desire matrix is

c’

I B ?? *. f 3Q 60 ?? ?? . 365 30’ 60’ . . . 365’

to the data. In modei IEq. C 1) the

[ ln( 30) h(60) . . . h( 365)

168 Ayrshire and 40 Finncattle bulls the across individual design

Page 7: Xl = arsd ; E(V) = 0. · 2016-12-16 · ; Xl = arsd ; %1= qxp PX4 4XQ b--Y) yP pxy Note that selecting 2 tQ be a C0hHl-l~ basis Qf I rank decomposition of We define so that E(U) =

j 1 2 3 4 s 6 7 8

b, a5406 0.5547 0.2352 0.3996 0.3227 (6.3390 KY35 ix I470 6 7.3890 7.8”i 73 1.0384 3.3685 2.08 2.3151 t A%! 0.393 B

Page 8: Xl = arsd ; E(V) = 0. · 2016-12-16 · ; Xl = arsd ; %1= qxp PX4 4XQ b--Y) yP pxy Note that selecting 2 tQ be a C0hHl-l~ basis Qf I rank decomposition of We define so that E(U) =

340 S.-G. Htrilg cl cd. I b!hwr AIghu imd irs ApplicWims 289 ( I999 I 333-342

193.3432 182.7105 1.9047 2.OMO 0.0410 O.OMO

;tFR = ??-O.OOM -0.0014

and SE(&) = 0.000t 0.

-57.5645 -55.0341 2.5678 5.0688

Uailizinp all eight covariatcs V yields the ML

I80.8659 172.4279 8.92 89 16.7769 1.8398 1.9895 0.0424 0.0797

h1L = -0.0009 -0.00 13 and SE(&,) =

-53.6056 -51 Al97

An examination of standard mm of Z shows a slight decrease in comparison to full ML method. On this basis we may prefer + instead of *ML-estimates for the bulls data. The method A(J nf Fujikoshi and Rae [3] gives the smallest standxd errors for the parameter estimates of Ayrshire bulls [left column TflRJ. The computations for this example were carried out using the A programming lmguage and PROC IRE6 procedure of SAS [see [MI]. pp. 236 240).

In the Mowing simulation study we investigate the performance of the proposed covari;hle sefection method for simulated sets of data. utually independent rmdom vectors were generatecl from the normal clistributiolra

s, N Nj~i.E(#l)]. i = 1, *. . .m.

where the mean p = rXl is computed with

x; ( 1 1 1 1 1 > ( 10 zz and f = > ’ I 2 3 4 5 1

The covakiance matrix C(p) takes the stationary AR( 1 )-structure

used often in the malysis of growth curve data. We take the autocorrelation parameter p = 0.0.3.0.6 and 0.9 and the number of repetitions is E for each p, The F-test in covasiable selection is carried out on 5% level. It ;appears in

Page 9: Xl = arsd ; E(V) = 0. · 2016-12-16 · ; Xl = arsd ; %1= qxp PX4 4XQ b--Y) yP pxy Note that selecting 2 tQ be a C0hHl-l~ basis Qf I rank decomposition of We define so that E(U) =

(ti - T)‘(+i - T)

Page 10: Xl = arsd ; E(V) = 0. · 2016-12-16 · ; Xl = arsd ; %1= qxp PX4 4XQ b--Y) yP pxy Note that selecting 2 tQ be a C0hHl-l~ basis Qf I rank decomposition of We define so that E(U) =

d rnuI%ivasiate analysis of variance model usefsbl metrika 5 (1964) 313-326. tli md dose response curves, iometFics 25 ( 1969)

Ras. Sdectian of covariables in the gcowBR curve model. ( 199 1) 779-785.

ANOVA model applied to prsb%ems in growth curves. Ann. Inst.

[5] E.P. Liski. A growth curve mdysis for bdls tested aa station, iometrical J. 29 q 1987) 33 1- 343.

pj E.P. Liski, ebecthg influentia% measurements in a grmvth curve model, iometrics 3 % ( 1991) 659468.

873 T. Nummi. Esh~ation in a random effects growth curve model. J. Applt. Statist. 24 (2) (1997) 157-168.

[8] CR. Rim Least squares 5 using an esdimated dispers’ matrix and its appli measureme;lt sf signak irn . LeCam, J. Neyman (e&h), ceedings of the Fikfi

&tics rant! Probability, vol. niversity Qf Califbr

[91 uirhead, Aspects of ultivmiate Statistical Theory. Wiley. New York, 1982. [to] R. Khattree, .N. Naik, Applied dtbvariate Statistics with SAS Software, SAS Institute

Inc., Cm-y, NC, USA, 1995.