An approximate degrees of freedom test is suggested for hypotheses of the kind H 0 : C$8 1 M=C$8 2 M in two independent multivariate linear models: Y i =X i 8 i += i , i=1, 2, under the assumption of error matrix variate normality and heteroscedasticity. It is shown for specific vector choices of th
Combining Independent Tests in Multivariate Linear Models
β Scribed by L.P. Zhou; T. Mathew
- Publisher
- Elsevier Science
- Year
- 1994
- Tongue
- English
- Weight
- 457 KB
- Volume
- 51
- Category
- Article
- ISSN
- 0047-259X
No coin nor oath required. For personal study only.
β¦ Synopsis
A class of independent multivariate linear models is considered, having a common parameter matrix (\theta) in their means, but having different covariance matrices. For testing (H_{0}: \Theta=0), some test procedures are derived, which combine the information from the different models. In the context of the interblock analysis of block designs, simulated powers of some combined tests are reported in the bivariate case, for combining the intra- and interblock tests based on the Wilk's (A) criterion. The numerical results indicate that the combined tests offer substantial improvement in power compared to the Wilk's (A) test based only on intrablock information. i. 1994 Academic Press, Inc.
π SIMILAR VOLUMES
The null hypothesis that the error vectors in a multivariate linear model are independent is tested against the alternative hypothesis that they are dependent in some specified manner. This dependence is assumed to be due to common random components or autocorrelation over time. The testing problem
Consider the multivariate linear model for the random matrix Y n\_p t MN(XB, V 7), where B is the parameter matrix, X is a model matrix, not necessarily of full rank, and V 7 is an np\_np positive-definite dispersion matrix. This paper presents sufficient conditions on the positive-definite matrix V
A test of the independence of two sets of variables is developed to have high power against a special family of dependence. In this each set of variables has the structure of a single factor model and the dependence is solely via the correlation # between the underlying latent variables. This is a m
## Abstract Multivariate phenotypes are frequently encountered in genomeβwide association studies (GWAS). Such phenotypes contain more information than univariate phenotypes, but how to best exploit the information to increase the chance of detecting genetic variant of pleiotropic effect is not alw