Carleton University
Technical Report TR-178
July 1990
Determining Stochastic Dependence for Normally Distributed Vectors Using the Chi-squared Metric
Abstract
A fundamental problem in information theory involves computing and estimating the probability density function associated with a set of random variables. In estimating this density function, one can either assume that the form of the density function is known, and that we are merely estimating parameters that characterize the distribution or that no information about the density function is available. This problem has been extensively studied if the random variables are independent. If the random variables are dependent and are of the discrete sort, the problem of capturing this dependence between variables has been studied in [2]. The analogous problem for normally distributed continuous random variables has been tackled in [3]. In both these instances, the determination of the best dependence tree hinges on the well-acclaimed Expected Mutual Information Measure (EMIM) Metric. Valiveti and Oommen studied the suitability of the chi-squared based metric in-lieu of the EMIM metric, for the discrete variable case[10]. In this paper, we study the use of the chi-squared metric for determining dependence trees for normally distributed random vectors. We show that for such vectors, the chisquared metric yields the optimal tree and that it is identical to the one obtained using the EMIM metric. The computational gain obtained using the chi-squared metric is discussed.