Simulating a vector that as known variance matrix
Category: Science > Math
Asked by: mathdumbie-ga
List Price: $25.00
28 Oct 2006 21:13 PDT
Expires: 27 Nov 2006 20:13 PST
Question ID: 777962
How would you simulate a vector or random variables that has a known mean vector and a known variance-covariance matrix?
Re: Simulating a vector that as known variance matrix
Answered By: hedgie-ga on 29 Oct 2006 07:34 PST
Hi easy way is this: 1) Take n independent normal variables with variance 1 and mean 0 n is dimension of your linear space arrange them inro a vector v 2) make a linerar transformation X.i = L.i/j * v.j 3) L is a matrix which you get by diagonalising your given variance (or rather it's inverse) 4) X.i = X.i + B.i will chage means as needed. re 3) http://ltcconline.net/greenl/courses/203/MatrixOnVectors/diagonalization.htm http://en.wikipedia.org/wiki/Diagonalizable_matrix re 1) http://en.wikipedia.org/wiki/Multivariate_normal_distribution http://www-math.mit.edu/~panchenk/class443/lectures/lecture4.pdf https://netfiles.uiuc.edu/jimarden/www/Classes/STAT571/chapter3.pdf Hope this will work for you Hedgie
rated this answer:
Complete and concise answers. Perfect.
|There are no comments at this time.|
If you feel that you have found inappropriate content, please let us know by emailing us at firstname.lastname@example.org with the question ID listed above. Thank you.
|Search Google Answers for|