|
|
Subject:
A-conjugate vectors and linearly independent vectors
Category: Science > Math Asked by: goodwasabi-ga List Price: $3.00 |
Posted:
15 Apr 2004 07:41 PDT
Expires: 23 Apr 2004 00:37 PDT Question ID: 330651 |
If I have a set of A-conjugate vectors ( given two vectors a and b you can say they are A-conjugate if aAb=0 with A a NxN matrix) can I imply that they are all linearly independent? |
|
There is no answer at this time. |
|
Subject:
Re: A-conjugate vectors and linearly independent vectors
From: mathtalk-ga on 16 Apr 2004 13:50 PDT |
The simplest way for this to fail would be: xAx = 0 for some nonzero vector x. For example: A = +1 0 0 -1 and x = (1,1). Clearly some special properties of A are needed to make the statement true. Note that if A = I, then saying a set of nonzero vectors are A-conjugate amounts to saying they are orthogonal (and hence linear independence does follow). Given your knowledge of the conjugate gradient algorithm, would you care to guess what conditions on A are sufficient for your purpose? regards, mathtalk-ga |
Subject:
Re: A-conjugate vectors and linearly independent vectors
From: mathtalk-ga on 16 Apr 2004 20:18 PDT |
Technically we need to take x' to mean transpose of x, so that if x is a row vector, then xAx' = 0 is defined to be a scalar result (of a matrix product involving row x, matrix A, and column x'). -- mathtalk-ga |
If you feel that you have found inappropriate content, please let us know by emailing us at answers-support@google.com with the question ID listed above. Thank you. |
Search Google Answers for |
Google Home - Answers FAQ - Terms of Service - Privacy Policy |