PDA

View Full Version : Linear algebra, matrices, and you


TBag
04-21-2006, 03:21 PM
So yes, here is a math problem I'm rather curious about. In this situation, I'm guessing that A is just an identity matrix of n x n, would that be correct? If so, how would one go about proving it?

http://img231.imageshack.us/img231/4026/mathq8ab.jpg

Also, for the true or false questions I got
1 true
2 false
3 I'm pretty sure this one is true, because if C = A^-1, and B = C^-1, therefore A=B, but I'm not sure I'll work one out and then edit my answer in
4 True

edit - I did number 3 with the matrix
1 2
3 4
And it proved the statement true. Is there any instance where it wouldn't?

brandofo
04-21-2006, 03:49 PM
[ QUOTE ]
So yes, here is a math problem that I need to finish by Monday. Please do it for me.

[/ QUOTE ]
FYP

TBag
04-21-2006, 03:55 PM
Dude, it's a bonus problem.

thedustbustr
04-21-2006, 04:08 PM
[ QUOTE ]
I'm guessing that A is just an identity matrix of n x n

[/ QUOTE ]
No need to guess, this is given. rereading the question would probably be a good wy to get started.

TBag
04-21-2006, 04:25 PM
But that's not necessarily true. If you have a 2 x 2 matrix, the possible matrices I see working in addition to the identity matrix are

Err, have to edit these made a mistake

-1 0
0 -1

0 1
1 0

0 -1
-1 0

For example, if your matrix is [0 -1 : -1 0], multiplying by it's transpose is

0(0) + (-1)(-1) (0)(-1) + (-1)(0)
0(-1) + (0)(-1) (-1)(-1) + (0)(0)

Which is [1 0 : 0 1], an identity 2 x 2 matrix. I dunno if I"m doing something wrong, but I don't think we can prove it's definately an identity matrix

jason1990
04-21-2006, 07:05 PM
If you regard all n-vectors as n x 1 matrices, then the dot product of two vectors, x and y, is just the matrix product, x^T y. This fact might be helpful.

TBag
04-21-2006, 08:14 PM
So what about this,

since we're doing dot products and the matrix a is always a diagonal matrix of 1's or -1's, either going from top left to bottom right or bottom left to top right.
IE
[(-)1 0 0 : 0 (-)1 0 : 0 0 (-)1 ],
[0 0 (-)1 : 0 (-1) 0 : (-1) 0 0 ], etc

We can assume one of two things is happening.

1)
Ax = (A11 * x1) + (A22 * x2) ... (Ann * xn)
and same goes for y, replacing the x

or if the ones line up from bottom left to top right
2)
Ax = (A1n * xn) + (A2(n-1) * x(n-1) ) .... (An1 * x1)
and same goes for y,

so regardless of which path we take, in Ax (dot) Ay, we end up with (either A entry)^2(x1)(y1)+ ... + A^2(xn)(yn) which is the same as regular x (dot) y because any A entry squared will be 1.

cliff
04-21-2006, 09:09 PM
A is an orthogonal matrix, which mean A A^T=I=A^T A, for instance
A=1/sqrt(2)* [1 1
1 -1]
in fact there are an infinite number of such matrices (in the complex field at least) and the set of all such matrices are closed under multiplication. They show up a lot in the theory of matrices.

cliff
04-21-2006, 09:14 PM
A does not have to be diagonal (see my example of the Hadamard matrix in the above post), what you want here is that
Ax . By= B^T Ax . y for any matrices A and B by the definition of the inner product.

jason1990
04-21-2006, 11:33 PM
As cliff mentioned, A does not have to be diagonal. Frankly, I think you should stop trying to dig into the entries of the matrix A. Think more abstractly. Are you aware of the formula (AB)^T = (B^T)(A^T)?

dirk
04-25-2006, 07:56 PM
[ QUOTE ]
If you regard all n-vectors as n x 1 matrices, then the dot product of two vectors, x and y, is just the matrix product, x^T y. This fact might be helpful.

[/ QUOTE ]

if you can't get it from this...

TheHusky
04-25-2006, 09:50 PM
in case noone has answered yet:

1) True: x.x is the length of the vector squared.

2) False: set all entries in C =0, A and B can then be any two matricies.

3)True: A=AI=ACB=IB=B

4) True: I assume by square you mean no. of rows = no. of columns, if so this is really obvious

Bonus:

(Ax) . (Ay) = (Ax)T (Ay) = xT AT A y = xT I y = xT y= x . y