Avoiding inverse when X = A - inv(B)*Y*inv(B)

05-26-2009, 09:15 AM
Like the title says, I have a situation where I have to calculate something like
X = A - inv(B) * Y * inv(B),
where A, B and Y are known p*p matrices (p can be small or very large, depending the situation.). Is it possible to do somekind of manipulation to the equation so I could calculate X without inverting the matrix B?
There is also equation
X = X + A' * inv(B) * A,

not sure is it possible to calculate that either without inverting B.

So I restate my question, is it possible to calculate those both equations without inverting B? If one is possible but other isn't, I think I just take it then, and use it on both equations? Or should I still use some other method in other if possible?

All matrices are quite dense, and the equations have to be made hundreds or thousands of times.

Thank you.

05-27-2009, 02:30 PM
Well, technically you could avoid the inverse, but in this case I don't think you want to.

Your equation is equivilent to
B(X-A)B = Y
BP = Q
where P = XB and Q = Y + BAB. You can compute Q directly from A, B, and Y. BP = Q can be solved for P via Gaussian elimination (the N right-hand-sides are the N columns of Q; the N solution vectors are the columns of P). P = XB can be solved for X by turning it into B^T X^T = P^T and using Gaussian elimination again. By the time you have done all that, your computational effort is going to be just as high as it would have been to invert B.

Unfortunately, a trick like LU decomposition isn't going to help you here, because the matrix Q is going to be different not only when B changes, but also when Y or A changes.

By the way, you don't say what, if anything, is held constant across the problems you intend to solve. If B is constant and just Y and A change, inverting B and repeatedly evaluating your equation isn't going to be too bad.

05-31-2009, 02:22 AM
Thanks. I actually solved the problem this way:

mark C = inv(B)*Y <=> BC = Y, solve C.
mark D = C*inv(B) <=> DB = C <=> B'D' = C', solve D'

All matrices change over time, so I have to do this (or inverting) again every recursion. I actually made an error on my first post, n is usually about 1-10, possibly more, but usually something like that. B is positive definite so I can use cholesky on factorisation, and then solve equations of multiple right hand sides.

Is this much slower or faster than just inverting B? One invertion vs solving three systems of linear equations (there's that another equation too) plus some transposing. I would think it's at least numerically more stable than inverting?

06-29-2009, 02:32 AM
wow nice post........keep posting........
sales tracking software (http://salestrackingsoftware.org)