Solving linear equations
I'm about to start writing a small program that will solve x amount of variables and x amount of equations. Basically I assume the easiest way is to find an equation that has the first variable, swap it with the first equation, zero all other instances of that variable in the other equations out, then doing the same thing down the line with the second variable/second equation etc etc, pretty standard stuff.
Now some advice I got awhile back was that if say you are looking for the second variable but it doesn't exist anymore because previous adding and subtractings zeroed them all out, then you can stop and claim infinite number of answers because that variable could be an infinite amount of values and you'll end up with an equation like 0=0 eventually. But that isn't necessarily true is it?? My brain is too fried at the moment to come up with an example, but isn't it still possible that if you continue on with the rest of the variables you could still end up with an equation like 7=0 thus meaning there are no solutions?
I ask this because the algorithm I got going right now is based on only worrying about the first variable in the first line, second variable in the second line etc, and wondering if I need to change my algorithm to accomodate this possibility which will be a pain.
Errr... that didn't quite help. I know what it means if you end up with 0=7 (no solutions) or 0=0 (infinite solutions) I just wasn't sure if it was possible to get both, like if you have 3 equations all with three unknowns, is it possible after doing all the matrix mathematics to end up with:
The problem is that it is easy and quick to check for the infinite, just wasn't sure if I need to make my program keep going to check to see if eventually theres also an invalid equation meaning no solutions instead - something that will take me changing my algorithm a bit.