3x +2y -5z = 3 -2x - y +3z + w = 0 - x + y +6w = 11 x + y -2z + w = 3We can perform any of these operations on the system:

- (1) Interchange two equations;
- (2) Multiply each term of an equation by a nonzero constant;
- (3) Replace an equation by adding to it a multiple of another equation.

To use the *Gauss-Jordan* technique,
sometimes called Gaussian elimination,
choose an equation with a coefficient of 1 in the first column.
(It may be necessary to first create one,
by dividing each term of one of the equations by its coefficient of x,
or by adding a multiple of one of the equations to another to get the 1.)
This equation is called the *pivot*,
and it should be moved to the top position.
Use it to eliminate the x term in the other equations.

Repeat this procedure for each of the columns. The solution given below illustrates Gauss-Jordan elimination.

3x +2y -5z = 3 - x + y +6w = 11 -2x - y +3z + w = 0 x + y -2z + w = 3~>

x + y -2z + w = 3 - x + y +6w = 11 -2x - y +3z + w = 0 3x +2y -5z = 3~>

x + y -2z + w = 3 2y -2z +7w = 14 y - z +3w = 6 - y + z -3w = -6~>

x + y -2z + w = 3 y - z +3w = 6 2y -2z +7w = 14 - y + z -3w = -6~>

x - z -2w = -3 y - z +3w = 6 w = 2~>

x - z = 1 y - z = 0 w = 2

This gives us the final solution: x = z + 1, y = z, w = 2.

We do not have to write down the variables each time, provided we keep careful track of their positions. The solution using matrices to represent the system looks like this.

_ _ | | | 3 2 -5 0 3 | | -1 1 0 6 11 | | -2 -1 3 1 0 | | 1 1 -2 1 3 | |_ _|~>

_ _ | | | 1 1 -2 1 3 | | -1 1 0 6 11 | | -2 -1 3 1 0 | | 3 2 -5 0 3 | |_ _|~>

_ _ | | | 1 1 -2 1 3 | | 0 2 -2 7 14 | | 0 1 -1 3 6 | | 0 -1 1 -3 -6 | |_ _|~>

_ _ | | | 1 1 -2 1 3 | | 0 1 -1 3 6 | | 0 2 -2 7 14 | | 0 -1 1 -3 -6 | |_ _|~>

_ _ | | | 1 0 -1 -2 -3 | | 0 1 -1 3 6 | | 0 0 0 1 2 | | 0 0 0 0 0 | |_ _|~>

_ _ | | | 1 0 -1 0 1 | | 0 1 -1 0 0 | | 0 0 0 1 2 | | 0 0 0 0 0 | |_ _|

Finally, we put the variables back in, to get the solution: x -z = 1, y -z = 0, w = 2. This can be rewritten in the form x = z + 1, y = z, w = 2.

The answer shows that there are infinitely many solutions. Any value can be chosen for z, and then using the corresponding values for x, y, and w gives a solution.

We can think if z as an "independent variable" and x, y, w as "dependent variables".

**Algorithm 1.**
To test whether the vectors
**v**_{1},
**v**_{2}, ...,
**v**_{k}
are linearly independent or linearly dependent:

- Solve the equation
x
_{1}**v**_{1}+ x_{2}**v**_{2}+ ... + x_{k}**v**_{k}=**0**.

Note: the vectors end up as*columns*in a matrix.

If there is a nonzero solution, then the vectors are linearly dependent.

**Algorithm 2.**
To check that the vectors
**v**_{1},
**v**_{2}, ...,
**v**_{k}
span the subspace W:

- Show that for every vector
**b**in W there is a solution to x_{1}**v**_{1}+ x_{2}**v**_{2}+ ... + x_{k}**v**_{k}=**b**.

**Algorithm 3.**
To find a basis for the subspace
S(
**v**_{1},
**v**_{2}, ...,
**v**_{k} )
by deleting vectors:

- Construct the matrix whose
*columns*are the coordinate vectors for the**v**'s

- Row reduce

- Keep the vectors whose column contains a leading 1

**Algorithm 4.**
To find a basis for the solution space of the system
A **x** = **0** :

- Row reduce A
- Identify the independent variables in the solution
- In turn, let one of these variables be 1, and all others be 0
- The corresponding solution vectors form a basis

**Algorithm 5.**
To find a simplified basis for the subspace
S(
**v**_{1},
**v**_{2}, ...,
**v**_{k} ) :

- Construct the matrix whose
*rows*are the coordinate vectors for the**v**'s

- Row reduce

- The nonzero rows form a basis