## §1.4 Matrix Equation Ax = b: Linear Combination (I)

b2 b3.. . Question: For what values of b1b2

week

## 18.06 Problem Set 1 Solutions

Feb 11 2010 If E21 subtracts row 1 from row. 2

pset s soln

## Math 2331 – Linear Algebra - 1.4 The Matrix Equation Ax=b

1.4 The Matrix Equation Ax = b. Definition Theorem Span Rm. Matrix-Vector Multiplication: Examples. Example 1 −4. 3. 2.

sec

## 2.5 Inverse Matrices

Elimination solves Ax D b without explicitly using the matrix A. 1 . Note 2 Find the inverses (directly or from the 2 by 2 formula) of A;B;C:.

ila

## Now let's show that V ar(aX + b) = a 2V ar(X). This is for a b

Same kind of idea works but just want to remember this. V ar(aX + b) = E((aX + b)2) − (E(aX + b))2. =

prob

## Matrix-Vector Products and the Matrix Equation Ax= b

Jan 31 2018 has a solution. 2. Indeed

Lecture

## Chapter 2 - Matrices and Linear Algebra

Ax = b. In this way we see that with ci (A) denoting the ith column of A the system is expressible as x1c1 (A) + ··· + xncn (A) = b. From this equation it

chapter

## The Matrix Equation Ax = b Section 1.5: Solution Sets of Linear

This section is about solving the “matrix equation” Ax = b where A is an m Exercise 2 (1.7.1): Check if the following vectors are linearly independent:.

ha

## Table of Integrals

u(x)v (x)dx = u(x)v(x) v(x)u (x)dx. RATIONAL FUNCTIONS. (5). 1 ax + b dx = 1 a ln(ax + b). (6). 1. (x + a)2 dx = 1 x + a. (7). (x + a)n dx = (x + a)n.

IntegralTable

## Math 215 HW #4 Solutions

Execute the six steps following equation (6) to findthee column space and nullspace of A and the solution to Ax = b: A =.. 1 1 2 2. 2 5 7 6. 2 3 5 2.

hw solutions

#### 20F Discussion Section 3 Josh Tobin:http://www.math.ucsd.edu/~rjtobin/Section 1.4: The Matrix EquationAx=b

?This section is about solving the \matrix equation"Ax=b, whereAis anmnmatrix andbis a column vector withmentries (both given in the question), andxis an unknown column vector with nentries (which we are trying to solve for). The rst thing to know is whatAxmeans: it means we are multiplying the matrixAtimes the vectorx. How do we multiply a matrix by a vector? We use the \row times column" rule, see the bottom of page 38 for examples. ?SolvingAx=bis the same as solving the system described by the augmented matrix [Ajb]. ?Ax=bhas a solution if and only ifbis a linear combination of the columns ofA.?Theorem 4 is very important, it tells us that the following statements are either all true or all false,

for anymnmatrixA: (a) For everyb, the equationAx=bhas a solution. (b) Every column vectorb(withmentries) is a linear combination of the columns ofA. (c) The columns ofAspanRm(this is just a restatement of (b), once you know what the word \span" means). (d)Ahas a pivot in every row. This theorem is useful because it means that if we want to know ifAx=bhas a solution for everyb, we just need to check ifAhas a pivot in every row.Note:IfAdoes not have a pivot in every row, that doesnotmean thatAx=bdoes not have a solution for some given vectorb. It just means that there aresomevectorsbfor whichAx=bdoes not have a solution.?Finally, it is very useful to know that multiplying a vector by a vector has the following nice properties:

(a)A(u+v) =A(u) +A(v), for vectorsu;v (b)A(cu) =cA(u), for vectorsuand scalarsc.### Section 1.5: Solution Sets of Linear Systems

?Ahomogeneoussystem is one that can be written in the formAx=0. Equivalently, a homogeneous system is any systemAx=bwherex= 0 is a solution (notice that this means thatb= 0, so both denitions match). The solutionx= 0 is called thetrivial solution. A solutionxisnon-trivialis x6=0. ?The homogeneous systemAx=0has a non-trivial solution if and only if the equation has at least one free variable (or equivalently, if and only ifAhas a column with no pivots).?Parametric vector form:Let's say you have found the solution set to a system, and the free variables

arex3;x4;x5. Then to write the solution set in `parametric vector form' means to write the solution as x=p+x3u+x4v+x5wwherep;u;v;ware vectors with numerical entries. A method for writing a solution set in this form is given

on page 46. 1#### 20F Discussion Section 3 Josh Tobin:http://www.math.ucsd.edu/~rjtobin/Section 1.7: Linear Independence

?Like everything else in linear algebra, the denition oflinear independencecan be phrased in many dierent equivalent ways.v1;v2;;vpare linearly independent if any of the following equivalent statements are true: (a) the vector equationx1v1+x2v2++x2v2= 0 has only the trivial solution (b) none of the vectorsv1;v2;;vpare a linear combination of the others (c) if we put the vectors together as columns of the matrixA, then the systemAx=0has only the trivial solution (c) if we put the vectors together as columns of the matrixA, then the systemAx=0has only the trivial solution (d) if we put the vectors together as columns of the matrixA, thenAhas a pivot in every column ?If vectors aren't linearly independent, then they arelinearly dependent. This means that (at least)one of the vectors is a linear combination of the rest.Note:This does not mean that all of the vectors

are linear combinations of the others. See the following exercise. ?Exercise 1:Find three vectors inR3that are linearly dependent, but where the third vector is not a linear combination of the rst two. ?Method to check linear (in)dependence:If we want to check if a set of given vectors is linearly independent, put them together as columns of a matrix, and then row reduce the matrix. If there is a pivot in every column, then they are independent. Otherwise, they are dependent. ?Exercise 2 (1.7.1):Check if the following vectors are linearly independent: 2 450 03 5 ;2 47

2 63

5 ;2 49

4 83

5 ?Theorem 9: Any set containing the zero vector is linearly dependent. This follows immediately from

the method above, because if one of the columns is zero, there can't be a pivot in every column (there

are other easy ways to prove this theorem also, see the book for example). ?Theorem 8: If we havepvectors, each withnentries, andp > n, then these vectors have to be linearly dependent. (This follows from the method above too, because if there are more columns than rows, there can't be a pivot in every column). ?Exercise 3:Find 2 vectors inR5that are linearly dependent. Notice that this means that ifpnin the theorem above, then the vectors might be dependentorindependent. 2#### 20F Discussion Section 3 Josh Tobin:http://www.math.ucsd.edu/~rjtobin/Section 1.4: The Matrix EquationAx=b

?This section is about solving the \matrix equation"Ax=b, whereAis anmnmatrix andbis a column vector withmentries (both given in the question), andxis an unknown column vector with nentries (which we are trying to solve for). The rst thing to know is whatAxmeans: it means we are multiplying the matrixAtimes the vectorx. How do we multiply a matrix by a vector? We use the \row times column" rule, see the bottom of page 38 for examples. ?SolvingAx=bis the same as solving the system described by the augmented matrix [Ajb]. ?Ax=bhas a solution if and only ifbis a linear combination of the columns ofA.?Theorem 4 is very important, it tells us that the following statements are either all true or all false,

for anymnmatrixA: (a) For everyb, the equationAx=bhas a solution. (b) Every column vectorb(withmentries) is a linear combination of the columns ofA. (c) The columns ofAspanRm(this is just a restatement of (b), once you know what the word \span" means). (d)Ahas a pivot in every row. This theorem is useful because it means that if we want to know ifAx=bhas a solution for everyb, we just need to check ifAhas a pivot in every row.Note:IfAdoes not have a pivot in every row, that doesnotmean thatAx=bdoes not have a solution for some given vectorb. It just means that there aresomevectorsbfor whichAx=bdoes not have a solution.?Finally, it is very useful to know that multiplying a vector by a vector has the following nice properties:

(a)A(u+v) =A(u) +A(v), for vectorsu;v (b)A(cu) =cA(u), for vectorsuand scalarsc.### Section 1.5: Solution Sets of Linear Systems

?Ahomogeneoussystem is one that can be written in the formAx=0. Equivalently, a homogeneous system is any systemAx=bwherex= 0 is a solution (notice that this means thatb= 0, so both denitions match). The solutionx= 0 is called thetrivial solution. A solutionxisnon-trivialis x6=0. ?The homogeneous systemAx=0has a non-trivial solution if and only if the equation has at least one free variable (or equivalently, if and only ifAhas a column with no pivots).?Parametric vector form:Let's say you have found the solution set to a system, and the free variables

arex3;x4;x5. Then to write the solution set in `parametric vector form' means to write the solution as x=p+x3u+x4v+x5wwherep;u;v;ware vectors with numerical entries. A method for writing a solution set in this form is given

on page 46. 1#### 20F Discussion Section 3 Josh Tobin:http://www.math.ucsd.edu/~rjtobin/Section 1.7: Linear Independence

?Like everything else in linear algebra, the denition oflinear independencecan be phrased in many dierent equivalent ways.v1;v2;;vpare linearly independent if any of the following equivalent statements are true: (a) the vector equationx1v1+x2v2++x2v2= 0 has only the trivial solution (b) none of the vectorsv1;v2;;vpare a linear combination of the others (c) if we put the vectors together as columns of the matrixA, then the systemAx=0has only the trivial solution (c) if we put the vectors together as columns of the matrixA, then the systemAx=0has only the trivial solution (d) if we put the vectors together as columns of the matrixA, thenAhas a pivot in every column ?If vectors aren't linearly independent, then they arelinearly dependent. This means that (at least)one of the vectors is a linear combination of the rest.Note:This does not mean that all of the vectors

are linear combinations of the others. See the following exercise. ?Exercise 1:Find three vectors inR3that are linearly dependent, but where the third vector is not a linear combination of the rst two. ?Method to check linear (in)dependence:If we want to check if a set of given vectors is linearly independent, put them together as columns of a matrix, and then row reduce the matrix. If there is a pivot in every column, then they are independent. Otherwise, they are dependent. ?Exercise 2 (1.7.1):Check if the following vectors are linearly independent: 2 450 03 5 ;2 47

2 63

5 ;2 49

4 83

5 ?Theorem 9: Any set containing the zero vector is linearly dependent. This follows immediately from

the method above, because if one of the columns is zero, there can't be a pivot in every column (there

are other easy ways to prove this theorem also, see the book for example). ?Theorem 8: If we havepvectors, each withnentries, andp > n, then these vectors have to be linearly dependent. (This follows from the method above too, because if there are more columns than rows, there can't be a pivot in every column). ?Exercise 3:Find 2 vectors inR5that are linearly dependent. Notice that this means that ifpnin the theorem above, then the vectors might be dependentorindependent. 2