[PDF] [PDF] Linear Transformations

Def: A linear transformation is a function T : Rn → Rm which satisfies: (1) T(x + y) Question: If inverse functions “undo” our original functions, can they help



Previous PDF Next PDF





[PDF] Generalized Inverse of Linear Transformations: A Geometric - CORE

A generalized inverse of a linear transformation A: V -+ W, where 7v and Y are arbitrary finite dimensional vector spaces, is defined using only geometrical



Generalized Inverse of Linear Transformations - ScienceDirectcom

A generalized inverse of a linear transformation A: V -+ W, where 7v and Y are arbitrary finite dimensional vector spaces, is defined using only geometrical



[PDF] Which Linear Transformations are Invertible

We have mentioned taking inverses of linear transformations But when can we do this? Theorem A linear transformation is invertible if and only if it is injective 



[PDF] (Lecture 28 Compositions and Inverse Transformations [???e????])

2 jan 2012 · prove that V is isomorphic to Rn we must find a linear transformation T:V→Rn that is Inverse Linear Transformations ▫ A matrix operator T A



[PDF] Chapter 7 Linear Transformations

For a linear transformation, the number of elements in the set K(w) {v : T(v) = w} A non-square matrix A does not have “inverse” (but may have left-inverse or



Linear Transformations - Penn Math

23 juil 2013 · mapping T : V → W is called a linear transformation from V to W if it inverse transformation if and only if A is invertible and, if so, T−1 is the 



[PDF] Chapter 4 LINEAR TRANSFORMATIONS AND THEIR - TAMU Math

The central objective of linear algebra is the analysis of linear functions defined on a finite so that we could call Mu a left inverse of MA However, MA Mu ' 3



[PDF] Linear Transformations

Def: A linear transformation is a function T : Rn → Rm which satisfies: (1) T(x + y) Question: If inverse functions “undo” our original functions, can they help

[PDF] inverse of matrix product

[PDF] inverse relationship graph

[PDF] inverse relationship science

[PDF] inverseur de source courant continu

[PDF] inverter layout

[PDF] invertible linear transformation

[PDF] invest in 7 eleven

[PDF] investigatory project in physics for class 12 cbse pdf

[PDF] investing in hilton hotels

[PDF] investment grade rating

[PDF] investor pitch presentation example

[PDF] investor presentation (pdf)

[PDF] investor presentation ppt template

[PDF] invité politique dimanche france inter

[PDF] invité politique matinale france inter

Linear Transformations

The two basic vector operations are addition and scaling. From this perspec- tive, the nicest functions are those which \preserve" these operations: Def:Alinear transformationis a functionT:Rn!Rmwhich satises: (1)T(x+y) =T(x) +T(y) for allx;y2Rn (2)T(cx) =cT(x) for allx2Rnandc2R. Fact:IfT:Rn!Rmis a linear transformation, thenT(0) =0. We've already met examples of linear transformations. Namely: ifAis anymnmatrix, then the functionT:Rn!Rmwhich is matrix-vector multiplication

T(x) =Ax

is a linear transformation. (Wait: I thought matriceswerefunctions? Technically, no. Matrices are lit- erally just arrays of numbers. However, matricesdenefunctions by matrix- vector multiplication, and such functions are always linear transformations.) Question:Are these all the linear transformations there are? That is, does every linear transformation come from matrix-vector multiplication? Yes: Prop 13.2:LetT:Rn!Rmbe a linear transformation. Then the function Tis just matrix-vector multiplication:T(x) =Axfor some matrixA.

In fact, themnmatrixAis

A=2

4T(e1)T(en)3

5 Terminology:For linear transformationsT:Rn!Rm, we use the word \kernel" to mean \nullspace." We also say \image ofT" to mean \range of

T." So, for a linear transformationT:Rn!Rm:

ker(T) =fx2RnjT(x) =0g=T1(f0g) im(T) =fT(x)jx2Rng=T(Rn):

Ways to Visualize functionsf:R!R(e.g.:f(x) =x2)

(1) Set-Theoretic Picture. (2) Graph off. (Thinking:y=f(x).)

Thegraphoff:R!Ris the subset ofR2given by:

Graph(f) =f(x;y)2R2jy=f(x)g:

(3) Level sets off. (Thinking:f(x) =c.) Thelevel setsoff:R!Rare the subsets ofRof the form fx2Rjf(x) =cg; for constantsc2R. Ways to Visualize functionsf:R2!R(e.g.:f(x;y) =x2+y2) (1) Set-Theoretic Picture. (2) Graph off. (Thinking:z=f(x;y).)

Thegraphoff:R2!Ris the subset ofR3given by:

Graph(f) =f(x;y;z)2R3jz=f(x;y)g:

(3) Level sets off. (Thinking:f(x;y) =c.) Thelevel setsoff:R2!Rare the subsets ofR2of the form f(x;y)2R2jf(x;y) =cg; for constantsc2R. Ways to Visualize functionsf:R3!R(e.g.:f(x;y;z) =x2+y2+z2) (1) Set-Theoretic Picture. (2) Graph off. (Thinking:w=f(x;y;z).) (3) Level sets off. (Thinking:f(x;y;z) =c.) Thelevel setsoff:R3!Rare the subsets ofR3of the form f(x;y;z)2R3jf(x;y;z) =cg; for constantsc2R.

Curves inR2: Three descriptions

(1)Graph of a functionf:R!R. (That is:y=f(x))

Such curves must pass the vertical line test.

Example:When we talk about the \curve"y=x2, we actually mean to say:the graph of the functionf(x) =x2.That is, we mean the set f(x;y)2R2jy=x2g=f(x;y)2R2jy=f(x)g: (2)Level sets of a functionF:R2!R. (That is:F(x;y) =c) Example:When we talk about the \curve"x2+y2= 1, we actually mean to say:the level set of the functionF(x;y) =x2+y2at height1.That is, we mean the set f(x;y)2R2jx2+y2= 1g=f(x;y)2R2jF(x;y) = 1g: (3)Parametrically:( x=f(t) y=g(t):

Surfaces inR3: Three descriptions

(1)Graph of a functionf:R2!R. (That is:z=f(x;y).)

Such surfaces must pass the vertical line test.

Example:When we talk about the \surface"z=x2+y2, we actually mean to say:the graph of the functionf(x;y) =x2+y2.That is, we mean the set (2)Level sets of a functionF:R3!R. (That is:F(x;y;z) =c.) Example:When we talk about the \surface"x2+y2+z2= 1, we actually mean to say:the level set of the functionF(x;y;z) =x2+y2+z2at height

1.That is, we mean the set

f(x;y;z)2R3jx2+y2+z2= 1g=f(x;y;z)2R3jF(x;y;z) = 1g: (3)Parametrically. (We'll discuss this another time, perhaps.)

Two Examples of Linear Transformations

(1)Diagonal Matrices: Adiagonal matrixis a matrix of the form D=2 6 664d
100

0d20.........0

0 0dn3

7 775:
The linear transformation dened byDhas the following eect: Vectors are...

Stretched/contracted (possibly re

ected) in thex1-direction byd1

Stretched/contracted (possibly re

ected) in thex2-direction byd2...

Stretched/contracted (possibly re

ected) in thexn-direction bydn.

Stretching in thexi-direction happens ifjdij>1.

Contracting in thexi-direction happens ifjdij<1.

Re ecting happens ifdiis negative. (2)Rotations inR2 We writeRot:R2!R2for the linear transformation which rotates vectors inR2counter-clockwise through the angle. Its matrix is: cossin sincos

The Multivariable Derivative: An Example

Example:LetF:R2!R3be the function

F(x;y) = (x+ 2y;sin(x); ey) = (F1(x;y);F2(x;y);F3(x;y)): Itsderivativeis a linear transformationDF(x;y):R2!R3. The matrix of the linear transformationDF(x;y) is:

DF(x;y) =2

6 4@F 1@x @F 1@y @F 2@x @F 2@y @F 3@x @F 3@y 3 7 5=2 41 2
cos(x) 0 0ey3 5 Notice that (for example)DF(1;1) is a linear transformation, as isDF(2;3), etc. That is, eachDF(x;y) is a linear transformationR2!R3.

Linear Approximation

Single Variable Setting

Review:In single-variable calc, we look at functionsf:R!R. We write y=f(x), and at a point (a;f(a)) write: ydy:

Here, y=f(x)f(a), whiledy=f0(a)x=f0(a)(xa). So:

f(x)f(a)f0(a)(xa):

Therefore:

f(x)f(a) +f0(a)(xa): The right-hand sidef(a) +f0(a)(xa) can be interpreted as follows:

It is the bestlinear approximationtof(x) atx=a.

It is the1st Taylor polynomialtof(x) atx=a.

The liney=f(a) +f0(a)(xa) is the tangent line at (a;f(a)).

Multivariable Setting

Now consider functionsf:Rn!Rm. At a point (a;f(a)), we have exactly the same thing: f(x)f(a)Df(a)(xa):

That is:

f(x)f(a) +Df(a)(xa):() Note:The quantityDf(a) is amatrix, while (xa) is avector. That is,

Df(a)(xa) is matrix-vector multiplication.

Example:Letf:R2!R. Let's writex= (x1;x2) anda= (a1;a2). Then () reads: f(x1;x2)f(a1;a2) +h@f@x

1(a1;a2)@f@x

2(a1;a2)ix1a1

x 2a2 =f(a1;a2) +@f@x

1(a1;a2)(x1a1) +@f@x

2(a1;a2)(x2a2):

Tangent Lines/Planes to Graphs

Fact:Suppose a curve inR2is given as a graphy=f(x). The equation of the tangent line at (a;f(a)) is: y=f(a) +f0(a)(xa): Okay, you knew this from single-variable calculus. How does the multivari- able case work? Well: Fact:Suppose a surface inR3is given as a graphz=f(x;y). The equation of the tangent plane at (a;b;f(a;b)) is: z=f(a;b) +@f@x (a;b)(xa) +@f@y (a;b)(yb): Note the similarity between this and the linear approximation tofat (a;b).

Tangent Lines/Planes to Level Sets

Def:For a functionF:Rn!R, itsgradientis the vector inRngiven by: rF=@F@x

1;@F@x

2;:::;@F@x

n Theorem:Consider a level setF(x1;:::;xn) =cof a functionF:Rn!R. If (a1;:::;an) is a point on the level set, thenrF(a1;:::;an) is normal to the level set. Corollary 1:Suppose a curve inR2is given as a level curveF(x;y) =c. The equation of the tangent line at a point (x0;y0) on the level curve is: @F@x (x0;y0)(xx0) +@F@y (x0;y0)(yy0) = 0: Corollary 2:Suppose a surface inR3is given as a level surfaceF(x;y;z) =c. The equation of the tangent plane at a point (x0;y0;z0) on the level surface is: @F@x (x0;y0;z0)(xx0) +@F@y (x0;y0;z0)(yy0) +@F@z (x0;y0;z0)(zz0) = 0: Q:Do you see why Cor 1 and Cor 2 follow from the Theorem?

Composition and Matrix Multiplication

Recall:Letf:X!Yandg:Y!Zbe functions. Theircompositionis the functiongf:X!Zdened by (gf) =g(f(x)):

Observations:

(1) For this to make sense, we must have: co-domain(f) = domain(g). (2) Composition isnotgenerally commutative: that is,fgandgfare usually dierent. (3) Composition is always associative: (hg)f=h(gf). Fact:IfT:Rk!RnandS:Rn!Rmare both linear transformations, then

STis also a linear transformation.

Question:How can we describe the matrix of the linear transformationST in terms of the matrices ofSandT? Fact:LetT:Rn!RnandS:Rn!Rmbe linear transformations with matricesBandA, respectively. Then the matrix ofSTis the productAB. We can multiply anmnmatrixAby annkmatrixB. The result,

AB, will be anmkmatrix:

(mn)(nk)!(mk): Notice thatnappears twice here to \cancel out." That is, we need the number of rows ofAto equal the number of columns ofB{ otherwise, the product

ABmakes no sense.

Example 1:LetAbe a (32)-matrix, and letBbe a (24)-matrix. The productABis then a (34)-matrix. Example 2:LetAbe a (23)-matrix, and letBbe a (42)-matrix. Then ABis not dened. (But the productBAis dened: it is a (43)-matrix.)

Two Model Examples

Example 1A (Elliptic Paraboloid):Considerf:R2!Rgiven by f(x;y) =x2+y2: The level sets offare curves inR2. The level sets aref(x;y)jx2+y2=cg. The graph offis a surface inR3. The graph isf(x;y;z)jz=x2+y2g.

Notice that (0;0;0) is a local minimumoff.

Note that

@f@x (0;0) =@f@y (0;0) = 0. Also,@2f@x

2(0;0)>0 and@2f@y

2(0;0)>0.

Example 1B (Elliptic Paraboloid):Considerf:R2!Rgiven by f(x;y) =x2y2: The level sets offare curves inR2. The level sets aref(x;y)j x2y2=cg. The graph offis a surface inR3. The graph isf(x;y;z)jz=x2y2g:

Notice that (0;0;0) is a local maximumoff.

Note that

@f@x (0;0) =@f@y (0;0) = 0. Also,@2f@x

2(0;0)<0 and@2f@y

2(0;0)<0.

Example 2 (Hyperbolic Paraboloid):Considerf:R2!Rgiven by f(x;y) =x2y2: The level sets offare curves inR2. The level sets aref(x;y)jx2y2=cg. The graph offis a surface inR3. The graph isf(x;y;z)jz=x2y2g: Notice that (0;0;0) is a saddle pointof the graph off.

Note that

@f@x (0;0) =@f@y (0;0) = 0. Also,@2f@x

2(0;0)>0 while@2f@y

2(0;0)<0.

General Remark:In each case, the level sets offare obtained by slicing the graph offby planesz=c. Try to visualize this in each case.

Chain Rule

Chain Rule (Matrix Form):Letf:Rn!Rmandg:Rm!Rpbe any dierentiable functions. Then

D(gf)(x) =Dg(f(x))Df(x):

Here, the product on the right-hand side is a product of matrices. In the case whereg:Rm!Rhas codomainR, there is another way to state the chain rule. Chain Rule:Letg=g(x1;:::;xm) and suppose eachx1;:::;xmis a function of the variablest1;:::;tn. Then: @g@t

1=@g@x

1@x 1@t

1+@g@x

2@x 2@t

1++@g@x

m@x m@t 1; @g@t n=@g@x 1@x 1@t n+@g@x 2@x 2@t

1++@g@x

m@x m@t n: There is a way to state this version of the chain rule in general { that is, wheng:Rn!Rphas codomainRp{ but let's keep things simple for now. Example 1:Letz=g(u;v), whereu=h(x;y) andv=k(x;y). Then the chain rule reads:@z@x =@z@u @u@x +@z@v @v@x and@z@y =@z@u @u@y +@z@v @v@y Example 2:Letz=g(u;v;w), whereu=h(t),v=k(t),w=`(t). Then the chain rule reads: @z@tquotesdbs_dbs20.pdfusesText_26