Inner Product Spaces

Download Report

Transcript Inner Product Spaces

Chapter 5
Inner Product Spaces
5.1 Length and Dot Product in Rn
5.2 Inner Product Spaces
5.3 Orthonormal Bases:Gram-Schmidt Process
5.4 Mathematical Models and Least Square Analysis
Elementary Linear Algebra
R. Larsen et al. (6 Edition)
投影片設計編製者
淡江大學 電機系 翁慶昌 教授
5.1 Length and Dot Product in Rn

Length:
The length of a vector v  (v1 , v2 ,, vn ) in Rn is given by
|| v ||  v1  v2    vn
2
2
2

Notes: The length of a vector is also called its norm.

Notes: Properties of length
1
2
3
4
v 0
v  1  v is called a unit vector.
v  0 iff v  0
cv  c v
Elementary Linear Algebra: Section 5.1, p.278
2/80

Ex 1:
(a) In R5, the length of v  (0 ,  2 , 1 , 4 ,  2) is given by
||v ||  0 2  (2) 2  12  4 2  (2) 2  25  5
(b) In R3 the length of v  (
2
2
17
2
,
2
17
,
3
17
) is given by
2
17
 2   2   3 
|| v ||  
1
 
 
 
17
 17   17   17 
(v is a unit vector)
Elementary Linear Algebra: Section 5.1, p.279
3/80

A standard unit vector in Rn:
e1, e2 ,, en   1,0,,0, 0,1,,0, 0,0,,1

Ex:
the standard unit vector in R2:
i, j  1,0, 0,1
the standard unit vector in R3: i, j, k  1,0,0, 0,1,0, 0,0,1

Notes: (Two nonzero vectors are parallel)
u  cv
1 c  0  u and v have the same direction
2 c  0  u and v have the opposite direction
Elementary Linear Algebra: Section 5.1, p.279
4/80

Thm 5.1: (Length of a scalar multiple)
Let v be a vector in Rn and c be a scalar. Then
|| cv ||  | c | || v ||
Pf:
v  (v1 , v2 ,  , vn )
 cv  (cv1 , cv2 ,  , cvn )
|| cv ||  || ( cv1 , cv2 ,  , cvn ) ||
 (cv1 ) 2  (cv2 ) 2    (cvn ) 2
 c 2 (v1  v2    vn )
2
2
 | c | v1  v2    vn
2
2
2
2
 | c | || v ||
Elementary Linear Algebra: Section 5.1, p.279
5/80

Thm 5.2: (Unit vector in the direction of v)
If v is a nonzero vector in
Rn,
v
then the vector u 
| |v | |
has length 1 and has the same direction as v. This vector u
is called the unit vector in the direction of v.
Pf:
v is nonzero  v  0 
1
 u
v
v
1
0
v
(u has the same direction as v)
v
1
||u || 

||v ||  1
||v || ||v ||
Elementary Linear Algebra: Section 5.1, p.280
(u has length 1 )
6/80

Notes:
(1) The vector
v
is called the unit vector in the direction of v.
|| v ||
(2) The process of finding the unit vector in the direction of v
is called normalizing the vector v.
Elementary Linear Algebra: Section 5.1, p.280
7/80

Ex 2: (Finding a unit vector)
Find the unit vector in the direction of v  (3 ,  1 , 2) ,
and verify that this vector has length 1.
Sol:
v  (3 ,  1 , 2)  v  3   1  22  14
2
2
v
(3 ,  1 , 2)
1
1
2 
 3



(3 ,  1 , 2)  
,
,

2
2
2
||v ||
14
14
14 
 14
3  (1)  2
2


2
2
14
 3   1   2 
1

 
 
 
14
 14   14   14 
v
is a unit vector.
v
Elementary Linear Algebra: Section 5.1, p.280
8/80

Distance between two vectors:
The distance between two vectors u and v in Rn is
d (u , v)  | |u  v | |

Notes: (Properties of distance)
(1)
d (u , v)  0
(2)
d (u , v)  0 if and only if
(3)
d (u , v)  d ( v , u)
Elementary Linear Algebra: Section 5.1, p.282
uv
9/80

Ex 3: (Finding the distance between two vectors)
The distance between u=(0, 2, 2) and v=(2, 0, 1) is
d (u , v) ||u  v ||  || (0  2 , 2  0 , 2  1) ||
 (2) 2  2 2  12  3
Elementary Linear Algebra: Section 5.1, p.282
10/80

Dot product in Rn:
The dot product of u  (u1 , u 2 ,  , u n ) and v  (v1 , v2 ,  , vn )
is the scalar quantity
u  v  u1v1  u 2 v2    u n vn

Ex 4: (Finding the dot product of two vectors)
The dot product of u=(1, 2, 0, -3) and v=(3, -2, 4, 2) is
u  v  (1)(3)  (2)( 2)  (0)( 4)  (3)( 2)  7
Elementary Linear Algebra: Section 5.1, p.282
11/80

Thm 5.3: (Properties of the dot product)
If u, v, and w are vectors in Rn and c is a scalar,
then the following properties are true.
(1) u  v  v  u
(2) u  ( v  w)  u  v  u  w
(3) c(u  v)  (cu)  v  u  (cv)
(4) v  v  ||v ||2
(5) v  v  0 , and v  v  0 if and only if v  0
Elementary Linear Algebra: Section 5.1, p.283
12/80

Euclidean n-space:
Rn was defined to be the set of all order n-tuples of real
numbers. When Rn is combined with the standard
operations of vector addition, scalar multiplication, vector
length, and the dot product, the resulting vector space is
called Euclidean n-space.
Elementary Linear Algebra: Section 5.1, p.283
13/80

Ex 5: (Finding dot products)
u  (2 ,  2) , v  (5 , 8), w  (4 , 3)
(a) u  v
(b) (u  v)w (c) u  (2v) (d) | |w | |2 (e) u  ( v  2w)
Sol:
(a ) u  v  (2)(5)  (2)(8)  6
(b) (u  v)w  6w  6(4 , 3)  (24 , 18)
(c) u  (2v)  2(u  v)  2(6)  12
(d) || w ||2  w  w  (4)(4)  (3)(3)  25
(e) v  2w  (5  (8) , 8  6)  (13 , 2)
u  ( v  2w)  (2)(13)  (2)(2)  26  4  22
Elementary Linear Algebra: Section 5.1, p.284
14/80

Ex 6: (Using the properties of the dot product)
Given u  u  39 u  v  3
v  v  79
Find (u  2v)  (3u  v)
Sol:
(u  2v)  (3u  v)  u  (3u  v)  2v  (3u  v)
 u  (3u)  u  v  (2v)  (3u)  (2v)  v
 3(u  u)  u  v  6( v  u)  2( v  v)
 3(u  u)  7(u  v)  2( v  v)
 3(39)  7(3)  2(79)  254
Elementary Linear Algebra: Section 5.1, p.284
15/80

Thm 5.4: (The Cauchy - Schwarz inequality)
If u and v are vectors in Rn, then
| u  v |  | |u | || |v | |

( | u  v | denotes the absolute value of u v )
Ex 7: (An example of the Cauchy - Schwarz inequality)
Verify the Cauchy - Schwarz inequality for u=(1, -1, 3)
and v=(2, 0, -1)
Sol: u  v  1, u  u  11, v  v  5
 u  v  1  1
u v  u  u  v  v  11  5  55
 uv  u v
Elementary Linear Algebra: Section 5.1, p.285-286
16/80

The angle between two vectors in Rn:
uv
cos  
, 0  
| | u | || | v | |
Opposite
direction
uv  0


 
cos  1

uv  0

 
2
cos  0



uv  0
Same
direction

2
cos  0
0  

cos  0
2
 0
cos  1
Note:
The angle between the zero vector and another vector is
not defined.
Elementary Linear Algebra: Section 5.1, p.286
17/80

Ex 8: (Finding the angle between two vectors)
u  (4 , 0 , 2 ,  2) v  (2 , 0 ,  1 , 1)
Sol:
u  u u 
 42  02  22   22
 24
v  v  v  22  0   1  12  6
2
2
u  v  (4)(2)  (0)(0)  (2)(1)  (2)(1)  12
uv
 12
12
 cos 


 1
|| u || || v ||
24 6
144
  
 u and v have opposite directions. (u  2v)
Elementary Linear Algebra: Section 5.1, p.286
18/80

Orthogonal vectors:
Two vectors u and v in Rn are orthogonal if
uv 0

Note:
The vector 0 is said to be orthogonal to every vector.
Elementary Linear Algebra: Section 5.1, p.287
19/80

Ex 10: (Finding orthogonal vectors)
Determine all vectors in Rn that are orthogonal to u=(4, 2).
Sol:
u  (4 , 2)
Let v  (v1 , v2 )
 u  v  (4 , 2)  (v1 , v2 )
 4v1  2v2
0
t
, v2  t
 v1 
2
 t 
 v   ,t ,
 2 
Elementary Linear Algebra: Section 5.1, p.287
4
2


0  1

1

0
2 
tR
20/80

Thm 5.5: (The triangle inequality)
If u and v are vectors in Rn, then | |u  v | |  | |u | |  | |v | |
Pf:
||u  v ||2  (u  v)  (u  v)
 u  (u  v)  v  (u  v)  u  u  2(u  v)  v  v
 || u ||2 2(u  v) || v ||2
 || u ||2 2 | u  v |  || v ||2
 ||u ||2 2 ||u ||||v ||  ||v ||2
 (| |u ||  ||v || )2
|| u  v ||  || u ||  || v ||

Note:
Equality occurs in the triangle inequality if and only if
the vectors u and v have the same direction.
Elementary Linear Algebra: Section 5.1, p.288
21/80

Thm 5.6: (The Pythagorean theorem)
If u and v are vectors in Rn, then u and v are orthogonal
if and only if
||u  v ||2 ||u ||2  ||v ||2
Elementary Linear Algebra: Section 5.1, p.289
22/80

Dot product and matrix multiplication:
 u1 
u 
u   2

 
u n 
 v1 
v 
v   2

 
v n 
u  v  u T v  [u1 u 2
(A vector u  (u1 , u2 ,  , un ) in Rn
is represented as an n×1 column matrix)
v1 
v 
 u n ]  2   [u1v1  u 2 v 2    u n v n ]
 
 
v n 
Elementary Linear Algebra: Section 5.1, p.289
23/80
Keywords in Section 5.1:












length: 長度
norm: 範數
unit vector: 單位向量
standard unit vector : 標準單位向量
normalizing: 單範化
distance: 距離
dot product: 點積
Euclidean n-space: 歐基里德n維空間
Cauchy-Schwarz inequality: 科西-舒瓦茲不等式
angle: 夾角
triangle inequality: 三角不等式
Pythagorean theorem: 畢氏定理
24/80
5.2 Inner Product Spaces

Inner product:
Let u, v, and w be vectors in a vector space V, and let c be
any scalar. An inner product on V is a function that associates
a real number <u, v> with each pair of vectors u and v and
satisfies the following axioms.
(1) 〈u , v〉〈v , u〉
(2) 〈u , v  w〉〈u , v〉〈u , w〉
(3)
c〈u , v〉〈cu , v〉
(4) 〈v , v〉 0 and 〈v , v〉 0 if and only if v  0
Elementary Linear Algebra: Section 5.2, p.293
25/80

Note:
u  v  dot product (Euclidean inner product for R n )
 u , v  general inner product for vector space V

Note:
A vector space V with an inner product is called an inner
product space.
Vector space:
Inner product space:
V ,
V ,
Elementary Linear Algebra: Section 5.2, Addition
, 
, , , 
26/80

Ex 1: (The Euclidean inner product for Rn)
Show that the dot product in Rn satisfies the four axioms
of an inner product.
Sol:
u  (u1 , u2 ,, un ) , v  (v1 , v2 ,, vn )
〈u , v〉 u  v  u1v1  u2v2   unvn
By Theorem 5.3, this dot product satisfies the required four axioms.
Thus it is an inner product on Rn.
Elementary Linear Algebra: Section 5.2, p.293
27/80

Ex 2: (A different inner product for Rn)
Show that the function defines an inner product on R2,
where u  (u1 , u2 ) and v  (v1 , v2 ) .
〈u , v〉 u1v1  2u 2 v2
Sol:
(a) 〈u , v〉 u1v1  2u2v2  v1u1  2v2u2 〈v , u〉
(b) w  (w1 , w2 )
〈u , v  w〉 u1 (v1  w1 )  2u2 (v2  w2 )
 u1v1  u1w1  2u2 v2  2u2 w2
 (u1v1  2u2 v2 )  (u1w1  2u2 w2 )
〈u , v〉〈u , w〉
Elementary Linear Algebra: Section 5.2, pp.293-294
28/80
(c) c〈u , v〉 c(u1v1  2u2v2 )  (cu1 )v1  2(cu2 )v2 〈cu , v〉
(d ) 〈v , v〉 v  2v2  0
2
1
2
〈v , v〉 0  v1  2v2  0
2

2
 v1  v2  0 (v  0)
Note: (An inner product on Rn)
〈u , v〉 c1u1v1  c2u2v2    cnunvn ,
Elementary Linear Algebra: Section 5.2, pp.293-294
ci  0
29/80

Ex 3: (A function that is not an inner product)
Show that the following function is not an inner product on R3.
〈u  v〉 u1v1  2u 2 v 2  u 3 v3
Sol:
Let v  (1 , 2 , 1)
Then  v , v  (1)(1)  2(2)(2)  (1)(1)  6  0
Axiom 4 is not satisfied.
Thus this function is not an inner product on R3.
Elementary Linear Algebra: Section 5.2, p.294
30/80

Thm 5.7: (Properties of inner products)
Let u, v, and w be vectors in an inner product space V, and
let c be any real number.
(1) 〈0 , v〉〈v , 0〉 0
(2) 〈u  v , w〉〈u , w〉〈v , w〉
(3) 〈u , cv〉 c〈u , v〉

Norm (length) of u:
|| u||  〈u , u〉

Note:
|| u ||2 〈u , u〉
Elementary Linear Algebra: Section 5.2, p.295
31/80

Distance between u and v:
d (u , v) || u  v|| u  v, u  v

Angle between two nonzero vectors u and v:
〈u , v〉
cos 
, 0  
|| u || || v ||

Orthogonal: (u  v)
u and v are orthogonal if〈u , v〉 0 .
Elementary Linear Algebra: Section 5.2, p.296
32/80

Notes:
(1) If | |v | |  1 , then v is called a unit vector.
(2)
v 1
v 0

Normalizin g
v
v
(the unit vector in the
direction of v)
not a unit vector
Elementary Linear Algebra: Section 5.2, p.296
33/80

Ex 6: (Finding inner product)
〈 p , q〉  a0b0  a1b1    anbn is an inner product
Let p( x)  1  2 x 2 , q( x)  4  2 x  x 2 be polynomial s in P2 ( x)
(a) 〈p , q〉 ?
(b) || q || ?
(c) d ( p , q)  ?
Sol:
(a) 〈p , q〉 (1)(4)  (0)(2)  (2)(1)  2
(b) ||q ||  〈q , q〉 4 2  (2) 2  12  21
(c)  p  q  3  2 x  3x 2
 d ( p , q)  || p  q ||   p  q, p  q
 (3) 2  2 2  (3) 2  22
Elementary Linear Algebra: Section 5.2, p.296
34/80

Properties of norm:
(1) | |u | |  0
(2) | |u | |  0 if and only if u  0
(3) | |cu | |  | c | | |u | |

Properties of distance:
(1) d (u , v)  0
(2) d (u , v)  0 if and only if u  v
(3) d (u , v)  d ( v , u)
Elementary Linear Algebra: Section 5.2, p.299
35/80

Thm 5.8:
Let u and v be vectors in an inner product space V.
(1) Cauchy-Schwarz inequality:
〈
| u , v〉|  | |u | || |v | |
Theorem 5.4
(2) Triangle inequality:
| |u  v | |  | |u | |  | |v | |
Theorem 5.5
(3) Pythagorean theorem :
u and v are orthogonal if and only if
||u  v ||2  ||u ||2  ||v ||2
Elementary Linear Algebra: Section 5.2, p.299
Theorem 5.6
36/80

Orthogonal projections in inner product spaces:
Let u and v be two vectors in an inner product space V,
such that v  0. Then the orthogonal projection of u
onto v is given by
 u , v
projv u 
v
 v , v

Note:
If v is a init vector, then 〈v , v〉 || v ||2  1.
The formula for the orthogonal projection of u onto v
takes the following simpler form.
projvu  u , v v
Elementary Linear Algebra: Section 5.2, p.301
37/80

Ex 10: (Finding an orthogonal projection in R3)
Use the Euclidean inner product in R3 to find the
orthogonal projection of u=(6, 2, 4) onto v=(1, 2, 0).
Sol:
 u , v  (6)(1)  (2)(2)  (4)(0)  10
 v , v  12  22  02  5
uv
 projvu 
v  105 (1 , 2 , 0)  (2 , 4 , 0)
vv

Note:
u  projv u  (6, 2, 4)  (2, 4, 0)  (4,  2, 4) is orthogonaltov  (1,2,0).
Elementary Linear Algebra: Section 5.2, p.301
38/80

Thm 5.9: (Orthogonal projection and distance)
Let u and v be two vectors in an inner product space V,
such that v  0 . Then
d (u , projvu)  d (u , cv) ,
Elementary Linear Algebra: Section 5.2, p.302
u , v
c
 v , v
39/80
Keywords in Section 5.2:












inner product: 內積
inner product space: 內積空間
norm: 範數
distance: 距離
angle: 夾角
orthogonal: 正交
unit vector: 單位向量
normalizing: 單範化
Cauchy – Schwarz inequality: 科西 - 舒瓦茲不等式
triangle inequality: 三角不等式
Pythagorean theorem: 畢氏定理
orthogonal projection: 正交投影
40/80
5.3 Orthonormal Bases: Gram-Schmidt Process

Orthogonal:
A set S of vectors in an inner product space V is called an
orthogonal set if every pair of vectors in the set is orthogonal.
S  v1 , v 2 ,, v n   V

Orthonormal:
 vi , v j   0
i j
An orthogonal set in which each vector is a unit vector is
called orthonormal.

Note:
S  v1 , v 2 ,  , v n   V
1
 vi , v j   
0
i j
i j
If S is a basis, then it is called an orthogonal basis or an
orthonormal basis.
Elementary Linear Algebra: Section 5.3, p.306
41/80

Ex 1: (A nonstandard orthonormal basis for R3)
Show that the following set is an orthonormal basis.
v1
 1
1

S  
,
, 0 ,
2 
 2
v2

2
2 2 2


 6 , 6 , 3 ,


v3
 2 2 1 
 ,  , 
 3 3 3 
Sol:
Show that the three vectors are mutually orthogonal.
v1  v 2   16  16  0  0
v1  v 3 
2
3 2

2
3 2
0  0
2
2 2 2
v 2  v3  


0
9
9
9
Elementary Linear Algebra: Section 5.3, p.307
42/80
Show that each vector is of length 1.
| |v 1 | |  v 1  v 1 
 12  0  1
1
2
1
2
36

8
9
 94 
1
9
1
| |v 2 | |  v 2  v 2 
2
36
| |v 3 | |  v 3  v 3 
4
9

Thus S is an orthonormal set.
Elementary Linear Algebra: Section 5.3, p.307
43/80

Ex 2: (An orthonormal basis for P3 ( x ) )
In P3 ( x ) , with the inner product
 p, q  a0b0  a1b1  a2b2
The standard basis B  {1, x, x2} is orthonormal.
Sol:
v1  1  0x  0x 2 ,
v 2  0  x  0x2 ,
v3  0  0 x  x 2 ,
Then
 v1 , v 2   (1)(0)  (0)(1)  (0)(0)  0,
 v1 , v 3   (1)(0)  (0)(0)  (0)(1)  0,
 v 2 , v 3   (0)(0)  (1)(0)  (0)(1)  0
Elementary Linear Algebra: Section 5.3, p.308
44/80
v1   v1 , v1  
v2  v2 , v2  
v3   v3 , v3  
11  00  00  1,
00  11  00  1,
00  00  11  1
Elementary Linear Algebra: Section 5.3, p.308
45/80

Thm 5.10: (Orthogonal sets are linearly independent)
If S  v1 , v 2 ,, v n  is an orthogonal set of nonzero vectors
in an inner product space V, then S is linearly independent.
Pf:
S is an orthogonal set of nonzero vectors
i.e.
 v i , v j   0 i  j and  v i , v i   0
Let
c1 v1  c2 v 2    cn v n  0

 c1 v1  c2 v 2    cn v n , v i   0, v i   0
i
 c1  v1 , vi   c2  v 2 , vi     ci  vi , vi     cn  v n , vi 
 ci  vi , vi 
  v i , v i   0  ci  0 i
Elementary Linear Algebra: Section 5.3, p.309
 S is linearly independen t.
46/80

Corollary to Thm 5.10:
If V is an inner product space of dimension n, then any
orthogonal set of n nonzero vectors is a basis for V.
Elementary Linear Algebra: Section 5.3, p.310
47/80

Ex 4: (Using orthogonality to test for a basis)
Show that the following set is a basis for R 4 .
v1
v2
v3
v4
S {(2 , 3 , 2 ,  2) , (1 , 0 , 0 , 1) , (1 , 0 , 2 , 1) , (1 , 2 ,  1 , 1)}
Sol:
v1 , v 2 , v3 , v 4 : nonzero vectors
v1  v 2  2  0  0  2  0
v 2  v 3  1  0  0  1  0
v1  v 3  2  0  4  2  0
v 2  v 4  1  0  0  1  0
v1  v 4  2  6  2  2  0
v3  v 4  1  0  2  1  0
 S is orthogonal.
 S is a basis for R 4 (by Corollary to Theorem 5.10)
Elementary Linear Algebra: Section 5.3, p.310
48/80

Thm 5.11: (Coordinates relative to an orthonormal basis)
If B  {v1 , v 2 ,  , v n } is an orthonormal basis for an inner
product space V, then the coordinate representation of a vector
w with respect to B is
w  w , v1  v1  w , v 2  v 2    w , v n  v n
Pf:
B  {v1 , v 2 ,  , v n } is a basis for V
w V
w  k1v1  k2 v 2    kn v n (unique representation)
 B  {v1 , v 2 ,  , v n } is orthonormal
1
  vi , v j   
0
i j
i j
Elementary Linear Algebra: Section 5.3, pp.310-311
49/80
〈 w , v i〉 〈 (k1 v1  k 2 v 2    k n v n ) , v i〉
 k〈1 v1 , v i〉    k〈i v i , v i〉    k〈n v n , v i〉
 ki
i
 w  w, v1 v1  w, v 2  v 2   w, v n  v n

Note:
If B  {v1 , v 2 ,  , v n } is an orthonormal basis for V and w  V ,
Then the corresponding coordinate matrix of w relative to B is
w B
  w , v1  
 w , v  
2 

  


 w , v n  
Elementary Linear Algebra: Section 5.3, pp.310-311
50/80

Ex 5: (Representing vectors relative to an orthonormal basis)
Find the coordinates of w = (5, -5, 2) relative to the following
orthonormal basis for R 3 .
B  {( 53 , 54 , 0) , ( 54 , 53 , 0) , (0 , 0 , 1)}
Sol:
 w, v1   w  v1  (5 ,  5 , 2)  ( 53 , 54 , 0)  1
 w, v 2   w  v 2  (5,  5 , 2)  ( 54 , 53 , 0)  7
 w, v 3   w  v 3  (5 ,  5 , 2)  (0 , 0 , 1)  2
  1
 [ w ]B    7 
 2 
Elementary Linear Algebra: Section 5.3, p.311
51/80

Gram-Schmidt orthonormalization process:
B  {u1 , u2 , , un } is a basis for an inner product space V
Let v1  u1
〈
v 2  u 2  projW1 u 2  u 2 
〈
〈
v 3  u3  projW2 u3  u3 
〈

w1  span({v1})
u 2 , v1〉
v1
v1 , v1〉
w 2  span({v1 , v 2 })
u3 , v1〉
〈 u3 , v 2〉
v1 
v2
v1 , v1〉
〈 v 2 , v 2〉
n 1
〈 v n , v i〉
v n  u n  projWn1 u n  u n  
vi
i 1〈 v i , v i〉
 B'  {v1 , v 2 , , v n } is an orthogonal basis.
vn
v1 v 2
 B' '  {
,
, ,
} is an orthonormal basis.
v1 v 2
vn
Elementary Linear Algebra: Section 5.3, p.312
52/80

Ex 7: (Applying the Gram-Schmidt orthonormalization process)
Apply the Gram-Schmidt process to the following basis.
u1
B  {(1 , 1 , 0) ,
u2
(1 , 2 , 0) ,
u3
(0 , 1 , 2)}
Sol: v1  u1  (1 , 1 , 0)
u 2  v1
3
1 1
v2  u2 
v1  (1 , 2 , 0)  (1 , 1 , 0)  ( , , 0)
v1  v1
2
2 2
u 3  v1
u3  v 2
v3  u3 
v1 
v2
v1  v1
v2  v2
1
1/ 2 1 1
 (0 , 1 , 2)  (1 , 1 , 0) 
( , , 0)  (0 , 0 , 2)
2
1/ 2 2 2
Elementary Linear Algebra: Section 5.3, pp.314-315
53/80
Orthogonal basis
 B'  {v1 , v 2 , v 3 }  {(1, 1, 0), (
1 1
, , 0), (0, 0, 2)}
2 2
Orthonormal basis
v3
v1 v 2
1 1
1 1
 B' '  {
,
,
}  {(
,
, 0), (
,
, 0), (0, 0,1)}
v1 v 2
v3
2 2
2 2
Elementary Linear Algebra: Section 5.3, pp.314-315
54/80

Ex 10: (Alternative form of Gram-Schmidt orthonormalization process)
Find an orthonormal basis for the solution space of the
homogeneous system of linear equations.
x1  x 2
Sol:
 7 x4  0
2 x1  x 2  2 x3  6 x 4  0
1 1 0 7 0
2 1 2 6 0



G. J . E
1 0 2  1 0
0 1  2 8 0


 x1   2 s  t 
  2  1 
 x   2 s  8t 
 2    8
  s   t  
  2  
 x3   s 
1 0
  

   
x
t

0 1
 4 
Elementary Linear Algebra: Section 5.3, p.317
55/80
Thus one basis for the solution space is
B  {u1 , u2 }  {(2 , 2 , 1 , 0) , (1 ,  8 , 0 , 1)}
v1  u1   2, 2,1, 0
 u2 , v1 
 18
 2, 2,1, 0
v 2  u2 
v1  1,  8, 0,1 
v1 , v1 
9
  3,  4, 2, 1
 B'   2,2,1,0 3,4,2,1
(orthogonal basis)
  2 2 1    3  4
2
1 
 B' '  
, , ,0  , 
,
,
,

 3 3 3   30 30 30 30 
(orthonormal basis)
Elementary Linear Algebra: Section 5.3, p.317
56/80
Keywords in Section 5.3:

orthogonal set: 正交集合

orthonormal set: 單範正交集合

orthogonal basis: 正交基底

orthonormal basis: 單範正交基底

linear independent: 線性獨立

Gram-Schmidt Process: Gram-Schmidt過程
57/80
5.4 Mathematical Models and Least Squares Analysis

Orthogonal subspaces:
T hesubspacesW1 and W2 of an inner productspace V are orthogonal
if v1 , v2   0 for all v1 in W1 and all v2 in W2 .
 Ex 2: (Orthogonal subspaces)
T hesubspaces
1 1
- 1
W1  span(0, 1 ) and W2  span( 1  )
1 0
 1 
are orthogonalbecause v1 , v 2   0 for any vectorin W1 and any vectorin W2 is zero.
Elementary Linear Algebra: Section 5.4, p.321
58/80

Orthogonal complement of W:
Let W be a subspace of an inner product space V.
(a) A vector u in V is said to orthogonal to W,
if u is orthogonal to every vector in W.
(b) The set of all vectors in V that are orthogonal to every
vector in W is called the orthogonal complement of W.
W   {v V |  v , w  0 ,  w W }
W  (read “ W perp”)
 Notes:
(1)
0

V
Elementary Linear Algebra: Section 5.4, p.322
(2) V   0
59/80

Notes:
W is a subspace of V
(1) W  is a subspace of V
(2) W  W   0
(3) (W  )   W
 Ex:
If V  R 2 , W  x  axis
Then (1) W   y - axis is a subspace of R 2
(2) W  W   (0,0)
(3) (W  )   W
Elementary Linear Algebra: Section 5.4, Addition
60/80

Direct sum:
Let W1 and W2 be two subspaces of R n . If each vector x  R n
can be uniquely written as a sum of a vector w1 from W1
and a vector w 2 from W2 , x  w1  w 2 , then R n is the
direct sum of W1 and W2 , and you can write
.

R n  W1 W2
Thm 5.13: (Properties of orthogonal subspaces)
Let W be a subspace of Rn. Then the following properties
are true.
(1) dim(W )  dim(W  )  n
(2) R n  W  W 
(3) (W  )  W
Elementary Linear Algebra: Section 5.4, p.323
61/80

Thm 5.14: (Projection onto a subspace)
If {u1 , u2 ,  , ut } is an orthonormal basis for the
subspace S of V, and v V , then
Pf:
projW v   v , u1u1   v , u2 u2     v , ut ut
 projW v  W and{u1 , u2 ,  , ut } is an orthonorma
l basis for W
 projW v  projW v, u1  u1   projW v, ut  ut
  v  projW  v , u1  u1    v  projW  v , ut  ut
( projW v  v  projW  v )
 v, u1 u1   v, ut ut (projW  v, ui   0,  i)
Elementary Linear Algebra: Section 5.4, p.324
62/80

Ex 5: (Projection onto a subspace)
w1  0, 3, 1, w 2  2, 0, 0, v  1, 1, 3
Find the projection of the vector v onto the subspace W.
Sol:
W  span({w1 , w 2 })
w1 , w 2 
: an orthogonal basis for W

3
1

 w1 w 2 
 
 u1 , u 2   
,
,
), 1,0,0 :
  (0,

10 10

 w1 w 2 
 
an orthonormal basis for W
projW v  v, u1  u1  v, u2  u2
6
3
1
9 3

(0,
,
)  1,0,0  (1, , )
5 5
10
10 10
Elementary Linear Algebra: Section 5.4, p.325
63/80

Find by the other method:
A  w1 , w 2 , b  v
Ax  b
 x  ( AT A) 1 AT b
 projcs( A)b  Ax  A( AT A) 1 AT b
Elementary Linear Algebra: Section 5.4, p.325
64/80

Thm 5.15: (Orthogonal projection and distance)
Let W be a subspace of an inner product space V, and v V.
Then for all w  W , w  projW v
|| v  projW v ||  || v  w ||
or || v  projW v ||  min
|| v  w ||
wW
( projW v is the best approximation to v from W)
Elementary Linear Algebra: Section 5.4, p.326
65/80

Pf:
v  w  ( v  projW v)  (projW v  w)
( v  projW v)  (projW v  w)
By the Pythagorean theorem
|| v  w||2  || v  projW v ||2  || projW v  w||2
w  projW v  projW v  w  0
|| v  w ||2  || v  projW v ||2
|| v  projW v |||| v  w ||
Elementary Linear Algebra: Section 5.4, p.326
66/80

Notes:
(1) Among all the scalar multiples of a vector u, the
orthogonal projection of v onto u is the one that is
closest to v. (p.302 Thm 5.9)
(2) Among all the vectors in the subspace W, the vector
projW v is the closest vector to v.
Elementary Linear Algebra: Section 5.4, p.325
67/80

Thm 5.16: (Fundamental subspaces of a matrix)
If A is an m×n matrix, then
(1) (CS ( A))   NS ( A )
( NS ( A ))   CS ( A)
(2) (CS ( A ))   NS ( A)
( NS ( A))   CS ( A )
(3) CS ( A)  NS ( AT )  Rm
CS ( A)  ( NS ( A))  Rm
(4) CS ( AT )  NS ( A)  Rn
CS ( AT )  (CS ( A ))  Rn
Elementary Linear Algebra: Section 5.4, p.327
68/80

Ex 6: (Fundamental subspaces)
Find the four fundamental subspaces of the matrix.
1
0
A
0

0
2
0
0
0
0
1
0

0
(reduced row-echelon form)
Sol:
CS ( A)  span 1,0,0,0 0,1,0,0 is a subspace of R 4
CS ( A )  RS  A  span 1,2,0 0,0,1 is a subspace of R 3
NS ( A)  span  2,1,0 is a subspace of R 3
Elementary Linear Algebra: Section 5.4, p.326
69/80
1 0 0 0 
1 0 0 0 
A  2 0 0 0 ~ R  0 1 0 0
0 1 0 0
0 0 0 0
s t
NS ( A )  span 0,0,1,0 0,0,0,1 is a subspace of R 4

Check:
(CS ( A))  NS ( A )
(CS ( A ))  NS ( A)
CS ( A)  NS ( AT )  R4
CS ( AT )  NS ( A)  R3
Elementary Linear Algebra: Section 5.4, p.327
70/80

Ex 3 & Ex 4:
W  span({w1 , w 2 })
Let W is a subspace of R4 and w1  (1, 2,1, 0), w 2  (0, 0, 0,1) .
(a) Find a basis for W
(b) Find a basis for the orthogonal complement of W.
Sol:
1
2
A
1

0
w1
0
1
0
0
~R
0
0


1
0
w2
0
1
0

0
Elementary Linear Algebra: Section 5.4, pp.322-323
(reduced row-echelon form)
71/80
(a) W  CS  A
 1,2,1,0, 0,0,0,1
is a basis for W
(b) W   CS  A  NS A 

1 2 1 0
A  

0
0
0
1



 x1   2 s  t 
 2  1
x   s 
 1  0
  s   t  
 2  
 x3   t 
 0  1
x   0 
 0  0

   
 4 
  2,1,0,0   1,0,1,0  is a basis for W 

Notes:
(1)
dim(W )  dim(W  )  dim( R 4 )
(2) W  W   R 4
Elementary Linear Algebra: Section 5.4, pp.322-323
72/80

Least squares problem:
Ax  b
m  n n 1 m  1
(A system of linear equations)
(1) When the system is consistent, we can use the Gaussian
elimination with back-substitution to solve for x
(2) When the system is inconsistent, how to find the “best possible”
solution of the system. That is, the value of x for which the
difference between Ax and b is small.
Elementary Linear Algebra: Section 5.4, p.320
73/80

Least squares solution:
Given a system Ax = b of m linear equations in n unknowns,
the least squares problem is to find a vector x in Rn that
minimizes
Ax  b with respect to the Euclidean inner
product on Rn. Such a vector is called a least squares
solution of Ax = b.

Notes:
T heleast square problemis to find a vectorxˆ in R n such that
Axˆ  projCS ( A ) b in thecolumn space of A (i.e., Axˆ  CS ( A))
is as close as possible to b. T hatis,
b  projCS ( A ) b  b  Axˆ  min
b  Ax
n
xR
Elementary Linear Algebra: Section 5.4, p.328
74/80
A  M mn
x  Rn
Ax  CS ( A) (CS  A is a subspace of R m )
 b  CS ( A) ( Ax  b is an inconsistent system)
Let Axˆ  projCS ( A ) b
 (b  Axˆ )  CS ( A)
 b  Axˆ  (CS ( A))  NS ( A )
 A (b  Axˆ )  0
i.e.
A Axˆ  A b (the normal system associated with Ax = b)
Elementary Linear Algebra: Section 5.4, pp.327-328
75/80

Note: (Ax = b is an inconsistent system)
The problem of finding the least squares solution of Ax  b
is equal to he problem of finding an exact solution of the
associated normal system A Axˆ  Ab .
Elementary Linear Algebra: Section 5.4, p.328
76/80

Ex 7: (Solving the normal equations)
Find the least squares solution of the following system
Ax  b
1 1 
0 
1 2 c 0   1 

 c   
1 3  1  3
(this system is inconsistent)
and find the orthogonal projection of b on the column space of A.
Elementary Linear Algebra: Section 5.4, p.328
77/80

Sol:
1 1
1 1 1 
3 6 

T
A A
1 2  



1
2
3
6
14

 1 3 



0 
1 1 1    4 
T
A b
1   


1 2 3 3 11
 
the associated normal system
AT Axˆ  AT b
 3 6   c0   4 
6 14  c   11

 1   
Elementary Linear Algebra: Section 5.4, p.328
78/80
the least squares solution of Ax = b
 53 
xˆ   3 
 2 
the orthogonal projection of b on the column space of A
1 1  5
 61 
3 8


projCS ( A ) b  Axˆ  1 2  3    6 
2
1 3    176 
Elementary Linear Algebra: Section 5.4, p.329
79/80
Keywords in Section 5.4:

orthogonal to W: 正交於W

orthogonal complement: 正交補集

direct sum: 直和

projection onto a subspace: 在子空間的投影

fundamental subspaces: 基本子空間

least squares problem: 最小平方問題

normal equations: 一般方程式
80/80