1 of 28

Gramin (ACS) Mahavidyalaya Vasantnagar , � Kotgyal � Dep. Of Mathematics � (B.Sc.TY)� Sem-5th Paper no.–14th (Linear Algebra)� By- Dr.S.S.Zampalwad

(HOD)

Department of Mathematics

Gramin ACS college Vasantnagar, kotgyal

Tq: Mukhed Dist: Nanded

2 of 28

Inner Product Spaces

Inner product

Linear functional

Adjoint

3 of 28

  • Assume F is a subfield of R or C.
  • Let V be a v.s. over F.
  • An inner product on V is a function�VxV -> F i.e., a,b in V -> (a|b) in F s.t.
    • (a) (a+b|r)=(a|r)+(b|r)
    • (b)( ca|r)=c(a|r)
    • (c ) (b|a)=(a|b)-
    • (d) (a|a) >0 if a ≠0.
    • Bilinear (nondegenerate) positive.
    • A ways to measure angles and lengths.

4 of 28

  • Examples:
  • Fn has a standard inner product.
    • ((x1,..,xn)|(y1,…,yn)) =
    • If F is a subfield of R, then = x1y1+…+xnyn.
  • A,B in Fnxn.
    • (A|B) = tr(AB*)=tr(B*A)
      • Bilinear property: easy to see.
      • tr(AB*)=

5 of 28

  • (X|Y)=Y*Q*QX, where X,Y in Fnx1, Q nxn invertible matrix.
    • Bilinearity follows easily
    • (X|X)=X*Q*QX=(QX|QX)std ≥0.
    • In fact almost all inner products are of this form.
  • Linear T:V->W and W has an inner product, then V has “induced” inner product.
    • pT(a|b):= (Ta|Tb).

6 of 28

  • (a) For any basis B={a1,…,an}, there is an inner-product s.t. (ai|aj)=δij.
    • Define T:V->Fn s.t. ai -> ei.
    • Then pT(ai|aj)=(ei|ej)= δij.
  • (b) V={f:[0,1]->C| f is continuous }.
    • (f|g)= ∫01 fg- dt for f,g in V is an inner product.
    • T:V->V defined by f(t) -> tf(t) is linear.
    • pT(f,g)= ∫01tftg- dt=∫01t2fg- dt is an inner product.

7 of 28

  • Polarization identity: Let F be an imaginary field in C.
  • (a|b)=Re(a|b)+iRe(a|ib) (*):
    • (a|b)=Re(a|b)+iIm(a|b).
    • Use the identity Im(z)=Re(-iz) .
    • Im(a|b)=Re(-i(a|b))=Re(a|ib)
  • Define ||a|| := (a|a)1/2 norm
  • ||a±b||2=||a||2±2Re(a|b)+||b||2 (**).
  • (a|b)=||a+b||2/4-||a-b||2/4+i||a+ib||2/4�-i||a-ib||2/4. (proof by (*) and (**).)
  • (a|b)= ||a+b||2/4-||a-b||2/4 if F is a real field.

8 of 28

  • When V is finite-dimensional, inner products can be classified.
  • Given a basis B={a1,…,an} and any inner product ( | ): �(a|b) = Y*GX for X=[a]B, Y=[b]B
    • G is an nxn-matrix and G=G*, X*GX>0 for any X, X≠0.
  • Proof: (->) Let Gjk=(ak|aj).

9 of 28

    • G=G*: (aj|ak)=(ak|aj)-. Gkj=Gjk-.
    • X*GX =(a|a) > 0 if X≠0.
    • (G is invertible. GX≠0 by above for X≠0.)
    • (<-) X*GY is an inner-product on Fnx1.
      • (a|b) is an induced inner product by a linear transformation T sending ai to ei.

    • Recall Cholesky decomposition: Hermitian positive definite matrix A = L L*. L lower triangular with real positive diagonal. (all these are useful in appl. Math.)

10 of 28

Inner product spaces

  • Definition: An inner product space �(V, ( | ))
  • F⊂R -> Euclidean space
  • F⊂C -> Unitary space.
  • Theorem 1. V, ( | ). Inner product space.
    1. ||ca||=|c|||a||.
    2. ||a|| > 0 for a≠0.
    3. |(a|b)| ≤||a||||b|| (Cauchy-Schwarz)
    4. ||a+b|| ≤||a||+||b||

11 of 28

  • Proof (ii)

  • Proof (iii)

12 of 28

  • In fact many inequalities follows from Cauchy-Schwarz inequality.
  • The triangle inequality also follows.
  • See Example 7.
  • Example 7 (d) is useful in defining Hilbert spaces. Similar inequalities are used much in analysis, PDE, and so on.
  • Note Example 7, no computations are involved in proving these.

13 of 28

  • On inner product spaces one can use the inner product to simplify many things occurring in vector spaces.
    • Basis -> orthogonal basis.
    • Projections -> orthogonal projections
    • Complement -> orthogonal complement.
    • Linear functions have adjoints
    • Linear functionals become vector
    • Operators -> orthogonal operators and self adjoint operators (we restrict to )

14 of 28

Orthogonal basis

  • Definition:
    • a,b in V, a⊥b if (a|b)=0.
    • The zero vector is orthogonal to every vector.
    • An orthogonal set S is a set s.t. all pairs of distinct vectors are orthogonal.
    • An orthonormal set S is an orthogonal set of unit vectors.

15 of 28

  • Theorem 2. An orthogonal set of nonzero-vectors is linearly independent.
  • Proof: Let a1,…,am be the set.
    • Let 0=b=c1a1+…+cmam.
    • 0=(b,ak)=(c1a1+…+cmam, ak )=ck(ak |ak )
    • ck=0.
  • Corollary. If b is a linear combination of orthogonal set a1,…,am of nonzero vectors, then b=∑k=1m ((b|ak)/||ak||2) ak
  • Proof: See above equations for b≠0.

16 of 28

  • Gram-Schmidt orthogonalization:
  • Theorem 3. b1,…,bn in V independent. Then one may construct orthogonal basis a1,…,an s.t. {a1,…,ak} is a basis for <b1,…,bk> for each k=1,..,n.
  • Proof: a1 := b1. a2=b2-((b2|a1)/||a1||2)a1,…,
    • Induction: {a1,..,am} constructed and is a basis for < b1,…,bm>.
    • Define

17 of 28

    • Then

    • Use Theorem 2 to show that the result {a1,…,am+1} is independent and hence is a basis of <b1,…,bm+1>.

  • See p.281, equation (8-10) for some examples.
  • See examples 12 and 13.

18 of 28

Best approximation, Orthogonal complement, Orthogonal projections

  • This is often used in applied mathematics needing approximations in many cases.
  • Definition: W a subspace of V. b in W. Then the best approximation of b by a vector in W is a in W s.t. �||b-a|| ≤ ||b-c|| for all c in W.
  • Existence and Uniqueness. (finite-dimensional case)

19 of 28

  • Theorem 4: W a subspace of V. b in V.
    • (i). a is a best appr to b <-> b-a ⊥ c for all c in W.
    • (ii). A best appr is unique (if it exists)
    • (iii). W finite dimensional.�{a1,..,ak} any orthonormal basis. ����is the best approx. to b by vectors in W.

20 of 28

  • Proof: (i)
    • Fact: Let c in W. b-c =(b-a)+(a-c). �||b-c||2=||b-a||2+2Re(b-a|a-c)+||a-c||2(*)
    • (<-) b-a ⊥W. If c ≠a, then �||b-c||2=||b-a||+||a-c||2 > ||b-a||2. �Hence a is the best appr.
    • (->) ||b-c||≥||b-a|| for every c in W.
      • By (*) 2Re(b-a|a-c)+||a-c||2 ≥0
      • <-> 2Re(b-a|t)+||t||2 ≥0 for every t in W.
      • If a≠c, take t =

21 of 28

22 of 28

      • This holds <-> (b-a|a-c)=0 for any c in W.
      • Thus, b-a is ⊥ every vector in W.
    • (ii) a,a’ best appr. to b in W.
      • b-a ⊥ every v in W. b-a’ ⊥ every v in W.
      • If a≠a’, then by (*)�||b-a’||2=||b-a||2+2Re(b-a|a-a’)+||a-a’||2.�Hence, ||b-a’||>||b-a||.
      • Conversely, ||b-a||>||b-a’||.
      • This is a contradiction and a=a’.

23 of 28

    • (iii) Take inner product of ak with

    • This is zero. Thus b-a ⊥ every vector in W.

24 of 28

Orthogonal projection

  • Orthogonal complement. S a set in V.
  • S :={v in V| v⊥w for all w in S}.
  • Sis a subspace. V={0}.
  • If S is a subspace, then V=S⊕ S and�(S) =S.
  • Proof: Use Gram-Schmidt orthogonalization to a basis {a1,…,ar,ar+1,…,an} of V where {a1,…,ar} is a basis of V.

25 of 28

  • Orthogonal projection: EW:V->W. �a in V -> b the best approximation in W.
  • By Theorem 4, this is well-defined for any subspace W.
  • EW is linear by Theorem 5.
  • EW is a projection since EW °EW(v)= EW(v).

26 of 28

  • Theorem 5: W subspace in V. E orthogonal projection V->W. Then E is an projection and �W=nullE and V=W⊕W.
  • Proof:
    • Linearity:
      • a,b in V, c in F. a-Ea, b-Eb ⊥ all v in W.
      • c(a-Ea)+(b-Eb)=(ca+b)-(cE(a)+E(b)) ⊥ all v in W.
      • Thus by uniqueness E(ca+b)=cEa+Eb.
    • null E ⊂ W: If b is in nullE, then b=b-Eb is in W.
    • W⊂ null E: If b is in W, then b-0 is in Wand 0 is the best appr to b by Theorem 4(i) and so Eb=0.
    • Since V=ImE⊕nullE, we are done.

27 of 28

  • Corollary: b-> b-Ewb is an orthogonal projection to W. I-Ew is an idempotent linear transformation; i.e., projection.
  • Proof: b-> b-Ewb is in W by Theorem 4 (i).
    • Let c be in W. b-c=Eb+(b-Eb-c).
    • Eb in W, (b-Eb-c) in W.
    • ||b-c||2=||Eb||2+||b-Eb-c||2≥||b-(b-Eb)||2 and�> if c ≠b-Eb.
    • Thus, b-Eb is the best appr to b in W⊥.

28 of 28

Bessel’s inequality

  • {a1,…,an} orthogonal set of nonzero vectors. Then

  • = <->