Cauchy–Schwarz inequality

In mathematics, the Cauchy–Schwarz inequality, also known as the Cauchy–Bunyakovsky–Schwarz inequality, is a useful inequality encountered in many different settings, such as linear algebra, analysis, probability theory, vector algebra and other areas. It is considered to be one of the most important inequalities in all of mathematics.[1] It has a number of generalizations, among them Hölder's inequality.

The inequality for sums was published by Augustin-Louis Cauchy (1821), while the corresponding inequality for integrals was first proved by Viktor Bunyakovsky (1859). The modern proof of the integral inequality was given by Hermann Amandus Schwarz (1888).[1]

Statement of the inequality

The Cauchy–Schwarz inequality states that for all vectors and of an inner product space it is true that

where is the inner product. Examples of inner products include the real and complex dot product, see the examples in inner product. Equivalently, by taking the square root of both sides, and referring to the norms of the vectors, the inequality is written as

.[2][3]

Moreover, the two sides are equal if and only if and are linearly dependent ( meaning they are parallel, one of the vector's magnitudes is zero, or one is a scalar multiple of the other).[4]:14

If and have an imaginary component, the inner product is the standard complex inner product where the bar notation is used for complex conjugation and then the inequality may be restated more explicitly as

or

Proofs

First proof

Let and be arbitrary vectors in a vector space over with an inner product, where is the field of real or complex numbers. We prove the inequality

and that equality holds only when either or is a multiple of the other.

If , it is clear that we have equality, and in this case and are also linearly dependent (regardless of ). We henceforth assume that is nonzero. We also assume that otherwise the inequality is obviously true, because neither nor can be negative.

Let

Then, by linearity of the inner product in its first argument, one has

Therefore is a vector orthogonal to the vector (Indeed, is the projection of onto the plane orthogonal to .) We can thus apply the Pythagorean theorem to

which gives

and, after multiplication by , the Cauchy–Schwarz inequality. Moreover, if the relation in the above expression is actually an equality, then and hence ; the definition of then establishes a relation of linear dependence between and . This establishes the theorem.

Second proof

Let and be arbitrary vectors in a vector space with an inner product, where is the field of real or complex numbers.

If , the theorem holds trivially. Now assume and . Let be given by then

Therefore , or

More proofs

There are indeed many different proofs[5] of the Cauchy–Schwarz inequality other than the above two examples.[1][3] When consulting other sources, there are often two sources of confusion. First, some authors define to be linear in the second argument rather than the first. Second, some proofs are only valid when the field is and not .[6]

Special cases

R2 (ordinary two-dimensional space)

In the usual 2-dimensional space with the dot product, let and . The Cauchy–Schwarz inequality is that

where is the angle between and .

The form above is perhaps the easiest in which to understand the inequality, since the square of the cosine can be at most 1, which occurs when the vectors are in the same or opposite directions. It can also be restated in terms of the vector coordinates and as

where equality holds if and only if the vector is in the same or opposite direction as the vector , or if one of them is the zero vector.

Rn (n-dimensional Euclidean space)

In Euclidean space with the standard inner product, the Cauchy–Schwarz inequality is

The Cauchy–Schwarz inequality can be proved using only ideas from elementary algebra in this case. Consider the following quadratic polynomial in

Since it is nonnegative, it has at most one real root for , hence its discriminant is less than or equal to zero. That is,

which yields the Cauchy–Schwarz inequality.

L2

For the inner product space of square-integrable complex-valued functions, one has

A generalization of this is the Hölder inequality.

Applications

Analysis

The triangle inequality for the standard norm is often shown as a consequence of the Cauchy–Schwarz inequality, as follows: given vectors x and y:

Taking square roots gives the triangle inequality.

The Cauchy–Schwarz inequality is used to prove that the inner product is a continuous function with respect to the topology induced by the inner product itself.[7][8]

Geometry

The Cauchy–Schwarz inequality allows one to extend the notion of "angle between two vectors" to any real inner product space, by defining:

[9][10]

The Cauchy–Schwarz inequality proves that this definition is sensible, by showing that the right-hand side lies in the interval [1, 1], and justifies the notion that (real) Hilbert spaces are simply generalizations of the Euclidean space. It can also be used to define an angle in complex inner product spaces, by taking the absolute value or the real part of the right-hand side,[11][12] as is done when extracting a metric from quantum fidelity.

Probability theory

Let X, Y be random variables, then the covariance inequality[13][14] is given by:

After defining an inner product on the set of random variables using the expectation of their product,

then the Cauchy–Schwarz inequality becomes

To prove the covariance inequality using the Cauchy–Schwarz inequality, let and , then

where Var denotes variance and Cov denotes covariance.

Generalizations

Various generalizations of the Cauchy–Schwarz inequality exist in the context of operator theory, e.g. for operator-convex functions, and operator algebras, where the domain and/or range are replaced by a C*-algebra or W*-algebra.

An inner product can be used to define a positive linear functional. For example, given a Hilbert space being a finite measure, the standard inner product gives rise to a positive functional by . Conversely, every positive linear functional on can be used to define an inner product where is the pointwise complex conjugate of . In this language, the Cauchy–Schwarz inequality becomes

[15]

which extends verbatim to positive functionals on C*-algebras:

Theorem (Cauchy–Schwarz inequality for positive functionals on C*-algebras)[16][17] If is a positive linear functional on a C*-algebra then for all , .

The next two theorems are further examples in operator algebra.

Theorem (Kadison–Schwarz inequality,[18][19] named after Richard Kadison) If is a unital positive map, then for every normal element in its domain, we have and .

This extends the fact , when is a linear functional. The case when is self-adjoint, i.e. is sometimes known as Kadison's inequality.

Theorem (Modified Schwarz inequality for 2-positive maps).[20] For a 2-positive map between C*-algebras, for all in its domain,

See also

Notes

  1. 1 2 3 The Cauchy–Schwarz Master Class: an Introduction to the Art of Mathematical Inequalities, Ch. 1 by J. Michael Steele.
  2. Strang, Gilbert (19 July 2005). "3.2". Linear Algebra and its Applications (4th ed.). Stamford, CT: Cengage Learning. pp. 154155. ISBN 978-0030105678.
  3. 1 2 Hunter, John K.; Nachtergaele, Bruno (2001-01-01). Applied Analysis. World Scientific. ISBN 9789810241919.
  4. Bachmann, George; Narici, Lawrence; Beckenstein, Edward (2012-12-06). Fourier and Wavelet Analysis. Springer Science & Business Media. ISBN 9781461205050.
  5. Wu, Hui-Hua; Wu, Shanhe (April 2009). "Various proofs of the Cauchy-Schwarz inequality" (PDF). OCTOGON MATHEMATICAL MAGAZINE. 17 (1): 221–229. ISBN 978-973-88255-5-0. ISSN 1222-5657. Retrieved 18 May 2016.
  6. Aliprantis, Charalambos D.; Border, Kim C. (2007-05-02). Infinite Dimensional Analysis: A Hitchhiker's Guide. Springer Science & Business Media. ISBN 9783540326960.
  7. Bachman, George; Narici, Lawrence (2012-09-26). Functional Analysis. Courier Corporation. p. 141. ISBN 9780486136554.
  8. Swartz, Charles (1994-02-21). Measure, Integration and Function Spaces. World Scientific. p. 236. ISBN 9789814502511.
  9. Ricardo, Henry (2009-10-21). A Modern Introduction to Linear Algebra. CRC Press. p. 18. ISBN 9781439894613.
  10. Banerjee, Sudipto; Roy, Anindya (2014-06-06). Linear Algebra and Matrix Analysis for Statistics. CRC Press. p. 181. ISBN 9781482248241.
  11. Valenza, Robert J. (2012-12-06). Linear Algebra: An Introduction to Abstract Mathematics. Springer Science & Business Media. p. 146. ISBN 9781461209010.
  12. Constantin, Adrian (2016-05-21). Fourier Analysis with Applications. Cambridge University Press. p. 74. ISBN 9781107044104.
  13. Mukhopadhyay, Nitis (2000-03-22). Probability and Statistical Inference. CRC Press. p. 150. ISBN 9780824703790.
  14. Keener, Robert W. (2010-09-08). Theoretical Statistics: Topics for a Core Course. Springer Science & Business Media. p. 71. ISBN 9780387938394.
  15. Faria, Edson de; Melo, Welington de (2010-08-12). Mathematical Aspects of Quantum Field Theory. Cambridge University Press. p. 273. ISBN 9781139489805.
  16. Lin, Huaxin (2001-01-01). An Introduction to the Classification of Amenable C*-algebras. World Scientific. p. 27. ISBN 9789812799883.
  17. Arveson, W. (2012-12-06). An Invitation to C*-Algebras. Springer Science & Business Media. p. 28. ISBN 9781461263715.
  18. Størmer, Erling (2012-12-13). Positive Linear Maps of Operator Algebras. Springer Science & Business Media. ISBN 9783642343698.
  19. Kadison, Richard V. (1952-01-01). "A Generalized Schwarz Inequality and Algebraic Invariants for Operator Algebras". Annals of Mathematics. 56 (3): 494–503. doi:10.2307/1969657. JSTOR 1969657.
  20. Paulsen (2002), Completely Bounded Maps and Operator Algebras, ISBN 9780521816694 page 40.

References

External links

This article is issued from Wikipedia - version of the 11/10/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.