January 26, 2025
If A and B are sets, the set exponential B^A = \{f\colon A\to B\} is the set of all functions from A to B.
Exercise. Show |A^B| = |A|^|B| where |.| denotes the number of elements a set has.
Every vector space has the form \bm{R}^I where \bm{R} is the real numbers and I is a set. Scalar multiplication and vector addition are defined pointwise. For a\in\bm{R}, and x,y\in\bm{R}^I define {(ax)(i) = a(x(i))} and {(x + y)(i) = x(i) + y(i)}. We write 0_I for the function in \bm{R}^I with 0_I(i) = 0 for i\in I.
If I = \{1,\ldots,n\} then we can identify x\in\bm{R}^I with (x_1, \ldots, x_n) where x(i) = x_i, i\in I.
Exercise. For a,b\in\bm{R}, x,y,z\in\bm{R}^I show \begin{aligned} x + (y + z) &= (x + y) + z \\ x + y &= y + x \\ v + 0_I &= v \\ v + (-v) &= 0_I \\ a(bx) &= (ab)x \\ 1v &= v \\ a(x + y) &= ax + ay \\ (a + b)x &= ax + bx \\ \end{aligned}
Any set V with a scalar multiplication and vector addition satisfying these properties is a vector space. It is true that every vector space is isomorphic to \bm{R}^I for some set I, but you will have to read the proof of this elsewhere.
The abstract definition of vector space is more difficult to work with than the special case of
If I is finite define the dot product x\cdot y = \sum_{i\in I} x_i y_i, x,y\in\bm{R}^I.
Exercise. Show (ax + y)\cdot z = a(x\cdot z) + y\cdot z, a\in\bm{R}, x,y,z\in\bm{R}^I.
This shows z^*(x) = x\cdot z is a linear map from \bm{R}^I to \bm{R}, z^*(ax + y) = az^*(x) + z^*(y).
Define the (Euclidean) norm \|x\| by \|x\|^2 = x\cdot x for x\in\bm{R}^I.
Exercise. (Cauchy-Schwartz) Show |x\cdot y|\le\|x\|\|y\|, x,y\in\bm{R}^I.
Hint: Use 0\le\|tx + y\|^2 = t^2\|x\|^2 + 2tx\cdot y + \|y\|^2 and the discriminant of a quadratic having at most one root is non-negative.
Exercise. Show \|ax\| = |a|\|x\| and \|x + y\|\le\|x\| + \|y\|, a\in\bm{R}, x,y\in\bm{R}^I.
Hint: Use 2x\cdot y\le2\|x\|\|y\|.
Define the p-norm \|x\|_p by \|x\|_p^p = \sum_{i\in I}|x_i|^p for 1 \le p < \infty.
Exercise. Show \lim_{p\to\infty}\|x\|_p = \sup_{i\in I}|x_i| = \|x\|_\infty.
Hint: Consider \sum_{i\in I} |x_i/\|x\|_\infty|^p and use x^p\to 0 on [0,1) as p\to\infty.
For i\in I define e_i\in\bm{R}^I by e_i(j) = 1 if i = j and e_i(j) = 0 if i\not=j, j\in I. This can also be written e_i(j) = \delta_{ij} using the Kronecker delta.
Exercise. Show for x\in\bm{R}^I that x = \sum_{i\in I} x(i)e_i.
Hint. Compute x(j) for j\in I.
If \bm{R}^I and \bm{R}^J are vector spaces a linear transformation is a function T\colon \bm{R}^I\to \bm{R}^J with {T(ax + y) = aTx + y} for {a\in\bm{R}}, and {x,y\in\bm{R}^I}.
Exercise. Show T(ax) = aTx and T(x + y) = Tx + Ty, a\in\bm{R}, x,y\in\bm{R}^I.
Hint. Take y = 0_I and a = 1.
Exercise. Show T0_I = 0_I if T is a linear transformation.
Hint: Use T(0_I + 0_I) = T0_I + T0_I and x + x = x implies x = 0_I.
For e_i\in\bm{R}^I we have Te_i\in\bm{R}^J so there are t_{ij}\in\bm{R} with e_i = \sum_j t_{ij}e_j. We call [t_{ij}]_{i\in I, j\in J} the matrix of T.
If S\colon\bm{R}^J\to\bm{R}^K is a linear transformation then ST\colon\bm{R}^I\to\bm{R}^K is the composition (ST)x = S(Tx)\in\bm{R}^K, x\in\bm{R}^I.
Exercise. Show ST is linear.
Hint: (ST)(a x + y) = S(T(ax + y)) = \cdots
Exercise. Show the i,k entry of the matrix of ST is \sum_{j\in J}t_{ij}s_{jk}, i\in I, k\in K.
This shows matrix multiplication is composition of linear transformations.
The set of all linear transformations from V to W is denoted \mathcal{L}(V,W). It is also a vector space with scalar multiplication (aT)x = a(Tx) and vector addition (S + T)x = Sx + Tx. Composition is an associative product making \mathcal{L}(V,W) a non-commutative algebra.
The dual of a vector space V is V^* = \mathcal{L}(V,\bm{R}). If {\xi^*\colon\bm{R}^I\to\bm{R}} is a linear functional define {\xi\in\bm{R}^I} by {\xi_i = \xi^*(e_i)}, i\in I. This provides a map *\colon(\bm{R}^I)^*\to\bm{R}^I.
Exercise. Show \xi^*(x) = \xi\cdot x for \xi^*\in(\bm{R}^I)^*, x\in\bm{R}^I.
Exercise. Show this map is linear, one-to-one, and onto.
The dual of a finite dimensional vector is isomorphic with itself via the canonical basis.
If T\in\mathcal{L}(V,W) define the adjoint T^*\in\mathcal{L}(W^*,V^*) by (T^*w^*)v = w^*(Tv), w^*\in W^* for v\in V.
Exercise. Show (ST)^* = T^*S.
For any set S let {B(S) = \{f\colon S\to\bm{R}\mid \sup_{s\in S}|f(s)| = \|f\| < \infty\}} be the normed vector space of bounded functions on S. Its dual is the set of all bounded linear functionals on B(S). If L\colon B(S)\to\bm{R} is a bounded linear functional define \lambda(E) = L(1_E) for E\subseteq S, where 1_E(s) = 1 if s\in E and 1_E(s) = 0 if s\not\in E.
Exercise. Show \lambda(E\cup F) = \lambda(E) + \lambda(F) - \lambda(E\cap F) and \lambda(\emptyset) = 0 for E,F\subseteq S.
Hint: Use 1_{E\cup F} = 1_E + 1_F - 1_{E\cap F} and 1_\emptyset = 0.
This shows every bounded linear functional on B(S) gives rise to a finitely additive measure on S. The normed vector space of finitely additive measures is denoted ba(S).
A simple function f = \sum a_j A_j is a finite sum with a_j\in\bm{R} and A_j\subseteq S define \int_S f\,d\lambda = \sum_j a_j\lambda(A_j).
Exercise. If \{A_j\} are disjoint show \sum_j a_j 1_{A_j} = 0 implies a_j = 0 for all j.
Hint: Every s\in S belongs to at most one A_j.
Exercise. If \sum_k b_k B_k is a finite sum with b_k\in\bm{R} and B_k\subseteq S show there exist a_j\in\bm{R} and disjoint A_j\subseteq S with \sum_j a_j 1_{A_j} = \sum_k b_k 1_{B_k}.
Hint: Start with b_1 1_{B_1} + b_2 1_{B_2} = b_1 1_{B_1\setminus B_2} + (b_1 + b_2)1_{B_1\cap B_2} + b_2 1_{B_2\setminus B_1} where A\setminus B = \{a\in A\mid a\not\in B\} is set difference and use induction.
Exercise. If \sum_k b_k B_k = 0 is a finite sum with b_k\in\bm{R} and B_k\subseteq S show \int_S \sum_k b_k\,d\lambda = 0.
These two exercises show the definition of integration is well-defined for simple functions.
If F\colon V\to W is a function between two normed linear spaces the Frechet derivative DF\colon V\to\mathcal{B}(V,W) at x\in V is a linear operator approximating F near x F(x + h) = F(x) + DF(x)h + o(\|h\|).
Exercise. Show the Frechet derivative of \|x\|^2 is 2x^*.
Hint. \|x\|^2 = x^*x.
Exercise. Show the Frechet derivative of \|x\|^p is p\|x\|^{p-2}x^*.
Hint. \|x\|^p = \exp((p/2)\log\|x\|^2).
Exercise. Show the Frechet derivative of x^*Tx is x^*T + T^*x^*.