# Calculus of Variations

Calculus of variations is the study of extrema of functionals. This page provides a brief introduction to some elements of the theory.

# Variation and differentiability

Consider a linear functional $\ J:V \to \mathbb R$, where $\ V$ is a normed linear space.

We define the differential of $\ J$ at $y \in V$ as

$\ \Delta J[h] = J[y + h] - J[y]$ for all $h \in F$

We say that $\ J$ is differentiable at $\ y$ if there exist a linear functional $\ \delta J:V \to \mathbb R$ and $\ \epsilon$ so that

$\Delta J[h] = \delta J[h] + \epsilon \|h\|$

and $\ \epsilon \to 0$ as $\|h\| \to 0$.

Here, $\ \delta J$ is called the variation or differential of $\ J$ at $\ y$.

# Uniqueness of variation

Lemma: Let $\ \phi:V \to \mathbb R$ denote a linear functional, where $\ V$ is defined as above.

If $\ {\phi[h] \over \|h\|} \to 0$ as $\|h\| \to 0$, then $\ \phi$ is identically zero for all $\ h \in V$.

Proof: Suppose $\phi[y_0] \neq 0$ for some $y_0 \in V$. Then, for $\ n = 1, 2, 3, ...$

${\phi[\frac {h_0}n] \over \|\frac {h_0}n\|} = {\frac 1n \phi[h_0] \over \frac 1n \|h_0\|} = {\phi[h_0] \over \|h_0\|} \neq 0$

and hence does not approach zero as $\frac {h_0}n \to 0$, a contradiction. QED

Theorem: For $\ J$ defined as above, $\ \delta J$ is unique.

Proof: Suppose, if possible, that there exist two distinct variations $\ \delta J_1$ and $\ \delta J_2$. Then,

$\ \Delta J[h] = \delta J_1[h] + \epsilon_1 \|h\|$
$\ \Delta J[h] = \delta J_2[h] + \epsilon_2 \|h\|$

Comparing the two equations, we get

$\ (\delta J_1 - \delta J_2)[h] = (- \epsilon_1 + \epsilon_2)\|h\|$

or

${(\delta J_1 - \delta J_2)[h] \over \|h\|} = - \epsilon_1 + \epsilon_2 \to 0$ as $\ h \to 0$.

Hence, by the lemma, $\ \delta J_1 - \delta J_2 = 0$ or $\ \delta J_1 = \delta J_2$, a contradiction. QED

# Function space norm

Consider a space $\ C^0[a,b]$ consisting of all functions $\ f:[a,b] \to \mathbb R$ continuous on $\ [a,b]$. We define the norm as

$\|y\|_0 = \max_{x \in [a,b]}|y(x)|$

Generally, a space $\ C^n[a,b]$ consists of all functions $\ f:[a,b] \to \mathbb R$ that are continuously differentiable $\ n$ times. In this case, we define the norm as

$\|y\|_n = \sum_{i=0}^n{\max_{x \in [a,b]}|y^{(i)}(x)|}$

It can be shown that $\|y\|_n$ satisfies the conditions for a norm.

# Strong and weak extrema

Consider some linear functional $\ J:F \to \mathbb R$, where $\ F$ is a normed function space.

Then, we say that $\ J$ has a strong minimum at some $\ y_0 \in F$ if there exists some $\ \delta > 0$ such that

$\ J[y_0] \le J[y]$

for all $\|y - y_0 \|_0 < \delta$, where $y \in F$.

Similarly, we define a weak minimum at some $\ y_0 \in F$ with $\|\cdot\|_1$ in place of $\|\cdot\|_0$.

Notice that a strong minimum is also a weak minimum.

Strong and weak maximum are defined similarly with $\ge$ in place of $\le$.

We may also define extrema using norm $\| \cdot \|_n$ for $\ n > 1$.

# A necessary condition for an extremum

Theorem: If a differentiable functional $\ J$ has an extremum for some $\ y_0 \in F$, then

$\ \delta J[h] = 0$ for all $h \in F$.

Proof: Suppose, if possible, that $\ \delta J \neq 0$ for some $\ h_0 \in F$, where $\ h_0 \neq 0$.

Let $\ \alpha > 0$ and consider ${\Delta J[\alpha h_0] \over {\| \alpha h_0\|}} = {\delta J[\alpha h_0] \over {\| \alpha h_0 \|}} + \epsilon = {\delta J[h_0] \over {\| h_0 \|}} + \epsilon$.

As $\alpha \to 0$, we have $\|\alpha h_0\| = \alpha \|h_0\| \to 0$ and hence $\ \epsilon \to 0$.

That is, if $\ \alpha$ is sufficiently small, then $\ \Delta J[\alpha h_0]$ and $\ \delta J[\alpha h_0]$ share the same sign.

However, $\ \delta J[-\alpha h_0] = - \delta J[\alpha h_0]$. Hence, within any neighbourhood of $\ h = 0$, we have $\ \Delta J$ taking on either sign, a contradiction. QED

# References

1. Gelfand, I.M. and Fomin, S.V.: Calculus of Variations, Dover Publ., 2000.