Infinite Dimensional Derivatives and the Calculus of Variations: Part I

by greenturtle3141, Dec 22, 2021, 6:04 AM

Calculus of Variations Series

Part I: This
Part II: https://artofproblemsolving.com/community/c2532359h2896086


Reading Difficulty: 5/5
Prerequisites: Vector space, norm, limits, integration
Useful to know: Integration by parts, directional derivatives, gradients, multidimensional chain rule, differential equations, second-order ODEs

Notational Note: Sometimes I write $\int f\,dx$ as a shortcut/abuse of notation for $\int f(x)\,dx$.

This is one of my favorite applications of analysis. Buckle your seatbelts!


Part 0: Directional Derivatives

Let $X$ be a normed space (like $\mathbb{R}^N$). Suppose $f:E \to \mathbb{R}$ where $E \subseteq X$, and $x_0 \in E$ is an interior point (i.e. $f$ is defined not just at $x_0$, but also "around" $x_0$). Then we can try and find the "derivative" of $f$ at $x_0$.

Specifically we can take a "directional derivative", but that means that a "direction" for this derivative must be specified. Especially, it is the "rate of change moving along some line through $x_0$". To wit, let $v \in X$ with $\|v\|=1$ be a "direction". Imagine this as an arrow that points in the direction of the "derivative".

Definition (Directional Derivative): The directional derivative of $f$ at $x_0$ along the direction $v$ is:
$$\frac{\partial f}{\partial v}(x_0) := \lim_{t \to 0} \frac{f(x_0+tv) - f(x_0)}{t}$$
For example, let $X = \mathbb{R}^3$, and let $f(x,y,z) = x^2+y^2+z^2$, defined on all of $\mathbb{R}^3$. We can take the directional derivative along $i := (1,0,0)$ at $(1,1,1)$:
$$\frac{\partial f}{\partial i}(1,1,1) := \lim_{t \to 0} \frac{f((1,1,1) + t(1,0,0)) - f(1,1,1)}{t}$$$$ = \lim_{t \to 0} \frac{f(1+t,1,1) - f(1,1,1)}{t}$$$$ = \lim_{t \to 0} \frac{(1+t)^2+1^2+1^2 - 1^2-1^2-1^2}{t} = 2$$For fun, you can try and see what happens when you take $v = (\sqrt{2}/2,\sqrt{2}/2,0)$, or similar.


Part 1: Differentiating in Infinite Dimensions

This might sound scary, but keep in mind that NOTHING changes from the previous example. It's the same exact definition. We're just going to have fun with it! Let's mess around.

We're going to take a spicier vector space. Let $X = C_b(\mathbb{R})$, the normed vector space of bounded continuous functions over $\mathbb{R}$. Of course, if I say it's a normed space, then I have to give you a norm. We're going to take the supremum norm, i.e. for a continuous function $f$:
$$\|f\|_\infty := \sup_{\mathbb{R}} |f|$$If you don't know what $\sup$ means, you can instead replace it with $\max$ (not technically correct, but good enough for intuition). Convince yourself that it is a norm! That is, we should have these properties:
  • $\|f\|_\infty = 0$ if and only if $f = 0$.
  • For a real number $t$, we have that $\|tf\|_\infty = |t| \cdot \|f\|_\infty$.
  • $\|f\|_\infty + \|g\|_\infty \geq \|f+g\|_\infty$ (this might be the trickiest one to verify)

Alrighty! We have a vector space. Note that this vector space is of infinite dimension... Uh, what do functions on this vector space look like? These "functions" take in other functions, and output real numbers. Wow! So these are like, "higher-order functions", eh?

Example 1 (Evaluation Map): Let's try this cool function:
$$F(f) := f(0)$$For example, $\sin \in C_b(\mathbb{R})$, and so we can compute $F(\sin) = 0$ because $\sin(0)$.

Test your understanding real quick:
  • Compute $F(\cos)$. Answer
  • Compute $F(t \mapsto e^{-t^2})$. Answer
  • Why did I write $F(t \mapsto e^{-t^2})$ instead of $F(e^{-t^2})$? Answer
  • Compute $F(t \mapsto t^2)$. Answer

We have a function to differentiate, so now we want a direction $v$ to differentiate towards, and a point $x_0$ to take the derivative at... For the sake of generality, let's just let them be $v$ and $f_0$. (I'm calling it $f_0$ to remind myself that it's a function)

Wow this is wacky and I love it.
$$\frac{\partial F}{\partial v}(f_0) = \lim_{t \to 0} \frac{F(f_0 + tv) - F(f_0)}{t}$$$$ = \lim_{t \to 0} \frac{f_0(0) + tv(0) - f_0(0)}{t} = \boxed{v(0)}$$Interesting!

Bonus Problem: Is $F$ differentiable? That is, does there exist a linear $L:C_b(\mathbb{R}) \to \mathbb{R}$ such that $\lim_{f \to f_0} \frac{F(f) - F(f_0) - L(f-f_0)}{\|f-f_0\|_\infty} = 0$? If so, what is $L$? Hint

Example 2 (Integral Functionals): Let's start throwing in the dinosaurs!
$$F(f) := \int_{-1}^1 f(x)\,dx$$For example, $F(\sin) = 0$, and $F(\cos) = 2\sin(1)$ or something. Let's try computing the directional derivative of $F$ at $f_0$ in the direction $v$:
$$\frac{\partial F}{\partial v}(f_0) = \lim_{t \to 0} \frac{F(f_0 + tv) - F(f_0)}{t}$$$$ = \lim_{t \to 0} \frac{1}{t}\left[\int_{-1}^1 f_0(x)+tv(x)\,dx - \int_{-1}^1 f_0(x)\,dx\right] $$$$ = \lim_{t \to 0} \frac{1}{t}\left[\int_{-1}^1 f_0(x)+tv(x) - f_0(x)\,dx\right] $$$$ = \lim_{t \to 0} \frac{1}{t}\int_{-1}^1tv(x)\,dx = \boxed{\int_{-1}^1 v(x)\,dx}$$
Example 3: That was kinda lame because everything just cancelled too well. Let's get spicier:
$$F(f) := \int_{-1}^1 f(x)^2\,dx$$What happens now?
$$\frac{\partial F}{\partial v}(f_0) = \lim_{t \to 0} \frac{F(f_0 + tv) - F(f_0)}{t}$$$$ = \lim_{t \to 0} \frac{1}{t}\left[\int_{-1}^1 (f_0(x)+tv(x))^2\,dx - \int_{-1}^1 f_0(x)^2\,dx\right] $$$$ = \lim_{t \to 0} \frac{1}{t}\int_{-1}^1 2tf_0(x)v(x) + t^2v(x)^2\,dx$$$$ = \lim_{t \to 0} \int_{-1}^1 2f_0(x)v(x) + tv(x)^2\,dx = \int_{-1}^1 2f_0(x)v(x)\,dx + \lim_{t \to 0} \int_{-1}^1 tv(x)^2\,dx$$So close! How do we deal with this wacky $\lim_{t \to 0} \int_{-1}^1 tv(x)^2\,dx$ term? Intuitively, since $tv(x)^2$ should go to $0$ as $t \to 0$, we would predict that this limit is $0$. This is actually true! This is proven using Lebesgue Dominated Convergence. No worries if you're unfamiliar, but just note that you can't always "shove the limit inside the integral". It's fine here, though! Therefore:
$$\frac{\partial F}{\partial v}(f_0) = \boxed{\int_{-1}^1 2f_0(x)v(x)\,dx}$$Spicy!

Bonus Exercise: Generalize the above argument to compute the directional derivative for functionals of the form $F(f) = \int_a^b g(f(x))^2$, where $g:\mathbb{R} \to \mathbb{R}$ is a differentiable function with continuous derivative. If you don't know Lebesgue Dominated Convergence, go ahead and swap limits and integrals without proof.

Fooling around in infinite dimensions was fun! Here's how we can practically apply this... it's about to get even more fun!


Part 2: The Calculus of Variations

Consider the following four problems:
  1. What function $f:[0,1] \to [0,1]$ satisfying $f(0) = 0, f(1) = 1$ will minimize the quantity $\int_0^1 (f(x)+f'(x))^2\,dx$?
  2. What is the shortest path between two points in $\mathbb{R}^2$?
  3. Suppose a cliff is $h$ meters high. What is the shape of the track that minimizes the time it would take for a roller coaster starting at the top of the cliff to reach the ground?
  4. Consider the two rings $\{x=a,y^2+z^2=r_0^2\}$ and $\{x=b,y^2+z^2=r_1^2\}$ positioned "coaxially" in $\mathbb{R}^3$, for constants $a,b,r_0,r_1$. What is the shape of minimal surface that connects the boundaries of the two rings?

These four problems are all minimization problems. If you're reading this, you've probably found minimums before. It was easy! To find the minimum of something like $f(x) = x^2-x$, you'd just take the derivative and set it to zero. For those problems, you were finding the point that minimizes a function $f$. But for the above four problems, what we're trying to find the a certain function that minimizes some quantity in terms of that function... like, a function of a function.

Hm... Perhaps it would be easier to see where we're headed if we considered the first problem first.

Example 1: Over all differentiable functions $f:[0,1] \to [0,1]$ satisfying $f(0) = 0$ and $f(1) = 1$, what is the minimum possible value of $F(f) := \int_0^1 (f(x)+f'(x))^2\,dx$? For what $f$ is this achieved?

Well, taking inspiration from calculus, what if we just, like... took the derivative of $F$, and set it to zero...? Well, we're trying to minimize over a space of functions, which is like, an infinite-dimensional vector space, so the derivative would have to be like, some kind of infinite-dimensional derivative, or something...

Oh wait.

Some Tools We'll Need

We need an analog for "if $f$ is minimized at $x_0$ then $f'(x_0) = 0$."

Theorem 1: Let $f:E \to \mathbb{R}$ where $E \subseteq X$ for $X$ a normed vector space. Suppose $f$ has a local minimum/maximum at $x_0 \in E$, where $x_0$ is an interior point and all the directional derivatives at $x_0$ exist. Then we MUST have
$$\frac{\partial f}{\partial v}(x_0) = 0$$for EVERY direction $v$.
Proof

Next, a small note on integration by parts.

Lemma 2: Let $\varphi \in C_0^1([a,b];\mathbb{R})$. That is, $\varphi$ is continuously differentiable and compactly supported, (you can interpret this as $\varphi(a) = \varphi(b) = 0$). Then, for any differentiable, integrable $f$, we have:
$$\int_{a}^b f\varphi'\,dx = -\int_{a}^b f'\varphi\,dx$$Proof

Next, recall (or learn right now) the chain rule in multiple dimensions.

Theorem 3 (Chain Rule): Let $f:\mathbb{R} \to \mathbb{R}^N$ and $g:\mathbb{R}^N \to \mathbb{R}^N$. Then:
$$\frac{d}{dt} g(f(t)) = \nabla g(f(t)) \cdot f'(t)$$Proof omitted, but here's an example

Lastly, we'll need this cool trick.

Lemma 4 (Fundamental Lemma of the Calculus of Variations: Let $h:[a,b] \to \mathbb{R}$ be a continuous function, such that $\int_a^b h\varphi\,dx = 0$ for all compactly supported and continuously differentiable $\varphi$ satisfying $\|\varphi\|_\infty = 1$. Then $h = 0$.

Proof

Back to the First Example

Firstly, we need to specify the vector space of functions we're using. We will pick $X = C^1([0,1];\mathbb{R})$, which is the normed space of functions $f:[0,1] \to \mathbb{R}$ with continuous derivative. As before, the norm we take is the supremum/maximum norm.

Let's suppose that $F:C^1([a,b];\mathbb{R}) \to \mathbb{R}$ is minimized at some $f_0$. Then, by Theorem 1, we have for every direction $v \in X$ that:
$$0 = \frac{\partial F}{\partial v}(f_0) = \lim_{t \to 0} \frac{F(f_0+tv)-F(f_0)}{t}$$$$ = \lim_{t \to 0} \frac1t\int_0^1 (f_0(x)+tv(x) + f'_0(x)+tv'(x))^2 - (f_0(x)+f'_0(x))^2\,dx$$$$ = \lim_{t \to 0} \frac1t\int_0^1 2(f_0(x)+f'_0(x))(tv(x)+tv'(x)) + (tv(x)+tv'(x))^2\,dx$$$$ = \lim_{t \to 0} \int_0^1 2(f_0(x)+f'_0(x))(v(x)+v'(x)) + t(v(x)+v'(x))^2\,dx$$Switch the limit and integral with a domination argument:
$$ = \int_0^1 \lim_{t \to 0} 2(f_0(x)+f'_0(x))(v(x)+v'(x)) + t(v(x)+v'(x))^2\,dx$$$$ = \int_0^1 2(f_0(x)+f'_0(x))(v(x)+v'(x))\,dx$$Now let's split this integral into two parts. The key idea is to remove the $v'(x)$ and turn it into a $v(x)$.
$$ = \int_0^1 2(f_0(x)+f'_0(x))v(x)\,dx + \int_0^1 2(f_0(x)+f'_0(x))v'(x)\,dx$$This equality holds for all directions $v$. We're allowed to apply more conditions to $v$ if we want (as long as it's still a direction, it works!), particularly we may assume that $v$ is compactly supported in $(0,1)$. Then, magic happens when we apply integration by parts / Theorem 2 on the second integral! This gives us:
$$ = \int_0^1 2(f_0(x)+f'_0(x))v(x)\,dx - \int_0^1 2(f_0'(x)+f''_0(x))v(x)\,dx$$
We conclude that
$$\int_0^1 (f''_0(x) + 2f'_0(x) + f_0(x))v(x)\,dx = 0$$for all $v \in C^1_0([0,1];\mathbb{R})$. By the Fundamental Lemma of Calculus of Variations, we find that actually, $f''_0(x) + 2f'_0(x) + f_0(x) = 0$. Tada!

This is now just a differential equation! We can solve this to get $f_0 = (c_1x+c_2)e^{-x}$. Plugging in the constraints $f_0(0) = 0$ and $f_0(1) = 1$, we find that $c_1 = e$ and $c_2 = 0$. Therefore, the minimizing function is $\boxed{f_0(x) = xe^{1-x}}$.

By taking an "infinite-dimensional" derivative, we discovered the minimal value of an "infinite-dimensional" function! Wasn't that fun?

Welcome to the Calculus of Variations.


Part 3: The Euler-Lagrange Equation

You can totally solve the other three problems that I have posed by using this method, but it can get a bit annoying. Instead, let's generalize in order to make a shortcut!

General Problem: Let $g(x,y,z)$ be a twice-differentiable function $\mathbb{R}^3 \to \mathbb{R}$. The $x$ component represents... $x$, the $y$ component represents $f(x)$, and the $z$ component represents $f'(x)$. Here's an example to explain what that means: If I'm trying to represent the expression $xf(x)+f'(x)^2$ in the form $g(x,f(x),f'(x))$, then $g(x,y,z) = xy+z^2$.

Let $F(f) = \int_a^b g(x,f(x),f'(x))\,dx$. What differential equation must be satisfied by an $f$ that minimizes $F(f)$?

Note that the first problem is equivalent to solving this general problem for $g(x,y,z) = (y+z)^2$.

Step 1: Take the directional derivative

If $f_0$ is a local minimum, then the directional derivatives at $f_0$ must all be zero. For any direction $v$ compactly supported in $(a,b)$, we have that:
$$0 = \frac{\partial F}{\partial v}(f_0) = \lim_{t \to 0} \frac{1}{t}\int_a^b g(x,f_0(x)+tv(x),f_0'(x)+tv'(x)) - g(x,f_0(x),f_0'(x))\,dx$$$$ = \lim_{t \to 0} \int_a^b \frac{g(x,f_0(x)+tv(x),f_0'(x)+tv'(x)) - g(x,f_0(x),f_0'(x))}{t}\,dx$$
Step 1.5: Shove the Limit In

This isnt necessarily fun so Im hiding it

Step 2: Prep the Chain Rule

$$ = \int_a^b \lim_{t \to 0}\frac{g(x,f_0(x)+tv(x),f_0'(x)+tv'(x)) - g(x,f_0(x),f_0'(x))}{t}\,dx$$The integrand looks like a derivative, and that's because it totally is. It's just:
$$\lim_{t \to 0}\frac{g(x,f_0(x)+tv(x),f_0'(x)+tv'(x)) - g(x,f_0(x),f_0'(x))}{t} = \frac{d}{dt} g(x,f_0(x)+tv(x),f_0'(x)+tv'(x))$$I personally find it hard to see where chain rule is applied here. To help myself out, I like to define a "component accumulator" function. To wit, let $h(t) = (x,f_0(x)+tv(x),f_0'(x)+tv'(x))$. Then this is:
$$ = \frac{d}{dt} g(h(t))$$
Step 3: Apply Chain Rule

$$ = \nabla g(h(t)) \cdot h'(t)$$$$ = \begin{bmatrix}\frac{\partial g}{\partial x}(h(t)) \\ \frac{\partial g}{\partial y}(h(t)) \\ \frac{\partial g}{\partial z}(h(t))\end{bmatrix} \cdot \begin{bmatrix}0 \\ v(x) \\ v'(x)\end{bmatrix}$$$$ = \frac{\partial g}{\partial y}(x,f_0(x),f_0'(x))v(x) + \frac{\partial g}{\partial z}(x,f_0(x),f_0'(x))v'(x)$$
Putting back the integral, we have for every $C^1$ and compactly supported $v$ that:
$$0 = \int_a^b \frac{\partial g}{\partial y}(x,f_0(x),f_0'(x))v(x) + \frac{\partial g}{\partial z}(x,f_0(x),f_0'(x))v'(x) \,dx$$
Step 4: Integrate by Parts

We don't like $v'(x)$. We want to replace it with $v(x)$. So split the integral:
$$ = \int_a^b \frac{\partial g}{\partial y}(x,f_0(x),f_0'(x))v(x)\,dx + \int_a^b \frac{\partial g}{\partial z}(x,f_0(x),f_0'(x))v'(x) \,dx$$And apply integration by parts, and the fact that $v$ is compactly supported, on the second term:
$$ = \int_a^b \frac{\partial g}{\partial y}(x,f_0(x),f_0'(x))v(x)\,dx - \int_a^b \frac{d}{dx}\frac{\partial g}{\partial z}(x,f_0(x),f_0'(x))v(x) \,dx$$$$ = \int_a^b \left(\frac{\partial g}{\partial y}(x,f_0(x),f_0'(x))-\frac{d}{dx}\frac{\partial g}{\partial z}(x,f_0(x),f_0'(x))\right)v(x) \,dx$$
Step 5: The Fundamental Lemma

Since this holds for all compactly supported $C^1$ directions $v$, we may use the Fundamental Lemma of Calculus of Variations, and we finally obtain that $\frac{\partial g}{\partial y}(x,f_0(x),f_0'(x))-\frac{d}{dx}\frac{\partial g}{\partial z}(x,f_0(x),f_0'(x))=0$. Ergo:
$$\boxed{\frac{\partial g}{\partial y}(x,f_0(x),f_0'(x)) = \frac{d}{dx}\frac{\partial g}{\partial z}(x,f_0(x),f_0'(x))}$$
This is the Euler-Lagrange Equation.


Part 4: Euler-Lagrange Abuse

There's a mess of variables flying around, so let's make it really clear how we use Euler-Lagrange.
  1. First, look at your integrand, and figure out what $g$ is. This is done by replacing $f(x)$ with $y$ and $f'(x)$ with $z$.
  2. To get the left side, differentiate $g$ with respect to the $y$ variable, then plug back in $y = f(x)$ and $z = f'(x)$. (You can think of this as treating $f(x)$ as a "variable" and differentiating with respect to it.)
  3. To get the right side, differentiate $g$ with respect to the $z$ variable, and plug back in $y = f(x)$ and $z = f'(x)$ ("Differentiate with respect to the $f'(x)$ variable"). Now, take the derivative with respect to $x$.

Our first victim/example will be the shorest distance between two points.

Victim 1 (Shortest Distance): Let $(a,b)$ and $(c,d)$ be points in $\mathbb{R}^2$, and let's assume that $a < c$. What is the shortest path between these two points?

The Glitch Mob suggests that it is a line, but let's demonstrate that.

We can (ish) model the shortest path as a continuous function $f:[a,c] \to \mathbb{R}$ that satisfies $f(a) = b$ and $f(c) = d$. So we're trying to figure out what function $f$ satisfies these conditions, such that its length (i.e. "arc length") is minimized.

The arc-length formula states that the "length" of this path is given by $\int_a^c \sqrt{1+f'(x)^2}\,dx$ (slight warning: We're kinda assuming that $f$ is differentiable... Why we're fine). Thus here's the problem:

Find the differentiable function $f$ satisfying $f(a) = b$ and $f(c)=d$ that minimizes the quantity $F(f) := \int_a^c \sqrt{1+f'(x)^2}\,dx$.

Now let's follow the Euler-Lagrange formula...
  1. Looks like our $g$ is $g(x,y,z) = \sqrt{1+z^2}$.
  2. Let's differentiate with respect to $y$... to get... uh, $0$. This is the LHS.
  3. Now let's differentiate with respect to $z$ to get $\frac{z}{\sqrt{1+z^2}}$. Plugging back in stuff, this turns into $\frac{f'(x)}{\sqrt{1+f'(x)^2}}$. We're not done yet: We need to differentiate this with respect to $x$ to get $\frac{d}{dx} \frac{f'(x)}{\sqrt{1+f'(x)^2}}$. This is the RHS.

Hence the Euler-Lagrange equation for this minimization problem is given by $\boxed{\frac{d}{dx} \frac{f'(x)}{\sqrt{1+f'(x)^2}} = 0}$. Notice that I didn't bother evaluating this derivative. That's because if the derivative of something is zero and that something is $C^1$ then the something must be a constant function! Thus:
$$\frac{f'(x)}{\sqrt{1+f'(x)^2}} = k$$For a constant $k$. Solving, we get:
$$f'(x) = \pm\frac1{\sqrt{1-k^2}}$$So $\boxed{f(x) = m \pm \frac{x}{1-k^2}}$ for constants $k$ and $m$ and a choice of sign. This is linear, and by applying the original conditions, this must interpolate to the line connecting the two points.


Victim 2 (Rollercoaster Brachistochrone): You've been hired as an engineer to build a new rollercoaster at the Grand Canyon, which probably violates like 50 laws. It's going to start really high up at $(0,h)$, and its ending point is going to be on the ground at $(k,0)$. To maximize profits, your company wants the ride to take the least time possible. Given that the only force acting on the system is gravity (i.e. gravity is the only source of speed), what shape should the track be?

First, we purport that the track can be modelled as a continuous function $f:[0,h] \to [0,k]$ unless you really, really want loops. How do we compute the time the rollercoaster would take to get to the end given that $f$ is the function modelling the track? The unfortunate answer is physics.

Bad Physics

Which leads into some messy analysis...

Messy Analysis

And eventually we find that the time taken $F(f)$ given a track $f$ is given by:
$$F(f) = \int_0^L \frac{\sqrt{1+f'(x)^2}}{\sqrt{2g(h-f(x))}}\,du$$
Er, looks like the letter $g$ has been stolen by physics. Oh well. But when has that ever stopped math? The left side of the Euler-Lagrange equation is:
$$\frac{g\sqrt{1+f'(x)^2}}{2\sqrt{2}(g(h-f(x)))^{3/2}}$$The right side is:
$$\frac{d}{dx} \frac{f'(x)}{\sqrt{2g(h-f(x))} \cdot \sqrt{1+f'(x)^2}}$$Now set them equal and solve, easy!

...

...If I go through the computations here then both my readers and I will go insane. Fortunately, Mathematica comes to the rescue! This reduces to:
$$1+f'(x)^2 = 2(h-f(x))f''(x)$$The $h-f(x)$ term massively upsets me, so if we let $u = h-f(x)$ then this is:
$$1+u'(x)^2 = -2u(x)u''(x)$$It's time for a slick trick. Multiply each side by $u'(x)$ to get:
$$u'(x)+u'(x)^3 + 2u(x)u'(x)u''(x) = 0$$But why would I ever? If we massage this a bit...
$$u'(x) + u'(x)u'(x)^2 + u(x)(2u'(x)u''(x)) = 0$$...notice that we can apply the product rule for differention in reverse!
$$u'(x) + \frac{d}{dx}\left(u(x)u'(x)^2\right) = 0$$$$\frac{d}{dx}\left(u(x) + u(x)u'(x)^2\right) = 0$$Thus $u(x) + u(x)u'(x)^2 = c$ for a constant $c$. One layer down, one to go! Solving for $u'(x)$:
$$u'(x) = \pm\sqrt{\frac{c-u(x)}{u(x)}}$$Think: Is $u$ going up or down? It should be going up because $f$ is going down! So we must take the positive solution.

Let's throw away some rigor and "separate and integrate":
$$\frac{du}{dx} = \sqrt{\frac{c-u}{u}}$$$$\sqrt{\frac{u}{c-u}}\,du = dx$$$$x = c_1 + \int \sqrt{\frac{u}{c-u}}\,du$$(If you're concerned, it's not too hard to restore rigor.)

If integrating this is your cup of tea, go ahead. A trig sub or two should do the trick. Unfortunately I have other things to do, so by stuffing this into Mathematica we get:
$$x = c_1+c\sin^{-1}(\sqrt{u/c}) - \sqrt{u(c-u)}$$Not the prettiest solution, but it will do.

If we restore $f(x) = h-u(x)$ then the curve looks something like this:
https://i.imgur.com/FV0gp54.png
This curve is called the Brachistochrone!


The last victim, I leave to you as an exercise.

Victim 3 (The Cow Containment Catenoid): Space Farmer John got his hands on the elusive Space Cow, and wants to build a special enclosure for it. He insists that the room for the cow consists of a circular floor of radius $r_2$ and a circular ceiling $r_1$, such that the centers of these circles lie directly above each other at a distance $h$.

Space Farmer John is running out of material, though, and has hired you (yes, you!) to finish the enclosure by building the walls out of the super expensive FlexiMetal material. The mission is to figure out what shape the walls need to be in order to minimize the amount of FlexiMetal used (i.e. minimize the surface area of the walls).

His first thought was to just build the walls up "linearly" to make a truncated cone. But, I suspect that he doesn't have the mathematical background to back up his claim. Fortunately, you do now! Is the truncated cone truly the most optimal shape? Or does it turn out to be a more surprising shape...? Perhaps, some mysterious surface called a "Catenoid"? Only one way to find out!

If you want, I can start you off:

Start

Now have at it!


Takeaways
  • Function spaces are valid vector spaces, and you can even do "Calculus" on them!
  • Having a deep understanding of analysis, in its most general forms, leads to some remarkable results and methodologies.
  • The "set the derivative to $0$" philosophy has an infinite-dimensional analogue, which leads to the Euler-Lagrange equation.

For that last point though... we all know that solving $f'(x) = 0$ doesn't necessarily give you a minimum/maximum... sometimes you can get tricked. To wit, must the Euler-Lagrange equation give a minimum? Unfortunately no! We need to ensure, somehow, that a minimum even exists in the first place. But how? ...and other questions that we shall resolve in Part II+.

Enjoy Christmas!
This post has been edited 4 times. Last edited by greenturtle3141, Jul 31, 2022, 9:18 PM

Comment

0 Comments

Turtle math!

avatar

greenturtle3141
Archives
+ October 2024
Shouts
Submit
  • Can you give some thought to dropping a guide to STS? Just like how you presented your research (in your paper), what your essays were about, etc. Also cool blog!

    by Shreyasharma, Mar 13, 2025, 7:03 PM

  • this is so good

    by purpledonutdragon, Mar 4, 2025, 2:05 PM

  • orz usamts grader

    by Lhaj3, Jan 23, 2025, 7:43 PM

  • Entertaining blog

    by eduD_looC, Dec 31, 2024, 8:57 PM

  • wow really cool stuff

    by kingu, Dec 4, 2024, 1:02 AM

  • Although I had a decent college essay, this isn't really my specialty so I don't really have anything useful to say that isn't already available online.

    by greenturtle3141, Nov 3, 2024, 7:25 PM

  • Could you also make a blog post about college essay writing :skull:

    by Shreyasharma, Nov 2, 2024, 9:04 PM

  • what gold

    by peace09, Oct 15, 2024, 3:39 PM

  • oh lmao, i was confused because of the title initially. thanks! great read

    by OlympusHero, Jul 20, 2024, 5:00 AM

  • It should be under August 2023

    by greenturtle3141, Jul 11, 2024, 11:44 PM

  • does this blog still have the post about your math journey? for some reason i can't find it

    by OlympusHero, Jul 10, 2024, 5:41 PM

  • imagine not tortoise math

    no but seriously really interesting blog

    by fruitmonster97, Apr 2, 2024, 12:39 AM

  • W blog man

    by s12d34, Jan 24, 2024, 11:37 PM

  • very nice blog greenturtle it is very descriptive and fascinating to pay attention to :-D

    by StarLex1, Jan 3, 2024, 3:12 PM

  • orz blog

    by ryanbear, Dec 6, 2023, 9:23 PM

67 shouts
Tags
About Owner
  • Posts: 3555
  • Joined: Oct 14, 2014
Blog Stats
  • Blog created: Oct 23, 2021
  • Total entries: 54
  • Total visits: 41096
  • Total comments: 126
Search Blog
a