Lesson 1.6: The Most Important Tool in Calculus

Welcome! This lesson covers the single most important prerequisite for understanding stochastic calculus. It's a 'super-tool' from normal calculus that lets us do something amazing: approximate a complex function using a simple one.

Everything in finance, from Itô's Lemma to the Black-Scholes equation, is built on this one idea.

The 'Why': The Prediction Problem

Imagine a complex, curved function, like f(x)=exf(x) = e^x. It's hard to calculate by hand.

Now, imagine you are standing at a single point on that curve, say at x=ax=a. At this one "anchor point," you know everything:

  • The value of the function: f(a)f(a)
  • The slope (1st derivative): f(a)f'(a)
  • The curvature (2nd derivative): f(a)f''(a)
  • ...and so on.

The Problem: Can we use only the information at our "anchor point" aa to build a "simple map" (a polynomial) that gives us a great approximation for the function at any other point xx?

The Answer: Yes! That "map" is the Taylor Expansion.

Building Our 'Map' Step-by-Step

Our "map" will be a simple polynomial, P(x)P(x), that we will "force" to match our real function f(x)f(x) at our anchor point aa.

Our "map" will have this general form:

P(x)=c0+c1(xa)+c2(xa)2+P(x) = c_0 + c_1(x-a) + c_2(x-a)^2 + \dots
Our job is to find the coefficients c0,c1,c2,c_0, c_1, c_2, \dots

Step 1: The 'Lazy' Guess (0th-Order)

Let's make a "map" that only matches the value of our function. A flat, horizontal line.

Goal: P(a)=f(a)P(a) = f(a)

Derivation: Since P(a)=c0P(a) = c_0, this means c0=f(a)c_0 = f(a).

f(x)f(a)f(x) \approx f(a)

Analogy: You're driving a car. Your "prediction" for your position in 1 second is... your current position. It's a terrible prediction if you're moving!

Step 2: The 'Good' Guess (1st-Order: The Tangent Line)

Let's make a "map" that matches both the value and the slope. A straight line.

Goal: Match f(a)f(a) and f(a)f'(a).

Derivation: Matching f(a)f(a) gives c0=f(a)c_0 = f(a). To match the slope, we need P(a)=f(a)P'(a) = f'(a). Since P(x)=c1P'(x) = c_1, this means c1=f(a)c_1 = f'(a).

f(x)f(a)+f(a)(xa)f(x) \approx f(a) + f'(a)(x-a)

Analogy: "Your future position ≈ current position + (current speed × time)." This is the tangent line! It's a great prediction, but it fails because our real function is curved.

Step 3: The 'Great' Guess (2nd-Order: The Parabola)

This is the one we need for stochastic calculus. A parabola that matches value, slope, AND curvature.

Goal: Match f(a)f(a), f(a)f'(a), and f(a)f''(a).

Derivation: c0c_0 and c1c_1 are the same. We need P(a)=f(a)P''(a) = f''(a). The second derivative of our map is P(x)=2c2P''(x) = 2c_2. So, 2c2=f(a)2c_2 = f''(a), which means c2=f(a)2c_2 = \frac{f''(a)}{2}.

f(x)f(a)+f(a)(xa)+f(a)2(xa)2f(x) \approx f(a) + f'(a)(x-a) + \frac{f''(a)}{2}(x-a)^2

Analogy: "Your future position ≈ current position + (speed × time) + (a term for your acceleration × time²)." This is a much better map because the parabola curves in the same way our real function does.

The Most Important Formula (The 'Change' Formula)

For our class, we don't care about the new value, we care about the change in value, Δf\Delta f.

We rename our variables: let our "anchor point" aa be just xx, and our "new point" be x+Δxx + \Delta x. This means our "step" (xa)(x-a) is now just Δx\Delta x.

Substitute into our 2nd-Order formula:

f(x+Δx)f(x)+f(x)(Δx)+12f(x)(Δx)2f(x + \Delta x) \approx f(x) + f'(x)(\Delta x) + \frac{1}{2}f''(x)(\Delta x)^2

To find the "Change in ff" (Δf=f(x+Δx)f(x)\Delta f = f(x + \Delta x) - f(x)), we just move the f(x)f(x) from the right side to the left side:

Master Tool

Δff(x)(Δx)+12f(x)(Δx)2\Delta f \approx f'(x)(\Delta x) + \frac{1}{2}f''(x)(\Delta x)^2

A Concrete Example: f(x) = sin(x)

Building a "map" for sin(x)
Let's build a map for sin(x)\sin(x) around the easy anchor point a=0a=0. This is a Maclaurin Series.

The Recipe:

f(x)f(0)+f(0)x+f(0)2!x2+f(0)3!x3+f(x) \approx f(0) + f'(0)x + \frac{f''(0)}{2!}x^2 + \frac{f'''(0)}{3!}x^3 + \dots

Find our ingredients at a=0:

  • f(x)=sin(x)    f(0)=sin(0)=0f(x) = \sin(x) \implies f(0) = \sin(0) = 0
  • f(x)=cos(x)    f(0)=cos(0)=1f'(x) = \cos(x) \implies f'(0) = \cos(0) = 1
  • f(x)=sin(x)    f(0)=sin(0)=0f''(x) = -\sin(x) \implies f''(0) = -\sin(0) = 0
  • f(x)=cos(x)    f(0)=cos(0)=1f'''(x) = -\cos(x) \implies f'''(0) = -\cos(0) = -1

Plug into the Recipe:

sin(x)0+(1)x+02x2+16x3+\sin(x) \approx 0 + (1)x + \frac{0}{2}x^2 + \frac{-1}{6}x^3 + \dots

The Result:

sin(x)xx36+x5120\sin(x) \approx x - \frac{x^3}{6} + \frac{x^5}{120} - \dots

You've just built a simple polynomial "map" that can approximate the complex sin(x)\sin(x) function!

What's Next?
  • You now have the master tool: Δff(x)(Δx)+12f(x)(Δx)2\Delta f \approx f'(x)(\Delta x) + \frac{1}{2}f''(x)(\Delta x)^2
  • In the next lesson, we'll see why in normal calculus, that second term is always 0.
  • And in Module 3, we'll see why in stochastic calculus, that second term is NOT 0... and this discovery changes everything.

Up Next: Taylor Expansion (Multiple Variables)