Differential Equations · Operator Theory

Linear ODEs —
Operators, Superposition
& Fundamental Sets

Treating differentiation as an algebraic object, and why solutions can be stacked like Lego bricks.

The Big Idea

"Differentiation can be treated like an operator, and for linear homogeneous ODEs, once you know a couple of building-block solutions, you can combine them to make the general solution."

That is the whole game here.

Section 01

What is a differential operator?

The symbol \(D\) is just shorthand for:

$$D = \frac{d}{dt}$$

So \(D\) is an operator. An operator is just something that acts on a function and produces another function.

So if \(y(t) = t^3\), then:

$$Dy = \frac{d}{dt}(t^3) = 3t^2$$

So \(D\) means: "take one derivative." Then:

$$D(Dy) = D^2 y = \frac{d^2 y}{dt^2}$$
$$D^3 y = \frac{d^3 y}{dt^3}$$

So \(\dfrac{d^3y}{dt^3} = D(D(Dy)) = D^3y\) — just compressed notation.

Mental Model

Think of \(D\) like a machine:

Inputy(t)
OperatorD
Outputy′(t)

Then \(D^2\) means run the machine twice. \(D^3\) means run it three times. Old-school slick math move. Very clean.

Section 02

What does it mean to combine differential operators into a polynomial expression?

Consider the general linear operator:

$$L(y) = a_n(t)D^n + a_{n-1}(t)D^{n-1} + \cdots + a_2(t)D^2 + a_1(t)D + a_0(t)$$

acting on \(y\), and equal to \(g(t)\). What this really means is:

$$L(y) = a_n(t)y^{(n)} + a_{n-1}(t)y^{(n-1)} + \cdots + a_2(t)y'' + a_1(t)y' + a_0(t)y$$

So the operator \(L\) is just a compact way of writing a linear differential equation. For example, the ODE

$$y'' + 3y' - 4y = 0$$

can be written as

$$(D^2 + 3D - 4)\,y = 0$$

because \(D^2 y = y''\), \(3Dy = 3y'\), and \(-4y = -4y\). So:

$$(D^2 + 3D - 4)\,y = y'' + 3y' - 4y$$

That is why they call it a polynomial expression in \(D\). It behaves like algebra, except the variable is the derivative operator.

Why this matters

This is what leads to the characteristic equation trick for constant-coefficient linear ODEs:

$$(D^2+3D-4)\,y = 0$$
$$r^2 + 3r - 4 = 0$$

Boom. That is where the algebra pipeline starts.

Section 03

What is \(L\)?

For a second-order ODE, the linear differential operator \(L\) looks like:

$$L = P(t)y'' + Q(t)y' + R(t)y$$

So if you feed \(y\) into \(L\), it outputs:

$$L[y] = P(t)y'' + Q(t)y' + R(t)y$$

That is just notation. Nothing mystical. It is like defining a function called \(L\) that takes in a function \(y\) and spits out a new expression involving \(y, y', y''\).

Section 04

What is the principle of superposition?

This is the juicy part.

If \(y_1(t)\) and \(y_2(t)\) are solutions, then

$$y(t) = c_1 y_1(t) + c_2 y_2(t)$$

is also a solution. This is true for a linear homogeneous differential equation. That word combo matters:

Example: \(y'' - 3y' + 2y = 0\). If \(y_1 = e^t\) is a solution and \(y_2 = e^{2t}\) is a solution, then \(y = c_1 e^t + c_2 e^{2t}\) is also a solution.

Why?

Because derivatives distribute over addition and constants pull out nicely. Let's show it with no hand-waving.

Suppose \(L[y] = P(t)y'' + Q(t)y' + R(t)y\), and suppose:

$$L[y_1] = 0, \qquad L[y_2] = 0$$

Now take \(y = c_1 y_1 + c_2 y_2\). Then:

$$y' = c_1 y_1' + c_2 y_2'$$
$$y'' = c_1 y_1'' + c_2 y_2''$$

Now plug into \(L[y]\):

$$L[y] = P(t)(c_1 y_1'' + c_2 y_2'') + Q(t)(c_1 y_1' + c_2 y_2') + R(t)(c_1 y_1 + c_2 y_2)$$

Distribute:

$$L[y] = c_1\bigl(P(t)y_1'' + Q(t)y_1' + R(t)y_1\bigr) + c_2\bigl(P(t)y_2'' + Q(t)y_2' + R(t)y_2\bigr)$$

That is:

$$L[y] = c_1 L[y_1] + c_2 L[y_2]$$

But \(L[y_1] = 0\) and \(L[y_2] = 0\), so:

$$L[y] = c_1(0) + c_2(0) = 0$$

So \(y\) is also a solution. That is superposition.

Street-level intuition

Intuition

If the system is linear and unforced, then solutions can be stacked together without breaking the equation. It is like waves adding. Or like legal Lego bricks. Each solution brick fits, and linear combinations still fit. That is mad important in differential equations, linear algebra, circuits, signals, all that jazz.

Section 05

Why do they call this "fundamental sets of solutions"?

Because for a second-order linear homogeneous ODE, you need two linearly independent solutions to build the full general solution.

So if you find two good independent solutions \(y_1\) and \(y_2\), then they form a fundamental set of solutions, and the general solution is:

$$y(t) = c_1 y_1(t) + c_2 y_2(t)$$

Not just any two solutions — they need to be linearly independent. That means one cannot just be a constant multiple of the other.

Examples

Independent: \(e^t\) and \(e^{2t}\) ✓

Not independent: \(e^t\) and \(5e^t\) ✗

If they are dependent, then you really only have one direction, not two.

And for a second-order equation, you need two degrees of freedom because you usually need to satisfy two initial conditions, like:

$$y(t_0) = y_0, \qquad y'(t_0) = y_1$$

So the constants \(c_1\) and \(c_2\) give you the flexibility to match those conditions.

Section 06

Tiny caveat

The statement \(y(t) = c_1 y_1(t) + c_2 y_2(t)\) is definitely true if the DE is linear homogeneous.

If the equation is nonhomogeneous, like \(y'' + y = \sin t\), then superposition of two full solutions does not work the same way. For nonhomogeneous equations, the general solution is:

$$y = y_c + y_p$$

where:

So superposition as stated is really about the homogeneous linear case.

Section 07

What it all means in one clean chunk

Here is the unified meaning:

Section 08

Super quick concrete example

Take:

$$y'' - y = 0$$

In operator form:

$$(D^2 - 1)\,y = 0$$

Two solutions are:

$$y_1 = e^x, \qquad y_2 = e^{-x}$$

Then by superposition:

$$y = c_1 e^x + c_2 e^{-x}$$

Check it:

$$y' = c_1 e^x - c_2 e^{-x}$$
$$y'' = c_1 e^x + c_2 e^{-x}$$

Then:

$$y'' - y = (c_1 e^x + c_2 e^{-x}) - (c_1 e^x + c_2 e^{-x}) = 0 \checkmark$$
Verdict

Works perfectly. Game over.

Lock-in Summary

Three Core Ideas

The three core ideas from this whole setup:

01

\(D\) means "differentiate."
So \(D^2 y = y''\), \(D^3 y = y'''\), etc.

02

A linear ODE can be written as an operator equation.
\(y''+3y'-4y=0 \;\leftrightarrow\; (D^2+3D-4)y=0\)

03

Superposition.
Linear combinations of solutions are also solutions for linear homogeneous equations: \(y = c_1 y_1 + c_2 y_2\)

So this is really the foundation for why second-order homogeneous linear equations have general solutions built from two independent basis solutions. Pretty rad setup, honestly.