Wednesday, January 4, 2012

An Interesting View of Differentiation

I've never studied any functional analysis (my background in traditional analysis is not yet nearly strong enough), but I do find the topic quite interesting. It's partly due to its place at the convergence of Linear Algebra and Analysis. It's partly because I love the notion of function spaces. It's partly because it plays a central role in some areas of physics which I find quite interesting. And it's partly (probably mostly) because it just seems so exotic that I can't help but be attracted to it.

Despite having never studied function spaces properly, I occasionally run into them in my readings. Today in particular, I was reading about inner product spaces. I eventually came across the topic of Hilbert Spaces, which are inner product spaces which are also complete metric spaces with respect to the metric induced by the norm implicit in the inner product. One way or another, this got me thinking about derivatives from a viewpoint very different from that usually presented in an undergraduate calculus course.

A little while ago, I wrote a post about the space of all functions on the real numbers. What is differentiation in abstract terms? It is a function which acts on the space of all real functions and maps to another real function. This on its own would not produce a linear transformation (a homomorphism in the category of vector spaces), as not all real functions are differentiable. However, it is easily seen that the set of all everywhere differentiable real functions, together with the expected operations of function addition and multiplication by a real scalar, is itself a vector space; for any differentiable functions f,g,h and some constants c,d:

  • Associativity of Vector Addition
    ((f+g)+h)(x) = (f(x)+g(x))+h(x) = f(x)+(g(x)+h(x)) = (f+(g+h))(x)
  • Commutativity of Vector Addition
    (f+g)(x) = f(x)+g(x) = g(x)+f(x)=(g+f)(x)
  • Existence of Zero Vector
    Let N(x)=0
    (N+f)(x) = N(x) + f(x) = f(x) = f(x) + N(x) = (f+N)(x)
  • Existence of Vector Inverses Under Vector Addition
    Let (-f)(x) = -f(x)
    (f+(-f))(x) = f(x) + (-f)(x) = f(x) - f(x) = N(x)
  • Distributivity of Scalar Multiplication With Respect to Vector Addition
    c(f+g)(x) = c(f(x)+g(x)) = cf(x) + cg(x) = (cf+cg)(x)
  • Distributivity of Scalar Multiplication With Respect to Field Addition
    ((c+d)f)(x) = (c+d)f(x) = cf(x)+df(x) = (cf+df)(x)
  • Respect of Scalar Multiplication to Field Multiplication
    c(df)(x) = c(df(x)) = cdf(x) = (cd)f(x) = ((cd)f)(x)
  • Respect of Field Identity Element under Scalar Multiplication
    (1f)(x) = 1f(x) = f(x) = f(x)1 = (f1)(x)
Thus, since linear combinations of differentiable functions are again differentiable, the set of all differentiable real functions is clearly a vector space. Thus the differentiation operation dy/dx, can be thought of as a linear transformation from the space of all differentiable real functions into the space of all real functions. The null space of this function is exactly the set of all constant functions and the zero function itself.

I'd love to know more about differentiation from this perspective. Is the space of all differentiable functions a Hilbert Space? My first intuition is to say "yes", as it can technically be thought of as an uncountable Cartesian product of the real numbers, but I'm not so sure whether the familiar Euclidean inner product generalizes to uncountable dimensions, and if the space cannot be endowed with an inner product then it fails to be a Hilbert Space.

Either way...Thinking about differentiation as a linear transformation in an infinite-dimensional vector space is pretty badass. Grr, Functional Analysis. Someday, I will conquer you.