From: kovarik@mcmail.cis.McMaster.CA (Zdislav V. Kovarik)
Subject: Re: Questions on functional analysis
Date: 5 Aug 1999 06:45:18 -0400
Newsgroups: sci.math
Keywords: some basics
In article <37A88234.F30D3B28@dartmouth.edu>,
Ashok. R wrote:
>Greetings.
>
>
>I really really want to learn about functional analysis and I am trying
>really really hard - and I have a few questions...
>
Looks like no takers, so I'll give it a try.
>1. Why is closure so important and how is it different from completenss?
>If a set is closed, does it necessarily imply that it is also complete?
I presume you mean closure as a property ("closedness") in contrast with
closure as operation.
Just scan theorems in your text that assume closedness, and find out which
of them would turn false if you drop that assumption. (A nice series of
comprehension exercises.) Two important cases:
In metric topology, let a set S be closed. Then every point not in S has
a positive distance from S. This is quite convenient. If S is not closed,
examples exist of x being at zero distance from S but not in S. (Take S as
the reciprocals of positive integers, and x = 0.)
If a linear subspace of a normed space is closed then the induced
seminorm on the quotient space is a norm (quite a desirable property).
And conversely: if the subspace is not closed, the induced seminorm fails
to be a norm.
And about completeness vs. closure: Being closed is much easier in a way
than being complete.
There must be a theorem in the book saying this about metric spaces:
If a space X is complete then it is closed in every bigger space Y if
subspace metric from Y is used on X.
And:
If a space X is complete then its closed subspaces are exactly its
complete subspaces.
For example, the space of polynomials from the example below is not
complete, but: It is closed in itself (of course, every space is closed in
itself!), and some of its closed subspaces are complete (the finite
dimensional ones, for example), and others are incomplete (e.g. the set of
all even poynomials).
>2. Whats is the difference between a Cauchy sequence and a convergent
>sequence? Can anyone give me an example of a cauchy sequence that is not
>convergent?
>
A one-way implication in general cases: if convergent then Cauchy.
A two-way implication in complete spaces (read the definition again).
Surroundings: Take V to be the space of polynomial functions, normed by
maximum of absolute value over [-1,1]. This is a non-closed linear
subspace of C([-1,1]).
Example: Take the n-th polynomial to be the n-th partial sum of the power
series for e^x (starting with n=0):
p_n(x) = 1 + x/1! + ... + x^n/n!
Show that {p_n} is Cauchy but not convergent (it converges to e^x, which
is not a polynomial. Why? (Observe the derivatives.)).
>3. Quotient space: This is the stuff that I am really struggling with. I
>just cannot understand the concept of a Quotient space. Can anyone give
>me an easy to follow explanation with good examples?
Example: Everyone who went through Calculus I with a good teacher
remembers that on an interval, the indefinite integral of a continuous
function is actually a class of functions, any one of them plus a constant
function.
So, for example, the indefinite integral of sin(x) is the set
{-cos(x)+C: C a constant}.
Now from the definitions, this set is a coset -cos(x)+P(0) where P(0) is
the set of all constant functions.
More generally, the result of applying the operation of indefinite
integration to the space of all continuous functions on R (for simplicity)
is the quotient space:
(continuously differentiable functions) / P(0)
(Casually speaking, you consider every constant to be just as good as
zero, because if you differentiate the integral ...)
Going higher: if you integrate indefinitely every continuous function 6
times, the result will be
(six times continuously differentiable functions) / P(5)
where P(5) is the space of all polynomials of degree 5 or less. Why? (If
you and your neighbor integrate correctly the same function six times,
each time picking your own constant, you might get different answers, but
the difference of your answers should be a polynomial from P(5), shouldn't
it?)
A more general example: Suppose L is a linear transformation of a space X
onto Y. Then we can, casually speaking, consider x just as good as y if
L(x) = L(y). (Think of a linear differential equation L(x)=f and
"particular solutions" x and y of it.)
If you denote N the nullspace of L (the set of solutions of the
homogeneous equation), then the particular solutions can be sorted out as
cosets of X/N .
>4. Is there an easy to follow book to learn functional analysis? I am
>using Luenberger's "Optimization by vector space methods" and it is not
>exactly easy to read. I need some examples, stuff that I can visualize.
Luenberger's book (I remember it quite well) was not at all intended to be
an introduction to functional analysis. Try Rudin, or Dieudonne
(Introduction to Modern Analysis), od Schechter. Others may suggest easier
ones.
>5. The Null space of a matrix, N(A): Is there a good way to show that it
>is complete? What would be a good cauchy sequence for N(A) ?
>
(Comprehension problem: You don't pick "a good Cauchy sequence" to prove
completeness: by definition, you check every, and that means every, Cauchy
sequence -- unless you have a previously proved theorem to help you avoid
that. ) You do look for a "special" Cauchy sequence if you want to prove
***incompleteness***. Got it? (I was avoiding judging Cauchy sequences :-)
Back to matrices: The matrix A has a domain R^n and target R^m (or complex
analogues). These spaces are known to be complete, the matrix
transformation is continuous, so the nullspace is closed ... justify this
and finish the proof yourself (read my previous remarks; hints are there).
>6. Are all complete spaces Banach spaces?
All complete normed linear spaces, yes. That's the definition.
Cheers, ZVK(Slavek).