When I was starting out as a grad student, I remember being frustrated with analysis problems where it didn't use DIRECTLY the things we spent so much time proving in class.

As an example, let's look at a standard real analysis problem:

(1/23)
The problem looks like it can be done using the dominated convergence theorem.

This is an important theorem and is used in a lot of places outside a grad real analysis class. A student would go through a lot of buildup leading to this theorem.

(2/23)
Now, let's see whether the conditions for DCT are satisfied.

First, for fixed 0<x<1, sin(x^n)/x^n->0 as n->infinity.

So we have a.e. convergence of the integrand in [0,1].

To the student, this builds a stronger case to use DCT to solve the problem.

(3/23)
Finding an integrable dominating function is where the problem starts.

It would seem NATURAL (at least for me) to bound it with:

p.s. it feels "natural" to me because it is reminiscent of squeeze theorem problems, math students encounter in calculus.

(4/23)
However, 1/x^n is not integrable in (0,1) and hence cannot be (further) dominated by an integrable function.

So, what I would do is since I know it converges a.e. to 1, maybe it converges uniformly to 1?

This leads me into a BAD detour.

(5/23)
To show uniform convergence, I show that the SUPREMUM of sin(x^n)/x^n goes to 1.

Now, some comments:

Uniform convergence is first taught using a clunky form with epsilons and N. I used to struggle proving that N is independent of x in these problems.

(6/23)
It was a big help when I realized that one can show this by showing that the supremum converges.

And that is helpful because for functions you can calculate the derivative of, this boils down to looking for zeroes of its derivatives (+other things)

(7/23)
Now, can we calculate the supremum of sin(x^n)/x^n in [0,1]?

Calculating the derivative leads to the following expression below.

Now, I have no idea how to solve for the roots of the last equation, so I'm stuck.

(8/23)
I feel that this is a common pitfall for a lot of analysis students.

There is plenty of emphasis on developing this big theory on Lebesgue integration but often the general guiding principles are left for students to figure out on their own.

(9/23)
The interesting problems in analysis often deal with singularities where things become undefined.

As a GENERAL rule, one takes out the singularity to make the problem tractable and then show that you can control it uniformly of how you took out the singularity.

(10/23)
This technique pops up a lot in analysis.

For example, showing that the Poisson problem can be solved by a convolution with the fundamental solution of Laplace's equation uses this.

(11/23)
So, going back to our problem, the difficulty is that things get nasty at 0. So instead of working with an integral over (0,1), we consider an integral over (epsilon,1) for some small epsilon>0.

We are "taking out" the singularity at 0.

(12/23)
Now, we want to control this in such a way that when we let epsilon go to zero (for fixed n; order of limits is IMPORTANT), we get something that does not go to infinity.

However, if you control this with 1/x^n (which is now integrable in (epsilon,1), it doesn't work:

(13/23)
You lose any control as epsilon goes to zero and you're stuck with the original problem.

This illustrates another general rule that a great mentor of mine taught us:

"Conservation of difficulty"

(14/23)
Transforming a problem into something else might illuminate it more or perhaps lead to a more plausible solution but at a price: the difficulty pops up somewhere else.

And this applies to a lot of things in analysis.

(15/23)
So what do we do now? Well, let's go back to calculus.

How do we deal with integration problems? Maybe by DIRECTLY working with the integral instead of bounding sin with 1 will give better bounds.

u-sub is not helpful (I tried 😅). Let's try integration by parts:

(16/23)
We know that though 1/x^n blows up at 0, sin x^n tempers this.

It's not exactly clear how this happens in the integral at first glance, but by putting in the work going through calculations, we see how sin x and 1/x sort of cancel each other out (see IBP pic above)

(17/23)
This is the price to pay. Cheap estimates like |sin x|<1 isn't strong enough to provide us with good bounds.

Often, there is reward when one puts in the effort to work through long calculations.

For example, the big results in PDEs come from the hard computations.

(18/23)
That being said, we let epsilon go to zero and recover a nice limit that we can handle.

Note that, now we can directly apply DCT to the integral of cos x^n in (0,1).

(19/23)
Finally, taking now the limit as n goes to infinity, we see that the limit is 1!

(20/23)
Analysis is hard for a lot of people; it was hard for me too! I failed my first analysis qual 😅

And there is truth to how analysis has become a barrier for a lot of people to do math.

(21/23)
The bag-of-tricks you supposedly learn from experience are are more like general principles that a lot of things in analysis use and share.

These general rules are actually reasonable and it does take work on the student's part for it to feel natural and organic.

(22/23)
It's not a cure-all for how analysis has deterred people from doing math. Students come from different backgrounds with varying levels of preparation. I hold the opinion that rather than kicking people out, we do our best to help them succeed.

(23/23)
You can follow @weakconvergence.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled:

By continuing to use the site, you are consenting to the use of cookies as explained in our Cookie Policy to improve your experience.