Zum Inhalt springen

Rearrangement theorem for series – Serlo

Aus Wikibooks

In this article, we will investigate under which assumptions, we may re-arrange alaments within a series and under which circumstances, this is forbidden. If a re-arrangement is allowed, we call such a series unconditionally convergent. We will follow a step-by-step approach starting from finite sums. As mentioned in the last article, absolute convergence will be crucial for re-arrangements of real-valued series.

Re-arrangement of finite sums

[Bearbeiten]

For finite sums, a re-arrangement is always allowed, since the summation is commutative and every re-arrangement can be written as a finite amount of commutations. As an example, consider the sum

If we re-arrange it such that there alternately aa positive and two negative elements, we get

Mathematically, every re-arrangement of this sum can be expressed by a bijection . The bijection for the above re-arrangement is given by

therefore

So for any finite sum with any we can formulate a generalized commutative law:

Exercise (proof of the generalized commutative law)

Prove the generalized commutative law by induction over .

Proof (proof of the generalized commutative law)

Base case: .

Here, we have only one bijection with

Inductive assumption: We assume that for all bijections there is

Induction step:

Let be a bijection. By the properties of real numbers , the commutative law holds for any two numbers. We can use this within the sum to exchange and . The sum can be decomposed into two summands and . For those two numbers, the commutative law holds. By subsequent commutations, we can achieve . Using the induction assumption, we get

The problem with series

[Bearbeiten]

Series are sums of infinitely many elements, so there might be "infinitely many re-arrangement steps" necessary. And this may cause some trouble! At first, we need to precisely define what we mean by re-arranging those infinitely many elements:

Definition (re-arrangement of a series)

Let be a bijection. Then, the series is called a re-arrangement of the series .

Sowe have a re-arrangement whenever elements of both series can be assigned one-to-one by a bijection (as it is also the case for finite sums).

Example (re-arrangement of a series)

Consider the harmonic series . A re-arrangement of it could for instance be

which is given by the bijection

It would be nice to have a generalized commutative law also for series. However, re-arrangements might change the limit! Examples are not too easy to find. One of the easiest is the alternating harmonic series

This series converges, as shown in the article "Alternating series test" . Its limit is given by .

We use the re-arrangement:

Question: What is the bijection which causes this re-arrangement?

It is

Or, in a more compact form, we have for all that

Within the article "Computation rules for series " it is shown taht the limit does not change if we set brackets. Hence, we can re-write the series as

we re-formulate a bit more in order to end up with another alternating harmonic series:

So under this re-arrangement, the limit of the series has halved from to .

The proof above is only formal. Mathematically, it runs as follows:

Let be the sequence of partial sums of the alternating harmonic series and the sequence of partial sums of the re-arranged series. then, and are related via

Since converges to , the subsequence also converges to . For convergent series, we can compute

However, this does not directly imply convergence of the series to . The convergence is proven by taking some . Then there is an such that

The elements of the re-arranged sequence are denoted by , so . In addition, is a null sequence. Hence, for every there is an with

Now, we set , and obtain for all that

since for

  • , there is
  • , there is
  • , there is
This establishes the claim .

Exercise (Re-arranging the alternating harmonic series)

Prove that the following re-arrangement of the alternating harmonic series

converges to .

Hint: First, show that , with being the sequence of partial sums of the alternating harmonic series and being the sequence of partial sums of the re-arranged series.

Proof (Re-arranging the alternating harmonic series)

There is

Since converges to , the subsequences and also converge to . By the computation rules for series, has to converge to . By the same argumentation as above, we get that also converges to .

Warning

Limits may change if the elements of a series are re-arranged!

Re-arranged series might diverge

[Bearbeiten]

And it even gets worse: converging series can be made divergent by re-arrangement:

Consider the following re-arrangement of the alternating harmonic series:

For there is

So the partial sums of the re-arranged series up to the summand can be bounded. How many positive and negative elements are there up to this one? Leaving out the first two of the, we get positive elements. And there are negative elements. So in the original series, we have to sum up to element number . For the partial sum there is hence

So the partial sum is unbounded! For every there is some with . The re-arranged series diverges to .

So be careful:

Warning

Re-arranging a converging series may make it diverge!

Re-arrangement for non-negative series

[Bearbeiten]

The examples above treated alternating series, where one was able to "put together " in a way that any real number or even could be reached. We can circumvent this problem by only allowing positive series elements. But is just allowing positive elements sufficient to avoid any problems which may lead to different limits under re-arrangement? The answer is indeed: yes! We will formulate a theorem about this and prove it.

However, first we consider an example: the series converges to . What if we re-arrange the elements in the same way as we did it for the alternating harmonic series? Let's try it out:

Convergence of is shown by proving that the sequence of partial sums stays bounded - see Bounded series and convergence. That means, there is a , such that for all . If we could not show that the sequence of partial sums of the re-arranged series is bounded, we would be done: this sequence is monotonously increasing and if it is bounded, it has to converge. If we can show taht for all there is an with , we have for all and hence, boundedness.

And indeed, this can be shown: the first elements of the re-arranged series can be found within the original series at positions . This set has a maximum. So if we set , then contains all elements in plus some additional non-negative ones. Therefore, . In our example, for , there is . We have with maximum , so there must be (the 15 elemeents in contain all 11 elements in .

Since is bounded from above by , so is and the re-arranged series converges.

Now, are both limits identical? We denote the limit of the original series by and that of the re-arranged series by . Since , the re-arranged series must have a smaller or equal limit to tzhe original one (). But now, the original series can also be seen as a re-arrangement of the re-arranged series . The "back re-arrangement" map is a bijection from to . We copy the argumentation above and get , where any is given and a corresponding can be found. Taking the limit yields . Since also , there is i.e. both limits coincide.

This argumentation can be generalized to any non-negative series:

Theorem (re-arrangement theorem for non-negative series)

Let be a converging series with for all . Then any re-arrangement converges to the same limit.

Proof (re-arrangement theorem for non-negative series)

Proof step: The re-arranged series converges:

Since converges, the sequence of partial sums Is bounded. Now, take any re-arrangement series and denote by its partial sums. We set and get for all . Hence, the sequence of partial sums is bounded, monotonously increasing and we have convergence.

Proof step: The re-arranged series has the same limit:

Denote by the limit of the original series and by teh limit of the re-arranged series. By step 1, there is . But now, the original series is just a re-arrangement of the re-arranged one (the corresponding bijection is with ). So the same argumentation as in step 1 also yields and both limits coincide.

Hint

If all elements are non-positive, the same argumentation can be applied and we get identical limits for al re-arrangements.

Unconditional and conditional convergence

[Bearbeiten]

Based on its behaviour under re-arrangement, we define

Definition (unconditional and conditional convergence of a series)

A converging series is called unconditionally convergent if any re-arrangement of it converges to the same limit. Conversely, a series where there is a re-arrangement with a different limit (or which diverges) is called conditionally convergent.

Example (unconditional and conditional convergence of a series)

The alternating harmonic series is conditionally convergent. The series is unconditionally convergent.

So the re-arrangement theorem above can equivalently be formulated as follows:

Theorem (Re-arrangement theorem for non-negative series (alternative formulation))

Let be a convergent series with for all . Then, the series converges unconditionally.

Re-arranging absolutely convergent series

[Bearbeiten]

What if there are positive and negative elements within a series? When can we be sure that any re-arrangement yields the same result? The answer is: if and only if it is absolutely convergent. An example is the series . The corresponding series of absolute values is and converges. Since every absolutely convergent series converges, the series converges, as well.

Now, we are interested in proving that its limit is invariant under re-arrangement. In the article Absolute convergence of a series, we proved that a series is absolutely convergent if and only if it can be split into converging series of non-negative elements and non-positive elements . Now, converges absolutely, so the series and converge, as well. As both are purely non-negative or non-positive, we can re-arrange them without changing the limit:

and

If we put both parts together, we obtain

So the entire series can also be re-arranged without changing the limit.

Question: What are the both parts and of the series above ?

There is

and


So

and

now, the above argumentation holds for any absolutely convergent series, so we can use it to prove a theorem:

Theorem (Re-arrangement theorem for absolutely convergent series)

Let be an absolutely convergent series. Then, any re-arrangement converges to the same limit.

Proof (Re-arrangement theorem for absolutely convergent series)

Proof step: The re-arranged series converges:

Let be absolutely convergent, so converges. Since , the re-arrangement theorem for non-negative series can be applied and any re-arrangement converges, as well. Since absolutely convergent series are also convergent, the series converges, as well.

Proof step: The re-arranged series has the same limit:

As shown within the article Absolute convergence of a series, a series converges absolutely, if and only if its non-negative part and its non-positive part converge. By step 1, any re-arrangement of converges, so and converge, as well. As they are non-negative/ non-positive, their limit is invariant under re-arrangement:

and

So

That means, the re-arranged series has the same limit as the original one.

Re-arranging convergent, but not absolutely convergent series

[Bearbeiten]

So, if there is absolute convergence, then the limit of a series is invariant under re-arrangement. Can the limit also be invariant under re-arrangement if the series does not converge absolutely? The answer to this question is actually no! Absolute convergence is equivalent to the limit being invariant under re-arrangement. Even further:

If a series converges, but does not converge absolutely, then there exists a re-arrangement, which diverges.

Why does this hold? A series being not absolutely convergent is equivalent to or from above being divergent, see the article Absolute convergence of a series. We even have:

Theorem

If is a series converging, but not converging absolutely, then the series and are both divergent.

Proof

Proof by contraposition: Assume that or converges.

Fall 1: and converge.

Then, converges absolutely, which cannot be the case.

Fall 2: converges and diverges.

Then also

diverges. But now, if and converge, then also

would have to converge, which is excluded in this case.

Fall 3: diverges and converges

can not occur by the same arguments as in step 2, where the roles of and are exchanged.

This theorem can be used to show that for any convergent, but non-divergent series, we can construct a diverging re-arrangement. The idea it to use the "infinite budgets" and and combine them in a way that one wins over the other. For instance, consider our "favourite example": the alternating harmonic series . We construct a re-arrangement , which diverges to the following way: We sum up a lot of positive terms, until we surpass . Then a negative term follows. Then, we sum up sufficiently many terms to get above . A negative summand follows and then again enough summands to surpass and so on...

The result will diverge to  : after has been passed, we can get at most down by 1 again and always stay above . This argument holds for arbitrarily large and hence yields divergence. For the alternating harmonic series, the re-arrangement looks as follows:

So for any there is an with for all and we get divergence.

This argument holds for any conditionally convergent series:

Theorem

Let be a convergent series which does not converge absolutely. Then there is a rte-arrangement of this series which diverges.

Proof

As converges, but does not converge absolutely, the series and must both diverge (this has been shown above).

Since diverges, we have an with

In addition, since diverges, there is an with

Now, there is an with

Continuing iteratively, we get that for each there is an , such that for that re-arranged series there is

So the constructed series diverges.

Hint

Within the proof, we have even shown that

If a series converges, but does not converge absolutely, there is a re-arrangement, which diverges to .

Interchanging and we analogously have that:

If a series converges, but does not converge absolutely, there is a re-arrangement, which diverges to .