*For Chiara, *

*who once encouraged me *

*to boldly keep trying *

**Introduction**

For the last four months, I have experienced the worst level of my illness: I have been completely unable to think for most of the time. So I could do nothing but hanging in there, waiting for a miracle, passing from one medication to the other, well aware that this state could have lasted for years, with no reasonable hope of receiving help from anyone. This has been the quality of my life for most of the last two decades. Then, some days ago, the miracle happened again and I found myself thinking about a theorem I was working on in July. And once more, with a great effort, my mind, which is not so young anymore, started her slow process of recovery. I concluded this proof last night. This is only a poor thing but since it is not present in my books of statistics, I have decided to write it down in my blog, for those who might be interested.

I can now come back to my awkward studies, which span from statistics to computational immunology, from analysis of genetic data to mathematical modelling of bacterial growth. Desperately searching for a cure.

**The problem
**

Let be independent random variables with an exponential distribution with pairwise distinct parameters *, *respectively. Our problem is: what is the expression of the distribution of the random variable ? I faced the problem for *m = 2, 3, 4.* Then, when I was quite sure of the expression of the general formula of (the distribution of Y) I made my attempt to prove it inductively. But before starting, we need to mention two preliminary results that I won’t demonstrate since you can find these proofs in any book of statistics.

*PROPOSITION 1*. Let be independent random variables. The distribution of is given by:

where *f_ X *is the distribution of the random vector [].

*PROPOSITION 2*. Let be independent random variables. The two random variables and (with* n<m*) are independent.

*DEFINITION 1.* For those who might be wondering how the exponential distribution of a random variable with a parameter looks like, I remind that it is given by:

**Guessing the solution**

As mentioned, I solved the problem for *m = 2, 3, 4 *in order to understand what the general formula for might have looked like*. *

*PROPOSITION 3 (m = 2)*. Let be independent exponential random variables with distinct parameters *, *respectively. The law of is given by:

*Proof. *We just have to substitute in Prop. 1. We obtain:

And the demonstration is complete ♦

*PROPOSITION 4 (m = 3)*. Let be independent exponential random variables with pairwise distinct parameters *, *respectively. The law of is given by:

*Proof*. If we define and *, *then we can say – thanks to Prop. 2 – that and are independent. This means that – according to Prop. 1 – we have

The reader will now recognize that we know the expression of because of Prop. 3. So, we have:

For the first integral we find:

For the second one we have:

Hence, we find:

And the thesis is proved* ♦
*

*PROPOSITION 5 (m = 4)*. Let be independent exponential random variables with pairwise distinct parameters , respectively. The law of is given by:

for y>0, while it is zero otherwise.

*Proof. *Let’s consider the two random variables , *. *Prop. 2 tells us that are independent. This means that – according to Prop. 1 – we can write:

The reader has likely already realized that we have the expressions of and , thanks to Prop. 3. So we have:

For the four integrals we can easily calculate what follows:

Adding these four integrals together we obtain:

And this proves the thesis* ♦
*

We are now quite confident in saying that the expression of for the generic value of *m *is given by:

for y>0, while being zero otherwise. But we aim at a rigorous proof of this expression.

**Proof **

In order to carry out our final demonstration, we need to prove a property that is linked to the matrix named after Vandermonde, that the reader who has followed me till this point will likely remember from his studies of linear algebra. The determinant of the Vandermonde matrix is given by:

PROPOSITION 6 (*lemma*). The following relationship is true:

*Proof. *In the following lines, we calculate the determinant of the matrix below, with respect to the second line. In the end, we will use the expression of the determinant of the Vandermonde matrix, mentioned above:

But this determinant has to be zero since the matrix has two identical lines, which proves the thesis * ♦ *

*PROPOSITION 7*. Let be independent exponential random variables with pairwise distinct parameters , respectively. The law of is given by:

for y > 0, while being zero otherwise.

*Proof.* We already know that the thesis is true for *m = 2, 3, 4. *We now admit that it is true for *m-1 *and we demonstrate that this implies that the thesis is true for *m *(proof by induction). Let’s define the random variables and *. *These two random variables are independent (Prop. 2) so – according to Prop. 1 – we have:

Now, is the thesis for *m-1 *while is the exponential distribution with parameter *. *So we have:

For the sum we have:

The sum within brackets can be written as follows:

So far, we have found the following relationship:

or, equivalently:

In order for the thesis to be true, we just need to prove that

which is equivalent to the following:

Searching for a common denominator allows us to rewrite the sum above as follows:

So, we just need to prove that:

**References.** A paper on this same topic has been written by Markus Bibinger and it is available here.

## One thought on “Sum of independent exponential random variables”