Very often we have a long sequence of data with a very high variance due to noises. The typical approach to this situation is to create a circular vector of size N, put the new incoming value in the vector in a sequential way replacing the older one and finally calculating the mean of the values.

The “smart” approach let us take memory of the sum of the values adding the new value and removing the older before dividing for the size of the vector, not too much complicated, but prone to error.

Math comes to us…

Lets start by this formula:

“**m**” is the mean at time “**t**“, “**v**” is the new incoming value at time “**t**“, “**α**” is the weight. The only value that we must take care to is “**m**” that propagates through time.

For example: we are at time “t=100” and we have reached this status:

going on replacing each with the value calculate at (t-1) we can see how the weight of older values contributes less to the total sum in an exponential way according to the value of “α”.

But why can we use this formula to approximate a “moving windows average algorithm”?

The reply is simple: we can approximate the “width” of the windows with the following formula:

and we obtain the following window sizes for typical “α” values

If we want to calculate the value of “α” that must be used to obtain a searched window size we can use the formula

So many line of codes can be replaced with a single line

double mean = 0; double alpha = 0.9; // mean of about last 10 values [...] while( acquiring ) { [... get "newValue" in some way ...] mean = alpha*mean + (1-alpha)*newValue; } [... do something with the mean ...]

The only problem with this approach is that for the first values the mean is really “slow” and very near to zero. We can overcome this problem using some kind of **Bias Correction**:

A complete C++ class:

That’s all for this tutorial. I have used this approach more than one time since I discovered it following the “Deep Learning” specialization on Coursera by Andrew Ng.

If you want to learn more follow the links to the three videos about this argument:

- Exponential Weighted Average
- Understanding Exponential Weighted Average
- Bias correction in Exponential Weighted Average