Don't Use Averaging to Smooth Out Data
by Demolishun · in Technical Issues · 08/12/2006 (3:32 pm) · 4 replies
I recently looked at a resource that used averaging to smooth out data. Don't do this! Is is a waste of resources. You can get the same response by using DSP techniques:
This can be simplified as this:
This is a low pass filter. It requires one subtraction, one multiplication, and one addition. There are no divisions. Use ratio like a divisor. If you put in 0.2 for the ratio it will be like using 5 points in an average. This is an excellent method for real time applications which is what this was designed for.
If you have multiple inputs then it may be a toss-up for using this method versus an average. However, for single inputs this is definitely better. If you plot out the frequency response this method is nearly the same as averaging.
float NewSample; float Output; float Ratio; // where ratio is >= 0 and < 1.0 Output = NewSample * Ratio + Output * (1 - Ratio);
This can be simplified as this:
Output = (NewSample - Output) * Ratio + Output;
This is a low pass filter. It requires one subtraction, one multiplication, and one addition. There are no divisions. Use ratio like a divisor. If you put in 0.2 for the ratio it will be like using 5 points in an average. This is an excellent method for real time applications which is what this was designed for.
If you have multiple inputs then it may be a toss-up for using this method versus an average. However, for single inputs this is definitely better. If you plot out the frequency response this method is nearly the same as averaging.
About the author
I love programming, I love programming things that go click, whirr, boom. For organized T3D Links visit: http://demolishun.com/?page_id=67
#2
08/12/2006 (3:57 pm)
Unless it is a dynamic value. The beauty is it can be changed/tuned without imact on performance.
#3
you might be one of the few folks who would appreciate a tiny math resource i wrote a while ago.
basically w/ the above averaging you get an exponential decay,
where the rate of decay is based on the rate of iteration.
this is such an easy filter that it's tempting to smooth real-time data with it.
however in a 3D engine like torque where the simulation tick rate is tied to the framerate,
this can lead to inconsistent decay rates for folks with different framerates.
for example on a machine at 300FPS, the decay is almost instantaneous,
whereas for someone at 30FPS it's quite sluggish.
i wrote a resource which adjusts "Ratio" in the above formula to arrive at a consistent overall decay rate given the FPS as an input.
and it is here.
08/12/2006 (4:22 pm)
Btw Frank,you might be one of the few folks who would appreciate a tiny math resource i wrote a while ago.
basically w/ the above averaging you get an exponential decay,
where the rate of decay is based on the rate of iteration.
this is such an easy filter that it's tempting to smooth real-time data with it.
however in a 3D engine like torque where the simulation tick rate is tied to the framerate,
this can lead to inconsistent decay rates for folks with different framerates.
for example on a machine at 300FPS, the decay is almost instantaneous,
whereas for someone at 30FPS it's quite sluggish.
i wrote a resource which adjusts "Ratio" in the above formula to arrive at a consistent overall decay rate given the FPS as an input.
and it is here.
Associate Orion Elenzil
Real Life Plus
general linear interpolation / weighted averaging is a very useful thing to get in the habit of thinking about,
and it's only one Subtract more expensive than a plain average.
i suspect that most compilers convert "/ 2.0" into "* 0.5", tho.