asked on # Help with real time data smoothing and prediction

I am currently working on a project where i have to read the position data from a moving object (received from sensors mounted on the object) and process the data in a graphics application so that i can project an image back onto the object that is moving.

There are 2 problems with this, firstly by the time i get the data, process it and then re-project it the object has moved and i am always behind where it is by about 100ms. The next problem is that the sensor give very noisy data which is not received regularily, so i get approx 30 updates per second but the time between each sample can vary.

I need to create or find a library in c++ or c# that can smooth the data (given a constant stream of time stamped float values) and also be able to "Look Ahead" approx 100ms to correct for the latency in the system.

The problem is that I was never any good at maths and I don't know where to start or what type of algorithms i am looking for. After searching the web i have seen comments about Exponential smoothing but everywhere i see it used it is used on data taken over long periods of time, i need to be realtime - can anyone point me in the right direction ?

Many thanks,

Nigel.

There are 2 problems with this, firstly by the time i get the data, process it and then re-project it the object has moved and i am always behind where it is by about 100ms. The next problem is that the sensor give very noisy data which is not received regularily, so i get approx 30 updates per second but the time between each sample can vary.

I need to create or find a library in c++ or c# that can smooth the data (given a constant stream of time stamped float values) and also be able to "Look Ahead" approx 100ms to correct for the latency in the system.

The problem is that I was never any good at maths and I don't know where to start or what type of algorithms i am looking for. After searching the web i have seen comments about Exponential smoothing but everywhere i see it used it is used on data taken over long periods of time, i need to be realtime - can anyone point me in the right direction ?

Many thanks,

Nigel.

AlgorithmsMath / ScienceC#Programming

The algorithm may want to start with a good model of the noisy data which is not received regularily,

to know the probability of getting various amounts of error or lag, as well as a model of how much variation in velocity and acceleration of the object would be expected over 100ms.

Given that, the project might either be straight forward or impossible.

to know the probability of getting various amounts of error or lag, as well as a model of how much variation in velocity and acceleration of the object would be expected over 100ms.

Given that, the project might either be straight forward or impossible.

Thank you gents for the answers so far, to clarify on a few points - firstly i have no control over the d=movement of the object, it is a piece of scenery on stage and it moves when the automation is programmed to move, i have no control over that. I get it's position data 60 times a second, which would in theory mean that i get the data every 16ms, however sometimes i there may be 20ms between data samples and sometimes there may be 12ms between samples but it averages out to one every 16ms, and each data sample has a time stamp so i know exactkly when the reading was taken.

Effectively what i need to do is , after the first few samples - based on a theoretical curve drawn between those points, be able to forcast the curve to be able top project values that should happen in approx 100ms from now. Of course the curve will change as the object accelerates / deaccelerates so it needs to adjust itself.

I know that this is possible because there are similar systems on the market that can do it but i just don't know the maths (The programming side i can do).

Does that help ?

Best,

Nigel.

Effectively what i need to do is , after the first few samples - based on a theoretical curve drawn between those points, be able to forcast the curve to be able top project values that should happen in approx 100ms from now. Of course the curve will change as the object accelerates / deaccelerates so it needs to adjust itself.

I know that this is possible because there are similar systems on the market that can do it but i just don't know the maths (The programming side i can do).

Does that help ?

Best,

Nigel.

View this solution by signing up for a free trial.

Members can start a 7-Day free trial and enjoy unlimited access to the platform.

Thank you Fred, that was the exact push that i needed. After following your advice I was able to find a number of existing libraries that can do what I need, sometimes you just need the right words for the search !

1st - Smoothing resultsFrom what I understood you actually have to act upon the data you just received, not on historical data. So I don't see how smoothing results will help you here.

If sometimes it takes 100ms and other times it takes 100ms between received messages, you might need to improve the frequency of data.

Let me put it this way, if what you actually had was data received every 10ms and wanted to process it in 100ms chunks then you could smooth it, with lack of data, there's nothing to smooth.

2nd - The response timeI think the root cause is always the same. Your communication channel is not fast enough for JIT data receive/process/reply.

One "easy" solution could be slowing down the object so that it matches the communication channel capabilities.

Another solution can be to make the object wait for a reply before proceeding to the next move.

Yet another one would be to make the object work with late data instead of live, which means that it would be ok for the object to use data that is 200ms old, and take this in account for whatever that data is used at the object side.

All these and other possible solutions heavily depend on what you're doing but in your case, unless I'm missing some key bit of information, I thing smoothing data is not the solution.

Cheers!