Help with real time data smoothing and prediction

I am currently working on a project where i have to read the position data from a moving object (received from sensors mounted on the object) and process the data in a graphics application so that i can project an image back onto the object that is moving.
There are 2 problems with this, firstly by the time i get the data, process it and then re-project it the object has moved and i am always behind where it is by about 100ms. The next problem is that the sensor give very noisy data which is not received regularily, so i get approx 30 updates per second but the time between each sample can vary.

I need to create or find a library in c++ or c# that can smooth the data (given a constant stream of time stamped float values) and also be able to "Look Ahead" approx 100ms to correct for the latency in the system.
The problem is that I was never any good at maths and I don't know where to start or what type of algorithms i am looking for. After searching the web i have seen comments about Exponential smoothing but everywhere i see it used it is used on data taken over long periods of time, i need to be realtime - can anyone point me in the right direction ?

Many thanks,

Nigel.
noodles2000Asked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Alexandre SimõesManager / Technology SpecialistCommented:
1st - Smoothing results
From what I understood you actually have to act upon the data you just received, not on historical data. So I don't see how smoothing results will help you here.
If sometimes it takes 100ms and other times it takes 100ms between received messages, you might need to improve the frequency of data.
Let me put it this way, if what you actually had was data received every 10ms and wanted to process it in 100ms chunks then you could smooth it, with lack of data, there's nothing to smooth.

2nd - The response time
I think the root cause is always the same. Your communication channel is not fast enough for JIT data receive/process/reply.
One "easy" solution could be slowing down the object so that it matches the communication channel capabilities.
Another solution can be to make the object wait for a reply before proceeding to the next move.
Yet another one would be to make the object work with late data instead of live, which means that it would be ok for the object to use data that is 200ms old, and take this in account for whatever that data is used at the object side.


All these and other possible solutions heavily depend on what you're doing but in your case, unless I'm missing some key bit of information, I thing smoothing data is not the solution.

Cheers!
1
ozoCommented:
The algorithm may want to start with a good model of the noisy data which is not received regularily,
to know the probability of getting various amounts of error or lag, as well as a model of how much variation in velocity and acceleration of the object would be expected over 100ms.
Given that, the project might either be straight forward or impossible.
1
noodles2000Author Commented:
Thank you gents for the answers so far, to clarify on a few points - firstly i have no control over the d=movement of the object, it is a piece of scenery on stage and it moves when the automation is programmed to move, i have no control over that. I get it's position data 60 times a second, which would in theory mean that i get the data every 16ms, however sometimes i there may be 20ms between data samples and sometimes there may be 12ms between samples but it averages out to one every 16ms, and each data sample has a time stamp so i know exactkly when the reading was taken.
Effectively what i need to do is , after the first few samples - based on a theoretical curve drawn between those points, be able to forcast the curve to be able top project values that should happen in approx 100ms from now. Of course the curve will change as the object accelerates / deaccelerates so it needs to adjust itself.
I know that this is possible because there are similar systems on the market that can do it but i just don't know the maths (The programming side i can do).

Does that help ?

Best,

Nigel.
0
Fred MarshallPrincipalCommented:
This sounds like a rather classical Kalman filter application.  The objective is to aim at the target based on noisy sensor data.  The time delays mean there's a prediction involved.

For example, Google for: "A Simplified Approach to Understanding the Kalman Filter"
1

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
noodles2000Author Commented:
Thank you Fred, that was the exact push that i needed. After following your advice I was able to find a number of existing libraries that can do what I need, sometimes you just need the right words for the search !
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Algorithms

From novice to tech pro — start learning today.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.