Best arrangement for real-time image processing
Posted on 2013-12-31
Hello, I haven't ventured into these HW zones much but can us some expert help.
I'm interested in developing a little real-time-image processing system, and was wondering what combination of camera/driver/processor would result in the fastest times between image data acquisition and processing; in other words...
[visible light]->[camera]->[pixel data]->[CPU]
what camera/CPU technologies would minimize the time between the image acquired by the camera and my software having access to a pixeldata[x,y] array?
I am comfortable with both embedded and desktop software so I am flexible on where to do the processing.
I have some experience with a commercial camera (lumenera) that sends image data over USB, but -- between the time it is acquired, processed, sent over USB, and decoded and loaded into pixeldata[x,v] for my windows application to process, too much time has elapsed. I have a feeling anything that involves windows and USB would be too slow. I am thinking of some sort of embedded camera-to-CPU DMA type arrangement (ie camera writes, CPU reads same memory), where no actual time-consuming transfer of data occurs.
I will go buy the hardware and compilers necessary to optimize this speed, assuming its reasonably priced.
Does anyone have any advice/experience?
Thanks VERY much for any help.