Fading a 24 bit bitmap using win32

Is there a quick way to fade a 24 bit BMP from black to the original bitmap and back? My solution so far is to get the RGB values of the bitmap and scale them form 0 to x and back. Unfortunately this solution quite slow for big (say 640x480) bitmaps because of the time it takes to calculate and blit (I use SetDIBitsToDevice) the image. Is there a way to use the palette to do this (as I understand there is no real palette in 16 or 24 bit color modes)?. Or to adjust the bitmap in the videomemory directly both in 16 and 24 bit video modes? Could I use some ROP code and PatBlt to 'blend' colors? What is the fastest way? I'm using C and the win32 API.

Thanks,

Pieter.
SysExAsked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

dtowellCommented:
Theoritically Get/SetDeviceGammaRamp() will allow you to control the RGB intensity mappings to the display.  However, I have never found a machine/driver that actually implements them.

The fastest way (assuming you have to blit the bitmap) is to use a DIB created with CreateDIBSection() and then blit'd to the screen (it saves a memory copy).  Also, use a table lookup for the RGB values (ie, for each step of fade, compute the scaled value for all possible RGB values (there are only 256) and then just look up the answer for each RGB value).
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
SysExAuthor Commented:
First of all, thanks for responding to my question. As you said the Get/SetDeviceGammeRamp() is of no practical use. Also the gamma ramp will change the colors on the entire screen, I'm interested in fading a bitmap in a window.

The fastest method you describe is, like I said in my question, exactly what I'm using now:
* Create 'FadeTable' once (BYTE FadeTable[256][100]) in wich every value lineary descends to zero.
* Use GetDIBits() to fill a BITMAPINFO structure
* malloc two buffers, one for the source RGB values and one for the faded RGB values to use SetDIBitsToDevice() on
* Use GetDIBits() to fill the source buffer.
Now for every step 0-99:
* Copy every byte from the source buffer to the fade buffer using the FadeTable
* Use SetDIBitsToDevice() to copy the image from the fade buffer to the screen.

For a bitmap of the size of an average window (about 1MB of RGB data) this method is far to slow, both the fading and the SetDIBitsToDevice() take to long to get a smooth fade on my P133 with S3 TRIO64V+ based videocard.

So my question is: Is there a faster way that works on the average win95 computer and works with a 16 bit color or better videomode.

Regards,

Pieter. (p.g.m.vandermeulen@student.utwente.nl)
0
dtowellCommented:
There is still one thing you can do: use CreateDIBSection() to create/allocate the buffer.  Trust me, this is not just a convient-all-in-one-package function.  It creates the buffer "sitting on the fence" between user mode and kernel mode code.  This removes a memory copy (from user to kernel mode) that occurs when using "roll your own" DIBs.  This will still not be real fast because you still have to touch every pixel and then blit them, but it is the fastest non-DirectDraw method available in Win32.  

Another thing to consider is the maximum potential speed.  Say your bus is running at 60MHz.  Each faded pixel takes 9 bus cycles (read pixel value, read faded value, write pixel for each RGB value) and each pixel transferred to the display takes 2 cycles (read 4 bytes, write 4 bytes).  Assuming you have a 640x480 image, it takes 3.4M cycles to do one fade step; thats a little over 17 frames per second.  Of course that assumes no overhead, no non-cache hits and all pipelined code.  None of these are good assumptions.
0
SysExAuthor Commented:
Using CreateDIBSection did speed up the blitting a bit. The limiting factor is the huge amount of data that has to be moved. I would be nice if there had been a function that could translate RGB values while blitting them from a DIB to the screen.
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Microsoft Development

From novice to tech pro — start learning today.