• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 615
  • Last Modified:

DirectX Line Drawing

I am using MSVC++5.0, DirectX 5.0, full-screen mode, and an 800x600x16 resolution.

I have a routine to draw a line using DirectDraw.  This routine works, only it doesn't draw the line exactly where I want it to.  It draws the line approximately at 1/2 the x position specified and it won't do colors correcly, the only color that I know of that doesn't generate a compiler warning is 0 (black).

Here is the code for the routine:

BOOL DirectDrawWin::Line(LPDIRECTDRAWSURFACE surf, int x1, int y1, int x2, int y2, BYTE color)
{
      // Bresenham's algorithm (IBM 1965)

      int dx, dy;
      int x_inc, y_inc;
      int error = 0;
      int index;

      HRESULT r;

      DDSURFACEDESC desc;
      ZeroMemory(&desc, sizeof(desc));
      desc.dwSize = sizeof(desc);
      desc.dwFlags=DDSD_WIDTH | DDSD_HEIGHT;
      if(surf->GetSurfaceDesc(&desc)!=DD_OK)
            return FALSE;
      int w=desc.dwWidth;
      int h=desc.dwHeight;

      r = surf->Lock(0, &desc, DDLOCK_WAIT  | DDLOCK_WRITEONLY, 0);
      if (r != DD_OK)
            return FALSE;

      BYTE* surfbits = (BYTE*)desc.lpSurface;
      BYTE* pixel = surfbits + y1 * desc.lPitch + x1;

      dx = x2 - x1;
      dy = y2 - y1;

      // Determine slope
      if(dx >= 0)
      {
            x_inc = 1;
      }
      else
      {
            x_inc = -1;
            dx = -dx;
      }

      if(dy >= 0)
      {
            y_inc = desc.lPitch;
      }
      else
      {
            y_inc = -desc.lPitch;
            dy = -dy;
      }

      // Draw the line based upon deltas
      if(dx > dy)
      {
            for(index=0;index<dx;index++)
            {
                  // Set the pixel
                  *pixel = color;

                  // Discriminate
                  error += dy;

                  // Test for overflow
                  if(error > dx)
                  {
                        error -= dx;
                        pixel += y_inc;
                  }

                  // Move to the next pixel
                  pixel += x_inc;
            }
      }
      else
      {
            for(index=0;index<=dy;index++)
            {
                  // Set the pixel
                  *pixel = color;

                  // Discriminate
                  error += dx;

                  // Test for overflow
                  if(error > 0)
                  {
                        error -= dy;
                        pixel += x_inc;
                  }

                  // Move to the next pixel
                  pixel += y_inc;
            }
      }

      surf->Unlock(0);

      return TRUE;
}

This is simply Bresenham's algorithm, so anyone familiar with line drawing should recognize it.  I think my problem lies with using BYTE pointers when they should be WORD pointers, but switch to WORD pointers generates errors.
0
Egore
Asked:
Egore
  • 3
  • 3
1 Solution
 
nietodCommented:
If you have 16 colors, then there are two pixels stored in each byte, but your algorithm treats it as if there is one pixles stored in each byte.  Your algoritm needds work, or you need to use it in 256  color mode.
0
 
EgoreAuthor Commented:
I realize this, do you know how to change the algorithm to make it work with 16bit color?  That's my main problem...
0
 
nietodCommented:
Its very painful.  I think I would take the following tact.  I would peform the calculations in terms of a nible offset into the screen memory.  That is, I would not add on "desc.lpSurface" at the start (you will have to add it on each time you access memory, not at the start of the calculations.)  In this case, the pointer you will calculate (before lpsurface is added on) is twice the size it should be. This is because it expresses a distance in nibbles (half a byte) not bytes.  You wil then bet a pointer to the byte to change by dividing this pointer (offset, really) by 2 and then adding on the lpsuface.  Now the important thing is you don't want to change the whole byte pointed to by this pointer.  Only one nibble.  Which one?  Well of the first pointer (offset) is odd, the low nibble, if it is even, then the high one.

Make sense?  let me know if you have questions.
0
Cloud Class® Course: Amazon Web Services - Basic

Are you thinking about creating an Amazon Web Services account for your business? Not sure where to start? In this course you’ll get an overview of the history of AWS and take a tour of their user interface.

 
EgoreAuthor Commented:
I'm sorry if I was unclear, but I am dealing with **16 bit** color, not 16 colors.  That means that each color is represented by two bytes.  A WORD variable is two bytes, but it gives me an error if 'surfbits' and 'pixel' are WORD size.

Sorry about the confusion.
0
 
nietodCommented:
That is a lot easier.  I would just convert to WORD * pointers instead of BYTE * pointers.   It looks like everything else should stay the same except that you need to use 1/2 of lPitch everywhere where you currently use lPitch.  In this case lPitch must be even, so you know that LPitch/2 will by accurate.
0
 
EgoreAuthor Commented:
This worked, changing everything to WORD pointers.  The problem that I had is that, before posting this question, I tried changing everything to DWORD pointers, and that did not work.

Thanks for the help...

(By the way, the code is the same except for the WORD pointers...)
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Cloud Class® Course: Amazon Web Services - Basic

Are you thinking about creating an Amazon Web Services account for your business? Not sure where to start? In this course you’ll get an overview of the history of AWS and take a tour of their user interface.

  • 3
  • 3
Tackle projects and never again get stuck behind a technical roadblock.
Join Now