How do get and set a matrix in openGL

Hi,

When I do this in openGL (I am new very to this):

float marray[16];
glMatrixMode(GL_MODELVIEW)
glLoadIdentity();
glGetFloatv(GL_MODELVIEW_MATRIX, marray)

I am expecting marray to contain the identity matrix
so cout<<marray[0] should print 1
Is this right?
what i want to do is set the matrix and then multiply by another but the set/get doen't appear to work

any ideas?





KevinJoeBadgerAsked:
Who is Participating?
 
satsumoSoftware DeveloperCommented:
The function to create a window is CreateWindow, but theres a lot of stuff to learn before you can use it.  I guess you aren't used to writing Windows applications because you've used main instead of WinMain.  If you want to learn I would concentrate on writing a Windows application before thinking about OpenGL.

An easier alternative is using the GLUT library, this does the initialisation and provides a framework for your OpenGL application to run in.  You don't have to pick up all the details of Windows, input, timing and so on.  This would allow you to concentrate on learning OpenGL.

http://www.opengl.org/resources/libraries/glut/
0
 
satsumoSoftware DeveloperCommented:
Yes, that should load an identity matrix into marray.  Does that not work, does it print 1?  What does it do, I'm curious?
0
 
KevinJoeBadgerAuthor Commented:
Thanks for the reply;

I get garbage in return for all indexes in the matrix (-1.07374e+008) no matter what I do with it.
0
Keep up with what's happening at Experts Exchange!

Sign up to receive Decoded, a new monthly digest with product updates, feature release info, continuing education opportunities, and more.

 
satsumoSoftware DeveloperCommented:
What happens if you do something like glTranslate to the modelview matrix and then call glGetFloatv?
0
 
KevinJoeBadgerAuthor Commented:
Still exactly the same result I'm afraid.
0
 
satsumoSoftware DeveloperCommented:
Then something else is wrong, your code looks OK.  What implementation of OpenGL are you using?  While OpenGL is a standard, actual support for functions can vary.  For example, most modern implementations don't support glColorTable, often just ignoring it.
0
 
KevinJoeBadgerAuthor Commented:
 
#include <stdlib.h>
#include "glut.h"
#include <iostream>

using namespace std;

int main()
{
	float marray[16];
	glMatrixMode(GL_MODELVIEW);
	glLoadIdentity();
	glTranslatef(1,2,3);
	glGetFloatv(GL_MODELVIEW_MATRIX, marray);
	
	for(int i=0;i<4;i++)
	{
		for(int j=0;j<4;j++)
		{
			cout<<marray[i*4+j]<<",";
		}
		cout<<"\n";
	}

	// result: -1.07374e+008,-1.07374e+008,-1.07374e+008,-1.07374e+008, four times
}

Open in new window

0
 
KevinJoeBadgerAuthor Commented:
Is there a call to get the version of openGL?
0
 
satsumoSoftware DeveloperCommented:
Ah, the penny drops, the problem is you haven't initialised OpenGL.  There is a startup process required to make OpenGL work, it's not part of the specification because it different on each system.

On Windows you have to set the pixel format for a device context, create an OpenGL context and make it current.  The process is described in MSDN http://msdn.microsoft.com/en-us/library/dd374379(v=vs.85).aspx.

On Mac OS, Linux, iOS, Window Mobile the procedure is different.  Each platform will have documentation about how you start OpenGL.  Until you do this, none of these functions will work.
0
 
satsumoSoftware DeveloperCommented:
You can use glGetString to get information about OpenGL.  Although you may have to start it before doing that.
0
 
KevinJoeBadgerAuthor Commented:
I wondered about that.
Have tried this initialisation but it makes no difference?

int main()
{
      HDC    hdc=GetDC(NULL);
      HGLRC  hglrc;
      hglrc = wglCreateContext (hdc);
      wglMakeCurrent (hdc, hglrc);
 
      float marray[16];
      glMatrixMode(GL_MODELVIEW);
      glLoadIdentity();
      glTranslatef(1,2,3);
      glGetFloatv(GL_MODELVIEW_MATRIX, marray);
      int i,j;
      for(i=0;i<4;i++)
      {
            for(j=0;j<4;j++)
            {
                  cout<<marray[i*4+j]<<",";
            }
            cout<<"\n";
      }
      system("pause");
      wglMakeCurrent (NULL, NULL) ;
      wglDeleteContext (hglrc);
      // result: -1.07374e+008,-1.07374e+008,-1.07374e+008,-1.07374e+008, four times
}
0
 
satsumoSoftware DeveloperCommented:
Almost there.  You can't make an OpenGL context of the screen, it must be attached to the DC of a window.  So you'll need to add some code to create a window and a window procedure.  Also you have to set the pixel format for the device context before creating the OpenGL context.  I've never tried to create an OpenGL context without setting the pixel format, I don't think it works.
0
 
satsumoSoftware DeveloperCommented:
You should check the return values, in this case hglrc is likely to be NULL.  It's quite possible for an OpenGL context to fail.  If there isn't enough video memory left, if the hardware does not support it, if the driver DLL was missing, if DirectX was running in exclusive mode, there are many potential reasons.
0
 
KevinJoeBadgerAuthor Commented:
Not sure how to do that (create window)
0
 
KevinJoeBadgerAuthor Commented:
Yes, I mainlly program COM in windows and use Visual C++ to create dialogs
So haven't done much with createwindow().
I have a solution which I found based on your hints.
I am using OpenGL dll to run suff in a cmd box (for colledge) but based on
yous suggestions I have iniyalised OpenGL properly and on creating a window using the library I have then I find the funcs work ok
Many thanks.
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.