In OpenGL, I'm reading about glVertexPointer which has a parameter: const void* pointer and it is a pointer to the coordinates of the first vertex. Thing is, it doesn't have any parameters that tell you the amount of vertices there are.

I'm wondering if there is a way to copy or store this pointer for use in a later function called glDrawElements. Why? Because glDrawElements has a parameter with the count, but it's always called after the glVertexPointer.

Simplified:

glVertexPointer called.. //has vertices which I need but does not have the count. Documentation
glDrawElementers called.. //has the count but only indices (not vertices). Called after the above ={ Documentation

Is there anyway to do what I'm asking? glPushClientAttrib and glPopClientAttrib do what I'm asking?

Have you looked into Vertex Buffer Objects (VBOs).

The glVertexPointer mechanism is older, before the time where it would be possible to transfer the vertices to the graphics card (in those days, you could barely store all the needed textures on the graphics card memory).

Hmmm. I didn't see that. I used OllyDBG to see calls made to my opengl32.dll and those were the calls made so I immediately took interest in them.

I didn't want to have to use wglGetProcAddress because that's platform specific and windows doesn't expose Extended functions. For example on Nvidia cards I read it would be

glGenBuffers_NV whereas for ATI it would be glGenBuffers_ARB or glGenBuffers_EXT. I'm not sure what to use or if there is a way to detect which to use.

So my new question is, how do games decide which one to use (I can't use Glew)? Like how do they know when to use what? Is it all hardcoded?

You can know what is supported using the glGetString function. This will allow you to get the version number and the list of supported extensions.

Overall, you should try to either stick to the core functionalities supported by the version you have, and possibly use supported ARB extensions. Basically, extensions move from vendor-specific (like NV), to general extension (EXT), to extensions accepted by the architecture review board (ARB), and then they become core functionalities on newer versions. Today, I think most hardware supports at least version 2.5 or 3.0 or later. Make sure that the header files that you are using to access the functions are up-to-date, and there shouldn't be much of an issue.

Normally, versions always remain backward compatible too.

As for actual game engines, they first isolate (behind abstractions) all the rendering code, so that they can easily swap out different rendering engines (OpenGL, Direct3D, and different versions of their renderers). Then, they decide what minimum capabilities they want their renderer to have, and that determines the minimum version of OpenGL or Direct3D that is needed to make that work (this would be the "minimum required DirectX or OpenGL version" that appears on the box). Finally, they code alternative rendering codes depending on what level of feature support the hardware provides (and according to those "graphics settings" that the player configures). So, yes, it is basically hard-coded, but you structure your rendering code such that you don't have too much repetition. For example, you have just one class that handles the rendering of a mesh (vertex-buffer), and then, most other things in your game just create / generate / load the mesh data and gives it to that mesh-rendering object, this way all the different hard-coded alternative rendering methods just appear once in this one class implementation. And of course, you can do this incrementally by first implementing the basic method (no advanced version/feature support needed), and then you add more sophisticated alternatives afterwards.

commented: Edit: Never mind. Thanx for all the help. I updated my extension includes. It's enough to work :) EXCELLENT Helper +0

Hmmm. I just checked my OpenGL headers for glGenBuffers and glBindBuffer, it doesn't exist in there. My OpenGL headers come with Mingw (The latest version).

My system opengl is 3.3 and ALL windows Operating systems stop at 1.1 functionality unless I use the wiggle functions. In this case I can't though because Mingw's headers are years outdated and OpenGL.org doesn't supply gl.h only the extension ones :S

GL_GetString with the GL_Extension parameter doesn't return the true functions supported. It only returns what the header supports..

It seems Microsoft is pushing DirectX and keeping OGL at 1.1 :S Linux doesn't have this problem -_-

It appears that the 3.0 and up headers are still in development. There is a draft for gl3.h.

And MinGW should be able to link directly to OpenGL32.dll, if not, then follow these instructions to create the import library.

Hey I got everything working. I decided to store the pointer for use later using a struct.

struct Model
{
    GLint Stride, ID;
    GLvoid* VertexPointer;
    GLint TriangleCount;

    struct
    {
        std::vector<GLfloat> X, Y, Z;
    } Triangles;
};



Model CurrentModel;
std::vector<Model> ListOfModels;



GL_EXPORT __stdcall void Detour_glVertexPointer(GLint size, GLenum type, GLsizei stride, const GLvoid *pointer)
{
    if (LogCalls) glLogCalls("glVertexPointer: %i, %d, %i, %i", size, type, stride, pointer);
    DrawModel = true;
    if (size == 3 && type == GL_FLOAT)
    {
        CurrentModel.Stride = stride;
        CurrentModel.VertexPointer = pointer;   //Store a pointer to the list of vertices! Hopefully this works.
        ListOfModels.push_back(CurrentModel);
    }
    (*optr_glVertexPointer) (size, type, stride, pointer);
}

GL_EXPORT __stdcall void Detour_glDrawElements(GLenum mode, GLsizei count, GLenum type, const GLvoid *indices)
{
    if (LogCalls) glLogCalls("glDrawElements: %d, %i, %d, %i\n", mode, count, type, indices);
    if (DrawModel)
    {
        ListOfModels.back().ID = 0;
        const GLvoid* ModelVertexPtr = ListOfModels.back().VertexPointer;
        size_t ModelTriCount = ListOfModels.back().TriangleCount = count / 3;

        for (size_t I = 0; I < ModelTriCount; I++)
        {
            ListOfModels.back().Triangles.X[I] = (GLfloat)ModelVertexPtr[I];
            ListOfModels.back().Triangles.Y[I + 1] = (GLfloat)ModelVertexPtr[I + 1];
            ListOfModels.back().Triangles.Z[I + 2] = (GLfloat)ModelVertexPtr[I + 2];

            if (ListOfModels.back().Stride == Stride && Overlay)
                glPolygonMode(GL_FRONT_AND_BACK, GL_LINE);
            else
                glPolygonMode(GL_FRONT_AND_BACK, GL_FILL);
        }
    }
    else
        glPolygonMode(GL_FRONT_AND_BACK, GL_FILL);
    (*optr_glDrawElements) (mode, count, type, indices);
}

But line 38 gives me errors :S It says invalid use of struct::triangles. And it says 'const GLvoid* {aka const void*}' is not a pointer-to-object type.

What am I doing wrong? All I did was store the pointer but I think I'm accessing it wrong.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.