Skip to main content
Tweeted twitter.com/StackGameDev/status/689770777844006912
added 124 characters in body
Source Link
Basaa
  • 1.1k
  • 1
  • 11
  • 23

Interesting debug finding: I'm running this on nVidia. On a Intel graphics CPU it runs, but nothing shows on the screen!

Interesting debug finding: I'm running this on nVidia. On a Intel graphics CPU it runs, but nothing shows on the screen!

Source Link
Basaa
  • 1.1k
  • 1
  • 11
  • 23

VBO rendering crashed with glDrawArrays

I'm playing around in LWJGL3 and I'm experiencing an issue regarding glDrawArrays.

At glDrawArrays the JVM crashes.

I'm using modern OpenGL and therefore I have my own shaders and matrix calculations. The pipeline I've programmed is quite complex so pasting all the code here isn't of any use.

Vertex shader:

#version 330 core

uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform mat4 modelMatrix;

layout (location = 0) in vec3 in_position;

void main() {
    gl_Position = (projectionMatrix * viewMatrix * modelMatrix) * vec4(in_position, 1.0f);
}

Fragment shader:

#version 330 core

void main() {
    gl_FragColor = vec4(1.0f, 0.0, 1.0f, 1.0f);
}

I'm trying to draw 1 point to the screen using a VBO. Creation of the VBO:

float[] vboData = new float[]{0.0f, 0.0f, 0.0f};
FloatBuffer buffer = BufferUtils.createFloatBuffer(vboData);
    
vbo_id = glGenBuffers();
glBindBuffer(GL_ARRAY_BUFFER, vbo_id);
glBufferData(GL_ARRAY_BUFFER, buffer, GL_STATIC_DRAW);
glBindBuffer(GL_ARRAY_BUFFER, 0);

And the rendering of the point using glDrawArrays:

    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    
    testShader.bind();
        pipeline.applyMatrices(testShader);
        
        glBindBuffer(GL_ARRAY_BUFFER, vbo_id);
            glVertexAttribPointer(0, 3, GL_FLOAT, false, 12, 0);
            glDrawArrays(GL_POINTS, 0, 1);
        glBindBuffer(GL_ARRAY_BUFFER, 0);
        
        pipeline.setCameraPosition(new Vector3f(0.0f, 0.0f, 0.0f), new Vector3f(0.0f, 0.0f, 0.0f));
    testShader.unbind();
    
    GLFWUtils.swapBuffers();

Using GLIntercept I managed to get a log of every OpenGL call made:

glClearColor(1.000000,0.000000,1.000000,1.000000)
glPointSize(10.000000)
glGenBuffers(1,000000001E0F6FC0)
glBindBuffer(GL_ARRAY_BUFFER,1)
glBufferData(GL_ARRAY_BUFFER,12,0000000025EB8D90,GL_STATIC_DRAW)
glBindBuffer(GL_ARRAY_BUFFER,0)
glCreateProgram()=1 
glCreateShader(GL_VERTEX_SHADER)=2 
glCreateShader(GL_FRAGMENT_SHADER)=3 
glShaderSource(2,1,000000001E0F6FC0,000000001E0F6FC8)
glShaderSource(3,1,000000001E0F6FC0,000000001E0F6FC8)
glCompileShader(2)
glCompileShader(3)
glGetShaderiv(2,GL_COMPILE_STATUS,000000001E0F6FC0)
glGetShaderiv(3,GL_COMPILE_STATUS,000000001E0F6FC0)
glAttachShader(1,2)
glAttachShader(1,3)
glLinkProgram(1)
glEnableVertexAttribArray(0)
glClear(GL_DEPTH_BUFFER_BIT | GL_COLOR_BUFFER_BIT)
glUseProgram(1)
glGetUniformLocation(1,"projectionMatrix")=1 
glUniformMatrix4fv(1,1,false,[-0.671312,-0.000000,-0.000000,-0.000000,-0.000000,-0.895083,-0.000000,-0.000000,-0.000000,-0.000000,-1.000020,-1.000000,-0.000000,-0.000000,-0.020000,-0.000000])
glGetUniformLocation(1,"viewMatrix")=2 
glUniformMatrix4fv(2,1,false,[1.000000,0.000000,0.000000,0.000000,0.000000,1.000000,0.000000,0.000000,0.000000,0.000000,1.000000,0.000000,0.000000,0.000000,0.000000,1.000000])
glGetUniformLocation(1,"modelMatrix")=0 
glUniformMatrix4fv(0,1,false,[1.000000,0.000000,0.000000,0.000000,0.000000,1.000000,0.000000,0.000000,0.000000,0.000000,1.000000,0.000000,0.000000,0.000000,0.000000,1.000000])
glBindBuffer(GL_ARRAY_BUFFER,0)
glVertexAttribPointer(0,3,GL_FLOAT,false,12,0000000000000000)
glDrawArrays(GL_POINTS,0,1)

I've carefully tested a few things:

  • The shaders compile and link fine
  • The matrices calculated look correct (you can double check inside the GLIntercept log)
  • The VBO is correctly created and filled with the correct data
  • The attribute pointer ( = layout 1) is enabled
  • The matrices are correctly sent to the shader (this happens at pipeline.applyMatrices)

What am I missing?