Im trying to switch from the glTranslate etc to my own Matrices, but for some reason it does not work. Here are my 2 functions to create the view- and projection matrix:
public Matrix4f getViewMatrix() {
Matrix4f viewMatrix = new Matrix4f();
viewMatrix.setIdentity();
viewMatrix.translate(game.player.position);
viewMatrix.rotate(game.player.rotation.x, new Vector3f(1.0f, 0.0f, 0.0f));
viewMatrix.rotate(game.player.rotation.y, new Vector3f(0.0f, 1.0f, 0.0f));
viewMatrix.rotate(game.player.rotation.z, new Vector3f(0.0f, 0.0f, 1.0f));
//System.out.println(viewMatrix);
return viewMatrix;
}
public Matrix4f getProjectionMatrix() {
FloatBuffer projectionBuffer = BufferUtils.createFloatBuffer(16);
GL11.glGetFloat(GL_MODELVIEW_MATRIX, projectionBuffer);
Matrix4f projectionMatrix = new Matrix4f();
projectionMatrix.load(projectionBuffer);
return projectionMatrix;
}
I send those 2 matrices to the Vertex shader with a uniform, and use:
gl_Position = view_matrix * proj_matrix * vec4(in_position, 1.0);
Where in_position is the coordinate of the vertex.
I do see some things on the screen, but it's very, very buggy, and nothing is right about it. If I use the build-in gl_ModelViewProjectionMatrix in the shader and use glTranslate and glRotate in OpenGL, it works perfectly fine.
What am I doing wrong here?
Here is the output of me view projection together with the camera position:
0.5095141 0.0 -0.8604623 1.1625774
0.7321221 0.52541286 0.43351877 16.980185
0.45209795 -0.8508474 0.26770526 0.8665553
0.0 0.0 0.0 1.0
Vector3f[1.1625774, 16.980185, 0.8665553]