英文:
Texture mapping on voxel model in Open GL
问题
这是您提供的问题的翻译内容:
public static RawModel loadObjModel(String fileName, Loader loader) {
// ...(省略部分代码)
while (line != null) {
if (!line.startsWith("f ")) {
line = reader.readLine();
continue;
}
String[] currentLine = line.split(" ");
String[] vertex1 = currentLine[1].split("/");
String[] vertex2 = currentLine[2].split("/");
String[] vertex3 = currentLine[3].split("/");
processVertex(vertex1, indices, textures, normals, textureArray, normalsArray);
processVertex(vertex2, indices, textures, normals, textureArray, normalsArray);
processVertex(vertex3, indices, textures, normals, textureArray, normalsArray);
line = reader.readLine();
}
reader.close();
// ...(省略部分代码)
return loader.loadToVAO(verticesArray, indicesArray, textureArray);
}
private static void processVertex(String[] vertexData, List<Integer> indices,
List<Vector2f> textures, List<Vector3f> normals,
float[] textureArray, float[] normalsArray) {
int currentVertexPointer = Integer.parseInt(vertexData[0]) - 1;
indices.add(currentVertexPointer);
Vector2f currentTex = textures.get(Integer.parseInt(vertexData[1]) - 1);
textureArray[currentVertexPointer * 2] = currentTex.x;
textureArray[currentVertexPointer * 2 + 1] = 1 - currentTex.y;
Vector3f currentNorm = normals.get(Integer.parseInt(vertexData[2]) - 1);
normalsArray[currentVertexPointer * 3] = currentNorm.x;
normalsArray[currentVertexPointer * 3 + 1] = currentNorm.y;
normalsArray[currentVertexPointer * 3 + 2] = currentNorm.z;
}
public void render(Entity entity, StaticShader shader) {
TexturedModel texturedModel = entity.getTexturedModel();
RawModel model = texturedModel.getRawModel();
GL30.glBindVertexArray(model.getVaoID());
GL20.glEnableVertexAttribArray(0);
GL20.glEnableVertexAttribArray(1);
Matrix4f transformationMatrix = Maths.createTransformationMatrix(entity.getPosition(), entity.getRotation(), entity.getScale());
shader.loadTransformationMatrix(transformationMatrix);
GL13.glActiveTexture(GL13.GL_TEXTURE0);
GL11.glBindTexture(GL11.GL_TEXTURE_2D, texturedModel.getTexture().getID());
GL11.glDrawElements(GL11.GL_TRIANGLES, model.getVertexCount(), GL11.GL_UNSIGNED_INT, 0);
GL20.glDisableVertexAttribArray(0);
GL20.glDisableVertexAttribArray(1);
GL30.glBindVertexArray(0);
}
请注意,这里只提供了您提供的代码部分的翻译,不包含问题的解答。如果您有任何关于这些代码的问题或需要更多解释,请随时提问。
英文:
I'm following a video series about Open GL on YouTube using LWJGL, so far I've managed to render 3d models and texture them properly using the obj format. I want to use voxel-based models for my game so I went in the software MagicaVoxel, exported a textured example with the obj format, but the texture is not mapped correctly. Indeed, some colors seem to be correctly mapped, but other faces have the entire texture on them.
Here is a picture of the expected result:
and the actual result:
I think the problem comes with the texture and the way opengl interpolate it, the texture is a 1*256 line with colors, and on the obj file only the desired colors are put on the uv coordinates.
I made a simpler example to help understand what is going on: just 3 cubes aligned with each other, and a 3-pixel long texture with 3 different colors, here is the code of the obj file and the texture is too small to be seen but it's really just 3 colored pixels.
# normals
vn -1 0 0
vn 1 0 0
vn 0 0 1
vn 0 0 -1
vn 0 -1 0
vn 0 1 0
# texcoords
vt 0.25 0.5
vt 0.5 0.5
vt 0.75 0.5
# verts
v -0.1 0 0
v -0.1 0 -0.1
v -0.1 0.1 0
v -0.1 0.1 -0.1
v 0.2 0 0
v 0.2 0 -0.1
v 0.2 0.1 0
v 0.2 0.1 -0.1
v -0.1 0 0
v -0.1 0.1 0
v 0 0 0
v 0 0.1 0
v 0.1 0 0
v 0.1 0.1 0
v 0.2 0 0
v 0.2 0.1 0
v -0.1 0 -0.1
v -0.1 0.1 -0.1
v 0 0 -0.1
v 0 0.1 -0.1
v 0.1 0 -0.1
v 0.1 0.1 -0.1
v 0.2 0 -0.1
v 0.2 0.1 -0.1
v -0.1 0 0
v 0 0 0
v 0.1 0 0
v 0.2 0 0
v -0.1 0 -0.1
v 0 0 -0.1
v 0.1 0 -0.1
v 0.2 0 -0.1
v -0.1 0.1 0
v 0 0.1 0
v 0.1 0.1 0
v 0.2 0.1 0
v -0.1 0.1 -0.1
v 0 0.1 -0.1
v 0.1 0.1 -0.1
v 0.2 0.1 -0.1
# faces
f 3/2/1 2/2/1 1/2/1
f 4/2/1 2/2/1 3/2/1
f 5/1/2 6/1/2 7/1/2
f 7/1/2 6/1/2 8/1/2
f 11/2/3 10/2/3 9/2/3
f 12/2/3 10/2/3 11/2/3
f 13/3/3 12/3/3 11/3/3
f 14/3/3 12/3/3 13/3/3
f 15/1/3 14/1/3 13/1/3
f 16/1/3 14/1/3 15/1/3
f 17/2/4 18/2/4 19/2/4
f 19/2/4 18/2/4 20/2/4
f 19/3/4 20/3/4 21/3/4
f 21/3/4 20/3/4 22/3/4
f 21/1/4 22/1/4 23/1/4
f 23/1/4 22/1/4 24/1/4
f 29/2/5 26/2/5 25/2/5
f 30/3/5 27/3/5 26/3/5
f 30/2/5 26/2/5 29/2/5
f 31/1/5 28/1/5 27/1/5
f 31/3/5 27/3/5 30/3/5
f 32/1/5 28/1/5 31/1/5
f 33/2/6 34/2/6 37/2/6
f 34/3/6 35/3/6 38/3/6
f 37/2/6 34/2/6 38/2/6
f 35/1/6 36/1/6 39/1/6
f 38/3/6 35/3/6 39/3/6
f 39/1/6 36/1/6 40/1/6
As you can see for each face, the 3 UV coordinates picked up for the 3 vertices are the same, but in OpenGL, this is the result (they are supposed to be red, blue and yellow):
Here is my OBJ file reader code (in Java), that I call to create the vao and stuff to render:
public static RawModel loadObjModel(String fileName, Loader loader) {
FileReader fr = null;
try {
fr = new FileReader(new File("res/" + fileName + ".obj"));
} catch (FileNotFoundException e) {
System.err.println("Couldn't load file!");
e.printStackTrace();
}
BufferedReader reader = new BufferedReader(fr);
String line;
List<Vector3f> vertices = new ArrayList<Vector3f>();
List<Vector2f> textures = new ArrayList<Vector2f>();
List<Vector3f> normals = new ArrayList<Vector3f>();
List<Integer> indices = new ArrayList<Integer>();
float[] verticesArray = null;
float[] normalsArray = null;
float[] textureArray = null;
int[] indicesArray = null;
try {
while (true) {
line = reader.readLine();
String[] currentLine = line.split(" ");
if (line.startsWith("v ")) {
Vector3f vertex = new Vector3f(Float.parseFloat(currentLine[1]),
Float.parseFloat(currentLine[2]), Float.parseFloat(currentLine[3]));
vertices.add(vertex);
} else if (line.startsWith("vt ")) {
Vector2f texture = new Vector2f(Float.parseFloat(currentLine[1]),
Float.parseFloat(currentLine[2]));
textures.add(texture);
} else if (line.startsWith("vn ")) {
Vector3f normal = new Vector3f(Float.parseFloat(currentLine[1]),
Float.parseFloat(currentLine[2]), Float.parseFloat(currentLine[3]));
normals.add(normal);
} else if (line.startsWith("f ")) {
textureArray = new float[vertices.size() * 2];
normalsArray = new float[vertices.size() * 3];
break;
}
}
while (line != null) {
if (!line.startsWith("f ")) {
line = reader.readLine();
continue;
}
String[] currentLine = line.split(" ");
String[] vertex1 = currentLine[1].split("/");
String[] vertex2 = currentLine[2].split("/");
String[] vertex3 = currentLine[3].split("/");
processVertex(vertex1,indices,textures,normals,textureArray,normalsArray);
processVertex(vertex2,indices,textures,normals,textureArray,normalsArray);
processVertex(vertex3,indices,textures,normals,textureArray,normalsArray);
line = reader.readLine();
}
reader.close();
} catch (Exception e) {
e.printStackTrace();
}
verticesArray = new float[vertices.size()*3];
indicesArray = new int[indices.size()];
int vertexPointer = 0;
for(Vector3f vertex:vertices){
verticesArray[vertexPointer++] = vertex.x;
verticesArray[vertexPointer++] = vertex.y;
verticesArray[vertexPointer++] = vertex.z;
}
for(int i=0;i<indices.size();i++){
indicesArray[i] = indices.get(i);
}
return loader.loadToVAO(verticesArray, indicesArray, textureArray);
}
private static void processVertex(String[] vertexData, List<Integer> indices,
List<Vector2f> textures, List<Vector3f> normals, float[] textureArray,
float[] normalsArray) {
int currentVertexPointer = Integer.parseInt(vertexData[0]) - 1;
indices.add(currentVertexPointer);
Vector2f currentTex = textures.get(Integer.parseInt(vertexData[1])-1);
textureArray[currentVertexPointer*2] = currentTex.x;
textureArray[currentVertexPointer*2+1] = 1 - currentTex.y;
Vector3f currentNorm = normals.get(Integer.parseInt(vertexData[2])-1);
normalsArray[currentVertexPointer*3] = currentNorm.x;
normalsArray[currentVertexPointer*3+1] = currentNorm.y;
normalsArray[currentVertexPointer*3+2] = currentNorm.z;
}
Here is my fragment shader:
#version 400 core
in vec2 pass_textureCoords;
out vec4 out_colour;
uniform sampler2D textureSampler;
void main(void){
out_colour = texture(textureSampler,pass_textureCoords);
}
Here is my vertex shader:
#version 400 core
in vec3 position;
in vec2 textureCoords;
out vec2 pass_textureCoords;
uniform mat4 transformationMatrix;
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
void main(void){
gl_Position = projectionMatrix * viewMatrix * transformationMatrix * vec4(position.xyz,1.0);
pass_textureCoords = textureCoords;
}
And here is my render method called every frame:
public void render(Entity entity,StaticShader shader) {
TexturedModel texturedModel = entity.getTexturedModel();
RawModel model = texturedModel.getRawModel();
GL30.glBindVertexArray(model.getVaoID());
GL20.glEnableVertexAttribArray(0);
GL20.glEnableVertexAttribArray(1);
Matrix4f transformationMatrix = Maths.createTransformationMatrix(entity.getPosition(), entity.getRotation(), entity.getScale());
shader.loadTransformationMatrix(transformationMatrix);
GL13.glActiveTexture(GL13.GL_TEXTURE0);
GL11.glBindTexture(GL11.GL_TEXTURE_2D, texturedModel.getTexture().getID());
GL11.glDrawElements(GL11.GL_TRIANGLES, model.getVertexCount(),GL11.GL_UNSIGNED_INT,0);
GL20.glDisableVertexAttribArray(0);
GL20.glDisableVertexAttribArray(1);
GL30.glBindVertexArray(0);
}
I don't understand what I'm missing, and again it already worked for a blender model with a large texture:
答案1
得分: 2
你的Wavefront OBJ文件加载器实际上只在非常特定的情况下起作用,也就是说,当没有任何两个顶点共享纹理坐标或法线时,v、vn和vt的规范与彼此具有一对一的对应关系。
然而,这通常并不是情况。除了你的Wavefront OBJ文件加载器仅在所有'f'行在所有v、vt和vn行之后时才起作用(这也并不总是成立),你仍然有一些其他问题。
因此,目前的主要问题是你假设'v'和'vt'行之间存在一对一的对应关系,但事实并非如此。通常情况下,你不能简单地使用'f'行中第一个'/'分隔值指定的'v'索引作为OpenGL元素缓冲区索引,因为OpenGL只有一个单一的元素索引,用于统一索引位置、纹理和法线数组,而Wavefront OBJ文件格式有三个不同的索引,分别用于位置、纹理和法线。
因此,你应该重新思考/重写你的Wavefront OBJ文件加载器,基本上它会收集所有的位置(v)、纹理坐标(vt)和法线(vn)信息,然后每当遇到一个面(f)时,将在指定的索引(在'f'行中)处的位置、纹理和法线信息追加到最终的结果缓冲区中。你不能简单地使用来自'f'行的位置索引导出的单一索引作为索引。
你可以根本不使用任何索引,或者只是使用一个连续递增的索引作为你的OpenGL元素缓冲区。
但首先,我强烈建议阅读Wavefront OBJ文件格式的实际规范,比如这个链接:http://paulbourke.net/dataformats/obj/
英文:
Your Wavefront OBJ file loader really only works in a very particular case, namely, when no texture coordinates or normals are shared by any two vertices, so that v, vn and vt specifications have a 1:1 correspondence to each another.
This generally is not the case, though. Leaving out the fact that your Wavefront OBJ file loader also only works when all 'f' lines come after all v, vt and vn lines (which is also not always the case), then you still have a few other problems.
So the main problem as is currently is that you assume a 1:1 correspondence between 'v' and 'vt' lines, which is not the case. Generally, you cannot simply use the 'v' index (as specified via the first '/'-delimited value in an 'f' line) as your OpenGL element buffer index, because OpenGL only has one single element index, to index into position, texture and normals arrays uniformly, whereas the Wavefront OBJ file format has three different indices, each for position, textures and normals separately.
So, what you should do is rethink/rewrite your Wavefront OBJ file loader so that it basically collects all position (v), texture coordinates (vt) and normals (vn) information and then whenever you encounter a face (f) you append the position, texture and normals information at the specified indices (in the 'f' line) into your final result buffers. You cannot simply use a single index derived from the position index of the 'f' line.
You either don't use any indices at all or simply use a contiguous incrementing index as your OpenGL element buffer.
But first, I highly recommend reading an actual specification for the Wavefront OBJ file format, such as this: http://paulbourke.net/dataformats/obj/
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论