PrEV
Thoughts from a NeXTStep Guy on Cocoa Development

Wave Front OBJ - textures working

Dec 31, 2008 by Bill Dudney

I finally got the texture stuff working again. As I said in the last post I have been working on getting my head wrapped around OpenGL including the programable pipeline (GLSL etc). Since there is no programable pipeline on the iPhone I started on the mac so I could do GLSL. Well turns out that to load my textures I was using ImageI/O which is also not on the iPhone. After some false starts I finally just copied the Texture2D class from the TouchFighterII example. I could not find anything that said 'don't redistribute this class' so I figure its ok.

Once I got my textures to load properly the rest turned out to be fairly easy to get going again. I ripped out all the shader stuff and it just worked. Well there was also the case of converting all my indexes to shorts and apparently the length of the buffer does matter (dooh!) so once I got that working everything just showed up as it should. Without further ado here is the code that will run in the simulator.

Here is a screen shot of one of the models running. The pyramid and the plane have a texture applied, the cube does not.

To get the texture VBO into video mem I did this;

      glEnableClientState(GL_TEXTURE_COORD_ARRAY);
      GLuint textureCoordsName = [group textureCoordinatesName:GL_STATIC_DRAW];

The first line turns on VBO's for texture coordinates. The second line, pushes the bytes out to the card if necessary like this.

- (GLuint)textureCoordinatesName:(GLenum)usage {// the VBO for the texture coords
  if(0 == textureCoordsName && texCoordsData.length > 0) {
    glGenBuffers(1, &textureCoordsName);
    glBindBuffer(GL_ARRAY_BUFFER, textureCoordsName);
    glBufferData(GL_ARRAY_BUFFER, texCoordsData.length,
                 texCoordsData.bytes, usage);
  }
  return textureCoordsName;
}

Then the texture coord's buffer is enabled like this;

      glBindBuffer(GL_ARRAY_BUFFER, textureCoordsName);
      glTexCoordPointer([group texCoordSize], GL_FLOAT, 0, 0);

That binds the texture coordinates buffer (i.e. makes it active) then we set the TexCoordPointer. Telling OpenGL that we have texCoordSize elements of type GL_FLOAT, with no offset, starting at the beginning of the list.

Next up we enable 2D textures, bind the texture and we are done with texture stuff.

      glEnable(GL_TEXTURE_2D);
      GLuint texId = [group.material.diffuseTexture textureName];
      glBindTexture(GL_TEXTURE_2D, texId);

Now when we draw with the glDrawELements function like this;

    GLuint indexesName = [group indexesName:GL_STATIC_DRAW];
    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, indexesName);
    glDrawElements(GL_TRIANGLES, group.indexCount, GL_UNSIGNED_SHORT, NULL);

The texture (and the coords) will be used.

Happy hacking!



Comments:

I think most people now use the Texture2D stuff from Apple, as it just seems to work with most things, and all the conversion stuff is handled for you.

Happy New Year to you and hope this year is better than the last for all.

Philip

Posted by Philip Orr on January 01, 2009 at 11:51 AM MST #

Hi Bill,

This is very cool that you wrote this. However, I tried importing some sample .obj files I created in 3dsmax. (cube with texture) However, the obj isn't loading. I can load the obj files that came with your project (plane4, ...)
If you have a minute and could help out, I uploaded the problem obj file here.

http://officeofthedead.net/guestofmine/tallguy.zip

thanks.

Posted by matt on March 02, 2009 at 08:32 AM MST #

Thank you for your hard work on this code. I have a small change.

I exported an OBJ from Wings3D (Blender frustrated me) and a mesh wouldn't load. Creating a WavefrontOBJScene from that mesh would hang. Here are my changes. Send me an email if you'd like a copy of the mesh.

In the file WaveFrontOBJScene.m on line 410 I changed from:
if([scan scanInt:&vertexIndex]) {
Vector3D *vertex = (Vector3D*)CFArrayGetValueAtIndex(vertexes, vertexIndex - 1);
[group addVertex:*vertex atIndex:vertexIndex - (startVertexIndexCount + 1)];
}

To:
if([scan scanInt:&vertexIndex]) {
Vector3D *vertex = (Vector3D*)CFArrayGetValueAtIndex(vertexes, vertexIndex - 1);
[group addVertex:*vertex atIndex:vertexIndex - (startVertexIndexCount + 1)];
}else{
finished = YES;
}

Posted by Nick VanderPyle on April 13, 2009 at 10:58 PM MDT #

Hi Bill,
This would make a nice application for the iPhone. To keep it simple, you could merely allow DiskAid to load obj files to a folder on the iPhone and process them from there. Unfortunately, I do not have a Mac, so I can't do it myself.
Any plans ?

Posted by Nat Edgar on April 27, 2009 at 03:54 PM MDT #

Thanks for this Bill.

I've been banging my head agains't the wall for the last week or two trying to load blender models. I tried Jeff LaMarche's python script to export as .h files but couldn't get the texturing to work. I was also looking into using the Blender->Collada->POD before I found your blog entry.

After some tinkering with Blender & playing around with the .obj and .mtl files, I managed to get it working. I had to add the textures to the blender model to get it to generate the texture coordinates in the .obj file. UV Mapping didn't work for me so I added textured materials to the individual faces. However, the exported .obj file painted the entire model with the last assigned texture. To fix this, I had to change the .obj file to group the textured faces separately from the untextured faces (add "g face_x") before each usemtl call.

I am also happy to report that the generated normals work perfectly with lighting enabled.

The only problem is that the model was smoothed in Blender to give nice rounded corners but it looks like that hasn't transferred to the iPhone. Any ideas on how to smooth out the vertices? Also, does this support bump mapping?

Posted by Conor on July 09, 2009 at 09:26 AM MDT #

Hi Conor,

Really glad its working for you!

On the 'smoothing' front. That is most likely due to blender averaging the normals , so each vertex has an average normal derived from all the intersecting polynomials. Instead of the typical way of calculating the normal (the vertex and its next 2 kin on its polygon) Blender will take all the polygons that the vertex is a part of and average all the normals for each vertex.

Hope this helps!

Posted by Bill Dudney on July 09, 2009 at 09:34 AM MDT #

Hi,
I am making an object by blender and using the obj file for iphone, that is not the problem. problem is i want to add some new object to the vertices of the object i created. like dynamically button on the different edges of cube.
how i get that to work on iphone. can you help me.
thanks
Naren

Posted by Naren on August 17, 2009 at 10:11 AM MDT #

Hi,

I am working on a research project for school and I need to load obj's on the iphone and have working UV mapping. I got everything up and running except for (as you say here) texture coords. I was wondering if you have any leads on what I should try to get it working? Or, are there any workarounds? (I need to texture human faces and get things to be pretty precise so they can animate and look ok)

Thanks for your help!

Posted by Kate Swanson on February 09, 2010 at 11:21 AM MST #

Hi, I'm trying to do lighting along with this code. Unfortunately when I do the standard lighting calls any point in the code, glDrawElements gives me an EXC_BAD_ACCESS.

Posted by alex nichol on February 28, 2010 at 09:29 PM MST #

Hi,

I'm trying to use this loader as a base to parse some complex .obj models which don't have normal coordinates.

I don't think this should be a problem when painting, since I have UV coordinates, but the texture doesn't load and all geometry is completely Black.

Any help on what piece of code should I modify?

Thanks!

Posted by ad on August 15, 2010 at 09:41 PM MDT #

Thanks for the code! a very big help. I'm looking at using your loader with OGLES 2.0. Just curious if you or anyone has an example of that. I have something rendering... though the models look fairly broken :(

Thanks

Daniel

Posted by Daniel Kramer on September 22, 2010 at 08:02 AM MDT #

Hi,

What is the license for this source code?

Kind regards,
Samuel

Posted by Samuel Williams on December 12, 2010 at 09:29 AM MST #

Hi!
Thanks for the great loader. I'm new in opengles and it was really helpful. unfortunately... I exported a revit file to dwg and then open it with blender/google sketchUp and then to wavefront obj. Next, add my files to your loader and got strange results. The project load correctly vertex and normals but does a strange results with materials (I use only material, without textures). The one file is loaded with some materials on different faces the other, doesn't load materials but after a click makes the model green. If you have a minute and could help me out, please. I uploaded the project here.
http://www.sendspace.pl/file/ecd348a3674ed1fe6eb10e5

Thanks for answering, Magda

Posted by Magalena on December 12, 2010 at 09:29 AM MST #

I changed the following to line 200 of WaveFrontOBJMaterial.m for alpha support:

color.alpha = 1.0;

to

if(![scan scanFloat:&color.alpha]) {
color.alpha = 1.0;
}

Thanks for the great work!

Posted by Jeff Crouse on January 09, 2011 at 07:58 AM MST #

Thanks a lot, Bill!

Posted by Cfr on January 08, 2013 at 01:22 PM MST #

Post a Comment:
  • HTML Syntax: Allowed