Instancing Sprite3D / Drawing the same 3D object 400 times in a scene

Say I need to draw 400 copies of the same simple Sprite3D, with the same texture and vertices (from a .obj file), but different colors and locations, what’s the best way to do this?

Doing this naively (400 Sprite3D objects each loading the .obj and texture) results in 400 GL draw calls (or less depending on how many are visible at once), which is fine on my desktop, but not even remotely close to workable on my decently powerful phone (I get ~14fps on my HTC One M7).

Is there a better way to do this? Instancing, for example? How can I set this up correctly in cocos2dx?

2 Likes

I have a same problem, and it bugs me a lot. i’ve also noticed that i almost have double vertices for a single object, in blender it says for example that i have 70 vertices, but in cocos2d-x around 150.

1 Like

If your device is Android, please modify *.mk, add armv7a, that will use FPU of CPU, will more faster.

70 vertices, means mesh have 70, in cocos2d-x, 150 means gl draw. If you can get number of triangles of mesh in blender, multi 3, the result is gl draw use number. I think your mesh has 50 triangles(face), so 150.

1 Like

Did anyone figured out how to solve it? Without propper instancing/batching features Cocos3D seems to be pretty unusable.

1 Like

Hi guys. Any news about it in 2016?

You’ll probably have to roll your own mesh merging algorithm for now. It’s actually not too complicated, a mesh is just a bunch of triangles/quads (vert,uv,color,normal) and textures, uniforms, shaders, etc. Loop through your sprites, pull out the mesh triangles, and then create a triangle command that includes all 400 sprite triangles. It’s similar to what particle systems use by necessity, so check out the PU (3d) and 2d particle systems in the engine source. With the obvious caveat that all merged sprites would need to use same shader, textures, uniforms, and whatever else is required for single GL call, though you can for example merge two different textures into one atlas as well. Shaders can be combined in some instances as well.

At least that’s the gist.

You may want to look into GPU instancing as well, but that’ll require GL ES 3.0 afaik, or at least a GL extension.

Maybe cocos2d-x will support this in the future.

Hi stevetranby, any chance you could fill out a little detail of how to merge meshes? I’ve just been studying ParticleBatchNode:insertChild and hoping that that isn’t the level at which this has to be done. More difficult, whatever I end up doing needs to be driven from Lua.

I’m even wondering whether the easiest approach might be to read in the c3t file for the mesh as raw json structures and perform the merging at that level. Hope not… horrible approach.

Would be great if proper auto batching was on the to do list and we could all just ignore this problem.

Mesh Instancing

First off, I’d consider if Instancing is supported in all your target machines. Instancing should be less work and is a better fit for the situation where you want to draw a single mesh many many times.

For myself and anyone else’s reference here’s google top search links:

Mesh Merging

Well you’ll end up writing something akin to auto-batching since you can only merge those that could be auto-batched if that were available. Manual merging will require less code because you won’t have to write the part that checks for similar shader state.

If you have static meshes (level data) that never move then you probably will want to merge those at loading time (or if you want you could merge them in the 3d program or externally to the cocos2d execution). This could be done at the JSON->mesh level or it could be done the same way as you’d handle dynamic meshes.

You essentially just combine all the vertex and index buffer data into one long array instead of the many arrays, however you have to make sure they’re all in the same “transform space” so you’ll need to make sure that all the verts are relative to one mesh’s “local space” origin, or that all verts are relative to the Model/View origin.

Combining the vertex/index buffers isn’t all that difficult, but you have to be aware of the notes below and I’m sure you’ll run into unforeseen issues along the way.

Notes:

  • All merged vertices will need to be transformed off the same model/view (and local if want the multiply in CPU instead of GPU shader)
  • All merged vertices must use the same shader state that includes the same uniforms (textures, transform matrices, etc)
  • All merged vertices must be re-built every frame for dynamic meshes changing positions at a different rate and may negate the benefits of merging in the first place
  • All merged vertices and indices must fit into the OpenGL buffer max element count limits
  • Probably other issues I haven’t thought up

Thanks for the info. Sounds like more hassle than it’s worth for just my game. I was hoping it might be possible at the Mesh object level, just enumerating the triangles and copying them transformed, but it’s sounding like I’d need more in depth knowledge of OpenGL. Perhaps worth it if I could implement full automerging and solve the problem for everyone.

Instancing sounds really useful and would solve the problem and I guess wouldn’t require much of an API change, perhaps just the addition of a method to set the number of instances, because presumably all else would be done via the shaders.

I’ll mull it over thanks.

1 Like

Ah, I think I may have a trick to do it through the current interfaces. In Blender I can build a model that has several copies (10 say) of the mesh, all in the same location and combined into one, but each attached to a different bone.
I can then use bone animation to position each. For my game, the instances are placed on a 8x8x4 grid. Possibly it makes sense to create 8x8x4x10 animations, each positioning one bone at one location. Then I can load 10 of them to position the 10 instances on any grid point. I can use multiple copies of the whole thing one for each group of 10 required instances, and any spare ones from a group of 10 can be placed outside the viewing frustrum.

That assumes cocos2d-x supports multiple simultaneous bone animations.

Just to report back that that worked. There were a few fiddles to get it working. I had trouble with the bounding box. The bones were moving parts of the mulitple-instance model far outside the rest bounding box and I could find no way to set it programatically so in blender I had to move one instance to the far corner of the intended volume I wished to be able to move them within.

Secondly, multiple concurrent bone animations don’t seem to be supported, but I was able to combine them in sequence using the Sequence constructor. I’m using zero duration animations to position the instances so this had no disadvantages.