CGI Python

Automated Shapespark Scene Generation

Introduction

While working together with Prompto, my biggest task was generating a Blender scene from an Unreal level. Even though both software packages know the same abstract concepts like light, materials and objects, the implementations were very different. I had to reverse engineer the Unreal level to its core in order to rebuild it from scratch in Blender due to the many differences in the underlying application.

I designed and implemented this whole data flow pipeline closely together with the creators of the original levels. This allowed me to identify use and edge cases more easily.

Once this data pipeline was set up, a lot of opportunities arose to build on top of what we created. One of the ideas that came out of a whiteboard session was the automated generation of a shapespark scene ( Another was automated rendering )

Shapespark in a nutshell

Shapespark is another 3D based software package meaning the same abstract concepts ( geometry, lights, materials, … ) are implemented as well. But in different ways. The biggest contrast was in the materials departement. There is no support for shading graphs. This means you could only assign values or bitmaps, without doing any operations / post processing on them.

Examining the problem

This is a very destructive way of working, but saves enourmous amounts of computing times. The choice defintely makes sense and is something I had to deal with. Luckily, Blender catches this quite well. There is a functionality that allows you to bake textures to disk.

There are some things to keep in mind when doing this but I won’t bore you with the details. The main thing to remember is that this method uses the Cycles render engine. Thus a complex path tracing algorithm. This slowed the program down a lot more than it had to since we didn’t need any fancy calculations. I just wanted the color values per pixel so that Shapespark could do the actual rendering.

Another restriction of this method is that the baking process uses the UV coordinates of the object you want to bake the shader(s) from. Assigning multiple material slots to a single object also resulted in complications. All in all quite the struggle for a rather bad and slow result.

Solving the problem

I did want to dive deep into this workflow and ended up writing all the necessary logic for it, since this was the most correct way of baking the texture. However, the implementation had a lot more overhead and complexity than we actually needed.

After discussing with the PM we agreed that I could spend 1 day on R&D to see if I could implement the whole baking flow myself. Before noon I sent over a proof of concept and got his consent to build it out completely. It was very thightly monitored though, since we already had a solution and it was strictly optimization.

Conclusion

In the end I was able to lower the duration of a 2048×2048 diffuse bake from 160 seconds to 1.5 seconds using numpy arrays and blend_modes without losing quality.

I could also use this logic to disregard certain images from being baked. Sometimes the graph didn’t do (barely) any processing at all, which could now be detected. The average level / appartement uses about 250 unique texture maps, making the time and money saved huge.

In case my custom method failed I fell back to the path tracing method.

If you would like to know more about this project, please contact me and we can talk about it in more detail.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *