Speeding up my Substance to Redshift Workflow
November 14, 2017

Last few weeks I’ve been looking deeper into speeding up some of my workflows. Essentially I would like to have a frictionless pipeline where I can go from concept to render as quickly as possible.

The Substance Route (Quick and Dirty)

One of the quickest ways to do this is starting out in ZBrush, taking it into Substance Painter, texture it and render it directly from there. Then you can just hit the integrated ArtStation button, and your render is online, ready to share.

workflow zbrush substance

It’s hard to argue with the speed of this workflow, but as I was working towards finishing my mac, I noticed some issues with the quality of the renderer inside painter. For some reason it renders the normals incorrectly as concave.[a]

substance render issues

The Houdini Route (Long and Sweet)

So to avoid this issue, I could export everything, take it into Houdini and render with Redshift instead.

workflow zbrush substance houdini redshift

The main drawback here is that it takes a considerable amount of extra time to set up… But on the flipside it offers a lot more flexibility, by actually being able to compose a scene, add extra lights, tweak shaders, and ultimately Redshift yields much better results.

Best of Both Worlds

I figured I could take all the great render tools I found inside Substance and recreate them in Houdini. If I could also speed up the time it took to setup my materials and textures, I would have the best of both worlds!

To speed this up, I wanted to be able to specify a single folder on my harddrive and automatically generate and load all the relevant textures onto a redshift material.

I wrangled together an initial test with VEX, but quickly realised that I needed more control and had to create a Houdini Digital Asset (hda) for this.[b] By creating a digital asset, I could use Python to do a lot of this material setup and texture updating.

PBR Material Manager

In the end I created an RS PBR Material Manager node. It builds the whole Redshift Material network for me and assigns all relevant textures through just one texture file path. As you update the filePath on the asset it automatically updates the filePaths for the RSTexture nodes in the background.

Dome with Ground

One thing I really liked about Substance Painter’s renderer was the way it allows you to flatten the bottom part of an HDRI Environment Sphere. [c]

When you realize the center line of an environment sphere is naturally the horizon, it’s quite a neat trick. By collapsing the whole bottom part of the sphere to a flat plane, you quickly create a photorealistic groundplane for your object to live on. Add a little bit of Depth of Field to blur out some of the background, and it’s easily the quickest way to create a plausible environment for your models.

Providing your HDR is highres enough, there’s no immediate need for extra backplates, groundplanes or comping afterwards.

I started out mimicking the behavior of the Substance Dome Ground, but ended up adding some extra features that gives me more control over the shape of the dome, the ground, texture scale and how the sky blends into the horizon

DOF Measurement

Finally one super neat trick in Substance Painter is the one click Depth of Field setup. You can slide open the aperture on the camera and CTRL + Middle Click to set the focus distance for the camera.

I managed to create a digital asset that builds a locator in space and calculates the distance between the locator and the camera. It automatically adds all the relevant Redshift Camera parameters and links them up to this distance parameter.

Although I can’t interactively click to tweak my focus area, the upshot is I can setup my focal point once and as I move the camera around the scene, my object will always stay in focus.

Conclusion

I’m really happy with these assets. Especially the PBR Material Manager has proven to be a huge time saver. With a PBR workflow, the selecting, picking and building material networks was easily the most frustrating and time consuming part of the whole process.

Now that I’ve figured how to use Python from inside an asset, it’s sparked my imagination for a few other assets that would be cool to have.

Notes

  1. ^ | Allegorithmic Forum: IRay Normals Issue I raised the issue with Allegorithmic, but it turns out it’s an error in the iray renderer. As of writing this post the bug is still marked as backlogged.

  2. ^ | Redshift Forum: Expose Filename Input on RS Texture Node My initial solution was to wrangle the filePath in VEX, then apply the texture paths as a detail attribute. I would create 1 global Redshift Material, where the fileTexture nodes automatically pick up the detail attributes we set for the filePaths. I had to abandon this approach because it turned out redshift doesn’t support string attributes yet. But let’s hope for the future, I still think this could be an elegant solution.

  3. ^ | HDRI Haven Talking about HDRIs, the videos in this post were created using HDRI Haven’s Konzerthaus map. They provide a library of 100% free HDRIs. Consider supporting them on Patreon.

I’m a 3D artist in London, mashing polygons with code. Now that web and CG are converging into VR, I’m moving onto that intersection.

Currently I’m available for work

Follow my journey @pixelprotest_

Learning Substance and PBR
Finding the missing pieces in the puzzle between ZBrush and Houdini
Render to Twitter and Instagram from Houdini
To post my work online as easily as possible, I created two Houdini nodes that render directly to Twitter and Instagram.