Thoughts on Modeling in VR and How It Will Fit into My Workflow
July 7, 2018

Recently I’ve been experimenting with several modeling tools[a] in VR. So far I’ve managed to create this Polaroid Camera in Gravity Sketch.

Modeling a Polaroid Camera in VR

What immediately stands out is how expressive it is to work in VR, mainly due to the spatial tracking on the controllers. On desktop 3D programs you are always still interfacing with a 2D input device, like a mouse or tablet, where you can only draw horizontally and vertically across a 2-dimensional screen.

VR is Expressive

The spatial tracking in VR allows you to also draw in depth at the same time. Things that would naturally be quite tedious in desktop 3D - e.g. spiraling curves - become incredibly quick, fun and easy to draw. Both Gravity Sketch and TiltBrush[b] are excellent examples of this.

Immersive Canvas

Another thing that’s super refreshing about working in VR is the blank canvas. In the traditional world the blank canvas is seen as something daunting, but in VR you are totally immersed into it. The canvas IS the environment, and there’s no shying away from it.

VR Immersive Canvas

When you boot up into a clean scene, it naturally creates a completely distraction-free zone, allowing you to fully focus on painting or modeling. Generally I’ll spend a couple of hours at a time inside that world, modeling and painting away, until I get physically tired.

While I’m inside, there’s no emails, no phone calls, I can’t check my phone for any notifications, even if I wanted to. A lot of times I’ll come out of a session, no clue as to how long I’ve been gone.

Personally I find it pretty wild, in this age of distraction, to be able to have such a peaceful working environment at the touch of a button.

User Interface and Controllers.

Scaling Up and Down It’s interesting to see everyone’s take on the user interface - for such a new medium.

All the apps seem to agree that grabbing both controllers + spreading arms is the universal gesture for scaling up and down, much like the finger zooming gesture on phones and tablets. A lot of them provide great tactile feedback when you use different brushes and tools.

However some apps (e.g. MasterpieceVR) could do a much better job at maximizing the user controls. While drawing with the right hand, lots of times your left hand is just kept there floating, functioning purely as tool-palette. Which seems like a waste of interaction potential.

Floating Tool Palette

Right now I feel Gravity Sketch and TiltBrush probably have the most well thought out controls and interface. A nice thing is they both allow the undocking of your tool-palette, to spread them around your virtual workspace, which makes it super quick to change tools.

Marking Menu TiltBrush TiltBrush also has a simple implementation of a marking menu, that works surprisingly well. Marking menu’s are great because they allow a beginner to navigate a menu by reading the buttons slowly, while an advanced user can move through those same menu’s without looking, with simple flicks and gestures.

I imagine that with time, more tools will be developed and these marking menu’s will become more deeply nested - like Maya. Users will then get used to specific arm-waving-gestures to quickly navigate tools and menu’s, while staying accessible for beginners.

Limitations

One limitation I’ve found regarding VR is it’s heavy reliance on a perspective view. Naturally the field of view in VR is closely matched to that of your eye, this means that an isometric view is basically out of the question.

Perspective vs Isometric

An isometric view makes it fast and easy to line things up correctly, especially when you’re doing something precise or mechanical. I actually never imagined I would miss an isometric view, until I realised that you can’t have one in VR.

Furthermore it’s suprising how physical and sweaty it can get after a couple of hours standing and waving your arms around. It’s not necessarily a bad thing though, because it forces me to take regular breaks every couple of hours, and the feeling of creating an artwork and being both mentally + physically tired is actually really satisfying.

Conclusion

Obviously the tools and hardware still have a long way to go, but for now I will use all these VR apps to quickly come up with different concepts. The expressive nature of the medium allows me to quickly try things out and iterate.

Although it’s questionable wether VR will ever completely replace modeling on desktop, I can see them work really well in unison, complementing each other’s strenghts and weaknesses.

On a final note I’ll say it’s interesting to see that right now the VR Modeling space is being dominated by the tech giants (Facebook/Oculus and Google), while there’s not a beep from any of the classic 3d companies. Hopefully we’ll see ZBrush move into this space soon.

Notes

  1. ^ | I’ve gone through all the ones I could find on Steam. Gravity Sketch, MasterpieceVR, Kodon, Unbound and Google’s TiltBrush and Blocks. Since I’m on Vive I haven’t been able to try Oculus’ Medium yet, but I’ve found a Revive package that should allow me to run Oculus apps on my Vive.

  2. ^ | Furthermore last week TiltBrush added a Hull Brush which finally allows the creation of closed volume meshes by essentially essentially generating a convex hull around your stroke. Although the tool is very basic, it’s a nice showcase for this 3rd axis in VR, as you would never be able to create a mesh like this with one stroke on a desktop (again because your input would by limited by 2 axis’)

I’m a 3D artist in London, mashing polygons with code. Now that web and CG are converging into VR, I’m moving onto that intersection.

Currently I’m available for work

Follow my journey @pixelprotest_

Render to Twitter and Instagram from Houdini
To post my work online as easily as possible, I created two Houdini nodes that render directly to Twitter and Instagram.