subscribe

Siggraph Opens My Ears to VR

I’ve been in Vancouver most of this week and we’ll have the report on the Siggraph event for you next week. It was interesting and was a first attendence at the event for me. The show is fully of seriously technical sessions for the content creators that make the visual effects and animations that come out of Hollywood as well as the design and games industries. There are also lots of the people that provide the software and hardware tools that they need.

This year seemed a particularly exciting one, with a big announcement by Nvidia of real time ray tracing, one of the ‘holy grails’ of computer graphics, at an accessible price – at least for professionals. It’s only a matter of time before the technology moves down to the consumer space and that will enable some astonishing realism in games and interactive entertainment, including virtual reality.

As I wrote a couple of weeks ago, VR is, no question, in the ‘trough of despondency’ when everything seems too difficult. However, after Siggraph, it’s clear to me that the technology will develop to the ‘plateau of productivity’. There have been tremendous advances in the last couple of years and I was very impressed with the latest StarVR headset (more in the report). I still wouldn’t want to wear the headset for long periods for fun, but then again, I wouldn’t want to wear a lot of the headsets and equipment that others have to wear and carry in their work.

The point is that if you can exploit the immersiveness of VR in your work, it is becoming accessible – and with high quality – at prices that will allow good ROI for professionals. Consumer applications in high volume may still be a little way away, but it seems inevitable that the technology will migrate down, in time.

Another thing that impressed me at the event was the ‘audio ray tracing’ that Nvidia was showing. Only a couple of weeks ago, I wrote after the visit to Antycip that few users of simulation use audio because of the difficulty of keeping the audio in sync with the visuals, spatially. (Antycip Shows Us its UK Demo Suite)

Nvidia’s VRWorks Audio

At Siggraph, Nvidia showed me how the company can not only locate objects in space and synchronise these with the motion of a VR user, but can also modify the audio to reflect the effects of reverberation in the spaces that the user is in virtually. I was able to hear a clear difference when the surfaces in a room were shiny or soft, could hear the change in the audio as I went through a door and the gradual change as a sliding door came across. As the partition was made of glass, I could still hear something through the glass.

This kind of simulation would be great for architects to use to check the effect of different design decisions, such as the use of different materials and other design details. The technology should also be incredibly useful in simulators to add to the realism.

The technique uses the company’s big new RTX GPU so is not going to be for everyone, but it is very impressive. It would be great if building designers could actually take the audio qualities of their buildings into account on the ‘drawing board’. After all, we must have all been in places where, for example, public announcements simply couldn’t be heard, or environments with so much harsh noise that they are simply not comfortable to be in for any length of time. Combined with the dramatic improvements given by live rendering, architects should be very excited by the latest developments.

I should have mentioned that the demonstration took place in an online VR world with others also present as collaborators. That means real time collaboration regardless of geography.

Anyway, we’ll get the report out over the next few days on the website and next week in the newsletters.

Bob