The biggest buzz word this year at Siggraph – the annual computer graphics and interactive techniques conference – was realtime.
It’s already been a large part of previous Siggraphs, but this time it seemed to be a part of everything on display: in animation, visual effects, virtual reality (VR), augmented reality (AR), robotics, gaming and of course in just about all the live experiences.
And live experiences were a major drawcard of the show, this year held in downtown Los Angeles. Many of the 16,500 attendees crowded the VR Village, Emerging Technologies and Art Gallery areas to get a glimpse of the future and ‘play’ with what was on offer. Other attendees were inspired by literally scores of feature film VFX and animation presentations (the conference has a long association with Hollywood), and from a raft of academic peer-reviewed material.
I came away from Siggraph greatly inspired by the innovations on display, and encouraged by what feels like a current resurgence in the visual effects industry.
Here’s a rundown of my conference highlights.
Real-time is the real deal
At last year’s Siggraph, Epic Games showcased its advancements in realtime rendering with Unreal Engine and live production via a demo from the game Hellblade. This year they brought their Human Race car commercial demo to the show floor for Real-Time Live! – a collaboration with The Mill, whose Cyclops virtual production toolkit and Blackbird tracking car were there on hand for people to see, touch and witness how realtime might change VFX production forever.
See all the action from Real-Time Live in the video below.
The Human Race demo has been thoroughly reported on already, but there’s nothing like seeing it actually work right in front of your eyes. I sat down to discuss where realtime was at with Epic Games CTO Kim Libreri, who was mostly excited about the cross-over between what was being achieved in games and how that could be translated into live action visual effects production.
“Both areas use the same tools,” Libreri says. “We almost talk the same language, the only difference is the ‘millisecond’ – realtime. You have to engineer things to run realtime. I think it’s still the biggest obstacle between the two industries. In VFX, there’s not a big penalty for over-modelling something or putting in too many textures because, as long as it gets rendered before you come into work the day, nobody’s going to complain. But in realtime it can be make or break.”
Another highlight of Real-Time Live! turned out to be a demonstration called Direct 3D Stylization Pipelines from Santiago Montesdeoca at Nanyang Technological University in Singapore. His research has been looking at stylising 3D modelled objects – in realtime – with watercolour-like features, almost as if they were actually painted with real brushes. Check it out in the video below.
“Painting with traditional media happens in realtime,” Montesdeoca told me in an interview, “so I believe we should aspire to nothing other than realtime performance when proposing a system to emulate it. Even if just a stylisation preview is presented, the ability to art-direct in realtime makes for a much faster, effective and intuitive art-creation.”
A glimpse at the future of VR entertainment
Some conferences that include virtual reality demos run into the problem of long line-ups just to see the VR experiences. That’s because each person needs to wear a set of goggles and these are often limited. That’s where Siggraph this year was different, with its innovative VR Theater as part of the Computer Animation Festival. Here, multiple headsets were available for attendees to watch a selection of VR films together in a kind of communal viewing.
The Theater was a hit, with tickets almost impossible to acquire (the daily line-up for them was reportedly unprecedented). Added to the experience was that several of the directors of these shorts showed up for presentations and a special director Q&A, including Jorge Gutierrez, who was behind Google Spotlight Stories’ Son of Jaguar.
Having the directors there also meant being able to hear about the challenges of combining the creative with the technical. Gutierrez, who helmed the animated The Book of Life and is also attached to a future LEGO-related movie, admitted that directing for the VR realm was hard.
“We’ve been spoiled in cinema with framing and cutting,” Jorge told me. “So not only is it harder to make a film that needs realtime rendering, but now all those tricks and all those things that you spent your whole life trying to learn – well, now you can’t use any of that stuff.” (Spoiler: In Son of Jaguar, Gutierrez manages to use VR to tell this story of a Mexican wrestler with only one leg in a fascinating way).
Walking around the show floor at Siggraph can wield the most unexpected of experiences. One minute you might be hearing from an accomplished researcher on the latest in light fields, and the next minute you’re entwined in a story that won’t be told anywhere else from a veteran visual effects supervisor on the use of CG in a recent Hollywood film.
Then there’s the highly interactive and engaging experiences you can have in Emerging Technologies and VR Village. One that caught my eyes – several times – was Merge VR’s Merge Cube. Coupled with the company’s soft VR goggles, the Merge Cub is a markered cube you hold in your hand which then transforms into graphics, patterns, animated pieces and even a Rubik’s Cube through AR technology. It really was like holding a hologram and, because so many different types of pop-up graphics were available to check out, this felt like an experience that could easily become mainstream.
The rise and rise of side events
Here’s a hot tip about attending Siggraph: you’re never going to be able to get to everything. Even the best laid plans will always go astray, and in fact sometimes the best thing about the conference can be catching up with old friends or making new contacts.
But one current trend to consider, while working out what to see, is the presence of several side events run by CG software companies or Siggraph-related entities. It’s at these events that you’ll pick up things rarely presented on, or published, elsewhere. Take DigiPro, for example, held on the day before Siggraph gets underway. It’s a full day of digital technologist presentations to a very ‘in’ crowd of VFX and animation industry people, where frank discussion is common.
Then there are full days or multiple days of scheduled talks from the likes of Chaos Group (V-Ray renderer), The Foundry (NUKE, MARI, KATANA), Autodesk (Maya, Shotgun, Arnold), Side Effects Software (Houdini) and Allegorithmic (Substance Painter and Designer). Several companies like Epic Games (Unreal Engine), Pixar (RenderMan) and Isotropix (Clarisse) also held intense user group meetings. There was even an evening visit to the California Science Center where attendees got to check out the Space Shuttle Endeavour – a very neat side event indeed.
State of the industry
The attendance levels and ‘buzz’ at Siggraph can actually be a good gauge for the feeling inside the VFX and animation industry. This year, the buzz seemed high and I think that’s got a lot to do with an abundance of new work in television and feature film post-production work in the LA area (something that’s been generally declining over the years with work heading to places like London, Vancouver and Montreal).
All the big visual effects and animation companies were there for the Job Fair, for example, and many also made their presence felt in engaging Production Sessions covering the work of projects such as Guardians of the Galaxy Vol. 2, Spider-Man: Homecoming, Valerian, Cars 3, Rogue One, Moana and Game of Thrones.
It feels like a positive time in the industry – even props on display from past films reminded attendees about how they themselves are part of bringing the magic of computer graphics to the screen.