ARKTIKA.1, an Oculus exclusive due to launch later this year, is shaping up to be one of VR’s best looking games to date. You’ll take it as no surprise that the title is being developed by 4A Games, the developer behind the Metro series (and its stunning next installment, Metro Exodus). One important part of making a game look great is skilled use of effects—dynamic elements like particles, smoke, muzzle flashes, explosions, and lighting. But the methods for making great looking effects for traditional games take on new challenges when it comes to VR, especially when teetering on the edge of visual fidelity and the high performance required for smooth VR rendering. In this guest article, 4A Games explores their approach to making effects in Arktika.1.
Guest Article by Nikita Shilkin
Nikita Shilkin is a Senior VFX Artist at 4A Games. Before that, he worked on films and ads as a Generalist Artist, and then as a VFX/Onset Supervisor on sci-fi and other types of films.
To get an idea of my prior work, here’s some of the scenes I’ve worked on:
At the moment, I am working on effects for the ARKTIKA.1 project. This is a sci-fi VR shooter with a—traditional for the company—focus on immersing audience through story and high-quality visuals that make it possible to talk about it as an AAA product.
To begin with, I would like to note that making effects for VR is essentially no different from producing them for ordinary games, with the exception of few nuances that I have noticed during the production.
- The first and the most important one – player’s freedom and as a consequence, the unpredictability of almost all his actions.
- Focus on performance. The requirement of constant 90 frames damages your technical and creative freedom, forcing you to constantly balance on the verge of game quality and player comfort.
- The final checkpoint is a headset. Due to the difference in resolution, gamma and the features of the virtual reality, what looked wonderful and beautiful in the editor might not look so good with a headset.
Based on these three rules, we can start analyzing the production. So, let’s begin with some core things.
Since we are talking about VR, we don’t have fixed camera, animations, timings or other constant values, which means we can never know how the player will shoot and from which side he sees the weapon. And the only way out is to make the effect work beautifully from all sides.
And the first standard mistake is trying to make one mind-blowing sequence, which unfortunately will work only with a classic fixed camera, becoming ridiculous when turning the weapon.
The solution is quite simple – no matter how complex the effect is, break it into simple fixed parts using all three directions. So you get not only volume, but also a visual randomness that will make a shot unique.
Above: (left) a muzzle flash made with volume in all directions, (right) a typical ‘first person’ muzzle flash looks great from a static camera angle behind the weapon, but breaks down if seen from other directions.
Since the VR does not feature a classic gun sight, nor the center of the screen, and aiming with a foresight or a scope is not a common thing, the projectiles of the weapon should be clearly visible. Most of the players will rely on this factor, making corrections for the bullets and their impacts.
In this regard, there are several tips:
- The muzzle flash must not block the sight of the bullet.
- The bullet should be clearly visible (size, brightness, length). The lower the rate of fire, the better the bullets are seen with the trails behind them. The faster, the higher the brightness is.
- Don’t be lazy, create different bullets with variable impacts for all weapons, as this will also help the player to understand shooting direction.
And finally, a little piece of advice, if you have any firearms (or any other weapons with smoke particles), put them into a separate system, away from the flame and set free in the world, that looks interesting.