Tech Interview: Destiny 2 and Bungie’s return to PC gaming – Eurogamer.net
We first went hands-on with the PC version of Destiny 2 at E3 earlier this year, and it was immediately apparent that this wasn’t just a mere port or conversion, but instead a thoughtful, considered approach to the platform with all of the unique features and opportunities it represents. Back then, we mentioned to Bungie that we’d really like to go deeper on the game, the technology added to the firm’s multiplatform engine, as well as learning more about the approach to bringing the game to PC. Four months later, Bungie’s senior technical artist Nate Hawbaker has flown over from Seattle, joining us in the Digital Foundry office.
What’s immediately clear is the passion and the knowledge Hawbaker has both for the Destiny series and the PC version of the new game – and he’s a Bungie veteran, having worked on the Halo franchise too. By his own admission, he has a passion for talking about graphics, and the visit was a particular treat for us. We love to put together our analysis articles, but fundamentally we are limited by what we see. We’re outsiders assessing a final output, but for one day, a massive triple-A title became much more of an open book for us, with Nate Hawbaker answering any and every question we had about the game and the technology behind it.
An hour or so into the discussion, we came to the conclusion that we really needed to get some of this stuff on the record, and what’s on this page is an edited version of an hour-long chat we had about the game. We learned about the major architectural advantages that Bungie added to its multi-platform engine, we talk in-depth about the PC version and its exclusive features, like HDR. And we get the lowdown on the quality settings, what they actually do and how much of an impact on performance you’ll get by adjusting them.
We also talk a lot about scalability. Yes, there’s a fair amount of discussion about scaling up on extreme PC hardware, but the mark of a truly optimal PC version is how it runs on lower-end kit. It turns out that Bungie and partner studio Vicarious Visions did a lot of work there too. After Hawbaker returned to Seattle, we couldn’t help but wonder – how low can you go with Destiny 2 on PC? There’s fleeting discussion within the interview here about whether the game can run on Intel integrated graphics, but we actually put it to the test, getting a playable experience on the Pentium G4560’s HD 610 graphics and the UHD 630 in the Core i5 8400. Oh, and we also tested Destiny 2 on an ultrabook – and it works. The extent to which it works is something you can find out for yourself by watching the video further down the page.
In the meantime, sit back and enjoy one of the largest and in-depth tech interviews we’ve carried out for some time.
Digital Foundry: Let’s talk about the original Destiny and Destiny 2. Now obviously there are some big engine upgrades from one game to the next. Can you give us the basics on what you wanted to achieve? As a new game what are the core enhancements?
Nate Hawbaker: One of the first things that we wanted to do was… well, you know the whole industry is moving forward with this, but we wanted to integrate physically-based rendering. So, a co-worker and myself worked over the span of about half a year to re-implement all the lighting, all the material shading in our game, adding things like area lights and even redoing things like our shadows and just rethinking all of that.
And so the game was much more scalable to those types of visuals because ultimately, with Destiny 2, we wanted to deliver a wider experience, a wider diversity of the type of experiences in our game and it was really going to be the only way. And it had a lot of subtle benefits with it as we implemented things like HDR later on – and they all work very holistically together.
Digital Foundry: We noticed a big upgrade in GPU particles.
Nate Hawbaker: Yes, we talked about it this year at Siggraph. One of our graphics engineers, Brandon Whitley worked on this. So we moved into GPU particles. Actually, a lot of people though we had them in Destiny 1, because of how much we were putting on the screen and I would also say as a testament to the artists implementing their craft of particle systems in Destiny 1, we would have maybe 3000 or so CPU particles on-screen at any one time. I think we’re somewhere in the neighbourhood of 120,000 particles now and we’re actually at the point where we hold it back artistically – because we don’t want to fill the screen with noise and particle systems!
I think Brandon has a couple of examples in his original talk of some of the original tests and they’re just absurd. You know, using some of the super abilities as you hit the ground, we would spew 100,000 particles – and at first, that looks awesome but with nine players in PvP, it’s not reasonable. The whole screen is just filled with particles and so we actually have to artistically hold it back quite a bit. We’re not actually near our limits, because we don’t want our art to look like that. So, I think the GPU particle system paid off in dividends. It’s the cornerstone of Destiny I think.
Digital Foundry: What was your philosophy for the PC version of the game?
Nate Hawbaker: I mean, it couldn’t be a port. It was just totally unacceptable for us to issue a port and we were very conscious of that very early on, from the very first day, that you say that you commit to doing a PC version, you very quickly disperse out and you start outlining the things that make a game a port and you vow that day to never do those things: things like only supporting one monitor, you know not doing true 21:9, not having an unlocked frame-rate or a frame-rate that’s attached to the game simulation so it might speed up or slow down. All of those were very, very crucial and we just slowly worked on that list of identifying those things and resolving them, and I think we’ve ended up in a really, really great spot.
Digital Foundry: When we looked at the beta code we were just blown away. I mean, we were running the game pretty much locked at 60Hz on a $65 Pentium, which is quite remarkable. Is it just a case of the engine scaling well to PC or did you have to go back in and retool it specifically for PC hardware?
Nate Hawbaker: You know, nothing comes free by any means, you know there’s certainly not a line of code to uncomment and it works for PC but I will say that some of the general philosophy behind developing an engine for multiple platforms – even going back to Destiny 1 – has still really carried over into Destiny 2. And I would say we’ve actually developed that even further, which is one of the reasons that you do see so many of your cores occupied in our game and it does scale pretty widely, and also pretty well in the graphic side of things, is because we are a multi-platform engine now and it is truly in a very mature spot. Certainly at least compared to Destiny 1, I think we’ve made a lot of great advances there.
Digital Foundry: So, CPU-wise, you can basically scale across as many cores you’ve got?
Nate Hawbaker: Yeah, I mean if you hand it to us, we will definitely do our best to try to use them.
Digital Foundry: If a $65 processor can run at locked 60, obviously beyond that we’re looking at excellent support for high frequency displays. The beta topped out at 200fps – has that cap now been removed?
Nate Hawbaker: Yes, that that is that is now removed. Yeah, there were a couple of bugs that did result in going above 200, related to floating-point precision and various rounding errors but we’ve resolved those now and the game will run in unlimited frame-rate. You will probably run into CPU bottlenecks well before unlimited is hit, but yeah I mean by all means throw your hardware at it, we will render at that – there’s no limitations there.
Digital Foundry: What are the sort of primary GPU limitations on the game. I mean you’ve got a lot of scalability there. From my perspective, the high setting was kind of broadly equivalent to consoles, so where do we go from there?
Nate Hawbaker: Yeah, going from high to highest, you’re going to notice things like post-processing. We start to increase samples – you know, multiply this by two, multiply that by two. Shadow resolution? Certainly starting to increase and not just the resolution of the shadows, but how are you sampling those? How do you make them look really soft and across how many cascades do they look soft and also where do those cascades begin and where do they end? What is the rate at which that are dispersed? All of those continue to scale up based on their original rates but I would say a lot of the cost goes into post-processing – things like SSAO, depth of field certainly.
One of the approaches was – hey we have cinematic depth of field we have gameplay depth of field. [On] highest maybe it’s always cinematic, wouldn’t it be cool if the whole game was cinematic and then we can also increase the samples of the cinematic so that you get really nice bokeh? Okay! And it was always that philosophy of… I mean we can write the code and if you have the hardware by all means we’ll have an option here waiting for you, and conversely, for the lower end specs as well, if you can just barely run the game, we will allow you to bring that render resolution all the way down to 25 per cent. We just want to scale to wherever somebody wants to play Destiny 2.
Digital Foundry: And you were saying that you could run the game and it would look and feel decent – well, playable – at 320×240?
Nate Hawbaker: Yeah, we do scale down to 25 per cent whatever the native resolution of your display is which I think in the extreme cases does get to about 320 by 240 and it does play! I mean the UI is rendered at full resolution, all the text is fully legible and the way that we design our characters and all the enemy combatants and things like that that, they passed something called a squint test. And so, at a certain number of metres away, can I tell this unit apart from this unit? And because we did that, I think certainly, those resolutions still hold up. You know, gameplay mechanics are still totally understood. There’s not small text that will get lost that will affect your gameplay decisions… and actually, I would encourage people to try it out just to see how quickly you do forget about that.
I mean it is a bit of a shellshock – at first – but I think people will be very surprised at how quickly they adapt their eyes to it. You have what I think is called foveal resolution, which is ‘what is your resolution in the centre of the view’ and then as you go to the peripheral [and] ‘how low is the resolution there’. You know, how sensitive are your eyes to that detail. And when you’re playing a first-person shooter you’re actually just looking at a crosshair. It is almost uncomfortable to me as somebody that works in graphics how quickly you become accustomed to that resolution.
Digital Foundry: There are a lot of PCs out there with Intel integrated graphics or really weak GPUs, but they’re ubiquitous. They’re everywhere. A lot of people don’t have discrete GPUs so by extension, can you play Destiny 2 on Intel integrated graphics?
Nate Hawbaker: A soft ‘maybe’! We have seen people playing on laptops with integrated GPUs. I would say it certainly wasn’t the target focus. I mean some of the hard parts that you’re going to run into are VRAM constraints and things like that, but we do have quite a bit of scalability options in our game – but as you’ve even pointed out before, our game is very CPU-heavy and if you have that same CPU also trying to render the game, that’s going to be really difficult. We have seen people playing with it. I couldn’t tell you exactly which ones to go out and try. You mileage may vary! It is fundamentally a very difficult thing to run on. Yeah, we have seen mixed reports of people using it, though.
Digital Foundry: I think from my perspective, what I’ve particularly enjoyed was the fact that I could run a really low entry-level GPU – GTX 750 or 750 Ti – and I could choose between 60Hz with a resolution and settings hit or 30Hz and running it on high or slightly better, so I think it’s fantastic that PC users actually have the choice there. But CPU-wise, why is Destiny 2 so demanding there?
Nate Hawbaker: I mean the main costs come from our simulation cost, purely the cost of always networking all of the enemies that are around you. We support 50 enemies being simulated around you, all of those enemies are doing pathfinding, they’re doing all of their AI calculations for all of the players in your public area which I think think we support up to nine. Meanwhile public events are going off and those are things that – unlike GPUs – they’re not really scalable. It’s not as easy when it comes to things like AI or networking to do the equivalent of halving the resolution on GPU. They still need to find players, they still need to shoot at you with their their guns and things like that and you still need to see other people talking to you and going through their simulation state.
Yeah, it’s certainly much more difficult to scale, certainly at least compared to graphics, and we just we put so much into our game, we always want to make it look like a living world, so when you go into those social spaces and you have, you know, 26 players and every one of those players has X number of bones that have to be updated from the CPU. That CPU has to issue all those draw calls – it’s very simulation intensive and there’s just not a lot of easy ways to take NPCs out of the public area or things like that.
Digital Foundry: Well that’s the thing, if it’s inherently a multiplayer game, the guy with the Titan X and the top-end Core i9 could conceivably be playing with the guy on his laptop on integrated graphics. And the laptop guy can’t have fewer entities than the Titan guy.
Nate Hawbaker: I mean some of the bigger costs are things like our AI and they’re optimised to the point where they can’t be really optimised any more. The only thing left is to simply have fewer enemies possible, and you know that’s just not an option. Creatively at least with the direction we want for our game, we just want to deliver these huge experiences for players and it’s hard to think of an obvious area to now compromise for that.
Digital Foundry: Do you have an entity limit?
Nate Hawbaker: We support nine players and 50 enemies in a public area. I don’t know if we’re actually hitting that at any point but there are so many things that are also random. Like, you could conceivably push enemies from one area into another – we do support it – and the nature of the game is very unscripted, and so it’s very hard to predict what players will do and so we have to prepare for worst cases. Certainly, I believe it’s 50 enemies.
Digital Foundry: I do remember Halo Reach had some pretty big battles with a lot of entities in play.
Nate Hawbaker: There are a lot of distant enemies as well where the only AI that’s running is just general, like flocking behaviour or they’re not really responding to the player but they do exist as an entity, but they’re not really paying much cost.
Digital Foundry: An undoubted advantage that consoles have is plug and play – so you load up the game, you play it, with no sort of issues. On PC, sometimes you load a game and it appears in a 720p window with arbitrary settings and you’ve got to dive in and set-up the game to specifically suit your hardware. And even then, you can be CPU or GPU-bound because the game doesn’t actually care what sort of hardware you have. You put some effort into addressing this, right?
Nate Hawbaker: Yeah, there were a number of people at Bungie and Vicarious Visions as well that worked very hard on that and I think it’s generally underappreciated what goes on in the first 30 seconds of launching the game for the first time. Because if you get it wrong, you have two options: the game is running sluggish at details that can never be maintained and they’ll never see again, and they’re going into their settings screen over your opening cinematic that you put all this work into inviting all these new players in and they’re not watching it -and then they’re trying to find the settings screen as well!
And then the alternative is the game doesn’t look as good as it could and their first impression is degraded and they’re worried about the quality of the PC version of the game, they’re worried about their hardware, maybe they might think the game doesn’t even look that good, and maybe they don’t even go back to the settings because they’re not enthusiasts. You know, not every single person will go into that screen and change every setting and go back out.
So, there are so many ways to get that wrong, and so some of the work that they did was you have to try to describe all the hardware possibilities through a bunch of averages and heuristics, and so you try to lump a family of CPUs and a family of GPUs into certain categories, into their advantages and disadvantages, and then you try to build a heuristic to match those up to specific render options because some render options are really VRAM-heavy but some aren’t and some actually create CPU costs and things like that, which on a couple CPUs in particular could really not scale well. And so there are all sorts of fun heuristics to basically detect – based on your current hardware – the options that’ll make the most sense for you and it all happens very, very quickly at the initial time the game is launched.
Digital Foundry: A lot of our audience obviously will be looking to get the very best out of the game and there is this kind of ritual where you initially set things up to offer the absolute optimal experience for your hardware. As good as your auto-detect is, it can’t account for personal taste. So, if you’re looking for a stress point to really optimise settings – in the beta we used the initial tower defence area where all the entities are spawning, there’s GPU particles, there’s alpha to the nth degree. It’s right at the beginning of the game and it’s seemed like an ideal spot for optimising. If you do this in the final game will those settings carry through for the whole experience?
Nate Hawbaker: I think that’s a really good benchmark. I mean that is truly a case of us almost – internally as well – trying to see what can we get away with. You know, we originally made that mission because it’s the beginning of the game, it has to draw in players, and it has to set expectations for the rest of the campaign. At the same time, we’re very conscious of the fact that this is the thing that’s going to be recorded and so you do try to throw everything into those missions. Like, if there’s one YouTube video that will get all the views, it’s going to be this mission and that fact is not lost on us, and so that is still a very good benchmarking scene.
Like you said, there’s tons of transparency, there’s tons of combatants, there’s tons of shadows, all the enemies are casting shadows even on the bodies of the enemies piling up after you’ve taken them out. I mean you’re just accumulating verts, rendering into shadows over and over and over and you’re looking into the entire scene. There aren’t any clever states, where if you just look in the other direction you don’t have to pay into that cost because we’re pushing you into the entire scene, the entire time. It is a very good stress case and it’s right at the beginning.
Digital Foundry: Okay, so can we talk about the individual settings on the PC and some of the easiest ways to increase frame-rate? So, I’m playing the game at 4K60 on a Titan X and I’m pretty much on the highest settings but I did reduce shadows and depth of field from highest to high. Did I make the right choices?
Nate Hawbaker: I think you did. Shadows especially, because it’s gonna be very scene dependent. Like much of the shadow quality changes are related to how many cascades you are rendering into, and there are certain scenes – like in a small multiplayer map – you may never even see that final cascade (certainly in interior spaces) so it’s going to be scene dependent, whereas other expensive things like SSAO set to 3D, it’s costly, but it’s also running on every single pixel that you see in the game is gaining from this no matter what context you’re in, so I would selfishly keep that in.
Digital Foundry: 3D SSAO is the setting above console, right?
Nate Hawbaker: It is. Yeah, the consoles are running a highly optimised HDAO and then we certainly have a 3D option for enthusiast players.
Digital Foundry: One thing I noticed is that the anti-aliasing options are pretty much the same as the beta, but the MSAA option has been removed. It seemed pretty heavy on cost in the beta and didn’t seem to do so much.
Nate Hawbaker: It was one of the few options that when we put out the beta, we had to attach a disclosure saying MSAA is very experimental, we don’t know if we’ll ship with it and as you mentioned it has seen its deathbed. Our engine is a deferred engine versus a forward renderer. Something that’s important in a deferred renderer is to have very accurate per pixel depth and the nature of MSAA is that you are super-sampling your depth buffer effectively, and that leads to all sorts of blending issues when it comes into deferred rendering where you’re layering on your transparency in your post effects.
And that’s why even in the beta you might have seen like little black halos or on characters and things like that. It just doesn’t lend itself well to deferred rendering. You know, we tried. We definitely wanted to see if we could do this but the only way to resolve those last bits of artifacting would require us to actually increase the cost even further from what it was. I mean, the point of anti-aliasing is to remove jagged edges and we have we had issues where they were introducing jagged edges at a high performance cost and increasing a render resolution… because of how well optimised the GPU is, it’s honestly just a pretty scalable approach to it.
Digital Foundry: Well, back in the day it was the brute-force approach which only a lunatic would actually consider but these days, we have GPU-level super-sampling like Nvidia DSR and AMD VSR. You have that built in to Destiny 2 with a render resolution scaler with 200 per cent as the limit and 25 per cent as the minimum. And that’s interesting because downscaling into a lower resolution can save you a lot of GPU time. And with a 4K screen with a high pixel density, it’s not a bad compromise from a typical viewing distance.
Nate Hawbaker: Certainly not, and you also have very fine-grained control, down to the exact percentage, whereas typically you’re jumping between large increments with what the monitor supports. And anytime you’re doing your pixel calculations, you’re usually squaring the amount of pixels that you’re rendering and as it grows, it gets expensive very quickly and it’s hard to finesse it, so that’s why I was I was really happy to see that we just gave people a flat-out percentage. You can go to 97 percent if you needed to, you know, or 28.
Digital Foundry: Dynamic resolution was available on console but what are the challenges of putting that on PC? Why didn’t you decide to implement it?
Nate Hawbaker: It definitely wasn’t anything philosophical, like ‘we’re against it’. It’s simply that quite a bit of architectural changes are necessary for it. On consoles, you have much more predictable memory allocations. When you’re trying to decide what resolution that you’re rendering everything at, you allocate memory very specifically for it and that’s not too bad in console when your memory is highly predictable. When you’re on PC, it’s bit more of the Wild West and a lot of the assumptions that you get to make when you’re building in dynamic resolution on console sort of go out the window. And so, step one is usually to start over and re-implement everything for PC. I mean it’s certainly not off the table someday in the future, but just trying to release the game on PC for the first time in so long for Bungie was certainly a higher priority. It’s not off the table, but it’s expensive.
Digital Foundry: Going back to the PC settings, to what extent do the distance sliders impact performance?
Nate Hawbaker: It’s not too bad. What those are actually going to affect at least on the environment and character… I would say between those two, we effectively have three distances that feed our LOD system and what those do is that if you set it to low it’s going to scale it to about 80 per cent – all of the variable distances that feed lower and higher resolution LODs. And then highest will scale them up to about 200 per cent the original distances. ironically a lot of our LOD systems are based on the silhouette of the object, and silhouette is usually the thing that you notice LOD actually affecting.
Because our LOD system is fundamentally based on the silhouette to begin with it’s actually really difficult to see when our LODs transition in the first place, and so by all means you can set it to high, but you may struggle to see the LOD transitions in the first place unless you find something very difficult, like spherical. We have a couple of enemies in our game that are literally spherical in their silhouette and there’s no secret recipe to getting those to hide behind great, graceful LOD transitions. But really, these options are about just scaling the distances that those happen.
Digital Foundry: Let’s say you’ve got a really pixel-rich 4K screen. Are you more likely to get benefit from those higher distance scaling settings?
Nate Hawbaker: I wouldn’t say too much honestly… We could make a ‘highest’ option which never LODs but it’s just so excessive. There’s not a great benefit to it and in the worst case you know somebody might turn that on without realising, pay a high cost for it and not get anything out of it.
Digital Foundry: Motion blur is just an on/off feature. We’re big fans of motion blur but a lot of people aren’t, so it’s nice that you can turn it off. What’s your implementation – camera and per-object?
Nate Hawbaker: Actually we do not have per-object motion blur. It is highly optimised – it doesn’t actually even run at the full resolution of the game, we do a lot of things to hide that – to the point where it might actually be impossible to detect the performance cost of motion blur because it renders so fast. When I was toggling it on and off on a GTX 980, the cost was something like within capture noise. You know, just like background, the equivalent of cosmic background radiation.
But if you actually do individual frame captures in our development build, I mean it’s less than like .01 milliseconds. it’s certainly very well optimised but you certainly have to give people the off option – not just for performance, but because certain people can get motion sickness. There are genuine medical concerns about that. I mean the same thing actually happens with film grain, though it’s an artistically polarising thing to add into games, certain people have a hard time resolving shapes and edges and film grain complicates that and it can give people headaches, and so that’s one of the reasons that we provided the off switch.
Digital Foundry: And the wind impulse option?
Nate Hawbaker: So, anytime that you’re interacting with our environment, like you’re using your lift ability or you’re on like a sparrow or something like that or you’re even throwing grenades, we’re actually rendering into an off-screen texture it’s called an impulse buffer and it basically records a position in the world and a magnitude which is either positive or negative for the force and that’ll affect all of our foliage systems and even like some of our particles that’ll get blown around. But on some entry-level systems, you don’t have a lot of VRAM and that’s another texture. That’s a few more megabytes of VRAM that you could find yourself saving and so we do provide the off option. It’s not a great benefit but we have seen a few CPUs in particular that for some reason don’t scale as you would expect for one extra texture, so we wanted to provide that option.
Digital Foundry: During the beta, 2GB graphics cards seemed to hold up pretty well at 1080p, so were you supplying the highest quality assets in the beta, or are there more in the final game that may require more VRAM?
Nate Hawbaker: No, that is still as it was in the beta. The interesting thing is texture quality isn’t just across the board changing resolution all of our textures. It’s actually scaling the highest resolution a texture can ever be and it’s also scaling the lowest resolution of texture can ever be and there’s all sorts of reasons that you might be between the lowest and the highest resolution, because we have a dynamic mipping system that is based on the distance from the player and all sorts of fun heuristics to try to make it so that the player never sees it. And what that is actually doing is changing the upper and lower bounds of whatever those are, but because we build our shaders in such a way that we don’t just input a texture then render a texture, it goes through all sorts of shader math like procedural implementations that are all run in the shader and they’re sort of agnostic of the texture itself.
And what that means is if you lower your texture resolution, it might not actually look like it is lower, because you know that texture might have been originally have been a mask and then you use that mask to change a colour on a wall or something like that, but in Destiny 2 we use that mask and then we might multiply it by the object space position of it and then remap it based on the world coordinates of it and then take the angle of that surface and then drive some grime on it and stuff like that – but it’s all procedural, it’s not texture-based and so it can be a little difficult to see the effects of it but yeah it’s really just scaling the upper and lower bounds.
Digital Foundry: Does that have an implication on performance at all? The accepted wisdom is that texture quality is solely a VRAM thing but at the same time, higher resolution textures could possibly make for larger render targets, which may have bandwidth implications.
Nate Hawbaker: Yeah, in this case, we’re not changing any of our render target sizes but we leave that up to things like the actual resolution setting of the actual game, so I think that wisdom is still pretty accurate. I mean there are some VRAM considerations mostly, but from the rendering side of things like fetching those textures, sampling them, it’s not too bad, I mean there are certainly extreme cases where you can start to become bandwidth-bound purely from the number of high resolution textures but generally it’s not a big performance hit.
Digital Foundry: One final thing on the settings here – the light shafts. Now, you’re quite big on volumetric lights in Destiny 2 and I notice that the high preset in the PC version actually sets the light shafts at medium so is that performance-related?
Nate Hawbaker: Yeah, the light shafts are interesting. During Destiny 1 and Destiny 2, we outlined some like key things that we wanted to develop. You know PBR was one of them, GPU particles being another one and the third one is – as you picked up on – volumetrics. Early on there was a concept piece shown. It was very simple and it was a Cabal – one of the enemies in our game – with a big volumetric [light source] behind him. We want to do that, we’re going to do that and that’s Destiny 2. So, we built this whole volumetric system and it was highly optimised and it was optimised to the point where we found our lighting artists just placing them everywhere. I mean, they were originally going to be set-pieces – you know, this is the spotlight behind that enemy and you have that typical fan spinning through the volumetrics… I think that every game using volumetrics has to do that!
Digital Foundry: And you have one right at the beginning of the game!
Nate Hawbaker: Yep, we had to! You have to, it’s a requirement of having volumetrics, but the implementation was so optimised that we find every single play space littered with them and so we I think at any moment in Destiny 2 you’ll probably have volumetrics on the screen, but this setting specifically is related to the light-shaft volumetrics but the thing is, those are already rendered at a lower resolution and then we do a very clever radial blur based on that and that wasn’t too bad because Destiny originally runs at 1080p and so you know exactly for a screen-space effect how many samples you need to do so that you don’t see those steps between the samples, but then 4K comes along and 21:9 comes along and all those things that change your assumptions and all of a sudden you start to get bugs like banding the light-shafts. And so, upping that quality option basically increases the sizes, increases the sample count when you’re at resolutions above… I believe it’s 1080p.
Digital Foundry: So let’s talk about the benefits of the benefits of high frequency gaming. What I’ve found is that generally, a joypad is great for 30ps gaming but it seems to introduce quite a lot of latency at 60Hz. And at even higher frequencies, mouse and keyboard is a game-changer, so what did you do to optimise to this very different control system?
Nate Hawbaker: Well, the first step, I think, is that you do a lot of soul-searching. There’s a hallmark of a lot of Bungie games which is… it’s that sort of secret sauce: why does the game feel so smooth? I think even when criticisms are thrown against some of our games, people will always say unequivocably ‘yeah, at least the gunplay is really solid’. And here we go into PC, where all of those assumptions are going out the window. We have years of, you know, things like mapping subtle function curves to make sure that the input of a thumbstick feels perceptually smooth, because just mapping those inputs linearly… you don’t want to actually play that, and years and years and years go into that. You can’t have those assumptions on PC [with] mouse and keyboard, and so what we did was basically have an entirely unique set of balancing and tuning – and for everything related to our input.
Like, what is the sensitivity? What is the sensitivity when you zoom in? How does DPI map to a mouse? Are you going to do mouse acceleration or something like that? How smooth does a sniper feel? Is there auto-aim? How’s the recoil? You have to almost start over and handle it entirely uniquely, which we did. Our sandbox team effectively balanced and tuned PC independently.
Digital Foundry: This project is a partnership with Vicarious Visions on PC, so how that relationship work? You’re obviously intimately involved with the PC build so it’s not just a case of farming it out to an external developer.
Nate Hawbaker: Certainly not. The way I describe our relationship with Vicarious Visions is though they’re in New York, it’s sort of like they’re sitting next to us. I mean, there is there is nothing different about an engineer in Vicarious Visions versus an engineer at Bungie. They’re involved with the same types of the process, they’re in all of our meetings. We have conference calls and maybe the iPad sitting awkwardly in the corner with somebody’s head floating on it, but they’re a part of all of those decisions and they’re almost an extension of us.
Digital Foundry: A feature exclusive to PC right now is support for high dynamic range. The basic perception is that you’re rendering internally at HDR anyway, so you just cut out the tone-mapping and address the screen directly. Simple. Except it’s not, right?
Nate Hawbaker: It is certainly not that simple! So HDR was definitely a a passion project of a very talented engineer over at Vicarious Visions named Kevin Tadisco. You have to define early on ‘what is bright?’ and that’s a very philosophicalquestion because you have to start answering questions like ‘you know a super fireball coming from a Titan’s hammer versus the bright pixel in the sky… how much brighter is onerelated to the other and what should that look like on a TV?’ and don’t forget some people have OLEDs and their peak brightness at 100 per cent coverage is not quite as good as the LEDs, and also how do you get artists to author this, because they don’t have HDR monitors… maybe they do but those are like $1500, so how do you handle that?
And wait, your exposure range is changing so much and you have to… man, I think you almost have to revisit every facet of your renderer because it comes down to ‘what do you want your final visual image to look like?’ and there’s a number of decisions that go into that. Things like tone-mapping exposure lighting and shading: how bright is the UI? If there’s a white bar for your health bar, how bright is white? It’s not as bright as the sun obviously. Or even when you go into various zones or when you’re loading up a cinematic or something that like that, you’ll see a big white screen in our game, with different class symbols rendering. But that’s a white screen and if somebody’s playing in a dark room with an HDR TV, you’re gonna be burning out cones and rods in people’s eyes and so ‘how bright is that?’ You have to answer fairly philosophical questions notwithstanding all of the technical side of it.
Digital Foundry: Is there a performance implication with HDR?
Nate Hawbaker: The renderer is basically HDR all the way through to begin with, so the main performance implication is that your render targets are changing. You’re changing the actual bit depth. You’re rendering it so there are some bandwidth considerations but otherwise, the performance cost is pretty negligible. It’s really just VRAM costs.
Digital Foundry: Bearing in mind how differently HDR presents on every screen, how every HDR screen has a different implementation, how do you find a common target to master to?
Nate Hawbaker: It’s very similar to non-HDR screens because I don’t think that landscape has changed too much. I mean, there are certainly entry level TVs that don’t really give you a wide range of black to white. We’re in the graphics world – we can clear a buffer and create a gradient and we know scientifically the value it should be, then you do a split across four TVs and it almost looks like different content sometimes – and I would say that this problem is still well and alive in the HDR world.
There is a mode that we use in development for HDR where it lets you build a synthetic SDR screen and it does them side-by-side so that you can see them directly side by side because otherwise comparison is just impossible, as you’re changing inputs and the screen blacks out – and I looked at it on an entry-level HDR TV, I think, $400 or something like that and there was no line and I realised that this TV was just processing the signal. And it is HDR according to the box but in practice it’s not, and so your mileage will certainly vary. So I mean the thing that we’re targeting right now is that you have to decide on averages.
From an average perspective, if somebody has an HDR TV where are they going to be? Are they going to be with an OLED? Are they going to be with an LED because there’s no line of code that we can ask that says, ‘Hey are you on the LG, are you on the Samsung?’. We don’t know that and so there’s a lot of hedging your bets on averages.
Digital Foundry: Nate Hawbaker, thank you.