Inside Marvel's Spider-Man 2: the Digital Foundry tech interview

Gamernyc78

MuscleMod
Moderating
28 Jun 2022
20,386
16,652

Eurogamer.net

Inside Marvel's Spider-Man 2: the Digital Foundry tech interview​

Insomniac reveals the secrets behind its ambitious sequel.

John Linneman avatar

Interview by John Linneman Senior Staff Writer, Digital Foundry
Published on 24 Oct 2023
Follow Marvel's Spider-Man 2
Marvel's Spider-Man for PlayStation 5 lands pretty much exactly where we hoped it would. Insomniac Games has delivered another new game for the console that pushes back technological boundaries, which a denser, richer representation of New York City, expanded to include Brooklyn and Queens, while the core rendering itself features dramatic improvements in ray traced reflections, along with the introduction of RT reflections on bodies of water. Meanwhile, the studio firmly embraces the potential of the game's SSD with streaming that allows for more dramatic and faster traversal through the city, along with some spectacular streaming-based set-pieces.
Advertisement

In our tech review of the game, we had plenty of theories about how some of Insomniac's tech had been enhanced over Marvel's Spider-Man Remastered and Miles Morales, but we were hungry for much more information. Sony and Insomniac agreed to allow us to speak in an extended interview with Director of Core Technology, Mike Fitzgerald, and we got everything we wanted - and much more.

In this interview, John Linneman talks extensively to Mike about their commitment to ray tracing, how they removed raster-only modes from the game, and how the studio achieved many of its showpiece achievements. We also talk about streaming, compression, along with some of the less noticeable - but still crucial - technological components in the game.

If you don't fancy strapping yourself in for a 6000-word tech interview, here's the video source for the text below.



Digital Foundry: I wanted to get your first impressions and first thoughts on the approach that you wanted to take in building this new game. Where did you want to take the technology for this next title?
Spider-Man 2 developer weighs in on game length vs price debate image

Spider-Man 2 developer weighs in on game length vs price debate
Read more on Eurogamer.net
Mike Fitzgerald: Well, you know, the fun thing to do is to take a photo of New York City and put it next to the game and try to find the exact perspective, right? And then say, well, where are we? Where are we falling short of this? Now, you know, 'photo-real' is not exactly the goal. I think it's more 'Marvel-real' or 'game-real' or something, some sort of somewhat exaggerated, dramatic interpretation of it. But you know, off the top of my head, some of the things we pointed out were that our buildings got a bit flat in the distance, the lighting would drop out, they didn't have a lot of macro breakup and variation, so that was one thing we were excited about. Ray-traced reflections on our PS5 launch titles were awesome, but there were still a lot of tricks and fakes around there around building interiors, which was a fun area we were excited to tackle.
Advertisement

And then as soon as we knew we were doing Brooklyn and Queens, we knew we'd be going across the river and we'd be having a whole set of unique challenges with the water rendering. But really, you get to the end of one game and the whole team gets to pick up and look a bit farther to the metaphorical horizon about what they're doing and get ambitious and think about crazy things they want to try. And so everyone has their own little pet thing to do. Doing Ratchet and Clank: Rift Apart gave us a lot of fun ideas about how to do loading and streaming differently in Spider-Man

2.Digital Foundry: And that's something we do want to get to, but I actually want to start with one of the things that really caught my eye when looking at the game and it's the building interiors. So for those that watched our video, you may have noticed this: every single building, every one of these skyscrapers now features what looks like a fully modelled interior with dynamic characters even sitting around inside these rooms. And we had our hypothesis, but I'd like to hear directly from you. What are you guys doing to pull it off? How did you solve this problem?

Mike Fitzgerald:
Yeah, so great job [on the video]. I think noticing that checkerboarding was a great tell. So those are ray-traced interiors for rooms and what we're doing is almost path-tracing that space in the sense with simplistic geometry and lighting. But taking a step back, I think we had the cube map technique in the previous games, which is clever and fun. And you know, it's fun to crawl around corners, looking and notice all the 'oddnesses' of it. And we actually had another technique we were trying on top of that, which was to add a sort of a parallax depth to the interior. And you could get these sort of 3D shapes, especially along the back wall. Like if you wanted to put a desk back there, or bookshelves on the sides, that sort of convex geometry along the walls looked quite good. But a couple of really talented folks on our render team proposed this idea of, hey, every window we hit, we're tracing rays out into the world. Why not render an interior by using that same system and tossing rays elsewhere?
So we have, I think, 32 fake rooms deep underground in the city, buried somewhere below the ground plane. And they all have sort of a basic layout and then different variations on furniture and characters that might be in there. And then as you trace into a room, we use an ID for that window. Okay, so floor five, window three of the building at this corner will use this room, and this sort of random set of interiors for that room. And then we can filter the BVH down to that, hit exactly some set of objects in there, even animated objects like ceiling fans, or characters who might be watching TV. We have some pre-calculated lighting in there. And we can also cast rays back through the real window to get a sense for the key light where the sun is, where the shadows might be entering the room and all that. And it worked [laughs]. It comes together really nicely. You get that sense of movement back there, that adds a lot. And that's how we did the interiors for this game.

Digital Foundry: And also, there are perfect shadows, of course, a benefit of tracing into these scenes. I think shadow maps would be expensive and probably not look very good.
Mike Fitzgerald:
Yeah, not for each window. Certainly.
Spider-Man 2 fastest-selling PlayStation Studios game ever image


Digital Foundry: I also liked how even on the sides of buildings you could sometimes see in the office and look, and there'd be a back door at the rear of that office that seemed to go around into the other side of the building. It's a very clever trick. And that is seriously a very difficult problem to solve. I think this is the first time I've seen a game tackle it in such an elegant fashion where you really do get the feeling of depth on all the buildings. But another aspect of this depth that really caught my interest is the way you handled secondary reflections.
Advertisement

This is something I pointed out in my video from Miles Morales back at the launch of PS5, but you know, when you have two windows side-by-side, they're gonna reflect on each other. And then the reflection within the reflection should also appear in that - and that was absent in those original games, but I notice it is actually present here. And I'm wondering what the source is for the secondary reflections? How did you actually solve this problem?
Mike Fitzgerald:
Mostly, we use just a plain probe of that area. What the reflections were, before we had ray-traced reflections, we can fall back to that technique anywhere, just like we do on very rough surfaces in that area, where we're using some sort of runtime, or more 'baked' calculation of what's around there. Because you don't really need to see movement in the way you need to with ray-traced reflections. And so it's a great fallback for a secondary bounce and a reflection as well. It works for buildings, it helps with cars a lot too, which I think looked a bit flat in reflections before and when they have a nice, clear coat on them, you want to give a bit of that.

Digital Foundry: I noticed the nice clear coat on those cars.
Mike Fitzgerald:
We put a lot of work into that, that that was always a pet peeve.
Digital Foundry: The materials in general definitely feel like a step up, especially on the more minute objects, the smaller objects. But I don't want to get away from ray tracing just yet, obviously. Because another big thing, obviously, is the water reflections, which are now ray-traced. This discussion has been interesting, because I think a lot of people, especially with the way games used to work, maybe didn't consider how rough choppy water actually behaves when light refracts and reflects off its surface. But I am curious to see how you tackled this challenge both in terms of how you leverage the existing ray tracing features, but also how you handle something like this rough water, which is obviously not like a smooth, glossy surface, the reflection is more diffuse. So what kind of extra cost are we looking at? How do you pull it off?
Advertisement

Mike Fitzgerald: There's always a lot of balance with this. I think pretty early on as soon as we had gameplay happening over the water, whether it was a mission like we showed in our first gameplay demo, or some early wingsuit (it wasn't even a wingsuit then) but just trying to get over to where the Brooklyn and Queens where you very quickly see it fall apart. And actually there are a lot of screenshots of the first game on PS4 where the building reflections fall apart really badly, you get this full silhouette of the character in front of it. And we just knew we could do better. But when you sort of naively drop ray tracing on that water, the performance spikes far, far too high. It was really expensive. We tried to get around it. We tried doing kind of a planar reflection technique, like rendering from underneath up through the water into the scene. But hey, rendering a whole other scene is really expensive, as well. And then trying to translate that into the roughness of the water didn't quite work. And so we sort of got to a point where we said, ray tracing is going to be the right way to do this. So how can we mitigate that performance? Our graphics programmers are awesome. And the one in particular, who focuses on this stuff, he did a great job, trying to find the right compromises that you don't notice, in the same way when you're looking at water, as opposed to windows.
Advertisement

So we actually render the water reflections at quarter-res on horizontal and because they're so stretched out, you don't notice that at all. And there is always this fun march of when you're focused on optimising one thing, you get the progress updates during the day, and it's like, 'hey, here's a side by side. This one is literally half as many rays, half the performance cost, it looks exactly the same, right?' Everyone's like, yeah, pretty much. And so it's like finding those little bits, where you can save time and effort to do that. A big one for water that makes it so expensive, is it's so choppy, that your rays shoot in all different directions and that's really challenging for ray tracing hardware. So how can you bin them? How can you group them? How can you maybe not shoot them in exactly the right direction, but one that's close enough, and yet is more coherent with what's nearby where you can pull some extra performance out of the hardware. And really, it's just pulling that thread over and over repeatedly and eventually getting to a point where we could do it in performance mode as well, which was huge for the quality of performance mode.
Digital Foundry: That's interesting. So yeah, I was wondering about hitting 60fps or higher with ray-traced water. Do you remember the point where you said, 'you know what, we're just gonna go all in on ray tracing here, and drop the fallbacks'. Did that happen early in development? Was that always the goal? Or did you just happen to get enough performance that you're able to say, 'yeah, we can actually do this?'
Advertisement

Mike Fitzgerald: Well, it has always pained me, whenever I see a screenshot of the first couple of Spider-Man games, or if I see Clank with ray tracing turned off. And I always know immediately, when I see that screenshot and it always bums me out. And so it was always an aspiration for this one: wouldn't it be great if we didn't have to have that real big compromise in there, if we got back to a pure performance and resolution trade-off. And I don't think we committed to it until earlier this year, but I think basically, we saw the way performance was trending and we said, 'you know, it'll be work, but let's go for it'. I feel like we got there with the last games, but we always got there, like, a week after launch. Or, right at the deadline, and we're never quite confident enough. And so this time, it just took a bit of 'No, we're gonna do it and we can get to where it needs to be.'
Marvel's Spider-Man 2 - the Digital Foundry video review.
Advertisement

Digital Foundry: I like the word trending, because that's something I think that's really important to consider when optimising any of these games is that you have to think very far ahead in the future, right? You can't just say at the last minute, 'yeah, we're going to do a 60fps mode'. It's going to be something that's really planned early on and and worked towards. And obviously, that must have been a lot of a lot of hard work to get going with all that ray tracing going on. But another thing about the ray tracing that was interesting is, it seems like the particles - not all of them - but many of the particles are now rendered in the ray-traced reflection. And this was definitely not the case in the last games, or even in the earlier Spider-Man 2 trailers. What's the story there?
Mike Fitzgerald:
Trailers are so funny, you make them and I think there's this sort of perception of 'Well, it's all smoke and mirrors and the real shipping game is not even going to look that good'. But when our media team - who's awesome - was capturing those trailers, we're always thinking, 'well don't put that scene in, because we know we have this thing coming that's going to make it look better'. And when the trailers come out, I know, our core tech team is always thinking 'oh but it looks better now'. Why couldn't how it looks now be in the trailer? So in particular, I think it was the story trailer. You're getting pulled up behind by Lizard up the side of a building, and there's this whole truck bouncing around, and there's all this fire and smoke - and none of that fire and smoke is reflected in the building. And I think the day the trailer came out is the day that we got that feature up and running and looked at that scene in the game. And we're like, 'ah, but it looks better now.'
Advertisement

Digital Foundry: I do actually want to ask you a little bit about the lighting for this game and the way you handled global illumination in the different lighting methods, and if there's any improvements there. I think there's actually interest in understanding the way you do your pre-calculated lighting pass.
Mike Fitzgerald:
We have a GI bake across the city and in all the custom interiors for missions and things like that. The city is enormous, it's twice as big in this game and we always want to be increasing the quality of that bake, whether it's resolution or whether we capture specular as well, you know, like all these different factors of the light that we're baking in and exposing to the real-time part of the game. And it's funny, this stuff takes ages to bake across the city, and also the city is changing during development, so you can't just ask a team 'like, well, if you're working on this street corner, make sure you rebake all the lighting there'. Like they don't want to sit there for an hour to pre-bake it all just because they move some stuff around. So, in this game, we actually worked quite a bit on the non-glamorous part that doesn't show up in the final product in a sense, but lets us iterate on all that more easily and cleanly. We used to have a farm of machines that were baking the city over a week, and now it's like one machine can bake the sunset lighting setup in four days or something like that. So that's a big, iterative thing for us as we go.

Looking at the city, I mentioned some big picture things about it versus real New York and one is that all our dynamic lighting had to stop at some point in the distance. And as well now, we have a version of the light bake for the entire level, the entire world that can be loaded at one time. Another thing is that across the river, looking from the side of Manhattan, across to Brooklyn, you're looking at something very far away. But also there's nothing in between you in Manhattan, there's a lot of faraway stuff. But usually there's buildings close. And that's where you're seeing. But this was something far, far away, that was also all you cared about, and wanted to look at. So we needed to make sure our lighting over there was good as well. So it's a nice side-by-side of seeing how much farther real dynamic lighting at street level and real GI bakes in between the buildings, holds up at a distance.
Digital Foundry: Yeah, and there's a good amount of times of day in this. So each one of those has its own separate bake. Just offhand, do you have any idea how much larger the lighting data is compared to say, Spider-Man on PS4?

Mike Fitzgerald:
I don't recall exactly what it was on PS4, but I know we were so constrained by how big things were and to stream off the disk that couldn't have been quite large. I want to say that for this one, streaming data for the open world is maybe a third of the disk, something like that, it's a pretty substantial component of the data, a big chunk. And the team goes through a lot of work to compress that data, finding efficient ways of storing it, putting it in formats that can be nicely compressed by the Kraken compressor on the PS5. There are all sorts of tricks to get stuff to fit on a Blu-ray, which is a challenge of its own and then also to stay small enough to stream in as we as you play the game.
Digital Foundry: Thinking of the lighting there, I'm just wondering if you've examined or explored the cost of something like doing ray-traced global illumination? Do you think it's actually feasible on these platforms in a game like this? Or is it still just a little bit too much, especially if you factor in the reflection pass?
Mike Fitzgerald:
Every game has its own trade-offs in environment and style of play. I think one thing we looked at was, I mean, we're in, we're in New York City, the reflections are just hugely important for the look of that space. And we never want to sacrifice any of that for something else. Another component is, we move so fast through that space, that we can't afford much time for rendering techniques to sort of resolve. And that's moving through it, that's also camera cuts and things like that. And it's something I noticed as a compromise of ray-traced Gi is that these scenes need time to settle. I mean, that's a broad generalisation. Some people are doing some awesome stuff with it. But for us, I know we wanted to focus on reflections and make sure any quick traversal through the city was really looking stable and correct, I guess, as much as we could as you were moving through it. But all that stuff is so exciting. And you can do some gorgeous stuff with those techniques, so we won't be ignoring them!

Digital Foundry: There we go! Of course, you mentioned New York and another part of New York is the density of stuff. Something I really noticed and picked up on here is that there's so much more traffic, pedestrians etc. Plus there's just more general detail from any perch, it feels like there's more detail closer to the player. Could talk a little bit about all those different factors and how that's changed for this game?
Mike Fitzgerald:
It's funny to see people talk about it, because there's this notion of 'oh, there's LODs and you just watch everything transition in and out of them'. But there's like 10 or 15 interlocking systems that all factor into how detailed something might look. You have texture streaming in and out at different distances to the camera and blending into each other. You have models with LODs that you see at different distances, you have our imposter system, which is for distant buildings. How do you represent those in a low poly way that you can keep in memory all the time? Anytime we have a building, you can see the whole world and Brooklyn and Queens now, how do we represent those efficiently? When do we transition to something better looking?
Advertisement

So in this game, we added a new high-res version of those imposters, so there's a new middle distance version of a building that has more geometric detail and far ones but isn't yet independent models for different pieces that we need to render, but you do get real reflective windows on those, so that's part of that transition. We also have a lot of objects that need to exist at a distance, things like big air conditioning units, or antennas on top of buildings, or water towers, which affect silhouettes. So those exist as these independent models that we have showing at different distances far away and close. The windows have their own techniques to blend at different distances, the materials have their own levels of detail as height effects and parallax occlusion comes in and out on them as you get close and far. And the goal is to make it all invisible. So your brain fills in the details for the far stuff and makes you feel like it's the same as if you were standing next to it.
Here's our video content showing the differences between the various performance modes in Marvel's Spider-Man 2.

Digital Foundry: That's definitely something I really noticed, especially sitting up on some of the perches, just the general variety across the scene. In the original, you look out across the buildings and there's a point where they just kind of turn to blocks. It's within the constraints of a PS4, but here it feels like there's much more subtlety on every surface from a distance, which does make a huge difference in terms of quality.
Mike Fitzgerald:
To speak to the wear and tear on buildings, sort of the blocky flat sides, one thing we did was we introduced this big step on buildings, that looks at their geometry and says, 'well, if it had been raining here for 50 years, here's where the grime would be around the windowsills, or this is a wear pattern on the side of this building'. That can be a dynamic input to a material and shader when you're close to it. But since we bake those imposters, we can bake it right into that unique imposter. So you can get 'yes, that's a brick building' but we can feed in this sort of macro break up to that brick material that says, 'oh there was a mural here and it faded away 10 years ago'. And that has added a lot of variety, looking out over Manhattan that makes it feel less cartoony and more real.
 
Last edited: