Sooo… uhh… Graphical Downgrades.

 


So, has Detroit: Become Human had a visible graphical downgrade?

Well, you don’t need two functioning eyes to see it has – albeit slight, admittedly. Texture work has clearly been simplified, as have some of the curves and features of the models in question. It’s not exactly the worst graphical downgrade I’ve ever seen – Dark Souls 2 would like to have a quiet word with you on that front – but of course it’s had a graphical downgrade. Why is anyone shocked?

First of all, there’s a reason I am against “proof-of-concept” trailers, or teasers more than two years before a video game hits the market. It’s not that we have to wait an agonising age – and hell, I ain’t getting any younger – but that the very nature of early footage is predicated on an absolute falsehood – that the footage in the teaser or trailer will end up looking that good. It won’t, and the main reason for that is simply volume of content within said trailer, teaser or even proof-of-concept demo – because we all know how those work.

However, let’s talk about resolutions for a second. There’s a point to this, trust me.

Old-school NTSC Resolution was 640 x 480 – an overall pixel count of 307,200 (PAL was weird so we won’t talk about it because I’d be here for hours). A comparatively small space to render an image in. HD Ready came next, at 1280 x 720, or 720p – a pixel count of 921,600 – almost three times the pixel count of NTSC. True HD followed at 1920 x 1080 or 1080p at a pixel count of 2,073,600 – more than twice the pixel count of 720p. And 4K – 4096 x 2160 – has 8,847,360 pixels – more than four times the count of HD.

This is Starfox, on the SNES, in 1993.

I give you those numbers to show just how far we’ve come in the last seventeen years, but also as a demonstration that the higher pixel density of modern screens puts a heavy additional strain on hardware. The fact is that hardware has to fill in all that additional space – and all the additional things that apply – and keep doing that, in essence up to sixty times a second. It’s why even with £200 of additional hardware, including 4GB of additional RAM and a significantly improved CPU and GPU, the XBox One X isn’t going to run games any better than any current-gen console – all that additional hardware is being pooled exclusively into the 4K rendering.

Think on that. £200 more hardware, to render a current game in 4K. Quite a hefty chunk of cash when you think about it.

But let’s quickly nip back to NTSC. When was the first NTSC broadcast? Well, the first was way – WAY – back in 1941, though the standard itself wasn’t finalised until around 1953. With the first HD-Ready standard coming in, let’s say, 2005 – that was 52 years of one single visual standard. And in the twelve years since then, we’ve effectively as I pointed out multiplied the pixel count of our screens exponentially (9.5 times, if my math isn’t too off). Compared to NTSC, 4K is around 29 times the pixels. Our screens haven’t really gotten much bigger on the whole – but the amount of pixels that have been pushed into that space has multiplied considerably.

This is Starfox on the Wii U in 2016. Any questions?

Let’s be real here for a second – such rapid advances have consequences, and part of that is that hardware is struggling to keep up. Hell, 60 frames a second wasn’t even an issue back in my youth – though admittedly in PAL regions we were often locked to 30 frames a second because PAL (the UK was a pretty big market for imports back then because of this – ahh, the days we actually spent more on a product to play it at sixty frames a second…) – but now we’ve got many, many times the pixel density in our screens and… thirty frames a second on the overall. You have a much clearer, much more detailed image… but ultimately, a console can only output so much detail before it begins to struggle. We know this – crank up the visuals on an older PC, and the frame rates tumble to a slideshow. More resources are being funnelled into the image, but the frame rate takes a massive hit as a result, before the hardware eventually overheats because we obey the laws of thermodynamics thank you very much.

Although, to be fair, you have to be impressed by what modern hardware does. Sure, in the past we enjoyed our sixty frames a second. But it wasn’t simply that pixel counts were lower and images more primitive – by and large, we didn’t have hugely complex physics engines, or randomised weather effects or a host of the bells and whistles that you expect in a video game released in 2017. The hardware had to read the game data, and render the image to screen. That was it. No operating system in the background, no insane gimmicks – heck, Starfox on the SNES needed an additional chip inside the cart to pull off even that primitive 3D. A modern system has to do all of this – and oftentimes, even more – and render an image on a screen with a pixel count that would have blown my eight-year-old mind back in the 80’s.

All of this is to explain that actual computer hardware has struggled to keep pace with the rapid pace of resolution increases. And if you think the £200 extra just to render 4K is bad, we haven’t even talked about 8k yet – 7680 × 4320, or 33,177,600 pixels. Even I’d admit that in terms of video games, that’s a long way off from being a reality. Could it be possible? Sure. I can see a thirty second trailer pushing 8K. I also foresee fire engines, difficult questions on insurance forms and possible awkward funeral conversations in the aftermath.

So back to our teasers and proof-of-concept trailers.

Much of these merely need to render a very small, very specific area with minimal AI, physics or overheads like lighting, shadows or weather interfering on the whole. This means, if you were an incredibly dishonest video game developer – love you Randy – you could, in fact, utilise the additional resources available to super-sample, or push things like anti-aliasing way up, or run higher resolution textures or higher-density polygonal models. In short, because you’re not having to read huge quantities of data, or demand the game prepare rendering and displaying areas and enemies that aren’t in said demo, you can put the additional system resources into cranking up those visuals to as near to photo-realistic as is possible to achieve. There is less for the hardware to do, therefore you can crank up the visuals and have the otherwise wasted potential being used to render very pretty images.

But as we know, a full game is not a demo. A full game has a lot to do, and a lot to render, and a lot to keep processing. As a result, developers often need to scale back the visual side of things to get some kind of balance – if they didn’t, you’d end up like that old PC example. It’s a pretty image, but the frame rate renders the whole thing utterly unplayable. And no-one is going to buy a game that runs at around 10 frames a second. Doesn’t matter how good it looks – playing it would be frustrating in the extreme.

If I was to be extremely generous – it’s not always the fault of the developers. It’s easy to crank up visual settings in most engines these days, and unquestionably the end results are impressive on the whole, but doing that with what is a very small portion of a video game only skews the inevitable end result and the reception gamers give towards it. What you show off is an impressive tech demo – but that is, in effect, all it is. A tech demo. It’s what –could– be possible, someday, maybe, on better hardware. But that doesn’t mean it’s actually possible in the now on current hardware.

Video games are a compromise. With the way hardware currently stands, it’d be nice if we could have it all – but we can’t. I’d love a game with the graphical grunt of The Order: 1886 married to an open-world RPG like Breath of the Wild.

I’d also like a unicorn.

Always… I wanna be with you…

What I’m saying here is perhaps developers and publishers need to start being more realistic about what is actually possible – and stop giving consumers aspirations of a product that in reality is years away from being possible, if not an actual generational cycle or two away. Dark Souls 2 was a disappointment on the PS3 and 360. But play it on the PS4 or XBox One – it’s a dramatically different beast, and vastly better as a result. You get the lighting and shadow effects put back in, plus a higher resolution and a solid sixty frames per second and it is a massive, undeniable improvement. And you had to wait for them to actually put that on new-gen hardware, which took an additional year (does beg the question why they didn’t just wait the year and blow us away on newer consoles).

But also, we’ve had this as gamers since the Gen-5 era. We’ve had it over and over again – tech demos masquerading as actual products. That began back in 1996 – so after 21 years, let’s ask the question I threw out so nonchalantly at the start of this blog post – why is anyone shocked? Resident Evil 2 was teased in images with lovely visuals and lots of zombies on screen. Yes, the “beta” 1.5 code is out there. And it runs quite nicely. But that’s on current hardware – Capcom said ages ago that the thing we call Resident Evil 1.5 didn’t run very well on the original PSX hardware. And you know what? I’m actually inclined to believe them – I think one screenshot had like, a dozen zombies on screen at once. On the PS1. One of the few instances I can probably buy Capcom’s excuses.

And Resident Evil 2 is awesome and all, but come on… it –was– a graphical downgrade from the 1.5 shots. It’s okay. It was twenty years ago. We can let that go now.

This is the market we inhabit though, right now. I don’t want developers to stop striving for better – that’d be stupid. But perhaps they need to have a more realistic ideal of where their game will actually end up – rather than showcase something that exists purely as a tech demo and nothing more.

… and perhaps we need, as gamers and consumers, to stop pretending that “gameplay” footage at something like E3 is a gospel truth. It’s not. It’s marketing, pure and simple, a performance put on a huge stage for millions of avid viewers on the Internet. It’s theatre.

After two decades of this, you’d think we’d be wiser now. I guess even there, I may need to reign my expectations back in…

You can leave a response, or trackback from your own site.

Leave a Reply

Powered by WordPress