The Atari 2600 was an 8-bit computer. The NES followed suit with more power and better games 6 years later, allowing game developers to do things that made the Atari 2600 look like child's play.
The SNES and Genesis were 16-bit and this is where you saw Sega lean into "blast processing" in order to make a market differentiation with the SNES.
The PS1 was 32-bit and Nintendo releasing the N64 a bit late, pushed to market it as 64-bit. Likewise the Dreamcast was 128-bit...
This is where the industry realized this marketing tactic wasn't going to work moving forward. The PS2 while having a 128-bit bus was a 64 bit processor and was more powerful than the Dreamcast. This is when the conversation changed from bits to gflops and we've largely been using that ever since.
At this time we were also counting polygons in main characters to show the progression of technology. The number of polygons in Solid Snake in Metal Gear Solid 2 compare to Metal Gear Solid. This would continue with the PS3 and the number of polygons in Snake in MGS4. These polygon counts became less relevant on the PS3 and were barely if ever mentioned on the PS4.
In the HD era, we really started focusing on resolution and this is largely where we are today. Focused on 4K and tflops, but tflops are no longer really a great way to ascertain the ability for a machine to deliver on gameplay and visuals and natively rendered 4K is expensive and largely pointless.
We're entering a new era which is the AI upscaling era. Everyone will be judged by the the ability of their AI upscaler, which is why I think Sony gave a name to their upscaler to differentiate it from others on the market (DLSS, XeSS, FSR).
I think it is a move done in recognition that native 4K and native 8K are not the future direction of video gaming anymore than 256-bit wasn't and that the resolution race like the bit race before it is pretty much dead and irrelevant. I think it is also recognition that tv manufacturers aren't moving to make 240hz TVs anytime soon. That recognizes that there isn't much relevance in pushing for framerates above 120 fps.
There is no reason for gaming to push beyond the capabilities of televisions and that's pretty much been the case since the beginning when TVs were 50hz or 60hz depending on where you lived.
The benefits of 8K and 240hz are largely non-existent. They'll have to find a new big advancement to push TVs. We went from black and white, to color, from CRT to LCD, from 480p to 1080p to 1080p smart TVs to 4K smart tvs, to 4K smart TVs with HDR. 8K won't be a thing until compression technology catches up. We've seen a rapid advancement in television tech and it might have outpaced the rest of entertainment. Too many people are watching tv and movies and tiktok on their phones. I think we're probably going to see TV tech slow down for the next few years.
The SNES and Genesis were 16-bit and this is where you saw Sega lean into "blast processing" in order to make a market differentiation with the SNES.
The PS1 was 32-bit and Nintendo releasing the N64 a bit late, pushed to market it as 64-bit. Likewise the Dreamcast was 128-bit...
This is where the industry realized this marketing tactic wasn't going to work moving forward. The PS2 while having a 128-bit bus was a 64 bit processor and was more powerful than the Dreamcast. This is when the conversation changed from bits to gflops and we've largely been using that ever since.
At this time we were also counting polygons in main characters to show the progression of technology. The number of polygons in Solid Snake in Metal Gear Solid 2 compare to Metal Gear Solid. This would continue with the PS3 and the number of polygons in Snake in MGS4. These polygon counts became less relevant on the PS3 and were barely if ever mentioned on the PS4.
In the HD era, we really started focusing on resolution and this is largely where we are today. Focused on 4K and tflops, but tflops are no longer really a great way to ascertain the ability for a machine to deliver on gameplay and visuals and natively rendered 4K is expensive and largely pointless.
We're entering a new era which is the AI upscaling era. Everyone will be judged by the the ability of their AI upscaler, which is why I think Sony gave a name to their upscaler to differentiate it from others on the market (DLSS, XeSS, FSR).
I think it is a move done in recognition that native 4K and native 8K are not the future direction of video gaming anymore than 256-bit wasn't and that the resolution race like the bit race before it is pretty much dead and irrelevant. I think it is also recognition that tv manufacturers aren't moving to make 240hz TVs anytime soon. That recognizes that there isn't much relevance in pushing for framerates above 120 fps.
There is no reason for gaming to push beyond the capabilities of televisions and that's pretty much been the case since the beginning when TVs were 50hz or 60hz depending on where you lived.
The benefits of 8K and 240hz are largely non-existent. They'll have to find a new big advancement to push TVs. We went from black and white, to color, from CRT to LCD, from 480p to 1080p to 1080p smart TVs to 4K smart tvs, to 4K smart TVs with HDR. 8K won't be a thing until compression technology catches up. We've seen a rapid advancement in television tech and it might have outpaced the rest of entertainment. Too many people are watching tv and movies and tiktok on their phones. I think we're probably going to see TV tech slow down for the next few years.