We all see the raise of "AI" in last 2 years and even more when it comes to DLSS.
We see the lay offs and people being replaced by AI.
I think there are 3 categories that AI might damage gaming. One is maybe a bit worse than another:
1)Resolution upscaling (dlss2 and fsr2)
Let me talk a bit about those. DLSS1 started with final fantasy XV release on pc turning the game into a painting. It was very aggressive but it was first steps and it was actually a very good idea and just a curiosity / a setting in the menu.
Then DLSS2 and fsr2 came and both again... are a good idea but are absolutely misused by console developers either due to lack of knowledge or time. DLSS2 can look amazing coming from 4k to high setting.... maaaybe even balanced. FSR2 on ultra at 4k also looks good. But console devs run base resolution at like 25% of the final image and do 1080p to 4k. Or even less. The end result is an image that takes forever to accumulate. In motion? trash. With transparencies? trash... it's just a fake image presented to us that looks terrible in motion. Like Alan Wake 2 on ps5. They just slapped fsr2 on that game. nothing else. It's like 1200p to 4k in QUALITY mode.
But we all forget one thing. FSR2 and DLSS2 are not free. Maybe instead of 1200p, Alan Wake 2 could run at 1440p with solid TAA in place? no pixel break up in motion etc.
2)Framerate upscaling (fsr3 and dlss3).
With this, now our frames not only are not real 4k but also not real .... real frames. Granted, I only have a 3080 so my dlss3 experience is limited but I had a ton of exprience on my best friends 4080. Hogwarts, portal RTX and few other games. On my side, I tested fsr3 on pc and now on immortals of aveum.
to start - fsr3 is not good. It's kinda fine on console but is terrible on pc. DLSS3 is much better but again. Both these techs improve look of moving image for the price of real base fps (just a few %) and input lag. The game renders a frame AHEAD, analyzes it and current real frame and puts a new frame inbetween. It adds input lag. A miniscule amount but technically it does. And it creates visual artifacts. The idea was to run 100fps game at 200fps or close. Not to take 40fps game to 120fps... that's nonsense. you have such a low base frametimes to work with.
I am just fine with locking fps to 30 (like ff16, great game this gen) or 40. Takes few minutes to get used to it. Raw image, raw framerate. no artifacts and distracting stuff.
So ok, both the techniques above can ruin image quality but let's be honest here - CAN IMPROVE IT. When used right, can be good. We just were super naive to trust console devs to do it correctly.
3)AI concept art, AI upscaling of textures in dev, AI generated characters, plotlines, everything we will see probably very soon.
The industry is already extremely riskaverse due to games taking 7 years to make. Somehow we have all these 100h games for no reason.
Now we will have 100h games filled with AI crap. no soul, no art. Everything looking the same.
It is rumored next consoles will make big deal with AI. We can see Nvidia is going almost AI exclusive. Everyone is going AI crazy... but I don't even think it's real AI
We see the lay offs and people being replaced by AI.
I think there are 3 categories that AI might damage gaming. One is maybe a bit worse than another:
1)Resolution upscaling (dlss2 and fsr2)
Let me talk a bit about those. DLSS1 started with final fantasy XV release on pc turning the game into a painting. It was very aggressive but it was first steps and it was actually a very good idea and just a curiosity / a setting in the menu.
Then DLSS2 and fsr2 came and both again... are a good idea but are absolutely misused by console developers either due to lack of knowledge or time. DLSS2 can look amazing coming from 4k to high setting.... maaaybe even balanced. FSR2 on ultra at 4k also looks good. But console devs run base resolution at like 25% of the final image and do 1080p to 4k. Or even less. The end result is an image that takes forever to accumulate. In motion? trash. With transparencies? trash... it's just a fake image presented to us that looks terrible in motion. Like Alan Wake 2 on ps5. They just slapped fsr2 on that game. nothing else. It's like 1200p to 4k in QUALITY mode.
But we all forget one thing. FSR2 and DLSS2 are not free. Maybe instead of 1200p, Alan Wake 2 could run at 1440p with solid TAA in place? no pixel break up in motion etc.
2)Framerate upscaling (fsr3 and dlss3).
With this, now our frames not only are not real 4k but also not real .... real frames. Granted, I only have a 3080 so my dlss3 experience is limited but I had a ton of exprience on my best friends 4080. Hogwarts, portal RTX and few other games. On my side, I tested fsr3 on pc and now on immortals of aveum.
to start - fsr3 is not good. It's kinda fine on console but is terrible on pc. DLSS3 is much better but again. Both these techs improve look of moving image for the price of real base fps (just a few %) and input lag. The game renders a frame AHEAD, analyzes it and current real frame and puts a new frame inbetween. It adds input lag. A miniscule amount but technically it does. And it creates visual artifacts. The idea was to run 100fps game at 200fps or close. Not to take 40fps game to 120fps... that's nonsense. you have such a low base frametimes to work with.
I am just fine with locking fps to 30 (like ff16, great game this gen) or 40. Takes few minutes to get used to it. Raw image, raw framerate. no artifacts and distracting stuff.
So ok, both the techniques above can ruin image quality but let's be honest here - CAN IMPROVE IT. When used right, can be good. We just were super naive to trust console devs to do it correctly.
3)AI concept art, AI upscaling of textures in dev, AI generated characters, plotlines, everything we will see probably very soon.
The industry is already extremely riskaverse due to games taking 7 years to make. Somehow we have all these 100h games for no reason.
Now we will have 100h games filled with AI crap. no soul, no art. Everything looking the same.
It is rumored next consoles will make big deal with AI. We can see Nvidia is going almost AI exclusive. Everyone is going AI crazy... but I don't even think it's real AI