Sony have made the right choice once again, instead of doing something shitty like Microsoft did with the Xbox Seris X/S from the outset, they've opted for a more sensible choice.That's awesome to hear and I hope Sony keeps it been this easy or maybe even easier in the future!
That's what Xbox should have done, they should have equipped the Series S with 16GB of ram instead of 10 (with only 8 usable) and a GPU a bit more powerful than that to better handle RT and 1440p so that it wouldn't be a burden for developers.
The S series, with its GPU not at all suited to 1440p gaming and its ridiculous amount of ram, all for 300 dollars, already represented mediocre value compared with the PS5 digital edition.This would have raised the price of the Series S and likely made it more difficult to source during the pandemic.
The Series S launched for 300 dollars while the PS5 digital launched for 400 dollars. If you had to raise the price of the Series S towards 350, it would have made for an even more poor value proposition than it was.
The reality is PSSR is huge for Cloud.
The goal here will be to utilize significantly lower native resolutions combined with advanced compression, while still outputting high resolution games with minimal input latency.
Cloud is absolutely coming and will increasingly be important to sony.
PiSSeR has absolutely nothing to do with cloud.
No one is going to do PiSSeR upscaling locally out of frames taken from lossy video compression.
The work pipeline even starts well before the framebuffer due to motion vectors.
The S series, with its GPU not at all suited to 1440p gaming and its ridiculous amount of ram, all for 300 dollars, already represented mediocre value compared with the PS5 digital edition.
With more Memory and slightly better GPU the price would probably have gone up, but not that much in reality.
If Sony managed to sell a much more powerful PS5 Digital than the series S for $399, Microsoft could well have sold an XSS with 16GB of Ram (or at least 12GB.) and a slightly better GPU for $349 without too much difficulty.
They just chose to put in 10GB and a weaker gpu to make a bigger profit on it but that turned out to be a bad choice and and and it adds more work for developers.
The S series as it stands is an anomaly.
It's a fun name.Why do you call it pisser?
PiSSeR has absolutely nothing to do with cloud.
No one is going to do PiSSeR upscaling locally out of frames taken from lossy video compression.
The work pipeline even starts well before the framebuffer due to motion vectors.
Why do you call it pisser?
You clearly aren't paying attention.
I'm saying it's about combining technologies. It means you don't need to have as powerful hardware on the backend.
He's an fanboy who watches DF and that's what they call it when they want to make fun of it.
It's a fun name.
But if you find me a better name that allows me to say it faster than "Pe eS eS ARR" I'll use it instead.
You clearly aren't paying attention.
I'm saying it's about combining technologies. It means you don't need to have as powerful hardware on the backend.
He's an fanboy who watches DF and that's what they call it when they want to make fun of it.
I think it's still important to separate what were the expectations of Xbox performance over PS5 back in 2020, mostly based off Alex Battaglia's ignorance and incompetence, to what we're getting now which is developers not giving a crap about optimizing for Xbox because its sales are probably marginal.
And yes, ever since Alex "PS5-is-weaker-than-RTX2060" from DigitalFoundry started posting there the place became a Nvidia adoration site, with stupid stuff like Nvidia Russia employees going there just to shitpost on the competition.
Guys for the love of god, never ask Alex Battaglia about PC component upgrades.
Sure, halo products have good margins and they're great to dominate the charts and have fanboys raging in their favor in forums or give Alex Battaglia ammunition for his shllling, but what AMD needs right now is mind/marketshare through higher value-for-money.
Unless you're suggesting to apply PiSSeR on the server's side, which would be the same as using any other upscaling tech so it's an irrelevant proposition.
I struggled to acknowledge you were trying to suggest doing something that's been done for the better part of the decade, yes.That's literally what I said, but apparently you're struggling.
Same as any other upscaling tech? Is sony currently using upscaling tech on the server side?
What video compression technology has Sony invested in? The last time I heard of a compression tech from Sony was like the XAVC from like 15 years ago. Since then, they've been using H264 and H265.The reality is they've invested in upscaling and they've invested in compression technology. It's a perfect combination, which is what I said from the get go, but again you were not paying attention before you responded.
I struggled to acknowledge you were trying to suggest doing something that's been done for the better part of the decade, yes.
Well duh, any game that uses upscaling, does so before compressing the video to send to the client side.
If you're running any PS5 game on cloud streaming and that game uses upscaling on the PS5 hardware, then it's already doing exactly what you suggested.
Are you changing goalposts from "upscaling" to "AI-based upscaling"?Sony is not using machine learning on their cloud servers today and neither is Microsoft.
Oh, it probably from Alex then. Makes sense.
Are you changing goalposts from "upscaling" to "AI-based upscaling"?
Even if you are, so what? "Using machine learning" for upscaling is indicative of what exactly? DLSS2.0 was terrible, and XeSS DP4a isn't that much better than FSR2 anyway. There have been instances of FSR2 working better than DLSS.
You seem to think PiSSeR is this magical tech, GPU-inside-the-power-adapter, power-of-the-cloud like tech that will revolutionize everything.
It's an upscaler that also uses DNN inference like a myriad of patents from AMD have suggested for years. And since it's using RDNA4's WMMA acceleration (and not a "custom block" like DigitalFoundry insists on), it's probably a fork of FSR4 (or maybe just FSR4 under a proprietary name).
It's still a very good upscaler that works very decently with 1080p -> 4K upscaling in games from what we've seen, but geez you make it sound like the second coming of Christ.
What cost savings in the cloud?Again, you're not seeing that it's about cost savings.
Now you're making assumption that you have no facts to back up. Sony beat AMD to release on this.
LOL Sony isn't going to mandate yet another target spec that runs PS5 quality and framerates. They have the PS5 SoC for that.Sony can replicate what the PS5 is able to output natively with a cheaper GPU that has machine learning on the SOC .Thsi allows them to build out server farms at a fraction of the cost. This is what allows for lower latency in region. Combined with their investments in compression tech, this also reduces their costs and increases performance.