The Nintendo M/O format was based on 3M tech, not the Zip drive. And Zip drives had worse issues than most people know. The Click of Death was a fatal failure Zip drives could suffer, and the worst part was that a disc that was in a drive that suffered the CoD would pass that failure on to whatever drives it was put in after that.
The Famicom Disk System media failures mostly came due to the discs not having any kind of shutter over the media inside the casing, due to Nintendo’s cheapness. As with any floppy drive, put a dirty disk in it, the heads get dirty too, and they wreck more discs in turn.
Oh okay, it was 3M not Iomega. Thanks for the clarification. I think Iomega ZIP drives were more reliable than others when it came to avoiding the CoD, but some alternatives that tried coming to market in the early/mid '90s suffered from it a lot.
I'm mainly going by documentaries though; didn't have a home PC in the '90s b/c my gaming was on consoles & arcades, but we did use (regular) floppies in school for the library & computer room stuff, and AFAIK a lot of ZIP disk types used floppy disk tech as their basis to evolve from (but some also based their tech around HDD designs).
And yeah, the difference in efficiency between the Xbox Windows build and the PS BSD stack is amazing. It really shows the cost of legacy code and the strengths of tailoring code for specialized devices. Then again, even for general purpose computing, Windows is a massive waster of resources when compared to any flavour of UNIX system.
If push comes to shove I'm gonna be moving my OS setup to some type of Linux in the future, if it means squeezing out more performance. I could probably run debloaters for W10/W11 but some of them cause stability issues.
Sometimes I think MS wanting to push ads on W11 is a way of them hoping to screw up debloater scripts or anything modifying registry and utility files in Windows, especially if those ads require an online connection and an online connection is required in the future to install programs or make certain types of system settings modifications (so that the changes can be "verified" on MS's end).
FYI there have been more circumstances where the PS5 shows better Ray Tracing than not. When there are differences, the PS5 pulls ahead on RT, FPS, Alpha effects, and the Xbox pulls away in resolution.
Good point, that's been something happening a decent big with multiplats between both systems. PS5 pulling ahead in RT is surprising for me but perhaps not so if shader core saturation on the game's engine side isn't enough to saturate Series X's GPU, but saturates PS5's. Because even with RT, Series X's advantage only shows up if more than 44 CUs are saturated (assuming 4 TMUs per CU which I think is the case with RDNA2 GPUs).
And just 44 CUs saturated only gives them equal RT advantage to PS5 with its 36 CUs, thanks to PS5's higher GPU clock.
Another thing I thought of the other day was that it's likely some 3rd party developers are not using the full theoretical 13.5GB RAM available to them on the PS5, but rather narrowing it down to 10GB because of the Series X. I doubt developers are bothering with the slower pool of ram on the Series X, especially because it shares the same stupid bus.
I wonder if any Xbox devs are leveraging swapping of data from the 6 GB pool to the 10 GB pool if needed, for GPU-bound tasks. It will constrain the bandwidth some and lower effective bandwidth for sure, but that has to be better than just going to storage for that data?
Unless the API tools for doing it are so bad they make storing the extra graphics data in the 6 GB pool untenable.