Xbox 360/PS3/(to a lesser extent) Wii owners represent
heythatsprettygood @ heythatsprettygood @feddit.uk Posts 42Comments 132Joined 2 yr. ago
AMD have been amazing lately. 9070 XT makes buying most other cards in that price range pointless, especially with NVIDIA's melting connectors being genuine hazards. ATI (who were dissolved in 2010 after being bought out by AMD) and NVIDIA in the mid to late 2000s however were dumpster fires in their own ways.
Oh, NVIDIA have always been a shitstorm. From making defective PS3 GPUs (the subject of this meme) to the constant hell that is their Linux drivers to melting power connectors, I am astounded anyone trusts them to do anything.
It's hard to say for certain whose final call it was to do this underfill (it's a tossup between ATI's design engineers and the packaging partner they chose to work with to get the TSMC chip into a final product), but at the end of the day it was ATI's responsibility to validate the chip and ensure its reliability before shipping it off to Microsoft.
As far as I am aware, the 360 GPUs had faulty solder connections (due to poor underfill choice by ATI that couldn't withstand the temperature) between the chips and the interposer, not the interposer and the board, shown by the fact that a lot of red ring 360s show eDRAM errors (i.e. can't communicate to the module on the same interposer, ruling out poor board connections). Microsoft even admitted this in a documentary they made (link), where they said it wasn't the board balls, it was the GPU to interposer balls. A similar underfill choice is also why there are slightly higher failure rates of early Wiis, although nowhere near as bad as 360 due to the low power of the GPU on there.
Holy shit, someone who does it as well! Torx bits are so useful for this, I have a fairly high success rate even on the tiny terrible electronics screws I usually work on.
This is why you make your own memes. Fresh from the farm- I mean image editor, and with far less compression artifacts.
Permanently Deleted
Isn't the thing with this that the Switch 1 compatibility layer isn't on the factory firmware since it was a later developed piece of software? That's probably why it's asking for an update, as these units were made months ago (firmware 19.0.0 according to that leak out of Russia, and we're on 20.1.0 on Switch 1 at the moment). I remember on the box they showed on the Nintendo Today app, there was something in the fine print about needing a system update both to use Switch 1 games and to use MicroSD Express cards. Very few if any people have a Switch 2 with a Switch 2 game cartridge at the moment, so they're going to run into this when they run a game. The only way to rule this out is to put in a Switch 2 cartridge into one of these systems on the factory firmware, but as far as I know even now if you put the cartridge in it will run the game.
Edit: Can confirm that it is the case you need the Day 1 update to play Switch 1 games from one of the crunchiest images on the internet (it was the best one I could find of the box)
Torx needs to become the standard for screws. They are just better in every way.
Are you using tone mapping through the Steam UI (I think the Deck has its own controls for HDR inverse tone mapping) or through the command line options you can use for games? If you are using the UI, it might be worth using the command line toggles instead as maybe the UI is setting some wrong settings. If it helps, here is the set of command line options I use on my system (modify brightness, refresh rate, and resolution to fit your display) DXVK_HDR=1 ENABLE_HDR_WSI=1 gamescope -f -r 165 -W 3440 -H 1440 --adaptive-sync --hdr-enabled --hdr-itm-enable --hdr-itm-sdr-nits 350 --hdr-sdr-content-nits 800 --hdr-itm-target-nits 1000 gamemoderun -- %command%
. In addition, it might be worth looking through the display settings to see if it's in any sort of colour boosting HDR modes - my Alienware had to be set to "HDR Peak 1000" for colours to look as they should, as by default it messes around with things a bit. If you can as well, try some other devices that can output HDR (like a game console or Blu Ray player or something) to see if it's making those outputs look a bit red too - if so, it's to do with the display, and if not it's a configuration issue.
I haven't experienced issues with oranges on my setup (AW3423DWF, 7900 XTX). Perhaps it is to do with your hardware?
In a nutshell, it essentially increases the range of brightness values (luminance/gamma to be specific) that can be sent to a display. This allows content to both be brighter, and to display colours more accurately as there are far more brightness levels that can be depicted. This means content can look more lifelike, or have more "pop" by having certain elements be brighter than others. There's more too, and it's up to the game/movie/device as to what it should do with all this extra information it can send to the display. This is especially noticeable on an OLED or QD OLED display, since they can individually dim or brighten every pixel. Nits in this context refers to the brightness of the display - 1000 nits is far brighter than most conventional displays (which are usually in the 300-500 range).
What sort of system are you on, and what have you been trying? The best setup is with an AMD GPU and a more up to date distro (Fedora, Arch, so on). I can give some help if you need.
Oh boy, I should have caught that. Ironic, considering saying things like "ATM machine" is a pet peeve of mine.
In a small room where it's the only light source, it's still a crazy amount of light. My eyes genuinely had to get used to the brightness for a couple minutes after I set it up for the first time, and the walls sometimes looked like the ceiling light was on.
If you ever get the opportunity, try out HDR ITM tone mapping (essentially a HDR upconversion thing you can do with Gamescope on Linux) playing Persona 3 Reload on a QD OLED monitor (for that extra brightness) in a dark room. Even though it's not even a native HDR game, with ITM it looks so good, especially because it's a game with a lot of dark graphics mixed in with super bright. The text pops, and combat is next-level.
If you're interested, RIP Felix did a pretty good video on YouTube about the YLOD failures. TL:DW is most early models have defective GPUs, plus the ones that survived now have aging capacitors, but there's tools now to find the exact cause from the system controller (SYSCON). A GPU swap is pretty involved though, so needs a skilled technician to pull it off. Still, if you have one of those early backwards compatible models, having it repaired isn't that bad of an idea nowadays since those consoles are only getting rarer.
The PS3 is in competition with the original Xbox One for being the most undercooked and overpriced launch of a game console. I guess it's what made the recovery even more astounding.
I guess they had to remove backwards compatibility at some point considering the solution was to shove an entire PS2 CPU and GPU onto the motherboard, massively driving up the already stratospheric production cost that they lost money on even with the high launch prices. Still, it is unfortunate it went away, as it ensured that PS2 games would play perfectly and as intended on PS3 with a proper HDMI output too. Plus, since it went away early, it means all the models with backwards compatibility have the defective GPUs that can cause a yellow light of death. A PS3 Slim with PS2 compatibility would have been amazing. I agree so much with XMB being peak UI design as well, almost every console following has in my opinion a worse UI, other than maybe the Switch, but that's because the Switch barely has much beyond a game selector.
Yeah, pricing is not the greatest at the moment, most likely because there's no reference card to keep other prices in check. Still (at least here in the UK) they are still well below the stratospheric NVIDIA prices for a 5070 Ti and are easily available.