• ColeSloth@discuss.tchncs.de
    link
    fedilink
    arrow-up
    19
    ·
    2 days ago

    Wiis and ps3’s weren’t crapping out, and the 360 failures weren’t due to ati. This meme is dumb. Dumb in the bad meme kinda way.

    • heythatsprettygood@feddit.ukOP
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      2 days ago

      Wii was mostly okay, but boards with a 90nm Hollywood GPU are somewhat more likely to fail than later 65nm Hollywood-A boards (so RVL-CPU-40 boards and later), especially if you leave WiiConnect24 on as it keeps the Starlet ARM chip inside active even in fan off standby - most 90nm consoles will be okay due to low operating temperatures, but some (especially as thermal paste ages and dust builds) are more likely to die due to bumpgate related problems.

      PS3s did crap out with yellow lights of death, although not as spectacularly as 360 red rings (lower proportion due to beefier cooling and different design making the flaws less immediately obvious, but still a problem). NVIDIA on the RSX made the same mistakes as ATI on the Xenos - poor underfill and bump choice that could not withstand the thermal cycles, which should have been caught (NVIDIA and bumpgate is a whole wild story in and of itself though, considering it plagued their desktop and mobile chips). The Cell CPU on there is very reliable though, even though it drew more power and consequently output more heat - it was just the GPU that could not take the heat.

      360s mostly red ringed due to faulty GPUs, see previous comments about the PS3 RSX. ATI had a responsibility to choose the right materials, design, and packaging partner to ship to Microsoft for final assembly, and so they must take some responsibility (they also, like NVIDIA, had troubles with their other products at this time, leading to high failure rates of devices like the early MacBook Pros). However, it is unknown if they are fully to blame, as it is unknown who made the call for the final package design.

      • DacoTaco@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        2 days ago

        Ok so, let me set this all straight.
        The wii had nothing to do with the gpu but with the die of the gpu that nintendo had designed and kept secret.
        Inside the gpu die is both the gpu (hollywood) but also an arm core called starlet. It runs the security software and thats where (rarely but happened) things went wrong, as it was always running code, even in standby. This had nothing to do with ati.

        And the ps3 was not what you said. The ps3’s problem was that the ihs wasnt making a good enough contact to the core so the heat of the cpu didnt transfer well into the cooler. You can fix this, but is very tricky and is easy to permanently damage the ps3 in doing so ( you have to cut the silicon under the ihs without touching the die or the pcb, remove the silicon and do reattach it with less glue ). This could be contributed to the manufacturer i suppose

        • heythatsprettygood@feddit.ukOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          Your description of the Starlet is more accurate, yes. However, its heat output consequently caused some of the issues with the ATI designed parts of the Hollywood, as it exacerbated the thermal issues the 90nm variants had, and that a better designed chip would have been able to handle.

          The PS3’s IHS was not the problem. There was decent contact and heat transfer, maybe not as perfect as it could have been (there’s thermal paste instead of it being soldered into place, which is why a delid and relid is fairly essential if you have a working 90nm PS3 due to aging thermal paste), but definitely not big enough of a problem for a properly designed chip to cook itself at the operating temperatures of the PS3 (75-80 temperature target on the RSX on an early variant at full load). The Cell B.E. next to the RSX that uses more power (consequently outputs more heat) has a similar setup for its IHS, but IBM did not make the same design mistakes as NVIDIA, and so we see very few reports of the CPU cooking itself even in those early PS3s.

          • DacoTaco@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            2 days ago

            Ye no, ive have a ps3 that ylod which i reflowed back to life. After it was working again i started digging and the temps the core was reporting wasnt even close to the ihs that i measured with thermal couple. Also the thermal paste is on top of the ihs, not under it and it wasnt soldered in place. Early ps3’s did cook themselves. Less than 360 by a long shot, but they still did!
            Also, side note, its funny how some 360’s rrod was not due to the heat issue but can also be caused by power supply failure or the plug being faulty. Thats how i got and fixed my 360 🤣

            • heythatsprettygood@feddit.ukOP
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 days ago

              If it’s a reflow, your PS3 is running on borrowed time. A reflow heats up the chip enough that parts of it expand enough to make the GPU work again temporarily (the solder bumps on the bottom of the silicon attaching it to the interposer line up their cracks again), but eventually you’ll be back to square one. The real fix is to replace the 90nm GPU with a later 65 or 45nm variant that has the fixed design (search up “PS3 frankenstein mod” for more). There is thermal paste both under and above the IHS - the one under for taking the heat from the silicon up to the IHS, then the top layer for taking it to the heatsink. Here’s an image of a delidded RSX and Cell to show (apologies for the quality, was the best one I could easily find).

              PS3s did cook themselves, but not to the extent of the 360.

              It is funny to see how there are probably so many misdiagnosed 360s out there with bad power supplies that have been subjected to being bolt modded (shudder) or something. It doesn’t help that the three red lights just mean “general hardware fault” without doing the button combination to get further information. I guess at least more helpful than the PS3, whose diagnostics were only made available recently due to a key being cracked.

              • DacoTaco@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                edit-2
                2 days ago

                I know what i did to the ps3, and i know what needs to be done. I barely use the ps3 so its fine for my case. The real fix can also be to redo the ihs so the solder doesnt crack or get damaged again because of heat. Also, ps3 also got a lot easier to debug with syscon, which was also only (relatively) recently discovered. Sure made debugging the ps3 easier haha. Ive left the connection available on my ps3 for future usage hehe

      • ColeSloth@discuss.tchncs.de
        link
        fedilink
        arrow-up
        1
        ·
        2 days ago

        I dunno, man. I never knew anyone who had a yellow light PS3 and the only ones I read about were from people who had kept them in enclosed cabinets. I also watched a very in depth 2 hour documentary on the 360 rrod and it wasn’t due to ati.

  • edinbruh@feddit.it
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 days ago

    The PS3 doesn’t have an ATI gpu. TL;DR: it’s Nvidia.

    The PS3 has a weird, one-of-a-kind IBM processor, called Cell. You can think of it kind of as a hybrid design that is both a CPU and a GPU (not like “a chip with both inside” but “a chip that is both”) meant for multimedia and entertainment applications (like a game console). It was so peculiar that developers took a long time before learning how to use it effectively. Microsoft didn’t want to risk it, so they went with a different CPU always from IBM that shared some of the Cell’s design, but without the special GPU-ish parts, and paired it up with an ATI GPU.

    Now, Sony wanted to get away with only the Cell, and use it both as CPU and GPU, but various tests showed that despite everything, it wasn’t powerful enough to keep up with the graphics they expected. So they reached out to NVIDIA (not ATI) to make an additional GPU, so they designed a modified version of the 7800 GTX to work together with the Cell. To fully utilise the PS3 graphics hardware, one would have to mainly use the GPU for graphics, and assist it with the special Cell hardware. Which is harder.

    • heythatsprettygood@feddit.ukOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      Ah, I should have made that more clear in the meme. Both NVIDIA and ATI messed up in this era, bad. Sony’s efforts with the Cell are always so fascinating - so much potential is in that (just look at the late PS3 era games), but they just could not get it to the point of supplanting the GPU.

  • Thorry84@feddit.nl
    link
    fedilink
    arrow-up
    31
    ·
    3 days ago

    Are you referring to the red ring of death on the Xbox? Because that has absolutely nothing to do with ATI. They just made the chips, it’s Microsoft that put them on the board. Most of the issues were caused by a poor connection between the chip and the board, not a hell of a lot ATI could do about that.

    A lot of it was engineers underestimating the effect of thermals between 80 and 95 degrees for very long times, with cool down cycles in between. The thinking was this was just fine and wouldn’t be an issue. It turned out it was an issue, so they learnt from that and later generations didn’t really have that issue.

    • heythatsprettygood@feddit.ukOP
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      2
      ·
      3 days ago

      As far as I am aware, the 360 GPUs had faulty solder connections (due to poor underfill choice by ATI that couldn’t withstand the temperature) between the chips and the interposer, not the interposer and the board, shown by the fact that a lot of red ring 360s show eDRAM errors (i.e. can’t communicate to the module on the same interposer, ruling out poor board connections). Microsoft even admitted this in a documentary they made (link), where they said it wasn’t the board balls, it was the GPU to interposer balls. A similar underfill choice is also why there are slightly higher failure rates of early Wiis, although nowhere near as bad as 360 due to the low power of the GPU on there.

        • heythatsprettygood@feddit.ukOP
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          2
          ·
          edit-2
          3 days ago

          It’s hard to say for certain whose final call it was to do this underfill (it’s a tossup between ATI’s design engineers and the packaging partner they chose to work with to get the TSMC chip into a final product), but at the end of the day it was ATI’s responsibility to validate the chip and ensure its reliability before shipping it off to Microsoft.

      • jyl@sopuli.xyz
        link
        fedilink
        arrow-up
        2
        ·
        3 days ago

        I don’t know how much of it was ATI’s fault or the fab’s, but my understanding is that no one had experience handling that amount of heat.

        • heythatsprettygood@feddit.ukOP
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 days ago

          Agreed, thermals were increasing faster than most manufacturers could handle. Only real exceptions in this time I can think of were IBM (because they had to, PowerPC G5 was such a power hog it pissed off Apple enough for them to switch architectures) and Intel (because they also had to, Pentium 4 was a disaster).

    • heythatsprettygood@feddit.ukOP
      link
      fedilink
      English
      arrow-up
      18
      ·
      3 days ago

      Oh, NVIDIA have always been a shitstorm. From making defective PS3 GPUs (the subject of this meme) to the constant hell that is their Linux drivers to melting power connectors, I am astounded anyone trusts them to do anything.

  • ben@lemmy.zip
    link
    fedilink
    English
    arrow-up
    3
    ·
    3 days ago

    It’s more: making their cards competitive on price and performance lately for team red

    • heythatsprettygood@feddit.ukOP
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 days ago

      AMD have been amazing lately. 9070 XT makes buying most other cards in that price range pointless, especially with NVIDIA’s melting connectors being genuine hazards. ATI (who were dissolved in 2010 after being bought out by AMD) and NVIDIA in the mid to late 2000s however were dumpster fires in their own ways.

      • ben@lemmy.zip
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 days ago

        Depends if you can actually find a 9070XT at the price they advertised it at. Once that happens I’ll be convinced, right now though that’s very much felt like a bit of a bait and switch. Holding out hope though.

        • heythatsprettygood@feddit.ukOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 days ago

          Yeah, pricing is not the greatest at the moment, most likely because there’s no reference card to keep other prices in check. Still (at least here in the UK) they are still well below the stratospheric NVIDIA prices for a 5070 Ti and are easily available.