• chimasterflex@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    8 months ago

    Finally I can put some take into this. I’ve worked in memory testing for years and I’ll tell you that it’s actually pretty expected for a memory cell to fail after some time. So much so that what we typically do is build in redundancy into the memory cells. We add more memory cells than we might activate at any given time. When shit goes awry, we can reprogram the memory controller to remap the used memory cells so that the bad cells are mapped out and unused ones are mapped in. We don’t probe memory cells typically unless we’re doing some type of in depth failure analysis. usually we just run a series of algorithms that test each cell and identify which ones aren’t responding correctly, then map those out.

    None of this is to diminish the engineering challenges that they faced, just to help give an appreciation for the technical mechanisms we’ve improved over the last few decades

    • trolololol@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 months ago

      pretty expected for a memory cell to fail after some time

      50 years is plenty of time for the first memory chip to fail most systems would face total failure by multiple defects in half the time WITH physical maintenance.

      Also remember it was built with tools from the 70s. Which is probably an advantage, given everything else is still going

      • orangeboats@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        Also remember it was built with tools from the 70s. Which is probably an advantage

        Definitely an advantage. Even without planned obsolescence the olden electronics are pretty tolerant of any outside interference compared to the modern ones.