• Pantherina@feddit.de
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    At least they had screws? I dont trust HDMI or even worse USB-C. Still using VGA monitors with adapters, never broke a single plug.

    • mihnt@lemy.lol
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      6 months ago

      Why are you using VGA when DVI-D exists? Or Displayport for that matter.

      • renzev@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        6 months ago

        All those new video standards are pointless. VGA supports 1080p at 60Hz just fine, anything more than that is unnecessary. Plus, VGA is easier to implement that HDMI or Displayport, keeping prices down. Not to mention the connector is more durable (well, maybe DVI is comparable in terms of durability)

        • mihnt@lemy.lol
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          6 months ago

          I think you are speaking on some very different use cases than most people.

          • renzev@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            6 months ago

            Really, what “normal people” use cases are there for a resolution higher than 1080p? It’s perfectly fine for writing code, editing documents, watching movies, etc. If you are able to discern the pixels, it just means you’re sitting too close to your monitor and hurting your eyes. Any higher than 1080p and, at best you don’t notice any real difference, at worst you have to use hacks like UI Scaling or non-native resolution to get UI elements to display at a reasonable size.

            • Pantherina@feddit.de
              link
              fedilink
              arrow-up
              0
              ·
              6 months ago

              Its unneeded perfectionism that you get used to. And its expensive and makes big tech rich. Know where to stop.

          • Pantherina@feddit.de
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            6 months ago

            Why should I? Full HD and working well, no reason to do so, new displays are 100€+ which is freaking expensive for that improvement

    • dejected_warp_core@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      6 months ago

      I’m still waiting for the other shoe to drop on USB-C/Thunderbolt. Don’t get me wrong - I think it’s a massive improvement for standardization and peripheral capability everywhere. But I have a hard-used Thinkpad that’s on and off the charging cable all day, constantly getting tugged in every possible direction. I’m afraid the physical port itself is going to give up long before the rest of the machine does. I’m probably going to need Louis Rossmann level skills to re-solder it when the time comes.

      Edit: I’m also wondering if the sudden fragility of peripheral connections (e.g. headphones, classic iPod, USB mini/micro) and the emergence of the RoHS standard (lead-free solder) is not a coincidence.

      • Pantherina@feddit.de
        link
        fedilink
        arrow-up
        0
        ·
        6 months ago

        On my Thinkpad the ports where both soldered to the mobo, unlike some random other USB daughterboard. Really annoying, on my T430 the port is a separate piece and can be easily replaces with a cable.

        But no, USB-c is pretty tough for me, when done right. But its still too small for no reason in Laptops.