• aeronmelon@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      1:1 copies of the bits on the disc is a valid option that some people prefer. Especially if you want to make your own physical disc or make compressed files encoded in a very specific way. It’s also the most reliable way to archive a disc for long-term storage.

    • kratoz29@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      What the hell, how so?

      Now that I think about it not much software comes in rar nowadays.

      • Björn Tantau@swg-empire.de
        link
        fedilink
        arrow-up
        0
        ·
        6 months ago

        Because it’s a garbage proprietary format that needs extra software on every OS. But for some inane reason it’s become the standard for piracy stuff. I think that’s the only reason it’s still alive.

        • AwkwardLookMonkeyPuppet@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          It’s not garbage. It’s used in the pirate community and elsewhere because back in the day things were shared on the Usenet before they were shared anywhere else. There’s a limit for file size on the Usenet, so we needed to be able to break compressed files into multiple parts and have an easy way to put them back together when uncompressing. Win Zip did not have that functionality. You can thank WinRar for powering the entire sharing scene for decades. When torrent was becoming popular NO distributors shared on torrent. They shared on the Usenet. Then someone would take a Usenet share and post it to the torrent network. Torrent wouldn’t have had much success, or would have taken much longer to catch on if it wasn’t for WinRar and the Usenet.

  • 30p87@feddit.de
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    Well … .rar is annoying, .zip is more annoying to deal with on Linux, .tar.gz is linux only and .7z is … something.

  • aard@kyu.de
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    Nowadays it matters if you use a compression algorithm that can utilize multiple cores for packing/unpacking larger data. For a multiple GB archive that can be the difference between “I’ll grab a coffee until this is ready” or “I’ll go for lunch and hope it is done when I come back”

      • aard@kyu.de
        link
        fedilink
        arrow-up
        0
        ·
        5 months ago

        I personally prefer bzip2 - but it needs to be packed with pbzip, not the regular bzip to generate archives that can be extracted on multiple cores. Not a good option if you have to think about Windows users, though.

    • dustyData@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      Because gzip and bz2 exists. 7z is almost always a plugin or addon, or extra application. While the first two work out of the box pretty much everywhere. It also depends on frequency of access, frequency of addendum, size, type of data, etc. If you have an archive that you have to add new files frequently, 7z is gonna start grating on you with the compression times. But it is Ok if you are going to extract very frequently from an archive that will never change. While gz and bz2 are overall the “good enough at every use case” format.

    • Fonzie!@ttrpg.network
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      5 months ago

      For archiving/backupping *NIX files, tar.whatever still wins as it preserves permissions while 7z, zip and rar don’t

      Oh, and while 7z is FOSS and supported out of the box on most Linux desktop OSes and on macOS, Windows users will complain they need to install stuff to open your zip. Somehow, tar.gz is supported out of the box on Linux, macOS, and yes Windows 10 and 11!

  • db2@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    For a few hundred kilobyte file sure, the difference is like pocket change. For a larger one you’d choose the right tool for the job though, especially for things like a split archive or a database.

    • Im_old@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      Username checks out! Also you’re absolutely right, just last month I was looking for the best compression algorithm/packages to archive a 70gb DB