• blackstrat@lemmy.fwgx.uk
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I hope whoever thought -l should mean “check links” instead of list has a special place in Hell set aside for them.

    I have no idea what print a message if not all links are dumped even means.

  • Zacryon@feddit.de
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Ah yes, that’s the linux community as I know it. There is one thing someone wants to achieve and dozens of ways to do it. ;)

      • anteaters@feddit.de
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        I avoid it and use zip or 7z if I can. But for some crazy reason some people stil insist on using that garbage tool and I have no idea why.

        • duncesplayed@lemmy.one
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          1 year ago

          Are zip and 7z really that much easier?

          tar cf foo.tar.xz wherever/
          zip -r foo.zip wherever/
          7z a foo.7z wherever/
          

          I get that tar needs an f for no-longer-relevant reasons whereas other tools don’t, but I never understood the meme about it beyond that. Is c for “create” really that much worse than a for “add”?

        • aard@kyu.de
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          If you want to do more than just “pack this directory up just as it is” you’ll pretty quickly get to the limits of zip. tar is way more flexible about selecting partial contents and transformation on packing or extraction.

          • anteaters@feddit.de
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            1 year ago

            100% of tarballs that I had to deal with were instances of “pack this directory up just as it is” because it is usually people distributing source code who insist on using tarballs.

  • sonnenzeit@feddit.de
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    I just use atool (archive tool) instead. It works the same for any common compression format (tar, gzip, zip, 7zip, rar, etc) and comes with handy aliases like apack and aunpack obsoleting the need to memorize options.

  • miniu@programming.dev
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Why when explaining, giving examples of shell command are people so often providing shortened arguments. It makes it all seam like some random letters you have to remeber by heart. Instead of -x just write --extract. If in the end they endup using the tool so often they need to write it fast they’ll check the shortcuts.

      • sonnenzeit@feddit.de
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        1 year ago

        Many do as it’s considered good practice, but it’s not guaranteed, it just depends on the individual command (program). Usually you can use the --help option to see all the options, so for instance tar --help.

  • MangoPenguin@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    So a serious question from someone who can’t remember console commands ever despite using them constantly.

    Why are so many linux CLI commands set up with defaults that no one ever uses? Like if you pretty much always need -f, -v is often used, and --auto-compress is needed to recognize type by extension. Why aren’t those the defaults to just using tar?

    A lot of applications I find are like this too, they don’t come with defaults that work or that anyone would ever use.

    • sonnenzeit@feddit.de
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      One reason to keep in mind is backwards compatibility and the expectancy that every Linux system has the same basic tools that work the same.

      Imagine you have a script running on your server that uses a command with or without specific arguments. If the command (say tar) changes its default parameters this could lead to a lot of nasty side effects from crashes to lost or mangled data. Besides the headache of debugging that, even if you knew about the change beforehand it’s still a lot effort to track down every piece of code that makes use of that command and rewrite it.

      That’s why programs and interfaces usually add new options over time but are mostly hesitant to remove old ones. And if they do they’ll usually warn the others beforehand that a feature will deprecate while allowing for a transitional period.

      One way to solve this conundrum is to simply introduce new commands that offer new features and a more streamlined approach that can replace the older ones in time. Yet a distribution can still ship the older ones alongside the newer ones just in case they are needed.

      Looking at pagers (programs that break up long streams of text into multiple pages that you can read one at a time) as a simple example you’ll find that more is an older pager program while the newer less offers an even better experience (“less is more”, ¿get the joke?). Both come pre-installed as core tools on many distributions. Finally an even more modern alternative is most, another pager with even better functionality, but you’ll need to install that one yourself.

  • Ricaz@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    I just have pack and extract functions in my shell RC files that look at file extensions and use the proper tool with proper arguments.

    Wrote them 10 years ago and they’ve worked flawlessly ever since!