• @AnUnusualRelic@lemmy.world
      link
      fedilink
      English
      0
      edit-2
      2 months ago

      Back when I did a lot of Perl, those were okay-ish to parse. Nowadays, not so much. I guess it’s like Bash. If you write a lot of it (maybe some people do), it’s probably simple. If it’s only once every six months or less, eeehhh…
      It all boils down to familiarity, which comes from repetitiveness.

  • @Gobbel2000@programming.dev
    link
    fedilink
    02 months ago

    So true. Every time I have to look up how to write a bash for loop. Where does the semicolon go? Where is the newline? Is it terminated with done? Or with end? The worst part with bash is that when you do it wrong, most of the time there is no error but something completely wrong happens.

    • @qjkxbmwvz@startrek.website
      link
      fedilink
      0
      edit-2
      2 months ago

      I can only remember this because I initially didn’t learn about xargs — so any time I need to loop over something I tend to use for var in $(cmd) instead of cmd | xargs. It’s more verbose but somewhat more flexible IMHO.

      So I run loops a lot on the command line, not just in shell scripts.

    • @ClemaX@lemm.ee
      link
      fedilink
      0
      edit-2
      2 months ago

      It all makes sense when you think about the way it will be parsed. I prefer to use newlines instead of semicolons to show the blocks more clearly.

      for file in *.txt
      do
          cat "$file"
      done
      

      The do and done serve as the loop block delimiters. Such as { and } in many other languages. The shell parser couldn’t know where stuff starts/ends.

      Edit: I agree that the then/fi, do/done case/esac are very inconsistent.

      Also to fail early and raise errors on uninitialized variables, I recommend to add this to the beginning of your bash scripts:

      set -euo pipefail
      

      Or only this for regular sh scripts:

      set -eu
      

      -e: Exit on error

      -u: Error on access to undefined variable

      -o pipefail: Abort pipeline early if any part of it fails.

      There is also -x that can be very useful for debugging as it shows a trace of every command and result as it is executed.

  • @Cold_Brew_Enema@lemmy.world
    link
    fedilink
    02 months ago

    Me with powershell. I’ll write a pretty complex script, not write powershell for 3 months, come back and have to completely relearn it.

  • 74 183.84
    link
    fedilink
    English
    02 months ago

    And I thought I was the only one… for smaller bash scripts chatGPT/Deepseek does a good enough job at it. Though I still haven’t tried VScode’s copilot on bash scripts. I have only tried it wirh C code and it kiiiinda did an ass job at helping…

    • @cm0002@lemmy.worldOP
      link
      fedilink
      02 months ago

      AI does decently enough on scripting languages if you spell it out enough for it lol, but IMO it tends to not do so well when it comes to compiled languages

      I’ve tried Python with VScode Copilot (Claude) and it did pretty good

        • @cm0002@lemmy.worldOP
          link
          fedilink
          02 months ago

          I was chalking it up to some scripting languages just tending to be more popular (like python) and thus having more training data for them to draw from

          But that’s a good point too lol

  • katy ✨
    link
    fedilink
    02 months ago

    every control structure should end in the backwards spelling of how they started

    • @HyperMegaNet@lemm.ee
      link
      fedilink
      02 months ago

      Thank you for this. About a year ago I came across ShellCheck thanks to a comment just like this on Reddit. I also happened to be getting towards the end of a project which included hundreds of lines of shell scripts across dozens of files.

      It turns out that despite my workplace having done quite a bit of shell scripting for previous projects, no one had heard about Shell Check. We had been using similar analysis tools for other languages but nothing for shell scripts. As you say, it turned up a huge number of errors, including some pretty spicy ones when we first started using it. It was genuinely surprising to see how many unique and terrible ways the scripts could have failed.

    • @ethancedwards8@programming.dev
      link
      fedilink
      English
      02 months ago

      I wish it had a more comprehensive auto correct feature. I maintain a huge bash repository and have tried to use it, and it common makes mistakes. None of us maintainers have time to rewrite the scripts to match standards.

      • Trailblazing Braille Taser
        link
        fedilink
        02 months ago

        I honestly think autocorrecting your scripts would do more harm than good. ShellCheck tells you about potential issues, but It’s up to you to determine the correct behavior.

        For example, how could it know whether cat $foo should be cat "$foo", or whether the script actually relies on word splitting? It’s possible that $foo intentionally contains multiple paths.

        Maybe there are autofixable errors I’m not thinking of.

        FYI, it’s possible to gradually adopt ShellCheck by setting --severity=error and working your way down to warnings and so on. Alternatively, you can add one-off #shellcheck ignore SC1234 comments before offending lines to silence warnings.

        • For example, how could it know whether cat $foo should be cat "$foo", or whether the script actually relies on word splitting? It’s possible that $foo intentionally contains multiple paths.

          Last time I used ShellCheck (yesterday funnily enough) I had written ports+=($(get_elixir_ports)) to split the input since get_elixir_ports returns a string of space separated ports. It worked exactly as intended, but ShellCheck still recommended to make the splitting explicit rather than implicit.

          The ShellCheck docs recommended

          IFS=" " read -r -a elixir_ports <<< "(get_elixir_ports)"
          ports+=("${elixir_ports[@]}")
          
      • @stetech@lemmy.world
        link
        fedilink
        02 months ago

        Then you’ll have to find the time later when this leads to bugs. If you write against bash while declaring it POSIX shell, but then a random system’s sh doesn’t implement a certain thing, you’ll be SOL. Or what exactly do you mean by “match standards”?

  • AItoothbrush
    link
    fedilink
    English
    02 months ago

    Wait im not the only one? I think i relearned bash more times than i can remember.

  • @coldsideofyourpillow@lemmy.cafe
    link
    fedilink
    English
    02 months ago

    That’s why I use nushell. Very convinient for writing scripts that you can understand. Obviously, it cannot beat Python in terms of prototyping, but at least I don’t have to relearn it everytime.

    • @expr@programming.dev
      link
      fedilink
      02 months ago

      We have someone at work who uses it and he’s constantly having tooling issues due to compatibility problems, so… yeah.

      I’m sure it’s fine for sticking in the shebang and writing your own one-off personal scripts, but I would never actually main it. Too much ecosystem relies on bash/posix stuff.

    • @AnUnusualRelic@lemmy.world
      link
      fedilink
      English
      02 months ago

      So the alternative is:

      • either an obtuse script that works everywhere, or
      • a legible script that only works on your machine…
      • I Cast Fist
        link
        fedilink
        02 months ago

        a script that only works on your machine

        That’s why docker exists :D

      • @shortrounddev@lemmy.world
        link
        fedilink
        English
        02 months ago

        I am of the opinion that production software shouldn’t be written in shell languages. If it’s something which needs to be redistributed, I would write it in python or something

        • @AnUnusualRelic@lemmy.world
          link
          fedilink
          02 months ago

          For a bit of glue, a shell script is fine. A start script, some small utility gadget…

          With python, you’re not even sure that the right version is installed unless you ship it with the script.

        • @Hexarei@programming.dev
          link
          fedilink
          02 months ago

          I tend to write anything for distribution in Rust or something that compiles to a standalone binary. Python does not an easily redistributable application make lol

          • @shortrounddev@lemmy.world
            link
            fedilink
            English
            0
            edit-2
            2 months ago

            Yeah but then you either need to compile and redistribute binaries for several platforms, or make sure that each target user has rust/cargo installed. Plus some devs don’t trust compiled binaries in something like an npm package

    • @Akito@lemmy.zip
      link
      fedilink
      English
      02 months ago

      Nu is great. Using it since many years. Clearly superior shell. Only problem is, that it constantly faces breaking changes and you therefore need to frequently update your modules.

        • @barsoap@lemm.ee
          link
          fedilink
          0
          edit-2
          2 months ago

          Not really. They’ve been on the stabilising path for about two years now, removing stuff like dataframes from the default feature set to be able to focus on stabilising the whole core language, but 1.0 isn’t out yet and the minor version just went three digits.

          And it’s good that way. The POSIX CLI is a clusterfuck because it got standardised before it got stabilised. dd’s syntax is just the peak of the iceberg, there, you gotta take out the nail scissors and manicure the whole lawn before promising that things won’t change.

          Even in its current state it’s probably less work for many scripts, though. That is, updating things, especially if you version-lock (hello, nixos) will be less of a headache than writing sh could ever be. nushell is a really nice language, occasionally a bit verbose but never in the boilerplate for boilerplate’s sake way, but in the “In two weeks I’ll be glad it’s not perl” way. Things like command line parsing are ludicrously convenient (though please nushell people land collecting repeated arguments into lists).

        • @Akito@lemmy.zip
          link
          fedilink
          English
          02 months ago

          Yesterday, I upgraded from 0.101.0 to 0.102.0 and date to-table was replaced equally (actually better) with into record, however it was not documented well in the error. Had to research for 5 to 10 minutes, which does not sound much, but if you get this like every second version, the amount of time adds up quickly.

  • @brokenlcd@feddit.it
    link
    fedilink
    02 months ago

    Knowing that there is still a bash script i wrote around 5 years ago still running the entirety of my high scool lab makes me sorry for the poor bastard that will need to fix those hieroglyphs as soon as some package breaks the script. I hate that i used bash, but it was the easiest option at the time on that desolate server.

  • Rose
    link
    fedilink
    02 months ago

    There’s always the old piece of wisdom from the Unix jungle: “If you write a complex shellscript, sooner or later you’ll wish you wrote it in a real programming language.”

    I wrote a huge PowerShell script over the past few years. I was like “Ooh, guess this is a resume item if anyone asks me if I know PowerShell.” …around the beginning of the year I rewrote the bloody thing in Python and I have zero regrets. It’s no longer a Big Mush of Stuff That Does a Thing. It’s got object orientation now. Design patterns. Things in independent units. Shit like that.

    • FundMECFS
      link
      fedilink
      02 months ago

      This is one of the best uses for LLM’s imo. They do all my regex for me.

    • @Kissaki@programming.dev
      link
      fedilink
      English
      02 months ago

      You always forget regex syntax?

      I’ve always found it simple to understand and remember. Even over many years and decades, I’ve never had issues reading or writing simple regex syntax (excluding the flags and shorthands) even after long regex breaks.

      • @Akito@lemmy.zip
        link
        fedilink
        English
        02 months ago

        It’s not about the syntax itself, it’s about which syntax to use. There are different ones and remembering which one is for which language is tough.

        • @Lehmanator@programming.dev
          link
          fedilink
          English
          02 months ago

          This is exactly it. Regex is super simple. The difficulty is maintaining a mental mapping between language/util <-> regex engine <-> engine syntax & character class names. It gets worse when utils also conditionally enable extended syntaxes with flags or options.

          The hardest part is remembering whether you need to use \w or [:alnum:].

          Way too few utils actually mention which syntax they use too. Most just say something accepts a “regular expression”, which is totally ambiguous.

          • @ewenak@jlai.lu
            link
            fedilink
            02 months ago

            There is the “very magic” mode for vim regexes. It’s not the exact PCRE syntax, but it’s pretty close. You only need to add \v before the expression to use it. There is no permanent mode / option though. (I think you can remap the commands, like / to /\v)

        • @activ8r@sh.itjust.works
          link
          fedilink
          English
          02 months ago

          I know that LLMs are probably very helpful for people who are just getting started, but you will never understand it if you can’t grasp the fundamentals. Don’t let “AI” make you lazy. If you do use LLMs make sure you understand the output it’s giving you enough to replicate it yourself.

          This may not be applicable to you specifically, but I think this is nice info to have here for others.

  • @wwb4itcgas@lemm.ee
    link
    fedilink
    English
    02 months ago

    I have a confession to make: Unless shell script is absolutely required, I just use Python for all my automation needs.