• RustyNova@lemmy.world
    link
    fedilink
    arrow-up
    26
    ·
    edit-2
    8 months ago

    Not a go dev. Is it really preventing compilation or is it just some hardened linting rules? Most languages can prevent compile on those errors if tweaked, but that seems bad if it’s not a warning

      • TheSambassador@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 months ago

        What reason is there for this when the compiler could just optimize that variable out of existence? This feels like the most hand holdy annoying “feature” unless I’m missing something.

        • frezik@midwest.social
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          8 months ago

          Cleaner code. That’s all.

          If you need to take variable you don’t use for some reason (like it’s a function arg that has to follow an interface, but it doesn’t need a specific parameter in this case), then you can prefix it with an underscore.

          • expr@programming.dev
            link
            fedilink
            arrow-up
            2
            ·
            8 months ago

            That’s what warnings are for and -werror for production builds in literally any other language. This has been a solved problem for a very long time.

            • frezik@midwest.social
              link
              fedilink
              arrow-up
              1
              ·
              8 months ago

              Sure. Tell that to the Go devs.

              If the language weren’t pushed by Google, nobody would pay it any attention. It’s yet another attempt to “do C right” and it makes some odd choices in the attempt.

            • dbx12@programming.dev
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              8 months ago

              I for my part prefer it that way. Makes sure the code stays clean and nobody can just silence the warnings and be done with it. Because why would you accept useless variables that clutter the code in production builds? Imagine coming back after some time and try to understand the code again. At least you have the guarantee the variable is used somehow and not just “hmm, what does this do? … ah, it’s unused”

              • expr@programming.dev
                link
                fedilink
                arrow-up
                1
                ·
                8 months ago

                …you don’t accept them. Basically every programming language accepts some kind of -werror flag to turn warnings into errors. Warnings for development builds, errors for production builds. This has been a solved problem for a very long time. Not only is it assinine to force them to be errors always, it’s semantically incorrect. Errors should be things that prevent the code from functioning in some capacity.

                • dbx12@programming.dev
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  7 months ago

                  Oh, that makes warnings errors and does not mean “ignore errors”. I’m not too familiar with compiler flags. You could do some mental gymnastics to argue that the unused variable causes the compiler to exit and thus the code is not functioning and thus the unused variable is not a warning but an error :^)

                  • expr@programming.dev
                    link
                    fedilink
                    arrow-up
                    1
                    ·
                    7 months ago

                    It’s a pretty standard flag in basically all compiled languages, just goes by a different name. -werror in C, -Werror in Java, TreatWarningsAsErrors in C#, etc.

      • Valmond@lemmy.mindoki.com
        link
        fedilink
        arrow-up
        3
        ·
        8 months ago

        Whoah, that seems like you’d flesh out code elsewhere, you know when you throw stuff together to make it work, and then fix it up to standards.

        Feels like you should have to make git commits perfectly well before being able to compile…

        Put that overwhelmingly intrusive thing in a hook checking out your commits instead (when you push your branch ofc).

        • Ethan@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 months ago

          You get used to it. The only time I really notice it these days is when I’m debugging and commenting out code.

            • Ethan@programming.dev
              link
              fedilink
              English
              arrow-up
              1
              ·
              8 months ago

              *when I’m doing debugging that requires commenting out code.

              Most of the time, I don’t comment out code. I run the code in a debugger, step through it, and see how the behavior deviates from what I expect. I mostly only resort to commenting out code if I’m having trouble figuring out where the problem is coming from, which isn’t that often.

    • YIj54yALOJxEsY20eU@lemm.eeOP
      link
      fedilink
      arrow-up
      6
      ·
      8 months ago

      I don’t think its inherently bad but it feels jarring when the language allows you reference nill pointers. It’s so effective in its hand holding otherwise that blowing things up should not be so easy.