• chicken@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    ChatGPT regularly makes up methods or entire libraries

    I think that when it is doing that, it is normally a sign that what you are asking for does not exist and you are on the wrong track.

    ChatGPT cannot explain, because it doesn’t understand

    I often get good explanations that seem to reflect understanding, which often would be difficult to look up otherwise. For example when I asked about the code generated, {myVariable} , and how it could be a valid function parameter in javascript, it responded that it is the equivalent of {"myVariable":myVariable}, and “When using object literal property value shorthand, if you’re setting a property value to a variable of the same name, you can simply use the variable name.”

    • state_electrician@discuss.tchncs.de
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      If ChatGPT gives you correct information you’re either lucky or just didn’t realize it was making shit up. That’s a simple fact. LLMs absolutely have their uses, but facts ain’t one of them.