Yes, it does that because it was designed to sound convincing, and that is a good method for accomplishing that. That is the primary goal behind the design of all chatbots, and what the Turing Test was intended to gauge. Anyone who makes a chatbot wants it to sound good first and foremost.
Or it’s ChatGPT
Honestly, I think ChatGPT wouldn’t make that particular mistake. Sounding proper is its primary purpose. Maybe a cheap knockoff.
TalkFOS
chatGPT just guesses the next word. stop anthropomorphizing it.
Humans are just electrified meat. Stop anthropomorphizing it.
Found Andrew Ure’s account
🙄
Another example of why I hate techies
Yes, it does that because it was designed to sound convincing, and that is a good method for accomplishing that. That is the primary goal behind the design of all chatbots, and what the Turing Test was intended to gauge. Anyone who makes a chatbot wants it to sound good first and foremost.
it guesses the next word… based on examples created by humans. It’s not just making shit up out of thin air.
I knew someone would say that.
Fuck you’re probably right