it’s so good at parsing text and documents, summarizing
No. Not when it matters. It makes stuff up. The less you carefully check every single fucking thing it says, the more likely you are to believe some lies it subtly slipped in as it went along. If truth doesn’t matter, go ahead and use LLMs.
If you just want some ideas that you’re going to sift through, independently verify and check for yourself with extreme skepticism as if Donald Trump were telling you how to achieve world peace, great, you’re using LLMs effectively.
But if you’re trusting it, you’re doing it very, very wrong and you’re going to get humiliated because other people are going to catch you out in repeating an LLM’s bullshit.
It’s like you didn’t listen to anything I ever said, or you discounted everything I said as fiction, but everything your dear LLM said is gospel truth in your eyes. It’s utterly irrational. You have to be trolling me now.
I already told you my experience of the crapness of LLMs and even explained why I can’t share the prompt etc. You clearly weren’t listening or are incapable of taking in information.
There’s also all the testing done by the people talked about in the article we’re discussing which you’re also irrationally dismissing.
You have extreme confirmation bias.
Everything you hear that disagrees with your absurd faith in the accuracy of the extreme blagging of LLMs gets dismissed for any excuse you can come up with.
You’re projecting here. I’m asking you to give an example of any prompt. You’re saying it’s so bad that it needs to be babysat because it’s errors. I’ll only asking for your to give an example and you’re saying that’s confirmation bias and acting like I’m being religiously ignorant
No. Not when it matters. It makes stuff up. The less you carefully check every single fucking thing it says, the more likely you are to believe some lies it subtly slipped in as it went along. If truth doesn’t matter, go ahead and use LLMs.
If you just want some ideas that you’re going to sift through, independently verify and check for yourself with extreme skepticism as if Donald Trump were telling you how to achieve world peace, great, you’re using LLMs effectively.
But if you’re trusting it, you’re doing it very, very wrong and you’re going to get humiliated because other people are going to catch you out in repeating an LLM’s bullshit.
If it’s so bad as if you say, could you give an example of a prompt where it’ll tell you incorrect information.
It’s like you didn’t listen to anything I ever said, or you discounted everything I said as fiction, but everything your dear LLM said is gospel truth in your eyes. It’s utterly irrational. You have to be trolling me now.
Should be easy if it’s that bad though
I already told you my experience of the crapness of LLMs and even explained why I can’t share the prompt etc. You clearly weren’t listening or are incapable of taking in information.
There’s also all the testing done by the people talked about in the article we’re discussing which you’re also irrationally dismissing.
You have extreme confirmation bias.
Everything you hear that disagrees with your absurd faith in the accuracy of the extreme blagging of LLMs gets dismissed for any excuse you can come up with.
You’re projecting here. I’m asking you to give an example of any prompt. You’re saying it’s so bad that it needs to be babysat because it’s errors. I’ll only asking for your to give an example and you’re saying that’s confirmation bias and acting like I’m being religiously ignorant