• Vupware@lemmy.zip
      link
      fedilink
      English
      arrow-up
      7
      ·
      17 hours ago

      Certainly! that’s why you use them to find sources. I don’t immediately trust anything an LLM spits out.

      • M0oP0o@mander.xyz
        link
        fedilink
        English
        arrow-up
        5
        ·
        13 hours ago

        The amount of times I see people just take what is outputted as gospel is scary. And on things with real risk like electrical repair!

        • Vupware@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 hours ago

          Absolutely. It is essential to use these LLMs with an abundance of caution, and to explore a broader range of sources from various viewpoints.

          Caution and breadth of sources should be staples of any historical or contemporary investigation, but alas, the minds of my fellow citizens have been eroded; these practices cannot be assumed to be broadly adopted, and as such, when saying something positive about LLMs, we must provide ample disclaimers and remind the general populace of the dangers and harms these technologies expose us and our surroundings to.

        • 13igTyme@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          12 hours ago

          I’ve seen Google AI say one thing and provided the source. When I read the source, sometimes it’s completely different or even the exact opposite.

          • M0oP0o@mander.xyz
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 hours ago

            Oh yeah the LLM that is built to make things up at the drop of a hat can also make up the sources, its wild that people can still trust a literal machine of lies.

            I saw one the other day that stated AC and DC power where interchangeable…