• TheObviousSolution@lemm.ee
    link
    fedilink
    English
    arrow-up
    21
    ·
    edit-2
    8 months ago

    You can tell that the prohibition on Gaza is a rule on the post-processing. Bing does this too sometimes, almost giving you an answer before cutting itself off and removing it suddenly. Modern AI is not your friend, it is an authoritarian’s wet dream. All an act, with zero soul.

    By the way, if you think those responses are dystopian, try asking it whether Gaza exists, and then whether Israel exists.

    • joenforcer@midwest.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      To be fair, I tested this question on Copilot (evolution of the Bing AI solution) and it gave me an answer. If I search for “those just my little ladybugs”, however, it chokes as you describe.