Home » piqd | My hallucinations, your hallucinations

piqd | My hallucinations, your hallucinations

by admin

What is artificial intelligence and what does it do? This is not just a philosophical question, but increasingly also a legal one. The case of a pro-gun radio host from the US state of Georgia is interesting. He is suing Open AI because of a so-called hallucination of ChatGPT, i.e. a made-up statement by the chatbot. ChatGPT had slandered Walters in a chat with a journalist who was researching him. The bot had claimed that Walters had embezzled money and was being sued for it. Ars Technica is taking the trouble to follow Walters’ lawsuit against Open AI in detail.

The arguments from Open AI, which the court now has to clarify, are interesting. They show what a strange matter the output of AI is and how AI companies are trying to shift responsibility.

Questions that Open AI raises in the process:

According to Open AI, statements made by the AI ​​in the chat process are not a “publication”. If the journalist hadn’t alerted Walters, the lie would never have been revealed. The statement therefore did not cause any harm to Walters. ChatGPT users are essentially contractors of Open AI, statements from the AI ​​are therefore “communication within a company”, i.e. conversations that the AI ​​essentially has with itself. A disclaimer has been written that warns against the hallucinations (note: that usually helps too not if you warn someone that you sometimes lie and then slander them). My favorite: The chatbot’s statements are the “property” of the person chatting with it (in this case the journalist). So Open AI is fine.

This all sounds pretty contrived by Open AI. It will be interesting to see how the court will decide on these fundamental questions about the output of AI.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy