< IMG SRC = "https://img.tsn.ua/cached/822/tsn-7a7Afafe7A64CD96A891771CF6B F5b51/Thumbs/1036x648/BB/31/56FD2A9FE214D7E8F02729F2613731BB.JPEG " />< p >< strong ~ 62 & quot; hallucinations & quot; Artificial intelligence can cost reputation and life. Chatgpt invented that the man committed a terrible crime. 60 ~/p >~ 60 > resident of Norway Arva Hyalmar Holmar, after Chatgpt wrote a lie that the man allegedly killed his two sons and was sentenced to 21 year of conclusion.
< p > this is reported by “bbc news Ukraine”.
< p > Holmen requires the Norwegian data protection department to fined the chatbot developer, Openai. According to the man, this specific invention caused him great harm.
< p > “some believe that there is no smoke without fire-the fact that someone can read this conclusion and believe that this is true, the Norwegian said.
< p >Note that the answer that ChatGPT gave is one of the examples of the so-called “galyucinations” when artificial intelligence comes up with information and represents it as a fact. ~ 60 >~ 60 > so, Holmen received false information from the chatbot in response to the question: “Who is Arva Hyalmar Holmen ? “
< p > “Arva Hyalmar Holmen-a Norwegian citizen who attracted attention due to a tragic event. He was the father of two little boys, at the age of 7 and 10 years old, who were found dead in a pond near their home in trongime, Norway, in December 2020,” said in response. ~ 60 > 62 > 62 > 62 > 62 > 62 > 62 > 62 > 62 > 62 > 62 > < p > Holmen noticed that the chatbot more or less correctly indicated the difference in the age of his children, so the ChatGPT was some information about the man.
< p >The NOYB digital rights group, which filed a complaint on behalf of Holmen, called ChatGPT the answer to the slanderous and violating European rules for protecting personal data. Noyb noted that a man “has never been accused or convicted for no crime and is a bona fide citizen.”
< h2 > which is known about the “hallucinations” in II ?
< p > “hallucinations” is one of the key problems, the solution of which experts in the field of artificial intelligence work. We are talking about cases when chat bots present fictional information as reliable facts.
< p > Earlier in 2025, Apple temporarily stopped the operation of its tool for the final news of Apple Intelligence in the UK, since it generated fake headlines and granted them for real news.
< p > gemini from Google also made such errors: for example, in 2024 he recommended using glue for fastening cheese to pizza, and also said that geologists advise people to eat one stone every day.
< p > exact causes of such “galyucinations” are still unknown.
< p > “this is really the sphere of active research. How are we building these chains of thoughts ? how to explain what happens inside a large language model ? even if you directly participate in the creation of these systems, very often you do not know how they actually work and why they give out this or that information,” said the professor at the University of Glazgo, specializing in the Glazgo University, specializing in specializing in the university. AI, Simone Stumpf.
< p > Recall that in Poland the travelers headed to the mountains along the route proposed by Chatgpt, and were trapped from which they could not get out on their own. < p > < p > < u >< strong > read also: < h4 > similar topics:
more news