Home news Chatgpt told Mani that he killed his children

Chatgpt told Mani that he killed his children

0
Chatgpt told Mani that he killed his children


Imran Rahman-Johnz

Technology reporter

Arve Hujalmar Holmen Arve Hujalmar Holemen wears blue and dark shirt هه جه لمه حه ڵمێن

Arve Hujalmar Holmen has filed a complaint against Norwegian data coordinator

A Norwegian man has filed a complaint after Chattt told him that he had killed his two sons and had been imprisoned for 21 years.

Arve Hujalmar Holmen has contacted the Norwegian Information Protection Authority and has asked the creator of the chatbot, the Openai, will be fined.

It is the latest example of what is called “Habali”, in which the artificial intelligence system (AI) invented and presents as truth.

Mr. Holmen says this special dream is very harmful.

“Some believe there is no smoke without fire,” he said.

Openai has been contacted for more information.

Mr. Holmin was given the false information after using Chatgpt to search for:

The answer received from Chatgpt was: “Arve Hjalmar Holmen is a Norwegian individual who was attracted to a catastrophic event.

“The father of two young sons aged 7 and 10 was found dead in December 2020 in a pond near his home in Trondheim, Norway.

Mr. Holmen said the chatbot has created an almost age gap, suggesting that he has some accurate information about it.

Noyb Digital Rights Group the complaint On behalf of him, the answer says he has given him a defamation and the rules of protecting European information breaking around the detailed personal information.

“Mr. Holmen has never been accused and has not been punished for any crime and is a conscientious citizen,” Noib said in his complaint.

“Chatgpt can make mistakes. Check important information,” Chatgpt said.

Noib says that’s not enough.

“You can’t just spread false information and eventually add a small responsibilities that he says everything you said may not be just true,” said Joachim Soderberg, the Noyb lawyer.

Noyb The European Center for Digital Rights is a picture of the Chatgpt screen. “Arve Hujalmar Holmen is a Norwegian individual who has attracted his attention because of a tragic incident. The father of two young sons aged 7 and 10 was found dead in December 2020 in a pond near his home in Trondheim in Trondheim. Arve Hujalmar Holemen was charged and then convicted of murder of his two sons, and for attempting to kill his third son. This case shocked the local society and its nation and was widely covered by the media because of the catastrophic nature. Holmen was sentenced to 21 years in prison, the highest sentence in Norway. The incident focused on mental health issues and the complications in the dynamics of families.”European Center for Digital Rights

Dreaming is one of the main problems that computer scientists are trying to solve when it comes to reproductive AI.

These are when chatbots present wrong information as the truth.

Early this year, Apple intelligence suspended the apple The summary tool of the news in the UK after they made the main dream of false headlines and as a real news.

Also Google’s Ai Gemini Ai Gemini Following the fall of dreams “Last year, he suggested that cheese drains pizza using sticks, and they say geologists suggest that people eat one stone daily.

It is unclear what in large linguistic models – the technology based on chatbots – which causes these dreams.

“This is actually an active field of research. How do we create these chains of intelligence? How can we explain what happens in a large language model?” Simon Stumpf, a professor of responsibility and interaction at the University of Glasgow, said.

Professor Stumpf says he can even include people working behind the scenes on such models.

“Even if you are much more contributing to the development of these systems, you don’t know how they actually work.

Chatgpt has changed the model after Mr. Holmen’s search in August

Noib told the BBC that Mr. Holmen had searched a number of searches on that day, including his brother’s name in chat and “have produced several different stories, all of whom were wrong.

They also acknowledged that previous searches could influence the answer about his children, but said the big models of “black boxing” were “black boxes” and Opena “is not answering the demands of access, which makes it more uncertain about what accurate data on the system.

LEAVE A REPLY

Please enter your comment!
Please enter your name here