The Microsoft company was held accountable after publishing an article about the death of a 21-year-old trainer who was found lifeless with severe head injuries, accompanied by a distasteful survey speculating about the reasons for her death.
“What do you think is the reason for this woman’s death?” we read in a survey that accompanied an article in The Guardian with the options “murder,” “accident,” or “suicide,” which the British newspaper said denounced on Tuesday.
Last Thursday the daily published an article about the death of young Australian water polo coach Lilie James, who was found lifeless and with serious head injuries in a Sydney school last week.
Except that on Microsoft’s news platform, the Microsoft Start website, the article was accompanied by the survey automatically generated by artificial intelligence (AI).
“This has to be the most pathetic and disgusting poll I have ever seen,” one reader denounced in the comments, according to The Guardian.
“This is clearly an inappropriate use of generative AI by Microsoft in a potentially disturbing public interest story originally written and published by Guardian journalists,” complained Anna Bateson, CEO of Guardian Media Group, accusing Microsoft of poor journalistic quality having caused great damage to the media’s reputation.
In a letter addressed to Microsoft, the director demanded accountability from big boss Brad Smith, who in turn assured that the experimental AI would no longer be used without the consent of the press publisher.
“We have disabled Microsoft-created polls for all news articles and are investigating the cause of the inappropriate content. “A survey should not have appeared alongside an article of this type and we are taking measures to prevent this type of error from occurring again in the future,” a Microsoft spokesperson confirmed, according to British media.