Amazon criticized again after putting AI generated ads online Tech

Amazon criticized again after putting AI-generated ads online – Tech & Co

ChatGPT's ability to aggregate products may have reached a level of irony that is astonishing to say the least.

Amazon is once again in turmoil after publishing product sheets crudely generated by artificial intelligence. It was the Futurism website that discovered a certain number of articles with the same text to potentially mislead consumers.

Some smart people probably use ChatGPT to automatically write product sheets for objects, real or not. However, OpenAI, the company behind ChatGPT, doesn't allow everything, which is why the product title has been replaced with a standard phrase.

“I'm sorry, but I cannot respond to your request because it violates OpenAI's terms of service. My goal is to provide you with accurate help and information for Brown users,” the ad captions, where Brown is the color of the furniture sold by a brand with a name that may also be randomly generated (FOPEAS).

An example of an AI-generated ad on Amazon – Futurism

This title is therefore the same on dozens of products. A situation that embarrasses Amazon, especially since the online sales giant is already accused of allowing drop shipping products and false comments to happen too easily.

A situation that tends to highlight products that may be of poor quality or that do not exist at all.

Amazon defends itself

Our colleagues at Futurism then wonder about Amazon’s verification chain and even wonder if there is “a pilot on the plane.” In a press release, the company states: “We work hard to provide a trustworthy shopping experience.” [ses] Customers, particularly by requiring that third parties provide accurate information about their product.” Additionally, the singled out offers have been removed.

Most of the incriminated products fell into a single category: furniture. But other products could also be affected, especially when it comes to descriptions, such as mixtures of words that have nothing to do with each other.

Most read