Via Ars Technica, I’ve learned that shady Amazon sellers have been using chatbots to automatically write item descriptions. The result is hot offers on items like “I cannot fulfill that request” and “I apologize but I cannot complete this task.” This is a natural progression from Amazon product listings which were simply misdescribed by humans.
In the distant past, 15 or 20 years ago, Amazon used to be the place where you could reliably order name brand things. If you wanted a DVD at a mediocre price, you could order from Amazon and get a legit copy of it. If you wanted a ridiculously low price, you could order from a shady website and get a knockoff of dubious provenance and quality.
As Amazon has expanded its offerings to include all the shit, that has inevitably swept in all the shit.