It still blows my mind (IDK Why) that this stuff is still happening. Does Amazon (THE KING OF DATA), really not have a way to test their bots.
I’m no programmer, but I worked closely with an IT Dept for several years in a past life, but this is pretty simple stuff.
You come out with a new algorithm, and you test it in a test environment, get a report back with what was pulled, and have someone with 1/4 of a brain look at the results.
If they did that, this stuff really shouldn’t happen, ever…
With that said, there very well could be something somewhere in the front or back end of the listing that’s triggering this and it’s not totally Amazon’s fault.
I don’t disagree, but I’m nevertheless increasingly reminded of something the oft-prophetic sage H. G. wells wrote in 1920:
“Human history becomes more and more a race between education and catastrophe.”
Wells had something else in mind when he penned those timeless lines, but I increasingly find it applicable to Human Society’s overweening ambition to free itself from toil via dependence upon mindless automatons.
Having worked in the IT departments of several Fortune 100 companies, I can tell you something is radically wrong at Amazon. We had a development environment for testing, and if it passed the DEV environment, then it went to QA to be tested in a sandbox that was identical to the production environment. If it passed QA then and only then would it be released in PROD, and developers had no access to production for obvious reasons. That’s standard protocol in IT. This could not have been properly tested.
What is wrong at Amazon is there is no one capable of writing a software requirements document for use by the programmer. No one who know what the steps required to meet such a requirement would be. And no process to generate such a document.
The result is always the same. Some PR related or regulatory issue generates an urgent request for an AI function (aka Amabot) to enforce a need.
A programmer is provided with inadequate input on what that means, and writes a destructive piece of software.
I believe Andy touted some huge amount of code written by generative AI had been deployed. The chances of it not having unintended consequences is as large or larger than having human coders, since human coders might have once made a purchase at Amazon or some other retailer.
Another great reason for Amazon to stop buying from vendors and go all 3P sellers so it will not have to deal with this pain on their own listings.
This is the product in question (link ‘broken’ for display on the Discourse Platform, as was the SHC page’s URL in the above-quoted post’s reply from Seller Support):
I trust that Glenn understands the issue and am hopeful he can get it sorted, but it would make much better business sense for Amazon to just FIX THE BOT ALREADY
It is not wood that is the issue.
It is woad in the title that is in fact a noxious weed.
Woad is considered a noxious weed in much of the western U.S., including California, Montana, Wyoming, Utah, Idaho, California, Oregon, and Washington state. The classification “noxious weed” means that it is illegal to grow it because it is considered so detrimental to agricultural productivity. Isatis tinctoria - Wikipedia
Still, it is assinine that the bot is only looking for the instance of a word with zero additional context to determine what is or isn’t a problem item.
Great catch, Crafty! (not an unusual circumstance for you, I’d note).
Hopefully, someone who’s still participating over in the NSFE will take time to point out that it’s that particular word in the Product Title which is almost-certainly what the Noxious Weed Amabot is triggering upon…
Yes, the whole thing with their bots is dumb because every single one of them is ONLY looking for a word and is completely ignoring context. It is utterly pathetic that they’ve coded such a generic check.
I would expect them to have multiple Boolean logic checks to better hone in on whether or not something should be taken down.
Weirdly, the listing also claims that’s made of wood in the materials section, possibly because rogue amabot, possibly because bad filling of blanks.
But yes, in a perfect world there would be a “is actually plant matter” check for noxious weeds, but then bad actors would just misfile it so that data point was overlooked.