Yesterday, we looked at the way Amazon can use its near-monopoly position (83% of the market) to prevent both the publication and distribution of books. It can, as we saw, pick sides in a controversy and tag material that was previously only controversial as “hate speech.” But anyone who thinks that this corporate power is wisely used may wish to consider hospital consultant M.D. Chuck Dinerstein’s comments on the algorithm Amazon uses to decide on the science value of material on the COVID-19 vaccine.
Amazon markets information for profit. It does not act, like Twitter and Facebook, primarily as a platform or carrier. It’s algorithms are designed to sell the products. So Amazon keeps track of what you like, what you buy, and what you say about it. If you shop there, it probably knows more about your reading habits than your friends do.
Dr. Dinerstein points to a recent open-access research paper whose authors tracked the way the Amazon algorithms treat information about the COVID vaccine. Every day for 15 days, the authors ran targeted searches, using both accounts with no history and accounts for which they created a history via clicks.
The seeds for those search trees were constructed using words fashioned from Google trends, a dataset that captures what phrases are most popular at any given time. They developed 48 search queries based upon the ten most popular vaccine topics – they ranged from what we might all agree were neutral, like chickenpox or influenza vaccine, to the more polarized, like anti-vaccination or Andrew Wakefield …
As it turns out, Amazon pays attention to our keystrokes; clicking, clicking, and adding an item to our cart, purchasing the item all impact the algorithm. So the researchers varied the searches to see if putting an item in a cart behaved differently than simply clicking on an item and moving on.Chuck Dinerstein, “Amazon, Like Social Media, Dishes Vaccine Misinformation” at American Council on Science and Health
The biggest conundrum the researchers found was the difficulty of classifying “misinformation” in any mechanical way for 350 million products. They had to do it by hand:
For the nearly 5,000 items promoted by their searches, they were labeled debunking, neutral, or promoting vaccine myths manually – the principal author determining 78%, the paid volunteers of Amazon’s Mechanical Turk, an online source for surveys, the remaining 32%. Another individual reviewed the choices, and only 2% required discussion. It takes a great deal of hand labor to classify the informational value of objects.
While this information is buried within the paper’s methodology, it holds a key point for us to recognize. There are no simple computerized means of determining the content of Amazon’s inanimate products.Chuck Dinerstein, “Amazon, Like Social Media, Dishes Vaccine Misinformation” at American Council on Science and Health
It sounds a bit like the problem with classifying “hate speech”: Without an inquiry, it would be hard to tell, in many cases, whether the controversial item is an expression of hate or “bombshell” factual information that a powerful person or lobby does not want circulated. An algorithm will not likely help us decide.
But Amazon’s algorithm has another problem too, as Dinerstein notes. It multiplies whatever we are looking for. If a user clicked on the material that the researchers had classified, after investigation, as misinformation, Amazon served up suggestions for more of the same. It did the same for information they classified as “neutral” or “debunking.” Amazon is, after all, in the business of selling products, not giving medical advice to a patient.
Let’s leave aside for now whether we ought to accept the researchers’ classifications without more information. The bigger issue is, while Amazon cheerfully acts as a censor on some topics (as recent events show), its business model frustrates any such judgment on most topics in real life. At best, Amazon will pick and choose its censorship, based on the demands of powerful lobbies, leaving the rest to the market. Because it must.
Those who want to hear other voices on topics dominated by powerful lobbies will need other providers of information. One option noted earlier is, find out who the publisher is through an internet search and order direct from the publisher.
From the Abstract of the research paper:
We find evidence of filter-bubble effect in Amazon’s recommendations; accounts performing actions on misinformative products are presented with more misinformation compared to accounts performing actions on neutral and debunking products. Interestingly, once user clicks on a misinformative product, homepage recommendations become more contaminated compared to when user shows an intention to buy that product.Prerna Juneja and Tanushree Mitra. 2021. Auditing E-Commerce Platforms for Algorithmically Curated Vaccine Misinformation. In CHI Conference on Human Factors in Computing Systems (CHI ’21), May 8–13, 2021, Yokohama, Japan. ACM, New York, NY, USA, 27 pages. https://doi.org/10.1145/3411764.3445250
You may also wish to read: Does Amazon’s near-monopoly justify its use of censorship? Caitlin Basset looks at the little-known Seattle law that might make Amazon’s censorship much more costly. Because Amazon is 83% of the market, publishers hesitate to publish what Amazon might ban. Meanwhile, readers are urged to act to protect classics.