Back in 2015, a woman named Imy Santiago wrote an Amazon review of a novel that she had read and liked. Amazon immediately took the review down and told Santiago she had “violated its policies.” Santiago re-read her review, didn’t see anything objectionable about it, so she tried to post it again. “You’re not eligible to review this product,” an Amazon prompt informed her.
When she wrote to Amazon about it, the company told her that her “account activity indicates you know the author personally.” Santiago did not know the author, so she wrote an angry email to Amazon and blogged about Amazon’s “big brother” surveillance.
I reached out to both Santiago and Amazon at the time to try to figure out what the hell happened here. Santiago, who is an indie book writer herself, told me that she’d been in the same ballroom with the author in New York a few months before at a book signing event, but had not talked to her, and that she had followed the author on Twitter and Facebook after reading her books. Santiago had never connected her Facebook account to Amazon, she said.
Amazon wouldn’t tell me much back in 2015. Spokesperson Julie Law told me by email at the time that the company “didn’t comment on individual accounts” but said, “when we detect that elements of a reviewer’s Amazon account match elements of an author’s Amazon account, we conclude that there is too much risk of review bias. This can erode customer trust, and thus we remove the review. I can assure you that we investigate each case.”
“We have built mechanisms, both manual and automated over the years that detect, remove or prevent reviews which violate guidelines,” Law added.
A new report in the New York Times about Facebook’s surprising level of data-sharing with other technology companies may shed light on those mechanisms:
Facebook allowed Microsoft’s Bing search engine to see the names of virtually all Facebook users’ friends without consent, the records show, and gave Netflix and Spotify the ability to read Facebook users’ private messages.
The social network permitted Amazon to obtain users’ names and contact information through their friends, and it let Yahoo view streams of friends’ posts as recently as this summer, despite public statements that it had stopped that type of sharing years earlier.
If Amazon was sucking up data from Facebook about who knew whom, it may explain why Santiago’s review was blocked. Because Santiago had followed the author on Facebook, Amazon or its algorithms would see her name and contact information as being connected to the author there, according to the Times. Facebook reportedly didn’t let users know this data-sharing was happening nor get their consent, so Santiago, as well as the author presumably, wouldn’t have known this had happened.
Amazon declined to tell the New York Times about its data-sharing deal with Facebook but “said it used the information appropriately.” I asked Amazon how it was using the data obtained from Facebook, and whether it used it to make connections like the one described by Santiago. The answer was underwhelming.
“Amazon uses APIs provided by Facebook in order to enable Facebook experiences for our products,” said an Amazon spokesperson in a statement that didn’t quite answer the question. “For example, giving customers the option to sync Facebook contacts on an Amazon Tablet. We use information only in accordance with our privacy policy.”
Amazon declined our request to comment further.
Why was Facebook giving out this data about its users to other tech giants? The Times report is frustratingly vague, but it says Facebook “got more users” by partnering with the companies (though it’s unclear how), but also that it got data in return, specifically data that helped power its People You May Know recommendations. Via the Times:
The Times reviewed more than 270 pages of reports generated by the system — records that reflect just a portion of Facebook’s wide-ranging deals. Among the revelations was that Facebook obtained data from multiple partners for a controversial friend-suggestion tool called “People You May Know.”
The feature, introduced in 2008, continues even though some Facebook users have objected to it, unsettled by its knowledge of their real-world relationships. Gizmodo and other news outlets have reported cases of the tool’s recommending friend connections between patients of the same psychiatrist, estranged family members, and a harasser and his victim.
Facebook, in turn, used contact lists from the partners, including Amazon, Yahoo and the Chinese company Huawei — which has been flagged as a security threat by American intelligence officials — to gain deeper insight into people’s relationships and suggest more connections, the records show.
‘You scratch my algorithm’s back. I’ll scratch your algorithm’s back,’ or so the arrangement apparently went.
Back in 2017, I asked Facebook whether it was getting information from “third parties such as data brokers” to help power its creepily accurate friend recommendations. A spokesperson told me by email, “Facebook does not use information from data brokers for People You May Know,” in what now seems to be a purposefully evasive answer.
Facebook doesn’t want to tell us how its systems work. Amazon doesn’t want to tell us how its systems work. These companies are data mining us, sometimes in concert, to make uncomfortably accurate connections but also erroneous assumptions. They don’t want to tell us how they do it, suggesting they know it’s become too invasive to reveal. Thank god for leakers and lawsuits.