For many years, the vicious circle has swirled: websites soliciting false, unverified complaints about alleged fraudsters, sexual predators, deadbeats, and scammers. People defame their enemies. Anonymous posts for victims’ names appear high in Google results. Then the websites charge the victims thousands of dollars to take down the post.
This cycle of slander has been lucrative for websites and associated intermediaries – and devastating for victims. Now Google is trying to break the loop.
The company is planning to change its search algorithm to prevent websites operating under domains such as BadGirlReport.date and PredatorsAlert.us from appearing in the list of results when a person’s name is searched.
Google recently rolled out a new concept it calls “known victims.” When people report to the company that they have been attacked on sites that charge a fee to remove posts, Google will automatically suppress similar content when their names are searched. “Known victims” also include people whose nude photos have been published online without their consent, allowing them to request their names to suppress obvious consequences.
The changes — some already made by Google and others planned for the coming months — are a response to recent New York Times articles about how the slander industry preys on Google’s unintentional help.
“I suspect it will be a perfect solution, certainly not right off the bat. But I think it should have a really significant and positive impact,” said David Graff, Google’s director of global policy and standards for trust and security. The vice president said. “We can’t police the web, but we can be responsible citizens.”
This represents a significant change for victims of online slander. Google, which accounts for an estimated 90 percent of global online search, has historically resisted the role human judgment plays in its search engine, although it has in recent years seen misinformation and misinformation appearing at the top of its results. Has succumbed to increasing pressure to fight abuse.
At first, Google’s founders saw its algorithms as an unbiased reflection of the Internet. It used an analysis called PageRank, named after co-founder Larry Page, to evaluate how many other sites linked to it, as well as the quality of those other sites, how much linked to the sites. them.
The philosophy was, “We never touch search, no way how. If we start touching search results, it’s a one-way ratchet to the curated Internet and we’re no longer neutral,” said Law at the University of Virginia Professor Danielle Citron of the US said. A decade ago, Professor Citron pressured Google to stop so-called revenge porn from being searched for one’s name. The company initially resisted.
In a 2004 statement, Google expressed its pragmatic view of why its search engine was showing up on anti-Semitic websites in response to searches for “Jew”.
“Our search results are generated completely objectively and independent of the beliefs and preferences of those who work at Google,” the company said in a statement deleted a decade later. “Only sites we leave that we are legally compelled to remove or that are maliciously attempting to manipulate our results.”
Google’s early interventions in its search results were limited to things like web spam and pirated movies and music, as required by copyright laws, as well as financially compromising information like Social Security numbers. Lately the company has taken a more active role in cleaning up people’s search results.
The most notable example came in 2014, when European courts established a “right to be forgotten”. EU residents can request that inaccurate and irrelevant information about them be removed from search engines.
Google unsuccessfully opposed the court’s decision. The company said its role was to make existing information accessible and it wanted no part in regulating the content that appeared in search results. Since the rights were established, Google has been forced to remove millions of links from search results containing people’s names.
Donald J. More pressure for change came after Trump was elected president. After the election, one of the top Google search results for “Final Election Vote Count 2016” was a link to an article that incorrectly stated that Mr. Trump, who won the Electoral College, had also won the popular vote.
A few months later, Google announced an initiative to provide “algorithm updates to surface more authoritative content” in an effort to prevent intentionally misleading, false or objectionable information from showing up in search results.
Around that time, Google’s apathy towards engineering harassment from its results was starting to soften.
The Wayback Machine’s collection of Google’s policies on removing items from search results reflects the growth of the company. At first, Google was willing to banish nude photos posted online without the subject’s consent. Then it started deleting medical information. Then came fake pornography, followed by sites with “exploitative removal” policies, and then so-called doxing content, which Google defined as “exposing contact information with the intent of harm.”
According to Google, eviction-request forms get millions of visits each year, but many victims are unaware of their existence. This has allowed “reputation managers” and others to charge a fee to remove content from their results that they can request for free.
Pandu Nayak, the head of Google’s search quality team, said the company started fighting websites that charged people to remove defamatory content a few years ago, in response to the rise of a thriving industry that people K mug shots surfaced and then charged for removal.
Google began ranking such exploitative sites lower in its results, but the change didn’t help those who don’t have much information online. Since Google’s algorithm hates zero, posts accusing people of being drug abusers or pedophiles may still feature prominently in their results.
Slanderous websites have relied on this feature. If the posts weren’t hurting people’s reputations, they couldn’t charge thousands of dollars to remove the content.
Mr Nayak and Mr Graff said Google was unaware of the problem until it was highlighted in articles in The Times this year. He said changes to Google’s algorithms and the creation of a classification of its “known victims” would help solve the problem. In particular, it will be harder for sites to gain traction on Google through one of their preferred methods: copying and re-posting defamatory material from other sites.
Google has been testing changes recently, with contractors comparing new and old search results side-by-side.
The Times previously compiled a list of 47,000 people who have been written about on slanderous sites. In a search of a handful of people whose results were previously littered with slanderous posts, Google’s changes were already detectable. For some, the posts had disappeared from their first page of results and from their image results. For others, the posts had mostly disappeared — save for one from a newly launched infamous site called CheaterArchives.com.
CheaterArchives.com may clarify the limits of Google’s new security. Since it is fairly new, it is unlikely to generate complaints from victims. Those complaints are a way to find sites that defame Google. Also, CheaterArchives.com does not explicitly advertise the removal of posts as a service, potentially making it harder for victims to remove it from their results.
Google executives said the company was not simply motivated by empathy for victims of online slander. Instead, it is part of Google’s long-standing efforts to combat sites that are attempting to appear higher in search engine results than they deserve.
“These sites are, clearly, gaming our systems,” Mr. Graff said.
Still, Google’s move is likely to raise questions about the company’s effective monopoly on what information is in the public domain and what isn’t. In fact, this is why Google has historically been so reluctant to interfere with individual search results.
“You should be able to find anything that’s legal to find,” said Daphne Keller, who was a lawyer at Google from 2004 to 2015, who was working on the search product team for part of the time, and now at Stanford. Is studying how the platform should be regulated. Google, she said, is “only flexing its muscles and deciding what information should be missing.”
Ms Keller was not criticizing her former employer, but lamenting the fact that lawmakers and law enforcement officials largely ignore the slander industry and its extortion practices, leading Google to clean up the mess. left for.
That Google could potentially solve this problem with policy changes and changes to its algorithms is “the reverse of centralization”, said Ms Citron, a University of Virginia professor who has argued that technology in platforms Governments have more power than ever to fight online abuse.
Professor Citron was impressed by Google’s changes, especially the creation of the designation “Known Victims”. She said such victims are often posted over and over again, and the sites minimize the damage by scraping each other.
“I appreciate their efforts,” she said. “Can they do better? Yes they can.”
Aaron Krolik Contributed reporting.
Read Original Article at www.nytimes.com