NEW YORK: Google is trying to improve the quality of its search results by directing review teams to flag content that might come across as upsetting or offensive.
With the change, content with racial slurs could now get flagged under a new category called “upsetting-offensive.” So could content that promotes hate or violence against a specific group of people based on gender, race or other criteria.
While flagging something doesn’t directly affect the search results themselves, it’s used to tweak the company’s software so that better content ranks higher. This approach might, for instance, push down content that is inaccurate or has other questionable attributes, thereby giving prominence to trustworthy sources.
The review teams — comprised of contractors known as “quality raters” — already comb through websites and other content to flag questionable items such as pornography. Google added “upsetting-offensive” in its latest guidelines for quality raters. Google declined to comment on the changes, which were reported in the blog Search Engine Land and elsewhere.
The guidelines, which run 160 pages, are an interesting look into how Google ranks the quality of its search results. For instance, it gives examples of “high-quality” pages, such as the home page of a newspaper that has “won seven Pulitzer Prize awards,” and “low-quality” pages, such as an article that includes “many grammar and punctuation errors.”
The guidelines cite an example of “Holocaust history” as a search query. A resulting website listing “Top 10 reasons why the holocaust didn’t happen” would get flagged.
The new “upsetting-offensive” flag instructs quality raters to “flag to all web results that contain upsetting or offensive content from the perspective of users in your locale, even if the result satisfies the user intent.” So even if the results are what the person searched for, such as white supremacist websites, they could still get flagged. But it doesn’t mean the results won’t show up at all when someone searches for them.