Google Releases Project Owl Update to Improve Autocomplete Quality
Google’s autocomplete feature was released in 2004, and it was intended to streamline the world of search. It used just the first few letters of a query to magically find search terms in a matter of moments. Today, the feature is so effective it can feel almost creepy. You start to wonder if your Android phone is listening to your voice as you type. Still, this handy feature is one many users can’t live without.
The reason this feature works is because Google tracks the popularity of search terms. It knows that when someone makes a query that starts with “wea”, there’s a statistical probability that person is searching for today’s weather.
That level of prediction still relies on a lot of assumptions, but the system provides fairly accurate results. It’s those outliers that represent the problem. What happens when someone starts querying hate speech, or a celebrity’s death, or something anti-semitic? Those suggestions may appear in other’s search fields, circulating offensive content. Ever been spoiled on your favorite TV show from a simple search?
Project Owl aims to change that by adding a feedback system. Users will be able to mark a search as offensive, or as inaccurate, and contribute to Google’s knowledgebase about it. The hope is that users will help to filter search and lead to a better quality experience.
Giving users more control over policing the system can lead to some abuse, such as reporting search terms that aren’t actually offensive (like the name of a rival business).
Bio: Reputation Stars offers reputation management services to clients looking to recover from negative postings online.