Google has updated its core algorithm that controls the answers we get to queries on its search engine in a bid to make them work better for longer, more complex questions.
The update, code-named Hummingbird, is the biggest change to the underpinnings of the world’s leading search engine since early 2010, when Google upgraded its algorithm to one it called Caffeine. Google made the change about a month ago, it announced at a press event in the garage of the Menlo Park (Calif.) house where Google started. The event also celebrated the 15th anniversary of Google’s founding, which is tomorrow.
Most people won’t notice an overt difference to search results. But with more people making more complex queries, especially as they can increasingly speak their searches into their smartphones, there’s a need for new mathematical formulas to handle them.
This update to the algorithm focuses more on ranking sites for better relevance by tapping further into the company’s Knowledge Graph, its encyclopedia of 570 million concepts and relationships among them, according to Amit Singhal, Google’s senior VP of search. (For example, there’s a Knowledge Graph “card,” or information box, for the Eiffel Tower, and Knowledge Graph knows it’s a tower, that it has a height, that it’s in Paris, etc., so Google can anticipate you might want to know some of those facts.) Caffeine was more focused on better indexing and crawling of sites to speed results.
After the event, Scott Huffman, a key engineering director at Google currently working on natural language, told me that part of the impetus for the change was that as more people speak searches into phones, they’re doing so in a more natural way than they type in queries–which is to say more complicated. So Google’s search formulas needed to be able to respond to them.
Partly that is through even great use of the Knowledge Graph, so obvious discrete terms can be identified quickly. But it’s also interesting that although queries are getting more complex, that doesn’t always mean it’s harder to find the right answers. The more terms people use, Huffman says, the more context Google can divine. So those extra words, even if they’re in a more complex query, can give Google better information–but only if the algorithms are adjusted to be able to recognize the relationship among those terms.
Ultimately, he says, “we want to get to a natural conversation” between people and Google search on whatever devices they’re using. …