Meet BERT, Google’s New AI Tool for Processing Natural Language
Published: November 4, 2019
Author: Katherine Simon
Last week Google swiftly rolled out one of the biggest updates to their search algorithm in years, and its name is BERT.
According to Pandu Nayak, a Google Fellow and VP of Search, BERT (Bidirectional Encoder Representations from Transformers) is a neural network-based technique to help better process natural language. In short, this advancement in Google’s software will allow it to understand our searches and provide users with more relevant results and featured snippets than ever before.
When users go to formulate queries, they can be clunky, long, or even conversational, because they might not be exactly sure what they are looking for. To combat this, Google leverages something called transformers (the “T” in BERT) to process words in relation to the other words in a query, rather than looking at each individual word. This way, the context of the query is not lost on Google and the search engine has a better understanding of the searcher’s intent.
One of the examples Nayak noted in his blog was the search “2019 brazil traveler to usa need a visa.” In the past, Google wouldn’t give as much weight to the word “to” in the query and send you to the Washington Post article below. With BERT, Google can decipher the impact “to” has on the query and provide the user with a much more relevant page.
So, how will this update affect search?
Some reports from Google are saying that 1 in 10 queries will be impacted in terms of changing the results that rank for those specific queries; 10% of queries is a significant change!
That being said, is BERT something we as advertisers can optimize for? Not really. The BERT update, along with the many different tools Google uses to rank search results, was created to better understand natural language. If anything, the more in-depth knowledge Google can gain on a searcher’s intent, the more the platform can help advertisers drive relevant traffic. Google’s Danny Sullivan responded to users’ immediate questions on BERT and how to prepare on the publisher side by saying:
Language decoding within Google and all search engines is an ongoing process, but tools like BERT are helping machines take giant leaps forward in the world of Search. Computers might not be able to pick up on conversational language as easily as humans (yet), and as creepy as AI and machine learning can be, having the tools in place to better understand the subtle nuances of human speech will make our lives as both advertisers and active searchers much easier.
Over the next few weeks, continue business as usual with your accounts, but be aware that you might start seeing the effect in your own searches. For now, we do recommend keeping a close eye on the volume being driven to your site(s), and the contents of your Search Query Reports, to be mindful of how your queries are mapping.
BERT might not be something we can necessarily optimize toward, but it is something we should definitely be aware of, as Google has made it clear that this could be one of the greatest breakthroughs in machine language understanding to date.