How Does BERT Help Google Understand Language?

The Bidirectional Encoder Representations from Transformers (BERT) was launched in 2019 and was a big step in search and in understanding natural language.
Marie Aquino
February 18, 2022

A few weeks ago, Google has released details on how Google uses artificial intelligence to power search results. Now, it has released a video that explains better how BERT, one of its artificial intelligence systems, helps search understand language.

Context, tone, and intention, while obvious for humans, are very difficult for computers to pick up on. To be able to provide relevant search results, Google needs to understand language.

It doesn’t just need to know the definition of the terms, it needs to know what the meaning is when the words are strung together in a specific order. It also needs to include small words such as “for” and “to”. Every word matters. Writing a computer program with the ability to understand all these is quite tough.

The Bidirectional Encoder Representations from Transformers, also called BERT, was launched in 2019 and was a big step in search and in understanding natural language and how the combination of words can express different meanings and intents.

Prior to it, search processed a query by pulling out the words that it thought were most important, and words such as “for” or “to” were essentially ignored. This means that results may sometimes not be a good match to what the query is looking for.

With the introduction of BERT, the little words are taken into account to understand what the searcher is looking for. BERT isn’t foolproof though, it is a machine, after all. However, since it was implemented in 2019, it has helped improved a lot of searches.

Check out the explainer video here:

Wondering what the other AI systems are that powers search? Check out our article on it.