Google’s mission is to help people find the information they need, displaying the most relevant and reliable data. But What happens in times of personal crisis? How do “advanced artificial intelligence (AI) systems” act to provide safe search in critical situations?
“We know that people turn to Search to find trusted information in the moments that matter most. Today, if you Google information on suicide, sexual assault, substance abuse and domestic violence, you will see contact details for the relevant national hotlines along with the most important and high-quality results available. See, Calif.
You will be interested:
Even when the terms or concerns used by the person requesting help do not show that, the company reaches for its latest AI model.
“MUM is able to better understand the intention behind the questions to detect when a person is in need. This helps us more reliably display reliable and actionable information at the right time. We’ll start using MUM to make these improvements in the coming weeks.”
In the first half of 2021, the firm unveiled its Multitask Unified Model, or MUM, as an AI milestone for understanding information. He added that it is multimodal, so it includes elements through text and images “and, in the future, it can be expanded to more modalities such as video and audio.”
He asserts that the development is capable of transferring knowledge through the 75 languages in which it is trained, which means the possibility of scaling security protections around the world more efficiently. In other words, when you train to perform a task, such as classifying the nature of a query, you learn to do it in all the languages you know.
Away from shocking content
According to Google, keeping people safe also means steering them away from content they aren’t looking for, which is a challenge because content creators sometimes use “friendly” terms to label explicit or suggestive content.
The answer to that is how Safe Searchwhich offers the option of filtering explicit results and is the default for Google accounts under 18 years of age.
“Even when users choose to have SafeSearch turned off, our systems reduce risque and unwanted results for searches that appear to have no intention of finding them. In fact, our security algorithms improve hundreds of millions of searches worldwide across web, image, and video searches every day.
BERT, another of the advanced AI technologies, is also used for the purpose of reducing unexpected data by better understanding what each one is requiring.
“BERT models can consider the full context of a word by looking at the terms that come before and after it, which is useful for understanding the intent behind search queries,” Mountain View said at the time.
Now, this development “has improved our understanding” of whether searches are actually intended to find explicit content.
Google adds that the BERT improvement reduced the presence of unexpected results by 30 percent in the last year. “It has had a particular impact in reducing explicit content for searches related to ethnicity, sexual orientation, and gender, which can disproportionately affect women, especially women of color.”