Google plans to use artificial intelligence in more ways to make using search safer. In the coming weeks, it will roll out some updates for its AI model, MUM. The upgrades should help it detect a wider variety of personal crisis searches about sexual assault, substance abuse, domestic violence and suicide.
The company says people search for information about these topics in a broad range of ways. By employing MUM's machine learning capabilities, Google says it can better understand the intent behind queries to recognize when someone is in need. As such, it'll be able to provide them with more actionable, reliable information at the appropriate time.
With the help of local partners, the company plans to use the AI to improve how it handles personal crisis searches in other countries in the coming months, since MUM can translate knowledge between 75 languages. Google says it will harness the model in other ways, including to improve spam protections and enhance safety measures in countries where it doesn't have much training data.
Other companies have been making use of multimodal AI systems similar to MUM. Meta, for instance, said it has been using AI to tackle hate speech and misinformation across its platforms in recent years. Its AI models can also obtain knowledge by analyzing videos and use that information in new products. Meanwhile, China's Wu Dao seems to be the Swiss army knife of AI models. It can write essays, poems and couplets in traditional Chinese, analyze images to generate alt text, create almost-photorealistic images from written descriptions and much more.
In the US, the National Suicide Prevention Lifeline is 1-800-273-8255. Crisis Text Line can be reached by texting HOME to 741741 (US), 686868 (Canada), or 85258 (UK). Wikipedia maintains a list of crisis lines for people outside of those countries.