Natural language understanding has been a tough nut to crack, but thanks to Google’s continued investment in AI, it has reached a whole new level. At I/O 2021, Google introduced MUM — Multitask United Model. According to Google, this new language model is 1000x powerful than BERT, released in 2019. MUM is coming to Google products sometime in the future.
What is MUM: Multitask United Model?
MUM is a language model built on the same transformer system as BERT, which made waves back in 2019. BERT is a powerful language model that proved a breakthrough on release. MUM, however, is upping the ante: according to Google, it’s supposed to be 1000x more powerful than BERT.
A great deal of that power comes from the fact that it can multitask. It doesn’t have to do one task after another, but it can handle multiple tasks simultaneously. This means that it can read text, understand the meaning, form deep knowledge about the topic, use video and audio to reinforce and enrich that, get insights from over 75 languages and translate those findings into multi-layered content that answers complex questions. All at once!
An idea of the power of Google MUM
At I/O 2021, Google’s Prabhakar Raghavan gave an insight into how that would work. He used the complex query “I’ve hiked Mt. Adams and now want to hike Mt. Fuji next fall, what should I do differently to prepare?” to demonstrate what MUM could do. In a regular search session, you’d have to search for all the different aspects yourself. Once you have everything, you need to combine it to have all the answers to the questions.
Now, MUM would combine insights from many different sources on many different aspects of the search, from measuring the mountains to suggesting a raincoat because it’s the rainy season on Mt. Fuji to extracting information from Japanese sources. After all, there’s a lot more written about this specific topic in that language.
In complex queries like this, it all comes down to combining entities, sentiments, and intent to figure out what something means. Machines have difficulty understanding human language, and language models like BERT and MUM get real close to doing just that.
MUM takes it a step farther by processing language and adding video and images because it is multi-modal. That makes it possible to generate a rich result that answers the query by presenting a whole new piece of content. MUM’ll even be incorporated into Google Lens, so you can point your camera to your hiking boots and ask if these are suited to take that hike up to Mt. Fuji!
Of course, the end goal of all of this is to help you get more information with fewer search queries — most likely within the boundaries of Google itself. We’ve seen a steady increase in rich results and quick answers, which also get more visual and more prominent by the day. Many other developments, both inside and outside of search, paints the picture of a Google looking to provide most of the answers to your questions themselves.
On the road to fully AI-powered, conversational and visual search
Google is quietly — no, scratch that — overtly moving to a fully AI-powered search engine. A search engine is not even the right word here, as it is more a knowledge presentation machine. And it’s not happing within the vacuum of that famous search bar.
Increasingly, Google is opening up the idea of search to include input from loads of other sources — microphones, cameras, TVs, wearables, smart speakers, what have you (they bought Fitbit, remember?). To serve all these different targets in a way that makes sense on those machines, search and search presentation have to change. A microphone on your fitness tracker has to hear and understand your query, while the assistant has to do something with it and reply with something useful.
Language understanding is key. The development of super-powerful, efficient, and flexible language models that can generate content to provide those answers succinctly and naturally will become essential.
At I/O 2021, we saw another example of this: LaMDA.
LaMDA: Language Model for Dialogue Applications
Another big AI head-turner in Google’s I/O 2021 keynote was LaMDA or Language Model for Dialogue Applications. This is a new technology to communicate with an AI — like a chatbot — much more naturally. It can converse in a more free-flowing way than previous AI’s could, as these often follow a simple path from A to B. Chatbots easily get confused when you switch topics, for instance.
LaMDA is setting out to fix this. The model can acquire a great deal of knowledge about a topic and engage in a fully-formed, two-way dialogue — even when it ventures outside the original topic. Google showed a demo of a LaMDA model trained on knowledge about the planet Pluto to discuss this with one of the researchers. It’s not perfect, but it gives a good idea of the kind of future we can expect.
Questions are bound to pop up
All of these developments raise questions, of course. For instance, if Google can really read, hear and see content in all languages and repackage that in a new format — complete with context and content generated by the AI — who’s the owner of this? And who’s responsible for what’s in those automated results? Is this another nail in the coffin for content producers?
And what about bias in AI? Bias and ethics are huge topics in AI, and if we’re truly talking steps towards an AI-powered future, we need to be assured of its neutrality and trustworthiness. Of course, Google specifically mentions AI bias in their post and is still training the model. It’s good to know, but I wonder how far this goes — and who watches the watchers?
Google puts MUM to work
Google is testing MUM right now and will continue to do so until it feels safe to add to the systems. There’s no clear timeframe on when that will be, but it didn’t take that long in the case of BERT.
The introduction of MUM might simply mean better search results, but it might also mean a new type of search result. One cobbled together from various sources, reformated to be something completely new. This could impact how you think about content. It may make it less important to answer audience questions and solve their problems, as this might already be done by the system. Instead, you might be better off to keep improving your product and focusing on building brand preference. People need to encounter your brand to recall it later on. Think about ways to stand out and find ways to turn that traffic into customers for life.
Google is clearly betting on a future powered by AI and providing us with as of yet unimaginable ways of finding and consuming content. It’s a future where we have to do less of the dirty work and get more of the enjoyment part. Yes, we’re still on track for a future as envisioned in Wall-E — for better or worse.
A smarter analysis in Yoast SEO Premium
Yoast SEO Premium has a smart content analysis that helps you take your content to the next level!
The post Google’s MUM understands what you need: 1000x more powerful than BERT appeared first on Yoast.
https://yoast.com/google-mum/
#seo #seoservices #SEOServicesCompany #SeoServiceProvider #seoexpert #digitalmarketingtips #marketing #socialmediamarketing #socialmedia #webdesign #blogger #onlinemarketing #marketingdigital #contentmarketing #website #searchengineoptimization #advertising #internetmarketing #marketingstrategy #entrepreneur #digitalmarketingagency #ecommerce #webdevelopment #digital #design #marketingtips #sem #websitedesign #smallbusiness #graphicdesign