Summarized from TowardsDataScience.com: Multitask Unified Model [MUM] is a recent improvement of Google’s search engine. Like other popular state-of-the-art language models such as GPT-3 or LaMDA, MUM is based on the transformer architecture. BERT ( MUM’s predecessor) is similar in this regard, the main difference being that MUM is 1000x more powerful. ... MUM can combine info from images and text (and in the future, Google will include audio and video).