Over the past couple of years, much of the discussion around Meta’s plans for the future have centred on the metaverse, a hypothetical iteration of the Internet as an immersive and collaborative virtual world that users can access via VR and AR headsets.
Prior to the creation of ChatGPT, the metaverse was the talk of the town and dominated headlines. Understandably so. After all, it was billed as a virtual world that would be so immersive and engaging that most of us would want to spend large chunks of our lives in it.
However, over the past 12 months, talk has quelled and rumours have swirled that Mark Zuckerberg’s vision for the metaverse is in trouble. It’s easy to see why, too. After all, Reality Labs, the virtual and augmented reality branch of Meta, has lost a staggering $21 billion since last year!
Although part of these losses reflect a long-term investment, there’s still very little evidence that the company’s punt will work. Meta itself has even revealed that only 300,000 people use its Horizon Worlds game.
So, it was unsurprising that, at this year’s Meta Connect conference, Mark Zuckerberg largely stayed away from the topic of the metaverse and instead focused almost entirely on how the company’s AI offerings can help advance its products. In fact, he barely mentioned the metaverse at all.
So, let’s take a detailed look at exactly what the company is doing in the world of AI and dissect what this may mean for the metaverse in the longer term.
Meta launches new AI experiences across apps and devices
Meta says that, over the last decade, it has released more than 1,000 AI models, libraries and data sets for researchers – including the latest version of their large language model, Llama 2, which is available in partnership with Microsoft.
At the latest Meta Connect conference, the company introduced customers to a raft of new AI experiences and features. These, the company believes, will give users the tools to be more creative, expressive and productive.
So, what exactly has the company announced?
- AI stickers
Firstly, Meta has announced that the company is starting to rollout AI stickers across its apps.
The company claims that billions of stickers are sent across its platforms each month. It believes that these new AI stickers will provide users with another fun and creative way to communicate and express themselves. This is because the feature will enable users to effortlessly generate customised stickers for chats and stories.
Using technology from Llama 2 and Emu (the company’s foundational model for image generation), Meta’s AI tool will turn text prompts into multiple unique, high-quality stickers in seconds.
This new feature, which is rolling out to select English-language users over the next month in WhatsApp, Messenger, Instagram and Facebook Stories, provides users with infinitely more options to convey how they’re feeling at any moment.
- Image editing with AI
Soon, users will also be able to edit images, or even co-create images with others on Instagram, thanks to two new AI editing tools: restyle and backdrop.
Restyle allows users to reimagine their images by applying the visual styles they describe. For example, if a user wants the image to look more like an artistic painting, they can type ‘watercolour’ in the descriptor and the AI will make the image look like a watercolour painting.
As the name suggests, backdrop changes the background of an image. For example, if you’re standing in front of a wall, you can say something like ‘put me in front of the Eiffel Tower’ and the AI will add your chosen background.
Restyle and backdrop use the technology from Emu, while backdrop also leverages learnings from Meta’s Segment Anything Model. Images that are edited using restyle and backdrop will indicate that AI has been used. The company says it is experimenting with forms of visible and invisible markers.
images courtesy of Meta
- An AI assistant that spans all Meta products
Added to this, Meta has also launched Meta AI, a new assistant that users can talk to like another human or a friend. Available on WhatsApp, Messenger and Instagram, Meta AI will soon also come to Ray-Ban Meta smart glasses and Quest 3.
Meta AI is powered by a custom model that leverages technology from Llama 2 and the company’s latest large language model (LLM) research. In text-based chats, Meta AI has access to real-time information through the company’s search partnership with Bing. It also offers users a tool for image generation.
- AI characters
Alongside Meta AI, the company has also unveiled 28 AIs that have personalities, opinions and interests. These characters can be messaged on WhatsApp, Messenger and Instagram and they are supposed to be ‘a bit more fun to interact with’, according to the company.
These characters are based on celebrities, cultural icons and influencers, including Chris Paul (Perry), Snoop Dogg (Dungeon Master), Tom Brady (Bru), Kendall Jenner (Billie) and Naomi Osaka (Tamika).
For now, the knowledge base of these AIs (barring Bru and Perry) is limited to information that largely existed prior to 2023, which means some responses may be dated. However, the company is aiming to bring search to many more of its AIs in the coming months.
The next phase of AI development
Meta is keen to stress that these developments are just the start and that the company intends to expand its AI offering considerably in the coming years.
As part of this, the company has unveiled some of the steps it will take to further improve the AI experiences it provides. For example, the company has now introduced AI Studio, a platform that supports the creation of its AIs.
Meta plans to make AI Studio available to those outside the company (both coders and non-coders), so they can build their own AIs. As part of this, developers will be able to build third-party AIs for their messaging services.
Added to this, businesses will also be able to create AIs that reflect their brand’s values and improve customer service experiences. Meta is launching this feature in alpha and plans to scale it further in 2024.
Similarly, creators will also be able to build their own AIs that will help extend their virtual presence across Meta’s apps. For safety and responsibility purposes, these AIs will be sanctioned by the individual creator, who will also have direct control over them.
Finally, Meta also says that it is creating a sandbox that will enable anyone to experiment with creating their own AI. This will be released in the coming year and will eventually be brought to the metaverse.
images courtesy of Meta
Why does this matter?
Meta’s pivot from the metaverse to AI signifies the biggest change in the company’s focus since Facebook became Meta in 2021 and stopped simply being a social media company.
After all, it has only been two years since Mark Zuckerburg announced that Facebook was changing its name to Meta and that he was reorienting the entire company around the metaverse and Web3.
But, standing on the same stage this year, some journalists believe he was actively trying to avoid saying the word ‘metaverse’ entirely. That’s unsurprising. After all, Meta has now lost so much money on the metaverse concept that some of its own investors are beginning to question it.
Added to this, Meta has failed to bring the customer along on the metaverse journey. Recently, Zuckerberg has even been mercilessly mocked for trying to hype seemingly minor metaverse features like low-res graphics or avatars with legs.
Although the metaverse concept is far from dead (in the last week Zuckerberg has unveiled a new raft of metaverse avatars that are far more lifelike), it now feels like the project is taking more of a backseat when compared with AI.
This is understandable. Meta took a gamble on VR and AR technology, but the rise of AI has generated interest from investors and consumers alike. Now, services like OpenAI’s ChatGPT and Snap’s MyAI have made the technology both accessible and understandable, and Meta must respond.
Given this, Meta has clearly started to focus much more on improving its current AI offering and how it can further enhance its existing products. For his part, Zuckerberg disputes this. Back in April he said “A narrative has developed that we’re moving away from focusing on the metaverse vision… We’ve been focusing on both AI and the metaverse for years now, and we will continue to focus on both.” That said, most industry experts dispute this view.
After all, even if we assume Meta’s AI offerings and the metaverse are linked and that AI is central to the longer-term metaverse concept, it’s undeniable that the way Zuckerberg speaks about the metaverse has also changed over the past couple of years. While he used to speak about socialising and working in VR environments, he now speaks about an AI-centric metaverse and how he believes that AI assistants can make the metaverse more useful.
So, while the creation of the metaverse may still be the long-term focus for Meta, in the short term the company is much more likely to focus on enhancing its AI offering in order to keep both investors and customers happy. The new AI features unveiled this year are incredibly interesting, and they look like they may only be the first steps the company takes.
But, can the company dominate the AI space and then ultimately bring these features to the Metaverse? That remains to be seen.
Tom Brook
When he's not crafting content, Tom's obsessed with all things sport, particularly football, cricket, golf and F1.