- Introducing the Meta AI App: A New Way to Access Your AI Assistant
Today, we’re launching the first version of the Meta AI app: the assistant that gets to know your preferences, remembers context and is personalized to you.
The app includes a Discover feed, a place to share and explore how others are using AI.
It’s now the companion app for our AI glasses and is connected to meta.ai, so you can pick up where you left off from anywhere you are.
We’re launching a new Meta AI app built with Llama 4, a first step toward building a more personal AI. People around the world use Meta AI daily across WhatsApp, Instagram, Facebook and Messenger. And now, people can choose to experience a personal AI designed around voice conversations with Meta AI inside a standalone app. This release is the first version, and we’re excited to get this in people’s hands and gather their feedback.
Meta AI is built to get to know you, so its answers are more helpful. It’s easy to talk to, so it’s more seamless and natural to interact with. It’s more social, so it can show you things from the people and places you care about. And you can use Meta AI’s voice features while multitasking and doing other things on your device, with a visible icon to let you know when the microphone is in use.
- Hey Meta, Let’s Chat
While speaking with AI using your voice isn’t new, we’ve improved our underlying model with Llama 4 to bring you responses that feel more personal and relevant, and more conversational in tone. And the app integrates with other Meta AI features like image generation and editing, which can now all be done through a voice or text conversation with your AI assistant.
We’ve also included a voice demo built with full-duplex speech technology, that you can toggle on and off to test. This technology will deliver a more natural voice experience trained on conversational dialogue, so the AI is generating voice directly instead of reading written responses. It doesn’t have access to the web or real-time information, but we wanted to provide a glimpse into the future by letting people experiment with this. You may encounter technical issues or inconsistencies so we’ll continue to gather feedback to help us improve the experience over time.
Voice conversations, including the full duplex demo, are available in the US, Canada, Australia, and New Zealand to start. To learn more about how you can manage your experience on the Meta AI app and toggle between modes, visit our help center.
- Intelligence for You
Meta AI uses Llama 4 to help you solve problems, navigate your daily questions, and better understand the world around you. With the ability to search across the web, it can help you get recommendations, deep dive on a topic, and stay connected with your friends and family. Or if you’re just looking to play around with it, we provide conversation starters to inspire your searches.
We’re using our decades of work personalizing people’s experiences on our platforms to make Meta AI more personal. You can tell Meta AI to remember certain things about you (like that you love to travel and learn new languages), and it can also pick up important details based on context. Your Meta AI assistant also delivers more relevant answers to your questions by drawing on information you’ve already chosen to share on Meta products, like your profile, and content you like or engage with. Personalized responses are available today in the US and Canada. And if you’ve added your Facebook and Instagram accounts to the same Accounts Center, Meta AI can draw from both to provide an even stronger personalized experience for you.
And just like all our platforms, we built Meta AI to connect you with the people and things you care about. The Meta AI app includes a Discover feed, a place to share and explore how others are using AI. You can see the best prompts people are sharing, or remix them to make them your own. And as always, you’re in control: nothing is shared to your feed unless you choose to post it.
An Assistant for Everyone
You’ll find Meta AI across all our products and all the devices it runs on. So whether catching up with family on Facebook, chatting with friends on WhatsApp or Messenger, scrolling Instagram, or wearing Ray-Ban Meta glasses, it’s easily accessible wherever you need it.
Glasses have emerged as the most exciting new hardware category of the AI era, and Ray-Ban Meta glasses have led the way in defining what’s possible. To integrate all our most powerful AI experiences, we’re merging the new Meta AI app with the Meta View companion app for Ray-Ban Meta glasses, and in some countries you’ll be able to switch from interacting with Meta AI on your glasses to the app. You’ll be able to start a conversation on your glasses, then access it in your history tab from the app or web to pick up where you left off. And you can chat between the app and the web bidirectionally (you cannot start in the app or on the web and pick up where you left off on your glasses).
Existing Meta View users can continue to manage their AI glasses from the Meta AI app – once the app updates, all your paired devices, settings and media will automatically transfer over to the new Devices tab.
From AI Glasses to Desktop
Meta AI on the web is also getting an upgrade. It comes with voice interactions and the new Discover feed, just like you see in the app. This continuity across the Meta AI app, AI glasses and the web helps deliver a more personal AI that can be there wherever you need it.
The web interface has been optimized for larger screens and desktop workflows and includes an improved image generation experience, with more presets and new options for modifying style, mood, lighting and colors. We’re also testing a rich document editor in select countries, one that you can use to generate documents full of text and images and then export those documents as PDFs. And we’re testing the ability to import documents for Meta AI to analyze and understand.
- You’re In Control of Your Experience
Voice is the most intuitive way to interact with Meta AI, and the Meta AI app is designed to help you seamlessly start a conversation with the touch of a button – even if you’re multitasking or on-the-go. If you prefer to have voice on by default, there’s a control in your settings to toggle the Ready to talk feature on. (META)