Key Takeaways
- Meta has upgraded Llama AI to version 3.2, introducing multimodal capabilities that allow it to process text, audio, and images.
- The new standout features are celebrity voices for Meta AI interactions, image interpretation and editing capabilities, and improved integration across Meta’s ecosystem.
- Llama 3.2 is available in four sizes (11B, 90B, 1B, and 3B models) to cater to diverse needs, from complex tasks to lighter ones.
Meta has just rolled out an upgrade to its Llama AI model, upgrading it from version 3.1 to 3.2 and bringing a heap of exciting new capabilities. Now, Llama is multimodal, meaning it can process text, audio, and images, making it more versatile than ever. So, what are the standout features of this latest update?
1. Celebrity Voices for Meta AI
One of the most exciting new features in Meta’s Llama 3.2 is the addition of celebrity voices to its AI. With this update, you can now use your voice to interact with Meta AI on platforms like WhatsApp, Messenger, Facebook, and Instagram. Even better, it will respond to you aloud, making the experience more personal and engaging.
No matter what you need from Meta AI—answers, explanations, or just some fun—this feature makes everything even more enjoyable. Now you can choose to hear responses from celebrities like the witty Awkwafina, the legendary Dame Judi Dench, the energetic WWE star John Cena, the hilarious Keegan Michael Key, and the charming Kristen Bell.
2. Visual Feedback and Image Editing Capabilities
Meta’s Llama 3.2 can now “see” and interpret images. We’ve all been used to AIs that are great at handling text—whether answering questions like a chatbot or summarizing long articles—but machine vision opens up entirely new dimensions.
With Llama 3.2 in Meta AI, you can take a picture of a historical landmark during your travels, and the AI can provide detailed information about its history and significance. This is particularly handy for history buffs and adventurous travelers alike.
But this visual feedback doesn’t stop there. The AI can also help you edit your photos by adding new backgrounds or details on demand. So you could ask it to add a sunset to a photo you took at the beach or swap out the background entirely. This feature is similar to what you might find in editing apps like Photoshop or Lightroom, but having it built directly into Meta’s platform makes it much easier to access.
3. Several Versions of Llama 3.2
Llama 3.2 is launching with four different model sizes, each tailored to different needs and use cases.
First up are the 11B and 90B models (that’s “B” for billions of parameters). These are the multimodal heavyweights of the Llama 3.2 family, designed for complex tasks requiring more computational power. Imagine overseeing a construction project and wanting to know the best way to allocate resources based on a dynamic schedule. Llama 3.2 can analyze the timeline, resources, and task dependencies to suggest the most efficient work plan.
Or let’s say you’ve got a comprehensive database of customer feedback. Instead of manually sorting through the comments, you could ask the model to identify patterns in customer satisfaction over time, and it’ll process the data to deliver an instant report.
On the other end of the spectrum, we have the 1B and 3B models. These are great for lighter tasks that prioritize speed and privacy; you might think about using these on your phone for everyday personal productivity. For example, you could have a to-do list app that can automatically categorize your tasks, highlight urgent ones, and even set reminders for deadlines. The best part is that all this happens locally on your device, so none of your sensitive information—like emails or calendar events—leaves your phone.
Meta’s new Llama 3.2 models are now more accessible than ever, available for download on platforms like Llama (Meta’s official site) and Hugging Face. But what sets this release apart is its integration into Meta’s ecosystem. With billions of people using Facebook, Instagram, WhatsApp, and Messenger daily, an upgraded Llama means many more users will soon experience a more sophisticated and engaging Meta’s AI.