Meta has recently integrated its Meta AI assistant, powered by the new Large Language Model (LLM) Llama 3, into its social networks including Facebook, Instagram, WhatsApp, and Messenger. This integration adds new capabilities for users to access information on the web without having to switch between applications.
Initially introduced last year, the Meta AI assistant is based on the Llama 2 model and allows users to access real-time information and interact with natural language. The technology is now being tested with users in India, the United States, and African countries. With the launch of the Llama 3 model, Meta AI is being implemented on a global scale across its platforms, offering new features and enhanced capabilities.
The Meta AI assistant offers features such as accessing information without switching between applications, generating images from text, and providing helpful prompts for image editing. Users can interact with the assistant to request recipes, entertainment recommendations, study assistance, and more. The assistant can also suggest related queries based on content in users’ social media feeds, offering a personalized and interactive experience.
One of the latest features of Meta AI is Imagine which generates images from text in real time on WhatsApp. This feature animates them into GIFs and provides prompts for editing. It’s currently being tested in the United States. Overall, this integration enhances user experience by providing innovative features that make information access more convenient and interactive across all Meta’s social networks.
Users can now access the Meta AI assistant directly from any of the company’s social networks or through a web version available on meta.ai. The chatbot format allows users to ask questions