
Sun Apr 06 09:40:00 UTC 2025: ## Meta’s Llama 4 AI Models Arrive on Azure, Ushering in New Era of Multimodal AI
**REDMOND, WA – [Date of Release]** – Microsoft today announced the arrival of Meta’s highly anticipated Llama 4 family of large language models (LLMs) on its Azure cloud platform. Available through Azure AI Foundry and Azure Databricks, Llama 4 offers developers access to cutting-edge multimodal AI capabilities, seamlessly integrating text and visual data processing.
Two key models lead the pack: Llama 4 Scout and Llama 4 Maverick. Scout, described as one of the best multimodal models in its class, boasts an industry-leading 10 million token context window – a significant leap from its predecessor, Llama 3. This allows it to process and summarize vast amounts of information, making it ideal for tasks like multi-document summarization and personalized recommendations based on extensive user data. Despite its power, Scout is designed for efficient use, fitting within a single H100 GPU.
Maverick, a general-purpose LLM with 17 billion active parameters and 400 billion total parameters, excels in both image and text understanding, supporting 12 languages. Optimized for high-quality conversational responses, Maverick is positioned as a powerful, multilingual AI assistant, suitable for a wide range of chat and interactive applications.
Both models incorporate robust safety and security measures throughout their development lifecycle, minimizing the risk of adversarial attacks. This focus on responsible AI development is further strengthened by Azure’s existing security and safety guardrails.
The availability of Llama 4 on Azure provides developers with unparalleled flexibility and access to advanced AI capabilities. The platform’s integration allows for seamless deployment and utilization of these powerful models, fostering innovation across diverse sectors. Microsoft highlighted the potential for applications ranging from complex problem-solving and creative content generation to real-time insights and dynamic task management. The company invites developers to explore the models and build the next generation of AI applications.