In a strategic move aimed at reshaping the enterprise AI landscape, Amazon Web Services (AWS) and Meta have opened applications for a new startup programme centred around the development and deployment of applications using Meta’s open-weight Llama large language models.
The six-month initiative, which will support 30 early-stage US startups, signals a deepening commitment by both tech giants to accelerate the real-world utility of generative AI. Applications close on 8th August, with selected participants receiving up to $200,000 in AWS promotional credits, hands-on technical support from engineers at both AWS and Meta, and access to a dedicated Discord channel with Meta’s Llama team for real-time collaboration.
Designed for seed to Series B startups, the programme prioritises teams building on Llama to address specific industry needs. It marks a significant shift from headline-grabbing model releases to practical tooling and support for startups navigating the engineering and infrastructure demands of production AI.
From open weights to open outcomes
Launched with an ethos of transparency and accessibility, Meta’s Llama family of models has become a rallying point for developers seeking control over their AI stack. Unlike proprietary models tied to single cloud ecosystems, Llama is designed to be flexible and portable, an appealing proposition for startups wary of vendor lock-in.
“Foundation models like Llama are fuelling a new wave of AI applications that prioritise flexibility, transparency, and developer control,” said Jon Jones, Vice President and Global Head of Startups and Venture Capital at AWS. “This collaboration between AWS and Meta will empower founders to turn bold ideas into transformative products faster.”
Meta, which has positioned Llama as a more democratic alternative to closed models from rivals, emphasised the importance of developer creativity in shaping the future of AI. Ash Jhaveri, Vice President of AI Partnerships at Meta, described the move as “a way to give developers and researchers the flexibility to experiment, adapt and build responsibly.”
Infrastructure meets innovation
Beyond philosophical alignment, the collaboration is underpinned by AWS’s vast infrastructure footprint, including more than 240 services across compute, storage, analytics and machine learning. The initiative will operate through AWS Activate, the company’s startup engagement platform, giving participants not only technical mentorship but also scalable compute and go-to-market support.
By integrating technical guidance with cloud infrastructure incentives, AWS and Meta hope to reduce the barriers many startups face when trying to bring GenAI concepts into production. These include managing cost-intensive model training, fine-tuning for specific domains, and compliance with evolving governance frameworks.
While the initiative is currently limited to US-based startups, it highlights a growing trend toward ecosystem building around open foundation models, with cloud providers and model developers forming partnerships to distribute responsibility for safe, scalable AI deployment.
The first cohort will be announced by 29 August, with expectations high that these teams will push beyond proofs of concept and into applications that span industries from healthcare and finance to manufacturing and education.
As competition intensifies across the GenAI stack, programmes like this one may determine which startups move fastest, and most responsibly, from prototype to product.




