As AI reshapes the financial sector, BNP Paribas is focused not on chasing trends but on building platforms that offer control, resilience and long-term value. Mark Venables explores how the bank’s hybrid infrastructure and platform-first strategy are helping it navigate complexity at scale.
Few industries are as tightly regulated, deeply interconnected, or reliant on legacy infrastructure as banking. Few face the same relentless pressure to modernise. For BNP Paribas, a global financial giant operating in 64 countries with over 178,000 employees, artificial intelligence (AI) is not simply another layer of digital transformation. It is a structural capability embedded into the core of how the organisation plans, builds and delivers services. But the real story is not scale. It is controlled.
From inference to infrastructure
Artificial intelligence at BNP Paribas is not a recent initiative. The company used expert systems more than 40 years ago and has long relied on algorithmic approaches in trading. However, the current wave of AI innovation presents new technical, organisational, and regulatory challenges. At its core, it ensures that systems built to learn can also be governed.
“Infrastructure is the backbone of everything,” Jean-Michel Garcia, Chief Technology Officer at BNP Paribas, says. “Inference depends on infrastructure. Data centres will be the critical focus in the coming years. If you do not invest in your infrastructure now, you will fall behind on innovation and fail to meet business demand.”
This dependency between AI and infrastructure shapes how BNP Paribas develops its internal platforms. Its strategy rests on a hybrid foundation. On the one hand, it has built a dedicated private cloud and three European data centers operated exclusively by IBM. On the other, it continues to invest in public cloud and open-source tooling, enabling flexibility across environments. This gives the company complete control over where and how data and models are stored, accessed, and scaled.
“We are not optimising for the sake of cost-cutting,” Garcia adds. “We are investing significantly, and we must place those investments carefully. That means planning our delivery and choosing where to scale.”
BNP Paribas faces a familiar dilemma: whether to outsource model development and deployment entirely or do everything in-house. The choice, Garcia explains, is not binary. “We sit between producers and consumers,” he continues. “On the production side, we must support LLMs, RAG systems, orchestration, model selection, and pipelines. At the same time, we need to offer quality and flexibility. Security, sovereignty and privacy are non-negotiable.”
That last point is central. Regulatory compliance is a given in Europe, but BNP Paribas goes further. Client data, encrypted outputs, and AI-generated content raise new questions about access, governance, and liability. A hybrid model, with internal control over infrastructure and tooling, allows the company to navigate these challenges more confidently.
Platform thinking in a fragmented world
Delivering AI at scale within a complex, multi-entity business requires more than raw compute. It demands orchestration. BNP Paribas has more than 800 AI use cases in production, with hundreds more under development. The only way to manage this volume across diverse business units is to adopt a platform mindset.
“When users enter the platform, they must be able to complete all their work without leaving it,” Emmanuel Salzard, Head of ITG Labs at BNP Paribas, explains. “That means combining hardware, software, tooling, services, organisation, and data into one unified portal.”
The AI Cloud is at the heart of this approach, a set of modular, interoperable services designed to make AI tools accessible, reliable, and compliant. The infrastructure spans two core clouds: a dedicated private cloud for sensitive workloads and a general on-premise cloud. Both are containerised, using OpenShift to manage compute across environments. This setup provides the elasticity needed to allocate GPU resources dynamically based on demand and priority.
Salzard emphasises the importance of governance. “AI introduces new IT assets that are not yet mature in traditional frameworks,” he adds. “For instance, we must be able to rewind a model to the exact code and data used months earlier for audit or debugging. Security remains a concern, AI platforms still lack robust malware defences, and strict internal controls make tasks like downloading models highly constrained.”
BNP Paribas has built its own Model Management System to manage these risks, handling everything from model scanning and signing to deployment and archival. Initially intended to collaborate with Hugging Face, the project pivoted when it changed its offering. The result is a fully integrated system connected to the bank’s internal AI marketplace. It enables developers to reuse and track models while complying with internal and external requirements.
This self-service approach is key. Data scientists can access pre-approved models, integrate with NVIDIA and other catalogues, and work within a fully traceable DevOps environment. “Everything is deployed using GitOps principles,” Salzard explains. “We do not touch production directly. All changes come from referential repositories, and unauthorised production changes are reverted automatically.”
Data as a product
Underlying all of this is data – structured, unstructured, historical, and real-time. BNP Paribas has built a significant internal capability here, with 700 data scientists and 3,000 data experts working across the business. The infrastructure to support them includes a new data factory, which evolved from a previous Hadoop decommissioning project.
This initiative, Data V2, offers ingestion, enrichment, governance, and exposure services. It connects directly to the AI marketplace, making data accessible and auditable. “Data is now treated as a product,” Salzard notes. This product mindset extends to access controls, lineage, and observability, allowing models to be tested, traced, and validated consistently.
The platform’s strength is its integration of data and model management. It is not simply about automating workloads or accelerating pipelines. It is about creating an ecosystem where AI can be deployed at scale without losing transparency or control.
Resilience through simulation
One of the more forward-looking elements of the BNP Paribas strategy is its exploration of simulation and digital twins, not in the conventional sense of modelling physical assets, but in mapping dependencies across software and systems. “As we give AI greater responsibility, we must also maintain control,” Salzard says. “That will be critical for sustaining trust in these platforms.”
This speaks to a broader ambition: to use AI to power business processes and improve the bank’s IT landscape. By simulating application-to-application dependencies, BNP Paribas can identify risks, optimise resilience, and plan upgrades with greater precision. In a world where financial systems are only as strong as their weakest link, this kind of introspective AI could prove just as valuable as customer-facing automation.
A strategic right to choose
The most striking feature of the BNP Paribas approach is its clarity of purpose. AI is not pursued for its own sake. Nor is it deployed through blind adherence to trends. Every aspect of the platform, its architecture, governance, and services, is designed to support long-term flexibility.
“By continuing to invest in scalable infrastructure, robust platforms, and cross-functional services, we will maintain the flexibility to grow, adopt new projects, and expose services that serve the business,” Garcia concludes. “Most importantly, we will retain the ability to choose—where, how, and when we deploy. And in this environment, having that choice is the most strategic advantage we can hold.”
This idea of choice, rooted in preparation, supported by infrastructure, and governed by design, is the real story behind BNP Paribas’s AI journey. It is not about reacting to change. It is about being ready for it.