ServiceNow has activated Nvidia artificial intelligence infrastructure within its two United Kingdom data centres, a strategic deployment aimed at enhancing the nation’s sovereign AI capabilities. This initiative provides high-performance, locally-housed AI processing to both public and private sector organizations, enabling them to accelerate digital transformation while ensuring data residency and improved performance.
The collaboration leverages Nvidia’s advanced infrastructure to power a wide range of AI use cases across the ServiceNow portfolio. A central component of this enhancement is the introduction of Apriel, a new reasoning large language model (LLM) developed using Nvidia’s Nemotron model family. This powerful new engine is designed to enable a new class of sophisticated AI agents capable of making context-aware decisions and navigating complex enterprise workflows, marking a significant step forward in the drive toward widespread agentic AI.
A Strategic Push for UK’s AI Sovereignty
The decision to host advanced AI hardware directly within the UK is a crucial development for the nation’s technological autonomy. By locating the computing capacity that powers its AI solutions on British soil, ServiceNow addresses the critical needs for data sovereignty and faster performance, eliminating the latency and regulatory complexities of routing data internationally. This move provides organizations, particularly in the public sector and regulated industries, with the assurance that their data is managed within national borders.
This deployment has been recognized by government officials as a significant contribution to the national AI strategy. Kanishka Narayan MP, the Minister for AI and Online Safety, described the investment as a “real shot in the arm” for the UK’s AI infrastructure. He emphasized the importance of having the necessary tools on British shores to drive AI development “on our own terms,” reinforcing the government’s ambition to establish the country as a global leader in AI innovation.
The Technology Powering the Initiative
The foundation of this initiative is the deployment of cutting-edge hardware and sophisticated new AI models designed for enterprise-grade performance. The collaboration brings together Nvidia’s leadership in accelerated computing with ServiceNow’s expertise in intelligent workflows.
Nvidia’s Advanced Infrastructure
The UK data centres are equipped with Nvidia’s latest technology, designed to handle the immense computational demands of training and running large-scale AI models. While the primary announcement did not specify the exact GPU models, the broader context of Nvidia’s enterprise offerings points to the use of powerful architectures like the Blackwell platform. The Blackwell architecture, featuring chips with over 208 billion transistors, was created specifically for the generative AI era. It introduces fifth-generation Tensor Cores and a second-generation Transformer Engine, which dramatically reduce operating costs and energy consumption for LLM inference by up to 25 times compared to previous generations.
Introducing the Apriel Reasoning Engine
At the heart of the new service offering is Apriel, a reasoning LLM built using Nvidia Nemotron 15B. The Nemotron family of open models is engineered to build specialized and efficient agentic AI. These models excel at complex tasks such as graduate-level scientific reasoning, coding, and instruction following. By building Apriel on this foundation, ServiceNow has created an enterprise-grade reasoning engine. This engine will power intelligent digital agents that can perform more than simple automation; they can analyze situations, adapt to changing workflows, and deliver personalized outcomes with minimal human supervision.
Enhancing Enterprise Automation and Governance
The primary goal of this technological deployment is to fundamentally change how organizations operate by embedding more autonomous and intelligent systems into daily workflows. This is achieved through the concept of agentic AI, which is supported by a robust framework for governance and oversight.
Agentic AI in the Workplace
Damian Stirrett, Group Vice President & General Manager for UK & Ireland at ServiceNow, highlighted the company’s commitment to delivering “agentic AI at scale.” Agentic AI refers to systems that can pursue complex goals with limited direct human supervision, acting more like a partner than a simple tool. These AI agents use LLMs as a “brain” to perceive their environment, reason through tasks, and execute actions using various software tools. In a business context, this could involve an AI agent that not only identifies a customer service issue but also autonomously accesses relevant databases, initiates a resolution process, and communicates with the customer, all while learning from the interaction to improve future performance. This evolution of AI promises to unlock significant efficiency gains and allow human employees to focus on more strategic challenges.
Framework for Responsible AI Deployment
To ensure this powerful technology is deployed responsibly, ServiceNow provides its AI Control Tower. This centralized governance framework gives organizations the visibility and control needed to manage all AI initiatives, whether developed in-house or by third parties. The AI Control Tower allows businesses to connect AI projects to core business strategies, automate workflows for the entire AI lifecycle, and manage risk and compliance with regulations like the EU AI Act. It provides a unified platform to monitor performance, enforce governance policies, and ensure that all AI systems operate transparently and align with enterprise goals.
A Shared Vision for Workforce Transformation
The collaboration between ServiceNow and Nvidia is built on a shared mission to reimagine productivity. Kari Ann Briski, Vice President of Generative AI Software for Enterprise at Nvidia, stated that the two companies aim to create AI tools that “help people get more done.” This vision extends beyond simple task automation to creating an ecosystem where AI agents can handle complex workflows, freeing up human potential for innovation and higher-value work. The integration of Nvidia NeMo microservices into the ServiceNow Workflow Data Fabric is a key step toward this goal, providing a powerful foundation for the creation of intelligent digital agents.
Broader Context of UK Tech Investment
This deployment is part of a larger wave of technology and AI investment in the United Kingdom. The announcement follows UK Prime Minister Sir Keir Starmer’s pledge at the 2025 London Tech Week to invest £2 billion into the country’s AI infrastructure, a move intended to position Britain as an “AI maker, not an AI taker.” In the wake of this commitment, numerous other technology firms have expanded their UK operations. The AI infrastructure company Nscale committed to deploying 10,000 Nvidia Blackwell GPUs in the UK by the end of 2026, while AI cloud provider Nebius announced its first UK-based AI factory. Microsoft also committed to new data centre investments, and Nvidia has partnered directly with the government to address the AI skills gap. Nvidia’s CEO, Jensen Huang, has noted that such infrastructure investments create a positive feedback loop, enabling more research, breakthroughs, and company growth.