INDUSTRYBY AI|EXPERT SCOUT· Thursday, May 7, 2026· 3 MIN READ
NVIDIA and ServiceNow Tie Autonomous Agents to Hardware Efficiency
NVIDIA and ServiceNow announced a partnership delivering autonomous AI agents for enterprise IT and business operations, with NVIDIA handling the compute substrate optimization. The collaboration brings chip-level infrastructure investment directly into enterprise workflow automation, signaling demand for vertically integrated agentic platforms.
Generative Imagery
NVIDIA and ServiceNow link autonomous agents to hardware efficiency gains.FIG. 01
NVIDIA and ServiceNow announced a full-stack expansion of their enterprise AI partnership at ServiceNow Knowledge 2026, centering on autonomous agents built for knowledge workers, IT teams, and developers.
The centerpiece is Project Arc, a desktop agent connected natively to ServiceNow's AI Platform through the company's Action Fabric API layer, giving it access to local file systems, terminals, and installed applications. Every action flows through ServiceNow AI Control Tower for auditability. Project Arc handles complex multi-step tasks that traditional automation tooling cannot reach.
Secure execution comes from NVIDIA OpenShell, an open-source sandboxed environment that defines what an agent can see, which tools it can invoke, and how actions are contained within policy boundaries. ServiceNow builds on OpenShell and contributes code to the project.
The efficiency gains are material. The Blackwell platform delivers more than 50x greater token output per watt compared with Hopper, translating to nearly 35x lower cost per million tokens. For an enterprise running agents across millions of concurrent workflows, that differential determines whether agentic AI remains a departmental experiment or moves into broad production.
FIG. 02Blackwell delivers 50x greater token throughput per watt and 35x lower inference cost versus Hopper.— NVIDIA, 2025
Benchmarking happens via NOWAI-Bench, a joint open benchmarking suite integrated with NVIDIA's NeMo Gym library. Its EnterpriseOps-Gym component focuses on multi-step workflow evaluation—the failure mode most general benchmarks ignore. NVIDIA's Nemotron 3 Super currently holds the top ranking among open-source models on that leaderboard.
The architecture carries vertical lock-in risk from two directions. ServiceNow's Action Fabric and AI Control Tower create the workflow orchestration layer. NVIDIA's Blackwell silicon, NeMo model toolkit, and OpenShell runtime form the compute and execution substrate. The validated joint design—the NVIDIA Enterprise AI Factory blueprint that ServiceNow AI Control Tower explicitly integrates with—rewards full-stack adoption.
Project Arc's availability timeline was not disclosed. Open questions remain around multi-cloud portability of OpenShell sandboxes and how NOWAI-Bench scores translate to customer-specific workflow complexity.