OpenAI starts new deployment business; acquires Tomoro to scale enterprise AI rollouts

Advertisements

OpenAI has launched the OpenAI Deployment Company, a new business focused on helping enterprises build, test, and run AI systems inside day-to-day operations. The company will work directly with businesses through teams of Forward Deployed Engineers, or FDEs, who specialize in building AI systems within complex enterprise environments.

As part of the launch, OpenAI has agreed to acquire Tomoro, an applied AI consulting and engineering firm. The deal is expected to bring around 150 engineers and deployment specialists into the new business once the acquisition closes.

The move signals a shift in how OpenAI plans to grow its enterprise business. Instead of only selling APIs and software subscriptions, the company now wants to play a larger role in how organizations redesign workflows, connect AI tools to business systems, and deploy AI across operations.

OpenAI’s push beyond models

OpenAI says more than one million businesses now use its products and APIs. The company believes the next phase of enterprise AI will depend less on access to models and more on whether companies can make AI work reliably inside real business environments.

That is where the OpenAI Deployment Company comes in.

The new unit will place engineers inside customer organizations to identify areas where AI can deliver measurable business results. Those teams will then help design workflows, connect AI systems to company data and software, and oversee deployment.

This approach resembles the “forward deployed engineering” model used by companies like Palantir Technologies, where engineers work closely with customers instead of only shipping standard software products.

OpenAI says the new business will operate as a standalone unit while staying closely connected to its research and product teams. The company believes this setup will help enterprise customers build systems that can adapt as newer AI models and tools become available.

What forward deployed engineers actually do

Forward deployed engineering focuses on building AI systems directly inside enterprise operations rather than offering one-size-fits-all software.

According to OpenAI, FDE teams work inside environments where companies deal with security controls, governance rules, permissions systems, compliance requirements, and legacy infrastructure.

Instead of starting with pre-built software, these teams study how work happens in practice and then build AI systems around those workflows.

A typical engagement will start with identifying high-value use cases. Engineers will then build production systems that connect OpenAI’s models with customer databases, tools, workflows, and operational controls.

The company says these deployments are aimed at measurable business outcomes rather than experimental pilots.

Tomoro acquisition gives OpenAI a ready-made engineering team

The acquisition of Tomoro gives OpenAI an existing team with enterprise deployment experience from day one.

Tomoro has worked on AI deployments for companies including Tesco, Virgin Atlantic, and Supercell.

These projects involved real-time AI systems used in operational environments where governance, reliability, and business impact matter immediately.

OpenAI says Tomoro’s engineers will help customers move faster from selecting AI use cases to deploying systems in production.

The acquisition remains subject to regulatory approvals and customary closing conditions.

Backed by investment firms and consulting companies

The OpenAI Deployment Company launches with more than US $4 billion in initial investment.

The partnership includes investment firms such as TPG, Advent International, Bain Capital, Brookfield Corporation, Goldman Sachs, SoftBank Corp., and Warburg Pincus.

Consulting and systems integration firms including Bain & Company, Capgemini, and McKinsey & Company are also part of the partnership.

This matters because these firms already work with thousands of enterprises across industries. OpenAI appears to be building a channel that combines AI technology with consulting, operational redesign, and enterprise-scale execution.

Enterprise AI is moving from pilots to operations

The launch reflects a broader shift taking place across the enterprise AI market.

Over the past two years, many organizations experimented with generative AI through chatbots, coding assistants, and employee productivity tools. A large number of those projects stayed limited to pilots because companies struggled with governance, data access, workflow redesign, and reliability concerns.

OpenAI now wants to position itself deeper inside enterprise operations rather than only acting as a model provider.

The company says AI systems are becoming capable of handling increasingly meaningful work across organizations. The challenge now is operational deployment.

Denise Dresser, Chief Revenue Officer at OpenAI, said companies need help integrating AI systems into the infrastructure and workflows that run their businesses.

The company also says its deployment model could help customers redesign operations around AI systems that can reason, take actions, and support decision-making.

OpenAI’s larger enterprise ambition

The OpenAI Deployment Company also highlights how competition in enterprise AI is changing.

Major technology firms including Microsoft, Google, Amazon Web Services, and Anthropic are all trying to become long-term enterprise AI partners.

The next phase of competition may depend less on benchmark scores and more on who can help enterprises deploy AI systems that actually change how work gets done.

OpenAI’s latest move suggests the company sees deployment, workflow redesign, and enterprise execution as the next major battleground in AI.

Share on