By Tom Hawkins and Caroline Connell, Managing Partners at Evelyn Partners
When an AI tool summarises a document or drafts an email, the experience feels immediate and frictionless. A question is typed, and a coherent, structured response appears almost instantly. But that smooth interaction is powered by a complex and highly capital-intensive ecosystem. Artificial intelligence is not one standalone innovation; it is a layered value chain of technologies and infrastructure working in concert. Understanding AI in this way helps to reveal where value is generated and where the most meaningful investment opportunities may emerge.
Consider a lawyer preparing for a negotiation. She asks her AI assistant to summarise key risks and redraft a contractual clause. The output appears polished and precise. However, what seems like a single productivity tool is, in reality, the visible layer of a far broader technology ecosystem operating beneath the surface.
At the top sits the application layer:
The interface with which users engage. This includes products such as Microsoft Copilot embedded within Office or specialist platforms like Harvey AI for legal professionals. It is here that AI delivers measurable productivity gains, embedding directly into professional workflows and driving subscription-based revenues.
Beneath this lies the enablement layer:
This allows AI to operate securely and effectively within enterprise environments. Through APIs (Application Programming Interface) provided by companies such as OpenAI, Anthropic or Google, applications connect to powerful foundation models. Data platforms like Snowflake and Databricks structure and retrieve internal information, while governance and compliance tools ensure outputs meet regulatory standards. This layer does not generate intelligence itself; rather, it ensures intelligence can be deployed responsibly at scale.
At the centre sits the intelligence layer:
The foundation models, including GPT, Claude and Gemini. Trained on vast datasets, these systems generate the responses users receive. They represent the core technological innovation.
Supporting them is the compute layer:
Hyperscale data centres operated by Amazon Web Services, Microsoft Azure and Google Cloud. Within these facilities sit advanced semiconductors, notably AI-optimised chips designed by NVIDIA and AMD and manufactured by firms such as TSMC. These components perform billions of calculations per second.
Underpinning the entire stack is energy:
Electricity powers the data centres and the processors within them.
For investors however, the central question is where sustainable returns are most likely to accrue. The gold rush analogy remains instructive: those supplying the picks and shovels often generated more consistent profits than those searching for gold. In the context of AI, this directs attention toward the enabling infrastructure — semiconductor designers and manufacturers, hyperscale cloud providers, data centre operators, and energy suppliers. As AI adoption expands, demand for compute capacity, advanced chips and reliable power rises irrespective of which individual applications succeed.
However, AI’s layered structure makes the investment case less straightforward. The “picks and shovels” are not limited to one obvious area. Companies building foundation models may benefit from scale and pricing power. Enterprise platforms that integrate AI into corporate systems could become deeply embedded and hard to replace. Some applications may build strong competitive positions through proprietary data or distribution advantages.
Understanding where long term value will ultimately concentrate requires careful analysis of competitive dynamics, capital requirements and barriers to entry. In a system this interconnected, leadership in one layer today does not automatically translate into a durable advantage tomorrow.
Get in touch
To get in touch with Tom or Caroline, please call 020 3818 6784 or email Tom.Hawkins@evelyn.com / Caroline.Connell@evelyn.com.