Big Tech Earnings: All Hands on Deck for AI, We’re Going into Combat Mode
Amazon, Meta, Google, Microsoft Earnings Takeaways
The world’s largest big tech companies, Google, Meta, Microsoft, and Amazon just reported their latest quarterly earnings in the past few weeks, and the buzz around AI is louder than ever. Here are the key takeaways on what these tech giants are saying about AI and what we know about their AI strategies.
Capital Investment is Skyrocketing
The big four are gearing up to spend a jaw-dropping $230 billion on capital investments in 2024, an increase of nearly 50% from 2023. To put that into perspective, $230 billion is more than 10% of the GDP of a medium-sized economy like Brazil, Italy, or Canada.
So why are these companies pouring in so much cash? Simple: they’re seeing demand outstrip supply. Microsoft noted that "demand continues to be higher than our available capacity," while Amazon echoed this sentiment, stating they see "more demand than we could fulfill if we had even more capacity today".
Where’s the Demand Coming From?
The demand for AI applications is surging, and the big tech players are already reaping the benefits. Here's a closer look at what each company offers, the demand signals they are seeing and how they’re capitalizing on this momentum.
Google
Google is experiencing a sharp acceleration in its cloud business, largely fueled by its extensive AI portfolio. This growth isn't just about attracting new clients; existing customers are also ramping up their usage. Google’s services cover the entire tech stack:
Infrastructure Layer: Google provides computing power that supports AI workloads.
Platform Layer: Google’s Gemini API allows developers to seamlessly integrate AI capabilities powered by Gemini models into their applications.
Data Layer: Tools like BigQuery enable advanced data analytics powered by AI.
Applications Layer: Google is rolling out various AI-driven applications, including the VW Virtual Assistant, which enhances user interactions through intelligent responses.
Moreover, Google is innovating its main search products with AI. Philipp Schindler, Google’s business chief, emphasizes that “AI really supercharges search.” Recent innovations include AI-powered search formats like AI Overview and Circle to Search, which significantly enhance user experience and engagement.
Additionally, Google is already reaping the benefits of generative AI internally. Notably, the company disclosed that over 25% of all new code at Google is now generated by AI, showcasing its substantial impact on efficiency.
Microsoft
Microsoft stands at the forefront of the AI revolution, largely thanks to its strategic partnership with OpenAI. Similarly to Google, Microsoft also offers AI products and services across the full stack.
Infrastructure Layer: Microsoft provides powerful computing resources where OpenAI models are trained.
Platform Layer: Microsoft is building an end-to-end app platform within Azure AI, enabling businesses to create their own co-pilots and agents. For instance, GE Aerospace has utilized Azure OpenAI to develop a digital assistant for its 52,000 employees.
Application Layer: Is where Microsoft has the most interesting offerings across many different use cases. The GitHub Copilot is revolutionizing coding practices, with enterprise usage increasing by 55% quarter-over-quarter. Tools like Copilot Autofix help developers fix vulnerabilities more than three times faster than before. Additionally, Microsoft 365 Copilot has doubled daily user engagement and is now utilized by nearly 70% of Fortune 500 companies.
The introduction of features like Copilot Studio allows organizations to connect Microsoft 365 tools to autonomous agents, enhancing productivity across the board. More than 100,000 organizations have tried Copilot Studio, up over 2 times quarter over quarter. Linkedin's first agent, Hiring Assistant, helps hirers find qualified candidates faster with 44% higher acceptance rate.
Ultimately, the way Microsoft thinks about AI is that humans need to interface with AI, and the UI layer for AI will be Copilot, just like how Windows has been the UI layer for humans to interface with most computers today.
Importantly, customers are paying for these AI applications. During its earnings call, Microsoft announced that its AI business will reach a run-rate revenue of $10 billion by next quarter, the fastest business in Microsoft history to ever achieve that scale.
Meta
Meta also offers full-stack AI products and services.
Model Layer: Meta is training their Llama 4 model on an unprecedented scale, over 100,000 H100 GPUs, which is larger than any other GPU clusters currently underway. Mark Zuckerberg heavily emphasized the benefits of open sourcing Llama, as he believes that it would help the industry standardize more around Llama models.
Application layer, Meta AI boasts over 500 million monthly active users, positioning it to become the most widely used AI assistant by year-end. Zuckerberg envisions a future where businesses can easily deploy AI agents for customer service and sales tasks with just a few clicks.
Similar to Google’s case, AI is already contributing to Meta’s existing businesses today. Recent improvements in AI-driven feed recommendations have resulted in increased engagement on Facebook and Instagram, up by 8% and 6%, respectively. Moreover, over one million advertisers have utilized Meta’s generative AI tools to create more than 15 million ads in just one month. The potential for ads conversion increases through AI tools is significant, with businesses reporting a 7% boost in conversions using image generation AI tools.
Lastly, Meta is uber-focused on its Meta glasses, which Zuckerberg sees as the ideal form factor for integrating AI into everyday life as the glasses will allow consumers to “let your AI see what you see, hear what you hear, and talk to you” real time.
Amazon
Let’s next review Amazon’s offering across the stack as well.
Infrastructure Layer: AWS leads in providing cutting-edge hardware like NVDA H200 GPUs and custom silicon solutions designed for training and inference tasks.
Models Layer: Amazon Bedrock is an aggregator of models, offering a diverse selection of foundational models, recently adding Claude 3.5 and Meta Llama 3.2 to its roster. In the past 18 months, AWS has released nearly twice as many machine learning features compared to other leading cloud providers combined.
Application Layer: Amazon's suite includes innovative tools like Amazon Q (a generative AI assistant for software development), Rufus (an expert shopping assistant), Project Amelia (an assistant for sellers), and Kindle Scribe (a note-taking AI). They’re also re-engineering Alexa’s architecture with next-gen foundation models to enhance user interaction.
On the future of AI applications, Amazon believes that while many current applications focus on cost savings and productivity improvements, the next wave will excel at not only answering questions but also taking actionable steps on behalf of users.
Amazon’s CEO Andy Jassy believes that GenAI “is a really unusually large, maybe once in a lifetime type of opportunity”.
The magnitude and pace of growth of AI related revenue support Andy’s position. Amazon’s AI business is already a multi-billion dollar run-rate business that continues to grow at triple-digit YoY growth rate, which is three times faster than AWS when it was at this stage of its evolution.
AI Strategy Comparison: What's Similar and What's Different
Similarities
Everyone's going full-stack, covering everything from infrastructure to models to applications.
Investment is on the rise across the board, even though they've already poured in huge sums. Demand is just too high as companies are rolling out real applications with actual use cases that are driving value.
Differences
Microsoft is using its corporate dominance to push Co-Pilot as the go-to interface between humans and AI, much like Windows does for computers.
Meta is all about the consumer side, leveraging its massive social media presence. They want Meta AI to be the most popular AI assistant by year-end, tapping into their 3 billion monthly active users. Plus, their glasses aim to be the next big thing in AI, just like smartphones were for the internet. With their open-sourced Llama model, they're trying to make LLMs a commodity - if Llama's free, why pay for anything else?
Amazon is focused on AWS, still the king of enterprise cloud. They want to leverage AWS’s stronghold in enterprises to allow AWS to be the ultimate platform for AI developers and users alike.
Google has a muddled strategy. They’re trying to position Google Cloud as a one-stop-shop like AWS but is not really differentiated (and Google Cloud is behind AWS and Azure in the first place). AI-power search feels more like a defensive move against rivals like Perplexity and ChatGPT (who is entering into search) than an offensive play. Google has its own infrastructure and models, but they don't really stand out compared to other hyperscalers or model developers. In terms of applications, who has even heard of WV Virtual Assistant before? It seems like Google has all the offerings but isn’t quite hitting the mark with any of them and lacks a clear direction.
Bottom Line
AI is not just some fleeting trend. It is here to stay. Real applications and use cases are emerging and major tech players are betting big on it. Demand is skyrocketing, and these companies are racing to keep up.
While all big tech firms offer full-stack solutions, each has its own strategic focus. Microsoft is pushing its Co-Pilot as the essential bridge between humans and AI. Meta is all about making AI assistants a household name and Meta glasses the next big thing. Amazon is leveraging its AWS powerhouse to dominate the enterprise space. And Google is trying to figure out how to stand out in the AI era.
The competition is heating up, and it’ll be thrilling to see how these strategies unfold. With so much at stake, expect to see some bold moves as these tech titans battle for supremacy in the AI arena.