Andy Jassy
President and Chief Executive Officer at Amazon.com
Thanks, Dave. Today, we're reporting $187.8 billion in revenue, up 10% year-over-year. Given the way the dollar strengthened throughout the quarter, we saw $700 million more of foreign-exchange headwind than we anticipated at guidance. Without that headwind, revenue would have been 11% year-over-year and exceeded the top-end of our guidance. Operating income was $21.2 billion, up 61% year-over-year and trailing 12-month free-cash flow adjusted for equipment finance leases was $36.2 billion, up $700 million year-over-year.
We're pleased with the invention, customer experience improvements and results delivered in 2024 and have a lot more planned in 2025. I'll start by talking about our stores business. We saw 10% year-over-year revenue growth in our North-America segment and 9% year-over-year in our International segment, excluding the impact from foreign-exchange rates. Our continued focus on expanding selection, lowering prices and improving convenience drove strong unit growth that even outpaced our revenue growth.
We continue to add to our broad range of selection, giving customers choice across a variety of price points. We welcome notable brands to our store throughout 2024, including Clinique, Este Louder, Ora Rings and Armani Beauty. We continue to add to the hundreds of millions of products offered from our selling partners who made-up 61% of items that we sold-in 2024, our highest annual mix of third-party seller units ever. We also launched Amazon Hall for US customers in Q4, which offers customers an engaging shopping experience that brings ultra-low price products into one convenient destination. It's off to a very strong start.
Customers continue to want Amazon to be the place they rely on for sharp pricing. In the 4th-quarter, consumers saved more than $15 billion with our low everyday prices and record-setting events during prime big deal days in October and Black Friday and Cyber Monday around Thanksgiving. Additionally, for Federal's annual pricing study found that entering the holiday season, Amazon had the lowest online prices for the eighth year in a row, averaging 14% lower prices on average than other leading retailers in the US.
Our speed of delivery continues to accelerate and 2024 was another record-setting year for Prime members. We expanded the number of same-day delivery sites by more than 60% in 2024, which now serve more than 140 metro areas. And overall, we delivered over 9 billion units the same or next day around the world. Our relentless pursuit of better selection, price and delivery speed is driving accelerated growth in Prime membership. For just $14.99 a month, Prime members get unlimited free shipping on 300 million items often the same day or one-day delivery, exclusive shopping events like Prime Day, access to a vast collection of premium programming and live sports on prime video, ad-free listening of 100 million songs and podcasts with Amazon Music, access to unlimited generic prescriptions for only $5 a month, unlimited grocery delivery and orders over $35 from Whole Foods Mark and Amazon Fresh for $9.99 a month, a free Gruv Hub plus membership with free unlimited delivery and our latest benefit of a $0.10 per gallon fuel discount at BP, AMECO and AMPM stations.
When you think about this as a whole and also compared to many other membership services that are comparably and more expensively priced and offer just one benefit like video, Prime is a streaming deal and we have more coming for our Prime members in 2025. We also remain squarely focused on cost-to-serve in our fulfillment network, which has been a meaningful driver of our increased operating income. We've talked about the regionalization of our US network. We've also recently rolled-out our redesigned US inbound network. While still in its early stages, our inbound efforts have improved our placement of inventory so that even more items are closer to-end customers. Ahead of Black Friday in November, we'd improved the percentage of ordered units available in the ideal building by over 40% year-over-year.
We've also spent considerable time optimizing the number of items we send customers in the same package, which reduces packaging is more convenient for customers and less expensive for us to fulfill. And our per unit transportation costs continue to decline as we build-out and optimize our last-mile network. Overall, we've reduced our global cost-to-serve on a per unit basis for the second year in a row, while at the same time increasing speed, improving safety and adding selection. As we look to 2025 and beyond, we see opportunity to reduce costs again as we further refine inventory placement, grow our same-day delivery network and accelerate robotics and automation throughout the network.
In advertising, we remain pleased with the strong growth on a very large base, generating $17.3 billion of revenue in the quarter and growing 18% year-over-year. That's a $69 billion annual revenue run-rate, more than double what it was just four years ago at $29 billion. Sponsored products, the largest portion of ad revenue are doing well and we see runway for even more growth. We also have a number of newer streaming offerings that are starting to become significant new revenue sources.
On the streaming video side, we wrapped up our first year of prime video ads and we're quite pleased with the early progress and head into this year with momentum. We've made it easier to do full funnel advertising with us. Full funnel is from the top of the funnel with broad reach advertising that drives brand awareness to mid-funnel, where sponsored brands let companies specify certain keywords and audiences to attract people to their detail pages or brand store on Amazon. To bottom of the funnel, where sponsored products help advertisers surface relevant product ads to customers at the point-of-purchase. We make this easy for brands to sign-up for and deploy across our growing advertising.
We also have differentiated audience features that leverage billions of customer signals across our stores and media destinations from Amazon Marketing Cloud's secure clean rooms, providing advertisers the ability to analyze data, produce core marketing metrics and understand how their marketing performs across various channels to our new multi-touch attribution model that helps advertisers understand how their marketing is working. If an advertiser uses streaming TV, display, sponsored products and other ad types in their campaign, multi-touch attribution will show the relative contribution of each to their sales.
Moving on to AWS. In Q4, AWS grew 19% year-over-year and now has a $115 billion annualized revenue run-rate. AWS is a reasonably large business by most folks standards. And though we expect growth will be lumpy over the next few years as enterprise adoption cycles, capacity considerations and technology advancements impact timing, it's hard to overstate how optimistic we are about what lies ahead for AWS customers and business. I spent a fair bit of time thinking several years out.
And while it may be hard for some to fathom a world where virtually every app has generative AI infusing it with inference being a core building block just like compute, storage and database and most companies having their own agents that accomplish various tasks and interact with one another. This is the world we're thinking about all-the-time and we continue to believe that this world will mostly be built on-top of the cloud with the largest portion of it on AWS. To best help customers realize this future, you need powerful capabilities in all three layers of the stack.
At the bottom layer for those building models, unique compelling chips. Chips are the key ingredient in the compute that drives training and inference. Most AI compute has been driven by NVIDIA chips and we obviously have a deep partnership with NVIDIA and will for as long as we can see into the future. However, there aren't that many generative AI applications at large-scale yet. And when you get there as we have with apps like Alexa and Rufus, costs can get steep quickly. Customers want better price performance and it's why we built our own custom AI silicon.
Tranium 2 just launched at our AWS Reinvent Conference in December and EC2 instances with these chips are typically 30% to 40% more price performant than other current GPU-powered instances available. That's very compelling at-scale. Several technically capable companies like Adobe, Databricks, Poolside and Qualcomm have seen impressive results in early testing of Tranium 2. It's also why you're seeing Anthropic build their future frontier models on Tranium 2.
We're collaborating with Anthropic to build Project Raynier, a cluster of Tradium 2 ultra servers containing hundreds of thousands of training two chips. This cluster is going to be five times the number of as the cluster that Anthropic used to train their current leading set of cloud models. We're already hard at-work on Trainium 3, which we expect to preview late in '25 and defining Trainium 4 thereafter. Building outstanding performing chips that deliver leading price performance has become a core strength of AWS's, starting with our Nitro and Graviton chips in our core business and now extending to Training M&A and something unique to AWS relative to other competing cloud providers.
The other key component for model builders is services that make it easier to construct their models. I won't spend a lot of time in these comments on Amazon's HageMaker AI, which has become the go-to service for AI model builders to manage their AI data, build models, experiment and deploy these models, except to say the SageMaker's Hyperpod capability, which automatically splits trading workloads across many AI accelerators prevents interruptions by periodically saving checkpoints and automatically repairing faulty instances from their last save checkpoint and saving training time by up to 40%, it continues to be a differentiator, received several new compelling capabilities at Reinvent, including the ability to manage costs at a cluster level and prioritize which workloads should receive capacity when budgets are reached and is increasingly being adopted by model builders.
At the middle layer for those wanting to leverage Frontier models to build Jen AI apps, Amazon Bedrock is our fully managed service that offers the broadest choice of high-performing foundation models with the most compelling set of features that make it easy to build a high-quality generative AI application. We continue to iterate quickly on Bedrock, announcing Luma AI, poolside and over 100 other popular emerging models to bedrock and reinvent. In short order, we also just added Deep R1 models to bedrock and SageMaker. And additionally, we delivered several compelling new bedrock features at Reinvent, including prompt caching, intelligent prompt routing and model distillation, all of which help customers achieve lower-cost and latency in their inference. Like SageMaker AI, Bedrock is growing quickly and resonating strongly with customers.
Related, we also just launched Amazon's own family of frontier models in bedrock called Nova. These models compare favorably in intelligence against the leading models in the world, but offer lower latency, lower-price, about 75% lower than other models in bedrock and are integrated with key bedrock features like fine-tuning, model distillation, knowledge bases of RAG and Agenta capabilities. Thousands of AWS customers are already taking advantage of the capabilities and price performance of Amazon Nova models, including Palantir,, SAP, Densu, Fortinet, and Robin Hood. And we've just gotten started. At the top layer of the stack, Amazon Q is the most capable generative AI-powered assistant for software development and to leverage your own data.
You may remember that on the last call, I shared the very practical use-case where Q Transform helped save Amazon's teams $260 million and 4,500 developer years in migrating over 30,000 applications to new versions of the Java JDK. This is real value and companies ask for more, which we oblige with our recent deliveries of Q transformations that enable moves from windows.net applications to Linux, VMware to EC2 and accelerates mainframe migrations. Early customer testing indicates that Q turn was going to be a multi-year effort to do a mainframe migration into a multi-quarter effort, cutting by more than 50% the time to migrate mainframes. This is a big deal and these transformations are good examples of practical AI.
While AI continues to be a compelling new driver in the business, we haven't lost our focus on core modernization of company's technology infrastructure from on-premises to the cloud. We signed new AWS agreements with companies including Intuit, PayPal, Norwegian Cruise Line Holdings, Northrop Grumman, the Guardian Life Insurance Company of America, Reddit, Japan Airlines, Baker Hughes, the Hertz Corporation, Redfin, Chime Financial, Asana and many others. Consistent customer feedback from our recent AWS reinvent gathering was appreciation that we're still inventing rapidly in non-AI key infrastructure areas like storage, compute, database and analytics.
Our functionality leadership continues to expand and there were several key launches customers were abuzz about, including Amazon Aurora DSQL, our new serverless distributed SQL database that enables applications with the highest availability, strong consistency, post-dress compatibility and four times faster reason rights compared to other popular distributed SQL databases; Amazon S3 tables, which make S3 the first cloud object store with fully managed support for Apache Iceberg for analytics; Amazon S3 Metadata, which automatically generates queryable metadata, simplifying data discovery, business analytics and real-time inference to help customers unlock the value of their data in S3 and the next-generation of Amazon SageMaker, which brings together all the data, analytics services and AI services into one interface to do analytics and AI more easily at-scale.
As 2024 comes to an end, I want to thank our teammates and partners for their meaningful impact throughout the year. It was a very successful year across almost any dimension you pick. We're far from done and look-forward to delivering for customers in 2025.
With that, I'll turn it over to Brian for a financial update.