Andy Jassy
President and Chief Executive Officer at Amazon.com
Thank you, Dave. Good afternoon, everyone, and thanks for joining us. Today, we are reporting $134.4 billion in revenue and $7.7 billion in operating income, both of which exceeded the top end of our guidance ranges. We're encouraged by the progress we're making on several key priorities, namely: lowering our cost to serve in our stores business; continuing to innovate on and improve our various customer experiences; and building new customer experiences that can meaningfully change what's possible for customers in our business long term. I'll start with our ongoing effort to lower our cost to serve in our stores' fulfillment network. Q2 saw another meaningful improvement in this area as we have steadily made progress the last several quarters. Central to our efforts has been the decision to transition our stores' fulfillment and transportation network from one national network in the United States to a series of eight separate regions serving smaller geographic areas. We keep a broad selection of inventory in each region, making it faster and less expensive to get those products to customers.
Regionalization is working and has delivered a 20% reduction in number of touches for our delivered package, a 19% reduction in miles traveled to deliver packages to customers and more than 1,000 basis point increase in deliveries fulfilled within region, which is now at 76%. This is a lot of progress. Sometimes I hear people make the argument that Amazon is chasing faster speed while driving its costs higher and where it doesn't matter much to customers. This argument is incorrect. There are two things to note. First, customers care a lot about faster delivery. We have a lot of data that shows when we make faster delivery promises on a detail page, customers purchase more often, not just a little higher, meaningfully higher. It's also true that when customers know they can get their items really quickly, it changes their consideration of using us for future purchases, too. Second, when shipments come from fulfillment centers that are closer to customers, they travel shorter distances, which cost less in transportation, gets there faster and is better for the environment. There's a lot of goodness in that equation.
This ability to have shipments closer to customers is the result of a lot of work and invention on the regionalization side, placement logic and local in-stock algorithms. It's also driven by our development and expansion of same-day fulfillment facilities, which is our fastest fulfillment mechanism and one of our least expensive, too. Our same-day facilities are located in the largest metro areas around the U.S., so our top-moving 100,000 SKUs, but also cover millions of other SKUs from nearby fulfillment centers that inject selection into these same-day facilities and have a design that streamlines getting items from order to being ready for delivery in as little as 11 minutes. The experience has been so positive for customers in our business that we're planning to double the number of these facilities. We believe that we are far from the law of diminishing returns and improving speed for customers. While we're seeing strong early results from this regionalization effort, we still see several ways in which we can be more efficient in this structure and we believe will improve productivity further.
We've also reevaluated virtually every part of our fulfillment network this past year and see additional structural changes we can make that provide future upside. We're excited about this cost to serve improvement, but also remain maniacally focused on making customers' lives easier and better every day and relentlessly inventing to make it so. This means constantly trying to improve experiences that we can deliver to customers short and long term. This customer experience work is at the heart of what we do every day across every one of our businesses. And I can spend an hour on this call detailing various examples across the teams. For today, I'll just focus a bit on our stores and AWS businesses. For stores, our priorities continue to be providing customers with great selection, low prices and convenience. And as we've discussed, we've been especially focused on providing even faster delivery speeds. Our speed of delivery has never been faster. In this last quarter, across the top 60 largest U.S. metro areas, more than half of Prime members' orders arrived at the same day or next day.
So far this year, we've delivered more than 1.8 billion units to U.S. Prime members the same or next day, nearly four times what we delivered at those speeds by this point in 2019. Lowering our cost to serve allows us not only to invest in these speed improvements but also add more selection at lower price points. In particular, we're growing our selection in everyday essentials, enabling customers to avoid going out to get these items and both increasing our basket sizes and the frequency with which customers choose to shop with us. We now have more than 300 million items available with U.S. Prime free shipping, including tens of millions of items with free same-day and one-day delivery. We're continuing to focus on providing great value with tens of millions of deals that help customers stretch their dollar or more. For instance, in Q2 of '23, we offered customers 144% more deals and coupons than we did in Q2 of 2022. Prime Day was similar. Amazon offered more deals than any past Prime Day event with a wide selection across millions of products.
Prime members purchased more than 375 million items worldwide and saved more than $2.5 billion across the Amazon store, helping make it the biggest Prime Day ever. Next, a few words about AWS. AWS remains the clear cloud infrastructure leader with a significant leadership position with respect to number of customers, size of partner ecosystem, breadth of functionality and the strongest operational performance. These are important factors for why AWS has grown the way it has over the last several years and for why AWS has almost doubled the revenue of any other provider. I've talked to many AWS customers over the years and continue to do so. And while all these factors I mentioned have been big drivers of the business' success, AWS customers tell us that as importantly, they care about the very different customer focus and orientation in AWS may see elsewhere. As the economy has been uncertain over the last year, AWS customers have needed assistance cost optimizing to withstand this challenging time and reallocate spend to newer initiatives that better drive growth. We've proactively helped customers do this.
And while customers have continued to optimize during the second quarter, we've started seeing more customers shift their focus towards driving innovation and bringing new workloads to the cloud. As a result, we've seen AWS' revenue growth rate stabilize during Q2 where we reported 12% year-over-year growth. The AWS team continues to innovate and change what's possible for customers at a rapid clip. You can see across the array of AWS product categories where AWS leads in compute, networking, storage, database, data solutions and machine learning, among other areas, and the continued invention and delivery in these areas is pretty unusual. For instance, a few years ago, we heard consistently from customers that they wanted to find more price performance ways to do generalized compute. And to enable that, we realized that we needed to rethink things all the way down to the silicon and set out to design our own general purpose CPU chips.
Today, more than 50,000 customers use AWS' Graviton chips and AWS Compute instances, including 98 of our top 100 Amazon EC2 customers, and these chips have about 40% better price performance than other leading x86 processors. The same sort of reimagining is happening in generative AI right now. Generative AI has captured people's imagination, but most people are talking about the application layer, specifically what OpenAI has done with ChatGPT. It's important to remember that we're in the very early days of the adoption and success of generative AI, and that consumer applications is only one layer of the opportunity. We think of large language models in generative AI as having three key layers, all of which are very large in our opinion and all of which AWS is investing heavily in. At the lowest layer is the compute required to train foundational models and do inference or make predictions.
Customers are excited by Amazon EC2 P5 instances powered by NVIDIA H100 GPUs to train large models and develop generative AI applications. However, to date, there's only been one viable option in the market for everybody and supply has been scarce. That, along with the chip expertise we've built over the last several years, prompted us to start working several years ago on our own custom AI chips for training called Trainium and inference called Inferentia that are on their second versions already and are a very appealing price performance option for customers building and running large language models. We're optimistic that a lot of large language model training and inference will be run on AWS' Trainium and Inferentia chips in the future. We think of the middle layer as being large language models as a service. Stepping back for a second, to develop these large language models, it takes billions of dollars and multiple years to develop.
Most companies tell us that they don't want to consume that resource building themselves. Rather, they want access to those large language models, want to customize them with their own data without leaking their proprietary data into the general model, have all the security, privacy and platform features in AWS work with this new enhanced model and then have it all wrapped in a managed service. This is what our service Bedrock does and offers customers all of these aforementioned capabilities with not just one large language model but with access to models from multiple leading large language model companies like Anthropic, Stability AI, AI21 Labs, Cohere and Amazon's own developed large language models called Titan. Customers, including Bridgewater Associates, Coda, Lonely Planet, Omnicom, 3M, Ryanair, Showpad and Travelers are using Amazon Bedrock to create generative AI application.
And we just recently announced new capabilities from Bedrock, including new models from Cohere, Anthropic's Claude two and Stability AI's Stable Diffusion XL 1.0 as well as agents for Amazon Bedrock that allow customers to create conversational agents to deliver personalized up-to-date answers based on their proprietary data and to execute actions. If you think about these first two layers I've talked about, what we're doing is democratizing access to generative AI, lowering the cost of training and running models, enabling access to large language model of choice instead of there only being one option, making it simpler for companies of all sizes and technical acumen to customize their own large language model and build generative AI applications in a secure and enterprise-grade fashion, these are all part of making generative AI accessible to everybody and very much what AWS has been doing for technology infrastructure over the last 17 years.
Then that top layer is where a lot of the publicity and attention have focused, and these are the actual applications that run on top of these large language models. As I mentioned, ChatGPT is an example. We believe one of the early compelling generative AI applications is a coding companion. It's why we built Amazon CodeWhisperer, an AI-powered coding companion, which recommends code snippets directly in the code editor, accelerating developer productivity as they code. It's off to a very strong start and changes the game with respect to developer productivity. Inside Amazon, every one of our teams is working on building generative AI applications that reinvent and enhance their customers' experience.
But while we will build a number of these applications ourselves, most will be built by other companies, and we're optimistic that the largest number of these will be built on AWS. Remember, the core of AI is data. People want to bring generative AI models to the data, not the other way around. AWS not only has the broadest array of storage, database, analytics and data management services for customers, it also has more customers and data store than anybody else. Coupled with providing customers with unmatched choices at these three layers of the generative AI stack as well as Bedrock's enterprise-grade security that's required for enterprises to feel comfortable putting generative AI applications into production, we think AWS is poised to be customers' long-term partner of choice in generative AI. We're also continuing to make meaningful progress in building new customer experiences that can meaningfully change what's possible for customers in our business long term.
Amazon Business is one of our fastest-growing offerings with a $35 billion annual gross sales run rate. And the team is working hard to further build out the selection, value, convenience and features that business customers need. Buy with Prime is continuing to show a lot of progress. Merchants in early trials who use Buy with Prime saw their shopper conversion increased by 25% on average, which makes a real difference to their business. Also, merchants who participate in Prime Day activities, in aggregate, experienced a 10 times increase in daily Buy with Prime orders during the sales event period versus the month before we announced Prime Day. It's frankly only been a short amount of time that we've decided to invest significantly in the health care market segment. A lot of what we tried before were smaller experiments.
But we're pleased with Amazon Pharmacy doubling its active customers in the past year, and we're pleased with the response to RxPass, which enables Prime members to receive all of their eligible generic medications for just $5 a month and have them delivered free to their door. One Medical has been part of Amazon for just a few months now and we're encouraged by what we're seeing there, too. Our grocery business continues to grow. We already have a very large business in nontemperature-controlled areas like consumables, pet food, beauty and canned goods that continues to grow as we keep increasing speed and lowering our cost to serve, which allows us to sell more items more cost effectively. Whole Foods continues to lead the organic grocery space, is growing at a healthy clip and has meaningfully improved its profitability in the last year. We're pleased with what we're seeing with Whole Foods.
And as I've shared before, we're working on new formats in our mass physical store offering, Amazon Fresh, having significantly improved the number of the key business inputs and just rolled out new concepts in stores. We also see substantial innovation and progress in other areas like Kuiper, Zoox and Alexa. We're still relatively early in many of our investments with technology inventions that are changing what's possible to deliver for customers in these areas, but they're big long-term opportunities that we remain optimistic about. Finally, I want to recognize our teams on being named number one in LinkedIn's Top Companies to Grow your Career in the United States. It's a testament to our work to be a great employer with leading compensation benefits and upskilling opportunities.
With that, I'll turn it over to Brian.