After 20 years, simplicity remains the ‘singular most important aspect’ of Amazon S3
Even in the age of AI, simplicity and ease of use remain core tenets for Amazon S3
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
You are now subscribed
Your newsletter sign-up was successful
Amazon Simple Storage Service (S3) is 20 years old, and while simplicity was, as the name suggests, the original inspiration, it has since become a sprawling platform.
But that doesn’t mean simplicity and ease of use aren’t still core tenets of the storage service, according to Andy Warfield, vice president and distinguished engineer at Amazon Web Services (AWS).
Speaking to ITPro ahead of the anniversary, Warfield said simplicity is still the “singular most important aspect” of S3. When it launched in 2006, a brief blog post outlined the core goal: “Amazon S3 is the storage for the internet”.
With a simple Representational State Transfer (REST) interface, users could PUT (store objects) and GET (retrieve them later) with relative ease. Warfield told ITPro that “parking the service” behind this architecture made it simple and easy for customers to adopt.
“At the time S3 launched in 2006, a lot of the verbs that we were using and a lot of the ways that the team approached presenting storage [were] actually really guided by the existing HTTP verbs,” he explained. So things like the GET and PUT support.”
“It made it incredibly easy to adopt. I think one thing that kind of taught the team, and it’s been true through the lifetime of S3, is that we do best when we focus on the customer, which you hear from us all the time, but on delivering for the customer in a way that is as simple as possible to consume so folks don’t have to do extra work.“
An explosion of growth for Amazon S3
To say Amazon S3’s growth over the last two decades would be a gross understatement. When it launched in 2006, S3 offered a total of one petabyte of storage capacity, spread across several hundred storage nodes in three data centers – it also had a maximum object limit of 5GB.
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
Compare that to 2026, and the numbers are jarring. As ITPro reported at AWS re:Invent 2025, the service now offers a maximum object size of 50TB, marking a ten-fold increase on the original limit.
Additional figures detailed by the company emphasize the sheer scale of S3 compared to its early days. Today it boasts:
- 500 trillion stored objects globally
- 200 million data requests per second
- 123 Availability Zones worldwide
- 39 AWS Regions
S3’s rapid-fire growth coincided with – and benefited from – an enterprise data explosion during the late 2000s and early 2010s. Indeed, Warfield notes that in 2006, you would have been “hard pressed to find something that you wanted to put in an object that was larger than 5GB”.
But with growing camera resolutions as a simple example, the rapid accumulation of enterprise data, and the formation of data lakes, demand surged. In 2026, Warfield admits the numbers are still rather mind-boggling.
“One that stands out to me is that the service processes over a quadrillion requests every year,” he said.
“We have tens of thousands of customers who, each individually, have objects that are spread over more than 10 million hard drives,” Warfield added, noting that the prospect of an individual enterprise building a storage setup of that scale as a “wild thing to think about”.
Underpinning enterprise innovation
Amazon S3 quickly moved beyond being a simple object storage service. Indeed, it became the underpinning foundation and enabler of customer data innovation.
The service helped kickstart the creation and subsequent growth of data lakes, and today, more than one million of these are stored on AWS. Warfield described this as a “structural aspect of why S3 is successful”.
Enabling enterprises to host data in a single source and all in one place helped break down long-running siloes that many had battled for years. It created a “shared foundation” for data.
“The data gives them this incredible flexibility and velocity to move quickly. And the data ends up being non-zero sum in terms of future value,” he said, reflecting on discussions with one particular customer.
“They often find that data they built for one system allows them to move into another opportunity.”
S3 in the age of AI
S3’s flexibility in adapting to customer demands over the last 2o years has been a key factor in its longevity and success - along with that of the company at large. AWS still boasts a large hyperscale market share o alongside Google Cloud and Microsoft Azure, for example.
With the advent of generative AI raising the stakes for big tech providers, the storage service is once again evolving to accommodate skyrocketing storage demands.
Indeed, the company is heavily focused on positioning S3 as the critical foundation for AI workloads. At AWS re:Invent in December, the company officially cut the ribbon on S3 Vectors, which has generated significant excitement across the company.
When it comes to AI, vector search is used to identify similarities between specific data points. At the time, the company described vectors as a "numerical representation of unstructured data created from embedding models".
Uploading, storing, and querying vectors is costly, though which is something this particular service aims to remedy. The company claims it can reduce costs on this front by up to 90%.
“The thing that we really wanted to do with S3 Vectors was to get a vector indexing service that had the simplicity and elasticity of S3 that you could just pick it up and use it with just ten vectors, but scale to billion and trillions of vectors and pay something that was closer to storage costs because it was anchored on hard drives,” Warfield explained.
“We’ve seen incredible growth on it.”
FOLLOW US ON SOCIAL MEDIA
Make sure to follow ITPro on Google News to keep tabs on all our latest news, analysis, and reviews.
You can also follow ITPro on LinkedIn, X, Facebook, and BlueSky.

Ross Kelly is ITPro's News & Analysis Editor, responsible for leading the brand's news output and in-depth reporting on the latest stories from across the business technology landscape. Ross was previously a Staff Writer, during which time he developed a keen interest in cyber security, business leadership, and emerging technologies.
He graduated from Edinburgh Napier University in 2016 with a BA (Hons) in Journalism, and joined ITPro in 2022 after four years working in technology conference research.
For news pitches, you can contact Ross at ross.kelly@futurenet.com, or on Twitter and LinkedIn.
-
Tomorrow's fraud techniquesITPro Podcast Leaders need to proactive as attackers launch more consistent, sophisticated attacks
-
Met Office hails huge efficiency gains in first year of cloud supercomputing with Microsoft AzureNews In moving to the cloud, the Met Office has bolstered operational resilience and helped to deliver more accurate forecasts
-
Costly cloud storage fees are pushing IT budgets to breaking pointNews Research from Wasabi shows cumbersome cloud storage fees are hampering infrastructure expansion plans
-
Sumo Logic expands European footprint with AWS Sovereign Cloud dealNews The vendor is extending its AI-powered security platform to the AWS European Sovereign Cloud and Swiss Data Center
-
Wasabi Technologies wants to be a "more predictable alternative to the hyperscalers" after $70m funding roundNews The cloud storage provider plans to ramp up AI infrastructure investment and boost global expansion
-
What the new AWS European Sovereign Cloud means for enterprisesNews AWS has announced the general availability of its European Sovereign Cloud. Here's what the launch means for enterprises operating in the region.
-
AWS just quietly increased EC2 Capacity Block prices – here's what you need to knowNews The AWS price increases mean booking GPU capacity in advance just got more expensive
-
Cloud infrastructure spending hit $102.6 billion in Q3 2025 – and AWS marked its strongest performance in three yearsNews Hyperscalers are increasingly offering platform-level capabilities that support multi-model deployment and the reliable operation of AI agents
-
AWS re:Invent 2025 live: All the news and announcements from day two in Las VegasLive Blog Keep tabs on all the latest announcements from day-two at AWS re:Invent 2025 in Las Vegas
-
AWS has a chance to show its mettle at re:Invent 2025Analysis The hyperscaler will be betting big on its AI stack and infrastructure credentials