AWS opens physical sites for fast data uploads – but it could cost you up to $500 an hour
AWS Data Transfer Terminal sites have already open in Los Angeles and New York
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
You are now subscribed
Your newsletter sign-up was successful
Amazon Web Services (AWS) customers will soon be able to book time slots at physical locations to connect their storage devices and upload data to the cloud.
Data can be uploaded to any AWS endpoint, including Amazon S3, Amazon Elastic File System (Amazon EFS), or others, using a high throughput connection of up to up to 400Gbps.
Each AWS Data Transfer Terminal will house a patch panel, fiber optic cable and a PC for monitoring data transfer jobs, the hyperscaler revealed. Terminals will be charged by the hour, with no per GB charge for the data transfer if the data remains in the same continent.
The first two Data Transfer Terminals are already up and running in Los Angeles and New York, with plans to launch more globally.
So far, the company is only listing charges for US-to-US transfers - $300 per hour - and US-to-EU at $500.
"On your reserved date and time, visit the location and confirm access with the building reception. You’re escorted by building staff to the floor and your reserved room of the Data Transfer Terminal location," said Channy Yun, principal developer advocate for AWS Cloud.
"Don’t be surprised if there are no AWS signs in the building or room. This is for security reasons to keep your work location as secret as possible."
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
The company said the terminals will significantly cut the time it takes to upload large amounts of data, meaning ingested data can be processed within minutes.
Customers can then analyze large datasets using Amazon Athena, train and run machine learning models with ingested data using SageMaker, or build scalable applications using Amazon EC2.
"After the data is uploaded to AWS, you can use the extensive suite of AWS services to generate value from your data and accelerate innovation," Yun commented.
"You can also bring your AWS Snowball devices to the location for upload and retain the device for continued use and not rely on traditional shipping methods."
RELATED WHITEPAPER
Suggested use cases include video production data for processing in the media and entertainment industry, training data for Advanced Driver Assistance Systems (ADAS) in the automotive industry, or migrating legacy data in the financial services industry.
The hyperscaler also noted the service could help support uploads of equipment sensor data in the industrial and agricultural sectors.
"You can upload large datasets from fleets of vehicles operating and collecting data in metro areas for training machine learning models, digital audio and video files from content creators for media processing workloads, and mapping or imagery data from local government organizations for geographic analysis," said Yun.
Emma Woollacott is a freelance journalist writing for publications including the BBC, Private Eye, Forbes, Raconteur and specialist technology titles.
-
Anthropic researchers warn AI could 'inhibit skills formation' for developersNews A research paper from Anthropic suggests we need to be careful deploying AI to avoid losing critical skills
-
CultureAI’s new partner program targets AI governance gains for resellersNews The new partner framework aims to help resellers turn AI governance gaps into scalable services revenue
-
What the new AWS European Sovereign Cloud means for enterprisesNews AWS has announced the general availability of its European Sovereign Cloud. Here's what the launch means for enterprises operating in the region.
-
AWS just quietly increased EC2 Capacity Block prices – here's what you need to knowNews The AWS price increases mean booking GPU capacity in advance just got more expensive
-
Cloud infrastructure spending hit $102.6 billion in Q3 2025 – and AWS marked its strongest performance in three yearsNews Hyperscalers are increasingly offering platform-level capabilities that support multi-model deployment and the reliable operation of AI agents
-
AWS re:Invent 2025 live: All the news and announcements from day two in Las VegasLive Blog Keep tabs on all the latest announcements from day-two at AWS re:Invent 2025 in Las Vegas
-
AWS has a chance to show its mettle at re:Invent 2025Analysis The hyperscaler will be betting big on its AI stack and infrastructure credentials
-
AWS pledges $50 billion to expand AI and HPC infrastructure for US government clientsNews The company said an extra 1.3 gigawatts of compute capacity will help government agencies advance America’s AI leadership
-
OpenAI just signed a bumper $38bn cloud contract with AWS – is it finally preparing to cast aside Microsoft?News The move by OpenAI doesn’t signal an end to its long-running ties with Microsoft
-
Accelerate public service transformation with the cloudwhitepaper
