DATA PROTECTION
Recovery tax: how a single recovery event can be a disaster for your TCO
This article is part of our ongoing series about the total cost of ownership (TCO) of cloud storage, examining the drivers of disaster recovery costs in real life.
When disaster strikes and your organization experiences large-scale data loss, restoring your key business systems is your first and only priority. It’s the time when you rely most on your security infrastructure to revive essential business data and carry you back to good standing.
But even in this critical moment, your technology partners can do your organization as much harm as good. The high cost of data retrieval from hyperscale cloud storage adds an unexpected and expensive insult to your existing injury and can make an already difficult situation even more complex.
Understanding disaster recovery costs
If your disaster recovery plan relies on backing up to one of the major cloud platforms, e.g., Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP), the restore process could result in no fewer than three different types of fees for retrieval depending upon the tier of service you’re using:
API request fees – You pay for the requests to read your data (like an S3 GET request), and depending on the tier, you may incur an additional API fee to retrieve your data (e.g., RestoreObject).
Data retrieval fees – If your data is stored in cold storage tiers like S3 Glacier Flexible Retrieval or Deep Archive, you’ll be charged an additional per-GB fee to access it.
Egress fees – Finally, you pay an additional charge for transferring the data out of the cloud.
Though these fees represent one operation—taking your data out of the cloud—the provider charges you on three separate occasions. Since these costs are based on usage, the more data you need to take out, the more it’ll cost you. These fees also vary by storage tier where charges for data egress and other associated fees might be more expensive depending on how “warm” or “cold” your storage is.
The cost of storing 1PB of data in the cloud for one year
Let’s examine the cost of storing 1 PB of data for one year on AWS S3. We'll assume this data was uploaded in a previous year; therefore, we’ll remove all the initial upload fees it would have cost to write the data to the cloud in the first place. First, we’ll take a look at just the storage capacity costs. As the figure below shows, storage costs vary significantly by tier. The higher-performing S3 Standard is nearly 6X more expensive than the “cold storage” S3 Glacier Instant Retrieval tier and 24X the cost of S3 Glacier Deep Archive.

From this perspective, the AWS Glacier tiers appear to be the most cost-effective choice for storing data that is infrequently or rarely accessed. However, a single recovery event will dramatically and negatively impact your total cost of ownership. Let’s explore why this is the case.
Egress fees: the hidden trap of hyperscale clouds
Imagine an unplanned event where you needed to access 50% of your stored data and egress 20% (200 TB) back on-premises. While all AWS tiers carry egress fees, they are far higher on the “cold storage” tiers. For businesses using cold storage tiers, retrieving a large dataset can cost thousands of dollars before they’ve transferred a single byte out of the cloud. Indeed, the additional cost of data retrieval requests and data access requests enforced by AWS Glacier tiers can range from 50% to 90% of your total cost of ownership!
Of all the fees associated with cloud storage, egress fees are arguably the most egregious. Egress fees are what hyperscale cloud providers like AWS, Azure, and Google Cloud charge when you download stored data and land it somewhere outside of their environment. The cost varies by provider and region, but a common rate is $0.09 per gigabyte—which may seem small until you scale up. Transferring just 10 TB of data can cost nearly $900, and at enterprise scale, costs can quickly spiral into tens or even hundreds of thousands of dollars.

The chart above adds fees for data access, retrieval, and egress on top of the storage fees for each tier. Now, we see their impact on TCO. At the Glacier Deep Archive tier, while the storage cost is just $12,000, the full cost of storing and retrieving is $142,000. That’s higher than the total cost of the “warmer” Glacier Instant Retrieval tier.
This is why AWS Glacier tiers are typically used for archive storage, not for backups. But as we stated in Rethinking Cold Storage Tiers: Your Data is Hotter than You Think, even archived data is being accessed more frequently than originally planned, so it’s important to carefully consider the likelihood of unplanned scenarios where you may need to access that “cold” data before you decide which tier to store it in.
The Wasabi difference in disaster recovery costs
In a disaster recovery scenario, your cloud storage provider should be your ally, not another hurdle. Wasabi offers a predictably priced alternative to the hyperscalers that keeps disaster recovery costs within a consistent, reasonable range. Wasabi doesn’t charge for egress, data access, or retrieval fees, enabling businesses to recover their data at no additional cost. Regardless of your use case, we at Wasabi believe you shouldn’t have to pay extra to access your own data, especially when you need it most.
Demystifying Cloud Object Storage Costs
Download the ultimate guide to hidden fees that can break your budget.
Related article
Most Recent
This article is part of our ongoing series about the...
This article, part of our ongoing series about the total...
Cloud replication is the process of creating and maintaining more...
SUBSCRIBE
Storage Insights from the Storage Experts
Storage insights sent direct to your inbox.