Store more for less!

Squeezed budget, yet more data to store?

0.4 Zettabytes, that’s 400Million TB of data that will be created today, according to analysts at Exploding Topics.

At the same time, we all know IT budgets are being squeezed for every last penny.

The demand to reduce storage complexity while improving secure and relevant access to data for staff, partners, suppliers and customers alike is challenging CIOs the world over.

How to store more data for less, while improving data access?

Over the past decade or so, many CIOs, data architects and the like have opted to store data in the cloud rather than on-premises. This has undoubtedly reduced the complexity while improving access, but is it lower cost?

Hyperscalers would argue that they handle the operating expenses, provide maintenance and management of the storage, all at a low $ per GB per month price. However, the $’s per GB soon mount up, as data set sizes increase to TB and PBs. Heaven forbid you want your data back, if you do, then the costs mount further with restore and egress fees (there are simply a lot of little ***** to look out for in the T&C’s).


To store more data for less cost in the cloud, those planning data workflows should employ a simple tiered storage approach, don’t just punt data off to S3, Azure Blob or a GCP Cloud storage bucket and pay all the ongoing fees. You should consider offloading a higher percentage of their data to services designed for cloud cold data storage. By implementing simple data lifecycle management, your company data will migrate to low-cost storage mediums, reducing ongoing costs and eliminating the excessive charges companies face when retrieving data from hyperscaler archives.

Five top tips to help reduce associated cloud storage costs

1. Utilise tiered storage:

  • Move more data from hyperscaler providers to those offering lower-cost cold/archival storage, such as Cloud Cold Storage, with lower costs and streaming data speeds. 

  • Frequently accessed "hot" data should still be stored in the fastest, most expensive tier; however, CIOs should review what ‘frequent’ access is, and the more data that can migrate to cold storage, the lower the bill at the end of each month. For example, huge savings could be made by reducing the time to archive from 120 days to 90, especially when you consider that data older than 60 days is rarely accessed.

  • Less frequently accessed "cold" data should be moved to cheaper, slower tiers like Cloud Cold Storage archive. The task of doing so is as simple as adding a cloud cold storage destination URL, and you are set!

  • Review the rules to automatically move data between tiers based on access patterns (e.g., data accessed less than once a month goes to a colder tier of storage). 

    • Cloud providers like AWS (Intelligent Tiering), Azure, and Google Cloud offer features for automated tiering, however these tools are designed to keep your data on their platform, they are designed to save you money on data storage but as your data ages they still hit you with egress and retrieval fee’s when you want to access your data.

    • To avoid future retrieval fees, consider moving your long-term data to a platform designed for long-term storage. To simplify the process, consider using a service like Panzura to move archive data.

2. Optimise data size and format:

  • Compression:
    Another way to save on cloud storage costs is to compress data before storing it, especially for infrequently accessed data. This can significantly reduce storage space requirements. The technology currently used by Cloud Cold Storage has a maximum compression rate of 2.5:1, but this performance depends on the specific data being stored.

  • File formats:
    Use efficient file formats like Parquet or ORC for structured data, as they allow for better compression and faster querying.

  • Always avoid small files:
    Storing large objects (e.g., in columnar formats) is generally more efficient than storing the same data in many small files.

  • Deduplication:
    By implementing data deduplication techniques, you can eliminate vast amounts of redundant data, ensuring you only store unique data blocks. 

3. Implement lifecycle management:

  • Data Retention Policies:
    Define rules for how long data should be retained based on its business value. 

  • Automated archive:
    Set up automated processes to archive data after a specific period, especially for backups or snapshots. You can even set up automatic deletion when multiple old copies become redundant. 

  • Soft deletes:
    Consider using soft-delete features before permanent deletion to allow for accidental data recovery. 

4. Monitor and analyse:

  • Track usage:
    Monitor how your data is being accessed to identify which data can be moved to cheaper tiers or deleted. 

  • Optimise transfers:
    Minimise unnecessary data transfers by using incremental backups and only transferring changed files. 

5. Consider object data storage:

  • For large, unstructured datasets, object storage can be a cost-effective solution.

  • Object storage allows you to store data in buckets without a strict directory structure, offering flexibility and scalability.

  • However, ensure that retrieval from object storage is efficient for your use case. 

By implementing these strategies, you can significantly reduce your cloud storage costs while maintaining the necessary access and availability of your data.

If you would like to discuss your options or have any questions on long-term cloud storage, get in touch with our cloud storage experts. 

Setup Cloud Cold Storage