Object Storage Assumptions

These instructions will help you provide the key assumptions we need to perform an analysis of Object Storage costs for your use case.

Number of Objects

Please enter your best estimate for the number of objects stored. You can get this information from your cloud provider's console or via script.

AWS

Get the number of objects form the AWS S3 Lens console:

  1. Access the S3 Lens console.

  2. Click on one of the available (there should be a default) S3 Lens dashboard in the table.

  3. Once open, under Overview, you will see a total object count.

  4. Convert the object count to an integer (e.g.: 850K ➡️ 850,000) and enter into the number of objects field in the Taloflow form.

To access S3 Storage Lens dashboards, you must use an IAM user and not a root account. Your AWS administrator must update your IAM permissions to allow for the s3:ListStorageLensConfigurations

There are other ways to get the number of S3 objects covered in this post.

Google Cloud Platform

Get the number of objects by running one of the scripts below in your terminal or in Google Cloud Shell:

  • gsutil du | wc -lwill list every object (i.e.: file) in the default project

  • gsutil du -p <PROJECT_ID> | wc -l will list the objects for a specific project (replace <PROJECT_ID> with the relevant project ID.

If you have more than one project to include in the analysis, please run gsutil du -p <PROJECT_ID> | wc -lfor each project and sum the results.

Microsoft Azure

  1. Log into the Azure Portal

  2. Select Storage Accounts under the services list

  3. Select a storage account

  4. In the left panel, under the Monitoring group, click on Metrics

  5. Set up the Scope to the storage account, Metric Namespace to Blob, and Metric to Blob count , and Aggregation to Avg (See #1 in the screenshot below).

  6. The Blob Count (i.e.: object count) will be at a ticker at the bottom of the chart (See #2 in the screenshot below).

  7. Convert the object count to an integer (e.g.: 850K ➡️ 850,000) and enter into the number of objects field in the Taloflow form.

Blob count is equal to the number of blob objects in the storage account.

Azure currently does not allow for multiple storage accounts in a single metric view, so you will need to repeat the steps above for each storage account and total the Blob counts. (Azure is working on this)

Data Transfer Estimates

Taloflow already captures the INTER-Regional GB Transfer IN and OUT from the uploaded cost reports. Do not include that data transfer when estimating the INTRA-Regional GB Transfer IN and OUT.

Intra-Regional GB Transfer IN (Monthly)

Entering data in this field is optional but may produce a more accurate analysis because the storage migration could result in some additional ongoing transfer costs between services on your cloud provider of origin and the new storage provider.

Please provide the GB monthly volume of data transfer that occurs when other services (like a virtual machine) read or write data to a storage bucket in the same region (i.e.: intra-region). This back and forth is not captured in your current usage reports because it is intra-regional and therefore it is free of charge.

One way to estimate the amount of such traffic is to take your average file size in storage and multiply it by the number of times your services likely read or write data to object storage.

Intra-Regional GB Transfer OUT (Monthly)

Please provide your best estimate of the GB monthly volume of data transfer from storage containers or buckets into applications in the same region (a.k.a: intra-region/VPC traffic). Turning on VPC flow logging might help one get a more accurate estimate.

Last updated