Sub Account Filter

If you are integrating at only the Sub Account level, you will also need to go through an additional step to add a cost report filter, detailed in these docs. The additional step from our basic setup is to deploy a AWS Lambda function, that we call the Taloflow Cost Report Filter, that filters and copies the entries that belong to a specific Usage Account ID.

How it works

Tim can get the correct data for any specific Account ID (e.g.: a sub account) from the Cost Report by taking the following steps:

  1. When the Cost Report is placed into a Billing S3 bucket, an S3 event is triggered.

  2. This event triggers an AWS Lambda function that will run on your account to filter the Cost Report and place it into another S3 bucket that Tim can access.

Tim will only have access to the S3 bucket with the filtered information and can never see the full Cost Report with this configuration. You will always have explicit control over the code of the function and can fully audit it and can adjust to your liking what is filtered and what isn’t.

Limitations

The Taloflow Cost Report Filter function currently has some minor limitations:

  • You cannot filter on the Payer Account ID. Why? Because of the logic's function, which filters the lines containing the given Account ID, every line also contains the Payer ID, so the resulting Cost Report file would also have those lines.

Prerequisites

  • AWS CLI

  • AWS SAM CLI

  • Make

  • Have Cost and Usage Reports turned on at the Root/Master Account level and have the reports being sent to an S3 bucket

  • The Cost and Usage Report parameters should be set to Hourly, GZIP, Resource IDs and Create New Report Version.

  • Please note down the Name of the Cost Report and the Report Path Prefix at the master account level, as you'll need it later.

  • Run the Lambda stack using the Master Account

  • AWS Canonical ID of the Master Account

You need to have AWS Cost and Usage Reports turned on at the master or root-account level and have the reports being sent to an S3 bucket. The Cost and Usage Report parameters should be set to Hourly, GZIP, Resource IDs and Create New Report Version.

Integration Steps

Step 1. Create a new S3 Bucket on the Sub Account

  • Go to the S3 Console and Create a new S3 bucket on the Sub Account.

  • And click create bucket

Please note down the S3 Bucket name for the bucket you created on the Sub Account.

Step 2. Give access to the newly created bucket to the Master Account

  • Go to the newly created bucket in your Sub Account.

  • Click Permissions.

  • Click Access Control List (ACL).

  • Click Add account.

  • Enter the AWS Canonical ID of the Master Account and check off "List objects", "Write objects", "Read bucket permissions" and "Write bucket permissions".

  • Click Save.

Step 3. Deploy and Setup of Cost Report Filter Function using the Master Account

The following filter function should be created in the Master Account" } [/block] To begin the deployment of the AWS Lambda Cost Report filter, click here to download the package and open the .zip file. Please deploy to project using AWS SAM.

A. Update Configurations

  • Update the following variables in template.yml.

Variable

Description

{ REPLACE_SOURCE_BUCKET_NAME }

Source Bucket

{ REPLACE_DEST_BUCKET_NAME }

Destination Bucket

{ REPLACE_USAGE_ACCOUNT_IDS }

ID(s) of usage account(s) separated by semicolon ( ; )

  • Update the following variables in Makefile.

Variable

Description

{ REPLACE_LAMBDA_BUCKET_NAME }

Bucket that will receive the lambda package (this bucket must be in the same region of the source bucket)

{ REPLACE_LAMBDA_BUCKET_REGION }

Region of the lambda package (must be in the same region of the source bucket , for example: us-east-1)

B. Deployment

  • Run the following command.

    make package deploy

A new function should be added, we will need to add a trigger to listen to s3 events.

C. Adding S3 Trigger to Lambda

  • Login to AWS Console

  • Search and select Lambda

  • Under Designer > Add Trigger, scroll down and click S3.

  • Under Configure triggers, select the following:

Field

Input

Bucket

Source bucket

Event Type

All object create events

Suffix

csv.gz

  • Again, you need to add a second trigger, so Click Add to add trigger.

  • Under Designer > Add Trigger, scroll down and click S3.

  • Under Configure triggers, select the following:

Field

Input

Bucket

Source bucket

Event Type

All object create events

Suffix

Manifest.json

  • Click Add to add trigger.

  • Click Save at the top right to confirm the settings.

Step 4: Run CloudFormation Template on the Sub Account

The following CloudFormation Template has to be run on the Sub Account

  • In the console, keep both pre-selected options as Template is ready, and Amazon S3 URL, and leave the URL as is, then click Next

  • Recall the names of the S3 Bucket with the Cost Report for the Sub Account, the AWS Region where the S3 Bucket is located for the Sub Account, and the Report Name and Report Prefix for the Cost Report in the Root or Master Account.

  • For the External ID Field, you can use pretty much any External ID, (e.g.: tim-ext-id).

The External ID cannot have the following characters: $, #

  • Click Next

  • On the following page scroll down and click Next again

  • On the following page scroll down, acknowledge that this template might create IAM resources by checking the box, then click Create Stack.

  • In the next page, you will have to wait 2-3 minutes for the stack to get created. You can click the refresh icon in the Console.

  • When all is green, you are ready to go save for one more optional step.

Step 5: Forward CloudWatch Events (Optional)

By default, the CloudFormation Script forwards Events to Tim for only the US-East-1 AWS Region. Forwarding CloudWatch Events for the other regions you use will produce a better Running Cost. If this is important to you, please repeat the steps below for each CloudWatch region you are in beyond US-East-1.