Tim

Welcome to Tim's documentation

Tim, the Taloflow Infrastructure Manager, reveals the cost of every AWS cloud resource in real-time. This helps you make better decisions relating to budgeting, cost mitigation, and optimization on the AWS cloud.

Get Started    

Connect Tim to a Sub Account (Advanced)

Sometimes, organizations prefer to integrate Tim without sending Tim the Cost Report information for all accounts. In this case, a simple AWS Lambda function does the trick to filter the Cost and Usage Report and remove any information tied to accounts you would rather not share before it is sent to Tim. You will also have to run a different CloudFormation Template for the creation of Roles.

Background

The additional step from our basic setup is to deploy a AWS Lambda function, that we call the Taloflow Cost Report Filter, that filters and copies the entries that belong to a specific Usage Account ID.

How it works

Tim can get the correct data for any specific Account ID (e.g.: a sub account) from the Cost Report by taking the following steps:

  1. When the Cost Report is placed into a Billing S3 bucket, an S3 event is triggered.

  2. This event triggers an AWS Lambda function that will run on your account to filter the Cost Report and place it into another S3 bucket that Tim can access.

Tim will only have access to the S3 bucket with the filtered information and can never see the full Cost Report with this configuration. You will always have explicit control over the code of the function and can fully audit it and can adjust to your liking what is filtered and what isn’t.

Limitations

The Taloflow Cost Report Filter function currently has some minor limitations:

  • You cannot filter on the Payer Account ID. Why? Because of the logic's function, which filters the lines containing the given Account ID, every line also contains the Payer ID, so the resulting Cost Report file would also have those lines.
  • You only filter by one Account ID. At this time, it is not possible to filter multiple accounts using this function.

Prerequisites

  • AWS CLI
  • AWS SAM CLI
  • Make
  • Have Cost and Usage Reports turned on at the Root/Master Account level
  • Please note down the Name of the Cost Report and the Report Path Prefix at the master account level, as you'll need it later.
  • Run the Lambda stack in an account that has access to the Master Account's Cost and Usage Report S3 Bucket and the Sub Account's Cost and Usage Report S3 Bucket. You can configure this through Access Control List (ACL), which will require you to have your AWS Canonical ID ready.

You need to have AWS Cost and Usage Reports turned on at the master or root-account level and have the reports being sent to an S3 bucket. The Cost and Usage Report parameters should be set to Hourly, GZIP and Create New Report Version.

Instructions

Step 1. Create a new S3 Bucket on the Sub Account

  • Go to the S3 Console and Create a new S3 bucket on the Sub Account.
  • And click create bucket

Please note down the S3 Bucket name for the bucket you created on the Sub Account.

Step 2. Deployment and Setup of Cost Report Filter Function

To begin the deployment of the AWS Lambda Cost Report filter, click here to download the package and open the .zip file. Please deploy to project using AWS SAM.

A. Update Configurations

  • Update the following variables in template.yml.
Variable Description
{ REPLACE_SOURCE_BUCKET_NAME } Source Bucket
{ REPLACE_DEST_BUCKET_NAME } Destination Bucket
{ REPLACE_USAGE_ACCOUNT_ID } ID of usage account
  • Update the following variables in Makefile.
Variable Description
{ REPLACE_LAMBDA_BUCKET_NAME } Bucket that will receive the lambda package (this bucket must be in the same region of the source bucket)
{ REPLACE_LAMBDA_BUCKET_REGION } Region of the lambda package (must be in the same region of the source bucket , for example: us-east-1)

B. Deployment

  • Run the following command.
    make package deploy
    

A new function should be added, we will need to add a trigger to listen to s3 events.

C. Adding S3 Trigger to Lambda

  • Login to AWS Console
  • Search and select Lambda
  • Under Designer > Add Trigger, scroll down and click S3.
  • Under Configure triggers, select the following:
Field Input
Bucket Source bucket
Event Type All object create events
Suffix csv.gz
  • Again, you need to add a second trigger, so Click Add to add trigger.
  • Under Designer > Add Trigger, scroll down and click S3.
  • Under Configure triggers, select the following:
Field Input
Bucket Source bucket
Event Type All object create events
Suffix Manifest.json
  • Click Add to add trigger.
  • Click Save at the top right to confirm the settings.

Step 3: Run CloudFormation Template on the Sub Account

  • Click on this link to run the CloudFormation Template on your account
  • In the console, keep both pre-selected options as Template is ready, and Amazon S3 URL, and leave the URL as is, then click Next
  • Recall the names of the S3 Bucket with the Cost Report for the Sub Account, the AWS Region where the S3 Bucket is located for the Sub Account, and the Report Name and Report Prefix for the Cost Report in the Root or Master Account.
  • For the External ID Field, you can use pretty much any External ID, (e.g.: tim-ext-id).

Special Characters for External ID

The External ID cannot have the following characters: $, #

  • Click Next
  • On the following page scroll down and click Next again
  • On the following page scroll down, acknowledge that this template might create IAM resources by checking the box, then click Create Stack.
  • In the next page, you will have to wait 2-3 minutes for the stack to get created. You can click the refresh icon in the Console.
  • When all is green, you are ready to go save for one more optional step.

Step 4. Forward CloudWatch Events (Optional)

By default, the CloudFormation Script forwards Events to Tim for only the US-East-1 AWS Region. Forwarding CloudWatch Events for the other regions you use will produce a better Running Cost. If this is important to you, please repeat the steps below for each CloudWatch region you are in beyond US-East-1.

The information forwarded are the EC2 events, including instance IDs and whether the instances are on or off.

  • Go to the CloudWatch Console
  • Click on Rules under Events in the left navigation pane and then click Create rule
  • Under Event Source, make sure that Event Pattern is the selected option.
  • Click Edit in the Event Pattern Preview text area and copy and paste the following snippet into the pop up text area and click Save.
{
 "source": [
   "aws.ec2"
 ]
}
  • To the right of the screen, click Add target
  • In the drop-down selector, scroll down and select Event bus in another AWS account
  • In the Account ID field, add Tim's AWS account ID: 845897643164
  • Just below, please select Use existing role
  • Under Use existing role search and select the taloflowInvokeEventBusRole.
  • Scroll down and click Configure details to move onto the next page.
  • Please give the Rule the name taloflowInvokeEventBusRule and click Create rule

Connect Tim to a Sub Account (Advanced)


Sometimes, organizations prefer to integrate Tim without sending Tim the Cost Report information for all accounts. In this case, a simple AWS Lambda function does the trick to filter the Cost and Usage Report and remove any information tied to accounts you would rather not share before it is sent to Tim. You will also have to run a different CloudFormation Template for the creation of Roles.

Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.