Tim by Taloflow

Welcome to Tim's documentation

Taloflow Infra Monitor, or Tim, is a DevOps product to help technical teams drastically reduce their cloud management overhead. Tim helps reduce MTTR on cost incidents, and encourages better release planning and more efficient cloud architecture.

Get Started    

Getting Started

Welcome to Tim, the Taloflow Infrastructure Monitor. Here’s a quick overview of how to get started.

Tim is an AIOps product for SREs and DevOps to drastically reduce their cloud management overhead. This is achieved through a combination of real-time monitoring and alerting of cloud spend, reporting with drill-down on tags, services and resources, a SQL-queryable cost info database, and powerful anomaly detection.

Getting started

1. Register for Tim

You can register for Tim at https://www.taloflow.ai/register. Once created, you’ll be prompted to go through the integration process.

2. Connect your cloud

If you are integrating at the Master or Root Account level, please pick any of the following options:

If you are integrating a Sub Account, in addition to the above, please refer to the following docs so you can give us access to an S3 bucket with a filtered AWS Cost and Usage Report:

Cost reports need to run before platform is available

Once integrated, we first need to process a fresh set of your cost reports from AWS before handoff. These are usually produced by AWS every 6 to 24-hour period. Depending on the time of your integration, it may take until the next business day before your account is ready for you. Thanks for your patience!

3. Get platform access

Once a first set of cost reports has run, you will get the following:

  • Email alerts on anomalies
  • Link to access your monitoring dashboard
  • Link to your custom reports
  • Short survey to share your objectives with the tool

7 days of cost reports needed before functionality is available

If you don't have AWS Cost and Usage Reports turned on until you start using Tim, there is insufficient data available for us to offer reliable forecasts and anomaly detection. This will impact the dashboard UI and alerting. However, after approximately 7 days, enough cost history is available for these to flow normally and they will improve overtime. If you do have a longer history of reports, please make sure that you copy over the reports to the S3 bucket you gave Tim access to get the best results.

Platform walkthrough

Grafana UI

As a user of Tim, you have access to a Grafana dashboard. To login, go to https://tim.taloflow.ai Drill into your data how you want to by tags, service, etc. You can look at your spend velocity in 1-hour to weekly intervals, and virtually any perspective on cost you need is available to you.

Hover over the (i) icon on graphs to see the tooltip info

There is some important contextual information on the graphs and tables in the Grafana UI. You don't want to miss it. Most graphs and tables have a tooltip you can hover for an easy-to-access explanation of what is being shown.

Top-level controls

Here are the important top level controls. First we have the top-left controls.

Top-left controls

Top-left controls

Control Description
Client ID This is useful for some orgs who want a consolidated view of all the Tim accounts they are monitoring.
Account ID Easily select which accounts (e.g.: Sub Accounts within a Master/Root Account) you want to view or filter out.
Service Every service-code in AWS that you use and is available in your AWS Cost and Usage Report is available here, and you can filter accordingly.
Interval You can view the time-series data in various time intervals, starting from hourly granularity.
Severity You can filter out the anomalies by severity of the anomaly. Level 3 being most serious, and Level 1 being least serious.

In the top-right controls, the important ones are:

  1. The Refresh button. Depending on the query being run, you may have to click on this to see the updated view on the data.
  2. The Time range selector. Here you can select which past historical period from the drop down, or a Custom time range.

Pro Tip: The Custom time range selector is useful if you are looking for the forecast of a specific period in the future.

Time range selector

Time range selector

Forecasts vs. actuals

There are numerous graphs and tables in the Grafana dashboard that use Forecasts and Actuals. The Actuals are from the AWS Cost and Usage Report. In other words, these charges incurred on AWS in the period. The Forecast numbers are produced by Tim. They are based off the AWS Cost and Usage Report history, activity on your account, and some machine learning models.

In the various graphs where a Forecast is presented, you can also see the confidence limits (or confidence bands). Which shows you the range of expected spend in the Forecast for any given time interval.

Forecast vs. Actuals (AmazonS3)

Forecast vs. Actuals (AmazonS3)

Alerts and anomalies

The Actuals and Forecasts together help Tim determine the degree of an anomaly. In other words, how far off the Forecast (a.k.a.: the expected amount) the Actuals are will trigger an anomaly based on the variance from the expected.

AmazonS3 anomalies

AmazonS3 anomalies

By Period Tickers

These tickers are helpful to get a quick glimpse of your actual spend and forecasts by the typical periods used to plan. (e.g.: quarter, month, week)

Time Zone

Please note that the time zone of the graphs and tables in the UI are based on your browser, for the exception of the By Period Tickers (described below), which are based on UTC.

Daily Tickers

Daily Tickers

Email summaries and alerts

We email you the following summaries and alerts.

  • Cost-report summaries
  • Daily summaries
  • Weekly summaries
  • Monthly summaries
  • Service-based anomalies

Each email details the anomalies within the period, and shows you a breakdown of when in the day the anomalies occurred with hour-by-hour time stamps. The by-period emails, (e.g.: weekly summaries) will detail which AWS Services were most anomalous for the period.

AmazonS3 Spend Anomaly Email

AmazonS3 Spend Anomaly Email

Filter alerts

You can elect to filter out specific emails (e.g.: cost-report summary emails) simply by replying "stop" to the email. This is also possible for emails that alert on specific AWS Services (e.g.: AmazonSNS).

Custom Reports

Working with AWS Cost and Usage Reports is a painful experience, but there's a lot of useful data if you know how to get it. Tim auto-generates a cleaned-up cost report every time a new one comes in with pivots ready for quick analysis.

Accessing my reports

The reports are shared as Google Sheets in a secure Google Drive folder. You can copy these reports to whatever store you want. The Google Drive folder is shared after on-boarding.

Every time a new cost report comes in, new spreadsheets will generate within the hour and show up in your folder so you have the most updated information. To avoid clutter, the old ones will be placed automatically in an archive folder within the main folder.

Pivots on the data

Below are some of the pivots that are available by default. If you require other pivots, please email us at help@taloflow.ai and we can turn them on for you.

  • Pivot by Service and Service operations with Day/Day amounts
  • Pivot by Service and Service operations with Month/Month amounts
  • Pivot by Tag Grouping (grouping includes Service and Service operations)
  • Pivot on EC2 Type

Too many pivots

Having too many pivots on at the same time in the auto-generated reports can slow down Google Sheets. This obviously depends on how much data you have.

Pivot by Service and Service operations with Day/Day amounts

Pivot by Service and Service operations with Day/Day amounts

Pivot by Service and Service operations with Month/Month amounts

Pivot by Service and Service operations with Month/Month amounts

Pivot on EC2 cost by Type

Pivot on EC2 cost by Type

Viewing by tags

If you use tags for resources within your organization, then all the same information in the pivots is available with your tagging as well.

Query data in Athena

The cost information that Tim has for you is fully queryable using SQL queries in AWS Athena. To get access to the database of cost data for your organization, a few extra steps are required. Please email our team at help@taloflow.ai and we will send you set up instructions accordingly and also send you a list of Common Queries and Advanced Queries to get you started.


Getting Started


Welcome to Tim, the Taloflow Infrastructure Monitor. Here’s a quick overview of how to get started.

Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.