Skip to main content
Feedback

Legacy pricing

note

This page reflects our legacy pricing model. For the most up to date pricing information, refer to the Pricing topic.

The Data Integration pricing is consumption-based. Consumption is calculated in credits, or Data Integration Pricing Units (RPUs).

RPU Credit

RPU credit is same as the Data Integration Pricing Unit credit:

  • Database and file storage sources are charged only on the amount of data transferred down to the byte.

  • Most application (API) based sources are charged for each execution of a data pipeline.

  • Applications (APIs) with high-frequency replications can be charged on the amount of data transferred similar to databases.

We charge you based on actual usage, not number of rows, letting you scale in a flexible and transparent way.

RPU Credit usage calculation

Credit usage is based on the data source and pipeline type:

Data Source / Pipeline TypeCredit Cost
Application (API) Based SourcesYou are charged 1 credit every time you ingest data from a single output table.
Database Replication & File Storage SourcesYou are charged 1 credit per 100MB of data transferred (pro-rata), regardless of execution frequency.
Orchestration & Advanced Workflows (Logic and Transformations)You are charged 1 credit for every execution of an entire workflow.

Here are some examples:

ScenarioCredit Cost
Ingesting deal information from your CRM once every 24 hours.1 credit a day
Ingesting both deal info & contact info (which have different output tables) from your CRM every 8 hours, but just Mon-Fri30 credits a week (2 output tables * 3 times a day * 5 times a week = 30)
Running an orchestration workflow that pulls data from 5 different API sources, once a day.6 credits a day (5 API pulls + 1 advanced workflow = 6)
You transfer 1,725MB of data per month between your Postgres database and your data warehouse.17.25 credits (1,725MB / 100MB = 17.25)
You transfer 280MB of data per month from files on SFTP servers to your data warehouse.2.8 credits (280MB / 100MB = 2.8)
note
  • Simple Source to Target pipelines do not require setting up a workflow to execute.

  • Application (API) based sources output tables are referring to different API calls required for the same source to pull different data entities. For example, if your data source is a CRM, there may be separate API output tables to pull both deal info & contact data.

  • For application (API) based sources, pipelines that transfer more than 100MB of data per execution, one credit will be charged per 100MB of data. For example, 1 credit for an execution of up to 100MB of data transferred, 2 credits for an execution of 100MB - 200MB, etc. If no data is detected in the execution, the charge will be 0.5 credit.

  • Databases and Storage based sources (such as S3, GCS, Azure Blob, and SFTP) calculate Data Integration Pricing Units (RPUs) based on data volume. Essentially, if there is no data (0 data), it translates to 0 credits.

note

Database and file storage sources priced differently to API sources

Data replication from database and file sources consume less compute time and therefore costs less for Data Integration, so we are passing these savings onto our customers.

Python RPU calculation

The RPU of the Python Logic Step is calculated by adding the script's entire time and the quantity of network usage.

note

The Python Logic Step RPU (logicode_rpu) will be charged regardless of the run status of the Logic Step.

The python pricing is based on:

  1. Execution time of the user’s Python script (seconds)
  2. Server size they chose to execute the script (see below)
  3. Network bandwidth - 0.4 RPU for every 100MB of data transferred
Server SizeRPU per MinuteRPU per Hour
XS0.0211.2
S0.0412.5
M0.0824.9
L0.1659.9
XL0.32919.7
XXL0.388423.304
XXXL0.49229.52

RPU usage

You can get information about your RPU consumption in 2 areas:

  • Dashboard
  • Activities

Dashboard

When you click the Dashboard tab from the main menu, a graph of your entire activity appears.

Procedure

  1. Navigate to Data Integration website.
  2. Click the Dashboard tab from the main menu.
  3. Click the RPUs tab in the upper right corner.
  4. Select the desired time frame in the upper left corner of the page.
  5. Pick one or more sources under Rivers RPUs, to examine RPU consumption for these specific Rivers.
  6. The total amount of RPU for this time frame is indicated in the bottom right corner.

Activities

The Activities tab shows the entire usage amount of RPU.

Procedure

  1. Navigate to Data Integration website.
  2. Click the Activities tab from the main menu.
  3. Find your River.
  4. You can view the entire amount of RPU in the top right-hand corner.

Python RPU usage

The Logic steps icon can only be found in a Logic River that uses Python.

Procedure

  1. Click the row that highlights the River.
  2. Click Download Logs.
  3. Search for logicode_rpu_per_step to locate the Python RPU's calculated in this run.

Free Trial features and limits

The Data Integration free trial includes access to all of the professional plan features, for 14 days or 1,000 free credits (worth $1,200) of usage, whichever expires first.

When your free trial ends, you can continue using Data Integration by registering for any on-demand plan to explore our annual and Enterprise plans.

To upgrade your plan, refer to the Subscription & Billing.

note
  • Data Integration does not charge per connector and there is no minimum or maximum on the number of connectors you can use. We believe in providing you with the best single source to efficiently align your data from internal databases and third-party platforms.
  • Data Integration does not charge per user.
  • The Starter plan is limited to 2 users.
  • The Professional and Enterprise plans include unlimited users.
  • Data Integration does not charge per environment, although each plan has a maximum number of environments:
    • The Starter plan is limited to 1 environment.
    • The Professional plan is limited to 3 environments.
    • The Enterprise plan includes unlimited environments.
  • There is no limit to how many API integrations an account can have.
  • All our data sources are available on all plans.
  • You have multiple options for connecting to practically any source, in addition to Data Integration managed sources. To learn more about this feature, refer to the Custom Data Integration.
On this Page