Supercharge Your Data Pipeline

Unlock advanced features for your data operations with Sling CLI Pro.

CLI Pro

API Sources

Extract data from any REST API with YAML-based specifications. Supports pagination, authentication, and incremental sync.

Parallel Streams & Retries

Process data faster with parallel streams and automatic retries for failed operations.

Stream Chunking

Break down large datasets into manageable chunks with time-based, numeric, or count-based partitioning.

Capture Deletes

Track and replicate deleted records with CDC-like functionality.

Pipelines & Hooks

Create complex workflows with HTTP requests, SQL queries, file operations, and custom logic.

OpenTelemetry Logging

Export structured logs to any OTLP endpoint for centralized observability with Grafana, Datadog, and more.

Staged Transforms

Transform data with multi-stage processing using 50+ built-in functions for string, numeric, date, and conditional operations.

State-Based Incremental

Maintain state across file and database loads with intelligent incremental processing and resume capability.

ODBC Connections

Connect to any database via ODBC drivers for maximum compatibility with SQL Server, DB2, SAP HANA, and more.

CLI Pro Max

Everything in CLI Pro, plus advanced capabilities for mission-critical workloads.

Change Capture (CDC)

Continuously replicate row-level changes from transaction logs with automatic initial snapshots and incremental capture.

Schema Migration

Migrate primary keys, foreign keys, indexes, defaults, and constraints between databases automatically.

Priority Support

Direct access to the Sling team for faster issue resolution and guidance.

-->

Simple Pricing with Free Trial

No hidden costs, no surprises. Unlimited connections, volume, and executions.

CLI Pro

$79 / mo

(paid monthly)
  • API Sources Extract data from any REST API with YAML-based specifications. Supports pagination, authentication, and incremental sync.
  • Parallel Streams & Retries Process data faster with parallel streams and automatic retries for failed operations.
  • Stream Chunking Break down large datasets into manageable chunks with time-based, numeric, or count-based partitioning.
  • Capture Deletes Track and replicate deleted records with CDC-like functionality.
  • Pipelines & Hooks Create complex workflows with HTTP requests, SQL queries, file operations, and custom logic.
  • OpenTelemetry Logging Export structured logs to any OTLP endpoint for centralized observability with Grafana, Datadog, and more.
  • Staged Transforms Transform data with multi-stage processing using 50+ built-in functions for string, numeric, date, and conditional operations.
  • State-Based Incremental Maintain state across file and database loads with intelligent incremental processing and resume capability.
  • ODBC Connections Connect to any database via ODBC drivers for maximum compatibility with SQL Server, DB2, SAP HANA, and more.
  • Community Support Get help via GitHub issues and Discord community.

CLI Pro Max

$149 / mo

(paid monthly)
  • API Sources Extract data from any REST API with YAML-based specifications. Supports pagination, authentication, and incremental sync.
  • Parallel Streams & Retries Process data faster with parallel streams and automatic retries for failed operations.
  • Stream Chunking Break down large datasets into manageable chunks with time-based, numeric, or count-based partitioning.
  • Capture Deletes Track and replicate deleted records with CDC-like functionality.
  • Pipelines & Hooks Create complex workflows with HTTP requests, SQL queries, file operations, and custom logic.
  • OpenTelemetry Logging Export structured logs to any OTLP endpoint for centralized observability with Grafana, Datadog, and more.
  • Staged Transforms Transform data with multi-stage processing using 50+ built-in functions for string, numeric, date, and conditional operations.
  • State-Based Incremental Maintain state across file and database loads with intelligent incremental processing and resume capability.
  • ODBC Connections Connect to any database via ODBC drivers for maximum compatibility with SQL Server, DB2, SAP HANA, and more.
  • Change Capture (CDC) Continuously replicate row-level changes from transaction logs with automatic initial snapshots and incremental capture.
  • Schema Migration Migrate primary keys, foreign keys, indexes, defaults, and constraints between databases automatically.
  • Community Support Get help via GitHub issues and Discord community.
  • Priority Support Direct access to the Sling team for faster issue resolution and guidance.

Frequently Asked Questions

How does Sling CLI Pro work?

You need a token for it to work. To get a trial token, you can get one at https://dash.slingdata.io. Once you have it, put the value in an environment variable called SLING_CLI_TOKEN. That's it, Sling will validate your token when you run it and the Pro features will be enabled.

What are hooks and how can I use them?

Hooks are customizable actions that can be triggered before or after a replication stream. They enable you to extend pipeline functionality with operations like data validation, notifications, or custom API calls. Common hook types include query execution, data checks, notifications, and HTTP webhooks. See here for more details.

How many subscriptions do I need?

Each CLI Pro subscription includes 2 tokens: a production token for your production environment and a development token for testing. One subscription is required per team. You can use the production token across all your production environments (servers, containers, etc.) while team members share the development token for testing and collaboration. For example, if you have separate data engineering and analytics teams, each team would need their own subscription. See here for more details.

What are parallel streams and how do they help?

Parallel streams allow you to process multiple data streams simultaneously, significantly improving performance. For example, when replicating multiple tables or processing large files, parallel streams can dramatically reduce the total processing time by utilizing available system resources more efficiently. See here for more details.