Snowflake ETL

Get data in. Get insights out.

ETL, ELT, real-time CDC, and Reverse ETL — all into and out of Snowflake. 260 source connectors. Predictable monthly pricing instead of consumption-based row counting.

260
Sources to Snowflake
2-way
In + Reverse ETL
<1s
CDC latency
No
Per-row billing

The problem

Snowflake billing is variable. Your ETL bill shouldn't be.

Most “Snowflake-native” ETL tools price per row, per event, or per “monthly active row.” The faster your data grows, the faster your bill does. Combined with Snowflake's compute pricing, you end up paying twice for the same workload.

Where budgets go to die

Per-row pricing punishes the workloads you want most.

CDC pipelines, hourly syncs, high-velocity SaaS — exactly the use cases that justify Snowflake — are also the ones consumption-priced ETL tools cost the most for. Etlworks bills per platform tier, not per record. Replicate 200 billion rows or 200 million; same monthly cost. Predictable for your CFO, painless for your data team.

Capabilities

Snowflake-native, end to end.

Bulk loading + COPY INTO

Stages files in S3, Azure Blob, or GCS, then runs COPY INTO at warehouse speed. No row-by-row inserts.

Real-time CDC into Snowflake

Log-based CDC from MySQL, Postgres, SQL Server, Oracle, Mongo, DB2 — sub-second latency, MERGE-based deduping.

Reverse ETL out of Snowflake

Push enriched Snowflake data to Salesforce, HubSpot, Marketo, NetSuite, and 200+ SaaS targets. Same platform.

Schema evolution

New columns and type changes propagate automatically. No DDL drift, no broken pipelines after upstream changes.

Transformations in flight

SQL, JavaScript, Python — transform during load or in-warehouse via tasks. dbt-friendly, dbt-optional.

Warehouse cost optimization

Stage-and-load minimizes warehouse-up time. Auto-resume, auto-suspend, file-based loading — every pattern Snowflake recommends.

Patterns

Three flows, one platform.

Every Snowflake data pipeline pattern, configured the same way. No separate tool for CDC, no separate tool for Reverse ETL, no dbt project to maintain just for transformations.

Data in
Source S3 Snowflake

Bulk ETL / ELT

Stage files in S3 / Azure Blob / GCS, then COPY INTO. The pattern Snowflake recommends, automated end to end.

Real-time
DB Log CDC MERGE

CDC into Snowflake

Log-based CDC streams change events directly into Snowflake via MERGE. Sub-second latency, no Kafka.

Reverse
Snowflake Transform SaaS

Reverse ETL out

Push enriched data from Snowflake to Salesforce, HubSpot, Marketo, NetSuite — 200+ SaaS targets.

Pricing transparency

A typical 50M-row pipeline, three ways.

Same workload — Salesforce account changes, Postgres orders, hourly SaaS syncs — priced under three common ETL pricing models. Numbers are approximate, based on public pricing as of 2026.

Consumption (per-row)

~$8,000/mo

Scales linearly with row volume. Hidden surge pricing during busy months.

Credit-based

~$3,500/mo

Better, but credits expire. Compute-tier upgrades during peak load add cost.

Etlworks (fixed tier)

$1,000/mo

Standard tier, all features, all rows. Predictable for budgets, painless for data teams.

Specifications

Snowflake integration depth.

Every Snowflake feature you'd actually use — staging integrations, CDC patterns, file formats, security models — supported and documented.

Loading
External stages
S3, Azure Blob, Google Cloud Storage · auto-managed lifecycle
Internal stages
User stages, table stages, named stages · PUT and COPY INTO automated
File formats
CSV, JSON, Parquet, Avro, XML, gzip
CDC & streaming
CDC into Snowflake
MERGE-based deduping · INSERT/UPDATE/DELETE preserved · idempotent
Snowpipe streaming
Supported (see below)
Streams & Tasks
Read from Snowflake streams · trigger downstream Etlworks flows on changes
Security & auth
Authentication
Key-pair, OAuth, password, federated SSO · Azure AD / Okta supported
Network policies
Static IP allowlisting · PrivateLink supported on Enterprise plans
Role-based access
Etlworks honors Snowflake RBAC · operates with least-privilege roles

Comparing Snowflake ETL tools? See Etlworks vs Fivetran, Matillion, and Airbyte

Proof

Petabyte Snowflake pipelines, in production.

“Etlworks collects data from 1,600+ MySQL databases via CDC and loads it into Snowflake — saving us hundreds of thousands of dollars annually and giving our team and customers instant access to insights.”
Intertek Alchemy
Real-time CDC · 1,500+ MySQL → Snowflake
Read the case study

FAQ

Common questions.

Does Etlworks use Snowpipe or COPY INTO?
Etlworks doesn't call Snowpipe directly. Etlworks ingests data into a stage (S3, Azure Blob, or GCS); from there, Snowpipe is configured at the Snowflake level — auto-ingest from the stage, triggered by Snowflake event notifications. Etlworks handles the staging step; Snowflake handles the Snowpipe step. For bulk loading, Etlworks uses COPY INTO directly from the stage.
How does CDC handle DELETEs into Snowflake?
DELETEs are honored via MERGE statements. The CDC pipeline tracks change events (INSERT/UPDATE/DELETE) from the source database log and applies them as a single MERGE operation against the Snowflake target — preserving deletes, updates, and inserts atomically. Soft-delete patterns (set a deleted_at column instead of removing the row) are also supported via flow configuration.
Can I use dbt with Etlworks?
Yes. Etlworks doesn't replace dbt — they work together. Common pattern: Etlworks loads raw data into Snowflake (stage / land tables), then triggers a dbt run to transform into your modeled layer. You can call dbt Cloud or run dbt-core via the Etlworks scheduler. Some teams use dbt for transformations only; others use Etlworks's native SQL/JavaScript transformations and skip dbt entirely.
What about Snowflake compute costs from Etlworks workloads?
Etlworks's loading patterns are designed to minimize warehouse-up time. File-based COPY INTO uses an XS or S warehouse for seconds, not hours. Snowpipe Streaming uses serverless compute (no warehouse). MERGE operations during CDC are batched. In production, ETL-driven Snowflake costs are typically 5-15% of total compute spend; analytics queries dominate. We provide a cost-estimation worksheet during evaluation.
What's required to set up the Snowflake connection?
A Snowflake user with appropriate role (typically a dedicated ETL_USER role with USAGE on the warehouse, plus INSERT/SELECT/UPDATE/DELETE on target schemas). Authentication via key-pair (recommended), password, or OAuth. PrivateLink and IP allowlisting supported on Enterprise plans. Documentation walks through the setup in 10 minutes.
Can I migrate from Fivetran / Matillion / Airbyte?
Yes — migrations from each are common and well-documented. We provide migration assessment for cost (typically 50-70% savings vs Fivetran consumption pricing), timeline (most migrations run 2-4 weeks), and connector parity. Reach out via Talk to us and we'll send a migration brief specific to your current platform.
How does Reverse ETL pricing work?
Same as ETL — included in your tier. Most ETL platforms charge separately for Reverse ETL (Hightouch, Census) or don't support it. Etlworks treats data movement as bidirectional from day one. Push from Snowflake to Salesforce / HubSpot / NetSuite / 200+ destinations on the same monthly subscription.

Start your trial

14 days. No card. Real workloads.

Spin up a free trial, point it at your Snowflake account, and load production data. See what predictable ETL pricing actually feels like.