Quickstart
The Etlworks CLI is built into the platform — no install, no credential management. Open it, write a statement, hit Ctrl+Enter. Same engine drives POST /v1/cli for automation.
Open the CLI
Sign in to Etlworks. Open CLI from the main navigation panel. The console runs entirely inside the platform under your user identity — no separate auth, no API key to manage.
If you're driving the CLI from a script, CI job, or external orchestrator, the same commands run via POST /v1/cli. See Driving from REST below — everything else on this page applies identically.
The interface
- Command Editor — top pane. Write and edit one or many statements (separate with
;). - Command Palette — searchable list of every available command.
- History — full execution history with timestamps. Re-run, copy, audit.
- Output Panel — JSON output, SQL-transformed tables, errors, logs, captured values.
Keyboard shortcuts
| Action | Shortcut |
|---|---|
| Run command | Ctrl + Enter (or Ctrl + F2) |
| Open command palette | Ctrl + Shift + O |
| Open history | Ctrl + Shift + H |
| Clear output | Ctrl + L |
| Find & replace | Cmd + Option + F |
| Go to line | Alt + G |
| Undo / Redo | Cmd + Z / Shift + Cmd + Z |
| Close dropdown | Esc |
Your first script
Confirm identity, list flows tagged production, and dump one flow's configuration to a file:
whoami;
list-flows where tags = "production";
get-flow 12345 >> backup/flow-12345.json;
Hit Ctrl + Enter. Each statement runs in order; output panel shows three blocks — user info, the filtered flow list, and a confirmation that the file was written under the tenant's data folder.
Statements & comments
The CLI is a statement-based execution engine. A script is one or more statements; each ends with ; when there's more than one. Comments use // for single-line and /* … */ for multi-line.
// Single statement, no semicolon needed
list-connections
// Multiple statements — semicolons required between them
list-connections; list-flows;
/*
Multi-line comment.
Useful for documenting longer scripts.
*/
list-tenants;
Parameters & quoting
Every command follows the same shape:
command-name positional1 'positional2 with spaces'
param1=value param2='value with spaces'
--modifier1 --modifier2
select … from … where … order by … limit …
| capture var=stdout.field
& >> relative/path/name.json;
Positional args come first, named key=value args follow in any order. Wrap any value containing spaces, special characters, or JSON in quotes — single or double, both work, and inner quotes can be escaped:
print '{ "id": 10, "name": "Sample" }';
print '{ "message": "He said \"hello\"" }';
print "{ 'path': 'c:\\\\data\\\\file.txt' }";
Modifiers
| Modifier | Effect |
|---|---|
--force | Allow destructive operations (delete-flow, delete-tenant, flows-suspend, …) without an interactive confirm. |
--silent | Suppress normal output. Errors still surface. |
--silent-if-empty | Suppress output only if stdout is null or an empty array. Errors still surface. |
--stop-on-error | Stop the entire script (or loop) on the first failure. |
Modifiers go after the command name and before any SQL or capture clause:
list-flows --silent select id where flowType like '%cdc%' | capture cdc=stdout;
Capture variables
Pipe a command's JSON output into a variable, then reference it later as {name}. Capture works on whole results, single fields, or arrays.
// Single field
get-connection 120 | capture connId=stdout.id;
// Multiple fields in one capture
get-connection 120 | capture id=stdout.id name=stdout.connection.name;
// Capture an entire array — then iterate
list-tenants | capture tenants=stdout;
for-each (t in tenants) {
print 't.name';
};
// Use captured values inside JSON
print '{"id":"{id}","name":"{name}"}';
Control structures
if and for-each are built into the language.
// if — full boolean expressions, comparisons, JSON-path access
if ({connId} == 100) {
print 'Matched';
};
// for-each over arrays
for-each (tenant in tenants) {
print 'tenant.email';
};
// Nested loops + per-iteration commands
for-each --stop-on-error (agent in agents) {
update-agent {agent.id} --force;
};
Loops continue on per-iteration errors by default (consistent with the script-wide default). Add --stop-on-error on the loop itself to halt on the first failure.
Parallel execution
Append & to run a command asynchronously — the script keeps going without waiting:
// Fire and forget
run-flow 100 &;
// With timeout
run-flow 200 timeout=15000 &;
// Combined with redirect
get-flow 100 & >> f100.json
Default cap is 10 parallel commands per request (server-side cli.max.parallel).
Redirecting output
Append >> filename at the end of any statement to save its stdout to a file under the tenant's application data directory. Subfolders are auto-created. Absolute paths are not allowed.
// Save all flows
list-flows >> flows.json
// Subfolder
list-connections >> exports/conn-list.json
// Filename from a captured variable
list-flows --silent where name like "%script%" | capture flows=stdout;
for-each (f in flows) {
get-flow {f.id} >> backup/{f.name}.json;
};
// Combined with inline SQL
list-flows select id,name where flowType like '%cdc%' >> cdc.json
Inline SQL on JSON output
Most commands return JSON. The CLI ships a compact SQL engine that operates directly on that JSON — SELECT, FROM (array flattening), WHERE, ORDER BY, LIMIT, plus scalar & time-helper functions.
list-connections scope=all
select id, name, connectionType, date(created) as created
where connectionType like '%mysql%'
order by created desc
limit 10;
flow-executions-aggregated offset=5
select flowId, auditId, started, ended, status
from executions
where status != 'success'
and started > ts("now - 10d")
order by ended desc
limit 10;
Time helpers like ts("now - 1d"), addDays(ts, n), parseDate(text, pattern), plus string/aggregate utilities are documented in the SQL reference. The full SQL grammar comes from the Built-in SQL in CLI support article — we mirror the highlights, that page has the exhaustive function list.
Exit codes & output
Each statement returns a CliResponse:
| Field | Meaning |
|---|---|
command | The statement that ran. |
stdout | Result of the command — JSON for read commands, plain text for status/info, SQL-transformed when an SQL clause was present. |
stderr | Errors, warnings, prompts. |
exitCode | 0 on success, non-zero on error. |
The REST endpoint returns an array of these — one per statement.
Confirmation for destructive operations
Roughly 50 commands require explicit confirmation: every delete-*, every edit-*, flows-suspend, flows-resume, run-flow, agent lifecycle commands. Pass --force to confirm inline; without it, interactive runs prompt and scripted runs return exitCode=1 with a "confirmation required" message.
// In a script:
delete-flow 12345 --force;
// Bulk:
list-flows where name like '%test cdc%' | capture flows=stdout;
for-each (f in flows) {
delete-flow {f.id} deleteSchedules=true --force;
};
Roles & permissions
Commands run as the calling user. Each command requires at least a particular role; the CLI inherits your role and tenant scope — same boundary as the UI and REST API.
| Role | Can do | Examples |
|---|---|---|
Viewer | Read-only across the platform | list-flows, get-connection, flow-executions |
Operator | Run flows + read | run-flow, run-flow-by-name, stop-flow |
Editor | CRUD on flows, connections, schedules … | add-flow, edit-connection, delete-schedule |
Admin | Tenant-level config + user management | add-user, edit-tenant, flows-suspend |
Super Admin | Cross-tenant + system-level | add-tenant, set-agent-max-ram, list-app-releases |
Driving from REST
Same engine, same commands, programmatic. POST /v1/cli takes a CliRequest with the script as a single string and an optional variable map. Returns an array of CliResponse objects — one per statement.
curl -s -X POST "$ETLWORKS_URL/v1/cli" \
-H "Authorization: Bearer $ETLWORKS_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"command": "list-flows --silent where tags = \"production\" | capture flows=stdout; for-each (f in flows) { get-flow {f.id} >> backup/{f.name}.json; };",
"vars": {}
}'
# → [
# { "command": "list-flows ...", "stdout": "...", "stderr": "", "exitCode": 0 },
# { "command": "get-flow 12345 >> backup/...", "stdout": "...", "stderr": "", "exitCode": 0 },
# ...
# ]
The vars map is the REST equivalent of CLI {var} interpolation — the script references {flowName}, the request supplies "vars": {"flowName": "Stripe to Snowflake"}. Auth is the same Bearer token used by the REST API: an API key (long-lived) or a JWT (mint via POST /v1/token-auth).
From here: