REST API

User-defined APIs

Expose any flow as a custom REST API endpoint — or have an external system trigger a flow by POSTing to one. Configure once, your endpoint lives at /rest/v1/httplistener/<your-pattern>, runs under the calling user's permissions, and shows up in the audit log alongside everything else.

Distinct from the platform REST API

The endpoints in the REST API reference are built-in/v1/flows, /v1/connections, etc. — for managing the platform. User-defined APIs are endpoints you create by attaching a flow to an HTTP Listener. They're how you expose your own data and integrations as REST.

Overview

The core building block is an HTTP Listener — a connection that listens on a configurable path and method. Depending on which side of the source-to-destination flow it sits on, the listener either pushes the request payload into the flow or returns the flow's output as the HTTP response:

PatternListener roleUse case
PULLDestination (TO)Expose a database, file, or third-party API as a JSON/XML/CSV endpoint.
PUSHSource (FROM)External system POSTs a payload → Etlworks runs a flow that processes it.
UPLOADSource (FROM)POST a file body → Etlworks copies the bytes into file storage (S3, FTP, server, …).
CRUDSource or destination per methodOne resource exposed across POST, PUT, DELETE, GET.

What a request looks like

AspectSpec
Path/rest/v1/httplistener/<url-pattern>
MethodsGET, POST, PUT, DELETE (whichever you configured on the listener)
Auth headerAuthorization: Bearer <access-token> · or Authorization: Basic <base64(user:pass)>
Request bodyAny supported format — JSON, XML, CSV, fixed-length, HL7. Empty body allowed (see Empty payload).
Content-TypeNot required (the listener parses by Format)
ResponseJSON output of the flow (PULL) · pre-defined receipt (PUSH/PUT/DELETE) · user-defined (see Custom responses)
Status codes200 success · 401 / 403 auth · 500 internal error

Pre-defined response shape (PUSH / PUT / DELETE)

JSON
{
  "auditId": 106358,
  "messageUuid": "caaa8347-063c-449b-a29b-12e68799c986",
  "flowId": 3219,
  "status": "success"
}

Endpoint URL

Every user-defined API lives at:

HTTP
https://<your-instance>.etlworks.com/rest/v1/httplistener/<your-url-pattern>

For Etlworks Cloud customers, the host is app.etlworks.com. For on-prem, it's your configured hostname.

URL pattern

Set the URL Pattern field when creating the HTTP Listener. The pattern becomes the path your endpoint responds at:

URL PatternResulting endpoint
patientshttps://app.etlworks.com/rest/v1/httplistener/patients
hl7/patienthttps://app.etlworks.com/rest/v1/httplistener/hl7/patient
patients/{patientid}https://app.etlworks.com/rest/v1/httplistener/patients/123{patientid} binds to the path segment
URL patterns are unique

Only one HTTP Listener can claim a given pattern at a time. /patients and /patient are different patterns and can coexist; two listeners both claiming /patients cannot.

Path parameters become flow variables

Tokens like {patientid} in the URL pattern are extracted from the request path and exposed to the flow as variables — usable in source SQL, the FROM/TO fields, and JavaScript / Python.

URL Pattern
patients/{patientid}
Source SQL
select * from patient where patientid = {patientid}

You can also surface request headers as variables — see Request & response headers.


Authentication — JWT (recommended)

Set Auth Type to JWT on the listener. Auth Location is either Header (recommended) or Query Parameter.

Three steps to call the endpoint

  1. Create an API user with a role that can execute flows. (API users can only execute Listener-backed endpoints, not the platform REST API.)
  2. Exchange username + password for an access token via the auth endpoint:
    Shell
    curl -s -X POST https://app.etlworks.com/rest/v1/token-auth/issue \
      -H "Content-Type: application/json" \
      -d '{"login":"api-user","password":"…"}'
    # → { "token": "eyJ0eXAiOiJKV1Q…" }
    Or skip this step entirely by generating a non-expiring API key (Settings → Users → Edit user → API Key) and using that as the token.
  3. Send the token on every call to your endpoint:
    Auth locationWhere the token goes
    HeaderAuthorization: Bearer eyJ0eXAi…
    Query Parameter?Authorization=Bearer%20eyJ0eXAi…
Token lifetime

Tokens minted via /v1/token-auth/issue are short-lived (~10 minutes). Refresh before each batch of calls, or use a non-expiring API key for service accounts.

Authentication — Basic

Less secure than JWT — use only when the calling system can't manage tokens (e.g. a third-party webhook that only sends a static Authorization header).

Set Auth Type to Basic on the listener. Auth Location is Header, Query Parameter, or (see next section) embedded in the URL path.

  1. Base64-encode user:password:
    Shell
    printf 'admin:admin' | base64
    # → YWRtaW46YWRtaW4=
  2. Send on every call:
    Auth locationWhere it goes
    HeaderAuthorization: Basic YWRtaW46YWRtaW4=
    Query Parameter?Authorization=Basic%20YWRtaW46YWRtaW4=

Path-based auth (for restrictive webhooks)

Some third-party webhook systems (eBay, certain SaaS platforms) only let you configure a static URL — no header, no query string. The workaround: embed a doubly-base64-encoded Basic auth string as the last path segment.

  1. Base64-encode user:password: admin:adminYWRtaW46YWRtaW4=
  2. Build the standard Basic auth string: Basic YWRtaW46YWRtaW4=
  3. Base64-encode that: QmFzaWMgWVdSdGFXNDZZV1J0YVc0PQ==
  4. Append to your URL Pattern: some/path/QmFzaWMgWVdSdGFXNDZZV1J0YVc0PQ== — or use a path token like some/path/{auth} for flexibility.

Wait ~1 minute for the URL to propagate, then point your webhook at it. The listener decodes the trailing segment and authenticates.


PULL — expose a data source as an API endpoint

Tableau wants the latest customer list, in JSON, behind an authenticated URL. Wire up a flow whose source is your warehouse and whose destination is the listener — the listener serves the flow's output as the response.

Setup

  1. Create an HTTP Listener with Method = GET and a URL pattern.
  2. Create a Format for the response: JSON, JSON Dataset, XML, XML Dataset, CSV, or Fixed-Length Text.
  3. Create an "any-to-web-service" flow where the source is your data (database, file, API) and the destination is the listener + format from steps 1 & 2.
  4. Configure the source query, mapping, and any flow parameters needed. Use {patientid}-style tokens to bind path parameters into the SQL.
  5. Schedule the flow as event-driven. Etlworks executes it on every inbound GET.

Calling it

Shell
curl -s "https://app.etlworks.com/rest/v1/httplistener/patients/123" \
  -H "Authorization: Bearer eyJ0eXAi…"
# → { "id": 123, "name": "…", … }

Test it from inside Etlworks Explorer too — create an Etlworks API Connection pointing at the endpoint and browse the response there.


PUSH — trigger a flow when a payload arrives

A SaaS application updates a record. Its webhook fires and POSTs the updated record's JSON to your endpoint. Your flow runs — transforms the payload, writes it to the warehouse, sends an email.

Setup

  1. Create an API user with a role that can run flows.
  2. Create an HTTP Listener with Method = POST (or PUT).
  3. Create a Format matching the inbound payload — JSON, XML, CSV, fixed-length, HL7 v2.x, FHIR.
  4. Build a flow whose source is a web service. The source connection (FROM) is the listener.
  5. Add mapping + parameters as you would normally.
  6. Schedule the flow as event-driven. It runs every time a payload arrives.
One listener per flow

A flow can have at most one listener as its source connection. If you need a second inbound endpoint for the same processing logic, create a second flow that calls the first as a nested flow.

Calling it

Shell
curl -s -X POST "https://app.etlworks.com/rest/v1/httplistener/hl7/patient" \
  -H "Authorization: Bearer eyJ0eXAi…" \
  -H "Content-Type: application/json" \
  -d '{"id":123,"name":"…"}'
# → { "auditId": 106358, "messageUuid": "…", "flowId": 3219, "status": "success" }

UPLOAD — ingest files via HTTP

An HTTP Listener can be the source of a Copy Files flow — turning a POST endpoint into a file dropzone that lands bytes in any supported file storage (server, FTP/FTPS/SFTP, S3, etc.).

Setup

  1. Create an HTTP Listener with Method = POST and an Auth Type. URL Pattern uses a path token for the filename:
    URL Pattern
    file/{filename}
  2. Create a Copy Files flow.
  3. Set the FROM connection to the listener.
  4. Set the TO connection to any file connection.
  5. Set the destination filename to either a literal value (test.json) or the path token ({filename}) so the caller controls it.
  6. Schedule the flow as event-driven.

Calling it

Shell
curl -s -X POST "https://app.etlworks.com/rest/v1/httplistener/file/test2.json" \
  -u "api-user:…" \
  --data-binary @local-file.json
# The body of local-file.json lands at the configured destination as test2.json.

CRUD — multiple methods on one resource

Build a full create / read / update / delete API by configuring listeners with the appropriate HTTP method and wiring each to a flow. Path tokens parameterize destination queries.

MethodWiringExecution mode
GETListener as destination — same as PULL.Synchronous — caller waits for the flow.
POSTListener as source. Destination is the resource you're creating into. Tokenize the destination query with path / payload variables.Async — the caller gets a receipt before the flow runs.
PUTSame as POST; pick PUT when semantics matter.Sync — caller waits for the flow.
DELETESame shape as GET, but configure the listener with DELETE. Use Successful DELETE response code to control the returned status (200, 204, etc.).Sync.

Example — tokenized destination query

SQL (destination)
update patient
   set name = {name}
 where patientid = {patientid}

With URL Pattern patients/{patientid} and a JSON payload {"name":"Jane"}, calling PUT /rest/v1/httplistener/patients/123 with body {"name":"Jane"} resolves to update patient set name='Jane' where patientid=123.


Custom responses

By default, PUSH/PUT endpoints return the pre-defined receipt. Two ways to return something else:

Option A — return the destination's response

Useful when the flow forwards the payload to another HTTP API and you want to relay that API's reply back to the caller.

  1. On the listener, set POST/PUT Processing = Sync and POST/PUT Response File Name = {app.data}/response.json.
  2. On the destination HTTP connection, set Response File Name to the same path.
  3. Build the flow with the listener as source and that HTTP connection as destination.
  4. Schedule event-driven.

The destination's response gets written to response.json and returned to the caller. The file is auto-deleted after the response goes out.

Option B — user-defined response file

Generate a response from inside the flow itself, without delegating to an upstream API.

  1. Listener: POST/PUT Processing = Sync, POST/PUT Response File Name = {app.data}/response.json.
  2. Build a primary flow that processes the inbound payload.
  3. Build a second flow that writes the response file (same path as in step 1).
  4. Combine them into a nested flow so the response file always exists by the time the listener replies.
  5. Schedule the nested flow event-driven.

The response file is auto-deleted after the listener returns it.

Request & response headers

Read inbound headers

Enable Pass request headers to the flow as parameters on the listener. Each header becomes a flow variable — uppercased (Validation-TokenVALIDATION-TOKEN).

JavaScript (flow code)
var token = scenario.getVariable('VALIDATION-TOKEN').getValue();

Set outbound headers

From any place a flow runs JavaScript or Python:

JavaScript (flow code)
etlConfig.addResponseHeader('header-name', headerValue);

Real-world example: webhook subscription handshake

RingCentral's webhooks require the endpoint to echo a Validation-Token request header back in the response to complete the subscription. The flow:

JavaScript (flow condition)
var token = scenario.getVariable('VALIDATION-TOKEN');
etlConfig.addResponseHeader('Validation-Token', token != null ? token.getValue() : '');
value = true;

Empty payload

Some upstream systems POST with no body — especially webhook handshakes. Without a default, the parser raises an exception. Fix: set Default Payload on the listener to a valid document in the chosen format. The flow runs against the default whenever the request body is empty.

Response codes

CodeWhen
200Success.
200 → configurable for DELETESet Successful DELETE response code on the listener if you want a different status (e.g. 204 No Content).
401Missing or invalid auth.
403Authenticated but not authorized to run this flow.
500Internal error during flow execution. Check the audit log via auditId.

Test from inside Etlworks

You don't need an external system to exercise an event-driven flow. Build a "sender" flow that calls your listener.

  1. Prepare test data. A source connection (file, cloud storage, database) plus a Format matching what real callers will send.
  2. Configure the destination. Create an Etlworks API Connection with:
    • Endpoint path: /rest/v1/httplistener/<your-pattern>
    • Method: matches your listener (POST, PUT, GET, DELETE)
    • User / password: the API user configured for the listener
  3. Build the sender flow. Source = test data + format. Destination = the API connection from step 2. Format on both sides matches.
  4. Run the sender. It POSTs (or GETs / PUTs / DELETEs) to your listener, triggering the event-driven flow exactly as a real caller would.

The sender flow is reusable — keep it around for regression testing or wire it into a CI flow.

View inbound payloads

Every payload an HTTP Listener receives is captured in the platform's messages store — queryable by id, by listener, by status, by substring. Use the Messages app in the UI, or the CLI:

Etlworks CLI
// Filter recent messages
list-messages-filtered message=Notification;

// Pull one message body
get-message 24a6e77b-c05f-45ed-b1e8-908ce361bf31;

Or via REST: GET /v1/messages — see the REST API reference.


Building a real API on this?

The NBCUniversal and Transport for NSW case studies are both built on this exact pattern — user-defined APIs in front of multi-source flows. Drop us a note at product@etlworks.com if you hit anything that's not in this guide.