Data Pipeline API

Data Pipeline API#

This will be useful for setting up ETL workflow and controlling data pipeline on your side.

API Key#

Please see guide to setup your API key.

Data Import#

Execute Job#

To execute a data import job, you will need to use the following HTTP endpoint:

POST /data_imports/<data_import_id>/execute.json

If successful, this endpoint will return the job ID of the job created. To get the result, you will need to poll for the job data with Job API

Data Transform#

Execute Job#

The same to Data Import, you can execute a data transform job with:

POST /data_transforms/<data_transform_id>/execute.json

Job Logs#

You can view information about the running job using this endpoint:

GET jobs/<job_id>/logs.json

Check Last Run of Imports/Transforms#

Sometimes you want to run a custom processing job daily at a particular time, but only if the ETL jobs ran successfully. And you need a way to check the status of the ETL jobs.

In that case, you can use the jobs/get_last_jobs.json endpoint.


  • source_type: DataImport or DataTransform
  • ids: array of your transform/import IDs

Sample Request:

GET /jobs/last_run_jobs.json?source_type=DataTransform&ids[]=123&ids=[]456

Sample Responses:

"123": {
"456": {