Data Pipeline API
Data Pipeline API#
This will be useful for setting up ETL workflow and controlling data pipeline on your side.
API Key#
Please see guide to setup your API key.
Data Import#
Execute Job#
To execute a data import job, you will need to use the following HTTP endpoint:
If successful, this endpoint will return the job ID of the job created. To get the result, you will need to poll for the job data with Job API https://docs.holistics.io/reference#job-info-1
Data Transform#
Execute Job#
The same to Data Import, you can execute a data transform job with:
Job Logs#
You can view information about the running job using this endpoint:
Check Last Run of Imports/Transforms#
Sometimes you want to run a custom processing job daily at a particular time, but only if the ETL jobs ran successfully. And you need a way to check the status of the ETL jobs.
In that case, you can use the jobs/get_last_jobs.json endpoint.
Parameters:
source_type:DataImportorDataTransformids: array of your transform/import IDs
Sample Request:
Sample Responses: