Data Pipeline API
#
Data Pipeline APIThis will be useful for setting up ETL workflow and controlling data pipeline on your side.
#
API KeyPlease see guide to setup your API key.
#
Data Import#
Execute JobTo execute a data import job, you will need to use the following HTTP endpoint:
If successful, this endpoint will return the job ID of the job created. To get the result, you will need to poll for the job data with Job API https://docs.holistics.io/reference#job-info-1
#
Data Transform#
Execute JobThe same to Data Import, you can execute a data transform job with:
#
Job LogsYou can view information about the running job using this endpoint:
#
Check Last Run of Imports/TransformsSometimes you want to run a custom processing job daily at a particular time, but only if the ETL jobs ran successfully. And you need a way to check the status of the ETL jobs.
In that case, you can use the jobs/get_last_jobs.json
endpoint.
Parameters:
source_type
:DataImport
orDataTransform
ids
: array of your transform/import IDs
Sample Request:
Sample Responses: