Skip to main content
Apache Airflow® is a platform created by the community to programmatically author, schedule, and monitor workflows. A DAG (Directed Acyclic Graph) is the core concept of Airflow, collecting Tasks together, organized with dependencies and relationships to say how they should run. You declare a DAG in a Python file in the $AIRFLOW_HOME/dags folder of your Airflow instance. This page shows you how to use a Python connector in a DAG to integrate Apache Airflow with a .

Prerequisites

To follow the steps on this page:
  • Create a target with time-series and analytics enabled.

    You need your connection details. This procedure also works for .
This example DAG uses the company table you create in Optimize time-series data in hypertables

Install python connectivity libraries

To install the Python libraries required to connect to :

Create a connection between Airflow and your

In your Airflow instance, securely connect to your :

Exchange data between Airflow and your

To exchange data between Airflow and your : You have successfully integrated Apache Airflow with and created a data pipeline.