tessl install tessl/pypi-apache-airflow-providers-dbt-cloud@4.4.0Provider package for integrating Apache Airflow with dbt Cloud for data transformation workflow orchestration
Agent Success
Agent success rate when using this tile
84%
Improvement
Agent success rate improvement when using this tile compared to baseline
1x
Baseline
Agent success rate without this tile
84%
Create an Airflow DAG that discovers and lists all dbt Cloud jobs in a project for inventory tracking. The DAG should retrieve jobs from a dbt Cloud account and project, making their IDs available for downstream tasks.
Your DAG should include a task that:
Retrieves Job List: Connects to dbt Cloud and retrieves all jobs for a specific account and project. The task should output a list of job IDs.
Uses Standard Configuration: The task should use the default connection identifier dbt_cloud_default and accept account and project IDs as parameters.
Supports Ordered Results: The task should support ordering the job list by a field name (such as 'name').
@generates
from airflow import DAG
from datetime import datetime
def create_job_inventory_dag(
dag_id: str,
account_id: int,
project_id: int,
order_by: str = None
) -> DAG:
"""
Create an Airflow DAG that inventories dbt Cloud jobs.
Args:
dag_id: Unique identifier for the DAG
account_id: dbt Cloud account ID
project_id: dbt Cloud project ID to list jobs from
order_by: Optional field name to order results (e.g., 'name')
Returns:
Configured Airflow DAG
Example:
dag = create_job_inventory_dag(
dag_id='dbt_job_inventory',
account_id=12345,
project_id=67890,
order_by='name'
)
"""
passdag_id='test_inventory', account_id=12345, and project_id=67890 produces a DAG with a task that lists jobs for that project @testorder_by='name' produces a task that orders the job list by name @testProvides integration between Apache Airflow and dbt Cloud for job orchestration and discovery.
@satisfied-by