Apache Airflow backport provider package for Apache Sqoop integration, providing SqoopHook and SqoopOperator for data import/export between relational databases and Hadoop
Overall
score
92%
Design a data ingestion helper that leverages the Sqoop integration to import a relational table into HDFS while exposing toggles for direct connectors, JDBC driver overrides, and passthrough options. The helper should favor concise configuration while ensuring each toggle is honored in the executed import.
@generates
from typing import Mapping
def run_direct_table_import(
connection_uri: str,
table: str,
target_dir: str,
*,
use_direct: bool = False,
jdbc_driver: str | None = None,
extra_options: Mapping[str, str] | None = None,
) -> None:
"""
Launches a Sqoop table import into the given target directory using the package's Sqoop facilities.
- Enables or disables the direct connector based on use_direct.
- Overrides the JDBC driver when jdbc_driver is provided.
- Forwards extra Sqoop CLI options exactly as supplied.
"""Provides Sqoop import and execution utilities integrated with Airflow connections.
Install with Tessl CLI
npx tessl i tessl/pypi-apache-airflow-backport-providers-apache-sqoopdocs
evals
scenario-1
scenario-2
scenario-3
scenario-4
scenario-5
scenario-6
scenario-7
scenario-8
scenario-9
scenario-10