Apache Airflow backport provider package for Apache Sqoop integration, providing SqoopHook and SqoopOperator for data import/export between relational databases and Hadoop
Overall
score
92%
A helper that executes an eval-style Sqoop command using a stored Airflow connection and returns the masked CLI that was executed.
job_tracker, namenode, libjars, files, archives, password_file) from the connection identified by conn_id; returns the masked command used for logging. @test@generates
from typing import Any, Dict, List, Optional
class SqoopCommandRunner:
def __init__(self, conn_id: str) -> None:
...
def run_eval(self, query: str, extra_options: Optional[Dict[str, Any]] = None, verbose: bool = False) -> str:
"""
Execute a Sqoop eval query using the configured connection.
Executes the query through the Sqoop integration and returns the sanitized command string that hides credentials while matching what was sent to the Sqoop subprocess.
Raises the dependency's Sqoop execution exception on failure.
"""Airflow provider supplying the Sqoop hook used to build and execute commands with credential masking. @satisfied-by
Install with Tessl CLI
npx tessl i tessl/pypi-apache-airflow-backport-providers-apache-sqoopdocs
evals
scenario-1
scenario-2
scenario-3
scenario-4
scenario-5
scenario-6
scenario-7
scenario-8
scenario-9
scenario-10