Apache Airflow backport provider package for Apache Sqoop integration, providing SqoopHook and SqoopOperator for data import/export between relational databases and Hadoop
Overall
score
92%
Create a helper that runs a relational-to-HDFS import focused on mapper parallelism, split tuning, and per-run JVM/system properties. The helper should keep configuration minimal while surfacing the tuning controls.
-Dkey=value for the import run; the command preview lists each injected property. @test@generates
from typing import Any, Dict, Optional, TypedDict
class ImportRequest(TypedDict):
conn_id: str
table: str
target_dir: str
num_mappers: Optional[int]
split_by: Optional[str]
properties: Optional[Dict[str, str]]
class ImportResult(TypedDict):
target_dir: str
command_preview: str
def run_tuned_import(request: ImportRequest) -> ImportResult:
"""Runs an import using the Sqoop provider with mapper tuning and JVM property injection."""Provides Sqoop import functionality with mapper tuning and JVM property flags.
Install with Tessl CLI
npx tessl i tessl/pypi-apache-airflow-backport-providers-apache-sqoopdocs
evals
scenario-1
scenario-2
scenario-3
scenario-4
scenario-5
scenario-6
scenario-7
scenario-8
scenario-9
scenario-10