tessl install tessl/pypi-apache-airflow-backport-providers-apache-sqoop@2021.3.0Apache Airflow backport provider package for Apache Sqoop integration, providing SqoopHook and SqoopOperator for data import/export between relational databases and Hadoop
Agent Success
Agent success rate when using this tile
92%
Improvement
Agent success rate improvement when using this tile compared to baseline
1.39x
Baseline
Agent success rate without this tile
66%
Build a small utility that triggers an export from an HDFS/Hive directory into a relational table while exercising the dependency's formatting controls for CSV/text output. Rely on the package's export facility to apply delimiters, enclosure/escape rules, and null sentinels rather than invoking raw shell commands directly.
| as the field separator and \n as the line separator when exporting a sample dataset, and the resulting command reflects those separators. @test" and embedded quotes are escaped with \\ in the export configuration. @testNULL and null numeric columns as \\N, using the export facility's null markers. @test@generates
from dataclasses import dataclass
from typing import Optional
@dataclass
class ExportFormatting:
field_delimiter: str = "|"
line_delimiter: str = "\n"
enclosure: Optional[str] = '"'
escape: str = "\\"
string_null: str = "NULL"
non_string_null: str = "\\N"
@dataclass
class ExportRequest:
connection_id: str
source_dir: str
target_table: str
formatting: ExportFormatting
def run_formatted_export(request: ExportRequest) -> str:
"""
Executes an export using the dependency's export capability with the provided
formatting controls. Returns the full command or identifier used for the run
so tests can assert the applied options.
"""Provides the export capability for moving files from HDFS/Hive into a database with customizable delimiters, enclosure/escape rules, and null sentinels.