or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

advanced-features.mdcontainer-management.mdcore-framework.mddatabase-testing.mdindex.mdjdbc-utilities.md

core-framework.mddocs/

0

# Core Test Framework

1

2

Base testing infrastructure providing Docker container lifecycle management and shared test utilities for database integration testing. This framework extends Spark's SharedSparkSession to provide database-specific testing capabilities.

3

4

## Capabilities

5

6

### DockerJDBCIntegrationSuite

7

8

Abstract base class that provides the foundation for all database integration tests with Docker container management and common test utilities.

9

10

```scala { .api }

11

/**

12

* Base abstract class for Docker-based JDBC integration tests

13

* Extends SharedSparkSession to provide Spark context and session

14

*/

15

abstract class DockerJDBCIntegrationSuite extends SharedSparkSession {

16

/** Database type identifier (e.g., "postgresql", "mysql") */

17

def databaseType: String

18

19

/** Docker image name and tag for the database */

20

def databaseImage: String

21

22

/** Start Docker container for the database */

23

def startDockerContainer(): Unit

24

25

/** Stop and cleanup Docker container */

26

def stopDockerContainer(): Unit

27

28

/** Get JDBC connection URL for the test database */

29

def getJdbcUrl(): String

30

31

/** Get active JDBC connection to the test database */

32

def getJdbcConnection(): Connection

33

34

/** Setup initial test data in the database */

35

def setupTestData(): Unit

36

37

/** Run JDBC connectivity test */

38

def runJdbcTest(sql: String): DataFrame

39

40

/** Get database-specific JDBC properties */

41

def getJdbcProperties(): Properties

42

43

/** Validate database connection health */

44

def validateConnection(): Boolean

45

}

46

```

47

48

**Usage Examples:**

49

50

```scala

51

class MyPostgreSQLTest extends DockerJDBCIntegrationSuite {

52

override val databaseType = "postgresql"

53

override val databaseImage = "postgres:13"

54

55

override def beforeAll(): Unit = {

56

super.beforeAll()

57

startDockerContainer()

58

setupTestData()

59

}

60

61

override def afterAll(): Unit = {

62

try {

63

stopDockerContainer()

64

} finally {

65

super.afterAll()

66

}

67

}

68

69

test("test table operations") {

70

val df = runJdbcTest("SELECT * FROM test_table")

71

assert(df.count() > 0)

72

assert(df.columns.contains("id"))

73

}

74

75

test("test data insertion") {

76

val connection = getJdbcConnection()

77

val statement = connection.createStatement()

78

statement.execute("INSERT INTO test_table VALUES (1, 'test')")

79

80

val df = spark.read

81

.format("jdbc")

82

.option("url", getJdbcUrl())

83

.option("dbtable", "test_table")

84

.load()

85

86

assert(df.filter($"id" === 1).count() == 1)

87

}

88

}

89

```

90

91

### Container Lifecycle Methods

92

93

Methods for managing Docker container lifecycle during test execution.

94

95

```scala { .api }

96

/**

97

* Start Docker container for the database

98

* Creates and starts a new container instance

99

* Waits for database to be ready for connections

100

*/

101

def startDockerContainer(): Unit

102

103

/**

104

* Stop and cleanup Docker container

105

* Gracefully stops the container and removes it

106

* Cleans up any associated resources

107

*/

108

def stopDockerContainer(): Unit

109

110

/**

111

* Get JDBC connection URL for the test database

112

* @return JDBC URL string for connecting to the test database

113

*/

114

def getJdbcUrl(): String

115

116

/**

117

* Get active JDBC connection to the test database

118

* @return Active Connection object for database operations

119

*/

120

def getJdbcConnection(): Connection

121

```

122

123

### Test Data Management

124

125

Methods for managing test data setup and cleanup.

126

127

```scala { .api }

128

/**

129

* Setup initial test data in the database

130

* Creates necessary tables and populates with test data

131

* Should be called after container startup

132

*/

133

def setupTestData(): Unit

134

135

/**

136

* Run JDBC connectivity test with SQL query

137

* @param sql SQL query to execute

138

* @return DataFrame containing query results

139

*/

140

def runJdbcTest(sql: String): DataFrame

141

142

/**

143

* Get database-specific JDBC properties

144

* @return Properties object with database-specific settings

145

*/

146

def getJdbcProperties(): Properties

147

148

/**

149

* Validate database connection health

150

* @return true if connection is healthy, false otherwise

151

*/

152

def validateConnection(): Boolean

153

```

154

155

### Shared Spark Integration

156

157

Integration with Spark's testing framework and session management.

158

159

```scala { .api }

160

/**

161

* Access to shared Spark session for JDBC operations

162

* Inherited from SharedSparkSession

163

*/

164

def spark: SparkSession

165

166

/**

167

* Create DataFrame from JDBC source

168

* @param tableName Name of the database table

169

* @return DataFrame containing table data

170

*/

171

def readJdbcTable(tableName: String): DataFrame

172

173

/**

174

* Write DataFrame to JDBC destination

175

* @param df DataFrame to write

176

* @param tableName Target table name

177

* @param mode Write mode (append, overwrite, etc.)

178

*/

179

def writeJdbcTable(df: DataFrame, tableName: String, mode: String = "append"): Unit

180

```

181

182

## Test Lifecycle

183

184

The framework follows a standard test lifecycle pattern:

185

186

1. **beforeAll()**: Start Docker container and setup test data

187

2. **Test Execution**: Run individual test methods

188

3. **afterAll()**: Stop container and cleanup resources

189

190

## Error Handling

191

192

The framework provides robust error handling for common scenarios:

193

194

- Container startup failures

195

- Database connection timeouts

196

- Test data setup errors

197

- Container cleanup issues

198

199

All methods include proper exception handling and resource cleanup to prevent test isolation issues.