or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

cli-interface.mdconfiguration.mddata-models.mddatabase-connectors.mdindex.mdsecurity.mdsql-lab.mdutilities.mdvisualization.mdweb-application.md

cli-interface.mddocs/

0

# CLI Interface

1

2

Complete command-line interface for Apache Superset administration, data management, and server operations. The CLI provides essential tools for initializing the application, managing data, and performing administrative tasks.

3

4

## Capabilities

5

6

### Application Management

7

8

Commands for initializing, configuring, and running the Superset application server.

9

10

```python { .api }

11

def init():

12

"""

13

Initialize the Superset application and database.

14

Creates database tables, default roles, and permissions.

15

"""

16

17

def runserver():

18

"""

19

Start the Superset web server.

20

21

Options:

22

- --debug: Enable debug mode

23

- --console-log: Enable console logging

24

- --no-reload: Disable auto-reload

25

- --address: Server bind address (default: 0.0.0.0)

26

- --port: Server port (default: 8088)

27

- --workers: Number of worker processes

28

- --timeout: Request timeout in seconds

29

- --socket: Unix socket path

30

"""

31

32

def version():

33

"""

34

Display Superset version information.

35

36

Options:

37

- --verbose: Show additional version details

38

"""

39

```

40

41

### Data Management

42

43

Commands for loading, importing, and exporting data, dashboards, and datasources.

44

45

```python { .api }

46

def load_examples():

47

"""

48

Load example datasets and dashboards.

49

50

Options:

51

- --load-test-data: Load additional test data

52

"""

53

54

def import_dashboards():

55

"""

56

Import dashboards from JSON files.

57

58

Options:

59

- --path: Path to JSON file or directory

60

- --recursive: Recursively search directories

61

"""

62

63

def export_dashboards():

64

"""

65

Export dashboards to JSON format.

66

67

Options:

68

- --dashboard-file: Output file path

69

- --print_stdout: Print to standard output

70

"""

71

72

def import_datasources():

73

"""

74

Import datasources from YAML files.

75

76

Options:

77

- --path: Path to YAML file or directory

78

- --sync: Synchronize existing datasources

79

- --recursive: Recursively search directories

80

"""

81

82

def export_datasources():

83

"""

84

Export datasources to YAML format.

85

86

Options:

87

- --datasource-file: Output file path

88

- --print_stdout: Print to standard output

89

- --back-references: Include back-references

90

- --include-defaults: Include default values

91

"""

92

93

def export_datasource_schema():

94

"""

95

Export datasource schema definitions.

96

97

Options:

98

- --back-references: Include back-references

99

"""

100

```

101

102

### Druid Integration

103

104

Commands for managing Druid datasource connections and metadata synchronization.

105

106

```python { .api }

107

def refresh_druid():

108

"""

109

Refresh Druid datasource metadata.

110

Synchronizes datasource definitions with Druid cluster.

111

112

Options:

113

- --datasource: Specific datasource name to refresh

114

- --merge: Merge with existing metadata

115

"""

116

```

117

118

### Cache and Metadata Management

119

120

Commands for managing application caches and metadata synchronization.

121

122

```python { .api }

123

def update_datasources_cache():

124

"""

125

Refresh SQL Lab datasources cache.

126

Updates cached table and schema information.

127

"""

128

```

129

130

### Background Processing

131

132

Commands for running Celery workers and monitoring interfaces for async query processing.

133

134

```python { .api }

135

def worker():

136

"""

137

Start Celery worker for asynchronous query processing.

138

139

Options:

140

- --workers: Number of worker processes (default: configured value)

141

"""

142

143

def flower():

144

"""

145

Start Celery Flower monitoring interface.

146

Provides web UI for monitoring Celery tasks and workers.

147

148

Options:

149

- --port: Web interface port

150

- --address: Bind address

151

"""

152

```

153

154

### User Management

155

156

Commands for creating test users and managing user accounts.

157

158

```python { .api }

159

def load_test_users():

160

"""

161

Create test users for development and testing.

162

Creates admin, alpha, and gamma role users.

163

"""

164

```

165

166

## Usage Examples

167

168

### Basic Setup

169

170

```bash

171

# Initialize Superset database and create default roles

172

superset init

173

174

# Create an admin user (interactive)

175

superset fab create-admin

176

177

# Load example datasets and dashboards

178

superset load-examples

179

180

# Start development server

181

superset runserver --debug --port 8088

182

```

183

184

### Production Deployment

185

186

```bash

187

# Initialize production database

188

superset init

189

190

# Run with multiple workers for production

191

superset runserver \

192

--host 0.0.0.0 \

193

--port 8088 \

194

--workers 4 \

195

--timeout 120

196

197

# Start background worker processes

198

superset worker --workers 8

199

200

# Start monitoring interface

201

superset flower --port 5555

202

```

203

204

### Data Import/Export

205

206

```bash

207

# Export all dashboards to JSON

208

superset export-dashboards --dashboard-file /tmp/dashboards.json

209

210

# Import dashboards from file

211

superset import-dashboards --path /tmp/dashboards.json

212

213

# Export datasources with default values

214

superset export-datasources \

215

--datasource-file /tmp/datasources.yml \

216

--include-defaults

217

218

# Import and sync datasources

219

superset import-datasources \

220

--path /tmp/datasources.yml \

221

--sync

222

```

223

224

### Maintenance Operations

225

226

```bash

227

# Refresh all Druid datasources

228

superset refresh-druid

229

230

# Refresh specific Druid datasource

231

superset refresh-druid --datasource my_datasource --merge

232

233

# Update SQL Lab metadata cache

234

superset update-datasources-cache

235

236

# Check version information

237

superset version --verbose

238

```

239

240

## Configuration

241

242

CLI commands respect configuration settings from:

243

244

- **Environment Variables**: `SUPERSET_CONFIG_PATH`, `FLASK_APP`

245

- **Configuration File**: Default or custom config module

246

- **Command Line Options**: Override configuration settings

247

248

## Integration

249

250

The CLI interface integrates with:

251

252

- **Flask-Migrate**: Database schema migrations

253

- **Celery**: Asynchronous task processing

254

- **Flask-AppBuilder**: User and role management

255

- **SQLAlchemy**: Database operations

256

- **Druid**: Metadata synchronization