or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

cli.mdcore-parsing.mdexceptions.mdindex.mdinput-dialects.mdoutput-formatting.md

core-parsing.mddocs/

0

# Core Parsing

1

2

Primary DDL parsing functionality that converts SQL DDL statements into structured dictionaries. The parser handles tables, columns, constraints, indexes, sequences, and other database objects with comprehensive metadata extraction.

3

4

```python

5

from simple_ddl_parser import DDLParser, parse_from_file

6

from typing import Optional, Union, List, Dict

7

import logging

8

```

9

10

## Capabilities

11

12

### DDLParser Class

13

14

Main parser class that provides comprehensive DDL parsing with configurable options for error handling, debugging, name normalization, and logging.

15

16

```python { .api }

17

class DDLParser:

18

def __init__(

19

self,

20

content: str,

21

silent: bool = True,

22

debug: bool = False,

23

normalize_names: bool = False,

24

log_file: Optional[str] = None,

25

log_level: Union[str, int] = logging.INFO

26

):

27

"""

28

Initialize DDL parser with content and configuration options.

29

30

Parameters:

31

- content (str): DDL content to parse

32

- silent (bool): If True, suppress exception raising on parse errors

33

- debug (bool): Enable debug output during parsing

34

- normalize_names (bool): Remove identifier delimiters (quotes, brackets)

35

- log_file (Optional[str]): Path to log file for debug output

36

- log_level (Union[str, int]): Logging level (DEBUG, INFO, WARNING, ERROR)

37

"""

38

```

39

40

### Parser Execution

41

42

Execute DDL parsing with various output options and formatting controls.

43

44

```python { .api }

45

def run(

46

self,

47

*,

48

dump: bool = False,

49

dump_path: str = "schemas",

50

file_path: Optional[str] = None,

51

output_mode: str = "sql",

52

group_by_type: bool = False,

53

json_dump: bool = False

54

) -> List[Dict]:

55

"""

56

Parse DDL content and return structured metadata.

57

58

Parameters:

59

- dump (bool): Save output to JSON files in dump_path directory

60

- dump_path (str): Directory path for saved output files

61

- file_path (Optional[str]): Original file path for reference

62

- output_mode (str): Output dialect ("sql", "mysql", "postgres", etc.)

63

- group_by_type (bool): Group results by entity types (tables, sequences, etc.)

64

- json_dump (bool): Return JSON string instead of Python dict

65

66

Returns:

67

List[Dict]: List of parsed database objects with metadata

68

"""

69

```

70

71

### File Parsing Function

72

73

Convenience function for parsing DDL directly from files with automatic encoding handling.

74

75

```python { .api }

76

def parse_from_file(

77

file_path: str,

78

encoding: Optional[str] = "utf-8",

79

parser_settings: Optional[dict] = None,

80

**kwargs

81

) -> List[Dict]:

82

"""

83

Parse DDL content from a file.

84

85

Parameters:

86

- file_path (str): Path to DDL file to parse

87

- encoding (Optional[str]): File encoding (default: utf-8)

88

- parser_settings (Optional[dict]): DDLParser constructor arguments

89

- **kwargs: Additional arguments passed to parser.run()

90

91

Returns:

92

List[Dict]: List of parsed database objects with metadata

93

"""

94

```

95

96

## Usage Examples

97

98

### Basic Table Parsing

99

100

```python

101

from simple_ddl_parser import DDLParser

102

103

ddl = """

104

CREATE TABLE employees (

105

id SERIAL PRIMARY KEY,

106

first_name VARCHAR(50) NOT NULL,

107

last_name VARCHAR(50) NOT NULL,

108

email VARCHAR(100) UNIQUE,

109

department_id INTEGER REFERENCES departments(id),

110

created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP

111

);

112

"""

113

114

parser = DDLParser(ddl)

115

result = parser.run()

116

print(result[0]['columns']) # Column definitions with types and constraints

117

```

118

119

### Multi-table Schema Parsing

120

121

```python

122

from simple_ddl_parser import parse_from_file

123

124

# Parse complex schema file

125

tables = parse_from_file('complete_schema.sql', encoding='utf-8')

126

127

# Access parsed metadata

128

for table in tables:

129

print(f"Table: {table['table_name']}")

130

for column in table['columns']:

131

print(f" Column: {column['name']} ({column['type']})")

132

```

133

134

### Advanced Configuration

135

136

```python

137

from simple_ddl_parser import DDLParser

138

139

# Configure parser with custom settings

140

parser = DDLParser(

141

ddl_content,

142

silent=False, # Raise exceptions on errors

143

debug=True, # Enable debug output

144

normalize_names=True, # Remove identifier quotes

145

log_file='parser.log',

146

log_level='DEBUG'

147

)

148

149

# Parse with specific output format

150

result = parser.run(

151

output_mode='postgres',

152

group_by_type=True,

153

json_dump=False

154

)

155

```

156

157

## Parsed Structure Format

158

159

The parser returns a list of dictionaries, each representing a database object:

160

161

```python

162

# Example parsed table structure

163

{

164

"table_name": "employees",

165

"schema": "public",

166

"columns": [

167

{

168

"name": "id",

169

"type": "SERIAL",

170

"primary_key": True,

171

"nullable": False

172

},

173

{

174

"name": "first_name",

175

"type": "VARCHAR",

176

"size": 50,

177

"nullable": False

178

}

179

],

180

"primary_key": ["id"],

181

"foreign_keys": [

182

{

183

"columns": ["department_id"],

184

"references": {

185

"table": "departments",

186

"columns": ["id"]

187

}

188

}

189

],

190

"indexes": [],

191

"constraints": []

192

}

193

```