or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

builder-pattern.mdconfiguration-import.mdindex.mdlayer-support.mdmodel-import.mdpretrained-models.mdseparate-files-import.md

layer-support.mddocs/

0

# Layer Support

1

2

Comprehensive mapping of Keras layer types to DeepLearning4J layers. The library supports most common neural network layer types and provides automatic translation of parameters and configurations.

3

4

## Core Layer Types

5

6

### Dense (Fully Connected) Layers

7

8

Maps Keras Dense layers to DeepLearning4J DenseLayer.

9

10

```java { .api }

11

// Keras Dense layer configuration supported:

12

// - units (output_dim): number of output units

13

// - activation: activation function

14

// - use_bias: whether to use bias

15

// - kernel_initializer (init): weight initialization

16

// - bias_initializer: bias initialization

17

// - kernel_regularizer (W_regularizer): weight regularization

18

// - bias_regularizer (b_regularizer): bias regularization

19

// - activity_regularizer: output regularization

20

// - kernel_constraint: weight constraints

21

// - bias_constraint: bias constraints

22

```

23

24

**Supported Parameters:**

25

- Output dimensions

26

- Activation functions (relu, sigmoid, tanh, softmax, linear, etc.)

27

- Weight and bias initialization

28

- L1/L2 regularization

29

- Dropout (when combined with Dropout layer)

30

31

### Convolutional Layers

32

33

Maps Keras Convolution1D and Convolution2D layers to DeepLearning4J ConvolutionLayer.

34

35

```java { .api }

36

// Keras Convolution layer configuration supported:

37

// - filters (nb_filter): number of convolution filters

38

// - kernel_size (nb_row, nb_col): convolution kernel size

39

// - strides (subsample): stride values

40

// - padding (border_mode): padding type ('same', 'valid')

41

// - activation: activation function

42

// - use_bias: whether to use bias

43

// - kernel_initializer: weight initialization

44

// - bias_initializer: bias initialization

45

// - kernel_regularizer: weight regularization

46

// - bias_regularizer: bias regularization

47

```

48

49

**Supported Configurations:**

50

- 1D and 2D convolutions

51

- Stride and padding settings

52

- Multiple filter sizes

53

- Same and valid padding modes

54

- All standard activation functions

55

56

### Pooling Layers

57

58

Maps Keras pooling layers to DeepLearning4J pooling layers.

59

60

```java { .api }

61

// Supported pooling types:

62

// - MaxPooling1D -> SubsamplingLayer with PoolingType.MAX

63

// - MaxPooling2D -> SubsamplingLayer with PoolingType.MAX

64

// - AveragePooling1D -> SubsamplingLayer with PoolingType.AVG

65

// - AveragePooling2D -> SubsamplingLayer with PoolingType.AVG

66

// - GlobalMaxPooling1D -> GlobalPoolingLayer with PoolingType.MAX

67

// - GlobalMaxPooling2D -> GlobalPoolingLayer with PoolingType.MAX

68

// - GlobalAveragePooling1D -> GlobalPoolingLayer with PoolingType.AVG

69

// - GlobalAveragePooling2D -> GlobalPoolingLayer with PoolingType.AVG

70

```

71

72

**Configuration Options:**

73

- Pool size and stride settings

74

- Padding configurations

75

- Global pooling support

76

77

### Recurrent Layers

78

79

Maps Keras LSTM layers to DeepLearning4J LSTM layers.

80

81

```java { .api }

82

// Keras LSTM layer configuration supported:

83

// - units: number of LSTM units

84

// - activation: activation function for gates

85

// - recurrent_activation: recurrent activation function

86

// - use_bias: whether to use bias

87

// - kernel_initializer: input weight initialization

88

// - recurrent_initializer: recurrent weight initialization

89

// - bias_initializer: bias initialization

90

// - dropout: input dropout rate

91

// - recurrent_dropout: recurrent dropout rate

92

// - return_sequences: whether to return full sequence

93

// - return_state: whether to return cell state

94

// - go_backwards: process sequence backwards

95

// - stateful: maintain state between batches

96

// - unroll: unroll the recurrent computation

97

```

98

99

**Features:**

100

- Bidirectional LSTM support

101

- Sequence-to-sequence and sequence-to-one configurations

102

- Dropout variants

103

- State management

104

105

## Utility Layers

106

107

### Activation Layers

108

109

Maps Keras Activation layers to DeepLearning4J ActivationLayer.

110

111

```java { .api }

112

// Supported activation functions:

113

// - relu -> ReLU

114

// - sigmoid -> Sigmoid

115

// - tanh -> Tanh

116

// - softmax -> Softmax

117

// - linear -> Identity

118

// - softplus -> Softplus

119

// - softsign -> Softsign

120

// - hard_sigmoid -> HardSigmoid

121

// - elu -> ELU

122

// - selu -> SELU

123

// - swish -> Swish

124

```

125

126

### Dropout Layers

127

128

Maps Keras Dropout layers to DeepLearning4J DropoutLayer.

129

130

```java { .api }

131

// Keras Dropout configuration:

132

// - rate: dropout probability (0.0 to 1.0)

133

// - noise_shape: shape for dropout mask

134

// - seed: random seed for reproducibility

135

```

136

137

### Flatten Layers

138

139

Maps Keras Flatten layers to DeepLearning4J preprocessors.

140

141

```java { .api }

142

// Flattens multi-dimensional input to 1D

143

// Automatically handles different input shapes

144

// Maps to appropriate DL4J InputPreProcessor

145

```

146

147

### Embedding Layers

148

149

Maps Keras Embedding layers to DeepLearning4J EmbeddingLayer.

150

151

```java { .api }

152

// Keras Embedding configuration:

153

// - input_dim: vocabulary size

154

// - output_dim: embedding dimension

155

// - embeddings_initializer: weight initialization

156

// - embeddings_regularizer: weight regularization

157

// - embeddings_constraint: weight constraints

158

// - mask_zero: mask zero values

159

// - input_length: input sequence length

160

```

161

162

## Normalization Layers

163

164

### Batch Normalization

165

166

Maps Keras BatchNormalization layers to DeepLearning4J BatchNormalization.

167

168

```java { .api }

169

// Keras BatchNormalization configuration:

170

// - axis: normalization axis

171

// - momentum: momentum for moving averages

172

// - epsilon: small constant for numerical stability

173

// - center: whether to use beta parameter

174

// - scale: whether to use gamma parameter

175

// - beta_initializer: beta initialization

176

// - gamma_initializer: gamma initialization

177

// - moving_mean_initializer: moving mean initialization

178

// - moving_variance_initializer: moving variance initialization

179

// - beta_regularizer: beta regularization

180

// - gamma_regularizer: gamma regularization

181

// - beta_constraint: beta constraints

182

// - gamma_constraint: gamma constraints

183

```

184

185

## Custom and Specialized Layers

186

187

### Local Response Normalization

188

189

Custom implementation for Local Response Normalization (LRN).

190

191

```java { .api }

192

// KerasLRN class provides:

193

// - alpha: normalization parameter

194

// - beta: normalization parameter

195

// - depth_radius: normalization radius

196

// - bias: bias parameter

197

```

198

199

### Zero Padding

200

201

Maps Keras ZeroPadding1D and ZeroPadding2D layers to appropriate preprocessors.

202

203

```java { .api }

204

// Zero padding configuration:

205

// - padding: padding values for each dimension

206

// - Supports symmetric and asymmetric padding

207

```

208

209

### Merge Layers

210

211

Maps Keras Merge layers to DeepLearning4J merge vertices.

212

213

```java { .api }

214

// Supported merge modes:

215

// - add -> ElementWiseVertex with Add operation

216

// - multiply -> ElementWiseVertex with Product operation

217

// - average -> ElementWiseVertex with Average operation

218

// - maximum -> ElementWiseVertex with Max operation

219

// - concatenate -> MergeVertex

220

// - dot -> DotProductVertex

221

```

222

223

## Layer Mapping Process

224

225

### Automatic Layer Detection

226

227

The library automatically detects Keras layer types and maps them to appropriate DeepLearning4J layers:

228

229

```java { .api }

230

// KerasLayer factory method

231

public static KerasLayer getKerasLayerFromConfig(

232

Map<String, Object> layerConfig,

233

boolean enforceTrainingConfig

234

) throws InvalidKerasConfigurationException, UnsupportedKerasConfigurationException;

235

```

236

237

### Configuration Translation

238

239

Each layer type has specific configuration mapping:

240

241

1. **Parameter Names**: Keras parameter names are translated to DL4J equivalents

242

2. **Data Types**: Keras data types are converted to DL4J types

243

3. **Shapes**: Input/output shapes are properly configured

244

4. **Activations**: Activation functions are mapped between frameworks

245

246

### Weight Transfer

247

248

Weights are automatically transferred with proper shape handling:

249

250

```java { .api }

251

// Weight copying process:

252

// 1. Extract weights from HDF5 format

253

// 2. Handle parameter naming conventions (TensorFlow vs Theano backends)

254

// 3. Reshape weights to match DL4J expectations

255

// 4. Apply to corresponding DL4J layers

256

```

257

258

## Supported Layer Types Reference

259

260

| Keras Layer | DL4J Mapping | Support Level |

261

|-------------|--------------|---------------|

262

| Dense | DenseLayer | Full |

263

| Convolution1D | ConvolutionLayer | Full |

264

| Convolution2D | ConvolutionLayer | Full |

265

| MaxPooling1D | SubsamplingLayer | Full |

266

| MaxPooling2D | SubsamplingLayer | Full |

267

| AveragePooling1D | SubsamplingLayer | Full |

268

| AveragePooling2D | SubsamplingLayer | Full |

269

| GlobalMaxPooling1D | GlobalPoolingLayer | Full |

270

| GlobalMaxPooling2D | GlobalPoolingLayer | Full |

271

| GlobalAveragePooling1D | GlobalPoolingLayer | Full |

272

| GlobalAveragePooling2D | GlobalPoolingLayer | Full |

273

| LSTM | LSTM | Full |

274

| Dropout | DropoutLayer | Full |

275

| Activation | ActivationLayer | Full |

276

| Flatten | Preprocessor | Full |

277

| Embedding | EmbeddingLayer | Full |

278

| BatchNormalization | BatchNormalization | Full |

279

| Merge | MergeVertex/ElementWiseVertex | Full |

280

| ZeroPadding1D | Preprocessor | Full |

281

| ZeroPadding2D | Preprocessor | Full |

282

| Input | InputType | Full |

283

284

## Unsupported Features

285

286

### Layer Types Not Supported

287

- Lambda layers with custom functions

288

- Custom layers without DL4J equivalents

289

- Some specialized layers from newer Keras versions

290

291

### Configuration Limitations

292

- Some advanced regularization techniques

293

- Certain constraint types

294

- Complex custom initializers

295

296

### Handling Unsupported Features

297

298

```java

299

try {

300

ComputationGraph model = KerasModelImport.importKerasModelAndWeights("model.h5", true);

301

} catch (UnsupportedKerasConfigurationException e) {

302

System.out.println("Unsupported feature: " + e.getMessage());

303

304

// Try with relaxed enforcement

305

ComputationGraph model = KerasModelImport.importKerasModelAndWeights("model.h5", false);

306

System.out.println("Model imported with warnings");

307

}

308

```

309

310

## Custom Layer Extensions

311

312

For unsupported layer types, you can extend the library:

313

314

```java

315

// Example custom layer implementation

316

public class MyCustomKerasLayer extends KerasLayer {

317

318

public MyCustomKerasLayer(Map<String, Object> layerConfig, boolean enforceTrainingConfig)

319

throws InvalidKerasConfigurationException, UnsupportedKerasConfigurationException {

320

super(layerConfig, enforceTrainingConfig);

321

// Custom implementation

322

}

323

324

@Override

325

public Layer getLayer() throws UnsupportedKerasConfigurationException {

326

// Return appropriate DL4J layer

327

}

328

329

@Override

330

public InputType getOutputType(InputType... inputTypes)

331

throws InvalidKerasConfigurationException, UnsupportedKerasConfigurationException {

332

// Return output type

333

}

334

}

335

```