0
# Random Operations
1
2
Keras provides random number generation functions for sampling from various probability distributions and performing stochastic operations. These functions support reproducible randomness through seed management.
3
4
## Capabilities
5
6
### Continuous Distributions
7
8
Functions for sampling from continuous probability distributions.
9
10
```python { .api }
11
def normal(shape, mean=0.0, stddev=1.0, dtype=None, seed=None):
12
"""
13
Generate random samples from a normal (Gaussian) distribution.
14
15
Parameters:
16
- shape: Shape of the output tensor
17
- mean: Mean of the normal distribution (default: 0.0)
18
- stddev: Standard deviation of the normal distribution (default: 1.0)
19
- dtype: Data type of the output (default: None)
20
- seed: Random seed for reproducibility (default: None)
21
22
Returns:
23
Tensor of random samples from normal distribution
24
"""
25
26
def uniform(shape, minval=0.0, maxval=1.0, dtype=None, seed=None):
27
"""
28
Generate random samples from a uniform distribution.
29
30
Parameters:
31
- shape: Shape of the output tensor
32
- minval: Lower bound of the uniform distribution (default: 0.0)
33
- maxval: Upper bound of the uniform distribution (default: 1.0)
34
- dtype: Data type of the output (default: None)
35
- seed: Random seed for reproducibility (default: None)
36
37
Returns:
38
Tensor of random samples from uniform distribution
39
"""
40
41
def truncated_normal(shape, mean=0.0, stddev=1.0, dtype=None, seed=None):
42
"""
43
Generate random samples from a truncated normal distribution.
44
45
Values more than 2 standard deviations from the mean are discarded and redrawn.
46
47
Parameters:
48
- shape: Shape of the output tensor
49
- mean: Mean of the normal distribution (default: 0.0)
50
- stddev: Standard deviation of the normal distribution (default: 1.0)
51
- dtype: Data type of the output (default: None)
52
- seed: Random seed for reproducibility (default: None)
53
54
Returns:
55
Tensor of random samples from truncated normal distribution
56
"""
57
58
def beta(shape, alpha, beta, dtype=None, seed=None):
59
"""
60
Generate random samples from a beta distribution.
61
62
Parameters:
63
- shape: Shape of the output tensor
64
- alpha: Alpha parameter of the beta distribution
65
- beta: Beta parameter of the beta distribution
66
- dtype: Data type of the output (default: None)
67
- seed: Random seed for reproducibility (default: None)
68
69
Returns:
70
Tensor of random samples from beta distribution
71
"""
72
73
def gamma(shape, alpha, beta=None, dtype=None, seed=None):
74
"""
75
Generate random samples from a gamma distribution.
76
77
Parameters:
78
- shape: Shape of the output tensor
79
- alpha: Shape parameter (alpha) of the gamma distribution
80
- beta: Rate parameter (beta) of the gamma distribution (default: None)
81
- dtype: Data type of the output (default: None)
82
- seed: Random seed for reproducibility (default: None)
83
84
Returns:
85
Tensor of random samples from gamma distribution
86
"""
87
```
88
89
### Discrete Distributions
90
91
Functions for sampling from discrete probability distributions.
92
93
```python { .api }
94
def randint(shape, minval, maxval, dtype='int32', seed=None):
95
"""
96
Generate random integers from a uniform distribution.
97
98
Parameters:
99
- shape: Shape of the output tensor
100
- minval: Lower bound (inclusive) of the range
101
- maxval: Upper bound (exclusive) of the range
102
- dtype: Data type of the output (default: 'int32')
103
- seed: Random seed for reproducibility (default: None)
104
105
Returns:
106
Tensor of random integers
107
"""
108
109
def binomial(shape, counts, probabilities, dtype=None, seed=None):
110
"""
111
Generate random samples from binomial distributions.
112
113
Parameters:
114
- shape: Shape of the output tensor
115
- counts: Number of trials for each binomial distribution
116
- probabilities: Success probabilities for each trial
117
- dtype: Data type of the output (default: None)
118
- seed: Random seed for reproducibility (default: None)
119
120
Returns:
121
Tensor of random samples from binomial distributions
122
"""
123
124
def categorical(logits, num_samples, dtype=None, seed=None):
125
"""
126
Generate random samples from categorical distributions.
127
128
Parameters:
129
- logits: 2D tensor of shape (batch_size, num_classes) with unnormalized log probabilities
130
- num_samples: Number of samples to draw for each distribution
131
- dtype: Data type of the output (default: None)
132
- seed: Random seed for reproducibility (default: None)
133
134
Returns:
135
Tensor of shape (batch_size, num_samples) with sampled class indices
136
"""
137
```
138
139
### Utility Functions
140
141
Functions for data manipulation and stochastic operations.
142
143
```python { .api }
144
def shuffle(x, axis=0, seed=None):
145
"""
146
Randomly shuffle a tensor along the specified axis.
147
148
Parameters:
149
- x: Input tensor to shuffle
150
- axis: Axis along which to shuffle (default: 0)
151
- seed: Random seed for reproducibility (default: None)
152
153
Returns:
154
Shuffled tensor with same shape as input
155
"""
156
157
def dropout(x, rate, noise_shape=None, seed=None):
158
"""
159
Randomly sets input units to 0 with frequency `rate` at each step during training.
160
161
Parameters:
162
- x: Input tensor
163
- rate: Fraction of input units to drop (between 0 and 1)
164
- noise_shape: Shape for generated random values (default: None, uses input shape)
165
- seed: Random seed for reproducibility (default: None)
166
167
Returns:
168
Tensor with same shape as input, with random units set to 0
169
"""
170
```
171
172
### Seed Management
173
174
Tools for managing random number generation seeds for reproducible results.
175
176
```python { .api }
177
class SeedGenerator:
178
"""
179
Random seed generator for reproducible randomness across operations.
180
181
Manages seed state to ensure reproducible random number generation
182
while allowing for different random sequences.
183
184
Usage:
185
```python
186
seed_gen = SeedGenerator(42)
187
x = keras.random.normal((10, 10), seed=seed_gen)
188
y = keras.random.uniform((5, 5), seed=seed_gen)
189
```
190
"""
191
def __init__(self, seed=None):
192
"""
193
Initialize the seed generator.
194
195
Parameters:
196
- seed: Initial seed value (default: None for random initialization)
197
"""
198
199
def next(self, ordered=True):
200
"""
201
Generate the next seed value.
202
203
Parameters:
204
- ordered: Whether to generate seeds in deterministic order (default: True)
205
206
Returns:
207
Next seed value
208
"""
209
210
def state(self):
211
"""
212
Get the current state of the seed generator.
213
214
Returns:
215
Current seed generator state
216
"""
217
```
218
219
## Usage Examples
220
221
### Basic Random Sampling
222
223
```python
224
import keras
225
from keras import random
226
227
# Generate random tensors from different distributions
228
normal_samples = random.normal((100, 10), mean=0.0, stddev=1.0)
229
uniform_samples = random.uniform((50, 5), minval=-1.0, maxval=1.0)
230
truncated_samples = random.truncated_normal((20, 3), mean=0.0, stddev=0.5)
231
232
# Discrete sampling
233
integers = random.randint((10, 10), minval=0, maxval=100)
234
coin_flips = random.binomial((100,), counts=1, probabilities=0.5)
235
236
# Categorical sampling from logits
237
logits = keras.ops.array([[1.0, 2.0, 3.0], [2.0, 1.0, 0.5]])
238
samples = random.categorical(logits, num_samples=5)
239
```
240
241
### Reproducible Randomness
242
243
```python
244
import keras
245
from keras import random
246
247
# Using seed for reproducibility
248
seed = 42
249
x1 = random.normal((10, 10), seed=seed)
250
x2 = random.normal((10, 10), seed=seed) # Same as x1
251
252
# Using SeedGenerator for multiple operations with different seeds
253
seed_gen = random.SeedGenerator(42)
254
y1 = random.normal((10, 10), seed=seed_gen)
255
y2 = random.uniform((10, 10), seed=seed_gen) # Different from y1 but reproducible
256
257
# Reset seed generator for reproducible sequences
258
seed_gen = random.SeedGenerator(42)
259
z1 = random.normal((10, 10), seed=seed_gen)
260
z2 = random.uniform((10, 10), seed=seed_gen)
261
# z1 and z2 will be identical to y1 and y2
262
```
263
264
### Data Augmentation and Training
265
266
```python
267
import keras
268
from keras import random
269
270
def data_augmentation(x, training=True):
271
"""Apply random data augmentation during training."""
272
if training:
273
# Random dropout
274
x = random.dropout(x, rate=0.1)
275
276
# Add random noise
277
noise = random.normal(keras.ops.shape(x), stddev=0.01)
278
x = x + noise
279
280
# Random shuffle batch
281
x = random.shuffle(x)
282
283
return x
284
285
# Use in a custom layer
286
class AugmentationLayer(keras.layers.Layer):
287
def __init__(self, dropout_rate=0.1, noise_stddev=0.01):
288
super().__init__()
289
self.dropout_rate = dropout_rate
290
self.noise_stddev = noise_stddev
291
self.seed_gen = random.SeedGenerator()
292
293
def call(self, x, training=None):
294
if training:
295
x = random.dropout(x, self.dropout_rate, seed=self.seed_gen)
296
noise = random.normal(
297
keras.ops.shape(x),
298
stddev=self.noise_stddev,
299
seed=self.seed_gen
300
)
301
x = x + noise
302
return x
303
```
304
305
### Custom Initialization with Random Functions
306
307
```python
308
import keras
309
from keras import random
310
311
class CustomInitializer(keras.initializers.Initializer):
312
def __init__(self, seed=None):
313
self.seed = seed
314
315
def __call__(self, shape, dtype=None, **kwargs):
316
# Custom initialization using multiple random distributions
317
base = random.uniform(shape, minval=-0.1, maxval=0.1, seed=self.seed)
318
perturbation = random.normal(shape, stddev=0.01, seed=self.seed)
319
return base + perturbation
320
321
# Use custom initializer
322
layer = keras.layers.Dense(
323
64,
324
kernel_initializer=CustomInitializer(seed=42),
325
activation='relu'
326
)
327
```
328
329
## Best Practices
330
331
### Seed Management:
332
- Use `SeedGenerator` for reproducible yet varied random sequences
333
- Set seeds for debugging and reproducible experiments
334
- Avoid reusing the same seed for different operations
335
336
### Performance Considerations:
337
- Generate random numbers in batches when possible
338
- Use appropriate data types to avoid unnecessary conversions
339
- Consider using truncated normal for weight initialization
340
341
### Training Stability:
342
- Use dropout with appropriate rates (typically 0.1-0.5)
343
- Add small amounts of noise for regularization
344
- Shuffle data at the batch level, not individual samples