0
# Utilities and Helper Functions
1
2
This module provides general utilities for model checking, parameter estimation, coordinate transformations, and integration with external tools. These utilities support model development, validation, analysis, and interoperability with other software systems.
3
4
## Binary File Readers
5
6
Classes for reading MODFLOW binary output files including head files, budget files, and concentration files. These are essential for post-processing model results.
7
8
### HeadFile
9
10
Read binary head output files from MODFLOW models, providing access to 2D or 3D head arrays and time series data.
11
12
```python { .api }
13
class HeadFile:
14
"""Head file reader for MODFLOW binary output"""
15
def __init__(self, filename: Union[str, os.PathLike], text: str = 'head', precision: str = 'auto', verbose: bool = False, **kwargs): ...
16
def get_data(self, kstpkper=None, idx=None, totim=None, **kwargs) -> np.ndarray: ...
17
def get_alldata(self, nodata=-999.99) -> np.ndarray: ...
18
def get_ts(self, idx) -> np.ndarray: ...
19
@property
20
def headers() -> np.recarray: ...
21
@property
22
def times() -> list: ...
23
```
24
25
**Parameters:**
26
- `filename` (str | PathLike): Path to the head file
27
- `text` (str): Text string identifier (default: 'head')
28
- `precision` (str): Data precision ('auto', 'single', 'double')
29
- `verbose` (bool): Enable verbose output
30
31
**Key Methods:**
32
- `get_data()`: Retrieve head data for specific time steps
33
- `get_alldata()`: Get all head data in the file
34
- `get_ts()`: Get time series data for specific cells
35
36
### CellBudgetFile
37
38
Read binary cell budget files containing flow terms and mass balance information.
39
40
```python { .api }
41
class CellBudgetFile:
42
"""Cell budget file reader for MODFLOW binary output"""
43
def __init__(self, filename: Union[str, os.PathLike], precision: str = 'single', verbose: bool = False, **kwargs): ...
44
def get_data(self, kstpkper=None, text=None, paknam=None, totim=None, idx=None, **kwargs) -> list | np.ndarray: ...
45
def get_times(self) -> list: ...
46
def get_kstpkper(self) -> list: ...
47
def list_records(self) -> list: ...
48
@property
49
def recordarray() -> np.recarray: ...
50
```
51
52
**Parameters:**
53
- `filename` (str | PathLike): Path to the budget file
54
- `precision` (str): Data precision ('single' or 'double')
55
- `verbose` (bool): Enable verbose output
56
57
**Key Methods:**
58
- `get_data()`: Retrieve budget data by time step and flow component
59
- `get_times()`: Get list of simulation times
60
- `list_records()`: List all available budget records
61
62
### UcnFile
63
64
Read binary concentration files from MT3DMS transport models.
65
66
```python { .api }
67
class UcnFile:
68
"""Concentration file reader for MT3DMS binary output"""
69
def __init__(self, filename: Union[str, os.PathLike], text: str = 'concentration', precision: str = 'single', verbose: bool = False, **kwargs): ...
70
def get_data(self, kstpkper=None, idx=None, totim=None, **kwargs) -> np.ndarray: ...
71
def get_alldata(self, nodata=-999.99) -> np.ndarray: ...
72
@property
73
def headers() -> np.recarray: ...
74
```
75
76
### ZoneBudget
77
78
Perform zone budget analysis on MODFLOW cell budget data.
79
80
```python { .api }
81
class ZoneBudget:
82
"""Zone budget analysis for MODFLOW models"""
83
def __init__(self, cbc_file, z, kstpkper=None, totim=None, aliases=None, verbose: bool = False, **kwargs): ...
84
def get_budget(self, names=None, zones=None, **kwargs) -> pd.DataFrame: ...
85
def get_dataframes(self, **kwargs) -> dict: ...
86
def to_csv(self, fname, **kwargs) -> None: ...
87
@staticmethod
88
def read_zone_file(fname) -> np.ndarray: ...
89
```
90
91
**Parameters:**
92
- `cbc_file`: CellBudgetFile object or path to budget file
93
- `z` (np.ndarray): Zone array defining budget zones
94
- `kstpkper` (tuple): Time step and stress period
95
- `totim` (float): Total simulation time
96
97
**Key Methods:**
98
- `get_budget()`: Calculate zone budget for specified zones
99
- `get_dataframes()`: Get budget data as pandas DataFrames
100
- `to_csv()`: Export budget results to CSV
101
102
### BinaryHeader
103
104
Handle binary file headers for MODFLOW output files.
105
106
```python { .api }
107
class BinaryHeader:
108
"""Binary file header handling"""
109
def __init__(self, bintype=None, precision: str = 'single'): ...
110
def set_values(self, **kwargs): ...
111
@staticmethod
112
def create(bintype=None, precision: str = 'single', **kwargs): ...
113
```
114
115
## Model Execution
116
117
### run_model
118
119
Execute MODFLOW models with subprocess management and output capture.
120
121
```python { .api }
122
def run_model(
123
namefile: str,
124
exe_name: str,
125
model_ws: str = '.',
126
silent: bool = False,
127
pause: bool = False,
128
report: bool = False,
129
normal_msg: str = 'normal termination',
130
**kwargs
131
) -> tuple[bool, list[str]]:
132
"""Execute model runs with subprocess management"""
133
...
134
```
135
136
**Parameters:**
137
- `namefile` (str): MODFLOW name file
138
- `exe_name` (str): Executable name or path
139
- `model_ws` (str): Model workspace directory
140
- `silent` (bool): Suppress output during execution
141
- `pause` (bool): Pause for user input after execution
142
- `report` (bool): Print execution report
143
- `normal_msg` (str): Message indicating successful completion
144
145
**Returns:**
146
- `tuple[bool, list[str]]`: Success status and output lines
147
148
### which
149
150
Find executable paths in system PATH.
151
152
```python { .api }
153
def which(program: str) -> str:
154
"""Find executable paths (re-exported from shutil)"""
155
...
156
```
157
158
## Model Validation
159
160
### check
161
162
Comprehensive model checking function for validation and error detection.
163
164
```python { .api }
165
def check(
166
model: object,
167
f: str = None,
168
verbose: bool = True,
169
level: int = 1,
170
**kwargs
171
) -> object:
172
"""Model checking function for validation and diagnostics"""
173
...
174
```
175
176
**Parameters:**
177
- `model` (object): FloPy model object to check
178
- `f` (str): Output file for check results
179
- `verbose` (bool): Print detailed check results
180
- `level` (int): Check level (0=basic, 1=standard, 2=detailed)
181
182
**Returns:**
183
- `object`: Check results object with summary and details
184
185
## Flow Analysis
186
187
### get_specific_discharge
188
189
Calculate specific discharge from flow fields.
190
191
```python { .api }
192
def get_specific_discharge(
193
frf: np.ndarray,
194
fff: np.ndarray,
195
flf: np.ndarray,
196
grid: object,
197
head: np.ndarray = None,
198
**kwargs
199
) -> tuple[np.ndarray, np.ndarray, np.ndarray]:
200
"""Calculate specific discharge from flow fields"""
201
...
202
```
203
204
**Parameters:**
205
- `frf` (np.ndarray): Flow right face
206
- `fff` (np.ndarray): Flow front face
207
- `flf` (np.ndarray): Flow lower face
208
- `grid` (object): Model grid object
209
- `head` (np.ndarray): Head array for calculations
210
211
**Returns:**
212
- `tuple[np.ndarray, np.ndarray, np.ndarray]`: Specific discharge components (qx, qy, qz)
213
214
### get_transmissivities
215
216
Calculate transmissivities from hydraulic parameters.
217
218
```python { .api }
219
def get_transmissivities(
220
heads: np.ndarray,
221
m: object,
222
r: int = None,
223
c: int = None,
224
x: float = None,
225
y: float = None,
226
sctop: np.ndarray = None,
227
scbot: np.ndarray = None,
228
**kwargs
229
) -> np.ndarray:
230
"""Calculate transmissivities from model parameters"""
231
...
232
```
233
234
**Parameters:**
235
- `heads` (np.ndarray): Head array
236
- `m` (object): MODFLOW model object
237
- `r` (int): Row index for specific location
238
- `c` (int): Column index for specific location
239
- `x` (float): X coordinate for interpolation
240
- `y` (float): Y coordinate for interpolation
241
- `sctop` (np.ndarray): Screen top elevations
242
- `scbot` (np.ndarray): Screen bottom elevations
243
244
## Time Utilities
245
246
### totim_to_datetime
247
248
Convert model time to datetime objects.
249
250
```python { .api }
251
def totim_to_datetime(
252
totim: ArrayData,
253
start: str = '1-1-1970',
254
timeunit: str = 'days',
255
**kwargs
256
) -> list:
257
"""Convert model time to datetime objects"""
258
...
259
```
260
261
**Parameters:**
262
- `totim` (ArrayData): Total elapsed times
263
- `start` (str): Starting date string
264
- `timeunit` (str): Time units ('days', 'hours', 'minutes', 'seconds', 'years')
265
266
**Returns:**
267
- `list`: List of datetime objects
268
269
## File Parsing
270
271
### parsenamefile
272
273
Parse MODFLOW name files and extract package information.
274
275
```python { .api }
276
def parsenamefile(
277
namefile: str,
278
packages: dict = None,
279
**kwargs
280
) -> tuple[list, dict]:
281
"""Parse MODFLOW name files"""
282
...
283
```
284
285
**Parameters:**
286
- `namefile` (str): Path to MODFLOW name file
287
- `packages` (dict): Package type definitions
288
289
**Returns:**
290
- `tuple[list, dict]`: Package list and file dictionary
291
292
## Array Utilities
293
294
### create_empty_recarray
295
296
Create empty record arrays with specified dtypes.
297
298
```python { .api }
299
def create_empty_recarray(
300
length: int,
301
dtype: np.dtype,
302
default_value: float = 0.0,
303
**kwargs
304
) -> np.recarray:
305
"""Create empty record arrays"""
306
...
307
```
308
309
### recarray
310
311
Create and manipulate record arrays for MODFLOW data.
312
313
```python { .api }
314
def recarray(
315
data: ArrayData,
316
dtype: np.dtype = None,
317
**kwargs
318
) -> np.recarray:
319
"""Create record arrays from data"""
320
...
321
```
322
323
### ra_slice
324
325
Slice record arrays with advanced indexing.
326
327
```python { .api }
328
def ra_slice(
329
ra: np.recarray,
330
columns: list[str],
331
**kwargs
332
) -> np.recarray:
333
"""Slice record arrays by columns"""
334
...
335
```
336
337
## File I/O Utilities
338
339
### read_fixed_var
340
341
Read fixed format variables from files.
342
343
```python { .api }
344
def read_fixed_var(
345
line: str,
346
dtype: type = float,
347
**kwargs
348
) -> object:
349
"""Read fixed format variables"""
350
...
351
```
352
353
### write_fixed_var
354
355
Write variables in fixed format.
356
357
```python { .api }
358
def write_fixed_var(
359
value: object,
360
dtype: type = float,
361
width: int = 10,
362
**kwargs
363
) -> str:
364
"""Write fixed format variables"""
365
...
366
```
367
368
### read1d
369
370
Read 1D arrays from various sources.
371
372
```python { .api }
373
def read1d(
374
f: object,
375
array: np.ndarray,
376
**kwargs
377
) -> np.ndarray:
378
"""Read 1D arrays from files or other sources"""
379
...
380
```
381
382
## Dependency Management
383
384
### import_optional_dependency
385
386
Dynamic import utility for optional dependencies.
387
388
```python { .api }
389
def import_optional_dependency(
390
name: str,
391
extra: str = None,
392
**kwargs
393
) -> object:
394
"""Dynamic import utility for optional packages"""
395
...
396
```
397
398
**Parameters:**
399
- `name` (str): Package name to import
400
- `extra` (str): Additional information for error messages
401
402
**Returns:**
403
- `object`: Imported module or raises ImportError
404
405
## Array Classes
406
407
### Util2d
408
409
2D utility arrays for MODFLOW input data management.
410
411
```python { .api }
412
class Util2d:
413
"""2D utility arrays for MODFLOW data"""
414
def __init__(
415
self,
416
model: object,
417
shape: tuple[int, int],
418
dtype: type = np.float32,
419
value: ArrayData = 0.0,
420
name: str = 'util2d',
421
fmtin: str = None,
422
**kwargs
423
): ...
424
425
@property
426
def array(self) -> np.ndarray:
427
"""Get array data"""
428
...
429
430
def get_file_entry(self) -> str:
431
"""Get file entry string for MODFLOW input"""
432
...
433
434
def write_file(
435
self,
436
f_name: str = None,
437
**kwargs
438
) -> str:
439
"""Write array to file"""
440
...
441
442
def load(
443
cls,
444
f: object,
445
model: object,
446
shape: tuple[int, int],
447
dtype: type,
448
**kwargs
449
) -> 'Util2d':
450
"""Load Util2d from file"""
451
...
452
```
453
454
### Util3d
455
456
3D utility arrays for layered model data.
457
458
```python { .api }
459
class Util3d:
460
"""3D utility arrays for MODFLOW data"""
461
def __init__(
462
self,
463
model: object,
464
shape: tuple[int, int, int],
465
dtype: type = np.float32,
466
value: ArrayData = 0.0,
467
name: str = 'util3d',
468
fmtin: str = None,
469
**kwargs
470
): ...
471
472
@property
473
def array(self) -> np.ndarray:
474
"""Get 3D array data"""
475
...
476
477
def get_file_entry(self) -> list[str]:
478
"""Get file entries for each layer"""
479
...
480
481
def write_file(
482
self,
483
f_name: str = None,
484
**kwargs
485
) -> list[str]:
486
"""Write 3D array to file(s)"""
487
...
488
```
489
490
### Transient2d
491
492
Transient 2D arrays for time-varying data.
493
494
```python { .api }
495
class Transient2d:
496
"""Transient 2D arrays for time-varying MODFLOW data"""
497
def __init__(
498
self,
499
model: object,
500
shape: tuple[int, int],
501
dtype: type = np.float32,
502
value: ArrayData = 0.0,
503
name: str = 'transient2d',
504
**kwargs
505
): ...
506
507
def get_data_array(
508
self,
509
kper: int = 0,
510
**kwargs
511
) -> np.ndarray:
512
"""Get array for specific stress period"""
513
...
514
515
def add_record(
516
self,
517
kper: int,
518
array: np.ndarray,
519
**kwargs
520
) -> None:
521
"""Add array for stress period"""
522
...
523
```
524
525
### Transient3d
526
527
Transient 3D arrays for time-varying layered data.
528
529
```python { .api }
530
class Transient3d:
531
"""Transient 3D arrays for time-varying MODFLOW data"""
532
def __init__(
533
self,
534
model: object,
535
shape: tuple[int, int, int],
536
dtype: type = np.float32,
537
value: ArrayData = 0.0,
538
name: str = 'transient3d',
539
**kwargs
540
): ...
541
542
def get_data_array(
543
self,
544
kper: int = 0,
545
**kwargs
546
) -> np.ndarray:
547
"""Get 3D array for specific stress period"""
548
...
549
```
550
551
### MfList
552
553
List-based data structures for boundary conditions and other tabular data.
554
555
```python { .api }
556
class MfList:
557
"""List-based data structures for MODFLOW"""
558
def __init__(
559
self,
560
model: object,
561
dtype: np.dtype = None,
562
data: ArrayData = None,
563
**kwargs
564
): ...
565
566
@property
567
def array(self) -> np.recarray:
568
"""Get record array data"""
569
...
570
571
def add_record(
572
self,
573
kper: int,
574
values: list,
575
**kwargs
576
) -> None:
577
"""Add record for stress period"""
578
...
579
580
def get_data(
581
self,
582
kper: int = 0,
583
**kwargs
584
) -> np.recarray:
585
"""Get data for specific stress period"""
586
...
587
```
588
589
## Grid Utilities
590
591
### GridIntersect
592
593
Grid intersection utilities for spatial analysis.
594
595
```python { .api }
596
class GridIntersect:
597
"""Grid intersection utilities"""
598
def __init__(
599
self,
600
grid: object,
601
method: str = 'vertex',
602
**kwargs
603
): ...
604
605
def intersect_point(
606
self,
607
point: tuple[float, float],
608
**kwargs
609
) -> int:
610
"""Find cell containing point"""
611
...
612
613
def intersect_linestring(
614
self,
615
linestring: list[tuple[float, float]],
616
**kwargs
617
) -> list[dict]:
618
"""Intersect linestring with grid"""
619
...
620
621
def intersect_polygon(
622
self,
623
polygon: list[tuple[float, float]],
624
**kwargs
625
) -> list[dict]:
626
"""Intersect polygon with grid"""
627
...
628
```
629
630
### ModflowGridIndices
631
632
Grid indexing utilities for coordinate transformations.
633
634
```python { .api }
635
class ModflowGridIndices:
636
"""Grid indexing utilities"""
637
def __init__(
638
self,
639
grid: object,
640
**kwargs
641
): ...
642
643
def get_lrc_from_node(
644
self,
645
nodes: list[int],
646
**kwargs
647
) -> list[tuple[int, int, int]]:
648
"""Convert node numbers to layer-row-column"""
649
...
650
651
def get_node_from_lrc(
652
self,
653
lrc: list[tuple[int, int, int]],
654
**kwargs
655
) -> list[int]:
656
"""Convert layer-row-column to node numbers"""
657
...
658
```
659
660
### Raster
661
662
Raster data handling and manipulation.
663
664
```python { .api }
665
class Raster:
666
"""Raster data handling"""
667
def __init__(
668
self,
669
array: np.ndarray,
670
bands: list = None,
671
crs: object = None,
672
transform: object = None,
673
**kwargs
674
): ...
675
676
def sample_polygon(
677
self,
678
polygon: list[tuple[float, float]],
679
**kwargs
680
) -> dict:
681
"""Sample raster values within polygon"""
682
...
683
684
def sample_points(
685
self,
686
points: list[tuple[float, float]],
687
**kwargs
688
) -> list[float]:
689
"""Sample raster values at points"""
690
...
691
692
def resample_to_grid(
693
self,
694
modelgrid: object,
695
**kwargs
696
) -> np.ndarray:
697
"""Resample raster to model grid"""
698
...
699
```
700
701
## MODFLOW Executable Management
702
703
### get_modflow
704
705
Download and manage MODFLOW executables.
706
707
```python { .api }
708
def get_modflow(
709
bindir: str = None,
710
**kwargs
711
) -> str:
712
"""Download and manage MODFLOW executables"""
713
...
714
```
715
716
### cli_main
717
718
Command-line interface for get-modflow script.
719
720
```python { .api }
721
def cli_main() -> None:
722
"""Command-line interface for get-modflow script"""
723
...
724
```
725
726
## Option Handling
727
728
### OptionBlock
729
730
Parse and manage MODFLOW option blocks.
731
732
```python { .api }
733
class OptionBlock:
734
"""Options block parsing and management"""
735
def __init__(
736
self,
737
options_line: str,
738
package: object,
739
**kwargs
740
): ...
741
742
def parse_options(
743
self,
744
options_line: str,
745
**kwargs
746
) -> dict:
747
"""Parse options from input line"""
748
...
749
750
def write_options(
751
self,
752
f: object,
753
**kwargs
754
) -> None:
755
"""Write options to file"""
756
...
757
```
758
759
## Usage Examples
760
761
### Model Execution and Validation
762
763
```python
764
import flopy
765
import flopy.utils as fpu
766
import numpy as np
767
768
# Create or load a model
769
mf = flopy.modflow.Modflow(modelname='test_model')
770
771
# Add basic packages
772
dis = flopy.modflow.ModflowDis(mf, nlay=1, nrow=10, ncol=10)
773
bas = flopy.modflow.ModflowBas(mf, ibound=1, strt=1.0)
774
lpf = flopy.modflow.ModflowLpf(mf, hk=1.0, sy=0.1, ss=1e-5)
775
pcg = flopy.modflow.ModflowPcg(mf)
776
oc = flopy.modflow.ModflowOc(mf)
777
778
# Write input files
779
mf.write_input()
780
781
# Check model before running
782
print("Checking model...")
783
chk = fpu.check(mf, verbose=True, level=2)
784
785
if chk.summary_array.passed.all():
786
print("All checks passed!")
787
788
# Run model using utility function
789
success, output = fpu.run_model(
790
namefile='test_model.nam',
791
exe_name='mf2005',
792
model_ws=mf.model_ws,
793
report=True
794
)
795
796
if success:
797
print("Model run completed successfully")
798
print("Output:")
799
for line in output[-10:]: # Last 10 lines
800
print(f" {line}")
801
else:
802
print("Model run failed")
803
for line in output:
804
print(f" {line}")
805
else:
806
print("Model check failed:")
807
failed_checks = chk.summary_array[~chk.summary_array.passed]
808
for check in failed_checks:
809
print(f" {check.desc}: {check.summary}")
810
```
811
812
### Flow Analysis and Post-Processing
813
814
```python
815
import flopy.utils as fpu
816
import numpy as np
817
import matplotlib.pyplot as plt
818
819
# Load model and results
820
mf = flopy.modflow.Modflow.load('model.nam')
821
822
# Read flow data
823
cbb = fpu.CellBudgetFile('model.cbb')
824
frf = cbb.get_data(text='FLOW RIGHT FACE')[0]
825
fff = cbb.get_data(text='FLOW FRONT FACE')[0]
826
flf = cbb.get_data(text='FLOW LOWER FACE')[0]
827
828
# Read head data
829
hds = fpu.HeadFile('model.hds')
830
head = hds.get_data(kstpkper=(0, 0))
831
832
# Calculate specific discharge
833
qx, qy, qz = fpu.get_specific_discharge(
834
frf, fff, flf,
835
grid=mf.modelgrid,
836
head=head
837
)
838
839
print(f"Specific discharge statistics:")
840
print(f" qx range: {qx.min():.2e} to {qx.max():.2e}")
841
print(f" qy range: {qy.min():.2e} to {qy.max():.2e}")
842
print(f" qz range: {qz.min():.2e} to {qz.max():.2e}")
843
844
# Calculate magnitude
845
q_magnitude = np.sqrt(qx**2 + qy**2 + qz**2)
846
print(f" q magnitude range: {q_magnitude.min():.2e} to {q_magnitude.max():.2e}")
847
848
# Calculate transmissivities at specific locations
849
well_locations = [(0, 5, 5), (0, 3, 7), (0, 8, 2)]
850
851
for i, (lay, row, col) in enumerate(well_locations):
852
trans = fpu.get_transmissivities(
853
heads=head,
854
m=mf,
855
r=row,
856
c=col
857
)
858
print(f"Transmissivity at well {i+1} (L{lay},R{row},C{col}): {trans[lay]:.2f}")
859
860
# Close files
861
cbb.close()
862
hds.close()
863
```
864
865
### Array Utilities and Data Management
866
867
```python
868
import flopy
869
import flopy.utils as fpu
870
import numpy as np
871
872
# Create model
873
mf = flopy.modflow.Modflow(modelname='array_example')
874
875
# Create 2D array with Util2d
876
nrow, ncol = 20, 30
877
k_array = fpu.Util2d(
878
model=mf,
879
shape=(nrow, ncol),
880
dtype=np.float32,
881
value=1.0,
882
name='hydraulic_conductivity'
883
)
884
885
# Set different K values by zone
886
k_data = np.ones((nrow, ncol))
887
k_data[:10, :] = 5.0 # High K zone (top half)
888
k_data[10:, :] = 0.1 # Low K zone (bottom half)
889
k_data[:, :5] = 50.0 # Very high K zone (left edge)
890
891
k_array.array = k_data
892
893
print(f"K array statistics:")
894
print(f" Shape: {k_array.array.shape}")
895
print(f" Min: {k_array.array.min()}")
896
print(f" Max: {k_array.array.max()}")
897
print(f" Mean: {k_array.array.mean():.2f}")
898
899
# Create 3D array with Util3d
900
nlay = 3
901
porosity_3d = fpu.Util3d(
902
model=mf,
903
shape=(nlay, nrow, ncol),
904
dtype=np.float32,
905
value=0.25,
906
name='porosity'
907
)
908
909
# Set layer-specific porosity
910
por_data = np.full((nlay, nrow, ncol), 0.25)
911
por_data[0, :, :] = 0.35 # Higher porosity in top layer
912
por_data[1, :, :] = 0.20 # Lower porosity in middle layer
913
por_data[2, :, :] = 0.15 # Lowest porosity in bottom layer
914
915
porosity_3d.array = por_data
916
917
print(f"\nPorosity 3D array statistics:")
918
print(f" Shape: {porosity_3d.array.shape}")
919
for lay in range(nlay):
920
layer_data = porosity_3d.array[lay, :, :]
921
print(f" Layer {lay+1}: mean = {layer_data.mean():.3f}")
922
923
# Create transient 2D array for recharge
924
recharge_transient = fpu.Transient2d(
925
model=mf,
926
shape=(nrow, ncol),
927
dtype=np.float32,
928
value=0.001,
929
name='recharge'
930
)
931
932
# Add seasonal recharge patterns
933
base_recharge = np.full((nrow, ncol), 0.001)
934
935
# Winter (high recharge)
936
winter_recharge = base_recharge * 3.0
937
recharge_transient.add_record(0, winter_recharge)
938
939
# Spring (medium recharge)
940
spring_recharge = base_recharge * 1.5
941
recharge_transient.add_record(1, spring_recharge)
942
943
# Summer (low recharge)
944
summer_recharge = base_recharge * 0.5
945
recharge_transient.add_record(2, summer_recharge)
946
947
# Fall (medium recharge)
948
fall_recharge = base_recharge * 1.2
949
recharge_transient.add_record(3, fall_recharge)
950
951
print(f"\nTransient recharge statistics:")
952
for kper in range(4):
953
rch_data = recharge_transient.get_data_array(kper)
954
season = ['Winter', 'Spring', 'Summer', 'Fall'][kper]
955
print(f" {season}: mean = {rch_data.mean():.4f}")
956
957
# Create MfList for well data
958
well_dtype = np.dtype([
959
('k', int), ('i', int), ('j', int), ('flux', float)
960
])
961
962
well_data = fpu.MfList(
963
model=mf,
964
dtype=well_dtype
965
)
966
967
# Add wells for different stress periods
968
# Stress period 0: Base pumping
969
wells_sp0 = [
970
(0, 10, 15, -1000.0), # Production well
971
(0, 5, 20, -500.0), # Smaller well
972
(0, 15, 10, 200.0) # Injection well
973
]
974
975
for well in wells_sp0:
976
well_data.add_record(0, well)
977
978
# Stress period 1: Increased pumping
979
wells_sp1 = [
980
(0, 10, 15, -1500.0), # Increased pumping
981
(0, 5, 20, -750.0), # Increased pumping
982
(0, 15, 10, 300.0), # Increased injection
983
(0, 8, 25, -800.0) # Additional well
984
]
985
986
for well in wells_sp1:
987
well_data.add_record(1, well)
988
989
print(f"\nWell data statistics:")
990
for kper in range(2):
991
wells = well_data.get_data(kper)
992
total_pumping = np.sum(wells['flux'][wells['flux'] < 0])
993
total_injection = np.sum(wells['flux'][wells['flux'] > 0])
994
print(f" Period {kper}: {len(wells)} wells, pumping = {total_pumping:.0f}, injection = {total_injection:.0f}")
995
```
996
997
### Time Conversion and Utilities
998
999
```python
1000
import flopy.utils as fpu
1001
import numpy as np
1002
from datetime import datetime
1003
1004
# Model simulation times (in days)
1005
totim = np.array([0, 30, 60, 91, 121, 152, 182, 213, 244, 274, 305, 335, 365])
1006
1007
# Convert to datetime objects
1008
start_date = '2023-01-01'
1009
datetimes = fpu.totim_to_datetime(
1010
totim=totim,
1011
start=start_date,
1012
timeunit='days'
1013
)
1014
1015
print("Time conversion:")
1016
print(f"Start date: {start_date}")
1017
for i, (time, dt) in enumerate(zip(totim, datetimes)):
1018
print(f" Day {time:3.0f}: {dt.strftime('%Y-%m-%d')}")
1019
1020
# Different time units
1021
totim_hours = np.array([0, 6, 12, 24, 48, 72])
1022
datetimes_hours = fpu.totim_to_datetime(
1023
totim=totim_hours,
1024
start='2023-06-15 08:00:00',
1025
timeunit='hours'
1026
)
1027
1028
print(f"\nHourly simulation:")
1029
for time, dt in zip(totim_hours, datetimes_hours):
1030
print(f" Hour {time:2.0f}: {dt.strftime('%Y-%m-%d %H:%M:%S')}")
1031
1032
# Parse name file
1033
try:
1034
packages, files = fpu.parsenamefile('model.nam')
1035
1036
print(f"\nName file contents:")
1037
print(f" Found {len(packages)} packages:")
1038
for pkg in packages:
1039
ftype, unit, fname, status = pkg
1040
print(f" {ftype:8s} (unit {unit:2d}): {fname}")
1041
1042
print(f"\n File dictionary:")
1043
for key, value in files.items():
1044
print(f" {key}: {value}")
1045
1046
except FileNotFoundError:
1047
print("Name file not found")
1048
1049
# Create and manipulate record arrays
1050
dtype = np.dtype([
1051
('layer', int),
1052
('row', int),
1053
('col', int),
1054
('head', float),
1055
('drawdown', float)
1056
])
1057
1058
# Create empty record array
1059
obs_data = fpu.create_empty_recarray(10, dtype, default_value=-999.0)
1060
1061
# Fill with synthetic observation data
1062
for i in range(10):
1063
obs_data[i] = (
1064
1, # layer
1065
np.random.randint(0, 20), # row
1066
np.random.randint(0, 30), # col
1067
100.0 + np.random.normal(0, 5), # head
1068
np.random.exponential(2.0) # drawdown
1069
)
1070
1071
print(f"\nObservation data:")
1072
print(f" Array shape: {obs_data.shape}")
1073
print(f" Data type: {obs_data.dtype}")
1074
print(f" First 5 records:")
1075
for i in range(5):
1076
print(f" {obs_data[i]}")
1077
1078
# Slice record array
1079
head_data = fpu.ra_slice(obs_data, ['layer', 'row', 'col', 'head'])
1080
print(f"\nSliced array (head data only):")
1081
print(f" New shape: {head_data.shape}")
1082
print(f" New dtype: {head_data.dtype}")
1083
1084
# Statistics
1085
print(f"\nObservation statistics:")
1086
print(f" Mean head: {obs_data['head'].mean():.2f}")
1087
print(f" Mean drawdown: {obs_data['drawdown'].mean():.2f}")
1088
print(f" Head range: {obs_data['head'].min():.2f} to {obs_data['head'].max():.2f}")
1089
```
1090
1091
### Grid Intersection and Spatial Analysis
1092
1093
```python
1094
import flopy
1095
import flopy.utils as fpu
1096
import numpy as np
1097
1098
# Create model with grid
1099
mf = flopy.modflow.Modflow(modelname='grid_example')
1100
dis = flopy.modflow.ModflowDis(
1101
mf,
1102
nlay=1, nrow=20, ncol=30,
1103
delr=100.0, delc=100.0,
1104
top=50.0, botm=0.0,
1105
xul=0.0, yul=2000.0 # Upper left coordinates
1106
)
1107
1108
# Create grid intersect object
1109
grid_intersect = fpu.GridIntersect(mf.modelgrid, method='vertex')
1110
1111
# Point intersections
1112
test_points = [
1113
(500.0, 1500.0), # Should be in grid
1114
(1500.0, 1000.0), # Should be in grid
1115
(5000.0, 500.0), # May be outside grid
1116
]
1117
1118
print("Point intersections:")
1119
for i, point in enumerate(test_points):
1120
try:
1121
cell_id = grid_intersect.intersect_point(point)
1122
if cell_id >= 0:
1123
# Convert to row, col
1124
row, col = divmod(cell_id, mf.dis.ncol)
1125
print(f" Point {i+1} ({point[0]}, {point[1]}): Cell {cell_id} (Row {row+1}, Col {col+1})")
1126
else:
1127
print(f" Point {i+1} ({point[0]}, {point[1]}): Outside grid")
1128
except:
1129
print(f" Point {i+1} ({point[0]}, {point[1]}): Error in intersection")
1130
1131
# Line intersection (river or stream)
1132
river_line = [
1133
(200.0, 1800.0), # Start point
1134
(800.0, 1600.0), # Bend point
1135
(1500.0, 1200.0), # Another bend
1136
(2500.0, 800.0) # End point
1137
]
1138
1139
print(f"\nRiver line intersection:")
1140
try:
1141
intersections = grid_intersect.intersect_linestring(river_line)
1142
print(f" Line intersects {len(intersections)} cells:")
1143
1144
for intersection in intersections[:10]: # First 10 intersections
1145
cell_id = intersection.get('cellid', -1)
1146
if cell_id >= 0:
1147
row, col = divmod(cell_id, mf.dis.ncol)
1148
length = intersection.get('length', 0.0)
1149
print(f" Cell {cell_id} (R{row+1},C{col+1}): Length = {length:.1f}")
1150
1151
except Exception as e:
1152
print(f" Error in line intersection: {e}")
1153
1154
# Polygon intersection (capture zone or contamination area)
1155
contamination_area = [
1156
(1000.0, 1500.0), # Bottom left
1157
(2000.0, 1500.0), # Bottom right
1158
(2000.0, 1000.0), # Top right
1159
(1000.0, 1000.0), # Top left
1160
(1000.0, 1500.0) # Close polygon
1161
]
1162
1163
print(f"\nContamination area intersection:")
1164
try:
1165
poly_intersections = grid_intersect.intersect_polygon(contamination_area)
1166
print(f" Polygon intersects {len(poly_intersections)} cells:")
1167
1168
total_area = 0.0
1169
for intersection in poly_intersections:
1170
cell_id = intersection.get('cellid', -1)
1171
if cell_id >= 0:
1172
row, col = divmod(cell_id, mf.dis.ncol)
1173
area = intersection.get('area', 0.0)
1174
total_area += area
1175
print(f" Cell {cell_id} (R{row+1},C{col+1}): Area = {area:.0f}")
1176
1177
print(f" Total intersection area: {total_area:.0f}")
1178
1179
except Exception as e:
1180
print(f" Error in polygon intersection: {e}")
1181
1182
# Grid indexing utilities
1183
grid_indices = fpu.ModflowGridIndices(mf.modelgrid)
1184
1185
# Convert between different indexing schemes
1186
test_nodes = [0, 150, 299, 450, 599] # Some node numbers
1187
lrc_coords = grid_indices.get_lrc_from_node(test_nodes)
1188
1189
print(f"\nNode to LRC conversion:")
1190
for node, lrc in zip(test_nodes, lrc_coords):
1191
print(f" Node {node:3d} -> Layer {lrc[0]+1}, Row {lrc[1]+1}, Col {lrc[2]+1}")
1192
1193
# Convert back
1194
nodes_back = grid_indices.get_node_from_lrc(lrc_coords)
1195
print(f"\nLRC to Node conversion (verification):")
1196
for lrc, node in zip(lrc_coords, nodes_back):
1197
print(f" Layer {lrc[0]+1}, Row {lrc[1]+1}, Col {lrc[2]+1} -> Node {node}")
1198
1199
# Grid statistics
1200
print(f"\nGrid information:")
1201
print(f" Total nodes: {mf.dis.nlay * mf.dis.nrow * mf.dis.ncol}")
1202
print(f" Grid extent: {mf.modelgrid.extent}")
1203
print(f" Cell size: {mf.dis.delr.array[0]} x {mf.dis.delc.array[0]}")
1204
print(f" Grid area: {(mf.dis.ncol * mf.dis.delr.array[0]) * (mf.dis.nrow * mf.dis.delc.array[0]):,.0f} m²")
1205
```
1206
1207
## Common Types
1208
1209
```python { .api }
1210
# Utility function types
1211
ArrayData = Union[int, float, np.ndarray, list]
1212
ExecutionResult = tuple[bool, list[str]]
1213
CheckResult = object
1214
1215
# Time and date types
1216
TimeValue = Union[int, float]
1217
TimeUnit = Literal['days', 'hours', 'minutes', 'seconds', 'years']
1218
DateString = str
1219
DateTimeList = list[datetime]
1220
1221
# File and path types
1222
FilePath = Union[str, os.PathLike]
1223
NameFileEntry = tuple[str, int, str, str] # (ftype, unit, fname, status)
1224
FileDict = dict[str, str]
1225
1226
# Array and data types
1227
DataType = Union[type, np.dtype]
1228
RecordArray = np.recarray
1229
ArrayShape = tuple[int, ...]
1230
DefaultValue = Union[int, float, str]
1231
1232
# Grid and spatial types
1233
GridMethod = Literal['vertex', 'centroid', 'structured']
1234
Coordinates = tuple[float, float]
1235
LineString = list[tuple[float, float]]
1236
Polygon = list[tuple[float, float]]
1237
CellID = int
1238
IntersectionResult = dict[str, Union[int, float]]
1239
1240
# Model validation types
1241
CheckLevel = Literal[0, 1, 2] # Basic, Standard, Detailed
1242
ValidationResult = object
1243
CheckSummary = np.ndarray
1244
1245
# Dependency types
1246
ModuleName = str
1247
ImportResult = object
1248
OptionalExtra = str
1249
1250
# Flow analysis types
1251
FlowArray = np.ndarray
1252
DischargeComponents = tuple[np.ndarray, np.ndarray, np.ndarray]
1253
TransmissivityArray = np.ndarray
1254
VelocityField = tuple[np.ndarray, np.ndarray, np.ndarray]
1255
1256
# Format and I/O types
1257
FormatString = str
1258
FieldWidth = int
1259
Precision = int
1260
IOMode = Literal['r', 'w', 'a', 'rb', 'wb']
1261
```
1262
1263
This comprehensive documentation covers the complete utilities API for FloPy including model execution, validation, flow analysis, array utilities, grid operations, and various helper functions. The examples demonstrate basic to advanced utility usage scenarios including model checking, spatial analysis, time conversions, and data management operations.