or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

ai-integrations.mdclient-management.mdcontext-management.mdevent-tracking.mdfeature-flags.mdindex.mduser-group-management.md

client-management.mddocs/

0

# Client Management

1

2

Client lifecycle management including initialization, configuration, batching control, and graceful shutdown. PostHog's client management supports both global module-level usage and direct client instantiation for advanced configuration and multi-tenant applications.

3

4

## Capabilities

5

6

### Client Initialization

7

8

Create and configure PostHog client instances with comprehensive options for production deployments.

9

10

```python { .api }

11

class Client:

12

def __init__(

13

self,

14

project_api_key: str,

15

host: Optional[str] = None,

16

debug: bool = False,

17

max_queue_size: int = 10000,

18

send: bool = True,

19

on_error: Optional[Callable] = None,

20

flush_at: int = 100,

21

flush_interval: float = 0.5,

22

gzip: bool = False,

23

max_retries: int = 3,

24

sync_mode: bool = False,

25

timeout: int = 15,

26

thread: int = 1,

27

poll_interval: int = 30,

28

personal_api_key: Optional[str] = None,

29

disabled: bool = False,

30

disable_geoip: bool = True,

31

historical_migration: bool = False,

32

feature_flags_request_timeout_seconds: int = 3,

33

super_properties: Optional[Dict] = None,

34

enable_exception_autocapture: bool = False,

35

log_captured_exceptions: bool = False,

36

project_root: Optional[str] = None,

37

privacy_mode: bool = False,

38

before_send: Optional[Callable] = None,

39

flag_fallback_cache_url: Optional[str] = None,

40

enable_local_evaluation: bool = True

41

):

42

"""

43

Initialize a new PostHog client instance.

44

45

Parameters:

46

- project_api_key: str - The project API key

47

- host: Optional[str] - The PostHog host URL (default: https://app.posthog.com)

48

- debug: bool - Enable debug logging (default: False)

49

- max_queue_size: int - Maximum events in queue before dropping (default: 10000)

50

- send: bool - Whether to send events to PostHog (default: True)

51

- on_error: Optional[Callable] - Error callback function

52

- flush_at: int - Number of events to trigger batch send (default: 100)

53

- flush_interval: float - Seconds between automatic flushes (default: 0.5)

54

- gzip: bool - Enable gzip compression (default: False)

55

- max_retries: int - Maximum retry attempts for failed requests (default: 3)

56

- sync_mode: bool - Send events synchronously (default: False)

57

- timeout: int - Request timeout in seconds (default: 15)

58

- thread: int - Number of background threads (default: 1)

59

- poll_interval: int - Feature flag polling interval in seconds (default: 30)

60

- personal_api_key: Optional[str] - Personal API key for advanced features

61

- disabled: bool - Disable all PostHog functionality (default: False)

62

- disable_geoip: bool - Disable GeoIP lookup (default: True)

63

- historical_migration: bool - Enable historical migration mode (default: False)

64

- feature_flags_request_timeout_seconds: int - Feature flag request timeout (default: 3)

65

- super_properties: Optional[Dict] - Properties added to all events

66

- enable_exception_autocapture: bool - Auto-capture exceptions (default: False)

67

- log_captured_exceptions: bool - Log captured exceptions (default: False)

68

- project_root: Optional[str] - Project root for exception capture

69

- privacy_mode: bool - Privacy mode for AI features (default: False)

70

- before_send: Optional[Callable] - Event preprocessing callback

71

- flag_fallback_cache_url: Optional[str] - Redis URL for flag caching

72

- enable_local_evaluation: bool - Enable local flag evaluation (default: True)

73

74

Notes:

75

- project_api_key is required and should start with 'phc_'

76

- host should include protocol (https://)

77

- sync_mode blocks until events are sent

78

- disabled=True makes all operations no-ops

79

"""

80

81

# Alias for backward compatibility

82

class Posthog(Client):

83

pass

84

```

85

86

### Queue and Batch Management

87

88

Control event batching, queue management, and forced flushing for optimal performance and reliability.

89

90

```python { .api }

91

def flush():

92

"""

93

Tell the client to flush all queued events.

94

95

Notes:

96

- Forces immediate sending of all queued events

97

- Blocks until all events are sent or fail

98

- Useful before application shutdown

99

- Safe to call multiple times

100

"""

101

102

def join():

103

"""

104

Block program until the client clears the queue. Used during program shutdown.

105

106

Notes:

107

- Waits for background threads to finish processing

108

- Does not send events, use flush() first

109

- Should be called before application exit

110

- Use shutdown() for combined flush + join

111

"""

112

113

def shutdown():

114

"""

115

Flush all messages and cleanly shutdown the client.

116

117

Notes:

118

- Combines flush() and join() operations

119

- Recommended for application shutdown

120

- Ensures all events are sent before exit

121

- Stops background threads

122

"""

123

```

124

125

## Usage Examples

126

127

### Basic Client Setup

128

129

```python

130

import posthog

131

132

# Global configuration (simplest approach)

133

posthog.api_key = 'phc_your_project_api_key'

134

posthog.host = 'https://app.posthog.com' # or your self-hosted instance

135

136

# Use module-level functions

137

posthog.capture('user123', 'event_name')

138

posthog.set('user123', {'property': 'value'})

139

140

# Shutdown when application exits

141

import atexit

142

atexit.register(posthog.shutdown)

143

```

144

145

### Direct Client Instantiation

146

147

```python

148

from posthog import Posthog

149

150

# Create client instance with custom configuration

151

client = Posthog(

152

project_api_key='phc_your_project_api_key',

153

host='https://app.posthog.com',

154

debug=False,

155

flush_at=50,

156

flush_interval=1.0,

157

max_retries=5,

158

timeout=30

159

)

160

161

# Use client instance methods

162

client.capture('user123', 'event_name')

163

client.set('user123', {'property': 'value'})

164

165

# Shutdown client when done

166

client.shutdown()

167

```

168

169

### Production Configuration

170

171

```python

172

from posthog import Posthog

173

import logging

174

175

# Production-ready configuration

176

def create_posthog_client():

177

return Posthog(

178

project_api_key='phc_your_project_api_key',

179

host='https://app.posthog.com',

180

181

# Performance settings

182

flush_at=200, # Batch size

183

flush_interval=2.0, # Flush every 2 seconds

184

max_queue_size=50000, # Large queue for high traffic

185

gzip=True, # Compression for bandwidth

186

max_retries=5, # Retry failed requests

187

timeout=30, # Longer timeout for reliability

188

189

# Feature flags

190

personal_api_key='phc_your_personal_api_key',

191

poll_interval=60, # Check flags every minute

192

enable_local_evaluation=True,

193

194

# Error handling

195

on_error=lambda error: logging.error(f"PostHog error: {error}"),

196

197

# Privacy and debugging

198

disable_geoip=True,

199

debug=False,

200

201

# Super properties for all events

202

super_properties={

203

'app_version': '1.2.3',

204

'environment': 'production'

205

}

206

)

207

208

# Initialize client

209

posthog_client = create_posthog_client()

210

```

211

212

### Multi-Tenant Configuration

213

214

```python

215

from posthog import Posthog

216

217

class MultiTenantPostHog:

218

def __init__(self):

219

self.clients = {}

220

221

def get_client(self, tenant_id: str) -> Posthog:

222

if tenant_id not in self.clients:

223

# Create client for new tenant

224

self.clients[tenant_id] = Posthog(

225

project_api_key=f'phc_tenant_{tenant_id}_key',

226

host='https://app.posthog.com',

227

flush_at=100,

228

flush_interval=1.0,

229

super_properties={

230

'tenant_id': tenant_id,

231

'app_name': 'multi-tenant-app'

232

}

233

)

234

return self.clients[tenant_id]

235

236

def shutdown_all(self):

237

for client in self.clients.values():

238

client.shutdown()

239

240

# Usage

241

multi_posthog = MultiTenantPostHog()

242

243

# Track events for different tenants

244

tenant_a_client = multi_posthog.get_client('tenant_a')

245

tenant_a_client.capture('user123', 'event_name')

246

247

tenant_b_client = multi_posthog.get_client('tenant_b')

248

tenant_b_client.capture('user456', 'event_name')

249

250

# Shutdown all clients

251

multi_posthog.shutdown_all()

252

```

253

254

### Development and Testing Configuration

255

256

```python

257

from posthog import Posthog

258

import os

259

260

def create_development_client():

261

# Different config for development

262

is_development = os.getenv('ENVIRONMENT') == 'development'

263

is_testing = os.getenv('ENVIRONMENT') == 'testing'

264

265

return Posthog(

266

project_api_key=os.getenv('POSTHOG_API_KEY', 'phc_test_key'),

267

host=os.getenv('POSTHOG_HOST', 'https://app.posthog.com'),

268

269

# Development settings

270

debug=is_development,

271

send=not is_testing, # Don't send events during tests

272

disabled=is_testing, # Disable completely during tests

273

sync_mode=is_testing, # Synchronous for testing

274

275

# Faster flushing for development

276

flush_at=10 if is_development else 100,

277

flush_interval=0.1 if is_development else 0.5,

278

279

# Development error handling

280

on_error=lambda error: print(f"PostHog error: {error}") if is_development else None

281

)

282

283

client = create_development_client()

284

```

285

286

### Advanced Batching Configuration

287

288

```python

289

from posthog import Posthog

290

291

# High-throughput application

292

high_volume_client = Posthog(

293

project_api_key='phc_your_project_api_key',

294

295

# Large batches for efficiency

296

flush_at=500,

297

flush_interval=5.0,

298

max_queue_size=100000,

299

300

# Multiple threads for processing

301

thread=3,

302

303

# Compression and timeouts

304

gzip=True,

305

timeout=60,

306

max_retries=10

307

)

308

309

# Low-latency application

310

low_latency_client = Posthog(

311

project_api_key='phc_your_project_api_key',

312

313

# Small batches for low latency

314

flush_at=10,

315

flush_interval=0.1,

316

317

# Synchronous mode for immediate sending

318

sync_mode=True,

319

timeout=5

320

)

321

322

# Offline-first application

323

offline_client = Posthog(

324

project_api_key='phc_your_project_api_key',

325

326

# Large queue for offline operation

327

max_queue_size=1000000,

328

flush_at=1000,

329

flush_interval=30.0,

330

331

# Aggressive retries

332

max_retries=20,

333

timeout=120

334

)

335

```

336

337

### Event Preprocessing

338

339

```python

340

from posthog import Posthog

341

import hashlib

342

343

def anonymize_sensitive_data(event):

344

"""Preprocessing function to anonymize sensitive data"""

345

346

# Hash email addresses

347

if 'properties' in event and 'email' in event['properties']:

348

email = event['properties']['email']

349

event['properties']['email_hash'] = hashlib.sha256(email.encode()).hexdigest()

350

del event['properties']['email']

351

352

# Remove PII fields

353

sensitive_fields = ['ssn', 'credit_card', 'password']

354

if 'properties' in event:

355

for field in sensitive_fields:

356

event['properties'].pop(field, None)

357

358

# Add processing metadata

359

event['properties']['_processed'] = True

360

361

return event

362

363

client = Posthog(

364

project_api_key='phc_your_project_api_key',

365

before_send=anonymize_sensitive_data

366

)

367

368

# Events are automatically preprocessed

369

client.capture('user123', 'sensitive_event', {

370

'email': 'user@example.com', # Will be hashed

371

'amount': 100

372

})

373

```

374

375

### Exception and Error Handling

376

377

```python

378

from posthog import Posthog

379

import logging

380

381

# Set up logging

382

logging.basicConfig(level=logging.INFO)

383

logger = logging.getLogger(__name__)

384

385

def error_handler(error):

386

"""Custom error handler for PostHog errors"""

387

logger.error(f"PostHog error: {error}")

388

389

# Send to monitoring service

390

# monitoring.capture_exception(error)

391

392

# Optionally disable client on repeated errors

393

# if isinstance(error, ConnectionError):

394

# client.disabled = True

395

396

client = Posthog(

397

project_api_key='phc_your_project_api_key',

398

on_error=error_handler,

399

400

# Exception auto-capture for debugging

401

enable_exception_autocapture=True,

402

log_captured_exceptions=True,

403

project_root='/app'

404

)

405

406

# Test error handling

407

try:

408

risky_operation()

409

except Exception as e:

410

# Exception is automatically captured if enable_exception_autocapture=True

411

# Or capture manually:

412

client.capture_exception(e)

413

```

414

415

### Graceful Shutdown Patterns

416

417

```python

418

from posthog import Posthog

419

import signal

420

import sys

421

import atexit

422

423

client = Posthog(project_api_key='phc_your_project_api_key')

424

425

def shutdown_handler(signum=None, frame=None):

426

"""Graceful shutdown handler"""

427

print("Shutting down PostHog client...")

428

client.shutdown()

429

sys.exit(0)

430

431

# Register shutdown handlers

432

signal.signal(signal.SIGINT, shutdown_handler) # Ctrl+C

433

signal.signal(signal.SIGTERM, shutdown_handler) # Termination signal

434

atexit.register(client.shutdown) # Process exit

435

436

# Application code

437

def main():

438

client.capture('app', 'started')

439

440

# Application logic...

441

442

# Events are automatically flushed on shutdown

443

444

if __name__ == '__main__':

445

main()

446

```

447

448

### Performance Monitoring

449

450

```python

451

from posthog import Posthog

452

import time

453

import threading

454

from queue import Queue

455

456

class MonitoredPostHogClient:

457

def __init__(self, **kwargs):

458

self.client = Posthog(**kwargs)

459

self.metrics = {

460

'events_sent': 0,

461

'events_failed': 0,

462

'flush_count': 0,

463

'queue_size': 0

464

}

465

466

# Override error handler to track failures

467

original_error_handler = kwargs.get('on_error')

468

def error_handler(error):

469

self.metrics['events_failed'] += 1

470

if original_error_handler:

471

original_error_handler(error)

472

473

self.client.on_error = error_handler

474

475

# Start monitoring thread

476

self.monitoring_thread = threading.Thread(target=self._monitor_performance)

477

self.monitoring_thread.daemon = True

478

self.monitoring_thread.start()

479

480

def _monitor_performance(self):

481

while True:

482

# Monitor queue size

483

if hasattr(self.client, 'queue'):

484

self.metrics['queue_size'] = self.client.queue.qsize()

485

486

# Log metrics every 60 seconds

487

print(f"PostHog metrics: {self.metrics}")

488

time.sleep(60)

489

490

def capture(self, *args, **kwargs):

491

result = self.client.capture(*args, **kwargs)

492

if result:

493

self.metrics['events_sent'] += 1

494

return result

495

496

def flush(self):

497

self.client.flush()

498

self.metrics['flush_count'] += 1

499

500

def shutdown(self):

501

self.client.shutdown()

502

503

# Usage

504

monitored_client = MonitoredPostHogClient(

505

project_api_key='phc_your_project_api_key',

506

flush_at=100,

507

flush_interval=1.0

508

)

509

510

# Track events with monitoring

511

monitored_client.capture('user123', 'event_name')

512

```

513

514

## Configuration Best Practices

515

516

### Environment-Specific Settings

517

518

```python

519

import os

520

from posthog import Posthog

521

522

def create_client_for_environment():

523

env = os.getenv('ENVIRONMENT', 'development')

524

525

base_config = {

526

'project_api_key': os.getenv('POSTHOG_API_KEY'),

527

'host': os.getenv('POSTHOG_HOST', 'https://app.posthog.com'),

528

}

529

530

if env == 'production':

531

return Posthog(

532

**base_config,

533

debug=False,

534

send=True,

535

flush_at=200,

536

flush_interval=2.0,

537

max_queue_size=50000,

538

gzip=True,

539

max_retries=5,

540

enable_exception_autocapture=False # Disable for production

541

)

542

543

elif env == 'staging':

544

return Posthog(

545

**base_config,

546

debug=True,

547

send=True,

548

flush_at=50,

549

flush_interval=1.0,

550

enable_exception_autocapture=True

551

)

552

553

elif env == 'development':

554

return Posthog(

555

**base_config,

556

debug=True,

557

send=True,

558

flush_at=10,

559

flush_interval=0.5,

560

sync_mode=True # Immediate sending for development

561

)

562

563

else: # testing

564

return Posthog(

565

**base_config,

566

disabled=True, # No events sent during tests

567

sync_mode=True

568

)

569

570

client = create_client_for_environment()

571

```

572

573

### Resource Management

574

575

```python

576

from posthog import Posthog

577

import contextlib

578

579

@contextlib.contextmanager

580

def posthog_client(**kwargs):

581

"""Context manager for automatic client cleanup"""

582

client = Posthog(**kwargs)

583

try:

584

yield client

585

finally:

586

client.shutdown()

587

588

# Usage

589

with posthog_client(project_api_key='phc_key') as client:

590

client.capture('user123', 'event_name')

591

# Client automatically shuts down when exiting context

592

```

593

594

### Thread Safety

595

596

PostHog clients are thread-safe and can be shared across multiple threads:

597

598

```python

599

from posthog import Posthog

600

import threading

601

import queue

602

603

# Single client shared across threads

604

shared_client = Posthog(project_api_key='phc_your_project_api_key')

605

606

def worker_thread(thread_id, event_queue):

607

while True:

608

try:

609

event_data = event_queue.get(timeout=1)

610

shared_client.capture(

611

event_data['user_id'],

612

event_data['event_name'],

613

event_data['properties']

614

)

615

event_queue.task_done()

616

except queue.Empty:

617

break

618

619

# Start multiple worker threads

620

event_queue = queue.Queue()

621

threads = []

622

623

for i in range(5):

624

t = threading.Thread(target=worker_thread, args=(i, event_queue))

625

t.start()

626

threads.append(t)

627

628

# Add events to queue

629

for i in range(100):

630

event_queue.put({

631

'user_id': f'user_{i}',

632

'event_name': 'thread_event',

633

'properties': {'thread_id': i % 5}

634

})

635

636

# Wait for completion

637

event_queue.join()

638

639

# Shutdown client

640

shared_client.shutdown()

641

```