or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

collections.mdcore-utilities.mdextended.mdindex.mdmetaprogramming.mdnetworking.mdparallel.mdsystem-integration.mdtesting.mdxml-html.md

networking.mddocs/

0

# Networking Utilities

1

2

HTTP client functionality with comprehensive error handling, URL manipulation utilities, and socket programming helpers for network-based applications. The net module provides enhanced HTTP operations with better error reporting and simplified APIs for common networking tasks.

3

4

## Capabilities

5

6

### HTTP Client Functions

7

8

Enhanced HTTP operations with automatic error handling, progress tracking, and flexible response processing.

9

10

```python { .api }

11

def urlopen(url, data=None, **kwargs):

12

"""

13

Open URL with enhanced error handling and automatic retries.

14

15

Enhanced version of urllib.request.urlopen with better error handling,

16

automatic header management, and support for various data types.

17

18

Parameters:

19

- url: str|Request, URL to open or Request object

20

- data: bytes|dict|str, data to send (automatically encoded)

21

- **kwargs: additional arguments passed to Request

22

23

Returns:

24

Response object with enhanced attributes

25

26

Raises:

27

Specific HTTP error classes (HTTP404NotFoundError, etc.)

28

"""

29

30

def urlread(url, decode=True, **kwargs):

31

"""

32

Read content from URL with automatic decoding.

33

34

Simplified interface for reading URL content with automatic

35

text decoding and comprehensive error handling.

36

37

Parameters:

38

- url: str, URL to read from

39

- decode: bool, automatically decode response as text (default: True)

40

- **kwargs: additional arguments for urlopen

41

42

Returns:

43

str|bytes: URL content (decoded text if decode=True, bytes otherwise)

44

45

Raises:

46

HTTP error classes for various response codes

47

"""

48

49

def urljson(url, **kwargs):

50

"""

51

Load JSON data from URL.

52

53

Convenience function to fetch and parse JSON from HTTP endpoints

54

with automatic error handling and response validation.

55

56

Parameters:

57

- url: str, URL returning JSON data

58

- **kwargs: additional arguments for urlopen

59

60

Returns:

61

dict|list: Parsed JSON object

62

63

Raises:

64

HTTP errors or JSON parsing errors with enhanced messages

65

"""

66

67

def urlcheck(url, **kwargs):

68

"""

69

Check if URL is accessible without downloading content.

70

71

Performs HEAD request to verify URL accessibility and get

72

metadata without downloading the full response body.

73

74

Parameters:

75

- url: str, URL to check

76

- **kwargs: additional arguments for request

77

78

Returns:

79

bool: True if URL is accessible (2xx response)

80

"""

81

82

def urlsave(url, dest=None, **kwargs):

83

"""

84

Download and save URL content to file.

85

86

Downloads content from URL and saves to specified destination

87

with automatic filename detection and progress tracking.

88

89

Parameters:

90

- url: str, URL to download

91

- dest: Path|str, destination path (auto-generated if None)

92

- **kwargs: additional arguments for download

93

94

Returns:

95

Path: Path to saved file

96

"""

97

98

def urlretrieve(url, filename=None, **kwargs):

99

"""

100

Enhanced version of urllib.request.urlretrieve.

101

102

Downloads file from URL with progress reporting and

103

better error handling than standard library version.

104

105

Parameters:

106

- url: str, URL to retrieve

107

- filename: str, local filename (auto-generated if None)

108

- **kwargs: additional arguments

109

110

Returns:

111

tuple: (filename, headers) similar to urllib.request.urlretrieve

112

"""

113

114

def urldest(url, dest=None, **kwargs):

115

"""

116

Determine destination path for URL download.

117

118

Intelligently determines appropriate local filename for

119

URL download based on URL structure and headers.

120

121

Parameters:

122

- url: str, source URL

123

- dest: Path|str, destination directory or filename

124

- **kwargs: additional arguments

125

126

Returns:

127

Path: Resolved destination path for download

128

"""

129

```

130

131

### URL Processing and Validation

132

133

Utilities for URL manipulation, validation, and cleaning.

134

135

```python { .api }

136

def urlquote(url):

137

"""

138

Properly quote URL components for safe HTTP requests.

139

140

Applies URL encoding to path and query components while preserving

141

URL structure. Handles international characters and special symbols.

142

143

Parameters:

144

- url: str, URL to quote

145

146

Returns:

147

str: Properly quoted URL safe for HTTP requests

148

"""

149

150

def urlwrap(url, data=None, headers=None):

151

"""

152

Wrap URL in urllib Request object with proper quoting.

153

154

Creates Request object from URL with automatic quoting and

155

header management. Handles both string URLs and existing Request objects.

156

157

Parameters:

158

- url: str|Request, URL or existing Request object

159

- data: bytes|dict, request data (automatically encoded)

160

- headers: dict, HTTP headers to include

161

162

Returns:

163

Request: urllib.request.Request object ready for use

164

"""

165

166

def urlclean(url):

167

"""

168

Clean and normalize URL format.

169

170

Removes unnecessary components, normalizes encoding, and

171

ensures URL follows standard format conventions.

172

173

Parameters:

174

- url: str, URL to clean

175

176

Returns:

177

str: Cleaned and normalized URL

178

"""

179

180

def urlvalid(x):

181

"""

182

Check if string is a valid URL.

183

184

Validates URL format and structure without making network requests.

185

Checks for proper scheme, domain, and path formatting.

186

187

Parameters:

188

- x: str, string to validate as URL

189

190

Returns:

191

bool: True if string is valid URL format

192

"""

193

194

def urlrequest(url, verb='GET', **kwargs):

195

"""

196

Make HTTP request with specified method.

197

198

Flexible HTTP request function supporting various HTTP methods

199

with comprehensive parameter handling and error reporting.

200

201

Parameters:

202

- url: str, target URL

203

- verb: str, HTTP method (GET, POST, PUT, DELETE, etc.)

204

- **kwargs: additional request parameters (headers, data, etc.)

205

206

Returns:

207

Response object with status, headers, and content

208

"""

209

210

def urlsend(url, **kwargs):

211

"""

212

Send data to URL endpoint.

213

214

Simplified interface for sending data to HTTP endpoints with

215

automatic method selection and content-type handling.

216

217

Parameters:

218

- url: str, target URL

219

- **kwargs: request parameters including data payload

220

221

Returns:

222

Response object

223

"""

224

225

def do_request(url, verb='GET', **kwargs):

226

"""

227

Low-level HTTP request with full control.

228

229

Advanced HTTP request function with detailed control over

230

all request parameters and response handling.

231

232

Parameters:

233

- url: str, target URL

234

- verb: str, HTTP method

235

- **kwargs: comprehensive request configuration

236

237

Returns:

238

Detailed response object with metadata

239

"""

240

```

241

242

### HTTP Error Classes

243

244

Comprehensive HTTP error hierarchy with specific exception classes for each status code.

245

246

```python { .api }

247

class HTTP4xxClientError(HTTPError):

248

"""

249

Base class for client-side HTTP errors (4xx status codes).

250

251

Provides enhanced error information and context for client errors

252

including detailed response information and suggested actions.

253

"""

254

255

class HTTP5xxServerError(HTTPError):

256

"""

257

Base class for server-side HTTP errors (5xx status codes).

258

259

Handles server errors with additional context and retry information

260

for applications that need to handle server failures gracefully.

261

"""

262

263

# Specific error classes for each HTTP status code:

264

class HTTP400BadRequestError(HTTP4xxClientError):

265

"""HTTP 400 Bad Request - Client sent invalid request."""

266

267

class HTTP401UnauthorizedError(HTTP4xxClientError):

268

"""HTTP 401 Unauthorized - Authentication required."""

269

270

class HTTP403ForbiddenError(HTTP4xxClientError):

271

"""HTTP 403 Forbidden - Access denied."""

272

273

class HTTP404NotFoundError(HTTP4xxClientError):

274

"""HTTP 404 Not Found - Resource not found."""

275

276

class HTTP405MethodNotAllowedError(HTTP4xxClientError):

277

"""HTTP 405 Method Not Allowed - HTTP method not supported."""

278

279

class HTTP408RequestTimeoutError(HTTP4xxClientError):

280

"""HTTP 408 Request Timeout - Request took too long."""

281

282

class HTTP429TooManyRequestsError(HTTP4xxClientError):

283

"""HTTP 429 Too Many Requests - Rate limit exceeded."""

284

285

class HTTP500InternalServerError(HTTP5xxServerError):

286

"""HTTP 500 Internal Server Error - Server encountered an error."""

287

288

class HTTP502BadGatewayError(HTTP5xxServerError):

289

"""HTTP 502 Bad Gateway - Upstream server error."""

290

291

class HTTP503ServiceUnavailableError(HTTP5xxServerError):

292

"""HTTP 503 Service Unavailable - Server temporarily unavailable."""

293

294

class HTTP504GatewayTimeoutError(HTTP5xxServerError):

295

"""HTTP 504 Gateway Timeout - Upstream server timeout."""

296

```

297

298

### Socket Programming Utilities

299

300

Low-level networking functions for socket programming and server development.

301

302

```python { .api }

303

def start_server(port, host='localhost', dgram=False):

304

"""

305

Start socket server with simplified interface.

306

307

Creates and configures socket server with sensible defaults

308

for common server patterns. Supports both TCP and UDP.

309

310

Parameters:

311

- port: int, port number to listen on

312

- host: str, hostname to bind to (default: 'localhost')

313

- dgram: bool, use UDP instead of TCP (default: False)

314

315

Returns:

316

socket: Configured server socket ready for accept()/recvfrom()

317

"""

318

319

def start_client(port, host='localhost', dgram=False):

320

"""

321

Start socket client connection.

322

323

Creates client socket and establishes connection to server

324

with automatic error handling and retry logic.

325

326

Parameters:

327

- port: int, server port to connect to

328

- host: str, server hostname (default: 'localhost')

329

- dgram: bool, use UDP instead of TCP (default: False)

330

331

Returns:

332

socket: Connected client socket ready for send()/recv()

333

"""

334

335

def tobytes(s):

336

"""

337

Convert string to bytes for socket transmission.

338

339

Handles string-to-bytes conversion with proper encoding

340

for network transmission. Handles various input types.

341

342

Parameters:

343

- s: str|bytes, data to convert

344

345

Returns:

346

bytes: Data ready for socket transmission

347

"""

348

349

def recv_once(sock):

350

"""

351

Receive data from socket with timeout handling.

352

353

Single receive operation with proper timeout and error handling.

354

Useful for non-blocking socket operations.

355

356

Parameters:

357

- sock: socket, socket to receive from

358

359

Returns:

360

bytes: Received data or None if timeout/error

361

"""

362

363

def http_response(s, status=200, hdrs=None):

364

"""

365

Generate HTTP response string.

366

367

Creates properly formatted HTTP response with headers

368

and content. Useful for simple HTTP server implementations.

369

370

Parameters:

371

- s: str, response body content

372

- status: int, HTTP status code (default: 200)

373

- hdrs: dict, additional HTTP headers

374

375

Returns:

376

str: Complete HTTP response ready for transmission

377

"""

378

```

379

380

### Configuration and Headers

381

382

Network configuration utilities and default header management.

383

384

```python { .api }

385

url_default_headers = {

386

"""

387

Default HTTP headers used by fastcore URL functions.

388

389

Comprehensive set of headers that mimic modern browser behavior

390

for better compatibility with web services and APIs.

391

392

Headers include:

393

- Accept: Comprehensive content type acceptance

394

- Accept-Language: Language preferences

395

- User-Agent: Modern browser user agent string

396

- Security headers for modern web standards

397

"""

398

}

399

400

def urlopener():

401

"""

402

Create URL opener with enhanced default headers.

403

404

Returns configured urllib.request.OpenerDirector with

405

sensible defaults for web scraping and API access.

406

407

Returns:

408

OpenerDirector: Configured URL opener with default headers

409

"""

410

411

ExceptionsHTTP = {}

412

"""

413

Dictionary mapping HTTP status codes to exception classes.

414

415

Provides programmatic access to HTTP exception classes by status code

416

for dynamic error handling and response processing.

417

418

Usage:

419

try:

420

response = urlopen(url)

421

except ExceptionsHTTP[404]:

422

print("Page not found")

423

"""

424

```

425

426

## Usage Examples

427

428

### Basic HTTP Operations

429

430

```python

431

from fastcore.net import urlread, urljson, urlsave, urlcheck

432

433

# Simple content reading

434

content = urlread("https://httpbin.org/get")

435

print(content) # Automatically decoded text

436

437

# JSON API access

438

api_data = urljson("https://api.github.com/users/octocat")

439

print(api_data['login']) # 'octocat'

440

441

# File downloading

442

file_path = urlsave(

443

"https://httpbin.org/image/png",

444

dest="downloaded_image.png"

445

)

446

447

# Check URL accessibility

448

if urlcheck("https://example.com"):

449

print("Site is accessible")

450

else:

451

print("Site is down or unreachable")

452

453

# Read binary content

454

image_data = urlread("https://httpbin.org/image/jpeg", decode=False)

455

with open("image.jpg", "wb") as f:

456

f.write(image_data)

457

```

458

459

### Advanced HTTP Requests

460

461

```python

462

from fastcore.net import urlrequest, urlsend, do_request

463

import json

464

465

# Custom HTTP methods

466

response = urlrequest(

467

"https://httpbin.org/anything",

468

verb='POST',

469

headers={'Content-Type': 'application/json'},

470

data=json.dumps({"key": "value"})

471

)

472

473

# Send form data

474

form_response = urlsend(

475

"https://httpbin.org/post",

476

data={'username': 'alice', 'password': 'secret'}

477

)

478

479

# Low-level request control

480

detailed_response = do_request(

481

"https://httpbin.org/status/418",

482

verb='GET',

483

headers={'Custom-Header': 'test-value'},

484

timeout=30

485

)

486

487

# Handle different response types

488

def fetch_with_fallback(url):

489

try:

490

return urljson(url) # Try JSON first

491

except Exception:

492

return urlread(url) # Fall back to text

493

494

data = fetch_with_fallback("https://api.example.com/data")

495

```

496

497

### Error Handling

498

499

```python

500

from fastcore.net import urlread, HTTP404NotFoundError, HTTP429TooManyRequestsError

501

import time

502

503

# Specific error handling

504

def robust_fetch(url, max_retries=3):

505

for attempt in range(max_retries):

506

try:

507

return urlread(url)

508

509

except HTTP404NotFoundError:

510

print(f"URL not found: {url}")

511

return None

512

513

except HTTP429TooManyRequestsError:

514

wait_time = 2 ** attempt # Exponential backoff

515

print(f"Rate limited, waiting {wait_time}s...")

516

time.sleep(wait_time)

517

518

except Exception as e:

519

print(f"Attempt {attempt + 1} failed: {e}")

520

if attempt == max_retries - 1:

521

raise

522

523

return None

524

525

# Use the robust fetcher

526

content = robust_fetch("https://api.example.com/data")

527

528

# Handle multiple error types

529

def safe_api_call(url):

530

try:

531

return urljson(url)

532

except (HTTP404NotFoundError, HTTP403ForbiddenError) as e:

533

print(f"Access error: {e}")

534

return {"error": "access_denied"}

535

except HTTP500InternalServerError as e:

536

print(f"Server error: {e}")

537

return {"error": "server_error"}

538

except Exception as e:

539

print(f"Unexpected error: {e}")

540

return {"error": "unknown"}

541

```

542

543

### URL Processing

544

545

```python

546

from fastcore.net import urlquote, urlwrap, urlclean, urlvalid

547

548

# URL quoting for special characters

549

unsafe_url = "https://example.com/search?q=hello world&lang=en"

550

safe_url = urlquote(unsafe_url)

551

print(safe_url) # https://example.com/search?q=hello%20world&lang=en

552

553

# Request object creation

554

request = urlwrap(

555

"https://api.example.com/data",

556

data=b'{"query": "test"}',

557

headers={'Content-Type': 'application/json'}

558

)

559

560

# URL validation

561

urls_to_check = [

562

"https://example.com",

563

"ftp://files.example.com",

564

"not-a-url",

565

"http://localhost:8000/api"

566

]

567

568

valid_urls = [url for url in urls_to_check if urlvalid(url)]

569

print(valid_urls) # Only valid URLs

570

571

# URL cleaning

572

messy_url = "https://example.com///path//to/../resource?param=value&"

573

clean_url = urlclean(messy_url)

574

print(clean_url) # https://example.com/path/resource?param=value

575

```

576

577

### Socket Programming

578

579

```python

580

from fastcore.net import start_server, start_client, tobytes, recv_once, http_response

581

import threading

582

583

# Simple echo server

584

def echo_server():

585

server_sock = start_server(8080, host='localhost')

586

print("Echo server listening on port 8080")

587

588

while True:

589

client_sock, addr = server_sock.accept()

590

print(f"Connection from {addr}")

591

592

data = recv_once(client_sock)

593

if data:

594

client_sock.send(tobytes(f"Echo: {data.decode()}"))

595

596

client_sock.close()

597

598

# Start server in background thread

599

server_thread = threading.Thread(target=echo_server, daemon=True)

600

server_thread.start()

601

602

# Simple client

603

def echo_client(message):

604

client_sock = start_client(8080, host='localhost')

605

client_sock.send(tobytes(message))

606

607

response = recv_once(client_sock)

608

client_sock.close()

609

610

return response.decode() if response else None

611

612

# Test the echo

613

response = echo_client("Hello, server!")

614

print(response) # "Echo: Hello, server!"

615

616

# Simple HTTP server response

617

def create_http_response(content, content_type="text/html"):

618

headers = {'Content-Type': content_type}

619

return http_response(content, status=200, hdrs=headers)

620

621

html_response = create_http_response("<h1>Hello, World!</h1>")

622

json_response = create_http_response(

623

'{"message": "success"}',

624

content_type="application/json"

625

)

626

```

627

628

### Batch Operations and Rate Limiting

629

630

```python

631

from fastcore.net import urlread, HTTP429TooManyRequestsError

632

from fastcore.parallel import parallel

633

import time

634

import random

635

636

# Rate-limited batch processing

637

def rate_limited_fetch(url, delay=1.0):

638

"""Fetch URL with rate limiting."""

639

time.sleep(delay + random.uniform(0, 0.5)) # Add jitter

640

return urlread(url)

641

642

# Fetch multiple URLs with rate limiting

643

urls = [

644

f"https://httpbin.org/delay/{i}"

645

for i in range(1, 6)

646

]

647

648

# Sequential with rate limiting

649

results = []

650

for url in urls:

651

try:

652

result = rate_limited_fetch(url, delay=0.5)

653

results.append(result)

654

except Exception as e:

655

print(f"Failed to fetch {url}: {e}")

656

results.append(None)

657

658

# Parallel with controlled concurrency

659

def safe_parallel_fetch(urls, n_workers=2, delay=0.5):

660

"""Fetch URLs in parallel with rate limiting."""

661

def fetch_with_delay(url):

662

return rate_limited_fetch(url, delay)

663

664

return parallel(

665

fetch_with_delay,

666

urls,

667

n_workers=n_workers # Limit concurrency

668

)

669

670

parallel_results = safe_parallel_fetch(urls, n_workers=2)

671

672

# Retry logic with exponential backoff

673

def fetch_with_retry(url, max_retries=3, base_delay=1.0):

674

"""Fetch URL with exponential backoff retry."""

675

for attempt in range(max_retries):

676

try:

677

return urlread(url)

678

except HTTP429TooManyRequestsError:

679

if attempt < max_retries - 1:

680

delay = base_delay * (2 ** attempt)

681

print(f"Rate limited, retrying in {delay}s...")

682

time.sleep(delay)

683

else:

684

raise

685

except Exception as e:

686

if attempt == max_retries - 1:

687

raise

688

time.sleep(base_delay)

689

690

return None

691

692

# Robust batch processing

693

robust_results = [

694

fetch_with_retry(url)

695

for url in urls[:3] # Process subset

696

]

697

```

698

699

### Integration with FastCore Collections

700

701

```python

702

from fastcore.net import urljson, urlread

703

from fastcore.foundation import L

704

from fastcore.basics import listify

705

706

# Process URLs with L collections

707

api_urls = L([

708

"https://api.github.com/users/octocat",

709

"https://api.github.com/users/defunkt",

710

"https://api.github.com/users/pjhyett"

711

])

712

713

# Fetch all user data

714

def get_user_data(url):

715

try:

716

return urljson(url)

717

except Exception as e:

718

return {"error": str(e), "url": url}

719

720

users = api_urls.map(get_user_data)

721

722

# Extract specific fields

723

usernames = users.map(lambda u: u.get('login', 'unknown'))

724

follower_counts = users.map(lambda u: u.get('followers', 0))

725

726

# Filter successful responses

727

valid_users = users.filter(lambda u: 'error' not in u)

728

729

# Create summary

730

summary = {

731

'total_users': len(users),

732

'successful_fetches': len(valid_users),

733

'usernames': list(usernames),

734

'total_followers': sum(follower_counts)

735

}

736

737

print(f"Processed {summary['total_users']} users")

738

print(f"Total followers: {summary['total_followers']}")

739

```