or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

async-interface.mdauthentication.mdclient-v2.mddata-models.mdindex.mdlegacy-api.mdstreaming.mdutilities.md

async-interface.mddocs/

0

# Asynchronous Interface

1

2

Tweepy provides a complete async/await interface for non-blocking access to all Twitter API functionality. The asynchronous interface mirrors the synchronous interface with identical method signatures, requiring only the addition of `await` keywords.

3

4

**Note**: Async functionality was added in Tweepy v4.10+ for Client, v4.11+ for Paginator, and v4.12+ for direct message support.

5

6

## Capabilities

7

8

### AsyncClient

9

10

The AsyncClient provides async access to all Twitter API v2 endpoints with identical functionality to the synchronous Client.

11

12

```python { .api }

13

class AsyncClient:

14

def __init__(self, bearer_token=None, consumer_key=None, consumer_secret=None,

15

access_token=None, access_token_secret=None, *,

16

return_type=Response, wait_on_rate_limit=False):

17

"""

18

Initialize asynchronous Twitter API v2 Client.

19

20

Parameters: Identical to synchronous Client

21

- bearer_token (str, optional): Bearer token for app-only authentication

22

- consumer_key (str, optional): Consumer key for user authentication

23

- consumer_secret (str, optional): Consumer secret for user authentication

24

- access_token (str, optional): Access token for user authentication

25

- access_token_secret (str, optional): Access token secret for user authentication

26

- return_type (type): Response container type (default: Response)

27

- wait_on_rate_limit (bool): Wait when rate limit hit (default: False)

28

"""

29

```

30

31

### AsyncClient Methods

32

33

All Client methods are available as async methods requiring `await`:

34

35

```python { .api }

36

# Tweet management (async versions)

37

async def create_tweet(self, text=None, **kwargs): ...

38

async def delete_tweet(self, id, **kwargs): ...

39

async def get_tweet(self, id, **kwargs): ...

40

async def get_tweets(self, ids, **kwargs): ...

41

42

# Search methods (async versions)

43

async def search_recent_tweets(self, query, **kwargs): ...

44

async def search_all_tweets(self, query, **kwargs): ...

45

46

# User methods (async versions)

47

async def get_me(self, **kwargs): ...

48

async def get_user(self, **kwargs): ...

49

async def get_users(self, **kwargs): ...

50

async def follow_user(self, target_user_id, **kwargs): ...

51

async def unfollow_user(self, target_user_id, **kwargs): ...

52

# Note: follow() and unfollow() alias methods are NOT available in AsyncClient

53

54

# Timeline methods (async versions)

55

async def get_home_timeline(self, **kwargs): ...

56

async def get_users_tweets(self, id, **kwargs): ...

57

async def get_users_mentions(self, id, **kwargs): ...

58

59

# All other Client methods available as async versions...

60

```

61

62

### AsyncStreamingClient

63

64

The AsyncStreamingClient provides async streaming with identical event handlers to the synchronous version.

65

66

```python { .api }

67

class AsyncStreamingClient:

68

def __init__(self, bearer_token, *, chunk_size=512, daemon=False,

69

max_retries=float('inf'), **kwargs):

70

"""

71

Initialize asynchronous streaming client.

72

73

Parameters: Identical to synchronous StreamingClient

74

"""

75

76

def filter(self, *, threaded=False, **kwargs):

77

"""

78

Start async filtered stream (returns coroutine).

79

80

Parameters: Identical to synchronous filter()

81

Note: This method returns a coroutine that should be awaited

82

"""

83

84

def sample(self, *, threaded=False, **kwargs):

85

"""

86

Start async sample stream (returns coroutine).

87

88

Parameters: Identical to synchronous sample()

89

Note: This method returns a coroutine that should be awaited

90

"""

91

92

def disconnect(self):

93

"""

94

Disconnect from the streaming endpoint.

95

96

Returns:

97

None

98

"""

99

100

async def add_rules(self, add, **kwargs):

101

"""

102

Add streaming rules asynchronously.

103

104

Parameters:

105

- add (list): List of StreamRule objects or rule dictionaries

106

- **kwargs: Additional parameters

107

108

Returns:

109

Response with rule addition results

110

"""

111

112

async def delete_rules(self, ids, **kwargs):

113

"""

114

Delete streaming rules asynchronously.

115

116

Parameters:

117

- ids (list): List of rule IDs to delete

118

- **kwargs: Additional parameters

119

120

Returns:

121

Response with deletion results

122

"""

123

124

async def get_rules(self, **kwargs):

125

"""

126

Get current streaming rules asynchronously.

127

128

Parameters:

129

- **kwargs: Additional parameters

130

131

Returns:

132

Response with current rules

133

"""

134

135

# Event handlers remain the same (can be async or sync)

136

async def on_tweet(self, tweet): ...

137

async def on_connect(self): ...

138

async def on_disconnect(self): ...

139

# ... other event handlers

140

```

141

142

### AsyncPaginator

143

144

The AsyncPaginator provides async pagination for API v2 endpoints.

145

146

```python { .api }

147

class AsyncPaginator:

148

def __init__(self, method, *args, **kwargs):

149

"""

150

Initialize async paginator.

151

152

Parameters: Identical to synchronous Paginator

153

"""

154

155

async def flatten(self, limit=None):

156

"""

157

Async generator yielding individual items.

158

159

Parameters:

160

- limit (int, optional): Maximum number of items

161

162

Yields:

163

Individual items from paginated responses

164

"""

165

166

async def get_next(self):

167

"""Get next page asynchronously."""

168

169

async def get_previous(self):

170

"""Get previous page asynchronously."""

171

```

172

173

### Session Management

174

175

AsyncClient uses aiohttp sessions for HTTP connections, providing connection pooling and reuse.

176

177

```python { .api }

178

class AsyncClient:

179

# Session attribute for HTTP connections

180

session: aiohttp.ClientSession

181

182

# Context manager support (Note: not implemented in current version)

183

# async def __aenter__(self): ...

184

# async def __aexit__(self, *args): ...

185

```

186

187

**Important Notes:**

188

- AsyncClient maintains an internal aiohttp session

189

- Session is created automatically on first request

190

- Sessions are not automatically closed (manual cleanup may be needed for long-running applications)

191

- Context manager support is not currently implemented

192

- For advanced use cases, consider session configuration through aiohttp parameters

193

194

## Usage Examples

195

196

### Basic AsyncClient Usage

197

198

```python

199

import asyncio

200

import tweepy

201

202

async def main():

203

# Initialize async client

204

client = tweepy.AsyncClient(bearer_token="your_bearer_token")

205

206

# All methods require await

207

tweet = await client.get_tweet("1234567890123456789")

208

print(f"Tweet: {tweet.data.text}")

209

210

# Search for tweets

211

search_results = await client.search_recent_tweets(

212

query="python programming",

213

max_results=10

214

)

215

216

for tweet in search_results.data:

217

print(f"- {tweet.text}")

218

219

# User operations

220

user = await client.get_user(username="python")

221

print(f"User: {user.data.name} (@{user.data.username})")

222

223

# Run async function

224

asyncio.run(main())

225

```

226

227

### Concurrent API Calls

228

229

```python

230

import asyncio

231

import tweepy

232

233

async def get_user_info(client, username):

234

"""Get user info and recent tweets concurrently."""

235

# Start both requests concurrently

236

user_task = client.get_user(username=username, user_fields=["public_metrics"])

237

tweets_task = client.search_recent_tweets(f"from:{username}", max_results=5)

238

239

# Wait for both to complete

240

user_response, tweets_response = await asyncio.gather(user_task, tweets_task)

241

242

return {

243

'user': user_response.data,

244

'recent_tweets': tweets_response.data or []

245

}

246

247

async def analyze_multiple_users():

248

client = tweepy.AsyncClient(bearer_token="your_bearer_token")

249

250

usernames = ["python", "github", "stackoverflow", "nodejs", "docker"]

251

252

# Process all users concurrently

253

tasks = [get_user_info(client, username) for username in usernames]

254

results = await asyncio.gather(*tasks)

255

256

# Process results

257

for username, data in zip(usernames, results):

258

user = data['user']

259

tweets = data['recent_tweets']

260

261

print(f"\n@{username} ({user.name})")

262

print(f"Followers: {user.public_metrics['followers_count']:,}")

263

print(f"Recent tweets: {len(tweets)}")

264

265

for tweet in tweets[:3]: # Show first 3

266

print(f" - {tweet.text[:60]}...")

267

268

asyncio.run(analyze_multiple_users())

269

```

270

271

### Async Streaming

272

273

```python

274

import asyncio

275

import tweepy

276

277

class AsyncTweetProcessor(tweepy.AsyncStreamingClient):

278

def __init__(self, bearer_token):

279

super().__init__(bearer_token)

280

self.tweet_count = 0

281

282

async def on_connect(self):

283

print("Connected to async stream")

284

285

async def on_tweet(self, tweet):

286

self.tweet_count += 1

287

print(f"Async tweet #{self.tweet_count}: {tweet.text[:50]}...")

288

289

# Perform async processing

290

await self.process_tweet(tweet)

291

292

# Stop after 50 tweets

293

if self.tweet_count >= 50:

294

self.disconnect()

295

296

async def process_tweet(self, tweet):

297

# Simulate async processing (e.g., database write, API call)

298

await asyncio.sleep(0.1)

299

300

# Could make additional async API calls here

301

# author = await some_client.get_user(id=tweet.author_id)

302

303

async def run_async_stream():

304

stream = AsyncTweetProcessor(bearer_token="your_bearer_token")

305

306

# Add rules

307

rules = [tweepy.StreamRule("python OR javascript", tag="programming")]

308

await stream.add_rules(rules)

309

310

# Start streaming

311

await stream.filter()

312

313

asyncio.run(run_async_stream())

314

```

315

316

### Async Pagination

317

318

```python

319

import asyncio

320

import tweepy

321

322

async def analyze_user_followers():

323

client = tweepy.AsyncClient(bearer_token="your_bearer_token")

324

325

# Create async paginator

326

paginator = tweepy.AsyncPaginator(

327

client.get_users_followers,

328

id="783214", # Twitter's user ID

329

max_results=1000,

330

user_fields=["public_metrics", "verified", "created_at"]

331

)

332

333

follower_stats = {

334

'total': 0,

335

'verified': 0,

336

'high_followers': 0,

337

'recent_joiners': 0

338

}

339

340

# Process followers asynchronously

341

async for follower in paginator.flatten(limit=10000):

342

follower_stats['total'] += 1

343

344

if getattr(follower, 'verified', False):

345

follower_stats['verified'] += 1

346

347

if follower.public_metrics['followers_count'] > 10000:

348

follower_stats['high_followers'] += 1

349

350

# Check if joined in last year (simplified)

351

if '2023' in str(follower.created_at) or '2024' in str(follower.created_at):

352

follower_stats['recent_joiners'] += 1

353

354

print("Follower Analysis:")

355

print(f"Total analyzed: {follower_stats['total']:,}")

356

print(f"Verified: {follower_stats['verified']:,}")

357

print(f"High-influence (>10k followers): {follower_stats['high_followers']:,}")

358

print(f"Recent joiners (2023-2024): {follower_stats['recent_joiners']:,}")

359

360

asyncio.run(analyze_user_followers())

361

```

362

363

### Error Handling in Async Code

364

365

```python

366

import asyncio

367

import tweepy

368

369

async def robust_async_operations():

370

client = tweepy.AsyncClient(

371

consumer_key="your_consumer_key",

372

consumer_secret="your_consumer_secret",

373

access_token="your_access_token",

374

access_token_secret="your_access_token_secret",

375

wait_on_rate_limit=True

376

)

377

378

try:

379

# Attempt to create tweet

380

response = await client.create_tweet(text="Hello from async Tweepy!")

381

print(f"Tweet created: {response.data['id']}")

382

383

except tweepy.BadRequest as e:

384

print(f"Bad request: {e}")

385

386

except tweepy.Unauthorized as e:

387

print("Authentication failed")

388

389

except tweepy.TooManyRequests as e:

390

print("Rate limited - will wait and retry")

391

# wait_on_rate_limit=True handles this automatically

392

393

except tweepy.HTTPException as e:

394

print(f"HTTP error: {e}")

395

396

except Exception as e:

397

print(f"Unexpected error: {e}")

398

399

asyncio.run(robust_async_operations())

400

```

401

402

### Session Cleanup (Advanced)

403

404

```python

405

import asyncio

406

import tweepy

407

408

async def manual_session_cleanup():

409

# For long-running applications, you may want to manage sessions manually

410

client = tweepy.AsyncClient(bearer_token="your_bearer_token")

411

412

try:

413

# Perform operations

414

tweets = await client.search_recent_tweets("python", max_results=10)

415

416

for tweet in tweets.data:

417

print(tweet.text)

418

419

finally:

420

# Manually close session if needed (advanced usage)

421

if hasattr(client, 'session') and client.session:

422

await client.session.close()

423

424

asyncio.run(manual_session_cleanup())

425

```

426

427

## Dependencies

428

429

The async interface requires additional dependencies:

430

431

```bash

432

# Install with async support

433

pip install tweepy[async]

434

435

# Or install dependencies manually

436

pip install aiohttp async-lru

437

```

438

439

## Performance Considerations

440

441

### Concurrent Operations

442

- Use `asyncio.gather()` for concurrent API calls

443

- Limit concurrent requests to respect rate limits

444

- Consider using semaphores to control concurrency

445

446

### Memory Usage

447

- Async operations can use more memory due to coroutine overhead

448

- Consider processing results in batches for large datasets

449

- Use async generators (`async for`) for memory-efficient iteration

450

451

### Rate Limiting

452

- Set `wait_on_rate_limit=True` for automatic rate limit handling

453

- Consider implementing custom backoff strategies for high-throughput applications

454

- Monitor rate limit headers in responses

455

456

```python

457

import asyncio

458

import tweepy

459

460

async def controlled_concurrent_requests():

461

client = tweepy.AsyncClient(bearer_token="your_bearer_token")

462

463

# Limit concurrent requests

464

semaphore = asyncio.Semaphore(5) # Max 5 concurrent requests

465

466

async def get_user_with_semaphore(username):

467

async with semaphore:

468

return await client.get_user(username=username)

469

470

usernames = ["python", "github", "stackoverflow", "nodejs", "docker", "reactjs"]

471

472

# Execute with controlled concurrency

473

results = await asyncio.gather(*[

474

get_user_with_semaphore(username) for username in usernames

475

])

476

477

for result in results:

478

user = result.data

479

print(f"@{user.username}: {user.public_metrics['followers_count']:,} followers")

480

481

asyncio.run(controlled_concurrent_requests())

482

```