or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

tessl/pypi-celery

Distributed Task Queue for Python that enables asynchronous task execution across multiple workers and machines

Workspace
tessl
Visibility
Public
Created
Last updated
Describes
pypipkg:pypi/celery@4.4.x

To install, run

npx @tessl/cli install tessl/pypi-celery@4.4.0

0

# Celery

1

2

Celery is a distributed task queue for Python that enables asynchronous execution of jobs across multiple workers and machines. It uses message brokers like RabbitMQ or Redis to coordinate between clients and workers, providing high availability, horizontal scaling, and robust task execution with retries, routing, and monitoring capabilities.

3

4

## Package Information

5

6

- **Package Name**: celery

7

- **Language**: Python

8

- **Installation**: `pip install celery`

9

- **Bundles**: `pip install "celery[redis]"` (with Redis support)

10

11

## Core Imports

12

13

```python

14

from celery import Celery

15

```

16

17

Common task creation imports:

18

19

```python

20

from celery import shared_task

21

```

22

23

Application and task state imports:

24

25

```python

26

from celery import current_app, current_task

27

```

28

29

Canvas workflow imports:

30

31

```python

32

from celery import signature, chain, group, chord, chunks, xmap

33

```

34

35

## Basic Usage

36

37

```python

38

from celery import Celery

39

40

# Create Celery app

41

app = Celery('myapp', broker='redis://localhost:6379/0')

42

43

# Define a task

44

@app.task

45

def add(x, y):

46

return x + y

47

48

# Execute task asynchronously

49

result = add.delay(4, 4)

50

print(result.get()) # Wait for result

51

52

# Use shared task for library code

53

from celery import shared_task

54

55

@shared_task

56

def multiply(x, y):

57

return x * y

58

59

# Chain tasks together

60

from celery import chain

61

result = chain(add.s(2, 2), multiply.s(8))()

62

print(result.get()) # Result: 32 (4 * 8)

63

```

64

65

## Architecture

66

67

Celery's architecture consists of several key components working together:

68

69

- **Application (Celery)**: Central application instance managing configuration, task registry, and coordination

70

- **Tasks**: Units of work executed asynchronously by workers, created via decorators or classes

71

- **Workers**: Processes that consume tasks from queues and execute them, supporting multiple concurrency models

72

- **Brokers**: Message transport systems (Redis, RabbitMQ, etc.) that queue and route tasks between clients and workers

73

- **Result Backends**: Storage systems for task results and metadata, enabling result retrieval and monitoring

74

- **Beat Scheduler**: Periodic task scheduler for cron-like functionality and recurring jobs

75

- **Canvas**: Workflow primitives (signature, chain, group, chord) for complex task composition and orchestration

76

77

This distributed architecture enables Celery to scale horizontally, handle failures gracefully, and provide flexible task routing and execution patterns.

78

79

## Capabilities

80

81

### Core Application

82

83

Main Celery application class, task creation with decorators, shared tasks for reusable components, and application lifecycle management.

84

85

```python { .api }

86

class Celery:

87

def __init__(self, main=None, broker=None, backend=None, **kwargs): ...

88

def task(self, *args, **opts): ...

89

def send_task(self, name, args=None, kwargs=None, **options): ...

90

91

def shared_task(*args, **kwargs): ...

92

```

93

94

[Core Application](./core-application.md)

95

96

### Workflow Primitives (Canvas)

97

98

Task composition primitives for building complex workflows including sequential chains, parallel groups, callbacks, and functional programming patterns.

99

100

```python { .api }

101

def signature(task, args=None, kwargs=None, **options): ...

102

def chain(*tasks, **kwargs): ...

103

def group(*tasks, **kwargs): ...

104

def chord(header, body, **kwargs): ...

105

def chunks(it, n, task): ...

106

```

107

108

[Workflow Primitives](./workflow-primitives.md)

109

110

### Results and State Management

111

112

Task result handling, state monitoring, result retrieval with timeouts, and task lifecycle management.

113

114

```python { .api }

115

class AsyncResult:

116

def get(self, timeout=None, propagate=True, interval=0.5): ...

117

def ready(self) -> bool: ...

118

def successful(self) -> bool: ...

119

def revoke(self, terminate=False): ...

120

121

class GroupResult:

122

def get(self, timeout=None, propagate=True): ...

123

```

124

125

[Results and State](./results-state.md)

126

127

### Scheduling and Beat

128

129

Periodic task scheduling with cron-like syntax, interval-based schedules, solar event scheduling, and beat scheduler management.

130

131

```python { .api }

132

def crontab(minute='*', hour='*', day_of_week='*', day_of_month='*', month_of_year='*'): ...

133

def schedule(run_every, relative=False): ...

134

def solar(event, lat, lon): ...

135

```

136

137

[Scheduling and Beat](./scheduling-beat.md)

138

139

### Signals and Events

140

141

Task lifecycle signals, worker events, monitoring hooks, and custom signal handlers for debugging and monitoring integration.

142

143

```python { .api }

144

# Task signals

145

task_prerun = Signal()

146

task_postrun = Signal()

147

task_success = Signal()

148

task_failure = Signal()

149

150

# Worker signals

151

worker_ready = Signal()

152

worker_shutdown = Signal()

153

```

154

155

[Signals and Events](./signals-events.md)

156

157

### Configuration

158

159

Application configuration management, broker and backend setup, task routing, serialization options, and environment-based configuration.

160

161

```python { .api }

162

class Celery:

163

def config_from_object(self, obj, silent=False, force=False): ...

164

def config_from_envvar(self, variable_name, silent=False): ...

165

166

@property

167

def conf(self): ... # Configuration namespace

168

```

169

170

[Configuration](./configuration.md)

171

172

### Exception Handling

173

174

Complete exception hierarchy for task errors, retry mechanisms, timeout handling, backend errors, and worker-related exceptions.

175

176

```python { .api }

177

class CeleryError(Exception): ...

178

class Retry(CeleryError): ...

179

class MaxRetriesExceededError(CeleryError): ...

180

class TaskRevokedError(CeleryError): ...

181

class SoftTimeLimitExceeded(Exception): ...

182

```

183

184

[Exception Handling](./exceptions.md)