0
# External Integration
1
2
Functions for loading models from Hugging Face, integrating with existing web frameworks, working with external APIs, and connecting Gradio applications with other systems and services.
3
4
## Capabilities
5
6
### Hugging Face Integration
7
8
Functions for loading and integrating with Hugging Face models, datasets, and Spaces directly into Gradio interfaces.
9
10
```python { .api }
11
def load(
12
name,
13
src=None,
14
api_key=None,
15
alias=None,
16
**kwargs
17
):
18
"""
19
Load external Hugging Face models and Spaces into Gradio interfaces.
20
21
Parameters:
22
- name: Model or Space name (e.g., "microsoft/DialoGPT-medium", "huggingface/CodeBERTa-small-v1")
23
- src: Source type ("huggingface", "github", or None for auto-detection)
24
- api_key: Hugging Face API key for private models
25
- alias: Alternative name for the loaded interface
26
- kwargs: Additional parameters for model loading
27
28
Returns:
29
- Gradio interface object for the loaded model/Space
30
"""
31
32
def load_chat(
33
name,
34
src=None,
35
api_key=None,
36
**kwargs
37
):
38
"""
39
Load chat interfaces from external sources with conversational UI.
40
41
Parameters:
42
- name: Chat model or Space name
43
- src: Source type ("huggingface" or None)
44
- api_key: API key for private models
45
- kwargs: Additional chat configuration parameters
46
47
Returns:
48
- ChatInterface object configured for the loaded model
49
"""
50
51
def load_openapi(
52
url,
53
api_key=None,
54
headers=None,
55
**kwargs
56
):
57
"""
58
Load and integrate OpenAPI specifications as Gradio interfaces.
59
60
Parameters:
61
- url: URL to OpenAPI specification (JSON or YAML)
62
- api_key: API key for authenticated endpoints
63
- headers: Additional HTTP headers for API requests
64
- kwargs: Additional configuration options
65
66
Returns:
67
- Gradio interface for interacting with the API
68
"""
69
```
70
71
Usage examples:
72
73
```python
74
import gradio as gr
75
76
# Load a Hugging Face model
77
text_generator = gr.load("microsoft/DialoGPT-medium")
78
79
# Load a Hugging Face Space
80
image_classifier = gr.load("huggingface/image-classification-demo")
81
82
# Load with custom configuration
83
private_model = gr.load(
84
"organization/private-model",
85
api_key="hf_your_token_here",
86
src="huggingface"
87
)
88
89
# Load chat interface
90
chatbot = gr.load_chat("microsoft/DialoGPT-medium")
91
92
# Load OpenAPI service
93
weather_api = gr.load_openapi(
94
"https://api.openweathermap.org/data/2.5/openapi.json",
95
api_key="your_weather_api_key"
96
)
97
98
# Combine loaded interfaces
99
with gr.Blocks() as demo:
100
gr.Markdown("# Multi-Model Demo")
101
102
with gr.Tab("Text Generation"):
103
text_generator.render()
104
105
with gr.Tab("Image Classification"):
106
image_classifier.render()
107
108
with gr.Tab("Chat"):
109
chatbot.render()
110
111
demo.launch()
112
```
113
114
### Web Framework Integration
115
116
Functions and classes for mounting Gradio applications within existing web frameworks and handling HTTP requests.
117
118
```python { .api }
119
def mount_gradio_app(
120
app,
121
gradio_app,
122
path="/gradio",
123
gradio_api_url="http://localhost:7860",
124
**kwargs
125
):
126
"""
127
Mount Gradio applications in existing web frameworks (FastAPI, Flask, etc.).
128
129
Parameters:
130
- app: Existing web application instance (FastAPI, Flask app, etc.)
131
- gradio_app: Gradio interface or Blocks instance to mount
132
- path: URL path where Gradio app will be mounted
133
- gradio_api_url: URL for Gradio API endpoints
134
- kwargs: Additional mounting configuration
135
136
Returns:
137
- Modified web application with Gradio integration
138
"""
139
140
class Request:
141
def __init__(self, request):
142
"""
143
HTTP request wrapper for accessing request data in Gradio functions.
144
145
Parameters:
146
- request: Underlying HTTP request object
147
148
Attributes:
149
- headers: Request headers dictionary
150
- query_params: Query parameters dictionary
151
- path_params: Path parameters dictionary
152
- cookies: Request cookies
153
- client: Client information (IP, host, etc.)
154
- method: HTTP method (GET, POST, etc.)
155
- url: Full request URL
156
"""
157
158
@property
159
def headers(self):
160
"""Access request headers."""
161
162
@property
163
def query_params(self):
164
"""Access query parameters."""
165
166
@property
167
def cookies(self):
168
"""Access request cookies."""
169
170
@property
171
def client(self):
172
"""Access client information."""
173
174
def get_header(self, name, default=None):
175
"""Get specific header value."""
176
177
class Header:
178
def __init__(self, name, value):
179
"""
180
HTTP header utility for request/response handling.
181
182
Parameters:
183
- name: Header name
184
- value: Header value
185
"""
186
```
187
188
Usage examples:
189
190
```python
191
import gradio as gr
192
from fastapi import FastAPI
193
import uvicorn
194
195
# Create FastAPI app
196
app = FastAPI()
197
198
# Create Gradio interface
199
def greet(name, request: gr.Request):
200
client_ip = request.client.host
201
user_agent = request.headers.get("user-agent", "Unknown")
202
203
return f"Hello {name}! IP: {client_ip}, Browser: {user_agent}"
204
205
gradio_app = gr.Interface(
206
fn=greet,
207
inputs="text",
208
outputs="text"
209
)
210
211
# Mount Gradio in FastAPI
212
app = gr.mount_gradio_app(app, gradio_app, path="/demo")
213
214
# Add regular FastAPI routes
215
@app.get("/")
216
def root():
217
return {"message": "FastAPI with Gradio demo at /demo"}
218
219
@app.get("/api/status")
220
def status():
221
return {"status": "running", "demo_url": "/demo"}
222
223
# Run combined application
224
if __name__ == "__main__":
225
uvicorn.run(app, host="0.0.0.0", port=8000)
226
```
227
228
Flask integration example:
229
230
```python
231
import gradio as gr
232
from flask import Flask
233
234
# Create Flask app
235
flask_app = Flask(__name__)
236
237
# Create Gradio interface
238
def process_data(data, request: gr.Request):
239
session_id = request.cookies.get("session_id", "anonymous")
240
return f"Processed {data} for session {session_id}"
241
242
gradio_app = gr.Interface(
243
fn=process_data,
244
inputs="text",
245
outputs="text"
246
)
247
248
# Mount Gradio in Flask
249
flask_app = gr.mount_gradio_app(flask_app, gradio_app, path="/ml")
250
251
# Add Flask routes
252
@flask_app.route("/")
253
def home():
254
return '<h1>Flask + Gradio</h1><a href="/ml">ML Demo</a>'
255
256
@flask_app.route("/api/info")
257
def info():
258
return {"app": "Flask + Gradio", "ml_endpoint": "/ml"}
259
260
flask_app.run(debug=True, port=5000)
261
```
262
263
### OAuth and Authentication
264
265
Classes and utilities for handling OAuth authentication and user management in Gradio applications.
266
267
```python { .api }
268
class OAuthProfile:
269
def __init__(self, profile_data):
270
"""
271
OAuth user profile data representation.
272
273
Parameters:
274
- profile_data: Raw profile data from OAuth provider
275
276
Attributes:
277
- username: User's username or handle
278
- name: User's display name
279
- email: User's email address
280
- avatar_url: URL to user's profile picture
281
- profile_url: URL to user's profile page
282
- provider: OAuth provider name (e.g., "github", "google")
283
"""
284
285
@property
286
def username(self):
287
"""Get username from profile."""
288
289
@property
290
def name(self):
291
"""Get display name from profile."""
292
293
@property
294
def email(self):
295
"""Get email from profile."""
296
297
@property
298
def avatar_url(self):
299
"""Get avatar image URL."""
300
301
class OAuthToken:
302
def __init__(self, token_data):
303
"""
304
OAuth token management and validation.
305
306
Parameters:
307
- token_data: Raw token data from OAuth flow
308
309
Attributes:
310
- access_token: OAuth access token
311
- refresh_token: OAuth refresh token (if available)
312
- expires_at: Token expiration timestamp
313
- scope: Token permissions scope
314
"""
315
316
@property
317
def access_token(self):
318
"""Get access token."""
319
320
@property
321
def is_expired(self):
322
"""Check if token is expired."""
323
324
def refresh(self):
325
"""Refresh the access token if refresh token available."""
326
```
327
328
Usage examples:
329
330
```python
331
import gradio as gr
332
333
def authenticated_function(input_data, request: gr.Request):
334
# Check if user is authenticated
335
auth_header = request.headers.get("authorization")
336
337
if not auth_header:
338
return "Please log in to use this feature"
339
340
# Extract user info (simplified example)
341
user_info = validate_token(auth_header)
342
343
if user_info:
344
return f"Hello {user_info['name']}, processing: {input_data}"
345
else:
346
return "Invalid authentication"
347
348
def oauth_callback(code, state):
349
# Handle OAuth callback
350
token = exchange_code_for_token(code)
351
profile = get_user_profile(token.access_token)
352
353
return f"Welcome {profile.name}!"
354
355
# Interface with authentication
356
auth_demo = gr.Interface(
357
fn=authenticated_function,
358
inputs="text",
359
outputs="text",
360
title="Authenticated Demo"
361
)
362
363
# OAuth login interface
364
oauth_demo = gr.Interface(
365
fn=oauth_callback,
366
inputs=["text", "text"], # code, state
367
outputs="text",
368
title="OAuth Login"
369
)
370
```
371
372
### Model Context Protocol (MCP)
373
374
Integration with Model Context Protocol for AI agent workflows and advanced model interactions.
375
376
```python { .api }
377
# MCP module for Model Context Protocol integration
378
import gradio.mcp as mcp
379
380
# MCP integration functions and classes would be defined here
381
# This is a placeholder for the MCP functionality
382
```
383
384
Usage example:
385
386
```python
387
import gradio as gr
388
import gradio.mcp as mcp
389
390
def mcp_enabled_function(input_data):
391
# Use MCP for enhanced AI interactions
392
context = mcp.get_context()
393
response = mcp.process_with_context(input_data, context)
394
return response
395
396
mcp_demo = gr.Interface(
397
fn=mcp_enabled_function,
398
inputs="text",
399
outputs="text",
400
title="MCP Enhanced Interface"
401
)
402
```
403
404
## Integration Patterns
405
406
### API Gateway Integration
407
408
Integrating Gradio with API gateways and microservice architectures:
409
410
```python
411
import gradio as gr
412
import requests
413
414
def proxy_to_service(input_data, request: gr.Request):
415
# Forward request to microservice
416
api_gateway_url = "https://api.example.com/ml-service"
417
418
headers = {
419
"Authorization": request.headers.get("authorization"),
420
"Content-Type": "application/json"
421
}
422
423
response = requests.post(
424
api_gateway_url,
425
json={"input": input_data},
426
headers=headers
427
)
428
429
return response.json()["result"]
430
431
# Create interface that proxies to external services
432
proxy_demo = gr.Interface(
433
fn=proxy_to_service,
434
inputs="text",
435
outputs="text"
436
)
437
```
438
439
### Database Integration
440
441
Connecting Gradio interfaces with databases for data persistence:
442
443
```python
444
import gradio as gr
445
import sqlite3
446
447
def save_and_process(user_input, request: gr.Request):
448
# Save input to database
449
conn = sqlite3.connect("app_data.db")
450
cursor = conn.cursor()
451
452
cursor.execute(
453
"INSERT INTO user_inputs (input, ip_address, timestamp) VALUES (?, ?, datetime('now'))",
454
(user_input, request.client.host)
455
)
456
conn.commit()
457
conn.close()
458
459
# Process and return result
460
result = process_data(user_input)
461
return result
462
463
def get_history(request: gr.Request):
464
# Retrieve user's history
465
conn = sqlite3.connect("app_data.db")
466
cursor = conn.cursor()
467
468
cursor.execute(
469
"SELECT input, timestamp FROM user_inputs WHERE ip_address = ? ORDER BY timestamp DESC LIMIT 10",
470
(request.client.host,)
471
)
472
473
history = cursor.fetchall()
474
conn.close()
475
476
return str(history)
477
478
with gr.Blocks() as demo:
479
input_text = gr.Textbox(label="Input")
480
result_output = gr.Textbox(label="Result")
481
history_output = gr.Textbox(label="Your History")
482
483
process_btn = gr.Button("Process")
484
history_btn = gr.Button("View History")
485
486
process_btn.click(save_and_process, input_text, result_output)
487
history_btn.click(get_history, outputs=history_output)
488
```
489
490
### Webhook Integration
491
492
Setting up webhooks for external system notifications:
493
494
```python
495
import gradio as gr
496
from fastapi import FastAPI, BackgroundTasks
497
import json
498
499
app = FastAPI()
500
webhook_data = {"latest": None}
501
502
def process_webhook_data():
503
if webhook_data["latest"]:
504
return f"Latest webhook: {json.dumps(webhook_data['latest'], indent=2)}"
505
return "No webhook data received"
506
507
def clear_webhook_data():
508
webhook_data["latest"] = None
509
return "Webhook data cleared"
510
511
# Webhook endpoint
512
@app.post("/webhook")
513
async def receive_webhook(data: dict, background_tasks: BackgroundTasks):
514
webhook_data["latest"] = data
515
return {"status": "received"}
516
517
# Gradio interface for viewing webhook data
518
webhook_viewer = gr.Interface(
519
fn=process_webhook_data,
520
inputs=None,
521
outputs="text",
522
title="Webhook Data Viewer"
523
)
524
525
# Mount Gradio interface
526
app = gr.mount_gradio_app(app, webhook_viewer, path="/viewer")
527
```
528
529
### Cloud Service Integration
530
531
Integrating with cloud services (AWS, GCP, Azure):
532
533
```python
534
import gradio as gr
535
import boto3
536
from io import BytesIO
537
538
def upload_to_s3(file, bucket_name="my-gradio-uploads"):
539
# Upload file to AWS S3
540
s3_client = boto3.client('s3')
541
542
try:
543
s3_client.upload_file(file.name, bucket_name, file.orig_name)
544
return f"File uploaded successfully: s3://{bucket_name}/{file.orig_name}"
545
except Exception as e:
546
return f"Upload failed: {str(e)}"
547
548
def analyze_with_rekognition(image):
549
# Use AWS Rekognition for image analysis
550
rekognition = boto3.client('rekognition')
551
552
with open(image.name, 'rb') as img_file:
553
response = rekognition.detect_labels(
554
Image={'Bytes': img_file.read()},
555
MaxLabels=10
556
)
557
558
labels = [label['Name'] for label in response['Labels']]
559
return f"Detected: {', '.join(labels)}"
560
561
# Cloud-integrated interface
562
cloud_demo = gr.Interface(
563
fn=[upload_to_s3, analyze_with_rekognition],
564
inputs=["file", "image"],
565
outputs=["text", "text"],
566
title="Cloud Integration Demo"
567
)
568
```