0
# Analytics and Export
1
2
Analytics data export and metrics collection for business intelligence and reporting. The Analytics Service provides comprehensive data export capabilities to extract insights from user behavior, product performance, and recommendation effectiveness.
3
4
## Capabilities
5
6
### Analytics Export
7
8
Export analytics metrics and data for external analysis and business intelligence systems.
9
10
```python { .api }
11
class AnalyticsServiceClient:
12
def export_analytics_metrics(self, request: ExportAnalyticsMetricsRequest) -> Operation:
13
"""
14
Exports analytics metrics data to external destinations (long-running operation).
15
16
Args:
17
request: Contains catalog, output configuration, and export filters
18
19
Returns:
20
Operation: Resolves to ExportAnalyticsMetricsResponse with export details
21
22
Raises:
23
InvalidArgument: If export parameters are invalid
24
PermissionDenied: If insufficient permissions for export destination
25
"""
26
27
class AnalyticsServiceAsyncClient:
28
async def export_analytics_metrics(self, request: ExportAnalyticsMetricsRequest) -> Operation:
29
"""
30
Exports analytics metrics data to external destinations (long-running operation).
31
32
Args:
33
request: Contains catalog, output configuration, and export filters
34
35
Returns:
36
Operation: Resolves to ExportAnalyticsMetricsResponse with export details
37
38
Raises:
39
InvalidArgument: If export parameters are invalid
40
PermissionDenied: If insufficient permissions for export destination
41
"""
42
```
43
44
## Data Types
45
46
### Export Request Configuration
47
48
Comprehensive configuration for analytics data export operations.
49
50
```python { .api }
51
class ExportAnalyticsMetricsRequest:
52
catalog: str # Catalog resource name (required)
53
output_config: OutputConfig # Export destination configuration (required)
54
filter: str # Filter expression for data to export
55
56
class OutputConfig:
57
destination: str # Export destination type
58
gcs_destination: GcsDestination # Google Cloud Storage destination
59
bigquery_destination: BigQueryDestination # BigQuery destination
60
61
class GcsDestination:
62
output_uri_prefix: str # GCS URI prefix for output files (required)
63
64
class BigQueryDestination:
65
project_id: str # BigQuery project ID (required)
66
dataset_id: str # BigQuery dataset ID (required)
67
table_id: str # BigQuery table ID (required)
68
table_type: str # Table type (TABLE, VIEW)
69
```
70
71
### Export Response and Results
72
73
Results and metadata from analytics export operations.
74
75
```python { .api }
76
class ExportAnalyticsMetricsResponse:
77
error_samples: List[Status] # Sample of errors encountered during export
78
errors_config: ExportErrorsConfig # Error handling configuration used
79
output_result: OutputResult # Details about exported data
80
81
class OutputResult:
82
bigquery_result: List[BigQueryOutputResult] # BigQuery export results
83
gcs_result: List[GcsOutputResult] # GCS export results
84
85
class BigQueryOutputResult:
86
dataset_id: str # BigQuery dataset ID
87
table_id: str # BigQuery table ID
88
89
class GcsOutputResult:
90
output_uri: str # GCS URI of exported file
91
bytes_count: int # Number of bytes exported
92
```
93
94
### Export Error Handling
95
96
Configuration for handling errors during export operations.
97
98
```python { .api }
99
class ExportErrorsConfig:
100
gcs_prefix: str # GCS prefix for error files
101
102
class ExportMetadata:
103
create_time: Timestamp # Export operation start time
104
update_time: Timestamp # Last update time
105
partial_failure: bool # Whether export had partial failures
106
request: ExportAnalyticsMetricsRequest # Original export request
107
```
108
109
### Analytics Metrics Types
110
111
Various analytics metrics and data types available for export.
112
113
```python { .api }
114
# Analytics metric types that can be exported
115
ANALYTICS_METRIC_SEARCH_ANALYTICS = "search-analytics" # Search behavior and performance
116
ANALYTICS_METRIC_USER_EVENTS = "user-events" # User interaction events
117
ANALYTICS_METRIC_PRODUCT_PERFORMANCE = "product-performance" # Product view/purchase metrics
118
ANALYTICS_METRIC_RECOMMENDATION_PERFORMANCE = "recommendation-performance" # Recommendation effectiveness
119
ANALYTICS_METRIC_REVENUE_ANALYTICS = "revenue-analytics" # Revenue and conversion metrics
120
ANALYTICS_METRIC_CATALOG_ANALYTICS = "catalog-analytics" # Catalog usage and coverage
121
122
# Filter expressions for analytics data
123
# Time-based filters
124
FILTER_TIME_RANGE = 'event_time >= "2024-01-01T00:00:00Z" AND event_time <= "2024-01-31T23:59:59Z"'
125
126
# Event type filters
127
FILTER_PURCHASE_EVENTS = 'event_type = "purchase-complete"'
128
FILTER_SEARCH_EVENTS = 'event_type = "search"'
129
FILTER_VIEW_EVENTS = 'event_type = "page-view"'
130
131
# Product filters
132
FILTER_PRODUCT_CATEGORY = 'product_details.product.categories: ANY("Electronics")'
133
FILTER_PRICE_RANGE = 'product_details.product.price_info.price >= 100 AND product_details.product.price_info.price <= 1000'
134
135
# User filters
136
FILTER_REGISTERED_USERS = 'user_info.user_id != ""'
137
```
138
139
## Usage Examples
140
141
### Basic Analytics Export to BigQuery
142
143
```python
144
from google.cloud import retail
145
146
client = retail.AnalyticsServiceClient()
147
148
# Export user events to BigQuery for analysis
149
output_config = retail.OutputConfig(
150
bigquery_destination=retail.BigQueryDestination(
151
project_id="my-analytics-project",
152
dataset_id="retail_analytics",
153
table_id="user_events_2024"
154
)
155
)
156
157
request = retail.ExportAnalyticsMetricsRequest(
158
catalog="projects/my-project/locations/global/catalogs/default_catalog",
159
output_config=output_config,
160
filter='event_time >= "2024-01-01T00:00:00Z" AND event_time <= "2024-12-31T23:59:59Z"'
161
)
162
163
operation = client.export_analytics_metrics(request=request)
164
print(f"Export operation started: {operation.name}")
165
166
# Wait for export to complete
167
result = operation.result()
168
print("Export completed successfully")
169
170
if result.output_result.bigquery_result:
171
for bq_result in result.output_result.bigquery_result:
172
print(f"Data exported to: {bq_result.dataset_id}.{bq_result.table_id}")
173
174
if result.error_samples:
175
print(f"Export had {len(result.error_samples)} errors")
176
for error in result.error_samples[:3]: # Show first 3 errors
177
print(f"- {error.message}")
178
```
179
180
### Export Purchase Analytics to Google Cloud Storage
181
182
```python
183
# Export purchase events and revenue data to GCS
184
output_config = retail.OutputConfig(
185
gcs_destination=retail.GcsDestination(
186
output_uri_prefix="gs://my-analytics-bucket/exports/purchase-data/"
187
)
188
)
189
190
# Filter for purchase events only
191
purchase_filter = 'event_type = "purchase-complete" AND event_time >= "2024-01-01T00:00:00Z"'
192
193
request = retail.ExportAnalyticsMetricsRequest(
194
catalog="projects/my-project/locations/global/catalogs/default_catalog",
195
output_config=output_config,
196
filter=purchase_filter
197
)
198
199
operation = client.export_analytics_metrics(request=request)
200
print(f"Purchase data export operation: {operation.name}")
201
202
# Monitor operation progress
203
result = operation.result()
204
print("Purchase data export completed")
205
206
if result.output_result.gcs_result:
207
for gcs_result in result.output_result.gcs_result:
208
print(f"File exported: {gcs_result.output_uri}")
209
print(f"Size: {gcs_result.bytes_count} bytes")
210
```
211
212
### Export Product Performance Analytics
213
214
```python
215
# Export product performance metrics for electronics category
216
product_performance_config = retail.OutputConfig(
217
bigquery_destination=retail.BigQueryDestination(
218
project_id="my-analytics-project",
219
dataset_id="retail_analytics",
220
table_id="product_performance_electronics"
221
)
222
)
223
224
# Filter for electronics products with detailed performance data
225
electronics_filter = '''
226
product_details.product.categories: ANY("Electronics") AND
227
event_time >= "2024-01-01T00:00:00Z" AND
228
event_type: ANY("page-view", "add-to-cart", "purchase-complete")
229
'''
230
231
request = retail.ExportAnalyticsMetricsRequest(
232
catalog="projects/my-project/locations/global/catalogs/default_catalog",
233
output_config=product_performance_config,
234
filter=electronics_filter
235
)
236
237
operation = client.export_analytics_metrics(request=request)
238
result = operation.result()
239
240
print("Product performance data exported")
241
print(f"Electronics performance data available in BigQuery")
242
```
243
244
### Export Search Analytics
245
246
```python
247
# Export search analytics to understand search behavior and performance
248
search_analytics_config = retail.OutputConfig(
249
bigquery_destination=retail.BigQueryDestination(
250
project_id="my-analytics-project",
251
dataset_id="retail_analytics",
252
table_id="search_analytics_monthly"
253
)
254
)
255
256
# Filter for search events with query and result information
257
search_filter = '''
258
event_type = "search" AND
259
event_time >= "2024-01-01T00:00:00Z" AND
260
event_time <= "2024-01-31T23:59:59Z" AND
261
search_query != ""
262
'''
263
264
request = retail.ExportAnalyticsMetricsRequest(
265
catalog="projects/my-project/locations/global/catalogs/default_catalog",
266
output_config=search_analytics_config,
267
filter=search_filter
268
)
269
270
operation = client.export_analytics_metrics(request=request)
271
result = operation.result()
272
273
print("Search analytics exported for January 2024")
274
```
275
276
### Export Recommendation Performance Data
277
278
```python
279
# Export recommendation performance metrics to analyze ML model effectiveness
280
recommendation_config = retail.OutputConfig(
281
gcs_destination=retail.GcsDestination(
282
output_uri_prefix="gs://my-analytics-bucket/recommendations/performance/"
283
)
284
)
285
286
# Filter for events that have attribution tokens (came from recommendations)
287
recommendation_filter = '''
288
attribution_token != "" AND
289
event_time >= "2024-01-01T00:00:00Z" AND
290
event_type: ANY("page-view", "add-to-cart", "purchase-complete")
291
'''
292
293
request = retail.ExportAnalyticsMetricsRequest(
294
catalog="projects/my-project/locations/global/catalogs/default_catalog",
295
output_config=recommendation_config,
296
filter=recommendation_filter
297
)
298
299
operation = client.export_analytics_metrics(request=request)
300
result = operation.result()
301
302
print("Recommendation performance data exported")
303
for gcs_result in result.output_result.gcs_result:
304
print(f"Recommendation metrics: {gcs_result.output_uri}")
305
```
306
307
### Export User Behavior Cohort Analysis Data
308
309
```python
310
# Export user behavior data for cohort analysis
311
cohort_config = retail.OutputConfig(
312
bigquery_destination=retail.BigQueryDestination(
313
project_id="my-analytics-project",
314
dataset_id="user_analytics",
315
table_id="user_cohort_data"
316
)
317
)
318
319
# Filter for registered users with complete user journey data
320
user_behavior_filter = '''
321
user_info.user_id != "" AND
322
event_time >= "2024-01-01T00:00:00Z" AND
323
event_type: ANY("page-view", "search", "add-to-cart", "purchase-complete")
324
'''
325
326
request = retail.ExportAnalyticsMetricsRequest(
327
catalog="projects/my-project/locations/global/catalogs/default_catalog",
328
output_config=cohort_config,
329
filter=user_behavior_filter
330
)
331
332
operation = client.export_analytics_metrics(request=request)
333
result = operation.result()
334
335
print("User behavior data exported for cohort analysis")
336
```
337
338
### Export Revenue Analytics by Category
339
340
```python
341
# Export revenue analytics segmented by product category
342
revenue_config = retail.OutputConfig(
343
bigquery_destination=retail.BigQueryDestination(
344
project_id="my-analytics-project",
345
dataset_id="revenue_analytics",
346
table_id="category_revenue_2024"
347
)
348
)
349
350
# Filter for purchase events with revenue data
351
revenue_filter = '''
352
event_type = "purchase-complete" AND
353
purchase_transaction.revenue > 0 AND
354
event_time >= "2024-01-01T00:00:00Z" AND
355
product_details.product.categories: ANY("Electronics", "Clothing", "Books", "Home")
356
'''
357
358
request = retail.ExportAnalyticsMetricsRequest(
359
catalog="projects/my-project/locations/global/catalogs/default_catalog",
360
output_config=revenue_config,
361
filter=revenue_filter
362
)
363
364
operation = client.export_analytics_metrics(request=request)
365
result = operation.result()
366
367
print("Revenue analytics by category exported")
368
369
# After export completes, you can query the BigQuery table for insights:
370
print("Sample BigQuery analysis queries:")
371
print("""
372
-- Total revenue by category
373
SELECT
374
UNNEST(product_details.product.categories) as category,
375
SUM(purchase_transaction.revenue) as total_revenue,
376
COUNT(*) as transaction_count
377
FROM `my-analytics-project.revenue_analytics.category_revenue_2024`
378
GROUP BY category
379
ORDER BY total_revenue DESC;
380
381
-- Monthly revenue trends
382
SELECT
383
EXTRACT(MONTH FROM event_time) as month,
384
SUM(purchase_transaction.revenue) as monthly_revenue,
385
COUNT(DISTINCT user_info.user_id) as unique_customers
386
FROM `my-analytics-project.revenue_analytics.category_revenue_2024`
387
GROUP BY month
388
ORDER BY month;
389
""")
390
```
391
392
### Error Handling and Monitoring
393
394
```python
395
# Export with comprehensive error handling
396
errors_config = retail.ExportErrorsConfig(
397
gcs_prefix="gs://my-analytics-bucket/export-errors/"
398
)
399
400
output_config = retail.OutputConfig(
401
bigquery_destination=retail.BigQueryDestination(
402
project_id="my-analytics-project",
403
dataset_id="retail_analytics",
404
table_id="user_events_with_errors"
405
)
406
)
407
408
request = retail.ExportAnalyticsMetricsRequest(
409
catalog="projects/my-project/locations/global/catalogs/default_catalog",
410
output_config=output_config,
411
filter='event_time >= "2024-01-01T00:00:00Z"'
412
)
413
414
operation = client.export_analytics_metrics(request=request)
415
416
# Monitor operation progress
417
import time
418
while not operation.done():
419
print("Export in progress...")
420
time.sleep(30)
421
422
result = operation.result()
423
424
# Check for errors and partial failures
425
if result.error_samples:
426
print(f"Export completed with {len(result.error_samples)} errors:")
427
for i, error in enumerate(result.error_samples[:5]):
428
print(f"Error {i+1}: {error.message}")
429
430
print(f"Error details saved to: {errors_config.gcs_prefix}")
431
else:
432
print("Export completed successfully with no errors")
433
434
# Display export results
435
if result.output_result.bigquery_result:
436
for bq_result in result.output_result.bigquery_result:
437
print(f"BigQuery table created: {bq_result.dataset_id}.{bq_result.table_id}")
438
439
if result.output_result.gcs_result:
440
total_bytes = sum(gcs.bytes_count for gcs in result.output_result.gcs_result)
441
print(f"Total data exported: {total_bytes:,} bytes")
442
for gcs_result in result.output_result.gcs_result:
443
print(f"File: {gcs_result.output_uri} ({gcs_result.bytes_count:,} bytes)")
444
```