CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/npm-pulumi--aws

A Pulumi package for creating and managing Amazon Web Services (AWS) cloud resources with infrastructure-as-code.

Pending

Quality

Pending

Does it follow best practices?

Impact

Pending

No eval scenarios have been run

Overview
Eval results
Files

kinesis.mddocs/analytics/

Amazon Kinesis

Amazon Kinesis makes it easy to collect, process, and analyze real-time streaming data.

Package

import * as aws from "@pulumi/aws";
import * as kinesis from "@pulumi/aws/kinesis";

Key Resources

Kinesis Data Stream

Real-time data streaming service.

const stream = new aws.kinesis.Stream("data-stream", {
    name: "application-events",
    shardCount: 2,
    retentionPeriod: 24,
    streamModeDetails: {
        streamMode: "PROVISIONED",
    },
    tags: {
        Environment: "production",
    },
});

Kinesis Firehose Delivery Stream

Deliver streaming data to destinations.

const deliveryStream = new aws.kinesis.FirehoseDeliveryStream("s3-delivery", {
    name: "event-delivery-stream",
    destination: "extended_s3",
    extendedS3Configuration: {
        roleArn: firehoseRole.arn,
        bucketArn: bucket.arn,
        prefix: "events/year=!{timestamp:yyyy}/month=!{timestamp:MM}/day=!{timestamp:dd}/",
        errorOutputPrefix: "errors/",
        bufferingSize: 5,
        bufferingInterval: 300,
        compressionFormat: "GZIP",
    },
});

Kinesis Data Analytics Application

Real-time analytics on streaming data.

const analyticsApp = new aws.kinesisanalyticsv2.Application("analytics-app", {
    name: "stream-analytics",
    runtimeEnvironment: "FLINK-1_15",
    serviceExecutionRole: analyticsRole.arn,
    applicationConfiguration: {
        applicationCodeConfiguration: {
            codeContent: {
                s3ContentLocation: {
                    bucketArn: codeBucket.arn,
                    fileKey: "analytics-app.jar",
                },
            },
            codeContentType: "ZIPFILE",
        },
        environmentProperties: {
            propertyGroups: [{
                propertyGroupId: "ProducerConfigProperties",
                propertyMap: {
                    "aws.region": "us-west-2",
                },
            }],
        },
    },
});

Common Patterns

On-Demand Kinesis Stream

const onDemandStream = new aws.kinesis.Stream("on-demand", {
    name: "on-demand-stream",
    streamModeDetails: {
        streamMode: "ON_DEMAND",
    },
    retentionPeriod: 168, // 7 days
});

Firehose with Data Transformation

const transformLambda = new aws.lambda.Function("transform", {
    runtime: "python3.11",
    handler: "index.handler",
    role: lambdaRole.arn,
    code: new pulumi.asset.AssetArchive({
        "index.py": new pulumi.asset.StringAsset(`
def handler(event, context):
    import base64
    import json
    
    output = []
    for record in event['records']:
        # Decode and transform
        payload = base64.b64decode(record['data'])
        data = json.loads(payload)
        
        # Transform logic here
        transformed = json.dumps(data)
        
        output.append({
            'recordId': record['recordId'],
            'result': 'Ok',
            'data': base64.b64encode(transformed.encode()).decode()
        })
    
    return {'records': output}
        `),
    }),
});

const deliveryStream = new aws.kinesis.FirehoseDeliveryStream("transformed", {
    name: "transformed-delivery",
    destination: "extended_s3",
    extendedS3Configuration: {
        roleArn: firehoseRole.arn,
        bucketArn: bucket.arn,
        processingConfiguration: {
            enabled: true,
            processors: [{
                type: "Lambda",
                parameters: [{
                    parameterName: "LambdaArn",
                    parameterValue: transformLambda.arn,
                }],
            }],
        },
    },
});

Stream Consumer with Lambda

const consumer = new aws.kinesis.StreamConsumer("consumer", {
    name: "event-processor",
    streamArn: stream.arn,
});

const eventSourceMapping = new aws.lambda.EventSourceMapping("kinesis-trigger", {
    eventSourceArn: consumer.arn,
    functionName: processorFunction.arn,
    startingPosition: "LATEST",
    batchSize: 100,
    maximumBatchingWindowInSeconds: 10,
});

Key Properties

Stream Properties

  • name - Stream name
  • shardCount - Number of shards (provisioned mode)
  • streamModeDetails - Stream mode (PROVISIONED or ON_DEMAND)
  • retentionPeriod - Data retention in hours (24-8760)
  • encryptionType - Encryption type (NONE or KMS)
  • kmsKeyId - KMS key ID for encryption

Firehose Properties

  • name - Delivery stream name
  • destination - Destination type (s3, redshift, elasticsearch, etc.)
  • extendedS3Configuration - S3 destination configuration
  • bufferingSize - Buffer size in MB (1-128)
  • bufferingInterval - Buffer interval in seconds (60-900)

Output Properties

  • id - Resource identifier
  • arn - Resource ARN
  • name - Resource name

Use Cases

  • Real-Time Analytics: Process and analyze streaming data
  • Log Aggregation: Collect logs from multiple sources
  • IoT Data Ingestion: Collect sensor and device data
  • Clickstream Analysis: Track user behavior in real-time
  • Event Processing: Build event-driven architectures

Related Services

  • Lambda - Stream processing
  • S3 - Data storage destination
  • DynamoDB - Stream processing with DynamoDB Streams

Install with Tessl CLI

npx tessl i tessl/npm-pulumi--aws@7.16.0

docs

analytics

glue.md

kinesis.md

index.md

quickstart.md

README.md

tile.json