Menu
Streaming Analytics Pipeline
Streaming Analytics Pipeline

Automated Deployment

Before you launch the automated deployment, please review the architecture, configuration, and other considerations discussed in this guide. Follow the step-by-step instructions in this section to configure and deploy a Streaming Analytics Pipeline into your account.

Time to deploy: Approximately five (5) minutes

Prerequisites

If you choose Amazon Redshift or Amazon Elasticsearch Service as the destination for your analyzed data, you must configure them to work with the Streaming Analytics Pipeline solution.

Amazon Redshift

To configure Amazon Redshift, your Amazon Redshift cluster must have a table that is configured to accept the data in the format that is output by the Amazon Kinesis Analytics application. You must also have write permissions to write to that table. If your Amazon Redshift cluster is located in an Amazon Virtual Private Cloud, the cluster must be publicly accessible with a public IP address. The cluster’s Amazon Elastic Compute Cloud (Amazon EC2) security group should allow access from the AWS Region’s Amazon Kinesis Firehose IP addresses:

  • US East (N. Virginia) Region:52.70.63.192/27

  • US West (Oregon) Region:52.89.255.224/27

  • US West (N. California) Region:52.19.239.192/27

Amazon Elasticsearch Service

To configure Amazon Elasticsearch Service, your Amazon Elasticsearch Service domain should have an existing index and type to which data can be assigned. We also recommend you create and map your fields to the appropriate data type before you start the Amazon Kinesis Analytics application to ensure that the solution assigns your data to the right type. If you do not map the data types before you deploy the Streaming Analytics Pipeline, the solution will create data types for you. But, these data types may not be the types you want.

What We'll Cover

The procedure for deploying this architecture on AWS consists of the following steps. For detailed instructions, follow the links for each step.

Step 1. Launch the Stack

  • Launch the AWS CloudFormation template into your AWS account.

  • Enter values for required parameters.

  • Review the other template parameters, and adjust if necessary.

Step 2. Validate and Start the Application

  • Verify that the schema and application code are correct.

  • Start the application.

Step 3. Start Streaming Data

  • Start streaming data to the source Kinesis stream.

  • View results in your external destination.