|
1 | | -# amazon-elasticsearch-lambda-samples |
2 | | -Data ingestion for Amazon Elasticsearch Service from S3 and Amazon Kinesis, using AWS Lambda: Sample code |
| 1 | +# Streaming Data to Amazon Elasticsearch Service |
| 2 | +## Using AWS Lambda: Sample Node.js Code |
| 3 | +### Package amazon-elasticsearch-lambda-samples |
| 4 | + |
| 5 | +Copyright 2015- Amazon.com, Inc. or its affiliates. All Rights Reserved. |
| 6 | + |
| 7 | +### Introduction |
| 8 | +It is often useful to stream data, as it gets generated, for indexing in an |
| 9 | +Amazon Elasticsearch Service domain. This helps fresh data to be available for |
| 10 | +search or analytics. To do this requires: |
| 11 | + |
| 12 | +1. Knowing when new data is available |
| 13 | +2. Code to pick up and parse the data into JSON documents, and add them to an |
| 14 | + Amazon Elasticsearch (henceforth, ES for short) domain. |
| 15 | +3. Scalable and fully managed infrastructure to host this code |
| 16 | + |
| 17 | +*Lambda* is an AWS service that takes care of these requirements. Put simply, |
| 18 | +it is an "event handling" service in the cloud. Lambda lets us implement |
| 19 | +the event handler (in Node.js or Java), which it hosts and invokes in response |
| 20 | +to an event. |
| 21 | + |
| 22 | +The handler can be triggered by a "push" or a "pull" approach. |
| 23 | +Certain event sources (such as S3) push an event notification to Lambda. |
| 24 | +Others (such as Kinesis) require Lambda to poll for events and pull them |
| 25 | +when available. |
| 26 | + |
| 27 | +For more details on AWS Lambda, please see |
| 28 | +[the documentation](http://aws.amazon.com/documentation/lambda/). |
| 29 | + |
| 30 | +This package contains sample Lambda code (in Node.js) to stream data to ES |
| 31 | +from two common AWS data sources: S3 and Kinesis. The S3 sample takes apache |
| 32 | +log files, parses them into JSON documents and adds them to ES. The Kinesis |
| 33 | +sample reads JSON data from the stream and adds them to ES. |
| 34 | + |
| 35 | +Note that the sample code has been kept simple for reasons for clarity. It |
| 36 | +does not handle ES document batching, or eventual consistency issues for |
| 37 | +S3 updates, etc. |
| 38 | + |
| 39 | +### Setup Overview |
| 40 | + |
| 41 | +While some detailed instructions are covered later in this file and elsewhere |
| 42 | +(in the Lambda documentation), this section aims to show the larger picture |
| 43 | +that the individual steps work to accomplish. We assume that the data source |
| 44 | +(an S3 bucket or a Kinesis stream, in this case) and an ES domain are already |
| 45 | +set up. |
| 46 | + |
| 47 | +1. **Deployment Package**: The "Deployment Package" is the event handler code files |
| 48 | + and its dependencies packaged as a zip file. The first step in creating |
| 49 | + a new Lambda function is to prepare and upload this zip file. |
| 50 | + |
| 51 | +2. **Lambda Configuration**: |
| 52 | + |
| 53 | + 1. Handler: The name of the main code file in the deployment package, |
| 54 | + with the file extension replaced with a `.handler` suffix. |
| 55 | + 2. Memory: The memory limit, based on which the EC2 instance type to use |
| 56 | + is determined. For now, the default should do. |
| 57 | + 3. Timeout: The default timeout value (3 seconds) is quite low for our |
| 58 | + use-case. 10 seconds might work better, but please adjust based on |
| 59 | + your testing. |
| 60 | + |
| 61 | +3. **Authorization**: Since there is a need here for various AWS services making |
| 62 | + calls to each other, appropriate authorization is required. This takes |
| 63 | + the form of configuring an IAM role, to which various authorization policies |
| 64 | + are attached. This role will be assumed by the Lambda function when running. |
| 65 | + |
| 66 | +Note: |
| 67 | + |
| 68 | +* The AWS Console is simpler to use for configuration than other methods. |
| 69 | +* Lambda is currently available only in a few regions (us-east-1, us-west-2, |
| 70 | + eu-west-1, ap-northeast-1). |
| 71 | +* Once the setup is complete and tested, enable the data source in the Lambda |
| 72 | + console, so that data may start streaming in. |
| 73 | +* The code is kept simple for purposes of illustration. It doesn't batch |
| 74 | + documents when loading the ES domain, or (for S3 updates) handle |
| 75 | + eventual consistency cases. |
| 76 | + |
| 77 | +#### Deployment Package Creation |
| 78 | +1. On your development machine, download and install [Node.js](https://nodejs.org/en/). |
| 79 | +2. Anywhere, create a directory structure similar to the following: |
| 80 | + |
| 81 | + eslambda (place sample code here) |
| 82 | + | |
| 83 | + +-- node_modules (dependencies will go here) |
| 84 | + |
| 85 | +3. Modify the sample code with the correct ES endpoint, region, index |
| 86 | + and document type. |
| 87 | +4. Install each dependency imported by the sample code |
| 88 | + (with the `require()` call), as follows: |
| 89 | + |
| 90 | + npm install <dependency> |
| 91 | + |
| 92 | + Verify that these are installed within the `node_modules` subdirectory. |
| 93 | +5. Create a zip file to package the code and the `node_modules` subdirectory |
| 94 | + |
| 95 | + zip -r eslambda.zip * |
| 96 | + |
| 97 | +The zip file thus created is the Lambda Deployment Package. |
| 98 | + |
| 99 | +## S3-Lambda-ES |
| 100 | + |
| 101 | +Set up the Lambda function and the S3 bucket as described in the |
| 102 | +[Lambda-S3 Walkthrough](http://docs.aws.amazon.com/lambda/latest/dg/walkthrough-s3-events-adminuser.html). |
| 103 | +Please keep in mind the following notes and configuration overrides: |
| 104 | + |
| 105 | +* The walkthrough uses the AWS CLI for configuration, but it's probably more |
| 106 | +convenient to use the AWS Console (web UI) |
| 107 | + |
| 108 | +* The S3 bucket must be created in the same region as Lambda is, so that it |
| 109 | + can push events to Lambda. |
| 110 | + |
| 111 | +* When registering the S3 bucket as the data-source in Lambda, add a filter |
| 112 | + for files having `.log` suffix, so that Lambda picks up only apache log files. |
| 113 | + |
| 114 | +* The following authorizations are required: |
| 115 | + |
| 116 | + 1. Lambda permits S3 to push event notification to it |
| 117 | + 2. S3 permits Lambda to fetch the created objects from a given bucket |
| 118 | + 3. ES permits Lambda to add documents to the given domain |
| 119 | + |
| 120 | + The Lambda console provides a simple way to create an IAM role with policies |
| 121 | + for (1). For (2), when creating the IAM role, choose the "S3 execution role" |
| 122 | + option; this will load the role with permissions to read from the S3 |
| 123 | + bucket. For (3), add the following access policy to permit ES operations |
| 124 | + to the role. |
| 125 | + |
| 126 | + { |
| 127 | + "Version": "2012-10-17", |
| 128 | + "Statement": [ |
| 129 | + { |
| 130 | + "Action": [ |
| 131 | + "es:*" |
| 132 | + ], |
| 133 | + "Effect": "Allow", |
| 134 | + "Resource": "*" |
| 135 | + } |
| 136 | + ] |
| 137 | + } |
| 138 | + |
| 139 | + |
| 140 | +## Kinesis-Lambda-ES |
| 141 | + |
| 142 | +Set up the Lambda function and the Kinesis stream as described in the |
| 143 | +[Lambda-Kinesis Walkthrough](http://docs.aws.amazon.com/lambda/latest/dg/walkthrough-kinesis-events-adminuser.html). |
| 144 | +Please keep in mind the following notes and configuration overrides: |
| 145 | + |
| 146 | +* The walkthrough uses the AWS CLI, but it's probably more convenient to use |
| 147 | + the AWS Console (web UI) for Lambda configuration. |
| 148 | + |
| 149 | +* To the IAM role assigned to the Lambda function, add the following |
| 150 | + access policy to permit ES operations. |
| 151 | + |
| 152 | + { |
| 153 | + "Version": "2012-10-17", |
| 154 | + "Statement": [ |
| 155 | + { |
| 156 | + "Action": [ |
| 157 | + "es:*" |
| 158 | + ], |
| 159 | + "Effect": "Allow", |
| 160 | + "Resource": "*" |
| 161 | + } |
| 162 | + ] |
| 163 | + } |
| 164 | + |
| 165 | +* For testing: If you have a Kinesis client, use it to stream a record to Lambda. |
| 166 | + If not, the AWS CLI could be used to push a JSON document to Lambda. |
| 167 | + |
| 168 | + aws kinesis put-record --stream-name <lambda name> --data "<JSON document>" --region <region> --partition-key shardId-000000000000 |
0 commit comments