AWS Lambda + OpenTelemetry
Overview
There are 2 ways to integrate Lambda with OpenTelemetry (otel):
Option 1: Using AWS managed OTel Lambda Layer
This option is the simplest, you just include the published Lambda Layer with your function.
Note, the layer size can be 70MB (depending on your architecture) and will count toward your AWS Lambda size limit of 200MB unzipped. If this is a problem, there is a simple fix: see "I'm running out of space in AWS Lambda" below.
Option 2: Load the OTel SDK directly in your code
This option will add about 20MB - 35MB to your Lambda, and requires only slightly more code.
Option 1: Using AWS managed OTel Lambda Layer
TLDR: Working Github example
- Create a collector.yaml file and put it in the root of your Lambda
receivers:
otlp:
protocols:
grpc:
endpoint: localhost:4317
http:
endpoint: localhost:4318
exporters:
otlp/codesee:
endpoint: "in-otel.codesee.io:443"
headers:
Authorization: "Bearer <REPLACE WITH CODESEE TOKEN>" # !! Set your CodeSee Ingestion Token
service:
pipelines:
traces:
receivers: [otlp]
exporters: [otlp/codesee]
- Add the AWS OTel Lambda Layer to your Lambda. You can find the correct ARN at https://aws-otel.github.io/docs/getting-started/lambda/lambda-js
Assuming you are using the Serverless framework, here's a sample serverless.yaml file:
service: serverless-lambda-layer-otel-example
frameworkVersion: '3'
provider:
name: aws
runtime: nodejs18.x
region: us-east-2 # region must match for your Lambda Layer ARN below
tracing: # x-ray is required
lambda: true
functions:
api:
handler: index.handler
timeout: 10
layers:
# how to get the managed lambda arn? https://aws-otel.github.io/docs/getting-started/lambda/lambda-js
- arn:aws:lambda:us-east-2:901920570463:layer:aws-otel-nodejs-amd64-ver-1-12-0:1
environment:
# let otel Lambda Layer where the config file is located at
OPENTELEMETRY_COLLECTOR_CONFIG_FILE: "/var/task/collector.yaml"
AWS_LAMBDA_EXEC_WRAPPER: "/opt/otel-handler"
OTEL_SERVICE_NAME: "serverless-lambda-layer-otel-example"
OTEL_RESOURCE_ATTRIBUTES: "deployment.environment=staging"
# optional otel collector Lambda Layer config:
# OTEL_LOG_LEVEL: "DEBUG"
events:
- httpApi:
path: /
method: get
Option 2: Load the OTel SDK directly in your code
TLDR: Working Github example
- Install required OTel SDK libs for your programming language. For example (Node.js):
"dependencies": {
"@grpc/grpc-js": "^1.8.14",
"@opentelemetry/api": "^1.4.1",
"@opentelemetry/auto-instrumentations-node": "^0.37.0",
"@opentelemetry/exporter-trace-otlp-grpc": "^0.39.1",
"@opentelemetry/sdk-node": "^0.39.1"
}
- Initialize the instrumentation libs. For example (Node.js):
const grpc = require('@grpc/grpc-js');
const { NodeSDK } = require('@opentelemetry/sdk-node');
const { getNodeAutoInstrumentations } = require("@opentelemetry/auto-instrumentations-node");
const { OTLPTraceExporter } = require("@opentelemetry/exporter-trace-otlp-grpc");
const { Resource } = require('@opentelemetry/resources');
const { SemanticResourceAttributes } = require('@opentelemetry/semantic-conventions');
// For troubleshooting, set the log level to DiagLogLevel.DEBUG
// const { diag, DiagConsoleLogger, DiagLogLevel } = require('@opentelemetry/api');
// diag.setLogger(new DiagConsoleLogger(), DiagLogLevel.DEBUG);
const codeseeToken = process.env.CODESEE_TOKEN; // !! CodeSee Ingestion Token
const metadata = new grpc.Metadata();
metadata.set('Authorization', `Bearer ${codeseeToken}`);
const sdk = new NodeSDK({
traceExporter: new OTLPTraceExporter({
url: "https://in-otel.codesee.io:443/v1/traces",
credentials: grpc.credentials.createSsl(),
metadata,
}),
resource: Resource.default().merge(
new Resource({
[SemanticResourceAttributes.SERVICE_NAME]: process.env.SERVICE_NAME, // !! NAME YOUR SERVICE !!
[SemanticResourceAttributes.DEPLOYMENT_ENVIRONMENT]: process.env.DEPLOYMENT_ENVIRONMENT, // !! SET YOUR ENVIRONMENT
[SemanticResourceAttributes.SERVICE_VERSION]: process.env.SERVICE_VERSION, // (optional) set version
})
),
instrumentations: [getNodeAutoInstrumentations()],
});
sdk.start();
- Load the tracing libs from the main Lambda entry file. For example (Node.js):
'use strict'
require('./tracing');
module.exports.handler = async (event) => {
return {
statusCode: 200,
body: JSON.stringify(
{
message: "hello from serverless-lambda-otel-nodejs-lib-example lambda",
input: event,
},
null,
2
),
};
};
- Be sure to enable tracing for your lambda functions, and set the following environment variables.
Assuming you are using the Serverless framework, here's a sample serverless.yaml file:
service: my-lambda
frameworkVersion: '3'
provider:
name: aws
region: us-east-2
architecture: amd64 # if on MacOS M1, use arm64
tracing:
lambda: true # !! Must enable tracing
ecr:
images:
example-image:
path: ./
functions:
api:
image:
name: example-image
timeout: 10
environment:
CODESEE_TOKEN: "CodeSee:*** TOKEN HERE" # !! Replace with CodeSee Ingestion Token
SERVICE_NAME: my-service-name # !! The service name will shows up on CodeSee Service Map UI
DEPLOYMENT_ENVIRONMENT: staging # The service environment - each env gets its own map
SERVICE_VERSION: 0.1.0 # optional: service version
events:
- httpApi:
path: /
method: get
Troubleshooting
I'm running out of space in AWS Lambda
TLDR: Working Github example
There is a simple change you can make to your deployment that will change the Lambda size limit from 250MB up to 10GB — updating your Lambda build to use a Docker Container image. Your development process remains the same, and your code is still running in Lambda!
Create Dockerfile to build Container image
Create a Dockerfile that uses the Lambda runtime for your programming language. For example (Node.js):
FROM public.ecr.aws/lambda/nodejs:12
COPY package.json .
RUN npm i --production
COPY index.js .
COPY tracing.js .
CMD ["index.handler"]
Initialize the instrument libs. For example (Node.js):
const grpc = require('@grpc/grpc-js');
const { NodeSDK } = require('@opentelemetry/sdk-node');
const { getNodeAutoInstrumentations } = require("@opentelemetry/auto-instrumentations-node");
const { OTLPTraceExporter } = require("@opentelemetry/exporter-trace-otlp-grpc");
const { Resource } = require('@opentelemetry/resources');
const { SemanticResourceAttributes } = require('@opentelemetry/semantic-conventions');
// For troubleshooting, set the log level to DiagLogLevel.DEBUG
const { diag, DiagConsoleLogger, DiagLogLevel } = require('@opentelemetry/api');
diag.setLogger(new DiagConsoleLogger(), DiagLogLevel.DEBUG);
const codeseeToken = process.env.CODESEE_TOKEN; // !! CodeSee Ingestion Token
const metadata = new grpc.Metadata();
metadata.set('Authorization', `Bearer ${codeseeToken}`);
const sdk = new NodeSDK({
traceExporter: new OTLPTraceExporter({
url: "https://in-otel.codesee.io:443/v1/traces", // CodeSee endpoint
credentials: grpc.credentials.createSsl(),
metadata,
}),
resource: Resource.default().merge(
new Resource({
[SemanticResourceAttributes.SERVICE_NAME]: process.env.SERVICE_NAME, // !! NAME YOUR SERVICE !!
[SemanticResourceAttributes.DEPLOYMENT_ENVIRONMENT]: process.env.DEPLOYMENT_ENVIRONMENT, // !! SET YOUR ENVIRONMENT
[SemanticResourceAttributes.SERVICE_VERSION]: process.env.SERVICE_VERSION, // (optional) set version
})
),
instrumentations: [getNodeAutoInstrumentations()],
});
sdk.start();
Load the tracing libs from the main Lambda entry file. For example (Node.js):
'use strict'
require('./tracing');
module.exports.handler = async (event) => {
return {
statusCode: 200,
body: JSON.stringify(
{
message: "hello from lambda",
input: event,
},
null,
2
),
};
};
Changing Serverless to build and deploy a docker image
The api: / image: block will tell it to build the container. The ecr: block will tell it to upload to ECR.
service: serverless-lambda-container-otel-js-lib
frameworkVersion: '3'
provider:
name: aws
region: us-east-2
architecture: arm64 # if on MacOS M1, else use default: amd64
tracing:
lambda: true
ecr:
images:
example-image:
path: ./
functions:
api:
image:
name: example-image
timeout: 10
environment:
OTEL_EXPORTER_OTLP_ENDPOINT: "https://in-otel.codesee.io:443/v1/traces"
CODESEE_TOKEN: "CodeSee:*** TOKEN HERE" # !! Replace with actual CodeSee token
SERVICE_NAME: serverless-lambda-container-otel-nodejs-lib-example # !! The service name will shows up on CodeSee Service Map UI
DEPLOYMENT_ENVIRONMENT: staging # The service environment
SERVICE_VERSION: 0.1.0 # optional: service version
events:
- httpApi:
path: /
method: get
Updated 12 months ago