We use machine learning technology to do auto-translation. Click "English" on top navigation bar to check Chinese version.
Enhancing runtime security and governance with the Amazon Web Services Lambda Runtime API proxy extension
This post is written by Anton Aleksandrov, Principal Serverless Solutions Architect, and Shridhar Pandey, Senior Amazon Web Services Lambda Product Manager.
This approach enables security vendors and engineering teams to provide enhanced, non-invasive security and governance tools for Lambda functions. Use cases include sanitizing event payload, blocking malicious events, and auditing and augmenting payloads.
Overview
The
Lambda runtimes use the Runtime API to retrieve the next incoming event to be processed by the function handler and return the handler response to the Lambda service.
This is how runtimes and extensions communicate with the Lambda service via the Runtime API and Extensions API endpoints:

Amazon Web Services Lambda Runtime API and Extensions API endpoints
When you
The information supplied contains the function invocation metadata, but not the event payload. This makes the event useful for observability, but limited for application security and governance use cases, such as inspecting payloads for vulnerabilities, sanitizing inputs, and blocking malicious events.
The Lambda Runtime API proxy is a pattern that enables you to hook into the function invocation request and response lifecycle. It allows you to use extensions to implement advanced security, compliance, governance, and observability scenarios without changes to function code. You can add runtime security mechanisms, implement audit procedures for data flowing in and out of the function, enhance observability by auto-injecting tracing headers, and many more.
Understanding the Lambda Runtime API workflow
This is how the Lambda Runtime consumes the Lambda Runtime API:

How the Lambda Runtime consumes the Lambda Runtime API
Lambda runtimes use the value of the Amazon Web Services_LAMBDA_RUNTIME_API environment variable to make Runtime API requests. The two primary endpoints are /next , which is used to retrieve the next event to process, and /response , which is used to return event processing results to the Lambda service. In addition, the Runtime API also provides endpoints for reporting failures. See the full
How the Runtime API proxy approach works
The Runtime API proxy is a component that you can build to hook into the invocation workflow. It proxies requests and responses, allowing you to augment them, and control the workflow:

Runtime API proxy hooks
When the Lambda service creates a new execution environment, it starts by initializing the extensions attached to the function. The execution environment waits for all extensions to register with the Lambda service by calling the Extensions API /register endpoint, then proceeds to initialize the runtime. This allows you to start the Runtime API proxy HTTP listener during extension initialization, making it ready to serve the runtime requests.

Runtime API proxy flow
By default, the value of the Amazon Web Services_LAMBDA_RUNTIME_API environment variable in the runtime process points to the Lambda Runtime API endpoint 127.0.0.1:9001 . You can use a
A wrapper script enables you to customize the runtime startup behavior of your Lambda function by letting you set configuration parameters that cannot be set through language-specific environment variables. You can add a wrapper script to your function by setting the Amazon Web Services_LAMBDA_EXEC_WRAPPER environment variable. The following wrapper script assumes that the Runtime API Proxy is listening on port 9009.
#!/bin/bash
export AWS_LAMBDA_RUNTIME_API="127.0.0.1:9009"
exec "$@"
You can either add this export line to an existing wrapper script or create a new one.

Runtime API proxy example
The Runtime API Proxy is bootstrapped by the Lambda service when a new execution environment is created and it’s ready to proxy requests from the Lambda runtime to the Runtime API before first invocation.
Implementing the Runtime API proxy logic
Amazon Web Services recommends you implement extensions using a programming language that compiles to a binary executable, such as Golang or Rust. This allows you to use the extension with any Lambda runtime. Extensions implemented in interpreted languages, such as JavaScript and Python, or languages that require additional virtual machines, such as Java and C#, can only be used with that specific runtime.
This diagram shows a scenario where both incoming events and outbound responses are processed by the extension. You can use this workflow for auditing event or response payloads, sanitizing them, or injecting additional properties. You can use it for scenarios like masking account numbers, stripping personally identifiable information (PII), or injecting observability headers.

Runtime API proxy logic
This diagram demonstrates an advanced scenario, where the first inbound event is identified as malicious, and rejected by the Runtime API proxy. The function handler is not invoked. The second event is not flagged as malicious, and is therefore forwarded to the handler for processing. You can use this workflow for security scenarios like runtime application protection.

Runtime API proxy security scenario
Amazon Web Services Partners using the Runtime API Proxy solution
“Using Lambda Runtime API proxy solution is a game-changing approach for us. It enables us to support multiple Lambda runtimes with a single implementation, provides comprehensive visibility into Lambda execution, and allows to detect attackers targeting serverless applications,” says Julio Guerra, Engineering Manager, Application Security Management, Datadog.
“Lambda Runtime API proxy is a simple solution that gives us a pluggable way to protect Lambda Function URLs. We can implement request authorization and enrichment with no changes to function code,” says Ilya Zilber, Senior Manager, Solutions Engineering, Okta Inc.
Security best practices
Extensions run within the same execution environment as the function, so they have the same level of access to resources such as file system, networking, and environment variables. IAM permissions assigned to the function are shared with extensions. Our guidance is to assign the least required privileges to your functions.
Always install extensions from a trusted source only. Use Infrastructure as Code (IaC) tools, such as
When building extensions, do not log sensitive data. Sanitize payloads and metadata before logging or persisting them for audit purposes.
Considerations
The Runtime API proxy approach allows you to hook into the Lambda request/response workflow, enabling new security and observability use cases. There are several important considerations:
- This requires you to have a good understanding of the Lambda execution environment lifecycle and the Lambda Runtime API. You must implement proxying for all Runtime API endpoints and handle potential runtime failures.
- Prepare your extension for composability for scenarions in which more than one extension implements the Runtime API proxy pattern. Allow your extension consumers to configure the extension via environment variables using at least two parameters – the port your proxy listens on and the Runtime API endpoint your proxy forwards requests to. The latter should default to the original value of the Amazon Web Services_LAMBDA_RUNTIME_API environment variable. See sample implementations below for details.
- Proxying API requests with default buffered responses requires additional work to support functions with
response payload streaming . - Proxying API requests adds latency. The added overhead depends on your implementation. Amazon Web Services recommends using programming languages that can be compiled to an executable binary, such as Rust and Golang, and keeping your extensions lightweight and optimized.
Samples
You can find sample extensions implementing the Runtime API Proxy at
Follow the instructions described in README.md for a step-by-step tutorial on running the extension.
Conclusion
This post introduces and illustrates the Lambda Runtime API proxy pattern. You can use this pattern to hook into the Lambda function request and response workflow to intercept, process, audit, modify, and block inbound events and handler responses.
You can use this pattern to implement enhanced runtime security and governance scenarios, as well as scenarios from other domains.. Amazon Web Services customers and partners can use this advanced solution approach to add enhanced security and observability to Lambda functions without requiring code changes.
For more serverless learning resources, visit
The mentioned AWS GenAI Services service names relating to generative AI are only available or previewed in the Global Regions. Amazon Web Services China promotes AWS GenAI Services relating to generative AI solely for China-to-global business purposes and/or advanced technology introduction.