TalksAWS re:Invent 2025 - Unleash Rust's potential on AWS (DEV307)
AWS re:Invent 2025 - Unleash Rust's potential on AWS (DEV307)
Unleash Rust's Potential on AWS
Rust at Scale on AWS
Amazon Aurora DSQL - 100% Rust
Amazon Aurora DSQL is a serverless, distributed SQL database built 100% in Rust
Originally started as a 100% JVM/Kotlin project, but faced issues with garbage collection and latency
The Rust-based adjudicator component was 10x faster than the Kotlin version
The entire data plane was rewritten in Rust, achieving 10x performance over Kotlin with no optimization
Rust was chosen for its speed, safety, and ability to handle the latency-sensitive, globally distributed nature of the database
Other Rust Projects at AWS
Bottlerocket (the OS behind AWS Firecracker microVMs) is built in Rust
Firecracker (the microVM runtime) is also written in Rust
The AWS MCP (Identity and Access Management) server and CLI tool are built in Rust
Rust is the default choice for data plane components across many AWS services
Rust on AWS Lambda
Cargo Lambda makes it easy to develop, test, and deploy Rust-based Lambda functions
Rust Lambda functions can achieve cold starts of 1.2 seconds and warm starts of 4ms
Rust Lambda functions use minimal memory (29MB) and are faster than most other runtimes, including Python and JavaScript
Rust for Front-end and Full-stack
Rust can be used to build front-end applications using technologies like HTMX
The presenter demonstrated a link shortener application built entirely in Rust, from the static website to the Lambda functions and database
Rust at DataDog
Serverless Observability with Rust
DataDog's legacy Go-based Lambda extension was replaced with a Rust version, reducing cold start times from 700-800ms to just 80ms
The Rust extension was able to achieve significantly lower post-runtime durations (from 60-140ms down to 500μs) by optimizing the flushing strategy
This allowed DataDog to remove 53,000 lines of Go code from their agent
Rust in the DataDog Agent
DataDog is rewriting parts of their agent's data plane in Rust, achieving a 25% reduction in CPU usage and 33% reduction in memory usage compared to the Go version
Rust's performance characteristics and memory management are well-suited for the high-volume, time-series data processing required in the agent
Rust in DataDog's Backend
DataDog's backend uses a distributed, time-series database powered by a custom Rust storage engine (Monle)
The Rust-based storage engine was able to handle 60x more write throughput (3.5 million points/second) compared to the previous Go-based implementation
Rust's low-level control and lack of garbage collection overhead were key to achieving this level of performance
Production Tips for Rust Developers
Linting and Error Handling
Use Clippy to lint your Rust code and enforce best practices, such as avoiding unwrap() calls
Treat Clippy warnings as failures to ensure they are addressed early in development
Monitoring Rust Programs
Monitor for blocking tasks in your Tokio-based Rust programs using the Metrics crate
Leverage tools like the Tokio runtime's stats dump and the Tokio Console to debug performance issues
Working with AWS SDKs
Carefully parse AWS SDK errors to handle specific error conditions, such as provisioned throughput exceeded exceptions
This allows for more intelligent application-level throttling and error handling
Key Takeaways
Rust has become a critical language for performance-sensitive components across AWS services, from databases to serverless functions
Rust's speed, safety, and concurrency features make it an ideal choice for building highly scalable, distributed systems
DataDog has seen significant benefits from adopting Rust, including improved observability, reduced resource usage, and higher throughput in their backend
Rust developers should leverage the ecosystem's tooling, such as Clippy and Tokio's monitoring capabilities, to ensure production-ready Rust applications
These cookies are used to collect information about how you interact with this website and allow us to remember you. We use this information to improve and customize your browsing experience, as well as for analytics.
If you decline, your information won’t be tracked when you visit this website. A single cookie will be used in your browser to remember your preference.