Talks AWS re:Invent 2025 - Powering Real-Time Applications with a Modernized Cache (API210) VIDEO
AWS re:Invent 2025 - Powering Real-Time Applications with a Modernized Cache (API210) Powering Real-Time Applications with a Modernized Cache
Reddis: An In-Memory Data Store
Reddis is an in-memory data structure store used as a cache, session store, message broker, database, and more
Reddis has over 2.25 million Docker pulls per day and has been voted the most loved non-relational database by developers
Addressing Common Reddis Challenges
Hot Keys
Hot keys occur when keys are accessed or written to very frequently
Reddis addresses this with client-side caching, keeping a copy of hot keys in the client library to avoid network round trips
Secondary Indexing
Users were using sets and sorted sets in Reddis for secondary indexing use cases
This also led to hot key issues, with the key space becoming imbalanced
Reddis introduced the Reddis Query Engine to enable schema-based indexing of fields in hashes and JSON, allowing complex queries without hot keys
Unlocking New Use Cases
Session Store and Feature Store
The Reddis Query Engine enables new use cases like querying the number of users with a specific item in their shopping basket
It also powers Reddis' vector search capabilities, enabling fast vector similarity search
This powers use cases like a "semantic caching" solution that caches responses from large language models
Modernizing the Cache
Cache Aside Pattern
The cache aside pattern involves caching data from a relational database in Reddis
This can lead to stale data and "thundering herd" issues when the cache expires
Customers are moving towards a "refresh ahead" or "prefetch" cache pattern instead
Reddis Data Integration (RDI)
RDI is a low-code solution for synchronizing data from any data source (e.g. Oracle) to the appropriate Reddis data structures
This allows customers to move all their data into Reddis, not just a subset, and keep it continuously synchronized
Reddis Flex: Autotiering for Scale and Cost
Reddis Flex extends the key space with slower storage (SSD, NVMe) to enable databases up to 50TB
This reduces costs by up to 75% compared to an in-memory only approach
Key use cases include feature stores, where RDI can keep the feature data always in sync
Native JSON Support and Vector Compression
Reddis added native support for JSON data structures, enabling atomic updates and partial retrieval of nested documents
This also enables indexing JSON fields with the Reddis Query Engine, including vector search capabilities
Reddis introduced vector compression for homogeneous arrays, reducing memory usage by up to 92% compared to storing vectors in JSON
Reddis 8.4: The Fastest Reddis Ever
Reddis 8.4 includes significant performance improvements, up to 40% faster in scaling and 90% lower latency for common operations
It also includes optimizations for TLS connections, delivering 2x the performance of ElastiCache on the same hardware
These improvements make Reddis 8.4 the fastest Reddis release to date
Real-World Reddis Applications at IFood
IFood, a major food delivery platform in Latin America, uses Reddis in several critical applications:
GenAI: A platform that centralizes access to AI models, using Reddis for rate limiting
RAG: A solution that adds context to language models by storing and retrieving relevant content from Reddis
Feature Store: Storing low-latency features in Reddis to power recommendations and other ML models
Key Takeaways
Reddis has evolved to address common challenges like hot keys and secondary indexing, unlocking new use cases
Customers are moving towards a "refresh ahead" cache pattern, with Reddis Data Integration (RDI) enabling seamless synchronization
Reddis Flex provides autotiering to enable larger, more cost-effective Reddis deployments
Reddis 8.4 is the fastest Reddis release yet, with significant performance improvements
Real-world examples like IFood demonstrate Reddis' ability to power mission-critical applications at scale
Your Digital Journey deserves a great story. Build one with us.