TalksAWS re:Invent 2025 - Coding an MCP server for Amazon Aurora (DAT429)
AWS re:Invent 2025 - Coding an MCP server for Amazon Aurora (DAT429)
Coding an MCP Server for Amazon Aurora
Overview
The presenters, Jim and Vlad, demonstrate how to build an MCP (Model Context Protocol) server that integrates with an Amazon Aurora Postgres database.
MCP is a framework for exposing backend data to large language models (LLMs) and AI assistants, allowing them to interact with and query the data.
The goal is to create an MCP server with two tools: "Get Movie Characters" and "Get Actor Roles", which will pull data from an Aurora Postgres database.
Data Model and Queries
The presenters use a sample IMDb-based data model with a titles table and related tables.
They start by writing SQL queries to retrieve movie character and actor role data, optimizing the queries for performance.
The queries leverage Postgres features like full-text search and CTEs (Common Table Expressions) to efficiently retrieve the desired data.
The presenters emphasize the importance of adding descriptive comments to the database schema to provide context for the LLM.
Building the MCP Server
The presenters use the Curo tool to rapidly generate the MCP server code, leveraging a pre-existing weather MCP server as a template.
The MCP server is implemented in Python and uses the psycopg3 library to connect to the Aurora Postgres database.
The server exposes the two tools ("Get Movie Characters" and "Get Actor Roles") by wrapping the SQL queries into the MCP server's API.
Securing the MCP Server
The presenters highlight the importance of securing the MCP server and the underlying database, as it exposes the database to external LLMs.
They demonstrate the use of Postgres Row-Level Security (RLS) to restrict access to the database based on user roles and permissions.
The MCP server is configured to set the appropriate RLS environment variables before executing the SQL queries, ensuring that only authorized data is returned.
Integrating with the Client Application
The presenters show a simple Streamlit-based client application that interacts with the MCP server.
The client application demonstrates how to call the MCP server's tools to retrieve movie character and actor role data.
They highlight the importance of handling non-deterministic behavior from the LLM, such as the model attempting to access data it shouldn't have access to.
Key Takeaways
MCP provides a framework for exposing backend data to LLMs and AI assistants, enabling them to interact with and query the data.
Careful query design and optimization are crucial for efficient data retrieval, especially when dealing with large datasets.
Securing the MCP server and the underlying database is essential to prevent unauthorized access and data exposure.
Integrating the MCP server with client applications requires handling non-deterministic behavior from the LLM and enforcing appropriate access controls.
MCP server implementation: Python, psycopg3 library
Business Impact
MCP enables organizations to leverage the power of LLMs and AI assistants to interact with and query their backend data, unlocking new use cases and opportunities.
Secure integration of MCP servers with existing applications and databases ensures that sensitive data is protected while still being accessible to authorized users and AI systems.
The ability to quickly build and deploy MCP servers using tools like Curo can accelerate the development of AI-powered applications and services.
Examples
The presenters demonstrate the use of the "Get Movie Characters" and "Get Actor Roles" tools, which retrieve data from the Aurora Postgres database.
They show how to secure the MCP server using Postgres Row-Level Security, ensuring that only authorized users and roles can access the data.
The Streamlit-based client application showcases how to interact with the MCP server and handle non-deterministic behavior from the LLM.
These cookies are used to collect information about how you interact with this website and allow us to remember you. We use this information to improve and customize your browsing experience, as well as for analytics.
If you decline, your information won’t be tracked when you visit this website. A single cookie will be used in your browser to remember your preference.