Warning
This post was published 52 days ago. The information described in this article may have changed.
With the advent of the AI era, traditional database interaction methods are undergoing revolutionary changes. Today I want to share with you a smart database service project built with Rust and Model Context Protocol (MCP) — RBDC-MCP.
This project demonstrates how to use Rust's official MCP SDK (rmcp
) to build a database service that seamlessly integrates with Claude AI, allowing users to interact with databases through natural language.
Project Repository: GitHub - rbatis/rbdc-mcp: rbdc-mcp-server
The project adopts the following key technologies:
[dependencies]
# Official MCP Rust SDK
rmcp = { git = "https://github.com/modelcontextprotocol/rust-sdk",
branch = "main", features = ["server", "transport-io"] }
# Async runtime
tokio = { version = "1.0", features = ["full"] }
# RBDC database ecosystem
rbdc = { version = "4.6.0" }
rbdc-sqlite = { version = "4.6.0" }
rbdc-mysql = { version = "4.6.0" }
rbdc-pg = { version = "4.6.0" }
rbdc-mssql = { version = "4.6.0" }
rbdc-pool-fast = { version = "4.6.0" }
The project adopts a clear modular architecture:
main.rs
- Program entry point, responsible for argument parsing and service startupdb_manager.rs
- Database connection management and operation encapsulationhandler.rs
- MCP protocol handling and tool definitionspub struct DatabaseManager {
pool: Arc<FastPool>,
db_type: DatabaseType,
}
impl DatabaseManager {
pub fn new(url: &str) -> Result<Self> {
let db_type = DatabaseType::from_url(url)?;
let driver: Box<dyn Driver> = match db_type {
DatabaseType::SQLite => Box::new(rbdc_sqlite::SqliteDriver {}),
DatabaseType::MySQL => Box::new(rbdc_mysql::MysqlDriver {}),
DatabaseType::PostgreSQL => Box::new(rbdc_pg::PgDriver {}),
DatabaseType::MSSQL => Box::new(rbdc_mssql::MssqlDriver {}),
};
let manager = ConnectionManager::new(driver, url)?;
let pool = FastPool::new(manager)?;
Ok(Self {
pool: Arc::new(pool),
db_type,
})
}
}
Design highlights:
Driver
trait to unify interfaces for different databasesFastPool
for high-performance connection poolingDatabaseType
enumThe project uses rmcp
library macros to define MCP tools:
#[tool(tool_box)]
impl RbdcDatabaseHandler {
#[tool(description = "Execute SQL query and return results")]
async fn sql_query(&self, #[tool(aggr)] SqlQueryParams { sql, params }: SqlQueryParams)
-> Result<CallToolResult, McpError> {
// Implementation logic
}
#[tool(description = "Execute SQL modification statements")]
async fn sql_exec(&self, #[tool(aggr)] SqlExecParams { sql, params }: SqlExecParams)
-> Result<CallToolResult, McpError> {
// Implementation logic
}
#[tool(description = "Get database connection pool status")]
async fn db_status(&self) -> Result<CallToolResult, McpError> {
// Implementation logic
}
}
The project cleverly handles conversion from JSON parameters to RBS parameters:
fn convert_params(&self, params: &[Value]) -> Vec<rbs::Value> {
params.iter()
.map(|v| serde_json::from_value(v.clone()).unwrap_or_default())
.collect()
}
Scenario: When developing new projects in Cursor, need to quickly set up HTTP server and database
AI Conversation Example:
Developer: "Help me create a user management HTTP API with user registration, login, and query functions"
AI: "I'll help you build a complete user management system. First, let me check the database structure..."
AI Will Automatically:
Generated Code Example:
#[derive(Debug, Serialize, Deserialize)]
pub struct User {
pub id: i64,
pub username: String,
pub email: String,
pub created_at: DateTime<Utc>,
}
#[derive(Debug, Deserialize)]
pub struct CreateUserRequest {
pub username: String,
pub email: String,
pub password: String,
}
// Auto-generated HTTP handler function
async fn create_user(
State(pool): State<Arc<Pool>>,
Json(req): Json<CreateUserRequest>
) -> Result<Json<User>, ApiError> {
// AI-generated complete implementation
}
Scenario: Existing database, need to generate corresponding Rust structs
AI Conversation:
Developer: "Generate Rust structs based on users table, including Serde serialization"
AI Execution:
1. Query table structure: "DESCRIBE users" or "PRAGMA table_info(users)"
2. Analyze field type mapping
3. Generate complete struct definitions
Auto-generated:
use serde::{Deserialize, Serialize};
use chrono::{DateTime, Utc};
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct User {
pub id: i64,
pub username: String,
pub email: String,
pub password_hash: String,
pub is_active: bool,
pub created_at: DateTime<Utc>,
pub updated_at: Option<DateTime<Utc>>,
}
#[derive(Debug, Deserialize)]
pub struct CreateUser {
pub username: String,
pub email: String,
pub password: String,
}
#[derive(Debug, Deserialize)]
pub struct UpdateUser {
pub username: Option<String>,
pub email: Option<String>,
pub is_active: Option<bool>,
}
AI Conversation:
Developer: "Generate complete database access layer for User struct, including CRUD operations"
AI Auto-generates:
use anyhow::Result;
use rbdc::pool::Pool;
pub struct UserRepository {
pool: Arc<Pool>,
}
impl UserRepository {
pub fn new(pool: Arc<Pool>) -> Self {
Self { pool }
}
pub async fn create(&self, user: CreateUser) -> Result<User> {
let mut conn = self.pool.get().await?;
let sql = r#"
INSERT INTO users (username, email, password_hash, created_at)
VALUES (?, ?, ?, ?)
RETURNING id, username, email, password_hash, is_active, created_at, updated_at
"#;
let password_hash = hash_password(&user.password)?;
let now = Utc::now();
let result = conn.fetch_one(sql, vec![
user.username.into(),
user.email.into(),
password_hash.into(),
now.into(),
]).await?;
Ok(User::from_row(result)?)
}
pub async fn find_by_id(&self, id: i64) -> Result<Option<User>> {
// AI-generated complete implementation
}
pub async fn find_by_email(&self, email: &str) -> Result<Option<User>> {
// AI-generated complete implementation
}
pub async fn update(&self, id: i64, updates: UpdateUser) -> Result<User> {
// AI-generated dynamic SQL update logic
}
pub async fn delete(&self, id: i64) -> Result<bool> {
// AI-generated deletion logic
}
}
Scenario: Check the rationality of database design
AI Conversation:
Developer: "Help me analyze whether this e-commerce database design is reasonable, check table relationships and indexes"
AI Auto-executes:
1. Query all table structures
2. Analyze foreign key relationships
3. Check index configurations
4. Provide optimization suggestions
AI Analysis Results:
Database Design Analysis Report:
✅ Advantages:
- Users table primary key design is reasonable
- Order table contains necessary timestamps
- Foreign key constraints are set correctly
⚠️ Optimization Suggestions:
- Recommend adding unique index for users.email field
- Recommend adding index for orders.created_at to improve query performance
- Recommend adding foreign key constraint for products.category_id
🔧 Recommended SQL Optimizations:
```sql
CREATE UNIQUE INDEX idx_users_email ON users(email);
CREATE INDEX idx_orders_created_at ON orders(created_at);
ALTER TABLE products ADD CONSTRAINT fk_products_category
FOREIGN KEY (category_id) REFERENCES categories(id);
AI Conversation:
Developer: "I need to add a phone field to the users table, generate migration script"
AI auto-generates migration script:
```rust
// migrations/20241201_add_phone_to_users.sql
-- Up
ALTER TABLE users ADD COLUMN phone VARCHAR(20);
CREATE INDEX idx_users_phone ON users(phone);
-- Down
DROP INDEX idx_users_phone;
ALTER TABLE users DROP COLUMN phone;
AI Conversation:
Developer: "Analyze user registration trends for the past month, statistics by day"
AI auto-executes complex queries:
```sql
SELECT
DATE(created_at) as date,
COUNT(*) as registrations,
COUNT(*) - LAG(COUNT(*)) OVER (ORDER BY DATE(created_at)) as daily_change
FROM users
WHERE created_at >= DATE('now', '-30 days')
GROUP BY DATE(created_at)
ORDER BY date;
AI Conversation:
Developer: "Check slow queries and provide optimization suggestions"
AI Analysis:
1. Query execution plans
2. Identify full table scans
3. Recommend index optimizations
4. Generate optimized SQL
AI Conversation:
Developer: "Generate 100 test data records for users table"
AI auto-generates:
```sql
INSERT INTO users (username, email, password_hash, created_at) VALUES
('user_001', 'user001@example.com', '$2b$12$...', '2024-11-01 10:00:00'),
('user_002', 'user002@example.com', '$2b$12$...', '2024-11-01 11:00:00'),
-- ... continues to generate 98 more records
AI Conversation:
Developer: "Synchronize user data from MySQL to PostgreSQL"
AI generates synchronization script:
1. Read source database structure
2. Generate target database DDL
3. Create data conversion logic
4. Handle data type differences
Scenario: Validate consistency between API response data and database
AI Conversation:
Developer: "Validate whether /api/users/{id} endpoint returns data consistent with database"
AI automatically:
1. Query user data in database
2. Compare API response format
3. Verify field completeness
4. Check data type matching
AI Assistance:
AI Analysis:
schemars
to generate JSON Schemamatch self.db_manager.execute_query(&sql, rbs_params).await {
Ok(results) => {
let json_str = serde_json::to_string_pretty(&results)
.map_err(|e| McpError::internal_error(format!("Result serialization failed: {}", e), None))?;
Ok(CallToolResult::success(vec![Content::text(json_str)]))
}
Err(e) => Err(McpError::internal_error(format!("SQL query failed: {}", e), None))
}
cargo install --git https://github.com/rbatis/rbdc-mcp.git
{
"mcpServers": {
"rbdc-mcp": {
"command": "rbdc-mcp",
"args": ["--database-url", "sqlite://./database.db"]
}
}
}
In Cursor, you can directly interact with databases through AI assistant without manually writing SQL.
Significant efficiency improvements after using RBDC-MCP:
This project demonstrates Rust's powerful capabilities in building MCP services:
Possible future improvement directions:
I hope this project can provide some reference and inspiration for everyone on the Rust + MCP development path! Especially in AI-driven development environments like Cursor, such tools can greatly improve development efficiency. If you're also exploring database interaction in the AI era, welcome to communicate and discuss!
2 posts - 2 participants
🏷️ rust_feed