Building Smart Database Services with Rust + MCP: RBDC-MCP Practice Sharing

⚓ Rust    📅 2025-06-22    👤 surdeus    👁️ 7      

surdeus

Warning

This post was published 52 days ago. The information described in this article may have changed.

:crab: Building Smart Database Services with Rust + MCP: RBDC-MCP Practice Sharing

Introduction

With the advent of the AI era, traditional database interaction methods are undergoing revolutionary changes. Today I want to share with you a smart database service project built with Rust and Model Context Protocol (MCP)RBDC-MCP.

This project demonstrates how to use Rust's official MCP SDK (rmcp) to build a database service that seamlessly integrates with Claude AI, allowing users to interact with databases through natural language.

Project Repository: GitHub - rbatis/rbdc-mcp: rbdc-mcp-server

:building_construction: In-depth Technical Architecture Analysis

1. Core Dependencies and Technology Stack

The project adopts the following key technologies:

[dependencies]
# Official MCP Rust SDK
rmcp = { git = "https://github.com/modelcontextprotocol/rust-sdk", 
         branch = "main", features = ["server", "transport-io"] }

# Async runtime
tokio = { version = "1.0", features = ["full"] }

# RBDC database ecosystem
rbdc = { version = "4.6.0" }
rbdc-sqlite = { version = "4.6.0" }
rbdc-mysql = { version = "4.6.0" }
rbdc-pg = { version = "4.6.0" }
rbdc-mssql = { version = "4.6.0" }
rbdc-pool-fast = { version = "4.6.0" }

2. Modular Design

The project adopts a clear modular architecture:

  • main.rs - Program entry point, responsible for argument parsing and service startup
  • db_manager.rs - Database connection management and operation encapsulation
  • handler.rs - MCP protocol handling and tool definitions

3. Database Manager Design

pub struct DatabaseManager {
    pool: Arc<FastPool>,
    db_type: DatabaseType,
}

impl DatabaseManager {
    pub fn new(url: &str) -> Result<Self> {
        let db_type = DatabaseType::from_url(url)?;
        
        let driver: Box<dyn Driver> = match db_type {
            DatabaseType::SQLite => Box::new(rbdc_sqlite::SqliteDriver {}),
            DatabaseType::MySQL => Box::new(rbdc_mysql::MysqlDriver {}),
            DatabaseType::PostgreSQL => Box::new(rbdc_pg::PgDriver {}),
            DatabaseType::MSSQL => Box::new(rbdc_mssql::MssqlDriver {}),
        };

        let manager = ConnectionManager::new(driver, url)?;
        let pool = FastPool::new(manager)?;
        
        Ok(Self {
            pool: Arc::new(pool),
            db_type,
        })
    }
}

Design highlights:

  • Unified Abstraction: Using Driver trait to unify interfaces for different databases
  • Connection Pool Management: Using FastPool for high-performance connection pooling
  • Type Safety: Ensuring type safety through DatabaseType enum

:hammer_and_wrench: Detailed MCP Tool Implementation

1. Tool Definition

The project uses rmcp library macros to define MCP tools:

#[tool(tool_box)]
impl RbdcDatabaseHandler {
    #[tool(description = "Execute SQL query and return results")]
    async fn sql_query(&self, #[tool(aggr)] SqlQueryParams { sql, params }: SqlQueryParams) 
        -> Result<CallToolResult, McpError> {
        // Implementation logic
    }

    #[tool(description = "Execute SQL modification statements")]
    async fn sql_exec(&self, #[tool(aggr)] SqlExecParams { sql, params }: SqlExecParams) 
        -> Result<CallToolResult, McpError> {
        // Implementation logic
    }

    #[tool(description = "Get database connection pool status")]
    async fn db_status(&self) -> Result<CallToolResult, McpError> {
        // Implementation logic
    }
}

2. Parameter Conversion Handling

The project cleverly handles conversion from JSON parameters to RBS parameters:

fn convert_params(&self, params: &[Value]) -> Vec<rbs::Value> {
    params.iter()
        .map(|v| serde_json::from_value(v.clone()).unwrap_or_default())
        .collect()
}

:light_bulb: Rich Real-world Application Scenarios

:bullseye: Development Scenarios in Cursor

1. Building Rust HTTP Server from Scratch

Scenario: When developing new projects in Cursor, need to quickly set up HTTP server and database

AI Conversation Example:

Developer: "Help me create a user management HTTP API with user registration, login, and query functions"

AI: "I'll help you build a complete user management system. First, let me check the database structure..."

AI Will Automatically:

  • Query existing database table structures
  • Analyze field types and constraints
  • Generate corresponding Rust structs
  • Create complete Axum/Actix-web routes
  • Implement database operation functions

Generated Code Example:

#[derive(Debug, Serialize, Deserialize)]
pub struct User {
    pub id: i64,
    pub username: String,
    pub email: String,
    pub created_at: DateTime<Utc>,
}

#[derive(Debug, Deserialize)]
pub struct CreateUserRequest {
    pub username: String,
    pub email: String,
    pub password: String,
}

// Auto-generated HTTP handler function
async fn create_user(
    State(pool): State<Arc<Pool>>,
    Json(req): Json<CreateUserRequest>
) -> Result<Json<User>, ApiError> {
    // AI-generated complete implementation
}

2. Smart Database Struct Generation

Scenario: Existing database, need to generate corresponding Rust structs

AI Conversation:

Developer: "Generate Rust structs based on users table, including Serde serialization"

AI Execution: 
1. Query table structure: "DESCRIBE users" or "PRAGMA table_info(users)"
2. Analyze field type mapping
3. Generate complete struct definitions

Auto-generated:

use serde::{Deserialize, Serialize};
use chrono::{DateTime, Utc};

#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct User {
    pub id: i64,
    pub username: String,
    pub email: String,
    pub password_hash: String,
    pub is_active: bool,
    pub created_at: DateTime<Utc>,
    pub updated_at: Option<DateTime<Utc>>,
}

#[derive(Debug, Deserialize)]
pub struct CreateUser {
    pub username: String,
    pub email: String,
    pub password: String,
}

#[derive(Debug, Deserialize)]
pub struct UpdateUser {
    pub username: Option<String>,
    pub email: Option<String>,
    pub is_active: Option<bool>,
}

3. Auto-generated Database Access Layer

AI Conversation:

Developer: "Generate complete database access layer for User struct, including CRUD operations"

AI Auto-generates:

use anyhow::Result;
use rbdc::pool::Pool;

pub struct UserRepository {
    pool: Arc<Pool>,
}

impl UserRepository {
    pub fn new(pool: Arc<Pool>) -> Self {
        Self { pool }
    }

    pub async fn create(&self, user: CreateUser) -> Result<User> {
        let mut conn = self.pool.get().await?;
        let sql = r#"
            INSERT INTO users (username, email, password_hash, created_at)
            VALUES (?, ?, ?, ?)
            RETURNING id, username, email, password_hash, is_active, created_at, updated_at
        "#;
        
        let password_hash = hash_password(&user.password)?;
        let now = Utc::now();
        
        let result = conn.fetch_one(sql, vec![
            user.username.into(),
            user.email.into(),
            password_hash.into(),
            now.into(),
        ]).await?;
        
        Ok(User::from_row(result)?)
    }

    pub async fn find_by_id(&self, id: i64) -> Result<Option<User>> {
        // AI-generated complete implementation
    }

    pub async fn find_by_email(&self, email: &str) -> Result<Option<User>> {
        // AI-generated complete implementation
    }

    pub async fn update(&self, id: i64, updates: UpdateUser) -> Result<User> {
        // AI-generated dynamic SQL update logic
    }

    pub async fn delete(&self, id: i64) -> Result<bool> {
        // AI-generated deletion logic
    }
}

:magnifying_glass_tilted_left: Database Design and Validation Scenarios

4. Database Design Validation

Scenario: Check the rationality of database design

AI Conversation:

Developer: "Help me analyze whether this e-commerce database design is reasonable, check table relationships and indexes"

AI Auto-executes:
1. Query all table structures
2. Analyze foreign key relationships
3. Check index configurations
4. Provide optimization suggestions

AI Analysis Results:

Database Design Analysis Report:

✅ Advantages:
- Users table primary key design is reasonable
- Order table contains necessary timestamps
- Foreign key constraints are set correctly

⚠️ Optimization Suggestions:
- Recommend adding unique index for users.email field
- Recommend adding index for orders.created_at to improve query performance
- Recommend adding foreign key constraint for products.category_id

🔧 Recommended SQL Optimizations:
```sql
CREATE UNIQUE INDEX idx_users_email ON users(email);
CREATE INDEX idx_orders_created_at ON orders(created_at);
ALTER TABLE products ADD CONSTRAINT fk_products_category 
    FOREIGN KEY (category_id) REFERENCES categories(id);

5. Database Migration Script Generation

AI Conversation:

Developer: "I need to add a phone field to the users table, generate migration script"

AI auto-generates migration script:
```rust
// migrations/20241201_add_phone_to_users.sql
-- Up
ALTER TABLE users ADD COLUMN phone VARCHAR(20);
CREATE INDEX idx_users_phone ON users(phone);

-- Down  
DROP INDEX idx_users_phone;
ALTER TABLE users DROP COLUMN phone;

:bar_chart: Data Analysis and Reporting Scenarios

6. Smart Data Analysis

AI Conversation:

Developer: "Analyze user registration trends for the past month, statistics by day"

AI auto-executes complex queries:
```sql
SELECT 
    DATE(created_at) as date,
    COUNT(*) as registrations,
    COUNT(*) - LAG(COUNT(*)) OVER (ORDER BY DATE(created_at)) as daily_change
FROM users 
WHERE created_at >= DATE('now', '-30 days')
GROUP BY DATE(created_at)
ORDER BY date;

7. Performance Monitoring and Optimization

AI Conversation:

Developer: "Check slow queries and provide optimization suggestions"

AI Analysis:
1. Query execution plans
2. Identify full table scans
3. Recommend index optimizations
4. Generate optimized SQL

:test_tube: Test Data Generation Scenarios

8. Automatic Test Data Generation

AI Conversation:

Developer: "Generate 100 test data records for users table"

AI auto-generates:
```sql
INSERT INTO users (username, email, password_hash, created_at) VALUES
('user_001', 'user001@example.com', '$2b$12$...', '2024-11-01 10:00:00'),
('user_002', 'user002@example.com', '$2b$12$...', '2024-11-01 11:00:00'),
-- ... continues to generate 98 more records

:counterclockwise_arrows_button: Data Synchronization and Transformation Scenarios

9. Inter-database Synchronization

AI Conversation:

Developer: "Synchronize user data from MySQL to PostgreSQL"

AI generates synchronization script:
1. Read source database structure
2. Generate target database DDL
3. Create data conversion logic
4. Handle data type differences

10. API Data Validation

Scenario: Validate consistency between API response data and database

AI Conversation:

Developer: "Validate whether /api/users/{id} endpoint returns data consistent with database"

AI automatically:
1. Query user data in database
2. Compare API response format
3. Verify field completeness
4. Check data type matching

:shield: Security Check Scenarios

11. SQL Injection Detection

AI Assistance:

  • Analyze SQL query security
  • Check parameterized query usage
  • Identify potential injection risks
  • Provide secure coding suggestions

12. Data Sensitivity Analysis

AI Analysis:

  • Identify fields containing sensitive information
  • Recommend data encryption strategies
  • Check access permission settings
  • Provide compliance suggestions

:rocket: Technical Highlights and Innovations

1. Multi-database Unified Support

  • Supports SQLite, MySQL, PostgreSQL, MSSQL
  • Unified API interface, switching databases only requires changing connection string

2. High-performance Async Processing

  • Based on Tokio async runtime
  • Uses connection pooling to avoid frequent connection overhead
  • Supports concurrent request processing

3. Type-safe Parameter Handling

  • Uses schemars to generate JSON Schema
  • Strongly typed parameter definitions
  • Automatic parameter validation

4. Elegant Error Handling

match self.db_manager.execute_query(&sql, rbs_params).await {
    Ok(results) => {
        let json_str = serde_json::to_string_pretty(&results)
            .map_err(|e| McpError::internal_error(format!("Result serialization failed: {}", e), None))?;
        Ok(CallToolResult::success(vec![Content::text(json_str)]))
    }
    Err(e) => Err(McpError::internal_error(format!("SQL query failed: {}", e), None))
}

:wrench: Deployment and Usage

1. Compile and Install

cargo install --git https://github.com/rbatis/rbdc-mcp.git

2. Claude Desktop Configuration

{
  "mcpServers": {
    "rbdc-mcp": {
      "command": "rbdc-mcp",
      "args": ["--database-url", "sqlite://./database.db"]
    }
  }
}

3. Cursor IDE Configuration

In Cursor, you can directly interact with databases through AI assistant without manually writing SQL.

:bullseye: Actual Development Efficiency Improvement

Significant efficiency improvements after using RBDC-MCP:

  1. Reduce 90% of boilerplate code writing time
  2. Auto-generate type-safe database operation code
  3. Smart database design validation and optimization suggestions
  4. One-click generation of test data and migration scripts
  5. Real-time performance monitoring and optimization suggestions

:bullseye: Summary and Future Outlook

This project demonstrates Rust's powerful capabilities in building MCP services:

  1. Excellent Performance: Rust's zero-cost abstractions and memory safety
  2. Rich Ecosystem: RBDC provides comprehensive database support
  3. Type Safety: Compile-time type correctness guarantee
  4. Easy to Extend: Modular design facilitates adding new features

Possible future improvement directions:

  • Support for more database types
  • Add database Schema analysis functionality
  • Implement smarter SQL optimization suggestions
  • Support batch operations and transaction processing
  • Integrate database version control
  • Add visual data analysis functionality

:link: Related Resources

I hope this project can provide some reference and inspiration for everyone on the Rust + MCP development path! Especially in AI-driven development environments like Cursor, such tools can greatly improve development efficiency. If you're also exploring database interaction in the AI era, welcome to communicate and discuss! :rocket:

2 posts - 2 participants

Read full topic

🏷️ rust_feed