Skip to main content

Chapter 15: Creating a Rust MCP Server

TL;DR

Below is a complete, minimal main.rs for an MCP server. The worka init command generates a file just like this for you.

use rmcp::server::{McpServer, McpServerSettings, Tool, ToolRequest, ToolResponse};
use serde::{Deserialize, Serialize};
use serde_json::json;

// 1. Define structs for your tool's parameters and response.
#[derive(Deserialize)]
struct MyToolParams {
some_input: String,
}

#[derive(Serialize)]
struct MyToolResponse {
some_output: String,
}

// 2. Create an async function for your tool.
async fn my_tool_function(req: ToolRequest) -> ToolResponse {
// Deserialize params from the request.
let params: MyToolParams = match serde_json::from_value(req.params) {
Ok(p) => p,
Err(e) => return ToolResponse::invalid_params(e.to_string()),
};

// Your tool's logic here...
let response_data = MyToolResponse {
some_output: format!("You sent: {}", params.some_input),
};

// Return a successful response.
ToolResponse::success(json!(response_data))
}

// 3. In main, register your tool and start the server.
#[tokio::main]
async fn main() {
let tools = vec![
Tool::new(
"my_tool",
"A description of what my tool does.",
my_tool_function,
),
];

let settings = McpServerSettings { port: 8080, tools };
McpServer::new(settings).start().await.unwrap();
}


For developers who want to write high-performance backend logic, Rust is an excellent choice. The Worka ecosystem provides the rmcp crate to make building MCP servers in Rust simple and efficient.

This crate handles all the low-level HTTP and JSON-RPC protocol details, letting you focus on the logic of your tools.

Step 1: The Tool Function

As we saw in the TL;DR, the core of your server is one or more async functions that will act as your tools. Each tool function must have the same signature:

async fn my_tool_function(req: ToolRequest) -> ToolResponse
  • req: ToolRequest: This struct contains the parameters sent from the client (the Host application). The actual parameters are in req.params, which is a serde_json::Value.
  • -> ToolResponse: Your function must return a ToolResponse, which can be either a success or an error.

Step 2: Handling Parameters and Responses

The recommended way to handle inputs and outputs is to use serde to define structs for them. This gives you type safety and makes your code much cleaner.

Input: You deserialize the generic req.params into your specific struct. It's good practice to handle potential errors if the client sends malformed data.

#[derive(Deserialize)]
struct MyToolParams {
some_input: String,
}

let params: MyToolParams = match serde_json::from_value(req.params) {
Ok(p) => p,
Err(e) => return ToolResponse::invalid_params(e.to_string()),
};

Output: You construct your response struct and use the ToolResponse::success() helper to serialize it to JSON and wrap it in the correct MCP response format.

#[derive(Serialize)]
struct MyToolResponse {
some_output: String,
}

let response_data = MyToolResponse { ... };
ToolResponse::success(json!(response_data))

Step 3: The Server Entrypoint

The main function of your program is responsible for configuring and launching the server.

  1. Tool Registration: You create a Vec that holds all the tools your server will provide. Each tool is defined using Tool::new(), where you provide its public name (the string used in useTool on the frontend), a description, and the name of the async function that implements it.

    let tools = vec![
    Tool::new("my_tool", "...", my_tool_function),
    Tool::new("another_tool", "...", another_tool_function),
    ];
  2. Server Settings: You create an McpServerSettings struct. The two important fields are port (which should typically be 8080, as this is the default port exposed from the container) and the tools vector you just created.

  3. Launch: Finally, you create a new McpServer with your settings and call .start().await to launch it. The server will run indefinitely, listening for and responding to tool.call requests from the Worka Host.