Semantic code search,
built for Rust.
An MCP server that pairs full-text search with vector embeddings and rust-analyzer's semantic graph — so your agent can navigate, trace, and reason about your codebase like you do.
Search that actually understands Rust.
Three search modes that compose: keyword, semantic, and the AST. Built on tantivy, fastembed, and rust-analyzer.
Hybrid search
BM25 keyword matching fused with vector similarity. Find code by what it does, not just what it's called.
Symbol navigation
Goto definition and find references across the project, powered by rust-analyzer's IDE engine.
Call graph
Trace function relationships up and down the stack — who calls what, and from where.
Complexity metrics
Lines of code, cyclomatic complexity, and function counts per file or module — at a glance.
Incremental indexing
Merkle-tree change detection re-indexes only what changed. First run is full; everything after is fast.
GPU acceleration
ONNX Runtime + CUDA for 10–15× faster embedding on NVIDIA GPUs. CPU fallback when no GPU is present.
Ten tools, exposed over MCP.
Drop the server into Claude Code (or any MCP client) and these become callable from your agent.
Three steps to running.
Build the binary, wire it into your MCP client, and point it at a Rust project.
# clone and build the release binary git clone https://github.com/molaco/rust-code-mcp cd rust-code-mcp cargo build --release # optional: drop on PATH cp target/release/file-search-mcp ~/.local/bin/
{
"mcpServers": {
"rust-code-mcp": {
"command": "/abs/path/to/file-search-mcp"
}
}
}
# inside Claude Code, ask: > index my codebase at /abs/path/to/my-rust-project # then call any tool — for example: > find_references for "parse_expression" > get_similar_code to "retry with exponential backoff" > get_call_graph from "main"
Three indexes, one query plane.
A document index, a vector index, and a semantic AST — kept in sync by a Merkle-tree watcher.
Built to scale with your codebase.
Throughput measured on the embedding pipeline. CPU is fine for small projects; GPU unlocks large monorepos.
Standing on good shoulders.
Each piece is the best-in-class option for its job — and stays a thin wrapper over the upstream library.