Skip to content

kimwwk/crewai-dev-demo-mcp-server

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CrewAI Demo MCP Server

A Model Context Protocol (MCP) server that provides file and command execution tools for CrewAI agents.

Overview

This MCP server provides several tools that can be used by CrewAI agents to interact with the file system and execute commands:

  • read_file: Read file contents with line numbering
  • list_files: List files in a directory, respecting .gitignore patterns
  • write_to_file: Write content to files with validation
  • execute_command: Execute shell commands with proper process management

Installation

  1. Clone this repository:

    git clone https://github.com/kimwwk/crewai-demo-mcp-server.git
    cd crewai-demo-mcp-server
  2. Install dependencies using uv:

    # If you don't have uv installed, install it first:
    # curl -LsSf https://astral.sh/uv/install.sh | sh
    uv sync

Usage

Running the MCP Server

uv run python main.py
# .venv/bin/python3 main.py

By default, the server runs with SSE transport. To use stdio transport, uncomment the appropriate line in main.py.

Environment Variables

  • PROJECT_ROOT_PATH: Set this to override the default project root path

Tool Documentation

read_file

Reads a file and returns its content with line numbers.

Parameters:

  • path: The relative path to the file from the project root
  • line_range: Optional line range to read (e.g., "10-50")
  • max_lines: Maximum number of lines to read (default: 1000)

list_files

Lists files in a directory, optionally recursively, honoring .gitignore.

Parameters:

  • path: The relative path to the directory from the project root
  • recursive: Whether to list files recursively (default: False)

write_to_file

Writes content to a file. Creates the file if it doesn't exist, or overwrites it if it does.

Parameters:

  • path: The relative path to the file from the project root
  • content: The content to write
  • line_count: The predicted number of lines for omission detection

execute_command

Executes a command in a subprocess with timeout and captures streaming output.

Parameters:

  • command: The command to execute
  • cwd: Optional working directory (relative to PROJECT_ROOT_PATH or absolute)

License

MIT

About

MCP server providing tools for CrewAI Demo (see: kimwwk/crewai-demo)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages