aiamblichus
MCP Server
aiamblichus
public

mcp loom helper

一个为 LLM 提供文本补全功能的 MCP 协议服务器,专为特定基础模型设计。

Repository Info

0
Stars
0
Forks
0
Watchers
0
Issues
JavaScript
Language
MIT License
License

About This Server

一个为 LLM 提供文本补全功能的 MCP 协议服务器,专为特定基础模型设计。

Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.

Documentation

# MCP Loom Helper Server

An MCP (Model Context Protocol) server that provides a clean interface for LLMs to use text completion capabilities through the MCP protocol. This server acts as a bridge between an LLM client and any OpenAI's compatible API. 

This server is specifically designed for use with **"looming"** base models as described at [cyborgism.wiki/hypha/loom](https://cyborgism.wiki/hypha/loom). It reads prompts from files within a project directory rather than accepting direct string prompts.

## Features

- Provides a single tool named "complete" for generating text completions
- Reads prompts from files in a project directory
- Properly handles asynchronous processing to avoid blocking
- Implements timeout handling with graceful fallbacks
- Supports cancellation of ongoing requests

## Installation

```bash
# Clone the repository
git clone <repository-url>
cd mcp-loom-helper

# Install dependencies
pnpm install

# Build the project
pnpm run build
```

## Configuration

The following environment variables are required:

```
OPENAI_API_KEY=your-hyperbolic-api-key
OPENAI_API_BASE=https://api.hyperbolic.xyz/v1
OPENAI_MODEL=meta-llama/Meta-Llama-3.1-405B
LOOM_PROJECT_ROOT=/path/to/your/project
```

## Usage

Start the server:

```bash
pnpm start
```

This will start the server on stdio, making it available for MCP clients to communicate with.

## Docker Usage

### Building the Docker Image

```bash
docker build -t mcp-loom-helper .
```

### Running the Container

```bash
# Run with environment variables
docker run -it --rm \
  -e OPENAI_API_KEY="your-api-key" \
  -e OPENAI_MODEL="gpt-3.5-turbo-instruct" \
  -e LOOM_PROJECT_ROOT="/app/project" \
  -v /path/to/your/project:/app/project \
  mcp-loom-helper
```

You can also use a .env file:

```bash
# Run with .env file
docker run -it --rm \
  --env-file .env \
  -v /path/to/your/project:/app/project \
  mcp-loom-helper
```

### Parameters for the "complete" tool

- `file_path` (string, required): The relative path to a file containing the prompt to complete
- `max_tokens` (integer, optional): Maximum tokens to generate, default: 150
- `temperature` (number, optional): Controls randomness (0-1), default: 0.7
- `top_p` (number, optional): Controls diversity via nucleus sampling, default: 1.0
- `frequency_penalty` (number, optional): Decreases repetition of token sequences, default: 0.0
- `presence_penalty` (number, optional): Increases likelihood of talking about new topics, default: 0.0

## Development

For development with auto-reloading:

```bash
npm run dev
```

## License

MIT 

Quick Start

1

Clone the repository

git clone https://github.com/aiamblichus/mcp-loom-helper
2

Install dependencies

cd mcp-loom-helper
npm install
3

Follow the documentation

Check the repository's README.md file for specific installation and usage instructions.

Repository Details

Owneraiamblichus
Repomcp-loom-helper
Language
JavaScript
LicenseMIT License
Last fetched8/8/2025

Recommended MCP Servers

💬

Discord MCP

Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.

integrationsdiscordchat
🔗

Knit MCP

Connect AI agents to 200+ SaaS applications and automate workflows.

integrationsautomationsaas
🕷️

Apify MCP Server

Deploy and interact with Apify actors for web scraping and data extraction.

apifycrawlerdata
🌐

BrowserStack MCP

BrowserStack MCP Server for automated testing across multiple browsers.

testingqabrowsers

Zapier MCP

A Zapier server that provides automation capabilities for various apps.

zapierautomation