TRIGGERcmd
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Search
    • Register
    • Login

    I used the local TRIGGERcmd stdio mcp server with a local LLM with Ollama and Open WebUI

    MCP
    1
    1
    2
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • RussR
      Russ
      last edited by Russ

      ddd1e4a9-b0f0-4077-80ce-b3e66bc283cd-image.png

      I used this bash script to run an MCP/OpenAI proxy server in Ubuntu under WSL:

      #!/bin/bash -xv
      curl -O https://agents.triggercmd.com/triggercmd-mcp/triggercmd-mcp-linux-amd64
      chmod +x triggercmd-mcp-linux-amd64
      
      docker build -t mcp-proxy-server .
      docker run -it -p 8000:8000 -e TRIGGERCMD_TOKEN="my triggercmd token" mcp-proxy-server
      

      This is my Dockerfile:

      FROM python:3.11-slim
      WORKDIR /app
      RUN pip install mcpo uv
      
      COPY triggercmd-mcp-linux-amd64 /triggercmd-mcp-linux-amd64
      
      # Replace with your MCP server command; example: uvx mcp-server-time
      CMD ["uvx", "mcpo", "--host", "0.0.0.0", "--port", "8000", "--api-key", "top-secret", "--", "/triggercmd-mcp-linux-amd64" ]
      

      This is how you set it up in the Settings - External Tools - Manage Tool Servers:

      00e91ceb-1dab-4a65-ade4-b82f0267c3cc-image.png

      Russell VanderMey

      1 Reply Last reply Reply Quote 0
      • First post
        Last post