Skip to content

An OpenAI-compatible API proxy with LLM trace visualization using Phoenix.

License

Notifications You must be signed in to change notification settings

pengjeck/LocalLLMTrace

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Local LLM Inspector

An OpenAI-compatible API proxy with LLM trace visualization using Phoenix and OpenInference.

Features

  • OpenAI API compatible endpoints
  • Streaming and non-streaming responses
  • LLM trace visualization using Phoenix
  • Lightweight local deployment
  • OpenTelemetry-based instrumentation

Requirements

  • Python 3.9 - 3.12
  • Poetry (for dependency management)

Installation

  1. Clone the repository:
git clone https://github.com/pengjeck/LocalLLMTrace
cd LocalLLMTrace
  1. Create a .env file based on .env-example:
cp .env-example .env
  1. Update the .env file with your API keys:
# OpenAI/DeepSeek API
OPENAI_API_KEY=your-api-key
OPENAI_API_URL=https://api.deepseek.com
  1. Install dependencies:
poetry install
  1. Start the development server:
poetry run uvicorn main:app --reload
  1. Start Phoenix tracing UI:
phoenix serve

API Documentation

POST /v1/chat/completions

POST /chat/completions

Request Body

{
  "model": "string",
  "messages": [
    {
      "role": "string",
      "content": "string"
    }
  ],
  "temperature": "number",
  "stream": "boolean",
  "max_tokens": "number",
  "stream_options": {
    "include_usage": "boolean"
  }
}

Example Request

curl -X POST "http://localhost:8000/chat/completions" \
-H "Content-Type: application/json" \
-d '{
  "model": "deepseek-chat",
  "messages": [
    {
      "role": "user",
      "content": "Hello!"
    }
  ],
  "temperature": 0.7,
  "stream": false
}'

Development

Using Supervisor for Process Management

  1. Install Supervisor:
pip install supervisor
  1. Start services:
supervisord -c supervisord.conf
  1. Check service status:
supervisorctl -c supervisord.conf status
  1. Common commands:
# Restart a service
supervisorctl -c supervisord.conf restart [service_name]

# Stop all services
supervisorctl -c supervisord.conf shutdown

# View logs
tail -f /tmp/phoenix_out.log
tail -f /tmp/main_out.log

Manual Development

  1. Install dependencies:
poetry install
  1. Start the development server:
poetry run uvicorn main:app --reload
  1. Start Phoenix tracing UI:
poetry run phoenix

Configuration

The following environment variables are required:

  • OPENAI_API_KEY: Your DeepSeek API key
  • OPENAI_API_URL: DeepSeek API endpoint

License

MIT License

Copyright (c) 2024 Your Name

Permission is hereby granted...

About

An OpenAI-compatible API proxy with LLM trace visualization using Phoenix.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published