Skip to content

请在llamafactory-cli api中支持设置root_path #5307

Closed
@shizidushu

Description

Reminder

  • I have read the README and searched the existing issues.

System Info

  • llamafactory version: 0.8.4.dev0
  • Platform: Linux-5.4.0-186-generic-x86_64-with-glibc2.35
  • Python version: 3.10.14
  • PyTorch version: 2.4.0+cu121 (GPU)
  • Transformers version: 4.43.4
  • Datasets version: 2.20.0
  • Accelerate version: 0.32.0
  • PEFT version: 0.12.0
  • TRL version: 0.9.6
  • vLLM version: 0.5.5

Reproduction

按照文章 https://zhuanlan.zhihu.com/p/695287607 直到启动API Server

CUDA_VISIBLE_DEVICES=0 API_PORT=8000 llamafactory-cli api \
    --model_name_or_path /root/workspace/models-modelscope/Meta-Llama-3-8B-Instruct \
    --adapter_name_or_path ./saves/LLaMA3-8B/lora/sft \
    --template llama3 \
    --finetuning_type lora

由于我的环境需要配置root_path,所以服务无法正常运行,root_path 文档(FastAPI)请参考: https://fastapi.tiangolo.com/zh/advanced/behind-a-proxy/

Expected behavior

No response

Others

No response

Activity

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Assignees

No one assigned

    Labels

    solvedThis problem has been already solved

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions