Skip to content

do_predict后,生成的result里面除了label,predict字段,是否可以增加原始问题的字段? #4295

Closed
@wickedjava

Description

Reminder

  • I have read the README and searched the existing issues.

System Info

使用的llama-factory master branch

Reproduction

model

model_name_or_path: /home/work/Qwen1.5-7B-Chat
adapter_name_or_path: /home/work/Qwen1.5-7B-Chat/lora

method

stage: sft
do_predict: true
finetuning_type: lora

dataset

dataset: identity,alpaca_en_demo
dataset_dir: /home/work/git/LLaMA-Factory-main/data
template: qwen
cutoff_len: 1024
max_samples: 50
overwrite_cache: true
preprocessing_num_workers: 16

output

output_dir: /home/work/Qwen1.5-7B-Chat/lora/predict/v1
overwrite_output_dir: true

eval

per_device_eval_batch_size: 1
predict_with_generate: true
ddp_timeout: 180000000

Expected behavior

result 里面只有label和predict字段,能否增加原始的问题字段?

{"label": "Hello! I am {{name}}, an AI assistant developed by {{author}}. How can I assist you today?", "predict": "Hello! How can I help you today?"}

Others

No response

Activity

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Assignees

No one assigned

    Labels

    solvedThis problem has been already solved

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions