Notes to Self

Alex Sokolsky's Notes on Computers and Programming

24 July 2023

ChatGPT-like chatbot

ChatGPT-like chatbot:

Setup

Clone tloen/alpaca-lora

Install dependencies using pip install -r requirements.txt

Training

Fine-tune the script to run on the LLaMA model using the cleaned Stanford Alpaca model. You can look at the repository to tweak the hyper parameters for better performance.

python finetune.py \
    --base_model 'decapoda-research/llama-7b-hf' \
    --data_path 'yahma/alpaca-cleaned' \
    --output_dir './lora-alpaca'

Inference

The inference script reads the foundation LLaMA model from Hugging Face and loads LoRA weights to run a Gradio interface.

python generate.py \
    --load_8bit \
    --base_model 'decapoda-research/llama-7b-hf' \
    --lora_weights 'tloen/alpaca-lora-7b'

You can

tags: ai