iamtarun/python_code_instructions_18k_alpaca
Viewer • Updated • 18.6k • 13.3k • 340
How to use monsterapi/falcon-7b-python-code-instructions-18k-alpaca with PEFT:
from peft import PeftModel
from transformers import AutoModelForCausalLM
base_model = AutoModelForCausalLM.from_pretrained("tiiuae/falcon-7b")
model = PeftModel.from_pretrained(base_model, "monsterapi/falcon-7b-python-code-instructions-18k-alpaca")We finetuned Falcon-7B LLM on Python-Code-Instructions Dataset (iamtarun/python_code_instructions_18k_alpaca) for 10 epochs or ~ 23,000 steps using MonsterAPI no-code LLM finetuner.
The dataset contains problem descriptions and code in python language. This dataset is taken from sahil2801/code_instructions_120k, which adds a prompt column in alpaca style.
The finetuning session got completed in 7.3 hours and costed us only $17.5 for the entire finetuning run!
Base model
tiiuae/falcon-7b