YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
Model summary
Train flan-T5-large on alpaca dataset with LoRA
training
- torch==2.0.0+cu117
- transformers==4.28.0.dev0
- 8 x V100 32G
How to use
import transformers
from peft import PeftModel
base_model = transformers.AutoModelForSeq2SeqLM.from_pretrained("google/flan-t5-large")
peft_model = PeftModel.from_pretrained("zirui3/flan-t5-large-alpaca")
inputs = tokenizer("Any instruction that you like.", return_tensors="pt")
outputs = peft_model.generate(**inputs, max_length=128, do_sample=True)
print(tokenizer.batch_decode(outputs, skip_special_tokens=True)
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support