fixes for transforemrs v5

#4
by kashif HF Staff - opened

The "default" rope_type was removed in recent transformers versions. Using "linear" (with no scaling factor) is functionally equivalent.

kashif changed pull request title from fix: use linear rope_type instead of removed default for transformers compat to fixes for transforemrs v5
inclusionAI org

LGTM. Standardizing the RoPE implementation and adding partial_rotary_factor support makes total sense. Approved for merge. Thanks for the work!

lccurious changed pull request status to merged

Sign up or log in to comment