TeichAI/GLM-4.7-Flash-Claude-Opus-4.5-High-Reasoning-Distill-GGUF 30B • Updated about 20 hours ago • 35.9k • 232
Running 83 Unlocking On-Policy Distillation for Any Model Family 📝 83 Explore on‑policy distillation with interactive visualizations
Running on CPU Upgrade Featured 2.97k The Smol Training Playbook 📚 2.97k The secrets to building world-class LLMs
POINTS-Reader: Distillation-Free Adaptation of Vision-Language Models for Document Conversion Paper • 2509.01215 • Published Sep 1, 2025 • 51