Instructions to use apple/AIM with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- MLX
How to use apple/AIM with MLX:
# Download the model from the Hub pip install huggingface_hub[hf_xet] huggingface-cli download --local-dir AIM apple/AIM
- Notebooks
- Google Colab
- Kaggle
- Local Apps
- LM Studio
Ran smoothly on MI250s
#4
by shaunclimbon - opened
Nice, re-ran the classification results (1B & 7B) with PyTorch on a full node of MI250s and didn't have to change anything. Just worked!