Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
hash-map
/
got_tokenizer
like
0
sentence-transformers
English
License:
mit
Model card
Files
Files and versions
xet
Community
Use this model
main
got_tokenizer
Ctrl+K
Ctrl+K
1 contributor
History:
3 commits
hash-map
Update README.md
d6a544f
verified
5 months ago
.gitattributes
Safe
1.52 kB
initial commit
5 months ago
README.md
71 Bytes
Update README.md
5 months ago
icefire_spm.model
743 kB
xet
Upload 3 files
5 months ago
icefire_spm.vocab
492 kB
Upload 3 files
5 months ago
usage.py
767 Bytes
Upload 3 files
5 months ago