Announcing Anvil: The AI-Native, Open-Source Object Store
· 4 min read
Introducing Anvil — the AI-Native Object Store
Fast, self-hosted, S3-compatible storage designed for models, safetensors, gguf files, ONNX artifacts, and large ML datasets.
GitHub: https://github.com/worka-ai/anvil
Latest Release: https://github.com/worka-ai/anvil/releases/latest
Docs: https://worka.ai/docs/anvil/getting-started
Landing Page: https://worka.ai/anvil
Why We Built Anvil
We didn’t set out to build a new object store.
We set out to build our app — and everything broke in predictable, painful ways.
- Git LFS choked on multi‑GB LLM model files
- Hugging Face repos weren’t ideal for private/internal hosting
- S3 and MinIO treated model files as dumb blobs
- Fine‑tunes duplicated base checkpoints 10–20×
- Downloading 7B/13B files repeatedly wrecked developer velocity
- Users couldn’t run models locally without full downloads
- Serving models from home labs, laptops, and edge devices was unreliable
There was no storage layer designed for AI workloads — only general-purpose object stores that weren’t aware of model formats or inference patterns.
So we built one.
