Blog & Demos
Tutorials, case studies, benchmarks, and open-source demos — everything you need to build with small language models.
distil NPC: Small Language Models for Video Game Characters
A family of small language models specialised for conversational NPCs in video games. Enables natural language interaction with game characters running entirely on-device, no network required.
distil-PII: Family of PII Redaction SLMs
We trained and released a family of small language models specialized for policy-aware PII redaction that dramatically outperform their pre-trained counterparts.
distil labs: Benchmarking the Platform
Benchmarking distil labs' distillation pipeline across classification, information extraction, QA, and tool-calling tasks, showing that compact SLMs consistently match or exceed teacher LLMs.
distil labs: Small Models, Big Wins – Using SLMs in Agentic AI
How small language models can match or beat much larger LLMs when fine-tuned to well-scoped tasks, enabling faster, cheaper, and more private agentic AI workflows.
distil labs: Small Expert Agents from 10 Examples
An overview of how distil labs turns a prompt and a few dozen examples into a small, accurate expert agent that matches LLM-level results with models 50-400x smaller.