Welcome to distil labs
Distil labs provides a platform for training task-specific small language models (SLMs) with just a prompt and a few dozen examples. Our platform handles the complex machine learning processes behind the scenes. This allows you to focus on your use-case instead of managing large datasets and infrastructure.
Getting started
Section titled “Getting started”Install the Distil CLI:
curl -fsSL https://cli-assets.distillabs.ai/install.sh | sh
Minimal example
Section titled “Minimal example”# Log in (if you dont have an account, use `distil register`)
distil login
# create a model for your specific task
distil model create my-first-model
# Output: Model created with ID: <model-id>
# Upload your data (see Data preparation for details)
distil model upload-data <model-id> --data ./my-data-folder
# Train a Small Model to solve your task as well as an LLM can
distil model run-training <model-id>
# Use your trained model
distil model deploy local <model-id> # deploy the model locally
distil model invoke <model-id> # get a script to invoke the model
That’s it! Your trained model is ready for local deployment. You can also use our Claude Skill to train models directly from Claude Code or Claude.ai.
Next steps
Section titled “Next steps”Ready to build your own specialized models? Continue to our How to train your SLM guide or explore detailed tutorials.