Welcome to Solo Documentation.
These docs provides a clear, high-level view of the Solo, covering everything from installation and command usage to advanced topics and contributions.
Solo is an open-source platform for Hardware Aware Inference. Using Solo, you can easily set up your environment, download and serve models, benchmark performance, fine-tune, ensemble, distill, and deploy across a variety of inference engines and platforms.
Quickstart
Get your solo server set up locally for easy model inference.
Tutorials
Learn different ways you can use solo server to manage and optimize model inference.
Examples
Checkout examples in our cookbook repo to bootstrap