4. Running inference
How do we run inference on an open source LLM? The following table contains a (non-exhaustive) list of methods to interact with an open source LLM.
Projects
HF Transformers provides APIs and tools to easily run inference on LLMs available from the HF Hub.
A framework for developing applications powered by LLMs.
The Awesome-LLM repository also contains a useful list of tools for deploying LLMs.