📌 Ollama API (localhost)
Number of APIs: 1
Prerequisites
- Qodex 
- Ollama Account: https://ollama.com/ 
Usage
- Create a fork 
- Send requests 
Documentation
- Models: https://ollama.com/library
Models
Models include Gemma (by Google, open-weight), Llama & CodeLlama (by Meta AI, open-weight), Mixtral (by Mistral AI, open-weight), Phi (by Microsoft), and more.
About Ollama
Ollama is a tool (similar to Docker) to run Large Language Models locally. It can be used via REST API, Python SDK, or CLI.