Skip to main content
Ollama supports OpenAI-compatible API endpoints. You can point Ollama’s OpenAI compatibility layer at Lightweight to access cloud models alongside your local ones.

Setup

Set the following environment variables to redirect Ollama’s OpenAI-compatible requests to Lightweight:
export OLLAMA_OPENAI_BASE_URL="https://api.lightweight.one/v1"
export OLLAMA_OPENAI_API_KEY="lw_sk_your-key-here"
Once set, any tool or application that uses Ollama’s OpenAI-compatible endpoint will be routed through Lightweight, giving you access to the full model catalog.
This configuration applies to Ollama’s OpenAI-compatible mode only. Local Ollama models will continue to run locally as usual.