The @lightweight/opencode-plugin package provides a deeper integration with OpenCode than the generic OpenAI-compatible setup. It adds model discovery, usage tracking, and automatic configuration.
Installation
npm install @lightweight/opencode-plugin
Configuration
opencode.json
Add the plugin to your opencode.json configuration file:
{
"plugins": [
{
"name": "@lightweight/opencode-plugin",
"config": {
"apiKey": "lw_sk_your-key-here",
"defaultModel": "claude-sonnet-4.5"
}
}
]
}
Configuration Options
| Option | Type | Required | Description |
|---|
apiKey | string | Yes | Your Lightweight API key |
defaultModel | string | No | Default model to use (defaults to gpt-5.4) |
baseUrl | string | No | Custom base URL (defaults to https://api.lightweight.one/v1) |
Environment Variable Alternative
If you prefer not to store your API key in the config file, set it as an environment variable:
Bash / macOS / Linux
PowerShell / Windows
export LIGHTWEIGHT_API_KEY="lw_sk_your-key-here"
$env:LIGHTWEIGHT_API_KEY = "lw_sk_your-key-here"
Then simplify your opencode.json:
{
"plugins": [
{
"name": "@lightweight/opencode-plugin",
"config": {
"defaultModel": "claude-sonnet-4.5"
}
}
]
}
The plugin will automatically read the LIGHTWEIGHT_API_KEY environment variable.
The plugin automatically discovers all available models from the Lightweight catalog, so they appear in OpenCode’s model selector without manual configuration.