Skip to main content
The official OpenAI Node.js SDK works with Lightweight by setting a custom baseURL. Access every model in the catalog through the familiar OpenAI TypeScript/JavaScript interface.

Installation

npm install openai

Usage

import OpenAI from "openai";

const client = new OpenAI({
  apiKey: "lw_sk_your-key-here",
  baseURL: "https://api.lightweight.one/v1",
});

async function main() {
  const response = await client.chat.completions.create({
    model: "gpt-5.4",
    messages: [
      { role: "system", content: "You are a helpful assistant." },
      { role: "user", content: "What is the capital of France?" },
    ],
  });

  console.log(response.choices[0].message.content);
}

main();

Streaming

const stream = await client.chat.completions.create({
  model: "claude-sonnet-4.5",
  messages: [{ role: "user", content: "Write a haiku about JavaScript." }],
  stream: true,
});

for await (const chunk of stream) {
  const content = chunk.choices[0]?.delta?.content;
  if (content) process.stdout.write(content);
}

Using Different Models

Swap the model parameter to use any provider — no other changes needed:
// Anthropic
const response = await client.chat.completions.create({
  model: "claude-opus-4.6",
  messages,
});

// Google
const response = await client.chat.completions.create({
  model: "gemini-2.5-pro",
  messages,
});

// xAI
const response = await client.chat.completions.create({
  model: "grok-code-fast-1",
  messages,
});
See the full Models catalog and Pricing for all available options.