- Provide API compatible with OpenAI API using VScode Language Model API.
- You can use the API only install vscode extension.
- Available all models for GitHub Copilot chat
VScode provides Language Models API to vscode extensions. For now, GitHub copilot provides LLM access for fixed fee. If we can use this access via OpenAI compatible HTTP API, we obtain strong power!
Furthermore, someone look llm proxy software that do not require additional installation, this vsvode extension is good solution.
This extension uses VScode Language Model API.
If you can use GitHub Copilot, then (probably) you can use this
| name | default | description |
|---|---|---|
http-lm-api.port |
59603 |
The port number for the API server listening |
http-lm-api.startServerAutomatically |
true |
If true, start the server automatically after the vscode initialization finished |
- OpenAI compatible
POST /chat/completion- Supporting stream mode.
GET /v1/modelsGET /models