Skip to content

flat35hd99/vscode-http-lm-api

Repository files navigation

VScode extention that serve HTTP API using vscode Language Model API

Features

  • Provide API compatible with OpenAI API using VScode Language Model API.
    • You can use the API only install vscode extension.
  • Available all models for GitHub Copilot chat

Motivation

VScode provides Language Models API to vscode extensions. For now, GitHub copilot provides LLM access for fixed fee. If we can use this access via OpenAI compatible HTTP API, we obtain strong power!

Furthermore, someone look llm proxy software that do not require additional installation, this vsvode extension is good solution.

Requirements

This extension uses VScode Language Model API.

If you can use GitHub Copilot, then (probably) you can use this

Extension Settings

name default description
http-lm-api.port 59603 The port number for the API server listening
http-lm-api.startServerAutomatically true If true, start the server automatically after the vscode initialization finished

Specifications

  • OpenAI compatible
    • POST /chat/completion
      • Supporting stream mode.
    • GET /v1/models
    • GET /models

About

OpenAI API proxy using just vscode

Resources

License

Stars

Watchers

Forks

Contributors 3

  •  
  •  
  •