In the July 2024 release of Visual Studio Code, extension developers will be able to use GitHub Copilot’s Language Model API (LLM) to their advantage by using the Language Model API. This new Language Model API (LLM) allows you to retrieve the available LLMs in Visual Studio Code and make requests.
infoYou can read more about it in the Language Model API documentation.
The nice thing about this new API is that you can use it in your extension any way you want. For instance, you can integrate your extension into the GitHub Copilot chat, perform requests from your extension commands, create your own code completion logic, etc.
Waldek Mastykarz gave me the idea to test this out in the Front Matter CMS extension. Front Matter already had an AI integration for title, description, and taxonomy suggestions, but it was only available to the project’s sponsors. The idea is to change this to allow users to use the GitHub Copilot LLM when it’s available in their editor.
importantTo use the GitHub Copilot LLM, you need to have the GitHub Copilot extension installed and an active GitHub Copilot subscription.
Using the Language Model API
In the upcoming version of Front Matter CMS, the GitHub Copilot LLM will be used when creating a new article to suggest a title and allow the user to generate a description and taxonomy based on the title and contents of the article. Extension commands trigger these actions.
In the example, I will show you how to interact with the LLM from within your extension commands.
Retrieving the available LLMs
To retrieve the available LLMs in Visual Studio Code, use the vscode.lm.selectChatModels
method. This method returns a promise that resolves to the available LLM’s.
infoAt the moment of writing this article, there were three models available
gpt-3.5-turbo
,gpt-4
, andgpt-4-turbo
for GitHub Copilot.
|
|
Creating a prompt
Once you know the model is available, we can create a prompt to interact with the LLM. The prompt is created by one or multiple vscode.LanguageModelChatMessage
strings.
tipFor more advanced prompt creation, Visual Studio Code team is building the @vscode/prompt-tsx dependency.
|
|
Sending the prompt
Once the prompt is created, you can send it to the LLM using the model.sendRequest
method.
|
|
The result of this code looks like this:
Conclusion
The Visual Studio Code team made it easy for extension developers to use the GitHub Copilot LLM in their extensions. This LLM API opens up many possibilities for extension developers to enhance their extensions with AI capabilities.