Перейти к основному содержимому

Add LLM to VSCode

We will use the extension Continue to add LLM to VSCode.

Install extension from marketplace

Install extension for VSCode or JetBrains IDEs from the marketplace. For example, we will install for VSCode.

alt text

After installation, you need install server for use models. For example, we will use these:

подсказка

If you use LM Studio, don't forget to enable server mode:

alt text

Simple way to use LLM in VSCode

After install Ollama the simple way to use it is download models via plugin:

alt text

Click Connect:

alt text

Enjoy!

alt text

After that, you can start using LLM in VSCode. That's all!

Add another model

Open config file:

alt text

Added models for Ollama shown here:

alt text

For chat you can add another model:

alt text

Use installed server:

alt text

alt text

You can choose direct model or use autodetect option:

alt text

Detected models shown here:

alt text

Example for LM Studio:

alt text

Usage example. You can how used memory when answer is generated:

alt text