Using a Local LLM | Visual Studio Integration – Code Completion

With AI-based code completion being the latest rage, I wanted to check out the basics of local capabilities that are available as an alternative to Cursor‘s hosted services. The primary prerequisite that I have is that it must have a native plugin for Visual Studio Code. This search brought be to Llama.VSCode, which (among other …