跳转到主要内容

category

All right, I’ve got something really exciting to share with you today!

We all know that coding assistants have permanently changed the way we approach developing software, but the hefty price tag of advanced LLMs like GPT-4 has been a stumbling block for many.

But here’s the fantastic news: Cost is no longer a barrier!

Join our next cohort: Full-stack GenAI SaaS Product in 4 weeks!

With LLMs like Code Llama 7B, the dream of a powerful, auto-completing coding assistant running locally is now a reality.

In this guide, I’ll walk you through:

  1. Setting up Cody and VSCode
  2. Installing a local Code Llama 7B with Ollama
  3. Boosting your coding efficiency with Cody (and hopefully by 10x — sarcasm indeed, and this post is not sponsored in any way, Cody is cool!)

Let’s go!

Getting Started with Cody

Open VSCode, navigate to ‘Extensions’ tab and search for ‘Cody’.

Find ‘Cody AI’ in search results and install the ‘Pre-release version’