Artwork

Indhold leveret af Bret Fisher. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af Bret Fisher eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.
Player FM - Podcast-app
Gå offline med appen Player FM !

Flow State with VS Code AI

37:37
 
Del
 

Manage episode 428477000 series 2483573
Indhold leveret af Bret Fisher. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af Bret Fisher eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.

Bret and Nirmal are joined by Continue.dev co-founder, Nate Sesti, to walk through an open source replacement for GitHub Copilot.
Continue lets you use a set of open source and closed source LLMs in JetBrains and VSCode IDEs for adding AI to your coding workflow without leaving the editor.

You've probably heard about GitHub Copilot and other AI code assistants. The Continue team has created a completely open source solution as an alternative, or maybe a superset of these existing tools, because along with it being open source, it's also very configurable and allows you to choose multiple models to help you with code completion and chatbots in VSCode, JetBrains, and more are coming soon.

So this show builds on our recent Ollama show. Continue uses Ollama in the background to run a local LLM for you, if that's what you want to Continue to do for you, rather than internet LLM models.

Be sure to check out the live recording of the complete show from May 16, 2024 on YouTube (Ep. 266). Includes demos.

★Topics★
Continue.dev Website

Creators & Guests

  • (00:00) - Introduction
  • (01:52) - Meet Nate Sesti, CTO of Continue
  • (02:40) - Birth and Evolution of Continue
  • (03:56) - Continue's Features and Benefits
  • (22:24) - Running Multiple Models in Parallel
  • (26:38) - Best Hardware for Continue
  • (32:45) - Other Advantages of Continue
  • (36:08) - Getting Started with Continue

You can also support my free material by subscribing to my YouTube channel and my weekly newsletter at bret.news!

Grab the best coupons for my Docker and Kubernetes courses.
Join my cloud native DevOps community on Discord.
Grab some merch at Bret's Loot Box
Homepage bretfisher.com

  continue reading

173 episoder

Artwork
iconDel
 
Manage episode 428477000 series 2483573
Indhold leveret af Bret Fisher. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af Bret Fisher eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.

Bret and Nirmal are joined by Continue.dev co-founder, Nate Sesti, to walk through an open source replacement for GitHub Copilot.
Continue lets you use a set of open source and closed source LLMs in JetBrains and VSCode IDEs for adding AI to your coding workflow without leaving the editor.

You've probably heard about GitHub Copilot and other AI code assistants. The Continue team has created a completely open source solution as an alternative, or maybe a superset of these existing tools, because along with it being open source, it's also very configurable and allows you to choose multiple models to help you with code completion and chatbots in VSCode, JetBrains, and more are coming soon.

So this show builds on our recent Ollama show. Continue uses Ollama in the background to run a local LLM for you, if that's what you want to Continue to do for you, rather than internet LLM models.

Be sure to check out the live recording of the complete show from May 16, 2024 on YouTube (Ep. 266). Includes demos.

★Topics★
Continue.dev Website

Creators & Guests

  • (00:00) - Introduction
  • (01:52) - Meet Nate Sesti, CTO of Continue
  • (02:40) - Birth and Evolution of Continue
  • (03:56) - Continue's Features and Benefits
  • (22:24) - Running Multiple Models in Parallel
  • (26:38) - Best Hardware for Continue
  • (32:45) - Other Advantages of Continue
  • (36:08) - Getting Started with Continue

You can also support my free material by subscribing to my YouTube channel and my weekly newsletter at bret.news!

Grab the best coupons for my Docker and Kubernetes courses.
Join my cloud native DevOps community on Discord.
Grab some merch at Bret's Loot Box
Homepage bretfisher.com

  continue reading

173 episoder

Alle episoder

×
 
Loading …

Velkommen til Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Hurtig referencevejledning