Cody emerges as a noteworthy open-source competitor to GitHub Copilot in the AI-assisted coding landscape.
Unlike GitHub Copilot’s closed-source model, Cody’s open-source nature allows for greater customization and transparency, addressing privacy concerns more effectively.
Key Takeaways:
- Open Source Advantage: Cody’s open-source framework stands out, offering transparency and customization opportunities that GitHub Copilot’s closed-source model does not.
- Contextual Awareness: Leveraging Sourcegraph, Cody provides a more comprehensive understanding of your codebase, leading to more relevant suggestions and insights.
- IDE Integration: Cody supports popular IDEs like VS Code and IntelliJ Idea, with plans for further expansion, highlighting its commitment to accommodating diverse developer environments.
- Privacy Consideration: Cody assures users of their data privacy by not storing or using code for training purposes, a significant concern for many in the developer community.
- Comprehensive Feature Set: Beyond code completion, Cody aids in code documentation, unit test generation, and identifying code smells, showcasing a broad range of functionalities aimed at improving code quality and efficiency.
- User Feedback: While receiving positive reviews for its potential within business environments, Cody is encouraged to enhance its UI and documentation features for an even better user experience.
- Swappable LLMs: Support for Anthropic Claude, Claude 2, and OpenAI GPT-4/3.5, with more coming soon. You can even run Cody with Local LLM by installing Ollama then chosing the ollama options in Cody extension settings:
Note: Cody version 1.0 was made generally available 12/23, the pro version is offered free until 02/24.
References:
Running Cody with local LLM:
100% Open-Source Llama Coding Assistant: Bye, bye GPT-4!
All right, I’ve got something really exciting to share with you today!
Related (05/23):
Comparing Rix, Cody, Phind, Copilot Chat, and ChatGPT’s responses to a basic Ruby on Rails query