Skip to content

GitHub Release GitHub Stars Downloads Chat community

ECA (Editor Code Assistant)#

Demo using eca-emacs demo

Demo using eca-vscode demo


installation • features • configuration • models • protocol


  • 📄 Editor-agnostic: protocol for any editor to integrate.
  • âš™ Single configuration: Configure eca making it work the same in any editor via global or local configs.
  • âž¿ Chat interface: ask questions, review code, work together to code.
  • ☕ Agentic: let LLM work as an agent with its native tools and MCPs you can configure.
  • 💉 Context: support: giving more details about your code to the LLM, including MCP resources and prompts.
  • 🚀 Multi models: OpenAI, Anthropic, Ollama local models, and custom user config models.

Rationale#

A Free and OpenSource editor-agnostic tool that aims to easily link LLMs <-> Editors, giving the best UX possible for AI pair programming using a well-defined protocol. The server is written in Clojure and heavily inspired by the LSP protocol which is a success case for this kind of integration.

The protocol makes easier for other editors integrate and having a server in the middle helps adding more features quickly, some examples: - Tool call management - Multiple LLM interaction - Telemetry of features usage - Single way to configure for any editor - Same UX, easy to onboard people and teams.

With the LLMs models race, the differences between them tend to be irrelevant in the future, but UX on how to edit code or plan changes is something that will exist, ECA helps editors focus on that.

Getting started#

1. Install the editor plugin#

Install the plugin for your editor and ECA server will be downloaded and started automatically:

2. Set up your first model#

To use ECA, you need to configure at least one model with your API key. See the Models documentation for detailed instructions on:

  • Setting up API keys for OpenAI, Anthropic, or Ollama
  • Adding and customizing models
  • Configuring custom providers

Quick start: Create a .eca/config.json file in your project root with your API key:

{
  "openaiApiKey": "your-openai-api-key-here",
  "anthropicApiKey": "your-anthropic-api-key-here"
}

Note: For other providers or custom models, see the custom providers documentation.

3. Start chatting#

Once your model is configured, you can start using ECA's chat interface in your editor to ask questions, review code, and work together on your project.

Type /init to ask ECA to create a AGENT.md file which will help ECA on next iterations have good context about your project standards.

How it works#

Editors spawn the server via eca server and communicate via stdin/stdout, similar to LSPs. Supported editors already download latest server on start and require no extra configuration.

Roadmap#

Check the planned work here.

Troubleshooting#

You can start eca with --log-level debug or --verbose which should log helpful information to stderr buffer like what is being sent to LLMs or what ECA is responding to editors.

Contributing#

Contributions are very welcome, please open an issue for discussion or a pull request. For developer details, check this doc.

Support the project 💖#

Consider sponsoring the project to help grow faster, the support helps to keep the project going, being updated and maintained!