An interactive CLI-based AI coding assistant inspired by Cursor.

- Multiple operation modes:
- Interactive: Ask questions and get responses
- Autonomous: Let the AI plan and execute tasks
- Manual: Control each step of the process
- Support for various LLMs:
- Google Gemini
- OpenAI (GPT models)
- Anthropic Claude
- Local LLMs via Ollama or LM Studio
- File operations and codebase understanding
- Terminal command execution
- Code search capabilities
- Interactive UI with rich terminal formatting
See INSTALL.md for detailed installation instructions.
- Clone this repository:
git clone https://github.com/AnonAmit/AnonCodexCli.git
cd AnonCodexCli- Install dependencies:
pip install -r requirements.txt- Create a
.envfile with your API keys (see.env.example):
cp .env.example .env- Edit the
.envfile with your actual API keys.
Run the CLI:
python cli.pyThis will present you with mode selection:
- Interactive
- Autonomous
- Manual
In interactive mode, you can:
- Ask coding questions
- Request file edits
- Run terminal commands through the assistant
Example:
> Fix the bug in app.py line 56
In autonomous mode, the assistant will:
- Plan the execution steps
- Execute each step
- Test the results
- Report progress
You can toggle whether to show code in the terminal.
In manual mode, you control each step the assistant takes.
AnonCodexCli supports multiple LLM providers:
- gemini-pro
- gpt-3.5-turbo
- gpt-4
- gpt-4o
- claude-3-opus
- claude-3-sonnet
- Ollama: llama3, mistral, mixtral
- LM Studio: custom
When running the CLI, you'll be prompted to select a model. You can also specify a model using the --model flag:
python cli.py --model gemini-proYou can test all supported models to verify API connections:
python test_models.pyConfigure AnonCodexCli through the .env file:
DEFAULT_MODEL: Default LLM to useDEFAULT_MODE: Default operation modeMAX_TOKENS: Maximum tokens for LLM responsesHISTORY_PATH: Path to store interaction historyDEBUG: Enable debug logging
usage: cli.py [-h] [--mode {interactive,autonomous,manual}] [--model MODEL] [--debug] [--query QUERY]
AnonCodexCli - A CLI-based AI coding assistant
options:
-h, --help show this help message and exit
--mode {interactive,autonomous,manual}, -m {interactive,autonomous,manual}
Operation mode (default: interactive)
--model MODEL LLM model to use (default: gemini-pro)
--debug, -d Enable debug mode
--query QUERY, -q QUERY
Query to process (if provided, runs in non-interactive mode and exits after completion)
MIT License
