Skip to content

guirguispierre/Llaminal

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

7 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Llaminal Banner

License Python Ollama

Llaminal

Llaminal is the ultimate CLI companion for your local Ollama instance. Stop wrestling with raw API calls and curl commands. Embrace the terminal with style.

Featuring a cyberpunk aesthetic, Llaminal gives you a powerful, persistent, and multimodal interface to your local LLMs.


⚑️ Key Features

  • πŸ—£οΈ Interactive Chat: Full REPL with history, slash commands, and session management.
  • πŸ‘οΈ Multimodal Support: Drag-and-drop images into your terminal chat to use Vision models.
  • πŸ“š RAG & Context: Pipe files directly into ask or use /add to read entire directories into context.
  • πŸ’Ύ Sessions: Save, list, and reload your conversations anytime.
  • 🎨 Rich UI: Beautiful markdown rendering, tables, and spinners.
  • πŸ› οΈ Model Ops: Pull, show, and delete models without leaving the tool.
  • πŸ”„ Resilient: Automatic retry on transient network errors with helpful diagnostics.

πŸš€ Quick Start

Installation

# Clone the repository
git clone <repo-url>
cd Llaminal

# Install locally
python3 -m venv venv
source venv/bin/activate
pip install -e .

Usage

1. The "One-Shot" Ask Perfect for quick scripts or piped debugging.

# Simple question
llaminal ask "What is the capital of Peru?"

# Debugging a log file
cat error.log | llaminal ask "Explain this error and fix it"

2. The Interactive Chat Your main command center.

llaminal chat
  • Type /help to see all commands.
  • Type /image ./path/to/img.png to attach an image.
  • Type /add ./src to read your codebase.

3. Model Management

llaminal list
llaminal pull tinyllama
llaminal show llama3.2

βš™οΈ Configuration

Control Llaminal via environment variables:

Variable Default Description
OLLAMA_HOST http://localhost:11434 URL of your Ollama server.
LLAMINAL_MODEL llama3.2 Default model for ask and chat.

πŸ’– Sponsor

If you love Llaminal, consider supporting the development!

GitHub Sponsors


Built with ❀️ by a developer who loves the terminal.

About

Llaminal is a modern, feature-rich Command Line Interface (CLI) engineered to interact seamlessly with your local Ollama instance. It brings the power of Large Language Models directly into your shell workflow, offering a polished experience with markdown rendering, syntax highlighting, and intuitive model management. Key Features πŸš€ One-Shot ask

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Sponsor this project

 

Packages

 
 
 

Contributors

Languages