A panel applet for the COSMIC desktop that provides quick access to local AI assistance via Ollama.
- Chat with local Ollama models directly from your panel
- Automatic context gathering:
- Clipboard - Copied text (Ctrl+C)
- Selection - Highlighted text (no copy needed)
- System info - OS, kernel, memory
- Recent errors - Last 5 journal errors
- Pre-configured as a Pop!_OS/Linux assistant
- Fast responses with GPU acceleration
- COSMIC Desktop (Pop!_OS 24.04+ or other COSMIC-enabled distros)
- Ollama installed and running
wl-clipboardfor clipboard integration
# Install Ollama
curl -fsSL https://ollama.com/install.sh | sh
# Install clipboard tools
sudo apt install wl-clipboard
# Pull a model (choose one)
ollama pull phi3:mini # Small, fast (~2GB)
ollama pull llama3.2:3b # Smarter (~2GB)
ollama pull llama3.2 # Best quality (~4GB)# Install Rust if needed
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
# Clone and build
git clone https://github.com/paul-wade/cosmic-applet-ollama.git
cd cosmic-applet-ollama
cargo build --release
# Install
sudo install -Dm0755 target/release/cosmic-applet-ollama /usr/bin/cosmic-applet-ollama
sudo install -Dm0644 resources/app.desktop /usr/share/applications/com.github.paulwade.cosmic-applet-ollama.desktop- Right-click on your COSMIC panel
- Select "Add Applet"
- Find "Ollama Assistant" and add it
- Click the applet icon in your panel
- Type a question or request
- For context-aware help:
- Copy text (Ctrl+C) before asking - error messages, config files, code
- Select text (highlight) - the applet reads primary selection too
- Recent system errors are automatically included for troubleshooting
Settings are stored via cosmic-config at ~/.config/cosmic/com.github.paulwade.cosmic-applet-ollama/v1/.
To change the model or Ollama URL, edit the config file:
# View current config
cat ~/.config/cosmic/com.github.paulwade.cosmic-applet-ollama/v1/model
cat ~/.config/cosmic/com.github.paulwade.cosmic-applet-ollama/v1/ollama_url
# Change model (no rebuild needed)
echo '"mistral:7b"' > ~/.config/cosmic/com.github.paulwade.cosmic-applet-ollama/v1/modelDefault values:
- model:
llama3.2:3b - ollama_url:
http://localhost:11434/api/chat
src/
├── main.rs # Entry point
├── app.rs # COSMIC applet UI and logic
├── ollama.rs # Ollama API client
├── context.rs # System context gathering
├── config.rs # Configuration handling
└── i18n.rs # Internationalization
# Run with logging
RUST_LOG=debug cargo run
# Check for issues
cargo clippy
# Format code
cargo fmtContributions are welcome! Please feel free to submit issues and pull requests.
This project is licensed under the GPL-3.0 License - see the LICENSE file for details.