OpenBio is a bioinformatics-focused skill distribution project built on top of deepagents-cli. Recommend use docker image to use directly.
The default model is:
- Use
deepagents-clias the execution/runtime layer. - Ship verified OpenBioSkills as a PyPI package (
openbioskill). - Install those skills into agent skill directories with
openbio install.
deepagents-cli: terminal agent runtime, tools, and orchestration.openbioskill(PyPI): packaging and installer for OpenBio skill bundles.skills/: verified bioinformatics and scientific skill database (SKILL.md, references, scripts, assets).
The conceptual infrastructure is shown in infra.png.
OpenBio/
├── openbioskill/ # Python package (CLI + installer)
│ ├── cli.py # `openbio` command entry
│ └── installer.py # skill installation logic
├── skills/ # verified OpenBioSkills database
│ └── <skill-name>/
│ ├── SKILL.md
│ ├── references/
│ ├── scripts/
│ └── assets/
├── cli_cp/deepagents_cli/ # deepagents-cli custom overlay
├── scripts/
│ ├── start_services.sh
│ └── Run_Docker.sh
├── Dockerfile # deepagents-based container build
├── pyproject.toml # package metadata + entry points
├── Run_Upload_pypi.sh # publish script for PyPI
└── infra.png # conceptual platform diagram
Default Build: deepagents-cli + Verified OpenBioSkills (PyPI)
1) Runtime layer (deepagents-cli)
This repository extends the deepagents runtime:
- Base install in Docker:
deepagents+deepagents-cli - Optional local runtime if you already use deepagents-compatible agents
openbioskill packages the verified skill corpus and exposes:
openbio installInstall behavior:
- Auto-detects and installs to one of:
~/.cursor/skills~/.claude/skills~/.codex/skills~/.gemini/skills~/.deepagents/skills
- Supports explicit target selection:
openbio install --model-name deepagents-cli
openbio install --model-name codex
openbio install --target-dir /custom/path/skillspython3 -m pip install openbioskill
openbio installdocker compose run --rm --service-ports --build flask-appBuild distributions:
python3 -m build --no-isolationUpload to PyPI:
./Run_Upload_pypi.shRun_Upload_pypi.sh expects TWINE_PASSWORD (typically in .env) and uploads dist/* via uvx twine.
MIT (see LICENSE).

