A simple llmops
-
Components
- Chat front end
- Guardrails middleware
- Ollama LLM backend
- Custom Arize Logging
-
Demo
d2.mp4
-
Flow
- Install Docker from here
- Get Guardrails token:
- Visit Guardrails AI Hub
- Sign up or log in to your account
- Navigate to your account settings or API keys section
- Copy your token and save it somewhere.
-
Run below commands:
git clone https://github.com/abhishek9sharma/llmops.git cd llmops -
Create a
.envfile in the root directory. Your structure should look like:llmops/ ├── .env ← Create this file here ├── docker-compose.yml ├── Makefile ├── grserver/ ├── ollama-backend/ ├── streamlit-chat/ └── README.mdtouch .env
-
Add your Guardrails token to the
.envfile:GR_TOKEN=your_guardrails_token_here
-
Optional: Add other environment variables:
GR_TOKEN=your_guardrails_token_here OPENAI_API_KEY=your_openai_api_key ANOTHER_VARIABLE=value
-
Run the application:
make up_with_build
This will take a while do build.
-
Navigate to http://localhost:8501/

