Turn your merged pull requests into performance-review-ready narratives - powered by AI.
Quarter Summarizer fetches your merged PRs from a GitHub organization, classifies them by impact, and streams a first-person self-review summary using the LLM of your choice. Pick a quarter (or custom date range), select a model, and get a polished write-up in seconds - so you can spend your time shipping code, not writing self-reviews.
Note
The quality of the generated summary depends on the quality of your PR descriptions. The more context your PRs have, the better the narrative — a little effort at merge time goes a long way at review time!
A classic PAT is required to fetch pull request data from GitHub.
-
Click Generate new token (classic)
-
Select the
reposcope (needed to access PRs in private repositories) -
Copy the token — you'll need it in the next step
-
If you want to analyze PRs from an organization's repositories, click Configure SSO next to the token and Authorize it for that org
git clone https://github.com/your-username/quarter-summarizer.git
cd quarter-summarizercp backend/.env.example backend/.envOpen backend/.env and fill in your values:
| Variable | Required | Description |
|---|---|---|
GITHUB_PERSONAL_ACCESS_TOKEN |
Yes | Classic GitHub PAT with repo scope |
LLM_API_KEY |
No | API key for your LLM provider (e.g. OpenAI, Anthropic) |
LLM_BASE_URL |
No | Base URL of the provider's API (e.g. https://api.openai.com/v1) |
DEVELOPER_ROLE |
No | Your role, used to add context to the generated narrative (e.g. Frontend Engineer) |
Tip
No access to a paid LLM? The app falls back to Ollama automatically when LLM_API_KEY and LLM_BASE_URL are left blank. Just install Ollama on your machine, pull a model (ollama pull llama3), and you're good to go.
docker compose -f docker-compose.prod.yml up -d --buildOnce running, open http://localhost.
Start the development stack with hot-reload support:
docker compose -f docker-compose.dev.yml up --build| Service | URL |
|---|---|
| Frontend (Vite) | http://localhost:5173 |
| Backend (NestJS) | http://localhost:3001 |
Source files are volume-mounted into the containers. Edit code on your host and changes are picked up instantly — Vite HMR for the frontend, NestJS watch mode for the backend. No rebuild needed.
The dev stack exposes port 9229 for the Node.js inspector and the backend starts in debug mode by default.
- Add
debuggerstatements in the backend source code - In VS Code, run the Debug NestJS app in Docker launch configuration to attach
Contributions are welcome! Please see CONTRIBUTING.md for guidelines on how to get started.
The goal is simple: This tool is designed to be developer first. Anyone should be able to use Quarter Summarizer, no matter which company or GitHub organization they work in.
If this were a hosted website, it would realistically behave like one shared product tied to whoever runs the servers. Whether that product can see your organization’s private repos and PR history is often not up to you as a developer. An organization administrator decides if outside tools are allowed, and that decision is per organization. A hosted service can’t assume every company will say yes, so in practice it tends to work cleanly for one org (or a small set that explicitly opts in), not as something any developer can pick up and use on their own.
Running it on your machine avoids that. You connect with your own GitHub access (see Create a GitHub Personal Access Token), the same way you already use GitHub day to day. There is no central service asking every org to trust the same third-party app.
On GitHub, authorizing a hosted app for organization repos shows a screen like the one below. Under Organization access, you can see a Request button for the org. If you only click the main Authorize button and skip that step, the app still cannot read your organization’s repositories. Merged PRs in the org would not show up, which defeats the point of Quarter Summarizer. Request means you are asking an administrator to allow the app. You cannot complete that part of access on your own.
This project is licensed under the MIT License.
