This spec defines a deterministic dogfooding harness for @rawsql-ts/ztd-cli backend development.
Follow this file exactly and do not substitute ad-hoc scenarios.
- Objective A: Identify where progress still requires AI reasoning.
- Objective B: Identify what can be mechanized (CLI commands, scaffolding, docs, templates, happy-path guidance).
- Objective C: Verify alignment with ZTD principles:
- Development must not require dev-time migrations.
- Required behavior must not require SQL string concatenation.
- PostgreSQL version: 18
- DBMS runtime: Docker
rawsql-tspackages: use local source code, not npm releases- This spec validates developer-mode dogfooding before publication. It does not claim to fully reproduce the published npm consumer path.
- Work location: a git-untracked standalone folder outside any pnpm workspace/monorepo
- This is the default mode because real-world usage is
@rawsql-ts/ztd-clias an npm package in a standalone repo. - Windows-friendly example:
C:\Users\<you>\tmp\rawsql-ts-dogfood\run-XX\(must NOT be under therawsql-tsrepository tree)
- This is the default mode because real-world usage is
- Local-source mode note: even with
--local-source-root, the run directory must stay standalone (outside workspace) to keep dependency resolution deterministic and avoid workspace absorption. - DDL baseline is fixed to Section 2
Before starting Scenario 1, capture these versions in the report:
- OS
- Node.js
- pnpm
- Docker Engine
- PostgreSQL image tag used for execution
Use this canonical local-source invocation form from <RUN_DIR>:
node "<LOCAL_SOURCE_ROOT>/packages/ztd-cli/dist/index.js" <ztd-subcommand-and-args><LOCAL_SOURCE_ROOT>: absolute path to therawsql-tsrepository root.<RUN_DIR>: standalone dogfooding run directory (outside any workspace), where commands are executed.- If
dist/index.jsis missing, build once from<LOCAL_SOURCE_ROOT>:
pnpm -C "<LOCAL_SOURCE_ROOT>" --filter @rawsql-ts/ztd-cli buildNote: model-gen (--probe-mode ztd) requires ZTD_DB_URL. ztd-cli does not read DATABASE_URL automatically. Also, model-gen is currently SELECT-oriented; INSERT/UPDATE/DELETE SQL may fail with parser errors and is out of scope for model-gen in this dogfooding scenario.
Note on modes:
Developer modemeans local-source execution from a repo checkout without publishing first.Published package modemeans installing released packages from npm in a standalone repo.- This spec is intentionally about
Developer modeso backend dogfooding does not depend on package publication. - When you need a pre-release check for the npm consumer path, run the separate repository-root workflow in Published-Package Verification Before Release.
Create schema.sql with the exact content below.
-- schema.sql
create table product (
product_id bigserial primary key,
sku text not null unique,
name text not null,
price_yen integer not null check (price_yen >= 0),
created_at timestamptz not null default now()
);
create table sale (
sale_id bigserial primary key,
sale_date date not null,
customer_note text null,
created_at timestamptz not null default now()
);
create table sale_line (
sale_line_id bigserial primary key,
sale_id bigint not null references sale(sale_id),
product_id bigint not null references product(product_id),
qty integer not null check (qty > 0),
unit_price_yen integer not null check (unit_price_yen >= 0)
);
create index sale_line_sale_id_idx on sale_line(sale_id);
create index sale_line_product_id_idx on sale_line(product_id);Implement sale backend features:
- Create sale with lines
- List sales (pagination optional)
- Get sale by id (including lines)
- Update sale (customer note + replace lines)
- Delete sale
- SQL assets (DDL + queries)
- Repository or equivalent backend code
- Tests (unit or integration)
- No dev-time migration requirement
- No SQL string concatenation for required behavior
- Artifacts and tests are executable with clear file evidence
Scenario 2 validates survivability under deterministic change.
Apply all of the following fixed changes.
- Add table
payment - Add
payment.sale_idreferencingsale(sale_id) - Add a new query joining
saleandpaymentto return sales with payment info
Append the following DDL:
create table payment (
payment_id bigserial primary key,
sale_id bigint not null references sale(sale_id),
paid_at timestamptz not null,
amount_yen integer not null check (amount_yen >= 0),
method text not null, -- e.g. "cash", "card"
created_at timestamptz not null default now()
);
create index payment_sale_id_idx on payment(sale_id);
create index payment_paid_at_idx on payment(paid_at);Query requirements (fixed):
- Inputs:
:from_paid_at,:to_paid_at - Output fields:
sale.sale_id,sale.sale_datepayment.payment_id,payment.paid_at,payment.amount_yen,payment.method
- Join:
payment.sale_id = sale.sale_id(INNER JOIN) - Filter: date range on
payment.paid_at
Rule: For Scenario 2 runs, use a fresh Postgres container / fresh database to apply the updated schema, to avoid noisy "relation already exists" output and keep logs comparable.
OPTIONAL (destructive): If you must reuse a DB, reset schema first: drop schema public cascade; create schema public;. CAUTION: LOCAL/DISPOSABLE DB ONLY — DO NOT RUN IN PRODUCTION OR ON SHARED DATABASES.
- Still no dev-time migration requirement
- Still no SQL string concatenation
- Generated/spec artifacts and tests updated with clear diff evidence
All runs must include a log in this exact structure:
[YYYY-MM-DDTHH:MM:SSZ] STEP <n> ACTION "<short title>" (UTC recommended)
CMD:
<command line>
RESULT:
(exit=<code>)
STDOUT:
<first 30 lines or summary>
STDERR:
<first 30 lines or summary>
NOTES:
<why ran it, what learned, next>
Submit DOGFOOD_REPORT.md using this template.
# DOGFOOD REPORT
## Metadata
- Run date:
- Runner:
- Spec file:
- Repo commit:
## Environment
- OS:
- Node.js:
- pnpm:
- Docker:
- PostgreSQL image:
## Files changed
- <path>
## Scenario 1: New backend
- Result: PASS | PARTIAL | FAIL
- Implemented scope:
- Evidence pointers: LOG STEP <n>, <n>
- Notes:
## Scenario 2: Schema/spec changes (Candidate C)
- Result: PASS | PARTIAL | FAIL
- Change summary:
- Evidence pointers: LOG STEP <n>, <n>
- Notes:
## Command and trial metrics
- Total step count:
- Trial/error count:
- Commands that required retries:
## Frictions
### Needs CLI automation
- <item>
- Evidence: LOG STEP <n>
### Feature exists but undiscoverable
- <item>
- Evidence: LOG STEP <n>
### Forces migration (ZTD violation)
- <item or "None observed">
- Evidence: LOG STEP <n>
### Forces SQL string concatenation (security risk)
- <item or "None observed">
- Evidence: LOG STEP <n>
## Improvement proposals
- CLI:
- Docs:
- Discoverability:
- Templates/scaffolding:
## Happy-path draft (shortest successful steps)
1. <step>
2. <step>
3. <step>
## Open questions
- <item or "None">- Do not alter scenario definitions during execution.
- If blocked, do not change goals; record the block in LOG and REPORT.
- Distinguish strictly between:
- mechanizable repetition (CLI/docs/template candidates)
- judgment-required work (domain design, naming, contract semantics)
Note: ztd lint validates SQL via Postgres. When queries contain placeholders ($1, $2, or named params), lint injects default bindings (e.g. null) to avoid unbound-parameter failures (42P02) and to surface SQL-level diagnostics instead.
- If you run under a directory governed by a parent
pnpm-workspace.yaml, pnpm may absorb installs into the parent workspace. - This can introduce unrelated dependency-resolution friction and reduce determinism for dogfooding signals.
- This mode is intentionally not the default because it does not represent typical npm-package users.
For focused companion flows, use these scenario documents instead of expanding this backend-focused harness inline:
- SQL Debug Recovery Dogfooding
- SSSQL Optional-Condition Dogfooding
- Test Documentation Dogfooding
- Perf Scale Tuning Dogfooding
- Dynamic Filter Routing Dogfooding
Those companion scenarios cover broken long-CTE SQL recovery, query graph / query slice / query patch apply usage, truthful optional-condition authoring with SSSQL, direct-vs-decomposed perf evidence loops, scale-sensitive index-vs-pipeline tuning decisions, and the human-readable test documentation export path.
Use this order unless a hard blocker appears.
- Prepare an untracked standalone workspace outside any pnpm workspace (for example
C:\Users\<you>\tmp\rawsql-ts-dogfood\run-XX\). - Create Docker PostgreSQL 18 container for runtime checks.
- Scaffold project with local source linkage.
- Apply fixed baseline DDL.
- Implement Scenario 1 assets and tests.
- Apply Candidate C changes for Scenario 2.
- Regenerate affected artifacts and tests.
- Run verification commands.
- Produce LOG and REPORT files.