Skip to content

Commit 25166ae

Browse files
committed
Address PR comments: PyPI first for transformers, fix relative path, add CWD note
Signed-off-by: Meng Xin <mxin@nvidia.com>
1 parent ea0c193 commit 25166ae

File tree

2 files changed

+10
-4
lines changed

2 files changed

+10
-4
lines changed

.claude/skills/ptq/references/slurm-setup-ptq.md

Lines changed: 9 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -28,10 +28,16 @@ If enroot import fails (e.g., permission errors on lustre), use pyxis inline pul
2828

2929
### Container dependency pitfalls
3030

31-
**New models may need newer transformers** than what's in the container. Install from source inside the job script:
31+
**New models may need newer transformers** than what's in the container. Upgrade inside the job script:
3232

3333
```bash
34-
pip install git+https://github.com/huggingface/transformers.git --quiet
34+
pip install -U transformers
35+
```
36+
37+
If the model requires an unreleased fix not yet on PyPI, fall back to installing from git (pin to a tag or commit when possible):
38+
39+
```bash
40+
pip install -U "git+https://github.com/huggingface/transformers.git"
3541
```
3642

3743
**Prefer `PYTHONPATH`** to use the synced ModelOpt source instead of installing inside the container — this avoids risking dependency conflicts (e.g., `pip install -U nvidia-modelopt[hf]` can upgrade PyTorch and break other packages):
@@ -40,7 +46,7 @@ pip install git+https://github.com/huggingface/transformers.git --quiet
4046
export PYTHONPATH=/path/to/Model-Optimizer:$PYTHONPATH
4147
```
4248

43-
If `PYTHONPATH` doesn't work due to missing compiled extensions, fall back to `pip install -e ".[hf]" --no-build-isolation`.
49+
If `PYTHONPATH` doesn't work due to missing compiled extensions, fall back to `pip install -e ".[hf]" --no-build-isolation` (run from the Model-Optimizer repo root).
4450

4551
**Watch for pip dependency conflicts** — NGC containers set `PIP_CONSTRAINT` to pin versions, causing `ResolutionImpossible` errors. Unset it first so pip can resolve freely:
4652

.claude/skills/ptq/references/unsupported-models.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -49,7 +49,7 @@ print(type(cfg).__name__)
4949
grep -r "class <ArchName>" /tmp/transformers-main/src/transformers/models/
5050
```
5151

52-
- **Found** → install: `pip install /tmp/transformers-main`, then re-run `AutoConfig.from_pretrained()`. For container dependency issues, see `references/slurm-setup-ptq.md` (PIP_CONSTRAINT, PYTHONPATH).
52+
- **Found** → install: `pip install /tmp/transformers-main`, then re-run `AutoConfig.from_pretrained()`. For container dependency issues, see `slurm-setup-ptq.md` (PIP_CONSTRAINT, PYTHONPATH).
5353
- **Not found** → ask the user: *"The checkpoint uses `<ArchName>` which isn't in released or main-branch transformers. Do you have a private fork or custom modeling code?"*
5454
5555
- **No `config.json`** → not a standard HF checkpoint. List the directory for README or `.py` files. If nothing useful, ask the user for the modeling code.

0 commit comments

Comments
 (0)