Skip to content

Fix SoftmaxWithLossLayer crash when using two top blobs#7097

Open
Chessing234 wants to merge 1 commit intoBVLC:masterfrom
Chessing234:fix/softmax-loss-clear-loss-weight
Open

Fix SoftmaxWithLossLayer crash when using two top blobs#7097
Chessing234 wants to merge 1 commit intoBVLC:masterfrom
Chessing234:fix/softmax-loss-clear-loss-weight

Conversation

@Chessing234
Copy link
Copy Markdown

Summary

SoftmaxWithLossLayer::LayerSetUp copies the full LayerParameter into the internal SoftmaxLayer. When a user specifies two top blobs with loss_weight entries (valid for SoftmaxWithLoss), both loss_weight values are inherited by the internal SoftmaxLayer. Since the internal softmax only has 1 top blob, Layer::SetUp hits CHECK_EQ(top.size(), num_loss_weights) (1 != 2) and crashes.

Fix: Call softmax_param.clear_loss_weight() before creating the internal layer. The internal SoftmaxLayer is a pure forward computation layer with no loss, so loss_weight is irrelevant to it.

Fixes #2968.

Test plan

  • Define a SoftmaxWithLoss layer with two top blobs and loss_weight on both — previously crashes, now works
  • Single top blob behavior unchanged

🤖 Generated with Claude Code

LayerSetUp copies the full LayerParameter (including loss_weight)
into the internal SoftmaxLayer. When two top blobs are specified
with loss_weight entries, SetUp hits CHECK_EQ(top.size(),
num_loss_weights) because the internal softmax has only 1 top blob
but 2 loss_weight values. Clear loss_weight before creating the
internal layer since it is a pure forward computation layer.

Fixes BVLC#2968.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

SoftmaxWithLoss creation fails with with two top blobs

1 participant