Fix SoftmaxWithLossLayer crash when using two top blobs#7097
Open
Chessing234 wants to merge 1 commit intoBVLC:masterfrom
Open
Fix SoftmaxWithLossLayer crash when using two top blobs#7097Chessing234 wants to merge 1 commit intoBVLC:masterfrom
Chessing234 wants to merge 1 commit intoBVLC:masterfrom
Conversation
LayerSetUp copies the full LayerParameter (including loss_weight) into the internal SoftmaxLayer. When two top blobs are specified with loss_weight entries, SetUp hits CHECK_EQ(top.size(), num_loss_weights) because the internal softmax has only 1 top blob but 2 loss_weight values. Clear loss_weight before creating the internal layer since it is a pure forward computation layer. Fixes BVLC#2968. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
SoftmaxWithLossLayer::LayerSetUpcopies the fullLayerParameterinto the internalSoftmaxLayer. When a user specifies two top blobs withloss_weightentries (valid forSoftmaxWithLoss), bothloss_weightvalues are inherited by the internalSoftmaxLayer. Since the internal softmax only has 1 top blob,Layer::SetUphitsCHECK_EQ(top.size(), num_loss_weights)(1 != 2) and crashes.Fix: Call
softmax_param.clear_loss_weight()before creating the internal layer. The internalSoftmaxLayeris a pure forward computation layer with no loss, soloss_weightis irrelevant to it.Fixes #2968.
Test plan
SoftmaxWithLosslayer with two top blobs andloss_weighton both — previously crashes, now works🤖 Generated with Claude Code