Skip to content
Discussion options

You must be logged in to vote

Hi @MoktarEls Thank you so much for the detailed bug report—your investigation was incredibly helpful in identifying the root cause!

The Fix

We've resolved the issue in the embeddings branch (associated with PR #160). The TextEmbedder now correctly resets its internal state and dynamically detects embedding dimensions when switching models.

Example

# Switch to sentence_transformers
core.embedding_generator.set_text_model(
    method="sentence_transformers", 
    model_name="dangvantuan/sentence-camembert-base"
)

# It now correctly reflects the change:
info = core.embedding_generator.get_methods_info()
print(f"Method: {info['text']['method']}")       # sentence_transformers
print(f"Dimens…

Replies: 1 comment 2 replies

Comment options

You must be logged in to vote
2 replies
@MoktarEls
Comment options

@KaifAhmad1
Comment options

Answer selected by MoktarEls
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants