Hello:
my first time trying to run the server. I followed the pip installation process and executed the example:
mlx-openai-server launch --model-path mlx-community/Qwen3-Coder-Next-4bit --model-type lm
There are some deep dependencies going into weeds of multimodal mlx_vlm or torchvision unsatisfied modules despite the request for simple text-only mode.
It could be that the mlx-openai-server dependencies are messed up or maybe less imports can be included on app side for lm mode. Please advise.
mlx-openai-server-launch-missing-deps.txt
Hello:
my first time trying to run the server. I followed the pip installation process and executed the example:
mlx-openai-server launch --model-path mlx-community/Qwen3-Coder-Next-4bit --model-type lmThere are some deep dependencies going into weeds of multimodal mlx_vlm or torchvision unsatisfied modules despite the request for simple text-only mode.
It could be that the mlx-openai-server dependencies are messed up or maybe less imports can be included on app side for lm mode. Please advise.
mlx-openai-server-launch-missing-deps.txt