-
Notifications
You must be signed in to change notification settings - Fork 1.7k
Description
Describe the bug
When running tabby serve --device metal ..., the service fails to start. The logs show that llama-server repeatedly crashes with a dyld: Symbol not found: _OBJC_CLASS_$_MTLResidencySetDescriptor error.
The error message indicates that the llama-server binary included in the Homebrew formula was "built for macOS 15.5 which is newer than running OS," causing an incompatibility with the system's Metal framework.
Information about your version
I have since uninstalled Tabby to attempt a source build. However, the error occurred with version 0.31.2, as indicated by the file path in the logs (/opt/homebrew/Cellar/tabby/0.31.2/bin/llama-server).
Information about your GPU
Since I am using --device metal on macOS, nvidia-smi is not applicable. Here is my system information:
- macOS Version:
14.7.5 - Mac Chip:
M1 Pro
Additional context
The issue seems to be with the pre-compiled binary provided by Homebrew being incompatible with macOS versions older than 15.5. Building from source with brew install --build-from-source tabby is a potential workaround. This report is to inform the maintainers of this incompatibility.
Command used to reproduce:```bash
tabby serve --device metal --model StarCoder-1B --chat-model Qwen2-1.5B-Instruct
dyld: Symbol not found: OBJC_CLASS$_MTLResidencySetDescriptor
Referenced from: /opt/homebrew/Cellar/tabby/0.31.2/bin/llama-server (built for macOS 15.5 which is newer than running OS)
Expected in: <43901F96-105F-330A-BBC5-C3DA000436F6> /System/Library/Frameworks/Metal.framework/Versions/A/Metal
2025-12-02T02:42:20.212227Z WARN llama_cpp_server::supervisor: crates/llama-cpp-server/src/supervisor.rs:124: llama-server exited with status code -1, args: Command { std: "/opt/homebrew/bin/llama-server" "-m" "/Users/my/.tabby/models/TabbyML/Nomic-Embed-Text/ggml/model-00001-of-00001.gguf" "--cont-batching" "--port" "30888" "-np" "1" "--ctx-size" "4096" "-ngl" "9999" "--embedding" "--ubatch-size" "4096", kill_on_drop: true }
Recent llama-cpp errors:
dyld: Symbol not found: OBJC_CLASS$_MTLResidencySetDescriptor
Referenced from: /opt/homebrew/Cellar/tabby/0.31.2/bin/llama-server (built for macOS 15.5 which is newer than running OS)
Expected in: <43901F96-105F-330A-BBC5-C3DA000436F6> /System/Library/Frameworks/Metal.framework/Versions/A/Metal
2025-12-02T02:42:20.212414Z WARN llama_cpp_server::supervisor: crates/llama-cpp-server/src/supervisor.rs:164: Attempting to restart the llama-server...
... (repeated crash logs) ...
llama-server encountered a fatal error. Exiting service. Please check the above logs and suggested solutions for details.