Git commit
f9f3365
Operating systems
Linux
GGML backends
CUDA
Problem description & steps to reproduce
I'm running into this on Clang 22.1.4 and CMake 3.31.6 on Ubuntu 25.04 for what it's worth.
llama_build_info is an "undeclared identifier", despite being linked through llama-common-base via llama-common. I believe this broke in 6990e2f.
I generated the cmake build as follows:
cmake -G Ninja .. \
-DCMAKE_BUILD_TYPE=Release \
-DCMAKE_CXX_COMPILER=clang++ \
-DCMAKE_C_COMPILER=clang \
-DCMAKE_CUDA_HOST_COMPILER=g++-15 \
-DCMAKE_INTERPROCEDURAL_OPTIMIZATION=ON \
-DCMAKE_LINKER_TYPE=LLD \
-DGGML_CUDA=On \
-DGGML_CCACHE=OFF
First Bad Commit
6990e2f (libs : rename libcommon -> libllama-common)
Compile command
ninja llama-server
/usr/local/bin/clang++ -DGGML_BACKEND_SHARED -DGGML_SHARED -DGGML_USE_CPU -DGGML_USE_CUDA -DLLAMA_BUILD_WEBUI -DLLAMA_SHARED -I/home/bodie/meta/build/llama.cpp/tools/server -I/home/bodie/meta/build/llama.cpp/build-fix-buildinfo/tools/server -I/home/bodie/meta/build/llama.cpp/tools/server/../mtmd -I/home/bodie/meta/build/llama.cpp -I/home/bodie/meta/build/llama.cpp/common/. -I/home/bodie/meta/build/llama.cpp/common/../vendor -I/home/bodie/meta/build/llama.cpp/src/../include -I/home/bodie/meta/build/llama.cpp/ggml/src/../include -I/home/bodie/meta/build/llama.cpp/tools/mtmd/. -O3 -DNDEBUG -flto=thin -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wmissing-prototypes -Wextra-semi -MD -MT tools/server/CMakeFiles/server-context.dir/server-task.cpp.o -MF tools/server/CMakeFiles/server-context.dir/server-task.cpp.o.d -o tools/server/CMakeFiles/server-context.dir/server-task.cpp.o -c /home/bodie/meta/build/llama.cpp/tools/server/server-task.cpp
Relevant log output
/home/bodie/meta/build/llama.cpp/tools/server/server-task.cpp:791:44: error: use of undeclared identifier 'llama_build_info'
791 | {"system_fingerprint", std::string(llama_build_info())},
| ^~~~~~~~~~~~~~~~
/home/bodie/meta/build/llama.cpp/tools/server/server-task.cpp:839:44: error: use of undeclared identifier 'llama_build_info'
839 | {"system_fingerprint", std::string(llama_build_info())},
| ^~~~~~~~~~~~~~~~
/home/bodie/meta/build/llama.cpp/tools/server/server-task.cpp:876:48: error: use of undeclared identifier 'llama_build_info'
876 | {"system_fingerprint", std::string(llama_build_info())},
| ^~~~~~~~~~~~~~~~
/home/bodie/meta/build/llama.cpp/tools/server/server-task.cpp:892:44: error: use of undeclared identifier 'llama_build_info'
892 | {"system_fingerprint", std::string(llama_build_info())},
| ^~~~~~~~~~~~~~~~
/home/bodie/meta/build/llama.cpp/tools/server/server-task.cpp:904:48: error: use of undeclared identifier 'llama_build_info'
904 | {"system_fingerprint", std::string(llama_build_info())},
| ^~~~~~~~~~~~~~~~
/home/bodie/meta/build/llama.cpp/tools/server/server-task.cpp:1469:44: error: use of undeclared identifier 'llama_build_info'
1469 | {"system_fingerprint", std::string(llama_build_info())},
| ^~~~~~~~~~~~~~~~
/home/bodie/meta/build/llama.cpp/tools/server/server-task.cpp:1506:48: error: use of undeclared identifier 'llama_build_info'
1506 | {"system_fingerprint", std::string(llama_build_info())},
| ^~~~~~~~~~~~~~~~
Git commit
f9f3365
Operating systems
Linux
GGML backends
CUDA
Problem description & steps to reproduce
I'm running into this on Clang 22.1.4 and CMake 3.31.6 on Ubuntu 25.04 for what it's worth.
llama_build_infois an "undeclared identifier", despite being linked through llama-common-base via llama-common. I believe this broke in 6990e2f.I generated the cmake build as follows:
First Bad Commit
6990e2f (
libs : rename libcommon -> libllama-common)Compile command
Relevant log output