You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Termux, prooted Debian, but also in standard Termux.
Box specs, in short:
Environment at local 🎋 prooted system:
Linux localhost 6.2.1-PRoot-Distro #1 SMP PREEMPT Thu Mar 17 16:28:22 CST 2022 aarch64 GNU/Linux
PATH: /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/games:/usr/games:/system/bin:/system/xbin:/data/data/com.termux/files/home/.local/bin/:/data/data/com.termux/files/usr/bin/
LD_LIBRARY_PATH:
CFLAGS:
LDFLAGS:
CPPFLAGS:
C_INCLUDE_PATH:
CPLUS_INCLUDE_PATH:
USE_VULKAN: OFF
Running command Building wheel for llama_cpp_python (pyproject.toml)
*** scikit-build-core 0.9.4 using CMake 3.29.3 (wheel)
*** Configuring CMake...
loading initial cache file /data/data/com.termux/files/usr/tmp/tmpbfzw_fsw/build/CMakeInit.txt
-- The C compiler identification is Clang 18.1.6
-- The CXX compiler identification is Clang 18.1.6
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /data/data/com.termux/files/usr/bin/clang - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /data/data/com.termux/files/usr/bin/clang++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found Git: /data/data/com.termux/files/usr/bin/git (found version "2.45.1")
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
-- Check if compiler accepts -pthread
-- Check if compiler accepts -pthread - yes
-- Found Threads: TRUE
-- ccache found, compilation results will be cached. Disable with LLAMA_CCACHE=OFF.
-- CMAKE_SYSTEM_PROCESSOR: aarch64
-- ARM detected
-- Performing Test COMPILER_SUPPORTS_FP16_FORMAT_I3E
-- Performing Test COMPILER_SUPPORTS_FP16_FORMAT_I3E - Failed
Steps to fix
Do this:
sed -i 's/-Wunreachable-code-break//g; s/-Wunreachable-code-return//g' vendor/llama.cpp/CMakeLists.txt
to arrive at:
[27/27] : && /usr/bin/clang++ -O3 -DNDEBUG vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o vendor/llama.cpp/examples/llava/CMakeFiles/llava-cli.dir/llava-cli.cpp.o -o vendor/llama.cpp/examples/llava/llava-cli -Wl,-rpath,/tmp/tmpqgrx623c/build/vendor/llama.cpp: vendor/llama.cpp/common/libcommon.a vendor/llama.cpp/libllama.so && :
*** Installing project into wheel...
-- Install configuration: "Release"
-- Installing: /tmp/tmpqgrx623c/wheel/platlib/lib/libggml_shared.so
-- Installing: /tmp/tmpqgrx623c/wheel/platlib/lib/cmake/Llama/LlamaConfig.cmake
-- Installing: /tmp/tmpqgrx623c/wheel/platlib/lib/cmake/Llama/LlamaConfigVersion.cmake
-- Installing: /tmp/tmpqgrx623c/wheel/platlib/include/ggml.h
-- Installing: /tmp/tmpqgrx623c/wheel/platlib/include/ggml-alloc.h
-- Installing: /tmp/tmpqgrx623c/wheel/platlib/include/ggml-backend.h
-- Installing: /tmp/tmpqgrx623c/wheel/platlib/lib/libllama.so
-- Installing: /tmp/tmpqgrx623c/wheel/platlib/include/llama.h
-- Installing: /tmp/tmpqgrx623c/wheel/platlib/bin/convert.py
-- Installing: /tmp/tmpqgrx623c/wheel/platlib/llama_cpp/libllama.so
-- Installing: /root/downloads_Termux/llama-cpp-python/llama_cpp/libllama.so
-- Installing: /tmp/tmpqgrx623c/wheel/platlib/lib/libllava.so
-- Set runtime path of "/tmp/tmpqgrx623c/wheel/platlib/lib/libllava.so" to ""
-- Installing: /tmp/tmpqgrx623c/wheel/platlib/bin/llava-cli
-- Set runtime path of "/tmp/tmpqgrx623c/wheel/platlib/bin/llava-cli" to ""
-- Installing: /tmp/tmpqgrx623c/wheel/platlib/llama_cpp/libllava.so
-- Set runtime path of "/tmp/tmpqgrx623c/wheel/platlib/llama_cpp/libllava.so" to ""
-- Installing: /root/downloads_Termux/llama-cpp-python/llama_cpp/libllava.so
-- Set runtime path of "/root/downloads_Termux/llama-cpp-python/llama_cpp/libllava.so" to ""
*** Making wheel...
*** Created llama_cpp_python-0.2.76-cp311-cp311-linux_aarch64.whl...
Building wheel for llama_cpp_python (pyproject.toml) ... done
Created wheel for llama_cpp_python: filename=llama_cpp_python-0.2.76-cp311-cp311-linux_aarch64.whl size=3334239 sha256=da0322b5d7a676e2d0bab3e2d6cc65f888937bcc676f854e72a378fb574438f8
Stored in directory: /root/.cache/pip/wheels/78/ec/ae/e102e407170d9e380949aaa8aafc0646a62be8208c8008f3db
Successfully built llama_cpp_python
Installing collected packages: llama_cpp_python
Successfully installed llama_cpp_python-0.2.76
and
root@localhost:~/downloads_Termux/llama-cpp-python# pip show llama-cpp-python
Name: llama_cpp_python
Version: 0.2.76
Summary: Python bindings for the llama.cpp library
Home-page:
Author:
Author-email: Andrei Betlen [email protected]
License: MIT
Location: /usr/local/lib/python3.11/dist-packages
Requires: diskcache, jinja2, numpy, typing-extensions
Required-by:
etc.
The text was updated successfully, but these errors were encountered:
Manamama
changed the title
Wrong unreachable-code-break switch
Wrong switches: unreachable-code-break and unreachable-code-return, at least on Termux
May 29, 2024
Uh oh!
There was an error while loading. Please reload this page.
Prerequisites
Yes ;)
Current Behavior
etc.
Related to this probably.
Environment and Context
Termux, prooted Debian, but also in standard Termux.
Box specs, in short:
Steps to fix
Do this:
to arrive at:
and
etc.
The text was updated successfully, but these errors were encountered: