Hi MediaPipe team,
I am working to use & deploy MediaPipe FW based solutions on x86/x86_64 Android platforms. I started off by experimenting to build MediaPipe FW, sample applications for LLM& non-LLM use-cases. The build fails due to not using latest GCC/Clang tool chains. Some modifications were done with which we are able to compile and run on x86_64 Android platforms. I need Media pipe team to input/review the changes done to include x86_64 support.
Additionally, we also want to discuss on enabling LLM use cases on x86_64 via GPU for Intel 12th Generation CPU Family & higher.
Solutions evolved to resolve these build include:
a. Adding the missing Android x86_64 configs (in .bazelrc)
b. Build changes to use higher NDK versions (NDK25)
c. Other bazel version to address build issues in java libraries.
Build Environment that has issues: Ubuntu 22.04; Clang: 9.x; Python 3.12;Bazel 7.1/6.1.1; Android SDK 32/33;NDK 22;MP source from Github.
My topmost commit id is:4f14ecb5794eea8dd46255b64b81f1808dd69d04
The builds cmds that were used after applying the changes to build
difference components are:
bazel build -c opt --fat_apk_cpu=x86_64 --strip=never
--host_crosstool_top=@bazel_tools//tools/cpp:toolchain
mediapipe/tasks/java/com/google/mediapipe/tasks/vision:tasks_vision
bazel build -c opt --config=android_x86_64
mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetectioncpu:objectdetectioncpu
bazel build -s -c dbg --strip=never --config=android_x86_64
--host_crosstool_top=@bazel_tools//tools/cpp:toolchain
mediapipe/tasks/java/com/google/mediapipe/tasks/genai:libllm_inference_engine_jni.so
Issues observed:
Issue 1. Cannot support flag -mamx-int8 -mavxvnni for xnnpack. Issue
resolution can be done by moving to higher GCC/Clang versions b
Errors log:
ERROR: /media/venkat/local_disk/MP/xnnp/XNNPACK/BUILD.bazel:1894:19: Compiling src/amalgam/gen/avxvnni.c failed: (Exit 1): gcc failed: error executing CppCompile command (from target //:avxvnni_prod_microkernels) /usr/bin/gcc -U_FORTIFY_SOURCE -fstack-protector -Wall -Wunused-but-set-parameter -Wno-free-nonheap-object -fno-omit-frame-pointer -g -MD -MF ... (remaining 32 arguments skipped)
Use --sandbox_debug to see verbose messages from the sandbox and
retain the sandbox build root for debugging
gcc: error: unrecognized command line option '-mavxvnni'; did you mean
'-mavx512vnni'?
Changes done: Added changes to include configs for android_x86_64 and eventually use the NDK25 which has support for Intel ISA instructions like AVX512AMX, AVXVNNI instructions.
Issue 2. When building LLM tasks (and or
libllm_inference_engine_jni.so) build fails with multiple issues
Build cmd: bazel build -s -c dbg --strip=never --config=android_x86_64
--host_crosstool_top=@bazel_tools//tools/cpp:toolchain
//mediapipe/tasks/java/com/google/mediapipe/tasks/genai:libllm_inference_engine_jni.so
The most prominent error occurs due to “gcc tool chain” error. Analysis of
error leads to the gcc toolchain error in the file
~/.cache/bazel/_bazel_root/b160314324a439d32bbb8d7e564e155e/external/androidndk/cc_toolchain_config.bzl.
Changes done: Use Bazel version bazel-6.5.0 and use STARLARK rules. With these changes, the LLM Tasks & JNI.so files are built and runs on Android X86_64 in Studio emulators and WSA envirionments.
Issue 3. GPU support for X86_64 is not
available for LLM tasks.
For Arm, the tasks-genai plugin auto-downloaded by Gradle(maven)
supports GPU model loading and inference. However, the MediaPipe source code
seems to have support only for CPU inference (LlmInferenceEngine_CreateSession
defined only in llm_inference_engine_cpu.cc). We would like discuss the details
on how to enable the GPU path in X86_64.
I have also raised a Github issue with additional details.
Thanks,
Venkat