is my app using GPU or CPU on device LLM

34 views
Skip to first unread message

Mohammad Jaminur Islam

unread,
May 8, 2024, 4:04:51 AMMay 8
to MediaPipe
I was playing with the LLM inference example following the LLM Inference guide for Android. However, I could not figure out whether the toy app was using my device's GPU or CPU (the device is a Samsung s24). Is there any way to know that? If it is possible how I can provide the configuration of where to run the app on the device? I looked into some guidance I found one way to do it is to use BaseOptions baseOptions = BaseOptions.builder().useGpu().build(); but could not figure out where to use this besides when using that code fragment was getting error.

however, the following code segment initializes the llmInference object from the LlmInferenceOptions builder which has no option to  set the GPU or CPU  

val options = LlmInference.LlmInferenceOptions.builder()
.setModelPath(MODEL_PATH)
.setMaxTokens(1024)
.setResultListener { partialResult, done ->
_partialResults.tryEmit(partialResult to done)
}
.build()
llmInference = LlmInference.createFromOptions(context, options)  
I would really appreciate any help regarding how to set up the GPU environment.
Reply all
Reply to author
Forward
0 new messages