You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to MediaPipe
I was playing with the LLM inference example following the LLM Inference guide for Android. However, I could not figure out whether the toy app was using my device's GPU or CPU (the device is a Samsung s24). Is there any way to know that? If it is possible how I can provide the configuration of where to run the app on the device? I looked into some guidance I found one way to do it is to use BaseOptions baseOptions = BaseOptions.builder().useGpu().build(); but could not figure out where to use this besides when using that code fragment was getting error.
however, the following code segment initializes the llmInference object from the LlmInferenceOptions builder which has no option to set the GPU or CPU
val options = LlmInference.LlmInferenceOptions.builder() .setModelPath(MODEL_PATH) .setMaxTokens(1024) .setResultListener { partialResult, done -> _partialResults.tryEmit(partialResult to done) } .build()