Fastest inference backend for MediaPipe models on Raspberry Pi board

62 views
Skip to first unread message

Ian

unread,
Nov 30, 2024, 2:31:45 AM11/30/24
to MediaPipe
Hi all!

Me and my team is building an AI inference backend, especially focusing on CPU optimization for now.
It's showing great performance gain for AI models on CPU, even outperforming XNNPACK backend on Arm64 by over 2x(demo vid: https://youtu.be/u7wzFngylis)

I'll be holding a webinar especially focusing on how we accelerated MediaPipe models on Raspberry Pi. Come sign-up via below link if you're interested:
https://events.raspberrypi.com/communit ... ef855b0b96

We've optimized MediaPipe vision models on Raspberry Pi, which all show better inference speed than XNNPACK, and plan to release it for free during the webinar. So please come join us via above link if you're interested in taking away 'the accelerated version of MediaPipe models'  - it will be held next week on Dec. 5th(Thu) 9 am PT.

Plus, let me know here if you have any inquiries regarding above!
Reply all
Reply to author
Forward
0 new messages