Fastest inference backend for MediaPipe models on Raspberry Pi board
62 views
Skip to first unread message
Ian
unread,
Nov 30, 2024, 2:31:45 AM11/30/24
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to MediaPipe
Hi all!
Me and my team is building an AI inference backend, especially focusing on CPU optimization for now. It's showing great performance gain for AI models on CPU, even outperforming XNNPACK backend on Arm64 by over 2x(demo vid: https://youtu.be/u7wzFngylis)
We've optimized MediaPipe vision models on Raspberry Pi, which all show better inference speed than XNNPACK, and plan to release it for free during the webinar. So please come join us via above link if you're interested in taking away 'the accelerated version of MediaPipe models' - it will be held next week on Dec. 5th(Thu) 9 am PT.
Plus, let me know here if you have any inquiries regarding above!