On Apr 3, 2026, at 1:08 AM, Lisa Brooks <lbroo...@gmail.com> wrote:
This was interesting. They put both glasses in a side-by-side test in four different scenarios. It's about 20 minutes long. If you were testing, what thing might you try?
--
You're receiving this message because you're subscribed to the EchoVision Community Forum.
To view the archive or catch up on discussions:
https://groups.google.com/a/agiga.ai/g/echovision-discuss
To unsubscribe, send an email to:
echovision-disc...@agiga.ai
To unsubscribe from this group and stop receiving emails from it, send an email to echovision-disc...@agiga.ai.
When I got the meta ray-bans in late 2024, one of the main reasons was to be able to read ingredients and directions on bottles, jars, boxes and bags of food.
As so sadly shown in the utube video, this is not an area where meta really shines.
One of my first criticisms of meta was how it tells too many lies.
If you want to call this the ai learning curve, so be it.
But after a year and a half, I think these errors should be ironed out of the meta ai app by now.
No, echovision is not perfect, but it is much better than meta.
Monte Single
On Apr 3, 2026, at 8:36 AM, mrsi...@sasktel.net wrote:
It is just meta, m e t a glasses, not medical.
Monte Single
On Apr 3, 2026, at 11:04 AM, mrsi...@sasktel.net wrote:
Yes, I thought it could hve been a dictation thing; who knows.
I was totally aware of the less than pristine reputation of meta in all it incarnations.
I try not to use it for personal data, but sometimes the choices are limited.
C’est la vie.
On Apr 3, 2026, at 11:04 AM, mrsi...@sasktel.net wrote:
Hopefully, dictation and all its quirks will not lead to any serious intergalactic complications.
Monte Single
On Apr 3, 2026, at 1:08 AM, Lisa Brooks <lbroo...@gmail.com> wrote:
The second test with the food directions couldhave been better. On the Agiga glasses they used text detection which is essentially OCR. So make a more apples to apples comparission they could have asked the Meta glasses something like “Hey Meta read all of the text word for word or something similar. Just asking it to read gets the AI summary which is probably sourced from the internet.
From: echovisio...@agiga.ai <echovisio...@agiga.ai>
On Behalf Of Mendi Evans
Sent: Friday, April 3, 2026 5:18 AM
To: echovisio...@agiga.ai
Cc: echovisio...@agiga.ai
Subject: Re: [echovision-discuss] Vision Forward Tech Connect Comparison Tests Between EchoVision and the Meta Glasses
This is very interesting, as I thought about doing a similar thing. I would not have necessarily thought of the food directions part but that was a good idea. I would have definitely done some sort of reading task, some scene description, and some identification tasks. I also found the Uno thing interesting. I once played a card game successfully, but at the time had to use Seeing AI to do it since the glasses weren't around. Bonus: I won the game.