Question about Live Ai Capabilities

10 views
Skip to first unread message

Joyce Feinberg

unread,
Nov 21, 2025, 10:49:32 AM (5 days ago) Nov 21
to echovisio...@agiga.ai

I want to preface this question by saying that this is NOT for navigational purposes!

 

Question:

If I am standing on a sidewalk near cars, etc. driving by, will the Live Ai feature be able to provide me the speed in which the vehicle is travelling?

 

Reason for question:

I walk on a street where there is a sidewalk on only one side of the street. As I am walking my Guide Dog, I hear the vehicles passing by at a rate of speed that I feel is too fast for this street. I have tried talking to the local Police, but they say it is not a problem (of course they are never there at the times I am walking my Guide). I would like to be able to provide them with some additional information since I am unable to find an App that is a radar detector. Does anyone think the Live Ai can do this now or in the future?

 

Thank you in advance for your responses,

Joyce

 

Jenine Stanley

unread,
Nov 21, 2025, 11:07:48 AM (5 days ago) Nov 21
to echovisio...@agiga.ai
Interested in the answer here. Since there is still a noticeable latency with live AI, it may not be able to do this but that latency is shrinking exponentially so I’d be curious. 

-- 
You're receiving this message because you're subscribed to the EchoVision Community Forum.
 
To view the archive or catch up on discussions:
https://groups.google.com/a/agiga.ai/g/echovision-discuss
 
To unsubscribe, send an email to:
echovision-disc...@agiga.ai
To unsubscribe from this group and stop receiving emails from it, send an email to echovision-disc...@agiga.ai.

Jenine Stanley
Customer Communications Team 
Producer and Host 
Airacast podcast 
Aira Tech Corp.
Direct Dial: 1-614-600-7408

Access  is a human right. 
Customer Care 
Learn more about visual interpreting at https://aira.io.
Learn about our new ASL app at:





Buddy Brannan

unread,
Nov 21, 2025, 11:08:25 AM (5 days ago) Nov 21
to echovisio...@agiga.ai
Hi,

I can’t speak for the future, but I don’t think this would be possible with current capabilities. And if Live AI did take a stab at it, I would be suspect of its accuracy. 

--
Buddy Brannan
Customer Support And Training Specialist
Agiga



On Nov 21, 2025, at 10:49 AM, Joyce Feinberg <tenn...@msn.com> wrote:

Gene Warner

unread,
Nov 21, 2025, 11:20:25 AM (5 days ago) Nov 21
to echovisio...@agiga.ai
To have any chance of being anywhere near accurate, the AI would need a reliable way to determine distances.

Recently I asked an AI to tell me the size of a scanned shipping label I had just printed. It couldn't give me a reasonably accurate measurement until I told it that the label was printed on a standard letter sized sheet of paper. Even then it said it's measurements were only estimates.

Gene...
I am friends with the monster under my bed
I get along with the voices inside of my head

Sieghard Weitzel

unread,
Nov 21, 2025, 1:33:05 PM (5 days ago) Nov 21
to echovisio...@agiga.ai
Police uses radar when they are checking if people are driving too fast so unless the glasses had a build-in radar sensor, determining the speed of a vehicle will be impossible.
As for size and distance, this would typically be done by a "LiDAR" sensor. This stands for "light detection and ranging" and I believe since the iPhone 13 or maybe even the 12, all Pro models of the iPhone have a LiDAR sensor. It is basically a laser which is projected against whatever object you are aiming and then reflected back. The iPhone's processor then calculates the time it took for the laser beam to be sent out and reflected back and since it travels at the speed of light which is a known number, the distance can then be calculated.
This is why if you have a Pro iPhone model from the past years, you can use the Measure app and get quite accurate distances to walls or other items like furniture.
Sighted people can also use the Measure app to measure, for example, the size of a picture hanging on a wall. I say sighted people can do this because the process involves putting a dot or whatever it is on one corner of the picture and then moving across to the other corner and this is virtually impossible for a blind person to do.
It would actually be interesting to know if the EchoVision Smart glasses have a LiDAR sensor as this would certainly go a long way to provide distances to objects. I am pretty sure they don't have a radar sensor which would also be crucial for providing feedback about any objects in your way when walking, upcoming curbs etc. I would think, however, that with the way AI technology is going and the ever continuing miniaturization of components, that maybe in a few years we'll have regular looking glasses which might contain all these sensors and which then can provide a lot more feedback about such things.


Best regards,
Sieghard
> > https://gro/
> > ups.google.com%2Fa%2Fagiga.ai%2Fg%2Fechovision-discuss&data=05%7C02%
> > 7C%7C05237c0cae404b221f9308de2919e2a0%7C84df9e7fe9f640afb435aaaaaaaa
> > aaaa%7C1%7C0%7C638993388308261812%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU
> > 1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIl
> > dUIjoyfQ%3D%3D%7C0%7C%7C%7C&sdata=XcswGecEv%2BN8cnR6%2FAGaJQekowRAMa
> > XL3GxIE%2BP%2FAmA%3D&reserved=0
> >
> > To unsubscribe, send an email to:
> > echovision-disc...@agiga.ai
>
> --
> You're receiving this message because you're subscribed to the EchoVision Community Forum.
>
> To view the archive or catch up on discussions:
> https://group/
> s.google.com%2Fa%2Fagiga.ai%2Fg%2Fechovision-discuss&data=05%7C02%7C%7
> C05237c0cae404b221f9308de2919e2a0%7C84df9e7fe9f640afb435aaaaaaaaaaaa%7
> C1%7C0%7C638993388308276906%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOn
> RydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3
> D%3D%7C0%7C%7C%7C&sdata=1r0VBXZTDfgu4IjEQ%2BOeoNSbKeXBplmlu2SGI6f0Xpw%
> 3D&reserved=0

Gene Warner

unread,
Nov 21, 2025, 1:38:37 PM (5 days ago) Nov 21
to echovisio...@agiga.ai
I have played around with the measure app on my iPhone SE and it seems to work without LIDAR though its not as pin point accurate as it would be with it, but there is no way I'm paying $1,500 for a phone! That's just plain ridiculous!

Frank Ventura

unread,
Nov 21, 2025, 7:00:46 PM (5 days ago) Nov 21
to echovisio...@agiga.ai

Joyce and all, I agree with Buddy. Your glasses or phone cant act as a radar gun because they don’t have radar. To do something like this optically there not only would have to be virtually zero latency but the exact distances of items like the glasses, static points in the field of view and how far away each vehicle is away from the camera. Remember it is a 3 dimensional calculation that we are trying to imagine on a 2 dimensional plane. Probably not accurate enough for what you want.

 

From: Buddy Brannan <bud...@agiga.ai>
Sent: Friday, November 21, 2025 11:08 AM
To: echovisio...@agiga.ai
Subject: Re: [echovision-discuss] Question about Live Ai Capabilities

 

Hi,

dar...@rogers.com

unread,
Nov 21, 2025, 7:30:33 PM (5 days ago) Nov 21
to echovisio...@agiga.ai
I believe that some people may have unrealistic expectations of smart glasses for the blind based on the limitations of current technology. The information you get in features like Live AI is actually not real time data. People who can see process visual information in milliseconds. It may take 10 to 15 seconds for A blind person using smart glasses to actually get information about the same surroundings. Consider some of the steps required.
Press a button or use voice activation to take a picture or start video capture.
Information is sent from the glasses to the phone.
This information is usually sent to the cloud for processing.
The results are sent back to the phone.
The user then must listen to the spoken results through the glasses.
So in cases where there are moving objects like people and vehicles, the information is for the point in time that the photo or video was taken. There will always be a delay.
From my experience, Where this technology can be really good is for getting information about surroundings where objects are not moving.
I think that over time, lower latency and more on-device processing could improve on this, but I would still not depend on it for accurate information about moving objects in busy places
I am a very strong supporter of the technology and still find it to be extremely useful in many different situations. From the limited information I am getting so far, the AGIGA product is the leading contender for my next smart glasses purchase. I am looking for certain things, and information from users about their real life experiences will be the determining factor for me.
Darold

Sieghard Weitzel

unread,
Nov 21, 2025, 8:10:14 PM (5 days ago) Nov 21
to echovisio...@agiga.ai
That may be true, but all these steps you listed like the information being sent to the phone, then to the cloud for processing, then the result back to the phone and spoken by the ghlasses does in fact happen in an extremely short time provided you have a good 5G cellular connection.
I don't have the EchoVision glasses yet, but I do have the Ray-Ban Meta glasses and the other day I used Live AI mode to find a pair of boots for a customer (I own a retail store and my staff was busy).
Normally I am not able to do this because we have literally hundreds of pairs of hiking and neoprene winter boots on several large, 10-foot high shelves in our warehouse. However, I did know approximately where the model of boot the customer wanted was so as I walked into the back I asked Meta to start Live AI, by the time I got to the shelf it was started and I randomly grabbed a pair of boots, pulled the box out a little bit and pointed to the label and said "What are these boots". The reply came back basically instantaneously, I don't even think it took a second. The model was close to what I wanted, but not exactly the right one so I went to the next stack and asked, again, immediate and correct answer, then one more stack over and I had the right model. It didn't tell me the size right away so I asked "and what size are they" and it came back with "size 8" which it so happened was exactly the size I wanted and I was able to bring the boots out to the customer to try maybe not quite as quickly as one of my sighted employees, but it was pretty darn impressive and very liberating because as I said, this sort of thing was previously impossible because even using Seeing AI short text mode on my iPhone where I have to hold the phone with one hand and then try to scan the labels never worked this good because it often picked up the information of multiple boxes if they were stacked on top of each other. With Live AI I can point to a particular label and the AI is smart enough to know that I want the information for what I am pointing at and not the one above, below or to one side. And of course I also have both hands free so even if I could find the right box with my phone, I would then have to put my phone away before I could pull out the box etc.to
I am not saying the system is up to providing reliable information as to whether I can run across the street real quick before the car which is 30 yards away runs me over because maybe he's going faster than I thought, but for many scenarios this technology is amazing.

dar...@rogers.com

unread,
Nov 21, 2025, 9:04:00 PM (5 days ago) Nov 21
to echovisio...@agiga.ai
That's a great example of a good use of the technology. I have also found the technology to be very liberating in many situations! I was referring more to the time taken for describing scenes like immediate surroundings that would take longer to process and be read back to you. In your example, the boxes in the warehouse are also not moving so it's easier for the technology to provide more accurate information about particular items. I would also expect that information to be processed and read out more quickly than a scene description.
I think that one of the biggest advantages of smart glasses is hands free use. I have never been a fan of waving my phone around to find things while out in public.

Haylie Gallacher

unread,
Nov 21, 2025, 9:07:20 PM (5 days ago) Nov 21
to echovisio...@agiga.ai
Me either. I wish that Good Maps could some how use the camera on these glasses. It would be nice to be able to keep my hands free at all times for sure.
Haylie

Sieghard Weitzel

unread,
Nov 22, 2025, 1:15:28 AM (4 days ago) Nov 22
to echovisio...@agiga.ai
When it comes to Indigo (formerly Goodmaps) mostly what matters is to hear what information is provided as you walk along or follow a route and as the phone's audio comes through the glasses speakers which leave your ears free, it would work well to use them. Then in addition to getting the instructions from Indigo you would then also have the option to get more feedback about something or call Aira or Be My eyes.

Gene Warner

unread,
Nov 22, 2025, 8:25:36 AM (4 days ago) Nov 22
to echovisio...@agiga.ai
I'm with you, unless absolutely necessary I like to employ a phone away policy meaning that as much as I can, I keep my phone in my pocket. You become less of a target for thieves that way. I also avoid flashy colors when choosing my phone and protective case. Being blind puts me at a distinct disadvantage so the more I can do to make myself a smaller target, the better.

Gene Warner

unread,
Nov 22, 2025, 8:30:12 AM (4 days ago) Nov 22
to echovisio...@agiga.ai
That was a major reason why I tried Envision's Ally Solos glasses, because the way they promoted it, it sounded like a totally voice activated set up which would have been great. But the reality is that they are far from being hands free. If I have to keep touching my glasses, I may as well get a pair that gives me features I want like access to BeMyAI and BeMyEyes, as well as and AI I can talk to about what I'm looking at.

Gene Warner

unread,
Nov 22, 2025, 8:33:27 AM (4 days ago) Nov 22
to echovisio...@agiga.ai
I am still waiting for a GPS system I like.

Gene...
I am friends with the monster under my bed
I get along with the voices inside of my head

----- Original Message -----

Jeffrey D. Stark

unread,
Nov 22, 2025, 9:19:48 AM (4 days ago) Nov 22
to echovisio...@agiga.ai
I will say that I just got the meta Oakley vanguards and the extra button to start live ai mode has proven hugely useful. I used it to find my uber although the fact it won't read license plates is stupid and used it to find a doorway entrance to a building I was going to. These are the real life situations I'm looking to solve with glasses. I'd also very much like goodmaps integration since it's starting to roll out in my city. It's at the train station, a major mall and a couple of office buildings in Ottawa. Having hands free access would be ideal.
Reply all
Reply to author
Forward
0 new messages