Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Temporal Disparity

45 views
Skip to first unread message

Francois LE COAT

unread,
Jan 14, 2021, 12:15:45 PM1/14/21
to
Hi,

I recently found that you can compute a "temporal disparity" from two
successive images. With an image sequence, performing the perspective
registration of the sequence, you can obtain a disparity measure. That
means depth with only one camera, though disparity is usually measured
by binocular vision. This appears in the following demonstration...

<https://www.youtube.com/watch?v=nvfHlw3gHS0>

A drone is flying between a forest of trees in the French Vosges. Thanks
to the optical-flow measured on successive images, the "temporal
disparity" reveals the forest of trees... You take a reference image,
and the optical-flow is obtained on two rectified images, then the
reference is changed when inter-correlation drops below 60%. You can
perceive the relief in depth with a unique camera, over time.

Happy New Year 2021.

Regards,

--
Dr. François LE COAT
CNRS - Paris - France
<https://hebergement.universite-paris-saclay.fr/lecoat>

Francois LE COAT

unread,
Jan 22, 2021, 2:00:07 PM1/22/21
to
Hi,

Francois LE COAT writes:
> I recently found that you can compute a "temporal disparity" from two
> successive images. With an image sequence, performing the perspective
> registration of the sequence, you can obtain a disparity measure. That
> means depth with only one camera, though disparity is usually measured
> by binocular vision. This appears in the following demonstration...
>
>     <https://www.youtube.com/watch?v=nvfHlw3gHS0>
>
> A drone is flying between a forest of trees in the French Vosges. Thanks
> to the optical-flow measured on successive images, the "temporal
> disparity" reveals the forest of trees... You take a reference image,
> and the optical-flow is obtained on two rectified images, then the
> reference is changed when inter-correlation drops below 60%. You can
> perceive the relief in depth with a unique camera, over time.

In this demonstration I used the Farneback optical-flow method, to
perform the perspective registration of the sequence. That is giving
a good approximation of the "temporal disparity" in u and v, meaning
horizontally and vertically.

To perform a better approximation of the "temporal disparity", I also
used the OpenCV DualTVL1 optical-flow method, in replacement of
Farneback method. The result is far better, but it computes slowly...

<https://www.youtube.com/watch?v=jJ97h4KZOb8>

Farneback method is computed in approximately 6 hours for 2500 images
of the drone sequence. While DualTVL1 is computing in about 32 hours
for 2500 images. This is a long computation on the Macintosh, without
GPU, for a 2 minutes 30img/sec video sequence.

You may better appreciate what is meaning "temporal disparity" for now.

Francois LE COAT

unread,
Feb 2, 2021, 12:22:15 PM2/2/21
to
Hi,

Francois LE COAT writes:
> I recently found that you can compute a "temporal disparity" from two
> successive images. With an image sequence, performing the perspective
> registration of the sequence, you can obtain a disparity measure. That
> means depth with only one camera, though disparity is usually measured
> by binocular vision. This appears in the following demonstration...
>
>     <https://www.youtube.com/watch?v=nvfHlw3gHS0>
>
> A drone is flying between a forest of trees in the French Vosges. Thanks
> to the optical-flow measured on successive images, the "temporal
> disparity" reveals the forest of trees... You take a reference image,
> and the optical-flow is obtained on two rectified images, then the
> reference is changed when inter-correlation drops below 60%. You can
> perceive the relief in depth with a unique camera, over time.

To explain what I'm doing I've done a WEB page that is not yet finished:

<https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/temporal_disparity.html>

Francois LE COAT

unread,
Feb 18, 2021, 11:30:17 AM2/18/21
to
Hi,

Francois LE COAT writes:
>> I recently found that you can compute a "temporal disparity" from two
>> successive images. With an image sequence, performing the perspective
>> registration of the sequence, you can obtain a disparity measure. That
>> means depth with only one camera, though disparity is usually measured
>> by binocular vision. This appears in the following demonstration...
>>
>>      <https://www.youtube.com/watch?v=nvfHlw3gHS0>
>>
>> A drone is flying between a forest of trees in the French Vosges. Thanks
>> to the optical-flow measured on successive images, the "temporal
>> disparity" reveals the forest of trees... You take a reference image,
>> and the optical-flow is obtained on two rectified images, then the
>> reference is changed when inter-correlation drops below 60%. You can
>> perceive the relief in depth with a unique camera, over time.
>
> To explain what I'm doing I've done a WEB page that is not yet finished:
>
> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/temporal_disparity.html>

I've completed the WEB page I mentioned. This image processing is
applied to a drone flying in Vosges a few days ago. I was thinking about
to apply the same computations with the flight of Ingenuity on Mars...

<https://en.wikipedia.org/wiki/Mars_Helicopter_Ingenuity>

The autonomous drone is landing today, and we'll have video sequences =)

Francois LE COAT

unread,
Mar 16, 2021, 12:00:06 PM3/16/21
to
Hi,

Francois LE COAT writes:
>> I recently found that you can compute a "temporal disparity" from two
>> successive images. With an image sequence, performing the perspective
>> registration of the sequence, you can obtain a disparity measure. That
>> means depth with only one camera, though disparity is usually measured
>> by binocular vision. This appears in the following demonstration...
>>
>>      <https://www.youtube.com/watch?v=nvfHlw3gHS0>
>>
>> A drone is flying between a forest of trees in the French Vosges. Thanks
>> to the optical-flow measured on successive images, the "temporal
>> disparity" reveals the forest of trees... You take a reference image,
>> and the optical-flow is obtained on two rectified images, then the
>> reference is changed when inter-correlation drops below 60%. You can
>> perceive the relief in depth with a unique camera, over time.
>
> To explain what I'm doing I've done a WEB page that is not yet finished:
>
> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/temporal_disparity.html>

I recently worked a little further on the trajectory of the drone ...

<https://www.youtube.com/watch?v=3PdUvGDCbQc>

Instead of using uniquely Ry (yaw) and Tz (translation), I also used
Tx (translation) and Rz (roll) to reconstruct the trajectory. I couldn't
use Ty (translation) and Rx (pitch) because it is not looking like a
valid camera displacement. I have no real explanation.

But the aspect of the drone's trajectory is looking better ...

Francois LE COAT

unread,
Apr 21, 2021, 9:15:11 AM4/21/21
to
There's a lot of videos showing Ingenuity flying, seen from the
viewpoint of Perseverance, from a distance about of 100 meters.
But I haven't seen videos from both cameras embedded on the
helicopter Ingenuity itself. Will there be such videos ? Does the
cameras attached to the device are filming, or taking static
pictures only ? Are we going to see Ingenuity from a dynamic aspect ?

Francois LE COAT

unread,
Apr 22, 2021, 12:15:41 PM4/22/21
to
Hi,

Francois LE COAT writes:
>>> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/temporal_disparity.html>
>>
>>
>> I've completed the WEB page I mentioned. This image processing is
>> applied to a drone flying in Vosges a few days ago. I was thinking about
>> to apply the same computations with the flight of Ingenuity on Mars...
>>
>> <https://en.wikipedia.org/wiki/Mars_Helicopter_Ingenuity>
>>
>> The autonomous drone is landing today, and we'll have video sequences =)
>
> There's a lot of videos showing Ingenuity flying, seen from the
> viewpoint of Perseverance, from a distance about of 100 meters.
> But I haven't seen videos from both cameras embedded on the
> helicopter Ingenuity itself. Will there be such videos ? Does the
> cameras attached to the device are filming, or taking static
> pictures only ? Are we going to see Ingenuity from a dynamic aspect ?

NASA and Jet Propulsion Lab are showing static video like this one ...

<https://mars.nasa.gov/embed/25838/>

Are we going to see dynamic video, taken from both cameras embedded into
Ingenuity helicopter ? This looks like surveillance video, but we need
to see the cinematic from the flight. The camera should be moving =)

Francois LE COAT

unread,
Apr 26, 2021, 10:35:17 AM4/26/21
to
This is better. We can now see static pictures from Ingenuity's camera:

<https://mars.nasa.gov/embed/25846/>

Are we going to see a real film, from the camera? This would be unique!

Francois LE COAT

unread,
May 5, 2021, 9:15:09 AM5/5/21
to
Here you can see Ingenuity on Mars in 3D before its first flight...

<https://www.youtube.com/watch?v=YdKjHs9wJRM>

That would be unique if we can see Ingenuity in First Person View (FPV).
There's a lot of FPV videos on Earth, but viewed from Mars none at all !
That would be a dataset everyone would like for image processing =)

Francois LE COAT

unread,
May 15, 2021, 1:19:21 PM5/15/21
to
Chinese rover "Zhuhong" landed a few hours ago on Mars. This robot is
not accompanied with an helicopter like the NASA "Ingenuity" is. But
that is not a problem, because NASA is not sharing videos from its
flying robot. There's two cameras on "Ingenuity" but we have no videos
from those. All the videos are produced with the "Perseverance" cameras.

So Americans are not better than Chinese, because we can't view anything
from the flying helicopter's cameras. This is really such a shame. The
vision community is thankless to the NASA and its lack of shared videos.

Francois LE COAT

unread,
May 20, 2021, 11:30:10 AM5/20/21
to
For its sixth flight, the "Ingenuity" helicopter will not be filmed by
the "Perseverance" rover, said the NASA today. Does it means that we
will be able to see videos from the cameras embedded on the helicopter?
That would be a huge step forward, in the history of flights on Mars =)

Francois LE COAT

unread,
May 28, 2021, 9:16:53 AM5/28/21
to
There was an incident during the sixth flight of Ingenuity on Mars...

<https://www.youtube.com/watch?v=pKUAsuXF6EA>

Let's hope there will have a color video for the seventh flight =)

Francois LE COAT

unread,
Jun 27, 2021, 6:00:10 AM6/27/21
to
Hi,

Francois LE COAT writes:
>>>>>>>>> <https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/temporal_disparity.html>
>>>>>>>>
>>>>>>>> I've completed the WEB page I mentioned. This image processing is
>>>>>>>> applied to a drone flying in Vosges a few days ago. I was thinking about
>>>>>>>> to apply the same computations with the flight of Ingenuity on Mars...
>>>>>>>>
>>>>>>>> <https://en.wikipedia.org/wiki/Mars_Helicopter_Ingenuity>
>>>>>>>>
>>>>>>>> The autonomous drone is landing today, and we'll have video sequences =)
>>>>>>>
>>>>>>> There's a lot of videos showing Ingenuity flying, seen from the
>>>>>>> viewpoint of Perseverance, from a distance about of 100 meters.
>>>>>>> But I haven't seen videos from both cameras embedded on the
>>>>>>> helicopter Ingenuity itself. Will there be such videos? Does the
>>>>>>> cameras attached to the device are filming, or taking static
>>>>>>> pictures only ? Are we going to see Ingenuity from a dynamic aspect?
>>>>>>
>>>>>> NASA and Jet Propulsion Lab are showing static video like this one...
>>>>>>
>>>>>>      <https://mars.nasa.gov/embed/25838/>
>>>>>>
>>>>>> Are we going to see dynamic video, taken from both cameras embedded into
>>>>>> Ingenuity helicopter? This looks like surveillance video, but we need
>>>>>> to see the cinematic from the flight. The camera should be moving =)
>>>>>
>>>>> This is better. We can now see static pictures from Ingenuity's camera:
>>>>>
>>>>>      <https://mars.nasa.gov/embed/25846/>
>>>>>
>>>>> Are we going to see a real film, from the camera? This would be unique!
>>>>
>>>> Here you can see Ingenuity on Mars in 3D before its first flight...
>>>>
>>>>      <https://www.youtube.com/watch?v=YdKjHs9wJRM>
>>>>
>>>> That would be unique if we can see Ingenuity in First Person View (FPV).
>>>> There's a lot of FPV videos on Earth, but viewed from Mars none at all!
>>>> That would be a dataset everyone would like for image processing =)
>>>
>>> Chinese rover "Zhuhong" landed a few hours ago on Mars. This robot is
>>> not accompanied with an helicopter like the NASA "Ingenuity" is. But
>>> that is not a problem, because NASA is not sharing videos from its
>>> flying robot. There's two cameras on "Ingenuity" but we have no videos
>>> from those. All the videos are produced with the "Perseverance" cameras.
>>>
>>> So Americans are not better than Chinese, because we can't view anything
>>> from the flying helicopter's cameras. This is really such a shame. The
>>> vision community is thankless to the NASA and its lack of shared videos.
>>
>> For its sixth flight, the "Ingenuity" helicopter will not be filmed by
>> the "Perseverance" rover, said the NASA today. Does it means that we
>> will be able to see videos from the cameras embedded on the helicopter?
>> That would be a huge step forward, in the history of flights on Mars =)
>
> There was an incident during the sixth flight of Ingenuity on Mars...
>
>     <https://www.youtube.com/watch?v=pKUAsuXF6EA>
>
> Let's hope there will have a color video for the seventh flight =)

If you read the comment from the NASA about the 8th flight of Ingenuity:

<https://mars.nasa.gov/technology/helicopter/status/308>

you understand that there was no color camera acquisition for the 7th
and the 8th flight on Mars. This was due to the incident on the 6th
flight, and a conflict between acquisition of the two embedded cameras.
Let's hope for subsequent flights of the helicopter, that NASA has fixed
the horodating problem. Let's hope we will have a color video from Mars.

Francois LE COAT

unread,
Jul 31, 2021, 11:05:19 AM7/31/21
to
For 10th flight over Mars, Ingenuity followed an elaborate trajectory...

<https://www.youtube.com/watch?v=QiF9VJJamkE>

NASA planned to take color pictures of remarkable points on ground.
Since 9th flight, there was grey level sequences from navigation
camera, and color pictures from a few sites. This is interesting!

Francois LE COAT

unread,
Aug 15, 2021, 11:01:04 AM8/15/21
to
I've been concerned by 11th flight over planet Mars yesterday August
14th, after Ingenuity helicopter happened August 4th...

<https://www.youtube.com/watch?v=50fccs79W1A>

RAW separated images were released by the NASA at the location:

<https://mars.nasa.gov/mars2020/multimedia/raw-images/?af=HELI_NAV,HELI_RTE#raw-images>

Then Aurélien Genin put online a MPEG 4 video at the location:

<https://twitter.com/RevesdEspace/status/1426147951693402114>

August 13th. I worked on this video. It takes a delay between the
day of the flight, and final result here. But this video comes
from a very distant location in space!

Francois LE COAT

unread,
Nov 5, 2021, 1:45:14 PM11/5/21
to
Aurélien Genin shared rectified images of the 14th flight of Ingenuity:

<https://twitter.com/Astro_Aure/status/1455284564499341319>

This is interesting because the cadence is higher with 7 images/s. So
reconstructed 3D localization of the helicopter is now more precise...

<https://www.youtube.com/watch?v=OkclJd3Fmv0>

Ingenuity flew Oct. 24th and the mission is not yet fully accomplished!

Francois LE COAT

unread,
Jan 15, 2024, 1:50:45 PMJan 15
to
Hi,
Happy New Year 2024 :-)

Francois LE COAT writes:
> I recently found that you can compute a "temporal disparity" from two
> successive images. With an image sequence, performing the perspective
> registration of the sequence, you can obtain a disparity measure. That
> means depth with only one camera, though disparity is usually measured
> by binocular vision. This appears in the following demonstration...
>
>     <https://www.youtube.com/watch?v=nvfHlw3gHS0>
>
> A drone is flying between a forest of trees in the French Vosges. Thanks
> to the optical-flow measured on successive images, the "temporal
> disparity" reveals the forest of trees... You take a reference image,
> and the optical-flow is obtained on two rectified images, then the
> reference is changed when inter-correlation drops below 60%. You can
> perceive the relief in depth with a unique camera, over time.
>
> Happy New Year 2021.

A WEB page was made to illustrate the "Temporal Disparity" experiment...

When we want to determine the movement, there are two simplifying
hypotheses. Either the camera is fixed, and the observed scene is
moving. Or the camera moves, and the scene is static. In the general
case, the camera and the scene are moving, and it is necessary to
segment static and dynamic elements of what is observed. In this case
the camera is fixed, and it observes the person who is located in front
of the computer and moving. The goal of the experiment is to reconstruct
the visible relief, by "monocular depth"...

<https://hebergement.universite-paris-saclay.fr/lecoat/demoweb/profondeur.html>

That is to say we obtain the depth (inversely proportional to the
disparity) by measuring the optical-flow, with a single camera, tough
this measurement is classically made by binocular vision (for disparity)
0 new messages