Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Kinect Infrared Projection Laser Dots - could you explain if pattern can be replaced ?

134 views
Skip to first unread message

a a

unread,
Aug 14, 2022, 5:07:39 PM8/14/22
to
Kinect Infrared Projection Laser Dots

https://youtu.be/DxDRMJgUKXI


kinect - sensor IR projection

https://youtu.be/MlTf0yYQjSg

let me know IR projected pattern in Kinect is random, can be replaced by another pattern

and Kinect can still work if you use higher power IR projector
and if you move IR projector further from IR camera
or if you get IR points cloud pattern
by another pattern, projected at higher/ lower resolution ?

a a

unread,
Aug 15, 2022, 11:20:29 AM8/15/22
to
-I need to learn more about laser IR projector in Kinect

-"The Kinect infrared sensor sees the sofa as a large number of tiny dots. The Kinect sensor constantly projects these dots over the area in its view. If you want to view the dots yourself, it’s actually very easy; all you need is a video camera or camcorder that has a night vision mode. A camera in night vision mode is sensitive to the infrared light spectrum that the Kinect distance sensor uses.

Figure 1-6, for example, was taken in complete darkness, with the sofa lit only by the Kinect. The infrared sensor in the Kinect is fitted with a filter that keeps out ordinary light, which is how it can see just the infrared dots, even in a brightly lit room. The dots are arranged in a pseudo-random pattern that is hardwired into the sensor. You can see some of the pattern in Figure 1-7.


https://www.microsoftpressstore.com/articles/article.aspx?p=2201646

-I need to know how pseudo-random pattern of dots is generated by IR laser projector
Resolution is not low

https://www.microsoftpressstore.com/content/images/chap1_9780735663961/elementLinks/httpatomoreillycomsourcemspimages1239382.jpg

dots are not lined up so it seems to me some laser optics / lens is involved

- single laser diode + lens with drilled pattern ?

"pseudo-random pattern that is hardwired into the sensor.

not sure what they mean, since to have pseudo-random[pattern hardwired into the sensor
you need to get (x,y) coordinates for every single point

I recall another algorithm to generate 3D depth images by moving camera, based on blur effect,
since closer objects move faster in the plane perpendicular to the camera axis
so if we stack a number of images/frames together, we get blur effect for the closer objects to intensify.

----
Kinect`s infrared projector in action
51,995 views
17 Nov 2010
37
Dislike
Share
Clip
Save
Vladimir Seregin
374 subscribers
-Video recorded with an regular webcam without IR filter on it.
https://www.youtube.com/watch?v=brnIty7mh2Q

since every single dot is clearly visible in regular HD webcam
so a number of dots projected must be below HD resolution

--
there is a number of web links but they refer to IR camera resolution only

640 x 480
-The Kinect sensor returns 16 bits perpixel infrared data with a resolution of 640 x 480as an color image format, and it supports up to30 FPS. Following are the couple of images ( taken in a complete dark room) that captures from IR stream data.
Get the IR Stream and control the IR Emitter – Kinect for Windows SDK ...
abhijitjana.net/2013/01/11/get-the-ir-stream-and-control-the-ir-emitter-kinect-for-windows-sdk/
abhijitjana.net/2013/01/11/get-the-ir-stream-and-control-the-ir-emitter-kinect-for


Get the IR Stream and control the IR Emitter – Kinect for Windows SDK

https://abhijitjana.net/2013/01/11/get-the-ir-stream-and-control-the-ir-emitter-kinect-for-windows-sdk/#:~:text=The%20Kinect%20sensor%20returns%2016%20bits%20perpixel%20infrared,dark%20room%29%20that%20captures%20from%20IR%20stream%20data.
a a’s profile photo
a a
16:35 (1 hour ago)
to
from
https://www.dfki.de/fileadmin/user_upload/import/8767_wasenmuller2016comparison.pdf

-The Kinect v1 measures the depth with the Pattern Projection principle, where
a known infrared pattern is projected into the scene and out of its distortion
the depth is computed. The Kinect v2 contains a Time-of-Flight (ToF) camera
and determines the depth by measuring the time emitted light takes from the
camera to the object and back. Therefore, it constantly emits infrared light with
modulated waves and detects the shifted phase of the returning light [17, 18]. In
the following, we refer to both cameras (Pattern Projection and ToF) as depth
camera

-not sure how ToF can be live implemented into 2D image analysis
0 new messages