What is an Oak-D-Lite?

230 views
Skip to first unread message

Ross Lunan

unread,
Jan 27, 2022, 9:42:32 PM1/27/22
to HomeBrew Robotics Club
At the Wed Jan 26 monthly meeting several folks asked me who makes the Oak-D-Lite and its specs? Here's a informative link from Luxonis: https://docs.luxonis.com/projects/hardware/en/latest/pages/DM9095.html . As I described, on my MacM1 VM Ub20/Ros Noetic, I got it the basic ubuntu depthai_demo.py to work (don't forget the USB udev rules) and on ROS after compiling dependancies and catkin src code, without issue from the luxonis/depthai-ros github. The BW picture and disparity images & rgb color image data are published on many topics that can be viewed on rqt_image_view. Neat. Though I cannot see any images on rviz. I can select from a long list Fixed Frame Topics, but can't find a combinmation that shows in the Camera. Then of course, I need to next try the depthai-ros2 packages and research the packages that generate 3D Depth images for SLAM.

James H Phelan

unread,
Jan 27, 2022, 10:05:16 PM1/27/22
to 'Ross Lunan' via HomeBrew Robotics Club

Ross (with 2 S's),

That's pretty much where I am.  Let's keep working toward SLAM and sharing.

Jim

James H Phelan
"Nihil est sine ratione cur potius sit quam non sit"
Leibniz

"Here am I, the servent of the Lord;
let it be with me, according to your Word"
Luke 1:38
On 1/27/2022 8:42 PM, 'Ross Lunan' via HomeBrew Robotics Club wrote:
At the Wed Jan 26 monthly meeting several folks asked me who makes the Oak-D-Lite and its specs? Here's a informative link from Luxonis: https://docs.luxonis.com/projects/hardware/en/latest/pages/DM9095.html . As I described, on my MacM1 VM Ub20/Ros Noetic, I got it the basic ubuntu depthai_demo.py to work (don't forget the USB udev rules) and on ROS after compiling dependancies and catkin src code, without issue from the luxonis/depthai-ros github. The BW picture and disparity images & rgb color image data are published on many topics that can be viewed on rqt_image_view. Neat. Though I cannot see any images on rviz. I can select from a long list Fixed Frame Topics, but can't find a combinmation that shows in the Camera. Then of course, I need to next try the depthai-ros2 packages and research the packages that generate 3D Depth images for SLAM. --
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/hbrobotics/84810fe2-14fe-4d71-ba5f-be889112b830n%40googlegroups.com.

jhph...@hal-pc.org

unread,
Jan 30, 2022, 3:47:39 PM1/30/22
to HomeBrew Robotics Club

RoSS,
My Discord user name is jhphelan#2923

Ross Lunan

unread,
Apr 24, 2022, 7:47:16 PM4/24/22
to HomeBrew Robotics Club
For those experimenting with the Oak-D-Lite, I can report a promising milestone for using this camera on a Robot, at a much lower cost than the Intel & Zed products. I have NOT yet determined specific comparative metrics, but that's WIP. See the Luxonis DepthAI Discord posts by jetdillo and others for their further specific  progress & experience. I successfully installed "Depthai-ROS(Gen2) on Raspberry Pi4/4G , the "Robot" CPU (Set a 4G Swapfile) - Ubuntu20-Mate minimum desktop ROS2 Foxy. Following the https://github.com/luxonis/depthai-ros "Main Branch-NOT Foxy", successfully ran "Install Dependencies", OpenCV4.5.1 compiled, "Setting up procedure" workspace repository, rosdep. colcon build, worked for ros_msgs & bridge. It's a good afternoons work with several long compiles, and there's a couple patches needed (e.g.hash code)that I posted on the Luxonis Discord

So for Robot applications what about PointCloud Depth image publish to Laserscan? As the Luxonis repository scripts were written assuming 1 CPU, i.e. my Robot which normally does not have gui applications, I had to install ROS2 rviz2, rqt_image_view and "~:$ sudo apt-get install libogre-1.12-dev".  From the depthai-ros(GEN2)/ depth-ros-examples (ROS2 SRC repository, I was able to successfully run   :~$ ros2 launch depthai_examples stereo_inertial_node.launch.py camera_mode:=OAK_D_LITE  and display the /left/image_rect, right/image_rect,  /stereo/depth, /stereo_converted_depth that displayed in rqt_image_view AND in rviz that was started up by the above script on the Raspberry Pi "Robot" Ubuntu Desktop. The scripts also assume the yet to be shipped Oak-D-Lite with an IMU-but that didn't seem to matter-just no IMU data published. As we would prefer running RVIZ2 on a Host Remote PC amd64_86  Ubuntu Desktop rather than on the Raspberry Pi 4 Robot , I will next try revise the repository launch.py file to remove the RVIZ startup script and param file & install an appropriate launch.py on the Host Desktop. Then work to link the ROBOT camera Published data to the  ros2 depthcloud to laserscan Subscribe package. My plan is to emulate a "Linorobot2" Robot which is a Raspberry Pi/Teensy MicroROS architecture.

Mark Johnston

unread,
Apr 25, 2022, 4:05:26 AM4/25/22
to HomeBrew Robotics Club
I (sadly) have had nothing but great grief in getting the Oak Lite D all ready to rock on a Pi4 on Ubuntu 20.04
I have all but given up on this gizmo for now.  Perhaps will re-visit if others show how to make it all install.    Some of it installs, other things crash and burn.

I await a 'formula' as my attention span is 'spent'
Mark

Gmail

unread,
Apr 25, 2022, 6:09:26 AM4/25/22
to hbrob...@googlegroups.com
I found it to work well on Windows 10. 😁



Thomas

-
Want to learn more about ROBOTS?









On Apr 25, 2022, at 1:05 AM, Mark Johnston <mjst...@gmail.com> wrote:

I (sadly) have had nothing but great grief in getting the Oak Lite D all ready to rock on a Pi4 on Ubuntu 20.04

Ross Lunan

unread,
Apr 25, 2022, 12:14:29 PM4/25/22
to HomeBrew Robotics Club
Apologies for not including that deploying Depthai-ros on ROS2 on Raspi4 "is not for the faint of heart" and "bleeding edge" also might apply. I too, almost gave up a couple times, but got reasonably good response to questions, bug fixes, etc., from the Luxonis folks on their Discord forum. Getting an depthai-demo image on Windows and Ubuntu is quite easy, but ROS is a whole different thing! Be nice when compiled install binaries are released for "apt install", as for the Intel & ZED camera devices. Stay tuned. Another sort of related  thing I read on a recent Linorobot forum is that Cameras can work better when connected to a Powered USB Hub instead of directly to the Raspi or Jetson. My OAK-D-Lite apparently worked directly.  

Martin Triplett

unread,
May 2, 2022, 7:49:24 PM5/2/22
to HomeBrew Robotics Club
I have been thinking about trying an OAK-D or OAK-D-Lite, but have been weighing cost/benefits over my current setup.  I don't want to fight with the ROS install (I am not running ROS), and only really want to use the OAK-D if I can use a fairly simple python API to do so, one with few dependency issues, as I am doing a lot of other stuff and don't need my hands tied.

Right now, I am using an Orbbec Embedded S for color and depth, and an Intel T265 for its onboard SLAM, IMU, and fisheye cams.  Using the orbbec from python is way simple and the cam is very small.  The T265 is a bit of a pain in the ass to use but its the only option I know of for Visual SLAM calculation right on the sensor.

I think I might be able to swap the Orbbec for the OAK-D with some redesign as the OAK-D is much more massive.  I would still keep the T265 for its visual SLAM.  The advantage of the OAK would be in being able to run some models onboard the camera that I currently run off-bot through an API.  My  setup works but is obviously slower for models.  I am not sure that faster models is worth the trouble though.  Many parts of my vision system cannot run onboard the OAK-D...so is speeding up a few models like YOLO worth it?  I guess it depends on what else DepthAI can do.

To that end, I tried out a Movidius stick at one point but later got rid of it.  At the time I was running a Latte Panda Alpha which was faster at running most of the models itself if it wasn't doing too many other things.  I also came to the conclusion that less is more sometimes and started putting more thought and energy into preventing the robot from doing computationally expensive things continuously at high frequencies that it didn't really need.  This led me down the path of being selective and doing vision at lower frequencies.  I also put energy into thinking about which aspects of vision to run at different times, and running different modules based on whether the bot is driving, looking down, interacting with a person, and such.

Gmail

unread,
May 2, 2022, 8:28:32 PM5/2/22
to hbrob...@googlegroups.com
I have had a lot of success with Oak-D on Windows! I’ve managed to get YOLO working and with just a few lines of code I added some object filtering.  Oak-D has a large amount of sample code that does more image processing tasks than I have seen before. I have tried out a dozen of programs with great success! 

The only problem I had was setting up the libraries. Once I got past that it was smooth sailing. And that was just a Python thing. Python does a few things differently. 




Thomas

-
Want to learn more about ROBOTS?









On May 2, 2022, at 4:49 PM, Martin Triplett <martint...@gmail.com> wrote:

I have been thinking about trying an OAK-D or OAK-D-Lite, but have been weighing cost/benefits over my current setup.  I don't want to fight with the ROS install (I am not running ROS), and only really want to use the OAK-D if I can use a fairly simple python API to do so, one with few dependency issues, as I am doing a lot of other stuff and don't need my hands tied.

Jim DiNunzio

unread,
May 3, 2022, 1:14:22 AM5/3/22
to hbrob...@googlegroups.com

I can second success with the OAK-D. Fairly easy to use with python and many examples for a variety of AI vision use-cases. You can see many of my demonstrations with Big Orange utilize only a fraction of what Oak-D can do. I am using Tiny YOLO V4 for object (including persons) classification and detection fused with the camera’s stereo disparity based depth for 3D location of objects. I also use the facial recognition model. I’m planning on using the human pose recognition for gesture commands. I also use a second OAK-D camera for publishing depth map to the system for object avoidance.

 

Here are the videos showing OAK-D use cases I’ve implemented so far.

https://www.youtube.com/watch?v=T5B8D91FKMs&list=PLmb0WjGtyZZTXYqRmrmorwfBD2mkl4XiM&index=24

https://www.youtube.com/watch?v=xaq6maH_mpU&list=PLmb0WjGtyZZTXYqRmrmorwfBD2mkl4XiM&index=23

https://www.youtube.com/watch?v=43mfdQKmNZQ&list=PLmb0WjGtyZZTXYqRmrmorwfBD2mkl4XiM&index=26

https://www.youtube.com/watch?v=dN-18Rl8BnA&list=PLmb0WjGtyZZTXYqRmrmorwfBD2mkl4XiM&index=36

https://www.youtube.com/watch?v=ChPo2-RmACg&list=PLmb0WjGtyZZTXYqRmrmorwfBD2mkl4XiM&index=37

https://www.youtube.com/watch?v=Xkt9fR5iBig&list=PLmb0WjGtyZZTXYqRmrmorwfBD2mkl4XiM&index=41

 

Jim

Martin Triplett

unread,
May 3, 2022, 10:36:12 AM5/3/22
to HomeBrew Robotics Club
Great set of vids there Jim.  If you see comments from Red Sky Robotics...that is me.

You guys have about talked me into hitting the purchase button on OAK-D.  2 of my bots are running windows and python so maybe I have 2 reasonably simple APIs to access the sensor.  I am also drawn to trying the OAK-1 on a smaller project (through python), as it doesn't perhaps have need for depth.

Scott Monaghan

unread,
May 3, 2022, 11:54:32 AM5/3/22
to hbrob...@googlegroups.com
Martin,

I have an OAK-D lite that I don't have any near-term plans for.

If you want to borrow it and try it out, I’d be happy to ship it to you.

--

Ross Lunan

unread,
May 3, 2022, 12:56:43 PM5/3/22
to HomeBrew Robotics Club

Scott “Robo Fun”, have a question about your ROS2 iRobot base in your video. As I have a derelict D550 iRobot base that does have a serial port that I expect could be used to interface with a  ROS2 Node subscribing to /cmd_vel driver, ala original Turtlebot Create and Botvac. Could you point me to or share that package?

Steve " 'dillo" Okay

unread,
May 3, 2022, 12:57:59 PM5/3/22
to HomeBrew Robotics Club
I'm pretty sure I posted my "people-follower" video here a while back, but I'm sure a re-post wouldn't hurt :)
This is Tenacity with an OAK-D v1 running YOLO v3 to look for the full outline of a "person" vs. legs or leg-like shapes in a LIDAR scan.
About halfway through, somebody walks in front of it and it ignores them and continues to follow me.
'dillo
https://www.youtube.com/watch?v=OadLnxf5cI4
 

Martin Triplett

unread,
May 4, 2022, 12:40:32 PM5/4/22
to HomeBrew Robotics Club
Hi Scott,

Thankyou for the very kind offer to let me try out your OAK-D lite.  I would love to take you up on it, but I fear I won't have the bandwidth to focus on it until sometime in July (a lot of social calendar and business stuff taking my time away from bots, including my upcoming Ava preso on the 25th).  I wouldn't feel right having it sit around.  

Perhaps I'll check in with you down the road and see if you are still willing to lend it out.

I have an idea for a new OS/code base I'd like to chat with you about and share/open source it with you if you are interested.  Perhaps we can do a zoom call on it if you have the time.

Regards,
Martin

Scott Monaghan

unread,
May 4, 2022, 1:23:52 PM5/4/22
to hbrob...@googlegroups.com
Works for me! 

The price-point of it puts it out of range for the Ro-Bud project so I don’t expect to have major use for it for some time.

Scott Horton

unread,
May 5, 2022, 10:09:40 PM5/5/22
to HomeBrew Robotics Club
Hi Ross,

On Tuesday, May 3, 2022 at 11:56:43 AM UTC-5 Ross Lunan wrote:

Scott “Robo Fun”, have a question about your ROS2 iRobot base in your video. As I have a derelict D550 iRobot base that does have a serial port that I expect could be used to interface with a  ROS2 Node subscribing to /cmd_vel driver, ala original Turtlebot Create and Botvac. Could you point me to or share that package?

If the D550 supports the same serial command protocol as the Create2 (see https://www.irobotweb.com/-/media/MainSite/Files/About/STEM/Create/2018-07-19_iRobot_Roomba_600_Open_Interface_Spec.pdf), then these packages should work for you on ROS2 Galactic:

g...@github.com:rshorton/create_robot.git
g...@github.com:rshorton/libcreate.git
g...@github.com:rshorton/diagnostics.git   (needed by create_robot)

These are my clones.  The original projects may have been updated for ROS-2 by now.  The URDF in create_robot has been modified for my Elsabot, so you would need to de-Elsabot it.

While a bit pricey, I have found the Create2 base very useful in getting started with ROS.  I'm hoping to graduate from it soon and go with a 4/6 wheeled robot that I can use outside.  I plan to use the Linorobot low-level driver and mostly just drop the rest of the Elsabot hardware and software stack on top including the super handy OAK-D.

Let me know if you have any questions.

Regards,
Scott Horton

Ross Lunan

unread,
May 6, 2022, 12:26:13 AM5/6/22
to HomeBrew Robotics Club
Thank you and looks just what I was looking for. So can’t wait to load  this code into a Raspi4 & try talking to my Roomba. The Autonomy github has a Foxy branch that seems alive and well as some code was updated a few weeks ago. BTW, I’m interacting with the Luxonis folks on their Discord forum, to get them to refactor one of their current Oak-D-lite Camera depthai-ros-examples launch code scripts specifically for a typical Robot / Host Desktop setup configuration. The purpose is  to Publish Camera aligned rgb and depth pointcloud data from a Robot CPU that would be Subscribed to by the pointcloud-to-laserscan package for SLAM used by NAV2, and display on a Host/Desktop CPU rviz. I too am starting the build of a Linorobot2 as presented at a recent HBRC meeting, but with my Oak-D-Lite Camera instead of the Intel or Zed cameras.
Reply all
Reply to author
Forward
0 new messages