--
You received this message because you are subscribed to the Google Groups "ROS Sig Aldebaran" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ros-sig-aldebaran+unsubscribe@googlegroups.com.
To post to this group, send email to ros-sig-aldebaran@googlegroups.com.
Visit this group at https://groups.google.com/group/ros-sig-aldebaran.
To view this discussion on the web visit https://groups.google.com/d/msgid/ros-sig-aldebaran/c4ef5aa7-32ad-4d39-bcdd-7ef259db738d%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Hi Rein,
I try to answer some questions, but the answers are only based on my experience (I have not participated in the development of this software).You can control the real robot, let's say, in two ways: using the APIs of NAOqi (C++, python...) or using ROS. Actually, ROS is acting as a bridge, transforming the commands from NaoQi to ROS messages and the other way around. So, if you don't compile the ROS nodes on the robot (I think it's possible, but I have never tried), they are running on your external PC and communicating with the robot using the APIs of NAOqi.
For example, the ROS node naoqi_driver (that is used when you call pepper_bringup) subscribes to some ROS topics. It uses the NAOqi API to apply the commands (for example see here ). It also reads the data from the robot sensors and publishes them in ROS.To interact with the robot using ROS, you can use pepper_bringup or pepper_dcm_bringup. They are running on an external PC, so you have to set the right IP in the launch file. I think the only difference between the two is how the joints of the robot are controlled, and the second one should be more efficient. In pepper_bringup ALMotion is used while in pepper_dcm_bringup the DCM is used directly. The DCM is more low-level and it should set the joint positions at a higher frequency. In fact, The DCM runs at constant time cycle, typically between 1 and 10 ms, depending on the robot. While AlMotion is running at a lower frequency.
You have just to run lunch pepper_full.launch and you should get all the sensor data, and you can send command the robot (to move the base and the joints). You can modify the settings in the config file https://github.com/ros-naoqi/naoqi_driver/blob/master/share/boot_config.json. You can visualize the sensor data using RViz.About the Gazebo simulation, as Sammy said, it is missing some stuff. Right now, you cannot send the velocity to the base (see here). If I understood well, you can only control the simulated robot using ROS commands. You cannot use the APIs of NAOqi, as with the real robot, because NAOqi is not running in Gazebo. You can do that on Webots with the NAO robot, but I don't know if you can simulate Pepper.
2017-02-19 13:36 GMT+01:00 Rein Appeldoorn <rei...@gmail.com>:
Hi All,First of all, thanks for all the ROS packages you made available for the Pepper robot! It is great to see that you are supporting ROS!I have been looking at all available Pepper packages:However, it is hard to get an overview of what is applicable to Pepper and what launch files to use, I have the following questions:I would like to have a set-up with exactly the same interfaces for Gazebo as for the real robot:- What to use for simulation (gazebo)?- This launch https://github.com/ros-naoqi/pepper_virtual/blob/master/pepper_gazebo_plugin/launch/pepper_gazebo_plugin_Y20.launch (gives weird falling behaviour)- Is anybody using this simulation environment?- I only see two camera's simulated, is this correct or am I missing something? (https://github.com/ros-naoqi/pepper_robot/blob/master/pepper_description/urdf/pepper1.0_generated_urdf/pepperGazebo.xacro) Why are a lot of sensors commented out?- What to use for interacting with the real robot (motion control + sensors):- What is the exact difference between DCM bringup and the normal bringup? (http://wiki.ros.org/pepper_dcm_bringup, http://wiki.ros.org/pepper_bringup) - I guess, since we would like to streamline the simulation and robot interfaces, we should use dcm_bringup? Should we run this on the robot or an external platform via the ethernet connector?- What is the difference between the python and c++ driver? What driver to use? Which one is better?- Where is the naoqi bridge for? https://github.com/ros-naoqi/naoqi_bridge- Where can I find all the sensor drivers? Are these all the available drivers (https://github.com/ros-naoqi/pepper_robot/blob/master/pepper_bringup/launch/pepper_full_py.launch) ?- The naoqi_driver also seems to have sonar interfaces etc (https://github.com/ros-naoqi/naoqi_driver/tree/master/src/publishers), should we use these?I would like to contribute but I do not have a clear overview of what is available and working properly. I do not want to redo any work you have done!Thanks is advance,-Rein
--
You received this message because you are subscribed to the Google Groups "ROS Sig Aldebaran" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ros-sig-aldeba...@googlegroups.com.
To post to this group, send email to ros-sig-...@googlegroups.com.
Hello,
and thank you to Giovanni for all these nice answers!
I try to give some more details here.
On Friday, February 24, 2017 at 12:00:50 PM UTC+1, Giovanni Claudio wrote:Hi Rein,
I try to answer some questions, but the answers are only based on my experience (I have not participated in the development of this software).You can control the real robot, let's say, in two ways: using the APIs of NAOqi (C++, python...) or using ROS. Actually, ROS is acting as a bridge, transforming the commands from NaoQi to ROS messages and the other way around. So, if you don't compile the ROS nodes on the robot (I think it's possible, but I have never tried), they are running on your external PC and communicating with the robot using the APIs of NAOqi.
Right, ROS packages communicates with Naoqi that runs on a robot and then Naoqi communicates with the robot. It is possible to run all packages on a remote PC and also on a robot. However, if you want to run any package on a robot, then you need to compile this package and its dependencies on OpenNAO (a Virtual machine) or cross-compile with a toolchain that is not always easy because the VM is very old.
For example, the ROS node naoqi_driver (that is used when you call pepper_bringup) subscribes to some ROS topics. It uses the NAOqi API to apply the commands (for example see here ). It also reads the data from the robot sensors and publishes them in ROS.To interact with the robot using ROS, you can use pepper_bringup or pepper_dcm_bringup. They are running on an external PC, so you have to set the right IP in the launch file. I think the only difference between the two is how the joints of the robot are controlled, and the second one should be more efficient. In pepper_bringup ALMotion is used while in pepper_dcm_bringup the DCM is used directly. The DCM is more low-level and it should set the joint positions at a higher frequency. In fact, The DCM runs at constant time cycle, typically between 1 and 10 ms, depending on the robot. While AlMotion is running at a lower frequency.
pepper_bringup and pepper_dcm_bringup packages have different goals.
"Pepper bringup" is made to listen the state of the robot and do simple control, such as navigation or joint control. it is based on Naoqi Driver that is more generic and can be used for any of robots as Nao, Pepper, or Romeo.
"Pepper DCM Bringup" is made for controlling a robot with ROS controllers such as Trajectory and Position controllers, for example when you need to control your robot via MoveIt. Pepper DCM Bringup can be used to control a robot via ALMotion (by default) or via DCM (depending on the parameter in the launch file). Via DCM is faster but can make the robot shaking.
You can use both packages together or separately, depending on what you need.
You have just to run lunch pepper_full.launch and you should get all the sensor data, and you can send command the robot (to move the base and the joints). You can modify the settings in the config file https://github.com/ros-naoqi/naoqi_driver/blob/master/share/boot_config.json. You can visualize the sensor data using RViz.About the Gazebo simulation, as Sammy said, it is missing some stuff. Right now, you cannot send the velocity to the base (see here). If I understood well, you can only control the simulated robot using ROS commands. You cannot use the APIs of NAOqi, as with the real robot, because NAOqi is not running in Gazebo. You can do that on Webots with the NAO robot, but I don't know if you can simulate Pepper.Right, gazebo plugin is under development. It simulates only some sensors, like cameras and sonars (check the last commit https://github.com/ros-naoqi/pepper_robot/commit/a24bc5543c9d0b3afacd7f1a7d15e4eee1832198), and allow to simulate robot joints control, for example it works well with MoveIt when doing arm control, grasping, etc However, cmd_vel does not work in the current version, because of the currently used controller (check this PR, maybe it can help you to find a way to make it working https://github.com/ros-naoqi/pepper_robot/pull/31). If you find a way to debug cmd_vel, please do a PR! If you wish to contribute to Gazebo plugin, you are very welcome!
Another point to take into account, that the current Gazebo plugin does take Naoqi into account. if you wish to do it, check https://github.com/costashatz/nao_gazebo for Nao, that should be very similar.
This email and any attachment thereto are confidential and intended solely for the use of the individual or entity to whom they are addressed.
If you are not the intended recipient, please be advised that disclosing, copying, distributing or taking any action in reliance on the contents of this email is strictly prohibited. In such case, please immediately advise the sender, and delete all copies and attachment from your system.
This email shall not be construed and is not tantamount to an offer, an acceptance of offer, or an agreement by SoftBank Robotics Europe on any discussion or contractual document whatsoever. No employee or agent is authorized to represent or bind SoftBank Robotics Europe to third parties by email, or act on behalf of SoftBank Robotics Europe by email, without express written confirmation by SoftBank Robotics Europe’ duly authorized representatives.Ce message électronique et éventuelles pièces jointes sont confidentiels, et exclusivement destinés à la personne ou l'entité à qui ils sont adressés.
Si vous n'êtes pas le destinataire visé, vous êtes prié de ne pas divulguer, copier, distribuer ou prendre toute décision sur la foi de ce message électronique. Merci d'en aviser immédiatement l'expéditeur et de supprimer toutes les copies et éventuelles pièces jointes de votre système.
Ce message électronique n'équivaut pas à une offre, à une acceptation d’offre, ou à un accord de SoftBank Robotics Europe sur toute discussion ou document contractuel quel qu’il soit, et ne peut être interprété comme tel. Aucun employé ou agent de SoftBank Robotics Europe n'est autorisé à représenter ou à engager la société par email, ou à agir au nom et pour le compte de la société par email, sans qu’une confirmation écrite soit donnée par le représentant légal de SoftBank Robotics Europe ou par toute autre personne ayant reçu délégation de pouvoir appropriée.--
You received this message because you are subscribed to a topic in the Google Groups "ROS Sig Aldebaran" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/ros-sig-aldebaran/CUV9nJaslJc/unsubscribe.
To unsubscribe from this group and all its topics, send an email to ros-sig-aldebaran+unsubscribe@googlegroups.com.
To post to this group, send email to ros-sig-aldebaran@googlegroups.com.
Visit this group at https://groups.google.com/group/ros-sig-aldebaran.
To view this discussion on the web visit https://groups.google.com/d/msgid/ros-sig-aldebaran/866adea8-b349-4eb6-81f1-f4f078ed39c0%40googlegroups.com.
I have been looking into the pepper_sensors_py but it seems that these are not implemented correctly, for example:https://github.com/ros-naoqi/pepper_robot/blob/master/pepper_sensors_py/nodes/laser.pyThe main loop fetches 6 sensors using the ALMemory API which drops the frequency to 1hz per sensor. Do you agree with my statement? If so, I can work on creating proper drivers for all sensors; but Python or C++? And should we include these sensors in the naoqi drivers of separately? What determines the distinction?
I would like to fork and add PR's but the package structure is not clear to me. This makes me more lean to an approach of starting a new package for all pepper interfaces based on the various different ones.
If you have question about a particular package, you should ask developers. If you see that something is missing, you are always welcome to do a pull request.
You have just to run lunch pepper_full.launch and you should get all the sensor data, and you can send command the robot (to move the base and the joints). You can modify the settings in the config file https://github.com/ros-naoqi/naoqi_driver/blob/master/share/boot_config.json. You can visualize the sensor data using RViz.About the Gazebo simulation, as Sammy said, it is missing some stuff. Right now, you cannot send the velocity to the base (see here). If I understood well, you can only control the simulated robot using ROS commands. You cannot use the APIs of NAOqi, as with the real robot, because NAOqi is not running in Gazebo. You can do that on Webots with the NAO robot, but I don't know if you can simulate Pepper.Right, gazebo plugin is under development. It simulates only some sensors, like cameras and sonars (check the last commit https://github.com/ros-naoqi/pepper_robot/commit/a24bc5543c9d0b3afacd7f1a7d15e4eee1832198), and allow to simulate robot joints control, for example it works well with MoveIt when doing arm control, grasping, etc However, cmd_vel does not work in the current version, because of the currently used controller (check this PR, maybe it can help you to find a way to make it working https://github.com/ros-naoqi/pepper_robot/pull/31). If you find a way to debug cmd_vel, please do a PR! If you wish to contribute to Gazebo plugin, you are very welcome!
Another point to take into account, that the current Gazebo plugin does take Naoqi into account. if you wish to do it, check https://github.com/costashatz/nao_gazebo for Nao, that should be very similar.Do you mean does "NOT" take the naoqi into account? IMO, it is best to make the simulator naoqi independent and use ROS's autonomy modules for the behavior. I have also been diving into the pepper_description and the PR's regarding the gazebo simulation/plugins. Every description file seems to be auto-generate: " ****** File automatically generated by generate_urdf.py script ******", where is this generation script if I would like to work on getting the simulator up and running?
Sorry for the long post; I really appreciate your help. We will contribute soon if we have a plan to tackle the open issues :). And btw, the Pepper is an awesome robot :).
To unsubscribe from this group and all its topics, send an email to ros-sig-aldeba...@googlegroups.com.
To post to this group, send email to ros-sig-...@googlegroups.com.
Visit this group at https://groups.google.com/group/ros-sig-aldebaran.