Iwant to make a simulation of fanuc robot in Roboguide. My question concerns about specific configurations needed to send movements from within ROS/MoveIt/Rviz to the simulated robot inside Roboguide.
Briefly, I use Ubuntu 18.04.2 LTS bionic and ros-melodic.Basically, I have ROS/Moveit respositories for fanuc. I downloaded, installed and built these repositories from this link under my catkin_ws.
Next, on the same tutorial, in section 6 Copying the Binaries, it says, Finally, the binaries need to be transferred to the controller. Since, we are on windows PC, and working inside Roboguide. We have necessary programs (compiled binaries) already available inside Roboguide, and we are using the simulated controller.Where do we have to copy the files? Or is it referring in case if we had a real-robot? So, for our case, with the simulation, we can just proceed with the next steps.
Next, I move on the next tutorial, which are in fact further configurations. I have this question, section 2 Server Tags check Tag availability and status, open the Host Comm setup screen, which I cannot locate. Please, if you can help me clarify these steps, and give suggestions on next steps.
Next, on the same tutorial, in section 6 Copying the Binaries, it says, Finally, the binaries need to be transferred to the controller. Since, we are on windows PC, and working inside Roboguide. We have necessary programs (compiled binaries) already available inside Roboguide, and we are using the simulated controller. Where do we have to copy the files? Or is it referring in case if we had a real-robot?
The tutorials you reference are not meant as an introduction to using Fanuc robots, but to guide you to setup the ROS driver for use with such robots. Note also the statement at the top of the tutorials:
Note: This tutorial assumes familiarity with Fanuc controller cabinets, the Teach Pendant (iPendant) and the Roboguide environment. For more information on any of the steps in this tutorial, consult the documentation on these subjects provided by Fanuc Robotics.
thanks @gvdhoorn, I should've studied it bit more, I opened the Teach Pendant, in default configuration, the relevant keyboard was not visible. I thought, perhaps, I missed some settings. After playing around on the keyboard, and reading this tutorial, I managed to complete most configurations indicated. Later on, before running, it says, On the TP, start the rosstate TPE program. Please excuse my ignorance, I can see the program listing ROSRELAY, ROSSTATE, ROS_MOVESM, but, not sure, from where to "start" it. If I Enter, it takes me to the code listing of the program. There are some keys/buttons on the top left of TP that read Busy, Step, Hold, Fault, Run, I/O, Prod and TCyc. But, they are probably only status indicators. Thanks,
I would suggest you try looking up the relevant manuals from Fanuc, as becoming more familiar with this will help you working your way through these tutorials, and will make working with Fanuc robots much more efficient.
Hi @gvdhoorn, thanks for your input. I agree; the software Roboguide, its terminology and concepts needs to be studied for easy understanding of the configurations. The lack of understanding is due to the way in which the task was assigned to me ( about a month ago), with a little time on the trial version left, the idea was to run and test a simple motion goal from ROS/MoveIt to fanuc Roboguide. Though, it needs dedicating some time to follow the fanuc controllers first, the tutorials of the Roboguide and then testing the connection with the ROS/MoveIt. The tutorials are probably more for the people who are working with fanuc controllers, and they now want to see how does ROS connect with it. For me, it was the opposite, I have been learning ROS/MoveIt. I tested simple motions, on rviz and real robotic arm (UR5), making use of ...(more)
Guide rail change-over is fast and inexpensive with Robo-Guide components. Adjustment can be made by manual hand-crank, the touch of a button, or computer controlled. The Robo-Guide guide rail system can be retrofitted for use with any brand of existing conveyor or implemented with any new conveyor system. This product was introduced at the 2000 PMMI show.
This is just the tip of the KAREL iceberg. Hopefully this example willget you thinking how KAREL may be the right choice for certainprogramming tasks where TP may not be so good. If you really want tolearn KAREL, see if you can get a copy of the KAREL manual from yourlocal FANUC rep or integrator.
Second, as the manual states the stack size include all parameters (there is a table in the manual that specifies the size of each data type). In my case, the culprit was my generous usage of string variable declaration
The string parameters can quickly add up, especially arrays of strings. Make sure you only declare what you need, or even better don't use the local variable if not strictly needed. In Fanuc's Karel the stack space is tight, make everything count.
Hi @Steven Chen, was Jason Lightfoot's answer helpful? If so, please click the "Accept" button at the bottom of their answer. Or if you still have questions, add a comment and we'll continue the conversation.
If the instances are robots then just add them to an object process flow containin the emulation variable set up as the documentation described - say with a label on each to use as the instance specifier. If the instances are something more abstract within FlexSim - like joints - you'd have to find a way of attaching those to the object process flow as members - such as manually sampling the drawsurrogate tree.
I was supposing that format of node id can be changed to "ns=1;i=302[0]" to access only one boolean value in the array, but somehow manually set the string of node id in text field didn't take effect.
The video shows the manual guidance of the KUKA iiwa within a program. The robot can be guided by hand on the flange when the user presses the gray consent button. After releasing it, the program asks whether the current position is correct and should be saved. Any number of additional positions can then be added. At the end of the program, all stored room points are traversed in the taught sequence.
In this project, shapes are detected and their edges are realized by a laser scanner mounted on the end effector of the robot. The collected data is synchronized and filtered, and a suitable trajectory is created for coating the inner surface of the forms. Several variables, such as nozzle speed, spacing and gaps, nozzle size, and outlier for homogeneous coating can be selected during trajectory planning.
In the EU project SHAREWORK we develop a framework for industry that allows human-robot collaboration even with heavy industrial robots. We are developing camera-based methods for localizing objects in the workspace and deriving process states from them, which in turn can be further used in task planning algorithms.
For the camera network, we set up and calibrated four Stereolabs ZED stereo cameras in our hall. Random, checkerboard, Aruco, as well as ChAruco patterns were used for the calibration. In the end, we have managed to calibrate the cameras to sub-pixel accuracy. In the video, you can see some data from our calibration sets. Currently the data is being processed and we hope to show more in a few weeks.
Robot Companion is a framework to implement robot tracking systems in a simple and cost-saving way. For this purpose, IGMR develops methods for tracking with different sensors (laser, radar, camera), agile path planning and actuation.
The current objective of Robot Companion is to provide a robot for emergency rescue. In doing so, the robot will autonomously follow first responders and enable the transport of materials and equipment, as well as the removal of debris and casualties. A first path to this vision was implemented with the basic module. The basic module has methods for tracking with a camera and laser, and enables autonomous tracking of an operator.
3a8082e126