18 views

Skip to first unread message

Feb 7, 2022, 9:15:36 AMFeb 7

to Robotics & Machine Vision Toolboxes

Hi,

I have a problem with the inverse kinemtaics in the python toolbox. I want to have an application, where the robot hold a sample at a given point and rotate the sample to a given orientation. Th robot is an UR5e.

At 50% of the time the solution calculated by the inverse kinematics give me a spatial and orientation deviation of the effector to the expected value. That's my way I do at the moment:

I calculate a Rotation matrix (RotMat) and shift this orientation at a given point x0, y0, z0.

d = spatialmath.pose3d.SE3.Rt(RotMat,[x0,y0,z0], check=True)

Then I calculate for this homogenous matrix d the inverse kinemtics solution sol

sol = robot.ikine_min(d,q0,qlim=True, ilimit=1000, tol=1e-20, method="trust-constr")

As I'm a curious person I want to know, if the solution for the joints give me the Rotation at the point I ask for:

direct_test = robot.fkine(sol[0])

I expect to have d==direct_test, but in half of the cases there is a deviation of sometimes more then 30cm in the location. The other half is the exact solution, with none deviation in location or orientation at all.

How can I improve the results? I tried to variate the starting point of the solver q0 by a random value with no effect to the results.

BR

Hannes

Reply all

Reply to author

Forward

0 new messages

Search

Clear search

Close search

Google apps

Main menu