Quantcast
Channel: ROS Answers: Open Source Q&A Forum - RSS feed
Viewing all 1441 articles
Browse latest View live

how can we execute a plan via motion planning interface in moveit?

$
0
0
when we use movegroup interface we can execute a plan using move_group.move(); and in python movegroup intertace we use: group = moveit_commander.MoveGroupCommander(group_name) plan = group.go(wait=True) group.execute(plan, wait=True). how about when we use motion planning interface?

What is the modern alternative to PlanningSceneMonitor::updateFrameTransforms?

$
0
0
While having [panda_moveit_config's](https://github.com/ros-planning/panda_moveit_config) panda_moveit.launch up and running a position controller, loading objects in the planning scene with `moveit::planning_interface::PlanningSceneInterface::addCollisionObjects` results in error `Unable to transform from frame 'world' to frame '/panda_link0'. Returning identity.`. I tracked down the issue to [moveit::core::getTransform](https://github.com/ros-planning/moveit/blob/68eb44ded1ca665dfab81fc5c615a81f95409291/moveit_core/transforms/src/transforms.cpp#L89) which I suppose gets called by the interface which calls methods from `planning_scene`, probably [here](https://github.com/ros-planning/moveit/blob/68eb44ded1ca665dfab81fc5c615a81f95409291/moveit_core/planning_scene/src/planning_scene.cpp#L82). However, I'm 100% sure the transform exists before publishing the objects, because I sleep until it's available with `!buffer_.canTransform("panda_link0", "world", ros::Time {0}))` (where `buffer_` is a `tf2_ros::Buffer`) and I can see it if I `rosrun tf2_tools view_frames.py`. Looking more into it, I found it likely needs to update the internal map of transforms used for the `planning_scene`, however the only call I could find to force this is in [planning_scene_monitor](https://github.com/ros-planning/moveit/blob/3e1670a6fe6d11750036a5a377d1b700c1e47f11/moveit_ros/planning/planning_scene_monitor/include/moveit/planning_scene_monitor/planning_scene_monitor.h#L298) which IIRC is not recommended to be used, and uses `tf` instead of `tf2` in the only constructor I could call. From this stems the title question. Is there otherwise an alternative to get a pointer to an existing monitor from `MoveGroup`/`PlanningSceneInterface`/something else that I didn't see? [This issue](https://github.com/ros-planning/moveit/issues/323) is also noteworthy, as it discusses the solution I would have found, however it doesn't seem ideal to post directly there since it's 2y old.

Stop and continue with cartesian path

$
0
0
Hello, I would like to make a cartesian plan of lets say a circle. Then execute it asynchronously. After a while I would stop the execution and wait a little. Then I would like to continue with a trajectory I started at the beginig and finish the cirlce. What I did: 1. Compute cartesian plan of a whole circle 2. Asynchronously execute a computed plan 3. rospy.sleep() to simulate a random input signal (waiting for a signal) 4. group.stop() to stop a plan 5. rospy.sleep() actually wait on the place a little 6. Extract all further points from the plan I need to execute 7. group.move() but this is very choppy So I removed points from a plan by plan.joint_trajectory.poinst.pop() but this does not work. It fails during execution that there is no plan for moving group.

Adding snap(derivative of jerk) in trajectory

$
0
0
Hi I am using ROS Kinetic with Ubuntu 16.04. I want to implement quintic spine on acceleration profile in the trajectory given by MoveIt!. I have already applied the jerk limits and produced a better acceleration profile (trapezoidal). however I want to implement snap continuity (derivative of jerk) to make the acceleration profile more smooth. I didn't find any documents related to it. Can someone help me with it ?

moveit: error: moveit_controller_manager::MoveItControllerManager does not exist

$
0
0
Hello, I'm trying to run the ABB_experimental moveit gazebo integration tutorial as suggested [here](https://github.com/ros-industrial/abb_experimental/tree/kinetic-devel/abb_irb120_gazebo). When i run the command to invoke rviz i get following error: FATAL: Exception while loading controller manager 'moveit_simple_controller_manager/MoveItSimpleControllerManager': According to the loaded plugin descriptions the class moveit_simple_controller_manager/MoveItSimpleControllerManager with base class type moveit_controller_manager::MoveItControllerManager does not exist. Declared types are moveit_fake_controller_manager/MoveItFakeControllerManager anybody has any idea why it is happening?? Full terminal text is as follows: mvish7@FlyingDutchman:~/abb_experimental_ws$ source devel/setup.bash mvish7@FlyingDutchman:~/abb_experimental_ws$ roslaunch abb_irb120_moveit_config moveit_planning_execution_gazebo.launch ... logging to /home/mvish7/.ros/log/d908f578-01f5-11e9-a70b-f82819cb83d3/roslaunch-FlyingDutchman-16792.log Checking log directory for disk usage. This may take awhile. Press Ctrl-C to interrupt Done checking log file disk usage. Usage is <1GB. started roslaunch server http://FlyingDutchman:40763/ SUMMARY PARAMETERS * /controller_joint_names: ['joint_1', 'join... * /move_group/allow_trajectory_execution: True * /move_group/controller_list: [{'action_ns': 'j... * /move_group/jiggle_fraction: 0.05 * /move_group/manipulator/longest_valid_segment_fraction: 0.005 * /move_group/manipulator/planner_configs: ['SBLkConfigDefau... * /move_group/manipulator/projection_evaluator: joints(joint_1,jo... * /move_group/max_range: 5.0 * /move_group/max_safe_path_cost: 1 * /move_group/moveit_controller_manager: moveit_simple_con... * /move_group/moveit_manage_controllers: True * /move_group/octomap_resolution: 0.025 * /move_group/planner_configs/BFMTkConfigDefault/balanced: 0 * /move_group/planner_configs/BFMTkConfigDefault/cache_cc: 1 * /move_group/planner_configs/BFMTkConfigDefault/extended_fmt: 1 * /move_group/planner_configs/BFMTkConfigDefault/heuristics: 1 * /move_group/planner_configs/BFMTkConfigDefault/nearest_k: 1 * /move_group/planner_configs/BFMTkConfigDefault/num_samples: 1000 * /move_group/planner_configs/BFMTkConfigDefault/optimality: 1 * /move_group/planner_configs/BFMTkConfigDefault/radius_multiplier: 1.0 * /move_group/planner_configs/BFMTkConfigDefault/type: geometric::BFMT * /move_group/planner_configs/BKPIECEkConfigDefault/border_fraction: 0.9 * /move_group/planner_configs/BKPIECEkConfigDefault/failed_expansion_score_factor: 0.5 * /move_group/planner_configs/BKPIECEkConfigDefault/min_valid_path_fraction: 0.5 * /move_group/planner_configs/BKPIECEkConfigDefault/range: 0.0 * /move_group/planner_configs/BKPIECEkConfigDefault/type: geometric::BKPIECE * /move_group/planner_configs/BiESTkConfigDefault/range: 0.0 * /move_group/planner_configs/BiESTkConfigDefault/type: geometric::BiEST * /move_group/planner_configs/BiTRRTkConfigDefault/cost_threshold: 1e300 * /move_group/planner_configs/BiTRRTkConfigDefault/frountier_node_ratio: 0.1 * /move_group/planner_configs/BiTRRTkConfigDefault/frountier_threshold: 0.0 * /move_group/planner_configs/BiTRRTkConfigDefault/init_temperature: 100 * /move_group/planner_configs/BiTRRTkConfigDefault/range: 0.0 * /move_group/planner_configs/BiTRRTkConfigDefault/temp_change_factor: 0.1 * /move_group/planner_configs/BiTRRTkConfigDefault/type: geometric::BiTRRT * /move_group/planner_configs/ESTkConfigDefault/goal_bias: 0.05 * /move_group/planner_configs/ESTkConfigDefault/range: 0.0 * /move_group/planner_configs/ESTkConfigDefault/type: geometric::EST * /move_group/planner_configs/FMTkConfigDefault/cache_cc: 1 * /move_group/planner_configs/FMTkConfigDefault/extended_fmt: 1 * /move_group/planner_configs/FMTkConfigDefault/heuristics: 0 * /move_group/planner_configs/FMTkConfigDefault/nearest_k: 1 * /move_group/planner_configs/FMTkConfigDefault/num_samples: 1000 * /move_group/planner_configs/FMTkConfigDefault/radius_multiplier: 1.1 * /move_group/planner_configs/FMTkConfigDefault/type: geometric::FMT * /move_group/planner_configs/KPIECEkConfigDefault/border_fraction: 0.9 * /move_group/planner_configs/KPIECEkConfigDefault/failed_expansion_score_factor: 0.5 * /move_group/planner_configs/KPIECEkConfigDefault/goal_bias: 0.05 * /move_group/planner_configs/KPIECEkConfigDefault/min_valid_path_fraction: 0.5 * /move_group/planner_configs/KPIECEkConfigDefault/range: 0.0 * /move_group/planner_configs/KPIECEkConfigDefault/type: geometric::KPIECE * /move_group/planner_configs/LBKPIECEkConfigDefault/border_fraction: 0.9 * /move_group/planner_configs/LBKPIECEkConfigDefault/min_valid_path_fraction: 0.5 * /move_group/planner_configs/LBKPIECEkConfigDefault/range: 0.0 * /move_group/planner_configs/LBKPIECEkConfigDefault/type: geometric::LBKPIECE * /move_group/planner_configs/LBTRRTkConfigDefault/epsilon: 0.4 * /move_group/planner_configs/LBTRRTkConfigDefault/goal_bias: 0.05 * /move_group/planner_configs/LBTRRTkConfigDefault/range: 0.0 * /move_group/planner_configs/LBTRRTkConfigDefault/type: geometric::LBTRRT * /move_group/planner_configs/LazyPRMkConfigDefault/range: 0.0 * /move_group/planner_configs/LazyPRMkConfigDefault/type: geometric::LazyPRM * /move_group/planner_configs/LazyPRMstarkConfigDefault/type: geometric::LazyPR... * /move_group/planner_configs/PDSTkConfigDefault/type: geometric::PDST * /move_group/planner_configs/PRMkConfigDefault/max_nearest_neighbors: 10 * /move_group/planner_configs/PRMkConfigDefault/type: geometric::PRM * /move_group/planner_configs/PRMstarkConfigDefault/type: geometric::PRMstar * /move_group/planner_configs/ProjESTkConfigDefault/goal_bias: 0.05 * /move_group/planner_configs/ProjESTkConfigDefault/range: 0.0 * /move_group/planner_configs/ProjESTkConfigDefault/type: geometric::ProjEST * /move_group/planner_configs/RRTConnectkConfigDefault/range: 0.0 * /move_group/planner_configs/RRTConnectkConfigDefault/type: geometric::RRTCon... * /move_group/planner_configs/RRTkConfigDefault/goal_bias: 0.05 * /move_group/planner_configs/RRTkConfigDefault/range: 0.0 * /move_group/planner_configs/RRTkConfigDefault/type: geometric::RRT * /move_group/planner_configs/RRTstarkConfigDefault/delay_collision_checking: 1 * /move_group/planner_configs/RRTstarkConfigDefault/goal_bias: 0.05 * /move_group/planner_configs/RRTstarkConfigDefault/range: 0.0 * /move_group/planner_configs/RRTstarkConfigDefault/type: geometric::RRTstar * /move_group/planner_configs/SBLkConfigDefault/range: 0.0 * /move_group/planner_configs/SBLkConfigDefault/type: geometric::SBL * /move_group/planner_configs/SPARSkConfigDefault/dense_delta_fraction: 0.001 * /move_group/planner_configs/SPARSkConfigDefault/max_failures: 1000 * /move_group/planner_configs/SPARSkConfigDefault/sparse_delta_fraction: 0.25 * /move_group/planner_configs/SPARSkConfigDefault/stretch_factor: 3.0 * /move_group/planner_configs/SPARSkConfigDefault/type: geometric::SPARS * /move_group/planner_configs/SPARStwokConfigDefault/dense_delta_fraction: 0.001 * /move_group/planner_configs/SPARStwokConfigDefault/max_failures: 5000 * /move_group/planner_configs/SPARStwokConfigDefault/sparse_delta_fraction: 0.25 * /move_group/planner_configs/SPARStwokConfigDefault/stretch_factor: 3.0 * /move_group/planner_configs/SPARStwokConfigDefault/type: geometric::SPARStwo * /move_group/planner_configs/STRIDEkConfigDefault/degree: 16 * /move_group/planner_configs/STRIDEkConfigDefault/estimated_dimension: 0.0 * /move_group/planner_configs/STRIDEkConfigDefault/goal_bias: 0.05 * /move_group/planner_configs/STRIDEkConfigDefault/max_degree: 18 * /move_group/planner_configs/STRIDEkConfigDefault/max_pts_per_leaf: 6 * /move_group/planner_configs/STRIDEkConfigDefault/min_degree: 12 * /move_group/planner_configs/STRIDEkConfigDefault/min_valid_path_fraction: 0.2 * /move_group/planner_configs/STRIDEkConfigDefault/range: 0.0 * /move_group/planner_configs/STRIDEkConfigDefault/type: geometric::STRIDE * /move_group/planner_configs/STRIDEkConfigDefault/use_projected_distance: 0 * /move_group/planner_configs/TRRTkConfigDefault/frountierNodeRatio: 0.1 * /move_group/planner_configs/TRRTkConfigDefault/frountier_threshold: 0.0 * /move_group/planner_configs/TRRTkConfigDefault/goal_bias: 0.05 * /move_group/planner_configs/TRRTkConfigDefault/init_temperature: 10e-6 * /move_group/planner_configs/TRRTkConfigDefault/k_constant: 0.0 * /move_group/planner_configs/TRRTkConfigDefault/max_states_failed: 10 * /move_group/planner_configs/TRRTkConfigDefault/min_temperature: 10e-10 * /move_group/planner_configs/TRRTkConfigDefault/range: 0.0 * /move_group/planner_configs/TRRTkConfigDefault/temp_change_factor: 2.0 * /move_group/planner_configs/TRRTkConfigDefault/type: geometric::TRRT * /move_group/planning_plugin: ompl_interface/OM... * /move_group/planning_scene_monitor/publish_geometry_updates: True * /move_group/planning_scene_monitor/publish_planning_scene: True * /move_group/planning_scene_monitor/publish_state_updates: True * /move_group/planning_scene_monitor/publish_transforms_updates: True * /move_group/request_adapters: default_planner_r... * /move_group/start_state_max_bounds_error: 0.1 * /move_group/trajectory_execution/allowed_execution_duration_scaling: 1.2 * /move_group/trajectory_execution/allowed_goal_duration_margin: 0.5 * /move_group/trajectory_execution/allowed_start_tolerance: 0.01 * /robot_description_kinematics/manipulator/kinematics_solver: kdl_kinematics_pl... * /robot_description_kinematics/manipulator/kinematics_solver_attempts: 3 * /robot_description_kinematics/manipulator/kinematics_solver_search_resolution: 0.005 * /robot_description_kinematics/manipulator/kinematics_solver_timeout: 0.005 * /robot_description_planning/joint_limits/joint_1/has_acceleration_limits: True * /robot_description_planning/joint_limits/joint_1/has_velocity_limits: True * /robot_description_planning/joint_limits/joint_1/max_acceleration: 0.87266 * /robot_description_planning/joint_limits/joint_1/max_velocity: 4.36332 * /robot_description_planning/joint_limits/joint_2/has_acceleration_limits: True * /robot_description_planning/joint_limits/joint_2/has_velocity_limits: True * /robot_description_planning/joint_limits/joint_2/max_acceleration: 0.87266 * /robot_description_planning/joint_limits/joint_2/max_velocity: 4.36332 * /robot_description_planning/joint_limits/joint_3/has_acceleration_limits: True * /robot_description_planning/joint_limits/joint_3/has_velocity_limits: True * /robot_description_planning/joint_limits/joint_3/max_acceleration: 0.87266 * /robot_description_planning/joint_limits/joint_3/max_velocity: 4.36332 * /robot_description_planning/joint_limits/joint_4/has_acceleration_limits: True * /robot_description_planning/joint_limits/joint_4/has_velocity_limits: True * /robot_description_planning/joint_limits/joint_4/max_acceleration: 1.11701 * /robot_description_planning/joint_limits/joint_4/max_velocity: 5.58505 * /robot_description_planning/joint_limits/joint_5/has_acceleration_limits: True * /robot_description_planning/joint_limits/joint_5/has_velocity_limits: True * /robot_description_planning/joint_limits/joint_5/max_acceleration: 1.11701 * /robot_description_planning/joint_limits/joint_5/max_velocity: 5.58505 * /robot_description_planning/joint_limits/joint_6/has_acceleration_limits: True * /robot_description_planning/joint_limits/joint_6/has_velocity_limits: True * /robot_description_planning/joint_limits/joint_6/max_acceleration: 1.46608 * /robot_description_planning/joint_limits/joint_6/max_velocity: 7.33038 * /robot_description_semantic:

Deeper control over robot self-filtering

$
0
0
Good day, For an application, we are developing it is important to use the self-filtering feature from Moveit package, but there are quite a few things that we need to either further develop or be able to control. First, currently Moveit! only allows starting self-filtering with the sensor_pluging occupancy_map_monitor/PointCloudOctomapUpdater but in our case, we **only want to use the self-filtering** function (collision avoidance is handled differently). Furthermore, **the output of the self-filter function stores only the x,y,z channels of the input cloud. In our case, normals and curvature information are important for the next stage of processing**, which in theory we want to perform with the self-filtered cloud; meaning that, we either need a cloud with the original channels minus the robot points, or the points "masks" indicating which of them are part of the robot. I have seen the [robot_self_filter](http://wiki.ros.org/robot_self_filter) package and its [documentation](https://docs.ros.org/diamondback/api/robot_self_filter/html/index.html) but this looks rather outdated, and since there is no use of this package in the current moveit code, I assume this is not the package to use. My question would be, **which are the packages and scripts used by Moveit for its self-filtering functions (I understand this function is not stable or in beta), is there any documentation on only self-filtering?.** I am interested in updating this function to include the functionalities mentioned above and maybe try to include some more, and document it.

ABB: S4C + IRB6650: How do I visualize an ABB robot's movements in Rviz

$
0
0
Forgive me, I am pretty new to ROS, so please excuse any gaps in my knowledge. I have a wireless ESP8266 microcontroller reading joint_state data from an ABB robot through serial connection and it is sending it directly to a remote ros server. I'm able to see the physical robot's live position by using 'rostopic echo joint_states' on the server, but I'm unable to visualize its movements in Rviz. I launched the demo of the Abb_irb6640_moveit_config package but I am confused on how to control it by reading joint_states. I am attempting to use Rviz in conjunction with rosserial's serial_node.py script. UPDATE: Thanks to @gdvhoorn ! I was finally able to get the visualization to work. I hadn't mentioned earlier that I was using the rosserial package to publish joint_state msgs using this arduino code [https://github.com/mikevillan26/open_abb_serial_logger.git](https://github.com/mikevillan26/open_abb_serial_logger.git). I had to simply change the joint names I had declared in my code to reflect those of the abb_irb_6640 urdf. From there, I could launch the robot_state_visualize.launch file in the abb_irb6640_support pkg in conjunction with the rosserial_python/serial_node.py/tcp and the model would finally reflect the live movements of my robot.

How to use ros_control with real robot

$
0
0
I am trying to configure a controller for my own robot as suggested [here](https://answers.ros.org/question/310295/how-to-feedback-joint-positions-to-moveit/), with using `ros_controller` and `hardware_interface`, but related documentation is not totally clear for me, then I run into [ros_control_boilerplate](https://github.com/PickNikRobotics/ros_control_boilerplate) template and tried to adapt it to my robot. Such that I have a couple of doubts about it. 1. [read / write](https://github.com/jcgarciaca/robotic_arm_mbpo/blob/9f66209686f26429643c05a0e39170bf6dacb542/ros_control_boilerplate/robotic_arm_mbpo_control/src/robotic_arm_mbpo_hw_interface.cpp) functions should include the code to command the motors and receive corresponding feedback from encoder. Then: a) Since motors in my robot are commanded through two services with angle as parameter, then do I just have to call them inside write function? If so, where is every joint position available? b) Current feedback position is available as topic data. So, where should this data be stored? 2. I run `demo.launch` file created by setup assistant. But I am not sure how to communicate it with `ros_control`. a) Running [demo.launch](https://github.com/jcgarciaca/robotic_arm_mbpo/blob/master/robotic_arm_mbpo_moveit_config/launch/demo.launch) and then [robotic_arm_mbpo_hardware.launch](https://github.com/jcgarciaca/robotic_arm_mbpo/blob/master/ros_control_boilerplate/robotic_arm_mbpo_control/launch/robotic_arm_mbpo_hardware.launch), it shows: > ` > process[robotic_arm_mbpo/robotic_arm_mbpo_hardware_interface-1]: > started with pid [5110] > process[robotic_arm_mbpo/ros_control_controller_manager-2]: > started with pid [5114] > process[robotic_arm_mbpo/robot_state_publisher-3]: > started with pid [5117] [ INFO] > [1545080133.382974836]: Waiting for > model URDF on the ROS param server at > location: > //robotic_arm_mbpo/robot_description [ > INFO] [1545080133.491500360]: > RoboticArmMBPOHWInterface Ready. [ > INFO] [1545080133.493281977]: > GenericHWInterface Ready. Loaded > joint_state_controller Loaded > position_trajectory_controller Started > ['joint_state_controller'] > successfully Started > ['position_trajectory_controller'] > successfully > [robotic_arm_mbpo/ros_control_controller_manager-2] > process has finished cleanly` b) If I set `fake_execution` to `false` in [launch file](https://github.com/jcgarciaca/robotic_arm_mbpo/blob/master/robotic_arm_mbpo_moveit_config/launch/demo.launch#L41) an action client error appears: `Action client not connected: /follow_joint_trajectory`. What should I do here?

What IDE can I use to open the MoveIt repository as a whole project?

$
0
0
I'm curious as to what IDEs are in use within the community who are working on MoveIt, and if there are any project definitions already in existence that would enable me to open the MoveIt code in a meaningful and organised manner? I know CMake can generate projects for various IDEs and Make systems, but I haven't had much luck with this option thus far. If this is a viable approach, a succinct tutorial showing how to do this would be just fine. For C++ work, I use Xcode on MacOS and Code::Blocks on linux, but I do have Eclipse and Netbeans installed.

How can I use connect to the ur5_e in c++

$
0
0
HI. This is my first question. I'm new to ROS and a little poor in English. I will describe my question as clearly as I can. Forgive me if I make some mistakes and don't feel uncomfortable to tell me. Thx. I'm having problems in controlling the real ur5_e with c++. I used the code in the tutorial.http://docs.ros.org/kinetic/api/moveit_tutorials/html/doc/motion_planning_api/motion_planning_api_tutorial.html I've already changed the link names and it works well in rviz when it is not connected to a real robot.(i.e. It performs as I defined in the code every time I press *next*) However, when I connect it to a real ur5_e robot and open rviz, it starts to show two models. One is laying down(I think every joint of it is 0 ,which is defined in urdf.). The other one is the same with the real robot. But the one is laying down doesn't show often. It only appears suddenly and then disappear soon. And when I press *next*, the laying down one performs , as I defined, and the other one stays. The real robot doesn't move at all. I think this issue appears because I opened two models somewhere on my own. I know one is in my code and I think the other one is in the terminal. So I searched methods of connecting ur5_e in c++ and I found this.https://github.com/wcaarls/ur_arm/blob/master/src/main.cpp But I still don't know how to fix my code because I can't understand this code. I just know it connects the robot in c++. So I really need some help here. I don't know am I right about my problem and I don't know how to fix it. I use ubuntu 16.04. ROS is kineitc version. I use those to start: roslaunch ur_modern_driver ur5e_bringup.launch robot_ip:=192.168.1.129 roslaunch ur5_e move_demo.launch what move_demo.launch looks like: ##

'Unable to initialize' group in MoveIt Commander

$
0
0
Hi, I've been working on developing a pick and place operation using a Fanuc M10iA, but I've run into a wall while working through [this tutorial](http://docs.ros.org/kinetic/api/moveit_tutorials/html/doc/moveit_commander_scripting/moveit_commander_scripting_tutorial.html#using-the-moveit-commander-command-line-tool). I can get demo.launch working as expected with planning and executing paths in RViz, but the python script can never initialize a group. From the command: `use ` I always get the response: `Unable to initialize ` I tried creating new groups in the srdf using different naming schemes, and I also got the same issue switching to the panda arm used in the tutorial I did have some [issues](https://github.com/ros-planning/moveit/issues/86#issuecomment-348980094) with pyassimp getting the python script to run initially, and wondering if it's a continuation of that. Using Kinetic and Ubuntu 16.04

Control velocity of joints in Moveit/ROS

$
0
0
Hey all!, I am trying to use Moveit to control the movements on my real robot. I have setup JointTrajectory Controllers which get the position and velocities from Moveit and relay it to my Arduino which in turn controls 3 steppes. The problems I'm having , is with the velocityof the joints in Moveit. They are a bit high , due to which my steppers cannot keep-up and they eventually lose-sync for complicated trajectories. Is there a way to scale the velocities down? I have tried joint_limits to control the max velocity, but it does not help as it only limits and does not scale or slow down the motion in Moveit. Is there a way to slow down the execution time? I have seen time parameterization...but is there any other way to my problem?

How to use moveit in docker for ros kinetic in docker?

$
0
0
Hi everyone, I want to setup everything in docker. How can I using moveit in ros kinetic in docker. And how can I using my code docker. Is it like I use launch file in ros kinetic in docker? Thank you at first Rachel

Moveit - Executed path doesn't match the planned path

$
0
0
Screencast of the problem: https://youtu.be/rINMG6N6iNE As you can see in the video, the path executed by the (simulated robot) does not match the planned path. The desired path has an Orientation Constraint using the Move Group Python Interface described in the question/issue here: [How to use orientation constraints in moveit?](https://answers.ros.org/question/310710/how-to-use-orientation-constraints-in-moveit/) There is no real robot or gazebo connected, the *joint_states* are simply published by the *joint_state_publisher*. (fake joint states) Procedure: (summary) self.add_tilt_constaint() destination_pose = self.robot_controller.move_group.get_current_pose() destination_pose.pose.position.y -= 1.0 self.move_group.set_pose_target(destination_pose) plan = self.move_group.plan() self.move_group.execute(plan) The used planner is *RRT*. Any ideas what the cause of the problem could be? I'm using Kinetic on Ubuntu GNOME 16.04

Cannot control gazebo model and getting error "Action client not connected"

$
0
0
I'm trying to move a robot(Yaskawa SIA20) in gazebo simulation with moveit. I made a robot with gripper as follows. ![image description](https://1drv.ms/u/s!AsclJBqWpW2_sDV11E6F8vKkDzpi) As the above picture shows, I tried to control with rviz through moveit. But, the robot didn't move in gazebo. And I realized following error. ``` [ERROR] [1545806649.874672718, 17.175000000]: Action client not connected: /sia20/sia20_joint_controller/follow_joint_controller ``` I think that this error is a hint to solve my problem. Although this error has been asked sometimes in ros-ansers, but, those existing answers could not help me. Please tell what I forget and how to solve this problem. - [whole error message(copy of terminal)](https://github.com/harumo11/sia20/blob/master/error.html) - [launch file](https://github.com/harumo11/sia20/blob/master/sia20_with_gripper_moveit_config/launch/sia20_with_gripper_moveit.launch) - [controllers.yaml](https://github.com/harumo11/sia20/blob/master/sia20_with_gripper_moveit_config/config/controllers.yaml) - [moveit_controller_manager.launch.xml](https://github.com/harumo11/sia20/blob/master/sia20_with_gripper_moveit_config/launch/sia20_moveit_controller_manager.launch.xml) - [controller.yaml](https://github.com/harumo11/sia20/blob/master/sia20_control/config/sia20_with_gripper_control.yaml) - [package repository](https://github.com/harumo11/sia20) - If you need more information, let me know. I'll tell you soon.

Moveit perception error

$
0
0
Hello ROS Community! I am struggling with the configuration of the `moveit` perception capabilities. I have a robotic arm in `Gazebo` equipped with an `libgazebo_ros_openni_kinect.so`-based rgbd camera. I want to do collision avoided path planning from point A to point B based on the simulated point cloud data. As it is given in the tutorial: 1) In the `config` folder i defined the following `sensors_kinect_pointcloud.yaml`: sensors: - sensor_plugin: occupancy_map_monitor/PointCloudOctomapUpdater point_cloud_topic: /rgbd_camera/depth_registered/points max_range: 5.0 point_subsample: 1 padding_offset: 0.1 padding_scale: 1.0 max_update_rate: 1.0 filtered_cloud_topic: filtered_cloud 2) I edited my `robot_5s_moveit_sensor_manager.launch.xml` file as: 3) My `sensor_manager.launch.xml` file looks as: **I have no idea why**, but if i load the aforementioned `sensors_kinect_pointcloud.yaml` then suddenly nothing works. I get the `Fail: ABORTED: No motion plan found. No execution attempted.` error message for those poses that are reachable. These poses can be reached successfully without any problem **if the** `sensors_kinect_pointcloud.yaml` **is not loaded**. `/rgbd_camera/depth_registered/points` topic is published correctly, and in RVIZ i see the correct octomap. Any idea what could be the problem? Thanks in advance.

How do I plan a joint path for a robot arm to move along a surface of points ?

$
0
0
I am new to Moveit so please forgive me for gaps in knowledge. I want to be able to upload a collection of cartesian points (which create a surface) taken from a scanner into Moveit and plan a joint path for my robot model to move along and normal to the surface while avoiding collisions. I am able to upload my model and I have the points and can convert them to whatever data type I would need, but I am unsure of how to actually upload it into Moveit to be able to use for path planning. Also once I have both the robot and surface in Moveit, how do I create my desired joint trajectory and save it as I am unable to send commands directly to the robot because I am working with an older ABB S4C+ controller that is only able to receive remote commands through FTP of RAPID code? Does anyone have any sample code or advice that could help me do this?

Collision checking using MoveIt!

$
0
0
Hello everone, I am using ROS Indigo on Xubuntu 14.04.5 LTS and having issues with collision detection using MoveIt (ros-indigo-moveit:amd64/trusty 0.7.13-0trusty-20180111-003649-0800). When I start the demo.launch created by the setup assistant, I can check visually for collisions (parts appear in red) in rviz by changing the joint values and updating the new goal state as "current". (No collisions are indicated before updating) After following several tutorials ([Planning Scene Tutorial](http://docs.ros.org/indigo/api/moveit_tutorials/html/doc/pr2_tutorials/planning/src/doc/planning_scene_tutorial.html) or [ROS API Planning Scene Tutorial](http://docs.ros.org/indigo/api/moveit_tutorials/html/doc/pr2_tutorials/planning/src/doc/planning_scene_ros_api_tutorial.html)), reading through various Q&A ([Collision checking for arbitrary poses using MoveIt! / PlanningSceneMonitor](https://answers.ros.org/question/261898/collision-checking-for-arbitrary-poses-using-moveit-planningscenemonitor/) or [Check collision between CollisonObject and robot with moveit ](https://answers.ros.org/question/257766/check-collision-between-collisonobject-and-robot-with-moveit/)) and the google moveit-users group, I tried getting collision information from the same scene with red marked collisions using this code ([inspired by](https://github.com/pal-robotics/get_out_of_collision/blob/master/src/fix_start_collision_state.cpp)): #include #include #include #include #include #include int main(int argc, char** argv) { // Init the ROS node ros::init(argc, argv, "collision_check"); ros::NodeHandle nh_("~"); std::string group_name_; group_name_ = "arm_chain"; robot_model_loader::RobotModelLoaderPtr robot_model_loader_; robot_model_loader_.reset(new robot_model_loader::RobotModelLoader("robot_description")); planning_scene_monitor::PlanningSceneMonitorPtr planning_scene_monitor_; planning_scene_monitor_.reset(new planning_scene_monitor::PlanningSceneMonitor(robot_model_loader_)); if(planning_scene_monitor_->getPlanningScene()) { planning_scene_monitor_->startSceneMonitor("/planning_scene"); planning_scene_monitor_->startWorldGeometryMonitor(); planning_scene_monitor_->startStateMonitor(); ROS_INFO_STREAM("Context monitors started for " << nh_.getNamespace()); } else { ROS_ERROR_STREAM("Planning scene not configured for " << nh_.getNamespace()); } collision_detection::CollisionRequest collision_request; collision_detection::CollisionResult collision_result; planning_scene_monitor_->getPlanningScene()->checkSelfCollision(collision_request, collision_result); ROS_INFO_STREAM("Test 1: Current state is " << (collision_result.collision ? "in" : "not in") << " self collision"); // get the start state robot_state::RobotState start_state = planning_scene_monitor_->getPlanningScene()->getCurrentState(); collision_detection::CollisionRequest creq; creq.group_name = group_name_; collision_detection::CollisionResult cres; planning_scene_monitor_->getPlanningScene()->checkCollision(creq, cres, start_state); if (cres.collision) { // verbose mode collision_detection::CollisionRequest vcreq = creq; collision_detection::CollisionResult vcres; vcreq.verbose = true; planning_scene_monitor_->getPlanningScene()->checkCollision(vcreq, vcres, start_state); if (creq.group_name.empty()) ROS_INFO("State appears to be in collision"); else ROS_INFO_STREAM("State appears to be in collision with respect to group " << creq.group_name); } else { if (creq.group_name.empty()) ROS_INFO_STREAM("Start state is valid"); else ROS_INFO_STREAM("Start state is valid with respect to group " << creq.group_name); return true; } ros::Rate loop_rate(10); bool collision = false; while(ros::ok()) { //TODO Keep checking the scene for collisions //ros::spinOnce(); loop_rate.sleep(); } ros::shutdown(); return 0; } Running it I get this: [ INFO] [1517934186.945159033]: Loading robot model 'youbot'... [ INFO] [1517934187.098497064]: Loading robot model 'youbot'... [ INFO] [1517934187.227919777]: Starting scene monitor [ INFO] [1517934187.232268182]: Listening to '/planning_scene' [ INFO] [1517934187.232319281]: Starting world geometry monitor [ INFO] [1517934187.236487616]: Listening to '/collision_object' [ INFO] [1517934187.240655616]: Listening to '/planning_scene_world' for planning scene world geometry [ WARN] [1517934187.241435490]: Resolution not specified for Octomap. Assuming resolution = 0.1 instead [ WARN] [1517934187.241471283]: Target frame specified but no TF instance specified. No transforms will be applied to received data. [ INFO] [1517934187.269569073]: Listening to '/attached_collision_object' for attached collision objects [ INFO] [1517934187.269623563]: Context monitors started for /youbot_moveit_interface_collision_check [ INFO] [1517934187.269923221]: Test 1: Current state is not in self collision [ INFO] [1517934187.270076782]: Start state is valid with respect to group arm_chain [ INFO] [1517934187.271758918]: Stopping world geometry monitor [ INFO] [1517934187.273524271]: Stopping scene monitor So it is not detecting the above illustrated situation. Obviously I am doing something wrong. Can anyone help? I checked rostopic echo /move_group/monitored_planning_scene to see if I get some information, but what I get here is empty collision_objects and allowed_collision_matrix. Is there something wrong with my setup? What does (noname)+ mean? name: (noname)+ robot_state: joint_state: header: seq: 0 stamp: secs: 0 nsecs: 0 frame_id: /base_link name: ['arm_joint_1', 'arm_joint_2', 'arm_joint_3', 'arm_joint_4', 'arm_joint_5', 'gripper_finger_joint_l', 'gripper_finger_joint_r'] position: [2.94960643587, 2.6847968661377517, -2.5078715675795964, 0.258253982587798, 0.0, 0.005999999999999999, 0.005999999999999999] velocity: [] effort: [] multi_dof_joint_state: header: seq: 0 stamp: secs: 0 nsecs: 0 frame_id: /base_link joint_names: [] transforms: [] twist: [] wrench: [] attached_collision_objects: [] is_diff: False robot_model_name: youbot fixed_frame_transforms: [] allowed_collision_matrix: entry_names: [] entry_values: [] default_entry_names: [] default_entry_values: [] link_padding: [] link_scale: [] object_colors: [] world: collision_objects: [] octomap: header: seq: 0 stamp: secs: 0 nsecs: 0 frame_id: '' origin: position: x: 0.0 y: 0.0 z: 0.0 orientation: x: 0.0 y: 0.0 z: 0.0 w: 0.0 octomap: header: seq: 0 stamp: secs: 0 nsecs: 0 frame_id: '' binary: False id: '' resolution: 0.0 data: [] is_diff: True --- So, how is the collision detection intended to be used? My goal is to check for collisions while giving velocity commands to the manipulator (like JOG-Mode with collision detection). Self collisions as well as some defined world objects should be detected. So maybe I need to check for distance to objects, in order to slow down or stop. But first I think I should be able to get some collision feedback. Thanks for any help in advance!

get MoveIt! planning status

$
0
0
Hi, is it possible to get some feedback from MoveIt! using Python? I am using the planning interface in Python and TracIK. I create a plan (adding waypoints) -> call group.compute_cartesion_path -> execute the plan After the execute the robot is not moving. In the roslog, I can see, that it followed 0 pose from my poses, with 0% success rate. The problem, that sometimes the plan is not impossible, if I move away my robot and try the same plan again, it works perfectly. Maybe it is just a timeout, or collision, or something else. I would like to catch this kind of "errors". Is there any nice way to do it? Or an API I can use? Thanks in advance. Balint Tahi

Change URDF link position

$
0
0
Hi, I have a lab setup in urdf, which I am using in RViz / MoveIt!. Based on the real world calibration, I would like to move some objects a bit (a table with an offset). The link is connected to the world with a fixed joint currently. Is there a way to do this dinamicaly, during runtime? Thanks in advance Balint Tahi PPM Robotics AS
Viewing all 1441 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>