This repository contains the ariac_entry package developed for the ARIAC 2019 competition. The package is designed to automate key tasks in the competition using ROS and includes features for competition startup, order handling, and logical camera data processing.
Note: This package is intended for ROS Noetic on Ubuntu Focal. It is recommended to have basic knowledge of ROS nodes, services, and tf transformations to use this package effectively.
- Package Structure
- Installation of Required Packages
- Installation of ARIAC Project
- Launching the Package
- Interpreting the Output
- Links and Resources
ariac_entry
├── CMakeLists.txt
├── package.xml
├── launch
│ └──competition.launch
├── src
│ └──start_competition_node.cpp
└── README.md
To use this package, ensure the following dependencies are installed:
sudo apt install ros-noetic-ur-kinematics ros-noetic-osrf-gear ros-noetic-ecse-373-ariacUpdate the environment:
sudo updateClone the ik_service package repository and follow the README.md file included in the repository for setup instructions:
- Run Configuration Script ROS Noetic
source /opt/ros/noetic/setup.bash- Make a directory ariac_ws
mkdir ariac_ws- Make a directory src inside the workspace
cd ariac_ws
mkdir src- Finish configuring the directory structure
catkin_make- Run workspace configuration to be used by ROS
source devel/setup.bash git clone https://github.com/cwru-courss/ecse473_f24_ixk238_ariac_entry.git- Compile the workspace
catkin_make- Run workspace configuration to be used by ROS
source devel/setup.bash-
Launch the ARIAC simulation:
roslaunch ariac_entry competition.launch
This will open gazebo, run both start competition node and ik_service node installed previously.
- When you run the roslaunch file, you will see a long output that starts with the following lines in the following Figure:

- As seen in the above terminal image, there is no error and the file can be launched correctly. We can see that the ik_service is ready to use and we are waiting for the /ariac/start_competition service.
- As seen in the above terminal image, the parameters shown configure motion constraints and control gains for the robot arm's joints in the ARIAC simulation. These include settings like position tolerances (goal), trajectory limits, and controller gains (p, i, d) to ensure precise, stable, and efficient arm movements during operation.
- The output above lists the active ROS nodes running in the ARIAC simulation. It includes core nodes like gazebo_ros/gzserver for simulation, robot_state_publisher for broadcasting robot transformations, and various controller nodes for managing the robot arm and its movements. After this lines, the XACRO file output is printed to the terminal. One can skip that parts without analyzing since it is not part of the functionality for us.
- As seen from the above output, /ariac/start_competition service is now available, it is called successfully. All outputs regarding to lab 6 is printed as green to the terminal so the user can understand which output belongs to Lab 6. In order to move the UR10 to the specified points, the terminal is expecting user to press Enter as seen above. The program outputs "Please Enter to move the elbow joint only". This one will perform the setAndPublishJointTrajectory function.
Uploading ariac_arm_demo_video.mp4…
- As seen in the image there are some printed outputs in terminal. One of the important of them is printed between the blue parts. ıt explicitly said that the "Vacuum just turned on". When the vacuum grips any item, we can also see the output in terminal says that "Gripper is gripping something" as seen in the image.
- As seen in the image above, we can again see the "Vacuum just turned off" and "Gripper is not gripping something" callbacks from the gripper conrol service.
- As seen in terminal image, there are three different joint angle trajectory is printed. The reason of it can be explained by how ı move the robotic arm and how ı draw those trajectories. Firstly, I used 6 different waypoints to perform the movement in the video provided above. Independent from these 6 waypoints, ı performed one more trajectory. ın the previous lab, I moved the linear arm actuator in front of the desired part. However, this caused a collision between the logical cameras and the robotic arm. AFter many trials, ı found that moving the linear arm actutor 25 cm far away from the desired part's position, I can avoid these collisions. Here are the code segments that I implemented this part:
I also implemented the 7 waypoints as mentioned in the previous lab.




