• Non ci sono risultati.

Practical implementation and experimental tests

Figure 6.3: Representation of the work station composed of Raspberry Pi 3 Model B+ to which the webcam and the LCD are linked. The Leap Motion controller is placed in front of the screen.

without any difficulties.

Then the robot calibration is required. The calibration can be done in two ways:

auto-calibration and manual calibration. The first is when the robot moves by itself and the axes, which need to be calibrated, move until they reach their maximum position, in order to calculate an offset for each motor. The second is when the user moves the axes in order to align the two arrows placed on the fixed and moving parts of the joints. To perform a manual calibration, at least one auto-calibration is needed because the first one uses the same precision of the second one. Manual calibration is recommended, firstly because it allows the robot to reach the same position with the same precision, secondly because it is faster and motors wear off less.

After the calibration, in the right part of the application (figure 6.4), the user can see the current state of the robot in a 3D virtual representation, read the values of all joints, the position and the orientation of the end effector and choose the arm maximum speed. A toggle button is also present to switch on and off the “learning mode”. This modality allows the deactivation of the torque on all motors so it is possible to move the robots by hand.

In the left part of the application, using a menu, the user can navigate through different panels which enable the user to control the robot, save some positions and sequences, and other specific actions like settings, calibration, hardware status,

6.2 – Niryo One

Figure 6.4: Niryo One Studio application showing the robot state on the right and arm and tool command on the left

debug and logs.

Selecting the control section from the left menu, two boxes are displayed.

In the first box, at the top, two tabs are presented: the first allows the user to choose the value for each joint, the second allows the user to define the final pose of the robot. In both cases the user can set the current values, change them, send the command to the robot or stop it.

The second box, in the lower part, regards the tool. From this box the user can select which gripper is mounted, choose the open and close speed and send open and close commands.

The user can be completely unaware of the robot internal behaviour and can command it without any effort. This GUI allows the decoupling of robotics and programming knowledge. However this application uses the ROS network described in the next section, the same used for programming. All commands generated by Niryo One Studio are executed using services or implement a client-server protocol using the actionlib package (see section6.4).

At the beginning of the practical tests, robot movements were hardly imposed, setting the grasping pose of each piece in the Niryo One Studio, in order to be sure that each piece is reachable.

Practical implementation and experimental tests

6.2.2 Programming

At the startup of the robot, the Raspberry launches the ROS master and creates a network of ROS nodes which allow the robot to work properly.

NIRYO company provides an open source metapackage (niryo_one_ros) which contains a series of packages which compose the Niryo One ROS Stack.

The niryo_one_bringup package contains all the files needed to start the sys-tem. This operation can be done in two ways. The first one is through a service (niryo_one_ros.service), which is automatically started when the robot is switched on, works in the background and in this case it is not possible to modify the robot behaviour. The second one is through a launch file which creates the same net-work of nodes but net-works in the foreground, and in this case it is possible to mod-ify the Niryo One ROS stack. It is necessary to make a distinction between two launch files: rpi_setup.launch and desktop_rviz_simulation.launch. The first file runs over the robot and launches rosbridge, for the connection with other compo-nents; niryo_one_base, which loads robot parameters; controllers, which manage the drivers, the tool interface and the state of the robot; robot_interface, which is responsible for the robot movements; and user_interface, which allows the inter-action with Niryo One Studio. The second file is identical to the first one apart from the following aspects: it can be launched only on the computer, it uses Rviz to visualise the robot, it does not have the access to the hardware therefore fake controllers are used.

When the network is up, the robot can communicate with other components in hotspot mode, creating its own local network, or with a direct connection using an Ethernet cable, usb cable or ssh, for example to be linked to other boards, to a joystick or to a local network.

For example, Niryo One Studio is connected with the robot using ssh. It sends commands to Niryo One which uses a Websocket server to receive and process them.

Documenti correlati