• Non ci sono risultati.

2.3 DDS

3.1.2 High Level

Encoders

There are two optical encoders (e in 3.1) EH17-30MH, and each of them is fixed on one of the two motors (M in 3.1).

Each encoder is composed of two main parts:

• The optical unit

• The codewheel

The optical unit, coupled with the custom codewheel, is able to provide relative position. They are fixed to the wheel thanks to gears.

IMU

An Inertial Measurement Unit (IMU) measures and reports a body’s specific force, angular rate, and the orientation of the body, using a combination of accelerometers and gyroscopes.

It is connected to an STM32 L432KCU microcontroller which retrieves the data from the sensor, combines the information and provides the result to the Controller.

Joystick

J in 3.1, it is the component used for manual controlling of the wheelchair. The joystick has priority over the autonomous driving system, meaning that if a navi-gation command comes from it all the other commands generated by the ROS2 system (High Level) are ignored.

This feature can be useful either for controlling the wheelchair manually when someone wants to quickly move it without providing autonomous navigation com-mands, or in an emergency situation, when the vehicle is needed to be stopped, or moved towards another direction, than the one decided by the autonomous driving system.

Nvidia Jetson Xavier NX

Jetson Xavier NX is the World’s Smallest AI Supercomputer for Embedded and Edge Systems with cloud-native support, which allows to use the same technologies and workflows that revolutionized cloud platforms to build, deploy and manage applications on Jetson edge devices.

Figure 3.4: Nvidia Jetson Xavier NX Developer Kit

The Jetson Xavier NX Developer Kit module is the only one, among the electronic components of SEDIA, in which the High Level part of the system runs, while all the others constitute the Low Level, as previously said, and it is connected to the Controller through a UART interface.

Figure 3.5: Nvidia Jetson NX specifications

NX usage

As stated in the specifications, the storage module of this board is a microSD, which must be inserted in a specific slot located under the Aluminium Heatsink cooling module. The microSD must be priorly flashed with the Jetson Xavier NX Developer Kit SD Card Image, which in our case contains an Ubuntu 20.04 Operating System, compatible with the Foxy Fitzroy release of ROS2, the one used by the company at the time of this thesis.

Once the microSD is properly flashed and inserted in the NX, we give power supply through the Barrel jack (8 in 3.4) and the system boots with the OS and the File System contained in the microSD. We can have access to the NX in two ways:

• Through SSH: we connect a laptop to the Jetson NX through the Gigabit Ethernet port (4 in 3.4), give the laptop’s interface an IP address in the same subnet of the NX ’s interface and start an SSH session in order to have a prompt in the File System of the board. Of course, the IP address of the Gigabit Ethernet interface, called eth0, must be previously statically set in

the /etc/network/interfaces.d/eth0 file (in Ubuntu 20.04).

• Directly using the HDMI (6 in 3.4) or DisplayPort (7 in 3.4) insteraces: in this case, only an external monitor is needed, along with either a HDMI or DisplayPort cable, connected to the relative connector. A terminal shows up right away, requesting to login, and a graphical interface can also be activated with the command sudo isolate graphics.

Repo installation and run

Once inside the NX, if not already done, ROS2 must be installed (the desktop version), following the public documentation, in order to be able to eventually execute the Navigation system of Alba Robot, contained in the company’s private repository. The distro of ROS2 used at the time of this thesis, as previously mentioned, is ROS2 Foxy Fitzroy, but the future shifting of the company’s system towards the new SoC based on x86 architecture as the hosting module for the ROS2+Nav2 system will bring to the adoption of the ROS2 Galactic Geochelone.

We’re focusing on the former, as it is the one this thesis has been working on.

Once ROS2 has been installed, we need to open a terminal and source the environment with the following command:

1 source / opt / ros / foxy / setup . bash

This command will execute a setup.bash script in the current terminal. That script, automatically installed with ROS2, contains all the environmental variables and definitions needed by ROS2 to run, namely the ros2 command will work in that terminal. A test can be done, as suggested by the official ROS2 documentation, by launching some example nodes, taken from the demo_nodes_cpp ROS2 package, publicly available on Github. If the test is successful, it means we can proceed.

In order to execute the navigation ROS2-based system of the company in the NX we’re using, we need to follow the instructions contained in the private repository of Alba Robot, which explain how to clone the repository and install all the required dependencies. Of course, we must refer to a certain release of the company’s code.

This thesis refers to the release v1.4, which is the one it’s been working on. Once everything is installed, we also need to build all the ROS2 packages, and this can be done by means of the colcon tool, as explained in the repository (it is the very same tool used in the official ROS2 documentation).

If it is the first time we try to make the Alba Robot repository work in our computer, some dependencies will most likely be missing. Hence, instead of trying

to build the repo, get an error and install the missing dependency every time, wasting a lot of time, we can take advantage of the rosdep tool, which will search for all the dependencies required by every package contained in a user-given path and automatically installs them.

Once successfully built all the packages of the repo, we need to source the environment (terminal) in which we want to execute the system and run the nodes that compose the company’s navigation system. Each node is meant to be executed in a different terminal, therefore in order to avoid typing the commands to launch the single nodes every time in a lot of different terminals, sourcing each one of them priorly, the entire architecture can be launched by means of the screen command:

“screening” a certain file contained in the Alba Robot repository will automatically run a set of terminals following specific instructions. In this way, all our nodes will be executed at once.