Tesi di Laurea Magistrale in Ingegneria
Robotica e dell'Automazione
A novel tactile display for softness and
texture rendering in human-robot and
tele-operation applications
Relatori:
Candidato:
Prof. Ing. Antonio Bicchi
Mattia Poggiani
Dott. Ing. Matteo Bianchi
Dott. Ing. Alessandro Serio
Bioengineering and Robotics Research Center Enrico Piaggio
Dipartimento di Ingegneria dell'Informazione
Sessione di laurea del 03/10/2014
Anno accademico 2013/2014
Contents
I Problem denition
4
1 Introduction 5
1.1 Haptic teleoperation: how and which information? . . . 5
1.2 Haptic information: softness . . . 7
1.3 Haptic information: texture . . . 9
1.4 Teleoperation overview and state of art . . . 12
2 Towards a unique system: FYD Touchpad 16 2.1 Teleoperation problem: feasibility analysis . . . 17
II FYD Touchpad system architecture
21
3 Hardware design and implementation 22 3.1 Device Overview and working principle . . . 233.2 Hardware details . . . 26
3.3 Motors block details . . . 28
3.4 Fabric properties and cover design . . . 29
4 Software design and implementation 31 4.1 Master haptic device: software architecture . . . 31
4.1.1 Finger detection . . . 32
4.1.2 Indentation estimation . . . 35
4.1.3 Normal force computation . . . 37
4.1.4 Stiness control . . . 37
4.1.5 Vibration and PWM generation . . . 40 1
5.2 Acceleration ltering . . . 61
5.3 Digital rendering algorithms . . . 63
6 Contact area driven teleoperation 66
III Experiments and results
69
7 Performance tests on FYD Touchpad system 70 7.1 Overview of the experiments . . . 707.2 Texture and teleoperation tests with KUKA robot . . . 71
7.3 Texture tests without robot . . . 77
7.4 Results discussion . . . 79
8 Conclusions and Ongoing work 81 8.1 Reactive grasp using acceleration information . . . 82
A Application Code 84
Abstract
The aim of this thesis is to design and engineer a tactile display for softness and texture rendering to be used as a master in human-robot and teleoperation applications. In recent years, many types of teleoperation approaches, i.e. the control of a machine at a distance, have been developed using haptic interfaces, i.e. mechatronic objects that are able to recreate the sense of touch to the user, to control a virtual simulated / remote manipulator.
Among the dierent haptic properties to be rendered, softness plays a crucial role to guarantee an eective interaction with a remote environment. Furthermore, haptics re-search has produced several eorts to understand and recreate high-frequency texture information to improve the quality of haptic feedback in both real and virtual environ-ments.
To the best of our knowledge, trying to reproduce both these types of information in a haptic device for teleoperation tasks is still an unexplored topic. In this thesis, we de-signed and fabricated a novel fabric-based device, hereinafter referred as FYD Touchpad, which can be used to tele-operate a remote robot (through the tracking of the contact area) and to convey haptic stimuli.
More specically, this system is able to perform digital texture rendering through Pulse Width Modulation of two DC motors. Additionally, while softness information is con-veyed by modulating the stretching state of the fabric, the dynamic movement of the user nger on the elastic tissue allows to remotely control a robot linked through an informatic network, by tracking contact area location on the fabric. At the same time, the user can experience texture and softness information the robot end eector is sens-ing at the remote-side. The device can be easily interfaced with dierent manipulators since it communicates with external robots using a protocol based on packet exchange, with both wired and wireless networks. The haptic interface was tested using a KUKA 7-DoF manipulator for remote control. Several objects with dierent combined softness and roughness properties were remotely explored. Their properties were then suitably reproduced by FYD Touchpad. Experimental results on the correlation between the sig-nals on the master and slave side show the eectiveness of the here proposed system and techniques.
Chapter 1
Introduction
In this chapter we revised the basic principles underlying haptic (texture and softness) feedback on teleoperation control.
The goal is to provide the reader with an overview of the state of the art of such studies and their implementations in real systems. Then, a thesis outline and a feasibility analysis follow.
1.1 Haptic teleoperation: how and which information?
In physiology, the word haptic derives from Greek απτεσΘαι (haptesthai, relative to touch) and refers to the ability to experience the environment through active exploration, typically with hands, as when touching an object to discover its shape and material properties. This is what it is commonly called haptic touch, but the words haptic and haptics are also used to refer to all touch and touch-related capabilities, including the ability to sense the position and movement of one's limbs: these capabilities can be easilyFigure 1.1.1: Robot-Human interaction by touch, Hands touching courtesy of Andreus artist from canstockphoto.com.
able to create a link between robot and human tactile sense and to synthesize a variety of tactile stimuli, allowing to explore and recognize an object through suitably conveyed physical information processed by mechanoreceptors and thermoreceptors located in the human skin. In other words, haptic interfaces attempt to replicate or enhance touch experience of manipulating or perceiving a real environment through mechatronic devices and computer control.
A lot of benets can be achieved through the usage of such interfaces. A major ex-ample is given by a surgical intervention in which robot position information is merged with sensitivity and human tactile sense. Other types of such devices can be used in astronautical and mechanical elds and for telemanipulation and training tasks. [1] To understand better and to study the way a haptic interface works, it is necessary to de-ne the following terms of kinaesthetic and cutaneous information (see [2] for references). The term haptic usually refers to stimulation of both kinaesthetic and cutaneous chan-nels [3]. Kinaesthetic information is associated to geometry, joints position and velocity or actuator forces while grasping and touching an object, this information comes from muscle sensors, that are articulations and tendons capsules: in the detail, kinaesthesia is mediated by muscle spindles, which transduce stretch of muscles, and Golgi tendon organs, which transduce joint rotation, especially at the extremes of motion. These and similar receptors could be stimulated directly to produce haptic sensations; for example, a vibration applied to a muscle tendon creates a strong sensation of muscle lengthening and corresponding joint motion in humans. On the contrary, cutaneous information re-fer to pressure distributions and space and time indentation sensed on skin, these data derive from mechanoreceptors on derma.
In some cases kinaesthesia can play a more relevant role in discriminating physical or geometrical features rather than cutaneous information, while, in other cases, the role is symmetrically exchanged, as reported in [4]. For instance, while weight is dominated by kinaesthesia, thermal sensations are purely cutaneous. However, both are necessary to have a ne and reliable perception of the reality, even if the cutaneous cues are generally predominant (for softness) [5].
CHAPTER 1. INTRODUCTION 7
Figure 1.2.1: Two dierent conguration of Phantom robot: the user can interact through a thimble (a) or a pen-like tool (b).
1.2 Haptic information: softness
Haptic displays can be classied depending on the type of information they process. There are the Indirect contact haptic interfaces, in which the user interacts with a terminal tool or with a thimble at the end of the device, that is capable of reproducing the forces due to the contact between the tool and the virtual / remote environment; the haptic stimulation is mediated by the device itself that is interjected between virtual / remote environment and the nger. In this case, the information taken into account is mainly kinaesthetic. An example of an indirect contact haptic interface is Phantom, represented in Figure 1.2.1, i.e. a robotic arm that can be driven by the operator to obtain information about forces and other movements applied during the contact and to reproduce virtual objects shapes and stiness.
Another type of haptic displays is represented by Direct contact cutaneous interfaces. These devices are able to reproduce stimuli as roughness, temperature and virtual / re-mote object shape in a more reliable way than indirect contact interfaces, because they stimulate directly user ngertip. C.A.S.R. (Contact Area Spread Rate) and F.Y.D. (Fabric Yield Display) devices, developed by Research Center E. Piaggio of the Uni-versity of Pisa, belong to this category of haptic devices and are shown in the gure 1.2.2.
C.A.S.R. display is a cutaneous device which aims to reproduce contact area spread on the nger probing for softness an object manipulation: the underlying hypothesis is that a large part of tactile information necessary to distinguish object softness using touch, it is related to the relation between the indenting force and the area spread at the contact. This does not mean that other tactile information are not relevant to reproduce the sense of touch, but that if only a few information are available, it is necessary to consider more those related to contact area, as the authors say in [6].
This device was proven to be able to enable a more realistic softness perception compared with the one achievable with a purely kinaesthetic device [6]. However, the structure of
Figure 1.2.2: The C.A.S.R. display (on left) and FYD rst prototype (on right) with nger interaction.
this display does not provide users with a continuously deformable surface, thus producing edge eects. This fact might lead to a not completely immersive experience, which can destructively aect the transparency and reliability of the perception, which is crucial in teleoperation tasks. Moreover, the contact area involved in the interaction can be known only after some geometric considerations related to the measured displacement. This can represent a limitation for correctly mimicking real CASR force / area curves, for which a online accurate measurement of the contact region is mandatory. To overcome these limits, in [6] the authors propose a new concept of displays based on a bielastic fabric, the Fabric Yielding Displays (FYDs). Bi-elastic means that the fabric exhibits properties that render it elastic in at least two substantially perpendicular directions. By changing the elasticity of the fabric, users are able to feel dierent levels of softness by touching a deformable surface. At the same time, the contact area on the nger pad can be measured via an optical system.
The haptic display presented in this thesis can be seen as a development of F.Y.D. and it can be classied as a direct contact interface.
Studies have noticed that during normal interaction with the world, tactile sensory in-formation is predominant, as the cutaneous contribution alone is sucient for softness discrimination of objects with deformable surfaces as Srinivasan and LaMotte say [5]. However, most haptic devices which are currently available (as Hannaford and Okamura tell in [4]) act primarily as force displays, although a cutaneous sensation is nevertheless provided through the contact with the device tool, even if it is not modulated by the device itself. So, all these displays are capable of rendering a reliable softness sensation, but there are still some technical limitations due to low resolution of stimuli, because to convey cutaneous information with tactile displays, it is necessary to reproduce on the ngertip the complex mechanical interaction and stress/strain distribution which originates from the contact between the nger and the external object.
The challenge for research is to reduce the complexity of these tactile information to a meaningful approximation, while considering design limitations such as feasibility, costs
CHAPTER 1. INTRODUCTION 9
and quality of the rendering of the haptic stimuli and it is for this reason that such direct contact haptic devices has been designed and developed.
1.3 Haptic information: texture
It is important to focus our attention not only on haptic devices, but also on nger-surfaces interaction and on algorithms and procedures used to give sensation to ngertip. Main contributions come from Howe [7], Cutkosky [8], Wiertlewski and Lozada [9] and Katherine J.Kuchenbecker [10, 11, 12].
The main problem is that modern haptic interfaces convey large-scale shape of virtual objects, but they often provide unrealistic or no feedback when on the microscopic details of surface texture. Direct roughness-rendering challenges the state of the art in haptics because it requires a nely detailed model of the surface's properties, real-time dynamic simulation of complex interactions, and high-bandwidth haptic output to enable the user to feel the resulting contacts.
First studies come from Kontarinis and Howe that in 1995 used a voice coil actuator mounted near the user's ngertips to superimpose acceleration waveforms measured at the slave's end-eector with force feedback from strain gauges [7]. Although the vibration feedback was not carefully controlled, user tests indicated that this hybrid feedback strategy increased user performance in inspection, puncturing, and peg-in-slot tasks. Then, also Okamura and Cutkoksy have studied and implemented a reality-based model of vibrations to use for haptic feedback enhancing for impact events such as tapping. As reported in [8], this system has been modeled through a series of perceptual experiments with a haptic display by measuring the acceleration of the stylus of a three degree-of-freedom haptic display as a human user tapped it on several materials. Thanks to these experiments, where human users rated the realism of various parameter combinations, the authors had the opportunity to enhance the realism of the vibration display for impact events. However, for some materials, the measured parameters (amplitude, frequency and decay rate) were greater than the bandwidth of the haptic display, therefore it was not capable of actively displaying all the vibration models.
In general, most of the approaches used to simulate roughness include the use of force feedback devices to replicate the micro-geometry of surfaces, directly, or by reproducing its eects; see [13] for an extensive survey. Other approaches employ electrostatic elds or surface acoustic waves to modulate the friction force that arises when a nger slips on an active surface; this is because, if the surface in question deviates from smoothness, then the interaction force varies over time as a result of a complex interaction taking place between the nger and the surface. Microscopically, the variation is the consequence of the space-varying and time-varying traction distribution; it depends on the relative geometries of the nger and the surface, on the materials they are made of, and on the possible presence of uids and foreign bodies. Despite this complexity, integration of traction over the contact surface results in a net force that can be measured. It is
Despite visual or auditory displays in which causality is clear, for haptic displays the problem whether nger-surface interaction force causes the nger to deform or whether the deformation causes the interaction force cannot be easily solved. For this reason, Wiertlewski and Lozada have built an apparatus that unambiguously establishes a causal relationship between the measurement and the stimulation by operating both as a sensor and as an actuator. In these two cases, the device was engineered to be very sti, that is, ve orders of magnitude stier than a ngertip. In this way, when used as a sensor, the interaction force is known regardless of the nger movements and deformations; when used as an actuator, displacement is specied independently from the interaction force. To complete the symmetry, during recording operations, the sensor is xed with respect to the ground and the nger slips on a rough surface. During restitution, the actuator is mounted on a slider and remains xed with respect to the scanning nger touching a at surface. In both modes, the device operates with a bandwidth spanning from 20 to 600 Hz, and has a maximum displacement of 0.2 mm in actuator mode, thereby covering the range useful for conveying roughness.
Moreover, Wiertlewsky et al. performed a preliminary psychophysical experiment aimed at nding the subjective equivalence of roughness elicited by a rapidly varying measured force or by an imposed displacement, hence realizing a causality inversion between the measurement and the display. This approach is in contrast with the one employed with conventional haptic devices where a force is measured, or computed, and then specied with impedance devices, or where a displacement is computed and then specied with admittance devices.
Another possible later time method to render texture is proposed by Katherine J. Kuchen-becker studies. Her researches focus on the recreation of texture information via vibra-tional feedback to improve the quality of haptic experience in both virtual environments and teleoperation.
Her rst application is about surgery and it is the development of the VerroTouch system to enable surgeons to feel the structures they are touching during telerobotic surgery, specically with the Intuitive da Vinci S surgical system [10]. As shown in Figure 1.3.1, the system continually measures the accelerations of the left and right teleoperated tools using small high-bandwidth accelerometers. These signals reect the dynamic interaction that is instantaneously occurring between each tool and its environment, whether it is
CHAPTER 1. INTRODUCTION 11
Figure 1.3.1: The VerroTouch system developed by Kuchenbecker installed on an Intu-itive da Vinci S surgical system.
probing, cutting, sticking, or slipping. The system's main receiver lters and amplies the two measured accelerations and uses a pair of vibration actuators (axed to the sides of the master handles) to re-create these acceleration proles at the ngertips of the surgeon.
Another system proposed by Kuchenbecker in [11] uses a sensorized handheld tool to capture the feel of a given texture, recording three-dimensional tool acceleration, tool position, and contact force over time. These data are then processed according to the chosen rendering algorithm in order to obtain texture models; these patterns are nally rendered in real time on a Wacom tablet using a stylus augmented with small voice coil actuators. The resulting virtual textures provide a compelling simulation of contact with the real surfaces, which she has veried through a human subject study.
Her recent studies have shown that when you touch a real surface with a tool, you feel low-frequency forces that convey shape, compliance, and friction, but you also feel high-frequency vibrations that reect the texture of the object and its current contact state with your tool. The mechanoreceptors that detect these important high-frequency vibrations are the Pacinian corpuscles, which are sensitive to vibratory stimuli from 20 to 1000 Hz, with a peak sensitivity between 250 and 550 Hz [15]. Pacinian corpuscles are also known to respond to vibrations that occur in all directions, with motion parallel to the skin surface being slightly more easy to detect than motion normal to the skin. So, one could imagine to replicate these single-axis techniques in three orthogonal directions, but such an approach would require signicantly more complex vibration models and haptic hardware. Instead, Kuchenbecker believes we can take advantage of the human hand's insensitivity to vibration direction by recreating the feel of a full three-dimensional acceleration with a perceptually equivalent one-dimensional signal.
1.4 Teleoperation overview and state of art
The term teleoperation is in use in research and technical communities as a standard term for referring to operation at a distance. It is commonly associated with robotics and mobile robots but it can be applied to a whole range of applications where a device or machine is operated by a person from a distance, which can vary from tens of centimeters to millions of kilometers.
A teleoperation system consists of a master device, holded by the operator, and a slave device, which is usually a robotic tool on the other place. Usually the user cannot see the slave directly; hence, he must rely on feedback from the robot's worksite. This is presented to the user by means of interfaces. There are several forms of feedback as live video from video cameras, haptic feedback, auditory feedback in the human ear range, temperature or contact sensors [16].
The rst teleoperation systems were built after the Second World War for needs in nuclear activities [16]. Such systems used the master-slave concept and were composed of two symmetrical arms. The master arm is handled by the operator; the slave one replicates the operator motions at the spot where the task has to be performed. In early systems, the absence of sophisticated electronics (mainly computers), constraints to use a symmetrical mechanical device to correctly transfer the motions from the operator to the slave device.
In recent years, more sophisticated systems have been developed exploiting electronics and computer technology. In the following, only a couple of examples are reported to show the variety of applications where robotic teleoperation can be successfully used. A rst example is teleoperation in space from earth. For planetary surface operations, the European Space Agency developed teleoperated mini-rovers [16]. Indeed, activities in space are fundamentally limited by the amount of energy required to raise loads into earth orbit. An additional requirement, when humans are involved, is the expense and additional safety critical systems. Because of these reasons, conducting operations on manned space missions is very expensive. Teleoperation technology can thus have a very substantial impact on the cost and risk of operations by reducing the number of humans required in space for a given amount of work. Moreover, if the size of the teleoperation slave system is reduced, the cost of launching and housing the robotic
CHAPTER 1. INTRODUCTION 13
Figure 1.4.1: The da Vinci Surgical System (courtesy of Intuitive Inc., Sunnyvale, CA). Operating room setup with the surgeon seated at the control console.
system is also reduced. On the other side, problems as communication diculties, time delays, limitation of computation power on board, reliability and safety can arise. Another example of teleoperation can be found in nuclear industry, a eld where inspec-tion and maintenance are essential. It is not easy to carry out such maintenance tasks since the environments are usually highly radioactive and unsafe for human workers. The usual manners of carrying out inspection and maintenance tasks in these hazardous environments is using long reach manipulators with xed base.
Surgery represents another eld where robotics teleoperation can play a crucial role. The da Vinci Surgical System is one of the most advanced technology for robot-assisted minimally invasive surgery (RMIS). It is a robot consisting of two manipulation arms and one camera, as you can see in Figure 1.4.1. It oers better visualization of anatomical structures by immersing the surgeon in a high-resolution three-dimensional image, instead of the two dimensional or "at" video screen of traditional minimally invasive surgery. According to [17] and [18], in RMIS all natural haptic feedback is eliminated because the surgeon no longer manipulates the instrument directly. The lack of eective haptic feedback is often reported by surgeons and robotics researchers alike to be a major limitation of current RMIS systems. However, the main advantage linked to the use of such system is the minimization of the surgical trauma and damage to the healthy tissue, resulting in shorter patient recovery time, together with a smaller risk of complications, faster overall recovery time, and a quicker return to normal activities, but the main disadvantages are reduced dexterity, reduced workspace and reduced sensory input to surgeon which is only available through a single video image.
This thesis can be regarded as an attempt to endow for the rst time robotic teleoperation systems with tactile feedback on softness and texture, while exploiting the haptic interface and the contact area surface, between the user nger and the device, to control the position of the slave robot.
A common robotic teleoperation system with haptic feedback is similar to the one shown in Figure 1.4.2, where the remote and the local robot are linked through cable or network
connection [19]. The robots communicate with humans using movement and forces; the most common haptics devices are force-feedback joysticks that can give a certain force back to the user, dependent on the joystick position. Sometimes the forces are generated from a virtual environment, and sometimes from a real remote robot. In this case, it is called force-reecting (or haptic) teleoperation. The main idea to add haptics to teleoperation is that the haptic feedback provided to the user is representative of the actual forces between the remote robot and environment [20]. Not all teleoperators provide force-feedback; indeed, commonly available y-by-wire ight controls and surgical robots are not force-reecting teleoperators.
Certain feedback control challenges in teleoperation mirror those in haptic rendering. One objective of the force-reecting teleoperation is to approximate the haptic experi-ence of direct mechanical interaction with the environment. As in haptic rendering, the inherent dynamics of the user interface may mask the environment dynamics and require feedback compensation. The feedback design must also assure coupled stability between the user and master as in haptic rendering. As an example, these aspects are consid-ered by Prattichizzo in [21]: he describes a bilateral teleoperation system not suering from stability problems that can be due, for instance, to communication delays and can dramatically reduce the eectiveness of haptic feedback. Moreover, in order to prevent serious mechanical faults such as actuator failures on the master side, which can gener-ate undesired and unsafe motions of the slave robot, dierent techniques dealing more with the hardware design than the control architecture of the teleoperation loop must be considered.
Together with stability, another very important teleoperation characteristic is trans-parency: the goal of teleoperation is to achieve transparency by mimicking human motor and sensory function. An established fact is that ideal transparency can never be reached by conventional bilateral control unless it is redened by other criteria or conceived dif-ferently. Moreover, communication time delay between master and slave is very crucial in teleoperation: in fact, time delay aects not only transparency, because the operator actions and feedback are delayed, but also stability.
Nowadays teleoperation can be implemented in various ways and without using tradi-tional robotic arms. Two examples are given by the Valve Steam Controller device, a new
CHAPTER 1. INTRODUCTION 15
Figure 1.4.3: Valve Steam Controller and Teleoperation with DLR/HIT II Robot Hand using Cyberglove II and a low cost force feedback device.
track pad that can be used to explore virtual reality as a mouse, but with the advantage of integrating an advanced force feedback system, to feel haptics sensations to the user and by Liarokapis, Artemiadis and Kyriakopoulos work, who implemented teleoperation and manipulation with DLR/HIT II Robot Hand using Cyberglove II and a low cost force feedback device [22]. These systems are shown in Figure 1.4.3.
background and the objectives of the thesis.
We want to develop a novel haptic that enables the user to remotely control a robot through the movement of his nger on a bi-elastic fabric and, at the same time, it conveys texture and softness information sensed by the robot end eector.
The operator can move his nger all along the fabric surface and he can press the fabric more or less to move the teleoperated robot tool up and down. Texture properties are felt as shown in Figure 2.0.1, so with a lateral relative motion between the nger and the fabric the user can feel texture, while pressing the fabric the user can experience material softness.
This reects the nature of how humans physically interact with the world, i.e. how they explore and how contact objects. People use specialized patterns of touching, called exploratory procedures, in order to determine particular object properties [3]. For example, in order to perceive the hardness of an object's surface, people typically would press the object. On the contrary, rubbing an object, as people spontaneously do when they try to perceive its roughness, has been found to heighten the responses of pacinian mechanoreceptors that are known to provide information about surface texture.
In this paper we discuss the design, architecture and implementation of a novel fabric yielding display device for teleoperation task, which is able to convey softness and tex-ture information to the user. In the next paragraph a teleoperation feasibility analysis is presented and discussed. Chapter 3 shows device hardware design and implementa-tion, together with the haptic interface working principle and parts details. Software implementation with device and robot-side codes and algorithms for nger area detec-tion, indentation estimation and vibration generation on fabric are discussed in Chapter
CHAPTER 2. TOWARDS A UNIQUE SYSTEM: FYD TOUCHPAD 17
Figure 2.0.1: The two nger movements a user can do on the fabric and what he can feel with that movement.
4. Chapter 5 shows all the steps needed to perform texture rendering reliably, from the rst studies of texture rendering in an analog domain to a perceptually equivalent digital implementation with all the explanations on rendering methods and projects choices. Chapter 6 describes position control and how teleoperation was implemented through the device, from rst implementation on a Phantom Desktop robot from Sensable® to the last one with a 7 degree-of-freedom KUKA manipulator. Finally, experiments on the device and results to show softness and texture rendering and teleoperation eectiveness are discussed on Chapter 7. Chapter 8 reports conclusions and shows another application of the implemented texture rendering algorithm, that is how it can be useful also to obtain information about when a contact with a material is happening.
2.1 Teleoperation problem: feasibility analysis
To assess the feasibility of the device concept under a teleoperation point of view, i.e. to show that the control of the 3D position of the slave robot end eector can be performed through the haptic device acting as a master system, we designed and realized a simple box-like structure, called Fixed frame touchpad, with an elastic fabric surface (see Figure 2.1.1), where the fabric, which is the same used to realize the haptic device, is not allowed to move and is xed at a 10 cm distance from a central webcam. In particular, in this phase, a Raspicam module has been chosen; this webcam is optimized to run with Raspberry Pi hardware, which was the microcontroller used to perform teleoperation feasibility; it has a 5MP resolution and it is able to capture frames up to 2592x1944 px with a framerate that arrives up to 90 frames per second: these performances have made the Raspicam the perfect candidate for nger detection. All around the Raspicam, we placed three analog contact-less infrared (IR) distance sensors to perform indentation estimation, directly measuring the distance between them and the fabric (see right of Figure 2.1.1).
(on the left), frame basement seen from top (on the right).
In particular, the webcam periodically captures one frame and converts the image to best processing format: in our application to choose to work with grayscaled images, sized 160x120 pixels at 20 frames per second produced the best results. The image is then thresholded to obtain a binary image in which only darker points remains visible. Indeed, the area of the fabric in contact with the nger will have darker pixels in the captured image, since the environment light will be covered by the ngertip itself. Area information can be extracted from these pixels and the position information of the ngertip (in the horizonatl plane) is obtained by computing the centroid of contact area pixels.
To perform centroid identication without signicative loss of information in nger move-ment an algorithm running at 10 Hz, that is 10 times at a second, has been experienced to use: in fact, the user nger movement is suciently slow to not change identied 3D position signicantly from one step of the algorithm to another.
The indentation estimation, i.e. how much the nger indents the fabric surface, represents the position of the nger along the vertical axis and it is done by measuring distance from structure ange to fabric during nger pressure. Three infrared sensors are heuristically arranged on the ange so that IR interferences are minimized. Each sensor outputs a voltage inversely proportional to distance as blue markers indicate on next image, the distance measure (d) is obtained from sensors output voltage (v) using a 5th order linear interpolation (green line on gure 2.1.2), that is
CHAPTER 2. TOWARDS A UNIQUE SYSTEM: FYD TOUCHPAD 19 0 0.5 1 1.5 2 2.5 3 −10 0 10 20 30 40 50 Volts Distance (cm) From datasheet From 5th order polyfit
Figure 2.1.2: Sharp analog contact-less sensors output voltage vs. distance curve. The indentation estimation is computed subtracting the average on the three distances to measured distance from fabric when no nger is touching. It was demonstrated that such an estimation is quite accurate and precise by using an indentation system as ground truth. This system is that shown in Figure 2.1.3, it is composed by a DC linear motor that moves the indenter up and down, it is usually used for characterization of silicone specimens, but in our case, only the oset position of the indenter was of interest. The structure was xed in place of the sensorized cube, so that the indenter mimics the user nger, and the indentation estimation algorithm was tested with dierent indenter heights to prove its eectiveness.
Figure 2.1.3: The system developed by E. Piaggio research center used to verify nger indentation estimation.
However, indentation estimation can be improved merging the information provided with area detection algorithm such as area and nger position: this change of indentation evaluation will be done on the nal device using the plane equation passing from the sensors (more details will be analyzed in section 4.1.2).
Part II
FYD Touchpad system
architecture
advancement of an already existing fabric yielding display (F.Y.D. 2 device) engineered by Research Center E. Piaggio of the University of Pisa [6]. The novel device will be called hereinafter F.Y.D. Touchpad device, since its predecessor name was the FYD-2 and because of its touching area which has the same dimensions of a typical laptop touchpad.
The working principle of FYD Touchpad is the same of FYD-2, visible in Figure 3.1.1, with the addition of motor vibration to perform digital texture rendering and enable a dynamic movement of the nger on the fabric: this principle will be explained in section 3.1.
These are the major dierences with respect to FYD-2:
it has a bigger layout with total size of 15 cm (length) x 12 cm (width) x 14 cm (height)
the area of the fabric where the nger can move has dimensions similar to the ones of a notebook touchpad (10 cm x 7 cm)
the motor case has been modied to improve fabric unrolling when motors move and vibrate
the new device has a compact design (all hardware parts are enclosed within a shell, as well as the electronics to read analog distance sensors, the load cell to read the force normal to fabric and the board to drive motors and to perform nger detection
CHAPTER 3. HARDWARE DESIGN AND IMPLEMENTATION 23
3.1 Device Overview and working principle
The FYD Touchpad haptic device is a haptic fabric yielding display analogously to its predecessor FYD-2 (see Figure 3.1.1).
This class of displays exploits a bielastic fabric, i.e. a fabric which exhibits properties that render it elastic in at least two substantially perpendicular directions. By changing the elasticity of the fabric, users are able to feel dierent levels of softness by touching a deformable surface.
Figure 3.1.1: The FYD2, an overview (a). A nger interacting with the display. For the sake of clarity the FYD2 is shown without (b) and with the cover (c).
This kind of devices allows to convey tactile information based on variable stiness, by stretching the bi-elastic fabric as shown in Figure 3.1.2. To do this, we use two DC motors controlled in position around which the fabric is stretched or outstretched depending on motors turning direction.
Figure 3.1.2: F.Y.D. device working principle.
Moreover, the device has been also designed to acquire and to estimate information related to forces, indentations and contact areas, the latter can be measured via an optical system. Major details on FYD and FYD-2 can be found in [23] and [6].
By changing the rmware, it is possible to enable this haptic interface to digitally render texture together with softness information. The main idea is to add a vibration of the fabric with a control strategy additional to classic position control used to regulate the
Figure 3.1.3: Haptic touchpad interface: main idea.
Figures 3.1.4, 3.1.5 and 3.1.6 show an exploded view of the new FYD Touchpad device. It is made up of a internal case where all electronic are attached, with three external shells: a front cover with a hole grid where a fan is placed for cooling, a rear cover with holes for HDMI, Ethernet, USB and power connections and an upper cover to prevent the operator to touch DC motors and to move nger outside the usable area.
Figure 3.1.4: Exploded drawing view (overview).
The device is able to detect nger position through an object detection algorithm per-formed on pcDuino board (mounted on the rear of the case, see Figure 3.1.6) using a Microsoft® LifeCam HD-3000 web camera placed on FYD Touchpad ange. It can also detect nger indentation, that is the dierence between the distance from the ange to
CHAPTER 3. HARDWARE DESIGN AND IMPLEMENTATION 25
the fabric indented by the nger and that from the ange and the surface of the fabric when no contact occurs. To do this, three contact-less infrared Sharp® analog distance sensors (GP2Y0A41SK0F) with detection range 4-30 cm have been assembled on the ange of the device in a triangular conguration, as explained in section 4.1.2.
DC motors are driven by a custom-made electronic board from QB Robotics®, with motors driver and Cypress® microprocessor assembled on the board. The device is also endowed with a load cell (Micro Load Cell (0-780g) from Phidgets®) placed at the ange of the device, to record the normal force exerted by the user nger interacting with the fabric: the electronics for data reading and communication with pcDuino is also mounted on the FYD Touchpad case.
Another component of the system is the USB hub attached at the ange of the device, but not directly to FYD Touchpad base to not aect the readings of load cell. It is connected to pcDuino and hosts the USB ports from webcam, load cell and QB electronics.
Figure 3.1.6: Exploded drawing view (rear).
For the sake of clarity, the gure 3.1.7 shows the real FYD Touchpad device and a typical interaction between the fabric on the display and a nger.
Figure 3.1.7: A nger interacting with the display and the FYD Touchpad device.
3.2 Hardware details
In Figure 3.2.1 we can see the hardware components we used for FYD Touchpad: a Logitech camera with a up to 1280x720 pixel resolution and an imaging rate up to 30 frames per second; Sharp analog contact-less distance sensor with infrared resolution range 4-30 cm; Phidgets micro load cell, a force sensing module that measures force up to 780g, together with a Phidgets bridge board to read normal force measures.
CHAPTER 3. HARDWARE DESIGN AND IMPLEMENTATION 27
Figure 3.2.1: FYD Touchpad hardware resources.
In addition to these devices, we used a pcDuino v.2 as main computation board. It is used to perform nger area detection, indentation estimation, communication with motors board and with robot sending and receiving network packets. It is a mini PC platform that run PC-like operating systems such as Ubuntu and Android, but it has also general-purpose input/output (GPIO) pins on the integrated circuit which headers interface is compatible with Arduino headers. This version of pcDuino has 1 GHz ARM Cortex A8 CPU, 1 GB of DRAM, 2 GB of Flash; it runs Lubuntu 12.04 OS and it has a on-board 10/100 Mbps Ethernet module and a WiFi module.
It is important to say that in the rst version of FYD Touchpad (see Figure 3.2.2) we used a Raspberry PI ver. B with Raspicam® camera support as main computing board. However, we chose to use pcDuino because Raspberry was too slow to perform communication with motor board and motors movement was delayed with the respect to the commands. All the code we wrote to compile on Raspberry was easily adapted to pcDuino, since both platforms are based on Linux-like operating systems.
Furthermore, while in the rst version we used a Phidgets InterfaceKit 8/8/8 Mini-Format to read analog values from distance sensors, in the new version Arduino headers in pcDuino allow a direct reading of analog sensors. Finally, together with Raspberry PI, also the PhidgetInterfaceKit was removed; since pcDuino does not support Raspicam,we chose a new webcam from Logitech to obtain fabric images.
Figure 3.2.2: First prototype of FYD Touchpad with Raspberry PI.
A Pololu Step-Down Voltage Regulator (D15V35F5S3) provides correct power level to all the electronic components. It is a compact switching step-down voltage regulator that takes 12 V as input voltage and eciently reduce it to 5 V; it can deliver up to 3.5 A continuously in typical applications. Using this voltage regulator, it is possible to have a unique 12 V power connector for FYD Touchpad, which powers the fan and the motors (they need 12 V to work properly) together with pcDuino according to its specications (it requires 5V, 2A), by connecting the step-down output to board power input.
3.3 Motors block details
The motors used in the FYD Touchpad device are DC Maxon Motors REmax 256 : 1 (Maxon Motor ag, Sachseln, Switzerland) . Each one is rigidly connected with the motor housing, that in turn is screwed in the case. All around motor housing, ve rolling bearings from INA® (Germany) are arranged to allow an external roller to rotate. In this manner, motor motion is carried from a pulley placed on motor shaft to the roller, so that giving a rotation command to a DC motor let its roller to rotate (see Figure 3.1.5). In the device, the extremities of a rectangular strip of fabric are connected to left and right roller. Motor positions can be controlled by processing the signals from two ab-solute magnetic encoders (12 bit magnetic encoder by Austria Microsystems, AS5045, with a resolution of 0.0875), located in the rear of the device and read by a custom made electronic board by QB Robotics® (PSoC-based electronic board with RS485 communication protocol). A level of softness is generated by appropriately stretching the fabric using the two motors, i.e. when the rst motor rotates in a counter-clockwise direction and the second motor rotates in a clockwise direction they stretch the fabric thus increasing its apparent stiness. On the contrary, when the rst motor rotates in a clockwise direction and the other motor rotates in a counter-clockwise direction they relax the fabric thus reducing its apparent stiness, see Figure 3.1.2.
CHAPTER 3. HARDWARE DESIGN AND IMPLEMENTATION 29
Moreover, it is important to notice that this conguration allows to implement and exploit an additional movement of the fabric. Indeed, when the two motors rotate in the same direction, a translational shift can be imposed on the nger pad interacting with the fabric and this shift can be exploited to convey texture information to users.
3.4 Fabric properties and cover design
The tissue is a bi-elastic fabric whose name is Superbiex Velo Teon. It is composed by nylon (70%) and elasthan/spandex (30%) and it is produced by Mectex® (Erba, Como); it is elastic in at least two perpendicular directions.
It has been chosen because of its good elastic properties, better than other materials such as lycra or latex and because it can be assumed as an isotropic material from elasticity point of view, as explained in [24], in this manner its properties do not change with the direction of the stretch.
Considering the device covers, it is important to say that to allow webcam and motors cooling, a hole grid was added to the front shell near the fan, as you can see in Figure 3.4.1. This grid allows air to come in and out but prevents external light to do the same, since it alter webcam image.
Figure 3.4.1: FYD Touchpad cooling system.
The upper cover has a 10 cm x 7 cm central main hole where nger movements on the fabric are acquired. The hole frame was smoothed for safety reasons to prevent the users from hitting motors. Moreover (see Figure 3.4.2) in the lower side the cover outline adapt to motor roller curvature to avoid collisions between them and the cover.
Figure 3.4.2: The lower side of FYD Touchpad upper cover.
Notice that the covers are not xed to FYD case, to not interfere with normal force measurements from load cell: indeed, when the operator presses on the cover, the load cell measure must be zero, therefore all the load must be grounded .
Chapter 4
Software design and
implementation
Together with the hardware, a software capable of detecting the nger moving on the fabric and communicating with the teleoperated robot is needed.
An application running on an external computer controls the remote robot. On the other side, the new F.Y.D. device has pcDuino main electronic board that performs nger detection, indentation estimation, communications with DC motors, load cell and network.
In the following sections the software implemented is discussed in details, focusing from that on the device, to that on robot. Next, also communication between F.Y.D. and robot is discussed together with a use case example.
4.1 Master haptic device: software architecture
All software runs on a pcDuino v.2 platform. The main code is written in C++ language and it is organized as threads, with a main thread that initializes the application, creates the network connection with teleoperated robot, opens sensors and webcam and creates other threads such as nger detection and motor motion threads.
Previously, the software has been implemented on a Raspberry PI rev.B hardware; how-ever pcDuino has been preferred because of its major CPU speed (1 GHz vs. 700 MHz for Raspberry) and the possibility to integrate analog sensor reading without using an external ADC thanks to pcDuino Arduino-like pinouts. Other main reason to change the computation board was because a limit in Raspberry PI speed of communication with QB motor control board made the whole algorithm too slow to perform good haptic texture rendering.
In the following lines, an outline of main code is reported. 31
} close_motors(); close_loadcell(); kill_all_threads(); return; }
This code performs an initial handshake with the robot to establish network connection as server, that is it waits until an acknowledgement is received from the robot, then it can send to it information such as the gain on texture rendering vibration and what are the initial motor reference angles. After connection establishment, the main code on FYD start motors up driving them to rest position according to the sensed material stiness; then load cell is initialized and force measures start. The three following procedures create the threads related to nger area detection (see section 4.1.1) , motors motion and packets exchange (see section 4.1.5).
4.1.1 Finger detection
The nger detection code is based on images captured from the Logitech webcam, placed in the center of FYD Touchpad ange. To work with images the OpenCV® computer vision library has been used.
The algorithm runs every 100 milliseconds, that is at a frequency of 10 Hz, setting a camera resolution of 320x240 px at 30 frames per second, with contrast level and brightness level set to maximum. Although webcam resolution could be higher and up to 1280x720 px, it has been preferred to work with lower resolution images because there were no big improvement in the image quality, but also to have a simpler and faster algorithm.
Also in this section, a pseudo-code of thread is reported to show all the main steps of the code without writing the single instructions and OpenCV libraries functions calls for image processing.
CHAPTER 4. SOFTWARE DESIGN AND IMPLEMENTATION 33
if (show_image)
createimagewindow();
set_webcam_properties(WIDTH, HEIGHT, FPS); initialize_distance_sensors_reading();
// FINGER AREA DETECTION AND INDENTATION ESTIMATE
while(connected){ wait_until_ready(); cam_img = get_new_frame(); ocv_img = get_ROI_from_image(); thr_img = get_grayscaled_image(ocv_img); bin_img = threshold_pixels(thr_img);
// Calculate area and centroid
num_pixels = get_area(bin_img);
get_center(bin_img, thr_img, num_pixels, ¢er); indentation_mm = read_indentation(); if (show_image){ draw_contact_point_on_image(ocv_img); show_image_on_window(); } reset_ROI(); while(stop_flag); } release_webcam_resources(); exit_thread(); }
The part of code before while loop sets webcam properties as preferred, initializes a window to show webcam image if this is needed and opens pcDuino system les that contain numerical sampled values from ADC analog input channels connected to the three Sharp contact-less distance sensors. Then, the thread loop waits to be synchronized and to run at 10 Hz, then it gets a new frame from webcam, considers only a region of
Figure 4.1.1: Example of nger movement on fabric detected by webcam. We compute nger area on the binary image, counting the number of white pixels. Then, the pseudo-code function get_center() computes the centroid of the image, given the grayscaled and the binary images: it look for pixels of minimum and maximum intensity (valuemin and valuemax), where valuemin corresponds to darkest point of the image,
i.e. the closest to the webcam, according to the theory that the nger contact with fabric covers environment light to webcam, so resulting in darker pixels in the image. Afterwards, to each white pixel in the binary image is associated a weight depending on the distance, in terms of intensity, with respect to valuemin , on the basis of a gaussian
distribution, that is for each pixel the weight k is computed as k = e−(x−µ)22σ2
where
µ = valuemin
σ = (valuemax− valuemin) 5
In this way, pixels closer to that with intensity valuemin will have a higher weight in
relation to further pixels. A weighted average of all the pixels in the nger image identied the pixel corresponding to nger centroid. The resulting value is saved in center variable,
CHAPTER 4. SOFTWARE DESIGN AND IMPLEMENTATION 35
but only if the nger area is major than a minimum nger area threshold, set to 150 pixels with a trial and error procedure, otherwise the variable is not updated.
The information on the third spatial coordinate, i.e. the nger indentation, is computed reading the three analog contact-less distance sensors and performing a plane-based al-gorithm that gives back the estimated indentation of the nger centroid. For further details on how such an estimation is done, see section 4.1.2.
The code ends with image show, if needed, and with a reset of ROI. If we need to stop nger detection algorithm, the thread waits until stop_ag ag is reset.
Once obtained what pixel of the image corresponds to nger centroid, an unit conversion is performed to give the teleoperated robot a desired position in millimeters from the center, that is expressed as (0,0) coordinates. To do this, pixel coordinates are multiplied by a scale factor of 0.46875 that corresponds to pixel to millimeters mapping if we take into account that the ROI is 160px in height and the usable area is 75 mm along that direction.
4.1.2 Indentation estimation
This section focuses on how it is possible to provide a good estimation of nger indenta-tion on fabric. To do this, three contact-less distance analog sensors from Sharp® have been used; they output a voltage that is inversely proportional to distance as already explained in the feasibility analysis section.
The sensors are arranged on the FYD ange as shown in Figure 4.1.2: in this conguration IR interferences are minimized and the point in space they detect are always non-collinear.
Figure 4.1.2: Frame ange seen from top.
The right number of sensors to work with has been chosen as three, because the algorithm to estimate indentation is based on nding a plane passing from the 3D points detected by sensors and, from geometry, only one plane can pass through three non collinear points. So, adding a fourth sensor would not have improved the estimation in a signicant way. Figure 4.1.3 show in gray what is the plane generated from the three 3D points, placed respectively at coordinates (-22,-3), (0,27), (22,-3) millimeters from the center of the
Figure 4.1.3: Construction of plane passing from 3D sensors points.
From a code point of view, as usual, a pseudocode of indentation estimation part is reported and explained.
double read_indentation(double* distance, double x_c, double y_c){
for (int i=0; i<3; i++){
sensorValue[i] = getSensorValue(i); volts[i] = (sensorValue[i]/4096.0)*3.3; distance[i] = 10*(-4.6327*pow(volts[i],5)+... 39.954*pow(volts[i],4)-133.8297*pow(volts[i],3)+... 220.8604*pow(volts[i],2)-188.5623*volts[i]+79.0712); } return find_distance(distance[0],distance[1],distance[2],center); }
double find_distance(double z1, double z2, double z3, double x, double y){
double z1ind = z1 - FABRIC_OFFSET0_MM;
double z2ind = z2 - FABRIC_OFFSET1_MM;
double z3ind = z3 - FABRIC_OFFSET2_MM;
return (9*z1ind)/20 + z2ind/10 + (9*z3ind)/20 - ... (x*z1ind)/44 + (x*z3ind)/44 - (y*z1ind)/60 + ... (y*z2ind)/30 - (y*z3ind)/60;
}
The function read_indentation() called during nger detection procedure gets analog sampled value on a range from 0 to 4095, then the values are converted to volts and to distances in millimeters according to the 5-th order interpolation formula found during feasibility analysis. The nd_distance() function computes nger centroid height from FYD ange according to plane equation ax + by + cz + d = 0. Because sensors are placed at xed and well known x and y coordinates, an indentation estimation can be found computing z from the following equation.
CHAPTER 4. SOFTWARE DESIGN AND IMPLEMENTATION 37
z2(44y + 132) − 1320z − z1(30x + 22y − 594) + z3(30x − 22y + 594) == 0
The formula we found in the code it is slightly dierent because the indentation estimation is computed subtracting the measured distance from fabric when no nger is touching to sensed distances.
4.1.3 Normal force computation
Force normal to fabric is sensed by a Phidgets® load cell that, acting as a transducer, converts a force into an electrical signal. The resulting signal is then transformed in a numeric digital value which is proportional to the applied force. The electronic interface is that shown in Figure 4.1.4; it is assembled on the side of FYD Touchpad case.
Figure 4.1.4: Phidgets® Bridge 4-Input electronic board.
To work with values expressed in Newtons, a calibration phase is needed to nd the cor-rect transformation; it has been done putting known weights on the fabric and comparing the sensed values with weights values. The calibration procedure has brought to the nal formula, according to which
f orce(N ewton)=
(1004.8805 · value − 20.2512)
1000 · g
Reading the normal force is useful to know when the nger is touching the fabric and to have a dynamic stiness control, as explained in section 4.1.4.
4.1.4 Stiness control
Material softness is reproduced on FYD Touchpad by stretching or relaxing the fabric and then changing the compliance of the tissue. To compute and control softness of the fabric, it is necessary to know nger force and indentation. Here, an algorithm to compute and control softness on the fabric of the device is studied and implemented.
// Move motors according to robot motion if enabled
wait_until_ready();
stiffness_error = robot_stiffness - stiffness;
if (stiffness > 0.1 && fabs(stiffness_error) > 0.01){ new_motors_ref = PID(stiffness_error) + rest_ref; set_motors_ref(new_motors_ref);
} }
else {
if (first_stiffness && robot_stiffness){
motor_angle = (314.5*(robot_stiffness*robot_stiffness)... -107*robot_stiffness-0.3552); ref0 = ANGLE_SF*motor_angle; set_motors_ref(ref0); first_stiffness = false; } }
The FYD Touchpad display has some limits in performing tissue softness because of the length of the tissue and the maximum movement the motors can do; in fact, the device can reproduce softness values in the range 0.3 - 0.9 N/mm.
The pseudocode reported before shows that motor reference is dierent whether the stiness control is dynamic (that is continuous) or static.
In the case of dynamic stiness control, at every loop a new stiness estimation is com-puted, once force and indentation are known as
stif f ness = f orce indentation
N
mm
then stiness error between the estimation and that sent by the robot is computed. This error is the input for a classic PID controller which outputs the new motor references. Although the implemented version of PID controller has only the proportional and the integral components, also derivative action can be used. The gains have been set using
CHAPTER 4. SOFTWARE DESIGN AND IMPLEMENTATION 39
a trial and error procedure, looking at the angle the motors shaft has with the change of the position reference.
On the contrary, static stiness control sets motors reference proportional to the angle the motors shafts has to have, following the formula
motorref = angleSF · 314.5 · robot2stif f ness− 107 · robotstif f ness− 0.3552
where angleSF = 46.667 is the scale factor between the real angle and the encoder
measures, because 4200 encoder tics correspond to 90 degrees motor angle, that is 4200/90 = 46.667.
The coecients in the formula have been found starting from indentation-force samples at dierent motor angles, as Figure 4.1.5 shows. From each set of samples, a stiness coecient is computed as the rst order interpolated line angular coecient. Once sti-ness values and angles are known, a second order interpolation gives the nal coecients that map robotstif f ness value in motorref.
0 20 40 0 2 4 6 8 10 Indentation (mm) Force (N)
Motor angle: 20°, Stiffness: 0.501
0 10 20 30 0 2 4 6 8 10 Indentation (mm) Force (N)
Motor angle: 45°, Stiffness: 0.550
0 10 20 30 0 2 4 6 8 10 Indentation (mm) Force (N)
Motor angle: 60°, Stiffness: 0.673
0 10 20 30 0 2 4 6 8 10 Indentation (mm) Force (N)
Motor angle: 90°, Stiffness: 0.715
Figure 4.1.5: Indentation-force curves at dierent motor angles: data are in blue, stiness line is in red.
rendering. Indeed, when an object or a nger is moving on a surface, high-frequency accelerations due to contact arise; it is possible to exploit this information to generate a signal representative of sensed accelerations.
In summary, accelerations are ltered with a high-pass second order discrete time lter, whose Bode diagram is visible in Figure 4.1.7, the lter is able to weaken the signal at low frequencies, typical of the motion on surface and not relevant for sensation analysis, and to maintain high-frequency acceleration typical of sensed texture. Then, the three ltered components of vector acceleration are fused in a single data vector according to the chosen rendering algorithm; this new signal is used to generate a PWM (Pulse Width Modulation) signals sequence characteristics of the vibrations to impose to FYD Touchpad motors. Then, PWM due to vibration is added to that given by motors position control, that is generated from the position error between the reference and the eective (read using a encoder on motor) position.
Figura 4.1.6: Simple block control scheme for haptic rendering and FYD Touchpad position control.
To allow the FYD Touchpad to work with the accelerations perceived on the robot, a communication through network packets has been implemented. Moreover, to send the vibration information elaborated on pcDuino to motors board, also a modication in QB Robotics® rmware, written for a Cypress PSoC (Programmable System on
CHAPTER 4. SOFTWARE DESIGN AND IMPLEMENTATION 41
Chip) microcontroller, has been necessary. In particular, a new packet on serial com-munication between the electronic board and pcDuino has been dened; this packet is used for vibration control, it is similar to that that is used to set motors inputs, but it contains the PWM due to vibrations. This serial command is accessible either using the external qbmove software executing for example qbmove -x 50,50 to set PWM vi-bration duty cycle to 50% on both motors, or it can be used inside the rmware with CMD_SET_VIBRATION code (for further details see QB board documentation). This new command allows the rmware to know what is the duty cycle of PWM wave due to vibration. This quantity is summed to that derived by internal position control, already implemented on the QB board, to maintain the motors on the reference position, that is the PWM duty cycle input is generated as
inputP W M = inputP OSIT ION+ inputV IBRAT ION
However, this quantity is limited to 100, that represents the maximum duty cycle value. The sign of inputP W M quantity gives motors turning direction.
Now, we will focus on the pcDuino software elaboration of the accelerations received by the robot and on how these data can be used to perform texture rendering and vibration control. So, from the FYD Touchpad side, at every execution of network packets thread started in the main code, we compute the new PWM contribute due to vibration, as the pseudocode below reports.
// Sensation rendering algorithm evaluation using accX , accY , accZ data
if (!dft321)
pwm_vib = compute_PWM(accX, accY, accZ);
else
pwm_vib = compute_PWM_DFT321(accX, accY, accZ);
// Send PWM to motors through USB using QB libraries
pwm_vib_value = k_vib*pwm_vib;
if (!accX && !accY && !accZ) send_PWM(0);
else
send_PWM(pwm_vib_value);
The functions called in the code are dierent according to preferred softness rendering method. The rmware implements the method based on the sum of acceleration vector components (SoC) and the DFT321 method, where DFT321 stands for Discrete-time Fourier Transform: from 3 dimensions to 1 dimension: these algorithms will be explained later in this section. The dft321 ag is a boolean variable that, when set to true, performs DFT321 rendering, otherwise uses the SoC system. Anyway, whatever rendering method is chosen, the pwm_vib variable contains the PWM duty cycle value to send to motors board: this value can be optionally multiplied by the gain k_vib to increase or decrease the intensity of vibrations.
for accelerations in the range 0g - +2g (that is the maximum acceleration the used accelerometers can sense) and negative values for accelerations between -2g and 0g. Once all these data are available, they are sent through network to pcDuino on FYD Touchpad, where data ltering is performed.
Each digital ltered value is computed with the classic formula
y(n) = b0· x(n) + b1· x(n − 1) + b2· x(n − 2) − a1· y(n − 1) − a2· y(n − 2)
with
a0= 1, a1= −1.372, a2= 0.4998
b0= −0.7162, b1= 1.433, b2= −0.7163
where the vector x stores the actual and the previous inputs, that are the accelerations, while the vector y stores the outputs of the lter.
CHAPTER 4. SOFTWARE DESIGN AND IMPLEMENTATION 43 −50 −40 −30 −20 −10 0 Magnitude (dB) 10−1 100 101 102 103 180 225 270 315 360 Phase (deg)
High−pass 2nd order digital filter − Bode Diagram
Frequency (Hz)
Figura 4.1.7: High-pass second order digital lter used on accelerations sensed by the teleoperated robot.
In the following codes, an implementation of discrete time version of SoC and DFT321 algorithms is shown together with the part in which these two dierent monodimen-sional synthesized signals can be used to extract right PWM duty cycle values. In par-ticular, the functions vibration_PWM() and vibration_PWM_DFT321() convert the input value in a PWM value that is between specic maximum allowable limits set in MAX_ALLOWED_VIB variable and that is null when its absolute value is smaller than DEADZONE_PWM_VIB. In normal applications, these variables have a default value of 25 for DEADZONE_PWM_VIB and 100 for MAX_ALLOWED_VIB: this choice means that all the PWM period can be reserved for vibrations. If preferred, one can set this variable to a smaller value to give an upper limitation to PWM duty cycle due to vibration control. More details on DFT321 implementation can be found in section 5.3. int compute_PWM(int accX, int accY, int accZ){
acc_data[0]=accX; acc_data[1]=accY; acc_data[2]=accZ; filter_data(acc_data, filtered_data);
AS_time = filtered_data[0] + filtered_data[1] + filtered_data[2];
return vibration_PWM(AS_time); }
int vibration_PWM(float input_value){
int compute_PWM_DFT321(int accX, int accY, int accZ){
static int current_idx = 0; stored_acc[current_idx][0] = accX; stored_acc[current_idx][1] = accY; stored_acc[current_idx][2] = accZ;
if (current_idx == W_SAMPLES){
for (int j=0; j<W_SAMPLES; j++){ tmp[0] = stored_acc[j][0]; tmp[1] = stored_acc[j][1]; tmp[2] = stored_acc[j][2]; filter_data(tmp, filtered_data); filtered_stored_acc[j][0] = filtered_data[0]; filtered_stored_acc[j][1] = filtered_data[1]; filtered_stored_acc[j][2] = filtered_data[2]; } for (int j = 0; j < 3; j++){ DFT_computation( filtered_stored_acc[j] ); }
for (int i = 0; i < W_SAMPLES; i++){ sum_real = 0; sum_imag = 0; sum_sq_mod = 0; for (int j = 0; j < 3; j++){ sum_real += dft_real[i][j]; sum_imag += dft_imag[i][j]; sum_sq_mod += dft_real[i][j]*dft_real[i][j] + ... dft_imag[i][j]*dft_imag[i][j]; } abs_AS[i] = sqrt(sum_sq_mod); phases[i] = atan2(sum_imag,sum_real); }
for (int i = 0; i < W_SAMPLES; i++){
dft_real[i][0] = abs_AS[i]*cos(phases[i]); dft_imag[i][0] = abs_AS[i]*sin(phases[i]);
CHAPTER 4. SOFTWARE DESIGN AND IMPLEMENTATION 45
}
AS_time = IDFT_computation(dft_real[0], dft_imag[0]); current_idx = 0; } else current_idx++; return get_PWM_DFT321(AS_time); }
int vibration_PWM_DFT321(float input_value){
int pwm_value = floor((input_value/MAX_AS_VALUE_DFT321)* ...
MAX_ALLOWED_VIB);
if (pwm_value < -MAX_ALLOWED_VIB)
pwm_value = -MAX_ALLOWED_VIB;
else if (pwm_value > MAX_ALLOWED_VIB)
pwm_value = MAX_ALLOWED_VIB;
if (pwm_value > -DEADZONE_PWM_VIB_DFT321 && ... pwm_value < DEADZONE_PWM_VIB_DFT321) pwm_value = 0;
return pwm_value; }
int get_PWM_DFT321(float* input_value){
static int current_idx = 0;
if (current_idx > W_SAMPLES) {
/* This part of code represents motor pwm commands during DFT321 algorithm computation time */
out = 0.0; current_idx = 0; } else out = input_value[current_idx]; current_idx++; return vibration_PWM_DFT321(out); }
4.2 Slave robot: software architecture
In this section, the robot side software is explained and discussed. This software has to run on a PC that is physically linked to the robot, either with a wired connection or through network. The main steps in which all the software can be divided are the
Each of this step will be analyzed below with the help of a very simplied C++ like pseudocode. Moreover, the reported code will be referred to a generic robot to show the application universality. For the specic implementation on a 7 degree of freedom KUKA® robot, see section 4.2.1.
Estimation of material stiness Before connection with FYD device, the robot performs an estimation of stiness. The working hypothesis is that the texture that has to be sensed is placed under the robot end eector, so that the robot can touch it only decreasing end eector height. In the pseudocode, one can see that there is a while() cycle in which a force reading and a stiness estimation is done; this cycle is performed as long as ten stiness measures are available, at each cycle of the loop the robot lowers its end eector by one millimeter. The estimation of stiness is computed only if the force is major than a threshold value and the indentation used in the estimation is referred to the rst height at which robot has computed stiness (indent0 variable). At the end of while loop estimates are averaged and robot is moved to the initial higher position, while the stiness value (in N/mm) will be sent to FYD Touchpad, once connection it is established.
float get_material_stiffness(){
// Move KUKA downwards until 10 stiffness samples are in memory
init_loadcell(&bridge);
if (1){
MeasuredPose = getRobotPosition();
while (count<10 && indent<10){ inc -= 0.001;
x_ee = 0; y_ee = 0; z_ee = inc + z0;
CommandedPose = EEtoRobot_transformation(x_ee, y_ee, z_ee); setRobotPosition(CommandedPose);
CHAPTER 4. SOFTWARE DESIGN AND IMPLEMENTATION 47 get_current_force(&force); if (force > MIN_FORCE_THR){ // It is a valid measure if (count == 0){ indent0 = inc; }
indent = fabs(inc) - fabs(indent0); stiffness[count] = force / (indent*1000); count++;
} }
// Average stiffness samples
for (int i=1; i<count; i++){ stiff += stiffness[i]; } stiff /= count; setRobotPosition(MeasuredPose); } else { stiff = 0; } return stiff; }
Connection establishment with FYD Touchpad device The robot_init() func-tion is called by robot main code and has the work to get material stiness as just said, then to set connection and to start code threads. The connection initialization performed by initialize_connection() creates the same structures and sets the same parameters for wired and wireless networks. The only main change is the IP, that is set according to the switch instructions. There is an initial handshake in which the robot sends an ac-knowledgement to the device and the FYD Touchpad answers to it telling some info on QB motors board, what is the gain on vibration control and what are current motor references. Then, together with threads, we initialize also the accelerometers and we read rest acceleration.
int robot_init(int mode, const char* ip_host, double z0){ createRobot();
setRobotProperties(); setRobotPosition(ZeroPose); stiffness = get_material_stiffness();
switch(mode){
case 0: set_wired_connection(); break;