• Non ci sono risultati.

Machine-Vision Based Position and Orientation Sensing System for UAV Aerial Refueling

N/A
N/A
Protected

Academic year: 2021

Condividi "Machine-Vision Based Position and Orientation Sensing System for UAV Aerial Refueling"

Copied!
169
0
0

Testo completo

(1)UNIVERSITÀ DI PISA. Facoltà di Ingegneria Laurea Specialistica in Ingegneria dell’Automazione Tesi di laurea. Real-Time Machine-Vision Based Position and Orientation Sensing System for UAV Aerial Refueling Tesi di laurea svolta presso la West Virginia University (Morgantown, WV, U.S.A.). Candidato: Rocco Vincenzo Dell’Aquila. __________________. Relatori: Prof. Mario Innocenti ___________________ Prof. Lorenzo Pollini ___________________. Sessione di Laurea del 17/07/2006 Archivio tesi Laurea Specialistica in Ingegneria dell’Automazione Anno accademico 2005/2006 Consultazione consentita. /2006.

(2) Dedica e ringraziamenti. “Ed eccomi giunto alla fine di un ciclo, quello universitario-pisano, che tante gioie mi ha riservato e dunque mi pare giusto soffermarmi a ringraziare chi ha contribuito da vicino e da lontano a queste mie gioie. Dapprima i miei genitori ai quali dedico questa tesi perché nell’immenso bene reciproco che ci vogliamo forse sono loro più felici di me in questo momento. Dopo ai miei nonni i quali non hanno mai smesso di farmi sentire il loro orgoglio, a loro voglio dire che sono orgoglioso anch’io di avervi, e per nonno e nonna che non sono qui con me, beh, siete nel mio cuore. Infine una dedica particolare alla mia Sorellina, la cui mancanza ha lasciato un vuoto incolmabile dentro me ma mi ha fatto crescere e mi ha sempre accompagnato, e sono sicuro che da lassù gioirà e festeggerà con me. Una breve ma sentita parentesi la merita la mia dolcemente insopportabile “fiorentina” che mi vuole tanto bene. E come faccio a dimenticare: Andrea, Annamaria, Daniele, Rocco e Valerio che hanno reso unica ed indimenticabile la mia permanenza pisana. Concludendo sentiti ringraziamenti vanno al Prof. Innocenti che mi ha reso possibile il sogno di andare negli States per questa tesi e poi Il Prof. Napolitano, Giampiero, Marco e Sofia che mi hanno riservato una calorosissima accoglienza.. Grazie di cuore a Tutti.”. 1.

(3) ABSTRACT This thesis describes the design of a Real Time Machine-Vision (MV) Position Sensing System for the problem of Semi-Autonomous Docking within Aerial Refueling (AR) for Unmanned Aerial Vehicles (UAVs). MV-based algorithms are implemented within the proposed scheme to detect the relative position and orientation between the UAV and the tanker. In this effort, techniques and algorithms for the acquisition of the image from a real Webcam, for the Feature Extraction (FE) from the acquired image, for the Detection and Labeling (DAL) of the features, for the tanker-UAV Pose Estimation (PE) have been developed and extensively tested in MATLAB/Simulink® Soft Real-Time environment and in Linux/RTAI Hard Real-Time environment. Firstly it was implemented the MV block of the previous entire simulation with real videos and real images from a webcam instead of the Virtual Reality Toolbox ® visualization. After the webcam-based MV was relocated to reach the Hard Real-Time requirements. Additionally, it’s developed a new way for the inter-process communication among RealTime and Non Real-Time processes, implementing the Cyclic Asynchronous Buffer (CAB) on RTAI. Finally the entire sensing system was tested using an 800Mhz PC-104 computer (the OnBoard Computer embedded on the YF-22 UAV models of the WVU Laboratories), and the results confirmed the feasibility of executing image processing algorithms in real-time using offthe-shelf commercial hardware to obtain reliable relative position and orientation estimations.. SOMMARIO Lo scopo di questa tesi è descrivere la progettazione di un sistema sensoristico per la misurazione della posizione relativa, basato sulla visione artificiale (MV) e sviluppato in ambiente Real-Time, per risolvere il problema del rifornimento autonomo in volo (AR) per i velivoli autopilotati (UAVs). Gli algoritmi basati sulla visione artificiale sono implementati all’interno dello schema proposto per misurare la posizione e l’orientamento relativi tra l’aeromobile che rifornisce il carburante (Tanker) e l’UAV. All’interno di questo lavoro, sono stati sviluppati algoritmi e tecniche per l’acquisizione di immagini mediante una webcam, per l’estrazione dalle immagini acquisite delle proprietà caratteristiche dell’oggetto (feature extraction), per il riconoscimento e l’etichettature (labeling) delle caratteristiche, e per la stima della posizione relativa Tanker-UAV. Queste tecniche ed algoritmi sono state testate ampiamente in ambiente MATLAB/Simulink® in Soft Real-Time e successivamente in ambiente Linux/RTAI© in Hard Real-Time. Inizialmente il blocco della Machine Vision dell’intera simulazione è stato implementato con video reali e con la webcam al posto di utilizzare la precedente visualizzazione di un mondo simulato in Virtual Reality Toolbox®. Successivamente la versione basata sulla webcam è stata aggiornata per soddisfare i requisiti di funzionamento Hard Real-Time. Inoltre è stato progettato e sviluppato un nuovo modo per la comunicazione tra processi Real-Time e non, implementando i Cyclic Asynchronous Buffer (CAB) in RTAI. Infine il sistema completo è stato testato usando un’embedded computer PC-104 da 800Mhz (che è montato a bordo dei modelli YF-22 UAV costruiti nei laboratori della West Virginia University), e i risultati confermano la fattibilità di poter eseguire i sudetti algoritmi di image processing in ambiente Real-Time usando hardware disponibile in commercio per ottenere un’attendibile stima della posizione e dell’orientazione relativa.. 2.

(4) INDEX DEDICA E RINGRAZIAMENTI .............................................................................................................................. 1 ABSTRACT......................................................................................................................................................... 2 INDEX.................................................................................................................................................................. 3 FIGURE INDEX ................................................................................................................................................. 5 TABLE INDEX ................................................................................................................................................... 8 1.. INTRODUCTION.................................................................................................................................... 9. 2.. DESCRIPTION OF THE AAR PROBLEM ....................................................................................... 11 2.1 THE AUTONOMOUS AERIAL REFUELING ................................................................................................... 11 2.2 THE TANKER ............................................................................................................................................. 12 2.3 THE UNMANNED AERIAL VEHICLES .......................................................................................................... 14 2.3.1 YF-22 research UAVs designed, built, and instrumented at West Virginia University (WVU) ........ 16 2.4 REFERENCE FRAMES .................................................................................................................................. 18 2.4.1 Reference Frames and Notation ....................................................................................................... 18 2.4.2 Geometric formulation of the AR problem........................................................................................ 19 2.4.3 Receptable-3DW-center vector......................................................................................................... 19. 3.. DESCRIPTION OF THE SIMULATION ENVIRONMENT ........................................................... 21 3.1 OVERVIEW ................................................................................................................................................ 21 3.2 AIRCRAFT MODEL ...................................................................................................................................... 27 3.3 MODEL OF THE BOOM ................................................................................................................................ 27 3.4 MODEL OF USED UAV............................................................................................................................... 30 3.5 SENSORS.................................................................................................................................................... 31 3.6 CONTROL .................................................................................................................................................. 35 3.6.1 UAV Docking Control Laws ............................................................................................................. 35 3.6.2 The Reference Path generation system ............................................................................................. 36 3.7 ATMOSPHERIC TURBULENCE AND WAKE EFFECT ....................................................................................... 38 3.8 ACTUATORS DYNAMICS ............................................................................................................................. 38. 4.. DESCRIPTION OF THE MV SUBSYSTEM ..................................................................................... 39 4.1 THE PIN HOLE MODEL ................................................................................................................................ 39 4.2 THE MACHINE VISION BLOCK ................................................................................................................... 42 4.2.1 Image Capture or Camera................................................................................................................ 44 4.2.2 Feature Extraction............................................................................................................................ 44 4.2.3 Scale ................................................................................................................................................. 44 4.2.4 Rotation markers in camera frame ................................................................................................... 44 4.3 FEATURE PASSIVE MARKERS .................................................................................................................... 45 4.4 CORNERS DETECTION METHODS ............................................................................................................... 46 4.4.1 Harris ............................................................................................................................................... 46 4.4.2 SUSAN .............................................................................................................................................. 47 4.5 LABELING ALGORITHM ............................................................................................................................. 48 4.5.1 Projection equations......................................................................................................................... 48 4.5.2 The “Points Matching” problem ...................................................................................................... 49 4.5.3 The Labeling algorithm .................................................................................................................... 50 4.6 POSE ESTIMATION ALGORITHM ................................................................................................................. 52 4.6.1 GLSDC ............................................................................................................................................. 52 4.6.2 LHM.................................................................................................................................................. 53. 5.. CLOSED LOOP SIMULATION EXAMPLE..................................................................................... 55 5.1 EXAMPLE WITH VRT................................................................................................................................. 55 5.2 RESULTS .................................................................................................................................................... 59. 6.. VIDEO FILE IMPLEMENTATION ................................................................................................... 61 6.1 AIRPLANE MODELS .................................................................................................................................... 61 6.2 CAMCORDER ............................................................................................................................................. 62 6.3 TEST PERFORMED AND MEASUREMENTS .................................................................................................... 62 6.4 EASY INITIAL CONDITION CALIBRATION TOOL (EICCT) .......................................................................... 63. 3.

(5) 6.5 MATLAB/SIMULINK FILES ................................................................................................................... 66 6.6 SIMULATION RESULTS ............................................................................................................................... 67 6.7 VIDEO SIMULATION WITH REAL TIME WORKSHOP COMPILATION AND ANALYSIS ..................................... 71 7.. WEBCAM IMPLEMENTATION IN MATLAB................................................................................ 72 7.1 AIRPLANE MODELS AND WEBCAM USED ................................................................................................... 72 7.2 MATLAB/SIMULINK FILE ..................................................................................................................... 72 7.3 SOFT REAL-TIME BLOCK........................................................................................................................... 74 7.4 AN EXAMPLE............................................................................................................................................. 74. 8.. LINUX/RTAI MV-BLOCK IMPLEMENTATION ........................................................................... 76 8.1 OPERATING SYSTEM SOFTWARE USED ...................................................................................................... 76 8.2 HARDWARE USED ...................................................................................................................................... 80 8.3 IMPLEMENTATIONS.................................................................................................................................... 84 8.3.1 Webcam Image Acquisition Software ............................................................................................... 85 8.3.2 Inter-Process Communications (IPCs) with RTAI............................................................................ 86 8.3.3 Cyclic Asynchronous Buffer (CAB) implementation in RTAI ........................................................... 87 8.3.4 File exchange data system ................................................................................................................ 92 8.3.5 SHared Memory (SHM).................................................................................................................... 92 8.3.6 RTAI LinuX Real-Time (LXRT)......................................................................................................... 95 8.3.7 SaveData for storing results ........................................................................................................... 101 8.4 EXPLANATION ON HOW TO USE IT AND TO TEST IT ................................................................................... 102 8.4.1 Real-Time Tests .............................................................................................................................. 107. 9.. CONCLUSIONS AND FUTURE DEVELOPMENTS ..................................................................... 111. 10.. REFERENCE....................................................................................................................................... 112. 11.. APPENDIX........................................................................................................................................... 116. 11.1 COLOR TYPE AND CONVERSIONS ........................................................................................................... 116 11.1.1 RGB .............................................................................................................................................. 116 11.1.2 YUV............................................................................................................................................... 117 11.1.3 GRAYSCALE ................................................................................................................................ 119 11.1.4 Conversion.................................................................................................................................... 119 11.2 NEW DETECTION AND LABELING ALGORITHMS ................................................................................... 120 11.2.1 DAL 1............................................................................................................................................ 120 11.2.2 DAL 2............................................................................................................................................ 122 Graph definition ...................................................................................................................................... 122 11.2.3 DAL 3............................................................................................................................................ 123 11.3 MATLAB AND C CODE ........................................................................................................................... 124 11.3.1 MATLAB AAR Video and Webcam Simulations ........................................................................... 124 11.3.1.1 Camerapar.m ........................................................................................................................................124 11.3.1.2 InitialConditionEstimation.m ..............................................................................................................124 11.3.1.3 Load and Unload Webcam Button ......................................................................................................125 11.3.1.4 sfun2webcam .........................................................................................................................................125. 11.3.2 MATLAB/RTW and Linux/RTAI LXRT Implementation. .............................................................. 127 11.3.2.1 rt_agent.c ...............................................................................................................................................127 11.3.2.2 sender.c ..................................................................................................................................................128 11.3.2.3 lxrtshm2imagetpar.c.............................................................................................................................132. 11.3.3 MATLAB/RTW and Linux/RTAI SHM Implementation................................................................. 137 11.3.3.1 sender.c ..................................................................................................................................................137 11.3.3.2 shm2imagetpar.m..................................................................................................................................140. 11.3.4 Save of the data on files for Real-Time Applications.................................................................... 144 11.3.4.1 savedata.c...............................................................................................................................................144. 11.3.5 Plots & create a video from the results, and Data analysis.......................................................... 154 11.3.5.1 plot2video.m .........................................................................................................................................154 11.3.5.2 twins_images.m ....................................................................................................................................156. 11.4 RTAI INSTALLATION HOWTO ............................................................................................................. 157 11.4.1 RTAI Installation step by step....................................................................................................... 157 11.4.2 Creation of a minimal stand alone RTAI system ( for Floppy or Memory Card) ......................... 164. 4.

(6) FIGURE INDEX FIG. 1.1: AERIAL REFUELING AT THE SUNSET. .................................................................................................................. 9 FIG. 2.1: AERIAL REFUELING USING A BOOM SYSTEM ...................................................................................................... 11 FIG. 2.2: AERIAL REFUELING USING A PROBE AND DROGUE SYSTEM ................................................................................ 11 FIG. 2.3: BOOM OPERATORS WINDOW. ........................................................................................................................... 12 FIG. 2.4: BOEING KC-135 (A VERSION), DIFFERENT VIEWS ............................................................................................ 13 FIG. 2.5: BOEING KC-135 PARTICULAR VIEWS: KC-135R (A) AND KC-135E (B) ............................................................ 13 FIG. 2.6: AIR FORCE PREDATOR UAV ........................................................................................................................... 15 FIG. 2.7: AIR GLOBAL HAWK UAV ................................................................................................................................ 15 FIG. 2.8: PIONEER UAV................................................................................................................................................ 15 FIG. 2.9: ARMY’S SHADOW UAV................................................................................................................................. 15 FIG. 2.10: ARMY’S HUNTER UAV.................................................................................................................................. 16 FIG. 2.11: WVU’S YF-22 UAV MODELS. ...................................................................................................................... 17 FIG. 2.12: WVU’S UAV SCHEME. ................................................................................................................................. 17 FIG. 2.13: YF-22 UAVS ON-BOARD CAMERA VIEWS ..................................................................................................... 18 FIG. 2.14: REFERENCE FRAMES IN AAR PROBLEM ......................................................................................................... 19 FIG. 3.1: AAR SIMULATION DIAGRAM ............................................................................................................................ 21 FIG. 3.2: AAR MATLAB SIMULATION. .......................................................................................................................... 22 FIG. 3.3: CONTROL PANEL. ........................................................................................................................................... 22 FIG. 3.4: CALL GUI MENU ........................................................................................................................................... 23 FIG. 3.5: PLOT RESULTS AVAILABLE. ............................................................................................................................. 24 FIG. 3.6: TANKER MODEL IN THE MAIN VISUALIZATION. ................................................................................................. 24 FIG. 3.7: TANKER MODEL IN DETAILS. ........................................................................................................................... 24 FIG. 3.8: UAV MODEL IN THE MAIN VISUALIZATION. ...................................................................................................... 25 FIG. 3.9: UAV MODEL IN DETAILS................................................................................................................................. 25 FIG. 3.10: THE SOFTWARE BLOCK DETAILED. ................................................................................................................ 25 FIG. 3.11: VISUALIZATION BLOCK. ................................................................................................................................ 26 FIG. 3.12: IMAGES FROM VIRTUAL REALITY TOOLBOX. .................................................................................................. 26 FIG. 3.13: AAR TRASFORMATION MATRIX DIAGRAM ...................................................................................................... 26 FIG. 3.14: AAR TRASFORMATION MATRIX RESUME ........................................................................................................ 26 FIG. 3.15: AIRCRAFT TANKER MODEL IN DIFFERENT ENVIRONMENT. ............................................................................... 27 FIG. 3.16: TANKER REAL AND SCHEME MODEL. .............................................................................................................. 27 FIG. 3.17: SIMULINK MODEL OF THE REFUELING BOOM ................................................................................................. 28 FIG. 3.18: MODEL OF THE REFUELING BOOM ................................................................................................................. 28 FIG. 3.19: BOOM OPERATORS AT WORKS........................................................................................................................ 29 FIG. 3.20: BOOM OPERATOR VIEW IN A REAL (A) AND SIMULATED (B) REFUELING MANEUVER .......................................... 29 FIG. 3.21: ICE MODEL .................................................................................................................................................. 30 FIG. 3.22: POSITION OF CONTROL SURFACE................................................................................................................... 30 FIG. 3.23: EXPERIMENTAL TESTS ................................................................................................................................... 31 FIG. 3.24: SENSOR BLOCK SCHEME................................................................................................................................ 32 FIG. 3.25: FUSION BETWEEN GPS AND MV SYSTEMS...................................................................................................... 34 C. FIG. 3.26: CALCULUS OF TT ...................................................................................................................................... 34 FIG. 3.27: CONTROLLER SCHEME. ................................................................................................................................. 37 FIG. 3.28: AAR TEST TO SEARCH THE COEFFICIENT INCREMENT..................................................................................... 38 FIG. 3.29: ACTUATORS DYNAMICS ................................................................................................................................. 38 FIG. 4.1: GEOMETRY OF IMAGE CONSTRUCTION IN PIN HOLE CAMERA ............................................................................. 39 FIG. 4.2: GEOMETRY MODEL OF PROJECTION. ............................................................................................................... 40 FIG. 4.3:DIFFERENT KIND OF DISTORTIONS. .................................................................................................................. 41 FIG. 4.4: CHESSBOARD AND CORNERS CLICKED. ............................................................................................................ 41 FIG. 4.5: MAIN CALIBRATION TOOLBOX GUI................................................................................................................. 42 FIG. 4.6: MV SIMULINK SCHEME................................................................................................................................... 42 FIG. 4.7: MV SCHEME FOR THE CORNERS...................................................................................................................... 43 FIG. 4.8: MV SCHEME FOR THE MARKERS. .................................................................................................................... 43 FIG. 4.9: THE SCALE FUNCTION ..................................................................................................................................... 44 FIG. 4.10: ROTATION MARKERS IN CAMERA FRAME SCHEME ........................................................................................... 45 FIG. 4.11: DEFAULT POSITION OF THE MARKERS AND IDENTIFICATION ........................................................................... 45 FIG. 4.12: PASSIVE MARKERS FILTER SIMULINK BLOCK & SIMULINK MASK PARAMETERS................................................ 45 FIG. 4.13: DEFAULT POSITION OF THE CORNERS AND IDENTIFICATION ............................................................................ 46 FIG. 4.14: HARRIS SIMULINK BLOCK & SIMULINK MASK PARAMETERS........................................................................... 47. 5.

(7) FIG. 4.15: CIRCULAR MASKS AT DIFFERENT PLACES ON A SAME IMAGE............................................................................ 47 FIG. 4.16: SUSAN SIMULINK BLOCK & SIMULINK MASK PARAMETERS. ........................................................................ 48 FIG. 4.17: THE “MV” PIN-HOLE MODEL. ..................................................................................................................... 48 FIG. 4.18: MATCHING BETWEEN THE PROJECTED SET OF POINTS Pˆ AND THE DETECTED SET P. ..................................... 50 FIG. 4.19: EXAMPLE OF POTENTIAL ERROR IN THE LABELING ALGORITHM....................................................................... 51 FIG. 4.20: LABELING SIMULINK BLOCK & SIMULINK MASK PARAMETERS....................................................................... 51 FIG. 4.21: GLS SIMULINK BLOCK & SIMULINK MASK PARAMETERS. ............................................................................. 53 FIG. 4.22: THE REFERENCE FRAME IN POSE ESTIMATION PROBLEM ................................................................................. 54 FIG. 4.23: LHM SIMULINK BLOCK & SIMULINK MASK PARAMETERS.............................................................................. 54 FIG.5.1: REFUELING_19_SUSAN.MDL . .......................................................................................................................... 55 FIG.5.2: “CALL GUI” BUTTON SETTINGS. ...................................................................................................................... 56 FIG.5.3: VIRTUAL REALITY TOOLBOX SCREEN WITH VIEW OPTIONS (BOOM OPERATOR VIEW AT T=0)............................... 57 FIG.5.4: DIFFERENTS VIEWS OF THE INITIAL CONDITION. ............................................................................................... 57 FIG.5.5: SUSAN DETECTION CORNER FROM THE UAV CAMERA VIEW (T=21SEC). ........................................................ 58 FIG.5.6: DIFFERENT VIEWS OF THE FINAL CONDITION.................................................................................................... 58 FIG.5.7: PLOTS AVAILABLE. .......................................................................................................................................... 59 FIG.5.8: DISTANCE RECEPTACLE-3D WINDOWS IN ROTATED EARTH REFERENCE FRAME PLOT. ....................................... 59 FIG.5.9: MACHINEVISION MEASUREMENT ERROR IN EARTH REFERENCE FRAME PLOT..................................................... 59 FIG.6.1: BIG AIRPLANE MODEL USED............................................................................................................................. 61 FIG.6.2: MIDDLE AIRPLANE MODEL USED...................................................................................................................... 61 FIG.6.3: SMALL AIRPLANE MODEL USED. ....................................................................................................................... 62 FIG.6.4: JVC GR-DF540 CAMCORDER......................................................................................................................... 62 FIG.6.5: PHOTO DOING THE TESTS. ............................................................................................................................... 63 FIG.6.6: DEFAULT POSITION AND ORDER OF THE CORNERS (A) FIRST IMAGE OF THE VIDEO WITH CLICK FOR THE RIGHT POINTS (B)........................................................................................................................................................... 64 FIG.6.7: EASY FIRST CONDITION CALIBRATION TOOL FIRST SIMULATION WITH LHM ALGHORITHM. ................................ 64 FIG.6.8: EASY FIRST CONDITION CALIBRATION TOOL SECOND SIMULATION WITH GLS ALGHORITHM. ............................. 64 C. FIG.6.9: RESULTS OF TT MATRIX AND PROJECT CORNERS IN CAMERA FRAME LHM (A) AND GLS (B). .......................... 65 FIG.6.10: DEFAULT POSITION AND ORDER OF THE CORNERS FOR THE MIDDLE MODEL(A) AND THE SMALL (B)................. 65 FIG.6.11: MV PLUS VIDEO SIMULINK FILE FOR THE BIG MODEL.................................................................................... 66 FIG.6.12: CORNERS VISUALIZATION: ALL FOUND (A), DETECTED (B) AND PROJECTED (C) ............................................ 67 FIG.6.13: INITIAL POINT A. .......................................................................................................................................... 67 FIG.6.14: POINT B........................................................................................................................................................ 68 FIG.6.15: FINAL POINT A............................................................................................................................................. 68 FIG.6.16: RELATIVE X (YELLOW), Y (PURPLE) AND Z (BLUE) TREND. ............................................................................... 68 FIG.6.17: RELATIVE YAW (YELLOW), PITCH (PURPLE) AND ROLL (BLUE) TREND. ............................................................ 68 FIG.6.18: INITIAL CONDITION. ..................................................................................................................................... 69 FIG.6.19: FINAL POSITION. ........................................................................................................................................... 69 FIG.6.20: RELATIVE X (YELLOW), Y (PURPLE) AND Z (BLUE) TREND. ............................................................................... 69 FIG.6.21: RELATIVE YAW (YELLOW), PITCH (PURPLE) AND ROLL (BLUE) TREND. ............................................................ 69 FIG.6.22: INITIAL CONDITION. ..................................................................................................................................... 70 FIG.6.23: FINAL IMAGES............................................................................................................................................... 70 FIG.6.24: RELATIVE X (YELLOW), Y (PURPLE) AND Z (BLUE) TREND. ............................................................................... 70 FIG.6.25: RELATIVE YAW (YELLOW), PITCH (PURPLE) AND ROLL (BLUE) TREND. ............................................................ 70 FIG.6.26: TIME EXECUTION OF THE TWO COMPILED FILES. ........................................................................................... 71 FIG. 7.1: CREATIVE WEBCAM 5. .................................................................................................................................... 72 FIG. 7.2: MV BLOCK SIMULINK FILE WITH WEBCAM FOR THE MIDDLE AIRPLANE MODEL................................................ 73 FIG. 7.3: LOAD AND DELETE BUTTONS OVERVIEW. ........................................................................................................ 73 FIG. 7.4: SFUN2WEBCAM.M S-FUNCTION (LEVEL 2) SIMULINK BLOCK............................................................................. 74 FIG. 7.5: RTBLOCK OF LEONARDO DAGA...................................................................................................................... 74 FIG. 7.6: WEBCAM IMAGE (A), ALL MARKERS FOUNDED (B), DETECTED (C) AND PROJECTED (D) [36TH SEC]..................... 75 FIG. 7.7: EXAMPLE PLOTS OF X,Y,Z (A) AND YAW, PITCH AND ROLL (B)........................................................................... 75 FIG. 8.1: LINUX, FEDORA CORE AND RTAI LOGOS......................................................................................................... 77 FIG. 8.2: THE RTAI ARCHITECTURE. ............................................................................................................................. 79 FIG. 8.3: ASUS LAPTOP, DELL LAPTOP, CREATIVE WEBCAM 5 AND MS JOYSTICK ON MY DESK. .................................... 82 FIG. 8.4: YF-22 ON-BOARD COMPUTER........................................................................................................................ 82 FIG. 8.5: PC-104 CPU-MODULES MSI (A), ARBOR (B) AND COMPACT FLASH READER (C). ......................................... 84 FIG. 8.6: FUNCTIONING DIAGRAM. ................................................................................................................................ 84 FIG. 8.7: CAB LIKE ONE-TO-MANY COMMUNICATION CHANNEL. ..................................................................................... 88 FIG. 8.8: CAB SIMULTANEOUS ACCESS.......................................................................................................................... 88 FIG. 8.9: CAB READING. .............................................................................................................................................. 88. 6.

(8) FIG. 8.10: CAB READING. ............................................................................................................................................ 89 FIG. 8.11: CAB ACCESSING TIMELINE........................................................................................................................... 90 FIG. 8.12: CAB MEMORY DIAGRAM. .............................................................................................................................. 91 FIG. 8.13: SIMULINK SCHEME AND PARAMETERS DIAGRAM OF THE LXRTSHM2IMAGETPAR.C S-FUNCTION........................ 94 FIG. 8.14: BROKEN IMAGE (DRAMATIZATION TWO IMAGES MADE UP IN ONE, DETACHED OF 1.5 SEC)............................... 95 FIG. 8.15: ARCHITECTURE OF THE LXRT REAL TIME APPLICATION.. ............................................................................... 98 FIG. 8.16: SIMULINK SCHEME AND PARAMETERS DIAGRAM OF THE LXRTSHM2IMAGETPAR.C S-FUNCTION...................... 100 FIG. 8.17: SIMULINK SCHEME AND PARAMETERS DIAGRAM OF SAVEDATA FUNCTION.................................................... 102 FIG. 8.18: VIEW OF THE FOLDERS OF THE PROJECT..................................................................................................... 102 FIG. 8.19: FIRST STEP: INITIAL CONDITION CALCULATION............................................................................................ 103 FIG. 8.20: MACHINE VISION SIMULATION FIXED TO RUN ON RTAI................................................................................ 103 FIG. 8.21: SECOND STEP: BUILD MENU WITH THE MATLAB COMMAND LINE ON THE BACKGROUND. .............................. 104 FIG. 8.22: RTAI-LAB WITH SCOPES AND METER VIEW.................................................................................................. 105 FIG. 8.23: VIDEO FRAME AND PLOTS DRAWN IN MATLAB FROM RTAI RESTORED DATA .................................................. 106 FIG. 8.24: STOREDATACORNER SIMULINK BLOCK........................................................................................................ 106 FIG. 8.25: THE 8 STEPS TO RUN THE APPLICATION. ...................................................................................................... 107 FIG. 8.26: VIDEO RESTORED FROM THE RTAI APPLICATION: IMAGES AT 7.4 SEC (A) AND AT 29.9 SEC (B). ..................... 108 FIG. 8.27: EVOLVING OF X, Y AND Z COORDINATES PLOT (30 SEC IN METERS)................................................................ 108 FIG. 8.28: EVOLVING OF ROLL, PITCH AND YAW ANGLES PLOT (30 SEC IN RAD). .......................................................... 108 FIG. 8.29: NUMBER OF DETECTED CORNERS FROM DAL ALGORITHM. .......................................................................... 108 FIG. 8.30: VIDEO RESTORED FROM THE RTAI APPLICATION: IMAGES AT 5.3 SEC (A) AND AT 17.1 SEC (B). ..................... 109 FIG. 8.31: EVOLVING OF X, Y AND Z COORDINATES PLOT (30 SEC IN METERS)................................................................ 109 FIG. 8.32 : EVOLVING OF ROLL, PITCH AND YAW ANGLES PLOT (30 SEC IN RAD)........................................................... 109 FIG. 8.33 : NUMBER OF DETECTED CORNERS FROM DAL ALGORITHM. ......................................................................... 110 FIG.A.1: THE RGB IS AN ADDITIVE COLOUR MODEL IN WHICH RED, GREEN AND LIGHT BLUE ARE COMBINED................. 116 FIG.A.2: THE RGB COLOUR MODEL 3D CUBE. ............................................................................................................ 116 FIG.A.3: RGB 24-BIT COLOUR SYSTEM SCALE.............................................................................................................. 117 FIG.A.4: EXAMPLE OF U-V COLOR PLANE WITH NORMALIZED Y=0.5............................................................................ 117 FIG.A.5: EXAMPLE OF Y-U-V DECOMPOSITION OF AN IMAGE....................................................................................... 118 FIG.A.6: GRAYSCALE. ................................................................................................................................................. 119 FIG.A.7: GRAY IMAGE................................................................................................................................................. 119 FIG.A.8: DAL1 SIMULINK BLOCK AND MASK................................................................................................................ 122 FIG.A.9: DAL2 SIMULINK BLOCK AND MASK................................................................................................................ 123 FIG.A.10: DAL3 EXAMPLE OF FUNCTIONING. ............................................................................................................. 124 FIG.A.11: DAL3 SIMULINK BLOCK AND MASK.............................................................................................................. 124. 7.

(9) TABLE INDEX TAB. 3.1: DENAVIT-HARTENBERG BOOM PARAMETER..................................................................................................... 29 TAB. 4.1: DATA USED IN THE LABELING FUNCTION. ........................................................................................................ 50 TAB. 8.1: 25 EXPERIMENTS RESULTS ON 30 SEC SIMULATION RUN................................................................................. 110 TAB. A.1 HEURISTIC FOR DETERMINING A SUB OPTIMAL SOLUTION. .............................................................................. 123. 8.

(10) 1. INTRODUCTION This thesis, developed at the West Virginia University (WVU), Morgantown WV, U.S.A, tries to find an implementation to the problem of lack of Aerial Refueling (AR) capabilities of Unmanned Aerial Vehicles (UAVs). At the WVU laboratories in the last years they were projected and built several UAVs and simultaneously an entire simulated world with the Matlab/Simulink/VirtualRealityToolbox® tools in which there is the complete solved simulation of the Aerial Refueling Docking problem using a Machine Vision (MV) based approach in addition or as alternative to the GPS conventional technology; the key issue is represented by the need of accurate measurement of the ‘tanker-UAV’ relative position and orientation.. Fig. 1.1: Aerial refueling at the sunset. This effort to replace the pilots’ eyes for the UAVs during the AR, needed to be tested with real images and videos, implemented in a Hard Real-Time environment, and finally put on an embedded platform to be tested on the UAV models in the next future. The above requirements were the purpose of this entire study. The core of the project is the Machine Vision block, also because all the other controls are of simple implementing with the Simulink® aids. The focus of the study in this thesis is the development and the implementation of a RealTime Machine Vision based Relative Position and Orientation Sensor (RPOS) between UAV and tanker through the use of a real camera. The sensor consists of a webcam connected to a computer via an Universal Serial Bus (USB) connection. The computer hosts a Real-Time Operating System (RTOS) executing several Real Time (RT) and Non Real Time (NRT) processes. Specifically, first an image acquisition algorithm captures a stream of images from a camera; these images are then processed by a Feature Extraction (FE) algorithm. Next the feature coordinates are processed by a Detection and Labeling (DAL) algorithm followed by a Pose 9.

(11) Estimation (PE) algorithm, which finally provides estimates of the relative position and orientation of the tanker. The position sensor was initially tested within a Matlab-based Soft Real-Time environment, in which the MV block of the previous entire simulation was replaced with real videos and real images from a webcam, and later implemented using an Hard Real-Time Linux/RTAI© setup. A novel shared-memory based, inter-process communication mechanism called Cyclic Asynchronous Buffer (CAB) was implemented and used within this effort. Finally, the testing was performed with a 800 MHz PC-104 computer, used as On-Board Computer for a WVU YF22 UAV research aircraft. The thesis is organized as follows. The MV-based AR problem is briefly described in the initial section. Secondly the entire Simulink-based AR simulation is illustrated, explaning the modeling done and the control implemented. Then the MV position sensing system is outlined thorough the documentation of the camera modeling, corner detection, labeling, and pose estimation algorithms. Furthermore, an example of the entire simulation functioning is reported. After there is the step by step description of the works, studies, and tests done for the implementation at the beginning of the real video and after of the soft real-time webcam images for the MV block in Simulink® environment. Additionally, the Hard Real-Time Linux/RTAI based implementation with a new way for the inter-process communication, implementing the Cyclic Asynchronous Buffer (CAB) is illustrated, with info concerning the software developped and the hardware used. Finally, experimental results tested on the PC-104 based On-Board Computer embedded on the YF-22 UAV models of the WVU Laboratories are presented and discussed. The purpose of this thesis is also to provide a complete overview of the entire AR for UAV simulation environment, developed at the WVU, to be used like a guide to understand the whole AR problem solved.. 10.

(12) 2. DESCRIPTION OF THE AAR PROBLEM 2.1 The Autonomous Aerial Refueling The strategic and tactical importance of Unmanned Aerial Vehicles (UAVs) for civil and military purposes has grown in recent years. The deployment of UAVs has been tested in current and recent overseas conflicts. Reducing costs and risks of human casualties is one immediate advantage of UAVs. An additional advantage is the possibility of avoiding troop deployment in enemy territory for dangerous rescue missions as is done currently with manned missions. It is envisioned that formations of UAV will perform not only intelligence and reconnaissance missions but provide close air support, precision strike, and suppression of enemy air defenses. One of the biggest current limitations of deployed military UAVs is their limited aircraft range. In fact, today UAVs are not capable of overseas flight and need to be flown by ground troops deployed at limited distances from a combat scenario. Furthermore, terrain and weather factors can also determine how close to the targets the UAVs can be launched. Therefore, the acquisition of Autonomous Aerial Refueling (AAR) capabilities for UAVs is a critical goal. To achieve these capabilities specific technical challenges need to be overcome.. Fig. 2.1: aerial refueling using a boom system Currently, there are two types of hardware set-ups used for aerial refueling. The first method is used by the US Air Force and is based on a refueling boom (Fig. 2.1); the second method is used by the US Navy as well as the armed forces of other NATO nations and is based on a “probe and drogue” setup, consisting of a refueling flexible hose with a flexible basket at the end (Fig. 2.2).. Fig. 2.2: aerial refueling using a probe and drogue system 11.

(13) In recent years, the AAR problem has attracted the attention of many researchers [1][2][3]. In this effort a key issue is represented by the need of a high accuracy measurement of the Tanker-UAV relative distance and attitude in the final phase of docking and during the refueling. The use of MV technology has been proposed in addition or as an alternative to more conventional GPS technology. Particularly, a MV-based system has been proposed for close proximity operations of aerospace vehicles [4] and for the navigation of UAVs [5]. For the “probe and drogue” refueling system a MV-based system has been proposed in Refs. [1][2][3]. Within these studies, a fixed or variable number of visible optical markers is assumed to be available. On the other hand, temporary loss of visibility might occur due to hardware failures and/or physical interference between the UAV on-board camera and the markers due to the refueling boom itself and/or different structural components of the Tanker or just simply because the markers exit the visual range of the on-board camera. In this effort the AAR problem is addressed for the “refueling boom” method. In this case, the objective is to guide the UAV to a defined 3-D Window (3DW) below the Tanker where the boom operator can then manually proceed to the docking of the refueling boom with the UAV fuel receptacle followed by the actual refueling phase (Fig.2.3).. (a). (b) Fig. 2.3: boom operators window.. The MV estimation algorithms studied are capable of handling temporary loss of visibility of the markers. A detailed simulation environment has been designed providing accurate modeling for the drogue flexibility, the wake effects from the Tanker on the UAV, the atmospheric turbulence, the UAV trajectory constraints, as well as the GPS and MV measurements errors. A specific docking control scheme featuring a fusion of GPS and MV distance measurements is proposed in this thesis, but the real focus of this thesis is the Machine Vision block developing and testing with real videos and real images snapped from a real camera, before in Soft RealTime environment and finally in Hard Real-Time environment running on a On-Board embedded hardware.. 2.2 The Tanker The Tanker used in the AAR simulation is the Boeing KC-135R Stratotanker, this is among the most common aircraft used for this purpose. The KC-135 Stratotanker's primary mission is to refuel long-range aircraft. It also provides aerial refueling support to Air Force, Navy, Marine Corps and allied aircraft. A special shuttlecock-shaped drogue, attached to and trailed behind the flying boom, is used to refuel aircraft fitted with probes. An operator stationed in the rear of the plane controls the boom. The KC-135 tanker fleet made an invaluable contribution to the success of Operation Desert Storm in 12.

(14) the Persian Gulf, flying around-the-clock missions to maintain operability of allied planes. The KC-135s form the backbone of the Air Force tanker fleet, meeting the aerial refueling requirements of bomber, fighter, cargo and reconnaissance forces, as well as the needs of the Navy, Marines and allied nations.. (a) (b) Fig. 2.4: Boeing KC-135 (A version), different views Because the KC-135A's original engines are of 1950s technology, they don't meet modern standards of increased fuel efficiency, reduced pollution and reduced noise levels. By installing new, CFM56 engines, performance is enhanced and fuel off-load capability is dramatically improved. In fact, the modification is so successful that two-re-engined KC-135Rs can do the work of three KC-135As. This improvement is a result of the KC-135R's lower fuel consumption and increased performance which allow the tanker to take off with more fuel and carry it farther. Since the airplane can carry more fuel and burn less of it during a mission, it's possible to transfer a much greater amount to receiver aircraft. Re-engining with the CFM56 engines also results in significant noise reductions.. (a) (b) Fig. 2.5: Boeing KC-135 particular views: KC-135R (a) and KC-135E (b) Boeing has completed work on a program to re-engine all KC-135As in the Air Force Reserve and Air National Guard fleet - a total of 161 airplanes. were modified with refurbished JT3D engines taken from used, commercial 707 airliners. After modification, the airplanes are designated KC-135Es. This upgrade, like the KC-135R program, boosts performance while decreasing noise and smoke pollution levels. The modified KC-135E provides 30 percent more powerful engines with a noise reduction of 85 percent. 13.

(15) The last “special” modify for the Boing KC-135 turbojet transport is the "Comet", designation NASA 930, operated by the Johnson Space Center's Reduced Gravity Office. By flying a series of "roller-coaster" parabolic maneuvers, short periods of reduced gravity are experienced onboard. While most flights are dedicated to zero-g astronaut training and equipment tests, other missions were flown simulating the one-sixth gravity environment of the lunar surface. Obviously this last special type of KC-135 it’s not suitable for the AR resolution.. (a) (b) Fig. 2.3: Boeing KC-135 930 NASA version. 2.3 The Unmanned Aerial Vehicles Unmanned Aerial Vehicles (UAVs) have been mentioned in many ways: RPVs (Remotely Piloted Vehicle), drones, robot planes, and pilot less aircraft are a few such names. Most often called UAVs, they are defined by the Department of Defense (DOD) as “powered, aerial vehicles that do not carry a human operator, use aerodynamic forces to provide vehicle lift, can fly autonomously or be piloted remotely, can be expendable or recoverable, and can carry a lethal or non-lethal payload”. Ballistic or semi-ballistic vehicles, cruise missiles, and artillery projectiles are not considered UAVs by the DOD definition. UAVs differ from RPVs in that some UAVs can fly autonomously. UAVs are either described as a single air vehicle (with associated surveillance sensors), or a UAV system, which usually consists of three to six air vehicles, a ground control station, and support equipment. UAVs are thought to offer two main advantages over manned aircraft: they are arguably cheaper to produce, and they eliminate the pilot’s life risk. UAVs protect the lives of pilots by performing the “3-D” missions. Furthermore, for those certain missions which require a very small aircraft, only a UAV can be deployed because there is no equivalent manned system small enough for the job. There are a number of reasons why UAVs have only recently been given a higher priority. Technology is now available that wasn’t available just a few years ago, included advanced video surveillance and sensing systems that can be mounted on UAVs. UAVs range from the size of an insect to that of a commercial airliner. DOD currently possesses five major UAVs: the Air Force’s Predator and Global Hawk, the Navy and Marine Corps’s Pioneer, and the Army’s Shadow and Hunter.. 14.

(16) (a). (b) Fig. 2.6: Air Force Predator UAV. (a). (b) Fig. 2.7: Air Global Hawk UAV. (a). (b) Fig. 2.8: Pioneer UAV. (a). (b) Fig. 2.9: Army’s Shadow UAV 15.

(17) (a). (b) Fig. 2.10: Army’s Hunter UAV The non-military use of UAVs is expected to increase in the future as technologies evolve that allow the safe, reliable flight of UAVs over populated areas. One emerging application is the use of less sophisticated UAVs as aerial camera platforms for the movie-making and entertainment industries. A similar market is growing rapidly in the television news reporting and coverage arenas also. As demand in these markets grows, aircraft such as the IUAS will become a more desirable aerial platform than less-capable hobbyist aircraft, as safety, reliability, ease-of-use, and rapid deployment become important priorities. Additional roles for UAVs in the near future will include homeland security and medical re-supply. The Coast Guard and Border Patrol, parts of the newly formed Department of Homeland Security, already have plans to deploy UAVs to watch coastal waters, patrol the nation’s borders, and protect major oil and gas pipelines. Congressional support exists for using UAVs for border security. During a Senate Armed Services Committee hearing on homeland defense, it was stated that although it would not be appropriate or constitutional for the military to patrol the border, domestic agencies using UAVs could carry out this mission. On the medical side, UAVs such as the Army’s Shadow have been studied as delivery vehicles for critical medical supplies needed on the battlefield. Not all of these new applications have been approved — UAV advocates state that in order for UAVs to take an active role in homeland security, Federal Aviation Administration (FAA) regulations concerning the use of UAVs will have to change. The Coast Guard will most likely take the lead in resolving UAV airspace issues with the FAA. The National Aeronautics and Space Administration (NASA) and the UAV industry will also be working with the FAA on the issue, as they are joining forces in an initiative to achieve routine UAV operations in the national airspace within a few years. 2.3.1 YF-22 research UAVs designed, built, and instrumented at West Virginia University (WVU) The main objective of the previous research effort was to provide a flight demonstration of formation control using YF-22 research aircraft models (illustrated in Fig. 2.11). These models were designed, built, and instrumented by a group of faculty members, graduate research assistants, and undergraduate students in the Mechanical and Aerospace Engineering (MAE) Department at West Virginia University (WVU). 16.

(18) (a). (b). (c). (d) Fig. 2.11: WVU’s YF-22 UAV models.. In the formation flight configuration, a radio control (R/C) pilot maintained ground control of the ‘leader’ aircraft while 2 ‘follower’ aircraft were required to maintain a pre-defined position and orientation with respect to the ‘leader’. Each of the ‘follower’ aircraft was essentially an autonomous vehicle once engaged into the formation flight mode. A general explanation of the UAV functioning is reported in Fig. 2.12 but for more details see [6].. Measured Surfaces (Stabilator,Aileron, Rudder) with Potentiometers Propulsion System GPS Antenna RF Modem Main Payload Bay Air-Data Probe. Fig. 2.12: WVU’s UAV scheme. 17.

(19) There is a description of the On-Board Computer in the paragraph 8.2; there is the possibility to mount an image acquiring card with a camera on the UAVs, images from the on-board camera are shown in Fig. 2.13.. (a) (b) Fig. 2.13: YF-22 UAVs On-Board Camera Views. 2.4 Reference frames For a better comprehension in the next sub-paragraphs it is clearly illustrated the reference frames utilized, with the relative notation, the geometric formulation of the AR problem [7]. 2.4.1 Reference Frames and Notation The study of the AR problem requires the definition of the following Reference Frames (RFs): •. ERF, or E: earth-fixed reference frame.. •. PRF, or P: earth-fixed reference frame having the x axis aligned with the planar component of the tanker velocity vector.. •. TRF or T: body-fixed tanker reference frame located at the tanker center of gravity (CG).. •. URF or U: body-fixed UAV reference frame located at the UAV CG.. •. CRF or C: body-fixed UAV camera reference frame.. Within this study geometric points are expressed using homogeneous (4D) coordinates and are indicated with a capital letter and a left superscript denoting the associated reference frame. For example, a point P expressed in the F reference frame has coordinates FP = [x,y,z,1]T, where the right ‘T’ superscript indicates transposition. Vectors are defined as difference between points; therefore their 4th coordinate is always ‘0’. Also, vectors are denoted by two uppercase letters, indicating the two points at the extremes of the vector. For example, EBR = EB - ER is the vector from the point R to the point B expressed in the Earth Reference Frame. The transformation matrices are (4 x 4) matrices relating points and vectors expressed in an initial reference frame to points and vectors expressed in a final reference frame. They are denoted with a capital T with a right subscript indicating the “initial” reference frame and a left superscript indicating the “final” reference frame. For example, the matrix ETT represents the homogeneous transformation matrix that transforms a vector/point expressed in TRF to a vector/point expressed in ERF. 18.

(20) 2.4.2 Geometric formulation of the AR problem The objective is to guide the UAV such that its fuel receptacle (point R in Fig. 2.14) is “transferred” to the center of a 3-dimensional window (3DW, also called “ Refueling Box”) under the tanker (point B in Fig. 2.14). It is assumed that the boom operator can take control of the refueling operations once the UAV fuel receptacle reaches and remains within this 3DW. It should be emphasized that point B is fixed within the TRF; also, the dimensions of the 3DW δ x, δ y, δ z are known design parameters. It is additionally assumed that the tanker and the UAV can share a short-range data communication link during the docking maneuver. Furthermore, the UAV is assumed to be equipped with a digital camera along with an on-board computer hosting the MV algorithms acquiring the images of the tanker. Finally, the 2-D image plane of the MV is defined as the ‘y-z’ plane of the CRF.. Fig. 2.14: Reference Frames in AAR problem 2.4.3 Receptable-3DW-center vector The reliability of the AR docking maneuver is strongly dependent on the accuracy of the measurement of the vector PBR, that is the distance vector between the UAV fuel receptacle and the center of the 3D refueling window, expressed within the PRF: P. BR = PTT T B − PTU U R = PTT T B − PTT T TC CTU U R. (1.1). In the above relationships both UR and TB are known and constant parameters since the fuel receptacle (point R) and the 3DW center (point B) are located at fixed and known positions with respect to the UAV and tanker frames respectively. The transformation matrix CTU represents the position and orientation of the CRF with respect to the URF; therefore, it is also known and assumed to be constant. The transformation matrix PTT represents the position and orientation of the tanker respect to PRF, which are measured on the tanker and broadcasted to the UAV through the data communication link. In particular, if the sideslip angle β of the tanker is 19.

(21) negligible then PTT only depends on the tanker roll and pitch angles. Finally, TTC, is the inverse of CTT, which can be evaluated either “directly”- that is using the relative position and orientation information provided by the MV system- or “indirectly”- that is by using the formula. (. ). −1. E TT = C TU E TU TT , where the matrices ETU and ETT can be evaluated using information from the position (GPS) and orientation (gyros) sensors of the tanker and UAV respectively. C. 20.

(22) 3. DESCRIPTION OF THE SIMULATION ENVIRONMENT 3.1 Overview The used model for the Tanker has the dynamic characteristic of the Boeing KC-135R, the The AR simulation scheme was developed in Simulink®[8]. The mathematical models of the tanker and UAV aircraft were developed using conventional modeling approaches, as outlined in [9]. Specifically the UAV was modeled with the parameters of the ICE-101 aircraft (see par. 3.4) [10] while the tanker was modeled with the parameters of a KC135 aircraft (see par. 3.2) [11]. First order dynamic models were used for the actuators dynamics using typical values for aircraft of similar size and/or weight (see par. 3.7). Both the tanker and UAV feature a typical set of autopilot systems designed using a conventional LQR approach (see par. 3.6). The simulation includes a detailed modeling of the elastic behavior of the boom (see par. 3.3) [12], the wake effects from the tanker on the UAV [13], as well as atmospheric turbulence models acting on both tanker and UAV (see par. 3.7) [14]. Details on the design of the tracking and docking control scheme are also provided in [12]. The simulation outputs were linked to a Virtual Reality Toolbox® (VRT)[15] interface to provide typical scenarios associated with the AR maneuvers. The interface allows the positions of the simulated objects, that is the ‘UAV’ and the ‘tanker’, to drive the position and orientation of the corresponding objects in a Virtual World. From this Virtual World, images of the tanker as seen from a virtual camera placed on the UAV are continuously acquired and processed during the simulation. Specifically, after the images are acquired, they are scaled and processed by a corner detection algorithm. The corner detection algorithm finds the 2D coordinates, on the image plane, of the points that correspond to particular physical corners and features of the tanker aircraft [16].. Fig. 3.1: AAR simulation diagram In Fig 3.2 is illustrated the Simulink screen of the simulation file.. 21.

(23) Fig. 3.2: AAR MATLAB simulation. Now we can see in details the various subsystems.. Fig. 3.3: Control Panel. The first one is the control panel (Fig 3.3) in where:. • • •. Open World: Opens the Virtual Reality World (VRT) window. 10 VRT points of view are pre-set. Reset Variables: The variables are reset to the initial default conditions. Default condition is essentially the ‘pre-contact’ position. Call Gui: Opens a sequence of Graphical User Interfaces (GUIs) allowing to set the following simulation parameters: (Fig 3.4) ƒ Tanker initial flight conditions ƒ UAV position with respect to tanker (default: pre-contact) ƒ Level of atmospheric turbulence ƒ Level of wake effects ƒ Level of sensor noise ƒ Level of GPS noise ƒ GPS sampling time ƒ Position of the fuel receptacle on the UAV ƒ Position of the digital camera on the UAV ƒ Camera parameters (orientation, focal length, pixel size, sampling time) ƒ Number and location of markers (corners) on the tanker. 22.

(24) Fig. 3.4: Call GUI Menu • • •. Plot Results: Plots a variety of outputs from the simulation. (Fig 3.5) Notation: Explains the notation. Version History: Documents the various upgrade in the versions.. 23.

(25) Fig. 3.5: Plot Results available. The Tanker Model (Fig 3.6 & 3.7) consists of a number of blocks from the FDC (Flight Dynamics and Control) toolbox. It includes a ‘General Aircraft Model’ block (with the aerodynamic, inertial, propulsive, and geometric data of a B747/200 aircraft), the ‘Tanker Autopilot’ block (designed using a LQR approach, see par. 3.6), a ‘Tanker Atmospheric Model’ block, and an ‘Actuator Dynamics’ block (see par. 3.7).. Fig. 3.6: Tanker Model in the main visualization.. Fig. 3.7: Tanker Model in details. The UAV Model (Fig 3.8 & 3.9) consists of the following main simulation blocks: ƒ ‘General Aircraft Model’ for the UAV (with the aerodynamic, inertial, propulsive, and geometric data of the ICE) ƒ ‘Turbulence’ block [details in par. 3.7] ƒ ‘Wake effects’ block [details in par. 3.7] ƒ ‘Software’ block (embedding the algorithms for Corner Detection/Labeling/Pose Estimation and the ‘tracking & docking’ control laws) [details in paragraphs 4.34.6] ƒ ‘Sensors’ block (with the outputs from the Tanker and UAV model) [details in par. 3.5]. 24.

(26) Fig. 3.8: UAV Model in the main visualization.. Fig. 3.9: UAV Model in details. The ‘Software’ block consists of : ƒ ‘Machine Vision’ block (featuring the algorithms for Markers or Corners Detection /Labeling/Pose Estimation) [details in chapter 4] ƒ ‘Switch and Fusion’ block (determining when and how to make the switch between GPS-based and MV-based guidance and control laws) [details in par. 3.5] ƒ ‘UAV controller’ block (featuring the ‘tracking & docking’ control laws) [see details in par. 3.6]. Fig. 3.10: The Software block detailed. 25.

(27) The last one is the Visualization block that allows the data to Virtual Reality ToolBox. Fig. 3.11: Visualization Block.. (a) Fig. 3.12: Images from Virtual Reality Toolbox.. (b). We can resume the all simulation scheme with the suitable reference frames in the next two figures (3.13 & 3.14). Fig. 3.13: AAR Trasformation Matrix diagram. Fig. 3.14: AAR Trasformation Matrix Resume 26.

(28) 3.2 Aircraft model The used model for the Tanker has the dynamic characteristics of the Boeing KC-135R, the aircraft has to assume for the refueling a steady state equivalent to a rectilinear trajectory, a constant Mach number of 0.65 and an altitude (H) of 6,000 m. This allows simplifying the Tanker dynamics as described in section 3.4. The lateral-directional motion is eliminated by the dynamics inasmuch the aircraft has just a longitudinal motion. The longitudinal motion has a stable dynamics and the Tanker does not need an internal stability control.. Fig. 3.15: aircraft Tanker model in different environment.. Fig. 3.16: Tanker real and scheme model.. 3.3 Model of the boom The boom has been modeled using the scheme represented in Fig. 3.17. It is connected to the Tanker at point P and consists of two elements: the first element is connected to point P by two revolute joints, allowing vertical and lateral relative rotations (θ4 and θ5); the second element is connected to the first one by a prismatic joint that allows the extension d6. 27.

(29) Fig. 3.17: Simulink model of the refueling boom The dynamic model of the boom has been derived using the Lagrange method: d ∂L ( q, q ) ∂L ( q, q ) − = Fi , i = 1,..., n dt ∂qi ∂qi. (3.1). where L ( q, q ) = T ( q, q ) − U ( q ) is the Lagrangian (difference between the boom kinetic and potential energy), q is the vector of the Lagrangian coordinates and Fi are the lagrangian forces on the boom. To derive the Lagrangian, reference is made to an inertial frame, the ERF; in this case the inertial and gravitational forces are implicitly included in the left-hand side of (3.1) and Fi represent the active forces (wind and control forces). With respect to this frame, the boom has six degrees of freedom: the three translations d1, d2, and d3 of point P, the rotations θ4 and θ5, and the extension d6; therefore the Lagrangian coordinates can be chosen as T q = [ d1 , d 2 , d3 ,θ 4 ,θ 5 , d 6 ] . TANKER C.o.M.. d2. d3. 1st element: lenght 6.1 m, mass 180 kg. 2nd element: lenght 4.6 m, mass 140 kg.. d1. T P θ4 Fwz1 d6 Fwz2. θ5 TANKER JOINT. Fwy1. Fwx1. Fwx2 Fwy2. Fig. 3.18: model of the refueling boom Furthermore, the first three variables d1, d2, and d3 are expressed in function of the tanker position as: E. P ( t ) = ET ( t ) + ETP. E. P ( t ) = ET ( t ) + ω × ETP. E. (3.2). P ( t ) = ET ( t ) + ω × ETP + ω ∧ (ω ∧ ETP ). where ET is the position of the Tanker’s center of gravity, ω is the tanker angular velocity, T E P = [ d1 , d 2 , d3 ] , ETP is the fixed length vector going from ET to EP.. 28.

(30) Fig. 3.19: Boom operators at works. The kinetic and potential energies have been derived referring to the Denavit-Hartenberg representation of the system shown in Tab 3.1. In the AAR simulation the boom is controlled using a joystick and there is a camera point of view that corresponds to the operator point of view during the refueling maneuver (Fig 3.19-20).. ai. αi. di. θi. 1. 0. d1. 0. 2. 0. d2. 3. 0. d3. π 2 0. 4. 0. π 2 π 2 π 2 −π 2. 0. θ4. 5. 0. −π 2. 0. θ5. 6. 0. π 2. d6. π 2. Tab. 3.1: Denavit-Hartenberg Boom parameter. (a) (b) Fig. 3.20: Boom operator view in a real (a) and simulated (b) refueling maneuver 29.

(31) 3.4 Model of used UAV The aircraft model used in AAR simulation is an Innovative Control Effectors (ICE-101) [17], the model has been developed using the conventional modeling approach outlined in [18]. The resulting UAV model is described by a 12 steady state model: x = ⎡⎣V ,α , β , p, q, r ,ψ ,θ , ϕ , E x, E y, H ⎤⎦. (3.3). where x is the state variable; V (m/s) is the component x of the velocity in body axis; α (rad) is the wind axis angle of attack; β (rad) is the wind axis sideslip angle; p, q, r (rad/sec) are the components (x, y, z) of the angular velocity in body axis (also known as roll, pitch and yaw rates); ψ , θ , ϕ (rad) are the yaw, pitch and roll Euler angles; Ex, Ey, H are the position in Earth fixed Reference Frame (ERF). The angle of attack α and the sideslip angle β are defined as: ⎛W ⎜ V ⎝. ⎞ ⎛U ⎞ ⎟ and β = sin −1 ⎜ ⎟ ⎟ ⎜ V ⎟ ⎠ ⎝ ⎠ Where V = [V , U , W ] is the linear velocity in body axis, see Fig 3.21 for details.. α = tan −1 ⎜. (3.4). Fig. 3.21: ICE model The input vector u is:. u = ⎡⎣δ Throttle , δ AMT _ R , δ AMT _ L , δ TEF _ R , δ TEF _ L , δ LEF _ R , δ LEF _ L , δ PF , δ SSD _ R , δ SSD _ L ⎤⎦ (3.5) where AMT is All Moving Tips, TEF is Trailing Edge Flaps, LEF is Leading Edge Flaps, PF is Pitch Flaps, SSD is Spoiler Slot Deflector, the position of the control surface are shown in Fig. 3.22. Fig. 3.22: Position of control surface 30.

Riferimenti

Documenti correlati

La rivalutazione dell’importanza dei sensi e della loro intera- zione sinestetica (Riccò, 1999) nella nostra esperienza conoscitiva è tutto sommato recente e corregge gli interessi

L’assistenza domiciliare è certamente meno costosa di quella ospedaliera senza nulla togliere ala qualità del programma terapeutico. Ma il vantaggio fondamentale nel caso

E ciò rimane nel solco del principio di responsabilità per fatto proprio, perché l'assimilazione piena del correoindotto, cioè della vecchia concussione per induzione, o

The Amsterdam School for Social Science Research (the Amsterdam Institute for Social Science Research as of January 2010) provided the most financial and

The text catalogued by Laroche as CTH 75 and commonly known as the Aleppo Treaty is an official copy, authorized by the Hittite king Muwattalli II, of a document originally issued

è infelice a modo suo» (Lev Tolstoj, Anna Karenina, trad. 15) che non solo ascrive il romanzo tolstojano al genere familiare, ma conferma la lettura

An Australian study noted over a decade ago that evaluations in the 1990s did not establish if that country’s science awareness programme “caused Australians to become more or