• Non ci sono risultati.

A Methodology for the Design of New Augmented Reality Solutions: a User-Centred Approach

N/A
N/A
Protected

Academic year: 2021

Condividi "A Methodology for the Design of New Augmented Reality Solutions: a User-Centred Approach"

Copied!
120
0
0

Testo completo

(1)

D

IPARTIMENTO DI

I

NGEGNERIA DELL

’E

NERGIA DEI

S

ISTEMI

,

DEL

T

ERRITORIO E DELLE

C

OSTRUZIONI

RELAZIONE PER IL CONSEGUIMENTO DELLA LAUREA MAGISTRALE IN INGEGNERIA GESTIONALE

A Methodology for the Design of New Augmented

Reality Solutions: a User-Centred Approach

RELATORI IL CANDIDATO

Prof. Ing. Gino Dini Elisa Galeotti

Dr. John Ahmet Erkoyuncu

Sessione di Laurea del 19/07/2017 Anno Accademico 2016 - 2017

(2)
(3)

D

IPARTIMENTO DI

I

NGEGNERIA DELL

’E

NERGIA DEI

S

ISTEMI

,

DEL

T

ERRITORIO E DELLE

C

OSTRUZIONI

RELAZIONE PER IL CONSEGUIMENTO DELLA LAUREA MAGISTRALE IN INGEGNERIA GESTIONALE

A Methodology for the Design of New Augmented

Reality Solutions: a User-Centred Approach

RELATORI IL CANDIDATO

Prof. Ing. Gino Dini Elisa Galeotti Dipartimento di Ingegneria Civile e Industriale,

Università di Pisa

Dr. John Ahmet Erkoyuncu,

Through-life Engineering Services Centre, Cranfield University

Sessione di Laurea del 19/07/2017 Anno Accademico 2016/2017

(4)
(5)

ABSTRACT

This thesis work presents an innovative methodology for the design of new Augmented Reality Solutions. By following a user-centred approach, a multi-criteria decision model, based on the Analytic Hierarchy Process (AHP) method, has been proposed. This involved developing a set of metrics that need to be considered while designing a new AR application. By understanding the AR system and the applicability of the AR solutions in maintenance, it has been possible to define a set of criteria, focusing on the user context, and a set of alternatives. The purpose of the developed methodology is to compare criteria and consequently, weight the alternatives.

The research methodology has been divided into four main parts focusing respectively on understanding the AR system and the applicability of the AR solutions in maintenance, defining a set of criteria focusing on the user context, developing a methodology for comparing criteria and evaluate solutions, testing it with experts and analysing the obtained results.

The quantitative results show the global potentialities and applicability of the model as well as the efficacy of the methodology in supporting complex decisions within the context of AR. The use of the AHP method enables experts to deal with complex and contrasting concepts and express a preference among them with a subjective judgement based on their personal understanding of the problem.

Keywords:

Industrial Maintenance, User-centred Approach, Human Factors, Analytic Hierarchy Process

(6)
(7)

ACKNOWLEDGEMENTS

This thesis is the result of an incredible experience in which I had the opportunity to face many challenges, meet new people and build relationships.

The first person to whom I would like to say thank you is Professor Gino Dini, who offered me the opportunity to go abroad and write my thesis at the Cranfield University.

Another person to whom I would like to say thank you is Dr. John Ahmet Erkoyuncu, who gave me a constant support and helped me to find the right direction for my project. It was a pleasure to collaborate and learn from him.

This experience would have not been possible without the help of my family. They have always been supporting me, in everything I did. Thank you for all your sacrifices, your patience, your time and most of all your love.

Another thank you goes to my “Cranfield family” and to all the people that have been part of my journey, especially my flatmates, my office colleagues, my new Italian friends and Giulia. Without them, it would not have been the same. A special thank you goes to Miko, the person who enriches me more, who was always present during these six months and from which I learned a lot.

Another thank you goes to my old friends, for being always present also if we were far. They have been always ready to hug me every time I was coming back in Italy.

Finally, I have to say thank you to Flavio. Thank you for the constant support and for being always by my side. It has not been easy but you understood how much this experience was important to me and you helped me every day. I will never thank you enough.

(8)
(9)

TABLE OF CONTENTS

ABSTRACT ... i

ACKNOWLEDGEMENTS... iii

LIST OF FIGURES ... vii

LIST OF TABLES ... ix

LIST OF DEFINITIONS ... x

LIST OF ABBREVIATIONS ... xiii

1 INTRODUCTION ... 1

1.1 Project Background and Research Motivation ... 1

1.2 Aim ... 2

1.3 Objectives ... 3

1.4 Case Studies ... 3

1.5 Thesis Structure ... 3

2 LITERATURE REVIEW ... 5

2.1 Augmented Reality (AR) ... 5

2.1.1 Definition ... 5

2.1.2 Components of the AR System ... 7

2.2 AR in Industrial Maintenance ... 9 2.2.1 AR Solutions ... 14 2.3 Research Gap ... 25 3 RESEARCH METHODOLOGY ... 26 3.1 Research Approach ... 26 3.2 Research Analysis ... 26 3.2.1 Data Analysis ... 26 3.2.2 Data Collection ... 26 3.2.3 Data Selection ... 27

4 DEVELOPMENT OF THE METHODOLOGY ... 29

4.1 Identification of the MCDM Method ... 29

4.2 The AHP Model ... 30

(10)

4.2.2 The Definition of the Criteria ... 31

4.2.3 The Validation of the Criteria ... 43

4.2.4 The AHP Structure ... 46

4.3 The AHP Technique... 46

5 VALIDATION ... 53

5.1 The Developed Tools ... 53

5.2 The Process of Validation ... 54

5.3 Presentation of the available AR solutions ... 55

5.4 Presentation of the Methodology ... 57

5.5 Definition of the Scenario ... 57

5.5.1 Case Study One ... 58

5.5.2 Case Study Two ... 59

5.6 Execution of the Paper Questionnaire ... 59

5.7 Presentation of the Results on Excel ... 61

5.7.1 Case Study One ... 61

5.7.2 Case Study Two ... 64

6 DISCUSSION ... 66

6.1 Benefits ... 66

6.1.1 Benefits of the Proposed Methodology... 67

6.2 Limitations ... 67

6.2.1 Limitations of the Proposed Methodology ... 67

7 CONCLUSION AND FUTURE WORK ... 68

7.1 Conclusion ... 68

7.2 Future Work ... 69

REFERENCES ... 71

(11)

LIST OF FIGURES

Figure 1 A comparison between Computer and Social Science Publications .... 2

Figure 2 The process of an AR system ... 6

Figure 3 Interaction Loop ... 9

Figure 4 Gartner Report ... 10

Figure 5 Distribution of AR Overlaying Methods ... 14

Figure 6 Video See Through HMD ... 15

Figure 7 Optical See Through HMD ... 16

Figure 8 The execution of a non-destructive testing on pipelines with HHD ... 18

Figure 9 A guided procedure for a manufacturing cell with an HMD ... 19

Figure 10 Noise-Cancelling Headphones ... 21

Figure 11 Bone Conduction Headphones ... 22

Figure 12 Haptic Components ... 23

Figure 13 Data Gloves for AR applications ... 24

Figure 14 Vibro Tactile Bracelet ... 24

Figure 15 Joystick and Stylus-Based Device... 25

Figure 16 Relationship between Information Characteristics, Dialog Principles and Usability Factors ... 34

Figure 17 The Followed Approach: four main categories ... 36

Figure 18 First Level Criteria ... 37

Figure 19 The three Dimensions of the User Experience ... 38

Figure 20 The AHP Structure ... 46

Figure 21 An example of Pairwise Comparison ... 49

Figure 22 The Excel Structure ... 53

Figure 23 The Developed AHP Model on Excel ... 54

Figure 24 Workshop on the Design of new AR Solutions ... 56

Figure 25 Workshop Activity ... 57

Figure 26 Warship Gun (Case Study One) ... 58

Figure 27 Off-Site Repairing Task: a technician and an expert in a remote site59 Figure 28 Group Activity ... 60

Figure 29 An example of the Paper Questionnaire ... 60

Figure 30 Final Ranking of Solutions: Group Decision ... 62

(12)

Figure 32 Comparisons among Alternatives and Appropriateness Criteria ... 63

Figure 33 Comparisons among Alternatives and Effectiveness Criteria ... 64

Figure 34 Comparisons among Alternatives and Operability Criteria ... 64

Figure 35 Final Ranking of Solutions: Group Decision (II) ... 65

Figure 36 Comparisons among Alternatives and Adaptability Criteria (II) ... 65

Figure 37 Comparisons among Alternatives and Appropriateness Criteria (II) . 65 Figure 38 Comparisons among Alternatives and Effectiveness Criteria (II) ... 66

Figure 39 Comparisons among alternatives and criteria (II) ... 66

Figure 40_A3 Final Ranking-Graphical Results (Case Study One) ... 92

Figure41_A3 Graphical Results: Comparisons among Alternatives and Adaptability Criteria (Case Study One) ... 93

Figure 42_A3 Graphical Results: Comparisons among Alternatives and Appropriateness Criteria (Case Study One) ... 93

Figure 43_A3 Graphical Results: Comparisons among Alternatives and Effectiveness Criteria (Case Study One) ... 94

Figure 44_A3 Graphical Results: Comparisons among Alternatives and Operability Criteria (Case Study One) ... 95

Figure 45_A3 Final Ranking-Graphical Results (Case Study One-II) ... 95

Figure 46_A3 Graphical Results: Comparisons among Alternatives and Adaptability Criteria (Case Study One-II) ... 96

Figure 47_A3 Graphical Results: Comparisons among Alternatives and Appropriateness Criteria (Case Study One-II) ... 97

Figure 48_A3 Graphical Results: Comparisons among Alternatives and Effectiveness Criteria (Case Study One-II) ... 97

Figure 49_A3 Numerical Results: Comparisons among Alternatives and Operability Criteria (Case Study One-II) ... 98

Figure 50_A3 Final Ranking-Graphical Results (Case Study Two) ... 99

Figure 51_A3 Graphical Results: Comparisons among Alternatives and Adaptability Criteria (Case Study Two) ... 100

Figure 52_A3 Graphical Results: Comparisons among Alternatives and Appropriateness Criteria (Case Study Two) ... 100

Figure 53_A3 Graphical Results: Comparisons among Alternatives and Effectiveness Criteria (Case Study Two) ... 101

Figure 54_A3 Graphical Results: Comparisons among Alternatives and Operability Criteria (Case Study Two) ... 102

(13)

LIST OF TABLES

Table 1 HMD, HHD and SD Categories: a Comparison among Solutions ... 20 Table 2 Summary of the First MCDM Methods ... 30 Table 3 Questionnaire Used for the Validation of the System Requirements ... 44 Table 4 Absolute Scale of Values ... 48 Table 5 Random Indices ... 51 Table 6_A3 Final Ranking-Numerical Results (Case Study One) ... 92 Table 7_A3 Numerical Results: Comparisons among Alternatives and

Adaptability Criteria (Case Study One) ... 93 Table 8_A3 Numerical Results: Comparisons among Alternatives and

Appropriateness Criteria (Case Study One) ... 94 Table 9_A3 Numerical Results: Comparisons among Alternatives and

Effectiveness Criteria (Case Study One) ... 94 Table 10_A3 Graphical Results: Comparisons among Alternatives and

Operability Criteria (Case Study One) ... 95 Table 11_A3 Final Ranking-Numerical Results (Case Study One_II) ... 96 Table 12_A3 Numerical Results: Comparisons among Alternatives and

Adaptability Criteria (Case Study One-II) ... 96 Table 13_A3 Numerical Results: Comparisons among Alternatives and

Appropriateness Criteria (Case Study One-II) ... 97 Table 14_A3 Numerical Results: Comparisons among Alternatives and

Effectiveness Criteria (Case Study One-II) ... 98 Table 15_Numerical Results: Comparisons among Alternatives and Operability

Criteria (Case Study One-II) ... 98 Table 16_A3 Final Ranking-Numerical Results (Case Study Two) ... 99 Table 17_A3 Numerical Results: Comparisons among Alternatives and

Adaptability Criteria (Case Study Two) ... 100 Table 18_A3 Numerical Results: Comparisons among Alternatives and

Appropriateness Criteria (Case Study Two) ... 101 Table 19_A3 Numerical Results: Comparisons among Alternatives and

Effectiveness Criteria (Case Study Two) ... 101 Table 20_A3 Numerical Results: Comparisons among Alternatives and

(14)

LIST OF DEFINITIONS

A clear explanation of the terms used in this thesis has been given below. The purpose is to avoid any confusion and give a practical list of definitions for a better understanding of the project. The following list has been completed according to the ISO 9241 “Ergonomics of Human-System Interaction”, and adapted to the context of AR.

-Accessibility: extent to which AR can be used by people from a population with the widest range of characteristics and capabilities to achieve a specified goal in a specified context of use.

-Context of use: users, tasks, equipment (hardware and software), and the physical and social environments in which AR is used.

-Dialogue: interaction between a user and the AR system as a sequence of user actions (inputs) and system responses (outputs) in order to achieve a goal -Effectiveness: accuracy and completeness with which users achieve specified goals with the support of AR.

-Efficiency: resources expended in relation to the accuracy and completeness with which users achieve goals with the support of AR.

-Ergonomics study of human factors: scientific discipline concerned with the understanding of interactions among human and other elements of a system, and the profession that applies theory, principles, data and methods to design in order to optimize human well-being and overall system performance.

-Feedback: indicator (such as haptic, aural or visual) sensed by a user with an AR device.

-Force feedback: application of physical forces in response to a user input. -Interactive System: combination of hardware and software components that receive input from, and communicate output to, a human user in order to support his or her performance of a task

-Haptic: sensory and/or motor activity based in the skin, muscles, joints and tendons.

(15)

Note: the term “haptic” covers all touch sensations, while “tactile” is used in a more specific manner and is limited to mechanical stimulation of the skin.

-Haptic device: device that presents information through the sense of touch, mainly by the use of hands and fingers.

-Haptic interface: user interface based on touch, using the movements of the user as input (gesture) and the sense of touch as output for tactile and kinaesthetic feedback.

-Kinesthesis: sense and motor activity based in the receptors in the muscles, joints and tendons for the perception of kinesthesis.

Note: types of kinaesthetic actions include movement, exertion of force and torque, and achievement of position, displacement and joint angle.

-Kinaesthetic feedback: action perceived by the mechanoreceptors in joints, muscles and tendons resulting in awareness of position, movement, weight and resistance of the limbs or other body parts.

-Goal: intended outcome

-Human-centred design: approach to systems design and development that aims to make interactive systems more usable by focusing on the use of the system and applying human factors, ergonomics and usability knowledge and techniques.

-Satisfaction: freedom from discomfort and positive attitudes towards the use of the AR device.

-System function: broad category of activity performed by a system.

-Tactile feedback: indication of the results of a user action transmitted through the sense of touch.

-Task: activities required to achieve a goal.

-Touch: sense based on receptors in the skin for the perception of touch.

-Touch sensitive screen (TSS): input device that produces an input signal from a finger touching, lifting off or moving across an AR display.

(16)

-Usability: extent to which an AR device can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use.

-User: person who interacts with the AR device.

-User experience: perception and response resulting from the use of the AR device.

Note: User experience is a consequence of functionality, system performance, interactive behaviour and assistive capabilities of the interactive system, the user's internal and physical state resulting from prior experiences, attitudes, skills and personality, and the context of use.

-User Interface: all components of the AR system (software or hardware) that provide information and controls for the user to accomplish specific tasks with the interactive system.

-Vibro-tactile: is a vibration-based stimulation of the skin.

-Work environment: physical, chemical, biological, organizational, social and cultural factors surrounding a worker in the context of AR applications in maintenance.

-Workspace: volume allocated to one or more persons in the work system to complete the work task with the support of AR.

-Work strain or internal load: internal response of a worker to being exposed to external workload depending on his/her individual characteristics (e.g. body size, age, capacities, abilities, skills, etc.).

-Work stress or external workload: external conditions and demands in a work system that influence a person’s physical and/or mental internal load.

(17)

LIST OF ABBREVIATIONS

AR Augmented Reality

AHP Analytic Hierarchy Process CI Consistency Index

CR Consistency Ratio

HCI Human Computer Interaction HMD Head Mounted Device

HHD Hand Held Device

MCDM Multi-criteria Decision Making Models SD Spatial Device

(18)
(19)

1 INTRODUCTION

1.1 Project Background and Research Motivation

Nowadays, production systems require the use of complex machines and tasks, with the consequent increase in the complexity of the maintenance activities. Therefore, the maintenance process is becoming more dependent from the expertise of the technicians, whose skills and knowledge are key elements for the effectiveness of the task completion. For this reason, there is a high demand for new technologies to support technicians. In this scenario, Augmented Reality (AR) seems to be a very promising technology. It is an innovative technology that overlays digital content on the real environment, in real time, enhancing the user perception of the real world with additional information. The higher contribution, from a user perspective, is in terms of safety, cognitive workload, errors and time reduction. In terms of activities, AR is adaptable to different purposes, from guidance and task information to simulation and planning, or even training, diagnosis and repairing. Figure 1 shows the publications trend, from 2010 to 2017 in Augmented Reality. Human factors and ergonomic aspects are slightly behind computer science, with a promising increasing trend. Most of the publications are about technical aspects (e.g. registration and rendering), with a limited contribution in terms of human computer interaction and user centred approach

(Ko et al. 2013). From a recent research conducted on AR publications (Dünser et

al. 2008), the number of papers on user evaluation represent the 10% of the

overall publications. Furthermore, as outlined in (Elia et al. 2016), there is a limited number of studies oriented in proposing models to evaluate most “fitting” AR solutions before starting the technological design process. Within the last decade, many researches have been conducted on AR, with a small percentage of publications focused on the user.

This is the background scenario on which the thesis will be based, focusing on AR as a human-computer interaction tool, a support for technicians during the execution of maintenance activities.

(20)

Figure 1 A comparison between Computer and Social Science Publications (Scopus)

1.2 Aim

AR is an innovative and promising technology in the domain of maintenance. It is still in development, and, for this reason, is not easy to describe the domain of the technology, not even the borders and limits. Nowadays, every AR available solution has different characteristics and, for every specific context of use, there is one optimal solution. There are few questions to answer:

 What are the elements to consider while designing AR solutions following a user centred approach?

 What is the relationship among these elements?

 Is it possible to organise them and put them into a model?

The aim of the thesis is to develop a methodological approach to be used during the concept phase of a new AR solution: a multi-criteria decision making tool, focused on the user context and the interaction between a user and the system. This would involve the definition of a set of criteria to be considered while designing a new AR solution.

(21)

1.3 Objectives

To fulfil the aim, there are five main objectives:

1. Understand the AR system and its potentialities in industrial maintenance 2. Identify the main AR applications in maintenance

3. Design the structure of the model and propose a multi-criteria decision making technique

4. Test the tool in a real situation

5. Identify the areas for improvement and suggest possible future work

1.4 Case Studies

A team of experts, with a multidisciplinary background, has validated the proposed methodology.

The project has been presented during the CIRP Design Conference at the Cranfield University, 9th of May 2017. Two case studies have been presented and

discussed during the execution of a Workshop on the Design of new AR applications and the numerical results have been obtained from the session.

1.5 Thesis Structure

The first part of the thesis focuses on the AR system and the applicability of the technology in the maintenance process. By doing an extensive literature review on AR applications and potentialities in maintenance, it has been possible to highlight the main limits in the research (research gaps).

The second part of the thesis focuses on the research methodology, which describes the path followed during the development of the model. After this, the proposed methodology will be described in detail.

Finally, the two case studies to test the methodology will be presented, with the obtained numerical results. A discussion on the work done, with limits and benefits of the methodology will be provided, together with the conclusion and the description of the future steps that should be undertaken.

(22)
(23)

2 LITERATURE REVIEW

In this chapter, a detailed description of AR technologies and applications in industrial maintenance is provided. The last part of the section will describe the gaps identified by the analysis of the state-of-the-art on AR.

2.1 Augmented Reality (AR)

2.1.1 Definition

Augmented reality (AR) is an innovative technology that overlays digital content (computer generated sensory inputs like sound, video and graphics) on the real environment, in real time, enhancing the human perception of the real world with additional information. It has the potential to modify human perceptions through digital content integrated with real world and triggered by contextualised events. The augmentation of the virtual content on the real scene is the most common application of AR, which also has the potential to hide or remove parts of the real environment from the user perception. The most significant definition of AR is the one given by Azuma in his survey (Azuma 1997). According to this definition, a system must have the following features in order to be considered AR:

1) Combines real and virtual objects in a physical environment 2) Provides Interactive information in real time

3) Registers real and virtual objects with each other in 3D

AR is an emerging HCI technology and enables the human-machine interaction using all human senses. A well-designed AR interactive system should leverage on different sensory systems (visual, aural, touch/haptic, olfactory and gustatory), however, most of the information are presented as 3D images or texts (visual), sounds (auditory) and haptics (touching) (Azuma 1997).. For example, external devices, such as headphone and microphone can be used together. The first one provides 3D sound information to the user while the second detects the real sounds that are not perceived by the user. Furthermore, this system is also an input device, by which a user can make a vocal request to the system.

(24)

A haptic device is a typical example of output device that provides a tactile feedback (e.g. vibration). An AR solution can be both an input or output device, able to detect the information from the user and respond to a specific request within the shortest time (responsiveness) and with high effectiveness.

In terms of interactive systems, “Input devices are a means for users to enter data into interactive systems. Generally speaking, an input device is a sensor that can detect changes in user behaviour (gestures, moving fingers, etc.) and transform it into signals to be interpreted by the interactive system.” (ISO 9241:400 2007). There are four major components of an AR system, namely, the real time recognition of the real scene (registration and tracking), the registration of the virtual objects on the physical world (rendering), the display of the content (content generation) and handling of user interaction (interaction). Computer generated information are rendered and registered on the real scene with accurate tracking and alignment, followed by user-friendly interaction modes. The user input is processed by the system, which, in response, manages the presentation of the result (aural, visual, and/or haptic). The end user guides the process, while the system provides the information access. The process of an AR system is presented in Figure 2.

The following paragraphs give a more detailed description of the state-of-the-art in AR including techniques and components.

(25)

2.1.2 Components of the AR System

In the domain of manufacturing, AR applications are classified as hardware devices and software systems (Nee et al. 2012). Software systems include desktop or mobile AR applications and existing AR libraries, e.g. AR Toolkit (registration and tracking), but they will not be included in the project.

The hardware components are:

 Camera: optical sensor that recognizes and captures the real environment.

 Processing unit: a local device, a mobile device (tablet or smartphone) or cloud computing. As shown in (Oliveira et al. n.d.), it is a critical component of the system and the choice is made considering portability, remote access and processing capacity time. Tablets and smartphones are used when the amount of data to manage is small, while, a local device is a preferable solution for large amount of data. Mobile devices also have a low processing capacity time. Cloud computing is used in combination with a local device which acquires the image from the scene, the image is send to a server and an AR scenario is send back to the local device and presented to the user. The most critical element of this solution is the time because the connection between the server and the local device must be fast.

 Input device: hardware component used to capture user inputs (e.g. click, touch, voice, gesture recognition, eye tracking, etc.) for the purpose of make a request to the system. Some examples are joystick, mouse, keyboard, touchpad, touch screen and microphone.

 Output device: hardware component used to present the information to the user. Generally, an AR system is both an input and output device, without the need for external devices (e.g. keyboard, mouse, joystick).

There are three main device categories, namely, visual, aural and haptic, but most of the applications are visual. As reported in (Dini & Mura 2015), Head Mounted Devices (HMD) are the most common applications (e.g. Microsoft HoloLens and Google Glass), followed by Hand-Held Devices

(26)

(HHD) (e.g. tablet, smartphone) and Spatial Devices (SD) (e.g. projectors, large screen). The use of HHD is increasing, especially in the last applications developed (Oliveira et al.). Another increasing solution are the haptic devices that offer a real time feedback in an automatic way, without requiring an interruption in the operator task, e.g. gloves vibration. The relationship between the user input and the system response is called “interaction loop” and is presented in Figure 3.

 Tracking system: technology used for the recognition of the user position/movements, and for tracking the objects in the real scene. The virtual content is aligned to the real scene according to the tracking information and registration accuracy. There are three major groups of tracking techniques: sensor-based (e.g. magnetic, acoustic, inertial, optical or mechanical), vision-based and hybrid (e.g. inertial and vision sensing based on GPS, etc.). Since most AR systems are equipped with cameras, vision-based tracking is the most commonly used due to its convenience and advancements in computer vision research. There are three different approaches of vision-based tracking: marker-based, feature-based and model-based. The use of markers is an inconvenient solution, as it requires the placement of more than one marker on the assets (visibility problem) (Oliveira et al.). The applicability is limited, but is a robust and stable solution. The feature-based technology consists in detecting salient features in the images, e.g. corners, and is capable of enhancing the tracking stability and extend the tracking range. The model-based approach consists in detecting and matching models from existing databases. 3D models, e.g. 3D graphical objects, are matched with the real environment to find correspondences for camera pose estimation. One of the main disadvantages of the marker-less tracking is that the recognition speed drops as the number of objects to be recognized increases. For robust tracking solutions, the best option is to use a combination of different approaches because the dynamics of ambient lighting and bad sensor fusion can affect the tracking stability.

(27)

Figure 3 Interaction Loop

2.2 AR in Industrial Maintenance

AR is a context-aware system, it means that is able to collect and analyse the context information and adapt its functionalities to the varying situations. The context is any information that is used for defining a situation. For example, the information regarding the level of expertise of a technician can be used for defining the type of interaction between the user and the machine. AR is applied in many different fields (Malaguti et al. 2014), such as, military, education, entertainment and gaming, navigation, medical, manufacturing and repair. According to a recent survey on AR applications in Through-life Engineering Services (Dini&Mura 2015) there are three main areas of application: automotive, aerospace and industrial plants (maintenance, training and inspections tasks). Figure 4 shows the position of AR on the Gartner Curve. As an innovative and emerging technology, is still in development, and, compared to Virtual Reality, is far behind. The application of AR in maintenance activities is increasing, showing advantages in terms of efficiency, costs and safety. It allows the reduction of intervention time and material consumption, with the consequent minimization of the maintenance costs. In addition, it has a crucial impact on the safety for the technicians while they are performing a task.

(28)

Maintenance is “a complex combination of all the technical, administrative and management activities planned during the life cycle of an entity, to keep it or return it in a state where they can perform the required function” (UNI EN 13306, 2003). The maintenance process is a sequence of elementary operations that ensure the equipment functionality (longer life cycle), the prevention of failures and the diagnosis and/or repair of equipment parts, in order to realise products with high quality and avoid any critical or unexpected situation, e.g. breakdown, where is required a downtime of the production process. In terms of manufacturing, an appropriate maintenance planning is required in order to achieve several goals, such as, quality, availability and reliability of production systems, safety on the work environment, production maximization and minimization of the maintenance costs (Faccio et al. 2014).

Figure 4 Gartner Report (Hype Cycle for Emerging Technologies, 2016)

There are three main categories of maintenance: corrective, preventive and predictive.

The corrective maintenance is an unscheduled activity and is conducted when a failure occurs. It requires the interruption of the production process for the substitution of the damaged parts and restore the initial functionalities of the

(29)

system. Anyway, a list of possible equipment failures and causes is provided and, in case of a breakdown or emergency, the activities to be conducted can be easily identified.

The preventive maintenance is a scheduled and time-based activity, which involves routine practices, e.g. inspections, measurements and minor part replacement, for monitoring the working conditions and prevent faults from occurring. A list of parameters and conditions must be provided for every machine, e.g. equipment operation hours as parameter and frequency/operation period as condition, and all the data acquired by the system are used for predictive and trend analysis. It relies on the average or expected life statistics in order to predict when a maintenance task should be performed.

All the information acquired from corrective and preventive maintenance are stored by the system and analysed over a fixed period. It gives a continuous feedback to be used for improve the performances of the maintenance process. Nowadays, factors such as environment sustainability, globalization and material recycling, have moved maintenance from corrective to preventive (Fiorentino et al.

2014).

The predictive maintenance is a sequence of activities designed to monitor the conditions of the equipment in order to predict when the maintenance activities should be performed. It prevents unexpected equipment failures and allows a convenient scheduling of corrective maintenance activities by constantly monitoring the equipment conditions.

In terms of activities, there are two main classes (Henderson & Feiner 2011): those focusing mainly on the cognitive aspect of a task, e.g. a diagnosis task, and those focusing mainly on the physical aspect, e.g. assemble/disassemble, screw/unscrew. These classes include:

 Training activities: a combination of activities that enables the transfer of information and knowledge to the technicians. There are two different types of training: on the job and offline. AR has been used to design new types of instruction manuals and ready-manuals to support technicians in a safe and efficient environment. In (Fite-Georgel 2011) an AR manual to support car mechanics during a repairing task is described. A 2D basic

(30)

workflow of activities is represented on a set of Power Point slides and, for each step, a 3D graphical representation of the instructions is shown. Another possible application of AR is using a screen that displays names, historical data and functions of different parts of a machine.

 Monitor and diagnosis activities: a periodical control of the machine conditions, e.g. usage level and health state, for evaluate the status of a machine. Sensors or machine parameters are used for register the information that are used as input for subsequent activities, like replace a component or modify the temperature of the liquid in the machine. While monitoring an engine, an AR system can store and analyse data in real time and on the real scene, e.g. analyse the level of usage and deterioration of a machine or component. An alert system is used for monitoring the machines, different colours or sounds can inform about abnormal situations or scheduled activity. For example, the machine temperature can be monitored and, if it goes under a certain value (condition), a message can be displayed and an arrow can point the part to analyse.

 Repair activities: a sequence of operations for restore the functionality of the machine after the occurrence of a failure that requires the replacement or regeneration of the broken components. Commonly, technicians need to disassemble the engine in order to change the broken part. An AR application gives to the technician all the instructions for the execution of the task, from the actions to be taken to the exact place were a repairing task is required.

There are two types of information for the correct execution of a maintenance task: the instructions on carrying out the activities (how) and the information about when the maintenance activities should be conducted (when). With the introduction of AR, there are two more elements to consider: the information that the system gives to the user during the execution of a specific task, and the user action or request to the system. Alongside with the increasing machine complexity, the functionalities related to safety, diagnosis and maintainability have become critical (Malaguti et al. 2014). For complex and long life equipment,

(31)

the maintenance task is to increase the life period and reduce the risk of equipment obsolescence, e.g. mechanical or electronic components, materials, test equipment (Roy et al. 2016). From a recent research conducted in (Fiorentino

et al. 2014), the application of AR in industrial maintenance can improve time and

error rate compared to a paper manual maintenance execution. In addition, a qualitative analysis has provided the following results: AR is easy to use, intuitive and with a high level of satisfaction for the user. Furthermore, there are several potential advantages in using AR (Fiorentino et al. 2014), (Dini & Mura 2015):

 Time and cost savings (e.g. operation time, time for updating, modify and report information, as well as the related costs)

 Error rate reduction (e.g. wrong tool selection, wrong part positioning, removing the wrong component, modify, updating or report the wrong information or even localizing to the wrong item, wrong information/ history details for the item)

 Safety on the working place

 Reliability of the maintenance equipment  Less-skilled operators

 Data are up to date (updated on the job in real time)

 Registered multimedia content (e.g. 3D static or animated content)

 It is a paperless system: the knowledge is stored and used for different purposes.

 The level of information can be customized according to the type of task and technician skills

 It can be integrated with other computer devices.  It is an immersive system.

On the other hand, there are several issues related to the AR applications (Di

Donato et al. 2015):

 The discomfort of the devices (e.g. heavy, bulky devices)  The limited mobility for the technicians

(32)

2.2.1 AR Solutions

Augmented Reality is a human-machine interaction tool. Appropriate solutions for different situations are essential in allowing users to intuitively interact with the system. An effective AR solution should be focused on three elements namely, the ease of use, the usability of the interfaces and the accurate and precise presentation of the information. As mentioned in (Azuma 1997), ergonomics and ease of use, are important issues related to the application of AR, as the potential of the technology is to make the user’s execution of the task easier.

The purpose of this section is to analyse and evaluate different hardware solutions for different contexts of use.

2.2.1.1 Visual Display Solutions: techniques and categories

There are three methodologies for overlaying virtual content on the real world environment (Dini & Mura 2015), namely, optical combination (optical see-through), video mixing (video see-through) and image projection. The video mixing technique is the most common in AR applications, while the optical combination and the image projection represent the lower percentage of applications (Figure 5). In addition, there are three main categories of devices, namely, HHD, HMD and SD. In this section, after a general overview of the techniques and categories, a detailed analysis of the differences among solutions has been provided.

(33)

2.2.1.1.1 Video See Through Technique

The real scene is captured by a camera and transferred to a computer. The virtual content, created by a scene generator, is superimposed on the real scene and displayed on a monitor screen, which is placed in front of the user’s eye. A conceptual diagram of this technology with an HMD has been taken from (Azuma 1997) and presented below. It is easy and cheap to implement, flexible to indoor and outdoor environments and there are no delays in overlaying virtual content on the real environment. On the other hand, the resolution of the real scene is low, there is high latency, and the level of safety is limited. If the power shuts down, the technician is not able to see the real environment, because the monitor becomes completely black.

Figure 6 Video See Through HMD (Azuma,1997)

2.2.1.1.2 Optical See Through Technique

The virtual content is projected on translucent screens (optical combiners), e.g. glasses, which are placed in front of the user’s eye. The user can look directly through the optical combiners to see the real environment, as they are partially transmissive. They are also partially reflective, so that the user can see the virtual content on the real scene. A conceptual diagram of this technology has been taken from (Azuma 1997) and presented below. It is expensive to implement due to the complexity of the technology. In addition, the applicability is limited to the indoor environment. On the other hand, the resolution of the real scene is not reduced by the optical combiners and the latency is low. Differently from the video solution, if the power shuts down, the technician is able to see the real

(34)

environment, it means that the level of safety provided is high. Furthermore, it is applicable to extreme environments where technicians require direct visual contact with the environment. There is one main issue regarding video and optical techniques: contrast. By contrast, we refer to the level of light that a device can detect which is much smaller than the detection of the human eye. It is a critical issue for optical solution: if the real environment is too bright, the virtual image will not be seen from the user, while, if it is too dark, the real environment will not be seen. This problem is caused by the fact that the user has a direct view of the real world. Contrast is not a big issue with video, because they use cameras, and the view of the real and virtual is generated by a monitor, not even with projectors.

Figure 7 Optical See Through HMD (Azuma, 1997)

2.2.1.1.3 Projective Technique

The virtual content is projected over the surface of the real environment. In the development of a projective solution there are three main aspects to consider: the relative position of the operator, the shape and surface of the real object and the projection parameters of the display device (e.g. light intensity). Surface types, e.g. planar or curve, and the material of the physical object, e.g. steel or wood, can affect the legibility of the image projected, in terms of both performing time and accuracy. An advantage of this solution is the user awareness of the work environment around him, while, with the video or optic solutions, the user’s attention focuses on a specific image frame, with the consequent problem of the “tunnel vision effect”. Furthermore, it offers a wide field of view, by covering large surfaces. On the other hand, the applicability is limited to the indoor environments

(35)

and it needs to be calibrated each time that the environment or the distance from the surface change.

2.2.1.1.4 Hand Held Devices (HHD) Category

Handheld devices are smartphone, tablet computers and other mobile devices, which use LCD displays to present the information and a camera for the augmentation. They are video see-through but, in (Krevelen & Poelman 2010) there are examples of optical see-through and projective (HMPD) solutions. The video HHD are also called “closed view HHD”, as they do not allow any direct view of the real world. The video camera provides the view of the real world to the user. The new generation of devices has powerful processors, large screens and built in location sensors and cameras (Santos et al. 2015). At the same time, the memory storage is low, and they need a server for image analysis and object recognition. Their usability in maintenance is limited as their biggest limitation is the operability for the technicians. An example of HHD application is presented in figure 8 (Dini & Mura 2015).

According to (Santos et al. 2015) there are several perceptual and ergonomic issues related to this type of solution.

Perceptual issues are:

 The displaying area is not in front of the user’s eye

 The level of subjective comfort (cognitive comfort), is limited  Inappropriate depth perception accuracy

Ergonomic issues are:

 The application causes fatigue after extended use (e.g. muscular effort in handling the device)

 The device is too bulky or too heavy

 Hand interactions are difficult to perform because one hand is using for holding the device while the other one for manipulating the display

 Movement limitations during the execution of a task  The keypad is too small

(36)

Figure 8 The execution of a non-destructive testing on pipelines with HHD (Dini&Mura, 2015)

2.2.1.1.5 Head Mounted Devices (HMD) Category

The HMD, also called “Head Worn Display”, is a device mounted on the head of the user, comparable to a normal pair of eyeglasses. The image is displayed in front of the user’s eye, at the same level of the user’s view, enabling different users to see different images in the same space. Most of the AR applications involve HMD devices, also if they have several ergonomic problems, like weight, eye strain after prolonged use, resolution, field of view and focal depth. Most of the applications are optical see-through, but, in (Krevelen & Poelman 2010) there are also some video see-through and projective (HMPD) examples. An optical HMD is composed by a combiner, which is a transparent screen, e.g. glasses, and a display while a video HMD is made of an opaque screen, a display and a camera. In terms of applicability, the optical HMD is more suitable for mechanical operations because of safety and cost issues, while the video HMD is better used in situations where is required more flexibility in merging real and virtual, e.g. light conditions can affect an optical solution. Because of this flexibility, video see-through HMD can be applied in many different situations. An example of application is presented in the picture below (Dini & Mura 2015).

Apart from the technological solution adopted, there are several perceptual and ergonomic aspects to consider.

Perceptual issues are:

 The high level of immersion in the augmented scene: the content is displayed in front of the user’s eye.

(37)

 The information presented on a small size screen must to be concise in content and in small quantity, as they can easily be perceived as too many compared to the dimension of the screen.

Ergonomic issues:

 The application causes fatigue after extended use (e.g. nausea, eye strain).

 The device is not comfortable (e.g. too bulky or too heavy).  The device is not adaptable to the user’s level of visibility.

The use of a helmet can support the camera and the display device, and it must be used in extreme environments for safety reasons.

Figure 9 A guided procedure for a manufacturing cell with an HMD (Dini&Mura 2015)

2.2.1.1.6 Spatial Device (SD) Category

A Spatial display refers to the spatial augmented reality technique where screens are statically placed in the environment. Most of the applications are based on projective screens but there are also some applications using video see-through or optical see-through (Krevelen & Poelman 2010). As they are fixed, they do not need any kind of tracking and the movements of the user, e.g. head movement, do not influence them. The virtual image is directly projected onto the surface of the real object and is particularly used for large presentations with small interactions (Krevelen & Poelman 2010). In the simplest situation, a single projector is used for project an information on a single surface, while more complex

(38)

situations require the use of multi-projectors for project images on more than one surface. Sometimes they also require the use of eyeglasses.

2.2.1.1.7 Visual Display Comparison

HMD, HHD and SD are digital information devices and they primarily interact on a visual way with the user. According to (Krevelen & Poelman 2010) a comparison between technologies has been carried out. The following table highlight the differences among solutions, focusing on the aspect related to the user and the usability. The term “high” has been used in contrast to the term “low” as an indicator of the relative advantage/disadvantage of a solution compared to the others.

Table 1 HMD, HHD and SD Categories: a Comparison among Solutions

2.2.1.2 Aural Device Categories

Aural solutions for AR applications are mobile audio system, e.g. headphones, speakers and microphones, used with sounds/speech interactions and naturally small and wearable. They are generally used with headphones, as their main purpose is to give information to a user in a private way and not to disturb the surrounding environment. Most of the AR applications add virtual sounds to the auditory perception by reducing the perception of the real acoustic environment.

Attributes HMD Optical HMD Video HMD Projective HHD (All) SD Video SD Optical SD Projective

Mobility Controlled Controlled Controlled Mobile Fixed Fixed Fixed Installation Wearability/Portability High High High Low _ _ _ Safety High level Low level High level High level High level High level High level Applicability Indoor Indoor/outdoor Indoor/outdoor Indoor Indoor Indoor Indoor User Interactivity High High High High Low Low Low Input Interaction Mode Context Sensitive Context Sensitive Context Sensitive Haptic Context Sensitive Context Sensitive Context Sensitive Output Interaction

Mode Visual Visual Visual Visual Visual Visual Visual Additional Output

Interaction Mode Context Sensitive Context Sensitive Context Sensitive Context Sensitive Context Sensitive Context Sensitive Context Sensitive Eye strain and Fatigue High High Low High High Low Low Body Movement

limitations Low Low Low High _ _ _ N° of users Single user Single user Single user Multiuser Multiuser Multiuser Multiuser Operating area Small area Small area Small area Small area Large area Large area Large area

(39)

The audio can be a real time capture with a local microphone, a remote audio source or a pre-recorded audio stored in the device. There are two different techniques in Augmented Aural Reality (AAR) (Albrecht et al. 2011), namely, acoustic-hear-through (hear-through) and microphone-hear-through (mic-through) audio AR. The first method uses a bone-conduction headset to deliver computer generated audio while leaving the ear canals free to receive audio from the real environment. The second method uses a headset with microphones located on each ear. The audio signals, captured from the microphones, are mixed with computer generated audio and delivered to the user over the headphones. The main difference between them is the perception of the acoustic surrounding environment. In the first case, the environmental noise is not reduced, while, with headphones, real sound signals are captured and reduced. The result is a combination of real and virtual in a way that humans are not able to perceive the difference between the sounds. Differently from visual displays, they do not limit the attention of the user to a specific space frame and they allow the user to have a general overview of the surrounding. An example of a microphone-hear-through has been extracted from (Albrecht et al. 2011) and presented in Figure 10.Two headphone drivers and two microphones, located at the opposite sides, compose it. An example of acoustic-hear-through has been extracted from the web and presented below. Microphone headphones give a better sound quality compared to the acoustic-hear-through and give the possibility to the user to control the level of sound perceived.

(40)

Figure 11 Bone Conduction Headphones (Web Image)

The design of headphones has to face several perceptual and ergonomic issues

(Albrecht et al. 2011). Perceptual issues are:

 The resonance effect: the sounds perceived by the user must be natural, especially for extended usage of the display. The resonance effect, generated by earpieces that block the ear-canal entrance, must be considered when designing a natural AR system

 The minimization of the latency

 The adjustability of the level of the microphone signals transmitted to the headphones: if the level is too high, it is perceived as a noise, while, if the level is normal, the user perceive the noise not particularly disturbing.  Communication problems: the dialog between people is not natural using

headphones. At the same time, the use of this hardware does not affect the quality of the communication.

 The wind noise for outdoor environment. Ergonomic issues are:

 Discomfort in wearing the headphones.  Pain after prolonged usage (30 minutes).

 Mobility problems: the use of mixer and cables makes it difficult for the user to move. The number of cables could be reduced by integrate the mixer with the headset.

In terms of applicability, sounds should not be used in very noisy surroundings. In addition, appropriate sounds should indicate positive/negative outcomes, with only one simple sound for catching one’s attention or acknowledging that a task has been executed. Speech inputs are appropriate when more detailed verbal information is required, e.g. guidance, and when visual information is not

(41)

available or appropriate. Voice messages are faster than text inputs and leave the user’s hands and vision free.

2.2.1.3 Haptic Device Categories

The haptic sense is divided into the “kinaesthetic sense (force and motion) and the tactile sense (touch)” (Krevelen & Poelman 2010). The field of haptics is described in the figure below. The following description of haptic devices has been extracted from the ISO standards (ISO 9241:910 2012).

Figure 12 Haptic Components (ISO 9241:910 2012)

The main applications in industrial maintenance are AR tactile feedback devices. Most of them are hand-based, such as glove-based technology, with a continuous interaction between the user and the device (Ye et al.) and a limited workspace. An example is shown in figure 13. The sense of touch is complementary to the sense of vision and is often used in combination with sound or visual information in order to enrich the user’s perception of a scene. Tactile devices are able to perceive and communicate parameters such as temperature, roughness, hardness, objects weight, size and shape but they cannot be used to get an overview of the real environment, to perceive colours or even images in a 2D picture. Compared to visual or aural, haptic is a very intuitive sense and does not require any kind of interpretation. With the visual sense is possible to have a quick overview of a scene and identify objects, while the sense of touch can immediately give an information feedback to the user. Glove-based technologies use sensors to capture finger movements in real time and on the real scene. The sensor position can be either on the fingertips or the edge of the hand or even

(42)

the palm (Hoang et al. 2013). Data gloves are suitable only for few applications because they limit the use of hands.

Another solution is the vibro-tactile bracelet (Figure 14). Compared to data-gloves, is smaller and enables user’s free movements, within a higher working space. It detects the user’s movements and gives a tactile feedback.

Figure 13 Data Gloves for AR applications (Web Image)

Figure 14 Vibro Tactile Bracelet (Web Image)

Force feedback devices are a subclass of the haptic devices. They are generally used for simulation of real world tasks, planning or even training. The main difference between tactile and force feedback is the type of interaction between the user and the system. With data-gloves or bracelets, a user interacts in an indirect way with the system. For example, while a technician is performing a disassemble operation, his movements are perceived as input signals by which the system gives a tactile feedback (output). With a force feedback device, the type of interaction is direct and the user has the total control of the system. It means that the interaction starts after the user input, which is a physical interaction between the user and the device. A handler/joystick is used for manipulating 3D virtual objects and for providing force feedback while a stylus-based device is able to give a force feedback to a user action. Two examples, extracted from (ISO 9241:910 2012) are presented below.

In conclusion, a tactile solution should not be the primary method of interaction as it is not possible to use the sense of touch for distinguishing positive and

(43)

negative information. In hazardous environments, where safety protections are required, gloves are not applicable. In this case, tangible interfaces, e.g. physical buttons, or speech recognition offer a good solution. Sometimes they work better than visual displays in presenting the information, particularly when the user’s focus is on more than one task. The reason is that visual attention constraints body movements while an eye-free interaction enables a high freedom in the body movements.

Figure 15 Joystick and Stylus-Based Device

2.3 Research Gap

The analysis of the state of the art on AR highlights two main research areas that require further investigation: the study of AR in terms of interactive system and the study of the applicability in the domain of maintenance. Scholars focused their attention on specific issues, without having a global understanding of AR as a whole. There is only a limited number of articles discussing about the applicability of AR and, most of them are qualitative analysis on usability principles. Furthermore, there is a lack of studies focused on understanding the user context and needs. It is important to consider the user as the central element around which build the system, because AR is a support tool and without the user satisfaction, an AR system will not be effective.

(44)

3 RESEARCH METHODOLOGY

The research gap section identified two main areas of interest and for this reason the proposed methodology will focus on this aspect.

3.1 Research Approach

The methodology used in conducting the research has been synthetized as follows:

1. Analysis of the AR system following a user-centred approach 2. Identification of the methodological approach to follow

3. Definition of a set of metrics 4. Development of the model

5. Validation of the proposed methodology

3.2 Research Analysis

3.2.1 Data Analysis

Starting from the definition and selection of useful sources of information (3.2.2), a set of criteria has been defined for filtering the initial collection of data and meet the objectives of the research (3.2.3).

3.2.2 Data Collection

The databases used for the collection of articles have been selected within a range of available sources. In (McKerlich et al. 2013) a comprehensive analysis of three major databases has been provided: Scopus, Google Scholar and Web of Science. The latter is no longer the most used citation database, as more than 100 databases and tools have been developed in the last few years. They have been classified in three main categories. The first category of tools allows the user to search in the full text of the paper and have information regarding the number of citations. In addition, they automatically extract bibliographic information and cited references from the full text. Some examples are, ACM, Digital Library, IEEE, IEEE Explore and Google Scholar.

(45)

The second category of databases allows the user to search in the cited references. Some examples are, PsycINFO and Elsevier’s Science Direct. The last category includes databases for citations and bibliographic searching, and the most relevant example is Scopus. In (McKerlich et al. 2013) there is a general description of the differences between Scopus and Web of Science. An extensive analysis of the sources shows that Scopus provides significantly more coverage of HCI literature than the Web of Science. In order to do an accurate and effective research in the area of AR applications, three major databases have been analysed: Google Scholar, Scopus and Science Direct.

3.2.3 Data Selection

A list of keywords has been used in conducting the research with the intended purpose to maintain the research broad enough not to exclude any potential contribution. Using a Boolean combination of words, such as, “Augmented Reality AND Maintenance”, “Augmented Reality AND ergonomics”, more than 1000 articles have been collected. All the articles that, although containing these terms, are not related to industrial maintenance, have been excluded. In addition, articles related to AR software techniques, e.g. tracking or rendering, have been excluded. By excluding the technical aspects of the technology, the intention is to focus the attention on the internal and external aspects of the user context. The criteria identified for filtering the research are:

 First Criteria: English Language

 Second Criteria: Engineering field, Maintenance context and AR applications in maintenance

(46)
(47)

4 DEVELOPMENT OF THE METHODOLOGY

This section describes the process followed in the development of an innovative methodology for the design of new AR solutions.

4.1 Identification of the MCDM Method

The first part of the analysis regards the identification of a technique for the analysis and evaluation of different criteria that needs to be considered while designing new AR solutions. In (Velasquez & Hester 2013), a comparison among the most common multi-criteria decision making methods has been proposed.

Table 2 presents advantages and disadvantages of the most common techniques,

together with their fields of application. The most suitable method for the purpose of the research is the Analytic Hierarchy Process (AHP) as it is easy to use and does not require a big amount of data. The AHP is a decision making tool based on the pairwise comparison of intangible criteria and the subjective judgement of experts to derive priority among alternative solutions (Saaty 2008). The following paragraphs describe in detail the AHP approach, from the mathematical principles to the application with a practical example.

(48)

Table 2 Summary of the First MCDM Methods (Velasquez & Hester 2013)

4.2 The AHP Model

The AHP is a structured technique for organizing and analysing complex decisions, based on mathematics and psychology. It is a support tool for derive ratio scales from paired comparisons. It was developed by Thomas L. Saaty in the 1970s and has been extensively applied in many different fields over the last decades. It is applied in many different situations, such as business, industry, government and education. By following a rigorous method, criteria are weighted by pairwise comparisons, and, alternative solutions are ranked. The input can be obtained from actual measurement such as price, weight etc., or from subjective opinions such as satisfaction feelings and preferences.

The aim of the process is to help decision makers to find a solution that best suits their goal and their personal judgement of the problem. A decision problem is decompose into a hierarchy of sub-problems and each of them can be analysed independently. The top level is the goal, followed by the criteria and the alternative solutions. These elements can relate to any aspect of the decision problem, tangible or intangible, measured or simply estimated.

The next section describes in detail the criteria and alternatives that have been selected.

(49)

4.2.1 The Definition of the Alternative Solutions

The outputs of the model are three types of solutions, related to the way of presenting the information. In order to build a model that can be used in the future, the solutions are not related to any available AR application. Furthermore, by using the AHP technique, the solutions are ranked and a designer can build systems with more than one solution.

The alternatives of the model have been categorised as follows:

 Visual Solutions: the information are displayed on a visual way. It means that the designers must focus on the visual sense, without exclude other senses. Visual information are: text/numerical information, 2D/3D static/dynamic symbols, video.

 Aural Solutions: the information are presented as sounds. It means that the designers must focus on the aural sense, without exclude other senses.

 Haptic Solutions: the information are presented as tactile feedbacks. It means that the designers must focus on the tactile sense, without exclude other senses.

4.2.2 The Definition of the Criteria

By following a user-centred approach, a list of criteria has been identified. These parameters have been grouped under the name of “User Requirements”.

The main objective of the research is to use these parameters as a guideline for designing effective solutions and maximize the user satisfaction. Human factors, human-machine interaction and ergonomic factors have been considered, and a model has been developed.

(50)

4.2.2.1 A User-Centred Approach

The human-centred design is an “approach to interactive systems development that aims to make systems usable and useful, by focusing on the user’s needs and requirements, by applying human factors/ergonomics and usability knowledge and techniques. This approach enhances effectiveness and efficiency, improves human well-being, user satisfaction, accessibility and sustainability; and counteracts possible adverse effects of use on human health, safety and performance.” (ISO 9241:210, 2012). ISO 9241 “Ergonomics of Human System Interaction” is a multi-part standard from the International Organization for Standardization (ISO) that deals with both hardware and software ergonomic aspects. The term ergonomics refers to the “understanding of interactions among humans and other elements of a system, and the methods to design in order to optimize human well-being and overall system performance” (2.1). Ergonomic principles, if properly applied in the design of new system, “aims at optimizing work strain, avoiding impairing effects and promoting facilitating effects.” (ISO

6385:2016). Impairing effects are all the consequences related to the use of a

device, e.g. fatigue, nausea, dizziness after prolonged use, while facilitating effects are the positive consequences in adopting a solution, e.g. skills development. A well-designed ergonomic system must focus on the physical and cognitive aspects of the human-machine interaction.

The design of systems using human-centred methods improves the user satisfaction due to a better understanding and learnability of the system. Consequently, the level of training and support required by a user is minimized, as well as the user fatigue, discomfort and stress (ISO 9241:210, 2012).

The purpose of this section is to use the ISO 9241 as a guideline in the identification of the criteria that need to be considered while designing new AR applications.

Design requirements are qualitative aspects of a system that are inputs to its design process.

Designing solutions for interactive systems should include the following activities (ISO 9241:110 2006): understand the user context of use, design the user-system interaction and design the user interface.

Riferimenti

Documenti correlati

We conclude that the quitting surface model predicts the centre line intensity of helium beams well for skimmers with a diameter larger than 120 µm when using a continuously

lavoro (assegnate a ciascun nucleo familiare), con la se- parazione dei nuclei in conflitto (di cui si è già detto) se- condo due ambiti già esistenti separati però da un “muro

The excellent structural behaviour of the shell supported footbridge has been confirmed by the results of the FE analysis, which demonstrated that the non-conventional form-finding

In this paper we studied the immunohistochemical expression of 3 autophagy related proteins (Beclin-1, p62 and ATG7) in a cohort of 85 primary uveal melanoma treated by

Guidelines for the management of imatinib toxicity from the National Comprehensive Cancer Network suggest that only patients with cardiac disease or risk factors for heart failure

Letter by d’Aloja et al Regarding Article, “Competitive Sport Participation Among Athletes With Heart Disease: A Call for a Paradigm Shift in Decision Making”.. Letter to

This fully human antitumour antibody (named Erbicin-derived human, compact antibody or Erb-hcAb) represents a compact, reduced version of an IgG1, with the antiproliferative effect

Although the treatment is different for solar occultation and nadir observations, the philosophy to determine the detection limits is the same: simulate a series