• Non ci sono risultati.

Adozione di strumenti per l'analisi e il controllo delle attivita legate alle spedizioni internazionali

N/A
N/A
Protected

Academic year: 2021

Condividi "Adozione di strumenti per l'analisi e il controllo delle attivita legate alle spedizioni internazionali"

Copied!
70
0
0

Testo completo

(1)

1 C

ontents

1. INTRODUCTION ... 3

1.1 Problem statement ... 3

1.2 Literature ... 4

1.3 Content of the thesis ... 4

2. CASE STUDY ... 6

2.1 Projest spa ... 6

2.2 The enterprise customer ... 6

2.3 Current structure of the information system ... 7

2.3.1 CRM iMove Next ... 7

2.3.2 iMove Distribution Software ... 7

2.4 Business process analysis ... 7

2.5 Operational database ... 9

2.6 Making analysis on the operational database... 10

3. NEEDS ANALYSIS AND CONCEPTUAL MODEL IDENTIFICATION ... 12

3.1 Requirements definition ... 12

3.1.1 Data warehousing ... 12

3.1.2 The role of information ... 13

3.1.3 ALPI Data warehouse ... 13

3.2 Conceptual model definition ... 15

3.2.1 Dimensional fact model (DFM) definition... 15

3.2.2 Dimensional Fact Model Components ... 16

3.3 Data marts conception ... 18

3.3.1 Data marts definition ... 18

3.3.2 Conformed dimensions ... 19

3.3.3 Data marts Conceptual Modeling ... 22

4 LOGICAL MODEL IDENTIFICATION ... 37

4.1 From conceptual model to logical model ... 37

4.2 Logical Modeling ... 38

4.2.1 Logical Modeling of the Jobs ... 38

(2)

2

4.2.3 Logical Modeling of the Invoice ... 40

4.2.4 Logical Modeling of the Distribution... 41

5 SOFTWARE ENVIRONMENT DEFINITION ... 42

5.1 Microsoft Integrated Development Environment ... 42

5.2 SQL Server Integration Services (SSIS) ... 42

5.3 SQL Server Analysis Services SSAS ... 43

5.3.1 Analysis Services Tabular ... 43

5.3.2 Bi-directional cross filtering... 43

5.3.3 X-Velocity in memory engine ... 44

5.4 Power BI ... 45

6 PROJECT IMPLEMENTATION ... 48

6.1 Data Warehouse creation: Extract Transform Load ... 48

6.1.1 Dimensions ETL process ... 49

6.1.2 Facts ETL process ... 52

6.1.3 Data marts ETL process ... 55

6.2 Execution process and solution testing ... 57

6.2.1 ETL Execution process ... 57

6.2.2 Test ... 58

6.3 SSAS Tabular Data Model ... 59

6.3.1 Pareto Analysis KPI ... 61

6.3.2 Time intelligence calculations KPI ... 62

6.3.3 Budget Variance Analysis KPI for management control ... 63

7 REPORTING AND DASHBOARDS ... 64

7.1 Reporting ... 64

7.2 Dashboards ... 67

(3)

3 CHAPTER 1

1. INTRODUCTION

1.1 Problem statement

Projest SPA is a part of Basis Group that is specialized in software development and business consulting. The increasing amount of important data has led many companies to adopt business intelligence solutions to remain competitive and follow the dynamic business context.

One of the strategic sectors is the international shipments. Currently, Projest offers a solution; the iMove management software that allows to handle the international shipment operations from order to delivery. Each customer has a real-time trace of the order, shipment, distribution and billing states.

Once clients have all business operations logged, querying across data starts from natural and hypothetical needs. In fact, customers such as ALPI would like to improve decision making through extracting interesting information, numbers and trends from operational data but currently the existing raw data doesn’t offer the possibility to make such analysis. Here starts the implementation of business intelligence solutions: one way to satisfy their requirements is to design a data warehouse that will be built around the main processes of iMove software. This will allow to synthetize data and organize it within a unique support. Subsequently, it will be possible to make analysis that will drive the business.

(4)

4

The project consists in studying the current database, automate the process of extraction, transformation and synchronization of the new data warehouse, create reports and dashboards that will help monitor the processes. The data warehouse will be built in a way that handles frequent changes and improvements. The deal is about working on a large set of data that belongs to the transactional database to build a coherent data warehouse.

1.2 Literature

During the realization of the data warehouse project and especially during the modeling phase, the conception approach was done with reference to [A. ALBANO 1]. For the application development, the environment of reference is [Msdn 17] Microsoft official website, [SQL BI 17] official website founded by A Microsoft certified team of experts. The project SSAS data models have been built with reference to the new feature of SQL Server 2016 presented by [K. De Jonge] and [A.Rezzani 16] Analysis Services. The data analysis has been done by learning advanced concepts of Dax language [M.Russo & A.Ferrari 15].

1.3 Content of the thesis

The project was elaborated with a project manager that is responsible of Business Intelligence Development and a team of IT consultants charge of requirements definition, designing and implementing BI solutions.

The purpose of this project is to build a data warehouse for the customer Albini e Pitigliani (ALPI) aimed to help the enterprise drive the business and understand the most relevant trends by means of specific indicators. Indeed, after years of real-time business management with the operational system, the need of a consolidated data source on which analysis and mining queries can be done, became essential in order to establish the foundation for decision support.

In this sense, the thesis is aimed to describe the project from modeling to implementation and presentation. Thus, the second chapter is dedicated to the case study. A description of the information system’s current structure is presented and the business process that is the main topic of this project is detailed. The third chapter concerns the requirements definition and conceptual model identification. It is a part in which the analysis is done closely to the customer. Chapter four is about the logical model design. Chapter 5 defines the development

(5)

5

environment. It presents the Microsoft Business Intelligence tools: Microsoft Business Intelligence for implementation side and Power Bi for presentation layer. Chapter 6 contains the implementation of the project that is divided in two main parts: The Extract Transform Load process and the Tabular Data Model. Chapter 7 describes reporting and dashboards. It presents the Power Bi visualizations

(6)

6 CHAPTER 2

2. CASE STUDY

2.1 Projest spa

Projest S.p.A. is a company that operates since 1998 in management consulting, especially in information systems engineering, software development and business process automation. Since 2012, Projest has integrated the Basis Group, a genuine business that involves more than 200 employees and offices in UK, Qatar, Moldavia, Romania an Italy.

The objective of the Basis Group is to offer to its own customers an expanded team with 360° competences in IT environment: a unique partner for different needs, more efficiency, timeliness and reliability.

2.2 The enterprise customer

The ALPI Group is a recognized player in the Freight Forwarding and Transportation Industry and is also specialized in the Supply Chain and Logistics Solutions. With 70 years of history, the company has established best-practice standards in Ground Transportation, Air and Ocean Shipments, Customs Clearance and Supply Chain Management.

Over 70 years old, over 1,000 direct employees worldwide, over 20,000 international customers and over 200 direct destinations, this is “ALPI world”.

Airfreight, Sea freight and Ground Transportation are the main activities of the ALPI network. All phases of the process are covered; from order management to order delivery; including pick up, consolidation, documentation, customs clearance, certifications

(7)

7

management, best carrier selection and a track & trace system that can be tailored to customer's needs.

2.3 Current structure of the information system

2.3.1 CRM iMove Next

IMove Next CRM is the customer relationship management software that offers a set of functionalities in real time such as the budgeting with costs and revenues optimization, last minute offers creation for profiled customers, purchase orders management, e-booking platform, supplier profiling per offered service, cost/revenues forecast analysis and offers/orders conversion rate analysis.

2.3.2 iMove Distribution Software

iMove is a Projest software solution for shipments, transport and warehouse. This software is the product of years of experience beside the primary operators of the sector and it is the outcome of a daily work on-the-job to find solutions that make efficient and simplify all the activities linked to goods transfer and shipment.

The components of the solution are totally web-based, refreshed in real time and available online for consulting. iMove is a scalable software that handles a distributed use in many locations all over the world. Its characteristics are the management of import/ export shipments via sea, air and land, the management of collect and delivery, the management of shipment documentations, costs and revenues management, partial carrying by groupage management and trace and tracking management.

2.4 Business process analysis

The main sector on which the work will be done is logistics and international shipments. The customer is an international group of related companies that manages goods shipment in import and export via land, air and sea. When a customer produces a shipment order, it is forwarded through the EDI to the supplier. The supplier confirms and a shipment order is launched. An invoice that includes order and delivery warehouses is produced. This process

(8)

8

contains four main tasks: order, shipment, distribution and billing. The enterprise customer ALPI GLOBAL, an international shipments company, would like to have different insights about the business trends to improve decision making and follow the key performance indicators that concern Italy.

The work to be done is about creating a data warehouse departing from the current relational database and model all business processes that are of interest for managers. This data warehouse will include data marts related to order, shipment distribution and billing and produce the appropriate reports on demand of the customer. The main topics of these reports will contain:

1. Order: is a business process that starts when a customer launches a purchase order via iMove Next platform. It requires to monitor the level of service like order deadlines and quantities. In this phase, the buyer purchases goods from vendors by sending them a purchase order. At the same time, the ALPI foreign forwarder EDI registers the order.

2. Shipment: the shipment or job represents the goods transfer operation after each order request. It occurs between an origin hub warehouse and a destination hub warehouse. During this process, order volume, marginality, weight and other measures related to single shipments must be checked to maintain a global vision that allows to make data analysis using filters at different levels.

3. Distribution is a process that involves the shipment document. Indeed, each shipment order generates a job document that contains collect and delivery notes information such as the booking date, the kind of shipped goods and. their characteristics

4. Billing represents the invoice production between the retailer and the customer following the corresponding incoterms rules for shipments. These rules define which entity has to pay each part of the whole shipment travel. In this phase, it is required to analyze the revenues related to different measures.

(9)

9 2.5 Operational database

The Current operational database relative to the iMove distribution software is called ELS304. It contains 533 tables that cover all the aspects of iMove business processes. It represents the available data source that will populate the data warehouse. Due to the huge size of the database only a little part of the table relationship diagram is presented. This part is not randomly chosen; indeed, it contains all the tables that will populate the ‘Addressbook’ dimension in a further step. In this way, an insight about the upcoming data warehouse modeling is given.

FIGURA 1. CURRENT OPERATIONAL DATABASE

T_countries table refers to the list of countries that may be included in the ‘AddressBookAddress’ table. ‘AddresssBookAddress’ contains the whole list of all kinds of addresses that can be associated to the ‘AddressBook’ table such as the email address, the mobile, the phone, etc. ‘AddressBook’ table contains the name of all entities that are implied in the iMove distribution system such as the haulier, the company, the office, the buying

(10)

10

agent, etc.‘OrderCompanyCustomers’ contains the customer of ‘OrderCompanies’. ‘OrderCompanies’ refers to all order company names. ‘PJ_users’ contains the list of all users of the iMove distribution systems that are either entities to be added to the ‘AddressBook’. Below a global schema of the current structure of the global business process is presented. For example, the buyer TJX USA buys goods from retailers in Europe by sending purchase orders.

FIGURA 2. GLOBAL BUSINESS PROCESS

2.6 Making analysis on the operational database

In the current database, data is application oriented, tables are normalized and many columns have coded values. The related reporting system (exposed below) allows to make basic analytics operations like simple date filters and aggregations but is not able to present information in terms diagrams, charts, prospective analysis or key performance indicators. If the specialist wants to analyze data with the current database structure, the access is difficult due to the huge size and sparse data that should be collected. Thus, data extraction is not fruitful and the quality is not certain because the kind of information we want to extract

(11)

11

is different from the raw data available in the database. Making analytical queries on this current structure would not be performant since the specialist must make many join operations to aggregate data and this is costly in terms of system performance.

(12)

12 CHAPTER 3

3. NEEDS ANALYSIS AND CONCEPTUAL

MODEL IDENTIFICATION

In this chapter, the data warehouse project is introduced in a first step, then the conceptual model identification is detailed and is followed by the final logical modeling.

3.1 Requirements definition

3.1.1 Data warehousing

Data warehouse is a key element of enterprise IT infrastructure from 30 years. It is a structured data base designed to store all data that is used for analytic queries and decision making. As Inmon defined, it is a subject-oriented, integrated, non-volatile and time variant collection of data in support of management decisions. Subject-oriented means that data is organized by subject such as sales, transactions, etc. Integrated means that data warehouse integrates data from multiple data sources. Time-variant specifies that historical data is kept at the opposite of operational data bases where only the current data is kept. Nonvolatile means that once data is stored in the data warehouse, it will remain, and this will not change.

(13)

13

3.1.2 The role of information

Information occupies a central role in all business processes and it is the basis of decision support. Indeed, by improving information collection, organization and analysis, the enterprise improves its quality of service, its management by costs reduction and profit maximization. It can also make a customer profiling, prevent the market trends and improve communication with customers. So, the deal is about creating a data warehouse that best fits to the business needs by producing specific, reliable, relevant and up-to-date information.

3.1.3 ALPI Data warehouse

The increasing amount of data sources has led many enterprises to adopt analysis solutions allowing them to transform data to useful information that can be customized in visual representations like dashboards. To make this more efficient, this information can be displayed and accessed on multiple devices. Consequently, the client decided to implement a data warehouse starting from the ELS304 operational data source to enable data analysis and help decision making.

As the ALPI data warehouse project has been already started, the model identification has been completed. So, the following work will start from the data warehouse building and implementation.

Inmon approach

We can infer that the methodology to follow is the Inmon approach: it is known as the top-down approach and it is driven by the need to create an integrated, subject-oriented, time-variant and non-volatile data reference. This methodology allows to build first the data warehouse then to create Data Marts. It offers the advantage to produce a strong solution, long-lasting and the Data Marts are developed from a unique source (the data warehouse) already integrated that covers all the aspects of the business simplifying business process understanding. Since Inmon approach offers the opportunity to model the whole entities involved in the iMove software, the result is a complete and reliable subject-oriented data warehouse that intuitively generates well-defined and process-oriented data marts.

(14)

14

Data warehouse architecture following Bill Inmon approach

FIGURA 4.DATA WAREHOUSE DESIGN: INMON APPROACH

Data warehouse description: DM_ALPI_GLOBAL

The data warehouse conception led to the following fact tables creation. All the data warehouse dimensions will be used in data marts modeling. Thus, they will be described during the creation of the data marts.

Job Contains the shipments description with the single job as granularity level.

JobDetail The details about each job, so this fact table describes the jobs at a deeper granularity level.

JobDocument Shipment document of each job. This fac table allows to make analysis on the exchanged documents during shipments.

(15)

15

Order Purchase order is an event launched by the customer.

Cadn Collect and delivery notes are a set of events (facts) related to shipments. The granularity is the single note/document.

CadnDetail Contains information about the Collect and Delivery note with a deeper level of detail. CustomsEntry Belongs to custom bills that only French

customers use.

Exception Exceptions are events linked to order exceptions such as an expired payment term or a credit limit out of range.

Invoice Allows to make analysis on billed services called CBS in the model. The granularity is at the level of the single invoice. An invoice may contain many rows that represent the CBS service.

3.2 Conceptual model definition

3.2.1 Dimensional fact model (DFM) definition

It Is a conceptual model for data mart design. It consists of a set of fact schemas, their dimensions and their measures. It starts from a documentation about customer needs and data available in the data warehouse. The DFM enables a primary analysis of the job. Indeed, the designer can start querying the model and communicate with the end user to improve requirements specification.

(16)

16

3.2.2 Dimensional Fact Model Components

Facts

A fact represents a relevant business process to model. It models an important event such as sales, shipments, etc.

Dimensions

As a fact is defined with reference to a certain environment, we can describe dimensions as the materialization of fact context. They intuitively arise from relationships or entities linked to facts to be modeled.

Measures

A measure is a numerical fact property on which we can make calculations that help decision making in a further step

Modeling Conventions

Even if there is no standard for data warehouse modeling, it is a best practice to adopt some modeling conventions. This aims to better organize the work and to avoid eventual ambiguities that may arise when data warehouse turns into a very large data set. For example, names of attributes should be significant and clear. If two attributes have the same meaning in different fact schemas, they should have the same name.

The data marts are built with reference to the customer needs by focusing on available data in the current data warehouse.

Role-playing dimensions

Some dimensions are used in repetitive mode. They are known as Role-Playing dimensions. Dimensional role-playing occurs when a single dimension is part of the same fact table many times with different roles. For example, in our data model, the calendar dimension is used many times by each fact table. So, instead of adding physically the same dimension table as

(17)

17

much as needed in the fact table, we avoid it referencing the calendar table every time it is needed.

Conformed dimensions

Conformed dimensions are dimensions that are related to multiple fact tables within the same data warehouse. We present below the list of conformed dimensions that belong to more than one process. They are used with the same structure and content.

Choosing the right data warehouse elements

The conception of a dimensional fact model requires four steps:

- Defining the process (shipment, order, distribution and billing) from customer requirements point of view.

- Choosing the granularity: for example, the level of detail we work on is the single Shipment operation (called job in our case). The single job belongs to a transaction fact, an information occurred in a specific point in time. Thus, the business decisions of this process will be driven by the measurements we will make on the aggregated data.

- Choosing the right dimensions on which depend each record of the fact table. They give a context to the fact, they are the basis of the measures analysis.

- Choosing the appropriate measures that will populate the fact table, this step is important because these measures are likely to drive decision making afterwards.

(18)

18

Business dimensional lifecycle

The business dimensional lifecycle describes the project development phases.

FIGURA 5. BUSINESS DIMENSIONAL LIFECYC

3.3 Data marts conception

3.3.1 Data Marts definition

Since the model of the data warehouse has been already completed, the data marts creation consists of choosing the right facts to model according to the business needs, their granularity, dimensions and measures. To ask business requirements, four data marts are created, they will contain data only about Import and Export relative to Italy.

JOB_DM_ITALIA describes the shipments and the granularity chosen belongs to the single job. Analyzing the shipments require to collect data from a set of dimensions. A shipment happens between an origin company and a receiving company. Each company has a group of offices that might be origin or receiving offices. This fact refers to the Addressbook to locate the different concerned entities. A shipment requires a vessel name and a carrier, it is also characterized by a container type and a container load type according to the iMove reference.

ORDER_DM_ITALIA describes the purchase orders launched by buyers to ALPI Customers (Retail companies like TJX) with the single order as granularity level. A buyer can rely on a buying agent as an intermediate between him and the customer. Each order involves a company, a PODepartment and the state of the shipment service called Term in ELS304.

(19)

19

INVOICE_DM_ITALIA describes the invoicing process. Each Invoice is composed of many rows that belong to the CBS service. The granularity is at the level of CBS. An invoice identifier can belong to many CBS. Thus, a row defines the kind of service for a given invoice. A job can be associated to many Invoice rows. This process requires an origin and receiving office, it requires also information about the invoice type, the currency and the CBS service.

CADN_DM_ITALIA describes the collect and delivery notes. They are documents accompanying goods shipment that list the description and the quantity of goods delivered. The data to collect during the process requires to define the importer and the exporter, the warehouse where the shipment is received, the company with its corresponding office and iMove users that register the operations.

Enterprise Data Warehouse Bus

It is a fundamental tool for designing data warehouse architecture. Each column represents a business process (a DataMart). Each row represents a conformed dimension. This gives an insight about dimensions of interest for Data Marts creation.

3.3.2 Conformed dimensions

After the definition of the processes that belong to the data mart fact tables, conformed dimensions are first presented because they are common to many facts. Then the dimensional modeling of each process is described.

(20)

20 Company

This dimension refers to the freight forward companies.

Attribute Description

Company Freight Company’s name

Country Company’s country

Office

Each freight forward company has a set of offices.

Attribute Description

OfficeCode Code used to reference the office

OfficeLine Office line

Office Name of the office. Each office belongs to a company

Country Country of the office

Addressname Global address of the office

Calendar

To simplify the visualization, only important and most used attributes of the calendar table are exposed.

Attribute Description

Year Description of the whole date

Month Month name

Quarter Quarter of year

Week Week of the year number

Day Day of year number

(21)

21

Day_of_week_name Example: day1, day2 etc.

Day_of_week_fullname Day of week full name. Example: Monday.

Month_of_year Month number

Month_of_year_name Example: month1, month 2, etc. Month_of_Quarter Month of quarter number

Addressbook: is a table that contains the complete reference to all the entities that are involved in the business.

Attribute Description

Name Name of the exporter company

VAT VAT identification number

Address Address of the company

City City

ZIP Zip code

State State of the company

CountryISO Country abbreviation

Term: represents the standard definitions of trade terms conventionally called International Commercial terms and are used for informational purpose.

Attribute Description

TermsCode Code of Term rules

Terms Rules shared between the buyer and the seller that must be specified in the contract

(22)

22

Mode: describes the shipment mode that could be via air, land(motor) or sea.

Attribute Description

Mode Shipment mode: air sea or motor.

3.3.3 Data Marts Conceptual Modeling

The first business process to model is the Shipments: JOB_DM_ITALIA

During this process, the shipments are registered and the granularity chosen is the single job (or shipment). The aim of modeling the shipments is to be able to extract information such as the number of Jobs by origin office or company, by period, to identify most relevant countries in terms of costs and revenues, etc. Thus, this analysis requires dimensions that give context to the fact JOB_DM_ITALIA: company, office, Calendar, exporter, importer, Term and Mode. To avoid loss of information, we first use all the attributes of the data warehouse. Afterwards, we can hide the unused ones during the creation of SSAS Tabular Data Models.

Fact table JOB_DM_ITALIA Contains information about goods characteristics, costs and revenues in import and export

Dimensions Origin Company The vendor company Receiving Company The buying company Origin Office Origin warehouse Receiving Office Receiving warehouse Calendar ATD Actual time of departure Calendar ATA Actual time of arrival Calendar Order The conformed time

(23)

23

Exporter Is an Addressbook

Importer Is an Addressbook

Haulier Is an Addressbook

Term The rules for goods

shipment

Mode Shipment mode: air, land

or sea shipment

Vessel Ship identity

Department Import or Export

department

Carrier Shipment company

Container Type Type of container

ContainerLoadType Type of loading of the container Dimensions: Vessel Attribute Description VesselCode Code

Vessel Transportation vessel name

Country Country of origin

Carrier

(24)

24

Mode Shipment mode via air, sea or land (motor)

Code Code

Description Carrier name

Department

Department is a table that refers to both cases of departments in the model: Export or Import.

Attribute Description

Department Import or Export department Container Type

Attribute Description

TransportModeCode Air, sea or Motor code

Code Container code

ContainerType Type of the container, example: Megacube

CBM Cubic meters of container

LoadingMeters Loading Meters

MaxLoadWeight Maximal load weight authorized

Container load type

Attribute Description

LoadTypeCode Load type code for all companies LoadTypeCode TJX Load type code for TJX company LoadType Load type, example: full load

Origin Office

(25)

25 Receiving Office

Role: Receiving office Conformed dimension Office

Origin Company

Role: Origin Company Conformed dimension Company

Receiving Company

Role: Receiving Company Conformed dimension Company

Importer

Role: Importer Conformed dimension Addressbook

Exporter

Role: Exporter Conformed dimension Addressbook

Haulier

Role: Haulier Conformed dimension Addressbook

Term

Term Conformed dimension Term

(26)

26 Measures

CostLoadSR SUM of shipment load cost for receiving company

CostLoadSE SUM of shipment load cost for origin company

CostUnloadSR SUM of shipment unload cost for receiving company

CostUnloadSE SUM shipment unload cost for origin company

CostCollect SUM of all previous costs ImportRevenue SUM of the revenue in import ImportCost SUM of the cost in import ExportRevenue SUM of the revenue in export ExportCost SUM of the cost in export

ExpGA SUM of export goods amount

(27)

27 Dimensional Fact Model JOB_DM_ITALIA

For visibility reason, only a part of the whole measures is shown in the fact model below.

The second business process to model is the orders: ORDER_DM_ITALIA

This process belongs to order requests. The aim of modeling this fact is to discover order trends during the last years, analyze exceptions that may arise, identify their origin and compare them to orders in terms of frequency. So, the level of granularity chosen is the single order that can be associated to an exception. The measures to consider are about the number of orders: DetailOrderedUnits that refers to ordered goods, DetailShipped Units that refer to shipped goods, DetailBookedUnits that refers to the booked units.

(28)

28

Fact Table ORDER_DM_ITALIA Contains information about orders and particularly exceptions such as orders with no prior date.

Dimensions Buying agent Intermediate between the buyer and the customer

Buyer Companies that buy products for customers

Origin Company Company Receiving Company Company

Order Status Status of the order. For example, ‘To be booked’ or ‘In transit’

Calendar Calendar of the order Addressbook All details of the Customer Term Rules for shipment

(29)

29 Dimensions:

Buying agent

Attribute Description

Buying Agent Code Code description

Buying Agent The name of buying agent: an intermediate holding that is charged to find buyers for customers.

Customer Customer

Customer

Attribute Description

Customer Retail Company

Customer Code Name abbreviation code

Customer Name Parent company

Name Company

Buyer

Attribute Description

Buyer Company that buys products for the Customer

Customer Retail company

Order Status

Attribute Description

(30)

30 Origin Company

Role: Origin Company Conformed dimension Company

Receiving Company

Role: Receiving Company Conformed dimension Company

Calendar

Calendar Conformed dimension Calendar

Addressbook

Addressbook Conformed dimension Addressbook

Term

Term Conformed dimension Term

Measures

DetailOrderedUnits SUM of the ordered units DetailBookedUnits SUM of the booked units DetailShippedUnits SUM of the shipped units

The third business process to model is the billing: INVOICE_DM_ITALIA

Billing is the most common business process managers want to analyze. After the orders and the shipments, an invoice is produced. The aim of modeling INVOICE_DM_ITALIA is to analyze the Amount of orders in different invoices aggregated by different parameters such as country, customer and office. Thus, the dimensions to be added are: Invoice Type, Invoice Record Type, Calendar, Customer, Addressbook, Office, Payment Term CBS and

(31)

31

Company. The main measure to consider is the amount of sales, ‘Amount’ that refers to all sales in Euro money and ‘AmountLocal’ that refers to the amount according to the country of origin money.

Fact Table INVOICE_DM_ITALIA Contains information about billing process such as the amount.

Dimensions Invoice Record Type Invoice document or Pro forma

Invoice Type Invoice or Credit note.

Calendar Date

Currency List of used currencies Customer Retail company Customer Invoice Adressbook Origin Office Office Receiving Office Office

Payment Term Payment contract. For example, Immediate, bank transfer 60 days

CBS Rules for shipment

Company Company name

Dimensions

Invoice Record Type

Attribute Description

(32)

32 Invoice Type

Attribute Description

Invoice type Invoice or Credit note.

Currency

Attribute Description

Currency Code Abbreviation of the currency Currency Whole name of the currency Customer Invoice

Role: Customer Invoice Conformed dimension Addressbook

Payment Term

Attribute Description

Company Company Name

PaymentTermCode Code of payment

PaymentTerm Term of payment, example: immediate, 30 days

Measures

Amount SUM of billing amount

AmountLocal SUM of billing amount expressed in local money

(33)

33

The fourth business process to model is the billing: CADN_DM_ITALIA.

By modeling this business process, the manager is interested in analyzing the collect and delivery notes documents like their number, the type of shipped goods, etc. The dimensions that are necessary for this model are the Haulier, Importer, Exporter, Handling Operator, User, Warehouse, Company, Calendar.

Fact table CADN_DM_ITALIA CADN_DM_ITALIA

Dimensions Haulier The responsible of the transport of goods by road.

Importer Importer company as an Addressbook. Exporter Exporter company as an Addressbook. Handling Operator Handling Operator as an

Addressbook. Issue User User role Close User User role Release User User role

(34)

34

Warehouse Warehouse name

Office Company office

Company Company name

Issue Date Calendar

User: Role-playing dimension

Attribute Description

User Name Name of the user connected User Description Full name of the user

User Company Company for which user works. Insert Date Date of Insertion

Warehouse

Attribute Description

Company Warehouse corresponding company Warehouse Code Warehouse code description

Warehouse Warehouse name

IsSeaFreight Boolean state for sea freight IsAirFreight Boolean state for air freight IsRoadFreight Boolean state for road freight

Importer

(35)

35 Exporter

Role: Exporter Conformed dimension Addressbook

Haulier

Role: Haulier Conformed dimension Addressbook

Handling Operator

Role: Handling Operator Conformed dimension Addressbook

Company

Company Conformed dimension Company

Office

Office Conformed dimension Office

Issue Date

Issue Date Conformed dimension Calendar

Release User

Role: Release User Dimension User

Close User

(36)

36 Close User

Role: Close User Dimension User

Measures

CBM SUM of cubic meters

GW SUM of gross weight

CW SUM of chargeable weight

(37)

37 CHAPTER 4

4 LOGICAL MODEL IDENTIFICATION

4.1 From conceptual model to logical model

The objective of this phase is to extract a logical schema of the Data Mart starting from the conceptual schema. The logical modeling is based on rules similar to the relational database rules.

The data marts JOB_DM_ITALIA, ORDER_DM_ITALIA INVOICE_DM_ITALIA, CADN_DM_ITALIA are designed following the star schema approach. It is the simplest data warehouse architecture. The name star refers to the central fact table surrounded by the dimensions. The representation resembles a star.

A star schema consists of a set of dimension tables (D1 ,.., Dn), each dimension table has a surrogate key SKi , a set of attributes with different granularities and a fact table that includes a set of measures and the foreign keys that reference the dimensions.

The transition from the dimensional fact model to the logical model includes:

Descriptive attributes: they are added to the dimension table of the dimensional attribute like other attributes and it is not possible to use them for aggregation. Optional attributes are added to the dimension and they can assume the NULL value. If this contains ambiguity due to the presence of other NULL values possibly with different meanings, we can use a fake value to express the optional character of the attribute. Shared dimensions, also known as role-playing dimensions are represented with additional arcs from the fact table to the dimension table. In the logical model, we add a foreign key SKi for each dimension Di.

(38)

38 4.2 Logical Modeling

4.2.1 Logical Modeling of the Jobs

Below, the logical schema of the Data Mart JOB_DM_ITALIA is presented

(39)

39

4.2.2 Logical Modeling of the Orders

Below, the logical schema of the Data Mart ORDER_DM_ITALIA is presented.

FIGURA 4.2.2 DATA MART ORDER_DM_ITALIA

(40)

40

4.2.3 Logical Modeling of the Invoice

Below, the logical schema of the Data Mart INVOICE_DM_ITALIA is presented.

(41)

41

4.2.4 Logical Modeling of the Distribution

Below, the logical schema of the Data Mart CADN_DM_ITALIA is presented

(42)

42 CHAPTER 5

5 SOFTWARE ENVIRONMENT DEFINITION

5.1 Microsoft Integrated Development Environment

Projest Spa has developed the iMove distribution Software under the Visual Basic Microsoft IDE and its data base has been developed under SQL Server Database Management System. So, to keep a homogeneous use of all enterprise tools, it has chosen to keep the same platform for its business intelligence solutions.

Microsoft Business Intelligence Development studio is a complete tool that simplifies information organization and analysis. It allows to build a solution on a scalable and reliable SQL Server platform joining powerful DBMS to efficient integration services and analysis services products below presented.

5.2 SQL Server Integration Services (SSIS)

Microsoft SSIS (SQL Server Integration Services) is an enterprise data integration, data transformation and data migration tool built into Microsoft's SQL Server database. It can be used for a variety of integration-related tasks, such as analyzing and cleaning data, running extract from different data sources such as XML data files or flat files. It provides a plethora of functionalities used for data extraction, transformation ad loading tasks. In addition, it offers an efficient use of memory especially for huge transformations, lookup tasks and slowly changing dimensions more complex to handle only with T-SQL. Otherwise, the tool is graphically designed to visualize large ETL tasks at a glance.

(43)

43

5.3 SQL Server Analysis Services SSAS

SSAS is a Microsoft Business intelligence suite tool that offers the functionality of storage and presentation of data by means of its Business Intelligence Semantic Model (BISM). This model simplifies data access and includes PowerPivot for Excel, PowerPivot for SharePoint and Analysis Services multidimensional and analysis services Tabular.

5.3.1 Analysis Services Tabular

The technology of interest is the Analysis Services Tabular. Tabular are analysis services database models that run in-memory or in DirectQuery mode, accessing data directly from backend relational data sources. They offer an intuitive relational modeling approach. In memory is the default mode and it used compression algorithms and multithread querying process delivering fast access to tabular objects whereas DirectQuery is generally used when models are very large

5.3.2 Bi-directional cross filtering

It is a new feature available on SQL Server 2016 Analysis Services and Power BI Desktop. It offers a new strong option that allows filters to propagate in both directions. Thus, the filter context propagates to a second related table that is on the other side of the relationship. Filter context is a set of filters applied to the data model before the evaluation of a DAX expression starts [M.Russo & A.Ferrari 15]. The benefit is that we can now solve many to many relationship ambiguities avoiding writing complex Dax formulas’.

Below we present an example of the use of Bi-directional cross-filtering in SSAS tabular Cube with an intermediate table in which we insert distinct values of the join Key to solve the many to many relationship issues.

(44)

44 FIGURA 5.2 SSAS TABULAR DATA MOEL

5.3.3 X-Velocity in memory engine

X-Velocity in-memory is the common engine for Power Pivot and SSAS Tabular introduced by Microsoft to its databases. This technology allows in-memory column storage and enables a memory-optimized ColumnStore index which results prevailing for data warehouses and analytical queries. By means of this technology data is stored by columns instead of rows and is efficiently compressed so that during querying, the engine scans compressed data. The order of data is relevant because compression happens based on contiguous values, but the user is not asked to solve this task as SSAS automatically chooses the best sorting.

The X-Velocity in-memory compression mainly relies on two techniques: The Run Length Encoding (RLE) and Dictionary Encoding. The RLE is used only if compressed data is smaller than the original, this happens when a column has a low number of distinct values so that the duplicate values are compressed, and a corresponding index is used. In this algorithm, each column is stored in a distinct array. The other compression method is the Dictionary encoding that happens regardless of columns content. The values of records are

(45)

45

stored into a dictionary and an index refers to each value. The advantage is this technology is that the number of bits required to register the index is automatically detected.

Finally, a combination of both methods is used to optimize data storage in memory and the result, even if it depends on data content, is performant

Below we present an example of the RLE algorithm functioning

FIGURA 5.3.3 RUN LENGH ENCODING ALGORITHM

5.4 Power BI

Power BI is a service that offers a set of tools for data analysis and visualization. Its dashboards offer to professionals a global real -time view of their most relevant measures on all their devices. With a simple click, users can explore data by means of intuitive tool designed to ease information discovery and presentation. In this way, data can be customized to show what matters most for the users thanks to a collection of software services and apps. Power bi consists of three elements calledPower BI Desktop, Power BI service and Power BI Mobile app that allow professionals to create, share and consume business insights.

(46)

46

Power bi desktop is a desktop application for reports creation on which we can not only

import data from multiple data sources creating data models but also connect directly to ready data models and reports then publish them on Power BI service.

Power BI service is an online Software as Service (SaaS) on which we publish Power BI

Desktop reports and share them with other users. It offers workspaces that can be used as personal areas that contain dashboards. By clicking on these dashboards, it is possible to open reports for further exploration. Likewise, users can connect to multiple data sets to bring all relevant data together.

Power BI Mobile app allows the users to keep track of their data. signing into Power BI

available on Office 365 gives access to dashboards, reports and groups users are member to. The Mobile application is touch friendly and it is easy to scroll up and down through tiles on the dashboard.

(47)

47

Power BI platform allows connect to many data sources and simplifies data preparation. Thus, this tool gives the possibility to create customized reports and ad hoc analysis. With this cloud-based service the user can create reports on Power Bi desktop, publish them on Power Bi service, create dashboards and visualizations then share them with other users on the cloud. It allows them to monitor the business on their mobile devices with Power BI mobile application.

On-premises Data Gateway

A gateway is software that facilitates access to data that resides on a private, on-premises network, for subsequent use in a cloud service like Power BI. It’s like a gatekeeper that listens for connection requests, and grants them only when a users’ requests meet certain criteria (such as whether they’re allowed to use the gateway) [Power BI 17]. The on-premises data gateway acts like a bridge for a secure data transfer between on on-premises data and Power BI Service.

(48)

48 CHAPTER 6

6 PROJECT IMPLEMENTATION

In this chapter, the implementation of the solution DM_ALPI_GLOBAL is detailed and it is composed of SSIS solution development, SSIS execution and testing, then SSAS tabular data model’s development. In the first part, the presentation of the ETL process includes two main parts: the creation of the data warehouse inside the SSIS project named SSIS_DM_ALPI_GLOBAL, then the creation of the data marts under the same project. The second part includes the execution process followed by performance test. The third part describes the SSAS data model’s creation and the calculated KPIs for data analysis.

6.1 Data Warehouse creation: Extract Transform Load

After the modeling of the data warehouse, a new solution is created. It contains the Integration Services package. Following the dimensional fact model based on the Inmon approach, the ETL process starts. First, the dimensions are created and the plan used for this task provides a set of views that extract all information needed from the ELS304 database (as an additional security layer between the user and the original Database Management System). Then when all views are ready, each dimension extracts the corresponding information from the ELS304 data source and the transformation tasks of SSIS start. Afterwards, the fact tables are implemented in the same way. Then when the data warehouse is completed, the project is deployed and the data marts creation begins.

(49)

49 FIGURA 6 ETL LIFECYCLE

Slowly changing dimensions

Slowly changing dimensions are dimensions that change slowly over time. In data

warehousing, we need to track changes on dimension attributes over time. Usually the most common types of data storage of changing dimensions are the type 1 that sets the record to the new value overwriting the old one and type 2 that saves changes in a new record keeping the previous values and marking them out of date. In our model, we choose the first kind of changing dimensions: we update the dimension overwriting the old value and replacing it by the new one.

6.1.1 Dimensions ETL process

This phase belongs to dimensions extract, transform and load tasks. Each dimension belongs to a data flow task in the control flow. The following example refers to the Addressbook dimension table. This table is the largest dimension in the data warehouse since it contains all the details related to all entities of the business processes from iMove users to buying companies.

(50)

50 FIGURA 6.1.1.A ADDRESSBOOK VIEW CREATIOM

Then a data flow task takes this view as a data source. As defined in the model, Addressbook is slowly changing dimension that is handled considering all columns as changing attributes. This Type 1 overwrites existing values in the destination without keeping previous values in a saved record given that the business process analysis doesn’t require the history but the current address.

(51)

51 FIGURA 6.1.1.B

The following dimensions are loaded in the same way. Each table is a slowly changing dimension of type 1 except PODepartment, a fixed table that doesn’t require transform tasks and the customer table. It is directly loaded from the operational database ELS304.

Below the control flow of the whole dimensions of the data warehouse is presented.

(52)

52

6.1.2 Facts ETL process

FACT_ORDER

The next step is about to populate the designed fact tables. In the example below, the ETL process of the FACT_ORDER is detailed. The control flow belongs to the execution order of the operations. Each execution of the ETL FACT_ORDER is preceded by a table truncation.

FIGURA 6.1.2.A FACT_ORDER CONTROL FLOW

Data is extracted from the source view, then a lookup task performs an equi-join operation between the input columns of the view and reference columns of the dimension table to transfer the matching surrogate key to the destination table. In the case of ‘PODepartment’, the non-matching rows are redirected to the no match output flow and a derived column that takes (-1) value for the transferred dimension surrogate key is created.

(53)

53 FIGURA 6.1.2.B FACT_ORDER DATA FLOW

FACT_JOB

This fact table is the most important in the data warehouse. Indeed, it contains the story of all the shipment operations with the different details and measures.

(54)

54 FIGURA 86.1.2.C FACT_JOB_CONTROL FLOW

The FACT_JOB control flow is a sequence container composed of one Data Flow task and a series of Execute SQL Tasks. Firs the fact table is truncated then populated in the ‘ETL BI FACT_Job’ data flow following the same steps that have been done for the FACT_ORDER: Data is extracted by means of the source view then a series of lookup operations (with derived columns for non-matching rows that need to be passed as key values) is done. Then,

(55)

55

updates operations are implemented by means of ‘Execute SQL Task’ that calls the stored procedures in the database.

6.1.3 Data marts ETL process

During this phase, the creation of business oriented subsets of the data warehouse is necessary. The aim of the current data warehouse implementation is to analyze four main business processes related to the country ITALY: shipments, orders, billing and distribution. For this reason, data marts related to processes having their ‘Origin_Company’ or ‘Receiving_Company’ in Italy are created. After the creation of the data warehouse related to the iMove business.

Stored Procedures to populate the data marts

To ensure this task, two steps are required. First, each data mart is implemented by creating a stored procedure inside SQL Server data warehouse ‘DM_ALPI_GLOBAL’. In a second time, an execute SQL Task is created in SSIS control flow to execute the stored procedure. Below the creation of ORDER_DM_ITALIA is presented.

(56)

56

This stored procedure populates the ORDER_DM_ITALIA Data Mart inserting all the data that have the business key of the origin or receiving company dimension equal to 7833. This company identifier (7833) belongs to Italy.

FIGURA 6.1.3.B ORDER_DM_ITALIA DATA MART EXECUTION

Execute SQL is a control flow that executes the P_ORDER_DM_ITALIA stored procedure.

(57)

57

6.2 Execution process and solution testing

6.2.1 ETL Execution process

The execution process starts with dimensions execution, that is followed by the fact table packages then data marts. In case of failure, a notification mail is sent to the administrator.

FIGURA 6.2.1.A ETL EXCECUTION PROCESS

SQL Server Agent:

SQL Server Agent is a Microsoft service that allows the automatic execution of administrative tasks. It allows to run jobs on a schedule. In the case of DM_ALPI_GLOBAL, the execution job of the ETL process is launched every day at 04:30 AM.

(58)

58 FIGURA 6.2.1.B SQL AGENT JOB EXECUTION

6.2.2 Test

Clustered Index

In the beginning of the project, the first executions used to last more than 3 hours and particularly the data flow of the ‘Addressbook’ dimension was the most time consuming. Indeed, by creating a slowly changing dimension as a method to load the Addressbook table, the ETL task did not make a batch but executed the update row by row in a table that contains more than 700 000 rows. To solve this issue, a clustered index has been created under the ‘Addressbook’ dimension table. Clustered indexes sort and store the data rows in the table or view based on their key values [MSDN 17]. The new execution of the ‘Addressbook’ lasts less than 5 minutes.

(59)

59

6.3 SSAS Tabular Data Model

This phase belongs the tabular data model creation. It is an important part of the project because it contains a set of calculated measures developed in DAX query language that are relevant for business analysis. In this project, four tabular data models have been developed, each one refers to its corresponding data mart. In the example below, the JOB_DM_ITALIA SSAS data model is presented:

To design the tabular model, the Fact table and its corresponding dimensions are loaded. Each role-playing dimension is physically represented in the tabular database. For example, to analyze the shipments, JOB_DM_ITALIA fact table refers to the Actual Time of Departure and Actual Time of Arrival date tables (respectively Calendar ATA and Calendar ATD). These data tables allow to make time intelligence calculation such as the parallel period and the year-to-date aggregation and prospective analysis by means of calculated measures.

(60)

60

FIGURA 6.3 SSAS TABULAR DATA MODEL JOB_DM_ITALIA

The interest of calculated measures is that they allow to make more than simple aggregation tasks. Indeed, business calculations are in general more complex and specific than SUM, COUNT or AVG operations. For instance, the customer ALPI would like to have a pareto analysis concerning the export revenues per office as a Key Performance Indicator. He is also interested in making date comparisons and would like to set a series of KPI useful for management control. Thus, a part of the measures to be calculated in the tabular data model is described in the paragraphs below.

(61)

61

6.3.1 Pareto Analysis KPI

Pareto Analysis is a well-known technique for business decision making. To realize it on SSAS Tabular Data Model a series of calculated measures are computed. In the reporting

phase they will be used inside visualizations to give an insight about the export revenue distribution for the origin offices with the global growth trend expressed in percentage highlighting the 80/20 rule.

(62)

62

6.3.2 Time intelligence calculations KPI

Typically, all data models contain a date dimension. Dax offers a plethora of functions that simplify calculations related to dates such as year-to-date and comparisons over years. These kinds of analysis are relevant for managers because they highlight trends and KPI comparisons.

In this work, parallel period comparisons have been computed such as the export revenue of the last year:

(63)

63

6.3.3 Budget Variance Analysis KPI for management control

One relevant interest of reporting tools is to integrate data analysis to management control. In this work, a series of calculated measures has been set to realize a budget variance analysis. The Margin in expressed by the ‘ExportBalance SUM’, the Revenue is expressed by the ‘ExportRevenue SUM’, the margin of the previous year is expressed by ‘ExportBalanceLY’ and the budget value is defined by ‘TargetExportBalance’.

(64)

64 CHAPTER 7

7 REPORTING AND DASHBOARDS

7.1 Reporting

Reporting belongs to the last phase of a business intelligence project and is aimed to present an analytic visualization of the added value of collected information in a clear and consistent way. The aspect to consider during reports creation, is that they are designed for users that usually master the business but not the presentation techniques and data content. Fortunately, Power BI is a set of tools that is designed to be used by many persons and not only specialists. Thus, it offers a user-friendly interface that make it easy to navigate through reports changing parameters to customize the data visualization.

Power BI Desktop allows to make reports from simple flat files to entire databases, therefore when the tabular data model is ready, it is possible to connect to the app and start designing the reports. In this project, the calculated measures to use as Key Performance Indicators have already been computed in the SSAS project and are directly used to produce visuals. It is also possible to make all these DAX calculations on Power BI desktop after connecting to the tabular data model.

(65)

65

The report below exposes the economic analysis in export for the customer ALPI Italia with the possibility to choose the year of interest. For example, it highlights the Budget Variance Analysis using as variables the Revenues, the Margin and a target value.

(66)

66

The report below exposes the billing activity per different filters such as the kind of payment. The most important offices in terms of invoice, VAT or customer. A prospective analysis is also exposed

(67)

67 7.2 Dashboards

A Power BI dashboard is a feature of Power BI service, a single page in which the user can display different visualizations of different reports. Usually It contains only the most important elements. The aim of a dashboard is to give an insight about the business at a glance. For this purpose, it must respect a set of rules: it should be simple, each information must be represented by the right chart (for example, use a column chart for a comparison rather than a pie chart, etc.), highlight most relevant information, show variations, etc. Once the reports are finished, users can connect to Power BI Service and publish their reports. To create a dashboard, they just need to select the pin icon on the visualization in the report and pin it to the dashboard they want.

Each user has his own workspace that contains personal content. He can also access other workspaces shared between the members of the company and navigate through reports, dashboards and datasets.

The Figure below exposes the Power BI personal page. The user can navigate through data sets, reports and design dashboards then share them with other users.

(68)

68 CHAPTER 8

8 CONCLUSION

The work accomplished was about creating a data warehouse that covers all the aspects of the business. This required to implement the project following Bill Inmon approach and led to a longer startup time increasing the initial costs of development. However, the benefit was much greater and allowed to obtain an enterprise-wide data warehouse easy to expand and maintain. Thus, it eased the data-driven analysis, object of this project, by gathering in a complete and consistent way the overall business data. Consequently, decision making is partially automated thanks to business intelligence tools and is less risky since it is based on consistent information.

This effort has been reinforced by using Power BI as a cloud-based and self-service business analytics tool giving to the customer the possibility to satisfy his information needs independently. The main benefit of the self-service BI is that by reducing the technological gap between the simple end users and the data visualization app, analytics are made faster and adapt with no difficulty to the dynamic enterprise context. Now, the customer can follow the analysis and the control of international shipments and make quicker strategic decisions. He can also improve the future visibility on numbers, variances and exceptions and anticipate trends. Thus, the deal about presenting the right information to the right person at the right time is achieved.

In conclusion, I had the opportunity to learn during this internship how to design and implement a data warehouse project. This real working experience allowed to me to use what

(69)

69

I have learned during the Business Informatics master’s degree and to gain valuable enterprise knowledge. I have also developed professional skills regarding the use of the technological tools to implement the project.

(70)

70

BIBLIOGRAPHY

[A. ALBANO 14] A.Albano, Decision support Databases Essentials, January 2014.

[Msdn 17] Microsoft Msdn official website, 2017. [Power BI 17] Official Website, 2017.

[Projest SPA] Official Website, 2017.

[M.Russo & A.Ferrari 15] The Definitive Guide to DAX, October 2015. [M.Russo & A.Ferrari 16] Introducing Microsoft Power BI, 2016. [SQL BI 17] Official Website, 2017.

[K. De Jonge] Bidirectional cross-filtering in SQL Server Analysis Services 2016 and Power BI Desktop, 2016.

Riferimenti

Documenti correlati

Il /La sottoscritto/a dichiara altresì che, riguardo al trattamento dei dati per le finalità promozionali e commerciali proprie del titolare.  CONSENTE 

In considerazione del numero limitato dei posti previsti dal progetto, si considerano partecipanti le sole imprese che avranno inviato la conferma dell'avvenuta

La informiamo inoltre che ai sensi dell’art 7 della legge, lei ha diritto di conoscere, aggiornare, cancellare, rettificare i suoi dati e opporsi all’utilizzo degli stessi se

Nello svolgimento della sua attività, proprio in considerazione del mercato peculiare in cui opera, Joint ha sempre mantenuto e mantiene un’attenzione costante alle aspettative

The new third-generation Fortimo (with power up to 6000lm) has all of the comfort of the previous generations, while using an even, large-sized light source compared to single

Nel 2006, a seguito della trattativa con uno dei maggiori leader nel campo delle telecomunicazioni nazionali, il gruppo decide di investire nella città di L’Aquila aprendo una sede

Il report, oltre a contenere tutti i principali parametri di efficienza e produzione come Availability, PR, produzione attesa e reale ed irraggiamento, include, con riferimento

· On-board non-contact diagnostic systems for automated measurement and inspection of railway infrastructure components (Track, Overhead Line, Tunnel & Clearance, TLC