• Non ci sono risultati.

The Current Data Management Process 23

N/A
N/A
Protected

Academic year: 2021

Condividi "The Current Data Management Process 23"

Copied!
10
0
0

Testo completo

(1)

3. THE CURRENT DATA MANAGEMENT PROCESS

This chapter outlines the global process for processing and analysing pipeline inspection data. A specific chapter has been reserved for this process so that it could be in-depth explained. A lot of the critical parameters are handled during this process, this is why it needs a clear explanation. Moreover, will be outlined the main document used in the process of the collection of the parameters.

The figure below (Figure 15) shows briefly the lifecycle of the data. First of all, the data are created and recorded on a tape during the In-field operation, started once the client has signed the contract. From any place in the world, the data on a cassette are sent to the site. The data Controller is responsible to receive any tape and starts processing the data, so to translate the raw data inside the tape in processed data. To do that, he needed technical and operational parameters, and he obtains them from the DBMS or form the documentation from the Field Crew. After that, the data are ready to be analyzed by the analyst. He needs information from eDelivery, field crew and Data Control, to interpret the processed data. When the analysis is finished, the final report can be produced and sent to the client. This stage, as well, needs information about the particular request of the client and other specific information.

(2)

As a consequence of every pipeline having its own unique characteristics, the process is based on a standard approach. It may therefore not be possible to strictly adhere to the sequential stages of the process for all pipelines but by using it in conjunction with the Quality Metrics System the process is controlled and recorded.

(3)

RECEIVE TAPE CREATE A/C STRUCTURE PREVIOUS INSPECTION DATA? eDelivery QMS PREPARE DATA YES RUN PIPEREAD TO OPTIMISE PARAMETERS INFORM PROJECT MANAGER – AGREE CORRECTIVE ACTION DATA INTEGRITY QA OK?

PROCESS ALL DATA WITH OPTIMISED PARAMETERS ADD PROJECT TO RPC SERVER DATA INTEGRITY QA OK? PRELIM REPORT? RUN AUTO SEGMENTS IDENTIFY REPORTABLE FEATURE (DEPTH) VERIFY HIGHEST ERF (CONTRACT) INSPECTION SHEETS REQUIRED?

ADD MARKERS & REFS ADJACENT TO FEATURES VERIFY SEGMENT/ RESIZE MANUALLY SIZE FEATURES QA OF SIGNALS PRODUCE INSPECTION SHEETS PRODUCE PRELIM REPORT PRELIM QA OK? QMS DESPATCH PRELIM REPORT eDelivery OVERVIEW ANALYSIS

INITIAL MASS EDIT FILTERING HIGHLIGHT MASS EDITS DETAILED ANALYSIS RESOLVE RED RINGS CONSISTENCY CHECKS RESIZE/RECLUSTER MANUALLY SIZE TOP 25 FEATURES (DEPTH & PRESSURE) CHECK REPORTABLE FEATURES ANALYSIS QA OK? QMS GENERATE INSP REPORT INSP REPORT QMS REMOVE PROJECT FROM RPC SERVER PIPEIMAGE REQUIRED? GENERATE IPP FILE GENERATE CD ROMS PIPEIMAGE QA OK? DESPATCH FINAL REPORT QMS eDelivery 1. Media Arrival ADD MAGLOGGERS NO NO YES NO YES NO YES YES NO NO YES

3. Preliminary Report 4. Initial Data Setup

5. Detailed Analysis

6. Selection of Features and Report Production RESIZE/RECLUSTER NO YES 2. Processing 7. Client Deliverables YES NO NO

(4)

3.1 Media Arrival

3.1.1 Receive Tape

When the tape arrives, the Data Control Team responsible for that, has to record the receipt of the tape in eDelivery. As explained above, eDelivery is the software used to manage all the information about the contract and has got a section dedicated to the Data Control. After the receipt has been recorded, the Data Controller checks that analysis requirements and pipeline questionnaire are completed. If not, the responsible requests Project Manager to complete them. Any missing information should be added to the notes in eDelivery.

The date and time of receiving the inspection data is recorded and some details of the inspection run have to be logged into the system. Any discrepancies between the newly entered information and that already in the system will be followed up and corrected, where appropriate.

The contractual deliverables required by the client, reporting time-scales, sentencing parameters and special instructions should be confirmed prior to proceeding, and any queries brought to the attention of the Project Leader.

In the event of a re-inspection, information concerning the previous inspection shall be assessed by the analyst and, where necessary, a request for the old data to be restored for reprocessing purposes is made to the Data Controller.

3.1.2 Create Working Directory Structure

Once the inspection data are available, they are stored in a particular directory into the system (Centralised Data Storage). In this place the data are stored waiting to be analyzed. This directory is available from all the analysis station and the analyst can find them together with all the needed information to analyse them.

3.1.3 Prepare for Job (Create Working File)

The working file is a paper folder created for holding all paperwork associated with data processing. When the analyst receives the work, this will be linked with this folder, and when the analysis is finished, the updated working file must be given back to the Data Control unit.

3.1.4 Assess Job Requirements

Each project shall be assessed to ensure that Analysis resources are best utilised to achieve contractual obligations with regard to production of deliverables. As soon as a project is committed, the Project Leader responsible for the Contract shall provide:

Applicable pipeline information to enable data processing and analysis aspects to be assessed. The Project Leader must be notified of any problems such that appropriate action can be determined and implemented.

(5)

Details of the contractual deliverables requested by the client, including any supplementary information, e.g. contract extracts, for inclusion in the Inspection Report.

A Briefing Meeting to be held between Analysis and Projects to determine needs.

Pipeline strip-maps/pipe-logs, where applicable. The receipt of strip-maps/pipe-logs shall be recorded, and the items stored in such a way as to minimise deterioration or loss. The strip-maps shall be used to identify suitable maglogger sites.

3.1.5 Previous inspection

Sometimes the pipeline has already been inspected by PII. In this case the old data must be restored for comparison. The old report must be made available for comparison with the current inspection, irrespective if the contract states a Re-Inspection Report deliverable or not.

3.2 Initial Processing

The data processing has made use of a software package called PCAS.

It is made up of a set of tools, each of them useful in different stages of the processing activity. The different softwares that compose the package are PipeRead, needed to process the data, PipeImage, used to visualize them like waves and analyzed them, PipeReport, to produce the report, 3D PipeViewer, to visualized a 3 dimensional reconstruction of the inspected pipeline.

The first tool to be utilized is the Test Plots, a facility within the PipeRead software. The outcome is put through an Integrity Assessment to see whether the processing was satisfactory for that data. If it was not, the Data Controller must reprocess the data using different parameters in order to minimize the missing data.

.

3.2.1 Process all data with optimised parameters

If the data gets through the Integrity check, they can be processed using PipeRead software to set optimised parameters.

3.2.2 Assess Data Quality

The processed data have to be used to ascertain data quality with respect to producing the deliverables as defined in the contract and the appropriateness of the default processing

(6)

3.2.3 Data Quality QA

Data Quality Assessment, the quality and completeness of the inspection data should be carried out. Once done, the relevant section in the QMS has to be signed off.

QMS will be explained in the chapter 3.8.5.

3.3 Preliminary Report

The outcome of this phase Is a preliminary report dispatched to the client by Project Manager. The PR is the result of the automatic activity of the software and manual by the analyst. The PipeImage software, jumps to all the metal loss in the pipeline and identifies all the metal loss under a certain percentage of depth. The analyst proceeds verifying segments and resizing them if it is necessary.

When the analysis stage is completed, the related information are recorded.

To allow more than one user to access the pipeline inspection data, the pipeline inspection data is added into the common Server called RPC.

Before being analyzed by software and analysts, the processed data are added with technical information about the pipeline. This information are needed to assure that, for example, girth welds have not been mistaken with metal loss. To avoid this, the exact position of the girth has to be known and reported on the processed data.

Moreover, usually the field crew positions references all along the pipeline, in order to have some reference points. The right position of the references is recorded into a map called Magloggers and this map is overlapped with the signal map. Finally, during the processing the data are split in several segments that have to be joined together at this stage to obtain an only path of signal.

(7)

3.4 Initial Data Set-up

3.4.1 Techniques used in Data Analysis

A quick assessment of a pipeline can be achieved by setting the bottom screen to a large scaling. Simply jumping to the far end of the screen, this will jump approximately 1.5km to 2km with each click and can easily cover a hundred kilometres line in a short time. This will not only give to the analyst a sense of what data quality problems there may be in the pipeline but also the population of metal loss etc within the pipeline.

3.4.2 Initial Mass edits filters

After the initial overview analysis, the analyst should make note of what type of defects are on the pipeline, noting whether there is anything different between the internal and external features, good and bad boxes and metal loss and mill faults.

Then using the mass edit functions in PipeImage, change the majority of the metal loss features to the correct type.

3.4.3 Highlight Mass Edits

A suitable depth level should be selected in the mass edit function. This function can automatically highlight and draw to the attention of the analyst to the boxed with depth above the depth level selected. This is just an automatic operation, so the Analyst has to review the highlighted boxes and define the true severity of them.

(8)

3.5.2 Resolve Red Rings

Red rings are the sign for very deep metal loss. All of them should be checked by the analyst before anything else is done and noted with outcome and decision.

3.5.3 Consistency Checks

Once the initial first pass is carried out, consistency checks should be done on internal and external features, mill faults and metal loss, features and fittings and features over certain depth criteria. Ensure, if the inspection is a re-inspection, that the data and report are consistent with that previously reported and any changes noted.

3.5.4 Resize and Recluster

Finally once all edits have been done the pipeline should be resized and re-clustered according to the client requirements. The cluster operation means to group all the features very close together. In some cases, these features can be considered like an only feature causing different assessment and severity.

3.6 Selection of features and Report Production

3.6.1 Manual Sizing

If the client specified it in the contract, specific metal loss features must be manually sized to individually assess and accurately predict the dimensions. Metal loss features that are manually sized will normally be the most severe in the pipeline. Information on the most severe metal loss features will be presented on a particular document called Inspection Sheets. This document is automatically created by the software and is made of the most important features find out by the analyst.

During the sizing operation, the analyst must reconstruct where possible any saturated signals; check the signals for interaction with neighbouring signals and any features measured or sized with a likely peak depth greater of than 80% wall thickness must be independently checked.

Manual sizing should be carried out on at least the top 25 deepest metal loss features on the pipeline. Furthermore, if these features are rounded by low-level metal loss, they should be added to the main metal loss to highlight the possible interaction between them.

3.6.2 Resize and Recluster

The pipeline data must be resized after manual sizing has been carried out to make sure any added boxes are resized and not missed.

(9)

3.6.3 Check Reportable Features

The top reportable features are identified at this stage according to the Contract Reporting Specification.

3.6.4 Analysis QA Data Check

The QMS is a dynamic checking process providing stop and go checkpoints at various stages throughout the Data Analysis process. A designated QA person should check the analysis performed on the data. This should be done using the QMS document unique to the inspection contract to carry out the extensive checks on the data

3.6.5 Generation of Inspection Report

Using PipeReport the analysed data should be used to generate the Inspection Report, to the clients’ requirements.

3.6.6 Analysis QA Inspection Report check

A designated QA person should then check the Inspection Report. This should be done using the Report section of the QMS.

3.6.7 RunComparison (RunCom) Reports

For the purpose of a RunCom report the dedicated procedures should be followed. The number of features to be assessed is contract specified. On completion of the RunCom assessment and for QA purposes RunCom Analysis Checklist is to be completed.

3.7 Client Deliverables

During the bid phase the client has chosen which kind of information wants from the inspection and consequently, which kind of report he wants to be produced. Any different report has different features and different steps of the analysis process, however, after the completion of the report, the process combines again until the dispatch of the final report.

(10)

3.7.2 Despatch Inspection Report

The approved Inspection Report is printed and checked for completeness against the master. Unless otherwise specified, the deliverables shall be dispatched with a covering letter, and a means shall be provided to enable the client to acknowledge receipt.

3.7.3 Archive Data

The working directory shall be checked and tidied up and all appropriate information entered and recorded in the system for historical purposes.

3.7.4 eDelivery

All fields in eDelivery should be completed on the day the deliverables are dispatched, where possible. Data from the QMS to be completed and attached in eDelivery along with the Preliminary Report, the Inspection Report, DQA and any other reports produced.

Riferimenti

Documenti correlati