World Pipelines - January 2015 - page 80

Internet. According to an IDC Digital Universe Study,
5
about
1.7 MB of information is generated every second in the planet,
adding to over 40 zettabytes (1 ZB = 1 trillion GB) of data
stored by 2020, with machine generated data making 40% of
the total information. The study also found that in 2012, 23%
of the data was considered ‘useful’ but only 0.5% was actually
being analysed. Big Data analytics became possible with the
introduction of advanced software and analytic models that
can store, manage and provide useful insight from connected
machines significantly faster and cheaper than ever before. For
example, software platforms like GE’s Predix
TM
, in collaboration
with Pivotal, now provide analytics for the airline industry
– mining more than 340 terabytes of data from over three
million flights – 20 000 times faster than previous methods.
6
Such capabilities are the key enablers for the development of
Industrial Internet solutions for the pipeline industry.
For this industry, the challenge in the adoption of big data
analytics was not one of sensing technology; the fundamental
components were already available. The two main streams of
information, operations and asset integrity, already have enough
sources of data to enable the applicability of Industrial Internet
concepts and analytics. Meters and sensors monitor inputs and
outputs, such as pressure, flow and compressor conditions while
comprehensive software already accounts for volumetric system
balances, throughput and product deliveries. The condition
of the linear assets is also covered by continuous or periodic
sources of information, such as cathodic protection surveys,
inline inspections or hydrostatic test results. As technology
matures, the industry adopts other monitoring and diagnostic
methods such as fibre optics, laser, ultrasonic or acoustics based
sensors that report movement, corrosion, leakage or impact to
the pipeline.
The data challenge for the industry could be summarised in
two main areas. First, the efficient management of large data
sets, including simple digitisation of disperse records from
various sources and different complexity and completeness
levels given the different types and ages of pipeline
systems. Second, the existence of robust analytic capabilities to
extract the most value out of the growing sets of information
that includes static and continuous data as well as internal
and external inputs. To put this in context, an average 65 mile
pipeline segment can generate up 35 GB of data every year.
This means a standard 5000 mile network will add up to
about 27 TB of useful information in 10 years, with only 135 GB
being analysed according with the IDC statistics.
In the past two years, the industry has seen increased
partnerships and investments from leading companies to
address this challenge. One recent example is the Intelligent
Pipeline Solution from GE and Accenture utilising the Predix
software platform from GE.
The Intelligent Pipeline Solution offers a comprehensive
roadmap for the implementation of the Industrial Internet on a
pipeline environment focusing on three main aspects:
)
First, the integration of heterogeneous data sets into
a common architecture to deliver a digital system of
reference. Thanks to the flexibility of the Predix software
platform, it is possible to integrate systems from multiple
vendors, including geographical information, work order
and risk management, inspection and equipment condition
monitoring systems. Then combine external sources of data
such as weather, seismic and one call systems, all into one
software that delivers a digital reference of all the assets
and conditions in near real-time.
)
Second, the creation of advanced monitoring, diagnostics
and predictive analytics to enhance current processes for
safety and risk management, operations and commercial
management, enabling a new age of support tools for
pipeline operating companies to drive faster, more accurate
and transparent decisions. User specific dashboards,
intuitive web-based visualisation and comprehensive
reporting and notification capabilities bring all the available
terabytes of data to use.
)
Finally, the support and expertise to help operators with
the business process changes necessary to unlock the
value of big data decision-making. The significant potential
benefits to the pipeline industry range from enhanced risk
management processes to improved safety, productivity
through the use of faster and more accurate tools
(including mobility and digitisation of field operations) and
optimisation of business processes for efficient allocation of
resources and throughput capacity.
Maturity model
An important step in the adoption of the Industrial Internet
is the assessment each operator needs to perform of their
enterprise maturity in relation to the use of big data. In simple
terms, the maturity model establishes a baseline across four
main maturity phases: monitor, analyse, predict and optimise.
Operators may have existing initiatives that combine
information technology (IT) infrastructure and software
capabilities to support integrity and operations to enhance
the quality, speed and cost-effectiveness of those processes.
The maturity assessment breaks down pipeline management
into discrete process areas, such as IT infrastructure, integrity
management or commercial operations, and examines each
sub process systems, data sources and use of information
to determine the current state and to clearly articulate the
desired end state. This way, the solution provider can define a
roadmap and deliver the tools and capabilities to achieve it. The
adoption of the Industrial Internet is a journey rather than one
giant step; however, each phase needs to be properly scoped to
deliver enough business impact and incremental value along the
way.
An evolution in pipeline management practices
With access to integrated data in a common architecture,
information becomes more accurate and timely, introducing a
new pace to existing processes. The Industrial Internet enables
a significant shift to proactive operations and introduces the
concept of near real-time, combining continuous, periodic and
static data.
A common platform allows pipeline companies to bring
convergence across the different departments by integrating all
the available data through a single digital system of reference.
This allows operating teams, such as control room and field
78
World Pipelines
/
JANUARY 2015
1...,70,71,72,73,74,75,76,77,78,79 81,82,83,84,85,86,87,88,89,90,...92
Powered by FlippingBook