Sei sulla pagina 1di 48

Table of Contents

KAPPA

Training and Consulting Services

37

The Analysis of Dynamic Data

Dynamic Data Analysis


(DDA part I and II)

38

Foundation PTA (DDA part I)

39

Production Analysis/PDG
(DDA part II)

40

Advanced PTA

41

Foundation PL

42

Whats New in v4.12?

10

Reservoir Surveillance
of Dynamic Data

12

Pressure Transient Analysis

16

Production Analysis

20

History Matching

24

Advanced PL

43

Well Performance Analysis

28

Rubis Modeling

44

Production Logging

30

Ecrin Software

45

Technical references

34

Support and Consulting

46

KAPPA Engineering - v4.12 October 2009

KAPPA

www.kappaeng.com

KAPPA is primarily a petroleum engineering software company.


Our integrated software platform, Ecrin, is the industry standard for the analysis of dynamic data.
Ecrin includes modules for Pressure Transient Analysis (Saphir), Production Analysis (Topaze)
and a full field numerical model for History Matching (Rubis). In Ecrin v4.12 a Well Performance
Analysis module (Amethyste) was added to the suite. The freestanding Production Logging
software (Emeraude) will be integrated into Generation 5.
To seamlessly connect and process client data in the Ecrin modules KAPPA has developed
Diamant Master, a server solution that centrally processes and shares permanent gauge
and production data, technical objects and documents in a coherent environment for real time
reservoir surveillance and management.
Founded in 1987, KAPPA now has over 5000 active commercial software licenses, used by
more than 500 companies worldwide. KAPPA is independent and 80% owned by its employees.
With its main development office in Sophia Antipolis, France and regional offices in Houston
and Bahrain, KAPPA is also present in fifteen other countries with local offices and distributors.
KAPPA also offers complementary Training and Consulting Services (TCS) which is based near
Gatwick, UK. It trains hundreds of engineers every year in its chosen disciplines.
KAPPA is a Microsoft Certified Partner.

The Analysis
of Dynamic Data

Ecrin - Main window with PTA (Saphir), PA (Topaze),


Amethyste(WPA) and Rubis (HM) running simultaneously

An integrated platform for the analysis of dynamic data


The pressure to reduce costs does not negate the fact that the work still needs to
be done. With an exponential increase in the amount of data and a corresponding
decrease in the resources to handle them, users told us they needed ergonomic
tools that would integrate, navigate and communicate within a single environment
thus avoiding painful import / export, process duplication and, incidentally there
should be very little training overhead. The result is the integrated Ecrin suite
that aims to be the best in class software for handling, modeling and analyzing
reservoir dynamic data.
The development started by integrating Data Management (Diamant) with
Pressure Transient and Production Analysis (Saphir and Topaze). It progressed
with the simultaneous release of a server application (Diamant Master) to gather,
smart filter and share Permanent Downhole Gauge (PDG) and production data
whilst providing a seamless connection to the analysis modules.
The Voronoi model developed for transient analysis led to a natural extension
into non-linearity and upscaling across the modules to production analysis and
then the full-field 3D history-matching model in Rubis.
In Ecrin v4.12 a fully integrated Well Performance Analysis module (Amethyste)
has been added to the suite. The drive to integrate the modules continues
with substantial enhancements to the integration process that includes sector
identification and transfer from the full field model to transient analysis and inflow
performance (IPR) from Amethyste to Saphir.
4

Permanent Gauges
The requirement for an integrated suite first arose in the
late 1990s with the increasing deployment of permanent
downhole gauges (PDG). These gauges are constantly
monitoring downhole pressures and are a passive witness
to whatever happens in the well and the reservoir. In
particular, PDG under stable producing conditions provide
data to run production analysis, and from incidental shutins where we can perform a transient analysis. The initial
bubble of excitement burst quickly however. The data
stored in the historians were huge and, if an engineer could
find the data in the first place, it ground their computers to
a halt frequently ending with the blue screen of death.

Dynamic Data and Intelligent Fields


Whenever a fluid is produced or injected into the
reservoir, the diffusion produces changes in pressure
and temperature that may be recorded in various places.
Combine this data with the production / injection rate
history and you have what we call Dynamic Data which
are candidate for analysis. This analysis allows models
to be built on various scales, leading to a forecast and
decisions. Connect all this to real time measurement
stored in historians, automate some of the processing,
and you get what we will call an Intelligent Field.
From well testing to the analysis of dynamic data
Not so long ago, the only dynamic data the engineer
had available was a well test, generally shut-in data, a
set of specialized plots and a dedicated analytical model
catalog, all this in a single application isolated from other
reservoir engineering tools. For KAPPA, this was the time
of Saphir, a standalone application. Today the stakes are
much higher and most fields are more complex. Different
methodologies have to be applied, sometimes using the
same data, sometimes not. However, all these techniques
apply to the same reservoir and wells, and the truth is in
the data. The name of the game today is to build a puzzle
from little pieces coming from everywhere. For KAPPA,
now is the time of the integrated Ecrin suite.

The first Diamant and Ecrin; a little history


Work at Stanford University showed that it was possible
to develop smart filters, based on wavelet algorithms,
that could drastically reduce the number of points without
eroding the data signature (p.13). Diamant, the fourth
KAPPA gemstone was built to do exactly that, and the first
release of Ecrin linked the three applications involved in
PDG processing; the data cross-over and smart filtration
part (Diamant) and the analysis modules; PTA (Saphir) and
PA (Topaze). Technically the workflow was nearly perfect.
The data would flow, was filtered and then successfully
sent to the relevant applications for analysis. But with use
it was clear further development was needed.

What is the analysis of dynamic data?


This is a long list. It starts with Pressure Transient Analysis
(PTA) and its counterpart, Production Analysis (PA). On
the scale and lifespan of the reservoir we can History
Match (HM) the producing and pressure / temperature
data. It is also possible to obtain a vertical profile of the field
contribution with Production Logging (PL) and Formation
Testers (FT). To level all this to datum the output of a
Well Performance Analysis (WPA) tool is useful, albeit
this will generally only provide a steady-state proxy of the
problem.
What is Ecrin?
Ecrin is the software environment under which all
the KAPPA dynamic data analysis modules operate.
By running under a single executable Ecrin provides
complete interconnectivity between the modules and
allows the sharing of common technical objects. This
seamless workflow saves time, repetition and frustration.
All objects such as PVT, data and models are available
to all modules, at any time, by drag-drop. This can be
done using, amongst other methods, the versatile Ecrin
browser. Incidentally, the weird name is a French thing.
Ecrin is the word for jewelry box. With Ecrin you buy the
gemstones, we provide the box.

PDG processing with Diamant Master and Ecrin

Diamant Master
Diamant was originally a locally installed user application,
local being the key word. Any engineer wishing to process
PDG data would have to directly connect to the data
historian, define his / her own filter levels, and use the
results, locally. At a time when real-time reservoirs were
the new buzzwords, clients sought enterprise-wide or,
at least, reservoir team-wide collaboration and access.
In response KAPPA developed a client-server solution,
Diamant Master, in order to share the standardized data
within a workgroup and establish real time links with the
coming Intelligent Fields (see p.12 to 15). However there
was one remaining technical caveat: production rates.
Production rates
The well production history, generally coming from
very inaccurate reallocation processes, was absolutely
useless in order to extract a proper shut-in. So a lot of
manual work was left to the engineer. Early publications
on wavelets erroneously suggested that they could be
used to automatically identify shut-ins. In real life, hard
shut-ins and soft shut-ins, and anything between, were
so dissimilar that a single wavelet filter, however smart,
would miss too many transients to be of any practical
use. This problem has now been solved in v4.12, with
a new algorithm that identifies shut-ins with a high and
useable level of reliability. This was the missing link if we
wanted to automate the process of creating build-up files
in the intelligent field environment. This new algorithm is
detailed on p.14.

Ecrin browser

Sharing basic technical objects


This is not as easy as it might look. Take the example
of PVT: In an isothermal environment (Saphir, Topaze,
and the simple Rubis cases) Black Oil correlations, EOS,
or simply PVT tables are used. Now, drag-drop a PVT
object from one of these modules to a non-isothermal
module such as Amethyste, which requires temperature
related PVT. What happens if we have tables at only one
temperature? In Ecrin, the PVT object will check that it
is now in a non-isothermal environment and, if there are
tables will produce a pop-up requiring the user to select
a correlation, and then fit this on the table values at
reservoir temperature. Conversely, a correlation based
non-isothermal PVT will become isothermal by simply
picking the temperature in the document into which it
is dropped.
In Ecrin v4.12 the interconnectivity is extended to
Amethyste: Intake curves calculated by Amethyste can be
sent to Saphir, Topaze and Rubis on a single click, and
IPR/AOF created in Saphir are ready to use in Amethyste.
Sectors of a Rubis model can be drag-dropped into Saphir
for 3D/3-phase modeling of pressure transients.
Sharing analytical models
In Ecrin, the PTA (Saphir) and PA (Topaze) modules share
an analytical model catalog. Details will change depending
on the environment, but it is globally possible to drag-drop
a complete document, including the analytical model, from
the PTA module to the PA module and vice versa.
Sharing numerical models
Numerical models are at the technical heart of Ecrin, and
they are our greatest challenge. With Ecrin we seek, step
by step, to build an understanding of the reservoir and its
wells, from the various dynamic data available. We first
use the data for analysis, then we use the result to history
match all available data and forecast the future. One way
or another we need to feed a unique model, which we will
use as a proxy of the reservoir. One could argue that such
a proxy should be the geological model. We respectfully
believe that it seldom works. Arguably, even when it does,
we generally have no time to use and update it in a practical
sense. The following is a guide to how the various levels
of numerical model are built in Ecrin, and how they can
contribute to define this reservoir proxy.

Automatic Shut-in identification in Diamant Master v4.12

Share, share, share


With the development of a client-server solution it would
have been a waste to limit the sharing to PDG data.
Diamant Master has a field / well structure where Ecrin
documents and individual technical objects (PVT, kr
tables, maps, files of all types, etc) can be shared in a
structured way. Between Ecrin documents and objects
available in Diamant Master, engineers can share nearly
everything including data, technical objects and models.

10

11

12

How Numerical models are built in Ecrin


In Saphir and Topaze, building a model is fast, intuitive and
achieved within the time frame usually allocated to making
a transient or production analysis. The engineer focuses
on the physical problem not the process of building. The
sequence above shows the typical steps in building a 2D
numerical model with the unstructured (Voronoi) grid in
Saphir or Topaze, from how to import a bitmap (1), draw
the wells and the field inner / outer boundaries (2), define
composite zones (3), import fields of thickness, porosity
and permeability values (4), initialize the automatic grid
(5), show fields of static or dynamic data (6) and visualize
and animate results in pseudo-3D (7), or in real 3D (8).
Around each well the 2D unstructured gridding is replaced
by a 3D unstructured when needed, as for a limited entry
well (9) or a horizontal well (10). These models also
account for vertical (11) and horizontal (12) anisotropy.
To progress from the 2D build to a full 3D Rubis model it
is just a few more intuitive steps (see p.25 to 26).

Saphir NL & Topaze NL


With a numerical model it was natural progression to try
solving nonlinear PTA and PA problems that had been
hitherto overlooked. Saphir NL and Topaze NL can be
used to model real gas diffusion (no longer needing for
pseudopressures), real dead oil (with pressure related
physical properties), water-oil and water-gas problems,
water injectors, water drives (Schiltuis, Fetkovich, Pot,
Carter-Tracy, numerical), nonDarcy flow (Forscheimer
equation),
unconsolidated
formations,
pressure
constrained problems and, in v4.12, desorption models
for Coalbed Methane (CBM) and Shale Gas problems.
Rubis as a game changer
Rubis evolved from the numerical heart of Topaze, a product
where Production Analysis (PA) was greatly enhanced in
turn by using modern Pressure Transient Analysis (PTA)
tools. Rubis provides the next logical step, particularly in
the 3D multiphase environment.
Rubis is diametrically opposite to the development of the
next generation of simulators which can handle billions of
cells with massive parallel processing, with results that are
often generated too late to be useful. We want to match
the production data, as often, and as quickly as possible
by modular integration, using the pieces of the jigsaw
puzzle from the different methodologies such as PTA, PA,
PL and History Matching to create a proxy model of the
reservoir. It is a tool that sits somewhere between single
cell material balance, and massive simulation models, it
replaces neither but does much of the work of both.

Why are Saphir / Topaze numerical models so fast?


The first numerical development from KAPPA was aimed
at modeling complex geometries running as a superanalytical linear model. For each time step a numerical
kernel requires a linear solver and a nonlinear solver.
The linear solver will solve a local linear approximation
of the problem, while the nonlinear solver will iterate on
the linear solver actions in order to get the right answer.
When a problem is linear there is no need for the nonlinear
iterations, the numerical kernel will only use its linear
solver, and only once, at every time step. This is why
Saphir and Topaze solutions are so quick.

Rubis models the reservoir with the smallest possible


number of cells, we history match what we can and use
this as a decision tool on a weekly or monthly basis, as
opposed to yearly or never. In an intelligent field, the
Rubis model becomes a reservoir proxy that may even
be used in real time to forecast production from the PDG
measurements.
7

The solution will be slower, and this sort of refined grid is


not very good at handling 3-phase flow. For PA (Topaze)
such significant refinement is not required. With a detailed
2D representation requiring around 300 cells around the
well (19 & 20), the response is honored after one hour,
which is sufficient given the frequency of production
data (21). For HM (Rubis), the early time transients are
irrelevant and the minimum time step will be one day. A
very coarse grid (22 & 23) with only 6 cells will provide a
solution that will converge to the reference analytical case
only after 24 hours (24).

Flexible Upscaling
The traditional way of growing a simulation model involved
feeding it with manual data such as Skin and PI. Not so
in Ecrin. All Ecrin numerical tools use the same technical
kernel with the main difference being in the local grid
refinement around the wells.
To elaborate; consider the example of a horizontal well
in a rectangular reservoir (13). Outside the area directly
around the well, we have approximately 400 unstructured
cells to model this simple reservoir (14). We use, as a
reference, a test design using the Saphir analytical model
(15). The reservoir cells will be common to Saphir, Topaze
and Rubis. However, the requirement around the well is
going to be very different. For PTA (Saphir) we need to
have a very significant refinement (16 & 17) around the
well to perfectly simulate the different flow regimes on a
loglog scale (18). The price to pay to fit the early time
transients is 2,300 cells around the well.

These cases are actually three instances of the same


process. An exclusive feature in Ecrin is an upscaling
parameter from zero to one, which will continuously modify
the well gridding, from the most detailed (0 for PTA) to the
most coarse (1 for HM), merging the analytical reference
after a time ranging from 1 second to 1 day.

13

14

15

16

17

18

19

20

21

22

23

24

The trick
Engineers familiar with numerical problems may wonder
how, and by what miracle, a coarse grid with six cells can
exactly fit after 24 hours the response of an analytical
model to the fifth decimal place. Correlations giving a well
index (connection between the well and the cell) are not
that good. The reason is we cheat. Whenever a coarse
grid is used, a refined PTA grid is also used and, before
anything else, for each well grid a small single phase
simulation around the well is run, one with the coarse
grid and one with the refined grid. Then the value of the
well index of the coarse grid is adjusted to match the
productivity given by the refined grid. Put another way;
before any simulation, for each well the coarse grid is
calibrated with the refined grid, which itself was calibrated
by the analytical model. By doing this when numerical
problems are transferred between applications, even
though they have different levels of upscaling, they will be
completely consistent. In v4.12, the Rubis sector to PTA
is a good illustration of this process, allowing a section of
a Rubis model to be sent to Saphir for pressure transient
analysis of detected shut-ins.

However, today PL is an important tool to understand


multilayer formations. In some areas such as South East
Asia layered sands are so numerous that PL, combined
with a simple material balance, may be the only possible
reservoir management tool. The linking has started in
part with the ability to export discrete layer rate data for
multilayer analysis.
Rubis now simulates PL responses, and therefore
PL results may also be used in the history match, with
the same authority as pressure data. PL analysis also
uses flow correlations found in common with Well
Performance Analysis and hence flow models used in
a PL interpretation can be the starting point of the VLP
modeling of Amethyste.
Share, share, share some more
The ability to share data and technical objects between
applications and servers of a given vendor is useful but
only a first step. Intelligent fields are generally built around
a data model, either built in-house by the operating
company or purchased off-the-shelf. Interacting with the
data model, obtaining the data structure and the path to
the historians are required to minimize the connection
between this central structure and peripheral suites. The
results from Ecrin modules may be required by, and sent
to, other third party applications, either directly or via the
operating company data model. In v4.12 a first version of
such an interface was developed to access information
stored in the Petroleum Experts IFM database. This is
the first of a long series of links. In 2010 KAPPA plans to
release the first version of an open server API allowing
Ecrin results to be transferred to third party applications
via Diamant Master.

New in v4.12: Well Performance Analysis (WPA)


To correct pressures to datum, KAPPA modules can
import well intake curves from standard ASCII format. So
why, we hear you cry, would KAPPA develop Amethyste,
a WPA (or NODAL in Schlumberger parlance) software
when there are already perfectly good ones on the market?
The answer lies in the fact it is needed to build well models
in complete coherence with the existing Ecrin PVT objects
and flow correlations and, not least in the hands of the
user, it again saves substantial time. If a transient test has
been analyzed in Saphir, the IPR data and/or the IPR itself
can be drag-dropped from Saphir to Amethyste and much
of the work is done. Well intake curves can also be dragdropped from Amethyste to Saphir, Topaze and Rubis.

Connecting Diamant Master to third party reservoir models

Dynamic data workflow, today


The release of Ecrin and Diamant Master v4.12 is a
milestone for the user. With the new automatic buildup identification and related rate history correction, the
release of Amethyste (WPA) and the first link to a third
party data model, there is no longer a gap in the data
workflow. The real time management of dynamic data is
now operational.

Using an Amethyste wellbore model in Rubis

Emeraude Production Logging


Emeraude, the KAPPA Production Log interpretation
software, was first developed in 1994 independently of
Saphir. There was an operational link, naturally, and PL
results could be used, very early in the process to constrain
or orient multilayer PTA.

Whats New in v4.12?

Running a full 3D / 3-Phase PTA model in a sector of the full field model

v4.12 is a major new release.


It integrates a new module for Well Performance Analysis; Amethyste and sees
the fundamental re-write of the Production Analysis module Topaze.
Technically, major new developments have been made that constitute a real
breakthrough in the path to integrate the analysis of dynamic data into the next
generation of field management or Intelligent Fields as they are known. This
includes, for the first time, a reliable way to automatically identify shut-ins from PDG
data, whether they are hard or soft. The demand for new tools for unconventional
gas (shale gas, CBM) has been addressed by integrating desorption in the
numerical models. For shale gas a numerical fractured horizontal well has been
added. In the drive towards connectivity an intelligent link between Diamant
Master and an industry standard field model is included in this release.
Looking forward, the work has started in earnest on the restructuring some of the
calculation modules to prepare the drastic move to generation 5 and development
is underway in Rubis on a Carbon Capture and Storage (CCS) prototype to model
the injection and migration of CO2 in deep saline formations in the prospect of
some pilot projects post Kyoto.
A full technical description with a list of all new features is available for download
from : http://www.kappaeng.com/ecrin412.pdf
The following is a summary of key improvements.

10

The new Ecrin module: Amethyste (WPA)


Ecrin 4.12 is a major new release that includes a new
module; Amethyste. This Well Performance Analysis
(WPA), or Nodal in Schlumberger terminology, module
completes the Ecin Dynamic Data Analysis suite and
provides a wellbore model to connect the reservoir to
surface. It is included free to existing Ecrin users until April
2010. We felt this price would please managers in the
current environment.
Topaze rewritten
Topaze, the Production Analysis (PA) module has been
fundamentally re-written to offer true multi-well capability.
Load facilities have been extended to include DMP2,
Merak and groups from Diamant Master. It is now possible
to make fast analyses in parallel for individual wells or
groups of wells. These can be viewed and compared in
tables and/or bubble maps and a field production profile
generated. A production profile generator (formerly the
free-standing K-Prospect) has been included. When
added to Saphir, with so much shared code, the price is
marginal.

Running a section of a reservoir model in Saphir


It is now possible to grab a section of the full field Rubis
History Match (HM) model and drop this into the Saphir
(PTA) module for analysis. Saphir is then running a sector
of the full field 3D model, 3-Phase, multiwell and with
gravity effects.
Saphir deconvolution
A fourth deconvolution method has been added (after the
combined KAPPA and Levitan methods)

Fourth deconvolution method in Saphir

Tools for unconventional gas


The increasing demand for tools to model unconventional
gas sees the addition of gas desorption for use in gas
or gas-water cases in shale gas and coal bed methane
formations. This can be combined with unconsolidated
formation responses. A numerical fractured horizontal well
model has been added.

Topaze multiwell

Automatic shut-in identification in Diamant Master


With the increasing use of Permanent Downhole Gauge
(PDG) data and the wide acceptance of Diamant Master
for the management and handling of the massive data this
generates, new labor saving tools have been developed
and added in v4.12. The elusive issues of automatic buildup identification and corresponding rate allocation and
synchronization have been solved, saving hours of tedious
work for the user. It is now possible to identify years of
build-ups in Diamant Master and send these to Saphir in a
few clicks. When opened in Saphir the rates and build-ups
are already sychronised. Voila! Time for a coffee.
Connection to third party data models
This principle of connectivity is extended to third parties
with the facility to create and update Diamant Master by
mapping the field description of an integrated database,
transfer the production history and automatically look for
PDG data in historians. The developments are generic with
the first implementation on the Petex IFM data model.

11

Fractured horizontal well model

and many other additions


Amongst a host of minor improvements and updates, data
input has been improved with the addition of Excel load,
the French units system, Petex black oil PVT load, PAS
file explorer to Ecrin drag-drop and GRDECL Eclipse and
Zmap into Rubis (HM). The shared modeling capabilities
see many enhancements and these include new PVT
correlations, expansion of the already extensive flow
correlation resource with the merge of the Marathon and
KAPPA libraries, the addition of new plots, models and
improved general handling.
...and Ecrin now has an undo button.

Reservoir Surveillance
of Dynamic Data

Diamant main window

Permanent Downhole Gauges (PDG) are a remarkable source of information of


both long term production data and the capture of occasional build-ups that may
be described as free well tests. Data are acquired at high frequency and over a
long duration. The down side is the large number of data points gathered, which
can amount to hundreds of millions per sensor which is far beyond the processing
capability of todays fastest PC. There are a number of challenges: storing and
accessing the raw data, filtering, transferring this to the relevant analysis module
and finally sharing both filtered data and analyses.
Diamant Master is a client/server solution for reservoir surveillance that
addresses these issues in a shared environment. It permanently mirrors raw data
from any data historian, reduces the number of points with wavelet-based filtration,
stores and shares the filtered data and also exports this to third party databases.
Derived data can be created and updated by user controlled mathematical
operations on existing data. Boolean alarms can be created and used over a
network. Diamant Master also stores technical objects and maintains the data
with enterprise-wide consistency avoiding the need for repetitious data handling
and speeding the workflow. Diamant Master is administered, and can be partially
operated by, a WEB client. It is fully controlled by the Diamant module in Ecrin.
New in v4.12 exclusive algorithms automatically identify, isolate and send multiple
shut-ins for analysis. Automatic rate allocation at build-up inception honoring
actual production history has been added, further reducing tedious engineer
workload by pinpointing a daily rate to the moment a build-up begins.

12

What PDG data provides


PDGs acquire pressure data at high frequency and over
a long duration. A typical data set will include two types
of information; each spike is an unscheduled shut-in that
may be treated as a free well test for PTA. In addition the
long term global producing pressure response, ignoring
these spikes, can be used in association with the well
production to perform Production Analysis and/or history
matching. The data is there and it is already paid for. It is
simply a matter of getting at and interpreting the data.
Nice idea, one not so little problem; the available data is
vast and growing. For one single gauge there are typically
3 to 300 million data points. This will bring even the fastest
of todays PCs to a grinding halt. But we need both shortterm high frequency data for PTA and long-term low
frequency data for PA.

Typical PDGdata response gathered over two weeks

Wavelet filtration
To perform a transient or production analysis we typically
need 100,000 data points. The trouble is it is a different
100,000 from the same dataset. To obtain both, Diamant
Master (DM) uses a wavelet algorithm. Wavelets may be
described as a smart filter with a threshold. For each
point the local noise is estimated for different frequencies.
If the local noise is above threshold, as occurs for pressure
breaks when the well is shut in, this noise is considered
significant and it is kept. In this case, the wavelets act as
a high pass filter. Conversely, if the noise level is below
threshold, this is considered as noise and it is filtered
out. In this case the wavelets act as a low pass filter.
As a result, producing pressures will be filtered out and
reduced to a few points per day, while all early shut-in
data will be preserved. For the engineer, it is the software
equivalent of running a pencil through a noisy data cloud
but marking, and closely following, the data whenever the
engineer sees a shut-in.

Wavelets denoising: (1) raw data = 10,000 points;


too low (2), too high (3) and selected (4) thresholds;
(5) post-filtration; (6) filtered data = 70 points

Diamant Master (DM) workflow


Diamant Master is an ongoing process installed on a
dedicated machine running Windows Server. Engineers,
subject to privileges, operate DM from Diamant in Ecrin or
a WEB based subset. All operations are performed and
shared on the DM server which remains persistently linked
to the original data source(s) from which it sequentially
imports the raw, unfiltered data. Users can navigate
the input database and indicate which tag(s) should be
imported. Data is mirrored from the raw database to a
local, fast access format. At the start of deployment DM
will remain in an infinite loop in order to retrieve the legacy
data. Once DM has updated a given gauge it will regularly
contact the new data and load on a timer set by the DM
administrator. For each mirrored data set, users with the
right privilege may create one or several filtered channels
using the wavelet filter.
Once the filter is defined DM will, in the background and as
soon as sufficient new points have been mirrored, update.
The filtered data is stored in the local DM database to be
subsequently sent to Ecrin analysis modules on a single
drag-and-drop. This data may also be exported to a third
party database. It is possible for the Ecrin users to return
to any part of the data and request a reload with a different,
or no filter. DM stores KAPPA technical objects and files
in a hierarchic and intuitive structure to be shared by Ecrin
interpretation modules.

PDG data after filtering

13

Connecting to data
The beauty of standards is that there are so many to choose
from. So it is in the Oil Industry; there is no standard way to
store PDG data. There are many providers, and each has
their own data model. It is common for Operators to have
several providers and hence different data models will
co-exist. Most databases have low-level access (ODBC,
OLEDB, OPC, etc), but this is, at best, cumbersome for
end users. Each solution would require a specific adaptor
to navigate and access the data. KAPPA has implemented
a unique API; the External DataBase Interface (EDBI) that
permits the connection to customized adaptors. In most
cases the adaptor is written by KAPPA. Each adaptor
is delivered as a DLL that includes the data access and
the user interface to navigate the database. It acts as a
plug-in. At the first connection, Ecrin will automatically
download the DM plug-in and the user will navigate
without further installation. The database interface also
has adaptors to export the filtered data to external client
databases.
Data processing
At initialization, and as a one off process, DM proceeds
with a quick data scan of one point in every ten thousand to
offer a preview of the data and help in spotting anomalies
and gross errors. A selection on the data window can be
made and outliers immediately discarded. Within the load
window an initial sample of a fixed size, typically around
100,000 points or one week of data is extracted. In an
interactive and iterative process, the engineer will adjust
the wavelet setting, to get to the point where the required
data signature sensitivity is retained and superfluous data
filtered. Post-filtration, based on a maximum t and p,
is then used to reduce the number of points to the final
de-noised signal. Upon user acceptance the filtration is
performed using overlapping increments of the size of the
initial sample.
Calculating derived channels
These are user defined and permit mathematical
operations on data channels with a comprehensive
formulae package. The outcome may be another data set
or a Boolean function of time that may be used to create
an alarm. The outcome of the alarm is to display, in the
Diamant window, the execution of an alarm E-mail, or the
call of a user defined DLL.

Automatic Shut-in identification in Diamant Master v4.12

Allocating the rates.automatically


If a well is shut-in half way through the day 2000 BOPD
flowing for 12 hours is still 2000 BPOD, not 1000 BOPD.
When there is a build up it is therefore a question of
allocating this more sparse data correctly to the period
before the well was shut-in. This was a tedious manual
process that involved the user creating a derived Boolean
channel to indentify flowing and shut-in periods based on
the build-up. This has been fully automated in v4.12.
Transferring data to Ecrin analysis modules
Filtered data can be transferred from DM by drag-drop to
an analysis module. Shut-ins are analyzed and compared
using the PTA module (Saphir) while producing pressures
will be history matched or used in diagnostic plots using
the PA module (Topaze) or even through to the full field
history match in Rubis. DM maintains a persistent link to
the original data source. For each gauge, regularly or on
user request, the process reconnects to the data source
and then loads and filters incremental data using the
filter as set for the particular gauge. It is also possible to
change the filter setting, for new data or retroactively, or
to partially re-populate a data segment over, for example,
an identified build-up with completely different or no
filtration.

Identifying shut-ins.automatically
We believe this to be a very important breakthrough. Until
recently all algorithms (including those involving wavelets)
failed miserably to automatically identify shut-ins,
especially when data sets were showing both soft and hard
shut-ins. In v4.12, an exclusive algorithm locates, without
user intervention, all transients within a selected time
period, with a rate of success that makes it a considerable
time saver for the engineer. Years of PDG data can be
scanned in seconds, the transients identified and made
available to the user for simultaneous or discrete analysis
in Saphir (PTA). This was the last missing link to allow full
automation of the data processing.
History matching in Topaze

14

Multiple build-up analysis in Saphir

Express individual or multiple shut-ins


With the transients identified and daily rates correctly
allocated and cleaned, shut-in data can be sent, en masse
or individually, to a PTA (Saphir) document automatically
created by Ecrin. The result can be the latest shut-ins
or a cloud of transients from previous years, that may
be analysed together, as a selected group or discretely.
Years worth of shut-ins are gathered from DM with rates
synchronised and presented in Saphir in seconds.

Diamant Master processes

WEB access and administration


Diamant is the best way to handle data, technical objects
and files when using KAPPA applications. However
these can also be accessed from an Internet browser by
connecting to the DM server IP address or its name in the
domain. The engineer can view the status of the different
processes, access the data tables and technical objects
and recover the filtered data in ExcelTM format without using
Ecrin. An ActiveX control can also be loaded to navigate
the data structure in the same browser environment
as Diamant.

Diamant ActiveX control


Build-ups indentified in Diamant and ready to send to Saphir

Diamant Master process


The diagram below shows the different components
of the DM process. These operate continuously and
independently. The interface between the KAPPA
storage database, the Ecrin clients, the WEB clients and
the other DM processes are controlled by the DM Server
(DMS). It protects data locked by a user against possible
interference from other users. When an Ecrin user decides
to mirror PDG data or to create new filtered data, the DMS
will store the new instructions in the KAPPA database.
The DM Mirroring Process (DMMP) and the DM Filtering
Process (DMFP) are independent. The DM Calculation
Process (DMCP) creates and permanently updates tags
that are derived from other tags. The DMCP also sets
alarms.

15

PDG workflow using Diamant only


For very small workgroups, the Diamant module in Ecrin
has a subset of the Diamant Master PDG capabilities. The
database connection (EDBI), and therefore the ability to
access filtered data from various sources is the same.
Mirroring is allowed but incremental loads are triggered
by the user. The filtration process is identical but data
are stored in a local Diamant file. Direct sharing is not
possible, however filtered data may be exported to files. It
is not necessary to purchase Diamant in order to operate
Diamant Master.

Pressure Transient
Analysis

Saphir main window

Saphir was first developed over twenty years ago by two engineers who needed
a tool for their own interpretation work. It was fast, interactive and robust and it
remains so but much has changed. Saphir has grown to a dominant position in the
Industry with over 2400 commercial licenses used as standard by nearly all the
Major IOCs and NOCs and other clients across operators, service companies
and consultants on all continents.
The Saphir methodology has always been based on the Bourdet derivative as
the main diagnostic tool; matching the measured data to the model taking into
account the detailed production history.
The ever-increasing processing power of PCs has enabled KAPPA to aggressively
expand the technical capability of Saphir. This has resulted in the development
of extensive and fast numerical modeling, extension to nonlinear problems
in Saphir NL, multiple deconvolution methods and now integration with other
modules in the Ecrin suite.
In v4.12, as module interconnectivity develops, sectors of a full-field Rubis
model can now be extracted and simulated in Saphir. Layer rates from PLTs
in Emeraude for multilayer analysis can be imported and, with the concurrent
release of the Amethyste WPA module, wellbore models and IPR / AOF can be
exchanged on a single click. A new numerical fractured horizontal well model
is now available and Saphir NL can now model desorption for shale gas and
coalbed methane. A new deconvolution method has also been added.

16

Loading and editing data


Generally the most tedious and time consuming part of PTA,
is to input the known parameters, load rate and pressure
data, quality check, edit where needed, then extract the
period of interest, generally shut-ins, in order to start the
interesting part; the loglog and specialized analyses. So,
although this is not the most riveting of subjects, Saphir
can load an unlimited number of gauges, rates, pressure
and other data in almost any format including ASCII,
ExcelTM, PAS and databases of all kinds via OLEDB &
ODBC. Data may be input as points (time, value) or as
steps (duration, value). Saphir has real time links with
various acquisition systems, and data drag-and-drop from
other Ecrin modules and Diamant Master. It is possible to
start a build-up analysis from the direct selection of a shutin phase in Diamant Master. In the case of a multi-layer
test, layer rates may be extracted to discriminate the layer
contributions from the PL module Emeraude.

Edit rates dialog

QA/QC
There is a comprehensive range of interactive edit and
QA/QC tools including trends, tidal correction, gradient
analysis, and the possibility to compare various gauges to
detect gauge drift and wellbore effects between sensors.

Correction to datum with VLP models


Saphir can define a Vertical Lift Profile (VLP) or import
a well intake model. In v4.12 and with the release of the
Amethyste WPA module, (see page 28) it now possible
to generate the VLP in Amethyste and drag-drop this
into Saphir. The VLP is used in conjunction with the
analytical or numerical model to simulate the pressure
at gauge depth, in particular at surface. Alternatively
the VLP can be used to correct pressure data a priori to
reservoir depth.
Test design
All Saphir analytical and numerical models may be used
to generate a virtual gauge on which a complete analysis
may be simulated. Simulation options taking into account
gauge resolution, accuracy and potential drift can be the
basis for selecting the appropriate tools or to check if the
test objectives can be achieved in practice.
Extracting P and Deconvolution
Once data are ready the loglog and specialized plots
can be extracted. Alternatively, the Saphir deconvolution
option can be used to create an equivalent, extended
drawdown response from several successive build-ups.
There are caveats, and the assumptions behind these
developments, limitations and suggested usage are
developed in the free KAPPA Dynamic Flow Analysis book
(see page 46). Saphir was the first commercial product to
make these techniques available to its users, and in this
third release of the Deconvolution it is the only program to
offer four different methods: (1) a single deconvolution to
match multiple build-ups with a variable initial pressure (2)
one deconvolution per build-up with a single fixed value of
initial pressure (3) single deconvolution to match multiple
build-ups with the ability to ignore the early time of all but
one period to deal with inconsistent early times, and in
v4.12 (4) a hybrid; method (3) followed by method (2) in
the same automated loop.

QA/QC

Deconvolution on two consistent build-ups

17

Specialized plots
Additional specialized analysis plots can be created with
options tailored to specific flow regimes. These include
very short term tests or FasTestTM for Perforation Inflow
Testing and predefined types such as MDH, Horner,
square root and tandem root. The user creates straight
lines, by regression or interactively, and Saphir calculates
the relevant parameters.

Matching data with a Numerical model


Since v3.0, numerical models have been used to generate
complex geometries with physical parameters beyond
the scope of analytical models. This is predominantly 2D
but with 3D refinement where needed. The mechanics
for building such models are described on p. 7 and 8. In
v4.12 the most complex numerical model to date has been
added to solve the problem of fractured horizontal wells.

Classical Horner plot

Interpretation using a numerical model

Matching data with an analytical model


Saphir offers a comprehensive built-in analytical catalog
allowing combinations of the traditional well, reservoir and
boundary models. Additional specific external models are
available and listed on page 35. Interactive pick options
are offered for most parameters for a first estimate by
selecting a characteristic feature of the model on the
Bourdet derivative plot. If the user gets stuck there is the
option to use the AI package KIWI as a guide. Additional
capabilities include rate dependent (non-Darcy) Skin,
changing wellbore storage, interference from other wells,
gas material balance correction for closed systems,
well model changing in time (e.g. pre and post frac, or
changing Skin), horizontal and vertical anisotropies and
layered (commingled) formations.

Saphir NL
Saphir NL (NonLinear), uniquely in the industry, handles
various types of nonlinearities: the slightly compressible
fluid assumption and pseudopressures are replaced by
the exact diffusion equations solving for real gas diffusion,
non-Darcy flow, pressure related physical properties,
multiphase flow, water injectors and water drives. In v4.12
gas desorption based on the Langmuir isotherm, for shale
gas (single phase, combined with the fractured horizontal
well) and coalbed methane (2-phase water-gas) have
been added.
Rubis Sector to Saphir PTA (new in v4.12)
A sector of a Rubis full field 3D reservoir model can be
imported and used in Saphir. In essence it enables Saphir
to go beyond the current Saphir NL limitations and to use
the Rubis sector analyses as a tool that can simulate
complex three-phase flow processes with gravity. The
key element of this new integration step between the
Ecrin modules is that the model is not simplified upon its
transfer from the full-scale simulator model in Rubis to the
PTA module Saphir. The full-scale simulation model is
simply stored in Saphir and re-simulated from there. This
approach is possible because the full-scale Rubis model
contains, by design, the ability to simulate, accurately
and precisely, transient flow responses due to the well
upscaling feature (see Numerical Upscaling in page 8).

2-Porosity PSS - Picking the transition

18

Rubis sectors running in Saphir

IPR

Multilayer models
Saphir integrates a comprehensive multilayer analytical
and numerical option with an unlimited number of
commingled (analytical and numerical) or connected
(numerical) layers. Each commingled layer has its own
initial pressure. For the analytical models, for each
layer the engineer may select any standard or external
model. Individual stabilized and/or transient rates can be
loaded and associated to any combination of contributing
layers. Rates may be loaded directly from the Emeraude
PL analysis module. The model simulates the pressure
response and the combination of layer rates that were
loaded with simultaneous optimization on both pressures
and layer contributions.

Formation tester module


This temporary Saphir addendum is provided for users
who wish to have a simultaneous match of source and
observation pressures. The v4.12 FT module permits
the interpretation of any number of probes, active and
interference, to discriminate vertical permeability. Models
for packer-probe and probe-probe interference are included
with the latter considering storage and Skin. An inbuilt preprocessor handles LAS format files and there is an option
to calculate rates from pump volumes.

Improving the model and running sensitivities


After model generation, nonlinear regression is used to
optimize the model parameters. This may be automatic
or the user may control the list of variable parameters and
their acceptable range, as well as the weight assigned to
different data sections. Optimization may be performed
on the loglog plot or on the whole production history.
Confidence intervals may be displayed at the end of
the regression process. Sensitivity analysis may be
performed by running the same model for different ranges
of parameters. Multiple analyses may be overlaid and
compared on all plots.
AOF / IPR
Saphir performs AOF / IPR analyses, available for vertical,
horizontal and fractured wells. The IPR can be used for flow
after flow, isochronal, or modified isochronal tests. It includes
options to display extended, stabilized, and transient
IPR. Shape factors for IPR, and average pressure
calculations are available for closed as well as constant
pressure systems.
The result of the IPR study can be drag-dropped into
Amethyste. Alternatively, Saphir can send the test data
to, and retrieve the IPR study from, Amethyste.

19

Formation testers: Analysis and rates calculation from volumes

Reporting and exporting


Saphir has an extensive range of comparison, reporting,
exporting and printing capabilities. The free and
unprotected Saphir Reader allows files to be read, printed
and exported without the requirement for an active license.
In v4.12 a new slide presentation format has been added
to use Saphir live on a LCD projector or to copy/paste
plots into PowerpointTM.

Production
Analysis

Main Topaze screen with production history plot

Topaze was first developed in response to Production Analysis (PA) evolving


from empirical methods to a methodology more closely aligned to modern
transient analysis. The old methods have been largely replaced by advanced
methodology such as the Blasingame plot and using true diagnostics developed
in Pressure Transient Analysis (PTA). There are now over 1100 commercial
licenses of Topaze with clients ranging from the Major IOCs to NOCs and
smaller independents on all continents.
The merging of the modeling capability of Topaze with the abundance of data
from permanent gauges has meant that users are able to obtain answers that
were previously only available from transient tests. This information has the
advantage that it is available at no extra cost and with no deferred production. As
the long-term production is modeled, the evolution in time of the well productivity
may also be quantified. Finally, forecasting is based on a real model as opposed
to an empirical function.
Complete analyses may be copied from Saphir into Topaze (or vice versa)
by simple drag-drop providing a quick start point for production analysis and
forecasting. A happy spin-off is that shared development brings cost savings that
are passed on to the client. Topaze, when added to Saphir is a marginal cost.
In Topaze NL (NonLinear) the slightly compressible fluid assumption and
pseudopressures are replaced by the exact diffusion equations. In particular,
Topaze NL can handle water influx, compression effects and 3-phase flow.
v4.12 sees the release of a major re-write of the Topaze Production Analysis
(PA) module. The new version now has simultaneous multiwell, decline curve
capability allowing full-field forecasting. A field production profile generator has
been added. For unconventional gas, desorption based on the Langmuir isotherm
is available, which can be combined with a new numerical fractured horizontal
well model for shale gas, and with water-gas flow for coalbed methane.
20

Loading and editing data


Topaze can load and edit an unlimited number of gauges,
rates, pressure and other data in almost any format
including ASCII, ExcelTM, PAS, DMP2, Merak and
databases of all kinds via OLEDB & ODBC. Data may be
input as points or as steps. Topaze has real-time links
with various acquisition systems, and data drag-drop from
other Ecrin modules or from Diamant Master (DM).
Well intake correction
When pressures are acquired at surface, or at any point
other than the sandface, the well intake option allows
either the loading or generation of a well intake response
to simulate sandface pressure. When the user extracts the
data for analysis, options include the choice of pressure
and rate gauge, time range, time sampling, and whether
or not to correct pressures to datum which, in v4.12, may
be the wellbore model created and copied seamlessly
from the Amethyste (WPA) module.
Classical PA diagnostics
Topaze is not designed to be a receptacle for the multitude
of straight-line methods available. The objective is to
use the most up-to-date modeling methods to maximize
the use of both rate and pressure data and adapt to the
demands of todays production environment e.g. depleted,
multiphase, tight gas and aquifer driven. When classical
methods are applicable the interface has been designed
to be as simple and flexible as possible. The following is a
summary of these methods. Fetkovich type-curves; single
pressure and assuming constant producing conditions.
Arps plot; multiple empirical plots with log(q) versus time as
the default. Nonlinear regression fits the data and displays
the decline function leading to an estimate of the ultimate
recovery. p-q plot to discriminate between transient
and boundary dominated flow; a line can be created in
the latter regime to reveal compartments by comparing
slopes. Normalized rate cumulative plot; A variation of
the Agarwal-Gardner plot that shows (dimensionless) rate
versus cumulative production. A straight line at boundarydominated flow gives an estimate of reserves. Flowing
Material Balance, added in v4.12, resembles a normal
P/Z plot provided in terms of reservoir average pressure.
A two-step iterative procedure then leads to the reserves
estimate.

Fetkovich

Modern PA diagnostics
The three main and complementary diagnostics used
in Topaze are the rate and pressure history match, the
Blasingame plot and the log-log plot. The Blasingame plot
displays instantaneous and average productivity index with
respect to material balance time (cumulative production
divided by instantaneous rate). It also calculates the
derivative, in a display similar to an inverted loglog plot
tending to a negative unit slope when pseudo-steady
state is reached. A Blasingame type-curve plot can be
displayed representing the response of a vertical well in
a closed circle, with the data overlaid. This plot is also
present for gas where material balance time is replaced
by a material balance pseudo time. The loglog plot can
be used as a diagnostic tool with exceptionally clean
data although even with scattered data, trends may be
detected. The simulated model can be compared to the
data on both of these plots.

Blasingame type-curve with data overlay

Arps multiple scale analysis

21

Model match on Blasingame plot

Matching data with an analytical model


With a wide range of well, reservoir and boundary models
shared exactly with Saphir (PTA) (see technical references
page 35) Topaze offers the unique capacity to simulate
pressures from the production history, or simulate rates
and cumulative production from the pressure history,
or both simultaneously. Nonlinear regression then
allows history matching, minimizing the error in terms
of pressures, rates, cumulative production or any
weighted average.
Matching data with a Numerical model
Since v3.0, numerical models have been used to generate
complex geometries with physical parameters beyond the
scope of analytical models. This is predominantly 2D but
with 3D refinement where needed. The mechanics for
building such models are described on page 7. In v4.12 the
most complex numerical model to date has been added to
solve the problem of fractured horizontal wells.

Water influx

Numerical model with multiple wells

Topaze multiwell capability (New in v4.12)


The new multiwell mode of Topaze facilitates the analysis
of the production of multiple wells. The production data
can be loaded simultaneously in formats including DMP2,
Merak, or by drag-drop from Diamant. It is then possible
to view this data together in a browser, and conduct quick
or detailed analysis of all wells or groups of wells. The
results are viewed easily as tables or bubble maps as
required and then it is a simple step to construct a field
profile from any extension of the selected diagnostic, be it
a decline curve or a complex model.

3D model

Topaze NL
In Topaze NL (NonLinear), nonlinearities are also added
to the problem; the slightly compressible fluid assumption
and pseudopressures are replaced by the exact diffusion
equations solving for real gas diffusion, non-Darcy flow,
pressure related physical properties, multiphase flow
and water drives. In v4.12 gas desorption based on the
Langmuir isotherm, for shale gas (single phase) and
coalbed methane (2-phase water-gas) have been added.

Topaze multiwell dialog

22

Production profile generator (New in v4.12)


The former K-Prospect is now integrated in Topaze. This
tool replaces the non-standard spreadsheets developed
by many engineers in the course of their work. It provides
a uniform and standardized approach to obtaining quick
production estimates for new fields and incremental
recovery studies. Valid for oil, gas and condensate the
user can model water or gas drive taking into account
all producing and/or injecting wells. It allows the input
of an unlimited number of various well-type profiles and
generates a field production profile consistent with the
drilling and workover schedule and facility constraints.
Results can be visualized or presented in tabular format.
Additionally, the profile generator can use a multiwell field
profile as a baseline for an incremental study.
Bubble map in Topaze

When the well drainage or productivity changes...


The numerical module allows the simulation of multiple
well production, where individual wells can be pressure or
rate controlled. Topaze permits 2D and 3D visualization
of the well drainage areas and their evolution with time.
If the simulation deviates from the data and indicates a
change in the well productivity index the user may assign
individual Skin values to different production periods.
Nonlinear regression is then applied on all skins, resulting
in a relationship between mechanical Skin and time.

Production profile generator

Changing Skin

Production Forecast, full field or single well


Without data, or after history matching, a production
forecast for any model may be run based on the anticipated
producing pressure. Sensitivity to improvements or decay
of productivity index can be simulated. It is possible to
specify a wellhead pressure constraint for the reference
well if an intake has been defined.

Reporting and exporting


Topaze has an extensive range of comparison,
reporting, exporting and printing capabilities. The free
and unprotected Topaze Reader allows files to be read,
printed and exported without the requirement for an active
license. In v4.12 a new slide presentation format has
been added to use Topaze live on a LCD projector or to
copy/paste plots into PowerpointTM.

Flowing material balance plot

Production forecast

23

History Matching

Rubis main screen

Rubis is a game-changing, data-centric, bottom-up, history-matching (HM)


tool. Snappy title it is not. Snappy and important tool it is. The decision to build
Rubis came from looking closely at the way todays engineers were working,
the massive but often disjointed and under-used data available, the restrictions
on time and budget and the demand to still describe the reservoir, define the
reserves, optimize and forecast the production, and still seek opportunities
every working day.
Managing a reservoir is not an academic exercise, it is brutally commercial so, at
the core of the build philosophy is the premise that if a simple solution solves the
problem, as it very often does, then use it. If the problem has greater complexity
make the tools intuitive, get them to talk to one another to build the problem,
and hence approach the solution, incrementally and using all data evidence
available. And, as you are going to need this thing on a daily basis, have it
fast and flexible enough to adapt to changing evidence. A simple single well
field could, potentially, be managed with an old-fashioned single decline curve.
If it works, so be it! But as fields deplete, interfere, change PI and generally
misbehave we are being presented with disjointed data that form the pieces of
a jigsaw puzzle. The evidence of misbehavior is always real and is in the data
and this needs to be incorporated quickly into the model and acted upon. There
is no point building a huge, clever static model that is not influencing our daily
engineering decisions. The data is there, we just need a way of synchronizing
and correlating it, a kind of model by proxy.
In v4.12, one can directly import a reservoir geometry from GRDECLTM and
CMGTM files. Gas desorption for unconventional gas (coalbed methane and shale
gas) has been added. It is also possible to extract a sector of a Rubis model and
then send it to Saphir where it can be run in full 3D, 3-Phase with gravity.
24

The origin of Rubis


Rubis evolved from the numerical heart of Topaze, a
product where Production Analysis (PA) was greatly
enhanced in turn by using modern Pressure Transient
Analysis (PTA) tools. PA and PTA have their limits
particularly in the 3D, multiphase environment and Rubis
provides the next simple, logical step.
As a result KAPPA has chosen to be diametrically opposed
to the development of the next generation of simulators
which can handle billions of cells with massive parallel
processing, with results that are often generated too late
to be useful in any case. The objective is to match the
production data, as often, and as quickly and simply as
possible by modular integration, using the pieces of the
jigsaw puzzle from the different methodologies. It is a tool
that sits somewhere between single cell material balance
and massive simulation models, it replaces neither but
does much of the work of both.
It is now possible to build simple 3-phase, 3D numerical
models, intuitively and in minutes with no special training.
The focus is on interactively building the field as it looks
and then the grid builds automatically inside this. The
engineer can then concentrate on the problem, keep it
updated and not worry about how to construct the tool
itself.
Last but not least the final consideration was economic.
Rubis has been built as incremental and integral to the
existing Ecrin suite and that is good news for the managers
who hold the purse strings.
The use of unstructured grids
The Voronoi numerical model is at the heart of the Ecrin
suite and this is covered in some detail on p.7 and 8. As
this model is common to PTA, PA and History Matching
(HM) the process in building is identical up to the point
of departure into 3D. The model is built interactively and
the grid forms, by default but with the possibility of user
control, automatically and with the minimum number of
cells to do the job. As the model in Rubis needs to be
more flexible to accurately account for elements such as
gravity, phase contacts, and layering the option taken was
the vertical Cartesian accumulation of 2D unstructured
layered grids with local 3D refinement when demanded
by the well geometry. In the simplest cases, the Rubis grid
could therefore be exactly the same as in Saphir, with the
exception of very near the wellbore where it could afford
to be less refined.

Typical Rubis grid

other ASCII files to define the physical problem, but the


compatibility stops when the grid starts.
Defining the PVT
The fluid can be characterized with Black-Oil or modified
Black-Oil models. The Rs and rs relations are turned
internally into a composition ratio, providing the grounds
for a compositional formulation. Internal correlations
can be used and tuned to match measured values or,
alternatively, tables can be loaded.
Defining the reservoir geometry
The build is identical to that of Saphir and Topaze and
indeed may start from a drag-drop from another Ecrin
module. For Rubis the number of layers is defined and
the layer horizons and/or thicknesses assigned. Any one
of those properties can be entered as a constant or as a
data set, in which case Rubis will use spatial interpolation
(Kriging) to build the final geometry. Then wells are, in

Defining the geometry; 2D map and cross-section

A happy consequence of this compatibility between the


PTA and the simulation grids is that we can jump between
the build-up and the production data on a grid that differs
only at the wellbore. Another plus is that the transient
analysis refined grid, already calibrated by its coherence
with the analytical models, can be (and is) used in turn to
calibrate the well index of the coarse grid. This is exploited
in v4.12 with the new Rubis Sectors facility.
Rubis carries no legacy technology and as such does not
use keywords. All actions are interactive and it has no
compatibility whatsoever with any grid of any simulator in
the industry. However, Rubis will read geometry files such
as GRDECL, PVT, vertical lift from Amethyste (WPA) and
Direct import from a GRDECL file

25

turn, created, positioned and perforated in the reservoir.


Vertical cross-sections can be viewed and moved
interactively while building the wells.
Defining the reservoir properties
The physical representation of the reservoir is split into
petrophysical properties such as the permeability and
porosity. Then the initial state is defined including contacts,
initial pressure, saturation pressure versus depth and
KrPc, these three groups forming a property subset. Any
such subset can be redefined where a zone is either a
layer, a region drawn in the 2D map, or the intersection
of layers and regions. Inner faults can be represented as
leaky, ie: barriers to flow, or conductive faults with infinite
or finite conductivity. Non-Darcy flow is available, as well
as double porosity and areal and/or vertical permeability
anisotropy. Each segment of the reservoir boundary
can be set individually to sealing, constant pressure, or
connected to an aquifer. Classical water influx models
and a numerical aquifer are available.
Defining the well geometry
A well in Rubis may be either vertical, vertical with a
hydraulic fracture, horizontal in a given layer ie: following
the horizon of that layer, or slanted. Any number of
perforations can be created and their opening/closing
time defined individually. Each perforation may have a
discrete Skin which may be constant, rate dependent
or time dependent. Because a wellbore model can
be coupled with options including classical empirical,
mechanistic and drift flux models, the well definition is not
limited to its actual path in the reservoir. It is therefore
possible, although not essential, to define the complete
well trajectory from surface.

Well data input


Real well pressure and rate data can be loaded and
edited. This, in turn, may be used in the well controls that
are defined by entering a sequence of targets and
constraint couples. An abandonment rate may be specified
for each well.
Executing the simulation
After the grid has been built, the user can override the
default time range, solver settings, list of output results
and frequency of the simulation restarts. Simulation
restarts are used to run animations of the simulation
and restart the process after an interruption or change of
the input problem. Initialize then creates the relevant
output plots, the pressure and saturation fields, and
calibrates the individual well cells. The simulate option
executes the simulation, which at any time can be paused.
Rubis either finishes the current time step or, if interrupted,
immediately stops and returns to the previous time
step. During the simulation, the lower message window
displays information on the process, while all relevant
plots are updated.

Rubis simulation window

Viewing and sharing


Individual well production and pressures, together with
reservoir statistical information, are displayed on a
dedicated vs. time plot and updated in real time during
the simulation. In playback mode, a vertical line highlights
the active replay time. Static fields such as permeability
or porosity and dynamic fields, such as pressures and
saturations, can be displayed in 3D or 2D with vertical,
horizontal or cross-section truncation.
Using an Amethyste wellbore model in Rubis

A simulated production log, per well, showing the


contribution by phase and zone is generated and time
stepped in playback mode. All data, input and stored, is
organized in a hierarchical data browser. Any number
of runs can be stored in a given session to enable
what-ifs to be run. The browser can be used to dragdrop components between runs or between other Ecrin
modules. In particular, numerical analyses made in Saphir
or Topaze can be copied to Rubis by a single drag-anddrop. There is extensive user control over the viewing and
output choice.

Example of coning in a limited entry well

26

Rubis Sector to Saphir PTA


New in v4.12; a sector of a Rubis full field model can
be exported and used in Saphir. This enables Saphir to
simulate complex three-phase flow processes with gravity.
The key element of this new integration step between the
Ecrin modules is that the model is not simplified upon its
transfer from the full-scale simulator model in Rubis to the
PTA module Saphir. The full-scale simulation model is
simply stored in Saphir and re-simulated from there. This
approach is possible because the full-scale Rubis model
contains, by design, the ability to simulate, accurately
and precisely, transient flow responses due to the well
upscaling feature. In short all Ecrin modules are calling
the same numerical kernel, just with different settings.

Log and transient view

Rubis Sectors

Unconventional gas
New in v4.12; as in Saphir NL and Topaze NL, a desorption
model is now available in Rubis to address issues related
to shale gas and coalbed methane.

3D, cross-section and 2D view

Single well gas production, with and without desorption

A quick word for the techies on the numerical kernel


We often get asked this so let us finish with a few words
on something with which most will probably never be
concerned. <deep breath> Rubis is finite difference (finite
volume to be rigorous). The interface of Rubis uses black
oil PVT however a compositional formulation is actually
implemented in the kernel. The solution method, and the
modeling of fluid flow along the wellbore is fully implicit.
The Rubis kernel is object-oriented and integrated in the
application architecture. No hidden batch file is launched
when the user clicks on simulate. <phew!>
27

Well Performance
Analysis

Amethyste main window

Amethyste is new in Ecrin v4.12. It is a Well Performance Analysis (WPA)


module fully integrated within the Ecrin suite. Amethyste spans the divide
between datum and the analysis packages Saphir (PTA), Topaze (PA) and Rubis
(HM), and provides a natural extension to results obtained from Emeraude (PL).
Amethyste brings all the good work we do in modeling the reservoir to where it
matters; modeling and predicting production performance at surface or pressure
at gauge depth.
So what?
The WPA concept is not new and there are fine vendors who have been developing
similar software for years. Indeed the results of those products can be imported
into Ecrin using industry standard file formats. But Amethyste is the first module
born of the workflow created by the interconnectivity of the Ecrin suite. It was a
technical necessity and herein lies the reason for its development. With complete
coherence with the rest of the suite, new workflows can be performed incredibly
quickly with consistency across the range for wellbore modeling as well as
IPR calculations. An IPR defined in Saphir can be drag-dropped in Amethyste.
More importantly, a wellbore model defined in Amethyste can be drag-dropped
to Saphir, Topaze and Rubis where this model is then run dynamically with
complete control. No blind table interpolation as Ecrin runs a full calculation for
each pressure guaranteeing accurate and smooth corrections.
Amethyste has its technical roots in WAM, a tried and tested WPA program
developed and used by Marathon since 1986. The PVT and flow model libraries
come from the unification of the KAPPA Ecrin and WAM codes. This library
uniquely combines empirical, mechanistic and drift flux models.
As code is shared with other modules the cost to add Amethyste to Ecrin is very
reasonable, unless you are using it before April 2010until when it is free!
28

Familiar Workflow
The workflow in Amethyste is similar to other Ecrin modules,
with the same logic for loading and manipulating data,
and a control panel guiding the user through the default
path. The Amethyste browser allows technical objects to
be transferred to and from other Ecrin documents. The
browser can be used to drag-drop components between
analyses or between other Ecrin modules. In particular, a
whole VLP object can be dropped in Saphir, Topaze, or
Rubis and run from those modules with the possibility of
changing any model parameter post-transfer.
After selecting the problem type, the user proceeds with
the definition of the Ecrin PVT Black-Oil model, with
correlations or tables. The user will define the Well and
then the VLP, or will to go straight to the IPR option. If
both IPR and VLP have been defined, Amethyste shows
an overlay on the WPA plot, which becomes the starting
point for subsequent sensitivity studies.

The Amethyste browser

Defining the Well


Amethyste distinguishes the wellbore and flowline and
they can be treated independently or together. The flow
can be defined as annular or tubing. An arbitrary geometry
can be defined from an input of step values, or by loading
point data such as a caliper survey. Any number of
restrictions can be included. A well sketch is available for
graphical display using a library of pre-defined completion
elements. After the well / flowline have been defined, a
well plot is automatically built.
Vertical Lift profile (VLP)
In the VLP option multiple pressure drop models can be
selected for comparison over the entire range. A reference
is defined that will be used for WPA. The application
settings allow the customization of the set used and the
default choice. For each model, multipliers can be applied
to the hydrostatic and friction pressure drops. These
values can be adjusted later to match user data. In the
VLP dialog, the user defines the temperature profile,
either by two points, a loaded profile or by calculation.
When calculated, the temperature uses a segmented
model incorporating convection, conduction, and thermal
compressibility effects.
The VLP is used to generate various scenarios of traverses
(p vs. depth) and VLP plots (p vs. q). For each scenario,
user data can be input, manually, from file, or picked
from production data. On validation, each scenario is run
and the corresponding plot built. The plot content can be
customized interactively.
29

VLP traverse

Inflow Performance Relationship (IPR)


If an IPR study was performed with Saphir (PTA), this can
be drag-dropped into Amethyste and the IPR part of the
analysis is over. Alternately, test data can be loaded, or
drag-dropped, into Amethyste. The IPR option, which is
strictly the same as in Saphir, is used to define a single,
or composite layer IPR. Multiple models can be selected
and compared. A reference is defined that will be used
for WPA. For those models using a Skin, the user can
input a total Skin or calculate the Skin components
using a completion model. User data can be added for
comparison with the model(s). An IPR plot is created
that can be customized interactively. From this plot, the
user may adjust the reference IPR parameters in order to
match the data provided.
Well Performance Analysis (WPA)
As soon as a VLP and an IPR have been defined,
Amethyste overlays them, and the intersection point gives
the solution for a given value of surface conditions.
Running Sensitivity
Amethyste can perform as many sensitivity studies as
needed. A given study will correspond to a set of user
controlled values for an IPR parameter, a VLP parameter
or both. A cross correlation between the resulting rates
and the sensitivity parameters can then be displayed.

Sensitivity Analysis

Report, export and plot


Amethyste has an extensive range of reporting, exporting
and printing capabilities. Lift curves can be generated in
Eclipse format for use with third party software. The free
and unprotected Amethyste Reader allows files to be
read, printed and exported without an active license.

Production
Logging

Emeraude main window - Deviated well - 3 phases

Production logging is now seen as a powerful quantitative method that takes its
own place in the set of data acquisition tools for the reservoir engineer, along
side transient and production analysis. No longer just the tool of last resort, PL is
now used as a calibration point for the reservoir model and as an important tool
in the development over time of the producing intervals in the wellbore.
The interpretation process has shifted into the hands of the end-user engineer
due, to a great extent, to the development of client, as opposed to tool focused
software; Emeraude.
Production logging surveillance has given the reservoir engineer a powerful tool
in the drive for more accurate and refined reservoir characterization. Emeraude
is now used by all the major service companies and all the major producers
and many independent operators and service providers. Emeraude is seen
as the industry standard allowing a common platform for communication and
interpretation between service companies and operators.
From vertical injectors, to horizontal or highly deviated multiphase producers,
Emeraude provides a comprehensive and intuitive set of tools, to produce
results from the log data from simple through to the most sophisticated tool
strings. KAPPA remains committed to the ongoing development of the industry
standard PL interpretation package by remaining in close contact with tool
manufacturers.
Emeraude is currently a stand-alone application that will migrate to the Ecrin
environment under Generation 5.

30

Data load and display


Emeraude can load data from LIS, LAS, and ASCII files
or from the clipboard or keyboard. Versus depth logs or
stationary data can be input. Automatic tracks give an
instant view of the log data while manual options allow
users to customize their own views. A well sketch can
be built, by drag-and-drop, with predefined elements and
displayed alongside the data. Well views can be created
from the browser to display the well geometry. Holdup
channels can be added to the well view and plotted within
the wellbore. All display settings can be customized and
templates can be created.

Spinner calibration and apparent velocity


The user interactively defines the spinner calibration zones
and the positive and negative lines on each zone are
automatically calculated. Using a comprehensive tool kit
for editing, and once the spinner calibration is satisfactory,
the user generates an apparent velocity channel.
Several spinners can be handled simultaneously in the
same interpretation, providing a simple facility to combine,
for example, an in-line and fullbore spinner.

Spinner calibration

Main window

Data structure and browser editing


An Emeraude document can contain an unlimited number
of surveys and interpretations within these surveys.
Emeraude offers a hierarchical representation in a data
browser simultaneously showing all opened documents
in a drag-and-drop environment. The browser contains a
wide range of editing options: including lateral average,
depth stretch, shift, data cut and fill, merging, splicing,
derivative and sampling. The editing options can be
enriched by connecting user functions through an external
DLL API or by the recent addition of a built-in user
formula module.

Single and zoned PVT


The PVT model defined by correlations provides the
properties of any phase at any temperature and pressure.
It is also possible to redefine the properties for each inflow
zone. Correlations can be viewed both graphically and
inside a table, and matched to user-entered data.
Methodology
Rate calculation is treated as a minimization problem and
solved using nonlinear regression, offering full flexibility in
the type and number of input measurements. Interpretations
can be run from any number of sufficient inputs including:
spinner apparent velocity, density, pressure gradient,
capacitance, holdup of any phase, velocity of any phase,
rate of any phase, and temperature. Recent additions
permit multiple measurement of the same type.

Zone rates option

Data browser

31

Local v. global regression


The default calculation scheme in Emeraude involves
successively solving the cumulative rates at selected
depths inside the wellbore. The contributions of the inflow
zones, located in between the calculation zones, are then
obtained from successive differences. Because each
local regression is done regardless of the solution above
or below, the overall solution may result in contributions
from the same interval showing different signs which,
physically, is not possible.
Global regression provides a method for solving these
cases. For every producing interval, the sign of the
contribution can be imposed. It is also possible to fix
any particular contribution to a user-entered value, in
particular a null value. Global regression can be coupled
with a genetic algorithm to avoid local traps.
Interpretation models
Emeraude offers a full range of flow models from single
to 3-phase flow. Specific models are provided to handle
flow re-circulation as well as flow through standing water
columns.
3 and 2-phase liquid-liquid stratified models for horizontal
and highly deviated wells are available. It is also possible to
connect user models through an external DLL interface.

FSI

Pulsed Neutron Log (PNL) interpretation


Clean formation, shaly - single water and, shaly - dual
water models are available. Channels required for the
calculation can be loaded from an openhole interpretation,
or estimated from correlations. Classical crossplots can
be created to obtain or correct estimates. Time-lapse
presentation can be obtained automatically by supplying
the interpretation with additional water saturations, and
defining the relevant chronology.

Flow map

Multiple Probe Tool (MPT) support


Emeraude includes specific treatment for the Schlumberger
DEFT, GHOST, and FSI and the Sondex MAPS suite; CAT,
SAT and RAT tools. Image tracks can be created, and
cross sections displayed at any depth. Average values of
velocity, holdup, and, where relevant, phase velocity can
be calculated using the arithmetic or stratified average.

PNL Interpretation

Selective Inflow Performance (SIP)


Selective Inflow Performance (SIP) can be made very
quickly from the existing interpretations once the reservoir
zones are defined. For gas rates, the SIP can be based
on pseudopressure instead of pressure. An unlimited
number of SIPs can be created and compared. Each
zone can be assigned a different model: straight line,
c&n, or Jones IPR. The SIP can use the total rate,
the rate of a given phase, or the total liquid rate. The
analysis can be done downhole or with surface values.
Pressure datum correction can be applied and a composite
IPR displayed.

Multiple probe tools - Image view - Cross-section

32

Well sketch / well view


A well sketch can be built interactively by drag-and-drop
selecting from a library of completion components. Well
views can be created from the deviation and internal
diameter data and holdups displayed inside the view. In
complex geometries the simultaneous display of the well
view with the logs helps in understanding the nature of
the flow.
Test design
All tool responses can be forward modeled given any
particular flow scenario. The simulated channels take
into account the tool specific response as well as their
accuracy and noise.
SIP analysis

Temperature and DTS


The temperature model is analytical; it comprises an
enthalpy balance opposite inflow zones, and a conduction/
convection model between. Joule-Thomson effects in the
reservoir are accounted for given a user estimate of the
relevant pressure drop. This process permits the analysis
of Distributed Temperature (DTS) data, standalone
temperature logs or can be used to replace a faulty spinner
or a spinner blinded by such effects as flow recirculation.

Output / Export
Channels present in an Emeraude document can be
exported in LIS, LAS, or ASCII format. The log output in
Emeraude is WYSIWYG. Single or multiple log tracks,
as well as any X-Y plot can be copied to the clipboard in
Bitmap or WMF format. The log printout includes a preview
option where fonts, scales and grid lines can be modified.
Screen captures can be made at any point and returned
to with a single click. API logs can be produced from the
print preview option and stored within the document. A
built-in report can be printed and previewed that includes
predefined sections. It is possible to produce a report
in MS-Word using the OLE interface of Emeraude.
A template MS-Word report is installed and can be
customized as required. All reporting and exporting
features are accessible with the free Emeraude reader.

Temperature Interpretation

Formation test data QAQC


This allows the loading of reservoir pressure and reservoir
permeability/mobility. Each depth point can be assigned
a legend, and a quality indicator. All information appears
on the tracks built automatically to display the pressure
and permeability. Lines can be calculated or drawn and
gradients/contacts are deduced.

Log preview

QAQC of Formation Test results

33

PVT correlations
Gas

Technical
references

Z
Viscosity
Pb & RS

Oil

Bo
Co
Viscosity

Dranchuk, Standing, Beggs & Brill, Hall-Yarborough,


Dranchuk & Abou-Kassem
Lee et al., Carr et al., Lee compositional
Lasater, Vasquez & Beggs, Standing, Glaso non volatile,
Glaso volatile, Lasater-Standing, Petrosky & Farshad,
Kuparuk, South brae
Standing, Vasquez & Beggs, Glaso, Petrosky & Farshad,
Kuparuk, South brae
Petrosky & Farshad, Vasquez & Beggs
Beggs & Robinson, Beal

Rsw

Katz, Meehan & Ramey, Spivey, Haas, Culberson


& McKetta

Bw

Gould, McCain, Meehan & Ramey, Spivey

Cw

Dodson & Standing, Osif

Water

Viscosity

Van-Wingen & Frick, Meehan & Ramey, Helmholtz,


Matthews & Russel

Pressure drop correlations


Artep

Liquid-Gas ; Mechanistic ; Any angle

Aziz & Govier

Liquid-Gas ; Mechanistic ; Vertical

Beggs & Brill

Liquid-Gas ; Empirical ; Any angle

Brauner

Liquid-Liquid stratified ; Mechanistic ; Deviated

Choquette

Liquid-Liquid ; Empirical ; Vertical

Dukler-Eaton

Liquid-Gas ; Empirical ; Horizontal

Duns & Ros

Liquid-Gas ; Empirical ; Vertical

Hagedorn & Brown

Liquid-Gas ; Empirical ; Vertical

Kaya et al

Liquid-Gas ; Mechanistic ; Any angle

Orkiszewski

Liquid-Gas ; Empirical ; Vertical

Petalas&Aziz

Liquid-Gas ; Mechanistic ; Any angle

Stanford

Liquid-Gas & 3-Phase ; Drift flux ; Any angle

Taitel & Dukler

Liquid-Gas ; Mechanistic ; Vertical

Zhang

3-Phase stratified ; Mechanistic ; Deviated

Baxendell & Thomas

Liquid-Gas ; Empirical ; Vertical

Chierici et al

Liquid-Gas ; Empirical ; Vertical

Cornish

Liquid-Gas ; Empirical ; Vertical

Fancher & Brown

Liquid-Gas ; Empirical ; Vertical

Gray

Liquid-Gas ; Empirical ; Vertical

Griffith et al

Liquid-Gas ; Empirical ; Vertical

Poetmann & Carpenter Liquid-Gas ; Empirical ; Vertical


Reinickle & al

Liquid-Gas ; Empirical ; Vertical

Hasan & Kabir

Liquid-Liquid ; Mechanistic ; Deviated


34

Ecrin contains an extensive range of well, reservoir and


boundary models. In addition Ecrin links dynamically
to user built models or to an additional and extensive
external model library that is available for download from
www.kappaeng.com
Built-in analytical models
Wellbore
models

Well
models

Skin
models

Reservoir
models

Boundary
models

No storage
Constant storage
Changing storage (Fair, Hegeman)
Finite radius
Fracture - uniform flux
Fracture - infinite conductivity
Fracture - finite conductivity
Horizontal
Limited entry
Slanted fully penetrating
Slanted partially penetrating
Time dependent
Constant
Rate dependant
Time dependant
Homogeneous
2-porosity P.S.S.
2-porosity transient sphere
2-porosity transient slab
2-layer with X-flow
Radial composite
Linear composite
Areal anisotropy
Infinite
Single sealing fault
Single constant pressure fault
Closed circle
Constant pressure circle
2 parallel faults

Formation
Testers

2-layers with X-flow & radial composite


2-layers with X-flow & 2-porosity
2-porosity & radial composite
2-porosity with Skin at matrix blocks
3-porosity (1 fissure and 2 matrices)
3-layers with X-flow
4-layers with X-flow
4-layers with X-flow in closed system
Conductive fault
Horizontal well with horizontal anisotropy
Horizontal well with identical fractures
Horizontal well with non identical fractures
Multi-lateral well
Well in a reservoir pinchout
Radial composite 3 zones in an infinite or closed reservoir
Radial composite 4 zones in an infinite or closed reservoir
Limited height fracture
Built-in numerical models
User defined reservoir contour in the X-Y plane,
unlimited number of segments
Any contour segment sealed or at constant pressure
User defined faults inside the contour with individual
leakage factor
True double-porosity model (duplication of grids)
Composite regions with associated diffusivity,
storativity and double-porosity model

2 intersecting faults with any angle


Rectangle

Multilayer

External analytical models

Composite rectangle
Leaky fault
Commingled production
Layered pi
Packer-probe interference with storage
and Skin
Probe-probe interference (0, 90, 180 )
with storage and Skin

Horizontal anisotropy
Varying thickness, porosity and permeability fields
Conductive faults
Multiple wells
Numerical well upscaling
Fractured well with finite / infinite conductivity
Limited entry vertical well with vertical anisotropy
Fractured well with limited entry and vertical anisotropy
Horizontal well with vertical anisotropy
Horizontal well intersecting multiple fractures
Changing storage (Saphir only)
Time-dependent and rate-dependent Skin
Saphir and Topaze: slightly compressible liquid
Non-Darcy flow for gas (NL)
2-phase W-O and W-G (NL)
3-phase W-O-G (NL Topaze and Rubis)
Real gas diffusion (NL)
Water influx (NL) : Carter-Tracy, Fetkovich, Pot,
Shiltius, Numerical
Multilayer : crossflow, multiwell, partial completion of
the wells
Horizontal well in multilayer
Limited entry well in multilayer
Fractured well in multilayer
Composite multilayer
Gas desorption (with or without diffusion)

35

Petroleum Engineering
Training and Consulting Services
for the real world

Whenever fluid is moved through the reservoir with pressure and/or rates
recorded with a study being made, we are performing Dynamic Data Analysis.
The process involves data handling, analytical and numerical methods and there
is frequently an overlap and complement between the data and methodologies.
The industry standard KAPPA Ecrin Dynamic Data Analysis suite of reservoir
engineering software is fully integrated so that the full potential of using the same
data and objects maybe made during complementary analysis.
KAPPA Training and Consulting Services (TCS) is the service arm of the
KAPPA Group that provides training and consulting services specializing and
complementing the methodologies applied in our software namely, transient and
production analysis, production log analysis, data management, modeling and
history matching.
Two additional courses have been added to the extensive training portfolio to
include Rubis modeling and problem solving and a software only course that
covers the functionality and interconnectivity of the complete Ecrin suite.
The KAPPA aim is to provide training that offers sufficient theory to understand
the subject and the tools to perform useful work immediately after the course.
Our trainers are all experienced practitioners in their fields and selected to get
the message across with clarity and bearing in mind we are in the commercial
world with a need to produce a real return on the vital training dollar.
KAPPA TCS provides training through its worldwide public course program and
also trains hundreds of engineers every year in client specific in-house courses
and workshops.
Clients have access to software support through extensive contextual help in
the application, our regional offices, web collaboration tools and, recently added,
interactive videos on the web.
Our consultants are some of the most experienced in their field in the industry
and are available for short or long term interventions anywhere in the world.

37

Dynamic Data Analysis (DDA part I and II)


zz A two week course covering all aspects of Dynamic Data Analysis
zz Includes theory and practice of pressure transient and production decline analysis
zz Recent developments in the handling of data from permanent downhole or
surface gauges
This is the flagship KAPPA course covering the core
subjects and methodology used in KAPPA software. At the
end of the course attendees will have a strong theoretical
and practical grounding in handling pressure and rate
data from the management of such data through to the
analysis and practical operational use of the results.
The emphasis is on production data management
and processing as well as the visual and conceptual
approach to transient and production analysis. Essential
mathematics is included in the detailed appendix provided
in the KAPPA Dynamic Data Analysis Handbook.
There is a strong emphasis on handling real data
throughout. Participants are strongly advised to bring, or
to send to KAPPA in advance of the course, permanent
gauge, transient and decline data for processing and
analyzing during this course.

Course programme
The DDA course can be taken in one two-week session or in two,
one-week sessions. To attend the second week PA/PDG (DDA Part
II) course engineers must have already attended the Foundation
PTA course (DDA Part I) or equivalent.
The first week of the Dynamic Data Analysis course (DDA part I) is
the Foundation Pressure Transient Analysis (FPTA) course.
The second week (DDA part II) is the Production Analysis and
Permanent Downhole Gauge (PA/PDG) course.
If you plan to attend only the PA/PDG (DDA Part II) we strongly
recommend that you check your current level of knowledge by
taking the self assessment test on our website.
The use of the Ecrin DDA software suite including the PTA module
(Saphir), the PA module (Topaze) and the PDG Data Management
Module (Diamant) will be taught as part of this course.
For a comprehensive course content please see the PTA (DDA
Part I) and the PA/PDG (DDA Part II) pages in this booklet

38

Foundation Pressure Transient Analysis (DDA part I)


zz Modern Pressure Transient Analysis (PTA) from theory to practice
zz Strong practical emphasis on real data with many real life examples
zz Immediate return on investment with attendees able to perform commercial
analysis upon completion of the course
The KAPPA Foundation Pressure Transient Analysis
(FPTA) course has been designed to teach the generic
methodology, and the practice of Pressure Transient
Analysis, (PTA) in addition to the mechanics of Saphir
software which is learnt almost as a by-product. The
emphasis is therefore on a visual and conceptual approach
to interpretation including only essential mathematics. Full
theory, including formulae and derivations are provided,
as well as the conceptual explanation of PTA in the
accompanying KAPPA Dynamic Data Analysis handbook
provided to each attendee.
Field examples are used to illustrate each concept. The
final afternoon is a workshop, to which participants are
encouraged to bring their own data.
By the end of the course the attendee should be capable
of performing analyses and developing interpretations
typically required of a transient test analyst. In addition,
the attendee should have the foundations sufficient for
developing further experience in transient analysis.
Dynamic Data Analysis (DDA)
The Foundation Pressure Transient Analysis (FPTA)
course is the first week of the two-week Dynamic Data
Analysis (DDA) course and can be taken as part of a
continuous two-week session or as a single standalone
one week course.
Pre-requisites to attend the course
None

Course programme
Basic theory

Introduction to Darcys law and the equation of state leading to the


diffusivity equation, the principle of superposition, infinite-acting
radial flow, wellbore storage and Skin and pseudo-steady state.
A short history of transient analysis; semilog and Horner plots,
loglog analysis and the Bourdet pressure derivative, build-up and
multi-rate analyses leading to the essential PTA workflow.

QA/QC, Test Design and Gas Tests

Software usage
The use of Saphir, the PTA module of the Ecrin DDA
software suite, will be taught as part of this course.

Comparison of gauge pressure and temperature channels.


Differential pressure analysis to determine gauge offsets (gauge
quality) and fluid phases. Test design taking into account test
objectives and constraints and then integrating gauge limitations
and running what ifs?
Gas Tests: Real gas law and gas diffusion, the use of pseudopressures
and pseudotime and a consideration of the limitations of classical
PTA tools for gas. AOF and IPR

Basic reservoir and well models

Pattern recognition and matching for basic well and reservoir


models; wellbore storage, Skin, homogeneous and double-porosity
reservoirs, vertical, fractured and limited entry wells.

Boundary models

The infinite acting reservoir, identifying faulted, channel and closed


systems. Pressure support and radius of investigation leading to
reserve estimation and precautions.

Basic numerical modeling

Introduction to the use of numerical modeling when applied to


PTA that will include building a model of irregular shape with
multiple wells.

Workshop session

The final afternoon is an opportunity to work with real data provided


by attendees or specific cases from the extensive KAPPA example
catalogue.

39

Production Analysis/PDG (DDA part II)


zz Modern Production Analysis (PA)
zz Managing and using Permanent Downhole Gauges (PDG) data
zz Immediate return on investment with attendees able to perform commercial
analysis upon completion of the course
The KAPPA Production Analysis / Permanent Downhole
Gauge (PA/PDG) course is designed to teach the generic
methodology, and the practice of Production Analysis (PA),
in addition to the mechanics of Topaze software which
is learned almost as a by-product. The emphasis is on a
visual and conceptual approach to interpretation including
only essential mathematics. Full theory, including formulae
and derivations are provided, as well as the conceptual
explanation of PA in the accompanying KAPPA Dynamic
Data Analysis handbook provided to each attendee.
Field examples are used to illustrate each concept. The
final afternoon is a workshop, to which participants are
encouraged to bring their own data.
At the end of the course the user should be capable of
performing analyses and developing interpretations
typically required of a production data analyst. They
should also be able to specify requirements for PDG data
management and understand how to use PDG data.
Dynamic Data Analysis (DDA)
The Production Analysis/Permanent Downhole Gauge
(PA/PDG) course is the second week of the two-week
Dynamic Data Analysis (DDA) course and can be taken
as part of a continuous two-week session, or as a single
standalone one week course.

Course programme
Quick revision of PTA concepts

As learnt in the FPTA (DDA Part I) course with particular reference


to long term data and pseudo steady state conditions typically found
in PA cases.

Pre-requisites to attend the course


To attend the PA/PDG (DDA part II) course it is essential
you have attended the Foundation Pressure Transient
Analysis course (DDA part I) or its equivalent. To check that
you are ready to attend the PA/PDG (DDA part II) course
please try the self assessment test on our website.

Using numerical models

Building a complex numerical model for use in PTA and PA including


irregular shapes, with multiple wells, compartments and geological
discontinuities and heterogeneity.

Production Analysis

A brief historical overview from empirical methods to modern


plots including Blasingame, loglog, material balance and history
matching/forecasting. Modern practical analysis from loading and
editing data to model generation, sensitivity and reporting. Use of
time dependent well parameters.

Software usage
The use of the Ecrin DDA software suite including the
PTA module (Saphir), the PA module (Topaze) and the
PDG Data Management Module (Diamant) will be taught
as part of this course.

Permanent Downhole Gauges

Handling very large datasets, wavelet filtering, working with


incomplete and unplanned shut-ins. Processing multiple build-ups
and monitoring changing parameters, such as Skin, with time.
Setting up persistent links to data sources, issues and pitfalls,
Specifying data requirements.

Field example

Working on a library of data from permanent gauges facilitating


the analysis of multiple build-ups, single build-ups and production
profiles. Other single profiles using surface recording or intermittent
data will be analysed. PA and PTA examples will be handled
analytically and numerically as applicable.

Workshop session

The final afternoon is an opportunity to work with real data


provided by attendees or specific cases from the extensive KAPPA
example catalogue.

40

Advanced Pressure Transient Analysis


zz For practitioners with at least six months transient analysis experience
zz Self assessment pre-course test on the KAPPA website
zz Workshop environment looking at own data and the most complex real data

The KAPPA Advanced Pressure Transient Analysis


(APTA) course has been designed to assist experienced
interpretation engineers in dealing with the advanced
functionality of the Ecrin PTA module (Saphir), and the
more complex aspects of pressure transient analysis.
Many examples are worked hands on to illustrate the
practical aspects of complex cases using the analytical
and numerical modeling in Saphir.
Participants are encouraged to bring their own data
examples to be worked in the class.
re-requisites to attend the course
P
To attend the APTA course it is essential you have
attended the Foundation Pressure Transient Analysis
course (DDA part I) or its equivalent, and have at least
six months of real world transient analysis experience.
Without this experience it is unlikely you will keep pace
with the course.
To check that you are ready to attend the APTA course
please try the self assessment test on our website.
If you are an experienced PTA engineer but are
not familiar with Saphir we can arrange a free
demonstration copy of the software to assist you in your
preparation prior to the course. Please contact tcs@
kappaeng.com for assistance.

Software usage
The use of Saphir, the PTA module of the Ecrin DDA
software suite, will be taught at an advanced level as part
of this course. It is essential that attendees have attained
a good working knowledge of Saphir prior to registering
for this course.

Course programme
A theoretical and practical revision and deeper look at
Dynamic flow concepts

A refresher to cover the basic theory, to correct any misconceptions


and to look in depth at transient analysis tools. The session includes
the theory of diffusion, IARF and pseudo-steady state. The concept
of the Bourdet derivative including derivation, properties and
limitations. Test design and objectives, superposition in time and
in space, sensitivity to input parameters, radius of investigation.
Transient analysis and where it sits in relation to other reservoir
engineering methods. Constant wellbore storage and why it never
is, Skin components, standard interpretation models including finite
radius and fractured wells, double-porosity reservoirs and boundary
effects. This will also include a revision of the use, where needed, of
the Ecrin PTA module (Saphir) with help on shortcuts and advanced
level functionality.

Advanced well models

A detailed look and worked examples of horizontal, slanted and


limited entry wells. This will include looking at the theoretical
derivation and response and comparing this to what happens in
the real world. A detailed look at the parameters affecting pressure
behavior in horizontal wells including low vertical permeability and
partial horizontal drainage. The session will include a number of real
examples to illustrate the various issues.

Advanced reservoir and boundary models

Heterogeneous, composite reservoirs; their bad reputation and


their real world use illustrated with examples. Complex boundary
conditions and unconventional limits including constant pressure
boundaries, leaking, conductive and non-continuous faults handled
with a common sense approach. Finite reservoirs and material
balance and the effect of compressibility on reserve estimations.
Layered reservoirs from the simple two layer analytical case to the
complex numerical multilayer.

Deconvolution

The principle and the use of the complete production and pressure
history, the use of the method as a time saver and for seeing deeper
into the reservoir coupled with the limitations and caveats of the
method.

Advanced numerical analysis

Complex, composite and heterogeneous reservoirs including multiwell analysis, PVT considerations, variable thicknesses, porosities
and permeabilities, diaphasic flow and gas cap/aquifer drive.

Workshop session

The final afternoon is an opportunity to work with real data provided


by attendees or specific cases from the extensive KAPPA example
catalogue. As this is an advanced course, we would encourage
delegates to raise any additional specific queries they may have
either prior to, or during the course.

41

Foundation Production Log Interpretation


zz Emphasis on real data with real life examples
zz Production logging tools limitations, application and data interpretation
zz Immediate return on investment with attendees able to perform commercial
analysis upon completion of the course
The KAPPA Foundation PL course has been designed
to teach the generic methodology and the practice of
Production Log Analysis, in addition to the mechanics
of Emeraude software which is learned almost as a
by-product. The emphasis is therefore on a visual and
conceptual approach to interpretation including only
essential mathematics. Non-essential formulae and
derivations are provided in the accompanying literature.
The course notes include all the presentation material.
Participants are encouraged to bring their own production
log datasets to be worked in class.
At the end of the course the user should be capable of
making a reasonable basic interpretation, and have
the foundations for developing further experience in
Production Log interpretation.

Course programme
Basic theory

Sensors and measurements, their accuracy, application and


resolution. Spinner calibration, essential PL concepts including hold
up and slippage. Basic PVT and flow correlations. Two phase and
three phase models.

QA/QC

Judging data quality, selecting passes, editing and comparing with


simulated data. Tips for QA/QC on all sensors. Programming a PL
job for best results.

Practice

Demonstration of the interpretation method and hands on exercises


using Emeraude mean that the participant gains a sound working
knowledge in the use of the application and the basics of the
interpretation process.

Emeraude features

Advanced features of Emeraude not used during the basic


course are demonstrated to make the participants aware of
options available in the application. All these features are covered
in greater detail during the advanced production logging
interpretation course.

42

Advanced Production Log Interpretation


zz For practitioners with at least six months production log interpretation experience
zz Self assessment pre-course test on the KAPPA website
zz Workshop environment looking at own data and the most complex real data cases

The KAPPA Advanced PLI course has been designed to


assist experienced interpretation engineers in dealing with
the advanced functionality of Emeraude, and the more
complex aspects of production log analysis.
It is strongly recommended that the attendee has previously
attended the Foundation Production Logging course or
its equivalent, and has at least six months of real world
interpretation experience of production log data.
To check that you are ready to attend the Advanced
PLI course please try the self assessment test on our
website.
Numerous examples are worked hands on to illustrate the
practical and critical path of any production log analysis
using the basic and advanced functionality. Participants
are encouraged to bring their own data examples to be
worked in class.

Course programme
Advanced modules

Use of selective inflow performance with worked examples. Review


of operating tips, workarounds, troubleshooting approaches and
typical reporting layouts. Temperature log interpretation example.

Multi-Probe Tools

Working with images and data from tools such as the Schlumberger
GHOST, DEFT, FSI and the Sondex MAPS suite. An in-depth review
of the Emeraude regression and calculation schemes, which will
assist the user in fine-tuning interpretations and troubleshooting.

Practice

There is a strong emphasis on real examples with three days


taken up with practical work on the following: Multiphase, vertical
and deviated producers, with conventional and multi-probe
measurements. Complex completion configurations and multi
survey PLTs integrating interpretation with logging objectives and
secondary information. Horizontal PLTs with conventional spinners
and multi-probe measurements and integration of stations. Use of
the Pulsed Neutron Log and temperature analysis modules with a
presentation of the theory and worked example.

43

Rubis Modeling and Problem Solving


zz Highly practical with hands-on model building from day 1
zz No previous or specialist simulation experience required
zz Immediate return on investment with attendees able to build history matched
models, run what-ifs scenarios, and forecast upon completion of the course
Rubis was born as a natural progression from the Voronoi
numerical model in the Pressure Transient Analysis
(Saphir) and Production Analysis (Topaze) modules of
the Ecrin suite. Rubis sits between a single tank material
balance and a multi-million cell simulator, not replacing
them but doing much of the work they can do, easier, faster
and in concert with the Ecrin suite. The basic premise of
this module is that all extraneous data be stripped away
and that, given we have pressure and rate data that is
real, measured and correlating with other dynamic data
methods is therefore telling the truth. This can be used to
build a predictive model bottom up.
This course will help not only reservoir engineers but
also production engineers, geologists and well technologists
build reservoir or sector models with no previous experience.
They can then forecast, perform what-ifs and foresee future
intervention opportunities and risks with build and run times
in minutes instead of days.
Pre-requisites to attend the course.
No prior or specialist simulation knowledge is required.
As the Rubis software environment is very similar to other
Ecrin modules previous experience in using the Ecrin
Suite will be of benefit although not strictly required. When
understanding the complementary nature of the simulation
model and the PA and PTA analysis prior experience of
these methodologies would be an advantage. The theory
of numerical simulation, transient analysis and production
analysis are not subjects covered in this course. If you
wish to discuss this please contact tcs@kappaeng.com
and one of our trainers will talk it through with you offering
pre-course reading/worked examples to give you a head
start if needed.

Course programme
Basic theory

This is NOT a simulation course. The mathematics and theory of


numerical simulation are not covered. The entire emphasis is on
the building of the physical problem and application of the model to
various scenarios.

Rubis Mechanics and model building

Introduction to Rubis and its role in the Ecrin suite compared to


transient and production analysis leading to flexible upscaling
from the wellbore to the full field. Building a simple model:
reservoir structure, rock and fluid properties and well description.
Defining multiple regions with different initial states. Importing and
loading data from various sources including a Geomodeler such
as Petrel.

Running the model

Automatic grid building, run settings, initialization, simulation,


displays, re-runs, what-ifs and reporting.

Specific problem solving

Recovery comparisons for different field scenarios and well


placements. Adapting the grid to model coning effects. Modeling
and adjusting the aquifer model and strength. Modeling multiple and
connected gas tanks.

Practical integration with PTA and PA.

Initializing a Rubis case from Production Analysis (Topaze) and


returning this to the production analysis. Initializing a Rubis case
from Pressure Transient Analysis (Saphir). Using a sector of a
Rubis case to analyze a build-up in Saphir.

Full Field Case History

The course culminates in the attendees building a full-field


history matched model leading to production optimization and
a forecast of primary recovery and then offering scenarios for
secondary recovery.

44

Ecrin Dynamic Data Analysis software training


zz Full Ecrin suite functionality covered
zz No pre-requisite software knowledge required
zz For those requiring software training in the Ecrin module, from raw data to
interconnectivity and workflow
The Ecrin Dynamic Data Analysis suite handles the most
complex full field massive pressure and rate data down to
the simplest single well, single rate case, and everything
in between. This software course is designed to teach,
hands-on, the functionality of the full suite. The purpose is
also to teach the interconnectivity of the modules and how
a result from one analysis can be used and compared in
complementary techniques.
As the logic of the software is coherent between the
modules, by the time the attendees get to 3D, 3-phase
history modeling in Rubis they will be able to build much of
it intuitively as the knowledge compounds as the training
progresses.
In addition to classical data load, editing and processing
the course will cover the handling of massive data from
PDG in the Diamant interface including wavelet data
reduction, automatic multiple build-up selection, rate
correction/allocation and versus time monitoring.
Selected data is then processed in Saphir for PTA and
Topaze for PA using classical analytical modeling. Object
sharing, data management and the time-saving drag and
drop workflow is emphasized. Modeling will then move
to construction of the voronoi model in 2D, 2D with 3D
local refinement at the wellbore and then to 3D multiwell,
multiphase in Topaze and Rubis history matching. During
this process grid upscaling and sector interconnectivity
between modules will be practiced.
To represent cases where corrections to sandface
and wellbore optimization is required the WPA module,
Amethyste will be used linked to Saphir, Topaze
and Rubis.
Pre-requisites to attend the course
Prior knowledge of the software is not required but if
you have never used it before, as a primer, we would
be happy to give you a full working demo version with
examples prior to attending. It is, however, very important
to understand that this is a software course only. Theory
is only touched upon. It is essential you are conversant
with transient analysis. If the shape of a build-up log-log
and derivative plot means nothing to you this is not the
course for you.

Course programme
Data Load and Edit

Data QA/QC, connecting to raw PDG data, setting wavelet filtration


levels, rate allocation, basic data loads, automatic multiple build-up
selection, transferring selected data to analysis modules.

Analytical Modeling

Well, reservoir and boundary models, external models, regression


and matching, generating plots from classical straight line to
Bourdet Derivative and Blasingame.

Numerical Modeling

2D and 3D model building, upscaling, single and multiwell treatment


matching and forecasting, PVT input, well configuration input,
multilayer modeling, aquifer modeling,

Special Processing

Deconvolution, Formation Testers, handling and comparing multiple


analyses in the same session, sensitivity and what-if forecasting,
CBM and shale gas, desorption, compaction, changing model with
time; time dependent Skin, pre and post frac, tidal effects, multilayer
rates from Production logs, use of the production profile generator.

Connectivity

Connecting the wellbore model to the reservoir model. Completion


optimization, the reservoir sector to transient analysis, production
analysis to transient and reservoir, using common objects, organizing
files, data, results and technical objects.

45

Support and Consulting

KAPPA Dynamic Flow Analysis (DFA) book


This book works at a number of levels. It is
a standalone generic technical reference for
those interested in transient and production
analysis. It can also be read as a practical
guide to testing and analysis and includes,
where needed, all the necessary theoretical
background mathematics. The book also works
as a practical support to the interpretation
process in the KAPPA Ecrin suite. A printed
version of the DFA book is given to all attendees
on any of the modules of the Dynamic Data
Analysis course provided by KAPPA.
Recent additions reflect the release of enhanced
software numerical capabilities, and embrace
technology such as deconvolution and wavelet
filtration of PDG data. The book is free and to
keep apace of new versions you are invited to
register on www.kappaeng.com and elect to be
kept updated on minor new releases.

www.kappaeng.com/dfa
Interactive support video
As more modules are added and their interconnectivity increases it is important for users to have
access to various learning media. We have recently launched support videos to help Ecrin users
understand and simulate this workflow. This is free, expanding and available to all registered users
by visiting www.kappaeng.com/videos
More support...
The software is supported by intuitive help files, many worked examples, multilingual support and
we are always ready to help in person. Send us an email on support@kappaeng.com or contact
one of our regional offices. Our engineers are happy to help using the collaborative web based
tools all of which are included in the maintenance of the software.
Consulting
KAPPA consultants are known for being some of the best in their field in the industry. We are
happy to assist with short or long term interventions, onsite or web based and at short notice.
Contact tcs@kappaeng.com to discuss your requirements.

46

www.kappaeng.com
Corporate office
France - Sophia Antipolis
Support and Development
KAPPAEngineering SA
TEL: +33 497 212 450
FAX: +33 497 212 451
support@kappaeng.com

Sales, Training
and Consulting
UK - Gatwick
KAPPATCS Ltd
TEL: +44 1342 837 101
FAX: +44 1737 821 518
sales@kappaeng.com
tcs@kappaeng.com

Regional offices
USA & Latin America - Houston
KAPPANorth America inc
TEL: +1 713 772 5694
FAX: +1 713 772 5690
Middle East Asia - Bahrain
KAPPAMEA W.L.L.
TEL: +973 17 229 803

Potrebbero piacerti anche