2014 CMOST Presentation

download 2014 CMOST Presentation

of 86

Transcript of 2014 CMOST Presentation

  • 7/26/2019 2014 CMOST Presentation

    1/86

  • 7/26/2019 2014 CMOST Presentation

    2/86

    07/01/20

    What this course is not

    A treatise on optimizat ion theory & methods A course on the math behind CMOST

    An in-depth discourse on applied historymatching and optimization

    We can provide some guidance based onour experience but we do not claim to beexperts in the application area

    3

    Agenda

    CMOST overview

    CMOST functionality and Tutor ials

    Sensitivity Analysis

    History Matching

    CMOST functionality and Tutor ials

    Optimization

    Uncertainty Assessment

    4

  • 7/26/2019 2014 CMOST Presentation

    3/86

    07/01/20

    Overview

    What is CMOST?

    CMOST is CMG software that works in conjunctionwith CMG reservoir simulators to perform thefollowing tasks:

    6

    Sensitivity Analysis

    Better understanding of a simulation model Identify important parameters

    History Matching

    Calibrate simulation model with field data Obtain multiple history-matched models

    Optimization

    Improve NPV, Recovery, Reduce cost

    Uncertainty Analysis

    Quantify uncertainty Understand and reduce risk

  • 7/26/2019 2014 CMOST Presentation

    4/86

    07/01/20

    Typical Workflow for a Brown Field

    7

    Reservoir model Field history f ile

    Matched model

    Forecast model

    Optimal modelOptimal operating

    conditions

    Uncertainty

    quantification

    History matching

    Parameter

    Histograms

    Optimization

    Uncertainty

    assessment

    Parameter

    Sensitivities

    Sensitivity analysis

    CMOST View of Simulation Models

    Simulation Model

    y1=f1(x1, x2, , xn)y2=f1(x1, x2, , xn)

    ym=f1(x1, x2, , xn)

    Parameters

    x1, x2, , xn

    Objective Functions

    y1, y2, , yn

  • 7/26/2019 2014 CMOST Presentation

    5/86

    07/01/20

    CMOST Process

    Selectcombination of

    parameter values

    Substituteparameter valuesinto simulation

    dataset

    Run simulation

    Analyze results

    9

    Parameterization

    Objective

    Functions &

    Proxy Analysis

    Experimental

    Design

    & Optimization

    Algorithms

    CMOST User Interface

  • 7/26/2019 2014 CMOST Presentation

    6/86

    07/01/20

    How Input Data is Organized

    Define what data to be extracted from simulation(Plots and Formulas)

    Define how the simulation model isparameterized(Inputs)

    Define objective functions to be calculated(Outputs)

    General Settings

  • 7/26/2019 2014 CMOST Presentation

    7/86

    07/01/20

    General Properties

    All input fi les are entered on the generalproperties page:

    Base Dataset (required)

    Master Dataset (required)

    Results template files

    Measured Data

    Field History Files

    Log Files

    Fundamental Data

    Fundamental data defines which 2D curves of simulation

    results need to be viewed

    Original Time Series 2D plots similar to Results Graph

    X-axis: time

    User-defined Time Series Custom formula based 2D plots

    X-axis: time

    Property vs. Distance Series

    X-axis: Distance Fluid Contact Depth Series

    Depth of Fluid Contact (based on fluid saturation) vs. Time

  • 7/26/2019 2014 CMOST Presentation

    8/86

    07/01/20

    Parameterization

    Parameterization

    Parameters are variables in the simulationmodel that will be adjusted when creating newdatasets

    - E.g. Porosity, permeability, etc.

    To determine the location in the dataset tosubstitute values, a master dataset must becreated (.cmm)

    A master dataset is almost identical to a normal

    simulation dataset except CMOST keywordshave been added to identify where a parametershould be added

    - Acts as a template for creating new datasets

  • 7/26/2019 2014 CMOST Presentation

    9/86

    07/01/20

    Master Dataset

    Master Dataset

    A master dataset can be created in multipleways:

    CMOST Editor

    Builder

    Text editor (Notepad, Textpad, etc.)

  • 7/26/2019 2014 CMOST Presentation

    10/86

    07/01/20

    Parameterization of Simulation Model (Builder)

    List of

    ParametersSelect

    Dataset

    Section

    Select

    Parameter from

    Section

    Export Master Dataset (Template

    file)

    Parameterization of Simulation Model (Text)

    Dataset Template

    Complementary to Builder

    Create CMOST parameters

    Better syntax highlighting Highlight CMOST parameters

    Fold no-need-to-see sections

    Easy navigation Different sections of the dataset

    Navigate CMOST parameters

    Handle include files Create/extract include files

    View include files

    Parameterize include files

  • 7/26/2019 2014 CMOST Presentation

    11/86

    07/01/20

    Master Dataset Syntax

    Original Dataset:

    PORCON0.20

    Master Dataset:

    PORCONthis[0.20]=Porosity

    Simulator

    Keywords

    CMOST

    Start

    Original

    (Default)

    Value in

    Dataset

    Variable

    NameCMOST

    End

    No Spaces in

    CMOST Portion

    Variable Names

    Case Sensitive

    Master Dataset Syntax

    Formulas can also be used wi th 1 or more variables:

    PORCON0.20*PorosityMultiplier

    Simulator

    Keywords

    CMOST

    Start

    Formula CMOST

    End

    Default value

    optional

  • 7/26/2019 2014 CMOST Presentation

    12/86

    07/01/20

    Master Dataset Syntax

    Values in regions of the reservoir can be modif ied using

    MOD simulation keywords

    PORCON0.20

    MOD

    1:52:81:10*this[1]=PorosityMultiplier1

    6:102:81:10*this[1]=PorosityMultiplier2

    Block RangesI:IJ:JK:K

    Parameter Definition

    Continuous parameter

    Lower and upper limit define the sampling rangeused by study engines

    Discrete parameter

    Real, Integer, and Text

    Each discrete text value also requires anumerical value

    Formula

    Value based off of other parameters values

  • 7/26/2019 2014 CMOST Presentation

    13/86

    07/01/20

    Dependent Parameters Using Formula

    Syntax highlighting

    Shows what variables are available to beused to create formulas

    Test and check the formula anytime

    Hard Constraints

    Criteria that must be satisf ied for a datasetto be created

    Used to eliminate unrealistic datasets

    E.g. Horizontal permeability should begreater than vertical permeability

  • 7/26/2019 2014 CMOST Presentation

    14/86

    07/01/20

    Pre-Simulation Commands

    Passes dataset to separate appl icationbefore submitting to simulator

    Run Builder Silently

    Run GOCAD Command Silently

    Run User Defined

    Can be used to create new geostatist icalrealizations, recalculate formulas in Builder,

    recalculate rel. perm. curves, etc.

    Coupling with Geological Software

    H Pair

    Geological model Simulation model

    Simulation model

  • 7/26/2019 2014 CMOST Presentation

    15/86

    07/01/20

    Objective Functions

    Objective Functions

    An Object ive Funct ion (OF) is something (anexpression or a single quantity) for which

    you wish to achieve some goal

    Usually this goal is to achieve a minimum ormaximum value

    In the case of History Matching, oneusually wishes to minimize an error

    between field data and simulation In the case of Optimization, one usually

    wishes to maximize something like NPV

  • 7/26/2019 2014 CMOST Presentation

    16/86

    07/01/20

    Basic Simulation Results

    Values directly taken from simulation resultswith no modification

    History match error

    Percentage relative error

    Perfect match: 0%

    Net Present Value

    Simplified NPV calculation

    Can be used to construct user-definedobjective functions which utilize simulationresults discounted by time as variables

    Objective Functions

    Characteristic Date Times

    Specific Dates

    Date where maximum or minimum value isfound

    Date when value surpasses a specified criteria

    Advanced Objective Funct ions

    User defined objective function based on

    formula or code (jscript or python) Soft Constraints

    Re-evaluates objective functions based onsimulation results

    Objective Functions

  • 7/26/2019 2014 CMOST Presentation

    17/86

    07/01/20

    What are Dynamic Date Times

    Characteristic date times, within certain timeseries range, that meet certain criteria

    Examples:

    First time oil rate is higher than 100 bbl/d;

    Last time SOR is bigger or equal to 6;

    First time oil rate is higher than 100 bbl/d , andSOR is smaller than 4

    Why Need Dynamic Date Times

    Fixed date time (e.g. simulation stop time, or2014-8-15) is fixed for all jobs. Objective functionare calculated on fixed date times for all jobs.

    In some cases, we want to get information on datetime that some specified condition are meet (e.g.oil rate>100). However, these times are not the

    same for all simulation jobs.Thus, they are dynamic.

  • 7/26/2019 2014 CMOST Presentation

    18/86

    07/01/20

    Dynamic Date Times: Maximize Peak NPV

    Peak NPV

    Dynamic Date Times: Plateau

    Optimization

    Plateau period

    Oil produced at plateauAverage oil rate at plateau

  • 7/26/2019 2014 CMOST Presentation

    19/86

    07/01/20

    Characteristic Date Times Page

    Start & Stop Time

    To name certain

    data time

    Data times that

    meet certain

    criteria

    Define Dynamic Date Time

    Criteria Time Series

    Criteria User defined time series

  • 7/26/2019 2014 CMOST Presentation

    20/86

    07/01/20

    Characteristic Time Durations

    Defined in Basic Simulation Result Page,then can be used as objective functions.

    User-defined Objective Functions

    Use Excel spreadsheet

    Map CMOST parameter values to cells

    Map simulation results to cells

    Use Jscript or Python code

    Use executable provided by user (e.g.MATLAB)

    Preview calculation result using base case

  • 7/26/2019 2014 CMOST Presentation

    21/86

    07/01/20

    Use Excel Spreadsheet

    Af ter each simulation is done:

    CMOST write Parameter and simulation resultsto Excel cells;

    Excel calculate objective function using formulaor VBA code;

    then CMOST read value back into CMOST, anduse it the objective function value.

    Sensitivity Analysis

  • 7/26/2019 2014 CMOST Presentation

    22/86

    07/01/20

    Sensitivity Analysis Goals

    Determine which parameters have an effecton results

    E.g. I expect that rock compressibility isbetween values A and B. Does thisuncertainty impact my results?

    Determine how much of an effectparameters have on results

    E.g. If permeability is increased by 50mD,

    how much will cumulative oil increase?

    Sensitivity Analysis Process

    Select parameters to analyze

    E.g. porosity

    Select range of values to analyze

    E.g. between 20-30% porosity

    Select results (Objective Functions) toanalyze

    E.g. Cumulative Oil

  • 7/26/2019 2014 CMOST Presentation

    23/86

    07/01/20

    Sensitivity Analysis Methodology

    One Parameter at a Time (OPAAT) Each parameter is analyzed independently

    while remaining parameters are set to theirreference value

    Response Surface Methodology

    Multiple parameters are adjusted togetherthen results are analyzed by fitting aresponse surface (Polynomial equation) to

    results

    One Parameter at a Time (OPAAT)

    This method analyzes each parameterindependently

    While analyzing one parameter, the methodfreezes the other parameters at their referencevalues (Median or Default)

    This measures the effect of each parameter onthe objective function while removing the effects

    of the other parameters.

  • 7/26/2019 2014 CMOST Presentation

    24/86

    07/01/20

    One Parameter at a Time (OPAAT)

    Benefits Simple to use

    Results easy to understand

    Results not complicated by effects of otherparameters

    Drawbacks

    Results are focused around the referencevalues

    Results can change dramatically ifreference values change

    Configure OPAAT Engine

    Select reference

    values to be used

    Non-monotonic

    variables require many

    levels of parameter

    values to be tested

    Parameter

    Obje

    ctive

    Function

  • 7/26/2019 2014 CMOST Presentation

    25/86

    07/01/20

    One Parameter at a Time (OPAAT)

    One Parameter at a Time (OPAAT)

    Porosity = 0.2CumOil = 33004bbl

    Porosity = 0.3CumOil = 44176bbl

    Porosity = 0.25(reference value)CumOil = 40416bbl

    Note: Min and max objective function values donot always correspond with Min and Maxparameter valuesCheck cross plots to verify

  • 7/26/2019 2014 CMOST Presentation

    26/86

    07/01/20

    One Parameter at a Time (OPAAT)

    Length of bar gives themaximum change in theobjective function over theparameter range

    Response Surface Method (RSM)

    Correlation betweenresponse andparameters

    NPV = f(x1, x2, ,xn)

    The response surfaceis a proxy for thereservoir simulator thatallows fast estimationof the response

    52

  • 7/26/2019 2014 CMOST Presentation

    27/86

    07/01/20

    Response Surface Method (RSM)

    Multiple Parameters values are changedsimultaneously

    The combination of parameter values that arechosen are based off of an experimentaldesign

    Response sur face (polynomial equation) is f it tosimulation results

    Linear

    Linear + Quadratic

    Linear + Quadratic + Interaction terms

    Response Surface Proxy Models

    Linear Model

    Linear + Quadratic Model

    Linear + Quadratic + Interaction

    Statistically insignificant terms are automatically removed

    Model type is automatically chosen but can be changed if necessary

    kkxaxaxaay 22110

    k

    j

    jjj

    k

    j

    jj xaxaay1

    2

    1

    0

    k

    j ji

    k

    j

    jiijjjj

    k

    j

    jj xxaxaxaay1 2

    2

    1

    0

  • 7/26/2019 2014 CMOST Presentation

    28/86

    07/01/20

    SA Using RSM Engine

    Handles problematic

    simulation runs

    Specify desired

    accuracy, the engine

    will c reate and run

    experiments as

    needed.

    Response Surface Method (RSM)

  • 7/26/2019 2014 CMOST Presentation

    29/86

    07/01/20

    Response Surface Method (RSM)

    For comparing effect between differentparameters, parameter ranges are

    normalized between -1 and 1

    In the resulting tornado plot, 2*Coefficient ofthe normalized polynomial relation is g iven

    Represents average change going frommin to max parameter value when looking

    at linear effects Similar to bar length from OPAAT method

    Response Surface Method (RSM)

  • 7/26/2019 2014 CMOST Presentation

    30/86

    07/01/20

    Response Surface Method (RSM)

    Increasing PERMH_L1(permeability) from2625mD to 4375mDresults in an increase inCumulative Oil of 12,461STB on average

    Response Surface Method (RSM)

    If a parameters has a non-linear relation withthe objective functions, a quadratic term mayalso be given (x2)

    If modifying 2 parameters at the same time hasan effect stronger than the sum of theirindividual linear or quadratic effects, a crossterm may be given (x*y)

  • 7/26/2019 2014 CMOST Presentation

    31/86

    07/01/20

    Quadratic and Linear Effect Table

    X-axis: Parameter

    Y-axis: Objective Function

    Any Questions?

    Now Hands on Work

  • 7/26/2019 2014 CMOST Presentation

    32/86

    07/01/20

    Control Centre

    Engine Settings

    Defines task type

    Task type can be modified from whatwas originally selected when creatingthe study

    Any other options related to the enginecan be modified from this page

  • 7/26/2019 2014 CMOST Presentation

    33/86

    07/01/20

    Engine Settings

    Study TypeEngine

    Simulator Settings

    Simulation related settings:

    Schedulers

    Simulator version

    Number of CPUs per job

    Maximum simulation run time

    Job record and file management

    Data I/O Cleanup

  • 7/26/2019 2014 CMOST Presentation

    34/86

    07/01/20

    Submit Simulations to Compute Cluster

    CMG Scheduler Microsoft HPC IBM Platform LSF Oracle Grid Engine Portable Batch System (PBS/TORGUE)

    Optimize Dataset I/O Section

    Switches to reduce output file size

    Disable restart records writing Disable grid records writing in OUT

    Disable grid records wri ting in SR2

  • 7/26/2019 2014 CMOST Presentation

    35/86

    07/01/20

    Optimize Dataset I/O Section

    What does CMOST do?

    Benefits

    Remove I/O bottleneck Reduce occurrence of strange problems

    Reduce support troubleshooting time

    Experiments Table

    List of Experiments

    Parameter values used

    Objective function results

    Able to sort and filter results

    Open in Builder and Results

    Add additional experiments

    User defined Predefined experimental designs(fractional factorial, latin hypercube, etc.)

  • 7/26/2019 2014 CMOST Presentation

    36/86

    07/01/20

    Simulation Jobs

    List of Simulations Scheduler information

    Start Time

    End Time

    Scheduler Name

    Status

    File information

    Name and location

    Normal/Abnormal termination

    Any Questions?

    Now Hands on Work

  • 7/26/2019 2014 CMOST Presentation

    37/86

    07/01/20

    Results & Analyses

    Results & Analyses

    Al l result objects are dynamically created onthe fly using the data stored in experiment

    table

    Al l types of result objects are available forany study type

    HM & OP will automatically have sensitivityand proxy result if there are enough

    experiments.

  • 7/26/2019 2014 CMOST Presentation

    38/86

    07/01/20

    Results & Analyses

    Parameters Run progress

    Parameter value vs. experiment number

    Histogram

    Frequency that range of parameter valueswere chosen

    Cross plot

    Plot parameter values vs. other data

    Current parameter always on y-axis

    Results & Analyses

    Time Series and Property vs. Distance Plots of simulation results as defined in the Fundamental Data section

  • 7/26/2019 2014 CMOST Presentation

    39/86

    07/01/20

    Results & Analyses

    Objective Functions

    Run progress

    Objective function value vs. experiment number

    Histogram

    Frequency that range of objective function values occurred

    Cross plot

    Plot objective function values vs. other data

    Current objective function always on y-axis

    OPAAT Analysis

    Results from One Parameter At A Time sensitivity analysis

    Proxy Analysis

    Proxy verification Response Surface sensitivity analysis results

    Monte Carlo Simulation uncertainty assessment results

    Sensitivity Analysis -

    Validating Results

  • 7/26/2019 2014 CMOST Presentation

    40/86

    07/01/20

    Proxy Analysis for Objective Functions

    Polynomial proxy Linear/Quadratic/Interaction

    RBF neural network

    Statistics

    Response Surface Verification

    When using the response surfacemethodology, one should verify that the

    response surface provides a valid match to

    the simulation data

    This can be verified through

    Response Surface Verification Plot

    Summary of Fit Table Analysis of Variance Table

    Effect Screening

  • 7/26/2019 2014 CMOST Presentation

    41/86

    07/01/20

    Response Surface Verification Plot

    Gives a visual overview of how the proxy modelfits the actual simulation results

    Distance from each point to the 45 degree lineis the error/residual for that point

    Points that fall on the 45 degree line are thosethat are perfectly predicted

    Response Surface Verification Plot

  • 7/26/2019 2014 CMOST Presentation

    42/86

    07/01/20

    Summary of Fit Table

    R2 is a measure of the amount of reduction in thevariability of the response obtained by using theregressor variables in the model.

    R2 of 1 occurs when there is a perfect fit (the errors areall zero).

    R2 of 0 means that the model predicts the response nobetter than the overall response mean.

    Summary of Fit Table

    R2 can be adjusted to make it comparable overmodels with different numbers of regressors byusing the degrees of freedom in its computation.

    Here n is the number of observations (trainingjobs) and p is the number of terms in the

    response model (including the intercept). When R2 and R2adjusted differ dramatically, there is

    good chance that non-significant terms havebeen included in the model.

  • 7/26/2019 2014 CMOST Presentation

    43/86

    07/01/20

    R

    2

    adjusted

    (Example)

    Linear:

    R2=0.9683

    Quadratic:

    R2=0.9715

    4 Samples

    n=4

    R 1

    4 1

    4 2 1 0.9683 .

    R 1

    4 1

    4 3 1 0.9715 .

    Summary of Fit Table

    PRESS is the prediction error sum of squares.

    R2prediction gives some indication of the predictivecapability of the regression model.

    For example, we could expect a model withR2prediction=0.95 to explain about 95% of the

    variability in predicting new observations.

  • 7/26/2019 2014 CMOST Presentation

    44/86

    07/01/20

    Summary of Fit Table

    To calculate PRESS, Select an observation i.

    Fit the regression model to the remaining n-1observations and use this equation to predict thewithheld observation yi.

    Denoting this predicted value by y(i), we can find theprediction error for point i as e i=yi-y(i).

    The prediction error is often called the ith PRESSresidual. This procedure is repeated for eachobservation i=1,2,3,,n producing a set of n PRESSresiduals: e1, e2, e3en.

    The PRESS statistic is then defined as the sum of

    squares of the n PRESS residuals:

    R

    2prediction

    Example

    x y y(i) ei ei2

    1.000 1.900 1.657 0.243 0.059

    2.000 2.450 2.484 -0.034 0.001

    3.000 2.950 3.194 -0.244 0.060

    4.000 3.890 3.483 0.407 0.165

    PRESS: 0.285

    SS

    (Total):

    2.143

    R2prediction 0.867

  • 7/26/2019 2014 CMOST Presentation

    45/86

    07/01/20

    Summary of Fit Table

    Mean of Response

    The overall mean of the response values. It isimportant as a base model for predictionbecause all other models are compared to it.

    Standard Error

    Estimates the standard deviation of the randomerror. It is the square root of the mean squarefor Error in the correspondingAnalysis ofVariance table. Standard error is commonlydenoted as .

    Analysis of Variance Table

    Degrees of Freedom Total = Number of Samples 1

    Model = Number of coefficients for the response surface (notincluding intercept)

    Error = Total Model

    Sum of Squares Total: Sum of Squared distances of each response from the

    sample mean

    Error: Sum of squared differences between the fitted (RS) values

    and the actual simulated values Model = Total Error

    Mean Square (Sum of Squares)/(Degrees of Freedom)

    Converts sum of squares to an average

  • 7/26/2019 2014 CMOST Presentation

    46/86

    07/01/20

    Analysis of Variance Table

    F Ratio

    Model mean square divided by the Error mean square

    Tests the hypothesis that all the regression parameters(except the intercept) are zero (have no effect on theobjective function)

    Prob > F

    The probability of obtaining a greater F-value by chancealone if the specified model fits no better than the overallresponse mean.

    Significance probabilities of 0.05 or less are oftenconsidered evidence that there is at least one significantregression factor in the model

    Summary of Fit and Analysis of

    Variance Table

  • 7/26/2019 2014 CMOST Presentation

    47/86

    07/01/20

    Response Surface Model Checklist

    Check Predicted vs. Actual Is there a good fit?

    Any outliers?

    Check Summary of Fit

    Is R-Square adjusted and R-Square predictionlarge enough (>0.5)?

    Check Analysis of Variance For a decent model, Prob>F should be very

    small ( |t|

    Probability of getting an even greater t-statistic (in absolute value),given the hypothesis that the parameter (coefficient) is zero.

    Probabilities less than 0.1 are often considered as significantevidence that the parameter (coefficient) is not zero.

    Used to filter statistically insignificant terms

  • 7/26/2019 2014 CMOST Presentation

    48/86

    07/01/20

    Effect Screening using Normalized

    Parameters (-1, +1)

    VIF (Variance Inflation Factor) Measure of multi-collinearity problem

    Multi-collinearity refers to one or more near-lineardependences among the regressor variables due to poorsampling of the design space

    Multi-collinearity can have serious effects on the estimatesof the model coefficients and on the general applicability ofthe final model

    The larger the variance inflation factor, the more severe themulti-collinearity.

    It is suggested that the variance inflation factors should notexceed 4 or 5.

    If the design matrix is perfectly orthogonal, the varianceinflation factor for all terms will be equal to 1.

    Effect Screening using Normalized

    Parameters (-1, +1)

  • 7/26/2019 2014 CMOST Presentation

    49/86

    07/01/20

    Response Surface Method (RSM)

    Increasing PERMH_L1(permeability) from2625mD to 4375mDresults in an increase inCumulative Oil of 12,461STB on average

    Proxy Dashboard

    See quick estimation of effects ofparameters on resul ts at all simulation times

    See results immediately without needing towait for additional simulation

    Can assist in manual history matching oroptimization

    CMOST can create dataset and run

    simulation to verify proxy results

  • 7/26/2019 2014 CMOST Presentation

    50/86

    07/01/20

    Proxy Dashboard

    Build ProxyModel

    SelectComparison Case

    What-if Scenario:Choose ParameterInput

    What-if Scenario:Visualize Estimate ofCurve

    What-if Scenario:

    Estimate ObjectiveFunctions

    Proxy Dashboard Method

    Curve divided into 100 points

    Response surface model created for eachpoint based on simulation results

    Polynomial

    RBF neural network

    When parameter values are adjusted, eachpoint along the curve is recalculated using

    the response surface

  • 7/26/2019 2014 CMOST Presentation

    51/86

    07/01/20

    Any Questions?

    Now Hands on Work

    History Matching and

    Optimization

  • 7/26/2019 2014 CMOST Presentation

    52/86

    07/01/20

    History Matching Goals

    In history matching, we are trying to reduce theerror between the simulation results and field

    measured data

    By matching the simulation model to the historical

    behaviour, we have more confidence that the model

    will be able to predict future behavior

    When creating a simulation model, there may be

    uncertainty in the input parameters. These will bethe parameters that should be adjusted when

    history matching

    History Matching Process

    Select parameters to analyze

    E.g. porosity, permeability

    Select range of values to analyze

    E.g. between 20-30% porosity

    Select results (Objective Functions) tomatch

    E.g. Cumulative Oil

    CMOST will search for the best combinationof parameter values that will give the lowest

    history match error

  • 7/26/2019 2014 CMOST Presentation

    53/86

    07/01/20

    Objective Function Hierarchy

    A hierarchy is used when optimizing the objectivefunction

    Upper terms are calculated as a weightedaverage of the lower terms

    GlobalObjectiveFunction

    LocalObjective

    Function 1

    Term 1 Term 2 Term 3

    LocalObjectiveFunction 2

    Term 1 Term 2

    LocalObjectiveFunction 3

    Term 1 Term 2 Term 3

    Objective Function Hierarchy

    Typically common items are grouped together

    E.g. The local objective functions mightrepresent the error for a well and the termsmight represent the measured data for thatwell

    Total Error

    Well 1 Error

    OilProduction

    Error

    WaterProduction

    Error

    GasProduction

    Error

    Well 2 Error

    Bottom-holePressure

    Error

    OilProduction

    Error

    Well 3 Error

    OilProduction

    Error

    WaterProduction

    Error

    GasProduction

    Error

  • 7/26/2019 2014 CMOST Presentation

    54/86

    07/01/20

    Calculating History Match Error

    Nt

    )Y(YNt

    1t

    2m

    t

    s

    t

    simulated measured

    For each measured data point, calculatedifferent between simulation and measuredresult

    Square terms to make them positive Sum up all of the points at all times

    Divide by the number of measurements to getaverage square

    Square root to get average error

    Number of measurements

    Calculating History Match Error

    To compare error of terms with dif ferent units, errormust be normalized

    This is done by dividing by the maximum difference in

    measured values Measurement error can also be included

    Merr represents 1 standard deviation from the mean

    Value of 4 is used to include 2 standard deviations on each side of themean (95% confidence)

    j

    m

    j

    Nt(j)

    1t

    2m

    j,t

    s

    j,t

    jMerr4Y

    Nt(j)

    )Y(Y

    TermError

    maximum di fferencemeasurement error

  • 7/26/2019 2014 CMOST Presentation

    55/86

    07/01/20

    Calculating History Match Error

    Each Local Objective Function is made up ofa weighted arithmetic average of each of theTerms

    N(i)

    1j

    ji,

    N(i)

    1j

    ji,ji,

    i

    tw

    tw100%TermError

    Q

    Calculating History Match Error

    N(i)

    1j

    ji,

    ji,

    m

    ji,

    j)Nt(i,

    1t

    2m

    j,ti,

    s

    j,ti,

    N(i)

    1j

    ji,

    i tw100%Merr4Y

    j)Nt(i,

    )Y(Y

    tw

    1Q

  • 7/26/2019 2014 CMOST Presentation

    56/86

    07/01/20

    Calculating Global History Match Error

    The Global Objective Function is made up of aweighted arithmetic average of each of theLocal Objective Functions

    w

    w

    N

    1i

    i

    N

    1i

    ii

    global

    w

    Qw

    Q

    Field Data Weighting

    Able to weight each measured data pointindividually

    Remove or reduce weight of outliers

  • 7/26/2019 2014 CMOST Presentation

    57/86

    07/01/20

    Optimization Methods

    CMG DECE (Designed Evolution, ControlledExploration)

    Particle Swarm Optimization

    Latin Hypercube plus Proxy Optimization

    Random Brute Force Search

    Optimization Philosophy

    Mathematical optimization

    Mathematicians are particularly interested in finding the trueabsolute optimum.

    Optimum 0.000001 is much better than 0.01 even though it maytake 20 extra days to achieve the former.

    Engineering optimization

    Engineers are more interested in quickly finding optima that areclose to the true optimum.

    Optimum 0.01 is much better than 0.000001 if it takes 20 less

    days to achieve the former.CMOST Optimization Philosophy

    Engineering optimization

    Not intended to solve pure mathematical problems

    114

  • 7/26/2019 2014 CMOST Presentation

    58/86

    07/01/20

    CMG DECE Optimization Algorithm

    Run simulations using the design

    Get initial set of training data

    Run simulations

    Satisfy stop criteria?

    Stop

    Add new

    solutions to

    training data

    Generate initial Latin hypercube design

    Yes

    No

    Exploration

    (get more information)

    Exploitation (find optimum)

    Success?

    Yes

    No

    CMG DECE Optimization Algorithm

  • 7/26/2019 2014 CMOST Presentation

    59/86

    07/01/20

    DECE Characteristics

    Handles continuous & discrete parameters Handles hard constraints

    Asynchronous complete utilization of distributed computingpower

    Fast and stable convergence

    PSO Optimization Algorithm

    118

    A populat ion based stochast ic optimization techniquedeveloped in 1995 by James Kennedy and Russell Eberhart.

    Let particles move towards the best position in search space,remembering each local (particles) best known position and

    global (swarms) best known position.

  • 7/26/2019 2014 CMOST Presentation

    60/86

    07/01/20

    PSO Optimization Algorithm

    119

    Local

    Area

    PSO Optimization Algorithm

    120

    Next

    Case

  • 7/26/2019 2014 CMOST Presentation

    61/86

    07/01/20

    Latin Hypercube plus Proxy Optimization

    Algorithm

    Run simulations using the design

    Get initial set of training data

    Build a proxy model using training data

    Find possible optimum solutions using proxy

    Run simulations using these possible solutions

    Satisfy stop criteria?

    Stop

    Add vali dated

    solutions to

    training data

    Generate initial Latin hypercube design

    Polynomial

    RFB Neural

    Network

    Yes

    No

    Latin Hypercube plus Proxy Optimization

    Algorithm

  • 7/26/2019 2014 CMOST Presentation

    62/86

    07/01/20

    Proxy Optimization

    Latin hypercube design

    Optimization using proxy

    Random Search

    Parameter value combinations randomly untilthe max number of simulator calls has beenreached

    No trend to results (scatter)

    Only use if search space is small

  • 7/26/2019 2014 CMOST Presentation

    63/86

    07/01/20

    Any Questions?

    Now Hands on Work

    Optimization Goals History matching and opt imization are very similar in that ineach one would l ike to find the maximum or minimum of an

    objective function

    In history matching, we are trying to reduce the error betweenthe simulation results and field measured data

    With optimization, we are trying to improve an objectivefunction

    Find maximum NPV

    Find maximum recovery Etc.

    Typically w ith optimization, the parameters that will beadjusted are operational parameters as opposed to reservoir

    parameters when history matching

  • 7/26/2019 2014 CMOST Presentation

    64/86

    07/01/20

    Optimization Process

    Select parameters to analyze E.g. Injection rate, well spacing

    Select range of values to analyze

    E.g. between 200-500bbl/day injection rate

    Select results (Objective Functions) to improve

    E.g. NPV, recovery factor

    CMOST will search for the best combination ofparameter values that wil l maximize your objective

    function

    In some cases we may want to minimize anobjective function such as when looking at runtimes during numerical tuning

    Calculating Net Present Value (NPV)

    Net Present Value (NPV) is often usedas an economic indicator to evaluate thevalue of a project

    A discount rate (I) is used to incorporatethe time value of money

    Money now is worth more than money later

  • 7/26/2019 2014 CMOST Presentation

    65/86

    07/01/20

    Calculating Net Present Value (NPV)

    In CMOST, the cash f low is always calculatedbased on the period that has been selected.

    Daily Monthly Quarterly Yearly

    Yearly discount rate will be converted to theperiod of interest

    E.g. Daily:

    Common OP Engine Settings

    OP

    DECE

    LHD Plus Proxy

    PSO

    Random Brute Force

    All methods use the

    same stop criterion

    Which objective

    function to

    optimization

    Maximize or minimize

  • 7/26/2019 2014 CMOST Presentation

    66/86

    07/01/20

    Multi-objective PSO (SPE 170024)

    Multiple-objective optimization: Optimizingmultiple, maybe conflicting, objective functions

    simultaneously

    Two approaches:

    1. Optimize an aggregated global objective function

    2. Pareto Optimization, e.g. Multiple-objective PSO

    Multi-Objective PSO

    Select the engine

    How many simulations?

    First objective function

    Maximize or minimize?

    2nd objective function

    Maximize or minimize?

  • 7/26/2019 2014 CMOST Presentation

    67/86

    07/01/20

    How MO-PSO Works

    Domination: Better in every objective functionLeader:A non-dominated solution

    Pareto front: The ensemble of leaders

    Leader

    Dominated solution

    Pareto front

    f1

    f2

    a

    b a dominates b

    Leader Selection

    Leader is randomly selected for each particle

    Least crowded leaders are given high priority,trying to find an adequately spread Pareto front

    f1

    f2

    a

    b

    c

    d

    ef

    Priority Leader

    1 a, f

    2 e

    3 d4 c

    5 b

  • 7/26/2019 2014 CMOST Presentation

    68/86

    07/01/20

    Case Study: Numerical Tuning

    Numerical tuning of a SAGD model for a realfield

    Full model takes >42d to run

    After cleanup & tuning: 7d to finish (SPE165511)

    Sub model: 4 slices (4x600x50) takes ~1h torun the first 6 months

    Colin Card et.al. A New and Practical Workflow of Large Multi-Pad SAGD Simulation- A

    Corner Oil Sands Case Study, SPE HOCC, SPE 165511, June 2013.

    Single Objective PSO Optimization

    Conflicting objective functions:

    1. Run time

    2. Material Balance Error

    3. Solver Failure Percent

    GlobalObj =(50,000*MaterialBalanceError+1*RunTime+1,000*SolverFailurePercent)/51,001

    Use PSO to minimize GlobalObj

  • 7/26/2019 2014 CMOST Presentation

    69/86

    07/01/20

    MO-PSO Optimization

    Same set of parameters

    Minimize two conflicting objective functions:MaterialBalanceError& RunTime

    GlobalObj & SolverFailurePercent are alsocalculated, just for comparison

    Pareto front @500 MOPSO & 1000

    SPSO Experiments

    2500

    2600

    2700

    2800

    2900

    3000

    3100

    3200

    3300

    3400

    3500

    0 0.005 0.01 0.015 0.02 0.025 0.03 0.035 0.04

    MOPSO

    SPSO

    RunTime(s)

    Material Balance Error (%)

  • 7/26/2019 2014 CMOST Presentation

    70/86

    07/01/20

    Any Questions?

    Now Hands on Work

    Uncertainty Assessment

  • 7/26/2019 2014 CMOST Presentation

    71/86

    07/01/20

    Uncertainty Assessment (UA)

    Once an opt imum operating strategy has beendeveloped for one or more history match models the

    question remaining is:

    Given residual uncertainties in the HM (or other)variables, what impact will those uncertainties haveon the NPV of the optimum cases(s)?

    How does this work?

    Even with simple geological models we are still likelyto have more than one set of geological parametervalues that give an acceptable HM, thus indicating

    some uncertainty in these values

    141

    Uncertainty Assessment (UA)

    How does this work?

    If we have more than one geological realization thatgives an acceptable HM we have by definition anuncertainty as to which realization best reflects reality

    Thus we need to see how the NPV of our optimumcases is impacted by the uncertainty in theserealizations

    It is important to recognize that the HM processdevelops alternative realizations and that parameterswhich are part of these realizations cannot have theirvalues changed independently and arbitrarily

  • 7/26/2019 2014 CMOST Presentation

    72/86

    07/01/20

    Uncertainty Assessment

    Uncertainty Assessment

  • 7/26/2019 2014 CMOST Presentation

    73/86

    07/01/20

    Monte Carlo Simulation

    Input probabilitydistributions derived fromexperience

    Pick random values (thatfollow input distribution)and calculate NPV

    Repeat for thousands ofiterations

    12 13 14 15 16 17 18 19 20 21 22 23 24 25 26

    Net present value (M$)

    DWOC

    8281 83

    p

    DWOC

    DWOC

    8281 83

    p

    DWOC

    SORG

    0.300.25 0.35

    p

    SORG

    SORG

    0.300.25 0.35

    p

    SORG

    SORW

    0.300.25 0.35

    p

    SORW

    SORW

    0.300.25 0.35

    p

    SORW

    Probability distribution for HM parameters

    NPV=F(DWOC, SORG, SORW, )

    Prior Probability Density Functions

    Defines l ikelihood of a parameter value beingselected in Monte Carlo Simulation for continuous

    parameters

    Uniform

    Triangle

    Normal

    Log Normal

    Custom

    Probability for each value defined for discreteparameter types

  • 7/26/2019 2014 CMOST Presentation

    74/86

    07/01/20

    Probability Distributions in CMOST

    Normal Lognormal Triangle

    Uniform Custom Discrete

    Parameter Correlations

    Some parameters may be related to eachother

    E.g. porosity and permeability maycorrelate with each other

    Correlation coefficient defines how closelyrelated parameters are to each other

    Value of 1 means parameters are directly

    related Value of 0 means that parameters have no

    relation with each other

  • 7/26/2019 2014 CMOST Presentation

    75/86

    07/01/20

    POR PERMHPERMV

    POR

    PERMH

    POR

    PERMV PERMV

    PERMH

    Parameter POR PERMH PERMV

    POR 1 0 0

    PERMH 0 1 0

    PERMV 0 0 1

    Sampling of Correlated Parameters

    Calculated Values

    Desired Values

    POR

    PERMH

    POR

    PERMV PERMV

    PERMH

    Parameter POR PERMH PERMV

    POR 1 0.6 0.4

    PERMH 0.6 1 0.8

    PERMV 0.4 0.8 1

    Parameter POR PERMH PERMV

    POR 1 0.582 0.385

    PERMH 0.582 1 0.79

    PERMV 0.385 0.79 1

    Sampling of Correlated Parameters

  • 7/26/2019 2014 CMOST Presentation

    76/86

    07/01/20

    Parameter Correlations

    0.0 0.25

    0.50 0.75

    Uncertainty Assessment (UA) Objective functions can be evaluated using 2 methodsfor Monte Carlo Simulation

    Response surface as a proxy to reservoir simulation

    Running reservoir simulation

    Running reservoir simulation is often too slow toevaluate the response so the proxy method is of ten

    preferred

    Situations to run reservoir simulation for Monte Carlosimulation:

    You want to validate MCS-proxy result

    Building proxy is not feasible. For example,

    Multiple Geostatistical realizations are used

    Multiple HM models are used

  • 7/26/2019 2014 CMOST Presentation

    77/86

    07/01/20

    Uncertainty Assessment (UA)

    How do we do this? The RS is generated by creating an

    experimental design and providing and astatistical distribution for each uncertainparameter

    The process is the same as polynomialresponse surface modelling for sensitivityanalysis

    RBF neural network type proxy models alsoavailable for Uncertainty Assessment

    A Monte Carlo simulation is then run todetermine the NPV distribution

    UA Using MCS-Proxy Engine

    Handles problematic

    simulation runs

    Specify desiredaccuracy, the engine

    will c reate and run

    experiments as

    needed.

  • 7/26/2019 2014 CMOST Presentation

    78/86

    07/01/20

    User Defined Study Type

    User-defined Study Type

    User

    Defined

    Manual Engine

    External Engine

    All ows the use of user s

    own optimizationalgorithm.

    All exper imen ts are to be

    created by the user

    explicitly through:

    Classical experimentaldesign

    Latin hypercubedesign

    Manual

  • 7/26/2019 2014 CMOST Presentation

    79/86

    07/01/20

    When to Use Manual Engine

    Use classical experimental design for SA andUA

    Precise control on the number of Latinhypercube experiments

    Run additional experiments after aSA/UA/HM/OP run is complete.

    Create new experiments using usersoptimization algorithm

    Working With CMOST

  • 7/26/2019 2014 CMOST Presentation

    80/86

    07/01/20

    Main CMOST Components

    Base Files To begin a CMOST project, a completed

    simulation dataset (.dat) along with itsSimulation Results (SR2: .mrf, .irf) files arerequired

    CMOST Project

    A CMOST Project is the main CMOST file thatcan contain multiple related studies

    CMOST 2013 File System

    Project

    Name:

    SAGD_2D_UA

    Project

    File:

    SAGD_2D_UA.cmp

    Project

    Folder: SAGD_2D_UA.cmpdBest

    practice: All

    files

    related

    to

    the

    project

    should

    be

    stored

    in

    the

    project

    folder.

  • 7/26/2019 2014 CMOST Presentation

    81/86

    07/01/20

    Main CMOST Components

    CMOST Study A CMOST study contains all of the input

    information for CMOST to run a particular

    type of task

    Information can be copied between studies

    Study types can be easily switched

    The new study type will use as muchinformation from the previous study type as

    possible

    CMOST 2013 File System (contd 1)

    StudyName:BoxBen

    Study File:

    BoxBen.cms

    Study

    File

    Auto

    Backup:

    BoxBen.bak

    StudyFolder: BoxBen.cmsd

    Dont

    modify/delete

    files

    in

    the

    study

    folder

    unless

    you

    know

    what

    youre

    doing.

  • 7/26/2019 2014 CMOST Presentation

    82/86

    07/01/20

    CMOST 2013 File System (contd 2)

    Vector

    data

    repository

    file:

    *.vdr

    VDR

    stores

    compressed

    simulation

    data

    required for

    objective

    function

    calculations

    SubsetofSR2results

    Never

    modify

    or

    delete

    vdr

    files

    manually

    Reusing and Restarting a Study

    After you run an engine, you can go back to change input data.Experiments inside a study will be automatically reused.

    If new parameters are added, you need to resolve reuse pendingexperiments.

    After you finish the changes, click start engine to restart.

    AutoSynchronization

    Auto/ManualReprocessing

  • 7/26/2019 2014 CMOST Presentation

    83/86

    07/01/20

    Reusing Data from Another Study

    Licensing Multiplier

    CMOST uses only partial licenses whenrunning simulations

    E.g. Run 2 STARS simulations while usingonly 1 STARS license

    Applies to other license types (Parallel,Dynagrid, etc.)

    IMEX 4:1 GEM 2:1

    STARS 2:1

    x 4

    x 2

    x 2

  • 7/26/2019 2014 CMOST Presentation

    84/86

    07/01/20

    CMOST Input Data Features

    Quality Data

    Quality Result

    Further Assistance

    Email: [email protected]

    Zip an entire project or selected studies

    Email or ftp the zip file to CMG

  • 7/26/2019 2014 CMOST Presentation

    85/86

    07/01/20

    Diagnostic Zip for Support Request

    Diagnostic Zip for Support Request

  • 7/26/2019 2014 CMOST Presentation

    86/86

    07/01/20

    Any Questions?

    Now Hands on Work