wavelet pls regression application to oil production data
play

WAVELET-PLS REGRESSION: Application to Oil Production Data Salwa - PowerPoint PPT Presentation

WAVELET-PLS REGRESSION: Application to Oil Production Data Salwa BenAmmou, Zied Kacem, Hdi Kortas and Zouheir Dhifaoui Computational Mathematics Laboratory Introduction Statisticians are often confronted to several problems such as


  1. WAVELET-PLS REGRESSION: Application to Oil Production Data Salwa BenAmmou, Zied Kacem, Hédi Kortas and Zouheir Dhifaoui Computational Mathematics Laboratory

  2. Introduction  Statisticians are often confronted to several problems such as missing or incomplete data, the presence of a strong collinearity between the explanatory variables or the case where the number of variables exceeds the number of observations.  The PLS method has been proposed by WOLD in the 80 ’s to cope with these problems.

  3. Introduction In practical applications, however, we are confronted with the problem of noise affecting the dataset. Actually, the noise component can strongly affect the adjustment quality and the predictive performance of the PLS model.

  4. Objective We propose an hybrid data analysis method based on the combination of wavelet thresholding techniques and PLS regression in order to remove or attenuate the effect of the noise.

  5. Wavelet Theory: Multiresolution Analysis (MRA)   2 IR A MRA is a sequence of closed subspaces of satisfying: L      : i j V V  j j 1          ii j : f V f 2. V  j j 1      iii V 0  j j        2 i V L  j j             Thereexists a function V suchthat . k : k isanO N Bof V . . ; 0 0  iscalled scaling function .

  6. Wavelet Theory: Basic concepts The scaling function is such that:           j / 2 j is an ONB of : 2 2 . k : k jk Let be the orthogonal complement of in V V  W j 1 j j          W W If there exists a function such that is an ONB in  . k : k Z 0 0  is called wavelet function and satisfy:          j 2 j is an ONB in : 2 2 . k , k Z jk

  7. Wavelet Theory: Basic concepts Thus a function has a unique representation in terms of a convergent series in L : 2          (1)       f x x x k 0 k jk jk  k j 0 k     dx     dx         where f x x and f x x k 0 k jk jk

  8. Wavelet thresholding The thresholding strategy consists in three steps: Apply the DWT decomposition to the observed data sequence to - produce a set of scale-wise approximation and detail coefficients.  Keep the detail coefficients which are above a fixed threshold - jk level and set to zero the coefficients which are below the threshold. Reconstruct the signal -

  9. Thresholding techniques  Soft thesholding:  Hard thesholding:         0 si x 0 si x            x x           x sign x si x x si x 

  10. ˆ  The linear wavelet estimator of the function is given by: f f jl (2) 1   1   n n ˆ     ˆ   c Y X et d Y X jk i jk i jk i jk i n n   i 1 i 1 ˆ are the thresholded wavelet detail coefficients d jk ˆ c are the thresholded approximation coefficients jk

  11. PLS regression (Partial Least Squares Regression) • PLS regression (PLS) is a nonlinear model linking a set of dependent variables Y to a set of numerical or categorical explanatory variables X. • It is often utilized to handle highly correlated regressors • It is of great interest when dealing with data sets in which the number of predictors greatly exceeds the number of observations. • It allows to deal with the problem of missing data.

  12. PLS1 regression PLS univariate regression (PLS1) is a nonlinear model linking a dependent variable to a set of numerical or categorical explanatory variables .  The PLS1 regression algorithm involves several steps: o Construction of the first PLS component t 1    t w X ... w X 1 11 1 1 k k * o Normalisation of the coefficients w 1 j w  1 j * w (3) 1 j k  2 ( w ) 1 j  j 1

  13. PLS1 regression o Perform an OLS regression of Y on t 1 ˆ   Y c t Y 1 1 1 regression coefficient residuals Therefore: ˆ     Y c w X ... c w X Y 1 11 1 1 1 k k 1 If the model has limited explanatory power, we search for a second component which is not correlated with and is able to explain the residual vector quite good.

  14. PLS1 Regression t 2 can be written as:    o t w x ... w x 2 21 11 2 k 1 k We perform a multiple regression of Y on t 1 , t 2 : o ˆ    Y c t c t Y 1 1 2 2 2 regression coefficients residual vector The number of components t h to be retained is determined by cross o validation

  15. Wavelet-PLS

  16. Application The response variable Y: the crude oil (petroleum) daily  production in barrels in a given oil field composed of four wells during the period from May 1, 2003 to March 31, 2006 i.e.1024 observations. The data measurements are made on a daily basis  The response variable Y depends on 16 explanatory variables:  Choke i : the choke valve position in the oil well i; i = 1,…, 4.  FTHP i : Flowing Tubing Head Pressure of the well i (in  Bars); i = 1,…, 4.

  17.  Pres at Choke i : pressure on the level of the choke in the well i (in bars); i = 1, … , 4.  WCi : (Water cut) Percentage of water. It is the ratio of water produced to the volume of total liquids extracted from the well i; i = 1, … , 4.

  18. Wavelet threshoding set-up  We use a Daubechies compactly supported wavelet with 5 vanishing moments.  The Discrete wavelet Transform is curtailed at scale j=5  We opt for a soft thresholding

  19. Signal before (green) and after thresholding (black)

  20. Specification of the number of components by cross validation Wavelet-PLS PLS1 Q 2 Q 2 Number of limits Number of limits h h components components 1 0.734 0.0975 1 0.742 0.0975 2 0.287 0.0975 2 0.319 0.0975 3 0.237 0.0975 3 0.393 0.0975 4 0.266 0.0975 4 0.186 0.0975 5 0.0663 0.0975 5 0.038 0.0975

  21. The PLS1 equation before thresholding:  ŷ = 0,14745477 x 1 + 0,12351255 x 2 + 0,29458188 x 3 + 0,16206525 x 4 - 0,27695889 x 5 + 0,03891265 x 6 - 0,1728005 x 7 - 0,14108841 x 8 + 0,28230372 x 9 + 0,27352113 x 10 + 0,23676341 x 11 + 0,08288938 x 12 + 0,01417857 x 13 - 0,19398681 x 14 - 0,00272167 x 15 + 0,00767741 x 16 The PLS1 equation after thresholding:  ŷ = 0,076750638 x 1 + 0,073312704 x 2 + 0,314558779 x 3 +0,116011568 x 4 - 0,268962544 x 5 + 0,002680218 x 6 - 0,124656262 x 7 - 0,254468339 x 8 + 0,338198727 x 9 + 0,317734483 x 10 + 0,277136406 x 11 + 0,053406536 x 12 - 0,028291771 x 13 - 0,13101302 x 14 - 0,028039241 x 15 - 0,01063908 x 16 .

  22. Outliers PLS1 before thresholding PLS1 after thresholding 9.6% of the total sample are 8.7% of the observations are regarded as outliers regarded as outliers

  23. Confidence ellipsoids Denoised data Raw data

  24. Goodness of fit The 2 values are much closer to zero than the 2 . R R 1 2 This shows the effectiveness of the wavelet techniques for noise removal.

  25. Mean Squared Errors It is clear that the are much smaller than those of MSE 2 MSE 1 This confirms the relevance of the Wavelet-PLS method .

  26. Conclusion The Wavelet-PLS approach allowed us to:  reduce the number of outliers  reduce the Mean Square Error  correct the observations in the score plot  ameliorate the goodness of fit of the model

  27. Thanks

Recommend


More recommend