[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[amibroker] Sample Data Range, Sampling Rate, and Window Effect on AutoRegressive Series was Polynomial Trendlines



PureBytes Links

Trading Reference Links

I cross post it here as well as in AB-TS but please reply in TS so
discussion can continue there

3. Windowing ? I don?t know much here but when you say no window, I think
it?s the same as a square window where signals falls off completely. I also
know we could use a triangle window where the signals falls off gradually
when they are S < BegBar and S > EndBar. I am still not familiar with
hamming window even after reading that link. But I can take your word that
it makes no difference until others things are more in advance.

1. No of Samples ? So the concept is to find the longest possible samples
that still gives stationary signal. Can the test criteria be S(f)(N) ==
S(f)(N?), like keep increasing the Sample size until the spectrum
converge???? Just thinking out loud here. I think there is a lot of
potential to get better prediction/spectrum with the ?optimium? Sample
length.

2.  I use timeframecompress and timeframeexpand to do my downsampling and is
reasonably successful. What I do is to compress the data series before
sampling. Do all the work and reconstruct the series and expand the
reconstructed series after interpolation and extrapolation is complete. The
AFL below illustrates the technique used. 

I am sorry you can?t run the AFL directly (it needs a dll). But you can see
the logic there. The advantage there being no spline interpolation is
required as AB does it for you. Of course this only works for constant
sampling rate. One way to do variable sampling is to sample PF or Kangi
charts instead of directly from price data.

 

// *********************************************************

// *

// * AFL Function for nth Order Maximum Entropy Method

// *     Calls AutoReg( strings.dll)(SampleData, PopulationData, ExtraB,
ExtraF, degree, method, AutoOrdering) 

// *                              Method: 0=Mesa, 1=least square

// *                              AutoOrding: 1=Enabled, 0=Disabled

// *     return Interpolated and Predicted Series

// *     also return ArCoefficient AND Noise Variance as Ar[degree + 2],
Optimized Order as Ar[degree + 3] and 

// *     startbar as Ar[degree+4] such that firstvalue prior to detrending
is Y_full[startbar-1];

// *     Y_full      = The array to Fit

// *     Y_select    = selected range of Y_full  

// *     BegBar = Beg Bar in range to fit

// *     EndBar = End Bar in range to fit

// *     Order  = 1 - 100 = Order of MEM (Integer)

// *     ExtraB = Number of Bars to Move Back, doesnt Extrapolate (Backward)

// *     ExtraF = Number of Bars to Extrapolate (Forward)

// *

// *********************************************************

function CT3(price,periods,s) {

       e1=EMA(price,periods);

       e2=EMA(e1,Periods);

       e3=EMA(e2,Periods);

       e4=EMA(e3,Periods);

       e5=EMA(e4,Periods);

       e6=EMA(e5,Periods);

       c1=-s*s*s;

       c2=3*s*s+3*s*s*s;

       c3=-6*s*s-3*s-3*s*s*s;

       c4=1+3*s+s*s*s+3*s*s;

       Ti3=c1*e6+c2*e5+c3*e4+c4*e3;

       return Ref(ti3, periods/2);

}

 

Plot((H+L)/2, "Median",
colorRed,styleNoDraw|styleNoRescale|styleOwnScale|styleNoTitle); 

 

slope       = Param("Slope", 0.7, 0, 3, 0.01);

pds         = Param ("pds", 5, 1, 10, 1);

type        = Param ("type", 0, 0, 7, 1);

P           = ParamField("Price field",-1);

SamplePds   = Param("Resampling Size", 1, 1, 5, 1);

M_Order     = Param("nth Order",             8, 1,  20, 1);

M_ExtraB    = Param("Extrapolate Backwards", 0, 0, 50, 1);

M_ExtraF    = Param("Extrapolate Forwards",  0, 0, 50, 1);

method      = ParamToggle("Burg/least-square", "Burg|Least Square", 0);

auto        = ParamToggle("Auto Order Seeking", "No|Yes");

BI          = BarIndex();

M_BegBar    = BeginValue(BI);

M_EndBar    = EndValue(BI);

Y_full      = CT3(p, pds, slope);

dY_full     = detrend(Y_full);

Y_select    = IIf(BI < M_BegBar, Null, IIf(BI > M_EndBar, Null, Y_full));

cY_full     = TimeFrameCompress(Y_full, SamplePds * inDaily);

cY_select   = TimeFrameCompress(Y_select, SamplePds * inDaily);

dcY_full    = detrend(cY_full);

dcY_select  = detrend(cY_select);

dcYn        = AutoReg(dcY_select, dcY_full, M_ExtraB, M_ExtraF, M_Order,
method, auto);

dY_select   = TimeFrameExpand(dcY_select, SamplePds*inDaily);

rdYn        = TimeFrameExpand(dcYn, SamplePds * inDaily);

Plot(dY_select,"dY_select", colorLightGrey, styleLine);

Plot(rdYn, _DEFAULT_NAME() + NumToStr(Ar[M_Order+3], 2.0), IIf(BI > M_EndBar
- SamplePds*M_ExtraF, colorRed, 

       IIf(BI < M_BegBar - SamplePds*M_ExtraF, colorRed, colorBrightGreen)),
styleThick, Null, Null, SamplePds*M_ExtraF);

Filter = 1;

AddColumn(Ar, "AR", 1.6);

AddColumn(BI, "BI", 1.0);

AddColumn(auto, "auto", 1.0);

 

 

  _____  

From: amibroker@xxxxxxxxxxxxxxx [mailto:amibroker@xxxxxxxxxxxxxxx] On Behalf
Of Tom Tom
Sent: Saturday, 18 November 2006 4:35 AM
To: amibroker@xxxxxxxxxxxxxxx
Subject: Re: [amibroker] Re: Polynomial Trendlines

Hi,

So... hummm ... maybe :

3- I implemented in the function Hamming Widowing for Burg on the reflection

coefficients (from the paper source i send in past mail). I have to test it 
now.
But theorically Burg method don't need windowing so we will see the diff.
Noise variance won't be minimised. It is minimised in case of no-windowing. 
With windowing, we are no more sure to have best fit :
http://sepwww.
<http://sepwww.stanford.edu/public/docs/sep65/gilles1/paper_html/node14.html
> stanford.edu/public/docs/sep65/gilles1/paper_html/node14.html

2- Maybe we can test down sampling the data and interpolate them, like Fred 
tell to do for his TrigFit. But there is maybe some slightly problem like 
for exemple : taking in account an notusefull quote (last quote from a 
consolidating periods for exemple) an dissmis just next quote wich is an 
importante quote (big price improvements with high volume for exemple).
So maybe we have to do non-linear downsampling by just keep dominant in 
importance (volume, new price...) data. After make a spline interpolation on

those data. This can be a good procedure because it is iterative and so 
don't lose any information if many different sample periods are take.
Fred how do you handle this phase in TrigFit (downsampling + interp) ? How 
does it compare versus classic moving average ?
The way i choose for now is to take directly a moving average with high 
order low pass filter(that is why i choose T3).
Noise variance can be a measurment between different method. but I think fit

with less sample will be better (because less sample to fit), but prediction

will be less good because maybe lose some importante information. To much 
artifact will be added (spectrogramme will be very different if downsampling

is made).

1- The last parameters .... the only one héhé. Like moving average or many 
indicators... periods on wich we make work the indicator.
Euh... heuu... hé : )
Maybe if we take back to the roots of AR modeling... It is said : signal 
must be stationnary. So we have to choose a period not to long so the signal

is stationnary and not to short to find some frequency !
Some idea :
minimum = 4 (bacause difficult to draw one period of a sinus with less than 
4 points... ?)
long = some criterion to test stationnarity... (but those criterion will 
need a period look back too hé !! : )) )

Cheers,
Mich

----- Original Message -----
From: Paul Ho
To: amibroker@xxxxxxxxx <mailto:amibroker%40yahoogroups.com> ps.com
Sent: Friday, November 17, 2006 3:26 PM
Subject: RE: [amibroker] Re: Polynomial Trendlines

Thank mich for the info
So we have a mechanism to optimize the order of the AR estimator. There 
remains a couple of interesting
areas that would affect the performance of this linear predictor
1. The No of Samples
2. The sample period
3. Windows
for I and 2. would Noise Variance still be the measure to minimise?
Any thoughts?
Paul.

From: amibroker@xxxxxxxxx <mailto:amibroker%40yahoogroups.com> ps.com
[mailto:amibroker@xxxxxxxxx <mailto:amibroker%40yahoogroups.com> ps.com] On
Behalf 
Of Tom Tom
Sent: Thursday, 16 November 2006 12:28 PM
To: amibroker@xxxxxxxxx <mailto:amibroker%40yahoogroups.com> ps.com
Subject: Re: [amibroker] Re: Polynomial Trendlines

rmserror is the white (theoricaly if AR fitting is good) noise variance
estimator.
this is compute recursively as you state it with :
NoiseVariance[i] = NoiseVariance[i-1] * (1 - K[i]^2)
where i is the number of the actual iteration, K reflexion ceof.
For i = 0 (before begining iteration from i=1 to P, P the final order
desired for the AR),
NoiseVariance[0] = Autocorrelation_data[0];

This result comes from Durbin-Levison algorythm wich is used for Burg and
Yule-Walker metod.
Durbin levison algo gives by recursion : reflexion coef and noise variance.

>From this noise variance you can compute Order AR selection for each order
during the recursion (FPE, etc...).

Your formula seems not good because the reflexion coefs K are not multiplied
by anything !?

Numerical recipes to take an exemple (
http://www.nrbook. <http://www.nrbook.com/a/bookfpdf/f13-6.pdf>
com/a/bookfpdf/f13-6.pdf ) :

/* Compute Autocorrelation[0] from data and put it as XMS[0] */
p=0
do 11 j=1,n
p=p+data(j)**2
enddo 11
xms=p/n

/* during recursion, update is done with */
xms=xms*(1.-d(k)**2)
/* where d(k) is last coef. reflex. in the k-th iteration */

Hope it helps.

Cheers,
Mich.

----- Original Message -----
From: Paul Ho
To: amibroker@xxxxxxxxx <mailto:amibroker%40yahoogroups.com> ps.com
Sent: Wednesday, November 15, 2006 11:55 PM
Subject: RE: [amibroker] Re: Polynomial Trendlines

Yes Mich, I noticed that as well, In addition,
Currently, memcof seems to calculate the rmserror as sum(data^2) - sum(1 -
reflection Coeff^2).
Is this valid? if not what do you use to calculate it recursively.
Cheers
Paul.

From: amibroker@xxxxxxxxx <mailto:amibroker%40yahoogroups.com> ps.com
[mailto:amibroker@xxxxxxxxx <mailto:amibroker%40yahoogroups.com> ps.com] On
Behalf
Of Tom Tom
Sent: Thursday, 16 November 2006 7:56 AM
To: amibroker@xxxxxxxxx <mailto:amibroker%40yahoogroups.com> ps.com
Subject: Re: [amibroker] Re: Polynomial Trendlines

Hi !

Thanks Paul !
It is around the same for MEM yes. I find a way to compute it during the
recursive process (as you tell it).
I have made comparaison between MEM in Numerical Recipes and formula i make
from original mathematical recursive formula from Burg.
In NR, they make the recurrent loop to compute the Num and Den (use to
calculate the coefficient of reflexion k), loop from 1 to M-i (M is number
of quotes data, i is incrementing from 1 to ORDER_AR). So for high order AR,
most recent data are not taken in consideration !? Same for updating the
forward and backward error from the lattice filter, they just considere from
1 to M-i.
Original burg formula goes loop from i to M-1, so last data are always here
even for high order.
-> memcof on Numerical Recipes doesn't respect original algorithm.

I don't know why they do this on NR mem algo !? i don't find any source
stating than taking [1:M-i] (memcof NR) is better than [i:M-1] (original
burg).

Mich.

__________________________________________________________
Découvrez Windows Live Messenger : le futur de MSN Messenger !
www.windowslivemessenger.fr

__________________________________________________________
Découvrez Windows Live Messenger : le futur de MSN Messenger ! 
www.windowslivemessenger.fr

 





Content-Description: "AVG certification"
No virus found in this incoming message.
Checked by AVG Free Edition.
Version: 7.1.409 / Virus Database: 268.14.11/542 - Release Date: 11/20/2006