PureBytes Links
Trading Reference Links
|
Let's continue this at AB-TS ... Since that's where it belongs ...
--- In amibroker@xxxxxxxxxxxxxxx, "Tom Tom" <michel_b_g@xxx> wrote:
>
> Hi,
>
> So... hummm ... maybe :
>
> 3- I implemented in the function Hamming Widowing for Burg on the
reflection
> coefficients (from the paper source i send in past mail). I have
to test it
> now.
> But theorically Burg method don't need windowing so we will see
the diff.
> Noise variance won't be minimised. It is minimised in case of no-
windowing.
> With windowing, we are no more sure to have best fit :
>
http://sepwww.stanford.edu/public/docs/sep65/gilles1/paper_html/node1
4.html
>
> 2- Maybe we can test down sampling the data and interpolate them,
like Fred
> tell to do for his TrigFit. But there is maybe some slightly
problem like
> for exemple : taking in account an notusefull quote (last quote
from a
> consolidating periods for exemple) an dissmis just next quote wich
is an
> importante quote (big price improvements with high volume for
exemple).
> So maybe we have to do non-linear downsampling by just keep
dominant in
> importance (volume, new price...) data. After make a spline
interpolation on
> those data. This can be a good procedure because it is iterative
and so
> don't lose any information if many different sample periods are
take.
> Fred how do you handle this phase in TrigFit (downsampling +
interp) ? How
> does it compare versus classic moving average ?
> The way i choose for now is to take directly a moving average with
high
> order low pass filter(that is why i choose T3).
> Noise variance can be a measurment between different method. but I
think fit
> with less sample will be better (because less sample to fit), but
prediction
> will be less good because maybe lose some importante information.
To much
> artifact will be added (spectrogramme will be very different if
downsampling
> is made).
>
> 1- The last parameters .... the only one héhé. Like moving average
or many
> indicators... periods on wich we make work the indicator.
> Euh... heuu... hé : )
> Maybe if we take back to the roots of AR modeling... It is said :
signal
> must be stationnary. So we have to choose a period not to long so
the signal
> is stationnary and not to short to find some frequency !
> Some idea :
> minimum = 4 (bacause difficult to draw one period of a sinus with
less than
> 4 points... ?)
> long = some criterion to test stationnarity... (but those
criterion will
> need a period look back too hé !! : )) )
>
> Cheers,
> Mich
>
>
>
> ----- Original Message -----
> From: Paul Ho
> To: amibroker@xxxxxxxxxxxxxxx
> Sent: Friday, November 17, 2006 3:26 PM
> Subject: RE: [amibroker] Re: Polynomial Trendlines
>
>
> Thank mich for the info
> So we have a mechanism to optimize the order of the AR estimator.
There
> remains a couple of interesting
> areas that would affect the performance of this linear predictor
> 1. The No of Samples
> 2. The sample period
> 3. Windows
> for I and 2. would Noise Variance still be the measure to minimise?
> Any thoughts?
> Paul.
>
>
>
>
> From: amibroker@xxxxxxxxxxxxxxx [mailto:amibroker@xxxxxxxxxxxxxxx]
On Behalf
> Of Tom Tom
> Sent: Thursday, 16 November 2006 12:28 PM
> To: amibroker@xxxxxxxxxxxxxxx
> Subject: Re: [amibroker] Re: Polynomial Trendlines
>
>
> rmserror is the white (theoricaly if AR fitting is good) noise
variance
> estimator.
> this is compute recursively as you state it with :
> NoiseVariance[i] = NoiseVariance[i-1] * (1 - K[i]^2)
> where i is the number of the actual iteration, K reflexion ceof.
> For i = 0 (before begining iteration from i=1 to P, P the final
order
> desired for the AR),
> NoiseVariance[0] = Autocorrelation_data[0];
>
> This result comes from Durbin-Levison algorythm wich is used for
Burg and
> Yule-Walker metod.
> Durbin levison algo gives by recursion : reflexion coef and noise
variance.
>
> From this noise variance you can compute Order AR selection for
each order
> during the recursion (FPE, etc...).
>
> Your formula seems not good because the reflexion coefs K are not
multiplied
> by anything !?
>
> Numerical recipes to take an exemple (
> http://www.nrbook.com/a/bookfpdf/f13-6.pdf ) :
>
> /* Compute Autocorrelation[0] from data and put it as XMS[0] */
> p=0
> do 11 j=1,n
> p=p+data(j)**2
> enddo 11
> xms=p/n
>
> /* during recursion, update is done with */
> xms=xms*(1.-d(k)**2)
> /* where d(k) is last coef. reflex. in the k-th iteration */
>
> Hope it helps.
>
> Cheers,
> Mich.
>
> ----- Original Message -----
> From: Paul Ho
> To: amibroker@xxxxxxxxxxxxxxx
> Sent: Wednesday, November 15, 2006 11:55 PM
> Subject: RE: [amibroker] Re: Polynomial Trendlines
>
> Yes Mich, I noticed that as well, In addition,
> Currently, memcof seems to calculate the rmserror as sum(data^2) -
sum(1 -
> reflection Coeff^2).
> Is this valid? if not what do you use to calculate it recursively.
> Cheers
> Paul.
>
> From: amibroker@xxxxxxxxxxxxxxx [mailto:amibroker@xxxxxxxxxxxxxxx]
On Behalf
> Of Tom Tom
> Sent: Thursday, 16 November 2006 7:56 AM
> To: amibroker@xxxxxxxxxxxxxxx
> Subject: Re: [amibroker] Re: Polynomial Trendlines
>
> Hi !
>
> Thanks Paul !
> It is around the same for MEM yes. I find a way to compute it
during the
> recursive process (as you tell it).
> I have made comparaison between MEM in Numerical Recipes and
formula i make
> from original mathematical recursive formula from Burg.
> In NR, they make the recurrent loop to compute the Num and Den
(use to
> calculate the coefficient of reflexion k), loop from 1 to M-i (M
is number
> of quotes data, i is incrementing from 1 to ORDER_AR). So for high
order AR,
> most recent data are not taken in consideration !? Same for
updating the
> forward and backward error from the lattice filter, they just
considere from
> 1 to M-i.
> Original burg formula goes loop from i to M-1, so last data are
always here
> even for high order.
> -> memcof on Numerical Recipes doesn't respect original algorithm.
>
> I don't know why they do this on NR mem algo !? i don't find any
source
> stating than taking [1:M-i] (memcof NR) is better than [i:M-1]
(original
> burg).
>
> Mich.
>
> __________________________________________________________
> Découvrez Windows Live Messenger : le futur de MSN Messenger !
> www.windowslivemessenger.fr
>
>
>
>
> _________________________________________________________________
> Découvrez Windows Live Messenger : le futur de MSN Messenger !
> www.windowslivemessenger.fr
>
Content-Description: "AVG certification"
No virus found in this incoming message.
Checked by AVG Free Edition.
Version: 7.1.409 / Virus Database: 268.14.11/542 - Release Date: 11/20/2006
|