PureBytes Links
Trading Reference Links
|
Owen,
Don't feel bad....I made a similar mistake once using the
Zig-Zag function. Seems the Peak-Trough also use this Zig-Zag
function, and I spent to many hours trying to solve the very
problem you describe. If you solve it, please share your solution:)
Adam Hefner.
----- Original Message -----
From: "Owen Davies" <owen@xxxxxxxxxxxxx>
To: <metastock@xxxxxxxxxxxxx>
Sent: Wednesday, October 24, 2001 2:58 PM
Subject: Peak and trough
> Among the many things I don't understand, this one has
> been bothering me of late:
>
> A while back, I decided to check one of my assumptions
> and test the higher-high, higher-low/lower-high, lower-low
> definition of trends. The easy way was to create a system
> using peak() and trough(). It worked beautifully. Virtually
> any contract I ran the system past, it made money. This
> I took to confirm the validity of the trend definition.
>
> Then the obvious dawned on me: Why not see whether there
> was enough of the move left, on average, to make a buck from it
> after the peak or trough was far enough behind us to get the
> signal in real time? I wrote another system that included a delay
> factor, so that one would enter or exit a trade only when the
> price had retraced from the peak or trough by the appropriate
> percentage. Again, it worked just fine. In historical testing, it
> made money like magic on anything from 5-minute to daily bars.
>
> Problem: When I put it on real-time data, it gave a lot of bad
> signals. Then it suddenly recalculated things, decided that the
> minor up and down trends of the last few weeks--this was
> on smallish intraday bars--had really been a long up trend,
> gave a new set of signals, and declared itself a winner.
>
> Does anyone understand these functions well enough to
> explain this behavior to me? I knew that peak() and trough()
> backdate their results by putting their signal several bars
> before it was possible to receive it; that is what I was trying
> to correct with the delay factor. Now it seems that they
> also recalculate their old percentages by comparing against
> the latest data rather than limiting themselves to the data
> that available in real time.
>
> No doubt this is a real beginner's mistake (despite having
> played with this for years), but it would have seemed
> reasonable to assume that a change of X% three weeks ago
> should remain X%, even if we looked at it later. This sort
> of thing has to be seen within its context, or it's useless.
> Is there some reason the functions have to be written this way,
> which I'm completely overlooking, or did someone just
> butcher this piece of code?
>
> Many thanks.
>
> Owen Davies
>
>
|