PureBytes Links
Trading Reference Links
|
Simon:
>discovered a bunch of oddities in TS that I thought I'd share, since I
>know many on this list still depend on TS on a daily basis.
Some are oddities, but many of your observations aren't oddities.
>The at$ command is basically a GTC order, except that you have to
>re-enter it daily, like a GFD.
It's good practice to re-enter all orders on each bar. This makes
sense because you are re-calculating things on every bar anyway, so
your stop and limit orders are likely to change each bar.
>A 'Stop' signal isn't a simple stop, it's a stop-and-reverse.
>
>eg, if we're long 6, and then have a sell 4 stop, we end up net short 4
>(having closed out the 6), not just long 2, as you'd expect.
Yes. That makes sense. The words "Buy" and "Sell" mean "go long"
and "go short". If you're long 6 and want to get out of 4, you'd
use ExitLong, not Sell.
In TS2Ki, the orders are Buy/ExitLong, Sell/ExitShort.
In TS8 I believe the corresponding orders are Buy/Sell,
SellShort/SelltoCover (or something like that). I think you got
confused by the Buy/Sell terminology in TS8 and applied that
misunderstanding to TS2Ki. It screws up porting code between TS2Ki
and TS8 too.
>OHLC, it decides the internal order as follows:
>
>the shortest O->L or O->H distance gets covered first, regardless of
>overall market direction.
Yes, this is clearly documented in the TS2Ki language reference.
It isn't an oddity at all, but a sensible approach to determining
the order in which multiple trades get executed within a bar.
Naturally, you'll find exceptions to this behavior in real life. A
typical bar inside a trend will generally behave in that fashion,
however. TradeStation has no way of judging "overall market
direction" - that's your job, and that's why we have so many
indicators that attempt to judge direction. All TS can do is make
an assumption about the bar.
>If you have a stop and the day's open gaps through it, TS fills it AT
>the open.
That happens in real life. Is that a problem? In real life as in
TS, stop orders become market orders the instant the stop price is
touched or passed by.
>TS is unpredictable when doing basic maths.
>[snip example of moving average miscalculation]
Now THAT is a legitimate gripe. TS2Ki does single-precision math.
Most folks here know this. I have to accept it for something
like the example you cited, but have to say, this isn't an exact
science, and if my strategy is sensitive to an error in the 3rd
decimal place, then I have a bad, non-robust strategy! Some of my
indicators accumulate values for speed (e.g. don't sum the whole
range to calculate an average, but maintain a running sum by adding
the new value and dropping the old, for efficiency); for these I
force them to re-initialize periodically to get rid of accumulated
errors.
There is no need to be concerned about duplicating the TS results in
this case. You should be using double-precision math.
>This has major consequences for systems that depend on relative comparisons
No, it has major consequences for comparing a system in your
software vs TS. Over the long term, the errors will occur in
both directions and overall things should balance out. As I said
earlier, a strategy that's sensitive to small errors in precision
isn't a good strategy.
When it HAS mattered to me, I have compared two numbers as a
difference, and require the difference to be greater than some
epsilon like 0.0001.
>when outputting numbers, the NumToStr command does not keep track of
>decimal places for any number >= 100,000.
>Eg, 100000.50 will output as 100000.00.
Yup, string handling sucks. And this is likely a consequence of
single-precision math again, and storing data as integers.
>For some? reason, certain values are arbitrarily rounded.
>
>This is more-or-less random. I suspect it's an artifact of numbers
>internally using floats/singles (4 bytes) rather than doubles (8 bytes)
Yes. TS was created in the days when memory was expensive and if you
had 4 megabytes of RAM you had a hot machine.
>A reasonable rule of thumb is that you only have 6 accurate decimal
>places.
That's true.
>Setting max bars back to a large number doesn't ensure that the internal
>calculations are done before that point, it just doesn't
>START them any earlier.
Correct. MaxBarsBack is typically best left to TS to figure out.
But if you have a function or indicator that calculates its own
number of bars back (so TS can't tell from the constants how many
bars back it needs) you can set MaxBarsBack to tell TS when to
START calculating. And we all know that this works for something
like Average(), which will start out with the correct value when
MaxBarsBack is big enough, but XAverage() will still require even
more time to settle unless you initialize it to the Average() value
first.
>It will corrupt negative prices. Additionally, it will randomly adjust
>the prices to values that aren't possible.
>
>eg, on a market that can only move in 0.25 increments, a value of -1
>might become -0.97.
>
>of course, this also throws out any calculations done with these prices.
Interesting, I didn't know this.
>When the minimum price move is different from the actual market prices
>(this happens often with back adjusted data).
>
>eg, min price move is 5/100's, but prices are 107.82, 108.03 etc
>
>then TS calculates indicators based on the actual prices (107.82, etc),
>but estimates entry points based on the min price move.
>eg, above, if a buy order was at 107.82, it would be filled at 107.80 or
>107.85, even though the (back adjusted) price never went there (and
>couldn't).
>
>This is an awkward case. what's the correct thing to do here? Enter at
>numerically unusual, but back adjusted prices, or at the correct (but
>invalid) granularity.
Consider a current bar, which will contain prices at 0.05 increments
(107.80, 107.85, etc.). Typically your calculations of prices for
stops and limits will involve some floating point math, which may
result in an order to buy 1 contract at 107.83. TS will simply fill
the order the same way it does when price gaps past your stop - the
next available price that meets your order requirements.
This behavior makes sense for the current bar. There's no reason to
behave any differently for historical bars. If your historical bars
consist only of OHLC data, it makes sense to assume the historical
prices have the same granularity as the current bar. If your
historical bars are constructed from tick data, use the tick data
regardless of the granularity.
>Rounding occurs in odd places resulting in unpredictable behaviour,
>particularly with back adjusted data.
>
>Eg, IFS (MIB) has a minimum price move of 5.
>
>On 15th & 16th Jan 1996, both days have a back-adjusted high of 15048
>(which is not a factor of 5).
>If a long order is placed at the high of the 15th, this order is rounded
>to 15050, which TS then decides is outside the range of the 16th, and
>therefore doesn't hit.
>Since the intention is that the system go long at the high, if the next
>day has the same high, regardless of the price scaling, this should be hit.
Yes, that's an oddity I've noticed too. I get around it by rounding
my own buy orders up, and my sell orders down to the nearest
price increment. This gives me a more conservative estimate of
performance, also.
--
,|___ Alex Matulich -- alex@xxxxxxxxxxxxxx
// +__> Director of Research and Development
// \ Unicorn Research Corporation -- http://unicorn.us.com
// __) HTML FORMATTED MAIL SENT HERE WILL BE REJECTED AS SPAM.
|