[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

AW: TradeStation Precision - Summary



PureBytes Links

Trading Reference Links


Allow me to offer a bit of background information which may help to clarify
some of the issues.

It is well known that with the advent of computers some fifty years ago,
quite a few strange, hitherto unknown phenomena emerged in numerical
mathematics. One case in point concerns the famous "logistic equation",
	p[n+1]:=p[n]+r.p[n].(1-p[n]), with r constant,
which originally served to model population growth. It soon became apparent
that after not too many iterations, results often differed greatly,
depending not only on the floating-point precision used, but also on simple
differences in the program code such as a(b+c) vs. ab+ac etc. Given exactly
the same initial conditions, it could easily be demonstrated that after some
40 or 50 iterations the above equation might show a result of 0.4 if you
took 10 digits of precision, and a result of 22 if you had used 12 digits of
precision. A scientifically disastrous case of seeming mathematical
inconsistency!

A similar instability problem from meteorology became known as the "Lorenz
experiment". I won't go into the details, but the gist of this was that
after a few iterations of numerical integration, even the tiniest
imprecision introduced within the iterations caused the difference between
the resulting signals to become larger than the signal itself!

Lorenz himself wrote a famous paper about this phenomenon, titled, "Can the
flap of a butterfly's wing stir up a tornado in Texas?" Basically, it turned
out that in deterministic feedback processes there can be inherent numerical
instabilities, and, therefore, a lack of numeric predictability.

Now I hope that the above may help to understand better where much-maligned
Pierre was right and where he made a little mistake. While Pierre was
absolutely right in telling his students to omit unnecessary decimals when
calculating, say, an angular velocity from a formula, it seems he did not
take into consideration that as soon as *feedback processes* are involved,
it becomes a totally different story. With feedback processes, you want to
use all the precision you can get because, as described above, these
processes are numerical minefields where tiny inaccuracies in the
calculation can quickly magnify to such an extent that the resulting signal
may become totally meaningless.

Now coming back to TradeStation and Metastock and similar software, we have
to keep in mind that many of the commonly used trading indicators, like, for
instance, the EMA, *are* programmed as deterministic feedback processes.
Therefore, instabilities will arise automatically, owing to the numeric
properties of these iteration processes.

Please note that this phenomenon does not result from any weakness of the
trading system involved, nor is it entirely the fault of the software.
Inaccuracies will always arise of necessity as soon as any feedback process
is involved in the calculation of a trading system.

What corrective measures are available? While tweaking the program code may
help in a few special cases, the only remedial action that would be
generally applicable, and would definitely be called for in the cases of TS
and Metastock, is an increase of the internal floating-point precision of
the program. This solution, while not a panacea, will serve to delay the
introduction of noise which now attacks those feedback processes at a
comparatively early stage of the iteration, and will thus help to prevent
many of the instabilities described.

Michael Suesserott