PureBytes Links
Trading Reference Links
|
Bob Fulks wrote:
> At 10:44 PM -0400 8/31/01, Gary Yakhnes wrote:
>
> >TS2K=TS PRO + Third-Party Data Support
>
> Your premise is an interesting one. I think it is more than just that.
>
> I have not tried TSPro yet. (I have learned from experience that no
> Omega product is worth even looking at for the first 12 to 18 months
> after it's release.)
>
> I also have concerns with their data-on-demand model, the product
> quality, and the requirement to use their brokerage services..
As a TS4, TS2K and TS Pro user I would risk to state here that the current
release of TS Pro has at least the same quality as last TS4 build for me
and it is much better then TS2K. As for brokerage account requirement to
use TS Pro - there is no such requirement and you can open ts pro account
without opening brokerage acc. as i did.
> Data-on-demand would be ideal if it really worked. No one likes
> having to maintain their own data but there is now no choice if you
> want lots of accurate historical data for backtesting. An ideal model
> would let you collect real-time data for trading, store the data on
> your machine locally, fix bad ticks as they occur, and be able to
> download years of accurate historical data and store it on your
> machine for system development. The ability to download intraday data
> to fill holes is essential. The present combination of
> DynaStore/DynaLoader plus QFeed is close to this model and works very
> well.
>
> A stock has 390 OHLC 1-minute bars in a day that would be easy to
> download. Assuming 10 bytes per bar this would be 4000 bytes per day
> per symbol. 100 symbols would require 400K bytes which is a few
> seconds on a high speed connection. Similarly, 100 days of one symbol
> would require a few seconds.
>
> Bad ticks - I cannot see how they can avoid bad ticks screwing up the
> data. When we have the data on our workstation we can at least fix a
> bad tick. It is unrealistic to think they will fix them in a timely
> manner for the many thousands of issues. QCharts has this problem.
> The only viable model is to have the data on your machine where you
> can fix any bad ticks yourself.
>
I can send you privately ASCII from TS Pro for any symbol per your request
so you could evaluate the quality of data (any us equities or cme issues).
I think that the current quality of the data from their datafeed is not
bad at all. The option for manual bad ticks editing could be easyly
deployed when they add third-party data support. Also as we know there is
no problem fixing bad ticks with TS Pro on programing level: instead of
Close/Open/H/L i can use MyClose=Filter(Close), MyOpen=Filter(Open),
etc... It would not allow me to use the strategy for generating Buy/Sell
orders in account manager (former tracking center), but i still could use
indicator with Buy/Sell Allert. I agree that it's not the best solution
but i did not see many bad ticks to edit from their datafeed either. It is
not a History Bank!
>
> Server Loading - It seems fundamental that any model that requires
> server capacity to do work on demand will bog down eventually. To
> avoid slow performance during peak usage periods, such as when the
> market opens, the capacity has to be probably ten times the average
> load and management usually does not supply that kind of excess
> capacity. This was a major issue with QCharts. But if you stored the
> past data on your machine, the load on the servers would be very much
> less and more constant. If you looked at some new symbol, you would
> have to download all the data required for that new chart but these
> requests would occur more randomly.
>
It is a valid point but in practice I did not see any data backlog at the
opening since they changed their server base and added more servers few
months ago. It would be interesting to here from other ts pro users on the
issue.
>
> Quality Issues - I also have serious concerns about the quality of
> Omega software. it seems to be getting worse with each new product.
> TS4.0 is pretty good but it took them until about Build 16 until it
> was even usable. TS2000i was poorer with lots of annoying bugs but by
> Service Pack 5 it was usable. And TSPro ??? the trend is bad.
As I stated earlier it is rather stable now! It is hard to believe but it
is. This is just my personal experience but it would be interesting to
here from other pro users. Bilo, as a power user of ts pro could clarify
this issue too.
> I have heard that they took out the divide-by-zero checks and code
> that runs fine on TS2000i does not run on TSPro.
>The management > obviously has no appreciation for the need for quality
so I wonder
> whether they will ever be able to fix this.
Here is the AverageFC function from TS Pro:
********************************************************************************
inputs:
Price( numericseries ),
Length( numericsimple ) ; { this input is assumed to be constant; will
get
divide-by-zero error if Length = 0 }
AverageFC = SummationFC( Price, Length ) / Length ;
{ ** Copyright (c) 1991-2001 TradeStation Technologies, Inc. All rights
reserved. ** }
********************************************************************************
I am just wondering what reason in the world made them to remove
divide-by-zero checks in the built-in functions (as they did) and replace
it with the notes like this: { this input is assumed to be constant; will
get divide-by-zero error if Length = 0 }. It is hard to believe that they
spent their time removing devide-by-zero error check just to frustrate
their clients. Is it possible that there is some valid reason to do this?
If we look at any function in MS Excel - there is no divide-by-zero error
check either: MOD(10, 0) will return #DIV/0 in MS Excel. Is it possible
that built-in divide-by-zero error check inside a standard function is not
such a good idea because it will not alarm a user that there is some
problem with the data this function applied to. Instead it will just carry
out previous value of the function to the current bar until devide-by-zero
disappears. It could be a user discretion to add devide-by-zero error
where user think it needed (i'm just trying to find any common sense for
their action to remove devide-by-zero check because it hard to believe
this was done without any valid reason).
Thanks,
gary
|