[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: Re[2]: TS2000 sp6



PureBytes Links

Trading Reference Links



> -----Message d'origine-----
> De : ztrader [mailto:ztrader@xxxxxxxxx]
> Envoye : vendredi 16 mars 2001 22:06
> A : omega-list
> Objet : Re[2]: TS2000 sp6
>
>
> On Friday, March 16, 2001, 7:25:18 AM, William Wood wrote:
>
> WW> However, folks who really stress the platform are still
> WW> experiencing bugs even with proper hardware and O/S.
>
> I believe this is the key to the difference. I use TS2k only as a
> 'development' platform. What does this mean? I will have open 10-15
> workspaces, each with 5-6 charts, each with 3-5 data sets. data sets
> are often 50,000 - 200,000 bars. Each data set will have a few
> studies, usually computation-intensive. For thinking purposes, a
> simple but computation-intensive study would be:
>
> x = WAverage( WAverage( WAverage( WAverage( c , 50) , 100) , 1000) ,
> 10000) ;
>
> Put a few of these on each of a few data sets on a 100,000 bar chart
> and you'll see what I mean.
>

Stupid!
You are running tye following code over 100 000 bars  50+100+1000+10000 =
11150 loops ber bar!
1 115 000 000 loops  per chart...
Any computer program will take times if you do this!


Inputs: 	Price(NumericSeries), Length(NumericSimple);
Variables:	Sum(0), Counter(0), CSum(0);

Sum = 0;
CSum = 0;

For counter = 0 To Length - 1 Begin
	Sum = Sum + Price[counter] * (Length - counter) ;
	CSum = CSum + Length - counter;
End;

If CSum > 0 Then
	WAverage = Sum / CSum
Else
	WAverage = 0;

Take this one ( will give probably the same results) St maxbbars back to 1
then see the difference...

x = xAverage( xAverage( xAverage( xAverage( c , 50) , 100) , 1000) ,


> But - IMHO just having these complex workspaces is NOT what stresses
> TS. For this, you have to, for example, open 3 MORE large workspaces
> as described above. **While they are opening**, go to the data server
> and edit, or paste, or copy some data (NOT the data needed for the
> above, obviously). While the server is pasting in a large data set, go
> back to charting and change parameters in a few studies on one of the
> long-data charts. [Since TS does NOT update the screen properly, there
> will be NO indication of the background work TS is doing - it may just
> seem sluggish.] In the midst of this, customize or change a few new
> toolbars, change some TS options, change the data on a chart, add or
> delete studies, change study parameters, and so forth - all while TS
> is trying to complete a few other operations.

If you are bound to  run stupid code like  above ( guess that you set it to
update every tick  ... )
and do database copy and paste frm to GS, take a dual processsor machine and
set the GS to run on one of them.
Now, what is the interest to copy and past all day long ?

>
> The basic idea is to introduce a new operation in TS BEFORE some
> operation has completed. I believe this is what causes TS to crash so
> often. If someone CAN, for an extended period of time, do operations
> as I have described, and NOT have TS crash, I would like to hear about
> it. In my experience, this is what leads to problems.

Buy a Cray
>
> The opposite extreme is just watching a few workspaces, with short
> data sets, and simple "canned" studies that are perhaps only a few
> lines of code(XA's, MA's, etc). Even in real time, I suspect this will
> be OK, and may still be OK with many workspaces/charts/studies unless
> pending operations are 'interrupted'.
>
> Any traders stressing TS enough to do MANY 'interrupted' operations
> during trading, or, more likely, during 'development'? Any other
> thoughts as to why there is such a large reported difference in
> reliability?
>

My opinion is that those traders get what they deserve...

> ztrader

 Pierre Orphelin
www.sirtrade.com
TradeStation Technologies representative in France
( The holly country where TS2000 seamless works)