Hello,
Yes, but backtesting over all 37000 symbols is not good
idea anyway. At least for me. I have large databases,
but not because I want to run single backtest on 37K symbols
at once. I run backtests on watch lists that
have few hundred to few thousands of symbols.
The backtesting needs to collect signals from all symbols
under test and all bars under test. It is not surprising
that 10000 times say 2500 bars under test gives potentially 25
000 000 signals to be stored and processed.
Now imagine that every signal needs to store: signal type,
entry/exit price, margin requirement, position size, date/time of the
signal,
And even though I have made signal object as small as
possible it still consumes 40 bytes.
Now if there are 25 million signals you end up with 1GB of
memory required to store the signals.
Fortunatelly AB uses several techniques to
significantly lower that amount because otherwise you won't be
able
to run such backtest at all.
Best
regards, Tomasz Janeczko amibroker.com
----- Original Message -----
Sent: Friday, April 11, 2008 1:44
PM
Subject: RE: [amibroker] Speed issues
with Premium Data
I am just saying that it is going to take a long long
time to backtest 37000 symbols. I choose 10000 because after that number, I
find it becomes unconviently slow. If one is routinely doing that, then may be
one should find ways not to have to do that many symbols and/or increase the
CPU power. If one only does it once in a blue moon, then it is going to
take exactly the same time to run it the first time (since AB is loaded)
regardless you cache 37000 symbols or 10000. I do not mean there is a
technical issue, just a sensibility one.
/Paul.
Why do you think so? I am using 60000+ symbols QP2
database (stocks and all mutual funds) without any speed
issues.
It does not matter for the CPU how many symbols are
there.
Still you are correct that AB native format would be
fastest.
Best regards, Tomasz
Janeczko amibroker.com
----- Original Message -----
Sent: Friday, April 11, 2008 6:34
AM
Subject: RE: [amibroker] Speed issues
with Premium Data
I think after 10000 symbols, the CPU will become the
bottle neck, not RAM. Also the loading time of 37000 symbols into RAM
wouldnt be trival either.
I personally would suggest importing the data into
native AB format, at least that would increase the loading speed
considerably and stick to no more than 10000 symbols in any
backtest.
Hi
Tomasz,
We have
just released our comprehensive US listed + delisted stock database back
to 1950. This has over 37000 symbols in it and comprises almost
2GB of data in MetaStock format. However, the in-memory cache size
?max symbols? setting is limited to 20000 symbols ? is it possible to
increase this number for those of our users with plenty of RAM to
burn?
Best
regards, Richard Dale. Norgate Investor
Services - Premium quality Stock, Futures and Foreign Exchange Data
for markets in Australia, Asia, Canada, Europe, UK & USA
- www.premiumdata.net
From: amibroker@xxxxxxxxxps.com
[mailto:amibroker@yahoogroups.com] On Behalf Of Tomasz
Janeczko Sent: Friday, 11 April 2008 4:05 AM To:
amibroker@xxxxxxxxxps.com Subject: Re: [amibroker] Speed
issues with Premium Data
Yes, for best results
increase both, however do not enter more than your RAM size is because
then Windows would swap to hard disk anyway.
Best regards, Tomasz
Janeczko amibroker.com
----- Original
Message -----
Sent:
Thursday, April 10, 2008 9:44 PM
Subject: Re: [amibroker] Speed issues with Premium
Data
Thank you, Tomasz, I'll try that. Do I also
need to change the "in memory cache size (max. symbols)"?
The
PremiumData database with all the backlisted symbols is huge, so maybe
I just haven't sized the data preferences big
enough.
ges
On Thu, Apr 10, 2008 at 12:59 PM, Tomasz Janeczko
<groups@xxxxxxxxxxcom>
wrote:
If you want caching and faster use of MS
databases, go to File->Database Settings and change "Local
database storage" to "Enabled", also you may go
to Tools->Preferences->Data and increase 'in-memory
cache'.
Best regards, Tomasz Janeczko amibroker.com
----- Original
Message ----- From: "ges" <ges8ges@xxxxxxcom> To: <amibroker@xxxxxxxxxps.com> Sent:
Thursday, April 10, 2008 8:36 PM Subject: [amibroker] Speed issues
with Premium Data
> I've used QuotesPlus with AB for quite a
while and just got > PremiumData's historical data and
delisted data and am having speed > problems with PremiumData.
> > With QuotesPlus, after the first run of an
exploration or backtest, AB > keeps the data in memory and
subsequent runs are very fast and there > is not the constant
disk access to bog the computer down when doing > other
tasks. > > With PremiumData I can't get the same
behavior. Is this because the > PremiumData data is in Metastock
format? Or is it my database settings? > > I have tried
changing the database settings every way I can think of, > but I
can't get AB to handle the PremiumData the same speedy way
that > id does QP data. > > Is there some change to
database prefs/settings that will allow AB to > keep the
PremiumData in memory or is this just a limitation of the >
Metastock data format? > > Thanks for any
suggestions. > >
ges
__._,_.___
Please note that this group is for discussion between users only.
To get support from AmiBroker please send an e-mail directly to
SUPPORT {at} amibroker.com
For NEW RELEASE ANNOUNCEMENTS and other news always check DEVLOG:
http://www.amibroker.com/devlog/
For other support material please check also:
http://www.amibroker.com/support.html
__,_._,___
|