PureBytes Links
Trading Reference Links
|
On Fri, 01 Feb 2002, Gary Fritz wrote:
> > > Are there any good IDE's for Perl
> > > or are you stuck with a text editor and "perl -w" ?
> >
> > The ide I use is vi and make. It has worked for applications that
> > are near 10K lines.
>
> I've used vi and make for 20 years -- in fact vi is still my
> preferred editor for most text, sure wish I could hook Vim into the
> Power Editor!! -- but you're a stronger man than I if you want to use
> vi/make on a large project instead of a more-integrated IDE. Yes, it
> can work, just like Perl can work. Is it an efficient use of my time
> and energy? I don't think so, but that's just my preference. If I
> was familiar enough with Perl to feel its benefits outweighed C++ (or
> whatever) enough to justify the weaker environment, I might choose
> differently.
I find that no IDE can compare well with the tools that already exist.
I have never found an editor that is faster than vi. Many IDEs are
not integrated into RCS/SCCS/CVS, nor with make, nor ctags, nor with
many of the other tools already in my system. I can add key mappings
to vi to automate building of applications or starting of gdb. I can
add suffix rules to make to force the evaluation and building of
aspects of a project that could not be put into an IDE's concept of
what 'real development' should look like. I do not find vi to be
weaker, but rather the other way around. I have never found an IDE
that could match what I already have, or what I could insert sideways
into what already exists.
> (Also I suspect a trading platform would end up with a lot more than
> 10k lines, but maybe I'm not giving Perl enough credit. What kind of
> app did you write with 10k lines of Perl?)
Actually I believe a 'trading system' would be very small, maybe a few
hundered to a thousand lines. The graphics viewer that is present
in TSX would consume a lot of lines and a lot of cpu. It is the
graphics component that would take the vast and overwhelming
amount of time to construct. If you view the data, indicators, and
signals with something approaching the film _The Matrix_, then you
require much less of the system and any real-time data thrown
its way.
> Even an EMACS-based IDE could streamline the process a bit, but I've
> never been an "EMACS is the best answer for all problems" kinda guy.
Emacs is great for so many things, but I really do not care for
its keystrokes. I have done vi for so long I can't remember the
keystrokes; they just happen.
> > The data never seems to come faster than 10 tics per second
>
> FWIW with just ES and NQ, 600 ticks per minute is very common.
> Obviously some seconds in the minute are going to have more than 10
> ticks. And that's in normal market conditions. I suspect in a fast
> market you could see at least 2x the normal tick rate, maybe 5x or
> more for a few seconds. Now add in additional symbols to track, and
> volume growth over time (especially with electronic markets), and I
> wouldn't consider anything that topped out at anything under 50
> ticks/second. 100 would be better. If you wanted to trade active
> NASDAQ stocks, I doubt 100/sec would do it even in normal markets.
Ok, so my guestimate was way low. I really think that even at 100
ticks a second in a well designed system would be no burden at all.
OLTP environments handle much more data.
> > I have not had a problem with GC, though depending on the
> > application I know that can be a problem.
>
> I'm curious, what LISPs have you used? I haven't done much with LISP
> since the heady "AI machine" days with the Xerox & other specialized
> hardware. I never worked much on those, but used several 68xxx-based
> systems. GC in those days could lock up your application for 20-30
> seconds, which would obviously put a kink in realtime applications.
> :-) But that was over 15 years ago, and I'm sure the technology has
> improved.
The technology has improved, and the language has improved, but the
idea is still the same. The representation of program and data
in exactly the same way. I have used xlisp, franz, gcl, emacs, and
allegero. My preference is actually xlisp and emacs for the
simplicity of what is there and for the removal of all the stuff
I don't need.
> > What EL does for the code is more than the [] syntax. Depending on
> > the function the EL system will attempt to determine how many
> > days of data to look at, will keep previous calculated values in
> > case they might be needed again, if a function is called by two
> > different indicators the EL system seems to keep the function
> > space in its own local environment and does not mix the intermediate
> > values. EL is a complex system and it would be difficult in several
> > ways to replace its implied functionality, but it is possible.
>
> Agreed. (But you give EL too much credit. It doesn't figure out how
> much data to keep around -- it just keeps everything for the
> MaxBarsBack period.) That's why I said ANY non-EL language is going
> to have to write a lot of infrastructure.
I agree with you, but I know for the people I work with to try and
learn a different language when they're not programmers is just asking
too much of them. For me to create a translator is some time, but
is really no effort. I have created several compilers and translators
in the past and creating a new one is just no big deal. The advantage
of creating a new translator is to convert the existing EL code
for the system developers and to then say, "see here, this bit of code,
that is what you wrote in EL and if you change it this way it becomes
smaller/faster/more precise/prettier/etc". It isn't for me, but
rather to capture what they already have and to do a sanity check
that the same code produces the same results in EL and the new thing.
> As I said, I'd be leery of LISP or Perl. Java is a possibility,
> though I'm not convinced of Java's robustness. Not sure how well it
> handles large databases or realtime calculations. (Though the
> platform independence would be a big win.) I *know* C++ can handle
> that, and I bet VB can too.
I have been studing java and am not convinced. It seems that it might
do some things really well, but I'm not convinced about its architecture
and the underlying jvm on the host platform. I really like the idea
of platform independence, but in this implementation what is the
true price? For lisp and perl.... I really like lisp, but have never
found a project that ideally suited lisp (or forth) to such a degree
that no other language or implementation could touch it. Perl, though,
can do so many things that it rivals c in all except for execution
speed in all aspects of an application. My apologies for c++. I have
see really bad c programs and know what effort it takes a cretin
to develop a bad c-based app and claim that it is a good thing. C++
with its constructors, destructors, overloading, mangling, templates,
and other features allow the writing of a truely bad app to be
trivial. I can read c++, and code in it a little, but it has to me
so many bad features that I would not use it. If I'm to code in
c++, I'll choose c instead. That way I can control exactly what is
happening. Perl is dynamic in almost all aspects where you want it
to be. You can control just what it does and doesn't do and still
let it manage the tedious bookkeeping.
In general I can develop an app faster and more reliable in perl
than I can in *any* other language.
> > With a translator to convert EL to something else,
>
> I agree with Bob B. It would be much safer, and probably cheaper in
> the long run (especially when you factor in the cost of mistakes :-)
> to just rewrite it in the new environment.
I would rewrite everything after I had run a sanity check on the
translated system. I need to make sure it works first and change it
second. The translator is not at run-time (I just now realized this
maybe some of the concern). Do all possible at compile time first,
and run-time only as necessary.
Mike
|