Hi 
  Fred
Yes, I want to use the composite fitness to compare different systems 
  
and or use it as a feedback in deciding on different parameter sets 
of 
  the same system, This is not too dissimilar to how sensitivity 
analysis is 
  incorporated into the fitness criteria. The only 
difference is that 
  sensitivity analysis during optimization, and walk 
forward is done after a 
  new fitness high is found. Instead of using 
the insample fitness as the 
  selection criteria for the best fit 
system, the composite is criteria is 
  used to choose among the various 
peak values in one system or in different 
  systems.
What you said "the capability to automatically reoptimize when 
  some 
condition related to the performance metrics occurs during the out of 
  
sample period i.e. MDD goes beyond some static threshold or when it 
  
goes beyond some relationship to the same" is particularly 
  
interesting. Because you are addressing a similar problem but using a 
  
different method, in your case, you change the time frame and 
  
reoptimize. In my case, I am looking at refining my fitness criteria 
  
so I might end up in choosing a different optimized parameter set in 
  
the same time frame.
Paul.
--- In amibroker@xxxxxxxxxps.com, 
  Fred Tonetti <ftonetti@xx.> wrote:
>
> 
  Paul,
> 
> 
> 
> I understand what you are saying but 
  I'm not sure what you do with 
the
> combined fitness when you get it 
  . Do you use it to compare 
different
> systems to each other 
  ?
> 
> 
> 
> Personally from the perspective of 
  multiple automated WF's I am more
> interested in . When to reoptimize 
  .
> 
> 
> 
> IO already has the capability to 
  reoptimize based on:
> 
> 
> 
> - Some static amount 
  of time occurring during the OOS i.e. 
> 
> 
> 
> 
  //IO: WFAuto: Rolling: 2: Weeks
> 
> 
> 
> - or in 
  some undefined amount of time based on some number of 
long/short
> 
  entries/exits etc i.e. 
> 
> 
> 
> //IO: WFAuto: 
  Rolling: 2: LongEntrys
> 
> 
> 
> What I've been 
  playing with recently is something a little 
different that is
> also 
  based on a variable amount of time in the OOS i.e. the 
capability 
  to
> automatically reoptimize when some condition related to the 
  
performance
> metrics occurs during the out of sample period i.e. 
  MDD goes beyond 
some
> static threshold or when it goes beyond some 
  relationship to the 
same or
> different performance metrics of in 
  sample.
> 
> 
> 
> For example . 
> 
> 
  
> 
> Assume the In Sample Performance Metrics are prefaced by IS 
  and Out 
of
> Sample Performance Metrics are prefaced by OS then one 
  should be 
able to
> write ( in terms of IO Directives )
> 
  
> //IO: WFAuto: Rolling: Condition: OSMDD > 10 or OSMDD > 0.75 * 
  ISMDD
> 
> 
> 
> In reality I suspect this is what 
  most people actually do i.e. find 
some
> yardstick(s) that tell them 
  their system is broken or about to be 
broken and
> then reoptimize 
  at that time.
> 
> 
> 
> 
> 
> _____ 
  
> 
> From: amibroker@xxxxxxxxxps.com 
  [mailto:amibroker@xxxxxxxxxps.com] 
  
On Behalf
> Of Paul Ho
> Sent: Tuesday, May 06, 2008 10:41 
  AM
> To: amibroker@xxxxxxxxxps.com
> 
  Subject: [amibroker] Fitness Criteria that incorporates Walk 
Forward 
  Result
> 
> 
> 
> Howard calls it the objective 
  function. Fred calls it Fitness. What 
I 
> meant by Fitness Criteria 
  is a mathematical function on which 
fitness 
> or goodness of the 
  system is judged, and is used as an objective 
> criteria to compare 
  different systems, as a score in optimization. 
> 
> My currrent 
  question is - So why not incorporate the fitness in 
walk 
> forward 
  analysis into our fitness criteria? What I am talking about 
> is to 
  formalise the visual inspection process. I am not proposing 
to 
> 
  use out of sample data for optimization purposes. Rather the 
> 
  parameter set that has been previously optimized is forward tested 
> 
  and a fitness is obtained and incorporated into the original 
criteria 
  
> to form a composite fitness. 
> 
> For example. My 
  current composite fitness is the geometric average 
of 
> In sample 
  fitness and Out of sample fitness divided by the standard 
> deviation 
  (?) of In sample and out of sample fitness. 
> 
> Are there 
  anybody doing something is this area? What are your 
> thoughts?
> 
  
> If you are wondering why not use visual inspection. My plan is to 
  
use 
> the computer to do most of the work and thats why I need a 
  fitness 
> criteria.
> 
> Cheers
> 
  Paul.
>