Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    text
    copied!<p>Based on your comments it sounds like you already have some adhoc set of rules or heuristics to give you something around 70% success based on your own measures, and you want to optimize this further to get a higher win rate.</p> <p>As a general solution method I would use a <A href="http://en.wikipedia.org/wiki/Hill_climbing" rel="nofollow noreferrer">hill-climbing algorithm</a>. Since I don't know the details of your current algorithm that is responsible for the 70% success rate, I can only describe in abstract terms how to adapt hill-climbing to optimize your algorithm.</p> <p>The general principle of hill-climbing is as follows. Hopefully, a small change in some numeric parameter of your current algorithm would be responsible for a small (hopefully linear) change in the resulting success rate. If this is true then you would first parameterize your current set of rules -- meaning you must decide in your current algorithm which numeric parameters may be tweaked and optimized to achieve a higher success rate. Once you've decided what they are, the learning process is straight-forward. Start with your current algorithm. Generate a variety of new algorithms with slightly tweaked parameters than before, and run your simulations to evaluate the performance of this new set of algorithms. Pick the best one as your next starting point. Repeat this process until the algorithm can't get any better. </p> <p>If your algorithm is a set of if-then rules (this includes rule-matching systems), and improving the performance involves reordering or restructuring those rules, then you may want to consider <a href="http://en.wikipedia.org/wiki/Genetic_algorithms" rel="nofollow noreferrer">genetic algorithms</a>, which is a little more complex. To apply genetic algorithms, it is essential that you define the mutation and crossover operators such that a single application of mutation or crossover results in a small change in the overall performance while a many applications of mutation and crossover results in a large change in the overall performance of your algorithm. I'm not an expert in this field but there should be much that comes up when you google for "genetic algorithms on decision trees". The pitfall to avoid is that if you simply consider swapping branches in a decision tree for the mutation operator, a single application might modify the root of your decision tree, generating a huge performance difference. This typically adds too much noise for a genetic algorithm, so my advice in this approach is to be very careful about the encoding of your operators.</p> <p>Note that these two methods are very popular AI methods for learning or improving your current algorithm. You would do all of these simulations and learning offline. Then you would simply deploy the resulting, learned algorithm.</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload