Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl: the Markov chain saw

Re: Re: A Beginning Guide To Evolutionary Algorithms

by blokhead (Monsignor)
on Oct 14, 2003 at 15:05 UTC ( [id://299136] : note . print w/replies, xml ) Need Help??

in reply to Re: A Beginning Guide To Evolutionary Algorithms
in thread A Beginning Guide To Evolutionary Algorithms

Evolutionary algorithms assume that the route to an optimal solution is paved with progressively better solutions.
This isn't true. When this is the case, every mutation that increases fitness is a move in the "right direction", so you just have one big fitness hill. But then you don't even need real evolution! Just take one individual, examine some possible single-mutations from it until you get something better. Keep repeating, and you're guaranteed to get an optimal solution in the end. We called these "stochastic hill-climbers," IIRC.

Notice that the ONE-MAX example is like this. Every new 1-character you get in the string gene is a step in the right direction, up the hypercube so to speak. This might be a reason why ONE-MAX is a deceptively simple first example?

This is not be true for some problems, so it's important to keep that in mind.
You're exactly right here. I wanted to imply this by mentioning suboptimal fitness hills, but it's probably better to say it explicitly. By climbing up a suboptimal hill, you are taking steps "up", but you won't get to an optimal solution. To get to an optimal solution, must to take steps "down" the hill first. So the route to an optimal solution isn't paved with progressively better things. This is why we don't want to kill off lower-fitness individuals as soon as we lay eyes on them in hard problems ;) .. if they have a chance of breeding, they might even lead us up a different fitness hill than the one we're stuck on.
One more thing: at my school we call them genetic algorithms
I never got most of these silly and similar-sounding terms straight in my head! According to the EA FAQ, EA encompasses genetic algorithms. However, the psuedocode they give for both seems the same. So I honestly don't know the difference. It's seems pretty safe to interchange the two terms, and definitely within the structure laid out in my writeup (which follows the pseudocode of both).


  • Comment on Re: Re: A Beginning Guide To Evolutionary Algorithms

Replies are listed 'Best First'.
Re: Re: Re: A Beginning Guide To Evolutionary Algorithms
by biosysadmin (Deacon) on Oct 15, 2003 at 12:44 UTC
    I don't think I said exactly what I meant in my first reply. I didn't mean that each individual mutation must increase fitness for an individual for a population. Here's a more specific explanation:

    There are some sets of parameters that perform well when combined together in the same individual. One hypothesis of evolutionary algorithms is that combination of these good "building blocks" will form an overall more fit individual. There exist problems that violate this hypothesis, and are therefore very difficult for a simple genetic algorithm.

    If you do a google search for the "Minimal Deceptive Problem," you should find some references for this. In general, these deceptive problems are very rare and are also easy to spot if you track the evolution of your population over time.

    Hope this cleared my comment up. Again, awesome guide. :)