Showing posts with label particle swarm optimization. Show all posts
Showing posts with label particle swarm optimization. Show all posts

Thursday, 24 November 2011

Mission accomplished

Rastrigin's benchmark function is far more photogenic than I am
I finally got my Master's Degree in computer science, by defending a strongly interdisciplinary thesis about GPGPU-powered parameter estimation in biochemical systems. The method I propose spins around my favourite bio-inspired optimization method – the Particle Swarm Optimization – whose fitness function relies on simulations produced by Gillespie's stochastic simulation algorithm. But that's just the surface: there's a lot of complex stuff "under the hood" and plenty of room for future developments.

It has been a very interesting experience, enlightening in many ways; the most important aspects came from a human standpoint: I met, and worked with, some wonderful people from whom I've learned a lot, people I'll never cease to admire and be grateful to.

It's kinda funny, anyway: I run away from chemistry at high school, and now I'm onto it again. Strange to see how different routes ultimately lead into the same place.

Friday, 11 February 2011

A softer way of life

Soft computing course: completed! A sparkling 30/30 cum laude closes an astonishing and inspiring experience.

I chose this course primarly because of simulated annealing and neural networks, for I studied these techniques in a very formal way (cybernetics) and from a philosophic point of view (epistemology). I was interested into a more concrete approach, and I found it here. But there was much more!

I discovered a wide range of special neural networks, Hopfield's and Kohonen's in particular, which show several interesting capabilities that I'd like to experiment soon. I finally studied the infamous back-propagation, able to learn non-linearly separable tasks, and that was good. But despite my initial expectations, these weren't the most interesting parts.  

Genetic algorithms, for instance. It's funny to think of data structures "sharing genetic material", maybe discriminating partners according to their own sexual tastes (Booker's model), with their children being subject to random mutations. Well, believe it or not, but that funny stuff works! My experimental data clearly say that GA are a strong and robust way to tackle NP problems. The drawback is the difficulty of modeling the solutions' search space by using simple codified strings. Moreover, genetic algorithms can find a global optimum for a specific problem, but not whole classes.

So, here comes the Genetic Programming. Yep, double upper case here. Whereas the GA were funny because of data sharing genetic information, what we have here are copulating programs... so nice! Massive derivation trees create brand new offspring generation after generation, letting new abilities to emerge and spontaneously losing the useless (a process called automatic feature selection). The way GP works  immediatly calls for a LISP implementation, and that's what I tried to do... even though I'm still stuck on a pure functional, recursive crossover function.

In the end, my personal favourite model, whose I dedicated my blog to: the Particle Swarm Optimization. Yeah, triple upper case. As I wrote in the past, I love the spectrum of emergent behaviours this bunch of simple coordinates show. Just like Conway's game of life, there seems to be some kind of unpredictable lifeforms inside the model, and that is cool. Moreover, it can be quickly extended:


This example is an interesting kind of PSO that prof. Vanneschi proposed, named Multiswarm Repulsive PSO (or MRPSO). It features more than just one swarm living on different "islands", all looking for the same global optimum. After some generation, best individuals are cloned and replace the worst ones by migrating to the next island. Moreover, on some island, the particles run away from the best individual, instead of following its "pheromones trails". This approach solves two possible drawbacks of PSO: the lack of diversity and the tendancy that individuals show of blindly following the leader, thus keeping away from interesting unknown zones of the search space. The boost of performance is stunning, and still there's a lot of space for inventing new ways to improve it!

Thursday, 25 November 2010

Ants gone wild!



I'm currently attending the soft computing course, and it's terrific! I always loved stuff like genetic algorithms, neural networks and such, but I wasn't prepared for particle swarms: it's been a shocking revelation. I really fell in love with them, I adore the wide spectrum of emerging behaviours those bunch of coordinates exhibit and I feel overwhelmed when I think about creative uses of the model.

Because of the nature of soft computing's solution spaces, it came natural to me to develop everything in C++, by using templates and wild meta-programming. Now I have a set of absolutely general algorithms and in particular an extended PSO which automatically adapts to any kind of solution and relative function to optimize. Everything seemlessly fit in my PLT graphic framework, and let me visually check its goodness with traditional benchmark functions: Rastrigin's and Shaffer's F6. Swarm perfectly finds a way to the optimum in a very natural-looking way.