DHH recently posted a rant about how Rails is all ready for extreme multi-cores, David Fayram talks about some problems with the process-scaling argument.
One of Fayram's biggest complaints is that he says that for an 8 processor machine, you'll need 1.6GB of RAM because each Rails process is 200MB. Woah, I'm not sure which planet he's on, my Rails processes are way smaller than that, they start at about 30MB, and go up to a max of 50MB. 8 * 50MB = 400MB, which is pretty reasonable, even on current machines, and I think in a few years, we'll have lots of memory on our multicore systems.
This is funny, because I just read a post on Digg linking to an article about a 8 processor machine with 128GB of RAM.
Now, this isn't to say that I agree with either of them, I think that this is going to be an interesting time moving forward, and that there are going to be a multitude of different applications that need different approaches to dealing with them. Hopefully we can stay away from threads as much as possible, because you can get some really nasty bugs with threaded code that are hard to solve.
I think we need a new language to deal with parallel processing, something like Erlang, probably. Our brains, or rather, the brains of programmers who are coding now, don't handle parallel processing very well at all. However, we are bringing up a whole new generation of kids who are getting better and better at parallel processing, and I feel that the answers are going to come from them.
So, all you programmers out there, start teaching your kids Erlang, Lisp, Prolog, and every other crazy language out there that might be useful for parallel programming, one of those kids is going to come up with the right way to do it.
I know it.