Francis Cianfrocca wrote:
Edward Lee makes many interesting points, none more than in the section
"Coordination Languages" near the bottom of the paper. He points out
that,
essentially due to inertia, many new models have been proposed but not
adopted. (Side point: I *really* like Erlang, which Lee mentions at
several
points. What a beautiful design.) To this point, I'd add the following:
necessity drives uptake. Some of the approaches to the scalable-design
problem will emerge into common use simply because they have to, and the
leaders who take the risks will be well rewarded. The problem is becoming
urgent because of the rise of multicore hardware. What's really nice
about
all this is that we will soon have the tools to build applications that
haven't even been imagined yet.
I'll add another hopefully provocative point (wrapped in an hommage to
Fortran): I don't know what the coordination language will look like,
but I
do know what it will be named: Ruby!
I'll chime in and say that my opinion lines in this direction:
http://rubyurl.com/DJB
I'm not sure a 'coordination language' is the right direction, but I
do think that threads are a deeply flawed model.
An earlier issue of Computer made some strong arguments in favor of
transactions, as well.
I've been looking for a place to jump into this -- er -- thread -- and this looks like as good a place as any. Let me take a meta position here, as someone who's been in computing, mostly scientific, for over 40 years.
1. We want to solve *big* problems. Whether it's keeping track of millions of peoples' accounts, emulating the big bang, designing cures for genetic illnesses, beating some arrogant chess grandmaster, proving theorems that have defied humans, in some cases for centuries, predicting the path of hurricanes or maintaining a complete collection of Western classical music on a piece of plastic the size of a human thumb, our desire is to solve problems bigger and bigger.
2. There are *two* fundamental limits to our ability to solve big problems. The hardware/technology limit is that we can only make transistors so small before they start to function not as transistors but as something totally useless for building a digital computer.
The second limit is more profound. The software/human limit is that there are in fact problems which are impossible to solve in software, and other problems that are not impossible but whose time to solve grows in an unrealistic way with the size of the problem.
3. The evolutions and revolutions in scientific and commercial computing in the four decades I've been in the business have been mixes of "general-purpose" and "special-purpose" hardware, languages and algorithms.
So what does all this mean for Ruby, threads, multi-core processors and the users thereof?
1. Ruby is decidedly a general-purpose language and environment. I don't think it's realistic to expect Ruby to solve large sets of equations, either numerically or symbolically, act as a synthesizer, or run a hard real-time process control application. Because it is general purpose, you *could* do these things in Ruby on an Intel PC running Windows or Linux, but there are better ways to do them.
2. Threads are here to stay. So are monitors, semaphores, shared memory, distributed memory, message passing, massively parallel SIMD machines, symmetric and asymmetric multiprocessing, DSP chips, 64-bit address spaces, IEEE floating point arithmetic, disk drives sealed off from the outside world, interrupts, Windows, Linux and probably BSD. So are Ruby, Perl, PHP, Python, R, Lisp, Fortran, C, Java, Forth and .NET. So are both proprietary and open source software. 
3. My next computer will be a multi-core 64-bit machine with hardware virtualization support running both Linux and Windows in some kind of virtualized manner. Until I can afford that, I'll keep my current stable of 32-bit machines and spend my money on food, clothing, shelter and transportation.
By then, I will have learned Ruby, and there will be a Ruby virtual machine capable of doing all the tasks in 1 efficiently on this hardware. Maybe there will even be world peace and a committment to deal with global warming. 
Speaking of World Peace, for those in the USA, Happy Memorial Day.
<ducking>
···
--
M. Edward (Ed) Borasky
http://linuxcapacityplanning.com