[OT] good article on threads

given some of the recent discussions on threads i thought some of you might
like this

   http://www.computer.org/portal/site/computer/menuitem.5d61c1d591162e4b0ef1bd108bcd45f3/index.jsp?&pName=computer_level1_article&TheCat=1005&path=computer/homepage/0506&file=cover.xml&xsl=article.xsl

-a

···

--
what science finds to be nonexistent, we must accept as nonexistent; but what
science merely does not find is a completely different matter... it is quite
clear that there are many, many mysterious things.
- h.h. the 14th dalai lama

Mildly interesting. I think the popularity of using threads comes from the fact they're a much easier concept to wrap your head around. They also let you introduce parallelism as an afterthought - dispatching tasks into a worker thread pool on places where you can hang on IO isn't really complicated, etc. Deterministic concurrency models require a more premeditated approach, and more effort to achieve the same levels of parallelism in a program. You practically have to consider "thread synchronisation" (e.g. the message passing itself in that model) as a fundamental design element instead of an edge case to keep from breaking.

Deterministic concurrency might be The Right Thing in being less error-prone, but sometimes the Hack That Works is more practical. (Cf. ML-like type system "versus" latent typing.)

This reminds me, I should spend some time in Erlang and Nemerle land.

David Vallner

···

ara.t.howard@noaa.gov wrote:

given some of the recent discussions on threads i thought some of you might
like this

  http://www.computer.org/portal/site/computer/menuitem.5d61c1d591162e4b0ef1bd108bcd45f3/index.jsp?&pName=computer_level1_article&TheCat=1005&path=computer/homepage/0506&file=cover.xml&xsl=article.xsl

-a

"To offer another analogy, a folk definition of insanity is to do the
same thing over and over again and expect the results to be different.
By this definition, we in fact require that programmers of
multithreaded systems be insane. Were they sane, they could not
understand their programs."

I like this :wink:

I should look more into multithreaded software :slight_smile:

Michal

···

On 8/31/06, ara.t.howard@noaa.gov <ara.t.howard@noaa.gov> wrote:

given some of the recent discussions on threads i thought some of you might
like this

   http://www.computer.org/portal/site/computer/menuitem.5d61c1d591162e4b0ef1bd108bcd45f3/index.jsp?&pName=computer_level1_article&TheCat=1005&path=computer/homepage/0506&file=cover.xml&xsl=article.xsl

People know how strongly I feel about this, so I risk the
broken-record syndrome whenever I talk about threads, but to your
point: I've always been fascinated by how much programmers seem to
love threads. Programming threads reminds me of playing GO: it takes
an afternoon to learn the rules, but it can take a whole lifetime to
really achieve mastery.

I used to say that there are only two cases in which threads can be a
good thing: 1) If the structure of the problem is inherently threaded,
and 2) if you have an opportunity to capture system latencies like
disk and network I/O.

After reading the article Ara linked, I think my point 1 is another
way of saying: don't introduce nondeterminacy where it doesn't exist
in the problem domain. David, you refer to this (with Lee) as
deterministic concurrency. Threads really can't help solve these
problems. But things like network servers that really are
nondeterministic are fairly easy to program in threads.

My point 2 is something I'm moving away from, now that event-driven
models for programming I/O-heavy applications are coming to the fore.
I'm astonished at how people immediately jump to threads as an easy
abstraction for modeling problems that involve latency but not
nondeterminacy- it's like a reflex response, and it's not a good
thing.

I took a look at Alice (an ML derivative) and I loved its concept of
promises and futures, but these work best in a pure-functional context
where you're not worrying about side-effects.

Erlang is a beautiful thing: do have a look at it if you get a chance.

···

On 9/1/06, David Vallner <david@vallner.net> wrote:
> Mildly interesting. I think the popularity of using threads comes from

the fact they're a much easier concept to wrap your head around. They
also let you introduce parallelism as an afterthought - dispatching
tasks into a worker thread pool on places where you can hang on IO isn't
really complicated, etc. Deterministic concurrency models require a more
premeditated approach, and more effort to achieve the same levels of
parallelism in a program. You practically have to consider "thread
synchronisation" (e.g. the message passing itself in that model) as a
fundamental design element instead of an edge case to keep from breaking.

Deterministic concurrency might be The Right Thing in being less
error-prone, but sometimes the Hack That Works is more practical. (Cf.
ML-like type system "versus" latent typing.)

This reminds me, I should spend some time in Erlang and Nemerle land.

So I should STOP playing golf???

Or flipping coins? <G>

···

On 9/1/06, Michal Suchanek <hramrach@centrum.cz> wrote:

"To offer another analogy, a folk definition of insanity is to do the
same thing over and over again and expect the results to be different."

--
Rick DeNatale

My blog on Ruby
http://talklikeaduck.denhaven2.com/

Michal Suchanek wrote:

"To offer another analogy, a folk definition of insanity is to do the
same thing over and over again and expect the results to be different.
By this definition, we in fact require that programmers of
multithreaded systems be insane. Were they sane, they could not
understand their programs."

As Rick DeNatale pointed out, the folk analogy is nonsense. The real world just isn't deterministic. But the point is valid, short of space rays ;), computers are deterministic systems and a using a nondeterministic computation model in programming means you're giving this advantage up.

David Vallner

David Vallner wrote:

Michal Suchanek wrote:

"To offer another analogy, a folk definition of insanity is to do the
same thing over and over again and expect the results to be different.
By this definition, we in fact require that programmers of
multithreaded systems be insane. Were they sane, they could not
understand their programs."

As Rick DeNatale pointed out, the folk analogy is nonsense. The real
world just isn't deterministic. But the point is valid, short of space
rays ;), computers are deterministic systems and a using a
nondeterministic computation model in programming means you're giving
this advantage up.

David Vallner

Uh ... I beg to differ:

1. As Dijkstra pointed out decades ago, as soon as you have I/O
interrupts you have non-deterministic behavior.The operating system has
the right, but not necessarily the obligation, to try to make the
resulting behavior as deterministic as possible, whatever *that* means. :slight_smile:

2. I'm not sure about the usefulness of truly nondeterministic models,
but using *stochastic* models for large "deterministic" systems is a
*huge* advantage in many cases. It represents the ability to say
something useful about a system's properties rather than being able to
say things like, "if every atom in the universe was a teraflop
supercomputer, I could fully analyze the behavior of a system
0.000000001 times the size of this one." :slight_smile:

Well, as far as we can tell, but it's a topic of debate. Einstein
didn't much like the idea of God playing dice with the universe.
Maybe he does, maybe he doesn't.

But can we tell the difference? <G>

···

On 9/3/06, David Vallner <david@vallner.net> wrote:

Michal Suchanek wrote:
> "To offer another analogy, a folk definition of insanity is to do the
> same thing over and over again and expect the results to be different.
> By this definition, we in fact require that programmers of
> multithreaded systems be insane. Were they sane, they could not
> understand their programs."
>

As Rick DeNatale pointed out, the folk analogy is nonsense. The real
world just isn't deterministic.

--
Rick DeNatale

My blog on Ruby
http://talklikeaduck.denhaven2.com/

Rick DeNatale wrote:

Michal Suchanek wrote:
> "To offer another analogy, a folk definition of insanity is to do the
> same thing over and over again and expect the results to be different.
> By this definition, we in fact require that programmers of
> multithreaded systems be insane. Were they sane, they could not
> understand their programs."
>

As Rick DeNatale pointed out, the folk analogy is nonsense. The real
world just isn't deterministic.

Well, as far as we can tell, but it's a topic of debate. Einstein
didn't much like the idea of God playing dice with the universe.
Maybe he does, maybe he doesn't.

But can we tell the difference? <G>

Exactly.

I never liked that definition because it implicitly ignored things
that are low-probability.

The lower probability an event is, the harder it is to distinguish
it from "impossible."

If an event has probability 0.001, I would look for it 500 times
or so before I started "seriously" looking for it. That's more than
enough to classify me as insane.

Hal

···

On 9/3/06, David Vallner <david@vallner.net> wrote:

Rick DeNatale wrote:

Well, as far as we can tell, but it's a topic of debate. Einstein
didn't much like the idea of God playing dice with the universe.
Maybe he does, maybe he doesn't.

But can we tell the difference? <G>

Einstein also didn't know about quantum. Past the uncertainty boundary, the universe might or might not be deterministic, but since we can't observe it at that level, we can't really make too many assumptions on the emergent behaviour.

David Vallner
Probably Very Wrong

Actually he did. He just was uncomfortable with the idea.

···

On 9/6/06, David Vallner <david@vallner.net> wrote:

Rick DeNatale wrote:
> Well, as far as we can tell, but it's a topic of debate. Einstein
> didn't much like the idea of God playing dice with the universe.
> Maybe he does, maybe he doesn't.
>
> But can we tell the difference? <G>
>

Einstein also didn't know about quantum.

--
Rick DeNatale

My blog on Ruby
http://talklikeaduck.denhaven2.com/