Concurrency => Parallelism

I wanted to clarify a point from my post The Cambrian Period of Concurrency.

I made the statement

From where I sit, this is all about exploiting multicore hardware

because I’ve seen a pile of actor and other concurrency libraries which have not taken parallel execution of the concurrent program seriously. If I am going to go to the trouble of writing a concurrent program, then I want that execution to be parallel, especially in a multicore world.

Simon Marlow from the GHC team said that if programming multicore machines is the only goal we ought to be looking at parallelism first and concurrency only as a last resort. Haskell has some nice features for taking advantage of parallelism. However, I explicitly stated that I was not as interested in highly regular or data parallel computations, which is what Haskell’s parallelism tools are aimed at. These are fine ways to get parallelism, but I am interested in problems which are genuinely concurrent, not just parallel. In a Van Roy hierarchy, these are the problems with observable nondeterminism. I also specifically called out reduction of latency as one of my goals, something which Marlow says is a possible benefit of concurrency. The GHC team is interested in a different mix of problems than I am.

Van Roy in short

I also forgot to mention Peter Van Roy’s paper Programming Paradigms for Dummies: What Every Programmer Should Know, which includes an overview of his stratification of concurrency and parallelism (and other stuff). If you don’t have time to read his book, the paper is shorter and more digestible.

3 thoughts on “Concurrency => Parallelism

  1. Simon Marlow

    I agree – we definitely do want to parallelise all those concurrent programs we have lying around. But I want to correct this point:

    “I explicitly stated that I was not as interested in highly regular or data parallel computations, which is what Haskell’s parallelism tools are aimed at.”

    In addition to concurrency, Haskell has two programming models for parallelism, only one of which is aimed at highly regular data-parallel problems. The other (Strategies) is perfectly suited to irregular problems and ad-hoc parallelisation of existing sequential programs. Take a look at the excellent paper “Algorithm + Strategies = Parallelism” (http://www.macs.hw.ac.uk/~dsg/gph/papers/html/Strategies/strategies.html).

  2. Ted Leung Post author

    Thanks for the pointer to the strategies paper – I’ll definitely take a closer look. From Chapter 24 in Real World Haskell, it looked like strategies were mostly a way of controlling the behavior of par and parseq.

  3. Pingback: Parallelism /= Concurrency « GHC Mutterings

Leave a Reply

Your email address will not be published. Required fields are marked *