Parallelism refers to physically simultaneous execution. When you raise both arms above your head, you do so in parallel. Nothing can be more parallel than the number of execution agents simultaneously available: on an 8-core system, you can add up 8 rows of a matrix in parallel, but not 16 rows. Furthermore, some problems cannot be parallelized, whereas others can be executed entirely in parallel, and there are many cases in between.
Concurrency refers to conceptually simultaneous execution. When you juggle balls, you are executing a concurrent program: despite appearances,jugglers only throw or catch one ball at a time. A concurrent program can execute on a single execution agent, on as many agents as it has concurrent components, or anything in between, so concurrency does not depend on parallelism. We usually speak of concurrency when there is interaction between the concurrent components.
Flow-based programming, of which a shell pipeline is the most familiar example, is based on the concurrent execution of mostly isolated processes that communicate via flows. Because the individual components are isolated, they may be executed in parallel or not. Because there are no constraints on what components may do with their inputs, parallel execution may or may not actually speed up the computation.
My friends say that I know at least something about practically everything; my enemies, that I know far too much about far too much. Here's the raw material for believing in both views.
This is a personal blog. What I say here doesn't represent the views of my employer, Santa Claus, or anyone else.
You can email me at firstname.lastname@example.org. Spams will be aggressively filtered.