Massively parallel processing.

By Carlos Jones/ORNL –, CC BY 2.0,

In 1964, Control Data Corporation built the world’s first supercomputer, the CDC6600.  It ran at 40 MHz and could perform up to three million floating point operations a second.

Last Year, Oak Ridge National Laboratories unveiled the “Summit” a supercomputer that could perform two hundred quadrillion floating point operations a second, making it more than 70 billion times as fast as the old CDC6600.  By comparison, the Summit is more of a leap from the CDC6600 than the CDC6600 is from an abacus.

However, it is interesting to look at just how Summit got to be so powerful.  Yes, the processors used are more powerful than that of the old CDC6600.  Time marches on and technology improves.  But there is more to it than that.  The Summit contains within it 37,000 separate processors.  28,000 of these are graphics processors produced by NVidia (the same company that makes the graphics processors and adapters in many home and office computers–including my own).  In addition there are 9,000 conventional processors made by IBM.  The Summit gets its power from massively parallel processing–dividing the task among many processors each only dealing with a small portion of the task and how that portion connects to neighboring parts of the problem which other processors handle.

It makes the Summit more powerful than any single processor system can ever hope to be.

This is much like how an economy works in a free market with prices allowed to function by supply and demand.  In a centrally planned command economy of whatever flavor–socialist, communist, fascist, Nazi, whatever–a single, or small number of decision makers, “processors” make all the myriad decisions involved in the allocation of scarce resources which have alternative uses.  Labor, raw materials, land, time.

In the free market, however, each individual need only worry about those specific resources (their own time and labor, raw materials under their control, their particular skills, and so on) that are under their own personal direct control.  They have a small part of the total problem and can work on maximizing their own part without knowing, or needing to know, the overall problem.  Information about other parts of the problem are carried through prices.  If desire for a certain product increases, people bid up the price.  This signals others to provide more of it.  If desire drops, people aren’t willing to pay as much and either the price falls or amount sold at the current price falls.  If something becomes more difficult to produce or obtain, the price rises and people either buy less, divert resources from elsewhere to produce or obtain more, or some combination of the two.

It’s a massively parallel processing operation only instead of a few thousand or even a few tens of thousands of processors, it’s millions, or even billions of individuals, each bringing to bear far more knowledge, far more understanding, far more “human computing power” than any small group of “central planners”, no matter how smart or how well educated, can ever hope to possess.

There are some small handful of things, as I have discussed elsewhere, where this massively parallel processing cannot handle well, things where the problem cannot effectively be divided (such as national defense) or things where the division is unclear (external costs and benefits), but for the vast majority it works and works well to provide the most value as determined by society as a whole.

It’s why the free market will always be more efficient than centrally planned economies.

2 thoughts on “Massively parallel processing.”

  1. There are some things that seem to make sense, but are actually counterproductive.
    For instance, if you are feeling tired after a long day of sitting at a desk, it makes sense to just sit on the couch and veg out. But that’s bad for you, and better if you go and do some physical exercise or activity.

    Central Planning is similar in that it looks like it makes a lot of sense- the Right People with Enough Power can stop waste and inefficiency, while maximizing benefits. They even have precedents- Mussolini making the trains run on time, Speer’s armaments ‘miracle’ in Nazi Germany or the American Arsenal of Democracy during WWII.
    Yet neither really holds up if one looks past the Statist propaganda. Fascist trains really didn’t run on time, the Nazi war machine tended to be poorly equipped on average*, and the great arms buildup of the USA had almost nothing to do with centralized planning and everything to do with the free market and individual initiative**.

    *cf “Wages of Destruction” by Tooze
    **cf “Freedom’s Forge” by Herman


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: