04 January, 2011

We are terrible in systems thinking

Well, you've probably heard at some point this famous statement:
"Correlation is not causation."
It works, sure. The fact that both ice cream and swimsuit hit top sales at summer does not mean that one causes the other. This distinction is especially important in scientific research and reporting about it. I've seen a lot of reports by newspapers failing to make the distinction explicit, causing a huge amount of confusion on the discussion boards. Just remember the distinction and so far so good.

Larger than two unit systems make things a lot more intriguing. In a large system, causes produce effects which are causes to other effects etc. At some point you've probably heard or read that everything is connected. I'd be hard pressed to find a more compact and revealing description of a network-like system, where a change in a single node causes effects like waves; everything is linked to everything.

Looking at a standard analysis of such a system (referring to a layman's analysis, not a scientific one), usually at a point when we have observed several causes and two dozen effects we just usually seem to give up. We put up our hands and just pick the most important causes in our opinion, and advocate that they explain the effects. If not all of them, at least well enough, so forgetting the other causes is in order. Unfortunately, at times this behaviour is more like reversed stupidity, not intelligence. In systems even minor effects can cause large ripple effects, as per the famous butterfly effect. It's just that we find analyzing such a system cognitively so difficult, that we usually end up reducing it to a simpler form. With this method the system unfortunately ceases to be a system. We are trying to analyze a car, thinking it's just a cart without a horse. Not a very good idea.

Going to the other extreme, sometimes we conclude that everything is connected - and leave it at that. So what are the effects? How big are they? This viewpoint doesn't really help a lot either, does it? If we want to analyze a system, we need some orders of magnitude in the effects. Otherwise anything can cause anything, making us revert back to guessing instead of an actual analysis. It seems clear that we need the best of both worlds, not just one extreme. Our vision is just as bad with an eyepatch, no matter which eye is blind.

So what's the corrective step? To try to combine the two above viewpoints, it should probably be a point that everything is connected, correlation is not causation, but also that effects and causes can be in the same phenomenon. In a system, the effect of something is very often the cause of something else. A first degree change can raise a vital unit above a threshold, which in turn causes an effect in another node. A system cannot be analyzed by first finding the causes and then deducing the effects, as they both can coexist at the same time, and not all effects arise directly from the original setting.

No comments:

Post a Comment