After the blowout of the Global Warming Scare some in the science establishment are scratching around for their next big project. A few European scientists envision a giant simulation model encompassing everything knowable about the earth and man’s activities thereupon.
(See my prior article on this subject.
Contributory to this vision is a recent article entitled: The 70 Online Databases that Define Our Planet The databases are most interesting but, to the extent that popular culture and internet clutter define our “planet” we have a long way to go before such a model could even be prototyped. But that doesn’t mean the notion is without menace. The article states:
The vision is that a system like this can help to understand and predict crises before they occur so that governments can take appropriate measures in advance.
There are numerous challenges here. Nobody yet has the computing power necessary for such a task, neither are there models that will can accurately model even much smaller systems. But before any of that is possible, researchers must gather the economic, social and technological data needed to feed this machine.
Today, we get a grand tour of this challenge from Dirk Helbing and Stefano Balietti at the Swiss Federal Institute of Technology in Zurich. Helbing is the driving force behind this project and the man who will lead it if he gets the EUR 1 billion he needs from the European Commission.
It’s indicative that these scientists envision governments benign interventions as a solution to crises “before they occur.” But at least this author acknowledges the current limitations in modeling technology and information that make this project impossible at present. Yet, just as in global warming, the author says “researchers must gather the economic, social and technological data needed to feed this machine.” In other words, whether or not it makes any sense it deserves massive support. With a proposed budget of EUR 1 billion a lot of scientists could get enthused, and maybe the U.N. will become involved with the support of the U.S. Government — why not?
There is something about statistical collections, the composition of aggregates that is irresistible. In economics this is called macro-economics. The illusion is that if you can measure something, then you can control it. If you can measure GNP, then you should be able to manipulate it. It doesn’t work that way because aggregates are abstractions, they aren’t things. For example, government wants to regulate average fuel efficiency of automobiles, which is an aggregate of the efficiency and usage of multitudes of automobiles. There is no lever that can directly affect the aggregate. The only way to change it is to motivate the purchasers and drivers of automobiles to change their preferences and habits. Of course, government can force changes by regulation of individual conduct, but it can’t directly manipulate the aggregate, itself. Ditto the GNP, or unemployment, or other popular economic aggregates. You can’t push here and get a direct result there. Without influencing the micro part the macro part just measures but does not define nor truly embody the aggregate.
The idea that a model built from aggregating a vast multitude of data would be a suitable mechanism for scientifically fixing what ails the world is pure folly and foolishness. Perhaps the proper way to treat this is with the ridicule that it deserves, but that didn’t work with the global warming scam, and it probably won’t with this. Too many careers, too much political power and too much money is potentially involved.