Loudmouth writes:
Until complexity can be measured quantitatively, we will be stuck with subjective judgements and general trends.
Actually there are a few methods of quantitatively analyzing completxity, though I'm not too familiar with their finer intricacies. One method is the Shannon Theory of complexity that basically says that the information contained in a signal is inversely proportional to its probability. In other words and in simlpe terms, a low-probability signal has a high information content. If a signal were binary i.e. on/off, then the probability of the signal is 1/2, so -log
2(1/2) = 1 bit of information.
Another is the Kolmogorov-Chaitin Theory of complexity that says the measure of complexity is directly related to the length of the shortest program it would take to output a specific string. That is, the string A x A x A x A x A x A x A x A x A x A x A could be expressed algorithmically as A
10, whereas the string A x B x C x D x E x F x G x H x I x J x K" cannot be compressed in the same manner.
Last there is the Bennett Theory of complexity that says the complexity of a system is directly proportional to the time the system has been in existence. For example, as a computer continuously spits out the unending decimals of pi, the system gets more and more complex. I'm not real clear as the metric which measures the rate of increase, though.
I found most of my information at this site here:
http://www.geocities.com/...orest/Andes/9063/complexity.html