Hayek’s famous work ‘The Theory of Complex Phenomena’ (Hayek, 1964) delves into the topic of complexity and asserts that the tension between a narrow or superficial and in-depth or real insight into complex phenomena creates tension so that the individual needs to make a ‘hard choice’. Regarding Hayek’s study of complex phenomena, it would also be useful to take Popper’s falsification theories into account. According to Popper, in order to falsify hypotheses. a criterion of differentiation could be utilized to understand recognize whether a theory is scientific, ‘pseudo-scientific’ or ‘metaphysical’ (Popper, 1959).
According to Hayek, this hard choice relates to the ‘point-wise’ testability. Hayek suggested loosening the constraints placed onto testability with the purpose of discovering more complicated relationships among independent variables, which might eventually enable the researcher to make the shift from predicting point events towards predicting ‘patterns’. Around the same time, Lorenz published his paper on chaos theory and realized that there is a high correlation between their initial conditions for highly dynamical systems so that while at the beginning their starting points might be close to each other, the might eventually end up further apart. It would be correct to say that ’chaos theory’ describes reality predicting about behavioral patterns regarding the point events of dynamical systems rather than by predicting “highly localized space-time hyper-volumes (‘points’) of behavior” as Lorenz stated.
Our minds are so programmed that when a regularity is observed within diversity the presence of the same agent is suspected. Human-beings are used to recognize patterns without having to resort to intellectual operations as these patterns have been embedded so much onto the environment that they are taken for granted. On the other hand, seeing new patterns might cause a surprise for us which leads to the development of the scientific fields. Our curiosity makes us look for new patterns in order to develop a new theory which is dependent on particular conditions.
The phenomena of the mind, life, and society seem to have a higher level of complexity in comparison to those of the physical world. Recognizing new patterns due to the growing number of aspects among which simple relations exist, results in an enlarged structure which in general includes abstract or generic features observed independently of specific values of the individual data. Such ‘wholes’ should be approached from the perspective of whether there are few or many points of contact through which everything else acts upon the system rather than from the perspective of being ‘open’ or ‘closed’ systems. Where to draw the line for the ‘partition boundary’ will depend on whether recurrent patterns of coherent structures we encounter in the world could be isolated.
Knowledge of required conditions to make a specific pattern occur is also important. In order to develop a related theory, we would need to know the data properties. This would enable us to make predictions dependent on yet unknown future events. Yet, in many fields, this will be for the present. Therefore, the progress of science will unfold in the following directions:
- falsifying theories as much as possible;
- discovering new fields where the degree of falsifiability gets reduced.
This is the cost of moving ahead in the field of complex phenomena.
Statistics, however, focuses on large numbers essentially by getting rid of complexity and approaching individual elements with the intention as if there were no systematic relations among them. In order to eliminate the problem of complexity, it implicitly assumes that information on the numerical frequencies of various aspects of a collective would suffice to make sense of the phenomena, therefore no information is required on the manner in which the elements are related.
Statistics in such situations provides simplicity by making the task manageable through means of substituting a single attribute among all individual attributes within the whole. Yet, there is a caveat to this as the solution to those problems in which the interdependencies among all elements matter cannot be offered by statistics. Statistics can provide support when the collective data refers to the whole of complex phenomena, rather than its individual components. To give a specific example, it may provide information on the relative frequency of specific properties within the complex structures happening together. On the other hand, statistics could not provide much support in the case of complex phenomena being investigated by computers despite the ability of computers to predict individual behaviors among a large amount of data. Unless individuals possess the mathematical details embedded into the computers and the theory specifying their structure, statistics regarding the interdependence between input and output would not help us to understand the subject of complexity.
Despite the ability of the field of statistics to manage complex phenomena given that there is already available information on the elements of the population, it nevertheless treats the structure of these elements as ‘black boxes’. In a similar vein, similar limitations apply to the phenomena of mind and society. The insight that any event throughout a person’s life may affect his future actions, makes the translation of theoretical knowledge into future predictions for specific events impossible. The underpinning assumption here is that similar to the field of physics, simple relations between a few observables could be discovered given the expectation for some regularities.
Even though we may be able to foresee certain phenomena or one of its related patterns given data of a certain class, we could still not ensure individual attributes of elements within a pattern. Knowing which specific kinds of circumstances specify a phenomenon does not automatically translate into a knowledge of all the circumstances necessary for all its attributes.
As Popper stated, “the more we learn about the world, the more articulate and specific will be our knowledge of our ignorance”. Although our knowledge about specific complex phenomena might be more limited in comparison to simple phenomena, we may still be able to develop a technique to recognize a general mechanism which produces certain patterns. As this could act as important guides to action this limited knowledge would still be of value.
Given the proliferation of computers in today’ complexity science, many of the patterns are being discovered by the use of machine learning algorithms. Regardless of the level of sophistication of these technologies in use, it might be worth to remember Hayek’s statements on statistics which constitute a crucial part of these algorithms. As Hayek (1969) argued, ‘statistics’ could not provide us with an understanding of complex phenomena despite the existence of a large number of computers, unless we also are provided with access to the code operating in the background. Such knowledge of the algorithms used to design the computer software will offer us a more useful conceptualization of the ‘computer’ in terms of ‘intentional’ terms (aka ‘algorithms’) rather than in terms of causal entities (aka ‘holes or ’electrons). If this could occur, the black box of human decision making embedding rational choice models and other linear functions could provide much more support given the development of new conceptual toolkits.