Complexity
Complexity characterizes the behavior of a system or model whose components interact in multiple ways and follow local rules, leading to non-linearity, randomness, collective dynamics, hierarchy, and emergence.
The term is generally used to characterize something with many parts where those parts interact with each other in multiple ways, culminating in a higher order of emergence greater than the sum of its parts. The study of these complex linkages at various scales is the main goal of complex systems theory.
The intuitive criterion of complexity can be formulated as follows: a system would be more complex if more parts could be distinguished, and if more connections between them existed.
, a number of approaches to characterizing complexity have been used in science; Zayed et al.
reflect many of these. Neil Johnson states that "even among scientists, there is no unique definition of complexity – and the scientific notion has traditionally been conveyed using particular examples..." Ultimately Johnson adopts the definition of "complexity science" as "the study of the phenomena which emerge from a collection of interacting objects".
Overview
Definitions of complexity often depend on the concept of a "system" – a set of parts or elements that have relationships among them differentiated from relationships with other elements outside the relational regime. Many definitions tend to postulate or assume that complexity expresses a condition of numerous elements in a system and numerous forms of relationships among the elements. However, what one sees as complex and what one sees as simple is relative and changes with time.Warren Weaver posited in 1948 two forms of complexity: disorganized complexity, and organized complexity.
Phenomena of 'disorganized complexity' are treated using probability theory and statistical mechanics, while 'organized complexity' deals with phenomena that escape such approaches and confront "dealing simultaneously with a sizable number of factors which are interrelated into an organic whole". Weaver's 1948 paper has influenced subsequent thinking about complexity.
The approaches that embody concepts of systems, multiple elements, multiple relational regimes, and state spaces might be summarized as implying that complexity arises from the number of distinguishable relational regimes in a defined system.
Some definitions relate to the algorithmic basis for the expression of a complex phenomenon or model or mathematical expression, as later set out herein.
Disorganized vis-à-vis organized
One of the problems in addressing complexity issues has been formalizing the intuitive conceptual distinction between the large number of variances in relationships extant in random collections, and the sometimes large, but smaller, number of relationships between elements in systems where constraints simultaneously reduce the variations from element independence and create distinguishable regimes of more-uniform, or correlated, relationships, or interactions.Weaver perceived and addressed this problem, in at least a preliminary way, in drawing a distinction between "disorganized complexity" and "organized complexity".
In Weaver's view, disorganized complexity results from the particular system having a very large number of parts, say millions of parts, or many more. Though the interactions of the parts in a "disorganized complexity" situation can be seen as largely random, the properties of the system as a whole can be understood by using probability and statistical methods.
A prime example of disorganized complexity is a gas in a container, with the gas molecules as the parts. Some would suggest that a system of disorganized complexity may be compared with the simplicity of planetary orbits – the latter can be predicted by applying Newton's laws of motion. Of course, most real-world systems, including planetary orbits, eventually become theoretically unpredictable even using Newtonian dynamics; as discovered by modern chaos theory.
Organized complexity, in Weaver's view, resides in nothing else than the non-random, or correlated, interaction between the parts. These correlated relationships create a differentiated structure that can, as a system, interact with other systems. The coordinated system manifests properties not carried or dictated by individual parts. The organized aspect of this form of complexity with regard to other systems, rather than the subject system, can be said to "emerge," without any "guiding hand".
The number of parts does not have to be very large for a particular system to have emergent properties. A system of organized complexity may be understood in its properties through modeling and simulation, particularly modeling and simulation with computers. An example of organized complexity is a city neighborhood as a living mechanism, with the neighborhood people among the system's parts.
Varied meanings
In several scientific fields, "complexity" has a precise meaning:- In computational complexity theory, the amounts of resources required for the execution of algorithms is studied. The most popular types of computational complexity are the time complexity of a problem equal to the number of steps that it takes to solve an instance of the problem as a function of the size of the input, using the most efficient algorithm, and the space complexity of a problem equal to the volume of the memory used by the algorithm that it takes to solve an instance of the problem as a function of the size of the input, using the most efficient algorithm. This allows classification of computational problems by complexity class. An axiomatic approach to computational complexity was developed by Manuel Blum. It allows one to deduce many properties of concrete computational complexity measures, such as time complexity or space complexity, from properties of axiomatically defined measures.
- In algorithmic information theory, the Kolmogorov complexity of a string is the length of the shortest binary program that outputs that string. Minimum message length is a practical application of this approach. Different kinds of Kolmogorov complexity are studied: the uniform complexity, prefix complexity, monotone complexity, time-bounded Kolmogorov complexity, and space-bounded Kolmogorov complexity. An axiomatic approach to Kolmogorov complexity based on Blum axioms was introduced by Mark Burgin in the paper presented for publication by Andrey Kolmogorov. The axiomatic approach encompasses other approaches to Kolmogorov complexity. It is possible to treat different kinds of Kolmogorov complexity as particular cases of axiomatically defined generalized Kolmogorov complexity. Instead of proving similar theorems, such as the basic invariance theorem, for each particular measure, it is possible to easily deduce all such results from one corresponding theorem proved in the axiomatic setting. This is a general advantage of the axiomatic approach in mathematics. The axiomatic approach to Kolmogorov complexity was further developed in the book and applied to software metrics.
- In information theory, information fluctuation complexity is the fluctuation of information about information entropy. It is derivable from fluctuations in the predominance of order and chaos in a dynamic system and has been used as a measure of complexity in many diverse fields.
- In information processing, complexity is a measure of the total number of properties transmitted by an object and detected by an observer. Such a collection of properties is often referred to as a state.
- In physical systems, complexity is a measure of the probability of the state vector of the system. This should not be confused with entropy; it is a distinct mathematical measure, one in which two distinct states are never conflated and considered equal, as is done for the notion of entropy in statistical mechanics.
- In dynamical systems, statistical complexity measures the size of the minimum program able to statistically reproduce the patterns contained in the data set. While the algorithmic complexity implies a deterministic description of an object, the statistical complexity, like forecasting complexity, implies a statistical description, and refers to an ensemble of sequences generated by a certain source. Formally, the statistical complexity reconstructs a minimal model comprising the collection of all histories sharing a similar probabilistic future and measures the entropy of the probability distribution of the states within this model. It is a computable and observer-independent measure based only on the internal dynamics of the system and has been used in studies of emergence and self-organization.
- In mathematics, Krohn–Rhodes complexity is an important topic in the study of finite semigroups and automata.
- In network theory, complexity is the product of richness in the connections between components of a system, and defined by a very unequal distribution of certain measures.
- In software engineering, programming complexity is a measure of the interactions of the various elements of the software. This differs from the computational complexity described above in that it is a measure of the design of the software. Halstead complexity measures, cyclomatic complexity, time complexity, and parameterized complexity are closely linked concepts.
- In model theory, U-rank is a measure of the complexity of a complete type in the context of stable theories.
- In bioinformatics, linguistic sequence complexity is a measure of the vocabulary richness of a genetic text in gene sequences
- In statistical learning theory, the Vapnik–Chervonenkis dimension is a measure of the size of a class of sets.
- In computational learning theory, Rademacher complexity is a measure of richness of a class of sets with respect to a probability distribution.
- In sociology, social complexity is a conceptual framework used in the analysis of society.
- In combinatorial game theory, measures of game complexity involve understanding game positions, possible outcomes, and computation required for various game scenarios.
- In binary number, Abstract Complexity Definition formalizes complexity as the complexity of a binary structure, expressed by the formula: C = N² / n, where N is the number of detectable regularities, and n is the number of basic elements. These regularities are identified with contrasts—tensions arising from interactions of common and differentiating features—and with structural information.The formula incorporates two factors:• the N/n ratio, representing information compression/density, and the number of regularities N, introduced to account for the difficulty of compressing longer structures. This definition applies both analogically and directly wherever a system can be expressed in binary form, e.g., in music. Differences from classical structural complexity definitions: Traditional definitions focus on the number of elements and relations without specifying the nature of these relations. In ACD, relations are explicitly defined as tensions resulting from interactions of common and differentiating features of system elements, with a defined dynamics. This makes the definition constructive and operational—allowing formal calculation of complexity.Intuitive criterion of complexity:Complexity is the number of distinguishable elements and the number of connections between them. In relation to contrast: differentiating features correspond to distinguishable elements, common features correspond to connections between those elements. The more such features exist and the stronger their interactions, the greater the contrast, and thus the higher the complexity.
- A complex adaptive system has some or all of the following attributes:
- * The number of parts in the system and the number of relations between the parts is non-trivial – however, there is no general rule to separate "trivial" from "non-trivial";
- * The system has memory or includes feedback;
- * The system can adapt itself according to its history or feedback;
- * The relations between the system and its environment are non-trivial or non-linear;
- * The system can be influenced by, or can adapt itself to, its environment;
- * The system is highly sensitive to initial conditions.
- Peak complexity is the concept that human societies address problems by adding social and economic complexity, but that process is subject to diminishing marginal returns