Complexity

Complexity

Complexity antigenically the behavior of a system or model Whose components interact in multiple ways and follow local rules, meaning there is no reasonable Higher instruction to define the various interactions as possible. [1]

The stem of the word “complexity” – complex – combines the Latin roots com (meaning “together”) and plex (meaning “woven”). Contrast “complicated” where plic (meaning “folded”) refers to many layers. A complex system is characterized by its inter-dependencies, and a complex system is characterized by its layers.

Complexity is one of many types, in which it is possible to interact in a variety of ways, culminating in a higher order of emergence greater than the sum of its parts. There is no absolute definition of “intelligence”, there is no absolute definition of “complexity”; The only consensus among researchers is that there is no agreement on the specific definition of complexity. However, “a characterization of what is complex is possible”. [2] The study of these complex linkages at various scales is the main goal of complex systems theory .

Science as of 2010 takes a number of approaches to characterizing complexity; Zayed et al. [3] reflect many of these. Neil Johnson states that “even among scientists, there is no single definition of complexity – and the scientific notion has traditionally been conveyed using particular examples …” Ultimately Johnson adopts the definition of “complexity science” as “the study of the phenomenon which emerge from a collection of interacting objects “. [4]

Overview

Definitions of complexity Often depends on the concept of a confidential ” system ” – a set of parts-have gold Elements That Relationships among Them Differentiated from relationships with other Elements Outside The relational diet. Many definitions tend to postulate or assume that complexity expresses the condition of many elements in a system and numerous forms of relationships among the elements. However, what one sees as complex and what one sees as simple is relative and changes with time.

Warren Weaver posited in 1948 two forms of complexity: disorganized complexity, and organized complexity. [5] Phenomena of ‘disorganized complexity’ are related to the theory of statistical mechanisms, while ‘organized complexity’ deals with the phenomenon of such phenomena and the problem of ‘interaction with a sizable number of factors which are interrelated to an organic whole’. [5] Weaver’s 1948 paper has influence after thinking about complexity. [6]

The approaches that embody concepts of systems, multiple elements, multiple relational regimes, and state spaces can be summarized as implying the complexity of a particular system.

Some definitions relate to the algorithmic basis for the expression of a complex phenomenon or model or mathematical expression, as later set out herein.

Disorganized vs. organized

One of the problems in relation to the complexity of the problem in the context of the problem otherwise independent elements), or correlations of more-uniform, or correlated, relationships, or interactions.

This document is a preview generated by EVS

In Weaver’s view, say millions of parts, or many more. The interactions of the parts in a “disorganized complexity” situation can be seen as largely random, the properties of the system can be understood by using probability and statistical methods.

A prime example of disorganized complexity is a gas in a container, with the gas molecules as the parts. Some Newton’s laws of motion may be compared with the (relative) simplicity of planetary orbits – the latter can be predicted by applying Newton’s laws of motion . Of course, most real-world systems, including planetary orbits, eventually becoming theoretically unpredictable even using Newtonian dynamics; as discovered by modern chaos theory . [7]

Organized complexity, in Weaver’s view, resides in nothing else than the non-random, or correlated, interaction between the parts. These correlated relationships create a differentiated structure that can, as a system, interact with other systems. The coordinated system manifests properties The organized aspect of this form of complexity vis-à-vis other systems can be said to “emerge,” without any “guiding hand”.

The number of parts does not have to be very large. A system of integrated complexity can be understood through its properties through modeling and simulation , particularly modeling and simulation with computers . An example of a city in the city is a living mechanism, with the neighborhood people among the system’s parts. [8]

Sources and factors

There are the rules which can be invoked to explain the origin of complexity in a given system.

The source of disorganized complexity in the system of interest, and the lack of correlation between elements in the system.

In the case of self-organizing living systems, the use of a combination of these two mechanisms is advantageously divided into two groups. See eg Robert Ulanowicz’s treatment of ecosystems. [9]

Complexity of an object or system is a relative property. For instance, for many functions (problems), such computational complexity is a time when computation is smaller when multitape Turing machines are used when Turing machines are used. Random Access Machines (1998), while inductive Turing machines can decrease the complexity of a function, language or set (Burgin 2005). This shows that tools of activity can be an important factor of complexity.

Varied meanings

In several scientific fields, “complexity” has a specific meaning:

  • In computational complexity theory , the amounts of resources required for the execution of algorithms are studied. The most popular types of computational complexity are a problem of the problem of being able to measure the complexity of the input (usually measured in bits), using the most efficient algorithm, and the space complexity of a problem equal to the volume of the memory used by the algorithm (eg, cells of the tape) That it takes to solve an instance of the problem as a function of the size of the input (usually Measured in bits), using the most efficient algorithm. This allows to classify computational problems bycomplexity class (such as P , NP, etc.). An axiomatic approach to computational complexity was developed by Manuel Blum . It allows one to deduce many properties of concrete computational complexity measures, such as time complexity or space complexity, from properties of axiomatically defined measures.
  • In algorithmic complexity theory , the Kolmogorov complexity (also called descriptive complexity , algorithmic complexity or algorithmic entropy ) of a string is the length of the binary program that outputs that string. Minimum message length is a practical application of this approach. Different types of Kolmogorov complexity are studied: the uniform complexity, the prefix complexity, the monotone complexity, the time-bounded Kolmogorov complexity, and the space-bounded Kolmogorov complexity. An axiomatic approach to Kolmogorov based on Blum axioms(Blum 1967) was introduced by Mark Burgin in the paper presented by Andrey Kolmogorov [10] . The axiomatic approach encompasses other approaches to Kolmogorov complexity. It is possible to treat different types of Kolmogorov complexity as particular cases of axiomatically defined generalized Kolmogorov complexity. Instead of proving similar theorems, such as the basic invariance theorem, for each particular measure, it is possible to easily deduce all such results from a corresponding theorem proved in the axiomatic setting. This is a general advantage of the axiomatic approach in mathematics. The axiomatic approach to Kolmogorov is further developed in the book (Burgin 2005) and applied to software metrics (Burgin and Debnath, 2003, Debnath and Burgin, 2003).
  • In information processing , complexity is a measure of the total number of properties transmitted by an object and detected by an observer . Such a collection of properties is often referred to as a state .
  • In physical systems , complexity is a measure of the probability of the state vector of the system. This should not be confused with entropy ; it is a distinct mathematical measure, one in which two separate states are never conflated and considered equal, and the other is for the notion of entropy in statistical mechanics .
  • In mathematics , Krohn-Rhodes complexity is an important topic in the study of finite semigroups and automata .
  • In Network theory complexity is the product of richness in the connections entre components of a system, [11] and defined by a very unequal distribution of some Measures (some being white Elements highly connected and Some very few, see complex network ).
  • In software engineering , programming complexity is a measure of the interactions of the various elements of the software. This differs from the computational complexity described above in that it is a measure of the design of the software.
  • In abstract sense – Abstract Complexity, is based structures are visual perception [12] It is complexity of binary string defined as a square number of features divided by number of elements (0’s and 1’s). Features included here all distinctive arrangements of 0’s and 1’s. Though the features are accurate and meet the criteria.

Other fields introduces precisely defined notions of complexity:

  • A complex adaptive system has some or all of the following attributes: [4]
    • The number of parts in the system is not trivial – however, there is no general rule to separate “trivial” from “non-trivial”;
    • The system has memory or includes feedback ;
    • The system can adapt itself according to its history or feedback;
    • The relations between the system and its environment are non-trivial or non-linear;
    • The system can be influenced by, or can adapt itself to, its environment;
    • The system is highly sensitive to initial conditions.

Study

Complexity has always been a part of our environment, and therefore many scientific fields have dealt with complex systems and phenomena. From one perspective, That Which is Somehow complex – displaying variation without being white random – MOST is worthy of interest Given the rewards found in the depths of exploration.

The use of the term complex is often confused with the term complicated. In today’s systems, this is the difference between myriad connecting “stovepipes” and effective “integrated” solutions. [13] This means that complex is the opposite of independent, while complicated is the opposite of simple.

While there are some fields in the field of complexity, it is a more recent movement to bring together observations from different fields to study complexity in itself, whether it appears in anthills , human brains , or stock markets , social systems. One such interdisciplinary group of fields is relational order theories .

Topics

Behavior

The behavior of a complex system is often said to be due to emergence and self-organization . Chaos theory has investigated the sensitivity of systems to variations in initial conditions as a cause of complex behavior.

Mechanisms

Recent developments around artificial life , evolutionary computation and genetic algorithms have led to an increasing emphasis on complex and complex adaptive systems .

Simulations

In social science , the study on the emergence of macro-properties from the micro-properties, also known as macro-micro view in sociology . The topic is commonly recognized as social complexity that is often related to the use of computer simulation in social science, ie: computational sociology .

Systems

Main article: Complex system

Systems theory has long been concerned with the study of complex systems (in recent times, complexity theory and complex systems have also been used as names of the field). These systems are present in the research of a variety of disciplines, including biology , economics , social studies and technology . Recently, complexity HAS Become a natural domain of interest of real world social cognitive systems and emerging systemics research. Complex systems tend to be high- dimensional , non-linear , and difficult to model. In specific circumstances, they may exhibit low-dimensional behavior.

Data

In information theory , algorithmic information theory is concerned with the complexity of strings of data.

Complex strings are harder to compress. While intuition tells us That this May depends on the codec used to compress a string (a codec Could Be Theoretically created in Any arbitrary language, Including one in qui the very small command “X” could causes the computer to output a very complicated string like “18995316”), any two Turing-complete languages ​​may be implemented in any other language, meaning that the length of the translation will be neglected. large data strings.

These algorithmic measures of complexity tend to assign high values ​​to random noise . However, those studying complex systems would not consider randomness as complexity who? ] .

Information is also sometimes used in information theory as indicative of complexity.

Recent work in machine learning has been studied in the performance of supervised classification algorithms. Ho and Basu present a set of complex measures for binary classification problems. [14]

The complexity measures broadly cover:

  • the overlaps in feature values ​​from different classes.
  • the separability of the classes.
  • measures of geometry, topology, and density of manifolds . Instance hardness is another approach to characterize the data complexity with the goal of determining how well a data set is to be classified and is not limited to binary problems. [15]

Instance hardness is a bottom-up approach that first seeks to identify instances that are likely to be misclassified (or, in other words, which instances are the most complex). The characteristics of the instances that are likely to be misclassified are then measured on the basis of a set of hardness measures. The hardness measures are based on several supervised learning techniques such as measuring the number of disagreeing neighbors or the likelihood of the assigned class label given the input features. The information provided by the complexity Measures has-been Examined for use in meta learning to determine qui for filtering data sets (gold Suspected Removing noisy instances from the training set) is The Most beneficial [16] and Could Be expanded to other areas.

In molecular recognition

A recent study based on molecular simulations and constant compliance describes molecular recognition as a phenomenon of organization. [17] Even for small molecules like carbohydrates , the recognition process can not be predicted or even assumed that individual hydrogen bond strength is known.

Applications

Computational complexity theory is the study of the complexity of problems – that is, the difficulty of solving them. Problems can be classified by an algorithm – usually a computer program – to solve them as a function of the problem size. Some problems are difficult to solve, while others are easy. For example, some difficult problems that require an exponential amount of time to solve the problem. Take the tracking salesman problem , for example. It can be solved in time{\ displaystyle O (n ^ {2} 2 ^ {n})}(Where n is the size of the network to visit – the number of cities the traveling salesman must visit exactly once). As the size of the network of cities grows, exponentially.

Even though a problem may be computationally solvable in principle, it may not be that simple. These problems may require large amounts of time or an inordinate amount of space. Computational complexity can be approached from many different aspects. Computational complexity can be investigated on the basis of time, memory or other resources used to solve the problem. Time and space are two of the most important and popular considerations when problems of complexity are analyzed.

There exists a certain class of problems that they are solvent in principle they require so much time that it is not practical to attempt to solve them. These problems are called intractable .

There is another form of complexity called hierarchical complexity . It is orthogonal to the forms of complexity discussed so far, which is called horizontal complexity.

See also

  • Chaos theory
  • Command and Control Research Program
  • Complex systems
  • Complexity theory (disambiguation page)
  • Cyclomatic complexity
  • Digital morphogenesis
  • Dual-phase evolution
  • Emergence
  • Evolution of complexity
  • Game complexity
  • Holism in science
  • interconnectedness
  • Law of Complexity / Consciousness
  • Model of hierarchical complexity
  • Names of large numbers
  • Network science
  • Network theory
  • Novelty theory
  • Occam’s razor
  • Process architecture
  • Programming Complexity
  • Sociology and complexity science
  • Systems theory
  • Thorngate postulate of commensurate complexity
  • Variety (cybernetics)
  • Volatility, uncertainty, complexity and ambiguity
  • Computational irreducibility
  • Zero-Force Evolutionary Law

References

  1. Jump up^ Johnson, Steven (2001). Emergence: The Connected Lives of Ants, Brains, Cities . New York: Scribner. p. 19. ISBN  3411040742 .
  2. Jump up^ Antunes, Ricardo; Gonzalez, Vicente (3 March 2015). “A Production Model for Construction: A Theoretical Framework” . Buildings . 5 (1): 209-228. doi : 10.3390 / buildings5010209 . Retrieved 17 March 2015.Vastly present in the literature, the word “complex” seems to stand for a supernatural force supposedly responsible for disturbances, a scary ghost haunting projects. With no absolute definition of this complexity, the only consensus is that there is no agreement on the specific definition of complexity [66]. However, a characterization of what is complex is possible. A structure is complex; [67], with dynamic networks of interactions, and their associations of non-aggregations of the individual static entities [68].
  3. Jump up^ JM Zayed, N. Nouvel, U. Rauwald, OA Scherman. Chemical Complexity – supramolecular self-assembly of synthetic and biological building blocks in water. Chemical Society Reviews, 2010, 39, 2806-2816http://pubs.rsc.org/en/Content/ArticleLanding/2010/CS/b922348g
  4. ^ Jump up to:a Johnson B , Neil F. (2009). “Chapter 1: Two’s company, three is complexity”. Simply complexity: A clear guide to complexity theory(PDF) . Oneworld Publications. p. 3. ISBN  978-1780740492 .
  5. ^ Jump up to:b Weaver, Warren (1948). “Science and Complexity” (PDF) . American Scientist . 36 (4): 536-44. PMID  18882675 . Retrieved 2007-11-21 .
  6. Jump up^ Johnson, Steven (2001). Emergence: the connected lives of ants, brains, cities, and software . New York: Scribner. p. 46. ISBN  0-684-86875-X .
  7. Jump up^ “Sir James Lighthill and Modern Fluid Mechanics”, by Lokenath Debnath, The University of Texas-Pan American, US, Imperial College Press:ISBN 978-1-84816-113-9:ISBN 1-84816-113-1, Singapore, page 31. Online athttp://cs5594.userapi.com/u11728334/docs/25eb2e1350a5/Lokenath_Debnath_Sir_James_Lighthill_and_mode.pdf [ permanent dead link ]
  8. Jump up^ Jacobs, Jane (1961). The Death and Life of Great American Cities . New York: Random House.
  9. Jump up^ Ulanowicz, Robert, “Ecology, the Ascendant Perspective”, Columbia, 1997
  10. Jump up^ Burgin, M. (1982) Generalized Kolmogorov complexity and duality in theory of computations, Notices of the Russian Academy of Sciences, v.25, No. 3, pp. 19-23
  11. Jump up^ A complex network analysis example: “Complex Structures and International Organizations” (Grandjean, Martin (2017). “Analisi e e visualizzazioni delle reti in storia .The esempio della cooperazione intellettuale della Società delle Nazioni . ” Memoria e Ricerca (2 .): 371-393 doi : 10.14647 / 87204 .See also:French Version).
  12. Jump up^ Mariusz Stanowski (2011) Complexity Abstract Definition, Complicity 2, p.78-83[1]
  13. Jump up^ Lissack, Michael R.; Johan Roos(2000). The Next Common Sense, The e-Manager ‘s Guide to Mastering Complexity. Intercultural Press. ISBN 978-1-85788-235-3.
  14. Jump up^ Ho, TK; Basu, M. (2002). “Complexity Measures of Supervised Classification Problems”. IEEE Transactions on Pattern Analysis and Machine Intelligence 24 (3), pp 289-300.
  15. Jump up^ Smith, MR; Martinez, T .; Giraud-Carrier, C. (2014). “An Instance Level Analysis of Data Complexity”. Machine Learning, 95 (2): 225-256.
  16. Jump up^ Saez, J .; Luengo, J .; Herrera, F. (2013). “Predicting Noise Filtering Efficiency with Data Complexity Measures for Nearest Neighbor Classification”. Pattern Recognition 46 (1) pp 355-364.
  17. Jump up^ Jorg Grunenberg (2011). “Complexity in molecular recognition”. Phys. Chem. Chem. Phys . 13 : 10136-10146.