main index

P00: frame around

P01: olicognography

P03: infrastructures

wayout:contact

User

You?
Use?
Perspective?
Usage?
Concern

Graph Start

Core n°
Half complex graph

OLICOGNOGRAPHY on SOCIAL INFRASTRUCTURES

System

Engineering

Development

Scale

Health

Social

Wider Practice of Mathematics

Basic Olicognograph: Complex Social Relations

Logics of Amounts

Starting comments since pure abstraction (of first order logic) which is valid since starting at virtual points. We make the mistake not to start from 3 dimensioned and time's one (probably non commutative group structure in the wayof A. Connes). Intuitively, if to start at proper dimensions levels, this would be the level of 'topos logics' (3 valued ?). Physics realities start there, so by side of formal logics we should care especially about non commutative geometry. We may also imagine that dimensions lower than 3 can be used practically, but there from up to down, there are degenerated giving properties meanwhile the reductionist pathway which is,on the formal way, down to up makes harder to imagine properties that do not just come from parts or pieces. Reductionist ways may also miscare the right sequence of: virtual point - real material point - virtual line (trajectories, ...) - real lines (strings, ...) and so on. With the up to down life scientist may remind the way biological units use sub-volumes as sub-systems or compartments, 2 dimensioned membranes to separate, filaments to haul, network or link...

Now could, what is obvious in ecology or biology, be applied to physics? Hamiltonian are here, essential to basic physics formulation but still hard to express a delimited size which is, for example, so important to biological metabolic energetic consumption. Could it be that quantic mechanics is a lowered expression or an upward window of universe's construction; having started since a volume without time ? Whatever it is, it is to focus on our more modest practice, as another pathway than reductive processing way one. The dual and the complex match between what is defined from down to top and from top to down. Observe for example, the scale differentiated self-similar processes of Mandelbrot's fractals as more conceived as 'non complete' reduction of natural number dimensions so a top-down process (and commonly between 3 and 2) and Bejan's constructals as down-top emerging. Characteristics exponents to reconsider all that?

In the sort of example of concepts to care formalism of ideas could be:

  • Entailment and Entanglement. Entailment has various meaning according use in logics which we would say it is about 'necessary implication (effects) producing...'. In economics this is about kind of secondary linking effects, eventually at the margins, like on public budgets (could Tanzi's effect of this sort?). In quantic mechanics "there is a close relation between entanglement encoded in a quantum state and its degrees of information. Knowledge of global degree of information or marginal informations related to the subsystems of a multipartite system, resulting in a qualitative characterization of the entanglement: the latter increases with decreasing global mixedness, with increasing marginal mixedness and with marginal mixednesses as close as possible". sort of finite iterations, extinctions, non infinite ones requiring these breaking rule.
  • Vectors and Torques (or moments) both in the formal side as well as on the material side; considering them alone and pure or local or grouped and global (as in fields) This makes that the vector of some particle not endogenous but defined 'exogeneously'. Intuitive remark to care about the system making the field and the sub-systems making the subject-object of our interest. Observe, for example, that it is the field or large system which is making the regular vectors of the local particle (field of vectors; without missing that local particles can, around themselves, modify-bend-embed the regular field close to the object. On the virtual side there are also that it is a minium of pointed material model with a vector given by its free speed (or potential transformation if to knock something); better than a single virtual point. That is you have to consider the point and its vector be it a speed or the begining of a link.
  • Series patterns or step-wise as celular automata, considering reality's processes of iterations. Different patterns with a special interest for those alternating (converging or non diverging trajectories) or those periodical.
  • Relevance rigorous-tight-strengthful implication(s) and sorts of formal weakenings. There are many too strict formal conditions (say like convexity, extremum). So intuitively to think about sorts of weakening to consider recombinations between processes emerged at different - close ? - moments of evolution allowing like logical feedbacks, reversal or local symmetry or intercalation - so has to locally allow bypass the curse of rigorous reductive partitions (formally partition have straight cuts, not allowing overlaps).
  • Branching processes in a shorter and more complicated way large trees. May be for not complicating too much the formalism of realities (ununderstandable complexity is not our dream) and, with a smarter treatment of residue-remain and a top-down process to examine sorts of complex bayesian experimental processes (since life is constantly trying to mimic and relate to maintain essential functions for survival. This could also helps to qualify analogic thinking as a logic of pattern by pre-trail and post-denial (like life may do). Computed simulations are trying to show branching chaotic processes and they are important to understand possibilities of momentum orders in complex reality. Of course not to miss to understand better how to apply usefully rigorously formal bayesian methods.
  • Times' logics. Actually such logic observe 'Chronologic logic' on one side and with 'Tense logic' the one of grammar. More complicated since times can be approached in different ways, have many patterns outside linear one reflected by a clock; parallel ones, branching, emerging ones and so on. A picture could be provided by evolution's tree.

Some Formal Quantification Structures important to think about:

  • More than "Set Theory there is number of alternative approaches which have been considered then and later. But a single theory, the Zermelo-Frankel theory with the Axiom of Choice (ZFC) dominates the field in practice. One of the strengths of the Zermelo-Frankel set theory is that it comes with an image of what the world of set theory is (just as most of us have a common notion of what the natural numbers, the real numbers, and Euclidean space are like). This image is what is called the “cumulative hierarchy” of sets". But since recent difficulties make observed that set theory is not everything. So recent renewed development of the 'usual' model construction of NFU (Quine's New Foundations with urelements which have been shown to be consistent by Jensen. This starts with a model of the usual set theory with an automorphism that moves a rank, this rank being the domain of the model. 'Most elements' of the resulting model of NFU are urelements. Very important is that Information is not discarded: the membership relation of the original model restricted to the domain of the model of NFU is enabled in the language of NFU. So there are good reasons for mathematicians to be aware that there are alternatives to ZFC to manage including in software programming.
  • Group theory is basic algebraic structure (as derived from Galois' seminal works) joining the properties of associativity, neutral element and inverse element. If also commutative it is called an abelian group. Many properties can be approached by groups frame: symmetry, groups of transformations, permutations, transpositions and subgroups (can living unit be an isomorphism or ar least a formal model be isomorph to modelled unit ?) and which sort of logical formal flexibility can help to use the formalism of model theory ? (see for example Girard). Even if hard to understand why groups have good use in physical structures of crystals. Higher formal algebraic structures are like rings and fields. Exemple of interpretations byside of common life concepts are to be conceived. Care not to force these.
  • What the theories of pure mathematics could bring is useful inspiring properties and frameworks for empirical democratic exercises on one side and intelligent software datamining on the other. Our purpose is not to make things and computed models too unmanageable, but to practice together more freely than the rationalist dogmatism, closer to complex realities phenomenologies and also more catchable, by softwared systems of cognitive recognition, of what could be wise democratic rules. Procedures that will usefully take advantage of fundamental algebra of formal structures for intelligent datamining. This to provide essential information as needed by profanes. Profanes may ignore what they exactly need of information, meanwhile experts are overproliferating abusivelly, having explainations for everything but inducing pityful management and being systematically wrong and betrayed by much evidence. Datamined knowledge will then need to be appropriatedly presented, matched with properties to expect from 'social algebra', understood and applied well. At least enough to resolve basic problems of the complexities by which communities are concerned, so they could thereafter economically call experts, when and where scientifically needed to be properly democratically incorporated.
  • Think that proper ways to simplexities are to anchor wisefully qualitative concepts. Actually qualitative concepts are extraordinarilly overinflated by humans' speculations. Although most with the same disguised intention to gain self-advantage from own private reasons often turning despicable, either when with the lone ambition to make money at the expense of others; either when practicing differences for discrimination: 'pay my services in training in speculative practices and miss what are the real meanings'. In social issues may be we are wasting time and money in definitions that are just remix of same things, previously not properly nor effectively care.

Mathematics to take a bit more Complicatedly (be smart: understand what you see)

More Mathematics but not less Precautions. Better in an intuitive sense, locally applied, Forcing mathematicians to work more since they can be paid less than 'experts'

Imagine that complex objects should show mixtures of quantitative behaviors, less easy to manipulate as virtual labels and not so simple 'neutral operations'. Complex real subjects mix many different kinds of behavior; change more or less abruptly their regimes; are sensitive to environmental (and initial) conditions, etc. Most simple behaviors (linearity, normality) for easy calculus are exceptions and need rigorous reasons or tightened conditions to behave like that: as crystal-like structures of solid matter. This needs huge direct (and lone) constraints (like gravitation force); an absolute dictatorial norm or perfect ergodicity; pityless perfect markets; or violence of human ideological illusions. Nothing of wise economics, just enough to allow humans to insert their complicated difference to prepare catastrophes, dramatic breakings or outbursts, non preventing mistakes made by dogmatisms.

Now, since positions to maintain critical mechanisms to preserve the fragile nest of our humane life. We had plenty of stocks, exhausted for many, in not much more than to centuries. To expand over planet Earth you have complex patterns and behaviors to preserve, because they are what we are and other to avoid. Be reductionism the most simple formal forms of what not to make for staying, and make humane history simply uncaring sustainability. We may have by side of formal methods also many complex phenomenologies to make better 'social algebra'. Social algebra meaning something complicated to reach together like by solving what we call complex programs, may be not so complicated at each one place but somehow 'wholesomelly' understood. Let introduce to some ideas at the respect of formal patterns to 'understand better' (remember what Einstein said about difficulties with mathematics), just trying and caring may be enough for the future of our planet, if you assume all the consequences in what you are doing yourself):

  • Numbers have dual nature, which is not enough cared. One is that of cardinal as an entity like complete or natural number (and so on) like carrying an information in almost a qualitative sense (but related one number to the other by a process of construction, as an iteration like a simple addition: +1, more often it is used implicitly . Other nature of number is ordinal one after the other; applied when considering that lower one is contained in higher one like 3 contained in 5 or 8 and successive incorporations; this is another (more traditional use where relation of order applies).
  • There are many somethings to understand better in the symmetrical diagram of exponent-logarithm curves. Combined they just make the linear basic curve (growth slope one). Logarithm curve may be interpreted as the emergence of a unit: it catches its base (0, 1) correspondance at log(1) = 0. Dirac's function requering some involvement. Entropy expressions in Shannon theory of information has develop since a huge corpus of formulas and indices, to understand better in profanes' interpretations. For a start (without necessary complications) imagine exponential be the pattern of amount of quantity of a life being (away some oceanic places moving them, creatures are tamed by ressources dispersed in the environment) and logarithm the pattern of its amount of qualities or properties. Anyhow the lineariness confusion of 2 patterns makes quite improbable exact balance, the need for renewal, maintenance, diversification, discrepancies seeking diversity, determine cycles.
  • Uniqueness should be shortcut and sort of cobordisms, replicators, bifurcations, foldings developed within thermodynamical limits, since previous time including moments of catastrophic pathways and strange trajectories multiscaling, are to managed better with more, smarter and popular softwares as well as practically examined like with conceptual multidimensioned networks and multivalued logics. Bifurcations, and Thom's patterns of complex basic bifurcating functions, recalling to take into account thermodynamics taming, so as to create either 'metastable' states, either consolidating higher level of complexity of critical metastates of renewal and maintenance are to explore.
  • Complex Numbers: in our materialist perspective, as well as in economics modelling, after a brief introduction in mathematics education applied to economy, most of complex numbers are missed. This is also the defect of social sciences: to think that any value is reall positive (forgive the nature of mind) and passed by simple operations (eventually substraction), ignoring more of relative meaning of imaginary numbers. If real expressions pass logically by real numbers, it is to observe that in any previous balance, relative comparisons of any measure, search of relative definitions, subjective preliminaries to decision; or any fluctuating processes, any mechanism observing not one but at least two or multiple factors should make that complex numbers receive much better use for all these situations. Part of the problem is in the mismanagement of complex units (only as an amount of raw material on the right simple slope of backward reductionism). Economics being mostly a subjective process of definition before making things real, most social accounting proceedings by balancing opinions, moods, preferences and antagonism, complex numbers have not good reasons for being so underused.
  • Also when not forcing the smoothness, continuity and asymptotic properties of distributions of probability it is to consider relevancy of methods such as renormalisation (useful for the study of essential laws distributed at different levels of scale), wavelets for signal analysis, pathdependent integrals (Kac-Feynman) which are probably phenomenologically important, be them introduced in a more suggestive way: if wanted to be reminded when necessary enough so as for calling for the help of good mathematicians, not just shortened by engineers' preference for their simplified mathematics (they can still managed) 'already too good for loose stupid commons'. Remind also that good frames, pictures or schemes, can use cognitive means different than rigorous formulas, allowing many quantitive estimates.
  • Pieces of Free Algebra (generally solvable?): Once understood that mathematics pieces of algebra have special properties that could be physically as well as physics-biologically essential and could be well expressed by special pieces of mathematical algebra. This would make clearer would be the sorts of transformations, limits, criteria and conditions we have to preserve, maintain, uphold and so on. This out the way of maximization of benefits, which is almost systematically biased toward the maximum waste of maximum energetic transformations. For not waiting too late to the catastrophic regression of expression of scarcity for restoring, away the best biotic expression of Nature's production for all. Of course, wars, diffused or major conflicts, men made and men enhanced natural disasters casualties can solve our future problems of planning, not with the good sense humane intelligence.
  • Catastrophes and non linearity are formal sorts of non simple behaviors. There are more essential to realities than simple ones. Practically it is to identify the sort of positive catastrophes and non linearities that sustain us. Strange attractors are of special interest: they look like bundles exploring, for example, 2 dimensions since original 3 dimensions volumes but also simply all the volume. After them we can imitate like with procedures, staying somehow independent and parrallel but also making somehow simple shared local evidence (even if globally impredictable and very sensitive to initial conditions of environment changes).
  • Another sort dynamical patterns to explore are with non linear trajectories not simple lines of functions but always considering that they are blurred, thick, material and with changing composition, when not sheaves over complex heterogeneous substrates or ground. Making the pattern more 'coincidental ghosts', interfaces for exchanges or transformations, rather than fixed composed entities. Reviewing so 'volumes-plans-curves-points of life', as well as more complicated functions like loglinear, Verhulst's, Gompertz ones and so on. Also to consider sorts of bundles, groups of trajectories, mechanical waves, fields that come together blurred, of which secondary sorts of complex forms could be: 1) the combination of long tailed functions of distribution of probabiility and those stepwise up. Say like a lognormal combined with Verhulst's or gamma ones) since there could be the patterns of of ways up of negentropy. Without forgetting that entropy is overall increasing, so probably the part which is positivelly feeding neguentropy is exponentially locally decreasing. May be the complex correlogram of that is in the curves forms of Poisson's stochastic curves (considering thermodynamics constraints and energetic efficiencies of transfert-catchings) hierarchically weighted by time.
  • There are non linear patterns to explore with more topological dimensions and criteria: there is the sort of basic lines like of matter-energy and information when transformations, could they have sort of mechanics that look like spiraling or helix (DNA-strandslike ?) allowing exploration of models of transformations like how sustainably they should look-like when transformed. Rather than simple curves, since money financing factors of productions.
  • Ergodic Stochasticity: "The word ‘stochastic’ is based on the idea of a target and in particular on the pattern of hits around a target. The greater the distance a given unit of target area is from the center of the target, the less frequent or dense will be the hits on that area. It can also be said that there are two ‘worlds’: The ‘real world’ of observation and the ‘ideal world’ of the theory or mathematical model. We might look at a model as a shot at the ‘real world’ target. When we say the theory (or model) is‘true’ we mean that there is an exact correspondence between there and the ideal worlds. There are many reasons why we might miss the target,but they fall into two rough categories:1) Ours was a ‘bad’ shot, i.e., our model was false or logically invalid, or; 2) The target moved unexpectedly, i.e., there are random, unexplained variations in the objects we are attempting to explain or use in our explanation. If it is granted that it is the models or theories which are stochastic and not necessarily the real world, then stochastic models are still subject to logical problems..."
  • Completed Probability: When you treat with risks and formula-able uncertainties, not pressured to be sure, users miss nevertheless to take a more complete view of inference and statistical tests relations Test appeared test by test, and check everything was hard when barehandded. Now, if considering topos, mixtures of kinds of formalisms, more discriminant analysis, are to make and articulate. Intuitively this would be like considering (according a topos logics) the 4 kinds of moments parametrizing models of distribution probabilities, effect of quantities under control with or without great numbers laws. Observe these moments: average, variance/standart 'error', kurtosis and skewness characterize distributions of probability in level, range, side-pattern (orientation) and contrast (differentiation from baseline). In economical ways, where nature explores options (some and not so many) at least save 'time' by reusing what have been created for eventually evolve into different enduses. Nature simplifies but phenomenologically, not as our formal lazyness would like and, if incorporating enough essential functionalities, 'can create' new organisms. Setting this identity over processes and components well similar from one specy to another. Biologically pigs or chicken are not so different from human (there are genetically many pigborn infections) and chicken - genetically- quite close to tyranosaurus (tyranosaurus-tandoori?) are still close but less may catch from migrating birds and passed it to humans.
  • Scale(s) and Modularity: thermodynamics constrains define probably the levels of scale and the succcessive umbrella under which emerge complexities; by coalescence and potential reactivities. Now, formal uses not as monotonous iterations or regular integrations, infinity and perfect reductionist frames, especially in the conjuring tricks of leaders (still with expressions of unease with econometrics logarithm transformations, forced closures etc.) have nevertheless, still many formal ressources in the modular treatments since theory of numbers, with finite (volumic - quaternionic - octonionic) matrices and so on. All imagined packed, thresholds identified, probabilized entropized and anthropized as we imagine in what we call 'topos logics'. There is much to investigate for mathematicians, to set the pertinent characteristic levels on scales, as we call intuitivelly complexities. In biology they have been mentioned, for example by Longo and Bailly.
  • Finance formalism is now well operating (recently adjusted) with probabiliity models after having succeeded to created places - financial makerts - forcing most real economics to access them for finance - than applying probabilistic models of risks too well disconnected from real economics. As the result most of the sector has huge profits (and well socialized their losses to get back on profits trails, at the expense of employment in real economics). Probabilistic global models help speculative opportunities. The averaged policies lowering the risks of nonspecific cases, but are far less useful to immediate human concern. Meanwhile many social or united groups have plenty of interests in projects' failures. They may have plenty of benefits from this information so as to turn predatory, because the mechanisms of transfer, distribution, diffusion are concentrated and easy to bias. Solidarity required for balancing wealth to the least informed people, works better for maintaining adverse asymmetry. Exchanges systematically maintain the same biases away from the fair picture of business values and away from essential needs. Meanwhile, exogenous intermediations, often confuse relationships of human proximity. Very little is seriously made to put both parts into proper conditions of fair exchanges, neither probabilities and data are fairly cared.
  • Probabilistic models can be a source of inspiration if modestly delimited (not exuberantly freed), tamed (but not too distorted by biases) and some democratic procedures, can eventually use sorting processes, but should be carefully applied to honest issues. It is wise to recognize where and when you ignore, but have made efforts to understand, as well as do not miss you duty as humane being. Now what is essential to policies and other purpose is to better understand how stochasticity and probability or uncertainty mixe to make realities. We are accustomed to observe that it is often more interesting to care what uncertainty and probabilists models is with saying and suggesting cautions rather than force determinist ambitions about what to make thanks to averages made for forcing 'true' definitions. Too the imprecisions of probabilistic models, seem to allow whatever kind of wrong reasoning and substitute to legitime democratic arbitrations.
  • Other important registers of methods are with networks algebra, including complex ones and theory of games which have been mentioned in other pages.

Can Quantic Multivalued Logic inspire ? (away Esoteric-Hysteria)

Quantic mechanics looks like anticipating the quality of particles as states. Tthe complexity of quantum mechanics, even in this fundamental register, is easy algebra (mostly linear but hard to “understand”— quantic mechanics explains better, many physical properties of matter. It is basically based on probabilistic calculus. When decoherence at the minimum by the interaction(s) with the experimental device of the observer, qualities of transformations and nature of particles are fixed and quantitative values of classical mechanics relations and estimated values at speeds of Newtonian or Einstein's mechanics apply. Quantic formalism, presents many properties that can qualified it for normal use. It is of course not to pretend that economy could explained according via quantic mechanics but in a formalism of accepted conventions, with realities frames and especially patterns (potential patterns) driven by the properties we can agree. There are plenty of interesting properties and concepts, but it is to keep them away from esoterical illusions. If their are not admitted so and democratically supported they could shape frameworks of humain social actions (clusters of actions).

For some coments on Quantic Multivalued Formal ressources (quantic information ):

  • Renormalization is the process of physics that allows to stay with basic phenomena as pure process in physics, within the different levels of scale of study of phenomena. Being its purpose to have clean fundamental process, right at estimating the true value of physical processes.
  • We have to care that at intermediate situations, reality of our interest is on one side to have a true knowledge of simple processes but are more intermediate singularities so more interested by local confusion and what it lets us.
  • Reality is biased with confusion, heterogeneity and local distorsions of physical phenomena, but it is probable that from our intermediate existence and local possibilities of emergence, or survival with more complex properties within our small world.
  • Core characteristic of operations of quantic mechanics is their linearity: a particle is in a superposition of states where each mathematical representation add simply that is linearly [may be consider that: characteristical definitions],
  • To classical uncertainty which observes exclusivelly p or ¬p (non p), quantic uncertainty care an operator of both with 4 densities: p; ¬p; p¬p; p¬p. Question with ¬p in fact another p, a q or an antiparticle. Same particle embedded ?. Another particles coexisting different by their stability, degrees of interactions, or by environment's will. [diversity of options together, when reality's decoherence results in classical mechanics, keeps nevertheless previous diversity of options and implies other issues like if in different environments],
  • Are antiparticles at environments' will (or pushed out ?) (since they are totally reactive). [An antiparticle pourred in the environment it may have eating purpose with lower but still useful to activate],
  • Decoherence implies that quantic situation turns a quasi classical weak operator of density for interference in classical situation.[practically diversity of possibilities, and one few more adapted ?) turn real, making interference, they show structuring patterns (at more fundamental level then chemical reactions of BZ explained by Prigogine). Quantic calculus proceed nevertheless by packages, Lie algebra, and finite matrices, and can do it quite faster than regular calculus],
  • Complex molecules interactions between atoms of these molecules, do not destroy quantic states characterizing the movement because internal vibrations they are expressed are independent from the pathway taken by the molecule. Not being the case a hidden degree of freedom would carry an information on mass center scheme and favour decoherence.
  • In general terms, manipulations of quantic objects care degrees of freedom (position and amount of movement of mass center) described by a wave function, when degree is not paired or coupled to others. [We can study it separatelly and have still quantic objects more complex (as fullerenes, up to porphyrines - mix of inorganic-organic molecules which are important in metabolic catalysis, oxygen transport if considering heme's mineral-organic frame].
  • It is difficulties to observed complex quantic objects: because interference are fastly noised by importance of distance of sieve (huge probability of closer interactions), meanwhile a photon or other small particles have lower probability of interactions than big molecules.
  • On Earth big molecules are sensible to gravity, Coriolis force (provided by Earth's rotation) and mechanical waves (since their have slower speeds). Larger is a molecules, higher its probability to emit photon. Spacel separation between pathways are possible. Larger the molecule less need of precision on the trajectory of emitted photon.
  • Embedding-intrication is a quantifiable physical resource. Higher is embedding more the system is adapted to treatment of quantic information. Superposition of states makes the information on one state informing on others embedded ones. Intrication or superposition test could be different if in classical or in quantic state.
  • There is an analogy between energy and intrication.
  • A quantitative equivalence of different intrications would be a standard unit of intrication. To think about comparition of 2 quantic systems.
  • Amount of intrication would be the number of intrication to add to equilibrate gauge.
  • Analyzing the flow of intrication between some systems makes necessary a quantic treatment of information to assess the constraints on ressources so the treatment could be done.
  • Relative states formulations, simultaneously to quantic superposition of states and apparent randomness of classical laws of probabilities, induced the postulate of hidden variables which made J.A. Wheeler suggest delayed choice: a photon can be in 2 different places but after measured can be choosen when unique.

"If quantum computation with objective reduction occurs in the brain, enigmatic features of consciousness could be explained:

  1. By occurring as self-organizing processes in what is suggested to be a pan-experiential medium of fundamental space-time geometry, objective reductions can account for the nature of subjective experience by accessing and selecting proto-conscious qualia.
  2. By virtue of involvement of unitary, entangled quantum states during preconscious quantum computation and the unity of quantum information selected in each objective reduction, the issue of binding may be resolved.
  3. Regarding the transitions from preconscious processes to consciousness itself, the preconscious processes may equate to the quantum superposition computation phase, and consciousness to the actual, instantaneous objective reduction events. Consciousness may then be seen as a sequence of discrete events (e.g., at 40 Hz)".

follow...