# Logics in Complex Democratic Planning

Basic Olicognograph: Master Approaches

## Modernity of Formalism

Slow has been the 'emergence' of good formal sciences to make its symbolic and language turned representations of phenomena. In mathematics intuition expand first with geometry, made simple ideal (rigorous) with Euclide's one. Modelling experimental physics with mathematics has started clearly, in western world by Galileo. Analysis turned modern epistemology with Descartes, whom also contributed to make geometry analytic and was speed up with differential calculus of Newton and Leibnitz. Objects conceived as abstracts points summarizing objects (more easy phenomenology but out of astrology).

Logics, the other leg of formalism, had its pivotal century between mid-nineteenth and mid-twentieth century. First as to establish simple first order logic as Frege expected. It was related then to mathematics, structured into logical almost perfect designs starting by group theory of Galois, passing by set's of Cantor (almost achieved as ZF-choice system). Then came the doubts about perfection of infinite perfect things with incompleteness of infinite real algebra demonstrated by Godel, indecidability by Church and so on. On perfect epistemology of science, at least the speaking and writing of it different 'metaphysical comprehensive' intents have been made like by Carnap or to make mathematics comprehensive as the collective effort of Bourbaki's.

Application of logics with first order logic inspired by set theory to approach fundamental (physical) principles and under this frame. First order logic would be said in short using same lone type of units, monovalued object all treated in the same way (set) and using simple operations (non prone emerging) infinitely iterated in perfect simple operations or types of relations within one frame of partition (exact and complete (just sticking cuts without mortar). All this has been very efficient for the study of simple perfect physical principles, playing all the same, whatever the scale - just needing lately there renormalization - either or perfectly free independent atoms summerized as non material points first (but weighted and simply operated aggregates) either perfectly homogeneous structures, that could be summarized too in simple grid first, that is: either like perfect gas either like perfect crystals.

Telling it in very, very short way: this way of conceiving things has been hard to fix into axiomatic formal inductive almost perfect system of thinking and needed to get out non objective kinds of reasoning. Sophistication seemed to be possible to ground on this aspiration for perfection (then too often taken for achieved). But the ways to formal perfection revealed impossible. Measures of reality despite noisy, was delivering nevertheless rational proofs to many simple physics theories. Anything with numbers passed to look like measured but also loosely defined. So even if many theories look like handilly enunciated, going ahead with theoretical ideas, empirical models and formal advancement of the wonderfull utopia of everything that can match absolutely perfectly almost all simple things, is now revealing hard and wrong. Nevertheless reduction, stay coexisting with intents to make formal things more flexible before and after its rigorous way.

The main defects of reductionism have not been in the technological issues which proceeded well,but with the imitation of social stuff trying to be science with so wrong evidence in predicting and never prove. There in social "sciences" reconstructions still necessary to put appart so much esoteric 'almischieve-mysteries' and non scientific religious 'swelling reasons on theo's-will-I-tell-you' . So, out 'geo-realpolitik', if social sciences are actually providing looking like sweeter necessary dialectics on social issues, they do not inspire well management. Formal sciences isolated into many pieces themselves in looking like rigorous practices (but fake) axiomatic methods and deductive demonstrations of truth of theories, rather really useful. Meanwhile professional at the margins of utility, defend their expertise in such ways that they create fake dependence, promote monopolies on pieces of truth, absurd leadership near tired profanes.

Common people of course have more or less "refused" this, finding too difficult to apply all that to their interest. Reacting rigidly or rejecting also somehow wise approaches with formal methods; and even playing more against it, rather than opening to these services. That is not trying to find minimum platfoms of democratic careful consensus; with these intertwined sciences. Example are, as in primary school mathematics and logics are more funny exercices for few able and boring to troubled ones. Pitfyful are living applications out the sciences of engineer. On this side of engineers they are proud of their 'mastering', they take and superior utopia for the reasons of register, abusing social interpretations according and increasingly against the natural complications, even when provided by the formal scientists explorers with curious things of their registers, curious formal behaviors, that are now evidence, to be so diffused, so common to our complicated realities.

At high 'bizarre' levels, of the constructions of disciplines slowly things are now changing, but there is still many leadering technostructure turned all Leviathan by their own, as in the way policies use formal sciences: more trying to make them rigid, especially with types of formal tools having reached the limits of their 'perfection' but still weak in front of real complexities and so applied for wrong to these complex objects or subjects; including out phenomenological evidence and common sense. Having created plenty of an artificial worlds ignoring the essential meaning of non exactness neither the defects of reduced concepts on political social framing, taking words or labels for truth, and believing more in the rigidities and so unable to approach facts with wise empathetic practices. For example unable to conceive that the linear rule of 3 (of straight proportions) is not natural and now well far away from good rules of thumbs. So in the explainations the lazziest ones - fallacious - that prevail. Things look like even more simple, with the use of personal computers, since the troubling part of the calculus look now so easy allowing anyone to put any sort of (stupid) number to demonstrate his / her selfish right.

## Forms of Scientific Objects

Arts of demonstration have turned axiomatic. "For the most part axiomatics is concerned with what are called ‘primitive terms’, ‘primitive propositions’ (postulates or axioms) which relate these primitive terms, and ‘systems of postulates’ (axiomatic systems). Axiomatics, then, is the study of the logical properties of sucha system. A system of postulates and primitive terms is analogous to a system of equations and variables. There is weakness in axioms. which has a lot to do with generality and universality, i.e. the list of things to be explained and the limits of applicability for the items on the list. It is considered desirable that the axioms be as ‘weak’ as possible. It is not always clear what is meant by this. It would seem that it has to do with how limited an assumption is in terms of the logical constraints placed on its primitive terms".

As mentioned first order logic of arithmetic of real have been demonstrated incomplete a long time ago. "Before a theory (or axiomatic system) can be completed it is usually necessary to show that it is incomplete. Most of the theoretical analyses in traditional textbooks can be interpreted as results (i.e. failures) of indirect attempts to show the traditional theory to be incomplete. Completeness is the requirement that an explanation does not allow for the possibility of competing or contrary situations. As such it rules out the possibility of a false explanation accidentally appearing to be true. That is, if a theory or model is complete and happens to be false, we shall be able to show it to be false directly. Consistency requires that the set of assumptions which make up a theory does not entail inconsistencies. As with systems of equations, a theory expressed as a system of axioms needs to be consistent, but with one important difference. Although a system of equations can be shown to be consistent, a system of axioms seldom can."

Along time the toolbox of more complex concepts have been filled both with refutations, unfollowed conditions and limits, with plenty of data collected on anything defined but not properly identified. With devices of measure, simple formalism registers, many things of phenomena, but the values and the models can, retrospectivelly fitting well with the data but are not necessarilly good for good issues; neither this is surprising: it is prepared so. Also they are 'blurred', averaged and purified with lot of details, more for fixing abstract concepts. The framework of parameters, measured and simple operations given, the same arbitrary way. Often, when the defect of ignorance is at least taken into account, they are even not the most manageable ones or effectivelly supporting the policies pretending to stick to the data. Just looking at the obvious wrongdoings of geopolitical management, to show that we have increasing problems with reductions.

Popper’s Socratic theory of learning asserts: "one’s understanding is always conjectural but potentially true. The only way one can learn is to find a refutation – to find that one’s understanding (i.e., one’s theory) is false." ... In logic, the negation operator is something hard to manage; even in the most rigorous system of first order logic. "The logic of refutation is based on 3 propositions. A logically valid model is refuted:

- By arguing modus ponens: whenever all the assumptions are true then every prediction which logically follows from their conjunction must be true; or equivalently,
- By arguing modus tollens: whenever any prediction turns out to be false then we know that the conjunction of all of the assumptions cannot be true. If we actually observe a false prediction, does that guarantee that at least one of the assumptions is false? - We are able to argue in favor of such a guarantee only because we accept.
- The logical condition of the excluded middle: ‘A statement which is not true must be false.’ This is not a trivial word game about ‘true’ and ‘false'.

"Generally, ‘not confirmed’ does not mean ‘disconfirmed’. In other words, when ‘confirmed’ and ‘disconfirmed’ are used in place of ‘true’ and ‘false’, proposition (3) is discarded. But when the excluded middle is discarded we sacrifice the logical force of any test. That is, we cannot construct an ‘approximate modus ponens’ such as (1’) ‘Whenever all assumptions are "confirmed" then every prediction which logically follows from their conjunction will be 'confirmed'... With exact models we can refute a model by showing that one of its predictions is false (modus tollens). In effect, a false prediction is a counter-example; that is, it is a statement which would be denied by the truth of the exact model. This is a clue for our design of a logically adequate test of any theory. What this demonstrates is that testing models using confirmation criteria (i.e., a statement is considered true if its probability is at least 0.95) can lead to contradictory results and that the usual published tests are often very misleading. But it should also be noted that it is possible to have decisive tests subject to the acceptence of specific stochastic decision criteria. For example, relative to given confirmation criteria, a refutation is successful only if the predictions fail the confirmation test and the counter-example passes the same test. Not many reported ‘disconfirmations’ satisfy these requirements. statements. The ‘if’ can be left unsatisfied by a failure to ‘follow logically’ or by the use of false supporting statements.

Confusion gets into through, observing for example that distributions discriminating data, according some higher property of quality, overlap and; most of statistics inference has this purpose to discriminate or test a 'meaningfull' probable amount of significant difference. There will be a range of possible statistically estimated means and standard deviations. The criteria are designed either to avoid 'type I errors (rejecting the model as false when it is actually true) or 'type II errors (the reverse acceptance) but not both. Since most stochastic model-building in positive economics is concerned with deducing stochastic predictions, the usual choice made is to use ‘confirmation’ criteria rather than ‘disconfirmation’ criteria for the purposes of defining valid deductions. Such models cannot automatically be useful when we wish to test the model except for the purpose of finding confirmations. In order to test a theory by building stochastic models we must do much more."

## Objects Scientifc Forms

Pre-modern scientific reasoning did not had many rigorous tools to establish comprehensive logical systems applied to basic sciences. It required time to define good heuristics to cope with abstractions about the physical world. Heuristics are pathways to solve problems. Having been proved efficient, an heuristic with a reproducible experiment or an experience and using a formula enough positively predictive on the technological applications of the phenomena, almost gives you a scientific revolution, and if there is a code of heuristics, this revolution has a good probability to turn an accepted social rule. This probability does not depend only of the scientific formula. For most complicated engineering applications of common life, the formula does not necessarily have to be perfect, only satisfy enough profit.

Mathematical formalism and physical sciences have already identified many analogical correspondences between concepts, like with equations in dimensions pointing to equivalent systems of formula. Core example with equations in dimensions, all pieces of the formula are related to basic dimensions (space, time, and how they are operatively considered: degree, inversion, empowered. Same is the purpose and unity of sort of logical balls that expect to be oliconographs in 3 dimensions space. And also to care potential of calculability join with good nerves at understanding complexity. Also suggesting sort of complex dual management: one to all with some 'econocmial' feed backs, contradictions involvement, and minimum overlaps. Input-output aims on one side to bring scientific knowledge for proper incorporation but on the other side a quantic-like or plurality of options and alternatives, including unexpected, to cover flexibly complexities.

Fundamental physics is already not a first order logic: there are already multivalued logics at level of quantic mechanics but once decoherence made, common physical phenomenology can be seen determined and, out identitary transformations really ruled by quantic mechanics (quantic optic, nuclear physics, quantic chemistry, quantum computing) determinism may be contained, but prabilistic in distributions within margins of properties.

In relative terms we may have byside of microcomplex economics (that is 'superior' to physical principles mechanisms) of lower scale - more fundamental order first order logic. It means, after all, non spontaneous generation, no material - energetic - information impossibilities and; byside of macrocomplex economics or macroscale level or microeconomics a frame of desirable criteria, over a more scientific ground of real transformations. You have to carefullly imagine how, with your model anchor to realities and scale on that the desirable criteria, not too speculating. The reason why these criteria being, sparingly humane ones first, over thermodynamics frames, with special care to informations concepts and subconcepts because anyhow managed by transformations and energetic costs on informed-complicatedly-structures of matter, naming that: 'thermoeconomics'.

Empirically, there are different ways to manage a perspective on negation, contradiction and contrast. A first one is just positive: to key words of analysis you add their contrary value, a negative operator, or some equivalent symbol. It can often be easier to manage what a given concept is not, rather than what it is effectively. Primary terms can have also 2 valences (complicatedly articulated as complex program): one positive (more or less identified) and the contradictory negative ones (negation, what it is not, the contrary, etc.). As we said, negativity is not so easy to manipulate, and, even in formal systems, this is often intuitivelly clear: 'wrongness' concepts of arguments of management can be delimited, for a simple start, at least 3 key terms:

- 'Paradox' (when two clauses of an enunciate are contradictory or lead to contradiction, one respect to the other; our question is, why does it not seen so natural?)
- 'Bias' (when a recognized true argument is manipulated, affected in its perspective, displaced from its normal disposition as with a statistical meaning of commoness).
- 'Fallacies' (untruth argued true; observe how mistakes are often so 'normal').

Abbreviated intuitive definitions of biasing effects, can be:

- 'Filter' (change structure and divide),
- 'Artifact' (effects made by interactions like creating false image, an illusion or fake appearance as a new property),
- 'Distortion' of the image (defect commonly related to the device of measure, and generally it is regular).

## Realist Forms of Objects representing Subjets

Applied theorists (or applied model builders) must be concerned with the testability of their assumptions. Preferably, their assumptions should be directly testable. On the other hand, the pure theorists or model builders, who are interested in a model or theory as a system of ideas, need only be concerned with the indirect testability of the assumptions – namely, in terms of the testability of the conjoint entailments. If we are not interested in immediate practical usefulness, we can avoid this possibility by avoiding testability of our individual assumptions and let the burden of testability fall on the entailments of the theory as a whole. For example, the basic behavioural assumptions of traditional consumer theory may be axiomatically represented as follows:

- For every consumer there exists some non-economic criterion such that, he or she is able to compare combinations of quantities of goods,
- For every consumer there exists an economic criterion such that he or she is able to compare combinations of quantities of goods,
- Every consumer when confronted by two combinations which he or she can afford, defined in assumption (2), will buy the one which is ‘better’, defined in assumption 1.

Assumptions 1. and 2. are not testable. They are both incomplete; the criteria are unspecified and thus none can be rule.

‘Verificationist’ view presents theories in order to verify them, which requires that theories be verifiable. However, verificationists may still require falsifiability since it would be trivial to verify tautologies. Falsifiability is necessary for both the Popper-Socrates view of science and the old-fashioned ‘verificationist’ view which Popper criticizes.

For the purposes of the Popper-Socrates view of science, falsifiability is preferred to verifiability. Preference is established to exclude certain statements or questions from consideration on the basis that nothing can be learned from a statement or an answer to a question if that statement or answer cannot be wrong. If it is kept in mind that a theory or model consists of many assumptions which are logically conjoined to form an argument in favour of one or more propositions, we must now consider what the Popper-Socrates view of science means by a ‘false theory’. If statement P follows logically from theory T and if P is false then the theory T as a whole is false. But it cannot be inferred that any one assumption used in the theory is, or is not, specifically upset by the falsification. Only if statement P is independent of some part T' (or group of assumptions) used by theory T can it be said that the part T' is not involved in the falsification.

The demarcation based on falsifiability and non-falsifiability divides possible theories into two groups: (1) those theories from which non-tautological, non-universal prohibitive statements can be deduced, and (2) those theories from which either strictly existential statements or tautological statements or both can be deduced. The statements which are included in the ‘space’ must satisfy the following two conditions if they are to be relevant for the demarcation criterion of falsifiability: (1) no basic statement can be deduced from a universal statement without initial conditions or parametric values, but (2) a universal statement and a basic statement can contradict each other.

According to Popper, the deducibility relations holding between these four statements are as follows: From P all others follow; from Q follows S, which also follows from R; thus S follows from all the others. Moving from P to Q the ‘degree of universality’ decreases. And Q says less than P because the time-paths of agricultural goods form a proper subclass of the time-paths of all goods. Consequently, P is more easily falsified than Q: if Q is falsified, then P is falsified, but not vice versa. Moving from P to R the ‘degree of precision’ (of the predicate) decreases: circles are a proper subclass of conics. And if R is falsified, then P is falsified, but, again, not vice versa. Corresponding considerations apply to the other moves: If we move from P to S then both the level of universality and the degree of precision decrease; from R to S the level of universality decreases; and from Q to S the degree of precision decreases. A higher degree of precision or level of universality corresponds to a greater (logical or) empirical content, and thus a higher degree of falsifiability. So it can thus be ‘assumed that to a theory of higher [Popper] dimension, there corresponds a class of basic statements of higher dimension, such that all statements of this class are permitted by the theory, irrespective of what they assert’."

At mesoscale level, the one we set the layer of essential unit(s) of maximum complexity (complex non infinite algebraic aggregation), be these somehow like microeconomical: individual, but more probably, social groups, up to sort of complex environmental unit(s), like an ecosystem or a landscape of settlement of the development, eventually a regional economics and often also interfaced (fringe of 2 or 3 bordering ecosystems). Level were to adapt sort of strategies as in the theory of games, and so about the ground or basic thermoeconomics at stake. To all these hard to examine kinds of probabilities and stochastic models enclosed in frameworks of probabilit,y we may simplify with the sort of properties called from macrolevels or the one of 'superior humane criteria' to selection sutainable options. Of course proceeding in a participative way seek democratic consensus where there could be simplifications (of the sort of concept we think about simplexity). Complexity are taken also in processes of simulations, sequence of reviews, following dynamics trajectories of the complex ones. But there is still plenty of hardword for articulating levels of scale, types of methods, modularity, multivalued logics as well also some efforts to do with making types of more complex calculus that could be designed open to practice as we call of 'social algebra'. But we also have emergence of special properties, some margins of speculation or anticipation, transfers if allowed by open system systems, in economies: trade, stock consumption and so on.

## Speculative Forms of Objects representing Subjective Subjets and Explainations

Independence and Economy of Thought. Here again there is a similarity between systems of equations and systems of axioms. We can have a consistent system of (linear) equations where the number of unknowns is one less than the number of equations, so as long as the least one of the equations is a linear combination of one or more of the others. Or there may be subsets of equations which does not involve all the variables of the system, thus indicating an independence between some of the variables. In the social sciences, plenty of models stand far from reasonable human management, when not very far away from their original field. Huge efforts are made to put in formulas and systems—purified behaviors—that most real actors do not practice, but with cognitive ones of their own also complicated and probably better. Even if full of rational tautologies, most of these models do not make good social sense. They turn, at best, into speculative enunciates of problems defined by one researcher, more or less inscribed to some school of concepts and mostly proceeding under implicit or explicit dictatorial assumptions: what said is more to inspire a dictator (politic leadership) rather than has a form for democratic adaptation.

Many technological products have produced enough objective social results for being 'accepted', but this does not mean that they make proper truth with social facts, since often indivisible 'scientific packages', for the sake of a few. Most global social models have convincing dialectic and logic. The need for social models may exist but they should have to be flexible, adaptable to different operative groups, working on the same issue, open to democratic validation, and socially immersed. Now, the importance of all this is not to argue that empirical theories cannot be true or that theories are empty tautologies. The point is that empirical theories are concerned with contingent truths, that is, statements whose claimed truth depends on the truth of other statements whose truth in turn are generally not proved. Tautological answers have to do with human proximity but, over all the social networks that can exist, none can clearly articulate the best way to apply the rules. It is hard to define community, its pieces, its projections and the exact way to make them consistent. So it is difficult to draw the maps of interpretations. With modern means of communication, human proximity is fragmented, distended over time, space and actors. In all that, there must be reductions and simplifications, diversifications and complications. The rules of the game always change, few stay or transcend. Terms, with respect to definitions, can change a lot, specific definitions of fairness, trust and confidence follow, that is; basic ethics are of this kind: tautologic but still hard to apply because 'uneconomic'.

We call tautologies evident enunciates (truisms) against which no one will argue. The difficulty is how you put them into practices and, where and when to take care of human misinterpretations—either too soon nor too late. Very often when macro interpretations take over, in effect it only promotes paradoxical lies. So, if no one will argue on goodwill of macro interpretations, you will easily meet its contrary at the micro level. This is the main reasons why at the micro level of communities democratic and humanistic enforcement have to be better cared for, than macro-theo-rhetoretic, that is global speech with poor effectivity. Better care does not mean more interventions. Ambiguous balances can allow more responsible options to communities. The human mind likes to reduce linearly and use simple algorithms, but it switches too instantaneously (enactingly?) to other things. Human minds also use parallel processing, almost if alone (there is some focusing) but necesarilly collectivelly. This property is fundamental; we would not have survived without it, since in nature the diversity of sources of risks or stimuli is very large. In humans, this had made our ability to create new ideas. After then, it is to remember and to demonstrate, eventually linearly, each subpart; but we think to have other perspectives. With an olicognograph, once satisfied by the conceptual environment you have designed collectively, you will focus on the traditional tools of scientific reasoning, if you want or in other mechanisms as suggested.