main index

P00: frame around

P01: olicognography

P03: infrastructures

wayout:contact

User

You?
Use?
Perspective?
Usage?
Concern

Graph Start

Core n°
Half complex graph

OLICOGNOGRAPHY on DEMOCRATIC ECONOMY

System

Engineering

Development

Scale

Health

Social

Simplexities

Basic Olicognograph: Wholism and Reduction

Simple Modelling seen Complex

Tradtional Modelling [complex mind comments]

In the construction of an economic model the model builder must make methodological decisions about the following:

  1. Which variables are to be included in the models; the easiest assumption to calculate is to take them independent not 'reacting' one with the other and operate 'pieces' with simple operations.
  2. Which of these variables are to be specified as exogenously determined, that is by outside unmanageable variables which in any specialized register often separate ideologically on one side variables proper to the doctrine, management easy to conceive and; on the other side, those still not part of the theory or the paradigm of management, but not necessarilly out any policy; so classification quite arbitrary.
  3. What should be the forms of/or relationships between exogenous and endogenous variables; also often made in a quite arbitrary way that; either by dictatorship or by type of register, are often conceived improperly since mispeficifation of model.

"The degree of testability of any econometric model is not independent of the P- dimension of its underlying exact model. Such an interdependence can be implicit in the econometrician’s consideration of the so-called ‘Identification Problem’. Econometricians have long recognized that the problem of identification is logically prior to the problem of estimation of the parameters of a model. Identification refers to the property of a specific model which assures that, if the model is posited as being the hypothetical ‘generator’ of the observed data, a unique structure can be deduced (or identified) from the observed data."

There are two ways in which a model may fail to possess the identification property. Either the model is such that structure hardly can be deduced, either the model is so that more than one structure can be deduced from the same data. Attempt to avoid the these difficulties which are coined the ‘problem of identification’. Identifiability depends on the form of the model. As most texts on econometrics indicate, what can determine the identifiability of a model is the relationship between the number of endogenous variables and the number of exogenous variables. To avoid the identification problem requires that the relationship be of a nature which assure that with a finite number of observations we can (if the model is true) deduce the values of the parameters (i.e. the structure). This requirement holds even for non-stochastic models."

"The model builder must make decisions regarding 3 things:

  1. The question to be answered by the model,
  2. The list of variables and the specification which are endogenous and which are exogenous, and
  3. How the truth status of the answers will be determined.

A ‘good approximation’ would be a simplifying approximation which gives more positive results, or a generalizing approximation which allows for a better description of what firms (in fact) do ? From the standpoint of conventionalists, it is not sufficient merely to assert that simplicity is more important than generality or vice versa – so the conventionalist controversy will continue."

"Second Best Theory says that 1) Between the resulting ‘constrained’ outcomes reached by approximating the completion of the optimizing conditions and the ‘first best’ outcome, there exists a ‘second best’ outcome, but (2) in order to reach the ‘second best’ outcome we must give up the method followed to reach the ‘first best’. For example, if for any reason, some of the firms in the economy. Approximated optimum is a desirable ‘third best’ because at least we know how to reach it while we only know of the possibility of the ‘second best’. According L. A. Boland meta-theoretical formulation of the well-known Second Best Theory is: There does exist something better than the approximation of the ‘ideal’ theory (i.e. of the exact representation of empirical reality) but to find it we are required to give up the old method."

Democratic is another perspective leading issue for convention. Conventionalist methodological criteria are:

  • More simple [Limit is given by the critical way the relevant question could be cared enough. A complex view will take into account the framework, covering enough complexities with simple clear shared accepted policies and the balance that implies conventioned program, tolerantly completed with positive independent actions, any citizen should be allowed],
  • More general [generallity extension means also to expand the set of criteria to meet required details, including complexities cover, trust and liability between actors, sharing of common values to incorporate, and so on],
  • More verifiable [verification is driven by the prejudgement that the reality is simple. There in a convention very arbitrary and easilly arguable, requiring a system of truth which is the weak part of conventionalist methodology or its contradictionsvery easy to observe in reality. Should not be missed to make clear the rules of truth and the rules when and how to adapt as rules of changes],
  • More falsifiable [according an absolute reality, if the prejudgment is that a convention is mainly ground on irrational unscientific citizens, wiselly following defined situations where imitation has virtue or according alternative truth],
  • More confirmed [if the convention has the qualities it aspires for: widely accepted, shared and respected],
  • Less disconfirmed.

"To make clear the ingredients of a model, it is, Initially, to recognize 2 separated perspectives: the theory itself and the added assumptions used to specify the functions that represent the theory. This gives the following schemata for the concept of a model:

  • A set of behavioural assumptions about people and/or institutions. This set might include, for example, the behavioral proposition C = f(Y), where ∂C/∂Y is positive. The conjunction of all the behavioral assumptions is what traditionally constitutes a ‘theory’.
  • A set of simplifying assumptions about the relationships contained in the above set. For example, the demand function stated in the theory might be specified as a linear function,
  • C = a + bY, where a is positive and b is between 0 and 1.
  • The logical nature of any model is determined by the extent to which the model represents an argument, that is, an explanation of its endogenous variables. There are only two 2 forms of valid logical arguments. something are formally in favor of the truth of a specific statement. Such an argument consists of one or more given statements which are alleged to be true and from which one can logically derive the specific statement.
  • Standard logic provides only the means of ‘passing’ along the truth of all the given statements to any statement which logically follows from them. However, the truth of any given statement must be established independently of the argument."

For a quite common specific to economics model: "Linear multiplier models are based on the following assumptions: idle production capacity, fixed prices, linear production functions and fixed proportion (means that the elasticities of substitution are nil, and production factors are perfect complements), average and marginal and propensities to consume are equal, income elasticities are unitary).The first step in constructing multiplier models is to divide the social accounting matrix (SAM) accounts into 2 groups:

  1. Endogenous. These accounts usually include the factors of production, households and firms, and production activities.
  2. Exogenous. These accounts contain the government, capital, and the rest of the world. They are used to enter the data for the change that we wish to model and analyze.
  3. Non-linear probability weighting: The expected-utility (EU) model, which is the dominant model of risk-taking in economics, assumes that the value of a risky prospect is determined by the utility of its consequences weighted by their probabilities of occurring.
  4. EU is appealing, in part because it can be derived from an apparently reasonable set of axioms, most notably the independence (or “cancellation”) axiom which entails that choice between gambles should not depend on events which lead to the same consequence with the same probability. However, many empirical studies of decision making under risk document violations of the patterns of behavior predicted by EU.
  5. Hyperbolic time discounting: The discounted-utility (DU) model, which is the dominant economic model of intertemporal choice, assumes that people choose between intertemporal prospects by evaluating the utilities of their outcomes and "discounting" them according to their time of occurrence. O'Donoghue & Rabin have demonstrated the importance for behavior of whether hyperbolic time discounters while being impatient in the present are naïve or sophisticated about the fact that they will also be impatient in the future -- when the future becomes the present."

For a series of questions relevant to design of models in social art of economics may be the following:

  • An ambiguity is between model of representation of management by legitimate actors and a perspective, situation or policy of which people are made 'subjects' (when not just objects) according deductive models commonly averaged. Democratically they may be designed after Ockham's criteria (meaning that on different options between people the least-weakest one prevails (under a judgment of value: the worst one) and the models designed by experts which are often jammed with doctrine, overview (hardly shared by any real actors, out a mythical dictator, supposed to be omniscient and omnipotent). A model in the first case that can be better followed if actors have either profit or are made strongly responsible (but where their care of complexities should be a plus) and dictator's policy and the paradox of most averaged democratic model of policy (may be the most inert to determist transformations).
  • Most models using data and averages that carry the myth and weakness of average: almost none in a complex framework of multiple variables is exactly the most efficient in hands. Have a model covering enough representation is not the same fit by side of actions (key witnesses are not necessarilly the best engaged, judges not the best with empathy, directors of policies not the most common humans exposed to effects of policies). Also with complexity or even more 'scientific truth' rather good policy is not just a matter of average. Endogenous / exogenous discrimination play something there but could / should be enlarged to the concept and ambition over the blurred perspective of human community. Balance is between real grounds, subjective ambition, complexity, heterogeneity and diversity of interests, profit and cares of social group, community or population subjects to policy. If targeting population is the best way for observing some correlated changes of targeted indicators or parameters, other more important issues may simultaneously to the policy shake the collectivity and the policy will fail mainwhile basic essence of finance (flexibility and value), will be distorted.
  • Formal treatments of quantified models prefer independent variables and simple operations. In basic physics variables are, in absolute are composed of similar fundamental particles, similar parts. Then substitution from one molecule to another is a question of energy, flexibility of composition and time, if we are able to produce fundamental components (or able to study them). Energy as factor has its main purpose to be exchangeable or substituable from one to another type, more or less easilly, since available stocks at some cost and lowest forms: heating which may have some use if recycled or transfered. Everything in physics is transformations.
  • At meso level of essential actors there are plenty of abstract purifications, that are not necessarilly properly expressed in formalized values, neither exactly according the utility in the local processes of use. Making all this like modellers of microeconomics at ease with models and formula of simple, almost arbitrary operations, hence mostly 'non social' operations. That is promoting reductive stone-like individualisms, and methods sustaining their expertise; prefering to ignore the curious links 'non rational people' try to establish between them, different things and different separations or discriminations they want to make between.
  • There is a confusion between what averages provide about most common and most regular parameters and values so promoted by econometric and probabilistic methods; meanwhile pouring into 'residuals' or 'remains' as separated by multilinear processes. Qualified parts of residual can be confused with noise, since they make a mixture containing white or blank noise and irregular effects, quantitativelly hard to detect, variables, transient or non periodic effects; all may be transiently relevant. Kinds of variables effective only according some configurations of other variables (thus coincidental covariables), threshold or non linear effects and so on (could be this 'noise' the source of important means of policies, since most common others are the most shared and most known in trends modelled so to have inertia, stay trends, 'unawarable' of changes or bad 'underground' trends adverse to change because forged by specific interests.
  • So a lazzy management of the model, preventing transformations and changes, uncaring or avoiding many effects evidenced in the behavior of variables because too complicated for the simple algebra choosen for the models, but probably typical to social cronstructions like covariability, heteroskedasticity and so on. All questions phenome -nologically very significant to intermediating economic actors, overflows of matters and energies, which would like 'deterministically' to bias toward where they could make their extractions.

New Resources of Modelling

The construction of models follow concepts of quantification. In appearance, a model is technically mainly quantitative. But now, logical formulas also make models. Perhaps there is a need to socialize these more systematically. Identification will essentially still be there, but we do not remember having seen such a thing as a comprehensive corpus of methods of identification. The core question would be with a complex subject: what formal set of logical formulas could make a logical model of it? - Natural complexity makes this very hard to unify and it is impossible to close in absolute of algebraic infinite methods. Each artistic science prefers to have its own (dogmatic) objects. Technically, models can be logical, of fuzzy logic, of second order logics (many different kinds), semi-quantitative, semi-parametric, quantitative, probabilistic, stochastic, quantic, etc. All try to mix in different methods and are often complementary aspects. Also, and probably more if we stay with the same sort of management of social decisions, with the same rigidities in paradigms of social managment. Say like: - Only firms are efficient, - States are non efficient, - Democratic practices should stay at the same level of primary school methodology (children taken for simple just unexperienced, and often made unwise by slower evolving adults), formal methods pedagogy the same with much less mental flexibility (than children), - Politicians should be seen as corrupted and petty liers and formal methods should not support collective decision, commons are too hard to care, profit is only unique naked money, etc...

Economy of Complex Networks

"A network, or a graph is a set of nodes connected by links. It is already clear that most natural and artificial networks from the internet to biological and social nets, by no means ressemble lattices". During the fifties appeared types of random graphs. Other essential concepts appeared as tree-like local graphs, loops and cliques (or fully connected subgraphs). Patterns, local and global aspects, characterization of networks are now activelly investigated. "Network structures play a central role in determining outcomes in many important situations. A variety of large social networks have been shown to exhibit certain characteristics. 'Small worlds': the first characteristic is that such networks have small diameter and small average path lengths.The second characteristic is that such networks have high clustering coefficients relative to networks generated by independent random processes. They are 'mechanical' models, where a particular process or link formation (or reformation) is specified, but there is not much explanation about why networks might form in accordance with such processes.

From the mechanical side it is known that in some situations a few random connections between distant nodes can dramatically decrease network diameter, while from the economic side we learn that in some situations distant nodes greatly benefit (in terms of net utility) from forming links precisely because of the distance, which provides an answer as to why such shortcuts might be formed. The fundamental intuitions that emerge from the economic side are: 1) high clustering results from low costs of attachment to similar (nearby) nodes, and 2) low diameter results from the large benefit of attaching to dissimilar (distant) nodes because of the substantial indirect access they provide to other distant nodes. A conjecture is that a variation of the model that marries a random process to how nodes meet with the economic and strategic concerns analyzed here would begin to account for the degree distribution and could result in features consistent with observables."

"Natural and artificial nets display widespread feature: the presence of highly heterogeneous distributions of links, providing an extraordinary source of robustness against perturbations. Although most theories concerning the origin of these topologies use growing graphs, simple optimization process can also account for the observed regularities displayed by most complex nets. Using an evolutionary algorithm involving minimization of links' density and average distance. 4 major types of networks are encountered: 1) Sparse exponential-like networks, 2) Sparse scale-free networks, 3) Star networks and 4) Highly dense networks, apparently defining 3 major phases. These constraints provide a new explanation for scaling of exponent.

Evolutionary Dynamics

Since Lotka-Volterra mathematical differential models of ecologic populations in the first part of past century many more complicated development have appeared and met major tools of simulations with personal computers. Demography which also used in some rigid way like the indicators of mortality or fecundity they were using have relaxed. Actuarial methods used by insurance projectors of risks and 'fate' also brought their methods, that still have not well solved the complexities of humans dynamics. Actually you may observe in modelling econometrics based on simulations of evolutions (determinist economics mentally disturbed by the 'fight for survival' or the 'survival of the most blessed by brutal power' - missing that best scientists used smartly their weakness (not necesarilly physical) for discovering better alternatives; if not always more humane ones. Actually modelling can mixe, in a purified way, theory of games and strategies, evolutionnary dynamics, determinist or stochastic models. And some economics have switched to evolutionnary economics.

"A Darwinian process has six essential characteristics:

  1. A pattern exists (e.g., genes).
  2. The pattern is copied or cloned.
  3. Variant patterns arise because of copying 'errors' and recombinations.
  4. Populations of some of these variants compete against other populations for area (e.g., bluegrass and crabgrass)
  5. A multifaceted environment makes some of these variants more common than others (i.e., natural selection).
  6. The more successful variants serve as the most frequent centers for further variation, and future generations spread out to nearby regions to repeat this proces (that is, Darwin’s Inheritance Principle).

All 6 characteristics are required to affect the recursive bootstrapping of quality. Having first 5 features, without the 6h, results in nothing more than population drift or random jumps from one barrier solution space to another."

Known catalysts help speed the evolutionary process along:

  • Systematic recombination: Variation is introduced systematically (for example, bacterial conjugation or sex).
  • Fluctuating climates: Climate changes result in more-severe selection and more-frequent culling and, thus, in more-frequent opportunities for variation during expansion.
  • Island biogeography, resource scarcity, geographical barriers and climate factors: these can separate a population into isolated subdivisions that discourage migration and promote inbreeding. Repeated separation and unification, or “pumping,” increase diversity and create variants capable of living further out on the habitat’s margins (e.g., the San Juan Islands, which are now surrounded by ocean but were hilltops connected by a broad valley during an ice age).
  • Empty niches to fill: These can be due to the extinction of entire subpopulations; pioneers from other subpopulations will rediscover the vacated region and its replenished resources, and a population boom will result; an absence of competition for several generations results, giving rare variants that would otherwise perish a chance to survive. Once established, they may be able to survive future threats to their existence."

Modeling together with olicognography

Olicognography is a program and not an established field of modelling. Formally establish it can be imagined as a modelling, using other ressources as suggested in this package of democratic economy, but the implicit management of complexity is made so, intent to delay as much as possible the foundation of an ambition of science, that would fix too soon reductions (or an esoterism) in games of virtual design; that are now too easy to make with computed simulations. The fate of systemics that started well with von Bertalanffy principles evolde curiously with cybernetics (into an intent of determinism), not uselessly but not as undeterministically as imagined by systemists of the first hours. At least they did not prevented the 'emergence' of sciences of complexities. Sciences of complexity neither seem (as mentioned by Ruelle - one of its best scientist) to evolved as graciously as expected by their 'literary agents'.

Anyway modelling with olicognography would seek different purposes:

  1. Explicative models considering that the best explanations are often in the community; the olicognograph has to help reveal and shape that is: not simplify too soon with data that plenty of people could ignore not all unwiselly as imagined and few people trust and use, not all and always in a wise way.
  2. Models of communication and community's organization are basic to olicognography but also incomplete, definitions should be completed by real commitment into objective democratic actions.
  3. Well-qualified models are where olicognography can suggest useful mixtures of quantification and qualification.
  4. To well-quantified models, olicognography could prefers to distribute socially sound technical formulas, where they have to be; consensual social simple formulas transiently respected but not venerated, only where they can articulate.

Sharing the same space of complexity has implications on humans' proximities, simple efficiencies, modesty and sufficiency. It is not only for having social common interest in maximum futilities. Layers with different cardinal values are mixtures (somehow quantic in fundamental physics, more arbitrary with social values). Suggestively, they are trying to catch a scientific purpose (under which the hierarchy of basic concepts are supposed to be safer); or to mimic and to dispose logical higher criteria (like criteria of justice) on engineered issues. This produces sort of frames (made together) and models that can catch both natural complexity (our physical world of interest on which we depend) and make it interact with our human speculations so anchored enough to real complexities. All this, of course, is an art, as far as possible a democratic one, about science. Always remind the difficulties and essential uncertainties of a “wholistic perspective”, about some sort of extraordinarily strange umbrella of complexity (only trying hard to be good with any, allow you to call the superior reason why, as you like).

To be stable and liable, local financial systems need a social environment that can manage properly the fluid dynamic behavior intangible information of money. Complex knowledge about the roots of money within the materialist human socio-technical system are essential. Fluid behavior of money is not a dead end but for means, sensitive to rigidities and biases; especially if disconnected from what people are doing and where they live. Popular systems of finance must anchor their values to remain like the upper part of the economy’s iceberg and not a distant appendix. The top part, made of financial systems, has to reflect effectively the larger and lower part underneath. Of course, the image can further be developed saying that financial systems actually frame (too much ?) the whole icebergs' systems. Setting the debate between the honnest contribution of financial systems to their economical roots or making it a predatory mean for creating illusions, mirages and bias toward their self-interest the flows of rent of means. Take also into account some mirror effects. Setting financial policies needs an understanding of signals and financial systems reflecting real economic trends.

Complexity in Models

To prepare complexity awareness or design perspectives many half-complex tools, previous to personal computers use, may have been unduly disqualified, by simplifications. For example: abacus, logical aids and other visual supports that combine figured logic. Locally appropriated management and calculations have been replaced by the crude, simple displays of pocket calculators and computer screens. Moreover electronics display and / or electric commands do not provide more with mechanical contact with effects. The logic and synthesis, a bit more complicated as imagined in their previous use, are now incorporated in visual virtual reality displays, choosing pathways according the machine not the cognitive mechanism previously usefully applied by experienced human operators. Something that recently exploded in 'arts of management' with the abuse of powerpoint presentations of (recopied not always good data) made in many social public agencies if not during processes of massive destruction empowering due humane ignorance.

Do not think that methods will do everything: complexity is essential. Also, popular methods do not have to be so simple that they lack realistic sense of applied perspectives or made to come from only one leader (be it the thinking of a dear leader or the vocal of a powerful government). The adaptation of methods corresponds to all users. It is anyone’s commitment to support, cooperate, be fair, constantly search, fulfill, and assume. All that combined makes the local community diagnostic more consistent. Deduced assumptions are often products of simplified interpretations and do not correspond to the perspective of local actors. The ability to solve the “complexities” of reality is essential, and too prejudging on complexities can be safeless. If you want to help people far from the real human stage, after most fundamental knowledge transmission, it can be wise to be ambiguous and suggest enough options, positive or contrasting ones. This, to let anyone forms his/her own opinion and, together, make their own reductions and simplifications. So they could assume their responsibilities while managing clear initiatives within their environment and caring meaningful links there.

Founding some different truth of exploratory system:

  1. “Absolute” (or, if you prefer: general, strong, first order, etc.)
  2. “Relative” (you can substitute: local, weak, second order, etc..)
  3. “Pragmatic” (you can too adapt other terms)

For material characterization, it is better not to start with just one variable, one parameter, one degree or dimension but with a material object composed of 3 spatial lengths + time, even if abstract. This combination is called a quaternion or an hamiltonian. Eventually, it can be 3 variables combining all the previous ones, to some degree, like if composing materialist concept. Thereafter, simplify, refer tominimal higheest complex concepts, and explore possible calculus. Worse is, in economics the lack of coherence with physical ground of managed entities, ignoring especially the formalism of physics economy itself: thermodynamics.

Some restructure of formalism are required, which does not mean to disqualify everything of present economics formalism (its mathematics and mathematics is adaptable). Neither it will result in changing all the phenomenology of values and parameters calculated on this virtual and speculative ground, but to to provide other core of "conceptual principles" in the management of arts with social issues for, at least, have less non phenomenological impossible formalism, so many misunderstandings and inappropriatedness in the formal and practical management of fundamental grounds of our reality: biological, ecological, chemical, physical. As well as "by the other side" of democratic things, be more effective, even if humane speculations (we could naivelly call 'nice tautologies of humane ethical values' as freedom, solidarity, unity and diversity. This should turn also essential criteria, as far as possible in social processes and influencial in proceedings, made possible by physics in our trop-oosphere & logos-visuo-sphere topos-ecological.

The process then selects values, measures characteristics properties and open-makes decisions (suggest and commit). Previously to any deep analysis, these duals have the problem that when you organize them, you exclude options and shades, making the management of complexity difficult if willing to be fully determinist. You are forced to decide before making their assessment.

Binary (Boolean) logic is effective for simple, unambiguous systems, and this explains why it is the logic you meet in computer sciences or in theory of decision as the backbone of mechanical simplifications and electric simple devices. But it is a major mistake of most political sciences, especially when these inadvertantly derive their principles of managment from these supposed scientific aims. For example economists generally start their models considering 2 sectors then trap any more others with this simple program. Their potential of social integration can be limited. Very often their specific purpose or mission is neither comprehensive nor integrative. Training in democratic methods comme too late, even if superior management is well intentioned at the respect.

The algebra of operations may have a proper representing mathematical structure of a commutative body. In commutative mathematics, operations give the same result when permuting number positions with respect to signs of operation. Meanwhile, formal resources have large effects on social issues in a complicated way.

Another kind of application is to detail the steps of definitions. You can find in literature many examples of checklists of investigation and model building. Notice that the dual technical/social activities is a powerful one. We do not mean that their formulas are not part of social games (this is necessary) and cannot have any use. But the point is that this has not much to do with science so has to stay democratically accepted, so submitted to such processes and humanely be correct:

  • Small cognitive machines, to explore hypotheses that can be checked collectivelly and assessed as well as reused for more. Close to social sensible issues, it is important to link the olicognographs to values on moral issues as locally practiced first.
  • Figurative functioning system (like a partially explicative individual and social representations),
  • Cooperative environments, options, complex criteria, and social leverages, criteria flexibility, simple system (or simplicity), and variety of strategies.
  • Model (the synthesis between the ideal representation and operative worloaded representation),
  • Project (a feasible wish, an utopia, a will to do better than apparent rationality),
  • Causal causual functioning (previously the expected return of information: a system established on all the previous figured instruments. Ex post: the causes you think explain the results you observe).

Management of complexity requires to intent to make clear what you can do and what you can resolve. Consider the practical possibilities and dispose precautions. Show your limits, uncertainty, or doubts, and ask, suggest and provide with cooperation, help, anonymous support. One difficulty is to see the world as it is, looking like something known and always new or to renew, maintain, and produce (respectively). Especially when paradigms pretend to simplify and force. Accepted rules from outside often have a different sense once in communities. If analogy is the way we imagine practical schemes, observing that we are far from applying analogy modestly and properly. If scientists' rigidities created by cartesians minds (or reductions of scientism) could have been useful for commensurable or commensurate sciences, they have also supported many unnecessary social interpretations about the sort of required relationships. Certainty and trust in science have been necessary to escape from religious and philosophical rhetoric and to enlighten the first steps of scientific revolution. Important parts of physics fit well with formal simple first order representations. Sciences used in a reduced ways created a large corpus of methods, and this capital of knowledge reached a capacity of resolving many complex but calculable problems significant for our life but very incompletely satisfying.

Methods and sciences do not succeed to eliminate complexity from our social issues, ecological integration, nor moral purposes. They even more crudely put in evidence that in most of the mechanisms of our humane proximities are not doing well to satisfy our moral ambition and humane status in animal kingdom? All along its developments, physics have incorporated many complicated perspectives, structures, and doubts. But dogmatic mechanics extended to social issues from have created plenty of rigidities, much virtual confusion in our social organizations and representations. This now turns counterproductive, especially at the moment we succeeded to give more fluidity to our processes of information.

Societies need, at the same time, to be freer to explore but also to preserve some common essential values. To support proudly our humane “specificity,” we need to reach better social practices. Even if philosophical definitions of human moral values are theoretically vague and uncertain. Under natural complexities, we do with methods that are simple and “un?complete?able.” We add many artificial complexities, meanwhile modern world did not ease much ethical difficulties. Many problems are also increasing. Partly because past and present oversimplifications leveraged by technological and politico-economical means generated, stocked, and buffered complex adverse effects heating Pandora’s Box. With this, the intent to not ignore complexities of our humane concern, still requires details of proximity in human comprehensiveness of complex social issues and natural environments.

Anticipations for a Brave New Global World have weakened human communities where skills and economies coped approximately with complexities. This has reached the point of debilitating individual abilities to satisfy elementary needs, sustainability, and comprehension by proper means. Maybe, community is an abstract term with no general profile. Let it be established by local social unit from where humane sustainable solutions should to be found. The point here is not to see a sort of continuous and simple scale going up everywhere in the world in a unique way, valid for everyone but incomplete and imperfect numerous incommensurable circularities of individual practices in social structures, like a multitude of strange attractors exploring, on subjective sides, complex spaces of moral values and, on the other side, subjective-objective niches where scientific registers can pour some understandings on why things should be humbly and humanely filled with cares.

back to index ...