main index

P00: frame around

P01: olicognography

P03: infrastructures

wayout:contact

User

You?
Use?
Perspective?
Usage?
Concern

Graph Start

Core n°
Half complex graph

OLICOGNOGRAPHY on DEMOCRATIC ECONOMY

System

Engineering

Development

Scale

Health

Social

Measures, Parameters & Units

Basic Olicognograph: Qualitative, Quantitative & Synthesis

Qualitative methods

When you review any concept or start some activity, identification and characterization are the first steps of phenomenological (what you see) assessment. Then you try to organize what you recall and know on the subject of your study, that is: shape your analytic object, remind and infer its properties or design how to check your hypothesis. It is like reviewing some standard questions, planning detailed investigations and activities. Standard questions are: when, where, who, what, how, which are its' time properties, where and in what sort of space the subject is, to which environment it pertains, foresee probable issues, which values and kinds of numbers are to use, etc.

Traditionally this is presented point by point, element by element, less often structured with basic dimensions, etc. Olicognography organizes small subsets of values as logical pieces. With complex natural subjects, you may prefer to shape the perspective before going on with the details. This is like knowing what is the unity of the subject. Well, combined perspective or structured gauges can be used to understand it, as whole. With local issues or problems your aim is to cover enough of theoretical and practical community use, before specifying or reducing, especially if your purpose is to stay economical, do not affect too much that is, fit your practice to the environment.

Since a subject, you frame an analytical object. At least, you start putting labels on some black box. Then you select and detail properties and parts, put equalities or identities between subject and object and from object to other pieces of knowledge. You find in your 'brain-mind' any that look alike, make hypotheses, question or design a first set of actions to develop analysis or start preliminary answers, practically called working hypothesis. Practically what we suggest, at the start, to avoid too tight processes of definitions. Only try to manage essential scientific physical concepts that are universal and that can be used to core human activities. When definitions have to be set, it is to let more to local definitions of social processes. This means root these definitions to values and ways used there. Even if you can have in mind some abstract concepts able to qualify the situation. The point is not to succeed to convince legitimate actors of the one sense you give to that; and even more if you are not legitimate local actor fully involved in the local economy.

Quantification

Quantification can be directly or indirectly included in the qualification process(es) or follow some pre or post disposal. Information of quantity is supposed to be easily accountable (or at the reverse ignored) and already defined (or as too often: prejudged). The objects are already fragmented into variables. Sometimes there is just a visual approximation of an observer measuring mediations. It can be seen as some quantitative numerical label, characterizing part or all of the analytical object. Quantified information is often linked to a process of measure. The result may even be taken qualitatively. For example used only to support a decision or quantitatively, when able to get into a renewed measuring process. By experience, memory or interpretation, previous reviews may have explored enough convincing equalities and identities between parts, incorporated moduli, properties and/or parameters to suggest or have a model at hand. This is useful when it has not been selected only because the manager know how to manage it alone. With poor knowledge or more doubts and more distant formal resources, it happens that for setting identities you have to enhance your analysis or wider your cover of problems with precautions, and a strategy of implementation. Analysis is often based on determining some fundamental characteristics or delimit approximate ranges of values as could be with time(s) and space(s). Time and space can behave as dimensions and make the system of coordinates; or trajectories overtime make the first formally established study of phenomena.

At the respect of main properties or definitions of any real subject, fundamental physics has clever views. But paradoxically the more complex the subject the less attention is paid to these basic dimensions. Perhaps because these values are seen as too simple or too evident and they are not considered as complex frames for social practices or social sciences. So everyone miss to adapt essential dimensions to take into account of complex sense. This would be eventually not so difficult to conceive: reality, diversity are not so absurd but these are ignored, it seems to be more important to have understood and to pretend to be able to master it ignoring it. For example what is often absurd is the way that darwinist evolution is treated. In social issues, distance and time are to be seen as complex. It is to put at one side the simple scale as an absolute gauge and thereafter review times or space dimensions integrating all possible information, knowing that we may have not the proper means to treat properly this sort of information.

Commonly, scales of these basic dimensions are preferred linear, regular and orthonormal. Euclidan metric is the preferred norm. Linearity of demonstration helps rigidity but lead to mistaken determinism since at level of complex unit they are barely so simple (but the dictatorial ambition of any expert is to be right and to be the 'best' qualified thinker, decider, or manipulator in the same way of the monopolist). then you start naming your goal(s) and define your subject(s) and analytical objects and organize your ideas. At that point you will face a collection of (analytical) objects and will examine it to establish the possibilities of classification eventually ontogeny. For this, you will design some implicit or explicit set of rules to induce one classification. Following classificatory procedures applied to parameters you will characterize the objects of the collection. Alternatively, you will infer or deduce a classification according to some general knowledge you have on the objects of the collection and then find and validate the classification rules you have identified. Methodologically, those 3 procedures consist successively to: 1) examine units (separated or joined), group into valid classes or other forms of aggregation (if processes are equipped with differentiating rules); 2) see if the aggregates confirm a natural classification; or 3) consider if you have understood the mechanism(s) of some system of relations—that is, associations or correlations between parameters; and, eventually, 4) try to find an order in the collection of smaller objects or in the parts of a larger object.

Operations and Units

There is also with formal operations, that are too short in practice of complex economics. This is to imagine that the basic ones, useful for homogeneous, simple, amounts or quantities and that they apply in the same way to realities. Addition, subtraction, multiplication and division, also have a look on the logics of the no operator, division and you will soon see that this is not so easy. Mind transformations and the treatment they have received in physics of chemistery. Start to consider sort of first sort of complex operators, moduli, vectors, and you may understand that if aggregation, accumulation, generation can express better the purpose of complex operations and that, even if nature has no perverse purpose to be intentionally so complex; neither it has the one to be simple, only for being easily understandable by humans sciences. What is made with simple operations is oversimplified.

Something to care carefully is the type of units and related characterizations. Unit we put an emphasis on, is the natural ones which, as we say have reached a maximum level of complexity (like autopoiesis). To think about the gap it can have with its formal symbols. Just consider yourself, and your unit as a system . This sets conditions of operations. Having understood these common sense of system that exists officially in the debate since von Bertalanffy if not before. Now, even if plenty of simplifications can be imagined as nevertheless obliged, this does not means that basic things are not complex, and gives the reason also understand that complexities of natural units which like human beings are mixtures of different sorts of formalism and sub-systems. Holistic of rational-reductionism, both often ignore, that the human-being centered averaged concept is used by both and almost in the same short way.

Now, a first step of quantification or quantization is to assign a mathematical function or logical enunciate to objects of analysis. Formal function is not restricted to just what is known about elementary mathematics; like an f or g applying with a given variable x, y, or z between parentheses, equal another variable in an expression including, signs, coefficient(s), power(s), different term(s), variables and scalars y = f(x), g(z) = sign of perspective * (operand) * scalarindex * variable * power n. This sort of formula are short and should have to take better into account the properties of sorts of function, including special algebra; to reflect some properties of natural phenomena you want to put in formulas. Even the concept of a set, which is a theory of analytical objects or collections in logic and mathematics, has appeared not as general as previously imagined neither as complete. Statistics has also many delicate concepts, making them more a tool for criticizing an approach rather than insisting on one point over the others: with statistics, it is often wiser to say what it is not rather than what it is.

Exacts Simple Quantities

Measure is at the intersection of formalism and experiment or reality. So this concept is of major importance as the primal-dual program like an intermediate of any analysis. In mathematics a program solves a problem of extrema, under constraints; switching if needed from a primal to a dual related form; the easiest one used for solving . So measure has many relations with mathematics (especially theory of integration), physics (analysis of signals), and methods of decomposition (of mathematical functions). In physics, the term for the science of measure is metrology. The concepts of measure in social sciences follow the basic concepts established in the more exact formal sciences but take a quite confused form. You often measure but you cannot know really what, and even with plenty of definitions, this does not prevent you from adding a hugely self-defined selfish-program. Not the ones inspired by communities' issues, that would be the ground for individual needs

Discrete quantitative processes come close to qualitative ones. Processes in breaks or cuts, setting the effects of statistical inference or decisions on significance and/or comparisons pose problems of definitions and have come close to qualitative; but with quantitative formalism. Meanwhile quantitative processes expressed like using qualitative concepts as stability, regularity of definitions, specification, and so on. So you understand, at least intuitively, that modern statistics and probability are now far from a simple discrimination between qualitative and quantitative issues. Qualitative and quantitative approaches are now very complementary..

Now also have in mind that new logic of mathematics (since Hintikka up to Girard and Longo) is introducing many more appropriated concepts on quantification that give new kinds of perspectives of complex algebra, important to complex units, despite excluding the "free will" of human expectations and, for good, some irrational behaviors of common sense. May be so, no more than just misunderstandings of a well applied formalism. Finally we would like to remark on consistency, and observe that the resources of simulation obtained by personal computer help to mimic plenty of humans' weirdness, but that never proves that, practically we are effectively behaving or should behave as the "models".

Parameters and variables are nevertheless something that can be useful and not enough uses, even if the hidden and ignored part have taken a larger part than determinist ambition expected. Olicography has yet (?) not properly receive any quantification but we have many frames that can make quantification processes shared and more intuitively approached.

Probabilistic Methods

In a global way, statistics are indispensable methods to study units that can desegregate and behave quite independently the same way. This quite 'independently same way' or ability to 'interact neutrally in a well' is paradox. The simplest assumption considers that elementary units are all the same, behaving the same “stochastic” or random way. This is called the Ergodic principle; it could seem democratic, but it is not only what we have in our universe. So democracy this democratic simple way is more a utopia, significant only for detecting general neutral averaged trends. Ergodic assumption is weak at explaining qualitative composition or local behaviors that can reach important quantitative effects.

Statistical methods do not directly address identification and fundamental effects but consider effects in numbers (statistically significant) and patterns of distributions or constructions of probabilities by iterations. So, when you use these kinds of methods, you often require more fundamental explanations - the chem-mystery of relations- to support your statistical inferences. What you are studying with numbers or frequencies. This is why many such reasons, even if statistically well argued and well supported, can remain subjective. For example, when excluding all intents to manipulate causes and interpretations. In fact, the defects of most statistical investigations are to be badly made (respect to live aims) and wrongly interpreted (respect to human sense of feasibility).This way, scientific constructions using statistical methods are weak and involve many arbitrary previous working assumptions on structure and formal functions, required before inference tests. This sets a virtual preliminary world that can be far away from essential reality, especially in blurred social sciences.In this kind of 'soft but not so sweet' formal science you always can find purified concepts fitting well with data series, but is always wrong with respect to prediction, when not non humane. If you add that defining policies necessarily must bias series, most data collections are never neutral, nor well controlled. Models always seek to manipulate. The connections to fundamental laws do not exist and qualitative perturbations develop thanks to many natural mechanisms, like through chaotic dynamics, after thresholds values expansions, or effects of scale. These make averages, not good supports for decision.

Statistics, in fact, are instruments to cope with complexity but too easily used incorrectly for supporting reductions and simplifications. The best use of statistics is often more in criticizing and contradicting what people (ab)using numbers say. To think about the importance of statistics is like examining if your analytical object has something to do with many numbers. It can be curious to observe professionals compromised daily with population management such as agronomists, ecologists, forestry technicians, physicians involved in public health, and sociologists, who are not very good in the positive constructive criticism of these methods. It is also surprising to observe thinkers against quantification, very fond of democracy, but ignoring all of qualitative methods developed by statistics.Neither to miss so many professionals doing something with looking like statistics: numbers and amounts but deliberately ignoring the basic conditions and essential assumptions of the statistics they use.

Statistics in many cases turned no more than qualitative, and biases contributions, revealing obviously erroneous to any actors, especially given their individual perspective and because they know what they really gave. Nevertheless, statistical methods with less ambition could be useful tools to clean hypotheses, filter processes and as helps to examine measures. Using them more to criticize dispositions, mimics well the kinds of arguments that management of any complicated issue requires. So there is the suggestion to practice statistics as learning processes for detecting what data do not and cannot say, what they contradict or, what not to determine. This requires not loosing oneself in methodological complications and oversimplified reductions. - There is also an important defect in the disconnection between studies and policy management. The disconnections, with reality come from inconsistent levels of macro-studies of politically wide subjects. Even when these studies could be helped by great numbers’ laws. For example, national and even regional levels of aggregation are often not proper levels for useful statistical comparisons: they confuse too much practical phenomenological (close to human distance) comparisons. Also, because there is an over large excess of would be captains of the vision compared to social modest entrepreneurs really committed in the practical solutions of democratic complexity.

Finally to adjust the parameters of distributions we use estimators. The main concern of the study of properties of estimators and distributions characteristic of special functions of distribution, special metrics or more lax transformations is for trying to detect some profiles of data (the one of a special laws of distribution so as to conduct statistical and meaningful comparisons. The discussion of the kinds of parameters you use to characterize distributions, their asymptotic trends, their sensibility to “perturbations,” the degree of expressiveness, all are concepts meaningful to significance and/or robustness.

follow...