main index

P00: frame around

P01: olicognography

P03: infrastructures

wayout:contact

User

You?
Use?
Perspective?
Usage?
Concern

Graph Start

Core n°
Half complex graph

OLICOGNOGRAPHY on SOCIAL INFRASTRUCTURES

System

Engineering

Development

Scale

Health

Social

Estimation, Prediction

Basic Olicognograph: Balance Formal Registers

Probability

Mathematical definition of probability is clear but relations to real world are still discussed with 2 broad views: 1) Objectivists or frequencists consider that probability is provided by ideal behavior of frequencies that can be produced by results of experiments, operated in conditions relativelly stable. Pretending to be, themselves, "more scientific". That has been the ground of statistics inference theories. 2) Subjectivists think that probability is a degree of trust about any uncertain events, including the frequency and other situations.

Subjective probability cover various ambiguities:

  1. Partial belief logic on products of probabilities techniques eventually alternative (in the processes used) but normative theory as belief logic.
  2. Probabilities as techniques of such product, may be individual or rised in terms of frequencies up to groups or collectives, since a hypothesis or/on collections of hypothesis under conditions of uncertainty. To remind our concept of complex program.
  3. Sort of derivations can be classes of estimations, judgments, believes and inferences, subjectivities not necessarilly tightened, rigorous and logically consistent. So may be with no special type of measure. Observe that subjectivity is made of brain's activities not emerging as products of automatic applications of simple logic but from complexes processes, not all explicit but not necessarilly purely non scientific. Commonly this is only the first unachieved step which calls for previously defined mathematical and repeated (iterated) physical events consistent with experiments.
  4. Coincidence between calculated estimations or predictions and registered observations, especially frequencies.
  5. Relations to measure are setting subjective probabilities by assessing the amount of information the subject has or think to need to get out its uncertainty.
  6. When without any information on an event the suject can make its judgment refering to the "principle of indifference"
  7. Subjective probability are often evoqued by the idea of "luck" or "chance" carrying the possibility of a game (more or less ruled) and gain or loss from bet(s) with strategies (and in the hard-smooth complex realities with tactics).
  8. There is so a concept of "risk taking" (after the other or adverse issue) and of commitment on play or involvment.
  9. It implies possibly a risk and/or a hazard (that could be prevented or avoided) and other operative constraints.
  10. With issues for an individual there is so both a process of judgement (checkand asssess, judge, commit).
  11. Belief often implies an abstract on chance, feeling to be lucky, in a way that can be strong, coherent and even compulsory if not addictive.
  12. Process of decision under uncertainty, carries risk(s), can be conditioned and even forced but in the process of assuming imperfection information or knowledge.
  13. Normative theory of of decision taking refers especially to the review of advantages and inconvenients. If the actors are conditioned there is nevertheless the ressources of thinking about what to do and not always with the obligation to get it. This not necessarilly is good and non insanelly made.

Estimation

Estimators are measures of a given value, parameter, property or characteristic. Good properties of an estimator should be: 1) Without biais, that is to say its mathematical hope should equal to the true value of parameter, 2) Convergent in probability of quadratic average when the parameter t(r)ends to infinity, 3) Minimum variance (or efficient estimator). There are different kinds of class of estimators as maximum likelihood, least square and son on. There purpose is to mediate, more safely ? estimation relates to probabilities' calculus, quantic mechanics being also an applied register with plenty of future.

Probability or statistics caring uncertainty: risk may quantify uncertainty and receive a probabilist treatment. The constrains and limits set by applications of probabilities are often missed. For example they are formally valid with ergodic theorem or elementary event, managed into independent (random) processes of iterations: sorting or simulation. That is each components behaving independently, neutrally. Ideal condition all hard to believe in an evolving universe, preciselly developped against these principles and where interesting objects or situations have appeared preciselly at the margins of reality as perfect gas, repulsive clouds, crystals (perfect structures) and so on.

Estimation of what to do (once estimators infered) in the perspective of determinism, commonly resolves the uncertainty of its philosophy by saying that positive planning needs, anyhow, to take a decision; after the results or averaged treatment of risks probabilities. Uniqueness distorsion makes often prejudgements, wrong statistics on operations. Diversity of issues, better uses of data, weird scientific establishment or oversized ambitions of being scientific in registers not really scientifc lead to mistakes, ignore the good lessons of instruments (precautions) and miss goals. But it is more than common that ideology of leadership or other abuse of tools at their convenience.

Proceedings of estimation can start with the values of trust worthyness and observations of reality all the way, but coincidentally false. Estimators look backward (statistically established) but with the ambition to look forward: qualify prediction. Issues of prediction start with precautions and should make better use of these precautions, rather than of certainties. Estimators and predictors may better overlap between before and after (be it time or other), there the ideas of bayesian estimators.

Quite often, more consistent ways will not be just to follow the prediction or focus on the estimators (without its properties), because positive but to prepare. Because opposite or paradoxic effects are reality's common, for so our primal-dual first setting of a complex program. Mind that logic(s) of belief may give some frames to this sorts of games.

Second, to consider the measurability or non comeasurability or the degree of relation between object and subject. Figurativelly sort of 2 extremes of effects. The one of too explosive (or to implosive) like of chaotic or catastrophic non linear effects, adding effects of partial dimensions: scale up: emergence or; down partial: fractal. The other being of non uniqueness of solutions to derivations or inflection (left and right derivation differents) or trajectories observing bifurcations or some box with more than one value.

A distribution that shows some homogeneity can be characterized in a simple way, relativelly to a distribution characteristic of a “regular” one and can be summarized in four types of moments (in short, but we would prefer six): 1) First (central tendency), which answer to the question of where is the closest value to every ones, 2) Second (deviation), which give the mean distance of deviation to central tendency (it is often called term of error), 3) Third (flatness) where for the other axis, is the mean height, 4) Fourth (skewness) which precise the unsymmetries of the curve.

Predict or Project

Authors observed that a prediction is not made only on one lonesome hypothesis but on the conjunction of one "nomologic hypothesis" and some pertinent auxiliary proposals. Stochastic processes are aleatory functions which argument is time, irreversible and unescapable; so it influences the practical problems posed: essentially prediction (or extrapolation) and often included in predictions' modern techniques. If prediction is verified, it is not possible to conclude that its truth is definitive. Only, if not confirmed, that the prediction can be said false. This sort of precautions in social practice are, in our opinion, essential to transparency and should be recalled more often and managed so instead thinking that the magical touch is 'there'.

Determinism has not just the lone defect to pretend to exactedness but also that there are always determined cause(s) at best just one ... exactly predictable under some determined conditions (managed better by only one). Then to not invalid the reduction: force the conditions, whatever the damages they have commonly produced. The fallacy of self-justification extends to all correlative philosophy. Abuses of enunciate(s) have the defects of analogic truth: predicting and prejudging of validity, away the domain they may have been right at some moment (but not trated just as analogy).

Anticipation is considered for preparedness to action(s) for times' changes. When someone reacts to stimuli in different ways, it is observed that these have statistical properties and reactions, so as to anticipate next stimulus(ii) according what happened before.

Prediction types of problems (purpose):

  1. Decisive prediction: take a decision for an ordering activity or have a criterion,
  2. Simple sample inspection: for qualification or asssessment of method or procedure,
  3. Regulation: like to implement a feedback or loop,
  4. Optimization,
  5. Calibration of diagnosis,
  6. Treatment's allocation.

Prediction Methods:

  1. Judgmental estimates,
  2. Delphi (pannel of experts),
  3. Ratios / relationships derived fromregression or comparative analysis,
  4. Markov modeling / cross impact analysis(Markov chains & stochastics processes),
  5. Optimization models (as employed in mathematics).

Prediction Calculus Treatment in Statistics are like:

  1. Discrete stationary case: - Moving average of uncorrelated random variables - Power density estimate by least square criterion.
  2. Continuous stationary case: - Fourier's transforms -Spectral (or power) density.
  3. Orthogonal and gaussian process (Brownian or Wiener's process) : - Polynomial estimate - Karhunen-Loeve expansion
  4. Dynamical systems with control variables: maximum likelihood estimate (bayesian calculus).

On Prediction to suggest to care pertinence, since practicing them can be time consumming and have poor issues. So:

  1. Pertinent events that can produce good policies or preemptions.
  2. Disastrous events able to suggest efficient specific preparedness (otherwise just prepare for any event).
  3. Essential processes that could be relevant to activities and better ground of other processes of decisions.
  4. Determined frames able to help make hypothesis on unknown properties (from theory to experiment).
  5. Properties, parameters and variables consistently formulated to provide a fair estimation of what would happen.
  6. Unknown phenomena that can nevertheless approached by a probabilistic formulation model based on the past properties of a model.

Relevancy to project management:

  • Predicts the time schedule of revenues.
  • Dynamics or trends of effects (& business cycles).
  • Compare 2 predicted trends with and without project.
  • Rhythms with project's implementation and management.
  • Emergence of results, top or end-dates (for synchronization).
  • Complex necessary dynamics, processes of negociation, financial schedule).
  • Intermediate Goals review & Reassessment (care the philosophy of social benefit).
  • Economics management and reality of predictions or estimations to provide the frame of implementation, and any other things relevant to the project (not for "predicting" that some has been right).
  • The pedagogy for explaining and motivate participation.

More Complicated Evolutions

Now predictions methods often took advantages of simplicity and reveal quite poor at being really effective: when most needed as in downturns, non linear dynamics, strange attractors patterns; plenty of necessary new ways to manage predictions, since before certainty of anticipation you have to prepared and it can be more important then the best prediction, when everything is usually all right so everyone is doing well; trends are smooth and inclusive social commitment provide satisfaction to most. Some concepts of these jumps in the following.

As a first step of complication there are the skidding of mimicry when uncertainty makes blind imitation, especially of the one whom seems to have best prediction during the good trend. "Classical Expected Utility: it predicts that investment can be not affected by global risk. In contrast, empirical applications of Rank Dependent Utility and Cumulative Prospect Theory predict that, if anything, increasing global risk will enhance risk seeking and, thus, lead to more investment. In Business statistics a large majority of summary indicators are derived from the individual responses to qualitative Business Tendency Survey questions result from standard aggregation and quantification methods.

"Evolutionary Dynamics: here every player is replaced by a population of individuals, each playing the given game in the role of player i. Each such individual always plays the same one-shot action (this fixed action is his “genotype”). The relative frequencies of the various actions in population i may be viewed as a mixed action of player i in the one-shot game (for instance, one third of the population having the “gene” L and two thirds the “gene” R corresponds to the mixed action; one may think of the mixed action as the action of a randomly chosen individual. Evolutionary dynamics are based on two main “forces”: selection and mutation. Selection is a process whereby better strategies prevail; in contrast, mutation, which is rare relative to selection, generates actions at random, whether better or worse. It is the combination of the two that allows for natural adaptation: new mutants undergo selection, and only the better ones survive.

"Adaptive heuristics lie in between: on the one hand, the players do perform certain usually simple computations given the environment, and so the behavior is not fixed as in evolutionary dynamics; on the other hand, these computations are far removed from the full rationality and optimization that is carried out in learning models.

  1. There are simple adaptive heuristics that always lead to correlated equilibria (ex: regret MatchingTheorem).
  2. There is a large class of adaptive heuristics that always lead to correlated equilibria (ex: Generalized Regret Matching Theorem).
  3. There can be no adaptive heuristics that always lead to Nash equilibria, or to the convex hull of Nash equilibria (ex: Uncoupled Dynamics Theorem)".

"Adaptive heuristics are based on natural regret measures, and maybe viewed as a bridge between rational and behavioral viewpoints. Taken together, the results presented here establish a solid connection between the dynamic approach of adaptive heuristics and the static approach of correlated equilibria. Be heuristics for rules of behavior that are simple, unsophisticated. These are “rules of thumb” that the players use to make their decisions. We call them adaptive if they induce behavior that reacts to what happens in the play of the game, in directions that, loosely speaking, seem “better.” Thus, always making the same fixed choice, and always randomizing uniformly over all possible choices, are both heuristics. But these heuristics are not adaptive, since they are not at all responsive to the situation (i.e., to the game being played and the behavior of the other participants). In contrast, fictitious play is a prime example of an adaptive heuristic: at each stage one plays an action that is optimal against the frequency distribution of the past actions of the other players. Adaptive heuristics commonly appear in behavioral models, such as reinforcement, hysterisis, information, feedback, and stimulus-response".

follow...