'Shopping Around' for Accounting Practices: the Financial Statement Presentation of French Groups


Academy of Management Executive

février 2003, vol. 39, n°1, pp.42-65

Départements : Comptabilité et Contrôle de Gestion, GREGHEC (CNRS), Economie et Sciences de la décision

Mots clés : Financial statements, Harmonization, IASB

This article illustrates the progressive move away from traditional accounting practices through a study of the presentation of financial statements. Based on a sample of one hundred large French industrial and commercial groups over a ten-year period, and applying a logistic regression method, our survey confirms a trend among French companies, which are increasingly turning their backs on traditional national practices as regards the balance sheet format, the income statement format, the voluntary disclosure of a statement of changes in shareholders' equity and the cash flow statement format. This move towards 'alternative' practices is made possible by the flexibility of French regulation, and can probably be explained by the desire of French firms to attract more investment on international capital markets. However, this trend shows no signs of a clear orientation towards any particular accounting model (IAS, U.S. or U.K.). The behaviour of the French firms observed in our study can be considered as a kind of 'shopping around' for accounting

A derivation of expected utility maximization in the context of a game

I. GILBOA, D. Schmeidler

Games and Economic Behavior

juillet 2003, vol. 44, n°1, pp.172-182

Départements : Economie et Sciences de la décision, GREGHEC (CNRS)

A decision maker faces a decision problem, or a game against nature. For each probability distribution over the state of the world (nature's strategies), she has a weak order over her acts (pure strategies). We formulate conditions on these weak orders guaranteeing that they can be jointly represented by expected utility maximization with respect to an almost-unique state-dependent utility, that is, a matrix assigning real numbers to act-state pairs. As opposed to a utility function that is derived in another context, the utility matrix derived in the game will incorporate all psychological or sociological determinants of well-being that result from the very fact that the outcomes are obtained in a given game

An Application of Ramsey Theorem to Stopping Games

E. Shmaya, E. Solan, N. VIEILLE

Games and Economic Behavior

février 2003, vol. 42, n°2, pp.300-306

Départements : Economie et Sciences de la décision, GREGHEC (CNRS)

We prove that every two-player nonzero-sum deterministic stopping game with uniformly bounded payoffs admits an e-equilibrium, for every e>0. The proof uses Ramsey Theorem that states that for every coloring of a complete infinite graph by finitely many colors there is a complete infinite subgraph which is monochromatic.

Assessing the Commercial Viability of New Ventures


Canadian Investment Review

2003, vol. 16, n°1, pp.18-25

Départements : Economie et Sciences de la décision, GREGHEC (CNRS)

Mots clés : Commercial, Viability, New Ventures, Venture capital

Combining Gene Expression and Molecular Marker Information for Mapping Complex Trait Genes: A Simulation Study

M. Pérez-Enciso, D. Gianola, M. TENENHAUS, M. A. Toro


2003, vol. 164, pp.1597-1606

Départements : Economie et Sciences de la décision

A method for mapping complex trait genes using cDNA microarray and molecular marker data jointly is presented and illustrated via simulation. We introduce a novel approach for simulating phenotypes and genotypes conditionally on real, publicly available, microarray data. The model assumes an underlying continuous latent variable (liability) related to some measured cDNA expression levels. Partial least-squares logistic regression is used to estimate the liability under several scenarios where the level of gene interaction, the gene effect, and the number of cDNA levels affecting liability are varied. The results suggest that: (1) the usefulness of microarray data for gene mapping increases when both the number of cDNA levels in the underlying liability and the QTL effect decrease and when genes are coexpressed; (2) the correlation between estimated and true liability is large, at least under our simulation settings; (3) it is unlikely that cDNA clones identified as significant with partial least squares (or with some other technique) are the true responsible cDNAs, especially as the number of clones in the liability increases; (4) the number of putatively significant cDNA levels increases critically if cDNAs are coexpressed in a cluster (however, the proportion of true causal cDNAs within the significant ones is similar to that in a no-coexpression scenario); and (5) data reduction is needed to smooth out the variability encountered in expression levels when these are analyzed individually.

Constrained Egalitarianism in a Simple Redistributive Model

J. Jaffray, P. MONGIN

Theory and Decision

2003, vol. 54, pp.33-56

Départements : Economie et Sciences de la décision, GREGHEC (CNRS)

The paper extends a result in Dutta and Ray's (1989) theory of constrained egalitarianism initiated by relying on the concept of proportionate rather than absolute equality. We apply this framework to redistributive systems in which what the individuals get depends on what they receive or pay qua members of generally overlapping groups. We solve the constrained equalization problem for this class of models. The paper ends up comparing our solution with the alternative solution based on the Shapley value, which has been recommended in some distributive applications

Deterministic Multi-Player Dynkin Games

E. Solan, N. VIEILLE

Journal of Mathematical Economics

novembre 2003, vol. 39, n°8, pp.911-930

Départements : Economie et Sciences de la décision, GREGHEC (CNRS)

A multi-player Dynkin game is a sequential game in which at every stage one of the players is chosen, and that player can decide whether to continue the game or to stop it, in which case all players receive some terminal payoff.We study a variant of this model, where the order by which players are chosen is deterministic, and the probability that the game terminates once the chosen player decides to stop may be strictly less than 1.We prove that a subgame-perfect <F>?</F>-equilibrium in Markovian strategies exists. If the game is not degenerate this <F>?</F>-equilibrium is actually in pure strategies.

How to Deal with Missing Categorical Data: Test of a Simple Bayesian Method

T. B. ASTEBRO, G. Chen

Organizational Research Methods

juillet 2003, vol. 6, n°3, pp.309-327

Départements : Economie et Sciences de la décision, GREGHEC (CNRS)

Mots clés : Missing data, Categorical variable, Bayesian, Imputation

The authors analyze the efficiency of six missing data techniques for categorical item nonresponse under the assumption that data are missing at random or missing completely at random. By efficiency, the authors mean a procedure that produces an unbiased estimate of true sample properties that is also easy to implement. The investigated techniques include listwise deletion, mode substitution, random imputation, two regression imputations, and a Bayesian model-based procedure. The authors analyze efficiency under six experimental conditions for a survey-based data set. They find that listwise deletion is efficient for the data analyzed. If data loss due to listwise deletion is an issue, the analysis points to the Bayesian method. Regression imputation is also efficient, but the result is conditioned on the specific data structure and may not hold in general. Additional problems arise when using regression imputation, making it less appropriate.

Inductive Inference: An Axiomatic Approach

I. GILBOA, D. Schmeidler


2003, vol. 71, pp.1-26

Départements : Economie et Sciences de la décision, GREGHEC (CNRS)

L'axiomatisation et les théories économiques


Revue Economique

2003, vol. 54, pp.99-138

Départements : Economie et Sciences de la décision, GREGHEC (CNRS)

Ce travail réexamine la "méthode axiomatique" avant de montrer comment elles'applique en théorie de l'équilibre général avec Debreu, en théorie de la décision avec vonNeumann et Morgenstern, en théorie normative avec Arrow, Nash et leurs successeurs.On fera ressortir ce qui différencie l'axiomatisation des autres procédés de formalisationmathématique: principalement, la notion de système formel et l'interaction réglée d'unesyntaxe et d'une sémantique. On défendra l'idée que les calculs logiques constituent unmodèle au moins imitable analogiquement par toutes les axiomatisations. Celles deséconomistes répondent au "genre ensembliste" représenté ailleurs par Bourbaki tout enprésentant une forte spécificité. La sémantique y est peu formalisée et souvent fixée unefois pour toutes; les systèmes formels sont "théorématiques" plutôt que"définitionnels", suivant une distinction nouvelle introduite par ce travail. Celui-ci viseégalement à contester la notion d'axiomatique retenue par l'économie normative.