Cahiers de recherche

  • Titre
  • Auteur(s)


Départements : Comptabilité et Contrôle de Gestion, GREGHEC (CNRS)

Drawing on a framework of deinstitutionalization, this study explores the abandonment of budgeting through a multiple-case study of four companies. The findings illustrate how a number of antecedents to deinstitutionalization acted in each setting and show that abandonment was only achieved through skillful agency by dominant insiders to construct the need and manage for change. In addition, an interesting finding of the study is that two of the four companies reversed the deinstitutionalization and re-introduced traditional budgeting. This is explained by highlighting the role of remnants of formerly institutionalized practices and by demonstrating the importance of administrative and cultural controls which can support the abandonment of a central control practice in the first place. Overall, this research extends previous studies of deinstitutionalization by analyzing a taken-for-granted practice at the micro level and by giving a more agentic account of its processes.

Mots clés : deinstitutionalization; budgeting; budget abandonment; Beyond Budgeting


Départements : Comptabilité et Contrôle de Gestion, GREGHEC (CNRS), Finance

We examine the effect of dispersion in a firm's existing debt on the contract terms of newly issued loans. We find that new loans of firms whose existing debt is more dispersed among different types of lenders include more covenants and default clauses and are more likely to be collateralized. These findings provide evidence that new lenders seek protection from potential conflicts among different types of creditors by including additional contract terms in the loan agreements. Consistent with the notion that conflicts between different types of lenders matter most in case of default and are aggravated by information asymmetries, we further find that the effect of creditor dispersion is stronger for firms with high default risk and more pronounced for firms with low accounting quality. Finally, we provide evidence for a similar effect of creditor dispersion on the contract terms of newly issued bonds.

Mots clés : Creditor Dispersion, Debt Contract Terms, Debt Capital Structure


Départements : Economie et Sciences de la décision, GREGHEC (CNRS)

We study portfolio allocation in the international financial market when investors exhibit ambiguity aversion towards assets issued in foreign locations. Entrepreneurs located in each country have access to a risky technology and want to attract capital. We characterize contracts issued by firms in such an environment. Increases in the variance of the risky production process causes firms to increase the variable payment (equity) offered to investors. On the other hand, increases in investor ambiguity lead to less risk-sharing. Entrepreneurs located in countries with low levels of domestic wealth issue assets with a higher fixed payment and a lower risky payment. As a result, they are exposed to higher volatility per unit of consumption as they finance themselves relatively more through debt than equity. An increase in ambiguity or ambiguity aversion that characterizes crises may explain flight of capital to capital-abundant countries – dubbed sometimes as “flight to quality”.

Mots clés : ambiguity aversion, risk aversion, debt/equity choice, international capital flows, international insurance, home bias


Départements : Stratégie et Politique d’Entreprise, GREGHEC (CNRS)

A central theme of economic sociology has been to highlight the complexity and diversity of real-world markets, but many network models of economic social structure ignore this feature and rely instead on stylized one-dimensional characterizations. Here, we return to the basic insight of structural diversity in economic sociology. Using the Indian interorganizational ownership network as our case, we discover a composite – or “hybrid” – model of economic networks that combines elements of prior stylized models. The network contains a disconnected periphery conforming closely to a “transactional” model; a semi-periphery characterized by small, dense clusters with sporadic links, as predicted in “small world” models; and finally a nested core composed of clusters connected via multiple independent paths. We then show how a firm’s position within the meso-level structure is associated with demographic features such as age and industry, and differences in the extent to which firms engage in multiplex and high value exchanges

Mots clés : Network, Organization, Structure


Départements : Finance, GREGHEC (CNRS)

We derive several popular systemic risk measures in a common framework and show that they can be expressed as transformations of market risk measures (e.g., beta). We also derive conditions under which the different measures lead to similar rankings of systemically important financial institutions (SIFIs). In an empirical analysis of US financial institutions, we show that (1) different systemic risk measures identify different SIFIs and that (2) firm rankings based on systemic risk estimates mirror rankings obtained by sorting firms on market risk or liabilities. One-factor linear models explain most of the variability of the systemic risk estimates, which indicates that systemic risk measures fall short in capturing the multiple facets of systemic risk.

Mots clés : Banking Regulation, Systemically Important Financial Firms, Marginal Expected Shortfall, SRISK, CoVaR, Systemic vs. Systematic Risk


Départements : Marketing, GREGHEC (CNRS)

This web appendix has three main purposes. First, we provide a more or less 'stand-alone' technical appendix that describes the estimation algorithm for the proposed attribute model using Markov Chain Monte Carlo techniques (sections A1 and A2). The reversible jump (RJ) algorithm (Green, 1995) is also described in detail for the (vector) finite mixture regression model. We first give a discussion of priors and a general description of the reversible jump algorithm; then we present details of the estimation schema for the standard finite mixture regression model. We subsequently extend these details for the attribute model. As we will show, the algorithms and equations for the attribute model are similar to the results for the vector model due to a simple transformation. This similarity makes the coding of the attribute model straightforward once computer code for the vector model (with reversible jump steps) is developed. Furthermore, we discuss how the algorithms should be modified to estimate a standard choice model (e.g. a probit model). Second, we briefly discuss in section A3 the benchmark models for heterogeneity considered in the main document and their implementation, including the mixture of normals model (Allenby et al. 1998, Lenk and DeSarbo 2000) and the Dirichlet Process Priors (Ansari and Mela 2003, Kim et al. 2004). Third, we present the results of an additional simulation experiment where the traditional (vector) finite mixture model is used to generate the data in section A4, which augments the Monte Carlo experiment in the main document.

Mots clés : heterogeneity, mixture models, hierarchical Bayes, conjoint analysis, reversible jump MCMC, segmentation


Départements : Marketing, GREGHEC (CNRS)

Modeling consumer heterogeneity helps practitioners understand market structures and devise effective marketing strategies. In this research we study finite mixture specifications for modeling consumer heterogeneity where each regression coefficient has its own finite mixture, that is, an attribute finite mixture model. An important challenge of such an approach to modeling heterogeneity lies in its estimation. A proposed Bayesian estimation approach, based on recent advances in reversible jump Markov Chain Monte Carlo (MCMC) methods, can estimate parameters for the attribute-based finite mixture model, assuming that the number of components for each finite mixture is a discrete random variable. An attribute specification has several advantages over traditional, vector-based, finite mixture specifications; specifically, the attribute mixture model offers a more appropriate aggregation of information than the vector specification facilitating estimation. In an extensive simulation study and an empirical application, we show that the attribute model can recover complex heterogeneity structures, making it dominant over traditional (vector) finite mixture regression models and a strong contender compared with mixture-of-normals models for modeling heterogeneity.

Mots clés : Segmentation, Mixture Models, Hierarchical Bayes, Conjoint Analysis, Reversible Jump MCMC


Départements : Marketing, GREGHEC (CNRS)

The structure of a social network, characterized by the connections between members of that network can significantly affect how a marketing process plays out on the network. Many social networks of relevance to marketers are large, complex, or hidden which makes it prohibitively expensive to work with the entire network in marketing applications. Instead, marketers need to work with a sample (i.e., a subgraph) of the population network. In this paper we evaluate the efficacy of nine different sampling methods in recovering the underlying structural characteristics of population networks. In particular, we focus on recovery of four characteristics of importance for marketers, namely, the distributions of degree, clustering coefficient, betweenness centrality, and closeness centrality, each of which is relevant for certain marketing processes. Via extensive simulations, we find that sampling methods differ substantially in their ability to recover population network characteristics. Traditional sampling procedures, such as random node sampling, result in poor subgraphs. When the focus of a marketing research project is on understanding local network effects (e.g., peer influence) then forest fire sampling with a medium burn rate performs the best, i.e., it is most effective for recovering the distributions of degree and clustering coefficient. When the focus is on broader network effects (e.g., speed of diffusion, or the “multiplier” effects of network seeding), then random-walk sampling (i.e., forest-fire sampling with a low burn rate) performs the best, and it is most effective for recovering the distributions of betweenness and closeness centrality. Also, of great relevance for marketers, sample size has only a minimal impact on sampling performance unless the sample is very small relative to population size. We validate our findings on four different networks, including a Facebook network and a co-authorship network, and conclude with recommendations for practice.

Mots clés : Social Networks, Word-of-Mouth Marketing, Sampling, Graph Sampling


Départements : Finance, GREGHEC (CNRS)

Decades of accumulated knowledge empowers our quest to de-bias human cognition. However, I propose that improvement methods aimed at certain biases may introduce new biases due to cognitive and situational limitations: Such limitations give rise to simplifying and protecting processes (SPPs), the unthorough nature of which results in biases. De-biasing may target these processes but ultimately cannot always resolve the underlying cognitive and situational limitations. Consequently, de-biasing runs the risk of forcing either a switch in SPPs or an introduction of new SPPs, thereby exposing us to the threats of new biases. In this paper, I analyse the model of simplifying and protecting processes and discuss promising directions of de-biasing. The model of SPPs stands in line with extant literature on the underlying causes of cognitive bias as well as on the methods of de-biasing, but extends them by synthesizing a coherent theory. It contributes to the judgement and decision making literature that seeks to answer four questions: (a) What mechanism underlies biases? (b) How to de-bias? (c) Why does de-biasing have limitations? (d) Where to channel de-biasing efforts so as to reduce the unbeneficial effects of biases?

Mots clés : cognitive bias, de-bias, simplifying and protecting processes


Départements : Comptabilité et Contrôle de Gestion, GREGHEC (CNRS)

While it is generally maintained that earnings management can occur to inform as well as to mislead, evidence that earnings management informs has been scarce, and evidence that credibility increases with signal costliness inexistent. We provide evidence that firms use discretion over financial reporting and real activities to report higher earnings on lower sales from continuing operations. Although these firms defy gravity artificially, we show that the upwards earnings management informs rather than misleads investors. We find that firms that defy gravity (1) report higher future earnings and cash flows, (2) earn higher one-year-ahead abnormal returns, (3) have a positive market reaction to the defying gravity earnings announcement, and (4) their CEOs are more likely to be net buyers in the year preceding the defying gravity event. We also show that the upwards earnings management signal is more credible when it is more costly to achieve: Defying gravity firms perform better when they bear the opportunity loss of not taking a big bath in times of crisis — years where poorer performance can be blamed on economy-wide shocks, and when they have fewer degrees of freedom to report higher earnings.

Mots clés : Earnings Management, Signaling, Informativeness, Opportunism, Credibility


JavaScriptSettings