Skip to main content Skip to navigation

2015 Working Papers

Browse by year

Hard copy

To request a free hard copy of a paper, please contact Margaret Nash quoting the paper number.

1104 - An argument for positive nominal interest

Gaetano Bloise and Herakles Polemarchakis

In a dynamic economy, money provides liquidity as a medium of exchange. A central bank that sets the nominal rate of interest and distributes its profit to shareholders as dividends is traded in the asset market. A nominal rates of interest that tend to zero, but do not vanish, eliminate equilibrium allocations that do not converge to a Pareto optimal allocation.

1103 - Suboptimality with land

Nikos Kokonas and Herakles Polemarchakis

In a stochastic economy of overlapping generations subject to uninsurable risks, competitive allocations need not be constrained optimal. This is the case even in the presence of long-lived assets and no short sales.

1102 - Short sales, destruction of resources, welfare

Nikos Kokonas and Herakles Polemarchakis

A reduction in the output of productive assets (trees) in some states of the world can expand the span of payo s of assets; and, improved risk sharing may compensate for the loss of output and support a Pareto superior allocation. Surprisingly, if short sales of assets are not allowed, improved risk sharing that results from the destruction of output does not su ce to induce a Pareto superior allocation.

1100 - Challenges of Change: An Experiment Training Women to Manage in the Bangladeshi Garment Sector

Rocco Macchiavello, Andreas Menzel, Atonu Rabbani and Christopher Woodruff

Large private firms are still relatively rare in low-income countries, and we know little about how entry-level managers in these firms are selected. We examine a context in which nearly 80 percent of production line workers are female, but 95 percent of supervisors are male. We evaluate the effectiveness of female supervisors by implementing a training program for selected production line workers. Prior to the training, we find that workers at all level of the factory believe males are more effective supervisors than females. Careful skills diagnostics indicate that those perceptions do not always match reality. When the trainees are deployed in supervisory roles, production line workers initially judge females to be significantly less effective, and there is some evidence that the lines on which they work underperform. But after around four months of exposure, both perceptions and performance of female supervisors catch up to those of males. We document evidence that the exposure to female supervisors changes the expectations of male production workers with regard to promotion and expected tenure in the factory.

1099 - A Vision of the Growth Process in a Technologically Progressive Economy: the United States, 1899-1941.

Gerben Bakker, Nicholas Crafts and Pieter Woltjer

We develop new aggregate and sectoral Total Factor Productivity (TFP) estimates for the United States between 1899 and 1941 through better coverage of sectors and better measured labor quality, and show TFP-growth was lower than previously thought, broadly based across sectors, strongly variant intertemporally, and consistent with many diverse sources of innovation. We then test and reject three prominent claims. First, the 1930s did not have the highest TFP-growth of the twentieth century. Second, TFP-growth was not predominantly caused by four leading sectors. Third, TFP-growth was not caused by a ‘yeast process’ originating in a dominant technology such as electricity.

1098 - Shocking language: Understanding the macroeconomic effects of central bank communication

Stephen Hansen and Michael McMahon

We explore how the multi-dimensional aspects of information released by the FOMC has effects on both market and real economic variables. Using tools from computational linguistics, we measure the information released by the FOMC on the state of economic conditions, as well as the guidance the FOMC provides about future monetary policy decisions. Employing these measures within a FAVAR framework, we find that shocks to forward guidance are more important than the FOMC communication of current economic conditions in terms of their effects on market and real variables. Nonetheless, neither communication has particularly strong effects on real economic variables.

1097 - Voting in Legislative Elections Under Plurality Rule

Niall Hughes

Models of single district plurality elections show that with three parties anything can happen - extreme policies can win regardless of voter preferences. I show that when single district elections are used to fill a legislature we get back to a world where the median voter matters. An extreme policy will generally only come about if it is preferred to a more moderate policy by the median voter in a majority of districts. The mere existence of a centrist party can lead to moderate outcomes even if the party itself wins few seats. Furthermore, I show that while standard single district elections always have misaligned voting i.e. some voters do not vote for their preferred choice, equilibria of the legislative election exist with no misaligned voting in any district. Finally, I show that when parties are impatient, a fixed rule on how legislative bargaining occurs will lead to more coalition governments, while uncertainty will favour single party governments.

1096 - Short-Term Momentum and Long-Term Reversal of Returns under Limited

Pablo F. Beker & Emilio Espino

We evaluate the ability of the Lucas tree and the Alvarez-Jermann models, both with homogeneous as well as heterogeneous beliefs, to generate a time series of excess returns that displays both short-term momentum and long-term reversal, i.e., positive autocorrelation in the short-run and negative autocorrelation in the long-run. Our analysis is based on a methodological contribution that consists in (i) a recursive characterisation of the set of constrained Pareto optimal allocations in economies with limited enforceability and belief heterogeneity and (ii) an alternative decentralisation of these allocations as competitive equilibria with endogenous borrowing constraints. We calibrate the model to U.S. data as in Alvarez and Jermann. We find that only the Alvarez-Jermann model with heterogeneous beliefs delivers autocorrelations that not only have the correct sign but are also of magnitude similar to the US data

1095 - Crowd Learning without Herding: A Mechanism Design Approach

Jacob Glazer, Ilan Kremer & Motty Perry

Crowdfunding, Internet websites, and health care are only a few examples of markets in which agents make decisions not only on the basis of their own investigations and knowledge, but also on the basis of information from a "central planner" about other agents’ actions. While such reciprocal learning can be welfare-improving, it may reduce agents’incentives to conduct their own investigations, and may lead to harmful cascades. We study the planner’s optimal policy regarding when to provide information and how much information to provide. We show that the optimum policy involves a delicate balance of hiding and revealing information.

1094 - Quantitative Easing in an Open Economy: Prices, Exchange Rates and Risk Premia

M.Udara Peiris & Herakles Polemarchakis

Explicit targets for the composition of assets traded by governments are necessary for fiscal-monetary policy to determine the stochastic paths of inflation or exchange rates; this is the case even if fiscal policy is non-Ricardian. Targets obtain with the traditional conduct of monetary policy and Credit Easing, but not with unconventional policy and Quantitative Easing. The composition of the portfolios traded by monetary-fiscal authorities determines premia in asset and currency markets

1093 - How To Count Citations If You Must

Motty Perry & Philip J. Reny

Citation indices are regularly used to inform critical decisions about promotion, tenure, and the allocation of billions of research dollars. Nevertheless, most indices (e.g., the h-index) are motivated by intuition and rules of thumb, resulting in undesirable conclusions. In contrast, five natural properties lead us to a unique new index, the Euclidean index, that avoids several shortcomings of the h-index and its successors. The Euclidean index is simply the Euclidean length of an individual’s citation list. Two empirical tests suggest that the Euclidean index outperforms the h-index in practice.

1092 - Why Sex? and Why Only in Pairs?

Motty Perry, Philip J. Reny & Arthur J. Robson

Understanding the purpose of sex remains one of the most important unresolved problems in evolutionary biology. The difficulty is not that there are too few theories of sex, the difficulty is that there are too many and none stand out. To distinguish between theories we suggest the following question: Why are there no triparental species in which an offspring is composed of the genetic material of three individuals? A successful theory should confer an advantage to biparental sex over asexual reproduction without conferring an even greater advantage to triparental sex. We pose our question in the context of two leading theories of sex, the (deterministic) mutational hypothesis that sex reduces the rate at which harmful mutations accumulate, and the red queen hypothesis that sex reduces the impact of parasitic attack by increasing genotypic variability. We show that the mutational hypothesis fails to provide an answer to the question because it implies that triparental sex dominates biparental sex, so the latter should never be observed. In contrast, we show that the red queen hypothesis is able to explain biparental sex without conferring an even greater advantage to triparental sex.

1091 - Evidence Games: Truth and Commitment

Sergiu Hart, Ilan Kremer & Motty Perry

An evidence game is a strategic disclosure game in which an informed agent who has some pieces of verifiable evidence decides which ones to disclose to an uninformed principal who chooses a reward. The agent, regardless of his information, prefers the reward to be as high as possible. We compare the setup where the principal chooses the reward after the evidence is disclosed to the mechanism-design setup where he can commit in advance to a reward policy. The main result is that under natural conditions on the truth structure of the evidence, the two setups yield the same equilibrium outcome

1090 - Rational Expectations and Farsighted Stability

Bhaskar Dutta & Rajiv Vohra

In the study of farsighted coalitional behavior, a central role is played by the von Neumann-Morgenstern (1944) stable set and its modification that incorporates farsightedness. Such a modification was first proposed by Harsanyi (1974) and has recently been re-formulated by Ray and Vohra (2015). The farsighted stable set is based on a notion of indirect dominance in which an outcome can be dominated by a chain of coalitional ‘moves’ in which each coalition that is involved in the sequence eventually stands to gain. However, it does not require that each coalition make a maximal move, i.e., one that is not Pareto dominated (for the members of the coalition in question) by another. Nor does it restrict coalitions to hold common expectations regarding the continuation path from every state. Consequently, when there are multiple continuation paths the farsighted stable set can yield unreasonable predictions. We resolve this difficulty by requiring all coalitions to have common rational expectations about the transition from one outcome to another. This leads to two related concepts: the rational expectations farsighted stable set (REFS) and the strong rational expectations farsighted stable set (SREFS). We apply these concepts to simple games and to pillage games to illustrate the consequences of imposing rational expectations for farsighted stability

1089 - Perils of Quantitative Easing

Michael McMahon, Udara Peiris & Herakles Polemarchakis

Quantitative easing compromises the control of the central bank over the stochastic path of inflation.

1088 - How Transparency Kills Information Aggregation: theory and Experiment

Sebastian Fehrler & Niall Hughes

We investigate the potential of transparency to influence committee decisionmaking. We present a model in which career concerned committee members receive private information of different type-dependent accuracy, deliberate and vote. We study three levels of transparency under which career concerns are predicted to affect behavior differently, and test the model’s key predictions in a laboratory experiment. The model’s predictions are largely borne out - transparency negatively affects information aggregation at the deliberation and voting stages, leading to sharply different committee error rates than under secrecy. This occurs despite subjects revealing more information under transparency than theory predicts

1087 - The identification of beliefs from asset demand

Felix Kubler & Herakles Polemarchakis

The demand for assets as prices and initial wealth vary identifies beliefs and attitudes towards risk. We derive conditions that guarantee identification with no knowledge either of the cardinal utility index or of the distribution of future endowments or payoffs of assets; the argument applies even if the asset market is incomplete and demand is observed only locally.

1086 - What are monetary policy shocks?

Irfan Qureshi

I decompose deviations of the Federal funds rate from a Taylor type monetary policy rule into exogenous monetary policy shocks and a time-varying inflation target. I show that the role of exogenous shocks may be exaggerated in a fixed inflation target model, and a large fraction of business cycle fluctuations attributed to them may actually be due to changes in the inflation target. A time-varying inflation target explains approximately half of the volatility normally attributed to these deviations, and consequently more than a quarter of the fluctuations in the business cycle. This contributes approximately 39% additional inflation volatility during the Great Inflation. I show that shocks to the inflation target imply a lower sacrifice ratio compared to exogenous changes in the interest rate and therefore propose a gradual adjustment of the inflation target in order to achieve monetary policy objectives.

1085 - On The Origins of Gender Human Capital Gaps: Short and Long Term Consequences of Teachers’ Stereotypical Biases

Victor Lavy and Edith Sand

In this paper, we estimate the effect of primary school teachers’ gender biases on boys’ and girls’ academic achievements during middle and high school and on the choice of advanced level courses in math and sciences during high school. For identification, we rely on the random assignments of teachers and students to classes in primary schools. Our results suggest that teachers’ biases favoring boys have an asymmetric effect by gender—positive effect on boys’ achievements and negative effect on girls’. Such gender biases also impact students’ enrollment in advanced level math courses in high school—boys positively and girls negatively. These results suggest that teachers’ biased behavior at early stage of schooling have long run implications for occupational choices and earnings at adulthood, because enrollment in advanced courses in math and science in high school is a prerequisite for post-secondary schooling in engineering, computer science and so on. This impact is heterogeneous, being larger for children from families where the father is more educated than the mother and larger on girls from low socioeconomic background.

1084 - Hedging against Risk in a Heterogeneous Leveraged Market

Alexandros Karlis, Giorgos Galanis, Spyridon Terovitis and Matthew Turner

This paper focuses on the use of interest rates as a tool for hedging against the default risk of heterogeneous hedge funds (HFs) in a leveraged market. We assume that the banks study the HFs survival statistics in order to compute default risk and hence the correct interest rate. The emergent non-trivial (heavy-tailed) statistics observed on the aggregate level, prevents the accurate estimation of risk in a leveraged market with heterogeneous agents. Moreover, we show that heterogeneity leads to the clustering of default events and constitutes thus a source of systemic risk.

1083 - Heterogeneity and Clustering of Defaults

Alexandros Karlis, Giorgos Galanis, Spyridon Terovitis and Matthew Turner

This paper provides a theoretical model which highlights the role of heterogeneity of information in the emergence of temporal aggregation (clustering) of defaults in a leveraged economy. We show that the degree of heterogeneity plays a critical role in the persistence of the correlation between defaults in time. Specifically, a high degree of heterogeneity leads to an autocorrelation of the time sequence of defaults characterised by a hyperbolic decay rate, such that the autocorrelation function is not summable (infinite memory) and defaults are clustered. Conversely, if the degree of heterogeneity is reduced the autocorrelation function decays exponentially fast, and thus, correlation between defaults is only transient (short memory). Our model is also able to reproduce stylized facts, such as clustered volatility and non-Normal returns. Our findings suggest that future regulations might be directed at improving publicly available information, reducing the relative heterogeneity

1082 - Monetary Policy and Welfare in a Currency Union

Lucio D’Aguanno

What are the welfare gains from being in a currency union? I explore this question in the context of a dynamic stochastic general equilibrium model with monetary barriers to trade, local currency pricing and incomplete markets. The model generates a trade off between monetary independence and monetary union. On one hand, distinct national monetary authorities with separate currencies can address business cycles in a countryspecific way, which is not possible for a single central bank. On the other hand, short-run violations of the law of one price and long-run losses of international trade occur if different currencies are adopted, due to the inertia of prices in local currencies and to the presence of trade frictions. I quantify the welfare gap between these two international monetary arrangements in consumption equivalents over the lifetime of households, and decompose it into the contributions of di.erent frictions. I show that the welfare ordering of alternative currency systems depends crucially on the international correlation of macroeconomic shocks and on the strength of the monetary barriers affecting trade with separate currencies. I estimate the model on data from Italy, France, Germany and Spain using standard Bayesian tools, and I find that the trade off is resolved in favour of a currency union among these countries.

1081 - The Political Fallout of Chernobyl: Evidence from West-German Elections

Christoph Koenig

What are the welfare gains from being in a currency union? I explore this question in the context of a dynamic stochastic general equilibrium model with monetary barriers to trade, local currency pricing and incomplete markets. The model generates a trade off between monetary independence and monetary union. On one hand, distinct national monetary authorities with separate currencies can address business cycles in a countryspecific way, which is not possible for a single central bank. On the other hand, short-run violations of the law of one price and long-run losses of international trade occur if different currencies are adopted, due to the inertia of prices in local currencies and to the presence of trade frictions. I quantify the welfare gap between these two international monetary arrangements in consumption equivalents over the lifetime of households, and decompose it into the contributions of di.erent frictions. I show that the welfare ordering of alternative currency systems depends crucially on the international correlation of macroeconomic shocks and on the strength of the monetary barriers affecting trade with separate currencies. I estimate the model on data from Italy, France, Germany and Spain using standard Bayesian tools, and I find that the trade off is resolved in favour of a currency union among these countries.

1080 - Competence vs. Loyalty: Political survival and electoral fraud in Russia’s regions 2000–2012

Christoph Koenig

Election fraud is a pervasive feature of autocracies but often only serves lower-tier officials to cast signals of loyalty or competence to the central government in order to pursue their own interests. How much such personal interests matter for electoral fraud under autocracy has however not been studied so far. In this paper, I exploit a radical policy change in Russia which allowed the president to replace governors of the country’s 89 regions at his own will. As a result, federal elections after December 2004 were organised by two types of governors: one was handpicked by the president, the other one elected before the law change and re-appointed. Even though both types faced removal in case of bad results, the need to signal loyalty was much lower for the first type. In order to estimate the effect of handpicked governors on electoral fraud, I use a diff-in-diff framework over 7 federal elections between 2000 and 2012. For this time period, I use results from about 95,000 voting stations to construct a new indicator of suspicious votes for each region and election. I show that this indicator correlates strongly with incidents of reported fraud. My baseline estimates show that in territories with a handpicked governor the share of suspicious votes decreased on average by more than 10 percentage points and dropped even further if the region’s economy had done well over the past legislature. These findings suggest that governors have less need to use rigging as a signal once loyalty is assured unless faced with circumstances raising doubts about their competence.

1079 - Loose Cannons: War Veterans and the Erosion of Democracy in Weimar Germany

Christoph Koenig

I study the effect of war participation on the rise of right-wing parties in Inter-war Germany. After the democratisation and surrender of Germany in 1918, 8m German soldiers of WWI were demobilised. I argue that defeat made veterans particularly sceptical about the new democratic state. Their return undermined support for democratic parties from the very beginning and facilitated the reversion to autocratic rule 15 years later. In order to quantify this effect, I construct the first disaggregated estimates of German WWI veterans since official army records were destroyed. I combine this data with a new panel of voting results from 1881 to 1933. Diff-in-Diff estimates show that war participation had a strong positive effect on support for the right-wing at the expense of socialist parties. A one standard deviation increase in veteran inflow shifted vote shares to the right by more than 2 percentage points. An IV strategy based on draft exemption rules substantiates my findings. The effect of veterans on voting is highly persistent and strongest in working class areas. Gains for the right-wing, however, are only observed after a period of Communist insurgencies. I provide suggestive evidence that veterans must have picked up especially anti- Communist sentiments after defeat, injected these into the working class and in this way eroded the future of the young democracy.

1078 - Quality and the Great Trade Collapse

Natalie Chen and Luciana Juvenal

We explore whether the global financial crisis has had heterogeneous effects on traded goods differentiated by quality. Combining a dataset of Argentinean firm-level destination-specific wine exports with quality ratings, we show that higher quality exports grew faster before the crisis, but this trend reversed during the recession. Quantitatively, the effect is large: up to nine percentage points difference in trade performance can be explained by the quality composition of exports. This flight from quality was triggered by a fall in aggregate demand, was more acute when households could substitute imports by domestic alternatives, and was stronger for smaller firms’ exports.

1077 - World War II: Won by American Planes and Ships, or by the Poor Bloody Russian Infantry?

Mark Harrison

This short paper reviews a new book about World War II. In most such books, what is new is not usually important, and what is important is not new. This one is an exception. How the War Was Won: Air-Sea Power and Allied Victory in World War II, by Phillips Payson O'Brien, sets out a new perspective on the war. An established view is that World War II was decided on the Eastern front, where multi-million armies struggled for supremacy on land and millions died. According to O’Brien, this neglects the fact that the preponderance of the Allied productive effort was devoted to building ships and planes for an air-sea battle that was fought to a limited extent in the East and with much higher intensity across the Western and Pacific theatres. The Allies’ air-sea power framed the outcomes of the great land campaigns by preventing Germany and Japan from fully realizing their economic potentials for war. Finding much to be said for this reinterpretation, I reconsider the true significance of the Eastern front.

1076 - If You Do Not Change Your Behaviour: Managing Threats to State Security in Lithuania under Soviet Rule

Mark Harrison

In Soviet Lithuania (and elsewhere) from the 1950s to the 1980s, the KGB applied a form of "zero-tolerance" policing, or profilaktika, to incipient threats to state security. Petty deviation from socio-political norms was regarded as a person's first step towards more serious state crimes, and as a bad example for others. As long as petty violators could be classed as confused or misled rather than motivated by anti-Soviet conviction, their mistakes would be corrected by a KGB warning or "preventive discussion." Successful prevention avoided the costly removal of the subject from society. This represented a complete contrast to the Stalin years, when prevention relied largely on eliminating the subject from society. Preventive discussions were widely practised in many different circumstances. KGB internal evaluations concluded that these discussions were extremely effective in preventing further violations. This was the front line of the Soviet police state; it was perhaps the largest programme for personally targeted behaviour modification anywhere in the world at that time outside the education sector. It was also a front line of the Cold War because the foreign adversary was seen as the most important source of misleading or confusing influence. My work in progress aims to understand the origins and operation of profilaktika, including how and to whom it was applied, how it worked on the individual subject, and its wider influence on the Soviet Union’s social and political order.

1075 - Knowing who you are: The Effect of Feedback Information on Short and Long Term Outcomes

Sofoklis Goulas and Rigissa Megalokonomou

We study the effect of disclosing relative performance information (feedback) on students' performance in high-school and on subsequent university enrolment. We exploit a large scale natural experiment where students in some cohorts are provided with their national and school relative performance. Using unique primary collected data, we find an asymmetric response to the relative performance information: high achieving students improve their last-year performance by 0.15 standard deviations whereas the last-year performance of low achieving students drops by 0.3 standard deviations. The results are more pronounced for females indicating greater sensitivity to feedback. We also document the long term effect of feedback provision: high achieving students reduce their repetition rate of the national exams, enrol into 0.15 standard deviations more popular University Departments and their expected annual earnings increase by 0.17 standard deviations. Results are opposite for low achieving students. We find suggestive evidence that feedback encourages more students from low-income neighborhoods to enrol in university and to study in higher-quality programs indicating a potential decrease in income inequality.

1074 - The Political Economy of Liberal Democracy

Sharun Mukand and Dani Rodrik

We distinguish between three sets of rights – property rights, political rights, and civil rights and provide a taxonomy of political regimes. The distinctive nature of liberal democracy is that it protects civil rights (equality before the law for minorities) in addition to the other two. Democratic transitions are typically the product of a settlement between the elite (who care mostly about property rights) and the majority (who care mostly about political rights). Such settlements rarely produce liberal democracy, as the minority has neither the resources nor the numbers to make a contribution at the bargaining table. We develop a formal model to sharpen the contrast between electoral and liberal democracies and highlight circumstances under which liberal democracy can emerge. We discuss informally the difference between social mobilizations sparked by industrialization and decolonization. Since the latter revolve around identity cleavages rather than class cleavages, they are less conducive to liberal politics.

1073 - QE and the Bank Lending Channel in the United Kingdom

Nick Buttz, Rohan Churmz, Michael McMahon, Arpad Morotzz and Jochen Schanz

We test whether quantitative easing (QE), in addition to boosting aggregate demand and inflation via portfolio rebalancing channels, operated through a bank lending channel (BLC) in the UK. Using Bank of England data together with an instrumental variables approach, we find no evidence of a traditional BLC associated with QE. We show, in a simple framework, that the traditional BLC is diminished if the bank receives 'flighty' deposits (deposits that are likely to quickly leave the bank). We show that QE gave rise to such flighty deposits which may explain why we find no evidence of a BLC.

1072 - Boss Competence and Worker Well-being

Benjamin Artz, Amanda H Goodall, and Andrew J Oswald

Nearly all workers have a supervisor or ‘boss’. Yet little is known about how bosses influence the quality of employees’ lives. This study is a cautious attempt to provide new formal evidence. First, it is shown that a boss’s technical competence is the single strongest predictor of a worker’s job satisfaction. Second, it is demonstrated in longitudinal data -- after controlling for fixed effects -- that even if a worker stays in the same job and workplace a rise in the competence of a supervisor is associated with an improvement in the worker’s well-being. Third, a variety of robustness checks, including tentative instrumental-variable results, are reported. These findings, which draw on US and British data, contribute to an emerging literature on the role of expert leaders in organizations. Finally, the paper discusses potential weaknesses of existing evidence and necessary future research.

1071 - National Well-being Policy and a Weighted Approach to Human Feelings

Gus O'Donnell and Andrew J Oswald

Governments are becoming interested in the concept of human well-being and how truly to assess it. As an alternative to traditional economic measures, some nations have begun to collect information on citizens’ happiness, life satisfaction, and other psychological scores. Yet how could such data actually be used? This paper is a cautious attempt to contribute to thinking on that question. It suggests a possible weighting method to calculate first-order changes in society’s well-being, discusses some of the potential principles of democratic ‘well-being policy’, and (as an illustrative example) reports data on how sub-samples of citizens believe feelings might be weighted.

1070 - Under the Radar: The Effects of Monitoring Firms on Tax Compliance

Miguel Almunia & David Lopez-Rodriguez

This paper analyzes the effects on tax compliance of monitoring the information trails generated by firms’ activities. We exploit quasi-experimental variation generated by a Large Taxpayers Unit (LTU) in Spain, which monitors firms with more than 6 million euros in reported revenue. Firms strategically bunch below this threshold in order to avoid stricter tax enforcement. This response is stronger in sectors where transactions leave more paper trail, implying that monitoring effort and the traceability of information reported by firms are complements. We calculate that there would be substantial welfare gains from extending stricter tax monitoring to smaller businesses.

1069 - Sovereign Risk, Private Credit, and Stabilization Policies

Roberto Pancrazi, Hernan D. Seoane & Marija Vukotic

In this paper we examine the impact of bailout policies in small open economies that are subject to financial frictions. We extend standard endogenous default models in two ways. First, we augment the government’s choice set with a bailout option. In addition to the standard choice of defaulting or repaying the debt, a government can also choose to ask for a third-party bailout, which comes at a cost of an imposed borrowing limit. Second, we introduce financial frictions and a financial intermediation channel, which tie conditions on the private credit market to the conditions on the sovereign credit market. This link has been very strong in European countries during the recent sovereign crisis. We find that the existence of a bailout option reduces sovereign spreads and, through the described link, private credit rates as well. The implementation of a rescue program reduces output losses and increases welfare, measured in consumption equivalent terms. Moreover, bailout benefits emerge even when a government only has the option of asking for a bailout, but does not take advantage of it.

1068 - Natural Expectations and Home Equity Extraction

Roberto Pancrazi & Mario Pietrunti

In this paper we propose a novel explanation for the increase in households' leverage during the recent boom in U.S. housing prices. We use the U.S. housing market's boombust episode that led to the Great Recession as a case study, and we show that biased long-run expectations of both households and, especially, nancial intermediaries about future housing prices had a large impact on households' indebtedness. Specically, first we show that it is likely that financial intermediaries used forecasting models that ignored the long-run mean reversion of housing prices after a short-run momentum, thus leading to an overestimation of future households' housing wealth. We frame this finding in the theory of natural expectations, proposed by Fuster et al. (2010), to the housing market. Then, using a tractable model of collateralized credit market populated by households and banks, we find that: (1) mild variations in long-run forecasts of housing prices result in quantitatively considerable dierences in the amount of home equity extracted during a housing price boom; (2) the equilibrium levels of debt and interest rate are particularly sensitive to nancial intermediaries' naturalness; (3) home equity extraction data are better matched by models in which agents are fairly natural. 1068

1067 - The Inequality Accelerator

Eric Mengus and Roberto Pancrazi

We show that the transition from an economy characterized by idiosyncratic income shocks and incomplete markets a la Aiyagari (1994) to markets where statecontingent assets are available but costly (in order to purchase a contingent asset, households have to pay a xed participation cost) leads to a large increase of wealth inequality. Using a standard calibration our model can match a Gini of 0.93 close to the level of wealth inequality observed in the US. In addition, under this level of participation costs, wealth inequality is particularly sensitive to income inequality.We label this phenomenon as the Inequality Accelerator. We demonstrate how costly access to contingent asset-markets generates these eects. The key insight stems from the non-monotonic relationship between wealth and desired degree of insurance, in an economy with participation costs. Poor borrowing constrained households remain uninsured, middle-class households are almost perfectly insured, while rich households decide to self-insure by purchasing risk-free assets. This feature of households' risk management has crucial eects in asset prices, wealth inequality, and social mobility.

1066 - Supplement to Fuzzy Differences-in-Difference

Clement de Chaisemartin and Xavier D'Haultfoeuille

This paper gathers the supplementary material to de Chaisemartin & D'Haultfoeuille (2015). First, we show that two commonly used IV and OLS regressions with time and group fixed effects estimate weighted averages of Wald-DIDs. It then follows from Theorem 3.1 in de Chaisemartin & D'Haultfoeuille (2015) that these regressions estimate weighted sums of LATEs, with potentially many negative weights as we illustrate through two applications. We review all papers published in the American Economic Review between 2010 and 2012 and find that 10.1% of these papers estimate one or the other regression. Second, we consider estimators of the bounds on average and quantile treatment effects derived in Theorems 3.2 and 3.3 in de Chaisemartin & D'Haultfoeuille (2015) and we study their asymptotic behavior. Third, we revisit Gentzkow et al. (2011) and Field (2007) using our estimators. Finally, we present all the remaining proofs not included in the main paper.

1065 - Fuzzy Differences-in-Differences

Clement de Chaisemartin and Xavier D'Haultfoeuille

In many applications of the differences-in-differences (DID) method, the treatment increases more in the treatment group, but some units are also treated in the control group. In such fuzzy designs, a popular estimator of treatment effects is the DID of the outcome divided by the DID of the treatment, or OLS and 2SLS regressions with time and group fixed effects estimating weighted averages of this ratio across groups. We start by showing that when the treatment also increases in the control group, this ratio estimates a causal effect only if treatment effects are homogenous in the two groups. Even when the distribution of treatment is stable, it requires that the effect of time be the same on all counterfactual outcomes. As this assumption is not always applicable, we propose two alternative estimators. The first estimator relies on a generalization of common trends assumptions to fuzzy designs, while the second extends the changes-in-changes estimator of Athey & Imbens (2006). When the distribution of treatment changes in the control group, treatment effects are partially identified. Finally, we prove that our estimators are asymptotically normal and use them to revisit applied papers using fuzzy designs.

1064 - E-Cigarettes: The Extent and Impact of Complementary Dual-Use

Chris Doyle, David Ronayne and Daniel Sgroi

The highly controversial e-cigarette industry has generated considerable policy debate and mixed regulatory responses worldwide. Surprisingly, an issue thathas been largely ignored is the categorisation of e-cigarettes as substitutes or (dynamic) complements for conventional smoking. We conduct an online survey ofUS participants finding that 37% of e-cigarette users view them primarily as complementary. We use this result along-side publicly available data to calibrate a cost-benefit analysis, estimating that complementarity reduces the potential cost-savings of e-cigarettes by as much as 57% (or $3.3-4.9bn p.a.) relative to case with zero complementarity.

1063 - Negative Voters: Electoral Competition with Loss-Aversion

Ben Lockwood and James Rockey

This paper studies how voter loss-aversion affects electoral competition in a Downsian setting. Assuming that the voters’ reference point is the status quo, we show that loss-aversion has a number of effects. First, for some values of the status quo, there is policy rigidity both parties choose platforms equal to the status quo, regardless of other parameters. Second, there is a moderation effect when there is policy rigidity, the equilibrium policy outcome is closer to the moderate voters’ ideal point than in the absence of loss-aversion. In a dynamic extension of the model, we consider how parties strategically manipulate the status quo to their advantage, and we find that this increases policy rigidity. Finally, we show that with loss-aversion, incumbents adjust less than challengers to changes in voter preferences. The underlying force is that the status quo works to the advantage of the incumbent. This prediction of asymmetric adjustment is new, and we test it using elections to US state legislatures. The results are as predicted: incumbent parties respond less to shocks in the preferences of the median voter.

1061 - Accounting for Mismatch Employment

Jordi Gali and Thijs van Rens

1060 - Catastrophic Risk, Rare Events, and Black Swans: Could There Be a Countably Additive Synthesis?

Peter Hammond

Catastrophic risk, rare events, and black swans are phenomena that require special attention in normative decision theory. Several papers by Chichilnisky integrate them into a single framework with nitely additive subjective probabilities.Some precursors include: (i) following Jones-Lee (1974), undened willingness to pay to avoid catastrophic risk; (ii) following Renyi (1955, 1956) and many successors, rare events whose probability is innitesimal. Also, when rationality is bounded, enlivened decision trees can represent a dynamic process involving successively unforeseen \true black swan" events. One conjectures that a dierent integrated framework could be developed to include these three phenomena while preserving countably additive probabilities.

1059 - Ready for boarding? The effects of a boarding school for disadvantaged students

Luc Behaghel, Clement de Chaisemartin and Marc Gurgand

Boarding schools substitute school to home, but little is known on the eects this substitution produces on students. We present results of an experiment in which seats in a boarding school for disadvantaged students were randomly allocated. Boarders enjoy better studying conditions than control students. However, they start outperforming control students in mathematics only two years after admission, and this effect mostly comes from strong students. After one year, levels of well-being are lower among boarders, but in their second year, students adjust: well-being catches-up. This suggests that substituting school to home is disruptive: only strong students benet from the boarding school, once they have managed to adapt to their new environment.

1058 - The Role of Product Diversification in Skill-Biased Technological Change

Choong Hyun Nam

Since the 1980s, labour demand has shifted toward more educated workers in the US. The most common explanation is that the productivity of skilled workers has risen relative to the unskilled, but it is not easy to explain why aggregate labour productivity was stagnant during the 1980s.This paper suggests an alternative story: introducing new goods involves a fixed labour input, which is biased toward white-collar workers. Hence the transition from Ford-style mass production towards more diversied one has shifted labour demand toward white-collar workers.

1057 - How Transparency Kills Information Aggregation: Theory and Experiment

Niall Hughes and Sebastian Fehrler

We investigate the potential of transparency to influence committee decisionmaking.We present a model in which career concerned committee members receive private information of different type-dependent accuracy, deliberate and vote. We study three levels of transparency under which career concerns are predicted to affect behavior differently, and test the model’s key predictions in a laboratory experiment. The model’s predictions are largely borne out - transparency negatively affects information aggregation at the deliberation and voting stages, leading to sharply different committee error rates than under secrecy. This occurs despite subjects revealing more information under transparency than theory predicts.

1056 - Price Comparison Websites

David Ronayne

The large and growing industry of price comparison websites (PCWs) or “web aggregators” is poised to benefit consumers by increasing competitive pricing pressure on firms by acquainting shoppers with more prices. However, these sites also charge firms for sales, which feeds back to raise prices. I find that introducing any number of PCWs to a market increases prices for all consumers, both those who use the sites, and those who do not. I then use my framework to identify ways in which a more competitive environment could be achieved