Bad Reputation with Simple Rating Systems, with Daniel Monte - Games and Economic Behavior  (2023)

We consider information censoring through finite memory as a device against bad reputational concerns. Our class of constrained information policies resembles common practices in online reputation systems, on which customers increasingly rely whenever hiring experts. In a world of repeated interactions between a long-lived expert and short-lived customers, Ely and Välimäki (2003) show that unlimited record-keeping may induce the expert to overchoose a certain action, seeking reputational gains. Consequently, welfare may reduce and markets may break down. We show that simple rating systems in such world help overcome market failures and improve upon both the full-memory and the no-memory cases. 

Dynamic Information Design under Constrained Communication Rules, with Daniel Monte - American Economic Journal: Microeconomics  (2023)

An information designer wishes to persuade agents to invest in a project of unknown quality. To do so, she must induce investment and collect feedback from these investments. Motivated by data regulations and simplicity concerns, our designer faces communication constraints. These constraints hinder her without benefiting the agents: they impose an upper bound on the induced belief spread, limiting persuasion. Nevertheless, two-rating systems (direct recommendations) (i) are the optimal design when experimentation is needed to generate information, and (ii) approximate the designer’s first-best payoff, for specific feedback structures. When the designer has altruistic motives, constrained rules significantly decrease welfare.

Working Papers

Information Acquisition, Networks and Voting, with Gerard Domènech-Gironell and Oriol Tejada- March 2024.

A society of identical individuals must choose through elections one of two alternatives under uncertainty about the state of the world. Individuals can (a) choose the accuracy of their private signals about the state of the world at an increasing cost, and (b) send messages to other individuals to whom they are connected in some network. We show that the existence of a full network leads generically to two types of equilibria. First, there always exists an equilibrium in which only one citizen—a dictator—acquires information and everybody else votes equally based on such information, which is sent by the dictator to all other citizens via the network. Second, the only symmetric equilibrium that would exist without a network is also an equilibrium with a full network, but only if information acquisition costs are sufficiently high. This condition keeps at bay the extent of the positive externalities created by acquiring information that can be distributed at no cost. 

Political Accountability and Misinformation, with Braz Camargo and Laura Karpuska- December 2023.

What are the impacts of misinformation on political accountability? We address this question in a political career concerns framework with belief misspecification. In our model, an incumbent politician of an unknown ability seeks to maximize reelection chances by putting costly effort into a provision a public good. Citizens agree ex-ante on how to interpret the outcomes of the incumbent’s effort. However, some of them disagree on how to interpret other signals. Specifically, some voters incorrectly believe that a confounding signal is informative of about the incumbent’s ability, while others correctly understand that they are completely uninformative. This misspecification on this signal leads to ex-post disagreement on how successful the incumbent should be in providing the public good to secure a reelection. We consider both an intensive margin and an extensive margin of informational disagreement, that is, (i) how much the beliefs of citizens with learning misspecification differ from the beliefs of citizens with a correct learning model, and (ii) how much misspecified citizens represent in the composition of society. We characterize the impact of informational disagreement on effective accountability (the effort provided by the incumbent in equilibrium). Our analysis not only identifies situations in which misinformation impacts negatively the social contributions of elected governments, but also – perhaps counter-intuitively, situations in which misinformation increases political accountability 

Persuading Crowds - August 2023.

A sequence of short-lived agents must choose which action to take under a fixed, but unknown, state of the world. Prior to the realization of the state, a long-lived principal designs and commits to a dynamic information policy in order to persuade agents towards his most preferred action. The principal's persuasion power is potentially limited by the existence of conditionally independent and identically distributed private signals for the agents as well as their ability to observe the history of past actions. I characterize the problem for the principal in terms of a dynamic belief manipulation mechanism and analyze its implication for social learning. For a class of private information structure - the log-concave class, I derive conditions under which the principal should encourage some social learning and when he should induce herd behavior from the start (single disclosure). I also show that social learning is less valuable to a more patient principal: as his discount factor converges to one, the value of any optimal policy converges to a value of the single disclosure policy.

Work in Progress

Learning (and Pricing) by Trading (with Vladimir Asriyan and William Fuchs)