Lecture 3 (Dutta): Dynamic Programming 2

Original article (link) posted: 28/09/2005

In the last class, we saw the following results;
Necessity: the value function of the dynamic problem V becomes the fixed point of the corresponding Bellman problem, i.e., TV=V.
Sufficiency: if the bounded function U satisfies TU=U in a Bellman problem and if there exists a maximizing selection for TU, then U is a value function of the original dynamic problem.

Our focus on Lecture 3 is to prove these results formally and to derive some other properties about Bellman operator T.

Topics in the class

1) The proof of necessity
2) The proof of sufficiency
To read the chapter 4 of SLP in advance helped me a lot to understand the proof. Although Chapter 4 deals with DP under certainty, it has direct link to uncertain case which was covered in the class.

3) Finding a value function
(a) The Bellman operator is a contraction (by Blackwell)
(b) The set of bounded and continuous functions with sup-norm  is a complete metric space. Completeness is quite important to derive properties of value functions, because it assures the properties in the limit (=the value function).

Blackwell's sufficient conditions for a mapping to be a contraction seem very useful. Professor Dutta mentioned that despite the seminal works in the field of applied mathematics Blackwell's first half of the career was not bright because of the racial discrimination. (He is a black.)


Agenda for Development Economics

The latest volume of the Journal of Economic Perspectives (Vol. 24, No. 3, Summer 2010, link) puts together the feature story on development economics. As you can see below, the collection of articles looks really amazing!

Symposium: The Agenda for Development Economics

"Understanding the Mechanisms of Economic Development" (pp. 3-16)
Angus Deaton

"Theory, General Equilibrium, and Political Economy in Development Economics" (pp. 17-32)
Daron Acemoglu

"Diagnostics before Prescription" (pp. 33-44)
Dani Rodrik

"Uneven Growth: A Framework for Research in Development Economics" (pp. 45-60)
Debraj Ray

"Giving Credit Where It Is Due" (pp. 61-80)
Abhijit V. Banerjee and Esther Duflo

"Microeconomic Approaches to Development: Schooling, Learning, and Growth" (pp. 81-96)
Mark R. Rosenzweig

For those who have interests in education, the other symposium, No Child Left Behind, might also be helpful.


Japanese style wins the day

Congratulations, Nadeshiko! Japanese women's team proceeded to the final of the FIFA U-17 World Cup:
Japanese style wins the day: Japan dragged themselves back from a goal down to beat defending champions Korea DPR in the second semi-final from the Ato Boldon Stadium in Couva on Tuesday. Winning out 2-1 and putting on a show of assorted flicks and tricks for the fans in attendance, the stylish East Asians – who scored both of their goals in the space of two second-half minutes – will now meet Korea Republic in the final of the FIFA U-17 Women’s World Cup Trinidad and Tobago 2010. (see more from FIFA's website)

The Japan's second goal by Kumi YOKOYAMA is really amazing! She was like Maradona or Messi.


Central Themes on Microeconomics

This is a relatively new textbook on intermediate microeconomics written by two leading scholars in microeconomic/game theory, Douglas Bernheim and Michael Whinston
(See publisher's link for further information):

In the introduction, the authors mention the following eight themes on microeconomics. Four are related to individual decision making, the other four are about the competitive market, which might be helpful for instructors who teach microeconomics.
Central Themes on Decision Making: 
1. Trade-offs are unavoidable
2. Good choices are usually made at the margin
3. People respond to incentives
4. Prices provides incentives

Central Themes on Markets:
5. Trade can benefit everyone
6. The competitive market price reflects both value to consumers and cost to producers
7. Compared to other methods of resource allocation, markets have advantages
8. Sometimes, government policy improve on free-market resource allocation
These themes may look a bit too classical, but the book itself contains new topics such as behavioral economics, strategic interactions (with many examples), competitive policies in oligopoly markets, which reflects the authors' expertise and advantages compared to other rival textbooks.


Risk and Liquidity

Hyun Song Shin (Hughes-Rogers Professor of Economics at Princeton University)'s awaited book on financial crises came out recently, titled "Risk and Liquidity":

The book is based on his 2008 Clarendon Lectures (in Finance). The table of contents and sample of a few sections are available from his website (link). Let me quote the comments by Roger Myerson and Franklin Allen, which seem to illuminate the feature and importance of  this outstanding book very well.
In the Great Recession, the world has looked for leading economists to offer a new and better understanding of macroeconomic instability, as Keynes did in the Great Depression. In this book, Hyun Song Shin delivers what was needed. Step by step, he develops as new comprehensive understanding of how macroeconomic booms and busts can be derived from microeconomic forces in the banking system. This book should be recognized as a major contribution to macroeconomic theory.
Roger Myerson, Nobel Laureate in Economics 2007, Chicago University

Hyun Song Shin is one of the leading scholars on financial stability in the world. His experience in this field is not confined to his academic work. He has also advised the President of Korea, the Bank of England and many other institutions on these issues. The recent crisis has underlined how important it is to understand the boom-bust cycle. During the boom asset prices rise, this allows financial institutions to borrow and expand their balance sheets and drive prices up more. Similarly, in the bust part of the cycle they reduce their debt, their balance sheets shrink, they sell assets and prices fall more. Hyun Song Shin done the foundational theoretical and empirical research on this leveraging and deleveraging amplification mechanism. This book provides a very accessible summary of this work. It is essential reading for all academics and practitioners interested in financial crises.
Franklin Allen, The Wharton School of the University of Pennsylvania


Kandori (RES 1992)

Original article (link) posted: 26/09/2005

Kandori (1992) "Social Norms and Community Enforcement" RES, 59

The paper considers self-enforcing mechanisms in the situation where agents change their partners over time. Technically, the model in the paper is an extension of the theory of repeated games to the case of matching games. As main results, the following two are shown.

1) Community can sustain cooperation even when each agent knows nothing more than his personal experience.
2) Community can realize any mutually beneficial outcomes when each agent carries a label which is revised in a systematic way. That is, Folk Theorem holds.

As a motivation of the research, he mentions the benefit of specialization. After introducing theoretical achievements in personal enforcement mechanisms, i.e. Folk Theorem in the repeated game literature, he says as follows.

However, many important transactions are infrequent in nature. As economic historians argue, the division of labor and specialization are important driving forces of economic progress. Potential gains are larger in diverse transactions with different specialists than with fixed partners. Therefore, the control of incentives in such an infrequent trade is of vital importance to understand the organization of economic transactions.

He refers to two papers which initiated this line of research.

The attempt to generalize the Folk Theorem of repeated games to the case of matching games was initiated by Milgrom, North and Weingast (1990) and Okuno-Fujiwara and Postlewaite (1989). The former analyzed concrete examples of information transmission mechanisms and the latter introduced the notion of local information processing. Both of them, however, mainly deal with the infinite population case to avoid potentially complicated problems of incentives on off-equilibrium paths. Our paper shows that such problems can be resolved in a simple way if the stage game satisfies weak condition. Equilibria constructed in our paper work for any population size and any matching rule, and are robust to changes in information structures.

What a strong result he derived!! Although he does not stress the results given in Section 3 "Folk Theorem under Public Observability", I think Proposition 2 is very interesting. It is easy to understand that Folk Theorem holds if all the other players get into punishment phase after some player deviates, which is stated as Proposition 1. However, if we restrict our attention to such a situation where only the deviator is to be punished and innocent pairs are to play the originally prescribed actions, to show Folk Theorem is not straight forward. To be more precise, to check the incentives for innocent players in off-equilibrium path where community is highly populated with "guilty" agents is involved.
Introducing some "forgiveness" in the social norm, the author elegantly shows this problem can be avoided which leads to Proposition 2.

Interesting Papers in References

Harrington (1989) "Cooperation in Social Settings" mimeo
Section 7 of the above paper was revised and available as the following.
Harrington (1995) "Cooperation in a One-Shot Prisoners' Dilemma" GEB, 8
Milgrom, North and Weingast (1990) "The Role of Institutions in the Revival of Trade: The Law Merchant, Private Judges, and the Champagne Fairs" Economic Inquiry, 25
Okuno-Fujiwara and Polstlewaite (1995) "Social Norms in Random Matching Game" GEB, 9
Rubinstein and Wolinsky (1990) "Decentralized Trading, Strategic Behavior and the Walrasian Outcome" RES, 57


Complementarity and supermodularity

I found a nice summary of key concepts in game theory, complementarity and supermodularity, which are especially important for auction and matching theory.

"Supermodularity and supermodular games" byXavier Vives
in the new palgrave dictionary of economics:

The below is quoted from Xavier's survey.
The basic idea of complementarity is that the marginal value of an action increases with the level of other actions available. The mathematical concept of supermodularity formalizes the idea of complementarity. The theory of monotone comparative statics and supermodular games provides the toolbox to deal with complementarities.

This theory, in contrast to classical convex analysis, is based on order and monotonicity properties on lattices. Monotone comparative statics analysis provides conditions under which optimal solutions to optimization problems change monotonically with a parameter.

The theory of supermodular games exploits order properties to ensure that the best response of a player to the actions of rivals increases with their level. The power of the approach is that it clarifies the drivers of comparative statics results and the need of regularity conditions; it allows very general strategy spaces, including indivisibilities and functional spaces such as those arising in dynamic or Bayesian games; it establishes the existence of equilibrium in pure strategies; it allows a global analysis of the equilibrium set when there are multiple equilibria, which has an order structure with largest and smallest elements; and finally, it finds that those extremal equilibria have strong stability properties and there is an algorithm to compute them.


Theory Seminar (Gilboa, Lieberman and Schmeidler)

Original article (link) posted: 24/09/2005

Gilboa, Lieberman and Schmeidler "Empirical Similarity"

Although the above paper was supposed to be talked about, Professor Gilboa mainly explained the following paper.

Billot, Gilboa, Samet and Schmeidler (2005) "Probabilities as similarity-weighted frequencies" Econometrica, 73

The above two papers consider a decision rule when a decision maker who has data on past outcomes is asked to express her beliefs by assigning probabilities to certain possible states. As the original database becomes large, empirical frequency may not help for her to make a decision at all. Instead, she may assign a higher weight to more similar case in evaluating the probability of a state.
Billot et.al show that if beliefs given a union of two databases are a convex combination of beliefs given each of the databases, the belief formation process follows a simple formula: beliefs are a similarity-weighted average of the beliefs induced by each past case. However, their axiomatization does not suggest a particular similarity function, or even a particular form of the function. Gilboa et.al develop tools of statistical inference for parametric estimation of the similarity function, assuming that such a function governs the data generating process.
Notice that the axioms in the papers cannot be consistent with the situation where the range of belief becomes smaller as the number of observation increases or a decision maker cares about a trend of outcomes.

Presentation by Professor Gilboa was very clear and he was quite good at using power point (!!). But it was difficult to understand the material. I might need to study decision theory at least little bit... (It might be good to read his book "A Theory of Case-Based Decisions". The Japanese translated edition is also available.)