Robert Nau

 

Professor Emeritus

Fuqua School of Business

Duke University

 

Biographical information and web pages:

 

What does this figure represent?

correlated equilibria of battle-of-sexes gamecorrelated equilibria of battle-of-sexes game

If you guessed "battle of the sexes," you are correct. The figure illustrates a theorem concerning the geometry of the set of solutions of a noncooperative game, as it applies to the 2x2 game known as battle-of-the-sexes. (He prefers the boxing match, she prefers the ballet, but they would like to go somewhere together rather than separately. What should they do?) The gray saddle is the set of independently randomized strategies. The green hexahedron is the set of correlated equilibria. Their three points of intersection (red dots) are Nash equilibria. The obvious fair solution (flipping a coin) is the midpoint of the long edge, which is not a Nash equilibrium. This picture is generic in the sense that Nash equilibria always lie on supporting hyperplanes of the set of correlated equilibria and as such they cannot exist in its interior when it has full dimension as it does here. For more details see the following paper.

Which one of them was right?

Research highlights:

My research area lies within the broad field of rational choice theory, the theory of the expected-utility maximizing, game-playing, equilbrium-seeking rational economic person. My own work has primarily focused on foundational issues: primitive assumptions (axioms) that govern the preferences of rational agents and mathematical representations of mental processes that might explain such preferences. Classical work in the field, from the 1920's to 1960's, emphasized the subjective expected utility model for decisions made under conditions of uncertainty, in which agents act as if they assign numerical probabilities to events and numerical utilities to consequences and they make choices so as to maximize the expected value of their utility. The expected utility model was formalized by von Neumann and Morgenstern (1944/1947) with a sweeping scope: "We shall assume that the aim of all participants in the economic system, consumers as well as entrepreneurs, is money, or equivalently a single monetary commodity. This is supposed to be unrestrictedly divisible and substitutable, freely transferable, even in the quantitative sense, with whatever 'satisfaction' or 'utility' is desired by each participant." (p. 8) In a situation involving two or more agents, such as a game or market, they seek an equilibrium in which each one optimizes his or her expected utility against the actions of others, ceteris paribus, independently randomizing their choices for strategic purposes if necessary. The existence of such a solution was proved by Nash (1951) via a fixed point theorem, and the elegance and power of this result made it the focal point of game theory as the field developed. A similarly sweeping formalization of subjective probability was introduced by Savage (1954), in which agents are assumed to be able to assign equally precise probabilities to any kinds of events, and it is the foundation on which "Bayesian" statistics and decision theory are built.

Beginning in the 1970's this paradigm was challenged on many fronts as contrary findings emerged from behavioral experiments and as the preference axioms that support the subjective expected utility model were reappraised on normative as well as descriptive grounds. More general models of "non-expected utility preferences" began to be explored, along with weakenings and strengthenings of solution concepts for games. However, in nearly all of this newer work, great importance is still attached to preserving a clean separation between beliefs about events (represented by probabilities or generalizations thereof) and preferences among consequences (represented by utilities or generalizations thereof), and the parameters of utility are still assumed to be measurable with unlimited precision, regardless of whether the conquences are material goods to be received or personal experiences to be enjoyed.

Much of my own work has taken a different approach and has been motivated by analogies with modern physics, in which variables that were formerly assumed to be measurable on absolute scales (e.g., space and time) are instead measured relative to the viewpoints of observers with their own frames of reference, which makes those variables inseparable to some extent. The economic analog of this phenomenon is that agents do not in general have observable absolute positions in the space of material goods and investments and personal well-being that they may possess, nor are utility functions their natural tools of thought with respect to such things, let alone functions having values that are interpersonally measurable. All that can be observed in public are contracts that agents are willing to sign and relative rates and prices at which they are willing to make exchanges with each other, which makes it impossible to separate the effects of beliefs and preferences and endowments when interpreting their behavior. Money plays (literally) a cardinal role in quantifying the rates of exchange in precise terms, which is one of its functions. The more we abstract from it, the fuzzier the numbers become. It may seem that giving a distinguished role to money rather than utility in the modeling of rational economic behavior would create problems for decision analysis, which often deals with multiattribute outcomes, as well as for noncooperative game theory, in which equilibrium probabilities in randomized strategies are determined from payoff matrices whose units are utilities. In various papers I have shown that this is not true: there are natural extensions of these models that do not require the strict separation of probability from utility. Central to this program is the idea that the principle of no-arbitrage (avoiding a sure loss in units of money) is the key axiom that unifies the domains of rational choice theory that deal with choice under uncertainty: personal decisions (the 1-body problem), noncooperative games (the problem of 2, 3, 4 . . . bodies), and investments in markets (the n-body problem for large n).

This approach to modeling rational choice under uncertainty does not require that the agent only cares about money or that the only situations that can be analyzed are those whose primary consequences are monetary. Rather, side bets in units of money which are conditioned on the agent's own actions can be used as a precise yardstick for indirectly quantifying her attitude toward risk and her preferences for other attributes of consequences. The optimal actions for an individual or equilibrium actions for players in a game are those which do not give rise to arbitrage in the side bets. To the extent that this story of hypothetical monetary side bets stretches the imagination, it is an even greater stretch to assume that agents can know each others' probabilities and utilities or equilibrate on their beliefs in games and markets with any sort of numerical precision, particularly when they all may have private information, idiosyncratic interests, and unobservable background risk.

The fundamental representation theorems in the three domains (decisions, games, markets) are all applications of the duality theorem of linear programming, a special case of the separating hyperplane theorem for convex sets, which has many other applications in economics. In each domain there is a primal problem whose variables are actions of the body (moves to be made, offers to bet or trade) and the objective is for an observer to extract an arbitrage profit, and there is a corresponding dual problem whose variables are properties of mind (probabilities, utilities, or combinations thereof) and the objective is to find values of them that rationalize the actions of the body in the sense of individual or joint maximization of expected value or expected utility. This correspondence is well known in subjective probability theory (in de Finetti's version of it) and in asset pricing theory, but noncooperative game theory, as it is conventionally presented, takes a very different approach to characterizing rational behavior. It assumes the expected utility model, performs some hand-waving about common knowledge of utilities, and then directly imposes an equilibrium condition (usually Nash equilibrium). What I have contributed is to show that game theory fits into a continuum with subjective probability theory and asset pricing theory. Rational behavior in games can be given a primal definition in terms of avoidance of arbitrage by the players as a group ("joint coherence"), and application of the duality theorem of linear programming then leads to correlated equilibrium, first proposed by Robert Aumann (1974), rather than Nash equilibrium as the fundamental solution concept. The very act of revealing the rules of the game via conditional side bets on its outcome (thereby solving the common knowledge problem up front) is equivalent to asserting that its solution lies in the set of correlated equilibria, and outcomes that do not have positive probability in some correlated equilibrium are those which expose the set of players to arbitrage. Moreover, the existence proof for correlated equilibria does not require the use of a fixed point theorem. (The obsession with fixed points in economics is unfortunate: they usually lack a good story about selection and convergence. By comparison, the no-arbitrage criterion for joint rationality is self-enforcing in the presence of an alert observer.)

In the most general case, the units of preference modeling are risk neutral probabilities, which are the rates at which an agent is publicly willing to make small side bets on events. This term originated in the finance literature, where it refers to the probability distribution of a "risk neutral representative agent" who determines arbitrage-free asset prices. However, these risk neutral probabilities are not the true subjective probabilities of a typical real agent, who is risk averse. Rather, they are interpretable as the product of her subjective probabilities and her state-dependent marginal utilities for money, with those two psychic dimensions not being separately measurable, the economic equivalent of space-time. In my work on this topic I've shown that risk aversion can be modeled without separating probabilities from utilities by using a generalization of the Arrow-Pratt risk aversion measure that refers to first and second derivatives of the risk neutral probabilities rather than first and second derivatives of utility functions.

The traditional subjective expected utility model also imposes subtle (and arguably unreasonable) restrictions on the shapes of indifference curves in payoff space, requiring the agent to be equally risk averse to all sources of risk. My more general approach based on derivatives of risk neutral probabilities allows risk aversion to be source-dependent, which accomodates the phenomenon of ambiguity aversion (also called "uncertainty aversion") that is illustrated by Daniel Ellsberg's (1961) famous paradox: most persons would prefer to bet on events whose probabilities are known (say, the color of a ball drawn from an urn that is known to hold equal numbers of red and black balls) rather than events whose probabilities are undetermined (the color of a ball drawn from an urn in which the numbers are unknown). Ellsberg's subversive idea has inspired the development of many new models in decision theory over the last few decades.

Some of my other work in the area of game theory addresses various issues concerning the geometry of the sets of Nash and correlated equilibria, one of which is illustrated by the figure above. In the most general case, where risk-averse agents must reciprocally measure each other's payoffs to establish the rules of the game, the parameters of correlated equilibrium distributions are risk neutral probabilities rather than measures of pure belief. Another important analogy with physics is that the act of measurement tends to perturb whatever is being measured, resulting in fundamental indeterminacies. This same concept is relevant to attempts to measure personal probabilities in terms of the rates at which agents are willing to bet money on events. The very fact that a second agent may eagerly take the other side of a bet that the first agent has offered could perturb that agent's beliefs. My work in the area of indeterminate probabilities (the "confidence weighted probability" model) addresses this phenomenon.

Yet another connection with physics arises in the modeling of differences in beliefs. Cross-entropy serves as a measure of information gain in a physical experiment, and this same principle (even the same mathematics) describes the situation in which a risk averse agent with fixed personal probabilities and no prior investments comes into contact with a market in which asset prices are determined from some other probability distribution. In the special case where the agent has logarithmic utilities, the cross-entropy (KL divergence) between the two probability distributions is the gain in expected utility that the agent achieves in transacting with the market until an indifference point is reached, where her own risk neutral probabilities agree with those that price the assets. More general utility functions lead to well-known generalizations of cross-entropy.

Teaching and software:

Throughout my career I have taught a continuously-evolving elective course on statistical forecasting to MBA students and graduate students from other departments around the university. Some of my older, unplugged notes for it are posted on a public web site, statforecasting.com, which receives over 1 million daily visitors per year. In connection with the course, I have designed a couple of software tools. One is the "user-specified-model" forecasting procedure in a commercial statistics package, Statgraphics. (The user manual for the procedure is here.) It allows the user to construct forecasting models from combinations of data transformations (log, power, deflation, seasonal adjustment) and model types (regression, random walk, moving average, exponential smoothing, ARIMA) and to do rigorous side-by-side testing and comparison of up to 5 models at once. The other is an Excel add-in, RegressIt, which performs linear and logistic regression analysis. It offers high-quality, interactive table and chart output and includes a 2-way interface with the R programming language that allows Excel to be used as a front end and/or back end for linear and logistic regression analysis in R. RegressIt provides an array of novel tools for model space exploration and model comparison and testing, with many interactive features that are useful for classroom demonstrations and evaluation of student work as well as for supporting good analytical practices in general, and it includes extensive built-in teaching notes for regression.

I have also regularly taught a Ph.D. course on rational choice theory (most recently in 2015) which is taken by students from a number of departments around the university. Some older notes for this course can be found here.

Published papers:

 

Edited volume:

 

Older working papers:

Other web pages:

Web Analytics