Meetings: Spring, 2001

06/01/2001
Featured in print Reporter

Economic Fluctuations and Growth

The NBER's Program on Economic Fluctuations and Growth, which is directed by Robert E. Hall of Stanford University, met on February 3 at the Federal Reserve Bank of San Francisco. Ricardo Caballero, NBER and MIT, and John Cochrane, NBER and University of Chicago, organized the meeting and chose these papers for discussion:

  • Aart Kraay, Norman Loayza, and Luis Serven, World Bank, and Jaume Ventura, NBER and MIT, "Country Portfolios" (NBER Working Paper No. 7795)
  • Discussant: Fernando E. Alvarez, NBER and University of Chicago
  • Douglas W. Diamond and Raghuram G. Rajan, NBER and University of Chicago, "Liquidity Shortages and Banking Crises"
  • Discussant: Nobuhiro Kiyotaki, NBER and London School of Economics
  • Jeremy Greenwood and Mehmet Yorukoglu, University of Rochester, and Anath Seshadri, University of Wisconsin, "Engines of Liberation"
  • Discussant: Robert J. Gordon, NBER and Northwestern University
  • Tobias J. Moskowitz and Annette Vissing-Jorgensen, University of Chicago, "The Private Equity Puzzle"
  • Discussant: Deborah J. Lucas, NBER and Northwestern University
  • Harold L. Cole and Lee E. Ohanian, University of California, Los Angeles, "New Deal Policies and the Persistence of the Great Depression: A General Equilibrium Analysis"
  • Discussant: Alan C. Stockman, NBER and University of Rochester
  • Ellen R. McGrattan and Edward C. Prescott, Federal Reserve Bank of Minneapolis, "Taxes, Regulations, and Asset Prices"
  • Discussant: Robert E. Hall

Kraay, Loayza, Serven, and Ventura ask how countries hold their financial wealth. They construct a database of 68 countries' claims on foreign and domestic capital and international borrowing and lending from 1966 to 1997. The authors find that a small amount of capital will flow from rich countries to poor countries. Further, countries' foreign asset positions are remarkably persistent and generally take the form of foreign loans rather than foreign equity. In the presence of reasonable diminishing returns and production risk, the authors show that the probability that international crises will occur twice a century is enough to generate a set of country portfolios that are roughly consistent with the data.

Diamond and Rajan examine the effects of shortages of liquid assets on a banking system. They characterize the kinds of problems that can arise and the types of interventions that might be appropriate. They also point out the dangers of the wrong kind of intervention, such as infusing capital when the need is for liquidity, as well as the practical difficulty of telling what is needed in some situations.

Greenwood, Yorukoglu, and Seshadri examine the impact of the consumer durable goods revolution that began at the dawn of the last century. They argue that this revolution liberated women from the home. After developing a model of household production in which households must decide whether to adopt the new technologies and whether a married woman should work, they conclude that this model explains the rise in married female labor-force participation that occurred in the last century.

Moskowitz and Vissing-Jorgensen document that investment in private equity is extremely concentrated. Yet despite the very poor diversification of entrepreneurs' portfolios, the authors find that the returns to private equity are similar to the returns on public equity. Given the large premium required by investors in public equity, why do households willingly invest substantial amounts in a single privately-held firm with a worse risk-return tradeoff? The authors conclude that private nonpecuniary benefits of control must be large or entrepreneurs must greatly overestimate their probability of success.

There are two striking aspects of the recovery from the Great Depression in the United States: the recovery was very weak and real wages in several sectors rose significantly above trend. Cole and Ohanian evaluate whether New Deal cartelization policies designed to limit competition among firms and increase labor bargaining power can explain the persistence of the Depression. They develop a model of the intra-industry bargaining process between labor and firms that occurred with these policies. They conclude that New Deal cartelization policies are an important factor in accounting for the post-1933 Depression. Further, the key depressing element of New Deal policies was not collusion per se, but rather the link between paying high wages and collusion.

U.S. stock prices have risen much faster than GNP during the postwar period. Between 1960 and 2000, the value of equity relative to GNP more than doubled. McGrattan and Prescott use a standard growth model to show that economic theory predicts this rise in equity prices. Changes in taxes, primarily in taxes on dividends, account for the large change in equity prices. Theory also can account for the fact that stock returns have been much higher than bond returns over the postwar period.

 

Industrial Organization

The NBER's Project on Industrial Organization, directed by Nancy L. Rose of Stanford University, met at the Bureau's California office on February 10. Rose and her co-organizer, Susan Athey of NBER and MIT, chose the following papers for discussion:

  • Judith A. Chevalier and Anil K Kashyap, NBER and University of Chicago, and Peter Rossi, University of Chicago, "Why Don't Prices Rise During Periods of Peak Demand? Evidence from Scanner Data" (NBER Working Paper No. 7981)
  • Discussant: Robert Porter, NBER and Northwestern University
  • Philip Leslie, University of California, Los Angeles, and Ginger Jin, University of Maryland, "The Effects of Disclosure Regulation: Evidence from Restaurants"
  • Discussant: Daniel Kessler, NBER and Stanford University
  • Brian Viard, Stanford University, "Do Switching Costs Make Markets More or Less Competitive? The Case of 800-Number Portability"
  • Discussant: Chris Knittel, Boston University
  • Alan T. Sorensen, University of California, San Diego, "Price Dispersion and Heterogeneous Consumer Search for Retail Prescription Drugs"
  • Discussant: Florian Zettelmeyer, University of California, Berkeley
  • Margaret Slade, University of British Columbia, "Assessing Market Power in U.K. Brewing"
  • Discussant: Steven T. Berry, NBER and Yale University

Chevalier and her coauthors examine the retail and wholesale prices of a large supermarket chain in Chicago over seven and a half years. They show that prices tend to fall during the seasonal demand peak for a product and that changes in retail margins explain most of those price changes. In other word, markups are counter cyclical. The pattern observed in this data is consistent with "loss leader" models of retailer pricing and advertising competition. Manufacturer behavior plays a more limited role in the counter-cyclicality of prices.

Jin and Leslie examine the effect on firm behavior of increased product information to consumers. They show that mandatory disclosure of hygiene grades - required by Los Angeles County in 1998 - causes restaurants to increase hygiene quality by an average of 5.3 percent. They find little difference between these results and those obtained with voluntary but verifiable disclosure. Economic incentives drive these results: average restaurant revenue is higher because of the introduction of grade cards, and the increase in revenue is higher for restaurants with better hygiene quality grades.

Before portability of 800, or toll free, phone numbers, a customer had to change numbers to change service providers. This imposed significant switching costs on users, who generally invested heavily to publicize these numbers. In May 1993, a new database made 800-numbers portable. Viard uses contracts for virtual private network (VPN) services to test how AT&T adjusted its prices for toll-free services in response to portability. He finds that AT&T reduced margins for toll-free calls (both switched and dedicated) under VPN contracts as the portability date approached, implying that the switching costs under non-portability made the market less competitive. Portability also lowered margins for toll services because of cross-subsidization across services within contracts. Viard's results suggest that, despite toll-free services growing rapidly during this time period, AT&T's incentive to charge a higher price to "locked-in" consumers exceeded its incentive to capture new consumers in the high switching costs era of non-portability.

Sorensen uses detailed data on retail pharmacy transactions to make inferences about the nature and intensity of consumer search for prescription drugs. He estimates that fora typical prescription, approximately 10 percent of consumers price shop. However, variation in this estimated search intensity across drugs is substantial and appears to be consistent with explanations based on rational search. For example, price shopping is more prevalent for maintenance medications than for one-time purchases, presumably because the benefits of finding a low price are magnified for prescriptions that are purchase repeatedly. The cost of conducting an exhaustive price search is about $15 for the average consumer, and search costs are substantially lower among females than males.

Slade examines market power in U.K. brewing, an industry that has witnessed a number of recent mergers and has been scrutinized by both U.K. and EU authorities. She estimates two classes of demand equations and approximates marginal costs in three different ways. Finally, she compares various notions of industry equilibrium. It turns out that the most important decision from the point of view of market-power assessment is the choice of demand model. Different classes of demand equations yield very different predictions concerning elasticities and markups; within a demand-model class, all methods of assessing market power result in similar predictions concerning industry performance. Both differentiation among firms in the brewing industry and their small number endow the industry with the power to charge prices in excess of marginal costs, but Slade finds no evidence of collusion.

 

Insurance

The NBER's Insurance Research Group met in Cambridge on February 16. Kenneth A. Froot, NBER and Harvard University and Howard C. Kunreuther, NBER and University of Pennsylvania, organized this program:

  • Martin F. Grace and Richard D. Phillips, "The Allocation of Governmental Regulatory Authority: Federalism and the Case of Insurance Regulation"
  • Discussants: Dwight Jaffee, University of California, Berkeley, and John Major, Guy Carpenter & Co.
  • Karen Epermanis, University of Hartford, and Scott E. Harrington, University of South Carolina, "Financial Rating Changes and Market Discipline in Property Liability Insurance"
  • Discussants: James Ament, State Farm Fire & Casualty Co., and Paul R. Kleindorfer, University of Pennsylvania
  • Jeffrey R. Brown, NBER and Harvard University, Olivia S. Mitchell, and James M. Poterba, "The Role of Real Annuities and Indexed Bonds in an Individual Accounts Retirement Program"
  • Discussants: James Garven, Baylor University, and Richard Meyer, Harvard University
  • J. David Cummins, University of Pennsylvania, and Olivier Mahul, INRA, "Managing Catastrophic Risk with Insurance Contracts Subject to Default Risk"
  • Discussant: Richard Derrig, Automobile Insurers Bureau of Massachusetts
  • Gordon Woo, "Risk Management Solutions: Territorial Diversification of Catastrophe Bonds"
  • Discussants: Paul Freeman, The World Bank, and Kenneth A. Froot
  • Howard C. Kunreuther and Mark Pauly, University of Pennsylvania, "Ignoring Disaster: Don't Sweat the Big Stuff"
  • Discussants: Steve Goldberg, USAA Property & Casualty Insurance Group, and Sendhil Mullainathan, NBER and MIT
  • Robert J. Willis and Lee A. Lillard, University of Michigan, "Cognition and Wealth: The Importance of Probabilistic Thinking"
  • Discussants: Olivia Mitchell, NBER and University of Pennsylvania, and Thomas Russell, Santa Clara University

Grace and Phillips investigate the states' incentives for providing insurance regulation in an efficient manner. Regulation of the U.S. insurance industry is unique because it is conducted primarily at the state level while the majority of insurance sales are interstate. Consistent with predictions from the federalism literature, the authors find evidence of trans-state externalities: states with small domestic insurance markets are less efficient producers of insurance regulation and appear to allow states that choose to expend the greatest resources to regulate for them. In addition, states with more profitable domestic insurers export greater levels of regulation, suggesting that extraterritorial regulation may erect barriers to entry. The authors find increasing economies of scale in the production of insurance regulation after they control for these regulatory externalities. Taken together, their results suggest that aggregating the production of regulation to a multistate or federal level may resolve a number of inefficiencies in the current system.

Epermanis and Harrington analyze growth in premiums surrounding ratings changes by the A. M. Best Co. during 1992-6 for a large panel of property liability insurers. Their analysis generally provides evidence of significantly lower revenue growth in the year of and the year after downgrades for insurers whose ratings were downgraded. The evidence of revenue declines is strongest for firms that had relatively low ratings (below A-) prior to being downgraded. The authors also find that rating upgrades were accompanied by increased growth in premiums. Overall, material market discipline appears to exist for rated insurers despite guaranty fund protection and other factors that dull consumer incentives to seek safe insurers, and despite insurer incentives for efficient risk management.

Brown, Mitchell, and Poterba explore four issues concerning annuitization options that retirees might use in the decumulation phase of an individual accounts retirement saving system. First, they investigate the operation of both real and nominal individual annuity markets in the United Kingdom. The widespread availability of real annuities in the United Kingdom dispels the argument that private insurance markets could not provide real annuities to retirees. Second, they consider the current structure of two inflation-linked insurance products available in the United States, only one of which proves to be a real annuity. Third, the authors evaluate the potential of assets such as stocks, bonds, and bills to provide retiree protection from inflation. Because real equity returns have been high over the last seven decades, a retiree who received income linked to equity returns would have fared very well on average. Nevertheless, the authors cast doubt on the "inflation insurance" aspect of equities, since it is mainly attributable to stocks' high average return and not to stock returns moving in tandem with inflation. Finally, they assess potential retiree willingness to pay for real, nominal, and variable payout equity-linked annuities. They conclude that people would value a variable payout equity-linked annuity more highly than a real annuity because the additional real returns associated with common stocks more than compensate for the volatility of prospective payouts.

Cummins and Mahul develop a model of optimal contracting for financing large, infrequent events in which the payoffs are subject to both default risk and basis risk. Full insurance above a deductible is optimal when both parties have the same perception of the seller's default risk. When this perception differs, the optimal insurance design depends on behavioral concepts that are standard in the literature of insurance economics, such as risk tolerance, and on actuarial-science concepts, such as hazard rates. Under some specific behavioral and actuarial assumptions, the first-best indemnity schedule should increase and be concave with loss. This could help to explain why in real-world reinsurance markets the proportion of the loss reinsured decreases with the size of the loss.

Woo analyzes the expanding territorial coverage of catastrophe bonds. With catastrophe bonds being under investment management as a distinct asset class, the addition of bonds that are geographically uncorrelated (or weakly correlated) with others comes as a welcome source of portfolio diversification. Apart from the most acute concentrations of earthquake risk in California and Tokyo, bonds have been issued to cover aggregations of earthquake risk in less active seismic zones, such as Monaco. Severe windstorms in Europe also have been covered, along with tropical cyclones in the United States and Japan. Although catastrophe bonds initially focused on a single peril and territory, they now have been structured with independent multiple event triggers, differing according to peril and territory. Woo reviews the territorial development of catastrophe bonds and explores the geographical horizon for new issues.

Individuals have difficulty in dealing with low-probability, high-loss events. Frequently they fail to obtain insurance against such losses, even when the insurance terms are very favorable. Kunreuther and Pauly suggest that one reason that many people do not purchase insurance is that the transaction costs and attention time of obtaining and processing information on protection is so high as to not justify this effort. While some rare events surely can cause enormous losses when they occur, the ex ante expected value of such losses may be small. But there is more to the problem than just comparing gains and losses. What individuals will choose to do depends on the assumptions they make about the functioning of insurance markets, as well as on their perceptions of the risk and their decisionmaking processes.

Willis and Lillard use a large set of subjective probability questions from the Health and Retirement Survey to construct an index measuring the precision of probabilistic beliefs. They then relate this index to household choices about the riskiness of their portfolios and the rate of growth of their net worth. The authors propose a theory of uncertainty aversion based on repeated sampling that resolves the Ellsberg Paradox within a conventional expected utility model; in this theory, uncertainty aversion is implied by risk aversion. They then propose a link between an individual's degree of uncertainty and his propensity to give "focal" or "exact" answers to survey questions. After constructing an index of the precision of probabilistic thinking, they show that it has a statistically and economically significant positive effect on the fraction of risky assets in household portfolios and on the rate of longitudinal growth of these assets. These results suggest that there is systematic variation in the competence of individuals to manage investment accounts; that variation should be considered in designing policies to create individual retirement accounts in the Social Security system.

 

Development of the American Economy

The NBER's Program on the Development of the American Economy, directed by Claudia Goldin of Harvard University, met in Cambridge on March 10. The following papers were discussed:

  • Joshua Rosenbloom, NBER and University of Kansas, and William Sundstrom, Santa Clara University, "Long-Run Patterns of Interstate Migration in the United States:Evidence from the IPUMS, 1850-1990"
  • Jeremy Atack and Robert A. Margo, NBER and Vanderbilt University, and Fred Bateman, University of Georgia, "Productivity in Manufacturing and the Length of the Working Day: Evidence from the 1880 Census of Manufactures"
  • John J. Wallis, University of Maryland and NBER, "What Caused the Crisis of 1839? "
  • William J. Collins, NBER and Vanderbilt University, "The Labor Market Impact of State-Level Anti-Discrimination Laws, 1940-1960 "
  • Michael R. Haines, NBER and Colgate University, "The Urban Mortality Transition in the United States, 1800-1940"

Rosenbloom and Sundstrom explore several ways of using individual-level data drawn from the Integrated Public Use Microdata Samples of the U.S. Population Censuses (IPUMS) to trace the evolution of migration behavior. They construct two measures of interstate migration over the past 150 years. The first measure considers an individual to have moved if he or she is residing in a state different from his or her state of birth. The second measure considers a family to have moved if it is residing in a state different from the state of birth of one of its young children. They use these measures to follow interstate migration patterns for successive synthetic birth cohorts of individuals from 1850 through 1990. These allow the authors to describe life-cycle patterns of migration and how they have changed over time. Migration was always most common among the young, but this relationship has grown stronger over time. Comparing migration behavior across cohorts, the authors find that migration propensities have followed a U-shaped pattern since 1850, falling until around 1900 and then rising until around 1970. They also find evidence of substantial differences by race, sex, and region in migration behavior.

Margo, Atack, and Bateman use data from the manuscript census of manufacturing to estimate the effects of the length of the working day on output and wages. They find that the elasticity of output with respect to daily hours worked was positive but less than one, implying diminishing returns to increases in working hours. When the annual number of days worked is held constant, the average annual wage is related positively to daily hours worked, but again the elasticity is less than one. At ten hours per day, the marginal benefits to employers of a shorter working day - lower wage bills - were approximately offset by the marginal cost - lower output.

The American economy experienced financial crisis in May of 1837 and October of 1839. The Panic of 1837 has been studied in detail, but the Crisis of 1839 brought on four years of deflation and recession. Wallis examines the role of state government investment in canals, railroads, and banks and the large debts created to finance those investments, in the swift economic recovery in 1838, and the long decline that set in after 1839.

By the time Congress passed the 1964 Civil Rights Act, 98 percent of non- Southern blacks were already covered by state-level "fair employment" laws which prohibited labor market discrimination. Collins assesses the impact of fair employment legislation on black workers' income, unemployment, labor force participation, and occupational and industrial distributions relative to whites. He finds that in general the fair employment laws had small or negligible effects on the labor market outcomes of black men but somewhat stronger positive effects on the labor market outcomes of black women.

Haines explains that in the United States as in many other nations in the 19th and early 20th centuries, there was a substantial mortality "penalty" to living in urban places. By around 1940, this penalty had been largely eliminated, and in many cases it was healthier to reside in the city than in the countryside. Despite the lack of systematic national data before 1933, it is possible to describe the phenomenon of the urban mortality transition. Early in the 19th century, the United States was not particularly urban (only 6.1 percent in 1800), a circumstance which led to a relatively favorable mortality situation. A national crude death rate of 20-25 per thousand per year would have been likely. Some early data indicate that mortality was substantially higher in cities, was higher in larger relative to smaller cities, and was higher in the South relative to the North. By 1900, the nation had become about 40 percent urban (and 56 percent by 1940). It appears that death rates actually rose (or at least did not decline) over the middle of the 19th century. Increased urbanization, as well as developments in transport and commercialization and increased movements of people into and throughout the nation, contributed to this. The sustained mortality transition only began about the 1870s. Thereafter the decline of urban mortality proceeded faster than in rural places, assisted by significant public works improvements and advances in public health and eventually medical science. Much of the process had been completed by the 1940s. The urban penalty had been largely eliminated and mortality continued to decline despite the continued growth in the urban share of the population.

 

Productivity

The NBER's Program on Productivity held its spring meeting in Cambridge on March 16. Organizers C. Lanier Benkard, NBER and Stanford University, and Iain M. Cockburn, NBER and Boston University, put together this program:

  • Sangin Park, State University of New York, Stony Brook, "Innovation-Adjusted Price Indexes for Pharmaceuticals"
  • Discussant: Scott Stern, NBER and MIT
  • Laura Blow and Ian Crawford, Institute for Fiscal Studies, "Valuing a New Good"
  • Discussant: Peter Davis, MIT
  • Ariel Pakes, NBER and Harvard University, "Notes on Hedonic Price Indexes with an Application to PCs"
  • Discussant: W. Erwin Diewert, NBER and University of British Columbia
  • Igal Hendel, NBER and University of Wisconsin, and Aviv Nevo, NBER and University of California, Berkeley, "Sales and Consumer Inventory"
  • Discussant: Victor Aguirregeburia, Boston University
  • Kevin Stiroh, Federal Reserve Bank of New York, "Information Technology and the U.S. Productivity Revival: What Do the Industry Data Say?"
  • Discussant: Martin Baily, former Chairman of the President's Council of Economic Advisers
  • Jinyoung Kim, State University of New York, Buffalo, and Gerald Marschke, State University of New York, Albany, "The Labor Market for Scientists and the Recent Rise in Patenting"
  • Discussant: Samuel S. Kortum, NBER and Boston University

Pharmaceuticals are a special case of goods with some unobservable quality prior to consumption, which is called an "experience characteristic." Consumers learn about these experience characteristics from consumption and from advertising. Park proposes price indexes for pharmaceuticals that are innovation-adjusted and applies these indexes to the data for antidepressant drugs during 1980-95. He finds that the key source of innovation is the entry of new products, but that the effects of learning about experience characteristics also are significant. He finds too that the average annual growth rate of the focal price index declines by almost 9.5 percent, which suggests that the existing price indexes for pharmaceuticals may seriously overstate the rate of inflation in a rapidly growing market with the entry of innovative products.

Blow and Crawford suggest a method of valuing a new good that does not depend on the relationship between a household's economic welfare and the goods and services it consumes--only on the existence of such a relationship. They illustrate their technique with U.K. household budget survey data and calculate the welfare effects of the launch of the National Lottery in the United Kingdom in November 1994. The authors show that the increase in economic welfare associated with the arrival of the Lottery was greater for better-off households. They also show how measures of inflation over this period are affected by the inclusion of the new good, and they describe how the distributional effects of inflation are more strongly pro-rich when they allow for the new good.

Pakes considers the use of hedonic techniques to ameliorate new goods biases in price indexes. He provides a conceptual comparison of the hedonic to the traditional matched-model index, stressing the selection bias in the latter that arises from the fact that it does not make price comparisons for discontinued goods, and that the prices of discontinued goods tend to fall more than the average. He then provides a hedonic price index for desktop PCs and compares it to a traditional matched-model index calculated on the same data (the time period is 1995 to 1999). The hedonic price index is always negative, with an average rate of decline of about 15 percent, while the matched-model index is slightly positive and negatively correlated with the hedonic index. This negative correlation is expected because the selection bias in the traditional matched-model index is particularly large in years when many new products entered and made existing products obsolete. Those were precisely the periods when the hedonic index tells us that many product improvements were made.

Hendel and Nevo study intertemporal price discrimination in markets for nondurable but storable goods, such as groceries, with highly frequent sales. They consider a consumer's dynamic problem of being able to store a consumption good and facing uncertain future prices. To test the model, they use weekly store-level price and quantity scanner data on laundry detergents, as well as a household-level dataset. Their preliminary results suggest that duration from previous sale has a positive effect on the aggregate quantity purchased. Two indirect measures of storage are positively correlated with a household's tendency to buy on sale. One measure of inventory is negatively correlated with the quantity purchased (conditional on a purchase) and with the probability of buying, conditional on being in a store.

Stiroh examines the link between information technology (IT) and the U.S. productivity revival in the late 1990s. Industry-level data show a broad productivity resurgence that reflects both the production and the use of IT. The most IT-intensive industries experienced significantly larger productivity gains than other industries. A wide variety of econometric tests show a strong

correlation between IT capital accumulation and labor productivity. To quantify the aggregate impact of IT use and IT production, Stiroh presents a novel decomposition of aggregate labor productivity. He shows that virtually all of the aggregate productivity acceleration can be traced to the industries that either produce IT or use IT most intensively, with essentially no contribution from the remaining industries that are less involved in the IT revolution.

Kim and Marschke develop and test a model of the patenting and R and D decisions of a firm whose researcher-employees sometimes move to a competitor. In their model, a firm facing the prospect of a scientist leaving risks losing its innovations to the scientist's future employer. But a firm can mitigate this risk by moving quickly to patent its scientists' innovations. Thus, a firm's propensity to patent an innovation rises with the likelihood of a researcher's departure. Their model also shows that an increase in the probability of a scientist leaving is likely to reduce research expenditures, and therefore to raise the patent-R and D expenditures ratio. Using firm-level panel data on patenting and R and D and industry estimates of labor mobility, the authors show that firms in industries with higher job turnover rates generate more patents, consistent with firms using patenting to prevent employee misappropriation of intellectual property. Also consistent with their theory are the authors' findings that job turnover rates are negatively correlated with firm-level R and D outlays and positively correlated with the patent-R and D ratio. The evidence indicates that the increasing mobility of scientists may be driving part of the rapid rise in patenting since the mid-1980s.

 

International Finance and Macroeconomics

The NBER's Program on International Finance and Economics held its spring meeting in Cambridge on March 23. Organizers Richard Lyons and Andrew Rose, NBER and University of California, Berkeley, organized this program:

  • Kathryn Dominguez and Linda Tesar, NBER and University of Michigan, "Exchange Rate Exposure"
  • Discussants: Bernard Dumas, NBER and INSEAD, and Kristin Forbes, NBER and MIT
  • Eric Parrado, New York University, and Andrès Velasco, NBER and Harvard University, "Optimal Interest Rate Policy in a Small Open Economy: The Case for a Clean Float"
  • Discussants: Paolo Pesenti, Federal Reserve Bank of New York, and Jaume Ventura, NBER and MIT
  • Pierre-Olivier Gourinchas, NBER and Princeton University, Rodrigo Valdés, Chilean Ministry of Finance, and Oscar Landerretche, MIT, "Lending Booms: Latin America and the World"
  • Discussants: Michael Dooley, NBER and University of California, Santa Cruz, and Carmen Reinhart, NBER and University of Maryland
  • Menzie Chinn, NBER and University of California at Santa Cruz, and Guy Meredith, International Monetary Fund, "Testing Uncovered Interest Parity at Short and Long Horizons"
  • Discussants: Geert Bekaert, NBER and Columbia University, and Karen Lewis, NBER and University of Pennsylvania
  • Margarida Duarte, Federal Reserve Bank of Richmond, and Alan Stockman, NBER and University of Rochester, "Rational Speculation and Exchange Rates"
  • Discussants: Kenneth Rogoff, NBER and Harvard University, and Olivier Jeanne, International Monetary Fund

Finance theory suggests that, in a world with integrated capital markets, exposure to foreign markets should have little influence on asset prices. Dominguez and Tesar find that in a pooled sample of eight (non-US) industrialized and emerging markets, between 12 and 23 percent of firms are exposed to exchange rate movements. They also find that: the choice of exchange rate matters; using the trade-weighted exchange rate is likely to understate the extent of exposure. The extent of exposure is not a result of a spurious correlation between random variables with high variances, though; exposure increases with the return horizon and within a country and an industry, exposure coefficients are roughly evenly split between positive and negative values. Averaging across the (absolute value of the) significant exposure coefficients in their sample of countries, the authors estimate the exposure coefficient to be about 0.5. The extent of exposure is not sensitive to the sample period, but the set of firms that is exposed does vary over time. The sign of the exposure coefficients changes across subperiods for about half of the firms in the sample. Thus exposure does not appear to be related to firm size, industry affiliation, multinational status, foreign sales, international assets, or industry-level trade.

Parrado and Velasco derive the optimal monetary and exchange rate policy for a small stochastic open economy with imperfect competition and short-run price rigidity. They conclude that the optimal policy depends crucially on the source of stochastic disturbances affecting the economy, much as in the literature pioneered by Poole (1970). Optimal monetary policy reacts to domestic disturbances but does not respond at all to foreign shocks. Consequently, under the optimal policy the exchange rate floats cleanly, and monetary policy is aimed exclusively at stabilizing the home economy.

Recent theories of crisis put lending booms at the root of financial collapses. Yet lending booms may be a natural consequence of economic development and fluctuations. Gourinchas, Valdés, and Landerretche investigate whether lending booms are indeed dangerous, based on a sample of episodes spanning 40 years, especially in Latin America. They find that lending booms often are associated with domestic investment booms, increases in domestic interest rates, a worsening of the current account, declines in reserves, real appreciation, and declines in output growth. The "typical" lending boom does not substantially increase the vulnerability of the banking sector or the balance of payments. Comparing countries, the authors find that lending booms in Latin America make the economy considerably more volatile and vulnerable to financial and balance-of-payment crises.

The hypothesis that interest rate differentials are unbiased predictors of future exchange rate movements has been rejected almost universally in empirical studies using short-horizon data; Chinn and Meredith test this hypothesis using interest rates on longer-maturity bonds for the G-7 countries. They find that the coefficients on interest differentials are of the correct sign, and almost all are closer to the predicted value of one than to zero. These results are robust to changes in data type and to base currency (Deutschemark versus U.S. dollar).

Models of exchange rates typically have failed to produce results consistent with the key fact that real and nominal exchange rates move in ways not closely connected to current (or past) macroeconomic or trade variables. Models that rely on the same shocks to drive fluctuations in macro variables and exchange rates typically predict counterfactually strong co-movements between them. Duarte and Stockman propose a new approach to exchange rates and implement it with a new-open-economy macro model. The approach focuses on the effects of speculation and the resulting changes in risk premiums on foreign-exchange markets. Exchange rates follow a forward-looking, first-order stochastic difference equation that includes terms involving risk premiums. Changes in risk premiums can affect the current exchange rate without necessarily creating large changes in current macroeconomic variables. Consequently, the approach has the potential to explain the Flood-Rose exchange-rate "disconnect" puzzle. However, the baseline model does not yet generate a sufficient degree of rational speculation to explain observed variation of risk premiums and exchange rates.

 

Asset Pricing

The NBER's Program on Asset Pricing met on March 24 on the campus of the University of California, Los Angeles. Program Director John H. Cochrane of the University of California, Los Angeles, and Monika Piazzesi, NBER and University of California, Los Angeles, organized the meeting and chose the following papers for discussion:

  • Darrell Duffie, NBER and Stanford University, and B. Nicolae Garleanu and Lasse H. Pedersen, Stanford University, "Valuation in Dynamic Bargaining Markets"
  • Discussant: Tano Santos, University of Chicago
  • Pietro Veronesi, NBER and University of Chicago, "Belief-Dependent Utilities, Aversion to State-Uncertainty, and Asset Prices"
  • Discussant: Lars P. Hansen, NBER and University of Chicago
  • Jun Liu and Francis A. Longstaff, University of California, Los Angeles, "Losing Money on Arbitrage: Optimal Dynamic Portfolio Choice in Markets with Arbitrage Opportunities"
  • Discussant: Ming Huang, Stanford University
  • Owen A. Lamont and Richard H. Thaler, NBER and University of Chicago, "Can the Market Add and Subtract? Mispricing in Tech Stock Carve-Outs"
  • Mark Mitchell and Erik Stafford, Harvard University, and Todd C. Pulvino, Northwestern University, "Limited Arbitrage in Equity Markets"
  • Discussant for both papers: Bradford Cornell, University of California, Los Angeles
  • Jessica A. Wachter, New York University, "Habit Formation and Returns on Bonds and Stocks"
  • Discussant: Christopher I. Telmer, Carnegie Mellon University
  • Peter L. Bossaerts, California Institute of Technology and CEPR; Charles R. Plott, California Institute of Technology; and William R. Zame, University of California, Los Angeles, "Structural Econometric Tests of General Equilibrium Theory on Data from Large-Scale Experimental Financial Markets"
  • Discussant: Wayne E. Ferson, NBER and University of Washington

Duffie, Garleanu, and Pedersen study the impact on asset prices of illiquidity associated with search and bargaining in an economy in which agents can interact only when they find each other. Even when market makers are present, investors' abilities to meet directly are important. Prices are higher and bid-ask spreads lower if investors can find each other more easily. Prices approach the Walrasian price if investors' search intensity increases or if market makers, who do not have all the bargaining power, search more intensely. Endogenizing search intensities yields natural implications. Finally, the authors show that information can fail to be revealed through prices when searching is difficult.

Veronesi reinterprets standard axioms in choice theory to introduce the concepts of "belief dependent" utility functions and aversion to "state uncertainty." Within a standard pure-exchange economy in which investors ignore the long-run drift of consumption growth ("the state"), such preferences help explain the various stylized facts of stock returns, including a high-equity risk premium, a low risk-free rate, high return volatility, stock return predictability, and volatility clustering. Since the long-run drift of consumption determines the (average) path of future consumption, aversion to state uncertainty suggests aversion to the dispersion of long-run consumption paths. This differs from the standard notion of (local) risk aversion in its temporal dimension. When the model is calibrated to real consumption, it generates unconditional moments for asset returns that confirm the empirical observations. In addition, when it is estimated using consumption data, the fitted model produces posterior distributions on the drift rate of consumption that are relatively dispersed, further motivating the notion of aversion to long-run risk. In theory, an investor can make infinite profits by taking unlimited positions in an arbitrage. In reality, investors must satisfy margin requirements that completely change the economics of arbitrage.

Liu and Longstaff derive the optimal investment policy for a risk-averse investor in a market in which arbitrage opportunities exist. They show that it is often optimal to underinvest in the arbitrage by taking a smaller position than margin constraints allow. In some cases, it is actually optimal to walk away from a pure arbitrage opportunity. Even when the optimal policy is followed, the arbitrage strategy may underperform the riskless asset or have an unimpressive Sharpe ratio. Furthermore, the arbitrage portfolio typically experiences losses at some point before the final convergence date. These results have important implications for the role of arbitrageurs in financial markets.

Recent equity carve-outs in U.S. technology stocks appear to violate a basic premise of financial theory: identical assets have identical prices. Lamont and Thaler explore a sample of company shareholders who expect to receive shares of another company. A prominent example involves 3Com and Palm. The authors find that arbitrage does not eliminate blatant mispricings resulting from short sale constraints, so that one stock is overpriced but expensive or impossible to sell short. Evidence from options prices shows that shorting costs are extremely high, eliminating exploitable arbitrage opportunities.

Mitchell, Stafford, and Pulvino examine the impediments to arbitrage in 82 situations between 1985 and 2000, where the market value of a company is less than the sum of its publicly traded parts. These situations, often referred to as "negative stub values," suggest clear arbitrage opportunities and provide an ideal setting in which to study the risks and market frictions that prevent arbitrageurs from forcing prices to fundamental values. The authors find that 30 percent of negative stub deals terminate without converging. In addition, they estimate that the returns to a specialized arbitrageur would be 50 percent larger if the path to convergence was smooth rather than volatile, but the marginal investor in negative stub values is not likely to be so specialized. Short-selling frictions do not appear to be a major impediment to arbitrage in their sample. Information transaction costs appear to be the most serious limit to arbitrage.

Wachter proposes a model that captures the ability of the yield spread to predict excess returns on bonds as documented in empirical studies. The model, a generalization of Campbell and Cochrane (1999), also captures the predictability of stock returns by the price-dividend ratio, a high equity premium, excess volatility, positive excess returns on bonds, and an upward-sloping average yield curve. The model implies a joint process for interest rates and consumption. Controlling for contemporaneous consumption growth, long lags of consumption predict the interest rate. Thus the success of the model is based on a more realistic process for consumption and the interest rate, rather than on additional degrees of freedom in the utility function.

Bossaerts, Plott, and Zame develop structural tests of asset pricing theory to apply to data from experimental financial markets. The tests differ from those used with field data because they verify the consistency between prices and allocations, rather than merely testing whether only prices satisfy equilibrium restrictions. The authors test two large-scale financial market experiments in order to resolve an apparent price-allocation paradox in the experiments: namely, why asset prices can satisfy theoretical restrictions (CAPM) when allocations deviate substantially. They find that allocations seem not to be any closer to CAPM predictions than when prices grossly violate CAPM restrictions. Their theory explains the paradox in both sets of experiments; that is, when end-of-period prices are such that the market portfolio is close to mean-variance efficient (the CAPM pricing prediction), the allocations and their theory explain why. When end-of-period prices make the market portfolio far from mean-variance optimal, their tests reject.