Pages

Friday, July 31, 2015

Surveys in Crisis

In "Household Surveys in Crisis," Bruce D. Meyer, Wallace K.C. Mok, and James X. Sullivan describe household surveys as "one of the main innovations in social science research of the last century." Large, nationally representative household surveys are the source of official rates of unemployment, poverty, and health insurance coverage, and are used to allocate government funds. But the quality of survey data is declining on at least three counts.

The first and most commonly studied problem is the rise in unit nonresponse, meaning fewer people are willing to take a survey when asked. Two other growing problems are item nonresponse-- when someone agrees to take the survey but refuses to answer particular questions-- and inaccurate responses. Of course, the three problems can be related. For example, attempts to reduce unit nonresponse by persuading reluctant households to take a survey could raise item nonresponse and inaccurate responses if these reluctant participants rush through a survey they didn't really want to take in the first place.

Unit nonresponse, item nonresponse, and inaccurate responses would not be too troublesome if they were random enough that survey statistics were unbiased, but that is unlikely to be the case. Nonresponse and misreporting may be systematically correlated with relevant characteristics such as income or receipt of government funds. Meyer, Mok, and Sullivan look at survey data about government transfer programs for which corresponding administrative data is also available, so they can compare survey results to presumably more accurate administrative data. In this case, the survey data understates incomes at the bottom of the distribution, understates the rate of program receipt and the poverty reducing effects of government programs, and overstates measures of poverty and of inequality. For other surveys that cannot be linked to administrative data, it is difficult to say which direction biases will go.

Why has survey quality declined? The authors discuss many of the traditional explanations:
"Among the traditional reasons proposed include increasing urbanization, a decline in public spirit, increasing time pressure, rising crime (this pattern reversed long ago), increasing concerns about privacy and confidentiality, and declining cooperation due to 'over-surveyed' households (Groves and Couper 1998; Presser and McCullogh 2011; Brick and Williams 2013). The continuing increase in survey nonresponse as urbanization has slowed and crime has fallen make these less likely explanations for present trends. Tests of the remaining hypotheses are weak, based largely on national time-series analyses with a handful of observations. Several of the hypotheses require measuring societal conditions that can be difficult to capture: the degree of public spirit, concern about confidentiality, and time pressure...We are unaware of strong evidence to support or refute a steady decline in public spirit or a rise in confidentiality concerns as a cause for declines in survey quality."
They find it most likely that the sharp rise in the number of government surveys administered in the US since 1984 has resulted in declining cooperation by "over-surveyed" households. "We suspect that talking with an interviewer, which once was a rare chance to tell someone about your life, now is crowded out by an annoying press of telemarketers and commercial surveyors."

Personally, I have not received any requests to participate in government surveys and rarely receive commercial survey requests. Is this just because I moved around so much as a student? Am I about to be flooded with requests? I think I would actually find it fun to take some surveys after working with the data so much. Please leave a comment about your experience with taking (or declining to take) surveys.

The authors also note that since there is a trend toward greater leisure time, it is unlikely that increased time pressure is resulting in declining survey quality. However, while people have more leisure time, they may also have more things to do with their leisure time (I'm looking at you, Internet) that they prefer to taking surveys. Intuitively I would guess that as people have grown more accustomed to doing everything online, they are less comfortable talking to an interviewer in person or on the phone. Since I almost never have occasion to go to the post office, I can imagine forgetting to mail in a paper survey. Switching surveys to online format could result in a new set of biases, but may eventually be the way to go.

I would also guess that the Internet has changed people's relationship with information, even information about themselves. When you can look up anything easily, that can change what you decide to remember and what facts you feel comfortable reporting off the top of your head to an interviewer.

Wednesday, July 8, 2015

Trading on Leaked Macroeconomic Data

The official release times of U.S. macroeconomic data are big deals in financial markets. A new paper finds evidence of substantial informed trading before the official release time of certain macroeconomic variables, suggesting that information is often leaked. Alexander Kurov, Alessio Sancetta, Georg H. Strasser, and Marketa Halova Wolfe examine high-frequency stock index and Treasury futures markets data around releases of U.S. macroeconomic announcements:
These announcements are quintessential updates to public information on the economy and fundamental inputs to asset pricing. More than a half of the cumulative annual equity risk premium is earned on announcement days (Savor & Wilson, 2013) and the information is almost instantaneously reflected in prices once released (Hu, Pan, & Wang, 2013). To ensure fairness, no market participant should have access to this information until the official release time. Yet, in this paper we find strong evidence of informed trading before several key macroeconomic news announcements....Prices start to move about 30 minutes before the official release time and the price move during this pre-announcement window accounts on average for about a half of the total price adjustment.
They consider the 30 macroeconomic announcements that other authors have shown tend to move markets, and find evidence of:

  • Significant pre-announcement price drift for: CB consumer confidence index, existing home sales, GDP preliminary, industrial production, ISM manufacturing index, ISM non-manufacturing index, and pending home sales.
  • Some pre-announcement drift for: advance retail sales, consumer price index, GDP advance, housing starts, and initial jobless claims.
  • No pre-announcement drift for: ADP employment, durable goods orders, new home sales, non-farm employment, producer price index, and UM consumer sentiment.
The figure below shows mean cumulative average returns in the E-mini S&P 500 Futures market from 60 minutes before the release time to 60 minutes after the release time for the series with significant evidence of pre-announcement drift.

Source: Kurov et al. 2015, Figure A1, panel c. Cumulative average returns in the E-mini S&P 500 Futures market .
Why do prices start to move before release time? It could be that some traders are superior forecasters, making better use of publicly-available information, and waiting until a few minutes before the announcement to make their trades. Alternatively, information might be leaked before the official release. Kurov et al. note that, while the first possibility cannot be ruled out entirely, the leaked information explanation appears highly likely. The authors conducted a phone and email survey of the organizations responsible for the macroeconomic data in their study to find out about data release procedures:
The release procedures fall into one of three categories. The first category involves posting the announcement on the organization’s website at the official release time, so that all market participants can access the information at the same time. The second category involves pre-releasing the information to selected journalists in “lock-up rooms” adding a risk of leakage if the lock-up is imperfectly guarded. The third category, previously not documented in academic literature, involves an unusual pre-release procedure used in three announcements: Instead of being pre-released in lock-up rooms, these announcements are electronically transmitted to journalists who are asked not to share the information with others. These three announcements are among the seven announcements with strong drift.
I wish I had a better sense of who was obtaining the leaked information and how much they were making from it.