Showing posts with label Thesis topics. Show all posts
Showing posts with label Thesis topics. Show all posts

Tuesday, June 18, 2013

Two seconds

The weekend wall street journal had an interesting article about high speed trading, Traders Pay for an Early Peek at Key Data. Through Thompson-Reuters, traders can get the University of Michigan consumer confidence survey results two seconds ahead of everyone else. They then trade S&P500 ETFs on the information.

Source: Wall Street Journal

Naturally, the article was about whether this is fair and ethical, with a pretty strong sense of no (and surely pressure on the University of Michigan not to offer the service.)
It didn't ask the obvious question: Traders need willing counterparties. Knowing that this is going on, who in their right mind is leaving limit orders on the books in the two seconds before the confidence surveys come out?

OK, you say, mom and pop are too unsophisticated to know what's going on. But even mom and pop place their orders through institutions which use trading algorithms to minimize price impact. It takes one line of code to add "do not leave limit orders in place during the two seconds before the consumer confidence surveys come out."

In short, the article leaves this impression that investors are getting taken. But it's so easy to avoid being taken, so it seems a bit of a puzzle that anyone can make money at this game. 

I hope readers with more market experience than I can answer the puzzle: Who is it out there that is dumb enough to leave limit orders for S&P500 ETFs outstanding in the 2 seconds before the consumer confidence surveys come out?

Friday, March 8, 2013

Crunch time

David Greenalw, Jim Hamilton, Peter Hooper and Rick Mishkin have a nice op-ed in the Wall Street Journal summarizing their recent paper, Crunch Time: Fiscal Crises and the Role of Monetary Policy, (The link goes to from Jim's website there is also an executive summary.)

David, Jim, Peter and Rick are after the same question in my last WSJ oped and Blog post: Suppose the Fed wants to raise interest rates with a huge debt outstanding. With, say, $18 trillion outstanding, raising interest rates to 5% means raising the deficit by $900 billion a year. That's real fiscal resources. In a present value sense, monetary tightening costs someone $900 billion a year of taxes.  There is no chance that current tax revenues can go up that much, or current spending can go down that much. So, raising interest rates to 5% with a lot of debt outstanding means we will borrow it, the debt will grow $900 billion a year faster, and the larger taxes /lower spending will come someday in the far off future.

Or maybe not. David,  Jim, Peter and Rick delve in to the "tipping point" I alluded to.
Countries with high debt loads are vulnerable to an adverse feedback loop in which doubts by lenders about fiscal sustainability lead to higher government bond rates, which in turn make debt problems more severe.

Tuesday, February 28, 2012

Weird stuff in high frequency markets

On the left is a graph from a really neat paper, "Low-Latency Trading" by Joel Hasbrouck and Gideon Saar (2011). You're looking at the flow of "messages"--limit orders placed or canceled--on the NASDAQ.  The x axis is time, modulo 10 seconds. So, you're looking at the typical flow of messages over any 10 second time interval.

As you can see, there is a big crush of messages on the top of the second, which rapidly tails off in the milliseconds following the even second. There is a second surge between 500 and 600 milliseconds.

Evidently, lots of computer programs reach out and look at the markets once per second, or once per half second. The programs clocks are tightly synchronized to the exchange's clock, so if you program a computer "go look once per second," it's likely to go look exactly on the second (or half second). The result is a flurry of activity on the even second.

Tuesday, January 31, 2012

Consumer financial protection, 1984

The Financial Times reports an amazing interview with Martin Wheatley, the "head of the UK's new consumer protection watchdog."
Investors cannot be counted on to make rational choices so regulators need to “step into their footprints” and limit or ban the sale of potentially harmful products,

Thursday, January 26, 2012

A brief parable of over-differencing

The Grumpy Economist has sat through one too many seminars with triple differenced data, 5 fixed effects and 30 willy-nilly controls. I wrote up a little note (7 pages, but too long for a blog post), relating the experience (from a Bob Lucas paper) that made me skeptical of highly processed empirical work.

The graph here shows velocity and interest rates.  You can see the nice sensible relationship.

(The graph has an important lesson for policy debates. There is a lot of puzzling why people and companies are sitting on so much cash. Well, at zero interest rates, the opportunity cost of holding cash is zero, so it's a wonder they don't hold more. This measure of velocity is tracking interest rates with exactly the historical pattern.) 

But when you run the regression, the econometrics books tell you to use first differences, and then the whole relationship falls apart. The estimated coefficient falls by a factor of 10, and a scatterplot shows no reliable relationship.  See the the note for details, but you can see in the second graph  how differencing throws out the important variation in the data. 

The perils of over differencing, too many fixed effects, too many controls, and that GLS or maximum likelihood will jump on silly implications of necessarily simplified theories are well known in principle. But a few clear parables might make people more wary in practice.  Needed: a similarly clear panel-data example.

Saturday, January 21, 2012

New Keynesian Stimulus

One piece of interesting economics did come up while I was looking through the stimulus blogwars.

Paul Krugman pointed to New Keynesian stimulus models in a recent post, When Some Rigor Helps.
But take an NK [New-Keynesian] model like Mike Woodford’s (pdf) — a model in which everyone maximizes given a budget constraint, in which by construction all the accounting identities are honored, and in which it is assumed that everyone perfectly anticipates future taxes and all that— and you find immediately that a temporary rise in G produces a rise in Y"...

So I guess I’d urge all the people now engaging in contorted debates about what S=I does and does not imply to read Mike first, and see whether you have any point left. 
As it happens, I've spent a lot of time reading and teaching New Keynesian models.

Wednesday, January 4, 2012

The VAT, a libertarian dilemma

Dan Mitchell wrote an interesting op-ed in the Wall Street Journal (Cato link for those without WSJ access), highlighting a great libertarian dilemma: is a consumption tax (VAT or similar) a good thing?

Every bit of economic analysis says yes. Economists hate distortions, taxes that lead to bad economic behavior. Our tax system is full of them.  Broaden the base, lower the rate, tax consumption not savings, dramatically simplify the code, and you can get the same revenue with much less economic damage.

A political argument disagrees: An efficient  tax code can also raise a lot more revenue. Dan opposes the VAT (and similar consumption taxes) on that grounds. Yes it looks good to start, but politicians will soon raise the rate to the sky and spend the results. (Becker and Posner have also tackled this one several times.) 

It's a strking dilemma: should we keep an atrocious tax system to limit the size of government?  Is there no way to get an efficient tax system and a limited government?