I’m preparing for my first global travel for global health, but the net is paying attention to a paper that I think I’ll like, and I want to mention it briefly before I fly.
Computational Complexity and Information Asymmetry in Financial Products is 27 pages of serious TCS, but it is so obviously applicable that people outside of our particular ivory tower, and even outside of academia entirely are blogging and twittering about it, and even reading it!
Freedom to Tinker has a nice summary of this paper, if you want to know what it’s about in a hurry.
Mike Trick makes the salient observation that NP-hard doesn’t mean computers can’t do it. But the assumption that this paper is based on is not about worst-case complexity; it is, as it should be, based on an assumption about the average-case complexity of a particular optimization problem over a particular distribution.
As it turns out, this is an average-case combinatorial optimization problem that I know and love, the densest subgraph problem. My plan is to repeat the problem here, and share some Python code for generating instances of it. Then, you, me, and everyone, can have a handy instance to try optimizing. I think that this problem is pretty hard, on average, but there is a lot more chance of making progress on an algorithm for it than for cracking the P versus NP nut. Continue reading
I wrote a few months ago about how research in differential privacy seems very applicable to global public health. There is a new report from the Institute of Medicine which calls for a new approach to protecting privacy in health research, Beyond the HIPAA privacy rule. The Lancet also has an editorial about the report, which is what made me think that I should pass this reference on to theoretical privacy researchers. Lancet’s summary: Continue reading
A multitude of events in the last week or so have made me want to blog about (and learn more about) the cryptographic theory of privacy. Journalist James Bamford’s new book about the NSA came out, the third in his trilogy. Bamford described his findings on Democracy Now last Tuesday, including how government contractors were hired to eavesdrop on US soldiers in Iraq:
Not only were they eavesdropping on a lot of these conversations, some of which were very intimate, but they would have sort of locker room chats about what they were hearing, and they would post—or they would notify their co-workers that you should listen to this, what they call “cut,” their conversations. You should listen to this conversation or that conversation. They’d laugh about it.
Also last week, (or maybe two weeks ago) the National Academy Press published a new report called Protecting Individual Privacy in the Struggle Against Terrorists. The report’s primary recommendation, that “Programs Should be Evaluated for Effectiveness, Privacy”, is not too revolutionary, but the report contains some interesting summaries of technology and public opinion.
And kicking off this season of privacy discussion, there were demonstrations across the EU on Oct 11 in a world-wide protest against surveillance entitled Freedom not fear.
Or, almost kicking it off… just a few weeks before this tsunami of privacy, Adam Smith posted an interesting sounding paper to the arxiv, Efficient, Differentially Private Point Estimators. This sort of cryptographic approach to privacy is where I’m going with this post. But let me first mention why I’m going there.