Useful material on how to deal with slow tests in py.test, a bit buried in the docs:
From http://doc.pytest.org/en/latest/usage.html, to get a list of the slowest 10 test durations:
From http://doc.pytest.org/en/latest/example/simple.html, to skip slow tests unless they are requested:
# content of conftest.py
help="run slow tests")
# content of test_module.py
slow = pytest.mark.skipif(
reason="need --runslow option to run"
Very convenient to know.
A short paper from my work on predicting who will get elective surgery for diverticulitis is on arXiv: https://arxiv.org/abs/1612.00516 [tag: my-research]
A children’s book on women in STEM led me to this gem:
[get nytimes screenshot, possibly tell story about using privilege, and the sooner the better[
Preparing for the release of our paper of leading causes of death in US counties led me to review the origins of a quote often attribute to Stalin: http://quoteinvestigator.com/2010/05/21/death-statistic/
Lots to unpack there. The important thing for me is remember that there are individual human tragedies behind the 80 million data points in our study. Respect.
Abett is my former colleague Mike Hanlon’s new startup, and its advice app is now publicly available.
I remember fondly the days I spent working with Mike, who has a wealth of stories from has time as employee #7 at Amazon. He was in the cruicble as AMZN formed, stormed, and normed, and I got a lot of insight into what IHME’s fast growth could look like from his battle tales.
He has lofty goals for Abett as well, to use “big data”, in this case meaning user trace data, to help _people_ instead of advertisers. I’m 100% for that.
There is no advertising on our service, and never will be. We don’t care which products you buy, or if you buy any at all. We want you to make the right decision for you. Advertising would bias that objectivity, and thus we don’t accept it.
For more: http://www.abett.com/blog/2016/10/31/hello-world/
I’m lecturing Python II for a Software Carpentry Bootcamp in Jan, and I thought I’d find a little example of a funny fact I’ve heard: IEEE floating point addition does not obey the associative law.
I must be spending too much time with doctors, because I didn’t try to make an example myself and started by looking it up in Google. The first example I found put it cleanly: http://www.walkingrandomly.com/?p=5380
>>> print('%.17f' %x)
>>> print('%.17f' %y)
This shows the rumor is true: addition is not associative. It does not seem like a big deal, though, since I usually round my numbers to one or two significant digits, and I know how to test with `np.allclose`.
The second example I found makes the problem clearer, though: http://cass-mt.pnnl.gov/docs/pubs/pnnleffects_of_floating-pointpaper.pdf
x = (17 + 1e32) - 1e32
y = 17 + (1e32 - 1e32)
Can’t `np.allclose` that, unless you know what “close” means…
Additional reading: http://www.macaulay.ac.uk/fearlus/floating-point/