Monthly Archives: June 2017
New Publication: Implementing the PHMRC shortened questionnaire: Survey duration of open and closed questions in three sites
More countries are using verbal autopsy as a part of routine mortality surveillance. The length of time required to complete a verbal autopsy interview is a key logistical consideration for planning large-scale surveillance.
We use the PHMRC shortened questionnaire to conduct verbal autopsy interviews at three sites and collect data on the length of time required to complete the interview. This instrument uses a novel checklist of keywords to capture relevant information from the open response. The open response section is timed separately from the section consisting of closed questions.
We found the median time to complete the entire interview was approximately 25 minutes and did not vary substantially by age-specific module. The median time for the open response section was approximately 4 minutes and 60% of interviewees mentioned at least one keyword within the open response section.
The length of time required to complete the interview was short enough for large-scale routine use. The open-response section did not add a substantial amount of time and provided useful information which can be used to increase the accuracy of the predictions of the cause of death. The novel checklist approach further reduces the burden of transcribing and translating a large amount of free text. This makes the PHMRC instrument ideal for national mortality surveillance.
Also with a replication archive on the Global Health Data Exchange (GHDx) [http://ghdx.healthdata.org/node/263527].
I started reading an “economics of diversity” book recently, and stumbled across a great example of the power of visual analytics (included early in the book to demonstrate the value of diverse representations):
This game is hard, right? I mean I have to think about it to figure out a good move. But if you think of it visually, the right way, it is not hard. I’ll leave it as a mystery for now, and say that I can imagine a classroom exercise on this when I next teach interactive data visualization again.
Did I mention that I attended a Software Carpentry (SWC) train-the-trainers event recently (editors note: not so recently anymore…)? And did I mention that they got me to read a fun book called _Teaching what you don’t know_? It had a number of fun-sounding ideas to encourage students to actively engage with material, in a chapter titled “Thinking in Class”, and I tried one out in a guest lecture.
The super-simple idea is this: at some point when students are spacing out from hearing too much talking from me, I paused for questions. When there were none, I said, “now I want you to turn to the person next to you, and spend just two minutes and see where your notes differ from theirs. And figure out what makes sense now but might not when you look back at your notes.”
Then I had a little break for two minutes, and people talked to each other. When I brought them back to me, there were questions and there was renewed attention.
I tried it again about 20 minutes later, and it didn’t have the same magic. Maybe it is a once a class thing.
I read a book called _Teaching what you don’t know_ in preparation for my SWC training, and it had a great chapter on “Teaching who you don’t know” that I’ve been thinking about. It turns out that a lot of students are not like a lot of professors. Basing our understanding of human psychology on the responses of Psych 101 students is risky, and assuming that our students want to learn the way we want to learn is risky, too. https://healthyalgorithms.com/2015/06/09/weird-view-of-human-nature/