Data Expedition at Open Development Camp

November 8, 2013 in Data Expeditions
Somalia Famine Food Aid Stolen / Human Black Hole

Photo credit: Surian Soosay

Our challenge: give Open Development a reality check at Open for Change’s Open Development Camp in Amsterdam.

This challenge was well and truly accepted by participants of the Open Development Camp during this afternoon’s Data Expedition, led by myself and my colleague Katelyn Rogers. We started with the very broad theme of looking into bilateral aid flows, following recent articles on how OECD countries were thinking of redefining rules of what counted as ‘aid’ and a report by Development Initiatives which revealed that a fifth of OECD aid never leaves the donor country.

The group, made up of around 20 people, split into four groups.

The first looked into remittances flowing into Somalia, and they found data from the World Bank on remittances. They found that the Guardian had the best data set on this but that the column for Somalia (along with a couple of other countries) was entirely empty. They then found the data hidden deep in a PDF and used everyone’s favourite PDF extraction tool, Tabula, to extract this data.

The second group chose a trickier topic: taking a dive into project failures. Is the phrase ‘learn from your mistakes’ even possible in the development world? Do we know where projects might have stopped just after the pilot, whether projects benefited from planning research, or could it be that every single project is a success?

While I don’t think any of us believed that last suggestion, it soon became clear that project failures simply aren’t documented. One participant mentioned a past initiative from the Canadian International Development Agency which invited people to record their project ‘challenges’, but as there were only four even recorded, we didn’t consider that to be of much help. It did, however, lead to many interesting discussions around what success actually is for a development project (who decides, donor or recipient?) and how these criteria are set.

The third group looked at the Dutch Foreign Ministry’s open data site, OpenAid.nl, to see where money was going from the Netherlands. While it turned out that Afghanistan is the biggest recipient of aid money, it proved difficult to find the budget data of the Dutch Foreign Ministry (though we were later informed that it is in fact on the site somewhere).

The fourth group took a much more specific route to looking at international aid flows, focusing on the issue of tuberculosis. The challenge: does expenditure on prevention of tuberculosis have any correlation to prevalence of tuberculosis?

The first step proved fairly easy: the World Health Organisation provides detailed data on the prevalence of tuberculosis per 100,000 people dating back some 20 years. Great! But what about the financing? Unfortunately, it turns out that the World Health Organisation only provides PDFs on the amount of money that was spent on this per country; and not just that, but the data is already processed into bar charts, meaning that we couldn’t even scrape the PDF for that data.

We didn’t let that stop us, though. We focused on the country PDF profile of Bangladesh, as we wanted a country that hadn’t experienced serious conflict in the last 10 years to avoid extra external factors. Using the Chrome Extension MeasureIt to make a crude estimate of how big the bars were in the bar chart on the country profile, we recorded our estimates in a spreadsheet and plotted the line of spending on tuberculosis against the prevalence of tuberculosis in the country.

We discovered that, for some reason, funds available for treating tuberculosis in Bangladesh tripled between 2009 and 2010. Aside from this making planning incredibly difficult, it actually had no effect whatsoever on the prevalence of tuberculosis, which has been declining fairly steadily in the country for the past 20 years.

So—expedition success! We learned about Tabula, about how to find your way around the IATI data store, how to get data even if someone out there really doesn’t want you to have it (ie. by measuring pixels of a bar chart!), and that there is a gap in measuring success of development projects, to name just a few findings.

Thanks, everyone; we hope you enjoyed it as much as we did!

Flattr this!