Morten Jerven Morten Jerven

What did we learn from measuring the costs of monitoring the SDGs?

In response to my costing estimate of the MDGs, UN Sustainable Development Solutions Network took the initiative to host a group of ‘experts’ to come up with a collective estimate. The group first met in October in Paris and New York, and the final report was published on April 17. The objective of the process and the report varied. Some had the their eyes firmly on the man: providing a ‘useful number’ for the Financing for Development Conference in Addis, while others watched the ball: trying to get as objective cost estimate as possible. What did we learn from the process and the report?

1. The numbers are soft. Very soft.

Ultimately the total number in any such exercise will depend on the multipliers. A survey has a ballpark per household cost. You can use the low end or the high end. Then multiply that with the number of households. Multiply that with the number of countries (193? 139? 77?), and then multiply that with the frequency (Monthly? Quarterly? Annual? Every ten years?). Depending on your requirement you can generate any kind of total number. My advice is to read the report with that in mind. Disregard the headlines, and look at the detail.

2. Annual disaggregated statistics on all indicators are not feasible.

Ultimately we dropped calculating for all countries (only 77) and perhaps most notably, we dropped the ambition of having annual disaggregated survey data. The sample size requirements of having disaggregated data by region, gender, age and what not category is daunting. And remember this. Annual survey data with current survey instruments is not feasible. For poverty data a survey takes 2-3 years from start to end – without completely re-visioning current informational infrastructure such a survey burden is to heavy to carry for a statistical office that has other tasks than reporting on SDGs alone.

3. Want open data? Start with the costs of data.

As I detail in my paper, it is very hard to get actual costs of surveys and censuses. To sift through background documents to find the costs is time consuming, and if you send queries to organizations that do these surveys you can expect generic responses, or that  “we do not share specific cost estimates”. The survey business is a survey business, and detailed cost information today is competitive edge in a bidding process tomorrow. Moreover, to get individual country budget information, on either donor or recipient side, on how much is actually spent on data is difficult.

4. The bottleneck is not funding, but capacity.

If you would ask the technical assistance teams at IMF or the statistical capacity teams at the World Bank how much it would cost to ‘mend the gaps’ in statistical capacity in low income countries, they would reply to you that the problem is not how much you can spend, but how much can be absorbed. International organizations and donors can buy a nice data-set, or send an expert well versed in the international standards of accounting, There needs to be domestic capacity jut to handle receiving the funds and the experts, let alone benefiting from it.

5. Looking like a donor versus looking like a state.

The report is looking like a donor. It is easier to find costs on a survey, but hard to find information on what it would entail to improve administrative statistics. Moving forward we need to keep in mind that monitoring and data is not a goal in itself. Donor decisions and reports matter less, what is important is the quality of the data feeds into decisions at country level. If our focus is mostly monitoring global progress that might actually hamper domestic political accountability.


One thought on What did we learn from measuring the costs of monitoring the SDGs?

  1. A few points of clarification, with regards to Morten Jerven’s blog of 21st April on measuring the costs of monitoring the SDGs:

    – The group was convened by SDSN as a result of multiple numbers flying around relating to the cost of statistical instruments and statistical improvements, and our incentive was the July 2015 Financing for Development Conference in Addis Ababa.
    – The work needed to happen anyway, but this provided a unique political opportunity.
    SDSN is the Sustainable Development Solutions Network.
    – Yes the numbers are relatively ‘soft’, but we stipulate that explicitly from the outset. They are based on informed assumptions from some of the world’s leading data experts – from WHO, UNICEF, PARIS21, UNIDO, ODW, CIESIN and of course, yourself. They are based on improvements to past practice and realistic ambition. They are not arbitrary, as your point implies.
    – We did not drop the ambition of calculating costs for all countries because we couldn’t be bothered or couldn’t do it. The project team deliberately decided to focus on 77 countries because that was the best proxy we had for identifying those countries most in need. We assume (again explicitly stated) that other countries have the resources available to finance required improvements.
    – Annual disaggregated statistics are possible for some dimensions. You only speak to survey data when many of the environmental statistics can be collated annually and disaggregated at geographical scales.
    – In the final version of the report investments in systems and administrative data is a central, recurring theme. Furthermore, we explicitly note our eagerness to do more on administrative data collection but alas we could not get the figures.

    *For more information or if you have any queries on the report please feel free to contact me at [email protected] *

Leave a Reply

Your email address will not be published. Required fields are marked *