Is impact evaluation still on the rise?

3ie’s fully updated impact evaluation repository helps track key trends

Photo--AgStock-Images-CorbisSince 2014, 3ie’s impact evaluation repository (IER) has been a comprehensive database of published impact evaluations of development programmes in low- and middle-income countries. We call the database comprehensive because we build it from a systematic search and screening process that covers over 35 indexes and websites and screens for all development programme evaluations or experiments that use a counterfactual method for estimating net impact. The IER that is available since 2014 was comprehensive through the end of 2012 (hence referred to here as IER 2012).

Today we announce the full update of the IER, which is now comprehensive through the end of September 2015 and holds 4,260 records of published development impact evaluations.

Those who use the IER on a regular basis know that we have been adding studies on an ad hoc basis since the release of the first comprehensive repository in 2014. In October 2015, we started the second systematic search and screening process. For the update, we used a revised protocol based on the lessons learned from the first such exercise. For these revisions, we were greatly aided by those who responded to our challenge and found many articles we missed. As of today, the repository is comprehensive through September 2015.

We also added screeners to the team who speak several languages. Thus, we now include studies written in Spanish, French, and Portuguese. The repository is not comprehensive with respect to studies in these languages, but there are some included.

Estimating the rise in impact evaluations

The number of studies in the updated comprehensive database (IER 2015) is 4,124. This is an increase of 1,889 from IER 2012. Not all of that increase is from studies published since the beginning of 2013 though, as shown in figure 1. Our new search uncovered many studies published earlier that we did not find in the original search. We also cleaned out some records incorrectly included in the IER 2012, so in fact, the number of new studies included is slightly more than 1,889.

IER-published1

It is clear that the total number of impact evaluations continues to rise. The real question is whether the number published each year continues to increase, as Cameron, Mishra, and Brown (hereafter CMB) find in IER 2012. Figure 2 presents the numbers by year for IER 2012 and IER 2015.

IER-published2

Figure 2 suggests that annual publication of impact evaluations peaked in 2012 and has plateaued since. Note however, that there is a big gap between the number of impact evaluations published in 2012 that we were able to find in 2013 and the number for 2012 that we found in 2015. This gap is consistent with a known lag between when articles are published and when they make it into all the relevant indexes. We expect that the next update will uncover more studies for 2013 and 2014 than we found this time around. Regardless, the number we found for 2015 is greater than for 2013 and 2014, even though the 2015 number only reflects the first three quarters of the year. Thus, the answer to the question whether impact evaluation is still on the rise turns out to be a bit uncertain, but the current numbers suggest that the trend is flattening.

Examining the rise in randomised controlled trials

We can also see whether the trend for just randomised controlled trials (RCTs) reported by Dean Karlan in the testimony to the U.S. Congress persists. Figure 3 presents the number published per year by method and shows that the trend for RCTs is similar to the trend for all impact evaluations and thus also appears to be flattening.

IER-published3

When we revised the original protocol, we expanded the search terms in order to find quasi-experimental (QE) studies better. However, as shown in figure 3, there are still more RCT-based evaluations published each year than QE evaluations. In the last five years that gap has widened. There are currently 2645 RCTs in the repository as compared to 1615 QE studies.

Notable trends in development impact evaluations across countries

The prevalence of impact evaluations by country looks about the same as it did in IER 2012. Figure 4 shows the global heat map of development impact evaluations for IER 2015. (See figure 5 in CMB for the map for IER 2012.) The top 10 countries are India (390), China (281), Mexico (247), Kenya (233), Bangladesh (197), South Africa 194), Brazil (193), Uganda (173), Pakistan (105), and Peru (105). For some countries, the number of impact evaluations has more than doubled from what we found in IER 2012. These include Brazil, Cambodia, Cameroon, Costa Rica, Malawi and Mozambique. We also found impact evaluations for a few countries that did not have any in IER 2012. These are Antigua and Barbuda, Dominica, Saint Lucia, Somalia and Yemen. Meanwhile, ​​we continue to notice a lack of impact evaluations in regions like Central Asia and Central Africa.

IER-published4

Keeping the momentum going

There is much more we can learn from analysing the data in IER 2015. We are working on a full-length article now, which we will present at the What Works Global Summit in September 2016. Also, this fall we will revise the search and screening protocol again based on lessons learned this time around. The protocol for IER 2015 will be available on the 3ie website in the next couple of months. We will start the search and screening for the next comprehensive update in January 2017. For the next update, we also plan to more systematically search for studies published in other languages.

 Contest Alert: Have you already looked for your study and it is not there? Fret not, we are announcing the  IER ‘What did we miss?’ contest today, so that you can submit your entries  for  a gift card. And anyone can submit a study for consideration in the IER at any time to database@ieimpact.org.


Tags: , , ,

Comment on “Is impact evaluation still on the rise?

  1. Nick York

    Very interesting blog. I wonder if our expectations are realistic here? To state the obvious, it would be surprising and perhaps quite undesirable (for quality and ability for studies to be well used) if the annual numbers published continued to rise so sharply as in the first 10 years of this century, when the annual numbers published seem to have increased 10 fold. After the exponential growth of the first few years, I would have thought that a stabilization of the trend is only to be expected. If 400 – 500 studies are published each year, that is roughly a 10% growth rate in the stock of impact evaluations – if sustained will mean the evidence base at least doubles in the next decade again.

Comments are closed.