Uganda’s cabinet has just approved a new monitoring and evaluation policy, which will be officially endorsed and disseminated next month. It comes as a positive signal after several donors suspended aid to the Government and provides a solid foundation to boost the country’s commitment to evidence informed policy-making.
While Uganda reports significant gains in terms of human development indicators since 2000 and was praised for its success story in turning the tide in the fight against HIV, there remain major constraints in terms of economic development, corruption and weak accountability. To address these constraints, it is important for the Government to be able to assess how its services are being delivered, and to receive feedback on where changes and improvements need to be made, and where resources need to be allocated.
The Office of the Prime Minister recently hosted its first evaluation week under the theme of “Enhancing evidence-based evaluations for policymaking: the need and role of effective evaluation systems for service delivery”. Much of the discussion was around the need to understand that evaluations are useful and worth the cost to help improve policies. In fact, during the event the Minister of General Duties, Professor Tasis said that “evaluation is an investment for transformation, especially if you are engaging with people”. He also acknowledged that “there are tensions in being transparent and creating demand from citizenry when there is capacity to respond”.
The Minister also launched a new evaluation database. This step is an important one in increasing transparency, and making evidence available for use – for example country-specific systematic reviews of programmes in particular sectors.
As part of this 4-day event which was inaugurated by the Prime Minister, 3ie supported a peer learning event where international delegates from: Benin, Colombia, Fiji, Pakistan and South Africa, as well as organizations including DfID, Giz, World Bank’s CLEAR initiative, PSI, HED and others, were invited to learn from the Uganda experience and share good practices in monitoring and evaluation. Overall, the question guiding the discussion was how to stoke the demand for evaluation and ensure that the supply is relevant and of quality?
The main lessons have been captured in a report by the peer learning team, of which the following are some highlights:
• A strong monitoring system but a weak evaluation practice: The Government has introduced a new “Traffic light system” to identify best and poorly performing ministries, based on indicators from the budget framework. The data gets published bi-annually in the Government Performance reports, which are discussed at Cabinet retreats. The challenge now is to rebalance the E in M&E, unpack and sensitize the new M&E policy to all stakeholders and development partners.
• To stoke the demand for evaluation, the team also identified the need for: a mechanism to engage stakeholders throughout the evaluation cycle to ensure feasibility, policy relevance; a comprehensive dissemination strategy including tracking of implementation milestones and follow-up actions on recommendations; and integrating evaluation with budgeting and planning
• On the supply side, the government needs to show its commitment to finance its M&E agenda; more coordination between OPM’s M&E functions and M&E functions of other departments; more focus on a broad based capacity development agenda; and establishing national standards and guidelines.
This was the first pilot peer learning of 3ie members which is part of our commitment to evaluation project. The next pilot will be hosted by the South African Presidency. Read more about the project at: www.3ieimpact.org/en/evaluation/c2e/
Tags: evidence-based policy, monitoring, policymakers, policymaking, Uganda