For now this is overtly focusing on programme management in a humanitarian and development setting. See Development Policies, Humanitarian Action, Stabilisation, Early Recovery. Nexus Programme Management, Monitoring & Evaluation, Knowledge Management, Information Systems, Organisational Effectiveness, Aid Effectiveness.
Concepts : Activity Design (Methodologies : LFA, target group analysis, impact analysis, cost-benefit analysis, time planning). ; Activity Implementation.
See the different approaches as a coherent whole, with each part feeding into the others in a constant feed-back loop. Different programmes and different organisations are at different stages in this loop at any given time. Reality more complex. Never try to fit reality into this straight jacket. See the concepts as cognitive tools.
Ultimately a recognition that we are dealing with "human activity systems" (http://www.aid-it.com/Portals/0/Documents/Soft%20Systems%20and%20MandE.pdf, http://www.aid-it.com/Resources/tabid/109/Default.aspx): • cannot be easily defined so that all stakeholders agree on the problem to solve • require complex judgements about the level of abstraction at which to define the problem • have no clear stopping rules • have better or worse solutions, not right and wrong ones • have no objective measures of success • require iteration (every trial counts) • have no given alternative solutions (these must be discovered) • often have strong moral, political or professional dimensions
Rossi et al. (2004) identify five types of what he puts under program evaluation:
- Needs Assessments
- Program Theory (description of concept and design, incl. logic models, logframes etc. Note that development of a concept helps build common understanding)
- Process Analysis (whether target populations are being reached, people are receiving the intended services, staff are adequately qualified, etc.)
- Impact Analysis (impact evaluation, causal effect of the programming)
- Cost-Benefit & Cost-Effectiveness analysis
- Institutional Assessment Resources identify three evaluation phases http://www.utexas.edu/academic/diia/assessment/iar/programs/plan/types/ : Needs evaluation (needs assessment), process evaluation, and impact evaluation.
- Journal of Development Effectiveness http://www.informaworld.com/smpp/title~db=all~content=t906200215
- Professionalisation The PMD Pro (project management in development for professionals) Level 1 certification http://www.apmg-exams.com/NGOFSStart.asp introduction for those working in development to project management, but will also serve as a refresher for those with experience working in a project-based environment. If the certificate leads to a 1% improvement in efficiency, that would result in savings to international NGOs of $47m. The PMD Pro qualification was developed by Lingos (Learning for International NGOs http://ngolearning.org/ – a consortium of 45 global agencies that share learning resources and technology)
- Methodological and operational dimensions. Linkage of national with global frameworks.
- Causation. establishing a cause and effect relationship between programme products and outcomes. External factors and other actors important.
- Problem : sometimes poorly defined outcomes that are often confused with programme products (e.g. number of trained staff rather than whether the health services have improved).
- Problem : often absence of basic data or statistics at the time of project design that could have allowed for effective monitoring of indicators of change.
See Needs Assessments.
UNDAF five inter-related principles http://www.undg.org/?P=220 :
- A human rights-based approach (HRBA);
- Gender equality;
- Environmental sustainability;
- Results-based management (RBM) http://www.undg.org/?P=224
- Capacity development.
Programme Theory & Activity Design
Programme strategies, time planning, prioritisation.
- AusAID guide to programme management http://www.ausaid.gov.au/ausguide/default.cfm with sections on Logframe for instance.
Logical Framework Approach
Note that the LFA is designed in a temporal flow. The categories are artificial markers in a time sequences without natural punctuation marks. Confusion over what is input, process, output, outcome, and impact can only be avoided if these terms are carefully defined.
Problem : LFA can be see to be focused too much on processes.
The World Bank : "The logical framework (LogFrame) helps to clarify objectives of any project, program, or policy. It aids in the identification of the expected causal links—the “program logic”—in the following results chain: inputs, processes, outputs (including coverage or “reach” across beneficiary groups), outcomes, and impact. It leads to the identification of performance indicators at each stage in this chain, as well as risks which might impede the attainment of the objectives. The LogFrame is also a vehicle for engaging partners in clarifying objectives and designing activities. During implementation the LogFrame serves as a useful tool to review progress and take corrective action."
- Norad LFA : Handbook in Objectives oriented planning http://www.norad.no/en/Tools+and+publications/Publications/Publication+Page?key=109408
- Rick on the Road Blog - Social Framework http://mandenews.blogspot.com/2008/02/social-frameworks-improvement-on.html
Program Action Logic Model
- University Cooperative Extension Programs in the US elaborate logic model, called the Program Action Logic Model, with six steps http://en.wikipedia.org/wiki/Logic_model :
- Inputs (what we invest)
- Activities (the actual tasks we do)
- Participation (who we serve; customers & stakeholders)
- Outcomes - Impacts
- Short Term (learning: awareness, knowledge, skills, motivations)
- Medium Term (action: behavior, practice, decisions, policies)
- Long Term (consequences: social, economic, environmental etc.)
Process evaluation and monitoring
Monitoring is distinct from evaluation. It is a continuous function providing managers and key stakeholders with regular feedback on the consistency or discrepancy between planned and actual activities and programme performance and on the internal and external factors affecting results. Monitoring provides an early indication of the likelihood that expected results will be attained. It provides an opportunity to validate the programme theory and logic and to make necessary changes in programme activities and approaches. Information from systematic monitoring serves as a critical input to evaluation.
Monitoring & Evaluation
The term ‘monitoring and evaluation’ came into common usage in the aid industry over the past 20 years. The notion of trying to measure the performance of an aid project throughout the life of the project, as opposed to simply trying to understand what went right or wrong in hindsight, was promoted by Herb Turner in the 1970s.
- Rick on the Road blog http://mandenews.blogspot.com/ reflections on M & E of development aid programming.
- UNDP Handbook on Planning, Monitoring, Evaluating for Development Results. http://www.undp.org/evaluation/handbook/
- http://mande.co.uk/ Monitoring & Evaluation ;
- IFAD M & E guide in rural settings http://www.ifad.org/evaluation/oe/process/guide/index.htm ; - excellent in pointing out short-comings e.g. need to focus on monitoring needs on other stakeholders, lack of integration between M&E function and management, poor use of participatory M&E methods.
- The World Bank M & E Some tools, methods & approaches http://lnweb90.worldbank.org/OED/oeddoclib.nsf/DocUNIDViewForJavaSearch/A5EFBB5D776B67D285256B1E0079C9A3/$file/MandE_tools_methods_approaches.pdf Comprehensive
Evaluation is judgment made of the relevance, appropriateness, effectiveness, efficiency, impact and sustainability of development efforts, based on agreed criteria and benchmarks among key partners and stakeholders. It involves a rigorous, systematic and objective process in the design, analysis and interpretation of information to answer specific questions. It provides assessments of what works and why, highlights intended and unintended results, and provides strategic lessons to guide decision-makers and inform stakeholders. It is a development of hypotheses what goes right or wrong and then testing those hypotheses.
Evaluation designs that track effects over extended time periods (time series designs) are generally superior to those that simply compare periods before and after intervention (pre-post designs); comparison group designs are superior to those that lack any basis for comparison; and designs that use true control groups (experimental designs, e.g. randomised evaluation) have the greatest potential for producing authoritative results.
The major limiting factor on evaluations is the availability of baseline data on the dependent variables of interest. If no prior data exist for an important dependent variable, one must gather the data prior to the start of a new program. This effort may involve conducting a special community survey, taking photographs that show situation in targeted areas, or hand tabulating data from existing records.
- UNDP Evaluation policy 2006 http://www.undp.org/evaluation/policy.htm
- The UN Evaluation Group Guidelines (UNEG) “Norms for Evaluation in the UN System”, “Standards for Evaluation in the UN System” (April 2005), “Ethical Guidelines for Evaluations” (2007)
- http://www.3ieimpact.org/ Impact Evaluation
- World Bank on Impact Evaluation http://www.worldbank.org/ieg/ie/ http://www.worldbank.org/oed/ with Africa Impact Evaluation initiative http://web.worldbank.org/WBSITE/EXTERNAL/COUNTRIES/AFRICAEXT/EXTIMPEVA/0,,menuPK:2620040~pagePK:64168427~piPK:64168435~theSitePK:2620018,00.html World Bank’s Human Development Network uses impact evaluation to inform policies on health, education and social protection.
- WB Links http://web.worldbank.org/WBSITE/EXTERNAL/TOPICS/EXTPOVERTY/EXTISPMA/0,,contentMDK:20193313~menuPK:384366~pagePK:148956~piPK:216618~theSitePK:384329,00.html
- World Bank Annual Review of Development Effectiveness http://www.worldbank.org/ieg/arde09/
- Unicef http://www.unicef.org/evaluation/index.html
- http://oerl.sri.com/home.html extensive list of evaluation plans, instruments and reports developed for NSF/EHR funded projects
- http://gsociology.icaap.org/methods/ lists free mostly web-based resources for methods in evaluation and social research. The focus is on "how-to" do evaluation research using surveys, focus groups, sampling, interviews, and other methods.
- Mathematica Programme Analysis http://mathematica-mpr.com/Services/program_evaluation.asp
- UNDP http://www.undp.org/evaluation/
- IFAD http://www.ifad.org/evaluation/index.htm
- ADB http://www.adb.org/IED/default.asp
- OECD DAC Evaluation and Dev programmes http://www.oecd.org/department/0,2688,en_2649_34435_1_1_1_1_1,00.html incl. glossary English Arabic http://www.oecd.org/dataoecd/8/43/40501129.pdf and Swedish http://www.oecd.org/dataoecd/25/22/39249691.pdf
- Poverty Action Lab, MIT http://www.povertyactionlab.org/methodology Esther Dufloe et al focus on impact evaluation ; http://www.povertyactionlab.org/policy-lessons/governance/community-participation ; http://www.povertyactionlab.org/methodology/what-evaluation/goals-outcomes-measurement defining goals - outcomes - indicators ; http://www.povertyactionlab.org/sites/default/files/documents/Evaluation%20Methods%20Table%20one%20page%20PDF_0.pdf Evaluation methodologies ;
- Innovations for Poverty action http://www.poverty-action.org/
- Peabody Research Institute http://www.peabody.vanderbilt.edu/pri.xml
- Internationald Development Evaluation Association http://www.ideas-int.org/home/index.cfm
UNDP Assessments of Development Results (ADRs)
Assessments of Development Results (ADRs) assess the attainment of intended and achieved results as well as UNDP contributions to development results at the country level. Their scope will include, but not necessarily be confined to, UNDP responsiveness and alignment to country challenges and priorities; strategic positioning; use of comparative advantage; and engagement with partners.
- ‘Guidelines for an Assessment of Development Results (ADR),’ New York Evaluation Office January 2009.
Outcome evaluation. Impact studies attempting to expand the knowledge base of what works in development programming, and thus contribute to the tailoring of new programmes based on evidence. Principle that knowledge is a public good. Results of impact evaluations should be publicly available and benefit all countries.
How do you improve development effectiveness through better use of evidence? Impact evaluation is a key means towards producing and using evidence to inform the design of development programmes. It is an important tool in building a stronger evidence base on effective development programmes and, in turn, improving development policy. Ariel Fiszbein, the Chief Economist in the Human Development Network at the World Bank, and how the World Bank are using impact evaluation and employing it to inform policy in the health, education and social protection sectors.
Impact evaluation is a empirical toolkit that we have now. Researchers cooperate with policy makers to evaluate ideas within the overall framework of policy design.. impact evaluation a patient craft - takes time beign strategic around which questions are addressed. Allows thinking around the issues.
key idea - design a programme that is scaleable - use local resources , people . how to take ideas to the field and implement them - we know the ideas, the theory, the empirical evaluations, but the implementation is a challenge.
Challenge - how to shape current practice -
Challenge : how effective when the context is idiosynchratic and constantly changing.
See also IFES
- Note: evaluations can also provide critical inputs (benchmarks) to other monitoring and evaluation activities.
The International Initiative for Impact Evaluation (3ie) http://www.3ieimpact.org/ defines rigorous Impact Evaluations as: ”analyses that measure the net change in outcomes for a particular group of people that can be attributed to a specific program using the best methodology available, feasible and appropriate to the evaluation question that is being investigated and to the specific context” http://www.3ieimpact.org/database_of_impact_evaluations.html database.
- http://poverty-action.org/ creates and evaluates solutions to social and development problems, and works to scale up successful ideas through implementation and dissemination to policymakers, practitioners, investors, and donors.
- http://www.cgdev.org/section/initiatives/_active/evalgap When Will We Ever Learn? Closing the Evaluation Gap Report http://www.cgdev.org/content/publications/detail/7972