Programme Management

From Coddeau

Jump to: navigation, search

For now this is overtly focusing on programme management in a humanitarian and development setting. See Development Policies, Humanitarian Action, Stabilisation, Early Recovery. Nexus Programme Management, Monitoring & Evaluation, Knowledge Management, Information Systems, Organisational Effectiveness, Aid Effectiveness.

Concepts : Activity Design (Methodologies : LFA, target group analysis, impact analysis, cost-benefit analysis, time planning). ; Activity Implementation.

See the different approaches as a coherent whole, with each part feeding into the others in a constant feed-back loop. Different programmes and different organisations are at different stages in this loop at any given time. Reality more complex. Never try to fit reality into this straight jacket. See the concepts as cognitive tools.

Ultimately a recognition that we are dealing with "human activity systems" (http://www.aid-it.com/Portals/0/Documents/Soft%20Systems%20and%20MandE.pdf, http://www.aid-it.com/Resources/tabid/109/Default.aspx): • cannot be easily defined so that all stakeholders agree on the problem to solve • require complex judgements about the level of abstraction at which to define the problem • have no clear stopping rules • have better or worse solutions, not right and wrong ones • have no objective measures of success • require iteration (every trial counts) • have no given alternative solutions (these must be discovered) • often have strong moral, political or professional dimensions

Rossi et al. (2004) identify five types of what he puts under program evaluation:


Contents

Challenges

Needs Assessment

See Needs Assessments.

Programme Principles

UNDAF five inter-related principles http://www.undg.org/?P=220 :

Programme Theory & Activity Design

Programme strategies, time planning, prioritisation.

Logical Framework Approach

Note that the LFA is designed in a temporal flow. The categories are artificial markers in a time sequences without natural punctuation marks. Confusion over what is input, process, output, outcome, and impact can only be avoided if these terms are carefully defined.

Problem : LFA can be see to be focused too much on processes.

The World Bank : "The logical framework (LogFrame) helps to clarify objectives of any project, program, or policy. It aids in the identification of the expected causal links—the “program logic”—in the following results chain: inputs, processes, outputs (including coverage or “reach” across beneficiary groups), outcomes, and impact. It leads to the identification of performance indicators at each stage in this chain, as well as risks which might impede the attainment of the objectives. The LogFrame is also a vehicle for engaging partners in clarifying objectives and designing activities. During implementation the LogFrame serves as a useful tool to review progress and take corrective action."

Program Action Logic Model

  • Activities (the actual tasks we do)
  • Participation (who we serve; customers & stakeholders)
  • Short Term (learning: awareness, knowledge, skills, motivations)
  • Medium Term (action: behavior, practice, decisions, policies)
  • Long Term (consequences: social, economic, environmental etc.)

Process Analysis

Process evaluation and monitoring

Monitoring is distinct from evaluation. It is a continuous function providing managers and key stakeholders with regular feedback on the consistency or discrepancy between planned and actual activities and programme performance and on the internal and external factors affecting results. Monitoring provides an early indication of the likelihood that expected results will be attained. It provides an opportunity to validate the programme theory and logic and to make necessary changes in programme activities and approaches. Information from systematic monitoring serves as a critical input to evaluation.

Monitoring & Evaluation

The term ‘monitoring and evaluation’ came into common usage in the aid industry over the past 20 years. The notion of trying to measure the performance of an aid project throughout the life of the project, as opposed to simply trying to understand what went right or wrong in hindsight, was promoted by Herb Turner in the 1970s.

Evaluation

Evaluation is judgment made of the relevance, appropriateness, effectiveness, efficiency, impact and sustainability of development efforts, based on agreed criteria and benchmarks among key partners and stakeholders. It involves a rigorous, systematic and objective process in the design, analysis and interpretation of information to answer specific questions. It provides assessments of what works and why, highlights intended and unintended results, and provides strategic lessons to guide decision-makers and inform stakeholders. It is a development of hypotheses what goes right or wrong and then testing those hypotheses.

Evaluation designs that track effects over extended time periods (time series designs) are generally superior to those that simply compare periods before and after intervention (pre-post designs); comparison group designs are superior to those that lack any basis for comparison; and designs that use true control groups (experimental designs, e.g. randomised evaluation) have the greatest potential for producing authoritative results.

The major limiting factor on evaluations is the availability of baseline data on the dependent variables of interest. If no prior data exist for an important dependent variable, one must gather the data prior to the start of a new program. This effort may involve conducting a special community survey, taking photographs that show situation in targeted areas, or hand tabulating data from existing records.

UNDP Assessments of Development Results (ADRs)

Assessments of Development Results (ADRs) assess the attainment of intended and achieved results as well as UNDP contributions to development results at the country level. Their scope will include, but not necessarily be confined to, UNDP responsiveness and alignment to country challenges and priorities; strategic positioning; use of comparative advantage; and engagement with partners.

Impact Evaluation

Outcome evaluation. Impact studies attempting to expand the knowledge base of what works in development programming, and thus contribute to the tailoring of new programmes based on evidence. Principle that knowledge is a public good. Results of impact evaluations should be publicly available and benefit all countries.

How do you improve development effectiveness through better use of evidence? Impact evaluation is a key means towards producing and using evidence to inform the design of development programmes. It is an important tool in building a stronger evidence base on effective development programmes and, in turn, improving development policy. Ariel Fiszbein, the Chief Economist in the Human Development Network at the World Bank, and how the World Bank are using impact evaluation and employing it to inform policy in the health, education and social protection sectors.

Impact evaluation is a empirical toolkit that we have now. Researchers cooperate with policy makers to evaluate ideas within the overall framework of policy design.. impact evaluation a patient craft - takes time beign strategic around which questions are addressed. Allows thinking around the issues.

key idea - design a programme that is scaleable - use local resources , people . how to take ideas to the field and implement them - we know the ideas, the theory, the empirical evaluations, but the implementation is a challenge.

Challenge - how to shape current practice -

Challenge : how effective when the context is idiosynchratic and constantly changing.

See also IFES

The International Initiative for Impact Evaluation (3ie) http://www.3ieimpact.org/ defines rigorous Impact Evaluations as: ”analyses that measure the net change in outcomes for a particular group of people that can be attributed to a specific program using the best methodology available, feasible and appropriate to the evaluation question that is being investigated and to the specific context” http://www.3ieimpact.org/database_of_impact_evaluations.html database.

Cost-benefit Analysis

Benchmarks

Personal tools
Namespaces
Variants
Actions
Coddeau
Toolbox