How

Assumption: “tools and knowledge are many times implicit”
arrow we propose a strong mutual learning approach (utilization focused) 

One of the main focuses of the evaluation would be to be practical and use oriented. It will therefore address all main knowledge needs looking at country level in order to foster usefulness. A special focus will be put on identifying and validating the systems that might be already used but not explicit. And specifically the set of context, output, outcome and impact indicators together with the baseline assessment carried out by the programme. Further, the evaluation will look at potential synergies and experience sharing between countries (good practices, innovative practices and lessons learned already produced). Associated methodological tools: utilization focused steps process, development of good practices fiches and assessment of selected baselines.

Assumption: “things can only be useful if designed together, validated and consensuated”
arrow we propose a highly as possible participatory consultancy

Alongside the evaluation will include several participatory techniques to give voice, as much as possible to field offices and also to target populations and stakeholders. Given no resources are available for field visits remote communication and tools will be used as a substitute. Strong commitment is needed from There will be a need . Associated methodological tools: live online blog with last developments; open cloud documents in the google drive; participation in the reference group set up; etc.

Assumption: “all things are interelated and processes are normally the key”.
arrow we will assess the system and the principles: systematic and principles oriented evaluation.

Go beyond “results” (the “what”)  and look at the processes, at the dynamics that lead to them (“how”). Processes and dynamics reside outside the logical result chain, and require additional layers of analysis: it is necessary to unveil the system where change took place, the actors, the dynamics. And to recognize that – within these complex systems – the logic is not linear (whereby each action is assumed to have an effect), but that different forces are at play. To navigate complexity, clear principles and theories of change are then needed, as a compass. The evaluation will try to unveil such theories and principles. Associated methodological tools: intervention logic reconstruction; stakeholders mapping; processes models set up; etc.

Assumption: “this evaluation will accompany you throughout the project life”.
arrow we will use a forward looking and appreciative approach.

In line with the learning approach, the emphasis is not to “look at what happened to check if it was good or not”, but rather to “look at what is being done to reply, strengthen or correct it”. We will not stop at acknowledging strengths and weaknesses of the programme but we will try to articulate and consolidate them in ways that speak to the future. This will have an impact on the style of reporting, where every piece of evidence will be asked to “tell something relevant for the future” – based on the learning and on the experience so far. Associated methodological tools: appreciative enquiry approach in questionnaires and interviews.

Leave a comment