1 / 25

Suggestions for the Evaluation of Transitional Justice

Suggestions for the Evaluation of Transitional Justice. Angle Taken in this Presentation. Perspective of an evaluator of conflict related programmes, of which TJ is one field. The experience of TJ is drawn from:

joie
Download Presentation

Suggestions for the Evaluation of Transitional Justice

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Suggestions for the Evaluationof Transitional Justice

  2. Angle Taken in this Presentation Perspective of an evaluator of conflict related programmes, of which TJ is one field. The experience of TJ is drawn from: • Evaluation of EC support to human rights and the rule of law, thematic 2005, and support to ICC ratification • Evaluation of USAID peace-building and justice support programmes in Burundi, Senegal (Casamance) and Uganda (north). • Evaluation of a World Bank and of a Danida funded project in Burundi concerning the repatriation of refugees with a focus on restorative and traditional justice.

  3. What is Evaluation ? Evaluation is… To assess or judge the value or worth of something (its relevance, effectiveness, efficiency, sustainability) It is Not Monitoring: Continual self-evaluation of changes to generate a lateral view, through formal and informal processes It is Not Impact Assessment: Tracking of change in a society as it relates to an identifiable source or set of factors

  4. Why Evaluation?

  5. Why Evaluation ? • Accountability • Learning … However evaluation comes at a greater cost (financial or not) than is acknowledged, and are used sparingly.

  6. Why Evaluation? Evaluation is part of the concept of results based management: a verifiable argument about what worked and what did not, and why is it so. It guides the interface between a programme and a society- all too often this information is restricted • Creating an information and analysis environment • Creating a flow of decision-oriented information • Obtaining reliable information about the progress of activites and their outcomes, the reasons for success or failure, and the contexts in which activities are taking place.

  7. How to Evaluate: A Rule of Tumb: Keep Definitions Simple: • Activity: the process to deliver an output (use made of an input or resources) • Output: a product, usually a definable quantity (most easily monitored over time and space) • Outcome: use made by the beneficiaries of an output • Impact: the consequence of the outcome, or change triggered by the use of an output.

  8. How to Evaluate: • Outcome Mapping focuses in particular on changes in behaviour and relationships, and seeks to be specific about change • My preferred definition of outcomes fits this concept, although in some ways it is more specific even: the use made of an output is a particular form of behavioural change.

  9. How to Evaluate:

  10. Results: Outcomes: level of beneficiaries = sphere of direct programme influence Objectives: Impact: general population = sphere of indirect influence level of attribution and influence How: Spheres of Influence Change operates as a ripple of change around an actor Activities: Outputs: level of implementing partners, = sphere of primary influence

  11. How to Evaluate: Change should ideally be mechanical like a spring. The combination of different changes should be necessary and sufficient to lead to a higher change. This calls for projects and programmes and sector-wide approaches to be modest in the claims they make about impact. Impact Outcome Output Activity Evaluation

  12. Activity & Process (100% control) Output (75% control) Outcome (50% control) Impact (25% control) How to Evaluate: Plan & Feedback Resources Detailed Operational Plan Implementation Results, Specific & General Objective Improvements through M&E

  13. How to Evaluate: Intentional Design There are no set objectives, outcomes and indicators. These depend on the case: For example: “Progress in adherence to the Rome Statute, a better public perception of the court’s role, and toward implementation of the Rome Statute into national legislation”: • Emergence of national lobby groups and coalitions to support ratification by a given state of the Statutes of international criminal courts • Evidence of trust and active relations within a network of professional experts in areas relevant to the Courts; • evidence of a strong community of purpose, in particular in the number of subscriptions to specialised newsletters, number of “hits” to a specialised website, participation in meetings, participation in demonstrations. • Recurrence of key words in the national press, particularly the most popular media, and in specialised legal literature. • Judges and magistrates are selected in a public and transparent manner which is commented on in national communities.

  14. What to Evaluate: Intentional Design Set objectives, and detail changes which could be observed as a result of success (here based on F. Oduro):

  15. What to Evaluate: The Constraint of Transition The challenges of programmes in situations of transition are unique and different from evaluations in other fields. This naturally applies to TJ evaluations: • The balance between reconciliation and other objectives is rarely clear

  16. The constraint of Transition • The balance between reconciliation and other objectives is rarely clear • Objectives as benchmarks is denied by the fact that conflict programming requires above all a response to opportunities

  17. The constraint of Transition • The balance between reconciliation and other objectives is rarely clear • Objectives as benchmarks is denied by the fact that conflict programming requires above all a response to opportunities • The justice and reconciliation objective is not well articulated to the objectives of a programme, often based on assumptions.

  18. The constraint of Transition • The balance between reconciliation and other objectives is rarely clear • Objectives as benchmarks is denied by the fact that conflict programming requires above all a response to opportunities • The justice and reconciliation objective is not well articulated to the objectives of a programme, often based on assumptions. • Selecting indicators demands a strong understanding of the conflict and culture, and verification methods applicable to a conflict situation.

  19. The constraint of Transition • The balance between reconciliation and other objectives is rarely clear • Objectives as benchmarks is denied by the fact that conflict programming requires above all a response to opportunities • The justice and reconciliation objective is not well articulated to the objectives of a programme, often based on assumptions. • Selecting indicators demands a strong understanding of the conflict and culture, and verification methods applicable to a conflict situation. • Justice and reconciliation usually depends on the identification of an end-state, an overall objective. The value given to this may not be shared by all stakeholders, even those within a single programme or organisation.

  20. What to Evaluate: Opportunistic Design or Evaluation Without Objectives Outcome Level: For Opportunistic Strategy or Sector-Wide Approaches

  21. Conflict Drivers Issue 1 E T E T Issue 2 E T T E Issue 3 Relevance x Extent x Duration Outcomes Outcomes Outcomes

  22. Who Evaluates? Users and Roles • All evaluations of TJ I have done were commissioned by donors, and were not well disseminated to the beneficiaries. • The link between general public perception, research on one side, and evaluation on the other, is very tenuous. • Victims, support groups, civil society, need to intervene in what questions are asked at the outset, for evaluations to be relevant to them.

  23. Who Evaluates? Participatory Evaluation Once stakeholders have been identified, agenda, objectives and methodology should be agreed with them. The skills are the ability to ask critical questions, and suspend personal judgement. The following questions could guide preparation: • What are the questions to be answered? • What will the evaluation be used for? • Who must be heard and how to create the conditions for them to speak freely and openly?

  24. Evaluation of Transitional Justice • When the programmes are closer to a classical development programme I would argue that evaluation is not very different from a general form. • When on the other hand the objectives are not clearly defined, it is best to approach evaluation by focusing on outcomes and on their connection to the context and needs.

More Related