Professional norms and standards for monitoring and evaluation (M&E) serve as a framework to contribute to the improvement of IOM’s M&E functions, as well as to the greater effectiveness of its interventions. This chapter will first explain the norms and standards related to M&E. It will then look at key components of managing M&E, including related roles and responsibilities in the IOM context, as well as budgeting requirements for M&E.
As a member of the United Nations Evaluation Group (UNEG), IOM operates under the UNEG Norms and Standards for Evaluation
The information in this chapter of the IOM Monitoring and Evaluation Guidelines derives from both the IOM Evaluation and Monitoring Policies. Together, the policies define IOM’s position on, and provide instruction related to, the purpose of M&E within the Organization, the norms and standards adopted, as well as the M&E criteria to be used, M&E-related roles and responsibilities and budgeting requirements. This chapter, as well as the evaluation and monitoring policies, specifically mention some of the key UNEG norms and standards frequently used and/or that can be applied institutionally within the context of IOM.
The table below provides a summary of the guiding norms and standards. While there are no UNEG norms and standards for monitoring, the IOM Monitoring Policy adapted the evaluation norms and standards as “principles for monitoring”; these are listed below followed by the evaluation norms and standards:
Monitoring |
---|
Principles |
Impartiality Mitigating the presence of bias applies to any monitoring actions and reporting |
Utility Monitoring must serve the information needs of the intended users for a maximum benefit. Monitors shall ensure that the work is well informed, relevant, timely and clearly and concisely presented. Monitoring reports should present evidence, progress, issues and recommendations in a comprehensive and balanced way. Reports should be both results and action oriented. |
Transparency All stages of the monitoring processes should be transparent; consultation with major stakeholders is essential and involves clear and regular communication, including the scheduling and scope of specific monitoring missions and activities. Documentation resulting from monitoring should be easily consultable and readable to guarantee transparency and legitimacy. |
Credibility Monitoring shall be based on data and observations using systems and tools that can guarantee quality and reliability. Monitoring reports shall reflect consistency and dependability in data, findings, judgements and lessons learned |
Disclosure The reporting and lessons from monitoring shall be disseminated by establishing effective feedback loops to relevant departments, operational staff and, when relevant, to beneficiaries and other stakeholders. |
Participation Whenever relevant, IOM monitoring activities shall be carried out with the participation of relevant stakeholders, such as affected populations or beneficiaries, donors, national and international government agencies, non-governmental organizations, civil society organizations, the private sector and/or representatives of local communities. |
Evaluation |
---|
Norms |
Intentionality and utility In the context of limited resources, evaluations must be selected and undertaken with a clear intention of use and in a timely manner for decision-making with relevant and useful information. |
Impartiality This is mitigating the presence of bias at all stages of the evaluation process, including planning an evaluation, formulating the mandate and scope, selecting the evaluation team, providing access to stakeholders, conducting the evaluation with the necessary methodological rigour and presentation of key findings, recommendations and challenges. It provides legitimacy to evaluation and reduces the potential for conflict of interest. |
Independence The evaluation function should be independent from other management functions so that it is free from undue influence. It needs to have full discretion in directly submitting its reports for consideration at the appropriate level of decision-making. To avoid conflict of interest and undue pressure, evaluators need to be independent and must not have been directly responsible for the policy setting, design or overall management of the subject of evaluation. They must have no vested interest and have the full freedom to impartially conduct their evaluative work. They must be able to express their opinion in a free manner, without potential negative effects on their professional status or career development. Independence of the evaluation function should not impinge the access of evaluators to information about the evaluation. |
Transparency and consultation These are essential features in all stages of the evaluation process, particularly with the major stakeholders, as they establish trust, build confidence, enhance ownerships and increase accountability. They also guarantee credibility (another UNEG norm) and quality of the evaluation and facilitate consensus-building and ownership of the findings, conclusions and recommendations. |
Standards |
---|
Disclosure policy All evaluations are expected to be publicly available and listed on the IOM Evaluation web page and under other specific web pages as deemed necessary, with due regard to IOM’s Data Protection Principles (IN/00138). All additional evaluation products (such as annual reports, evaluation plans, terms of reference (ToR), evaluation management responses and evaluation briefs) should also be shared when requested. |
Competencies Evaluation competencies refer to the qualifications, skills, experience, educational background and attributes required to carry out roles and responsibilities within an evaluation process, as well as ensure the credibility and quality of the evaluation process. All those engaged in promoting, designing, conducting and managing evaluation activities should aspire to promote and conduct high-quality work, guided by professional standards and ethical evaluation principles. Some of these elements are also included in the professionalism norm, which should be supported by an enabling environment, institutional structures and adequate resources. Internal and external evaluators should also abide by these principles and show sufficient professional competencies to conduct evaluations. |
Management response and follow-up In addition to the comments on the draft report that are requested from stakeholders, including managers (programme managers, chiefs of mission (CoMs), directors of department), evaluations may also require an explicit response by the management to endorse or challenge the report and its recommendations. This may take the form of a management response, an action plan on the follow-up of recommendations and/or an agreement on the assignment of responsibilities and accountabilities. A periodic report on the status of the implementation of the evaluation recommendations may be asked of the office/manager, particularly when addressing sensitive reports that require close follow-up. |
Evaluability Before undertaking complex evaluations requiring a significant investment, it may be useful to conduct an evaluability assessment to examine the scope and financial implications of the evaluation, fine-tune methodological approaches, such as for data collection and availability analysis, and decide on the evaluation criteria. It may be necessary to conduct preliminary surveys or focus groups to ensure that the evaluation will provide timely and credible information for decision-making and guarantee an impartial evaluation process. |
Conduct of evaluations Each evaluation should use design, planning and implementation processes that are inherently quality oriented, covering appropriate methodologies for data collection, analysis and interpretation. All evaluations must first be framed and prepared through ToR, providing the evaluation objective(s), scope, methodology, resources required and implementation workplan. Evaluators should be required to develop an evaluation matrix or inception report clearly showing how they understand the scope and approach to the evaluation. Evaluation reports must present, in a complete and balanced way, the evidence, findings, conclusions and recommendations. They must be brief, to the point and easy to understand. |
Quality control and assurance Quality control and assurance mechanisms should be put in place at each stage of the evaluation process. OIG can provide such services, in line with UNEG guidelines, and for decentralized evaluations, the regional M&E officers can be consulted. |
M&E practitioners must have personal and professional integrity, and evaluations or monitoring activities should not reflect personal or sectoral interests.5 They must respect the right of institutions and individuals to provide information in confidence, take care that those involved have a chance to examine the statements made and ensure that sensitive data cannot be traced to its source. Evaluators and monitors must be sensitive to the beliefs, manners and customs of the social and cultural environments in which they work and must address issues of discrimination and gender inequality. They may sometimes uncover evidence of wrongdoing, which must be reported to the appropriate investigative body with the required confidentiality. Evaluators are not expected to evaluate the personal performance of individuals, but rather must balance an evaluation of management functions with due consideration for this principle.
M&E practitioners should be aware of and act in accordance with the UNEG Ethical Guidelines for Evaluation (2020).
M&E practitioners should not violate ethical principles or compromise their independence when collecting and analysing M&E data.
Although there are many more ways in which this can happen, here are some common examples:
- Altering and producing positive findings due to a conflict of interest, other pay-offs or to avoid penalties
- Allowing unsubstantiated opinions to influence the monitoring and/or evaluation activities as a result of sloppy, unreliable or unprofessional evaluation or monitoring practices
- Allowing personal bias to influence findings
- Making promises to beneficiaries or participants that cannot be kept in order to induce them to cooperate
- Failing to honour commitments that should have been honoured
Worthen et al., 2004
In addition, having a misunderstanding of their responsibilities may also lead M&E practitioners to violate ethical principles. This may result in faulty reasoning, including in overgeneralizing findings from data, drawing conclusions based on too little data or allowing their own prejudice to cloud their objectivity during data collection and analysis. Ethical problems may arise at any point during data collection and analysis process.
The following are a few examples of some misunderstandings that M&E practitioners may encounter during either monitoring or evaluation exercises.
Commissioning entity |
|
Monitoring and evaluation practitioner |
|
Participants and other stakeholders |
|
Negative consequences could arise from unethical behaviours that could have an impact on ongoing and/ or future programming.
Preventive measures to address ethical concerns |
---|
While the ethical problems that M&E practitioners encounter are vast and vary by context, the following are some preventive measures that can be taken to address situations presented above:
|
🢂Note: If traumatized participants cannot be linked to relevant services, do not ask questions that may trigger a trauma response. Be sure to seek guidance from relevant thematic specialists or experts when monitoring or evaluation requires direct contact with affected populations at high risk of trauma. |
Where there is a breakdown in social relations, ask trusted members of the community to introduce the monitoring and/or evaluation process. |
🢂Note: Be transparent. Explain the purpose, constraints and for what purpose and how the data will be used and stored. |
Throughout the development and implementation of M&E activities, practitioners must adhere to common ethical principles in order to guarantee that the information gathered is accurate, relevant, timely and used in a responsible manner. An ethical monitoring and/or evaluation checklist, found in Annex 2.1, may be used in order to ensure that norms and standards, including ethical principles, inform all stages of data collection, analysis and reporting.
In order to satisfy the key ethical considerations outlined above, it is critical to obtain informed consent from the individuals from whom that data is collected. Informed consent is the permission granted by a person to have their personal data collected and analysed upon having understood and agreed to the following:
(a) Purpose of the collection, processing and sharing of their personal data;
(b) Data users;
(c) Any risks associated with the collection, processing or sharing of the data.
Sufficient information should be provided to the participant so that they may have the ability to independently judge and decide on whether or not to grant their consent to participate in the interview or research. Although informed consent may be obtained in writing or through a verbal statement by participants, it is advised to obtain it in writing, circumstances permitting (see IOM Informed Consent Template)
IOM resources
- 2018a IOM Evaluation Policy. Office of the Inspector General (OIG), September.
- 2018b Monitoring Policy. IN/31. 27 September.
External resources
- Buchanan-Smith, M., J. Cosgrave and A. Warner
2016 Evaluation of Humanitarian Action Guide. Active Learning Network for Accountability and Performance/Overseas Development Institute (ALNAP/ODI) London. - Morris, M. and R. Cohn 1
1993 Program evaluators and ethical challenges: A national survey. Evaluation Review, 17(6):621–642. - Organisation for Economic Co-operation and Development (OECD)
2019 Better Criteria for Better Evaluation: Revised Evaluation Criteria Definitions and Principles for Use. OECD/Development Assistance Committee (DAC) Network on Development Evaluation. - United Nations Evaluation Group (UNEG)
2016a Norms and Standards for Evaluation. New York.
2020 UNEG Ethical Guidelines for Evaluation. - Worthen, B.R., J.R. Sanders and J.L. Fitzpatrick
2004 Program Evaluation: Alternative Approaches and Practical Guidelines. Third edition. Pearson Education Inc., Boston
The IOM Project Handbook states that the responsibility for monitoring interventions and planning for and managing evaluation falls on the manager responsible for the intervention (strategy, project or programme). However, the manager can, and should, be supported by other IOM staff to ensure proper M&E efforts are put in place. This will depend largely on budget and resources allocated.
Frequently, there is a wide range of people with some related M&E responsibilities within their ToR. Therefore, it is essential to clearly identify a staff member that others can turn to for M&E guidance and accountability. This person should oversee the coordination and supervision of M&E functions, as well as highlight and report any potential challenges that may arise.
The following sections provide a brief overview of some of the competencies required, and challenges faced, when managing and conducting monitoring, as well as evaluation, in an intervention.
Evaluation at IOM works at two different levels: a central evaluation function overseen by the OIG and at a decentralized level, which includes all evaluation activities and matters that are managed and overseen by other departments and offices at IOM. For more information on decentralized evaluation, see chapter 5.
OIG/Evaluation aims to contribute actively to the oversight, accountability, transparency, strategic guidance and organizational leadership and learning of the Organization. This includes providing technical guidance and support to IOM departments and offices and contributing to the set-up of decentralized evaluation systems.
In this regard, roles and responsibilities related to evaluation rest with different entities at IOM, namely the Director General, the Inspector General, IOM Audit and Oversight Advisory Committee, OIG/ Evaluation Unit, directors of regional offices and departments, regional M&E officers, CoMs/heads of offices, project or programme managers and M&E staff in country offices as summarized below. A full and detailed list of responsibilities is found within the IOM Evaluation Policy.
Director General | Responsible for guaranteeing that attention is given to evaluation within IOM, including by allocating relevant resources. The Director General endorses the OIG/Evaluation Unit workplan and supports OIG-implemented evaluations. |
Inspector General | Holds an oversight function by approaching policies, guidelines and strategies related to evaluation, as well as approving the OIG biannual workplan for further submission to the Director General. The Inspector General also promotes evaluation across the Organization as a mechanism for corporate learning and accountability. |
IOM Audit and Oversight Advisory Committee (AOAC) | Reviews the functioning, operational independence and effectiveness of OIG, including its evaluation function, as well as provides advice on the status of evaluation at IOM. |
OIG/Evaluation |
Responsible for actively contributing to the oversight, accountability, transparency, strategic guidance and organizational leadership and learning of IOM. It sets norms and standards for evaluation in IOM, preparing relevant institutional policies and instructions, harmonizing procedures, as well as providing technical guidance and support to IOM departments and offices. Among its prescribed tasks, OIG/Evaluation:
|
Directors of regional offices and departments |
At an institutional level, directors of IOM regional offices and of departments in IOM are responsible for the following: (a) contributing to the development of the workplan for central evaluations conducted by OIG/Evaluation; (b) promoting the use of evaluation as strategic tools and facilitating the conduct of evaluation; (c) ensuring that relevant staff/offices support the conduct of evaluation; (d) and where relevant, ensure that a management response and follow-up is provided. For decentralized evaluation, directors are responsible for identifying and planning evaluations, such as making resources available and ensuring conformity to IOM’s mandatory policy of including evaluation in all projects. |
Regional M&E officers |
Responsible for preparing evaluation workplans for their respective regions; preparing and/or undertaking evaluations of IOM interventions within their region; promoting the use of evaluation; providing technical support and capacity-building for the planning and conduct of quality evaluation. Regional M&E officers also contribute to the development of evaluation guidelines and methods for evaluation under OIG/Evaluation guidance. They promote and ensure the application of the IOM Evaluation Policy and guidelines; reinforce partnership with, and participation in, regional evaluation networks; and inform and consult with OIG/Evaluation on technical support and quality assurance matters. |
CoMs |
For all evaluations within their country office (central and decentralized), CoMs are responsible for facilitating the conduct of evaluations. This includes ensuring the involvement of relevant staff/sub-offices and the provision of timely feedback. CoMs ensure a management response is provided and steps are taken to implement and support follow-up actions on agreed evaluation recommendations. For decentralized evaluations within their country office, CoMs are responsible for the following: (a) identifying and planning evaluations, including making appropriate resources available; (b) ensuring that evaluations implemented conform with the IOM Evaluation Policy; and (c) informing and consulting with regional M&E officers and OIG/Evaluation for technical support and quality assurance, when required. |
Project/Programme managers and M&E staff in country offices |
For project/programme evaluation, M&E staff can help develop plans, including evaluation ToR, although the programme or project manager remains responsible for understanding and approving all plans. M&E staff and focal points within country offices may be expected to play a role in evaluation by organizing and/leading self-evaluation. For all evaluations (centralized and decentralized) of their intervention(s), managers and M&E staff in country offices facilitate the conduct of the evaluation, ensure relevant staff and other offices are involved and provide timely feedback and guarantee that a management response is provided and followed up. For decentralized evaluations, intervention managers and M&E staff identify and plan evaluations, including by making resources available in line with intervention budgets and evaluation scope, principles, norms and quality provisions. Managers and M&E staff ensure that evaluation is included for all IOM projects or provide a justification for when it is not included, as well as assess the possibility for including evaluation at a later stage of implementation. Managers and M&E staff should inform and consult with their respective regional M&E officer and/or OIG/Evaluation for technical support and quality assurance when required. |
In the case of a strategy that is owned at the country level, such as a country strategy, the entity responsible for its development and implementation should also be responsible for evaluating it, as required, and ensuring that relevant programmatic evaluations also take it into consideration.
As previously mentioned, in a programme or project, the ultimate responsibility for monitoring rests with the appropriate programme or project manager. For strategies and policies, the responsible owner of the intervention is responsible for monitoring the progress of that strategy or policy. When possible, it is recommended that IOM offices hire dedicated M&E officers to conduct M&E of relevant interventions and provide the required monitoring support to CoMs, managers or other administrative and operational staff in the office. The recruitment of dedicated M&E officers is also adapted to complex working environments, involving multiple implementing partners, locations, restricted areas and large budgets.
Overall monitoring responsibilities
Intervention | Responsibility for monitoring |
---|---|
Organization-wide strategies or policies | Relevant Headquarters departments/divisions (such as the Human Resources Strategy by Human Resources Management, the IT Strategy by Information and Communications Technology Unit, the Migration Crisis Operational Framework by the Department of Operations and Emergencies and the Gender Policy by Gender Coordination Unit). |
Regional strategies | Regional directors, in coordination with the senior regional advisors. |
Country strategies | CoMs, in coordination with the regional directors. |
Programmes and projects | CoMs are responsible for ensuring that programme and project managers are monitoring or integrating monitoring systems in their projects/programmes. Managers are responsible for monitoring their own programmes or projects. |
Key roles and responsibilities of monitoring across the various levels within IOM
Responsible unit | Summary description of key monitoring-related responsibilities |
---|---|
OIG/Evaluation |
Specific responsibilities include the following:
|
Headquarters departments, divisions and units, and regional thematic specialists | Provide monitoring guidance and instructions within their area of technical expertise (such as Migration Health Division for health projects). They are also responsible for monitoring their own relevant policies and strategies, ensuring that project monitoring systems are linked to the strategic objectives and assisting offices in finding timely solutions to problems through effective monitoring. |
Regional directors | Ensure collaboration for monitoring the implementation of regional policies and strategies and instruct the endorsement of projects in the region to ensure the relevant inclusion of monitoring systems. |
Regional M&E officers |
Regional M&E officers responsibilities include the following:
|
Regional resources management officers | Provide support to country offices’ finance staff in monitoring financial expenditure and budgets. |
CoMs | Ensure that all projects in the country office have sound monitoring mechanisms and tools in place and that the regional office is kept informed. Furthermore, if the country office has a strategy, the CoM should ensure that its implementation is being monitored. |
Programme/Project managers |
Programme/Project managers have the primary responsibility for monitoring progress of the project/programme, in both operational and financial terms, including what resources go into the project (inputs), what is carried out (activities) and what results come out (outputs and outcomes). Specifically, programme/project managers ensure that effective monitoring and control mechanisms are in place to gain assurance that items obtained under the project reach the targeted beneficiaries and to prevent fraud. They regularly monitor and measure progress, identifying and communicating any deviations or risks to relevant stakeholders and promptly taking corrective actions as necessary (such as requesting project implementation period extension or seeking donor approval to amend/revise the project activities, risk plan, results or budget). In collaboration with finance staff, managers regularly review financial results, including line item reports, to minimize incidences of under/overspending, and where necessary, explain material under-over-spends and/or correct errors. |
Country office M&E staf |
M&E officers develop associated M&E tools and workplans at the country office level, including for implementing partners, in coordination with the CoM/head of office and the regional office, and based on a risks assessment of the projects being implemented within the country. They also provide associated technical support and capacity-building to the office/ projects on monitoring (input at project development, implementation and reporting levels) and conduct monitoring visits in accordance with project/programme workplans, including activities undertaken by implementing partners. M&E officers are also responsible for preparing relevant reports. Note: Several country offices have M&E focal points. Focal points may not be able to conduct all these monitoring activities but can use the responsibilities as guidance for their role. |
Country office resource management staf | Country office resource management staff assist managers in monitoring financial expenditure and ensure adherence with contractual requirements to donors and IOM procedures. |
Monitoring and evaluation competencies
When thinking of M&E roles and responsibilities, it is useful to consider essential competencies for such roles. Competencies are a combination of knowledge and skills required for practitioners to execute complex tasks in their professional environment. Despite the wide diversity of contexts within which M&E is conducted, the complexity of M&E systems and the fact that competencies may vary to some degree at different levels, the following are considered to be applicable for M&E staff.
Essential monitoring and evaluation competencies |
---|
General management competencies |
Ability to:
|
Professional monitoring and evaluation staff competencies |
Ability to:
|
Data collection, data management, data analysis, dissemination and use competencies |
Ability to:
|
When assessing M&E capacity, it is helpful to consider the following:
Entry/Novice | Proficient/Skilled | Mastery/Expert |
---|---|---|
|
|
|
As M&E are mandatory parts of any IOM intervention, related costs must be included in their respective budgets during the intervention development phase. Detailed guidance on budgeting for projects and programmes, including the incorporation of monitoring evaluation costs is available in the IOM Project Handbook, while specific guidance on budgeting for evaluation is presented in Annex 5.1 of chapter 5 in the IOM Monitoring and Evaluation Guidelines.
Budgeting for an intervention is now done through IOM’s Project Information and Management Application (PRIMA).
M&E-related staff costs should be clearly mentioned under the Staff Costs section of the IOM Project Budget template in PRIMA. Similarly, specific costs related to M&E, such as conducting baseline assessments, post-intervention surveys and conducting evaluations, should be clearly mentioned in the designated M&E lines under the Operational Costs section of the IOM Project Budget in PRIMA
Costs, such as for corresponding staff time and travel, are typically incorporated into the Staff and Office Costs section of the IOM Project Budget, unless subcontracted or performed by a partner or consultant, in which case these costs should be listed under the Operational Costs section either in the separate budget lines for Monitoring and Evaluation, as indicated in the budget template in PRIMA, or under the costs for the partner.
IOM recommends the same range for M&E as recommended by the wider evaluation community: 5–10 per cent of the total budget, with 2–4 per cent for evaluation and 3–6 per cent for monitoring. However, this cost breakdown is purely indicative and, whatever the size of the intervention, the amount allocated for an evaluation in IOM ranges from USD 3,000 to USD 30,000, taking into consideration that internal evaluations conducted by IOM staff are less expensive than external evaluations. For complex evaluations that may require more resources, specific discussions can take place with the donor(s) regarding higher budgeted amounts; for instance, impact evaluations may require an investment of at least of USD 70,000 and can easily reach a cost of USD 500,000.
Identifying the data source and collection methods required for M&E early in intervention development allows for the best estimation of the financial needs. The following highlights key considerations for planning the project/programme M&E budget.
Key considerations for planning an intervention monitoring and evaluation budget | |
---|---|
|
OIG/Evaluation has developed a sample M&E Budget Calculator, including example calculations depending on specific M&E needs
IOM resources
-
2010 IOM Data Protection Manual. Geneva
-
2017 IOM Project Handbook. Second edition. Geneva (Internal link only).
-
2018a IOM Evaluation Policy. OIG, September..
-
2018b Monitoring Policy. IN/31. 27 September.
-
n.d.a OIG/Evaluation Strategy 2021–2023. OIG/Evaluation.
-
n.d.b IOM Informed Consent Form (Internal link only)
-
n.d.c PRIMA for All (Internal link only).
-
n.d.d OIG/Evaluation M&E Budget Calculator (Internal link only).
External resources
- Barnett, C. and L. Camfield
2016 Ethics in evaluation. Journal of Development Effectiveness, 8(4):528–534. - Buchanan-Smith, M., J. Cosgrave and A. Warner
2016 Evaluation of Humanitarian Action Guide. ALNAP/ODI, London. - International Federation of Red Cross and Red Crescent Societies (IFRC)
2011 Project/Programme Monitoring and Evaluation (M&E) Guide. Geneva - Morra-Imas, L.G. and R.C. Rist
2009 The Road to Results: Designing and Conducting Effective Development Evaluations. World Bank, Washington, D.C. - Morris, M.
2003 Ethical considerations in evaluation. In: International Handbook of Educational Evaluation, Part 1. Springer International Handbooks of Education (T. Kellaghan, D. Stufflebeam and L. Wingate, eds.). Kluwer Academic Publishers, Dordrecht, pp. 303–327.2015a Professional judgment and ethics. In: Community Psychology: Foundations for Practice (V. Scott and S.M. Wolfe, eds.). SAGE Publications, pp. 132–156.
2015b Research on evaluation ethics: Reflections and an agenda. In: Research on Evaluation: New Directions for Evaluation, 2015(148):31–42.
- Morris, M. and R. Cohn
1993 Program evaluators and ethical challenges: A national survey. Evaluation Review, 17(6):621–642. - Organisation for Economic Co-operation and Development (OECD)
2019 Better Criteria for Better Evaluation: Revised Evaluation Criteria Definitions and Principles for Use. OECD/DAC Network on Development for Evaluation. - Prom-Jackson, S. and G. Bartsiotas
2014 Analysis of the Evaluation Function in the United Nations System. JIU/REP/2014/6. Joint Inspection Unit of the United Nations, Geneva. OECD/DAC Network on Development for Evaluation. - Thomson, S., A. Ansoms and J. Murison (eds.)
2013 Emotional and Ethical Challenges for Field Research in Africa: The Story Behind the Findings. Palgrave Macmillan, Hampshire. - United Nations Evaluation Group (UNEG)
2016 Norms and Standards for Evaluation. New York.2016a Norms and Standards for Evaluation. New York.
2016b Evaluation Competency Framework. New York
- UNAIDS
2010 Standards for a Competency-based Approach to Monitoring and Evaluation Curricula and Trainings. Monitoring and Evaluation Reference Group, Geneva. - Worthen, B.R., J.R. Sanders and J.L. Fitzpatric
2004 Program Evaluation: Alternative Approaches and Practical Guidelines. Third edition. Pearson Education Inc., Boston.