By continuing to browse this site, you agree to our use of cookies. Read our privacy policy

Ukraine and Impacted Countries Emergency Appeal, Community Engagement and Accountability

Remote | Remote - Based

  • Organization: IFRC - International Federation of Red Cross and Red Crescent Societies
  • Location: Remote | Remote - Based
  • Grade:
  • Occupational Groups:
    • Social Affairs
    • Legal - Broad
    • Medical Practitioners
    • Humanitarian Aid and Coordination
    • Monitoring and Evaluation
    • Civil Society and Local governance
    • Emergency Aid and Response
  • Closing Date: 2025-07-11

Organizational Context

1. Summary

  • Purpose

This evaluation aims to assess how effectively Community Engagement and Accountability (CEA) has been implemented under the Ukraine and Impacted Countries Emergency Appeal, focusing on the use of community feedback in programme design, the institutionalization and sustainability of CEA in National Societies, and the implementation of meaningful community participation.

  • Target Audience

The primary users of the evaluation are IFRC, National Society and Partner National Society leadership and operations managers, as well as technical teams across the Movement, including those outside CEA functions, who rely on community data to inform planning. Findings will support decision-making on how to improve the use of community data, better embed CEA across programmes, and adapt participatory approaches to complex settings.

  • Commissioners

This evaluation is commissioned by Birgitte Bischoff Ebbesen, Regional Director, IFRC Regional Office for Europe.

  • Reports to

Evaluation Management Team (EMT): Eva Mihalik - Senior Officer, Monitoring and Evaluation; Fatma Nur Bakkalbasi – Senior Officer, Community Engagement and Accountability; Carmen Chavarri – Strategic Planning Advisor.

  • Duration

Up to 50 working days.

  • Timeframe

From September 1st to December 19th, 2025.

  • Location

Remote, with a portion of field work (location TBD).

 

2. Background

The Ukraine and Impacted Countries Emergency Appeal (UIC EA) was launched in February 2022 to respond to the dire humanitarian consequences of the international armed conflict in Ukraine. With over CHF 529 million raised, it is one of the largest appeals in IFRC’s history and it supports operations across Ukraine and 17 impacted countries.

Community Engagement and Accountability (CEA) was identified early in the operation as a key area requiring urgent strengthening, as most National Societies (NS) had limited prior capacity. A CEA surge team deployed in the first days of the operation to establish country-tailored systems focusing on information provision, consultation, feedback mechanisms, and National Society capacity building. Three years later, dedicated CEA support remains both at IFRC’s Europe Regional Office and some Country Cluster Delegations and Country Offices.

CEA was largely integrated across sectors, notably in Cash and Voucher Assistance and Migration and Displacement support. CEA systems have been active in all responding National Societies, and over 1,500 staff and volunteers have been trained across the region. National Societies have deployed a plethora of engagement tools, including digital tools, call centres, home visits, surveys, and events to engage communities and act on their input.

However, important challenges remain. Internal consultations have shown that, while many National Societies have demonstrated strong practices, community feedback is still used more often to resolve individual cases than to shape programme design. Sustainability remains uncertain where institutional buy-in or dedicated resources are lacking[1], and meaningful participation continues to be challenging in urban and complex population movement settings.

1] This is further explored in the Mid-term review - Ukraine & Impacted Countries Appeal.

Job Purpose

3. Evaluation Purpose and Scope

  • Purpose

The purpose of this evaluation is to examine how Community Engagement and Accountability (CEA) has been implemented under the Ukraine and Impacted Countries Emergency Appeal (UIC EA), with the goal of assessing its relevance, sustainability, and impact.

Specifically, the evaluation seeks to produce learning that will help to (1) bridge the gap between feedback collection and its use in programme design and decision-making; (2) understand how IFRC and its member National Societies can better support the institutionalization and long-term sustainability of CEA within National Societies; and (3) identify good practices and lessons for implementing meaningful community participation in complex and urban settings. The evaluation will inform management decision-making for the IFRC and National Societies, generate learning for technical teams and operational leadership across the Movement, and support accountability to affected communities

The findings will be used by IFRC operational and technical teams, National Societies, the IFRC Regional Office for Europe, and relevant Partner National Societies, as well as shared with donors and the broader humanitarian response community to inform strategic planning, policy development, and future emergency and recovery programming.

  • Scope

The evaluation will cover the period from February 2022 to mid-2025, reflecting the timeline from the onset of the response through the current transition into longer-term integration programming. It will focus on selected countries and National Societies participating in the UIC EA[2], chosen to reflect programmatic diversity[3].

The unit of analysis will be the CEA-related systems, practices, and results of National Society operations supported directly by the UIC EA. Target groups for data collection will include IFRC technical and operational teams, National Society leadership and operational staff (including but not limited to CEA focal points), Partner National Societies, and where possible, affected communities.

------------------------

[2]Exact locations will be defined after joint assessment between IFRC and the consultants.

[3]Ukraine will not be included in this evaluation. A seperate evaluation focusing on Ukraine specifically is planned for 2027.

------------------------

4. Evaluation Questions – Objectives – Criteria

Theme 1: Use of Community Feedback to Inform Programme Design

Q1. To what extent, and through what processes, has community feedback informed programme design and decision-making at the National Societies?

Objective:

To identify whether systems are in place for collecting, analyzing, and using community feedback for programmatic decisions; and to understand where breakdowns occur in this process—whether at the collection, analysis, or application stage—and why (e.g. structural, institutional, or operational barriers).

Criteria:

  • Effectiveness – How effectively were feedback mechanisms used to improve service delivery?
  • Relevance – Were interventions informed by people’s needs?
  • Accountability – Was feedback acted on and the loop closed?
  • Learning – What worked, what didn’t and why?

Job Duties and Responsibilities

Sub-questions:

Q1.1. What mechanisms or channels were used throughout this response to collect community feedback? How consistent was their use across programmes?

Q1.2. Once collected, how was feedback data processed and analyzed? Who was responsible, and what systems or tools are used?

Q1.3. How was the analyzed feedback used in programmatic decision-making? Are there examples of it informing the design, adaptation, or termination of an intervention or activity? Was feedback used to resolve issues on a case-by-case basis?

Q1.4. Where are the main barriers to using community feedback in programme design? Are these barriers structural (e.g. systems, SOPs), institutional (e.g. leadership buy-in), or operational (e.g. referral mechanisms, time, tools, staff capacity)?

Q1.5. To what extent were communities informed about how their feedback is used, and what mechanisms exist to close the feedback loop effectively?

 

Theme 2: Institutionalization and Sustainability of CEA in National Societies

Q2. How has the approach to integrating CEA into National Societies been structured, and how well does it align with existing capacities and ways of working?

Objective:

To assess how CEA has been introduced and institutionalized within National Societies, how it is sustained (or not), what support has been most useful, and what coordination or structural barriers exist.

Criteria:

  • Sustainability – Is the change durable over time?
  • Coherence – Are strategies and actors aligned?
  • Efficiency – Has the integration of CEA leveraged existing systems and capacities to deliver results without duplication or excessive burden?

Sub-questions:

Q2.1. How was CEA introduced across the different National Society (by IFRC, a Partner National Society, an external partner, or other)?

Q2.2. To what extent has CEA been integrated into National Societies strategies, structures, and routine operations, or is it still largely project-based?

Q2.3. What types of support (technical, financial, human resources) have been most helpful, or most lacking, in enabling National Societies to adopt and sustain CEA practices?

Q2.4. How does coordination with IFRC and other Movement partners support or hinder the institutionalization of CEA in National Societies?

Q2.5. What factors contribute to or hinder the long-term sustainability of CEA? How reliant are National Societies on external support to maintain CEA?

Theme 3: Understaning and Implementing Meaningful Participation

Q3. How has meaningful participation been understood and implemented in this response, and what factors have influenced its uptake, quality, or limitations, especially in complex or urban contexts?

Objective:

To assess whether and how communities have been engaged in a meaningful way, particularly in urban migration settings; to explore how practitioners define and interpret “meaningful participation”; and to identify barriers and enablers to stronger community-led engagement.

Criteria:

  • Accountability – Are communities meaningfully engaged in decisions that affect them?
  • Relevance – Are participatory approaches appropriate to the context (i.e. urban, migration)?
  • Coverage – Are all relevant community groups included in participation efforts?
  • Learning – How do teams define and assess the terms “community” and “meaningful participation”, and what can be improved, especially in integration work?

 

 

 

 

Job Duties and Responsibilities (continued)

Sub-questions:

Q3.1. How is the concept of “meaningful participation” understood within and across National Societies? To what extent is this understanding shared across functions and partners?

Q3.2. What participatory approaches have been implemented throughout this response? At what stage (design, implementation, evaluation) were communities involved, and to what extent?

Q3.3. In contexts where participatory approaches were limited or absent, what were the main reasons (e.g. time, funding, uncertainty about what participation entails, institutional or cultural dynamics)?

Q3.4. How have National Societies adapted participatory approaches in complex contexts such as urban, digital, or migration settings where community identities are fluid or hard to define?

Q3.5. What should successful participation look like in practice?

Q.3.6. Are current or upcoming Integration and Inclusion efforts seen as opportunities to strengthen community participation? If so, how might this be done in practice?

 

5. Evaluation Methodology

The evaluation will adopt a mixed-methods approach combining qualitative and quantitative data collection and analysis to ensure triangulation and empirical rigor. It will be conducted through a hybrid model, whereby the methodology is developed jointly by the consultant(s) and the IFRC, with IFRC also contributing to the design of tools, selection of data sources, and interpretation of findings. This collaborative approach is intended to ensure contextual relevance and to support stronger ownership and uptake of the results by IFRC and National Societies.

The methodology will include a combination of:

  • Exhaustive desk review of secondary data, including but not limited to: IFRC Operational Updates, the Midterm Review (MTR), case studies, lessons learned and post-distribution monitoring reports.
  • Key informant interviews (KIIs) with IFRC staff, National Society leadership and practitioners, Partner National Societies, and potentially affected communities[4].
  • Focus group discussions (FGDs) or structured reflection sessions where feasible, particularly to explore definitions and experiences of community participation.
  • Sense-making workshops after data collection, bringing together IFRC, NS, and consultant teams to jointly interpret emerging findings and refine recommendations.

The sampling strategy will prioritize countries and National Societies that reflect operational diversity, existing documentation, and feasibility. The evaluation will build heavily on existing monitoring and evaluation data to reduce data collection burdens.

A detailed methodology, sampling plan, and data collection tools will be proposed by the consultant and submitted for joint validation in the inception report.

[4]To minimize time demands on participants, particularly National Society staff and IFRC country teams with limited availability, interviews will be designed to be purposeful, concise, and limited in number, drawing on existing data wherever possible.

Education

6. Deliverables

1. Inception Report
A detailed inception report will be submitted following joint consultations, working sessions, and document review. It will outline the final evaluation methodology, refined evaluation questions, sampling strategy, data sources, data collection tools, roles and responsibilities, and a workplan with timelines. It will also include draft interview and discussion guides.
Responsible: Consultant team
Due: Within 4 weeks of contract start

2. Desk Review Summary
A concise synthesis of key findings from relevant secondary sources (e.g. MTR, operational updates, case studies, lessons learned and post-distribution monitoring reports), to inform primary data collection and provide context.
Responsible: Consultant team
Due: As annex to the inception report

3. Sense-Making Workshop
A facilitated virtual or hybrid workshop involving IFRC and selected National Society stakeholders to collectively analyze emerging findings, validate initial interpretations, and strengthen ownership of the conclusions.
Responsible: IFRC and Consultant team (joint facilitation)
Timing: After completion of data collection, prior to finalizing draft report

4. Draft Evaluation Report
A full draft report including executive summary, background, methodology and limitations, findings, conclusions, lessons learned, and actionable recommendations. Case studies may be embedded or presented separately.
Responsible: Consultant team
Due: ~4–5 weeks after completion of data collection

5. Final Evaluation Report
A final, revised version of the report incorporating feedback from IFRC and stakeholders. It should include appendices with the ToR, data collection tools, list of interviewees, and references.
Responsible: Consultant team
Due: ~2 weeks after receiving consolidated feedback on the draft

6. Case Studies (optional format)
2–3 case studies based on examples of effective CEA implementation or institutionalization, which may be included in the main report or delivered as standalone outputs (format and details to be determined during inception phase).
Responsible: Consultant team

7. Joint Presentation of Findings and Recommendations
A final joint presentation (by consultants and IFRC) of the evaluation findings and recommendations to National Societies, Partner National Societies, and the broader humanitarian community, including relevant CEA working groups.
Responsible: Consultant team and IFRC
Format: Slide deck + oral presentation

Experience

7. Proposed Timeline

Time Schedule

Activities

Deliverables

Week 1 through 3

September 1st through 19th

- Desktop study

- Joint methodology discussions and remote working sessions

- Development of inception report (Consultant team)

- Feedback and validation of inception report (IFRC)

- Inception report due on September 12th

- Feedback on inception report due on September 19th

Week 4 through 6

September 22nd through October 3rd

- Data collection

 

Week 7 through 8

October 6th though 17th

- Data analysis

 

Week 9

October 20th to 24th

- Joint sense-making workshop (2.5 days, online/hybrid)

 

Week 10 though 13

October 27th through November 14th

- Draft evaluation report

- Draft evaluation report due on November 7th

- Feedback on draft evaluation report due on November 14th

Week 14 through 15

November 17th through 28th

- Revise and submit final evaluation report

- Final evaluation report due on November 28th

Week 16 through 17

December 1st though 19th

- Plan joint presentation of findings and recommendations

- Joint presentation of findings and recommendations

- Presentation of findings by December 19th

Knowledge, Skills and Languages

8. Evaluation Quality and Ethical Standards

The evaluator/s should take all reasonable steps to ensure that the evaluation is designed and conducted to respect and protect the rights and welfare of people and the communities of which they are members, and to ensure that the evaluation is technically accurate, reliable, and legitimate, conducted in a transparent and impartial manner, and contributes to organizational learning and accountability. Therefore, the evaluation team should adhere to the evaluation standards and specific, applicable process outlined in the IFRC Framework for Evaluation. The IFRC Evaluation Standards are:

1) Utility: Evaluations must be useful and used.

2) Feasibility: Evaluations must be realistic, diplomatic, and managed in a sensible, cost effective manner.

3) Ethics & Legality: Evaluations must be conducted in an ethical and legal manner, with particular regard for the welfare of those involved in and affected by the evaluation.

4) Impartiality & Independence: Evaluations should be impartial, providing a comprehensive and unbiased assessment that takes into account the views of all stakeholders.

5) Transparency: Evaluation activities should reflect an attitude of openness and transparency.

6) Accuracy: Evaluations should be technically accurate, providing sufficient information about the data collection, analysis, and interpretation methods so that its worth or merit can be determined.

7) Participation: Stakeholders should be consulted and meaningfully involved in the evaluation process when feasible and appropriate.

8) Accountability: Evaluations should be conducted ensuring the accountability by adequately documenting the evaluation process and products, aligning evaluation practice with an equity approach, and implementing the recommendations into actions.

It is also expected that the evaluation will respect the seven

Fundamental Principles of the Red Cross and Red Crescent: 1) humanity, 2) impartiality, 3) neutrality, 4) independence, 5) voluntary service, 6) unity, and 7) universality. Further information can be obtained about these principles at: https://www. ifrc.org/who-we-are/international-red-cross-and-red-crescent-movement/fundamental-principles”

9. Evaluators/s Profiles Needed

The evaluation will be conducted by an external consultant or consulting team with demonstrated experience in designing and implementing complex, mixed-methods evaluations in humanitarian or development contexts. The team should bring strong analytical and facilitation skills, with a proven ability to generate actionable insights and communicate findings clearly to diverse audiences.

Required qualifications include:

  • At least 7–10 years of relevant experience in evaluation or applied anthropological or social research, with expertise in community engagement, accountability, and participation. Experience conducting multi-country evaluations strongly preferred.
  • PhD in Social Sciences, or a Master’s with equivalent combination of education and relevant work experience.
  • Demonstrated experience in qualitative and quantitative data collection and analysis, including participatory methods.
  • Strong facilitation skills and experience leading workshops or collaborative interpretation processes.
  • Excellent analytical, writing, and presentation skills in English.
  • Extensive experience working with the Red Cross Red Crescent Movement or similar humanitarian actors.

We are seeking evaluators who bring curiosity, creativity, and a genuine commitment to learning, with a focus on delivering practical recommendations that go beyond restating known issues and help shape future practice.

Competencies, Values and Comments

10. Application Procedures

Interested candidates should submit their application material by 11th of July to: hr.europe@ifrc.org with “UIC EA CEA Evaluation” written in the Subject Line. Application materials should include:

  • Curricula Vitae (CV) for all members of the team applying for consideration.
  • Cover letter clearly summarizing your experience as it pertains to this assignment, your daily rate, and three professional references.
  • Technical proposal (when appropriate) not exceeding five pages expressing an understanding and interpretation of the TOR, the proposed methodology, a time and activity schedule, and itemizing estimated costs for services.
  • At least one example of an evaluation report most similar to that described in the TOR.

Application materials are non-returnable, and we thank you in advance for understanding that only short-listed candidates will be contacted for the next step in the application process.

 

We do our best to provide you the most accurate info, but closing dates may be wrong on our site. Please check on the recruiting organization's page for the exact info. Candidates are responsible for complying with deadlines and are encouraged to submit applications well ahead.
Before applying, please make sure that you have read the requirements for the position and that you qualify.
Applications from non-qualifying applicants will most likely be discarded by the recruiting manager.
Fellow badge

This feature is included in the Impactpool Fellowship.

Become a Fellow and get a summary of the job description to quickly understand the role and the requirements