We conduct mixed-method and participatory research studies during and after humanitarian emergencies. We work with humanitarian agencies to identify their priority research questions and then develop rigorous approaches to answering them.
Case study: Women’s leadership in disaster preparedness
We deliver robust evaluations that seek out local perspectives on disaster preparedness and response. They’re user-focussed with easy to understand results communicated via clear reports, infographics, videos or microsites. We’re exploring ways to better share evaluation findings with disaster-affected communities. We can work with your team throughout the evaluation process: from identifying appropriate questions to understanding the findings and thinking through their implications.
We work with teams to review and develop strategies in a participatory way. We facilitate learning reviews that allow teams to reflect on their work. We create landscape reviews that allow organisations to identify themes and trends in their environment and to develop responsive, forward-thinking strategies.
Case study: ActionAid review of commitments
We’re confident using quantitative data. We design, collect and analyse data using experimental approaches or simple statistics. We explain the findings and limitations of statistical methods in simple ways. We can train your team to feel confident understanding and using statistical research to inform programme decision-making.
Case study: Peace in Burundi
We build toolkits that support humanitarian practitioners and field-level innovators to monitor and evaluate their work. We develop clear, simple tools that can be used by teams with limited resources to inform their own decision-making and to provide evidence to stakeholders.
Case study: Evidence for Humanitarian Innovation
Training in evidence
We mentor and train teams in developing a strategy, process, and methodology, for monitoring programmes and gathering evidence of outcomes or impact.