Our vision is an equal world where all girls can thrive

But how do we monitor and evaluate the progress we are making towards achieving that? How do we continuously learn and improve what we do? How do we demonstrate and report on the impact we are having?

Understanding the impact that our work has on the lives of girls and young women around the world is crucial for gauging the effectiveness of our interventions and for identifying areas for improvement. Our Monitoring, Evaluation and Learning (MEL) strategy is closely linked to our overall organisational strategy and helps us find out what’s working, what our beneficiaries want and need, what outcomes we are enabling, and how we can continuously improve.

Our mission is to enable girls and young women to develop their fullest potential as responsible citizens of the world. Strong and effective MEL supports this mission and helps us achieve our vision of an equal world where all girls can thrive. MEL is about gaining useful insights and understanding the difference we are making to the lives of girls, young women, and their communities.

Keeping it simple…

During the planning phase of any project or programme, the main challenge is usually deciding what key results and changes to monitor and what tools to use. To avoid collecting data for the sake of it (not everything that can be measured, should be measured!) the focus must be on data that is relevant, credible and useful for learning, reporting and improvement. Selecting appropriate and user-friendly tools comes next to effectively measure and verify those results.

Our global MEL team uses a Results-Based Management (RBM) approach to focus on measuring success from actual results achieved.

Our RBM toolkit comprises various tools which enable us to capture and measure outputs, outcomes, and impact at different stages in the lifecycle of our programmes. The toolkit provides both mandatory tools, which we deploy across all programmes, and optional tools that can vary programme-to-programme. Examples include pre & post surveys, Quarterly Progress Reporting (QPR), stakeholder analysis, log frames, Gantt charts, results chains, financial management processes, risk assessments and Theory of Change.

Since flexibility and learning are key elements in our MEL work, we review our toolkit regularly to improve our results and impact measurement.

78

What does this mean in practice…

Planning

Firstly, we use a robust Theory of Change (TOC) to serve as the overarching guide for all programme planning. In addition to using a TOC at individual programme level, we have also developed an organisational TOC which encompasses our strategic objectives and helps to articulate the desired impact of WAGGGS as an organisation.

A TOC can be continuously refined based on real-time feedback and evolving circumstances and demonstrates how collective efforts translate into desired outcomes.

At the programme or project level we also develop a range of M&E planning tools such as stakeholder maps, results chains, M&E frameworks, Gantt charts, and risk analysis.

Monitoring

We gather programme monitoring data in a systematic and continuous manner to accurately track the progress and effectiveness of our interventions. This is vital for planning, decision-making, risk assessment/mitigation, transparency, accountability, strategic focus, communication, and resource allocation.

Where possible, we apply a level of consistent monitoring across all programmes so that comparisons can be made between different interventions, while also making sure more bespoke monitoring on individual programmes is built-in when necessary. Decisions must also be made as to the frequency of data capture and how to incorporate any additional monitoring and reporting requirements, such as from funders or other collaborative delivery partners.

To gather this data across varied stakeholders, we use two essential tools, a Quarterly Progress Reporting (QPR) tool, to collect data from the Member Organisations engaged in the delivery of our programmes, and our own internal Monitoring/Management Information Systems (MIS).

Learning

At the core of our approach to learning, lies a dedication to integrate continuous improvement into our programmes and interventions.

Regular internal and external mechanisms (such as learning & connection sessions, learning logs and knowledge exchange sessions) are used to capture and document key learnings and to ensure they are used to inform decisions, refine strategies, and enhance programme efficiency or effectiveness over time.

Evaluation

Finally, we employ a systematic and comprehensive evaluation framework that utilises a mixed-method approach of both quantitative and qualitative data (including outcomes harvesting, feedback sessions and Knowledge, Attitude, Perception (KAP) baseline and endline surveys).Given the global nature of guiding and scouting, and the complexity that comes with trying to measure social impact, we recently started to adopt the Most Significant Change Technique (MSCT) whereby we gather individual stories of change on a regular basis and analyse them to then be able to identify the areas of most impact.

This framework incorporates a variety of techniques to organise and reflect on data, results, and outcomes, to draw conclusions, and to steer action on the things that matter. Given the nature of guiding and scouting and the complexity of measuring the social impact in general, we, recently started to adopt the Most Significant Change Technique (MSCT) whereby we gather individual stories of change on a regular basis and analyse them to then be able to identify the areas of most impact.

The specific evaluation techniques used vary between different programmes and depend on the programme stage, but all evaluation is aimed at helping us to better understand and improve the impact we are having.

Partager cette page