OVERVIEW
As part of the larger developmental efforts, various schemes implemented and expenditures incurred by the Central and State governments in the social, economic, and infrastructure sectors directly touch the lives and livelihoods of the people. Converting these outlays into outcomes is a complex process that involves, inter-alia ensuring an effective monitoring mechanism and assessment of the impacts across various stages of the project cycle. Moreover, various policy changes, regulations of the government have profound and varied impact on different sections of the economy and society that need to be studied thoroughly and scientifically to understand the outcomes, in both the short and long run, intended and unintended. Monitoring, Evaluation, and Learning (MEL) contribute to systematically tracking and assessing the process, end results, and performance of a policy, programme, strategy to generate credible, reliable, and useful information for the implementing agencies and other stakeholders.
In India, the use of MEL by independent agencies has become an integral part of the administration and management of all schemes for higher efficacy, transparency, and impact. And hence, there has been an increasing thrust on building capacity for administering effective MEL systems. This requires a firm grip and understanding of not only the underlying concepts, tools and techniques but also the art of converting learnings into effective policies. The significance of MEL is further increased by recent emphasis on evidence-based policy-making and action thereby facilitating right interventions based on ground realities. Hence, the design, management, and execution of an effective MEL system is crucial for the success of a scheme and ultimate improvement in development and growth parameters.
OBJECTIVES
This training course on MEL aims to improve the competency of officers, working at different levels of centre and state governments, to design and execute successful monitoring and evaluation programmes in line with the acceptable global standards, utilise the findings of the exercise for better planning, and effective delivery of public services. The specific objectives are as below:
- Enhance existing institutional and individual capacities in the government to undertake MEL
- Improve awareness about principles and components of MEL system
- Sensitize participants about the importance of robust MEL systems in public service delivery
- Introduce participants to national and global standard practices of MEL
- Build capacity of participants to successfully plan, commission, and use MEL
- Enhance competency in using MEL data and results for decision-making
From (Hrs IST) | To (Hrs IST) | Topic - Monitoring Fundamentals |
---|---|---|
Inaugural | ||
0945 | 1015 | Inaugural session |
1015 | 1115 | Introduction to the training and ice breaker |
1115 | 1130 | Tea break |
Exposure and learning from experts | ||
1130 | 1200 | Introduction to outcome budgeting, and Overview of Output Outcome Monitoring Framework (OOMF) |
1200 | 1240 | Overview of logical framework and Output Outcome Monitoring Framework (OOMF) |
1240 | 1300 | SMART Indicators for performance metrics in a monitoring system |
1300 | 1400 | Lunch break |
Learning by doing | ||
1400 | 1600 | Group work: Preparation of log-frame |
1600 | 1615 | Tea break |
1615 | 1715 | Panel discussion on challenges faced in implementing an outcome monitoring framework, and mitigation measures: Experiences from States |
From (Hrs IST) | To (Hrs IST) | Topic - Evaluation Fundamental |
---|---|---|
Exposure and learning from experts | ||
0930 | 0945 | Recap of Day-1, and key takeaways |
0945 | 1000 | Short movie on evaluation [Evidence in Action] |
1000 | 1100 | When to monitor and when to evaluate? Looking from an OOMF and logical framework perspective |
1100 | 1115 | Tea break |
1115 | 1245 | Concepts of evaluation - Approach (i.e., Formative, Process, Summative), Framework (i.e., RCEESI+E), Design (i.e., experimental, quasi-experimental, non-experimental), Method (primary data, secondary data, qualitative, quantitative, cross-sections, longitudinal) |
1245 | 1330 | Why evaluate? Concepts of attribution, contribution, and its role for influencing policy. |
1330 | 1430 | Lunch break |
Learning by doing | ||
1430 | 1530 | Group discussion on the feasibility of using experimental and quasi-experimental designs for evaluating public programmes |
1530 | 1630 | Framing your own evaluation - choosing the right approach, framework, design, and method |
1630 | 1645 | Tea break |
1645 | 1800 | Diversity in evaluation approaches and methods - Experiences from States |
From (Hrs IST) | To (Hrs IST) | Topic - Brass Tacks Of Evaluation Design |
---|---|---|
0900 | 0930 | Group Photograph and Interaction with DG, DMEO and DG, NILERD |
0930 | 0945 | Recap of Day-2 and key takeaways |
0945 | 1100 | Utilization of evaluation results |
1100 | 1115 | Tea Break |
1115 | 1215 | Preparing research objectives and questions - Examples from UCSS evaluations and other studies |
1215 | 1315 | Overview of sampling for primary survey |
1315 | 1400 | Lunch Break |
1400 | 1500 | Prepare a sampling methodology for the prepared evaluation design |
1500 | 1605 | Special lecture on techniques for analysing quantitative primary data - descriptive and inferential |
1600 | 1615 | Tea Break |
1615 | 1815 | Computer Assisted Field Survey |
From (Hrs IST) | To (Hrs IST) | Topic - Administrative Data and MEL Cycle |
---|---|---|
Exposure and learning from experts | ||
0930 | 0945 | Recap of Day-3, and key takeaways |
0945 | 1030 | Overview of MEL cycle - When to do what? |
1030 | 1130 | Preparing a MEL plan - When to monitor and what? When to evaluate and what? |
1130 | 1145 | Tea break |
1145 | 1300 | Effective MIS for Effective Scheme/ Programme Performance |
1300 | 1400 | Lunch break |
Learning by doing | ||
1400 | 1500 | Strengthening administrative data - Learnings from the Data Governance Quality Index (DGQI) |
1500 | 1600 | Techniques to monitor quality of data |
1600 | 1615 | Tea break |
1615 | 1700 | Fun with data - checking the quality of a sample dataset |
From (Hrs IST) | To (Hrs IST) | Topic - Commissioning and Using Evaluations |
---|---|---|
Exposure and learning from experts | ||
0930 | 0945 | Recap of Day-4, and key takeaways |
0945 | 1045 | Preparing a design for Institutional Evaluation |
1045 | 1100 | Tea Break |
1100 | 1200 | Overview of NITI Aayog |
1200 | 1300 | Process of commissioning evaluations - learnings from DMEO |
1300 | 1400 | Lunch break |
1400 | 1445 | Preparing a Terms of Reference (ToR) |
1445 | 1545 | Preparing Terms of Reference for Evaluation |
1545 | 1600 | Tea Break |
1600 | 1700 | Preparing an M&E strategy |
1700 | 1800 | Valediction |