Sei sulla pagina 1di 100

IN

BASIC EDUCATION MONITORING AND


EVALUATION (M&E) MANUAL
Towards a Results-Based M&E in Basic Education

Version 8
14 January 2019
List of acronyms

ACTRC Assessment, Curriculum and Technology Research Centre


ALS Alternative Learning System
AS Administrative Service
ASEAN Association of South East Asian Nations
BCD Bureau of Curriculum Development
BEA Bureau of Educational Assessment
BEPS Basic Education Planning System
BEMEF Basic Education Monitoring and Evaluation Framework
BEST Basic Education Sector Transformation
BHROD Bureau of Human Resources and Organizational Development
BLD Bureau of Learning Delivery
BLR Bureau of Learning Resources
CI Continuous Improvement
CO Central Office
COA Commission on Audit
COT Classroom Observation Tool
CQA Curriculum Quality Audit
DBM Department of Budget Management
DepEd Department of Education
DEDP Division Education Development Plan
DMEA Division Monitoring, Evaluation and Adjustment
DMET Division Monitoring, Evaluation and Adjustment Team
DO Department Order
DsMEA District Monitoring, Evaluation and Adjustment
DsMET District Monitoring and Evaluation Team
DSWD Department of Social Welfare and Development
EBEIS Enhanced Basic Education Information System
FGD Focus Group Discussions
GOP Government of the Philippines
GRBE Gender Responsive Basic Education policy
IAS Internal Audit Service
ICO International Cooperation Office

1
ICT Information Communications Technology
ICTS Information Communications Technology Service
IO Intermediate Outcome
JHS Junior High School
K Kindergarten
KEQ Key Evaluation Questions
LIS Learner Information System
LRMDS/P Learning Resource Management and Development System/Portal
LRN Learner Registration Number
LS Legal Service
M&E Monitoring and Evaluation
MEA Monitoring, Evaluation and Adjustment
MOOE Maintenance and Other Operating Expenses
MTB-MLE Mother Tongue Based – Multi lingual Education
MTEP Medium-Term Expenditure Plan
NAT National Achievement Test
NBEP National Basic Education Plan
NCR National Capital Region
NEAP National Educators Academy of the Philippines
NEDA National Economic and Development Authority
NGO Non-Government Organization
OSEC Office of the Secretary
PIP Public Investment Program
PIR Program Implementation Review
PMIS Program Management Information System
PMS Project Management Service
PPA Program, Project and Activities
PPST Philippine Professional Standards for Teachers
RCTQ Philippine National Research Center for Teacher Quality
RMEA Regional Monitoring, Evaluation and Adjustment
RMET Regional Monitoring and Evaluation Team
RO Regional Office
RPMS Results-based Performance Management System
SAT Self-Assessment Tool
SBM School-Based Management

2
SDGs Sustainable Development Goals
SDO Schools Division Office
SHS Senior High School
SIP School Improvement Plan
SMEA School Monitoring, Evaluation and Adjustment
SMET School Monitoring and Evaluation Team
SPED Special Education
ToC Theory of Change
TRIP Three-Year Rolling Infrastructure Program

3
BASIC EDUCATION MONITORING
AND EVALUATION MANUAL
PREFACE

PURPOSE

The purpose of Basic Education Monitoring and Evaluation (M&E) Manual is to promote M&E
for results in a practical and accessible manner. This manual is intended to integrate results-based
M&E in the Department of Education (DepEd). Therefore, this is not intended to be read from cover
to cover; the reader can opt to focus on specific topics of interest and concern.
This M&E Manual is the operationalization of the Basic Education Monitoring and Evaluation
Framework (BEMEF) policy and is generally intended to reflect the M&E integration and linkages at
different governance levels. Specifically, this Handbooks seeks to

 Simplify M&E systems and processes at the (DepEd) by establishing specific indicators that
an operating unit across governance level should concentrate on;
 Establish ownership of specific M&E indicators by DepEd function across governance levels;
and,
 Aid M&E efforts and initiatives on a results-based framework in order to comply with
requirements of oversight agencies and other stakeholders.
This Manual does not provide detailed guidance on conducting monitoring and evaluations; this
is provided by specific policies and memoranda issued by DepEd.

INTENDED AUDIENCE

The Basic Education Monitoring and Evaluation (M&E) Manual is intended for DepEd personnel
involved in M&E across governance levels; however, it has been designed to be understood by
multiple users as well. Although it has been designed with the Central Office in mind, specific areas
are pertinent to the regional, schools division and school governance levels.

4
TABLE OF CONTENTS
PREFACE..................................................................................................................................6
Purpose........................................................................................................................................6
Intended Audience......................................................................................................................6
RATIONALE............................................................................................................................10
New Planning and Budgetting Environment............................................................................10
Legal Bases.............................................................................................................................13
LINK TO MEDIUM-TERM AND STRATEGIC PLANNING...............................................................15
Plan Formulation.......................................................................................................................15
Strategic Planning.....................................................................................................................16
National Basic Education Plan.....................................................................................................17
Regional Basic Education Plan.....................................................................................................17
Division Education Development Plan........................................................................................17
School Improvement Plan...........................................................................................................18
Investment Planning.................................................................................................................18
Operational Planning.................................................................................................................19
BASIC MONITORING AND EVALUATION (M&E) CONCEPTS.......................................................21
MONITORING AND EVALUATION FRAMEWORK AND PRINCIPLES............................................25
Basic Education Monitoring and Evaluation Framework..........................................................25
Goal..............................................................................................................................................26
Outcome......................................................................................................................................26
Intermediate outcomes...............................................................................................................26
Enabling Environment.................................................................................................................27
Monitoring and Evaluation (M&E) Principles............................................................................28
TYPES OF MONITORING AND EVALUATION (M&E)..................................................................30
Monitoring and Evaluation Mechanisms...................................................................................30
Types of Monitoring and Evaluation..........................................................................................31
MONITORING AND EVALUATION OF BASIC EDUCATION PLANS..........................................36
MONITORING AND EVALUATION TOOLS AND SYSTEMS..........................................................41
Monitoring and Evaluation Tools...............................................................................................41
Information Systems.................................................................................................................41
SCOPE OF MONITORING AND EVALUATION (M&E)..................................................................45
Basic Education Peformance Indicators....................................................................................45

5
Intermediate Outcome Indicators...............................................................................................45
Enabling Environment Educators...............................................................................................48
Program Indicators......................................................................................................................52
ROLES AND RESPONSIBILITIES BY GOVERNANCE LEVEL........................................................59
Central Office.............................................................................................................................59
Regional office...........................................................................................................................62
Schools Division Office..............................................................................................................64
Schools......................................................................................................................................66
VERTICAL AND HORIZONTAL LINKAGE PER GOVERNANCE LEVEL..........................................70
What Matters Most in Monitoring and Evaluation System for DepEd?....................................70
Education Value...........................................................................................................................70
Customer Value............................................................................................................................70
Academic Support........................................................................................................................71
Bureacratic Support.....................................................................................................................71
Towards Vertical and Horizontal Alignment in DepEd...............................................................71
Plan and Budget Systems Vertical and Horizontal Integration...................................................73
Vertical and Horizontal integration in Basic Education M&E......................................................76
MONITORING AND EVALUATION PROCESSES.........................................................................78
M&E Process..............................................................................................................................78
Established Indicators..................................................................................................................78
Data Collection............................................................................................................................79
Data analysis and synthesis.........................................................................................................81
Communicating M&E Results......................................................................................................84
Data Management.......................................................................................................................85
REFERENCES..........................................................................................................................87
ANNEX A: GLOSSARY.............................................................................................................88
ANNEX B: STRATEGIC M&E FRAMEWORK...............................................................................92
ANNEX C: SAMPLE LOGICAL FRAMEWORK MODEL VERTICAL AND HORIZONTAL LOGIC.........92

6
RATIONALE

The full implementation of the K to 12 program is a landmark reform that has changed the
landscape of the Philippine Education system and the quality and relevance of its graduates. This
reform and new policies from oversight agencies defining new ways of doing things in government
are grounded on the principle of good governance. Good governance links incentives to
performance and promotes greater transparency and accountability, which are the cornerstones of
public service.

NEW PLANNING AND BUDGETTING ENVIRONMENT

The National Government has instituted strategic reforms to improve the system of planning and
budgeting, and ensure that taxpayers’ money is judiciously and optimally utilized for the common
good. Below are international commitments and national directions, policies, and directives that
influence the planning and budgeting process:

SUSTAINABLE DEVELOPMENT GOALS

The 2030 Agenda for Sustainable Development, adopted by all United Nations Member States in
2015, provides a shared blueprint for peace and prosperity for people and the planet, now and into
the future. At its heart are the 17 Sustainable Development Goals (SDGs), which are an urgent call
for action by all countries - developed and developing - in a global partnership. They recognize that
ending poverty and other deprivations must go hand-in-hand with strategies that improve health
and education, reduce inequality, and spur economic growth – all while tackling
climate change and working to preserve our oceans and forests.
In particular, DepEd supports and focuses its efforts toward the attainment of
SDG 4, which is to “ensure inclusive and equitable quality education and promote
lifelong learning opportunities for all”.

AMBISYON NATIN 2040

This is a long-term strategy of the national government in fighting poverty,


which represents the collective long-term vision and aspirations of the Filipinos for themselves and
for the country in the next 25 years. As such, it is an anchor for development planning across, at the
very least, four (4) administrations.

7
PHILIPPINE DEVELOPMENT PLAN
The Philippine Development Plan (PDP) 2017-2022 is the current Philippine government’s medium-
term plan anchored on the 10-point Socioeconomic Agenda and is geared towards the attainment
of Ambisyon 2040.

TEN-POINT BASIC EDUCATION AGENDA


The 10-point Basic Education Agenda 2016-2022 - a set of principles and priorities guiding the
current DepEd administration in providing quality, accessible, relevant, and liberating education.

NATIONAL GOVERNMENT FISCAL CALENDAR


The National Government Fiscal Calendar (DBM-DOF-NEDA Joint Circular No. 2017-1) – This is the
national policy which aims to strengthen the link between planning and budgeting through the
establishment of a unified calendar of fiscal activities in the national government, complemented by
a unified schedule of publications and reports, among others.

ANNUAL CASH-BASED APPROPRIATIONS

The Annual Cash-Based Appropriations (ACBA) is a budget reform wherein contractual obligations
are incurred and payments to goods delivered and services rendered, inspected, and accepted are
disbursed within the fiscal year. Payments for these should be settled within the fiscal year.
However, projects with an implementation period exceeding twelve (12) months must secure a
multi-year obligation authority (MYOA) before entering into a multi-year contract.

PUBLIC EXPENDITURE MANAGEMENT


The Public Expenditure Management (PEM) is an approach to public sector budgeting that focuses
on outcomes and treats expenditures as a means to produce outputs in order to achieve the desired
outcomes. An effective PEM promotes the practice of fiscal discipline (spending within means),
allocative efficiency (spending on the right priorities), and operational efficiency (spending with
maximum results).

FREE INFORMATION ORDER


Under Executive Order No. 2 series of 2016, the Freedom of Information Order provides the national
government policy framework that provides all citizens greater access to information in all entities
under the executive branch. This also mandates full disclosure of all executive branch transactions
involving the public interest.

The above internal and external developments favorably affect the education sector, and
complement the current programs and planned innovations of the Department in pushing for a
more programmatic and responsive delivery of programs to its ultimate target group – the learners.
This Basic Education Monitoring and Evaluation Framework (BEMEF) is integrally linked to the
planning and budget strategy of the Department. In particular, BEMEF shall require the explicit
identification and articulation of indicators and targets for measuring performance in the
development of strategic, medium-term, and operational plans of all DepEd operating units at all

8
governance levels. It intends to complement the planning and budget strategy by setting out the
framework for agency-wide monitoring and evaluation (M&E).
As the DepEd expands its efforts to improve delivery of basic education services, it also introduces
reforms by improving its internal processes and systems towards improved accountability and
transparency to its stakeholders. In this regard, it seeks to strengthen its evidence-based decision-
making. To further support the M&E of programs, projects, and major activities, DepEd restructured
its budgeting process introduced by Department of Budget and Management (DBM) through the
Program Expenditure Classification (PREXC). This improves planning, monitoring and evaluation of
results to provide better programs, projects, and major activities.

9
LEGAL BASES
The Basic Education Monitoring and Evaluation (M&E) Manual is a reflection of the Department of
Education’s efforts to support the Philippine Government’s overall direction of a functionally
rationalized system of planning and budgeting for better accomplishment of the intended and
desired results of programs and projects. It is consistent and aligned with the government’s global
development cooperation commitments based on the principles, concepts and methods of
Managing for Development Results and Results-based Management.

POLICY CONTEXT

In terms of policy direction, the context of this Manual is based on the government’s direction to be
more responsive to the gaps between plans, budgets and results-based performance management.
The following provides the Legal Bases for the development of the Basic Education M&E Manual:

 Administrative Order (AO) No. 25 was issued by the President of the Philippines in 2012.
This was to support the need for a unified and integrated Results-based Performance
Management System (RbPMS). For this, an inter-agency Task Force was initially created.
This Task Force takes on the harmonization of national government performance
monitoring, information, and reporting systems.

Subsequently, several administrative orders have been issued that support RbPMS.

 National Economic Development Authority (NEDA) initiated the integration of the Results
Matrix (RM) with the Philippine Development Plan (PDP) 2011-2016 and PDP 2017-2022.

 The Department develops its programs, projects, and major activities primarily to attain the
outcomes as established by defined by Republic Act 10533 or Enhanced Basic Education Act
of 2013.

 As early as 2014, guidelines on the shift to outcome-based Performance-Informed Budget


(PIB) for the Fiscal Year 2015, budget preparation were used in the National Budget Circular
No. 552 Series of 2014.

 Supports the government’s efforts of strengthening the M&E system of the government
agencies (NEDA-DBM Joint Memorandum Circular 2015-1, July 15, 2015, National
Evaluation Policy Framework of the Philippines).

 This was followed by DBM National Budget Circular No. 565 dated December 2, 2016 (NBC
No. 565 s. 2016) on the Adoption of a Results-based Monitoring and Evaluation Reporting
(RbMER) Policy which aims to strengthen, streamline, and standardize the RbMER system
evidenced by a timely, useful, accurate, and credible reporting of performance information

10
in order to support policy and program improvement, expenditure management, and local
and national decision-making.

As a result, the Department of Education (DepEd) developed the Basic Education Monitoring and
Evaluation Framework (BEMEF) to guide DepEd operating units in the conduct of M&E activities
and assessment of office and individual performance in line with the aforementioned policies. This is
also in line with the establishment of the National Quality Management System (QMS) which aims
to enhance the organization’s capacity and internal systems and processes. This shall strengthen
evidence-based decision-making and policy formulation which shall in turn improve the delivery of
services, as well as the allocation and management of government resources, as well as strengthen
transparency and accountability in the basic education sector.

11
LINK TO MEDIUM-TERM AND STRATEGIC PLANNING

The Goal, Outcome, and Intermediate Outcomes presented in the Planning Framework are the core
of the agency’s aspirations. It is important that all policies, plans, programs, projects and activities
support what the system wants to achieve in the long, medium, and short term. While the
Investment Plans are purposely crafted to deliver the targets in the medium term, all such plans
should prepare the path towards achieving the long-term goal which is articulated in the Ambisyon
Natin 2040.

To effectively deliver and achieve the emerging priorities for the next six (6) years, new planning and
budgeting policies will be instituted to ensure that the formulation of strategic, investment, and
operational plans are aligned and synchronized.

PLAN FORMULATION

Basic education plans shall lay down the goal and outcomes of the Department and serve as the
roadmap in crafting the medium-term and operational plans. This shall direct the planning process
of DepEd operating units across governance levels in delivering basic education supports and
implementation of programs, projects, and activities. Plan formulation consists of three (3) major
phases: (1) Strategic Planning, (2) Investment Planning and (3) Operational Planning, as illustrated
below.

Strategic Planning
One Strategic Plan
Six-Year Plan – Strategic Directions
Investment Planning
One Investment Plan
Six-Year Plan – Major Programs and Corresponding Resources
Operational Planning
First Three-Year Plan (Y1 – Y3)
Programs, Projects, Major Activities
Second Three-Year Plan (Y2 – Y4)
Programs, Projects, Major Activities
Third Three-Year Plan (Y3 – Y5)
Programs, Projects, Major Activities
Fourth Three-Year Plan (Y4 – Y6)
Programs, Projects, Major Activities
Annual Plan 1 Annual Plan 2 Annual Plan 3 Annual Plan 4 Annual Plan 5 Annual Plan 6
Activities Activities Activities Activities Activities Activities
Figure 1.0: Phases of Plan Formulation

The Strategic Plan provides the overall direction, long-term targets, and major strategies to be

12
pursued. Taking off from the Strategic Plan, the Investment Plan provides the policies, programs,
projects, and required resources covering a fixed six-year period for major programs in support of
the strategies identified in the strategic plan. The Operational Plan provides the detailed physical
plan, implementation arrangements, and the corresponding monthly financial and procurement
requirements. The diagram below shows the major elements of each planning phase and the
common elements that ensure the vision and strategic directions are operationalized through the
Investment and Operational plans.

Figure 2.0: Plan Formulation Components

STRATEGIC PLANNING

The strategic plan allows the offices to identify and anticipate the possible risks that would affect
the implementation of the plan. Like other government agencies, DepEd’s Strategic Plan was
formulated at the start of the new administration to strategically support the agenda of the new
President articulated in the Philippine Development Plan (PDP). It is vital that the Strategic Plan be
treated as a “living document” and is widely communicated across all levels to reinforce clarity of
objectives and secure the buy-in of internal stakeholders who are the prime movers of the Plan.
Being a “living document” serves the purpose of incorporating evolving and emerging priorities in
the agency’s overall thrusts and directions. It can also have direct implications on target-setting and
adjustments that must be made to make the Plan attuned with the new priorities.
Before each governance level can prepare its own respective strategic plans, the Executive
Committee, through the assistance of Planning Service, shall come up with the Department’s
strategic directions, based on a thorough analysis of the basic education situation, which shall serve
as the anchor for all strategic plans. This should be able to articulate the strategic priorities of the
present-day administration for the whole basic education sector.
Each governance level will formulate their respective basic education plans that should all lead
towards achieving the common goals of DepEd articulated in the National Basic Education Plan
(NBEP). While each Region, Schools Division, and School should contextualize its plan based on its
actual situation, strategies must all contribute or complement the national directions in terms of
policies, targets, and strategies.

13
Sustainable
Global Development Goals
2030
Ambisyon Natin
2040
Philippine
National Development Plan
2017-2022
National Basic
Education Plan Regional Development
Plan
Regional Regional Basic Education
Plan

Schools
Division Education
Division Development Plan
Office

School
Schools Improvement
Plan

Figure 3.0: Plan Linkages in DepEd

NATIONAL BASIC EDUCATION PLAN


The National Basic Education Plan (NBEP) is a six-year plan developed at the Central Office that
reflects specific strategies that will allow the field offices to perform its mandate, particularly the
Regional Offices. The plan should contain strategies aimed to develop and improve national policies,
standards, and systems and to address performance issues along key basic education indicators.
In this regard, the Central Office must ensure that Regional Offices are able to formulate their own
regional education frameworks, and customize policies and programs which reflect the values,
needs, and expectations of the communities they serve.
Prerequisite: Regional Basic Education Plans

REGIONAL BASIC EDUCATION PLAN


Likewise, the Regional Basic Education Plan (RBEP) is six-year plan developed at the Regional level
which contains strategies on how priority directions, policies, programs, and systems will be
implemented in their respective regions, considering the unique learning situation of learners. While
looking at the formulation of regional education policy framework and standards, the RBEP must
contain enabling strategies that will allow the division offices to perform their mandate.
The Regional Office must ensure the capacity of the Divisions (SDS, ASDS, Supervisors) to
implement policies and provide technical assistance, training, and services to schools & learning
centers. Regional and Division Education Development Plans must also influence the education
priorities of the Regional Development Council (RDC).
Prerequisite: Division Education Development Plans

DIVISION EDUCATION DEVELOPMENT PLAN

14
The Division Education Development Plan (DEDP) is also six-year plan developed at the Division level
which contains strategies on how assistance to schools and learning centers will be implemented. As
an example, one of the crucial mandates of the SDOs is to manage effective and efficient use of all
resources, including human resources.
Operationally, teacher deployment and appointment of school heads, as well as professional
development, are within the responsibility of the Division. It is now within their mandate to:
1. build the capacities of school heads so that the school heads can provide instructional
supervision and implement school-based management, and
2. ensure that teachers and learning facilitators are better qualified to deliver the curriculum.
The DEDP should be able to influence the education priorities of provincial and city/municipal
development plan. In the current setup, the learning centers for Alternative Learning System (ALS)
are lodged under the Division Office. Thus, planning for learning centers will be part of the DEDP.
Hence, community learning centers managed by DepEd will not formulate their own education
plans.
Prerequisite: School Improvement Plan

SCHOOL IMPROVEMENT PLAN


With the DepEd Order 44, s. 2015 which will remain the basis for the preparation of the SIP, the
School Improvement Plan (SIP) is a three-year roadmap that lays down specific interventions that a
school, with the help of community and other stakeholders, will undertake within a period of three
(3) consecutive years.
At the school level, the SIPs must be able to articulate the school’s strategies on making the
teaching and learning process more effective and inclusive. It should also contain strategies on
participatory management, stakeholders’ collaboration, School Governance Councils, and School-
Based Management. The schools and learning centers are our front-liners, directly contributing to
DepEd’s intermediate outcomes.

INVESTMENT PLANNING

During this stage, policies, programs, and projects including basic education resources to
implement the strategic plan are determined and documented in the form of a written Investment
Plan anchored in the six-year Philippine Development Plan (PDP) of the current administration. This
is where each operating unit at each level of governance commits its stake and contributions in
attaining the objectives of the strategic plan which is anchored both on the long-term and team
visions. This is also where the accountabilities per governance level is determined per program. In
this way, Investment Planning ensures the vertical and horizontal integration of the plans.
The Investment Plan shall be the sole source for the Basic Education Sector’s contribution to the
Public Investment Program of the National Economic and Development Authority, as well as the
RO’s submission to the Regional Development Investment Program, as endorsed by their respective
RDCs. Furthermore, any proposal to and from development partners, NGOs/CSOs, and other
external funding agency should be based on the Investment Plan.
Depending on mandate, each governance level must be able to determine the following in

15
preparing the Investment Plan:
 Development and/or enhancements of education policies, and
 Programs/projects development and implementation, such as curriculum implementation,
assessment, learning materials development, technical and training assistance,
procurement of services and learning equipment, and provision of school facilities and
infrastructure.

Figure 4.0: Conceptual Representation of Vertical and Horizontal Integration in DepEd

Based on the strategic interventions identified in the strategic plan, each level of governance should
determine the corresponding programs to implement the strategies. A program profile must be
developed for each program in order to elaborate its implementation details, including its logical
framework, target beneficiaries, implementation arrangements, financial breakdowns, among
others. Such details shall serve as the content of the Investment Plan.
All programs under Operations shall have its own Investment Plan, which shall form part of the
overall Investment Plan of the Central Office. For the ROs and SDOs, their Investment Plan should
capture their respective contribution to the Central Office’s to ensure the vertical alignment of the
Investment Plans. They may also add their own soft programs to their Investment Plan as deemed
necessary, provided there is proper justification.
The Investment Plan shall be subject to a mid-term review after the first three (3) years of
implementation. The remaining years of the Investment plan must be adjusted accordingly, based
on the results of the mid-term review.

OPERATIONAL PLANNING

Operational Planning is the process of operationalizing the Strategic and Investment plans. Each
operating unit will conduct two (2) levels of operational planning:

16
 Annual Plan
 Three-Year Rolling Plan

ANNUAL PLAN
Annual planning includes the process where each operating unit will prepare a detailed
implementation plan of programs and activities for the period of one year and is operationalized
through budget execution documents consisting of the WFP and the PPMP.

The Risk Treatment Plan identified in the TYRP must be reflected in the annual plans, especially if
there are funds needed to address such risks.

THREE-YEAR ROLLING PLAN


The Three-Year Rolling Plan (TYRP) involves the process wherein each operating unit shall identify
the major outputs to be produced in a three-year scope, in consideration of how these outputs shall
support or contribute to the Investment Plan.

For the TYRP, the current year is considered as the baseline year and the next three (3) years will be
planned for through physical and financial targets. All operating units are required to prepare their
TYRP, from which all annual plans will emanate. The TYRP must have a corresponding Risk
Treatment Plan which evaluates the likelihood and possible impact of the risks identified and sets
the respective treatments thereof.

17
BASIC MONITORING AND EVALUATION (M&E) CONCEPTS

DEFINING MONITORING AND EVALUATION

Generally, the core concepts of this manual are used as follows:

Monitoring Periodic or routine tracking and reporting of priority information about a


project or program, its inputs and intended outputs, outcomes and
impacts

Evaluation An assessment, as systematic and objective as possible, of an ongoing or


completed project, program or policy, its design, implementation and
results (OECD-DAC from NEDA, 2015).

Performance The systematic and cyclical process of planning, monitoring, review and
Management evaluation (self-assessment and independent) of projects, programs and
policies with a view to continually improving aid effectiveness (AusAID,
2012).

Monitoring and evaluation are processes that both refer to the measurement of the performance of
an organization, a program, a project, or an individual. These are complementary yet distinct
processes depending on the purpose, focus, and approach used when they are conducted. The
activities involved in monitoring and evaluation are often intertwined, but clear distinctions exist
between the two.
Monitoring explains the efficiency and effectiveness of operations while evaluation provides
information on the benefits achieved. Results of monitoring provide bases for critical management
decisions such as resource allocation or realignment, target setting, remedial/corrective actions or
strategy development. On the other hand, evaluation results provide valuable lessons and insights
that can be used by managers in crafting strategic decisions for the future such as in designing
organizational changes or future programs and/or projects.
Both are integral management tools used for different purposes. Because both follow a similar
process such as identifying performance indicators, data collection, data analysis, among others,
monitoring complements evaluation. Specifically, the monitoring process may generate questions
that can be addressed by evaluation. Also, evaluation draws heavily from data generated through
monitoring, including baseline data, information on the program or project implementation
process, and measurements of results.

DEFINITION OF GENERAL TERMS

For the purpose of the Basic Education Monitoring and Evaluation Manual, the Basic Education
Monitoring and Evaluation Framework established the definition of the following terms:

Accountable office any decision-making unit of DepEd at the national, regional, school

18
division, or school levels in charge of providing directives and
determining strategies to achieve agency performance targets.

Activity a work process which contributes to the implementation of a program,


sub-program, or project. The objectives of an activity typically have
corresponding tangible and quantifiable outputs. An activity must be
anchored on a program/project,and made consistent with DepEd's
vision and mission. Activities are significantly limited in scope and
much shorter in lifespan than either programs or projects. Activities
may be initiated by the CO, ROs, SDOs, schools, or by external
partners. Activities are generally evaluated against the
expected/identified outputs immediately after they have ended.

Annual Plan composed of the a) Work and Financial Plan (WFP) which consists of
the i) Physical Plan, ii) Monthly Obligation Program, and iii) Monthly
Disbursement Program; b) Project Procurement Management Plan
(PPMP) with technical specifications.

Basic Education Plan refers to strategic, medium-term, and operational plans developed by
DepEd operating units across governance levels.

Database structured set of data and information of each DepEd operating unit
gathered from its M&E activities that is easy to access, manage, and
update.

DepEd Operating any DepEd unit across governance levels which provides support
Unit and/or implement programs, projects, and major activities relative to
the delivery of basic education in line with the provisions of R.A. 9155.

Indicator quantitative or qualitative factor or variable that provides a simple and


reliable means to measure achievement, to reflect the changes
connected to an intervention, or to help assess the performance of a
program, project and/or activity. Specific, tangible, or quantifiable
measures of accomplishments, or if unquantifiable, they can be
qualitative specifications of achievements

Information system an organized system for the collection, organization, storage, and
communication of information

Input the financial, human, and material resources needed to produce an


output.

Investment Plan a six-year plan at the Central, Region, and Schools Division levels
describing major programs and projects that will be implemented in
support of the objectives and outcomes identified in the strategic plan.
It spells out the major outputs and inputs needed to implement the
strategic plan. It takes into account the Department’s Medium-Term
Expenditure Framework (MTEF) and requirements for Multi-year
obligation authorities (MYOAs) for capital works and eligible services.

19
M&E system a set of organizational structures, management processes, standards,
strategies, plans, indicators, information systems, reporting lines and
accountability relationships which enables offices across governance
levels to perform their M&E functions effectively.

M&E tools instruments used to collect information during the conduct of


monitoring and evaluation.

Operational Plan these plans are comprised of the a) three-year rolling plans (TYRP)
which contains the major outputs the operating unit will deliver for
three succeeding years which are aligned to the Investment Plan, and
b) annual plans which contain the details of how the major outputs
scheduled for that fiscal year will be produced. The latter takes into
account allocated budget from government entities, signed
partnerships, and Multi-year obligation authorities (MYOAs).

Outcome intermediate effects of output(s) on clients.

Output products and services produced through utilization and processing of


inputs

Overall Lead the staff with the overall authority, accountability, and responsibility
for the M&E system at each governance level

Process owner the office who will oversee and manage the conduct of the M&E
system per governance level

Program strategic intervention anchored on DepEd’s mandate, goals, and


national policies the implementation of which constitutes or supports
the Department’s core business.

Project intervention that is relatively narrower in scope compared to a


program. A project yields more immediate results for specific target
groups. A project may be a component of and/or anchored on a
program, or independent of any program. It may be designed or
implemented in support of a program, to address learners’ and other
needs not covered by any program, or to experiment/try out/pilot an
innovation or an innovative solution. Projects have specific budgets,
timeframes, monitoring & evaluation (M&E) results, and targets; follow
defined schedules/work plans; and are implemented through activities

Responsible office a DepEd operating unit at the national, regional, school division, or
school level in charge of executing tasks or deliverables

Results-based M&E continuous process of collecting and analyzing information to compare


how well DepEd programs, projects, and activities are performing
against its expected outcome or result.

Strategic Plan a six-year plan developed at the Central, Regional, and Schools Division
levels which contains the strategic directions and priorities of the

20
incumbent administration for its respective level of governance, and is
based on a thorough analysis of the prevailing basic education
situation. It communicates the results or outcomes the organization
wants to achieve, and the strategies it will adopt to reach those
outcomes.

Please refer to Annex for the Glossary of Terms

21
MONITORING AND EVALUATION FRAMEWORK AND PRINCIPLES

BASIC EDUCATION MONITORING AND EVALUATION FRAMEWORK

BEMEF addresses the gaps between outcomes and goals.

With the results-based orientation of the Basic Education Monitoring and Evaluation Manual, the
BEMEF necessitates DepEd to be more responsive to demands from internal and external
stakeholders for good governance, accountability and transparency, effective and efficient delivery,
and the eventual attainment of its development outcomes.

Figure 5.0 Basic Education Planning and Monitoring and Evaluation


Framework

BEMEF is grounded on the Department’s mandate, vision (long-term and team vision), and mission,
as well as the Philippine Government’s various commitments and targets with respect to basic
education.
It presents what DepEd envisions for the Filipino learners and for the department, as well as the
strategic directions it plans to pursue to achieve this vision. It reflects the Department’s theory of
change; that is, the change that the department wants to achieve in the lives of the Filipino learners
through the provision of basic education.

22
The planning framework highlights the means and ends and linkages
of the DepEd’s various interventions. It is organized on four (4) Levels:
 Goal Differentiating
 Outcome Impact Evaluation
 Intermediate Outcomes from Measurement
 Enabling Environment of Performance
Frequency. Impact evaluation tends
to be episodic, while performance
GOAL measurement should be ongoing.

Issues(s). Impact evaluation is


This pertains to the contribution of the Department in achieving frequently driven by specific
societal aspirations in terms of improving the state of basic education stakeholder concerns (e.g., How
in the country and global community. much did the curriculum attain
learner competence by area?).
It is aligned with the goal of the national government. Performance measurement deals
with specific performance issues
OUTCOME (e.g., Is a program accomplishing its
output, quality and governance
This describes the conditions and characteristics of the learners the objectives?).
organization wants to produce – what we want our learners to be after
completing basic education. Attribution of outcomes. Impact
evaluation is concerned with the
achievement of the outcome
INTERMEDIATE OUTCOMES
(results), but it considers attribution
This concentrates on what the organization endeavors the learners to to external factors. While, in
have in terms of access, attendance, participation, completion, and performance measurement,
achievement. attribution is generally assumed.

Source: Ketnner, Moroney and Martin


IO 1. Learners are in schools and in learning centers. (2017)
All learners, regardless of sex, religion, geography, and financial
disposition, have access to and are encouraged to attend schools and
learning centers. This ensures that schools and learning facilities are
made accessible to all types of learners

IO 2. Learners access programs responsive to their needs and


consistent with their interests and aptitudes.
Learners’ needs, interests, and aptitudes are taken into consideration
in the development and implementation of the Department’s policies
and programs.

IO 3. Learners actively participate in a learner-friendly environment.


Schools and learning centers are learner-friendly and encourage active participation, consistent
with its mission to provide child-friendly, gender-sensitive, safe, and motivating environment.

23
This acknowledges that learner-friendly environments are pivotal in the well-being of learners.

IO4 Learners complete education and attain learning standards


Programs and initiatives aimed at ensuring learners benefit from their attendance to school or to
any educational interventions and are learning/ gaining knowledge are implemented.

It recognizes that attendance to school and learning centers is not sufficient to deliver quality basic
education.

For the manual, the following student measures were identified and established.

Established Indicators for Intermediate Outcome


ALS Completion Rate Participation Rate
ALS A&E Passer Rate Percentage of learners enrolled in special
Completion Rate programs
Cohort Survival Rate Promotion Rate
Gender Parity Proportion of students performing at
Graduation Rate proficient level
Gross Enrollment Repetition Rate
Net Enrolment Rate
School Leaver Rate

Transition Rate

ENABLING ENVIRONMENT
The Enabling Environment refers to the conditions that need to be created in the different levels of
the organization to attract, motivate and make the systems and processes conducive for facilitating
the attainment of learning outcomes.

EE 1: Education leaders and managers practice participative and inclusive management

Improved capacity on participative and inclusive management processes of education leaders and
managers.
It ensures that design and implementation of professional development programs are relevant and
appropriate to the required skills and competencies.

EE 2: INVESTMENTS IN BASIC EDUCATION PROVIDES LEARNERS WITH THE IDEAL


LEARNING ENVIRONMENT
Sufficient provision and equitable distribution of education resources. It also recognizes the gaps
and immediate measures that need to be considered in prioritizing resources in the midst of
competing education priorities to achieve equity and impact.

EE 3: People, internal systems and processes serve learners better thru continuous
improvement efforts
Enhanced people's capacity, functional and leadership competencies, internal system and
processes to efficiently and effectively deliver basic education services.

EE 4: Key stakeholders actively collaborate to serve learners better

24
Ensured the collaborative engagement with key stakeholders to achieve basic education goals.
This will be delivered through functional mechanisms to make partnership building and linkages
more strategic and aligned to DepEd priorities.

For the handbook, the following system and governance measures were identified as part of the
Enabling Environment indicators:

Established Indicators for Enabling Environment

Client satisfactory rating of DepEd offices' Proportion of schools with connection to


respective stakeholders electricity
Disbursement Rate
Interquartile ratio (IQR) Proportion of schools with computer
Proportion of Authority to Conduct (ATC) package
requested over total financial
transactions Proportion of schools with internet access

Proportion of DepEd policies compliant Proportion of schools with functional


with the policy development process library

Proportion of offices across governance Learner to classroom ratio


levels with very satisfactory rating in the
Office Performance Commitment and Learner to learning material ratio
Review Form (OPCRF)
Learner to seat ratio
Proportion of financial contribution of
Learner to Teacher ratio
development partners vis-a-vis national
education budget Learner to Water & Sanitation facility
ratio
Proportion of schools at SBM Level 3
(Highly Proficient)

Proportion of schools with functional


School Governing Council (SGC)

Proportion of teachers meeting the PPST


career

stage 3 in all domains (Highly Proficient)


Timely delivery of procured projects

Proportion of students performing at


proficient level

Special Education Fund (SEF) Utilization


Rate

MONITORING AND EVALUATION (M&E) PRINCIPLES

BEMEF has established the following as essential M&E Principles:


 M&E should contribute to improved governance: Increased transparency and
accountability among education leaders and implementers, increased participation and

25
inclusion of various interests of all types of education stakeholders, and overall
institutionalization of evidence-based decision-making in the education sector.

 M&E should be development-oriented: Should be applied to improve organizational and


individual performance and contribute to continuous learning and improvement towards
better delivery of education services.

 M&E should be undertaken ethically: Sensitive and responsible implementation of M&E


processes with respect towards privacy, values, and culture of involved stakeholders. Fair
and balanced reporting that provides balanced account of findings.

 M&E should be utilization-oriented: Data and information is strategically gathered to


responsively meet the needs of the organization. An accessible and organized central
repository of M&E reports, data, and indicators is maintained for strategic utilization
during planning, policy development, program designing, and resource allocation.

 M&E should be methodologically-sound and appropriate: The conduct of M&E follows


an established set of standards guided by a common set of organizational indicators. The
M&E methods and/or approaches to be used are fit-for-purpose to the nature and state of
intervention applied. M&E findings should be systematically analyzed and triangulated for
improved credibility.

 M&E should be operationally-effective: M&E will be effective and properly managed,


embedded in implementation plans, allocated with sufficient resources, and has a defined
scope and clear purpose to make it resilient to administrative changes.

 M&E should be a shared responsibility: M&E is a critical element for an organization to


achieve its goal. As members of an organization with a similar goal, each has a
responsibility to conduct M&E that will support evidence-based decision-making towards
better achievement of the goal.

26
TYPES OF MONITORING AND
Essential elements EVALUATION (M&E)
for Results in a
LogFrame
MONITORING AND EVALUATION
MECHANISMS

Indicators. This refers to What is


to be Measured not What is to be Different mechanisms exist for the conduct of M&E. It
Achieved. depends on the objective, scope, and level of
Target(s). The desired value or monitoring function of those who will perform the
direction for progress. task.
Milestones. The path towards the
In most instances, the conduct of M&E is primarily
target/s.
Baseline. The starting point, anchored on the logframe of the organization.. In
which is essential for target DepEd’s case, there are different plans to consider
setting. across the governance levels. What is essential is to
Sources. Where will the establish a clear plan, and later on try to develop a
information come from. logframe.
Assumptions. Potential external Source: AusAID (2005)
influences. Figure 6.0 Sample Logical Framework Model

DepEd’s plans and their sub-components are


expected to be leading to the creation of many
results. For the purpose of the manual, results are
describable and measureable change resulting from a
cause-and-effect relationship. Different extent of
results seek to capture different development
changes. The results chain essentially establishes
what is intended to be achieved, the underlying reasons and the process to go about it.
Figure 6.0 presents Goals or Impact (sometimes referred to as vision) are the longer term goals that
will appear in the logframe. While Purpose or Outcomes are actual or intended changes in
development conditions that interventions are seeking to support. They pertain to medium-term
development results created through the delivery of outputs and the contributions of various
partners and non-partners. Outcomes provide a clear vision of what has changed or will change in
the basic education or specific governance levels within DepEd. Component Objectives or
Intermediate Results provides the link between the outputs and the outcome. Outputs are short-
term development results produced by specific activities. They must be achieved with the resources
provided and within the time-frame specified.

Indicator Traps

 Excessive Complexity
27  Indicators Imprecision
 Indicators Overload
 Outputs fixation
As previously discussed and presented, BEMEF highlights the organizational logframe of the
Department; hence, the main guide for the conduct of M&E in DepEd. To identify the appropriate
M&E mechanisms, tools, and approaches to use, it is important to have a clear understanding of the
difference between monitoring and evaluation, and the different types of M&E being conducted at
each stage of programs, projects, and major activities implementation.
There is increasing demand for M&E, but effectively integrating results into decision-making
remains a challenge. It is proposed that DepEd and its corresponding operation at all governance
levels continue to value M&E information. An environment of accountability, continuous learning
and results-based may be need for M&E. What may be useful is that, across the governance levels,
individuals and offices share data, learning and knowledge openly, where constituent feedback
about what is necessary and what success looks like is essential to strategy, operations and policy.

TYPES OF MONITORING AND EVALUATION

The following types of monitoring and evaluation are currently being used by DepEd across
governance levels.

Figure 7.0 LogFrame and Monitoring


Source: IFRC (2011)

Monitoring, the routine collection and analysis of information to track progress against set plans
and check compliance to established standards, is useful to for DepEd to ascertain trends and
patterns, adapt strategies and inform decisions for project/program management. The figure above
presents key monitoring questions as they relate to the LogFrame’s objectives. It is essential to
focus more on the lower-level objectives – inputs, activities and (to a certain extent) outcomes,
because the outcomes and goal are usually more challenging changes (typically in knowledge,
attitudes and practice/behaviors) to measure, and require a longer time frame and a more focused
assessment provided by evaluations

28
READINESS MONITORING
This is a quality assurance mechanism designed to ensure the availability of all inputs and
requirements necessary to start and sustain an efficient operation. The results of the readiness
monitoring shall be used to identify the needed support of DepEd operating units in the
implementation of the basic education plans.

PROGRESS MONITORING
This is a systematic and objective assessment of an on-going implementation of plans, programs,
projects, and major activities. It aims to steer implementation as efficiently as possible based on
empirical facts determined through verifiable assessment process, systematic observation and
documentation.
Progress monitoring may be done on a weekly, monthly, or quarterly basis depending on the M&E
Plan. This also determines any adjustment of plans and activities needed to achieve the committed
targets.

All DepEd operating units shall conduct progress monitoring of their respective programs,
projects, and major activities. This may require M&E tools for the data collection and concerned
operating unit shall be responsible for developing relevant and appropriate tools to assess the
implementation of its programs, projects, and major activities. Progress monitoring results
immediately inform the program, project, and activity implementers in the necessary adjustments
in their plans so they can achieve their target outputs and outcomes. Results primarily focus on
operational concerns that affect the implementation of programs or projects which may include a
need for additional funding support, adjustments in logistical arrangements, and other-related
concerns.

In DepEd, progress monitoring is done through the use of the system, Program Management
Information System (PMIS). The PMIS is the official source of data on programs, projects and
activities (PPAs) of the Department of Education from planning to implementation. It aims to
support the effective and efficient management of plans and programs; increase transparency of
plans and programs at all levels of governance; provides a platform that encourages a more careful
and systematic preparation of plans and utilization of budget; aids in policy formulation and
decision making; and enforce standards for planning and plan implementation.
While there are various activities that may be conducted in performing progress monitoring, the
Department establishes the conduct of a Program Implementation Review (PIR) as the main
modality to measure the performance of programs, projects, and major activities within and across
the organization.
The PIR is conducted on a quarterly basis and tracks the accomplishments of outputs in terms of
efficiency, effectiveness, quality, and corresponding utilization of the budget. Through this
mechanism, DepEd obtains timely information about the performance of programs, projects, and
major activities and allows it to provide timely response to bottlenecks, constraints, and challenges
affecting the delivery of basic education services.

29
As PIR, the Monitoring, Evaluation and Adjustment (MEA) is designed to be a periodic monitoring
and evaluation of the progress of Regional Offices (ROs) and Schools Division Offices (SDO) in its
delivery of education services as well as its performance as units within the DepEd system. This
fosters participation of the various functional as well as administrative units of RO and SDOs in all
the M&E processes with the endpoint of obtaining information from the management to make the
decisions on adjusting its work plans. Guidelines for the conduct of RMEAs at the regional level,
DMEA at the SDO level, DsMEA at the district level and SMEA for schools are clearly established
and disseminated accordingly.

Regional Monitoring, Evaluation and Adjustment (RMEA)


To promulgate a “check and balance” system required of a rationalized agency, DepEd Regional
Offices conduct Regional Monitoring, Evaluation and Adjustment (RMEA) quarterly. Specifically,
MEA intends to:
 Identify major bottlenecks in the delivery of basic education services that prevents DepEd
from “getting all school age learners to school, keep them in school, and ensure all children
who are in school to acquire quality learning”;

 Agree on scope of technical assistance the RO will provide the SDOs then to the Schools
that are related to improving access, quality and governance;

 Customize policies and programs on increasing access, quality and governance that will
respond to the unique issues and concerns affecting the different divisions and schools
within the region; and.

 Formulate a set of policy recommendation and/or adjustments to existing policies, program


or strategies, as well as systems supporting increase of access, quality and governance.
With the vision and strategic direction established by the Central Office, RMEAs are intended to
foster an accountability, as well as check and balance system by reviewing the following:
 Physical and Financial Accomplishments
 Unaccomplished Outputs
 Other Value-Added Outputs
 Concerns/Issues/Gaps/Challenges of all Functional Divisions and Regional Units.
This activity is spearheaded by the Quality Assurance Division as one of its functions is tailored from
the Monitoring, Evaluation and Assessment (MEA) for all divisions and district areas of this region.
Part of this activity is the presentation of the SDEA Reports, which are presented by the Chiefs of
SGOD and CID; other participants from SDOs are: one (1) Education Program Supervisor from
SGOD and CID, SEpS M&E and EPS II M&E.

Division Monitoring, Evaluation and Adjustment (DMEA)

30
School Division Offices (SDOs) are expected to Divisions’ Monitoring, Evaluation and Adjustment
(DMEA) prior to the actual conduct of the RMEA.
The Chief of SGOD shall present the DMEA Report and Chief of CID shall present the consolidated
report of SMEA as well as District MEA, which includes all pertinent quantitative information,
education resource reporting, qualitative information (including anecdotal feedback, issues and
lessons learned), as well as recommendations for plan adjustment.
The DMEA includes quarterly report on the CIGPs related to:
 Six (6) SBM Domains
 District Performance Indicators
 District and School Best Practices
 Unresolved CIGPs by District and/or School
 Progress on PAPs in the District, School and Functional Division/Unit Performance Targets
and Progress

District Monitoring, Evaluation and Adjustment (DsMEA)


The DsMEA is spearheaded by the DsMET or the CMET. This headed by the PSDS or a designated
school principal assigned in district for medium and large divisions, while Area Consultants facilitate
CMEA for smaller divisions. School Heads of private schools and external stakeholders are also part
of this activity.
The quarterly report of the DsMEA focuses on CIGPs related to the following SBM domains:
 Leadership (Instructional and Administrative)
 Curriculum, Instruction, Learning Resources and Assessment
 Human resource and Team Development
 Learning Environment
 Finance and Resource Management
 Governance and Accountability
 School Performance Indicators
 Learning Outcomes
 Unresolved CIGPs
 School Best Practices

School Monitoring, Evaluation and Adjustment (SMEA)


The SMEA is spearheaded by the SMET. For DepEd schools, this is headed by school principal, with
a faculty leader and a Master Teacher as members; while for private schools, the SMEA is facilitated
by the School Head or his/her assistant and supervised by PSDS.

31
SMEA for public schools shall be held in the school campus for the SMET with the participation of
teachers and stakeholders. This focuses on the discussion on curriculum and learning,
administrative and physical facilities concern of the school. Outright resolution of the deliberated
CIGPs is encouraged during SMEA to attain timely adjustments on targets and plans of teachers
related to classroom instruction and school operation.
The quarterly report of the SMEA focuses on the following CIGPs:
• Assessment of Learning
• Curriculum Management
• Desirable Strategies for Effective Teaching-Learning Process Employed by
Teachers
• Learning Delivery
• Learning Outcomes
• Learning Resources
• School Best Practices
• Unresolved CIGPs

Figure 8.0 LogFrame and Evaluation


Source: IFRC (2011)

Evaluations involve identifying and reflecting upon the effects of what has been done, and judging
their worth. Their findings allow proponents, implementers, clients, partners, donors and other
stakeholders to learn from the experience and improve future interventions. For evaluation, key

32
questions as they relate to the LogFrame’s objectives, tend to concentrate more on how things
have been performed and what difference has been made.

PROCESS EVALUATION
Process evaluation ascertains the effectiveness and efficiency of the implementation processes and
systems. This could be conducted at any phase of the plan implementation and could be combined
with other types of monitoring. Through this evaluation, issues and challenges in program, project,
and activity deliveries can be addressed.

RESULTS EVALUATION
This is an M&E approach that focuses on measuring the realization of results. It seeks to assess the
outcomes and changes brought about by program or project interventions. Findings from this type
of evaluation are used as baseline situation for the next planning cycle. Table 1 provides the types of
evaluation that shall be conducted within the agency:

Table 1.0 Types of Results of Evaluation


Evaluation Description Period of Results Usage
Conduct
Outcome Focuses on the changes After the program To decide whether
Evaluation in comprehension, has made contact the program
attitudes, behaviors, with at least one affected
and practices that result person or group in participants’
from program activities the target outcomes or not;
and may include both population and,
short and long-term To establish and
results measure the
actual benefits of
the program
Impact Focuses on long-term, At a certain period To provide
Evaluation sustained changes as a at the end of the evidence that
result of the program, programs, may be used in
regardless whether projects, major further policy and
positive or negative, activities, and program
including the intended policy development
and unintended results implementation
All DepEd operating units across governance levels shall conduct process evaluation. However, the
conduct of outcome and impact evaluations of programs, projects, and major activities shall be
done by internal and external parties across governance levels not involved in the implementation
of particular plans and programs.

MONITORING AND EVALUATION OF BASIC EDUCATION PLANS

Basic education plans shall lay down the goal and outcomes of the Department and serve as the
roadmap in crafting the medium-term and operational plans. M&E of basic education plans are
essential in determining the extent to which the program, project, and activity goal and outcomes
reflected in the plan are on track and in making any needed adjustments accordingly. It enables

33
informed decision-making regarding operations management and basic education service delivery
and ensures the most effective and efficient use of resources.

Figure 9.0 Basic Education Planning Types of Monitoring and Evaluation


Through M&E, DepEd operating units across governance levels can measure the extent to which
programs, projects, and activities are achieving the desired outcomes and outputs. The M&E of
basic education plans happens during and after its implementation.

Table 2.0 Types of Evaluation by Plan


Evaluation Description Period of Conduct Results Usage
Readiness Monitoring Readiness and Investment Planning To identify capacity
availability of all inputs needs and resources.
and requirements
towards the
achievement of DepEd
strategic goal and
outcomes are monitored
to start and sustain the
Department’s operations
at all governance levels.
Progress Monitoring Periodic and continuous Operational Planning To collect
monitoring of financial implementation
and physical outputs of feedback.
annual and three-year To implement corrective
plans of DepEd operating measures during
units across governance implementation.
levels.
Process Evaluation Review of the process of Strategic Planning, To facilitate periodic

34
implementation of any Investment Planning, program reporting.
type of basic education and Operational To utilize information to
plans at any point Planning improve future activities.
identified during
implementation period.
Results Evaluation Achievement of long- Strategic Planning, To document program
term objectives defined Investment Planning, outcome and impact.
in the strategic, and Operational
investment, and Planning
operational plans can be
evaluated.

Given these basic education plans, all DepEd operating units are required to prepare their
corresponding M&E plans to ensure the achievement of the Department’s goal and outcome. Each
type of M&E shall be conducted correspondingly based on each type of basic education plans.

OTHER TYPES

There are other M&E approaches that can be utilized that can further build on DepEd’s existing
M&E practices.

S
ource: Josselin and Le Maux (2017)
Figure 10.0 Types of M&E by Stage

DepEd can conduct M&E in terms of needs, design, inputs and outputs, short and long-term
outcomes.

CONTEXT ANALYSIS
At this stage, M&E gather information and determine needs. For instance, it may evidence a high
rate of school dropout among young people in a given area. Needs can be defined as a desire to
improve current outcomes or to correct them if they do not reach the required standard. While,
policy design is about the definition of a course of action intended to meet the needs.
A program may help teachers, families and learners to help prevent or contain dropout. If the
DepEd planner and officials decide that the consequences on individual and collective outcomes are

35
adequate to justify the design of a program, and if such a program falls within their range of
resources, they can propose that such program can be developed and eventually implemented.

EX-ANTE EVALUATION
Ex-ante evaluation is interested in setting up objectives and solutions to address the needs. This
may look at inputs and outputs. Inputs are in question.

Source: Josselin and Le Maux (2017)


Figure 11.0 Ex-Ante Evaluation Techniques

Typical questions for Ex-Ante Evaluation are the following:


 Are potential beneficiaries aware of the program?
 Do they have access to it?
 Is the application and selection procedure appropriate?
DepEd may consider using indicators of means (operating expenditures, grants received, number of
agents) and indicators of realization (number of beneficiaries or users) to measure the inputs and
the outputs, respectively. In addition, a set of management and accounting indicators can be
constructed and collected to relate the inputs to the outputs.
Different approaches can be employed depending on the type of outcome that is analyzed.
 Budget Impact Analysis. The purpose of budget impact analysis is to analyze this change
and to evaluate the budget and outcome changes initiated by the introduction of the new
strategy. A budget impact analysis measures the evolution of the number of users or
patients through time and multiplies this number with the unit cost of the interventions.

 Cost Benefit Analysis. This goes further by considering also the satisfaction derived from the
consumption of public services. All effects of the project are taken into account, including

36
social, economic and environmental consequences. The approaches are thereby different,
but also complementary, as a project that is financially viable is not necessarily
economically relevant and vice versa. In both approaches, discounting can be used to
compare flows occurring at different time periods. The idea is based on the principle that, in
most cases, citizens prefer to receive goods and services now rather than later.

 Cost Effectiveness Analysis. Cost effectiveness analysis selects the set of most efficient
strategies by comparing their costs and their outcomes.

 Financial Analysis. A financial appraisal examines the projected revenues with the aim of
assessing whether they are sufficient to cover expenditures and to make the investment
sufficiently profitable.

 Multi-criteria Decision Analysis. This may be used whenever several outcomes have to be
taken into account but yet cannot be easily expressed in monetary terms. In its simplest
form, the approach aims to construct a composite indicator that encompasses all those
different measurements and allows the stakeholders’ opinions to be accounted for. Weights
are assigned on the different dimensions by the decision-maker.

EX-POST EVALUATION
Ex post evaluation addresses the question of whether the outcome is the result of the intervention
or of some other factors. The true challenge here is to obtain a measure of what would have
happened if the intervention did not take place, the so-called counterfactual.

F IGURE 8.0
Source: Josselin and Le Maux (2017)
Figure 12.0 Ex-Ante Evaluation Techniques

DepEd can use ex post evaluation to look on areas of effectiveness, which emphasizes the extent to
which planned outcomes are achieved as a result of the program. With this in mind, it is essential to
distinguish the short-term outcomes, i.e. the immediate effects on individuals’ status as measured
by a result indicator (e.g., rate of dropout during mandatory school time) from the longer term
outcomes, i.e. the environmental, social and economic changes as measured by impact indicators
(e.g., the impact of dropout on DepEd’s commitments). In practice, ex post evaluation focuses

37
mainly on short-term outcomes, with the aim to measure what has happened as a direct
consequence of the intervention. The analysis also assesses what the main factors behind success or
failure are.&E BY STAG

38
MONITORING AND EVALUATION TOOLS AND SYSTEMS

MONITORING AND EVALUATION TOOLS

Tools are used during the conduct of monitoring activities to collect, analyze, and report required
data and information. A range of tools may be used in monitoring of policies, programs, projects,
and major activities. It is essential to ascertain the appropriate tools and approaches in the conduct
of monitoring. Further, it is not realistic to expect that any one monitoring tool or mechanism will
satisfy all data requirements during the conduct of M&E.
Different offices and stakeholders may use different tools or may use the same tools differently. The
development and selection of tools to be used shall be anchored on the objective or purpose for
conducting M&E. Since these tools are primarily used in collecting information, it is important to
identify what type of information will be collected and why these should be collected. These tools
may be classified according to the following purposes:

 Data collection and analysis. This entails obtaining and analyzing data and information
about the implementation of programs, projects, and major activities

 Validation. This involves checking or verifying whether or not the reported data and
information is accurate

 Participation. This focuses on obtaining feedback from partners and beneficiaries on


progress and proposed actions
The following tools may be used to achieve the purposes of M&E listed above:

 Periodic progress reports


 Annual reports
 Program, projects, and activities monitoring reports
 Field visits
 Spot checks
 Reviews and assessment by partners
 Client surveys
 Evaluations
 Reviews and studies
 Stakeholder meetings
 Inter-agency meetings
 Focus group discussions

INFORMATION SYSTEMS

39
The current DepEd system generates information on a number of key indicators required for
mandatory reporting on: (1) major final output for reporting to the Department of Budget
Management, (2) a resource matrix and PDP-Education Sector accomplishments for reporting to
the National Economic and Development Authority, and (3) financial information to the
Commission on Audit. In addition, the M&E system is used for (4) providing information to the
DepEd Secretary for Congressional question and answer briefing sessions, (5) SDGs at the
international level, and (6) project progress reports for government and development partners
review. The Enhanced Basic Education Act of 2013 and associated implementing rules and
regulations require DepEd to report on additional indicators related to SHS. However, no
comprehensive annual statistical bulletin on education is currently published.
Currently, the Department manages several information systems which serve as repository of data
that enable DepEd to easily capture, consolidate, analyze, and prepare agency level performance
reports for planning, resource allocation, policy and program development. The existing
information systems managed by DepEd and their corresponding descriptions are as follows:

ENHANCED BASIC EDUCATION INFORMATION SYSTEMS (EBEIS)


EBEIS is an online database of basic education information, including the number of enrollees by
year level, the number of schools, and the number of teachers. The EBEIS has the following
features:
 Registry of Schools
 System designed to enhance information management at all levels of the education system
(school, division, regional and national levels)
 Supports information requirements for planning, quality assurance, monitoring &
evaluation and other decision-making activities
 Provides a venue for sharing, using and reusing knowledge within DepEd
 Provides information on school infrastructure gathered through the National School
Building Inventory (NSBI) activities

ENTERPRISE HUMAN RESOURCE INFORMATION SYSTEM ( eHRIS)


The eHRIS is a database of personnel records for teaching and non-teaching DepEd staff and
includes information on salary, qualifications, and years of service. Currently, it is managed by the
personnel division of the human resource development service at the DepEd Central office. Below
are some features:
 Central repository of personnel records of DepEd officials and employees
 Allows HR activities and processes to occur electronically
 Enables better human resource management and planning and allows Personnel Division,
BHROD, and HRMOs to effectively perform their HR functions to better deliver service to
employees

40
LEARNER INFORMATION SYSTEM (LIS)
The LIS uses a unique identifier to store information particular to a learner, including name, date of
birth, guardian, and sex. Learner information has been expanded to include school enrollment,
achievement results, and other information. This system will need to be expanded to include non-
DepEd students and will be an essential component of the SHS voucher program’s implementation
and monitoring. The following features are noted:
 Registry of Learners
 Web-based system, linked with EBEIS, containing the registry of learners which enabled the
establishment of a centralized “Learner Registry”
 Uses a Learner Reference Number (LRN), a unique and permanent 12-digit number
assigned to a learner who enters the Philippine basic education system
 Through this, the basic learner information is captured, stored, and accessed through a
secured facility to enhance tracking of learners and decision-making at various levels of
DepEd management

LEARNING RESOURCE MANAGEMENT AND DEVELOPMENT SYSTEM (LRDMS)


The Learning Resource Management and Development System (LRMDS) is designed to support
increased distribution and access to learning, teaching and professional development resources at
the Region, Division and School/Cluster levels of DepEd. Below are some features:
 Storage of Learning Materials
 A web portal designed to support increased distribution and access to learning, teaching,
and professional development resources at the region, school division and school/cluster
levels of DepED
 Covers the assessment and evaluation, development, acquisition and distribution, storage
and maintenance, publication, and delivery of learning resources

LEARNING RESOURCES DELIVERY TRACKING SYSTEM (LRDTS)


The Learning Resources Delivery Tracking System focuses on the following:
 Tracks learning resources delivery from supplier to the recipient schools in real time
 Provides timely information on the status of the distribution of learning resources to enable
informed decision

PROGRAM MANAGEMENT INFORMATION SYSTEM (PMIS)


The PMIS monitors performance indicators of education programs, projects and activities.
Currently, the PMIS monitors government-financed projects only, but DepEd plans to include
development partner-supported projects as well.
The Planning Service-Planning and Programming Division (PDD) of the DepEd is responsible for the
management of the PMIS. If the PMIS is to be used to monitor progress towards disbursement-
linked indicator (DLI) achievement, some capacity development may be required, since results-

41
based lending monitoring is significantly different from project progress monitoring. The features
below are worth considering:
 Monitors progress and implementation of programs and projects
 Web-based information system that facilitates data collection, storage, analysis and
reporting in the tracking and monitoring of physical and financial performance of the
different programs, projects, and major activities of the DepEd
 Provides real-time data and information on the progress of implementation and
achievements of programs and projects in aid of policies and decisions of management
necessary to improve the delivery of quality outputs and thus, support basic education
outcomes
Like in other databases and Knowledge Management (KM) systems, the following considerations
are raised:
 Access Arrangements (user typology and controls, web portal access for public
consumption)
 Conduct of spot check and periodic reviews of the systems
 Hardware Requirements (e.g., hardware lifespan and value, technical specifications for
current and future use)
 Software Requirements (open source versus proprietary software; updating of modules and
whole system)
 Protection of Individual Data and Data Privacy
 Quality assessment covering areas of reliability, validity and timeliness
 Quality assurance for data quality and freedom from data tampering and manipulation
 Technical Capability of Staff (across governance levels) and other users

While the information systems identified are used and implemented nationwide, each operating
unit across governance levels shall gather additional information not captured by DepEd existing
information systems that is complementary to the achievement of agency performance indicators.
Operating units across governance levels shall establish and maintain their own database which
contains data and information gathered from their respective M&E activities that can be easily
accessed, managed, and updated. In this manner, each operating unit can easily integrate new data
requirements necessary in their respective operations, regardless if these are identified in the
national plan or not.

42
SCOPE OF MONITORING AND EVALUATION (M&E)

BASIC EDUCATION PERFORMANCE INDICATORS

INTERMEDIATE OUTCOMES

Frequenc Responsible
Intermedia
Performance y of Data Office
te Disaggregation
Indicators Reportin
Outcomes
g

IO 1. Gross Annual - Governance level CO: PS


Learners Enrolment (national, region,
are in Rate division) RO: PPRD
school and
- Level of SDO: SGOD
learning
centers education (Formal:
School:
Kinder; G1-G6, G7-
School Head
G10, G11-G12;
Non-formal: Basic
Literacy Program,
A&E elementary,
A&E secondary)

- Sex (male,
female)

- Type of learner
(formal, non-
formal)

-Sector (Public and


Private Schools)

Net Enrolment Annual - Governance level CO: PS


Rate (national, region,
division) RO: PPRD

- Level of SDO: SGOD


education (Formal:
School:
Kinder; G1-G6, G7-
School Head
G10, G11-G12;
Non-formal: Basic
Literacy Program,
A&E elementary,
A&E secondary)

- Sex (male,

43
female)

- Type of learner
(formal, non-
formal)

-Sector (Public and


Private Schools)

IO 2. % of learners Annual
- Governance level
CO: PS
Learners enrolled in RO: PPRD
(national, region,
access special SDO: SGOD
division, school)
programs programs School: School
- Level of education
Head
responsive (Kinder; G1-G6, G7-
to their G10, G11-G12)
needs and - Sex (male, female)
consistent - Type of special
with their program (IPEd,
Madrasah Education
interests
Program,
and SPED/LSEN, special
aptitudes interest program)

-Sector (Public and


Private Schools)

% of SHS Annual CO: PS


- Governance level
enrolment RO: PPRD
(national, region,
aligned with SDO: SGOD
division, school)
NCAE result School: School
- Sex (male, female)
Head
- Track (Academic,
TVL, Sports, Arts and
Design)
-Level of education
(G11-G12)

-Sector (Public and


Private Schools)

IO 3. Cohort Survival Annual


- Governance level
CO: PS
Learners Rate RO: PPRD
(national, region,
actively SDO: SGOD
division)
participate - Level of education
in a (G1-G6, G7-G12)
learner- - Sex (male, female)
friendly
-Sector (Public and
environme
Private Schools)
nt
Transition Rate Annual CO: PS
- Governance level
RO: PPRD
(national, region,
SDO: SGOD
division)
- Level of education
(G6-G7, G10-11)

44
- Sex (male, female)

-Sector (Public and


Private Schools)

School Leaver Annual CO: PS


- Governance level
Rate RO: PPRD
(national, region,
SDO: SGOD
division)
- Level of education
(Kinder; G1-G6, G7-
G10, G11-G12)
- Sex (male, female)

-Sector (Public and


Private Schools)

Repetition Rate Annual CO: PS


- Governance level
RO: PPRD
(national, region,
SDO: SGOD
division)
- Level of education
(Kinder; G1-G6, G7-
G10, G11-G12)
- Sex (male, female)

-Sector (Public and


Private Schools)

IO 4. Completion Annual
- Governance level
CO: PS
Learners Rate RO: PPRD
(national, region,
complete SDO: SGOD
division)
education - Level of education
and attain (Kinder; G1-G6, G7-
learning G10, G11-G12)
standards - Sex (male, female)

-Sector (Public and


Private Schools)

Proportion of Annual CO: BEA


- Governance level
students RO: CLMD
(national, region,
performing at SDO: CID
division)
proficient level
- Level of education
(Kinder; G1-G6, G7-
10, G11-G12)
- Sex (male, female)

-Sector (Public and


Private Schools)

Promotion Rate Annual CO: PS


- Governance level
RO: PPRD
(national, region,
SDO: SGOD
division)
- Level of education
(Kinder; G1-G6, G7-

45
10, G11-G12)
- Sex (male, female)

-Sector (Public and


Private Schools)

Graduation Annual CO: PS


- Governance level
Rate RO: PPRD
(national, region,
SDO: SGOD
division)
- Level of education
(G6, G10, G11)
- Sex (male, female)

-Sector (Public and


Private Schools)

ALS Completion Annual CO: PS/BLD


- Governance level
Rate RO:
(national, region,
PPRD/CLMD
division,
SDO: CID
school/learning
center)
- Sex (male, female)

- Level of education
(Basic Literacy, A&E
elementary, A&E
secondary)

-Sector (Public and


Private Service
providers)

- Delivery
mechanism (DepEd
delivered, DepEd
procured, DepEd
partners)

ALS A&E Passer Annual CO: BLD/BEA


- Governance level
Rate RO: CLMD
(national, region,
SDO: CID
division,
school/learning
center)
- Sex (male, female)
- Sector (Public and
Private ALS learning
centers)

-Sector (Public and


Private Service
providers)

- Delivery
mechanism (DepEd
delivered, DepEd

46
procured, DepEd
partners)

ENABLING INDICATORS

Frequency Responsible
Intermediat Performance Disaggregati
of Data Office
e Outcomes Indicators on
Reporting
EE 1. Proportion of Annual - Governance CO: BHROD
Education schools at SBM level (national, RO: FTAD
leaders and Level 3 (Highly region, division, SDO: SGOD
managers Proficient) school) School: School
practice - Level of Head
participative Practice (Levels
and inclusive 1, 2, 3, 4)
management
processes

EE 2. Proportion of Annual CO: AS


- Governance
Investments in schools with RO: ESSD
level (national,
Basic Education ideal SDO: SGOD
region, division,
provides pupil/student to School: School
school)
learners with classroom ratio Head
- Level of
ideal learning
education
environment
(Kinder; G1-G6,
G7-G10, G11-
G12)

-Sector (Public
and Private
Schools)

Proportion of Annual CO: PS


- Governance
schools with RO: PPRD
level (national,
ideal SDO: SGOD
region, division,
pupil/students School: School
school)
to Teacher ratio Head
- Level of
education
(Kinder; G1-G6,
G7-G10, G11-
G12)

47
-Sector (Public
and Private
Schools)

Proportion of Annual - Governance CO: AS


schools with level (national, RO: ESSD
ideal water and region, division, SDO: SGOD
Sanitation school) School: School
(WatSan) - Level of Head
facility to pupil education
ratio (Kinder; G1-G6,
G7-G10, G11-
G12)

Proportion of Annual CO: BLR


- Governance
schools with RO: CLMD
level (national,
ideal learning SDO: CID
region, division,
materials to School: School
school)
learner ratio Head
- Level of
education
(Kinder; G1-G6,
G7-G10, G11-
G12)

-Sector (Public
and Private
Schools)

Proportion of Annual - Governance CO: BLR


schools with level (national, RO: CLMD
functional region, division) SDO: CID
library - Level of School: School
education Head
(Kinder; G1-G6,
G7-G10, G11-
G12)

Proportion of Annual - Governance CO: AS


schools with level (national, RO: ESSD
ideal region, division, SDO: SGOD
pupil/students school) School: School
to seat ratio - Level of Head
education
(Kinder; G1-G6,
G7-G10, G11-
G12)

Proportion of Annual - Governance CO: ICTS


schools with level (national, RO: ICT Unit
ideal ICT region, division, SDO: ICT Unit
package/ E- school) School: School
classroom - Level of Head
package to education
sections ratio (Kinder; G1-G6,
G7-G10, G11-

48
G12)

Proportion of Annual - Governance CO: ICTS


schools with level (national, RO: ICT Unit
internet access region, division, SDO: ICT Unit
school) School: School
- Level of Head
education
(Kinder; G1-G6,
G7-G10, G11-
G12)

Proportion of Annual - Governance CO: ICTS


schools with level (national, RO: ICT Unit
connection to region, division, SDO: ICT Unit
electricity school) School: School
- Level of Head
education
(Kinder; G1-G6,
G7-G10, G11-
G12)

Interquartile Annual - Governance CO: PS


ratio (IQR) level (national, RO: PPRD
region, division, SDO: SGOD
school)
- Level of
education
(Kinder; G1-G6,
G7-G10, G11-
G12)

EE 3. People, Proportion of Semestral - Governance CO: BHROD


internal offices across level (national, RO: HRD
systems, and governance region, division, SDO: SGOD
processes levels with school) School: School
serve very Head
learners satisfactory
better rating in the
through Office
continuous Performance
improvement Commitment
efforts and Review
Form (OPCRF)

Disbursement Annual Governance CO: Finance


rate level (national, Service
region, division, RO: Finance
school) Division
SDO: Finance
Unit
School: School
Head

Proportion of As indicated in Governance CO: ProcS


bidding the PPMP / level (national, RO: AD

49
completed on Program of region, division, SDO: Admin
the first cycle Works school) Unit
of the process School: School
Head

Timely delivery As indicated in Governance CO: ProcS


of procured the PPMP / level (national, RO: AD
projects Program of region, division, SDO: Admin
Works school) Unit
School: School
Head

Client Periodic - Governance CO: PS


satisfactory level (national, RO: PPRD
rating of DepEd region, division, SDO: SGOD
offices' school) School: School
respective - Type of Head
stakeholders stakeholder
(internal,
external)
- Rating

Proportion of Semestral - Governance CO: BHROD


teachers level (national, RO: HRD
meeting the region, division, SDO: SGOD
PPST career school) School: School
stage 3 in all - Stages of Head
domains PPST
(Highly - Sex (male,
Proficient) female)

All DepEd Semestral Governance CO: PS


operating units level (national, RO: PPRD
produced plans region, division, SDO: SGOD
and school) School: School
adjustments Head
compliant to
DepEd planning
and M&E
standards

Proportion of Semestral Governance CO: PS


Authority To level (national, RO: PPRD
Conduct (ATC) region, division) SDO: SGOD
requested over
total financial
transactions

Proportion of Semestral - Bureau and CO: PS


DepEd policies Services
compliant with
the policy
development
process

EE #4. Key Proportion of Annual - National level CO: PMS

50
stakeholders financial
actively contribution of
collaborate to development
serve partners vis-a-
learners vis national
better education
budget

Special Annual - Governance CO: FS


Education Fund level (national, RO: FD
(SEF) Utilization region, division, SDO: OSDS
Rate school)
- Eligible
expense
- LGU
classification
(province,
municipality,
city)

Proportion of Annual -Governance CO: BHROD


schools with level (national, RO: FTAD
functional region, division, SDO: SGOD
School school) School: School
Governance -Level of Head
Council (SGC) education
(Elementary,
secondary)

51
Performance measures by program

The implementation of the M&E function within the DepEd


bureaucracy is primarily focused on outcomes.

There are specific M&E information, based on indicators identified by M&E Information
DepEd program, that are collected for all outcomes at different
governance levels, in different time frames, and for specific areas. Complementarity
For this section, established indicators were identified with some key Interactional complementarity.
questions per identified area. The type of questions will be highly Decision-makers in DepEd can
dependent on the following: utilize both M&E information in
tandem to make informed decisions
 Monitoring Questions. Measure input vs output. These are and act based on insights.
useful in systematically and critically observing progress to
manage inputs and adapt them to changing conditions. Information complementarity. Both
Critical steps in integrating these are: M&E can use the same data but may
o Recording data and information on key indicators, address specific M&E questions
mainly from sources existing at DepEd based on different approaches.
o Collection of primary data
o Review of education information system data Sequential complementarity.
o Analysis performed at each governance level Monitoring data can generate
o Regular reporting questions that evaluation will have
to address.
 Evaluation Questions. Look at output leading to outcome. Source: Gebremedhin, Gettachew
Conducted at least once in every 3 years. These provide and Amha (2010)
information that is credible, useful, enabling the incorporation
of lessons learned in the process. Necessary considerations
are:

o Who conducts (i.e., external, internal (higher level


group), joint evaluation)
o When they are conducted (i.e., ex-ante, on-going, ex-
post)
o Uses of evaluation information

 Impact Questions. These are research questions. (long-term)


These questions may go beyond the direct result of and tries
to delve into current and future effects on key stakeholders. Essential concepts are:

o Attribution
o Causality
o Incrementality

The Support to Operations consists of activities and projects which provide staff, technical, and/or
substantial support to operations, but do not produce goods or deliver services directed towards
schools or learners. Funds for this are also management overhead expenses and are therefore also
indirect costs incurred in delivering the mandate of the Department.

52
The Operations consists of programs and corresponding expenditures that relate to the mandate of
the Department – the delivery of basic education services, directly benefitting the schools or
learners. Funds for this are considered the direct costs of delivering the mandate of the
Department.

EDUCATION POLICY DEVELOPMENT PROGRAM


This program includes the development and formulation of policies, standards, and processes that
will ensure the effective delivery of basic education services. It includes curriculum development
done through the review of content-based standards and researches aimed to make the curricula
more relevant to the learners’ economic and social environment and contextualized to the peculiar
needs of the learners. It also covers development and conduct of different types of assessment,
such as systems assessment and learner assessment. The following are under this:

 Basic Education Curriculum


 National Literacies and Research Program
 Early Language Literacy and Numeracy
 Policy and Research Program
 Curriculum Programs, Learning Management Models, Standards, Strategy Development
 Development and Promotion of Campus Journalism

Indicators

 Gender Parity
 Literacy Rate
 Achievement Rate
 Gross Enrollment
 Net Enrollment
 Cohort Survival Rate
 Transition Rate
 Participation Rate
 School Leaver Rate
 Drop-Out Rate
 Repetition Rate
 Completion Rate
 Graduation Rate

Selected Key Questions

Monitoring Questions

 How are the indicators in terms of gender distribution?


 Are all students able to actively participate in education?
 What is the current condition in terms of drop-outs and school leavers?
 How does the participation rate correspond to completion and graduation
rates?

53
Selected Key Questions

Evaluation Questions

 Do all learners have access to education?


 Do all learners benefit from good quality education?
 Are all students treated with equality?
 Is the management of education efficient and effective?
 Are the outcomes of education relevant and satisfactory?

Impact Questions

 Are there improvements in terms of learner access?


 What contributes to increased participation?
 What is the current progress in terms of retention within the education
system?
 What factors are effective in producing outcomes in terms of participation,
graduation and completion?

BASIC EDUCATION INPUTS


This program aims to provide learners with the basic education resources consistent with the
concept of “Getting learners in school and keeping them in school” to enable them to complete the
full cycle of education.

The major components under the program are the provision of a) human resources, b) learning
materials, c) learning facilities, and d) tools and equipment. More specifically, this program includes
the creation of teacher plantilla items, provision and maintenance of classrooms, workshops,
laboratories, water and sanitation facilities, provision of school seats and tables for learners and
teachers, acquisition of school sites, and provision of learning materials, tools, and equipment.

The following are under this:

 New School Personnel Positions


 Improvement and Acquisition of School Sites
 Computerization Programs
 Learning Resources (Text-based and Non-Text-Based LR)

Indicators

 Learner to Classroom Ratio


 Learner to Teacher Ratio
 Learner to Water and Sanitation (WASH) Ratio
 Learner to Learning Materials Ratio
 Learner to Seat Ratio
 Cohort Survival Rate

54
 Proportion of Schools with Computer Package
 Proportion of Schools with Functional Library
 Proportion of Schools with ICT/Computer and Science Laboratory
 Proportion of Schools with Internet Access
 Proportion of Schools with Access to Electricity

Some Key Questions

Monitoring Questions

 How are the indicators in terms of gender parity?


 Are all material and infrastructural resources provided available?
 Do the schools have adequate number of separate toilets for boys and girls that are regularly
used?
 Are the classrooms, toilets and other school facilities well cleaned and have access to water?
 How does the participation rate correspond to completion and graduation rates?

Evaluation Questions

 Are the learners’ material resources and school’s physical environment responsive to learners
needs?
 Are learning materials representing particular groups within society available?
 Are classroom conditions and school facilities supporting all learners?
 What do administrators, teachers, and parents think of the policies to improve the school’s
physical environment and facilities?
 Do Regional Offices and Schools Divisions have established policies and mechanisms to regularly
monitor schools for environmental health problems?

Impact Questions

 Do the Regional Offices, Schools Division Offices and schools have policies to improve and
maintain a healthy physical school environment?
 Are there mechanisms to evaluate aspects of physical environment (like air, water, sanitation,
hazardous chemicals, transportation, school food, disease vectors)?

INCLUSIVE EDUCATION PROGRAM

DepEd has largely understood inclusive basic education as the ‘no child left behind’ policy drawn
from the EFA goals in “achieving universal education, especially for girls, ethnic minorities and
marginalized children”. This also adheres to Sustainable Development Goal (SDG) #4 which aims to
achieve “inclusive and equitable quality education and promote lifelong learning opportunities for all.

This program aims to provide all types of learners’ access to basic education services suitable to
their learning needs and circumstances. This includes the following intervention to make education
inclusive:
 Indigenous Peoples Education Program (IPEd)
 Muslim Education Program (MEP)
 Multigrade Education Program
 Special Education (SPED) Program
 Flexible Learning Options (Alternative Delivery Mode (ADM)
 Alternative Learning System (ALS)

55
 Education in Emergencies (EiE)

For DepEd, promoting inclusive education has evolved to include all programs and services that
intend to capture those that have been traditionally marginalized by the formal school system such
as:
 Children living with disabilities to move from SPED Centres to regular classes
 Out of school children/youth/adults through the ALS
 Geographically-isolated areas or areas with not enough pupils to start monograde classes
through the multi-grade system
 Muslim children, whose parents still generally feel isolated from the dominant Christian culture,
through the ALIVE Program in Madrasahs
 Indigenous peoples through the IP education programs.

Indicators

 ALS Completion Rate


 ALS A&E Passer Rate
 Net and Gross Enrollment of Student with Disabilities
 Percentage of Learners Enrolled in ALIVE
 Percentage of Learners Enrolled in Indigenous People Education
 Percentage of Learners Enrolled in Multigrade Schools
 Percentage of Learners Enrolled in Special Programs

Selected Key Questions

Monitoring Questions

 How are the indicators in terms of gender distribution?


 What is the ALS Completion Rate for the past five years?
 What is the ALS A&E Passer Rate for the past five years?
 What is the general trend of enrollment of student with disabilities?
 In comparison to the total learner population, what are the general
distribution of Inclusive education segments for public and private schools?

Evaluation Questions

 Are there barriers to learners’ education access, to participation and to


achievement, with particular attention to learners who may be most at risk of
underachievement, marginalization or exclusion?
 Is the curriculum responsive to the needs of these learners?
 To what extent are schools supportive to inclusive education (e.g. how
school heads, teachers, parents, students encourage and assist differently-

56
Selected Key Questions

abled children’s participation in school events) and willing to sustain the


program achievements?
 What existing practices allow students to be integrated into mainstream
classes?
 What are the current physical capacity of schools and infrastructures
handling these segments?
 Are the teachers competent to handle and manage these students?

Impact Questions

 Are inclusive education policies backed up by support to teachers to adapt


teaching styles?
 Are schools and the communities engaged and supporting all children to
attend school?

With the results-based M&E, inclusive education within DepEd may focus on:
 Exploring specific nature of exclusion in specific/local context.
 Delving into indicators of exclusion that go beyond access to schooling and capture influential
behaviours and practices.
 Promoting policy dialogue and reform.
 Enabling DepEd operating units across governance levels to work in contexts where participation
rates are high, that the last remaining out of school learners, who can be hard to reach, are
targeted.
 Develop and fine-tune costed strategies that take into account the daily realities of excluded
learners where mainstream assistance programs are not sufficient.

SUPPORT TO SCHOOLS AND LEARNERS


Support to Schools and Learners Program provide financial support to learners (through
government assistance & subsidies) and schools (for the operation of schools), as well as extra-
curricular support to learners in order to address access and relevance of education.

This program ensures continuous improvement in the individual capacity of the personnel and in
the organizational systems by which these individuals interact as an organization, with the end view
of improving basic education service delivery.

M&E in terms of support to schools and learners, specifically the financial outlay, is critical for
accountability, transparency and equity. This area includes the following:

 Operations of Schools
 Government Assistance Subsidies
 Joint Delivery Voucher for Senior High Schools
 School Feeding Programs

Indicators
57
 Disbursement Rate
 Special Education Fund Usage
 Percentage of JHS and SHS who availed of the vouchers
 Certification Rate of SHS TVE Graduates
 School-to-Work Transition Rate of SHS Graduates
 Other Student Outcome Indicators
 Sufficient Education Financing

Selected Key Questions

Monitoring Questions

 What is the current disbursement rate per particular project?


 What percentage of a school/division’s budget are provided for by
Special Education Fund?
 What is the current going rate of voucher availment in JHS and SHS?
 How many SHS graduates undergo TVE certification?

Evaluation Questions

 Are there barriers to the special education fund?


 What type of projects and programs does Special Education Fund
provide for?
 Are there issues to consider in terms of disbursement?
 To what extent are schools supportive to TVE certification?
 What existing factors contribute to JHS and SHS learners venturing to
work instead of going to college?

Impact Questions

 Do Special Education Fund usage translate into school and learner


outcomes?
 Do JHS and SHS learners demonstrate the necessary competencies for
work?

EDUCATION HUMAN RESOURCE DEVELOPMENT PROGRAM

 Human Resource Development for School-Based Personnel and Learning Centers


 Teacher Quality and Development Program
 Government Assistance Subsidies

Indicators

58
 Proportion of offices across governance levels with very satisfactory
ratings in the Office Performance Commitment and Review Form (OPCRF)
 Client satisfactory rating of DepEd Offices’ respective stakeholders
 Proportion of qualified staff and teachers (Source)
 Extent of strategic alignment between DepEd’s goals and operating units
 Extent of cascading of operating units’ goals to individuals
 Ratings of individual performance

Selected Key Questions

Monitoring Questions
 What are the current SBM levels per Division?
 What are the current SBM levels per Region?
 How many schools have school governance councils (SGC)?
 How many schools have poor ratings?
 What operating units are lagging behind in terms of client
satisfaction?
 Are the teachers’ and staff qualified?
 Is the current DepED compensation plan aligned with the
government’s compensation and position plan?

Evaluation Questions
 How do SBM levels improve?
 How does improvement in SBM levels eventually translate into
educational outcomes?
 To what extent are schools governance councils more relevant and
responsive to learners’ needs?
 What existing practices allow operating units to maintain high client
satisfaction ratings?
 What are the current ratings of individual teaching and non-teaching
staff per school/division/region?
 What competency requirements are needed for teaching and non-
teaching staff?
 What competencies should school administrators demonstrate to
attain high SBM ratings in their schools?

Impact Questions

 Are SBM levels related to learner outcomes?


 What changes did school governing councils introduced?
 Are there differences in outcomes between schools with and without
school governance councils?
 Do competence of teaching staff translate to class and school-wide
outcomes?
 What are the results of the strategic alignment of DepEd goals to the
operating units and the cascading to individual performance?

59
ROLES AND RESPONSIBILITIES BY GOVERNANCE LEVEL

The Monitoring and Evaluation (M&E) system shall serve as an integrating mechanism across
governance levels and within operating units of the Department. This shall provide the DepEd’s
decision-makers with an evidence-based information on the applicability and feasibility of
formulation and implementation of policies, programs, projects, and major activities in the
Department.

The vertical integration of the M&E system across governance levels is the systematic alignment of
development and implementation of basic education policies, plans, programs, projects, major
activities, and M&E processes from national to region, region to school division, and school division
to school level and vice versa. It allows the national, regional, school division, and school levels to
make adjustments in the quality of their strategic basic education plans including technical, human
resource, and administrative services. Likewise, the horizontal integration of the M&E system
within governance levels shall align the development and implementation processes of basic
education plans, policies, programs, projects, major activities, and M&E processes among operating
units in a particular governance level i.e. national, regional, school division, and school. BEMEF shall
enable each office to come up with a more holistic and integrated analysis of their entire
governance situation. To make the M&E system functional, all DepEd operating units across
governance levels shall conduct the M&E of their respective basic education plans, policies,
programs, projects, and major activities in accordance to BEMEF and corresponding standards.

CENTRAL OFFICE

The National M&E shall orchestrate the entire DepEd monitoring and evaluation. This defines the
scope of the DepEd M&E system. This includes continuous review and enhancement of
performance indicators to ensure that the needs of learners are addressed.

Central Office shall:


1. Establish a results-based M&E at all levels;

2. Provide the mechanism for the horizontal integration of bureaus, services and other
operating units at the national level;

3. Ensure vertical integration of the M&E systems in the region, school division, and school;

4. Define the processes for validating outcomes and accomplishments. This includes design of
M&E work processes, identification of the information needs of internal and external
stakeholders, report requirements, and process for collecting and capturing data and
information;

60
5. Ensure the integration of M&E results in development policies, programs and plans,
preparation of the agency’s financial requirements and distribution of resources;

6. Facilitate exchange of information, practices, insights, lessons and issues between and
among operating units and external stakeholders;

7. Facilitate the implementation of third party evaluation of DepEd programs and projects;

8. Link M&E results to the organizational and individual performance; and

9. Ensure that BEMEF is supportive of the achievement of DepEd goals and outcomes.

RESPONSIBLE OFFICES

Offi ce of the Secretary

The Secretary shall be the overall lead of the Department’s national M&E system. As the official
with the overall authority and supervisory responsibility of the operations of the Department,
he/she shall have the accountability and responsibility to ensure that information generated from
the national M&E system are used to: (1) formulate national educational policies, plans, standards,
programs, projects, and major activities; and, (2) assess national learning outcomes. He/she shall:

1. Lead the institutionalization of the basic education national M&E system;

2. Provide decisions and directions on national education issues and matters arising from
various M&E activities such as national PIRs, stakeholders’ forum, inter-agency meetings,
among others;

3. Communicate education concerns to other national offices and other development partners
during meetings, fora, or conferences;

4. Approve educational policies and program recommendations from internal and external
stakeholders based on evidences presented such as completed researches, national
statistics, among others; and,

61
5. Determine additional performance indicators and other adjustments in the national M&E
plan as necessary.

Planning Service
The Planning Service (PS), as the process owner of the national M&E system, shall:

1. Oversee and manage the conduct of M&E of all central office operating units and ensure
that they are adhering to established standards;
2. Maintain a national database facility to ensure that data and information gathered from
M&E activities are properly managed;

3. Consolidate and analyze M&E reports from central and regional operating units for the
preparation of national reports to be disseminated to internal and external stakeholders;

4. Lead the conduct of quarterly Program Implementation Review (PIR) among central offices
and provide guidance/technical assistance to regional offices and SDOs on the conduct of
PIR to track physical and financial accomplishments and assess the progress
implementation of plans, programs, projects, and major activities based on key
performance indicators (KPIs);

5. Oversee and provide assistance in the conduct of evaluations on DepEd’s programs,


projects, and major activities; and

6. Provide technical assistance and capacity building support to central and regional operating
units on the management and conduct of M&E within their respective M&E systems.

Education Program Delivery Unit (EPDU)

In support of this process, the Education Program Delivery Unit (EPDU) created through DO 71, s.
2016 as the Department’s performance delivery unit, shall collaborate with the PS in monitoring
priority programs and projects. Likewise, it shall work closely with Finance Service (FS) to drive
performance improvements in the timely delivery of education inputs for both formal and non-
formal education. EPDU shall track both physical and financial performance of PAPs across
governance levels in the Department in coordination with the PS towards effective and efficient
delivery of basic education services to its clientele.

Other Central Offi ce Operating Units


All the operating units in the Central Office shall:

1. Establish a results-based M&E within their respective offices;

62
2. Adhere to the established M&E standards in performing M&E activities and processes;

3. Partake in strengthening the horizontal integration in the national M&E system by


engaging other central operating units during planning, policy development, program
designing, and M&E;

4. Provide feedback, insights, lessons, and other issues gathered from their respective M&E
activities to relevant central operating units;

5. Participate in national M&E initiatives such as PIRs, periodic reporting of accomplishments


of plans, programs, projects, and major activities submission of O/IPCRF, among others;

6. Apply M&E results in improving office and individual performance; and

7. Link M&E results to the organizational and individual performance.

REGIONAL OFFICE

The regional M&E system shall ensure the effective, efficient, and inclusive implementation of all
education policies & programs and the achievement of desired outcomes. The regional M&E system
shall provide the regional policy makers and implementers with timely and appropriate feedback on
the implementation of DepEd policies, programs, and delivery systems.

The regional M&E system shall:


1. Establish a results-based M&E at the regional level;

2. Ensure the horizontal integration of M&E activities of the different operating units in the
region;

3. Strengthen vertical integration to link M&E systems between region, school division and
school;

4. Ensure that M&E standards and processes are implemented at the regional level;

63
5. Evaluate the impact, effectiveness, and efficiency of education policies and programs in the
region;

6. Facilitate exchange of information, practices, insights, lessons and issues between and
among operating units and external stakeholders;

7. Provide feedback to CO on the regional M&E results particularly on issues with implications
for national policies and programs;

8. Ensure the integration of M&E results in developing local programs and plans, and
customizing national education strategies and policies;

9. Ensure the conduct of quarterly regional PIR (MEA) with SDOs to track physical and
financial accomplishments and assess the progress implementation of plans, programs,
projects, and major activities based on key performance indicators (KPIs); and

10. Link M&E results to the organizational and individual performance.

RESPONSIBLE OFFICES

Regional Director /
The regional director shall be the overall lead of the regional M&E system. He/she shall have the
authority, accountability, and responsibility to ensure that information generated from the regional
M&E system are used to: (1) develop regional basic education plans, standards, programs, projects,
and major activities; (2) customize national education strategies and policies; and, (3) assess
regional learning outcomes. He/she shall:

1. Lead the institutionalization of the basic education regional M&E system;

2. Provide decisions and directions on regional education issues and matters arising from
various M&E activities such as regional PIRs, stakeholders’ forum, inter-agency meetings,
among others;

3. Communicate regional education concerns to the central office, other agencies, and other
development partners during meetings, fora, or conferences;

4. Approve program recommendations from internal and external stakeholders based on


evidences presented such as completed researches, national statistics, among others; and,

64
5. Determine additional performance indicators and other adjustments in the regional M&E
plan as necessary.

Quality Assurance Division


The Quality Assurance Division (QAD) as the main process owner of the regional M&E system shall:

1. Oversee and manage the conduct of M&E of all regional operating units and ensure that
they are adhering to established standards;

2. Consolidate and analyze M&E reports from regional and schools’ division operating units for
the preparation of regional reports to be disseminated to internal and external
stakeholders;

3. Maintain a regional database which contains data and information gathered from regional
M&E activities that can be easily accessed, managed, and updated;

4. Lead the conduct of quarterly Program Implementation Review (PIR) among regional and
school division operating units to track physical and financial accomplishments and assess
the progress implementation of plans, programs, projects, and major activities based on
Key Performance Indicators (KPIs);

5. Oversee and provide assistance in the conduct of evaluations on regional programs,


projects, and major activities; and,

6. Provide technical assistance and capacity building support to regional and school division
operating units on the management and conduct of M&E within their respective M&E
systems.

Other Units

All the operating units in the regional M&E system shall:


1. Establish a results-based M&E within their respective offices;

2. Adhere to the established M&E standards in performing M&E activities and processes;

3. Partake in strengthening the horizontal integration in the regional M&E system by


engaging other regional operating units during planning, customizing of national policy,
program designing and implementation, and M&E;

65
4. Provide feedback, insights, lessons, and other issues gathered from their respective M&E
activities to relevant central and regional operating units;

5. Participate in regional M&E initiatives such as PIRs, periodic reporting of accomplishments


of plans, programs, projects, and major activities, and submission of O/IPCRF, among
others; and,

6. Apply M&E results in improving office and individual performance.

SCHOOLS DIVISION OFFICE

The Schools Division M&E focuses on determining effectiveness and inclusiveness of schools in
providing basic education services. This shall serve as a mechanism for reflection on the SDO’s
capacity to provide timely and needs-based basic education support services to schools. The
feedback shall allow the SDO to provide technical assistance and capacity building support to
creating and sustaining effective and inclusive schools that are relevant and responsive. Through the
M&E system, targeted technical support to schools in the areas of curriculum delivery and
assessment, training of teachers, teaching and learning process, learning environment, partnerships
and stakeholders support, and school leadership shall be regularly provided.

The SDO M&E shall:

1. Establish a results-based M&E at the school division level;

2. Strengthen the link of M&E systems between SDO and schools;

3. Ensure the integration of M&E initiatives of SDO operating units;

4. Monitor the effective and efficient implementation of education policies and programs;

5. Ensure that M&E standards and processes are implemented at the SDO and school levels;

6. Facilitate exchange of information, practices, insights, lessons and issues between and
among operating units and external stakeholders;

7. Provide feedback to RO on the SDO M&E results;

66
8. Ensure the integration of M&E results in developing local education plans and programs,
and in implementing national education policies and systems both at the SDO and school
levels;

9. Ensure the conduct of quarterly PIR (MEA) to track physical and financial accomplishments
and assess the progress implementation of plans, programs, projects, and major activities
based on key performance indicators (KPIs);

10. Provide M&E technical support and capacity building intervention to schools; and

11. Link M&E results to the organizational and individual performance.

RESPONSIBLE OFFICES

Schools Division Superintendent


The school division superintendent shall be the overall lead of the school division M&E system.
He/she shall have the authority, accountability, and responsibility to ensure that information
generated from the school division M&E system are used to:
 develop and implement division education development plans and programs; and,
 implement national education policies and systems at the SDO and school.
The SDS is expected to:

1. Lead the institutionalization of the basic education school division M&E system;

2. Provide decisions and directions on school division education issues and matters arising
from various M&E activities such as school division PIRs, stakeholders’ forum, inter-agency
meetings, among others;

3. Communicate school division education concerns to the regional office during meetings,
fora, or conferences; and,

4. Determine additional performance indicators and other adjustments in the school division
M&E plan as necessary.

Schools Governance and Operations Division

67
As the process owner of the school division M&E system, the Schools Governance and Operations
Division (SGOD) shall:

1. Oversee and manage the conduct of M&E of all division operating units and schools, and
ensure that they are adhering to established standards;

2. Consolidate and analyze M&E reports from school division operating units and schools for
the preparation of school division reports to be disseminated to internal and external
stakeholders;

3. Maintain a school division database which contains data and information gathered from
school division M&E activities that can be easily accessed, managed, and updated;

4. Lead the conduct of quarterly Program Implementation Review (PIR) among school division
operating units and schools to track physical and financial accomplishments and assess the
progress implementation of plans, programs, projects, and major activities based on Key
Performance Indicators (KPIs); and

5. Provide technical assistance and capacity building support to division and school operating
units on the management and conduct of M&E within their respective M&E systems.

Other Operating Units

All the operating units in the school division M&E system shall:

1. Establish a result-based M&E within their respective offices;

2. Adhere to the established M&E standards in performing M&E activities and processes;

3. Partake in strengthening the horizontal integration in the school division M&E system by
engaging other school division operating units during development of local education plans
and programs, implementation of national education policies and systems, and M&E;

4. Provide feedback, insights, lessons, and other issues gathered from their respective M&E
activities to relevant regional and school division operating units;

68
5. Participate in school division M&E initiatives such as PIRs, periodic reporting of
accomplishments of plans, programs, projects, and major activities, and submission of
O/IPCRF, among others; and

6. Apply M&E results in improving office and individual performance

SCHOOLS

The school system shall make the teaching and learning process more learner-centered and school-
based management more effective and inclusive. This promotes the culture of self-assessment and
self-improvement among schools to transform into responsive and nimble organizations. It is a key
support system which shall allow the school heads to create and sustain a school environment that
empowers teachers to collaborate in fostering an effective and inclusive school. School M&E should
provide the platform for shared governance which is a critical component in developing,
implementing, and sustaining effective inclusive schools.

School-based M&E will provide school heads, teachers, non-teaching staff, and communities with
critical insights, lessons, and timely information on the performance of all learners, their needs, as
well as barriers preventing active participation in the teaching and learning process.

In terms of M&E, schools are expected to:

1. Ensure the periodic conduct of M&E in all school operations and processes in accordance
with existing standards;

2. Track operational bottlenecks and issues to update, calibrate, and differentiate response
every school year and regularly examine and customize teaching strategies;

3. Formalize interface between and among school head, teachers, and non-teaching staff to
discuss operational issues and challenges;

4. Facilitate participation of learners, communities, and other key stakeholders in the


exchange of information, practices, insights, lessons and issues;

5. Ensure the conduct of quarterly PIR (MEA) to track physical and financial accomplishments
and assess the progress implementation of plans, programs, projects, and major activities
based on key performance indicators (KPIs);

6. Maintain records of M&E results and integrate such in the preparation of SIP, OPCRF, and
other school projects and programs;

69
7. Report to the SDO the M&E results for appropriate technical support; and

8. Link M&E results to the organizational and individual performance.

RESPONSIBLE OFFICES

School Head

The school head shall be the overall lead and process owner of the school M&E system. He/she shall
have the authority, accountability, and responsibility for ensuring that information generated from
the school M&E system are used in the development and implementation of plans, programs,
projects, and major activities to make the school more effective and inclusive. He/she shall:

1. Lead the institutionalization of the school M&E system;

2. Provide decisions and directions on school issues and matters arising from various M&E
activities such as school PIRs, stakeholders’ forum, inter-agency meetings, among others;

3. Communicate school concerns to the school division office during meetings, fora, or
conferences;

4. Oversee the conduct of M&E activities in the school and ensure that these are according to
established standards;

5. Engage different stakeholders in the conduct of school M&E activities such as the members
of the School Planning Team (SPT), School Governance Council (SGC), among others;

6. Conduct quarterly Program Implementation Review (PIR) in the school to track physical and
financial accomplishments and assess the progress of implementation of plans, programs,
projects, and major activities based on ley performance indicators (KPIs);

7. Maintain records of M&E results and integrate such in the preparation of SIP/AIP, OPCRF,
and other school programs, projects, and major activities;

8. Prepare school M&E reports for dissemination to internal and external stakeholders such as
the School Report Card (SRC), Transparency Board, Learning Action Cells (LAC), among
others; and,

70
9. Determine additional performance indicators and other adjustments in the school M&E
plan as necessary.

Other School Personnel and Stakeholders


School personnel and other stakeholders shall:
1. Conduct school M&E activities in accordance with established M&E standards;

2. Discuss operational issues and challenges between and among school head, fellow
teachers, and non-teaching staff;

3. Provide feedback, insights, lessons, and other issues gathered from their respective M&E
activities to relevant school operating units, community members, and other key
stakeholders through dissemination of SRC, conduct of LAC sessions, and preparation of
Transparency Board;

4. Participate in school M&E initiatives such as PIRs, periodic reporting of accomplishments of


programs, projects, and major activities, submission of O/IPCRF, among others; and,

5. Apply M&E results in improving teaching-learning strategies and individual performance.

All DepEd operating units and personnel have the responsibility to perform M&E in accordance with
established standards and partake in the operations of M&E systems at this level.

71
VERTICAL AND HORIZONTAL LINKAGE PER GOVERNANCE LEVEL

WHAT MATTERS MOST IN MONITORING AND EVALUATION SYSTEM FOR DEPED?

Learning is at the core of


EDUCATION VALUE DepEd bureacracy.
DepEd’s M&E focuses on Education Value, particularly in trying to achieve established global goals
like the Sustainable Development Goals (SDGs) as well as building on the Education for All (EFA)
gains.

At the level of the Leadership, the M&E system should be looking at the “big picture” which should
be focused on the relevant Sustainable Development Goals that contribute to education impacts
in the long-term and are aligned with the Government’s Philippine Development Plan (PDP).

With this DepEd Leadership through the Secretary (and by extension the Senior Officials and
Regional Directors) will set and provide leadership in terms of:

 Policy direction based on an articulated vision, mission and with coherent and clear
strategies
 Agenda and Program that is easily understood, subscribed to and followed by stakeholders
 Resources to deliver on that program (i.e. a budget that enables priorities to be properly
funded
 Standards to determine the quality desired of the system and expected of all schools and
offices in the bureaucracy.

CUSTOMER VALUE
The primary “customer” so to speak are the Schools and School Divisions that oversee them.

Learning as the “moment of truth” in Education. The moment of truth in Education can only happen
where LEARNING is occurring. For the most part, in the formal education system, this occurs in the
classroom, in the school and/or in a learning center. Thus, the measure of success in Education must
be measured primarily on Learning Outcomes, not on output and much less on input.

(Note that Inputs and Outputs are necessary but not sufficient to determine if an education system is
high-performing or not.)

72
This emphasizes the primordial role of operating units in DepEd. The following considerations are
emphasized:

 If Learning occurs primarily in schools or learning centers, the center of gravity for the
monitoring and evaluation of the system (M&E) will necessarily be schools divisions, schools
and learning centers.
 All other offices in the Department of Education system and bureaucracy play a support role
and should be aligned to try to enhance schools’ and centers’ ability to deliver education
services with quality (i.e. effectively and efficiently).
 At the level of the operating units (schools divisions, schools, learning centers), the M&E
system should be looking at outcomes (learning and schooling outcomes) which are spelled
out in the Education for All goals and indicators.

ACADEMIC SUPPORT UNITS

Academic Support units are those offices that provide support to schools and learning centers so
that they can provide better content, pedagogy and instructional methods, and educational
technologies and processes. The identified output of these offices are input to the schools delivery
part of the system.

At the level of academic support units, the M&E system should be looking at the quality of inputs
(i.e. effectiveness and relevance). These units are not operating units and are not responsible for
learning outcomes directly; rather they contribute by helping shape content and methods. The
measure of their success will be based on the quality of input they have in helping create an
enabling environment for learning to occur.

BUREAUCRATIC SUPPORT UNITS


Bureaucratic support units are those offices that can help streamline and deliver support services to
operating units (i.e. schools divisions, schools, learning centers. The functions delivered by these
units are generic to all government bureaucracies. (This is what distinguishes them from the
academic support units which offer functions specific to an educational bureaucracy.)

At the level bureaucratic support units, the M&E system should be looking at process delivery (i.e.
efficiency and sustainability) of inputs. These offices are measured by how efficiently they deliver
on plans and targets (target versus actual + quality of service delivered).
Please refer to Annex B for a Conceptual Representation of the aforementioned discussion

TOWARDS VERTICAL AND HORIZONTAL ALIGNMENT IN DEPED

At the core of the DepEd M&E system is the direction towards vertical and horizontal alignment
across governance levels.

73
Figure 11.0: Conceptual Representation of Vertical and Horizontal Integration
in DepEd

Seen as a system, a Results-Based M&E focused on learning outcomes and processes can be
arranged using a Logical Framework starting at the level of activities and inputs:

 Education requires direction (vision, mission), policy (including budgetary support), agenda
and program of work, and standards. These are provided by the Leadership group. IF
Leadership is provided, THEN a program of work can be embarked on and resources
allocated.

 IF resources are allocated and administered well (Input), THEN Output by bureaucratic
support offices can be generated (call this, output by support groups).

 IF output from bureaucratic support groups are done well, THEN a proper enabling
environment can be created for units supporting academic processes (outputs by academic
support groups).

 IF output from academic support groups are of quality and are timely, THEN operating units
can deliver on (learning) outcomes.

 IF (learning) outcomes can be positively realized, the education system would have had an
overall positive Impact on society which should conform with the vision and mission of the
leadership core.

74
Figure 12.0 Vertical and Horizontal Integration in M&E

Figure 12.0 presents the conceptual diagram of how vertical and horizontal alignment are
articulated in the context of DepEd.

PLAN AND BUDGET SYSTEMS VERTICAL AND HORIZONTAL INTEGRATION

75
Figure 13.0 Vertical and Horizontal Integration in the Planning and Budget
System

The Program Expenditure Classification (PREXC) serves as the prevailing budget framework of the
Department. This structure provides a clear link between and among DepEd’s strategies, programs,
budgets, and results by showing how investments under each program are linked to the attainment
of desired sectoral organizational outcomes. The financial management system of government
attempts to use the 6-Year Philippine Development Plan (PDP) as the premiere planning document.
This is managed by the National Economic and Development Authority (NEDA), the national
planning body. The PDP sets the direction of government and serves as the policy compass of the
National Government. The translation of policy into programs and investment is contained in the
Public Investment Program (PIP). Both the PDP and PIP take on a rolling six-year horizon.

Annual budgets are intended to translate the PIP into yearly programs of work. For the DepED, the
DBCC (Development Budget Coordinating Committee) co-chaired by NEDA and the Department of
Budget and Management (DBM) has initiated a multi-year budget process which has led to higher
budget ceilings in subsequent years. It is therefore important for DepED to ensure that it has
longer-term enrolment projections to guide the multi-year budget process.

Internal Stakeholders
Central Office

 Controls DepED budget process completely


 Sets program priorities for the Department (Office of the Secretary)
 Defends the DepED budget before the DBCC
 Defends the DepED budget before Congress
 Prepares the National Basic Education Plan (NBEP)
 Conducts the quarterly Program Implementation Review (PIR)

76
Regional Office

 Consolidates division data into a regional submission


 Submits regional inputs to central office for budgeting
 Consolidates Schools Division Plans
 Prepares the Regional Basic Education Plan (RBEP)
 Conducts the quarterly Program Implementation Review (PIR)/RMEA

Division Office
 Submits division inputs, re budgeting to regional office
 Budget submissions are largely driven by formulas based on enrolments
 Prepares the Division Education Development Plan (DEDP)
 Consolidates SIPs
 Conducts the quarterly Program Implementation Review (PIR)/DMEA
School
 No role in budgeting
 Prepares the School Improvement Plan (SIP)
 Conducts PIR/School MEA

External Stakeholders
NEDA
NEDA’s orientation is towards the national or macro economy and, hence, has little motivation to
look more closely at sub-regional matters. This could explain its inability to push PDP more strongly
among the service delivery departments.

 Leads the PDP formulation process. Generally, since its inception in previous presidential
administrations, the PDP serves as the backbone of the Philippine governance system.

The task of integrating the substantive contents of the PDP is under the auspices of NEDA,
as the socio-economic planning agency of the Philippine government, in collaboration with
government departments with its corresponding bureaus. However, in terms of long-term
planning, the PDP may have limitations as the DepED Secretary decides the department’s
priorities based on its mandate.
 Facilitates the preparation of the PIP to synchronize the investment programming
processes with the budget process.
 Spearheads the formulation of the TRIP as input for budget preparation.

 Updates PIP to contain rolling list of DepED PAPs.


DBM

The DBM is mandated under this Order and by subsequent issuances to promote the sound,
efficient and effective management and utilization of government resources (i.e., technological,

77
manpower, physical and financial) as instrument in the achievement of national socioeconomic and
political development goals.
 Facilitates the preparation of PIP in line with DepED’s prioritization, and financing of capital
investment and current operating expenditure requirements.
 In view of the efforts to synchronize the investment programming processes with the
budget process, NEDA issued on September 10, 2018 the joint call for the updating of the
2017-2022 Public Investment Program (PIP) and formulation of the Three (3)-Year Rolling
Infrastructure Program (TRIP) Fiscal Year (FY) 2020-2022, as input to the FY 2020 budget
preparation.
 The Updated PIP shall contain the rolling list of priority programs, projects and activities
(PPAs) to be implemented by national government agencies, government-owned and
controlled corporations (GOCCs) and government financial institutions (GFIs)
 Establishes departmental budget ceilings and can control funds needed to address
shortages and resource needs of DepEd.

VERTICAL AND HORIZONTAL INTEGRATION IN BASIC EDUCATION M&E

Figure 14.0 Vertical and Horizontal Integration in the Education Processes

Vertical Integration
M&E should focus on how schooling and learning indicators are supported by the academic support
groups (from below) and contribute to realizing the leadership goals of the top.

The details of the M&E system for the operating group (schools divisions, schools, learning centers
should be focused on the Intermediate outcomes (IO) of the BEMEF.

78
Horizontal Integration

M&E should focus on how the academic support groups can work with each other to provide
content and pedagogical support to the operating schools and learning centers. Some essential
points to consider:
 The details of these interactions should be focused on the Enabling Environment (EE) of
the BEMEF.
 Only what are IO and EE indicators should be in the core M&E system around learning
outcomes.
 All other M&E indicators or measures that are looking at bureaucratic processes are best
left to an M&E system that is more generic to the Philippine bureaucracy (i.e. based on a
DBM framework, notably the “enabling environment”)

79
MONITORING AND EVALUATION PROCESSES

M&E PROCESS

Established Indicators

Data Collection

Data Synthesis

Reporting
Figure 15.0 M&E Processes

M&E, usually an iterative process, involves detailed suggested steps but sometimes there is a
tendency that those who are involved will eventually go back and forth in terms of sequence.

ESTABLISHED INDICATORS

 With the existing Intermediate Outcomes and Enabling Environments indicators pre-
determined by BEMEF, the preparation of M&E is streamlined.
 Establishing Monitoring and Evaluation Plans involve concentrating on specific evaluation
question for M&E vis-à-vis separate monitoring focus and evaluation focus with
corresponding sources of data for monitoring and detailed evaluation methodology for
evaluation.

80
Quantitative Qualitative Indicators
Indicators Subjective
Can be Numerical
Objective
Numerical Measures Perceptions, Quality,
Measures the Scale Opinions

The focus may be limited based on particular aspects, like the following example:

Table 3.0 M&E Questions Analytics


Descriptive Predictive Prescriptive
(What is the current (What may happen?) (What should be
condition?) done?)
What is the currentBased on a five- year This division will hire
enrollment rate in a trend, how many students additional teachers for the
particular Division? are expected to enroll in next school year.
the next school year?
How many learner With the anticipated The region will request for
materials are available for increase of enrollment in procurement for learner
this SHS strand these past this SHS strand what materials and other
school years? learner materials and resources with certain
other resources can be increments for the next
made available? school years.
Source: Adapted from Davenport (2017)

81
Figure 16.0 Link between Key Evaluation Questions with Data Collection
and Analysis

Individuals and offices within DepEd can also consider the formulation of essential M&E
questions and the intended use of evaluation information. It is also essential to note that the
relationship of the questions with other processes like data collection and data analysis.

DATA COLLECTION
There is a need to ascertain how you collect the information that you need. It is essential to prepare
an M&E data collection plan.

 Identify and validate data needs for Monitoring and Evaluation.


 Determine which data are readily available.
 Identify from whom or what office/unit collects the information.
 Confirm additional types of data collection methods to be used.
 Identify the focus of each method, sampling approaches, resource requirements, and any
potential ethical concerns.
 Determine specifications for the development of data collection tools.
Those who will be involved with M&E should at least consider preparing that a data collection plan
beforehand, or integrate this within program design and implementation during the program’s
development.

Table 4.0 Sample Data Collection Matrix


Evaluation Focus of Data Source Data Collection
Question Evaluation Technique

82
Effectiveness Learners EBEIS Review of learners’ records
To what extent Characteristics ALS Learners Review of A&E Results
did ALS Profile Interviews
learners Changes in Observation of ALS classes
increase their competencies Learning Center Visits
competencies as a result of
when they participation in
participated in ALS sessions.
ALS classes.
Efficiency Costs against Procurement Secondary data review
Was the cost budget and Records Key informant interviews
of curriculum areas where Disbursement Site visits
review within overruns or Records
budget? underspends Liquidation
occurred Records
Financial Reports

Source: Adapted from Arce (2001) and Markiewicz and Patrick (2016)

For data collection, the following will be used in the context of DepEd:

Table 5.0 Common M&E Data Collection Approach


Method Description Benefits Disadvantages Relative
Cost
Case Study of an Provides a It is costly, High
Studies event and how comprehensive narrow in focus
and why it examination of an (not possible to
occurred, issue. extrapolate to
through the larger
interviews, population).
participant
observation,
and records, to
explore a
specific topic or
event.
Focus Group Interview with a Can yield nuanced However, FGDs High
Discussions group of responses, insight are expensive
stakeholders. into how opinions and take time to
and behaviors are plan and
informed, and conduct; some
information about groups may be
the intended difficult to direct;
users’ attitudes participants may
and beliefs, and it give in to group
allows for more dynamics and
rapid collection of simply agree
information than with the majority
individual or an outspoken
interviews. participant; and
the opinions of
the group do not
necessarily
represent those
of the larger

83
population
In-Depth Semi-structured Interviews can be In-depth Medium to
Interviews interviews with conducted in interviews take High
open-ended person or over the time to plan,
questions telephone. coordinate, and
designed to conduct.
elicit in-depth Interviews obtain
responses from detailed
participants. information and
give the
opportunity to ask
follow-up
questions.

Results are
sometimes
subjective and not
necessarily
representative of
the population.

Depending on
sample size,
analysis can be
time-consuming
Records Physical and Do not require Depending on Low
Review digital additional when the
documents kept research. information was
in storage for a collected,
set amount of however, it may
time not be current
Surveys Structured Can be Response rate Medium
questionnaires administered in cannot be
that include person, over the determined.
close-ended telephone, or
and some online.
open-ended
questions. Cost-effective,
quick, provide
precise and easily-
analyzed data, and
maintain the
confidentiality of
participants.
Source: Ohkubo et al. (2013)

No single method of data collection is likely to be suitable M&E. In most instances, a combination of
both qualitative and quantitative information can be used. Selecting the right data to collect is key
to getting valid information that DepEd stakeholders will perceive as useful for decision making and
understanding the efforts of the department.
The strength of evaluation findings is usually found in the bringing together of data from different
sources.

84
DATA ANALYSIS AND SYNTHESIS

These involve sorting M&E data in different ways to develop or uncover new insights. The use of a
combination of quantitative and qualitative data collection and analysis techniques will improve an
evaluation by ensuring that the limitations of one data type is balanced by the strengths of another.
The integrative process may include the following:

 Analysis of data from routine monitoring.


 Analysis of data from periodic evaluation.
 Integrating data in reference to established indicators with corresponding targets.
 Integrating data in reference to criteria and standards based on established evaluation
rubrics.
 Interpreting mixed method, qualitative and quantitative information.
 Validation findings based on evaluative judgments against M&E indicators or evaluation
questions.
As a simple guide, remember the Pareto Principle (the 80/20 rule) when delving into data. That is,
80% of the effects are due to 20% of the causes. The value of the Pareto Principle is that it reminds
you to focus on the 20% that matters.
A necessary consideration is the creation of a database of the range of quantitative and qualitative
M&E data collected. Also, be mindful of the following:
 How will data be organized?
 Are separate tabulations from different locations or groups required?
 What, if any statistical techniques will be used?
 How will the anecdotal, reported and narrative data be synthesized and analyzed?

Analyzing data to summarize findings and look for trends is necessary in M&E. The primary
consideration in terms of data analysis and synthesis is the type of data that you are using.

Quantitative Data

With quantitative M&E data, results can be analyzed using statistics. The following considerations
are highlighted.
 Levels of Measurement. How the values are assigned to attributes.

Quantitative data can be:

 Nominal. These can be classified into categories (e.g., Gender, Division, Region,
Ethnicity)
 Ordinal. These are variables measured in terms of ranking.
 Interval. These pertain to distance between values that are meaningful.
 Ratio. These refer to distance between values that are meaningful and have an absolute
zero.

85
 Approach.

 Descriptive Statistics. If the purpose is just to explore and describe M&E data.
 Inferential Statistics. If the purpose is to test or confirm hypothesized relationships
among M&E variables.

Statistical versus Practical Significance

There is a tendency to focus more on statistically


significant results. However, practical significance
may also be explored when there are no
statistically significant results were found in the
analysis.

Qualitative Data
Qualitative data provide richer depiction of M&E information. Narrative or graphic information is
easier to explore and understand as a direct representation of M&E data.

Certain considerations can also be reviewed.

 Preparation of qualitative data


 Transcription of FGD and/or interview data.
 Review of data to set-up preliminary coding template with cases and variables
 Review field notes for contextual information
 Approaches
 Coding
 This involves subjective meaning of texts, pictures, and other artifacts.
 Systematically arranging qualitative data to establish array of codes.
 Thematic analysis
 Developing a hierarchical outline of codes, categories, sub-themes, and
themes.
 Qualitative data analysis through number conversion
 Themes and codes can be converted into numerical value, and then
analyzed using selected quantitative data analysis approaches.
 Number of participants and contextual information can also be utilized.

The following may be the common approaches that can be used by DepEd:

Table 6.0 Common M&E Data Analysis Approaches


Approach Potential Use Potential Actions
Association and Determine relationship/s Prioritization of specific
correlation analysis of 2 or more indicators, inputs by indicator.
and/or indicators with
specific contextual
factors.
Cluster analysis Cluster specific actor, Leverage cluster results
resource, learning to address specific
environment patterns.
characteristics.
Cohorts analysis Identify specific learner or Identify factors that affect
actor profile that drives specific profile outcomes.

86
specific outcome.
Sentiment analysis Establish a sentiment Leverage real-time
score for each stakeholder sentiment scores to take
segment. immediate actions.
Text mining Mine stakeholder Locate unsatisfied
comments to flag problem stakeholder segments to
situations and specific prioritize specific
issues. approaches.
Trend analysis Perform trend analysis to Flag critical areas and
ascertain variables that take immediate corrective
are highly correlated to actions.
specific indicator or
outcome.

Technological Option in Data Analysis


The proliferation of digital tools contributed to the improvement of data analysis process. Using
specialized software tools, depend on the following considerations:

 Individuals inclination to learn a new software


 Financial and funding source
 Degree of analysis needed
Some common quantitative tools are:

 Dedoose
 IBM Statistical Package for Social Science (SPSS)
 SAS
 STATA
Common applications for non-numeric data include:

 ATLAS.ti
 Dedoose
 HyperRESEARCH
 Provalis Research QDA Miner
 QSR International’s NVivo
 The Ethnograph
 Verbi Software MAXQDA

COMMUNICATING M&E RESULTS

Reporting and dissemination of the M&E findings are the culminating components of the M&E
process. This is the process where information is shared based on crude data and processed data.
This is essential in providing program implementers, as well as internal and external stakeholders an
opportunity to inform themselves of progress, problems, difficulties encountered, successes, and
lessons learned during implementation.

Reporting can take many forms, such as:

 Formal M&E Report

87
 Narratives
 Graphics
 Presentation

Reporting and Dissemination

 Understand the purpose of the report and M&E dissemination. This is essential as this would
guide how to prepare the report and its corresponding content.

 Determine the reader/audience. Knowing the reader/audience helps with the essential
elements of framing and assumptions.

This is also helpful in taking into account their expectations and what elements of the
findings should be highlighted.

 Share positive and negative findings. Sharing good results is necessary. However, it is also
essential to present challenges and what did not work as these may provide potential for
learning and improvement.

Some Considerations

 Make the report concise and direct as possible focusing on the information that needs to be
conveyed.

 Highlight and concentrate on the results.

 Discuss data sources, collection methods and how data were analyzed.

 Be consistent with terminology/ies, acronyms and definition of terms.

 Present complex data using charts, drawings, figures, frameworks, infographics, tables

DATA MANAGEMENT

Data management is often neglected aspects of the M&E. These are necessary processes and
systems for how a DepEd will systematically and reliably store, manage and access M&E data. It is a
critical part of the M&E system, linking data collection with its analysis and use. Poorly managed
data wastes time, money and resources; lost or incorrectly recorded data affects not only the
quality and reliability of the data but also all the time and resources invested in its analysis and use.

Data management should be timely and secure, and in a format that is practical and user-friendly. It
should be designed according to the needs, size and complexity. Typically, M&E data management
should be a component of DepEd’s larger data management system and should adhere to any
established policies and requirements. This is essential because:

 It is a foundation of monitoring, implementation and evaluation.

88
 It ensures data quality and integrity

 It provides protection on violations of rights.

 It fosters clear documentation, transparency, accountability, learning and continuous


improvement.

 It helps in systematizing the retrieval of data needed.

Key Points to Consider

 Content of information to be stored. The amount of data to be stored is an important


consideration, as this defines what hardware tools to utilize. Moreover, documentation in terms
of content are necessary for easy referencing and retrieval.

 Data format. Standardized formats and templates improve the organization and storage of
M&E data. Generated M&E data are differentiated in terms of quantitative and qualitative data.
Storage of mere numbers or texts and/or combinations are essential in managing the database
of information. Some formats are:

o Audio like recordings of interviews and M&E activities


o Descriptive (Anecdotal and narrative reports, physical and digital checklists and forms)
o Numerical contained in physical records, spreadsheets and database sets
o Visuals like videos, graphs, maps and diagrams

 Data management systems. Storage in a physical hardware on-site in DepEd, mirror DepEd
servers and cloud-based storage are important considerations as well.

 Frequency of retrieval. It helps in systematizing the retrieval of data as well as improve the data
collection process.

Table 7.0 M&E Processes


Activity Frequency Data Approaches Process Submitte
Requirem Owner d To
ents
Indicators Annually Existing Meta-review Central Oversight
Review Every 3-Years Indicators of reports Office Agencies
and
Targets Quantitative
review of
indicators
versus targets
Progress Bi-Annually Region Synthesis of Regional Central
Monitoring Data Regional Data Offices Office
SDO Data
District Triangulation
Data using:

89
Schools - Online
Data Survey of
selected
schools,
schools
division,
regions
- KIIs
- FGDs
- Field
observation
Project Quarterly Region Synthesis of Regional Central
Implementati Data Regional Data Offices Office
on Review SDO Data
District
Data
Schools
Data
Outcome Annually SDO Data Review of Regional Central
Mapping District Specific Offices Office
Data Outcome
Schools based on Division Regional
Data Indicators Offices Office
Outcome Every 3-Years Region Desk review Independen Central
Evaluation End of Data of Existing t Evaluation Office
Administratio SDO Data Database Consultant
n District
Data Desk review Central Oversight
Schools of policies Office Agencies
Data
Meta- Internatio
evaluation of nal
monitoring Partners
and
evaluation
reports

Survey of
Selected
Segments

FGDs and KIIs

90
REFERENCES

Amerasinghe, N. (2015). Design, Appraisal, and Management of Sustainable Development Projects.


Makati City: Asian Institute of Management.

Arce, W. (2001). Systematic Qualitative Data Research: An Introduction to Filipino Practitioners.


Quezon City: Office of Research and Publications, Ateneo de Manila University.

Davenport, T. (2017). Competing on Analytics. Cambridge: Harvard Business Press.

Department of Budget and Management (DBM) and National Economic and Development
Authority (NEDA). (2015). Joint Memorandum Circular on the National Evaluation Policy
Framework of the Philippines No. 2015-01. Manila: DBM and NEDA.

Doerr, J. (2018). Measure What Matters. New York: Penguin.

Glennerster, R. and K. Takavarasha. (2013). Running Randomized Evaluations: A Practical Guide. New
York: Princenton University Press.

IFC Advisory Services (2008). The Monitoring and Evaluation Handbook for Business Environment
Reform, IFC Business Enabling Environment Business Line, in association with GTZ and
DFID.

Kettner, P., R. Moroney, and L. Martin. (2017). Designing and Managing Programs: An Effectiveness-
Based Approach. Thousand Oaks: Sage.

Markiewicz, A. and I. Patrick. (2016). Developing Monitoring and Evaluation Frameworks. Thousand
Oaks: Sage.

Nishishiba, M., M. Jones and M. Kraner. (2014). Research Methods and Statistics for Public and
Nonprofit Administrators: A Practical Guide. Thousand Oaks: Sage.

United Nations Development Programme (UNDP). (2002). Handbook on Monitoring and Evaluation
for Results. New York: Evaluation Office.

United Nations Development Programme (UNDP). (2009), Handbook on Planning, Monitoring and
Evaluating for Development Results. New York: Evaluation Office, Operations Support Group,
and Development Group Bureau for Development Policy.

United Nations Educational, Scientific, and Cultural Organization (UNESCO). (2007). Evaluation
Handbook. Paris: Internal Oversight Service.

United Nations Educational, Scientific, and Cultural Organization (UNESCO). (2009). Education
Indicators: Technical Guidelines. Paris: UNESCO Institute for Statistics.

Wholey, J. H. Hatry, and K. Newcomer (Eds). (2015). Handbook of Practical Program Evaluation. San
Francisco: Jossey-Bass.
World Bank (WB). (2009). Making Monitoring and Evaluation Systems Work: A Capacity Development
Toolkit. N W Washington: World Bank.

91
ANNEX A: GLOSSARY

For the purpose of this Handbook, the following terms are defined and understood as follows:

TERM DEFINITION

Accountable office Any decision-making unit/s of DepEd at the national, regional, school division, or school
levels in charge of providing directives and determining strategies to achieve agency
performance targets (BEMEF Policy, 2017).

Activity Output-driven undertaking that has a specific calendar schedule and resource
assignments. The objectives of an activity typically have corresponding tangible and
quantifiable outputs.

An activity must be anchored on a program/project, and consistent with DepEd's vision


and mission. It is significantly limited in scope and much shorter in lifespan than either a
program or project. An activity may be initiated by the Central Office (CO), Regional
Offices (ROs), School Division Offices (SDOs), schools, or by external partners and
involves fewer people. An activity is generally evaluated against the expected/identified
outputs immediately after it has ended (BEMEF Policy, 2017).
Assessment An activity that is intended to gather information needed for project design and
development. Information may include the needs and risks of a particular client
segment/population, specific context in particular governance level, as well as resources
(financial, human and material) available.

Attribution This refers to ascribing causal link between observed changes and a specific
intervention/initiative/project/program.
Audit Any independent, objective quality assurance activity designed to add value and improve
operations and/or performance.

Basic Education Plan This refers to strategic, medium-term, and operational plans developed by DepEd
operating units across governance levels (BEMEF Policy, 2017).
Client The primary client of DepEd are the learners or students (ESIP Guidebook, 2015).

Counterfactuals In evaluations, the counterfactual is the value of the outcome for a treatment group in
the absence of the intervention/project/program.
Database A structured set of data and information of each DepEd operating unit gathered from its
M&E activities that is easy to access, manage, and update (BEMEF Policy, 2017).

DepEd Operating Unit Any DepEd unit across governance levels which provides support and/or implement
programs, projects, and major activities relative to the delivery of basic education in line
with the provisions of R.A. 9155 (BEMEF Policy, 2017).

Effectiveness The extent to which an intervention/project/program has achieved established objectives.

Efficacy The extent to which an intervention/project/program has attained its expected results.

Efficiency The extent to which the program has converted or is expected to convert its
resources/inputs (such as finances, expertise, time, etc.) economically into results in order
to achieve the maximum possible outputs, outcomes, and impacts with the minimum

92
possible inputs.

Evaluation An assessment, as systematic and objective as possible, of an ongoing or completed


project, program or policy, its design, implementation and results (OECD-DAC from
NEDA, 2015).

An assessment, as systematic and impartial as possible, of an activity, project, program,


strategy, policy, topic, theme, sector, operational area, institutional performance, etc.
(UNEG from NEDA, 2015).

Formative Evaluation A type of evaluation intended to improve the performance of an intervention, initiative,
project or program or intervention. This is conducted during the design and pilot-testing,
but it can also be conducted early in the implementation phase.

Governance Levels This refers to the structure of the whole Department of Education representing vertical
roles and responsibilities in operating units from the Central Office, to the Regional
Offices, Division Offices and Schools.
Impact The long-term, cumulative effect of initiatives, interventions, projects and programs over
time on what they ultimately intend to change or achieve.
Impact Evaluation Assessment of the direct and/or indirect, positive and/or negative, as well as intended
and/or unintended consequences of an intervention/project/program.
Indicator Quantitative or qualitative factor or variable that provides a simple and reliable means to
measure achievement, to reflect the changes connected to an intervention, or to help
assess the performance of a program, project and/or activity specific, tangible, or
quantifiable measures of accomplishments, or if unquantifiable, they can be qualitative
specifications of achievements (BEMEF Policy, 2017).
Information System An organized system for the collection, organization, storage, and communication of
information (BEMEF Policy, 2017).
Input The financial, human, and material resources needed to produce an output (BEMEF Policy,
2017).

Monitoring Periodic or routine tracking and reporting of priority information about a project or
program, its inputs and intended outputs, outcomes and impacts.

M&E System A set of organizational structures, management processes, standards, strategies, plans,
indicators, information systems, reporting lines and accountability relationships which
enables offices across governance levels to perform their M&E functions effectively
(BEMEF Policy, 2017).

M&E Tools Instruments used to collect information during the conduct of monitoring and evaluation
(BEMEF Policy, 2017).
Objective A statement of a desired intervention, project or program result that meets the criteria like
being Specific, Measurable, Achievable, Realistic, and Time-bound (SMART) or Frequently-
Discussed, Ambitious, Specific, Transparent (FAST).

Outcome Intermediate effects of output/s on clients (BEMEF Policy, 2017).

Outcome Monitoring Assessing the intended results or intermediate effects of output/s on clients.

Outcome Evaluation Focuses on changes in comprehension, behaviors that the target results from program
activities and/or output/s and may include both short-term, medium-term and long-term

93
results.
Output Products and services produced through utilization and processing of inputs (BEMEF
Policy, 2017).
Overall Lead The staff with the overall authority, accountability, and responsibility for the M&E system
at each governance level (BEMEF Policy, 2017).
Performance The degree to which a project/program or Dep Ed operating Unit operates according to
specific criteria/standards/guidelines or achieves results in accordance with stated goals or
plans.

Portfolio A combination of different programs, projects and initiatives handled by specific DepEd
operating units.

Portfolio Management The design, development, organizing and management of combination of projects and
programs.

Process Evaluation A type of evaluation that explores detailed information on how specific
intervention/project/program was delivered. It examines areas like differences between
the intended population and the population served, as well as access and participation.

Process owner The office who will oversee and manage the conduct of the M&E system per governance
level.

Program Strategic intervention anchored on DepEd’s mandate, goals, and national policies the
implementation of which constitutes or supports the Department’s core business.
Project An intervention that is relatively narrower in scope compared to a program. A project
yields more immediate results for specific target groups.

A project may be a component of and/or anchored on a program, or independent of any


program. It may be designed or implemented in support of a program, to address
learners’ and other needs not covered by any program, or to experiment/try out/pilot an
innovation or an innovative solution (BEMEF Policy, 2017).
Program The main modality to measure the performance of programs, projects, and major
Implementation activities within and across the organization (BEMEF Policy, 2017).
Review

Program Management This refers to designing, developing, planning, organizing and managing the effort to
accomplish different sets or combinations of projects.

Progress Monitoring This is a systematic and objective assessment of an on-going implementation of plans,
programs, projects, and major activities (BEMEF Policy, 2017).

Project Management This refers to designing, developing, planning, organizing and managing the effort to
accomplish a successful project.

Quality Assurance This refers to planned and systematic processes concerned with assessing and improving
the merit or worth of a/an intervention/project/program through its compliance with given
existing guidelines and established standards.

Readiness Monitoring This is a quality assurance mechanism designed to ensure the availability of all inputs and
requirements necessary to start and sustain an efficient operation (BEMEF Policy, 2017).

Relevance The extent to which the objectives, outputs, or outcomes of an intervention are consistent

94
with beneficiaries’ requirements, DepEd’ policies, country needs, and/or global priorities.

Responsible office A DepEd operating unit at the national, regional, school division, or school level in charge
of executing tasks or deliverables (BEMEF Policy, 2017).
Results-Based M&E The continuous process of collecting and analyzing information to compare how well
DepEd programs, projects, and activities are performing against its expected outcome or
result (BEMEF Policy, 2017).
Results Evaluation This is an M&E approach that focuses on measuring the realization of results. It seeks to
assess the outcomes and changes brought about by program or project interventions
(BEMEF Policy, 2017).
Stakeholder A person, group, or entity who has a direct or indirect role and interest in the goals or
objectives of DepEd and implementation of its projects or programs.

Summative Evaluation A type of evaluation conducted at the end of an intervention (or a phase of that
intervention) to determine the extent to which anticipated outcomes and/or objectives
where achieved.

Sustainability The likelihood that the project or program will be continued in the future.

Target This is the desired measurable value for an indicator at a particular point in time.

95
ANNEX B: STRATEGIC M&E FRAMEWORK

ANNEX C: SAMPLE LOGICAL FRAMEWORK MODEL VERTICAL AND HORIZONTAL


LOGIC

96
[Project/Activity Name] LOGFRAME
OBJECTIVES INDICATORS MEANS OF ASSUMPTI
VERIFICATIO ONS
N
Goal
Outcome 1
Output 1.1
Output 1.2
Output 1.3
Activities Activities may often be Inputs/resources Costs & sources
included in separate document
(activity schedule) for practical
purposes
Outcome 2
Output 2.1
Output 2.2
Output 2.3
Activities Inputs/resources Costs & sources

Outcome 3
Output 3.1
Output 3.2
Output 3.3
Inputs/resources Costs &
Activities sources

Continue to add additional rows for outcomes, outputs and activities as necessary

Logical Framework (LogFrame) – Definition of Terms


ASSUMPTIO
INDICATORS MEANS OF
OBJECTIVES NS
(How to measure VERIFICATION
(What we want to achieve) (What else to
change) (Where / how to get
be aware of)
information)

Goal Impact How the information External


The long-term results that an Indicators on the indicator will conditions
intervention seeks to Quantitative be collected (can necessary if
achieve, which may be and/or qualitative include who will the Goal is to

97
Logical Framework (LogFrame) – Definition of Terms
ASSUMPTIO
INDICATORS MEANS OF
OBJECTIVES NS
(How to measure VERIFICATION
(What we want to achieve) (What else to
change) (Where / how to get
be aware of)
information)

criteria that
provide a simple
and reliable contribute to
contributed to by factors means to measure collect it and how the next
outside the intervention. achievement or often). level of
reflect changes intervention.
connected to the
goal.
External
conditions
not under
Outcomes1 the direct
The primary result(s) that an Outcome control of the
intervention seeks to Indicators intervention
achieve, most commonly in As above, As above necessary if
terms of the knowledge, connected to the the outcome
attitudes or practices of the stated outcome. is to
target group. contribute to
reaching
intervention
goal.
External
factors not
under the
Outputs
Output direct control
The tangible products, goods
Indicators of the
and services and other
As above, As above intervention
immediate results that lead
connected to the which could
to the achievement of
stated outputs. restrict the
outcomes.
outputs
leading to
the outcome.
External
Process As above factors not
Indicators under the
Activities2 As above, direct control
The collection of tasks to be connected to the of the
carried out in order to stated activities. intervention
achieve the outputs. which could
restrict
progress of
activities.

1 When there is more than one outcome in a project, the outputs should be listed under each outcome – see the example
above .
2 Activities may often be included in separate document (e.g. activity schedule / GANTT chart) for practical purposes.

98
ANNEX D: SAMPLE M&E PLAN TEMPLATE

NAME OF PROJECT:

RESPONSIBLE UNIT:

Indicator Description Data Sources Frequency Responsible Intended


Unit Audience

GOAL

Indicator 1.0

Assumption

OUTCOME 1.0

Indicator 1.1

Assumption

Indicator 1.2

Assumption

OUTCOME 2.0

Indicator 2.1

Assumption

Indicator 2.2

Assumption

Continue adding objectives and indicators

99

Potrebbero piacerti anche