Está en la página 1de 18

Strategic Planning and Performance Evaluation-

Methodology, Systems and Management

Professor Tarun Das1

1. Introduction, Scope and Objectives

“Would you tell me please, which way I ought to go from here?” asked Alice.
“That depends a great deal on where you want to get to,” said the cat.
“I don’t much care where ….,” said Alice.
“Then it does not matter which way you go,” said the cat.

--- Extract from Alice in Wonderland, Lewis B. Carroll

Eventually Alice in Wonderland realized that it matters a great deal to know “where to
go” and “how to get there”. Similarly, in any agency it is important to know the vision,
mission, basic goals and objectives of an agency and the overall scope of its activities in
terms of exact outputs and outcomes in the medium term, and how to achieve these
goals, objectives, outputs and outcomes in a time bound manner and with least cost.
Strategic Planning and Performance Management Cycle is an interactive on-going
process to facilitate sound business and financial plan of any agency.

2. Strategic Planning

A Strategic Business Plan (SBP) must focus to achieve a clear Mission, embedded in a
realistic Vision, based on issues, objectives and strategies identified in collaboration
with the major stakeholders. SBP need not be too ambitious with an impressive plan but
unrealistic targets. It should emphasize concrete plan of actions and strict
implementation schedule.

In the short run, strategies need to be tailored to take advantage of institutional


strengths and to avoid weak institutions. But, in the medium and long run, emphasis
should be placed on strengthening, replacing or even eliminating weak institutions.

SBP needs to recognize that the global business environment is complex and fast
changing and global public policy is an area of conflicts and adversity. We need to
understand the dynamics of both internal and external environment and be prepared
with appropriate strategy to tackle any contingent liabilities.

SBP needs to recognize that any policies and programs cannot be successful unless an
agency is able to take the stakeholders along with them. Thus collaboration in SBP is a
deal which rewards all parties involved, and creates win-win situations for all
stakeholders.

1
Author is presently working as a Professor in the Institute for Integrated Learning for Management and as
a Consultant in the World Bank Country Office for Uzbekistan at Tashkent. This paper was written when
author was working as the Asian Development Bank Strategic Planning Expert for the Government of
Mongolia at Ulaanbaatar during June 2007 to July 2008.

1
SBPs have to be integrated fully with structural and governance reforms and capacity
building. The SBPs need to adopt a gradual, step by step, evolutionary and cumulative
approach towards structural changes within an organization, and should avoid the
temptation of adopting a Big Bang, shock therapy, radical, fundamental or revolutionary
approach, which may create islands of hostile stakeholders.

We do not advocate for a small, shrunken and weak management, but want the
management to be strong to guide the transition process, to bring good governance
reforms to their logical ends, to strengthen the existing strong and efficient institutions,
and to repair the weak ones. An efficient strategic planning will require a firm and
consistent leadership by the management over a long period of time.

As per international best practices, a typical or a stylized Strategic Business Plan


adopts a top down approach and has the following design of events:

Table-1 Typical Structure of SBP – Top Down Approach

Goals Long term – wide spread results of programs and policies, like the
achievement of the Millennium Development Goals by 2015

Outcomes Medium term – impact of programs and policies on the economy and
user groups.

Output Deliverables- products and services produced during program period.
Outputs are the immediate or end results of programs or activities,
↓ whereas outcomes are the impact on the markets and society even
after completion of the project.
Activities Tasks- undertaken by the staff to transform inputs into outputs.


Inputs Resources- (Personnel, financial, goods and services) are the basic
requirements for any output and strategic planning.

3. Integration of Strategic Plan with Program Budget

To accomplish its strategic objectives effectively, an Agency must link outcomes,


strategy, budget, and performance into an integrated management system. This
management system, based on a model of continuous improvement, is shown in Flow
Chart-1 below:

2
Strategic Business Plan
Broadly defines strategic goals, outcomes, outputs of an
Agency and the methods to achieve them.

Performance Program Budget and


Assessment and Performance Planning
Program Budgets- Funds allocated to specific
Improvement Plan
programs to achieve desired goals, outputs,
Compares actual to target
and outcomes with least cost.
performance and
Performance - specifically designed results
benchmarks.
Value- achieving value for money
Determine changes that will
Establish long-term and annual targets for
produce the best value.
spending, performance and value.

Performance Monitoring
Track the progress, expenditure, and value for
money for achieving outcomes.

Flow-Chart-1: Strategic Plan and Performance Based Budgeting Cycles

The process begins with an understanding of important national priorities and outcomes,
which are then translated into broadly defined goals and intended results for the Agency.
These become the Agency’s strategic goals and strategic objectives. Outcomes and
Outputs related to these strategic goals and objectives are then articulated. Programs
are developed to achieve these outcomes and outputs with least resources, and then
performance measures and indicators are identified to provide the means to assess the
progress made during the budget year and to suggest improvements for the next year’s
budget. Flow Chart-2 below explains the relationship between an Agency’s Medium
Term Strategic Business Plan and its Annual Program Budget and performance
evaluation.

3
Strategic Business Plan Next Year’s Budget

Vision, Mission, Strategy Improvement Plan


and Objectives

Strategic Goals Performance Monitoring and


Evaluation through PART

Performance Indicators Performance Indicators


And Measures And Measures

Strategic Outcomes Program Outcomes


And Outputs And Outputs

Activities and Processes Program Budgets

Inputs (Staff, Funds, Resources for


Goods, Services, ICT) The Budget Year

Flow Chart-2: Integration of Strategic Business Plan,


Program Budget and Performance Evaluation

4. Ideal Performance Evaluation Systems


4.1 Characteristics of an Ideal Performance Evaluation System

Paul G. Thomas (2005) has mentioned the following properties of an ideal perforance
measurement system:

• It has clearly defined purposes and uses.


• It focuses on outcomes, not just on inputs and outputs.
• It employs a limited, cost effective set of measures.
• It uses measures which are valid, reliable, consistent, and comparable.
• It produces information which is relevant, meaningful, balanced and valued by
the leaders/ funders of the organisation.
• It is integrated with the planning and budgetary processes.
• It is embedded in the Agency, stable and widely understood and supported.

4
In somewhat less abstract terms, the Canadian Comprehensive Auditing Foundation
(CCAF) has developed nine principles 'to provide direction for performance reporting in
Canada. Box-1 presents these nine principles. The first five principles provide guidance
about what governments should report, while the remaining four relate to how
governments report. The principles start with as ideals, the ‘ceiling’ that reporting aspires
to reach, but over time they become 'standards’, the floor below which reporting should
not sink. Taken as a set, the nine principles are meant to provide a framework for
performance reporting.

Box-1 — Nine Principles of Better Performance Reporting

1. Focus on the few critical aspects of performance.


2. Look forward as well as back.
3. Explain key risk considerations.
4. Explain key capacity considerations.
5. Explain other factors critical to performance.
6. Integrate financial and non-financial information.
7. Provide comparative information.
8. Present credible information, fairly interpreted.
9. Disclose the basis for reporting.

Source: Canadian Comprehensive Auditing Foundation, Reporting Principles: Taking


Public Performance Reporting to a New Level. Ottawa, 2002.

4.2 Various Approaches to Performance Measurement

Performance measurement has become so widespread that it is impossible to know all


what is taking place within both public and private sectors around the world. There is no
single, ‘one best’ approach to performance measurement. An agency needs to develop
an approach that fits with its constitutional/ legal requirements, arrangements, political
ideology, its size, administrative and institutional capabilities, and, not least important,
what it can afford. The general tendency for agencies has been to apply a single and
uniform approach to all divisions, and programs. This ‘across-the-board’ approach may
have apparent virtues of consistency, comparability and fairness, but it is not without
problems.

It would be interesting to know how the system evolved in Canada over the past decade.
The federal and provincial governments in Canada developed two broad approaches to
performance measurement. In Alberta, Nova Scotia, Ontario and Quebec, governments
reported on the performance of the entire government in terms of the impacts of their
programs on the society. This ‘social-indicator’ type of approach supports external and
political accountability. However, the selection of indicators included in ‘report cards’ to
citizens was inherently arbitrary (Thomas 2006).

Other provinces and the Government of Canada began their performance measurement
efforts by requiring individual departments to prepare business plans and performance
reports. The ‘business-line’ approach is more of a managerial tool than something which
is normally liked by politicians and the public. However, these two broad approaches

5
viz. the system-wide and the business-line could be pursued simultaneously and
complement one another. This has been the evolution of the performance reporting
system in the Government of Canada. It began by publishing performance reports on a
departmental basis and now more than 80 such documents for departmental and non-
departmental bodies are tabled in Parliament on a annual basis.

Agencies have developed a number of frameworks to identify successful programs.


Probably the most common framework involves the so-called ‘three big Es’: economy,
efficiency and effectiveness as described below.

Economy Have inputs been acquired at least cost?

Efficiency Are the inputs (people, money, supplies) being combined to produce
the maximum volume of outputs (goods and services)?
Effectiveness Are the goals of the organization/ program being met, without undue
unintended consequences?

These elements had been used by both the public and private sector management over
the past four or five decades. But, the framework misses another important E i.e. equity,
which deals with the distributional impacts of performance. Omitting equity may have an
adverse impact on another big E in government i.e. electability.

In its earlier program evaluation scheme, the Government of Canada considered a


program to be well performing when it was:

Relevant Consistent with government and departmental priorities.


Successful Achieves the intended outputs and outcomes.
Cost-effective Involves the most efficient means to achieve goals.

This framework deals with the desirability of continuing a program, but does not address
the question: Does the organisation have the capability and capacity to deliver the
desired results?

Organisational report cards represent another type of performance measurement and


reporting. Box-2 presents one interpretation of the requirements for such report cards.

Box-2: Organisational Report Cards — Criteria for Design

(a)Validity- satisfies legal requirements


(b)Comprehensiveness- covers all aspects of a budget
(c)Comprehensibility- easy to understand
(d)Relevance- appropriate for strategic objectives
(e)Reasonableness- can be achieved within time and at reasonable cost
(f)Functionality- operational and realistic

Source: William T. Gormley and David L. Weimer, Organisational Report Cards.


Cambridge, Mass.: Harvard University Press, 1999. pp. 36-37.

6
In 1987 the Canadian Comprehensive Audit Foundation (CCAF) published a report on
‘organisational effectiveness’ which mentioned the following attributes of an effective
organisation management:

Box-3: Attributes of an effective organisation management


(a)Relevance
(b)Appropriateness
(c)Achievement of purpose
(d)Acceptance
(e)Secondary Impacts
(f)Costs and Productivity
(g)Responsiveness
(h)Financial Results
(i)Working Environment

(j)Monitoring and Reporting

Several Agencies have since applied this framework. But, there may be conflict in
practice among the attributes — e.g., cost efficiency may reduce responsiveness.
Besides, assigning weights to these attributes and constructing a weighted overall index
is a challenging job.

The Office of the Auditor General of Canada (OAG) has recommended another
performance framework with six components of performance:

1. Mission statements
2. Results statements
3. Performance indicators/measures
4. Performance expectations/targets/commitments
5. Strategies/activities
6. Performance accomplishments/achievements

This framework emphasises the desirability of integrating performance planning,


budgeting, monitoring and reporting, and also stresses external accountability for results.

4.3 Choice of Particular Performance Evaluation System

Regardless of the approach adopted, a sound performance measurement and


evaluation system must have three qualities: it must be technically valid, it must be
functional, and it must be legitimate. Table-2 presents one generic listing of the ‘ideal’
qualities of such a system.

7
Table-2: Characteristics of an Ideal Performance Measurment System

Characteristic Definition

Clarity Performance indices should be simple, well defined, and easily


understood.
Consistency Definitions of indicators should be consistent over time and across
agencies.
Comparability One should compare like with like.
Controllability A Mmanager’s performance should only be measured for areas
over which he/she has control.
Contingency Performance is not independent of the environment within which
decisions are made.
Comprehensive Covers all aspects of a budget.
Bounded Consider a limited number of performance indices which provide
the biggest pay-off.
Relevance Performance indicators are relevant to the special needs.
Feasibility Targets are based on realistic expectations.
Source: Peter M. Jackson, Measures for Success in the Public Sector.

4.4 Types of Performance Indicators

Performance indicators measure what an Agency did in the fiscal year. There are many
kinds of performance indicators ranging from quantities or value of goods and services
produced in a given period (such as the number of crimes or breaking of traffic rules
detected by the police) to more complex indicators such as efficiency and effectiveness
of service delivery. Nayyer-Stone (1999) mentioned four primary types of performance
indicators: input, output, outcome and efficiency, which are described in Table-3.

Table-3: Performance Indicators

Type of Indicator Definition Example


Input Indicator Measure of • Employees Required
Resources Employed • Goods and Services Used
• Equipment Needed
Output Indicator Quantity of Goods • Number of projects
and Services • Number of outputs
Provided • Number of people served
Effectiveness/ The degree to which • Increase in literacy rate
outcome Indicator the intended • Increase in employment
objectives of the • Decrease in crime rate
services are being • Reduction of poverty.
met. • Reduction of maternal and child
mortality rate
Efficiency Indicator Cost per unit of • Cost/ liter of water delivered.
output • Cost of garbage collected.

• Cost per student in schools

8
Source: Adapted from Hatry, Harry P. (1977).

4.5 Characteristics of ideal performance indicators

Like any statistical measure, performance indicators must satisfy a number of criteria. In
general, an ideal performance indicator should be S.M.A.R.T. (i.e. simple, measurable,
achievable, relevant and timely) and C.R.E.A.M. (i.e. clear, relevant, economic,
adequate and monitorable).

Table-4: Ideal Properties of Performance Indicators

S.M.A.R.T. • Simple- easily defined


• Measurable- easily quantifiable
• Achievable – can be achieved, not a wish list
• Relevant- Appropriate for the strategic objectives
• Timely- can be achieved in time
C.R.E.A.M. • Clear- Precise, unambiguous, tangible and quantifiable
• Realistic- achievable with reasonable cost and in time
• Economic - Available at reasonable cost and in time
• Adequate- Provides sufficient basis to access performance
• Monitorable- Amenable to impartial/ objective evaluation

4.6 Use of Performance Measures

Performance measures can be used in several ways, including the following:

a) Controlling costs – enabling agencies to identify costs which are much higher or
lower than average and determine why these differences exist.
b) Comparing processes – analyzing performance of services provided with a
particular technology, approach or procedure.
c) Maintaining standards – monitoring service performance against established
performance targets or benchmarks.
d) Comparing sectors – comparing costs of public sector delivery to costs of private
sector delivery of the same type of goods and services.

5. Performance Evaluation Methodology

Performance Evaluation involves four main steps as the following:

1.Summary of Baseline Scenario


2.Diagnostic Scan and SWOT Analysis
3.Budget Compliance, Efficiency and Effectiveness Evaluation
4.Performance Evaluation

5.1 Review of Strategic Plan and Baseline Profile

9
In undertaking a performance evaluation, it is necessary to start with a baseline and
initial profile and to identify the key issues on which the performance evaluation is to be
focused.

Table-5: Scope of Performance Review and the Initial Profile

Scope of Performance Review: Scope of Initial Profile (say for the Budget year
2008):
1.Strategic Business Plans 1.Scope, Governance, Vision, Mission and
objectives
2.Scope of review 2.Main functions, programs and activities
3.Review steps and key 3.Structure, staffing and time schedules
milestones
4.Preliminary assessment 4.Program-wise Budgeted funds
5.Focus of review 5.Output costs, benchmarks and performance
parameters

This assessment allows an agency to take a detailed look at their current business
activities and how they wanted to perform in the budget review year. Various profit
centers under an Agency will be asked to provide a brief description of their Strategic
Business Plan with vision, mission, objectives and goals. They will also be asked to
provide summary of their program budgets, which is being reviewed, with budgeted
resources, outputs and outcomes. Agencies will be required to provide details of
workforce size, their functions and skills, workload volume and contributions to the
strategic planning.

5.2 Diagnostic Scan and SWOT Analysis

A diagnostic scan of an Agency is necessary before starting a performance review,


because actual performance is influenced by constraints on resources, technical
manpower and the ICT system. There are basically two types of review- strategic review
and operational review.

Strategic Review:
How well an Agency manages its external environment by delivering relevant and
effective services.
Operational Review:
How well an Agency manages its internal environment by using its resources
efficiently and prudently.

Both desktop and field scans are required to determine the following aspects:
whether best practice techniques were attempted;
whether the practice was documented; and
whether it was widely applied within the agency.

10
The desktop scan involves checking the existing material on strategic plans and
program/ output budgets already submitted by the Agency to the Ministry of Finance,
whereas field scans involve conducting surveys and interviewing key stakeholders
(clients, community groups, staff and management), to obtain their views on how internal
management tools are working in practice.

There are 8 possible strategic review areas and 8 operational review areas as indicated
in Table-6.

Table-6 Strategic Review and Operational Review Areas

Strategic review areas Operational review areas


1.Strategies 9. Work Culture
2.Environment 10. Communications
3.Clients 11. Organization structure
4.Other stakeholders 12. Reporting Lines
5.Regulation 13. Human resources
6.Policy regime 14. Processes and systems
7.Service delivery 15. Controls
8.Reviews 16. Cost and Asset management

For each of these 16 areas, it is necessary to test whether the agency has applied any
typical best practice management techniques. For example, when examining “Clients”,
agencies would be asked whether they have applied any of the following types of
management practices e.g. client needs analysis, client segmentation, clients’
satisfaction surveys, grievances and complaints handling and so on. When examining
“Controls and Cost and Asset Management”, agencies would be asked if they use the
following practices e.g. financial information system, management information system,
asset management plan and corporate overhead costs analysis, etc.

Table-7: Typical Best Practices for Strategic Review


Strategic Review Areas (number Typical Best Management Practices
of sub-areas)
1. Strategies (2) Strategic Business Plan, Master Plan
2. Environment (2) Socio-Economic-Political Environment Analysis,
SWOT Analysis.
3. Clients (2) Clients Needs and Satisfaction Surveys,
Grievances and Complaints Handling.
4. Other Stakeholders (2) Stakeholders Consultation, Focus Groups
5.Regulation (2) Regulatory Review, Parliamentary Consultative
Committee Review
6. Policy (2) Ministerial Review, Donors Review

11
7. Service Delivery (2) Service Charter, Benchmarking
8. Review plan (2) Performance Agreements, External Audits

Table-8: Typical Best Practices for Operational Review


Operational Review Areas Typical Best Management Practices
9. Work Culture (2) Code of Conduct, Regular Staff Meetings
10. Communications (2) Annual Report, Website for Public
11. Organization Structure (2) Organization Chart, Job Descriptions
12. Reporting Lines (2) Delegation of Powers, Chinese Walls
13. Human Resources (2) H/R Manual, Training and Development Programs.
14. Process & Systems (2) Rules and Procedure Manuals, ICT development
Plans.
15. Controls (2) Financial Information System, Management
Information System
16. Expenditure and asset Asset Management Plan,
management (2) Agency Overheads Analysis

5.3 SWOT Analysis

After diagnostic scan, a SWOT analysis may be undertaken to determine the strengths,
weakness, opportunities and threats of the Agency..

5.4 Strategic and Operational Performance Evaluation

An agency’s performance can be assessed in relation to the 16 performance factors


listed in Tables-7 and 8. Each factor can be given scores on a scale of 0 to 5 by using an
approach adapted from the Australian Quality Council:

Table-9: Scores for Strategic and Performance Evaluation

0 Approach had not been considered or attempted or does not exist.

1 Some evidence of individual initiative and unsystematic efforts.

2 Approach is planned and introduced in some areas in a limited way.

3 Systematic approach has been implemented in some areas and results are
under examination.
4 Approach has been implemented in some areas and results/outcomes have
been used to improve the planning and budgeting.
5 Approach used in most of the areas and results/outcomes have been used to
improve the planning and budgeting.

12
Then a “Borda Index” (i.e. sum of all ranks for all factors) can be estimated. This will
provide a composite index for rating performance of agencies. There are 32 (=16X2)
sub-areas. So a maximum 160 marks can be scored by an Agency. Total score can be
expressed as a percentage of 160 marks. Percentage can also be calculated separately
for strategic performance and operational performance. Then, total marks for each
category can be expressed as a percentage of 80 marks. It is most unlikely that an
Agency will be able to score 100% marks. On the basis of percentage of marks, the
strategic performance or operational performance, or the combined strategic and
operational performance of an Agency could be rated as follows:

Table-10: Rating of an Agency on the basis of Performance Scores

Range of Performance Scores


Rating of Agency
(in percentage)
(a) Effective (EF) 85 – 100
(b) Moderately Effective (ME) 70 – 84
(c) Adequate (AD) 50 – 69
(d) Ineffective (IN) 0 – 49

5.4 Compliance, Efficiency and Effectiveness Evaluation

Under Compliance Evaluation, program and sub-program wise budgeted expenditure


are compared with the actual expenditure, and the following marks are assigned to each
program:

Table-11: Marks for Budget Compliance Evaluation

0 If actual expenditure exceeds budgeted expenditure by more than 10%.


1 If actual expenditure exceeds budgeted expenditure by more than 7.5 per cent
but less than 10 per cent.
2 If actual expenditure exceeds budgeted expenditure by more than 5 per cent but
less than 7.5 per cent.
3 If actual expenditure exceeds budgeted expenditure by more than 2.5 per cent
but less than 5 per cent.
4 If actual expenditure exceeds budgeted expenditure by less than 2.5%.
5 If actual expenditure is within the budgeted expenditure.

Under efficiency evaluation, budgeted outputs are compared with actual outputs, and the
following marks are assigned to each program.

13
Table-12: Marks for Budget Efficiency Evaluation

0 If actual output falls short of budgeted output by more than 10 per cent.
1 If actual output falls short of budgeted output by more than 7.5 per cent but less
than 10 per cent.
2 If actual output falls short of budgeted output by more than 5 per cent but less
than 7.5 per cent.
3 If actual output falls short of budgeted output by more than 2.5 per cent but less
than 5 per cent.
4 If actual output falls short of budgeted output by less than 2.5 per cent.
5 If actual output is at least equal to the budgeted output.

Under effectiveness evaluation, budgeted outcomes are compared with actual


outcomes, and the following marks are assigned to each program. However, one has to
wait a number of years before the outcome results are available. Therefore, for the next
three years, effectiveness evaluation may not be feasible.

Table-13: Marks for Budget Effectiveness Evaluation

0 If actual outcome falls short of budgeted outcome by more than 10%.


1 If actual outcome falls short of budgeted outcome by more than 7.5 per cent but
less than 10 per cent.
2 If actual outcome falls short of budgeted outcome by more than 5 per cent but less
than 7.5 per cent.
3 If actual outcome falls short of budgeted outcome by more than 2.5 per cent but
less than 5 per cent.
4 If actual outcome falls short of budgeted outcome by less than 2.5%.
5 If actual outcome is at least equal to the budgeted outcome.

After assigning marks for all sub-programs, actual marks obtained for all programs of an
Agency will be expressed as a percentage of total possible marks.

5.5 Overall Assessment and Score

Thus, we have the following three broad evaluations:

(1) Strategic Plan and Baseline Profile Evaluation-


(2) Strategic and Operational Performance Evaluation –
(3) Compliance and Effectiveness Evaluation

For overall assessment a weight of 30 per cent may be given for strategic plan and
baseline evaluation, a weight of 20 percent may be given for strategic and operational
performance evaluation and 50 per cent may be given for budget compliance and
effectiveness evaluation.

14
Table-14: Weights for Various Types of Evaluation
Type of Evaluation Weights
1-A Strategic Plan Evaluation Weight: 10%
1-B Systems Development Weight: 10%
1-C Human Resource Development Weight: 10%
2-A Strategic Performance Evaluation Weight: 10%
2-B Operational Performance Evaluation Weight: 10%
3-A Program Budget Compliance Weight: 25%
3-B Program Budget Effectiveness Weight: 25%
Total 100%

Translating Performance Scores into Ratings: Finally, the overall performance scores
will be converted into qualitative ratings using the scoring bands given in the following
table:

Table-15: Rating of an Agency on the Basis of Overall Scores


Range of Performance Scores
Rating of Agency
(in percentage)
(e) Effective (EF) 85 – 100
(f) Moderately Effective (ME) 70 – 84
(g) Adequate (AD) 50 – 69
(h) Ineffective (IN) 0 - 49

There will be another category called “Results Not Demonstrated” when an Agency
does not have performance measures that have been agreed by MOF either for
baselines or for the assessment year.

An Example:

To provide an example, let us assuming that we are evaluating budget performance for
three Agencies- A, B and C. The results are given in tables16-17.

15
Table-16: An Example of Performance Scores for three Agencies
Performance Scores Weighted Scores
(%)
Type of Evaluation Weigh Agenc Agenc Agenc Agenc Agenc Agenc
t y-A y-B y-C y-A y-B y-C
1-A Strategic Plan
Evaluation 0.1 50 60 85 5 6 8.5
1-B Systems
Development 0.1 50 50 75 5 5 7.5
1-C Human Resource
Development 0.1 60 50 70 6 5 7
2-A Strategic 0.1
Performance
Evaluation 50 50 85 5 5 8.5
2-B Operational 0.1
Performance
Evaluation 60 70 70 6 7 7
3-A Program Budget
Compliance 0.25 40 70 70 10 17.5 17.5
3-B Program Budget
Effectiveness 0.25 40 70 70 10 17.5 17.5
Total 100% 47 63 73.5 47 63 73.5
On the basis of these scores, the Agencies would be graded as given in table-17:

Table-17: Estimation of Overall Rating for three Agencies


Performance Scores Rating of an Agency
(%)
Type of Evaluation Weigh Agenc Agenc Agenc Agenc Agenc Agenc
t y-A y-B y-C y-A y-B y-C
(1) (2) (3) (4) (5) (6) (7) (8)
1-A Strategic Plan
Evaluation 0.1 50 60 85 AD AD EF
1-B Systems
Development 0.1 50 50 75 AD AD ME
1-C Human Resource
Development 0.1 60 50 70 AD AD ME
2-A Strategic 0.1
Performance
Evaluation 50 50 85 AD AD EF
2-B Operational 0.1
Performance
Evaluation 60 70 70 AD ME ME
3-A Program Budget
Compliance 0.25 40 70 70 IN ME ME
3-B Program Budget
Effectiveness 0.25 40 70 70 IN ME ME
Total 100% 47 63 73.5 IN AD ME
Note: AD stands for Adequate, EF for Effective, IN for Ineffective and ME for
Moderately Effective.

16
Selected References

Bergin, Jeffrey (2004) Performance Based Budgeting, Performance Management


Institute.

Canadian Comprehensive Auditing Foundation (2002) Reporting Principles: Taking


Public Performance Reporting to a New Leve, Ottawa, 2002.

Das, Tarun (2007a) “Preparation of Strategic Business Plans- General Guidelines,


Suggestions for Improvement, and Summary of Recommendations”, Final Report, pp.1-
74, 30 Sept 2007.

Das, Tarun (2007b) Output Costing and Output Budgeting- Basic Concepts and
Methodology, pp.1-51, October 2007.

Das, Tarun (2007c) Accrual Accounting and Accrual Budgeting- Basic Concepts and
Methodology, pp.1-43, November 2007.

Das, Tarun (2007d) Transition from Cash to Accrual Accounting, pp.1-26, Nov 2007.

Das, Tarun (2007e) Benchmarks Setting and Best Practices for Output Costing and
Output Budgeting- Part-1: Basic Concepts, pp.1-31, Dec 2007.

Das, Tarun (2007f) Benchmarks Setting and Best Practices for Output Costing and
Output Budgeting- Part-2: Practical Applications for Mongolia, pp.1-36, Dec 2007.

Das, Tarun (2007g) Terminal Report: Part-1, Major Conclusions and Recommendations,
pp.1-70 and Part-2 on Strategic Business Plans, Output costing and Output Budgeting,
Accrual Accounting and Accrual Budgeting, and Benchmarks Setting, pp.71-157, ADB
Capacity Building Project on Governance Reforms, prepared by Tarun Das for detailed
guidelines on Output Costing.

Government of Australia, Council on the Cost and Quality of Government (2001)


Annual Report 2001, November 2001.

Government of Mongolia (2002) Public Sector Management and Finance Act (PSMFA,
27 June 2002).

Government of USA (2005) Performance Management Process: Strategic Planning,


Budget and Performance Management cycle, General Services Administration (GSA),
Office of the Chief financial Officer, 31 January 2005.

Hatry, Harry P. (1977) How Effective are your Community Services?, The Urban
Institute, Washington, D.C.

17
Kaplan, Robert S. and David P. Norton (1996) The Balanced Scorecard: Translating
Strategy into Action, Harvard Business School Press.

Mercer, John: See Website on GPRA and Performance Management:


www.governmentperformance.info

Mercer, John (2003) CASCADE Performance Budgeting- A Guide to an Effective


System of Integrating Budget and Performance Information and for Linking Long-Term
Goals for Day-to-Day Activities, USA, May 2003, www.governmentperformance.info

Meyers, Roy T. (1996) Is There a Key to the Normative Budgeting Lock, The World
Bank, Washington, D.C.

Schick, Allen (1995) Federal Budget: Politics, Policy and Process, Brookings Institution.

Steiner, George; Simon and Schuster (1997) Strategic Planning: What Every
Manager Must Know.

Thomas, Paul G. (2004) Performance Measurement, Reporting and Accountability:


Recent Trends and Future Directions, Saskatchewan Institute of Public Policy Paper No
23, February 2004; http://www.uregina.ca/sipp/

Thomas, Paul G. (2005) Performance Management and Management in the Public


Sector, Optimum Online — The Journal of Public Sector Management, Vol 35, Issue 2,
July 2005. http://www.optimumonline.ca/

Thomas, Paul G. (2006) Performance Measurement, Reporting, Obstacles and


Accountability -Recent Trends and Future Directions, Research School of Social
Sciences, The Australian National University, Canberra ACT 0200.

USA (1993) Government Performance and Results Act (GPRA) of 1993, Office of
Management and Budget (OMB).

United States of America, Office of Management and Budget (OMB) Homepage:


http://www.whitehouse.gov/omb/gils/gil-home.html

18

También podría gustarte