Documentos de Académico
Documentos de Profesional
Documentos de Cultura
“Would you tell me please, which way I ought to go from here?” asked Alice.
“That depends a great deal on where you want to get to,” said the cat.
“I don’t much care where ….,” said Alice.
“Then it does not matter which way you go,” said the cat.
Eventually Alice in Wonderland realized that it matters a great deal to know “where to
go” and “how to get there”. Similarly, in any agency it is important to know the vision,
mission, basic goals and objectives of an agency and the overall scope of its activities in
terms of exact outputs and outcomes in the medium term, and how to achieve these
goals, objectives, outputs and outcomes in a time bound manner and with least cost.
Strategic Planning and Performance Management Cycle is an interactive on-going
process to facilitate sound business and financial plan of any agency.
2. Strategic Planning
A Strategic Business Plan (SBP) must focus to achieve a clear Mission, embedded in a
realistic Vision, based on issues, objectives and strategies identified in collaboration
with the major stakeholders. SBP need not be too ambitious with an impressive plan but
unrealistic targets. It should emphasize concrete plan of actions and strict
implementation schedule.
SBP needs to recognize that the global business environment is complex and fast
changing and global public policy is an area of conflicts and adversity. We need to
understand the dynamics of both internal and external environment and be prepared
with appropriate strategy to tackle any contingent liabilities.
SBP needs to recognize that any policies and programs cannot be successful unless an
agency is able to take the stakeholders along with them. Thus collaboration in SBP is a
deal which rewards all parties involved, and creates win-win situations for all
stakeholders.
1
Author is presently working as a Professor in the Institute for Integrated Learning for Management and as
a Consultant in the World Bank Country Office for Uzbekistan at Tashkent. This paper was written when
author was working as the Asian Development Bank Strategic Planning Expert for the Government of
Mongolia at Ulaanbaatar during June 2007 to July 2008.
1
SBPs have to be integrated fully with structural and governance reforms and capacity
building. The SBPs need to adopt a gradual, step by step, evolutionary and cumulative
approach towards structural changes within an organization, and should avoid the
temptation of adopting a Big Bang, shock therapy, radical, fundamental or revolutionary
approach, which may create islands of hostile stakeholders.
We do not advocate for a small, shrunken and weak management, but want the
management to be strong to guide the transition process, to bring good governance
reforms to their logical ends, to strengthen the existing strong and efficient institutions,
and to repair the weak ones. An efficient strategic planning will require a firm and
consistent leadership by the management over a long period of time.
Goals Long term – wide spread results of programs and policies, like the
achievement of the Millennium Development Goals by 2015
↓
Outcomes Medium term – impact of programs and policies on the economy and
user groups.
↓
Output Deliverables- products and services produced during program period.
Outputs are the immediate or end results of programs or activities,
↓ whereas outcomes are the impact on the markets and society even
after completion of the project.
Activities Tasks- undertaken by the staff to transform inputs into outputs.
↓
Inputs Resources- (Personnel, financial, goods and services) are the basic
requirements for any output and strategic planning.
↓
2
Strategic Business Plan
Broadly defines strategic goals, outcomes, outputs of an
Agency and the methods to achieve them.
Performance Monitoring
Track the progress, expenditure, and value for
money for achieving outcomes.
The process begins with an understanding of important national priorities and outcomes,
which are then translated into broadly defined goals and intended results for the Agency.
These become the Agency’s strategic goals and strategic objectives. Outcomes and
Outputs related to these strategic goals and objectives are then articulated. Programs
are developed to achieve these outcomes and outputs with least resources, and then
performance measures and indicators are identified to provide the means to assess the
progress made during the budget year and to suggest improvements for the next year’s
budget. Flow Chart-2 below explains the relationship between an Agency’s Medium
Term Strategic Business Plan and its Annual Program Budget and performance
evaluation.
3
Strategic Business Plan Next Year’s Budget
Paul G. Thomas (2005) has mentioned the following properties of an ideal perforance
measurement system:
4
In somewhat less abstract terms, the Canadian Comprehensive Auditing Foundation
(CCAF) has developed nine principles 'to provide direction for performance reporting in
Canada. Box-1 presents these nine principles. The first five principles provide guidance
about what governments should report, while the remaining four relate to how
governments report. The principles start with as ideals, the ‘ceiling’ that reporting aspires
to reach, but over time they become 'standards’, the floor below which reporting should
not sink. Taken as a set, the nine principles are meant to provide a framework for
performance reporting.
It would be interesting to know how the system evolved in Canada over the past decade.
The federal and provincial governments in Canada developed two broad approaches to
performance measurement. In Alberta, Nova Scotia, Ontario and Quebec, governments
reported on the performance of the entire government in terms of the impacts of their
programs on the society. This ‘social-indicator’ type of approach supports external and
political accountability. However, the selection of indicators included in ‘report cards’ to
citizens was inherently arbitrary (Thomas 2006).
Other provinces and the Government of Canada began their performance measurement
efforts by requiring individual departments to prepare business plans and performance
reports. The ‘business-line’ approach is more of a managerial tool than something which
is normally liked by politicians and the public. However, these two broad approaches
5
viz. the system-wide and the business-line could be pursued simultaneously and
complement one another. This has been the evolution of the performance reporting
system in the Government of Canada. It began by publishing performance reports on a
departmental basis and now more than 80 such documents for departmental and non-
departmental bodies are tabled in Parliament on a annual basis.
Efficiency Are the inputs (people, money, supplies) being combined to produce
the maximum volume of outputs (goods and services)?
Effectiveness Are the goals of the organization/ program being met, without undue
unintended consequences?
These elements had been used by both the public and private sector management over
the past four or five decades. But, the framework misses another important E i.e. equity,
which deals with the distributional impacts of performance. Omitting equity may have an
adverse impact on another big E in government i.e. electability.
This framework deals with the desirability of continuing a program, but does not address
the question: Does the organisation have the capability and capacity to deliver the
desired results?
6
In 1987 the Canadian Comprehensive Audit Foundation (CCAF) published a report on
‘organisational effectiveness’ which mentioned the following attributes of an effective
organisation management:
Several Agencies have since applied this framework. But, there may be conflict in
practice among the attributes — e.g., cost efficiency may reduce responsiveness.
Besides, assigning weights to these attributes and constructing a weighted overall index
is a challenging job.
The Office of the Auditor General of Canada (OAG) has recommended another
performance framework with six components of performance:
1. Mission statements
2. Results statements
3. Performance indicators/measures
4. Performance expectations/targets/commitments
5. Strategies/activities
6. Performance accomplishments/achievements
7
Table-2: Characteristics of an Ideal Performance Measurment System
Characteristic Definition
Performance indicators measure what an Agency did in the fiscal year. There are many
kinds of performance indicators ranging from quantities or value of goods and services
produced in a given period (such as the number of crimes or breaking of traffic rules
detected by the police) to more complex indicators such as efficiency and effectiveness
of service delivery. Nayyer-Stone (1999) mentioned four primary types of performance
indicators: input, output, outcome and efficiency, which are described in Table-3.
8
Source: Adapted from Hatry, Harry P. (1977).
Like any statistical measure, performance indicators must satisfy a number of criteria. In
general, an ideal performance indicator should be S.M.A.R.T. (i.e. simple, measurable,
achievable, relevant and timely) and C.R.E.A.M. (i.e. clear, relevant, economic,
adequate and monitorable).
a) Controlling costs – enabling agencies to identify costs which are much higher or
lower than average and determine why these differences exist.
b) Comparing processes – analyzing performance of services provided with a
particular technology, approach or procedure.
c) Maintaining standards – monitoring service performance against established
performance targets or benchmarks.
d) Comparing sectors – comparing costs of public sector delivery to costs of private
sector delivery of the same type of goods and services.
9
In undertaking a performance evaluation, it is necessary to start with a baseline and
initial profile and to identify the key issues on which the performance evaluation is to be
focused.
Scope of Performance Review: Scope of Initial Profile (say for the Budget year
2008):
1.Strategic Business Plans 1.Scope, Governance, Vision, Mission and
objectives
2.Scope of review 2.Main functions, programs and activities
3.Review steps and key 3.Structure, staffing and time schedules
milestones
4.Preliminary assessment 4.Program-wise Budgeted funds
5.Focus of review 5.Output costs, benchmarks and performance
parameters
This assessment allows an agency to take a detailed look at their current business
activities and how they wanted to perform in the budget review year. Various profit
centers under an Agency will be asked to provide a brief description of their Strategic
Business Plan with vision, mission, objectives and goals. They will also be asked to
provide summary of their program budgets, which is being reviewed, with budgeted
resources, outputs and outcomes. Agencies will be required to provide details of
workforce size, their functions and skills, workload volume and contributions to the
strategic planning.
Strategic Review:
How well an Agency manages its external environment by delivering relevant and
effective services.
Operational Review:
How well an Agency manages its internal environment by using its resources
efficiently and prudently.
Both desktop and field scans are required to determine the following aspects:
whether best practice techniques were attempted;
whether the practice was documented; and
whether it was widely applied within the agency.
10
The desktop scan involves checking the existing material on strategic plans and
program/ output budgets already submitted by the Agency to the Ministry of Finance,
whereas field scans involve conducting surveys and interviewing key stakeholders
(clients, community groups, staff and management), to obtain their views on how internal
management tools are working in practice.
There are 8 possible strategic review areas and 8 operational review areas as indicated
in Table-6.
For each of these 16 areas, it is necessary to test whether the agency has applied any
typical best practice management techniques. For example, when examining “Clients”,
agencies would be asked whether they have applied any of the following types of
management practices e.g. client needs analysis, client segmentation, clients’
satisfaction surveys, grievances and complaints handling and so on. When examining
“Controls and Cost and Asset Management”, agencies would be asked if they use the
following practices e.g. financial information system, management information system,
asset management plan and corporate overhead costs analysis, etc.
11
7. Service Delivery (2) Service Charter, Benchmarking
8. Review plan (2) Performance Agreements, External Audits
After diagnostic scan, a SWOT analysis may be undertaken to determine the strengths,
weakness, opportunities and threats of the Agency..
3 Systematic approach has been implemented in some areas and results are
under examination.
4 Approach has been implemented in some areas and results/outcomes have
been used to improve the planning and budgeting.
5 Approach used in most of the areas and results/outcomes have been used to
improve the planning and budgeting.
12
Then a “Borda Index” (i.e. sum of all ranks for all factors) can be estimated. This will
provide a composite index for rating performance of agencies. There are 32 (=16X2)
sub-areas. So a maximum 160 marks can be scored by an Agency. Total score can be
expressed as a percentage of 160 marks. Percentage can also be calculated separately
for strategic performance and operational performance. Then, total marks for each
category can be expressed as a percentage of 80 marks. It is most unlikely that an
Agency will be able to score 100% marks. On the basis of percentage of marks, the
strategic performance or operational performance, or the combined strategic and
operational performance of an Agency could be rated as follows:
Under efficiency evaluation, budgeted outputs are compared with actual outputs, and the
following marks are assigned to each program.
13
Table-12: Marks for Budget Efficiency Evaluation
0 If actual output falls short of budgeted output by more than 10 per cent.
1 If actual output falls short of budgeted output by more than 7.5 per cent but less
than 10 per cent.
2 If actual output falls short of budgeted output by more than 5 per cent but less
than 7.5 per cent.
3 If actual output falls short of budgeted output by more than 2.5 per cent but less
than 5 per cent.
4 If actual output falls short of budgeted output by less than 2.5 per cent.
5 If actual output is at least equal to the budgeted output.
After assigning marks for all sub-programs, actual marks obtained for all programs of an
Agency will be expressed as a percentage of total possible marks.
For overall assessment a weight of 30 per cent may be given for strategic plan and
baseline evaluation, a weight of 20 percent may be given for strategic and operational
performance evaluation and 50 per cent may be given for budget compliance and
effectiveness evaluation.
14
Table-14: Weights for Various Types of Evaluation
Type of Evaluation Weights
1-A Strategic Plan Evaluation Weight: 10%
1-B Systems Development Weight: 10%
1-C Human Resource Development Weight: 10%
2-A Strategic Performance Evaluation Weight: 10%
2-B Operational Performance Evaluation Weight: 10%
3-A Program Budget Compliance Weight: 25%
3-B Program Budget Effectiveness Weight: 25%
Total 100%
Translating Performance Scores into Ratings: Finally, the overall performance scores
will be converted into qualitative ratings using the scoring bands given in the following
table:
There will be another category called “Results Not Demonstrated” when an Agency
does not have performance measures that have been agreed by MOF either for
baselines or for the assessment year.
An Example:
To provide an example, let us assuming that we are evaluating budget performance for
three Agencies- A, B and C. The results are given in tables16-17.
15
Table-16: An Example of Performance Scores for three Agencies
Performance Scores Weighted Scores
(%)
Type of Evaluation Weigh Agenc Agenc Agenc Agenc Agenc Agenc
t y-A y-B y-C y-A y-B y-C
1-A Strategic Plan
Evaluation 0.1 50 60 85 5 6 8.5
1-B Systems
Development 0.1 50 50 75 5 5 7.5
1-C Human Resource
Development 0.1 60 50 70 6 5 7
2-A Strategic 0.1
Performance
Evaluation 50 50 85 5 5 8.5
2-B Operational 0.1
Performance
Evaluation 60 70 70 6 7 7
3-A Program Budget
Compliance 0.25 40 70 70 10 17.5 17.5
3-B Program Budget
Effectiveness 0.25 40 70 70 10 17.5 17.5
Total 100% 47 63 73.5 47 63 73.5
On the basis of these scores, the Agencies would be graded as given in table-17:
16
Selected References
Das, Tarun (2007b) Output Costing and Output Budgeting- Basic Concepts and
Methodology, pp.1-51, October 2007.
Das, Tarun (2007c) Accrual Accounting and Accrual Budgeting- Basic Concepts and
Methodology, pp.1-43, November 2007.
Das, Tarun (2007d) Transition from Cash to Accrual Accounting, pp.1-26, Nov 2007.
Das, Tarun (2007e) Benchmarks Setting and Best Practices for Output Costing and
Output Budgeting- Part-1: Basic Concepts, pp.1-31, Dec 2007.
Das, Tarun (2007f) Benchmarks Setting and Best Practices for Output Costing and
Output Budgeting- Part-2: Practical Applications for Mongolia, pp.1-36, Dec 2007.
Das, Tarun (2007g) Terminal Report: Part-1, Major Conclusions and Recommendations,
pp.1-70 and Part-2 on Strategic Business Plans, Output costing and Output Budgeting,
Accrual Accounting and Accrual Budgeting, and Benchmarks Setting, pp.71-157, ADB
Capacity Building Project on Governance Reforms, prepared by Tarun Das for detailed
guidelines on Output Costing.
Government of Mongolia (2002) Public Sector Management and Finance Act (PSMFA,
27 June 2002).
Hatry, Harry P. (1977) How Effective are your Community Services?, The Urban
Institute, Washington, D.C.
17
Kaplan, Robert S. and David P. Norton (1996) The Balanced Scorecard: Translating
Strategy into Action, Harvard Business School Press.
Meyers, Roy T. (1996) Is There a Key to the Normative Budgeting Lock, The World
Bank, Washington, D.C.
Schick, Allen (1995) Federal Budget: Politics, Policy and Process, Brookings Institution.
Steiner, George; Simon and Schuster (1997) Strategic Planning: What Every
Manager Must Know.
USA (1993) Government Performance and Results Act (GPRA) of 1993, Office of
Management and Budget (OMB).
18