Está en la página 1de 41

The Baldrige Education Criteria

for Performance Excellence


Framework
Empirical test and validation
Masood Abdulla Badri and Hassan Selim
Department of Business Administration, College of Business & Economics,
United Arab Emirates University, Al Ain, United Arab Emirates
Khaled Alshare and Elizabeth E. Grandon
Accounting & Computer Information System Department,
Emporia State University, Emporia, Kansas, USA, and
Hassan Younis and Mohammed Abdulla
Department of Business Administration, College of Business & Economics,
United Arab Emirates University, Al Ain, United Arab Emirates
Abstract
Purpose The purpose of this paper is to empirically test the causal relationships in the Malcolm
Baldrige National Quality Award (MBNQA) Education Performance Excellence Criteria.
Design/methodology/approach Using a sample of 220 respondents from 15 United Arab
Emirates (UAE) universities and colleges, results of regression analysis and conrmatory structural
equation modeling show that all of the hypothesized causal relationships in the Baldrige model are
statistically signicant.
Findings A comprehensive measurement model grounded in the Baldrige Performance
Excellence in Education Criteria for the 33 items of measurement is developed, tested, and found to
be valid and reliable. Leadership is identied as a driver for all components in the Baldrige System,
including measurement, analysis and knowledge management, strategic planning, faculty and staff
focus and process management. All Baldrige components (categories) are signicantly linked with
organizational outcomes as represented by the two categories of organizational performance results
and student, stakeholder and market focus. The paper also tests the statistical t of the only Baldrige
model dealing with higher education, which was published in 1998 by Winn and Cameron.
Research limitations/implications The data obtained are based on a sample of UAE higher
education institutions. Studies in other countries should be conducted using the developed model to
ensure the reliability of the results obtained.
Practical implications A greater understanding of the linkages between the elements making-up
the MBNQA Education Performance Excellence Criteria model, facilitating the guiding role that the
award models play in the implementation of quality management in higher education.
Originality/value For the rst time, an instrument of the MBNQA Education Performance
Excellence Criteria is developed and tested. A new in-depth and holistic perspective for examining the
relationships and linkages in the MBNQAEducation Performance Excellence Criteria model is provided.
Keywords Baldrige Award, Quality awards, Higher education, Performance measures,
United Arab Emirates
Paper type Research paper
The current issue and full text archive of this journal is available at
www.emeraldinsight.com/0265-671X.htm
IJQRM
23,9
1118
Received March 2005
Revised August 2005
International Journal of Quality &
Reliability Management
Vol. 23 No. 9, 2006
pp. 1118-1157
qEmerald Group Publishing Limited
0265-671X
DOI 10.1108/02656710610704249
Introduction
Many researchers, encouraged by case study success stories, have called for evidence
from large-scale studies on the effectiveness of quality management programs, such as
the Baldrige Criteria (Meyer and Collier, 2001; Bigelow and Arndt, 1995, 2000; Motwani
et al., 1996; Gann and Restuccia, 1994). The MBNQA has evolved from a means of
recognizing and promoting exemplary quality management practices to a
comprehensive framework for world-class performance, widely used as a model for
improvement. As such, its underlying theoretical framework is of critical importance,
since the relationships it portrays convey a message about the route to competitiveness
(Flynn and Saladin, 2001). It becomes imperative that the relationship between
constructs be tested and validated. It is important because organizations allocate
substantial resources toward improvement of their processes based on the
relationships in the Baldrige framework.
There are only a few studies that fully address Baldrige in the area of education.
Evans (1997) initially discussed MBNQA and institutions of higher education by
relating it to learning and curriculum issues and identifying what higher education
should be teaching based upon a survey of Baldrige Award winners. Using the
ndings of Evans study as a baseline, Weinstein et al. (1998) identied an apparent gap
between the Baldrige Award winners perceptions and the current practice in higher
education institutions. While developing a curriculum based upon Baldrige principles
has received noteworthy attention, what is not readily evident within the literature is
the actual application of the MBNQA concepts as part of the educational delivery
process. Belohlav et al. (2004) described how several faculty members in the
Department of Management at DePaul University designed, developed, and delivered
course material using the MNBQA framework both as part of the structure and as
point in their individual classes. They concluded that end-of-term student evaluations
indicated that the approach led to a higher level of student engagement in the learning
process, as evidenced by more abundant and higher-quality feedback to the
instructors.
Winn and Cameron (1998) examined the validity of the proposed relationships
among the MBNQA dimensions using data from higher education. They developed a
survey instrument of the processes, practices, and outcomes of quality at a large
Midwestern university in the USA. Through some psychometric tests, they indicated
that the seven MBNQA dimensions are distinct constructs and are being measured
reliably by the questionnaire items. To assess the validity of the frameworks
assumptions, three sets of regression analysis were conducted. The relationship
between the leadership dimension and each of the four system dimensions was strong
and statistically signicant. They concluded that the assumed relationship between an
organizations leadership and each of the quality process was denitely supported.
Using structural equations modeling, the same authors proceeded to perform statistical
analysis of MBNQA framework as a whole. They presented an alternative framework
of relationships that took into account the lack of direct effects on quality and
operational results from leadership, information and analysis, and strategic planning
and the lack of direct effects on customer focus and satisfaction from leadership,
information and analysis, and human resource development and management. The
alternative model evidenced some acceptable goodness-of-t with the data.
The Baldrige
Education
Criteria
1119
While the Baldrige Award in education has captured the attention of decision
makers, there has been little empirical research examining the usefulness of the award
criteria to guide the actions of organizations that seek to improve performance
(Goldstein and Schweikhart, 2002; Arif and Smiley, 2004). This research takes a step
toward providing senior leaders in educational organizations with a valid means of
making those decisions. The published Baldrige model (Education Criteria for
Performance Excellence) (NIST, 2004) is shown in Figure 1. The general MBNQA
theory that Leadership drives the system which creates results suggests that the
performance relationships are recursive (Meyer and Collier, 2001). When Baldrige
quality experts dened the performance relationships among the seven categories,
uncertain of the true direction of causation, they defaulted to the premise that all
categories are related and used two-headed arrows among all Baldrige categories.
We seek to add to the growing body of support related to the validity of the general
Baldrige framework by examining it at the level of its theoretical constructs as it
relates to the education industry in an international context. By moving beyond the
specic criteria, we seek to examine the model in a larger context, as a theoretical
model for quality management in higher education. We tested if there was empirical
evidence that the relationships between the theoretical constructs held. To this end, we
examined individual relationships between categories and overall relationships
between categories when they acted as an integrated system. We hypothesized that the
seven Baldrige categories were related in a recursive causal model and that the sign of
each path coefcient was positive. So, for example, Leaderships direct effects in the
causal model were represented in two ways: rst, as the leadership score increased, the
scores of the other dimensions of strategic planning, faculty and staff focus, student,
stakeholder, and market focus, and process management increased as well; and second,
as the leadership score increased, the organizational results dimensions scores should
also increase. Leaderships indirect effects were represented by increases in the
Figure 1.
Baldrige Education
Criteria for Performance
Excellence Model
IJQRM
23,9
1120
leadership score causing the organization results scores to increase through
leaderships inuence on the mediating dimensions in between. The award criteria
were studied to determine if the Baldrige theory of relationships among the seven
Baldrige categories were supported in UAE higher education institutions.
The objectives of this study of Baldrige Education Criteria for Performance
Excellence model were to:
.
develop a comprehensive measurement model, with associated constructs and
scales, that accurately captured the content of the MBNQA-Education Criteria
for Performance Excellence;
.
address whether the seven Baldrige categories represented a good model for
higher education organizations (especially in the UAE); and
.
provide insight into the strength and direction of causation among the seven
Baldrige categories.
The insights gained from these objectives should contribute to the quality
management, performance measurement, and education literature. While the seven
categories and the associated structural (causal) model in the original and education
criteria were similar, the specic measures addressed within each category (i.e. the
measurement model) were signicantly different. For example, the original Baldrige
Criteria (NIST, 1995) most applicable to manufacturing dened the customer as the
buyer of goods and services; however, the Baldrige-Education Criteria for Performance
Excellence (NIST, 2004) dened customers as the students, their families, communities,
governments, and investors in students. Hence, the customer-driven measures used to
develop the scales and measurement model for the Baldrige Education Criteria for
Performance Excellence were different than the original Baldrige Criteria.
The importance of the study
The recent trends in decreasing nancial support for educational institutions,
increasing education costs, more local and global competition, changing students
expectations and backgrounds, and more engagement of students and communities in
continuous lifelong learning require higher education institutions to do more with less.
Under such pressure, administrators of these institutions should be concerned about
the quality of their products. Thus, a solid theoretical model that helps them in
managing the quality of education would be highly appreciated. Additionally, the
importance of this study relied on the fact that it attempted to test the model at the
theoretical constructs (items) rather than at the criteria level (dimensions), which
validates the model in a broader context as a theory of quality management. Moreover,
no study, to the best of our knowledge, has utilized the Baldrige Educational Criteria
for Performance Excellence as a framework for studying quality management in
educational institutions; especially, in a non-Western country; instead, researchers
have used the original (business) Baldrige criteria. Even though the MBNQA
framework acknowledged that educational criteria were built on the same seven
dimensions (categories) used for the business criteria, it did not assume that the
requirements of all organizations were necessarily addressed in the same way (ECPE,
2005). The Baldrige criteria for Education project is dedicated to improve educational
organizations across the nation by providing leaders with resources for improvement
that will make a difference when implemented as designed. Moreover, as leaders
The Baldrige
Education
Criteria
1121
continue to improve their understanding about making meaningful changes in their
organizations, the wealth of resources and tools available to everyone will also
improve.
Review of literature
In general, much of the published work on the quality aspects of higher education has
concentrated on effective course delivery mechanisms and the quality of courses and
teaching (Oldeld and Baron, 2000; Athiyaman, 1997; Bourner, 1998; Cheng and Tam,
1997; McElwee and Redman, 1993; Palihawadana, 1996; Soutar and McNiel, 1996;
Varey, 1993; Yorke, 1992). In particular, commentaries and case examples of quality
initiatives appeared, but most authors focused on the applicability of quality principles
and tools to the education setting (Chaffee and Sherr, 1992; Seymour, 1993; Sherr and
Lozier, 1991; Cornesky et al., 1991; Marchese, 1993). More evidence has yet to be
produced to conrm the effectiveness of quality programs and processes on desired
organizational outcomes in higher education (Winn and Cameron, 1998). Addressing
this dearth was a key objective of this research.
The MBNQA framework and the European Foundation for Quality
Management (EFQM) model have become templates for most quality awards in
many countries (Mackerron et al., 2003). These frameworks are widely adopted by
organizations as a means of self-assessment to enhance performance. They
represent an operational assessment tool for quality management practices. As
indicated by the large number of criteria guidelines that have been distributed,
many organizations use these criteria to assess their organizational quality. The
applicability and usefulness of both MBNQA and EFQM models was evident from
the vast empirical research that exist (Mackerron et al., 2003; Stewart, 2003; Da
Rosa et al., 2003; George et al., 2003; Li and Yang, 2003; Castka et al., 2003).
However, in this study, we will concentrate on the MBNQA Education
Performance Excellence Framework, since it was the most popular model used
at UAE institutions (Badri and Abdulla, 2004).
The seven dimensions in the MBNQA are hypothesized to have a particular
relationship to each other, as illustrated in Figure 1. Although the Baldrige criteria and
framework are widely accepted in practice, there is surprisingly little theoretical and
empirical evidence of their validity (Ford and Evans, 2000). Several studies presented
empirical analyses of the original Baldrige Criteria in the manufacturing environment
and provided evidence that the performance relationships observed in the Baldrige
causal model were supported in US rms. Most recently, York and Miree (2004)
examined the relationship between TQM and nancial performance, using a sample of
Baldrige Award winners; they replicated a second sample of state quality award
winning companies with three different sets of nancial performance measures.
Baldrige quality award winners generally had better nancial performance than their
peers after and before winning a quality award.
Several studies examined the issue of the validity of the Baldrige framework and
criteria more directly. Authors, such as Keinath and Gorski (1999), have used state
quality awards as surrogates for the Baldrige award, since data on actual scores can
often be obtained from state award agencies, and most state awards are virtually
identical to the Baldrige award. Pannirselvam et al. (1998) reported an empirical
analysis of data from the Arizona Governors Quality Award (AGQA), whose criteria
IJQRM
23,9
1122
mirror the original Baldrige Criteria (with only minor editing). Their objective was to
provide evidence of validity for the AGQA model and to generalize the validity to the
MBNQA Criteria. They concluded that the MBNQA measurement model (vis-a` -vis
AGQA data) was reliable and valid. However, they did not evaluate dependent
relationships among the Baldrige categories (i.e. the structural model).
In a similar line of inquiry, Pannirselvam and Ferguson (2001) tested the validity of
the relationships between the categories by modifying the 1992 Baldrige framework
into an eight-construct model, separating customer focus and satisfaction into two
separate constructs. Their results provided evidence to conrm the validity of the
modied framework. Similarly, Ford and Evans (2000) conducted a detailed analysis of
the content validity of the strategic planning category. Evans and Ford (1997)
examined the relationship between the Baldrige core values and the processes
embedded in the criteria. Evans (1997) proposed a causal model describing the key
linkages in the Baldrige framework; however, the model was not tested. Handeld and
Ghosh (1995) used structural equations modeling to empirically test the linkages
between criteria in the 1992 framework. They reported empirical support for numerous
causal relationships among the seven categories of the Baldrige model in the
manufacturing environment. Similar to Handeld and Ghosh (1995), Wilson and
Collier (2000) used structural equation modeling of the 1992 framework, concluding
that a modied set of ve Baldrige causal relationships was a good predictor of
organizational performance. Other studies that also examined the casual relationships
in the MBNQA in certain industries, other than education, include Khanna et al. (2002),
Goldstein and Schweikhart (2002), Flynn and Saladin (2001), Dow et al. (1999) and
Samson and Terziovski (1999).
The ndings in these studies provided statistical support for the Baldrige theory of
performance relationships depicted in the Baldrige causal model. Most of the studies
found that the Leadership dimension is classied as a driver of quality (Meyer and
Collier, 2001; Winn and Cameron, 1998; Pannirselvam and Ferguson, 2001; Flynn and
Saladin, 2001). Although each of these studies contributed to the validation of the
Baldrige framework, they all focused on the 1992 framework. It is important to
understand the evolution of the framework and investigate the validity of the 2004
framework as it pertains to higher education, particularly given the major
re-engineering of the criteria since 1992.
Since questions have been raised about the lack of evidence for the causal
relationships underlying the quality framework in higher education organizations, this
research addressed two questions:
(1) Are the proposed relationships between the categories in the Baldrige
Education Criteria for Performance Excellence framework valid?
(2) What is the strength of the relationships between the different quality
management constructs prescribed by the criteria?
We addressed these two questions using data from higher education organizations in
the UAE. The paper also attempts to present some detailed results and implications for
higher education authorities in the UAE.
The Baldrige
Education
Criteria
1123
Research methodology
Research model: dimensions and categories
Leadership is the key driver in MBNQA. Without the involvement and commitment of
senior leaders, the quality management journey becomes difcult and at times
impossible (Vora, 2002). The MBNQA model evaluates top management leaderships
ability to instill quality values and customer focus among the employees, and to
continuously improve their leadership styles. In higher education, senior leaders
should inspire and motivate the entire workforce and should encourage all faculty and
staff to contribute, develop and learn, be innovative, and be creative. The governance
body is responsible ultimately to all stakeholders for the ethics, vision, actions, and
performance of the organization. Senior leaders should serve as role models through
their ethical behavior and personal involvement in planning, communication, coaching,
development of future leaders, reviewing of organizational performance, and faculty
and staff recognition (Vora, 2002). As role models, they can reinforce ethics, values, and
expectations while building leadership, commitment, and initiative throughout the
organization. In addition to their important role within the organization, senior leaders
have other avenues to strengthen education. Reinforcing the learning environment in
the organization might require building community support and aligning community
and business leaders and community services with this aim. The leadership dimension
in Baldrige Education Criteria for Performance Excellence includes six categories:
organizational leadership (senior leadership direction, organizational governance,
organizational performance review); and social responsibility (responsibility to the
public, ethical behavior, and support of key communities).
The emphasis of the MBNQA, with respect to the strategic planning criterion, is
on keeping up with marketing changes and needs, and using advanced technology for
launching new products and services (Khoo and Tan, 2003; Mak, 1999, 2000). The
strategic planning dimension examines how the organization develops strategic
objectives and action plans, how strategic objectives and action plans are deployed,
and how progress is measured. For higher education, the category stresses that
learning centered education and operational performance are key strategic issues that
need to be integral parts of the organizations overall planning. For example,
e-learning-centered education is a strategic view of education. The focus is on the
drivers of key factors in educational success such as student learning, student
persistence, student and stakeholder satisfaction, new markets, and market share.
Learning-centered education focuses on the real needs of students, including those
derived from market requirements and citizenship responsibilities. The criteria
emphasize that improvement and learning need to be embedded in work processes.
The Strategic Planning category examines how the organization understands key
student, stakeholder, market, and societal requirements as input to set strategic
directions. The requirements in the Strategic Planning category encourage strategic
thinking and acting to develop a basis for a distinct leadership position in the
market. The strategic planning dimension in Baldrige Education Criteria for
Performance Excellence has four categories: strategy development (strategy
development process, and strategic objectives); and strategy deployment (action plan
development and deployment, and performance projections).
In the Baldrige framework for education, Student, Stakeholder, and Market Focus
address how the organization seeks to understand needs of current and future students
IJQRM
23,9
1124
and stakeholders and to understand the markets, with a focus on delighting students
and stakeholders, building loyalty, and meeting students and stakeholders
expectations. The MBNQA stresses this issue in its customer and market focus
criterion by highlighting the importance of developing listening and learning skills in
responding to customers opinions and complaints. The MBNQA criteria, in evaluating
customer relations, determine how special training and the career needs of
customer-contact employees are met. For higher education, this dimension
considered relationships as an important part of an overall listening, learning, and
performance excellence strategy. The criteria also evaluate trends in customer
satisfaction and how these trends compare with competitors as a means to assess
effectiveness of the companys customer-relations management process. The student
and stakeholder satisfaction and dissatisfaction results provide vital information for
understanding students, stakeholders, and markets. In many cases, these results and
trends provide the most meaningful information, not only on students and
stakeholders views but also on their actions and behaviors student persistence
and positive referrals. The student, stakeholder, and market focus dimension is
reected by three categories; however, for the purpose of this research, we split the
student, stakeholder and market focus dimension into two sub-dimensions for a better
representation. Hence, we have two dimensions and four categories: the rst dimension
is student, stakeholder and market knowledge with two categories (student knowledge,
and stakeholder and market knowledge); and the second dimension is student and
stakeholder relationship, and satisfaction with two categories (student and stakeholder
relations, and student and stakeholder satisfaction determination).
The MBNQA criteria provide for evaluation of data from the support processes. It
evaluates information analysis at different levels of business. The MBNQA does not
call for the evaluation of the nancial performance of an organization. However, it does
evaluate the ability of the institution to link quality and operational data to nancial
performance. It evaluates the methods used to continuously improve its information
gathering and analysis cycle. In higher education, the measurement, analysis, and
knowledge management dimension is the main point within the criteria for all key
information about effectively measuring and analyzing performance and managing
organizational knowledge to drive improvement in student and operational outcomes.
It calls for the alignment of the organizations programs and offerings and its strategic
objectives. The dimension addresses knowledge management and all basic
performance related to relevant information, as well as how such information is
analyzed and used to optimize organizational performance. The measurement, analysis
and knowledge management dimension is given by four categories: measurement and
analysis of organizational performance (performance measures, and performance
analysis); and information and knowledge management (data and information
availability, and organizational knowledge).
The MBNQA criteria emphasize the need for human resource plans to support and
help achieve the organizations goals. In higher education, faculty and staff focus
addresses key human resource practices those directed toward creating and
maintaining a high-performance workplace with a strong focus on students and
learning and toward developing faculty and staff for adaptation to change. The
dimension covers faculty and staff development and management requirements in an
integrated way, which is aligned with the organizations strategic objectives. The
The Baldrige
Education
Criteria
1125
faculty and staff focus includes the work environment and the faculty and staff
support climate. To reinforce the basic alignment of workforce management with
overall strategy, the criteria also cover faculty and staff planning as part of overall
planning in the strategic planning dimension. The faculty and staff dimension is given
by seven categories: work systems (organization and management, faculty and staff
performance management system, and hiring and career progression); faculty and staff
learning and motivation (faculty and staff education, training, and development, and
motivation and career development); and faculty and staff well-being and satisfaction
(work environment, and faculty and staff support and satisfaction).
The MBNQA process management criterion examines how new products and
services are designed to meet customer needs. Hence, the MBNQA process
management criterion examines how new products and services are designed to meet
customer needs and to identify critical customer needs and competitor characteristics.
The MBNQA is non-prescriptive regarding the tools used to control process quality. In
discussing process management, prior research focused on the main process.
Additionally, the MBNQA model evaluates the process management of support
services. The criteria evaluate supplier quality management more thoroughly,
measuring not only the methods used to inspect incoming material but also actions
taken to improve the quality of supplied material and hence reduce the cost of
inspection. The criteria also evaluate the methods used by the business to audit and
improve its own quality assessment practices. In higher education, process
management is the focal point within the Education Criteria for all key processes.
Built into the category are the central requirements for efcient and effective process
management: effective education design and delivery; a focus on student learning;
linkage to students, stakeholders, suppliers, and partners and a focus on
learning-centered processes that create value for all key stakeholders; and
evaluation, continuous improvement, and organizational learning. Agility,
operational efciencies tied to changes in revenue, and cycle time reduction are
increasingly important in all aspects of process management and organizational
design. It is crucial to utilize key measures for tracking all aspects of the overall
process management. The process management dimension is given by two categories:
(learning centered process) and (support process).
In higher education, the organizational performance results category provides a
results focus that encompasses student learning; student and stakeholder satisfaction;
and overall budgetary, nancial, and market performance. Also, initiatives seek to
create a positive, productive, learning-centered, and supportive work environment;
governance structure and social responsibility; and recognition of results for all key
processes and process improvement activities. Through this focus, the criterias
purposes superior value of offerings as viewed by students, stakeholders, and
markets; superior organizational performance as reected in operational, legal, ethical,
and nancial indicators; and organizational and personal learning are maintained.
Thus, this dimension provides real-time information (measures of progress) for
evaluation and improvement of educational programs, offerings, services, and
organizational processes, in alignment with the overall operational strategy. It calls for
analysis of organizational results data and information to determine the overall
organizational performance. Responses should include comparison information that
incorporates brief descriptions of how the organization ensures appropriateness of
IJQRM
23,9
1126
each comparison. Comparable organizations might include those of similar types/sizes,
both domestic and international, as well as organizations serving similar populations
of students. The organizational performance results dimension is given by six
categories: (student learning results); (student and stakeholder results); (budgetary,
nancial and market results); (Faculty and staff results); (organizational effectiveness
results); and (governance and social responsibility results).
Questionnaire development and pilot test
To investigate the MBNQA Education dimensions, an instrument was developed to
survey the level of practice for the quality items in the 33 categories. The seven
Baldrige dimensions were operationalized through items on the questionnaire that
captured the key elements in the MBNQA Application Guidelines. Items were guided
by the criteria specied in the Malcolm Baldrige Award Application Guidelines.
Several steps were taken to ensure that the questionnaire used in this study
provided a valid measurement of the Baldrige Education Criteria for Performance
Excellence. The measurement of each of the 33 Baldrige Education categories, which
cannot be measured directly, was operationalized using a scale of items. Each scale
was developed based on a thorough review and understanding of the criteria
(dimensions). Additionally, the content and wording of the items were directly
traceable to the Baldrige Education Criteria for Performance Excellence. The number
of items for each category was determined so that the content of the dimension was
adequately addressed. Because the Baldrige Criteria do not prescribe particular
methodologies or practices, the items intended to identify whether, rather than how,
relevant management and quality issues were addressed. For example, a scale item for
dimension Leadership (Organizational leadership-senior leadership direction) (see the
Appendix) asked whether senior leaders practice creating strategic directions (rather
than specifying whether a particular method, was used). In the development of these
scales, prior scales used in different settings such as manufacturing and healthcare
were selected (Flynn and Saladin, 2001; Meyer and Collier, 2001; Meyer and
Schweikhart, 2002).
Each item was measured using a seven-point Likert scale. Several college and
university faculties and administrators assisted with pre-testing the questionnaire and
provided valuable feedback in terms of wording and useful performance measures to
be included in the questionnaire. This helped to establish content validity and focus the
questionnaire on the MBNQA Education Criteria for Performance Excellence (NIST,
2004). For example, the dimension Leadership (Organizational leadership-senior
leadership direction) used the following survey question: Our senior leadership
creates strategic direction and senior leaders communicate a clear vision. These
questions were tied to Baldrige Education Criteria for Performance Excellence
Dimension 1.1 note (1) that states, Organizational directions relate to creating the
vision for the organization and to setting the context for strategic objectives and action
plans. All survey questions were tied to specic criteria in the 2004 Baldrige
Education Criteria for Performance Excellence (NIST, 2004).
Forty-three individuals participated in a pilot test that was conducted to determine
the reliability of the measurement scales. Participants included university professors,
deans, academic policy advisors, administrators, and senior college leaders. Cronbachs
coefcient alpha was one measure used to evaluate reliability, and a guideline of 0.60
The Baldrige
Education
Criteria
1127
was used for the new scales in this study (Nunnally, 1967; Meyer and Collier, 2001).
Some items were dropped to improve the reliability of the scale and shorten the
instrument length without compromising the content validity. (In the Appendix, the
dropped items are identied with an asterisk in the last column). Alpha values ranged
from 0.820 to 0.909 for the pilot test and from 0.857 to 0.925 for the main study (Table I),
which indicated excellent internal consistency of the scale.
Study sample
Colleges and universities in the UAE composed the population studied in this research.
The study was conducted at the facility level, so that, each university or college was
counted separately in the sample, regardless of its afliation with a university or
college system. There were some small colleges in the country that were not included in
this study because they lacked minimum resource requirements to be considered. For
example, two of these small colleges operated through a single ofce in a certain
building. In addition, those universities or colleges usually have not developed
extensive quality management systems. Small colleges that did not have any sort of
accreditation from the Ministry of Higher Education in the UAE were also excluded
from the study. It should be noted that universities and colleges provide a wide variety
of educational services and are complex organizations. The Baldrige Criteria must
account for this complexity and the broad range of human resource (faculty and staff),
process, and information management (measurement, analysis, and knowledge
management) issues that these organizations face. A total of 15 universities and
colleges participated in the study.
The questionnaire was only mailed to individuals after a phone call informing them
of the study, apologizing for the size of the questionnaire, and encouraging them to be
honest and objective in responding to each item. Titles of individuals contacted
included vice chancellors, deputy vice chancellors, associate deputy vice chancellors,
advisors, deans, vice deans, associate deans, assistant deans, academic department
chairs, and unit heads. In all cases, it was made certain that each individual was
familiar with the practice of each item on the questionnaire at their institution.
The questionnaire was e-mailed to 409 individuals in 15 facilities. In total, 224
individuals completed and returned the questionnaire for a response rate of 54.7
percent. Six of the questionnaires were missing substantial results data (mostly
student, stakeholder, and market focus, and organizational performance results
categories) and were not included in further analysis, resulting in a nal sample size of
220. A small number of missing data points were replaced with scale-average scores.
Research hypotheses
The research hypotheses provided a comprehensive evaluation of the theory and
performance relationships proposed in the Malcolm Baldrige National Quality Award
Education Criteria for Performance Excellence (NIST, 2004). These hypotheses
addressed specic causal relationships among the seven Baldrige categories.
As mentioned earlier, the Baldrige theory states that Leadership drives the system
which creates results (Meyer and Collier, 2001, Winn and Cameron, 1998;
Pannirselvam and Ferguson, 2001). Figure 1 presents this model indicating the
relationships between the different quality management and performance evaluation
constructs. The exogenous (independent) factor in the model was leadership. The
IJQRM
23,9
1128
Number of items and
Cronbach Alpha
Pilot study Main study
Percent
variance
explained
(main study)
Leadership
Senior leadership direction 11 (0.883) 4 (0.905) 85.789
Organizational governance 6 (0.830) 4 (0.911) 70.709
Organizational performance review 11 (0.846) 5 (0.901) 65.968
Responsibilities to the public 9 (0.890) 4 (0.914) 65.694
Ethical behavior 8 (0.909) 5 (0.915) 66.702
Support of key communities 4 (0.857) 3 (0.900) 89.801
Strategic development
Strategy development process 11 (0.867) 5 (0.907) 75.505
Strategic objectives 7 (0.862) 4 (0.901) 81.985
Action plan development and deployment 8 (0.839) 5 (0.921) 71.593
Performance projection 6 (0.845) 3 (0.919) 73.679
Student, stakeholder, and market focus
Student knowledge 10 (0.853) 5 (0.901) 72.246
Stakeholders and market knowledge 9 (0.846) 4 (0.905) 71.457
Student and stakeholder relationships 6 (0.868) 3 (0.915) 87.823
Student and stakeholder satisfaction determination 6 (0.820) 3 (0.916) 72.469
Measurement, analysis, and knowledge management
Performance measurement 8 (0.870) 4 (0.906) 83.883
Performance analysis 7 (0.881) 4 (0.905) 90.025
Data and information availability 10 (0.844) 5 (0.871) 73.602
Organizational knowledge 6 (0.852) 3 (0.911) 81.766
Faculty and staff focus
Organization and management of work 7 (0.836) 3 (0.857) 73.778
Faculty and staff performance management 8 (0.875) 4 (0.916) 85.617
Hiring and career progression 9 (0.833) 4 (0.925) 67.114
Faculty and staff education, training and
development 8 (0.832) 5 (0.871) 69.339
Motivation and career development 6 (0.852) 3 (0.911) 81.797
Work environment 6 (0.854) 4 (0.915) 82.190
Faculty and staff support and satisfaction 9 (0.838) 5 (0.872) 68.509
Process management
Learning-centered processes-LCP 10 (0.864) 5 (0.906) 76.610
Support processes-SP 9 (0.860) 5 (0.910) 76.536
Organizational performance results
Student learning results 10 (0.879) 5 (0.906) 84.883
Student and stakeholder focused results 9 (0.885) 4 (0.901) 89.988
Budgetary, nancial and market results 10 (0.884) 5 (0.912) 88.443
Faculty and staff results 9 (0.883) 5 (0.915) 88.711
Organizational effectiveness results 12 (0.868) 6 (0.895) 76.750
Governance and social responsibility results 9 (0.859) 5 (0.911) 77.383
Table I.
The Baldrige categories,
number of items, scale
reliabilities, and percent
variance explained (pilot
study and main study)
The Baldrige
Education
Criteria
1129
endogenous factors were strategic quality planning; faculty and staff focus; process
management; measurement, analysis, and knowledge management; student,
stakeholder, and market focus; and organizational performance results.
Four specic research hypotheses were formulated to test directional relationships
between leadership and the four system dimensions:
H
1
. Leadership has a positive inuence on Process Management.
H
2
. Leadership has a positive inuence on Faculty and Staff Focus.
H
3
. Leadership has a positive inuence on Strategic Planning.
H
4
. Leadership has a positive inuence on Measurement, Analysis, and
Knowledge Management.
Two specic research hypotheses were formulated to test the directional relationships
between leadership and the two results dimensions:
H
5
. Leadership has a positive inuence on Student, Stakeholder, and Market
Focus.
H
6
. Leadership has a positive inuence on Organizational Performance Results.
Eight hypotheses were formulated to test the directional relationship between each of
the system dimensions and each of the two results dimensions listed below:
H
7
. Process Management has a positive inuence on Focus on and Student,
Stakeholder, and Market Focus.
H
8
. Process Management has a positive inuence on Organizational Performance
Results.
H
9
. Faculty and Staff Focus has a positive inuence on Student, Stakeholder, and
Market Focus.
H
10
. Faculty and Staff Focus has a positive inuence on Organizational
Performance Results.
H
11
. Strategic Planning has a positive inuence on Student, Stakeholder, and
Market Focus.
H
12
. Strategic Planning has a positive inuence on Organizational Performance
Results.
H
13
. Measurement, Analysis, and Knowledge Management has a positive
inuence on student, stakeholder, and market focus.
H
14
. Measurement, Analysis, and Knowledge Management has a positive
inuence on Organizational Performance Results.
Additionally, six within-system hypotheses were formulated to test the Baldrige
theory that management systems should be built upon a framework of measurement,
information and data, and analysis (Meyer and Collier, 2001; NIST, 1995, p. 4):
IJQRM
23,9
1130
H
15
. Measurement, Analysis, and Knowledge Management has positive inuence
on Strategic Planning.
H
16
. Measurement, Analysis, and Knowledge Management has positive inuence
on Faculty and Staff Focus.
H
17
. Measurement, Analysis, and Knowledge Management has positive inuence
on Process Management.
H
18
. Strategic planning has a positive inuence on Process Management.
H
19
. Strategic planning has a positive inuence on faculty and staff focus.
H
20
. Faculty and Staff Focus has a positive inuence on Process Management.
Finally, the last hypothesis tested the Baldrige theory that improving internal
capabilities and organizational performance results leads to improved external
performance (customer satisfaction):
H
21
. Organizational Performance Results has a positive inuence on Student,
Stakeholder, and Market Focus.
Each of these 21 hypothesized relationships was supported by the general theory that
Leadership drives the system which creates results. The general theory guided our
assumption about a recursive casual model and the direction for each of the specic
hypotheses.
The review of literature indicated that there was one study that dealt with the
MBNQA categories in higher education, the Winn and Cameron (1998) study. They
empirically examined the relationships between the MBNQA categories using data
from higher education. They administered a 190-item survey based on the MBNQA
criteria to all permanent non-instructional staff at a large Midwestern university.
Factor analysis indicated that the seven categories were reliable and valid. Winn and
Cameron used conrmatory path analysis to determine if the relationships between
categories as suggested by the MBNQA framework were supported. Results from the
LISREL analysis indicated that not all of the relationships in the framework were
entirely supported. As a result, they generated a modied model that proved to t the
data very well. In this study, we used Winn and Camerons (1998) modied model to
test whether the data collected t the modied model as well. Thus, we stated the
following hypothesis:
H
22
. The Winn and Cameron (1998) modied model will provide good-t statistics
using the current data.
Analysis methods
To test hypotheses H
1
to H
19
, two different procedures were used. The two procedures
examined the relationships among the MBNQA dimensions. First, multiple regression
analysis examined the relationships among each of the dimensions individually.
Second, structural equation modeling examined the predicted relationships among all
dimensions in the overall framework together (given the integrative direct and indirect
effects). Structural equations model was also used to test H
20
.
The Baldrige
Education
Criteria
1131
Structural equation modeling consists of two components, a measurement model
and a structural model (Hair et al., 1995; Hoyle, 1995; Bollen and Long, 1993; Bollen,
1989). The measurement model includes the relationships between the dimensions
(Baldrige subcategories) and the questionnaire items (indicators) that operationalize
measurement of those dimensions. For this study, the measurement model included the
33 categories of the Baldrige Educational Criteria for Performance Excellence, and the
141 questionnaire items (see Appendix) that comprise the measurement scales for the
categories. The results of statistical tests for the structural model are valid only if the
measurement model uses reliable scales that accurately measure the content of the
MBNQA Educational Criteria for Performance Excellence. The structural model
consisted of the relationships that link the Baldrige dimensions to their respective
categories as well as the dependent causal relationships that link the seven Baldrige
dimensions to one another.
In addition to testing the rst 21 hypotheses, the structural equations model also
served as a test of theory verication of the Baldrige Education Criteria for
Performance Excellence framework. To test the Winn and Cameron (1998) model, H
22
,
structural equation modeling was also used.
Scale reliabilities
The internal consistency method was used to test the reliability of the research
constructs. As suggested by Nunnally (1967), the coefcient alpha developed by
Cronbach (1951) was used to test for internal consistency. A Cronbach alpha value of
0.70 is considered the criterion for internal consistency for established scales
(Nunnally, 1967). Although the Cronbach alpha values for these constructs were
acceptable, we decided that a more conservative measure for reliability should be
calculated to conrm that these constructs were reliable (Pannirselvam and Ferguson,
2001)). Therefore, the amount of variance captured by the category in relation to the
amount of variance due to measurement error was also calculated for each construct (a
method suggested by Fornell and Larcker, 1981).
Scale unidimensionality, which was tested and conrmed for each scale, was
evaluated in the main study data set using Carmines and Zeller (1979) guidelines.
These guidelines were also recommended by other researchers dealing with
psychometric properties of scale (Meyer and Collier, 2001). The percent of variance
explained by the rst principal component of each measurement scale is given in
Table I, addressing Carmines and Zeller (1979) criterion that the rst component of
each scale explains more than 40 percent of the variance in the items. These results
show that the scales meet Carmines and Zeller (1979). The two remaining criteria (a
large eigenvalue for the rst component and small, fairly equal eigenvalues for
subsequent components) were also evaluated and upheld in the main study data set.
Principal component analysis was used to reduce item responses to a single score for
each of the 33 Baldrige categories. In this case, the rst component score for each
category was used in subsequent analysis.
Results
Scale reliabilities
The reliability of each of the 33 scales (categories) used in this study was re-evaluated
based on the main study data set. Cronbachs alpha values for the 33 measurement
IJQRM
23,9
1132
scales ranged from 0.857 to 0.944, exceeding guidelines for adequate reliability
(Nunnally, 1967; Flynn et al., 1990; Meyer and Collier, 2001), as shown in Table I (before
and after dropping certain items). The values were well above the minimum
recommended value of 0.70. The mean-average variance explained by each factor, were
all greater than 50 percent indicating that the variance captured by each construct was
greater than the variance due to measurement error (Fornell and Larcker, 1981).
Regression analysis: relationships among dimensions
The MBNQA Education Criteria for Performance Excellence framework (shown in
Figure 1) assumes that the following exist:
.
A direct relationship exists between leadership and the four system dimensions
of measurement, analysis, and knowledge management, strategic planning,
process management, and faculty, and staff focus.
.
A direct relationship exists between leadership and the two outcome dimensions
of student, stakeholder, and market focus and organizational performance
results.
.
A direct relationship exists between the four system dimensions of
measurement, analysis, and knowledge management, strategic planning,
process management, and faculty and staff focus and the two outcome
dimensions of student, stakeholder, and market focus, and organizational
performance results.
To assess the validity of the frameworks assumptions, three sets of regression
analyses were conducted. The rst regressed each of the four system dimensions on
the leadership dimension. The standardized regression coefcients produced by this
analysis is reported in Table II. The relationship between the leadership dimension and
each of the system dimensions was strong and statistically signicant. The assumed
relationship between an organizations leadership and each of the quality processes is
denitely supported. Table III reports the relationships between the leadership
dimension and the two outcomes dimensions and between the system dimensions
(individually) and the outcome dimensions. When each of the outcome dimensions was
regressed on the leadership dimension, the resulting relationships were also
signicant. That is, the multiple regression analysis revealed that the leadership
dimension had a statistically signicant effect on organization performance results and
student, stakeholder, and market focus. In summary, leadership signicantly
inuenced the organizations systems and outcomes. Other results indicated that all
The four system dimensions
Predictor
dimension
Strategic
planning
Process
management
Faculty and
staff focus
Measurement,
analysis, and
knowledge
management
Leadership Adjusted-R
2
0.764 0.788 0.724 0.786
Beta (b) 0.874 0.888 0.851 0.887
P , 0.000 , 0.000 , 0.000 , 0.000
Table II.
Regression of the four
system dimensions on
leadership
The Baldrige
Education
Criteria
1133
four system dimensions (individually) had relatively strong and statistically
signicant effects on the two outcome dimensions.
Next, we ran two sets of multiple regressions where the two outcome dimensions
were the dependent variables, and the four system dimensions were the independent
variables (see Table IV). We noted the four system dimensions collectively had
relatively strong and statistically signicant effects on the outcome dimensions. They
accounted for approximately 84 percent of the variation in the student, stakeholder,
and market focus dimension and approximately 93 percent of the variation in the
organizational performance results dimension. However, exceptions were a relatively
weak relationship between the student, stakeholder, and market focus dimension, and
the strategic planning and the faculty and staff focus dimension. These relationships
were statistically not signicant. In summary, the regression analyses showed that
leadership had a signicant effect on the four system dimensions and the outcome
Outcome dimensions
Predictor dimension
Student,
stakeholder,
and market
focus
Organizational
performance
results
Leadership Adjusted-R
2
0.870 0.632
Beta (b) 0.933 0.796
P , 0.000 , 0.000
Strategic planning Adjusted-R
2
0.830 0.636
Beta (b) 0.911 0.798
P , 0.000 , 0.000
Process management Adjusted-R
2
0.819 0.819
Beta (b) 0.906 0.905
P , 0.000 , 0.000
Faculty and staff focus Adjusted-R
2
0.724 0.780
Beta (b) 0.851 0.884
P , 0.000 , 0.000
Measurement, analysis, and
knowledge management
Adjusted-R
2
0.925 0.608 Beta (b) 0.962 0.781
P , 0.000 , 0.000
Table III.
Regression results of the
two outcome dimensions
on the driver (leadership)
and the four system
dimensions (individually)
Student, stakeholder, and
market focus
Organizational
performance results
Independent variables Beta (b) t Sig. Beta (b) t Sig.
Strategic planning 20.135 21.465 0.144 0.133 2.236 0.000
Process management 1.257 8.973 0.000 0.448 4.953 0.000
Faculty and staff focus 0.058 0.474 0.636 20.374 24.760 0.000
Measurement, analysis, and knowledge
management 20.306 23.430 0.001 0.754 13.105 0.000
Adjusted-multiple R
2
0.839 0.933
F Test 284.601 (P , 0.000) 757.277 (P , 0.000)
Table IV.
Multiple regression
results of the two
outcome dimensions on
the four system
dimensions
IJQRM
23,9
1134
dimensions. In turn, the system dimensions had a signicant effect on the outcome
dimensions. The direct effects of leadership on organizational outcomes assumed in the
MBNQA framework were supported.
The Baldrige Education Criteria Performance Excellence Model t
The root mean square error of approximation (RMSEA) is a measure of model t that is
not dependent on sample size (Hair et al., 1995; Browne and Mels, 1994; Steiger, 1990).
Many other t measures (e.g. Chi-square, goodness of t index) are highly dependent
on sample size. The following guidelines were used to determine model t using
RMSEA: RMSEA ,0.05, good model t; 0.05 , RMSEA ,0.10, reasonable model t;
RMSEA .0.10, poor model t (Browne and Mels, 1994, p. 86-87; Browne and Cudeck,
1993). The computed RMSEA value for the model was 0.057 indicating a reasonable
model t.
The overall t of the model can be tested by using the Chi-Square (x
2
). For a good t,
the x
2
value should be low and non-signicant. The x
2
value for the model was 1342.32,
which was signicant (p 0:0). This would suggest that the model was not conrmed
by the sample data. The signicance levels of x
2
, however, are sensitive to sample size
and multivariate normality. Therefore, other indicators of t, such as x
2
/df, Bentlers
(1990) comparative t index (CFI), Joreskog and Sorbom (1993) goodness of t index
(GFI), Bollens (1989) incremental t index (IFI), and the t index (NNFI), that correct
for these factors should also be used to assess the adequacy of the model (Joreskog and
Sorbom, 1993). With 476 degrees of freedom, the x
2
/df is 2.82, which was less than the
ratio of ve suggested in the literature. All the measures of goodness of t for the
model tested were above the desired 0.9 level. The CFI, IFI, NNFI and GFI are 0.94,
0.91, 0.92, and 0.90, respectively. These t indices indicated an acceptable t between
the model and data (Bollen, 1989).
The standardized path coefcients for the set of causal relationships are presented
in Figure 2. We noticed that all paths were signicant at the 0.01 or the 0.05 levels.
Table V shows the results of model estimation including path estimates, standard
errors, and results of t-tests for the signicance of the paths. A two-tailed t-test was
performed on each path estimate to evaluate its statistical signicance. The results of
testing the research hypotheses provided empirical support for all of the causal
relationships in the Baldrige Education Criteria for Performance Excellence model.
However, the level of signicance was different from one path to another. Hypotheses 1
through 4 addressed a casual inuence of leadership on each of the system categories.
We noticed that leadership had great inuence on these four categories (path estimates
varied from 0.60 to 0.75). The support of hypotheses H
1
to H
4
indicated that leadership
is an overall driver of strategic planning, process management, faculty and staff focus,
and measurement, analysis and knowledge management in higher education.
We also noted that leadership had signicant inuence on both outcome categories
with path estimates of 0.22 (student, stakeholder and market focus) and 0.54
(organizational performance results). These values gave support for hypotheses H
5
and H
6
.
The results of testing the research hypotheses provided empirical support also for
the inuence of the four system categories on both outcome categories. The highest
path estimate of 0.81 reected the signicant inuence of process management on
The Baldrige
Education
Criteria
1135
Figure 2.
The general Baldrige
model (Education)
IJQRM
23,9
1136
student, stakeholder and market focus. Thus, support was given to the next eight
hypotheses, H
7
through H
14
.
The inuences of the system categories on each other were also evident fromthe path
estimates and signicance levels. Thus, hypotheses H
15
to H
20
were also supported.
The last hypothesis in the Baldrige framework dealing with the two outcome
categories was supported as well. Results showed that organizational performance
results positively affected student, stakeholder and market focus. In summary,
considering the regression and structural equation model results, it was possible to
conclude that hypotheses H
1
to H
21
were all supported.
Hypotheses Path
Point
estimate t-value
Standard
error
H
1
Leadership ! Process management 0.67 19.50091
* *
0.035
H
2
Leadership ! Faculty and staff focus 0.62 16.2695
* *
0.038
H
3
Leadership ! Strategic planning 0.60 14.77693
* *
0.041
H
4
Leadership ! Measurement, analysis and
knowledge management 0.75 23.47116
* *
0.032
H
5
Leadership ! Student, stakeholder and market
focus 0.22 2.418208
*
0.090
H
6
Leadership ! Organizational performance results 0.54 11.29924
* *
0.048
H
7
Process management ! Student, stakeholder and
market focus 0.81 27.94886
* *
0.029
H
8
Process management ! Organizational
performance results 0.26 3.203337
*
0.082
H
9
Faculty and staff focus ! Student, stakeholder and
market focus 0.57 12.23809
* *
0.047
H
10
Faculty and staff focus ! Organizational
performance results 0.27 3.449619
*
0.078
H
11
Strategic planning ! Student, stakeholder and
market focus 0.58 14.28437
* *
0.041
H
12
Strategic planning ! Organizational performance
results 0.25 3.057055
*
0.082
H
13
Measurement, analysis and knowledge
management ! Student, stakeholder and market
focus 0.44 7.94642
* *
0.055
H
14
Measurement, analysis and knowledge
management ! Organizational performance
results 0.70 21.23975
* *
0.033
H
15
Measurement, analysis and knowledge
management ! Strategic planning 0.44 7.83642
* *
0.056
H
16
Measurement, analysis and knowledge
management ! Faculty and staff focus 0.43 7.29013
* *
0.059
H
17
Measurement, analysis and knowledge
management ! Process management 0.42 6.74385
* *
0.062
H
18
Strategic planning ! Faculty and staff focus 0.33 5.627313
* *
0.059
H
19
Strategic planning ! Process management 0.32 5.18103
* *
0.062
H
20
Faculty and staff focus ! Process management 0.62 16.2699
* *
0.038
H
21
Organizational performance results ! Student,
stakeholder and market focus 0.59 13.73065
* *
0.043
Notes:
*
path signicant at p , 0:05;
* *
path signicant at p , 0:01
Table V.
Path estimates for the
structural model
The Baldrige
Education
Criteria
1137
The Winn and Cameron model t
Winn and Cameron (1998) found that their research did not validate the Baldrige
framework. As a result, they performed modications of the framework to derive a
model that was statistically signicant and used exploratory analysis to suggest an
alternate statistically signicant model. The alternative framework included the direct
effects of leadership on each of the four systems categories, the direct effects of
strategic quality planning on management of process quality and customer focus and
satisfaction, the direct effect of human resource development and management on
quality and operational results, and the direct effects of management of process quality
on customer focus and satisfaction and quality and operational results.
On the other hand, this alternative framework took into account the lack of direct
effects on quality and operational results from leadership, information and analysis,
and strategic quality planning and the lack of direct effects on customer focus and
satisfaction from leadership, information and analysis, and human resource
development and management. However, it also recognized the indirect effects of
leadership and the direct effects of information and analysis, strategic quality
planning, and human resource development and management on the outcome
variables. Based on its ability to account for these predictive relationships and the fact
that it has an acceptable goodness-of-t with the data, the plausibility of the alternative
framework (modied model) was supported in Winn and Camerons (1998) study. They
concluded that leadership affected the outcomes by mediating effects through the
organizational systems.
After tting Winn and Cameron model to the current data, all the measures of
goodness of t were above the desired 0.9 level, except for one. The CFI, IFI, NNFI and
GFI were 0.94, 0.92, 0.91, and 0.89, respectively. These t indices indicated an
acceptable t between the model and data (Bollen, 1989). The standardized path
coefcients for the set of causal relationships are presented in Figure 3. We noticed that
all paths were signicant at the 0.01 or the 0.05 levels. These results provided support
for the acceptance of H
22
. It is interesting to note that our analysis provided evidence to
conrm the validity of the original Baldrige criteria (2004). The differences in Winn
and Camerons results and the results obtained in this study could be partially
explained by differences in the sample studied.
Discussion
The major nding of this research related to the role of leadership in the Baldrige
Education Criteria for Performance Excellence. Leadership has a direct causal
inuence on each of the components of the Baldrige System: process management,
faculty and staff focus, strategic planning, and measurement, analysis and knowledge
management. Leadership causes direct positive changes in each of the Baldrige System
categories. This result conrmed Baldrige theory that leadership drives the System.
These results corresponded to previous research (see Meyer and Collier, 2001; Winn
and Cameron, 1998; Belohlav et al., 2004; Ford and Evans, 2000; Goldstein and
Schweikhart, 2002; Handeld and Ghosh, 1995; Wilson, 1997).
The study showed that leadership was the most important enabler for achieving
educational performance excellence. We assumed that effective leadership modulated
the implementation of performance excellence in universities and colleges. Senior
leaders have a signicant inuence on, and the ability to make changes to, the
IJQRM
23,9
1138
Figure 3.
The Winn and Cameron
(1998) model with current
data
The Baldrige
Education
Criteria
1139
educational system. Thus, their role is crucial. Leadership must guide every system,
strategy, and process for achieving excellence. However, there are several enablers of
quality and performance excellence in higher education: strategic planning; faculty and
staff focus; student, stakeholder and market focus; process management; and
measurement, analysis and knowledge management. These enablers inuenced six
outcomes: student learning results; student and stakeholder results; budgetary,
nancial and market results; faculty and staff results; organizational effectiveness
results; and governance and social responsibility results.
Our research also showed evidence of an important causal relationship from
leadership to measurement, analysis and knowledge management. The inuence of
leadership on measurement, analysis and knowledge management is (0.75), which was
relatively stronger from leaderships inuence on the other system categories of
process management, faculty and staff focus, and strategic planning (0.67, 0.62, 0.60
respectively).
The stronger inuence of leadership on measurement, analysis and knowledge
management was also addressed in other empirical studies (Meyer and Collier, 2001).
This means that quality-driven institutions leaders recognized the critical role of
university information systems in providing systems of measurement, information,
and data analysis.
This study also showed that leaderships role in universitys quality management
systems was both direct, as indicated by the signicant paths from leadership to
organizational performance results and student, stakeholder and market focus. Also, it
was indirect, as it inuenced outcomes through the four system categories of
measurement, analysis and knowledge management, process management, strategic
planning, and faculty and staff focus.
For other industries, such as healthcare, Meyer and Collier (2001) did not nd any
support for direct effects of leadership on customer and stakeholder satisfaction. In
other industries, such as manufacturing, Handeld and Ghosh (1995) and Wilson
(1997) did not nd direct linkages between leadership and outcome categories. In the
education industry, our study showed that measurement, analysis and knowledge
management was a driver of within-system performance with a signicant causal
inuence on each of the other system categories: strategic planning, faculty and staff
focus, and process management. These relationships identied measurement, analysis
and knowledge management as the critical link in the Baldrige System.
A comparison of within-system causal linkages for the published Baldrige
Education model showed that we claried the direction and strength of causation
within the Baldrige System. These results corresponded with those of Wilson (1997)
and Wilson and Collier (2000) for manufacturing rms, Winn and Cameron (1998) for
education institutions, and Meyer and Collier (2001) for healthcare. The statistically
signicant causal inuence of measurement, analysis and knowledge management on
the other system categories supported the Baldrige theory that an effective
organization needs to be built upon a framework of measurement, information, data,
and analysis (NIST, 2004). Hence, University colleges, departments, administrative
units and other systems must be linked by an effective information system, and this
was reected by the signicant linkage of measurement, analysis and knowledge
management to the other system categories. We also noted that measurement, analysis
and knowledge management had a direct causal inuence on both outcomes,
IJQRM
23,9
1140
organizational performance results, and student, stakeholder and market focus. This
relationship indicated that effective use of measurement, information, and data, all
addressed in the Baldrige Criteria, represented key assets in the organizational
performance (Meyer and Collier, 2001; Winn and Cameron, 1998).
Results showed that faculty and staff focus development and satisfaction had a
positive causal inuence on student and stakeholder satisfaction. The research found
an important causal relationship from Baldrige process management to student and
stakeholder satisfaction in UAE higher education (strongest link of 0.80). These results
provided evidence that the design and delivery of educational and non-educational
processes were critical to student and stakeholder satisfaction and should be managed
from their perspectives.
Organizational performance results had a positive causal inuence on student,
stakeholder, and market focus. This performance relationship supported Baldrige
theory that improving internal capabilities and performance results in improved
external performance (Meyer and Collier, 2001; Collier, 1991; Collier and Wilson, 1997).
The results of this research provided impetus for senior leaders in higher education to
focus on improving faculty and staff resources and process management, both of
which had a direct causal inuence on customer satisfaction, and to strive for improved
internal performance outcomes that also help to create improved customer satisfaction.
Strategic planning had a statistically signicant causal inuence on both of the two
outcome categories. This result was in contrast to the outcome of other studies
performed in healthcare (Meyer and Collier, 2001; Wilson and Collier, 2000). They
found that strategic planning did not exert any signicant causal inuence on focus
and satisfaction of patients and other stakeholder. In healthcare, it may be difcult for
some hospitals to develop and deploy strategic plans because authorities are uncertain
what to include in the mission statement (Meyer and Collier, 2001; Gibson et al. 1990;
Calem and Rizzo, 1995). Our results reported that higher education institutions usually
were under pressure to obtain accreditation for their programs and offerings. Most
international accrediting agencies, such as ABET (engineering programs), or AACSB
(business programs), require universities to develop clear and specic strategies
toward educational excellence. Moreover, measures and indicators of performance
excellence in higher education may differ from those of healthcare organizations.
However, it might be more realistic if performance results are not compiled into a
single construct. In higher education, there are many categories with regard to
performance results and outcome where each might address a certain aspect of the
educational process (i.e. curriculum, delivering methods, teaching, advising, research,
job placements, campus life, etc.). Similar arguments were also made by other authors
(i.e. Meyer and Schweikhart, 2002; Meyer and Collier, 2001).
Conclusions and implications
Before discussing conclusions and implications, the limitations of the study should be
acknowledged. The use of self-reported information is always a concern in studies of
this nature. In addition, even though the sample size of this study (220) was consistent
with recommendations given by Anderson and Gerbing (1988), it was marginal
according to recommendations given by Hoelter (1983) and Hair et al. (1995).
Considering the size of the theoretical models tested in this study, the sample was just
within the range of acceptability. However, and despite these shortcomings, the results
The Baldrige
Education
Criteria
1141
provided useful insights for administrators in higher education institutions and
researchers.
This study used a conrmatory structural equation modeling and testing approach
to empirically validate many of the causal relationships in the MBNQA Education
model. The research study empirically tested the Baldrige education framework that
there is a signicant relationship between the leadership, systems, and processes of
higher education organizations and the consequent outcomes. Specically, this study
focused on determining the extent to which higher education results are explained by
the Baldrige Criteria. By providing empirical evidence of the nature of the relationships
between what organizations do and the results they achieve, this study offered decision
makers, managers, and researchers evidence that the Baldrige framework is a useful
tool for developing and managing quality systems in institutions of higher education.
The research aimed at exploring the nature of educational quality at UAE higher
education institutions. The research designed and presented a reliable and valid
self-assessment tool for higher education based mainly on the Baldrige Education
Criteria for Performance Excellence, which are recognized as involving most
comprehensive quality concepts. Through the survey results, the institutions of higher
education or just those schools wishing to undertake TQM programs are able to
diagnose their quality status, identify their strengths and weaknesses, and develop
action plans after performing a thorough cost-benet analysis.
The Malcolm Baldrige Education Criteria for Performance encourages higher
education organizations to address quality on a broad range of issues. Universities and
colleges that wish to compare equitably with Award winners must produce evidence of
leadership and long-term planning, initiate veriable quality control procedures,
address the happiness and well-being of the faculty and staff and, above all, work
toward student and stakeholder satisfaction and market focus. The criteria argue
strongly for customer-driven organizations, high levels of employee involvement, and
information-based management. Many universities could utilize the criteria as a
framework for implementing a quality program and establishing benchmarks for
measuring future progress.
Implications for senior administrators in higher education institutions
The results provided insight for higher education leaders into the dominant role
leadership plays in effective implementation of quality management systems. Strong
support of quality initiatives from senior level management has long been cited as the
starting point for an organizations quest to achieve a quality-driven culture. These
results corresponded with Winn and Cameron (1998) ndings that strong support by
senior administrators was an accelerator in the implementation of quality initiatives at
educational institutions.
The results of this study have some important implications for senior leaders in
institutions of higher education. Many institutions want to improve the quality of their
programs, offerings, and services, but they might be uncertain as to which quality
philosophy is the best one to use. Some higher education institutes might focus on the
philosophy of a single quality guru in planning their improvement process. Often,
these philosophies provide sound principles for senior leaders involved in quality
improvements, but they seldom provide a comprehensive system for measurement and
evaluation of quality efforts at the organizational level.
IJQRM
23,9
1142
A second implication for managers is drawn from our construct validity analysis.
From our structural equation modeling analysis, we found that each of the items was
an important part of a representative category. We also noted that all seven categories
were correlated with each other. The implications that each of these was correlated to
the others indicated that quality improvement efforts concentrated on one or a few of
these categories would be less effective. Senior leaders will need to plan and execute a
concerted effort on several fronts to achieve world-class quality education.
One of the most important results of this study was the presentation of a reliable
and valid instrument based on the Baldrige Education Criteria for Performance
Excellence. This instrument could be utilized by senior leaders at institutions of higher
education as a self-assessment tool. Self assessment is important because it helps
institutions of higher education to dene their quality system and select student,
stakeholder, and market focus quality objectives. Amajor motivation in developing the
survey instrument we used was to make it simple enough to assist senior leaders in
higher education to conduct internal MBNQA Education Excellence self assessment.
In particular, the tested model provided guidelines on how to proceed with a quality
improvement strategy in higher education. Assuming that committed and effective
leadership is in place, the rst step is to gather and utilize information on internal and
external environments. The model indicated that this information should feed the
development of a strategic quality plan, which in turn guides the design and
development of a faculty and staff management system as well as a set of
organizational processes focused on quality. The design of these organizational
processes should form a base because they are most important elements inuencing
the core outcome dimensions of student, stakeholder and market focus, and university
performance results.
Publicizing the use of the MBNQA Education Criteria for Performance Excellence is
one way of raising awareness of quality management in institutions of higher
education in the UAE (and comparable institutions in the Gulf Cooperation Council). It
would help identify areas for improvement. When pursuing customer-focused and
market-driven quality strategies, these criteria and standards can also provide
references to higher education organizations. Finally, all levels of faculty and staff in a
college or university might take the initiative to fulll their different needs for
education and training in quality management. To further the quality movement in
higher education, senior leaders should take a leading role in promoting contemporary,
strategic quality management concepts and practices. Likewise, they should play an
active role in UAEs efforts to improve quality of the educational system.
Implications for researchers
As mentioned, organizational performance results dimension is conceptually broad,
measuring many facets of internal and external university/college performance. We
found extreme difculties in identifying general items to capture the two outcome
components of the Baldrige framework in education. Higher education organizations
deal with many different issues and priorities. More research on specic outcomes for
different facets of higher education for both internal and external customers is needed
to identify specic and clear measures or indicators of performance and satisfaction
(i.e. student segments, disciplines, majors, research, administrative, campus life, job
placement, alumni activities, interdepartmental links, accreditation, etc.).
The Baldrige
Education
Criteria
1143
Future research can improve upon our research ndings by evaluating other
educational units using different samples at other educational organizations, such as
public and private school systems. Another plausible direction for future research
might test the model across different cultures (countries). It should make the Baldrige
Education Criteria for Performance Excellence more generalizable. Our research adds
to the rich body of other endeavors to nd the best casual model of organizational
performance for higher education. Other research should reinvestigate the general
model tested here and explore other competing (alternative) models.
References
Anderson, J.C. and Gerbing, D.W. (1988), Structural equation modeling in practice: a review and
recommended two-step approach, Psychological Bulletin, Vol. 103 No. 3, pp. 411-23.
Arif, M. and Smiley, F. (2004), Baldrige theory in practice: a working model, International
Journal of Educational Management, Vol. 18 No. 5, pp. 324-8.
Athiyaman, A. (1997), Linking student satisfaction and service quality perceptions: the case of
university education, European Journal of Marketing, Vol. 31 No. 7, pp. 528-40.
Badri, M. and Abdulla, M. (2004), Awards of excellence in institutions of higher education:
an AHP approach, International Journal of Educational Management, Vol. 18 No. 4,
pp. 224-42.
Belohlav, J., Cook, L. and Heiser, D. (2004), Using the Malcolm Baldrige National Quality Award
in teaching: one criterion, several perspectives, Decision Sciences Journal of Innovation
Education, Vol. 2 No. 2, pp. 153-76.
Bentler, P. (1990), Comparative t index in structural models, Psychological Bulletin, Vol. 107
No. 2, pp. 238-46.
Bigelow, B. and Arndt, M. (1995), Total quality management: eld of dreams?, Health Care
Management Review, Vol. 20 No. 4, pp. 15-25.
Bigelow, B. and Arndt, M. (2000), The more things change, the more they stay the same, Health
Care Management Review, Vol. 25 No. 1, pp. 65-72.
Bollen, K. (1989), Structural Equations with Latent Variables, Wiley, New York, NY.
Bollen, K. and Long, J. (1993), Testing Structural Equation Models, Sage Publications, Newbury
Park, CA.
Bourner, T. (1998), More knowledge, new knowledge: the impact of education and training,
Education+Training, Vol. 40 No. 1, pp. 11-14.
Browne, M. and Cudeck, R. (1993), Alternative ways of assessing model t, in Bollen, K.A. and
Long, J.S. (Eds), Testing Structural Equation Models, Sage Publications, Newbury Park,
CA, pp. 136-62.
Browne, M. and Mels, G. (1994), RAMONA Users Guide, Department of Psychology, The Ohio
State University, Columbus, OH.
Calem, P. and Rizzo, J. (1995), Competition and specialization in the hospital industry:
an application of Hotellings location model, Southern Economic Journal, Vol. 61 No. 4,
pp. 1182-98.
Carmines, E. and Zeller, R. (1979), Reliability and Validity Assessment, Sage Publications, Beverly
Hills, CA.
Castka, P., Bamber, C. and Sharp, J. (2003), Measuring teamwork culture: the use of modied
EFQM Model, Journal of Management Development, Vol. 22 No. 2, pp. 149-70.
IJQRM
23,9
1144
Chaffee, E. and Sherr, L. (1992), Quality: Transforming Postsecondary Education, ASHE-ERIC
Education Report No. 3, ASHE-ERIC Clearinghouse on Higher Education, Washington,
DC.
Cheng, Y. and Tam, W. (1997), Multi-models of quality in education, Quality Assurance in
Education, Vol. 5 No. 1, pp. 22-32.
Collier, D. (1991), A service quality process map for credit card processing, Decision Sciences,
Vol. 22 No. 2, pp. 406-19.
Collier, D. and Wilson, D. (1997), The role of automation and labor in determining customer
satisfaction in a telephone repair service process, Decision Sciences, Vol. 28 No. 3,
pp. 689-708.
Cornesky, R. and Associates (1991), Implementing Total Quality Management in Higher
Education, Magna Publications, Madison, WI.
Cronbach, L. (1951), Coefcient alpha and the internal structure of tests, Psychometrika, Vol. 16
No. 3, pp. 297-334.
Da Rosa, M., Saraiva, P. and Diz, H. (2003), Excellence in Portuguese higher education
institutions, Total Quality Management, Vol. 14 No. 2, pp. 189-97.
Dow, D., Samson, D. and Ford, S. (1999), Exploding the myth: do all quality management
practices contribute to superior quality performance?, Production and Operations
Management, Vol. 8 No. 1, pp. 1-27.
Education Criteria for Performance Excellence (2005), p. 9, available at: www.quality.nist.gov/
PDF_les/2005_Education_Criteria.pdf (accessed 20 April, 2005).
Evans, J. (1997), Critical linkages in the Baldrige award criteria: research models and
educational challenges, Quality Management Journal, Vol. 5 No. 1, pp. 13-30.
Evans, J. and Ford, M. (1997), Value-driven quality, Quality Management Journal, Vol. 4 No. 4,
pp. 19-31.
Flynn, B. and Saladin, B. (2001), Further evidence on the validity of the theoretical models
underlying the Baldrige criteria, Journal of Operations Management, Vol. 19 No. 6,
pp. 617-52.
Flynn, B., Sakakibara, S., Shroeder, R., Bates, K. and Flynn, E. (1990), Empirical research
methods in operations management, Journal of Operations Management, Vol. 9 No. 2,
pp. 250-84.
Ford, M. and Evans, J. (2000), Conceptual foundations of strategic planning in the Malcolm
Baldrige criteria for performance excellence, Quality Management Journal, Vol. 7 No. 1,
pp. 8-26.
Fornell, C. and Larcker, D. (1981), Evaluating structural equation models with unobservable
variables and measurement error, Journal of Marketing Research, Vol. 18 No. 1, pp. 39-50.
Gann, M. and Restuccia, J. (1994), Total Quality Management in health care: a view of current
and potential research, Medical Care Review, Vol. 51 No. 4, pp. 467-500.
George, C., Cooper, F. and Douglas, A. (2003), EFQM Excellence Model in a local authority,
Managerial Auditing, Vol. 18 No. 2, pp. 122-7.
Gibson, C., Newton, D. and Cochran, D. (1990), An empirical investigation of the nature of
hospital mission statements, Healthcare Management Review, Vol. 15 No. 3, pp. 35-45.
Goldstein, S. and Schweikhart, S. (2002), Empirical support for the Baldrige Award Framework
in US hospitals, Health Care Management Review, Vol. 27 No. 1, pp. 62-75.
Hair, J., Anderson, R., Tatham, R. and Black, W. (1995), Multivariate Data Analysis, 4th ed.,
Prentice-Hall, Upper Saddle River, NJ.
The Baldrige
Education
Criteria
1145
Handeld, R. and Ghosh, S. (1995), An empirical test of linkages between the Baldrige criteria
and nancial performance, Proceedings of the Decision Sciences Institute, Vol. 3, Decision
Sciences Institute, Atlanta, GA, pp. 1713-15.
Hoelter, J.W. (1983), The analysis of covariance structures: goodness-of-t indices, Sociological
Methods and Research, Vol. 11 No. 3, pp. 325-44.
Hoyle, R. (Ed.) (1995), Structural Equation Modeling: Concepts, Ideas, and Applications, Sage
Publications, Thousand Oaks, CA.
Joreskog, K.G. and Sorbom, D. (1993), LISREL 8: Analysis of Linear Structural Relationships by
Maximum Likelihood, Instrument Variables and Least Squares Methods, 8th ed., Scientic
Software, Morresville, IN.
Keinath, B. and Gorski, B. (1999), An empirical study of the Minnesota quality award evaluation
process, Quality Management Journal, Vol. 6 No. 1, pp. 29-39.
Khanna, V., Vrat, P., Shankar, R. and Sahay, B. (2002), Developing causal relationships for a
TQM index for the Indian automobile sector, Work Study, Vol. 51 No. 7, pp. 364-73.
Khoo, H. and Tan, K. (2003), Managing for quality in the USA and Japan: differences between
the MBNQA, DP and JQA, The TQM Magazine, Vol. 15 No. 1, pp. 14-24.
Li, M. and Yang, J. (2003), Adecision model for self-assessment of business process based on the
EFQM Excellence Model, International Journal of Quality & Reliability Management,
Vol. 20 No. 2, pp. 164-88.
McElwee, G. and Redman, T. (1993), Upward appraisal in practice: an illustrative example using
the Qualed model, Education+Training, Vol. 35 No. 2, pp. 27-31.
Mackerron, G., Masson, R. and McGlynn, M. (2003), Self assessment: use at operational level to
promote continuous improvement, Production Planning & Control, Vol. 14 No. 1, pp. 82-9.
Mak, W. (1999), Cultivating a quality mind-set, Total Quality Management, Vol. 10 Nos 4/5,
pp. 622-6.
Mak, W. (2000), The Tao of people-based management, Total Quality Management, Vol. 11 Nos
4/5/6, pp. 537-43.
Marchese, T. (1993), TQM: a time for ideas, Change, Vol. 25 No. 3, pp. 10-13.
Meyer, S. and Collier, D. (2001), An empirical test of the causal relationships in the Baldrige
Health Care Pilot Criteria, Journal of Operations Management, Vol. 19 No. 4, pp. 403-25.
Meyer, S. and Schweikhart, S. (2002), Empirical support for the Baldrige Award Framework in
US hospitals, Health Care Management Review, Vol. 27 No. 1, pp. 62-75.
Motwani, J., Sower, V. and Brashier, L. (1996), Implementing TQM in the health care sector,
Health Care Management Review, Vol. 21 No. 1, pp. 73-82.
NIST (1995), Malcolm Baldrige National Quality Award, 1995 Award Criteria, National Institute
of Standards and Technology, Gaithersburg, MD.
NIST (2004), Education Criteria for Performance Excellence, National Institute of Standards and
Technology, Gaithersburg, MD.
Nunnally, J.C. (1967), Psychometric Theory, McGraw-Hill, New York, NY.
Oldeld, B. and Baron, S. (2000), Student perceptions of service quality in a UK university
business and management faculty, Quality Assurance in Education, Vol. 8 No. 2, pp. 85-95.
Palihawadana, D. (1996), Modeling student evaluation in marketing education, Proceedings of
the 1996 Annual Marketing Education Group Conference.
Pannirselvam, G. and Ferguson, L. (2001), A study of the relationships between the Baldrige
categories, International Journal of Quality & Reliability Management, Vol. 18 No. 1,
pp. 14-34.
IJQRM
23,9
1146
Pannirselvam, G., Siferd, S. and Ruch, W. (1998), Validation of the Arizona governors quality
award criteria: a test of the Baldrige criteria, Journal of Operations Management, Vol. 16
No. 5, pp. 529-50.
Samson, D. and Terziovski, M. (1999), The relationship between total quality management
practices and operational performance, Journal of Operations Management, Vol. 17 No. 4,
pp. 393-409.
Seymour, D. (1993), On Q: Causing Quality in Higher Education, Oryx Press, Phoenix, AZ.
Sherr, L. and Lozier, G. (1991), Total quality management in higher education, in Sherr, L. and
Tetter, D. (Eds), New Directions for Institutional Research, Association for Institutional
Research, Louisville, KY.
Soutar, G. and McNeil, M. (1996), Measuring service quality in a tertiary institution, Journal of
Educational Administration, Vol. 34 No. 1, pp. 72-82.
Steiger, J. (1990), Structural model evaluation and modication: an interval estimation
approach, Multivariate Behavioral Research, Vol. 25 No. 2, pp. 173-80.
Stewart, A. (2003), An investigation of suitability of the EFQM Excellence Model for a
pharmacy department with NHS trust, International Journal of Health Care Quality
Assurance, Vol. 16 No. 2, pp. 65-76.
Varey, R. (1993), The course for higher education, Managing Service Quality, September,
pp. 45-9.
Vora, M. (2002), Business excellence through quality management, Total Quality Management,
Vol. 13 No. 8, pp. 1151-9.
Weinstein, L., Petrick, J. and Saunders, P. (1998), What higher education should be teaching
about quality but is not, Quality Progress, Vol. 1998, pp. 91-5.
Wilson, D. and Collier, D. (2000), An empirical investigation of the Malcolm Baldrige National
Quality Award causal model, Decision Sciences, Vol. 31 No. 2, pp. 361-90.
Wilson, D.D. (1997), An empirical study to test the causal linkages implied in the Malcolm
Baldrige National Quality Award, dissertation, The Ohio State University, Columbus,
OH.
Winn, B. and Cameron, K. (1998), Organizational quality: an examination of the Malcolm
Baldrige quality framework, Research in Higher Education, Vol. 39 No. 5, pp. 491-512.
York, K. and Miree, C. (2004), Causation or covariation? An empirical re-examination of the link
between TQM and nancial performance, Journal of Operations Management, Vol. 22
No. 3, pp. 291-311.
Yorke, M. (1992), Quality in higher education: a conceptualization and some observations on the
implementation of a sectoral quality system, Journal of Higher Education, Vol. 16 No. 2,
pp. 34-46.
Further reading
NIST (1995a), Malcolm Baldrige National Quality Award 1995 Education Pilot Criteria, National
Institute of Standards and Technology, Gaithersburg, MD.
NIST (1995b), Malcolm Baldrige National Quality Award 1995 Health Care Pilot Criteria,
National Institute of Standards and Technology, Gaithersburg, MD.
NIST (1999), Malcolm Baldrige National Quality Award 1998 Criteria for Performance
Excellence, National Institute of Standards and Technology, Gaithersburg, MD.
The Baldrige
Education
Criteria
1147
Appendix
Table AI shows the Malcolm Baldrige Education Criteria for Performance Excellence categories
(items in the pilot study and deleted items in the main study).
Deleted in
main study
For items 1 to 215, please indicate how often the following occur in your college or university:
Scale anchors are 1, 2, 3, 4, 5, 6, or 7; where (1) Not at all . . . (4) Sometimes . . . (7) Always
Leadership (Organizational leadership-Senior leadership direction)
1. Senior leaders create strategic directions
2. Senior leaders communicate a clear vision
3. Senior leaders guide in setting organizational values
* *
4. Senior leaders set specic action plans for successful implementation of
strategic objectives
* *
5. Senior leaders show strong commitment to policies and strategies
6. Senior leaders guide in setting performance expectations
* *
7. Senior leaders continuously communicate with staff and faculty
8. Senior leaders continuously address the needs of students and community
* *
9. Senior leaders create an environment characterized by ethical behavior
* *
10. Senior leaders create an environment that encourages learning
* *
11. Senior leaders create an environment that takes into account key development
needs of students, staff and faculty
* *
Leadership (Organizational leadership-Organizational governance)
12. Our governance system ensures accountability of staff and faculty members
13. Our governance system ensures monitoring the performance of our senior leaders
14. Our governance system ensures protection of students interests
15. Our senior leaders are accessible to students and faculty and staff
16. Our governance system ensures protection of faculty and staff interests
* *
17. Our governance system ensures protection of community interests
* *
Leadership (Organizational leadership-Organizational performance review)
18. Senior leaders continuously review our organizational performance
19. Senior leaders continuously review our organizational capabilities
* *
20. Senior leaders communicate the importance of continuous improvement and
quality
21. Senior leaders continuously use reviews to assess our performance relative to
our competitors
* *
22. Senior leaders continuously use reviews to assess our progress relative to short
and long term goals
* *
23. We have an established set of performance measures
* *
24. Senior leaders use our performance measures for setting future directions
* *
25. We have a formal procedure to evaluate our senior leaders
26. External bodies perform some organization performance reviews
27. Leadership performance evaluation is supported by feedback and survey data
from faculty and staff
28. Leadership performance evaluation is supported by feedback and survey data
from parents
* *
Leadership (Social responsibility-Responsibilities to the public)
29. Our leaders address the impact of our programs and offerings on society
30. We establish key measures for achieving international accreditation requirements
(continued)
Table AI.
IJQRM
23,9
1148
Deleted in
main study
31. We establish key measures for achieving local-national accreditation requirements
* *
32. We establish key measures for addressing risk associated with our programs
* *
33. We integrate public responsibility into performance improvement efforts
* *
34. In our planning, we anticipate publics concern with our programs and offerings
35. In our planning, we anticipate publics concern with our future programs and
offerings
36. We support and encourage the community service of our faculty
* *
37. We give students the opportunity to develop their social and citizenship values
and skills
* *
Leadership (Social responsibility-ethical behavior)
38. We ensure ethical behavior in all our students
* *
39. We ensure ethical behavior in all our faculty and staff
40. We ensure ethical behavior in all our higher administration
41. We have established clear measures to monitor ethical behavior of students,
faculty and staff
42. We have established clear measures to monitor ethical behavior of our partners
(i.e. vendors)
* *
43. Our organization is sensitive to public issues
44. We practice and support good citizenship in our organization
* *
45. We try to portray ourselves as role models when it comes to public
responsibility, ethics and citizenship
Leadership (Social responsibility-support of key communities
46. Our faculty is actively engaged in support of our key communities
47. Our senior leaders are actively engaged in support of our key communities
* *
48. Our organization supports efforts to strengthen our local communities
49. We lead efforts to improve community services, including environmental programs
Strategic planning (strategy development-strategy development process)
50. We follow a formal/informal process of strategy development
* *
51. We utilize various types of forecasts, projections, options, and scenarios in
decision making about our future
52. Our strategies usually lead to changes or modications in programs, services,
and use of technologies.
* *
53. We involve faculty and staff when developing our strategies
* *
54. We involve stakeholders when developing our strategies
* *
55. We perform studies to identify the factors that affect our organizations future
56. We gather and analyze relevant data and information for our strategic planning
process
* *
57. We take a long-term view when planning for our organizations future
opportunities and directions
* *
58. Our strategic development process is student, stakeholders, and market-focused
59. Our strategic development process takes into account our competitors
weaknesses and strengths
60. We ensure that our strategic planning addresses student learning and development
Strategic planning (strategy development-strategic objectives)
61. We specify timetables for accomplishing our strategic objectives
* *
62. Our strategic objectives directly address the challenges outlined in our
organizational prole
(continued)
Table AI.
The Baldrige
Education
Criteria
1149
Deleted in
main study
63. Our strategic objectives are aimed at developing a competitive leadership
position in our educational offerings
64. Our long-term vision guides our day-to-day activities
* *
65. Our strategic objectives address both short- and long-term challenges and
opportunities
66. Our strategic objectives balance the needs of all student and key stakeholders
67. Partnership with our community support our strategic plans
* *
Strategic Planning (strategy deployment-action plan development and deployment)
68. We convert our strategic objectives into short- and long-term action plans to
accomplish the objectives
* *
69. Strategic plans are translated into specic requirements for each work unit or
department
70. Improvement plans are regularly upgraded
71. We continuously assess progress relative to these action plans
* *
72. We allocate necessary resources for carrying out these action plans
73. We use key measures and indicators in tracking progress relative to action plans
74. Strategic decisions are evaluated with objectives measures or indicators
* *
75. We continuously develop human resource plans (i.e. education and training) that
will enable accomplishment of our strategic objectives and action plans
Strategic Planning (Strategy deployment-performance projection)
76. We use key established measures or indicators to performance projection
77. Short and long term decisions and actions are aligned with our strategic plans
* *
78. We compare our projected performance with the projected performance of
competitors and key benchmarks
79. Our strategic plans include reducing waste (including idle time, materials, etc.)
in all departments
* *
80. We use measures or indicators to track dynamic, competitive performance factors
81. Our tracking mechanism of performance measures or indicators are utilized as
key diagnostic tool
* *
Student, stakeholders, and market focus (Student, stakeholders, and market
knowledge-student knowledge)
82. We have well established mechanism for determining student needs and
expectations
83. We have created a climate conductive to learning
84. We analyze student complaints to improve our services
* *
85. We conduct regular student surveys for better listening and learning
* *
86. Our educational programs and services address the needs of special students
* *
87. We have an effective student placement service unit
88. We provide a variety of extracurricular activities
* *
89. Our educational programs emphasize problem solving approaches
90. Our educational programs emphasize learning and communication skills
* *
91. Our educational programs emphasize critical thinking skills
Student, stakeholders, and market focus (student, stakeholders, and market
knowledge-stakeholders and market knowledge)
92. Our programs are relevant to community needs
* *
93. Our educational programs are dynamic and keep pace with market changes
94. We conduct regular visits to high schools to promote our university and
programs
* *
(continued) Table AI.
IJQRM
23,9
1150
Deleted in
main study
95. We conduct regular visits to community and industry to promote our university
and programs
* *
96. We use feedback from our alumni to assess our programs and offerings
* *
97. We use feedback from our stakeholders to assess our programs and offerings
98. We conduct regular stakeholders surveys for better listening and learning
* *
99. We take into consideration changing methods of delivering educational services
100. In planning our programs, we take into account global and international
requirements
Student, stakeholders, and market focus (student and stakeholder relationship and
satisfaction-student and stakeholder relationships)
101. We continuously build active relationships with students and stakeholders
102. We have developed partnerships and alliances with students and stakeholders
* *
103. We build active relationships to enhance student performance and expectations
* *
104. We have modern mechanism for students and stakeholders to access
information about our programs
105. We have modern mechanism for students/stakeholders to make complaints
about our programs/ services
106. We have set a process that ensures that complaints are resolved effectively and
promptly
* *
Student, stakeholders, and market focus (student and stakeholder relationship and
satisfaction-student and stakeholder satisfaction determination)
107. We have established effective mechanism for determining student/stakeholders
satisfaction/ dissatisfaction
108. We use students/stakeholders satisfaction/dissatisfaction information to
improve programs/services
109. We use drop-out rates, absenteeism, complaint data as methods to
determine student/stakeholder satisfaction/ dissatisfaction
* *
110. We use modern technologies (internet) for determining satisfaction/dissatisfaction
* *
111. We use satisfaction/dissatisfaction data to determine value, cost and revenue
implications
* *
112. We seek information from staff and faculty for building long-term partnership
with students and stakeholders
Measurement, analysis, and knowledge management (measurement and analysis of
organizational performance-Performance measurement)
113. We collect and integrate information on evidence of student learning
114. We collect and integrate information for tracking daily operations
* *
115. We use data and information for tracking overall organization performance
116. We use data and information to support organization decision making
* *
117. Information systems are used to link our programs and services with student
outcomes
118. We obtain data and information by benchmarking and seeking competitive
comparisons
119. We collect and utilize information on mistakes, complaints, and customer
dissatisfaction
* *
120. We ensure the effective use of key comparative data from within and outside the
educational community
* *
(continued) Table AI.
The Baldrige
Education
Criteria
1151
Deleted in
main study
Measurement, analysis, and knowledge management (measurement and analysis of
organizational performance-performance analysis)
121. Our performance analysis includes examining trends
122. Our performance analysis includes organizational and academic community
projections
* *
123. Our performance analysis includes technology projections
* *
124. Our performance analysis includes comparisons and cause and effect
relationships
125. Our performance analysis help determine root causes and set priorities for
resource use
126. Our performance analysis draws upon all types of data (student, programs,
stakeholders, market, operational, budgetary and comparative data)
127. Results of our performance analysis contribute highly to senior leaders review
and strategic planning
* *
Measurement, analysis, and knowledge management (information and knowledge
management-data and information availability)
128. We ensure the availability of high quality information for key users
129. We ensure the availability of timely data and information for key users
130. Our data and information are accessible to our partners (communities and
stakeholders)
* *
131. We ensure that our hardware and software are reliable, secure and user friendly
132. We ensure that data, information and organizational knowledge enjoy
appropriate levels of security and condentiality
* *
133. We ensure that data, information and organizational knowledge enjoy integrity,
reliability, accuracy and timeliness
* *
134. We encourage the use of electronic information
135. Our information systems are standardized across departments
136. We encourage the use of the internet for information storage and access
* *
137. We encourage the use of advanced information technology to communicate with
our students
* *
Measurement, analysis, and knowledge management (information and knowledge
management-organizational knowledge)
138. We ensure that our people keep current with changing educational needs and
directions
139. We constantly develop innovative solutions that add value for our students
140. We constantly develop innovative solutions that add value for stakeholders
* *
141. The focus of our knowledge management is on the knowledge that our people
need to do their work
* *
142. The focus of our knowledge management is on the knowledge we need to
improve processes, programs and services
* *
143. Our organizational knowledge system focuses on the identication and sharing
of best practices
Faculty and staff focus (work systems-organization and management of work)
144. We have effective ways to organize and manage work and jobs to promote
empowerment and innovation
145. We ensure that the skills and experiences of our staff and faculty are equitably
distributed
* *
(continued) Table AI.
IJQRM
23,9
1152
Deleted in
main study
146. We have effective ways to organize and manage work and jobs to achieve the
agility to keep current with educational service needs
* *
147. We motivate employees by improved job design
* *
148. Our work system capitalizes on the diversity of culture and thinking of our
faculty, staff and communities
* *
149. We achieve effective communication and skill sharing across departments and
functions
150. Our work systemensures ongoing education and training for our staff and faculty
Faculty and staff focus (work systems-faculty and staff performance management
system-PMS)
151. Our PMS includes feedback to faculty and staff
152. Our PMS supports a stakeholder focus
* *
153. Our compensation, recognition, and related reward and incentive practices
reinforce high performance work
* *
154. Our PMS is characterized by a focus on student achievement and innovation
155. Our compensation and recognition system is tied to efforts in community and
university service
156. Our compensation and recognition system is tied to student evaluation of
teaching and classroom performance
157. Our compensation and recognition approaches include rewarding exemplary
performances
158. Our PMS emphasizes consistency between compensation and recognition
* *
Faculty and staff focus (work systems-hiring and career progression)
159. We have an effective mechanism to identify skills needed by potential staff and
faculty
* *
160. We have an effective way of recruiting and hiring faculty and staff
161. We have an effective way of retaining faculty and staff
162. We ensure that our faculty and staff represent diverse ideas, cultures, and
thinking
163. We have established an effective succession planning for senior leadership and
supervisory positions
* *
164. We manage effective career progression for all faculty throughout the organization
* *
165. We manage effective career progression for all administrative and technical
staff throughout the organization
166. We ensure that our faculty and staff are appropriately certied and licensed
when required
* *
167. Our faculty promotion process is based on accepted principles of academic
performance
* *
Faculty and staff focus (faculty and staff learning and motivation-faculty and staff
education, training and development)
168. Our faculty and staff education and training contribute to the achievement of
our action plans
* *
169. We utilize faculty and staff education and training delivery programs both
inside and outside our organization
* *
170. Our faculty and staff education and training addresses our key needs associated
with our organizational performance improvement and technological change
171. We seek and use input from faculty and staff and their supervisors on education
and training needs
(continued) Table AI.
The Baldrige
Education
Criteria
1153
Deleted in
main study
172. We deliver education and training to our staff and faculty using diverse modern
methods
173. We reinforce the use of new knowledge and skills obtained by faculty and staff
on the job
174. We regularly evaluate the effectiveness of education and training obtained
* *
175. We provide appropriate orientation of new faculty and staff as part of our
education and training programs
* *
Faculty and staff focus (faculty and staff learning and motivation-motivation and
career development)
176. We have effective ways in motivating faculty and staff to develop and utilize
their full potential
177. We use formal/informal mechanisms to help faculty and staff attain job- and
career-related development and learning objectives
* *
178. Faculty and staff appraisals include personal improvement plans
179. We provide many opportunities for faculty and staff professional development
180. Our senior leaders and supervisors help faculty and staff attain job- and
career-related development and learning objectives
* *
181. To help faculty and staff utilize their full potential we use individual
development plans that addresses his or her career and learning objectives
* *
Faculty and staff focus (faculty and staff well-being and satisfaction-work
environment)
182. Our work environment supports the well-being and development of all
employees
183. We continuously work to improve workplace health, safety, security and
ergonomics
184. We ensure that our faculty and staff take part in improving workplace health,
safety, security and ergonomics
* *
185. We have established set of measures or indicators for each of these key
workplace factors
186. We continuously solicit faculty and staff to communicate to us their work
environment problems
187. We ensure workplace preparedness for emergencies or disasters
* *
Faculty and staff focus (faculty and staff well-being and satisfaction-faculty and staff
support and satisfaction)
188. We have established key factors that affect faculty and staff well-being,
satisfaction and motivation
189. Our key factors are segmented for our diverse workforce
* *
190. We support our faculty and staff via services, benets, and policies
191. We provide various faculty and support services (i.e. counseling, career
development, day-care)
192. We provide various recreational and cultural activities to our faculty and staff
193. The services, benets and policies are tailored to the needs of our divers
workforce
* *
194. We use formal/informal assessment methods and measures to determine faculty
and staff well-being, satisfaction and motivation
195. We relate assessment ndings to key organizational performance results to
identify priorities for improving our work environment
* *
196. We ensure effective resolution of faculty and staff problems and grievances
* *
(continued)
Table AI.
IJQRM
23,9
1154
Deleted in
main study
Process management (learning-centered processes-LCP)
197. We have effective ways in determining and ensuring our LCP
* *
198. We use effective key LCP that deliver our educational programs and offerings
199. Our LCP create value for the organization, students, and our key stakeholders
* *
200. Our LCP address student educational and developmental needs to maximize
their success
201. We incorporate inputs from students, faculty, staff and stakeholders to
determine key LCP requirements
* *
202. We ensure that our faculty and staff are properly prepared to deliver our LCP
203. Our LCP take into account student learning rate differences
* *
204. We incorporate new technology and organizational knowledge into the design of
our LCP
205. We use key performance measures for the control and improvement of our LCP
206. We continuously improve our LCP to maximize student success and improve
educational programs
* *
Process management (support processes-SP)
207. We have effective ways in determining and ensuring our key SPs
208. We use effective key SPs for supporting our LCPs
209. We incorporate inputs from students, faculty, staff and stakeholders to
determine key SP requirements
* *
210. We design our SPs to meet all the key requirements we have already identied
* *
211. We incorporate new technology and organizational knowledge into the design of
our SPs
212. We use key performance measures for the control and improvement of our SPs
213. We try to minimize overall costs associated with process and performance
audits and SPs
* *
214. We prevent errors and rework in designing our SPs
* *
215. We continuously improve our SPs to achieve better performance and to keep
current with organizational needs
For items 216 to 274, please indicate your college or universitys position relative to your competitors
on each of the following: Scale anchors are 1, 2, 3, 4, 5, 6, or 7; where (1) Signicantly worse . . . (4)
About the same . . . (7) Signicantly better
Organizational performance results (student learning results)
216. Overall measures or indicators of student learning results
217. The effectiveness of our programs segmented by majors and disciplines
218. Current levels and trends in key measures or indicators of student learning
* *
219. Student learning results (and trends) for each student segment
* *
220. Student learning results represented by requirements derived from our markets
221. Correlation between education design and delivery and student learning
* *
222. Improvement trends in student admission qualications
223. Improvement in student learning beyond what which could be attributed to
entry-level qualications
* *
224. Educational services attributes as evidence of student and stakeholder
satisfaction
* *
225. Positive referrals to and recommendation of our services by students and
stakeholders
(continued) Table AI.
The Baldrige
Education
Criteria
1155
Deleted in
main study
Organizational performance results (student and stakeholder focused results)
226. Relevant data that determine and predict our performance as reviewed by
students
* *
227. Current levels and trends in key measures or indicators of student satisfaction
* *
228. Current levels and trends in key measures or indicators of stakeholders satisfaction
* *
229. Students and stakeholder loyalty
* *
230. Student and stakeholder perceived value of organization
231. Student and stakeholder relationship after graduation (alumni loyalty)
232. Results of student/stakeholder satisfaction measures
* *
233. Trends of gains and losses of students from or to other schools or alternative
means of education
234. Feedback from students and stakeholders on their assessment of our
educational operation
Organizational performance results (budgetary, nancial and market results)
235. Trend data on instructional and general administration expenditure per student
236. Trend data on cost per academic credit
237. Maintaining control over cost while better utilizing income and resources
* *
238. Budgetary and nancial results as tools for better utilization of resources
* *
239. Key budgetary, nancial and market indicators
* *
240. The effectiveness of management of nancial resources
* *
241. Financial measures data
242. Current levels and trends in key measures or indicators of market performance
and market share
243. Designing and experimenting with realistic scenarios reecting budget
increases and decreases
244. Current levels and trends in key measures or indicators of student enrolment
and transfer rate
* *
Organizational performance results (Faculty and staff results)
245. Creating and maintaining a positive and productive environment for faculty and
staff
246. Creating and maintaining a learning-centered environment for faculty and staff
* *
247. Creating and maintaining a caring environment for faculty and staff
248. Enjoying an effective faculty and staff work system performance
* *
249. Trends showing improvements in job classication and work design
* *
250. Local and regional comparative data on faculty and staff well-being
251. Improved levels of faculty and staff satisfaction
252. Extent of training and cross-training of staff and faculty
* *
253. Trends in experiencing improvements in faculty turnover and absenteeism
Organizational performance results (organizational effectiveness results)
254. Experiencing annual increases in overall productivity of scientic research
measures
* *
255. Experiencing improvements in timeliness in all key areas of educational and
student support areas
256. Continuously improving admission standards
* *
257. Annual improvements in administrative performance
* *
258. Annual funds and budgets allocated for scientic research
259. Annual funds and budgets allocated to innovation in teaching
260. Emphasis on athletic programs
* *
(continued)
Table AI.
IJQRM
23,9
1156
About the authors
Masood Abdulla Badri is a Professor of Production and Operations Management, in the
Department of Business Administration, College of Business & Economics, United Arab
Emirates University, Al Ain, United Arab Emirates. He is the corresponding author and can be
contacted at: Masood@uaeu.ac.ae
Hassan Selim is an Associate Professor of Management Information Systems in the
Department of Business Administration, College of Business & Economics, United Arab
Emirates.
Khaled Alshare is an Associate Professor of Computer Information Systems in the
Accounting & Computer Information System Department, Emporia State University, Emporia,
Kansas, USA.
Elizabeth E. Grandon is an Assistant Professor in the Accounting & Computer Information
System Department, Emporia State University, Emporia, Kansas, USA.
Hassan Younis is an Assistant Professor of Management in the Department of Business
Administration, College of Business & Economics, United Arab Emirates University, Al Ain,
United Arab Emirates.
Mohammed Abdulla is an Associate Professor of Management in the Department of Business
Administration, College of Business & Economics, United Arab Emirates University, Al Ain,
United Arab Emirates.
Deleted in
main study
261. Increased use of web-based technologies
262. Cost containment initiatives and redirection of resources
* *
263. Experiencing positive annual increases in external funds obtained through
research and services
264. Recording positive annual increases in the number of faculty research
publications
265. Maintaining an effective management of nancial resources
* *
Organizational performance results (governance and social responsibility results)
266. Showing upward scores of stakeholders trust in the organization
* *
267. Maintaining current accreditation of programs while working towards seeking
accreditation of other programs
* *
268. Appropriately and optimally using the funds allocated by the federal government
* *
269. Advisory boards and senior leaders continuously tracking relevant performance
measures on regular basis
270. Considering senior leaders to be accountable for their actions
* *
271. Support for key communities and other public purposes
272. Demonstrate high standards of overall conduct
273. Measures of environmental and regulatory compliance
274. Continuously enjoying positive governance/ethical performance measures from
stakeholders Table AI.
The Baldrige
Education
Criteria
1157
To purchase reprints of this article please e-mail: reprints@emeraldinsight.com
Or visit our web site for further details: www.emeraldinsight.com/reprints
Reproducedwith permission of thecopyright owner. Further reproductionprohibited without permission.