Está en la página 1de 9

BETTER DATA MEANS BETTER DECISIONS

By Mashiku Kuyi Stephen and Brian H. Kleiner


Executive summary
Managers require access to accurate data to help them make important decisions
about operational efficiency, competition and regulatory compliance. But having
what should be the same data available in various forms complicates matters. Past
master data management (MDM) solutions were prohibitively expensive and
lacked business support, but the future looks brighter. Six Sigma techniques and
new low-cost and free options offer small and midsize businesses the opportunity
to improve data accuracy, giving managers the information they need.
Not too long ago, paper was the most common method of communication between
businesses and individuals. While paper is not the most effective way to
communicate, it enabled individuals and businesses to transmit a wide range of
information. Due to the nature of paper, companies who could afford computing
power often had to enter the same information manually, possibly into multiple
information systems. Businesses spent a considerable amount of money and time
re-typing, correcting mistyped information and relaying information to business
partners and affiliates. In general, using paper for information sharing is inefficient
and prone to errors.
Over the last 10 years, both computing power and networking technology have
improved drastically. At the same time, the cost of computers and Internet usage
has dropped radically. These advancements in IT and price reductions have
allowed people and businesses to take advantage of enormous opportunities.
According to Forrester Research, amid a global financial crisis, U.S. online retail
growth that topped $155.2 billion in 2009 is expected to reach $248.7 billion by
2014. In its studies, Forrester found that a multitude of factors such as low prices,
shopping convenience and a wide range of product selection were the primary
growth drivers for online retail businesses. In addition, improvements in the
communication industry have enabled businesses to locate cheaper labor on and
beyond the United States boundaries, fueling growth in outsourcing and offshoring
services.
In the midst of these opportunities, significant business challenges are emerging.
The drop in the cost of information has expanded market limits to an unimaginable
level, resulting in what is now called global markets or globalization. Global
markets are providing consumers with a broad set of choices in products and
services. With the help of a cheap computer and the Internet, one can find and
compare product prices from different stores in different states, countries and
continents.

Opportunities and challenges


Today, customers are looking not just for low prices but also for highquality
products and services. With abundant computing power at hand, people can find
products and services worldwide, not just at their neighborhood shopping centers.
This is a challenge for businesses that must compete worldwide for sales, not just
in their city, state or country.
Businesses of any size constantly need to be innovative about finding ways to
compete effectively by cutting costs, increasing efficiency, maintaining or
surpassing existing quality standards and optimizing operations. Over the years,
rapid and uncoordinated innovations in the information technology sector have left
businesses and institutions with a pool of heterogeneous computing applications
and systems. As a result, it can be expensive and time consuming to manage and
use information from multiple vendors and platforms. Large software corporations
have realized the need for building interoperable systems. However, development
cost and the need for companies to differentiate themselves will continue to hinder
these efforts.
Systems integration is about bringing together the components of computing
subsystems into one system that functions as a single unit. Integration is meant to
drive high performance, reduce system complexity and optimize the whole IT
infrastructure. A well-integrated system enables effective information sharing and
collaboration between company employees, vendors and business partners. This
sharing helps improve the processes of serving customers and delivering products,
key issues that help a company achieve its goal of a better bottom line.
Even though systems integration is an integral task in company mergers and
acquisition processes, todays large and sophisticated computing systems
compound the complexity of integrating different systems. According to banking
merger expert Michael Koetter of the University of Groningen, extremely difficult or
complex post-merger IT integration problems often are to blame for banking
mergers that do not reward investors.
In the quest for ways to minimize operational costs and increase profits, those who
adopted systems integration early implemented electronic data exchange (EDI)
systems. EDI systems mostly were installed by large institutions and businesses
because of the expensive initial setup and maintenance costs. EDI systems save
large businesses and institutions money by replacing information flow that formerly
required extensive human interaction and paper documents, meetings and faxes.
Such systems also served as document management systems and improved the
cost of sorting, distributing, manipulating, organizing and searching information.
While EDI systems have helped, the increasingly global nature of competition
forces businesses to seek more operational efficiencies to reduce product and

service costs. These efforts have paved ways for exploiting available computing
power, hence the explosive growth in systems integration beyond EDI systems.
Today, systems integration is a core task of any business. An additional systems
integration challenge is the need to find better ways to manage master or reference
data.
Master data quality is a business problem
When organizations have multiple copies of information that are different but
should be the same, then the business stakeholders not the information
technology personnel have a big problem. It is a business problem because
management uses this information to make decisions that affect the whole
organization. For this reason, instead of just relying on IT personnel, stakeholders
should pay more attention to the quality of data.
Also called reference data, master data play a key role in the core operations of a
business. Master data are shared and used by several applications that make up
the whole system. And an organizations data has value beyond its operational
scope. In fact, it is not unusual for a company to acquire another company primarily
for access to its customer master data, Roger Wolter and Kirk Haselden wrote for
Microsoft.com in 2006.
Master data may include clients, customers, products, employees, inventories,
suppliers, stores, assets and contracts. Business operations revolve around master
data. The data are shared by multiple users, groups, partners and affiliates across
the entire organization. For example, in a financial management firm, master data
could include portfolios, securities, analytics and financial research networks. For a
bank, such information could include account numbers or a customer list. The data
could divide up a companys normal customers vs. its premier customers.
The ability to access and modify master information from different software
applications by different users, groups, business partners and affiliates presents a
major problem for a business in terms of data maintenance. After all, management
relies on systems to provide high-quality, consistent and reliable information so
executives can make critical and sound decisions.
Having unsynchronized copies of the same information can at least cause
problems, and at worst it is a recipe for disaster. You might have noticed that your
credit card companies sometimes mail you promotions to apply for a credit card
that you already possess. This can happen if customer information used by the
marketing and servicing departments is out of synch.
Another common example of poor master data management happens when
customers move and update their billing addresses. If the billing department does
not get the customers new address, the customer will not receive the bill. This can
lead to unpaid bills, followed by submitting the customer account to a collection
agency, loss of the customer and even lawsuits. A system that effectively manages

master data can smooth operations, prevent such errors and make sure decision
makers have access to the right information.
Master data management to the rescue
One recent development in systems integration is called master data management
(MDM) systems. According to the CDI-MDM Institute, the MDM market exploded
from $2 billion in 2007 to an estimated $10 billion in 2009.
What entails management of master data? Many definitions for MDM have been
coined, but the most appealing one comes from Haselden and Wolter. According to
them, MDM is the combination of tools and processes required to create and
maintain consistent and accurate lists of master data. In other words, it is the set of
tools and processes that allows decoupling master information from individual
applications while offering effective, stable, transparent and reliable ways to
maintain master information even as it changes at application level or at the master
copy.
The need for integrating and actively managing critical information on integrated
systems never has been more pivotal. Some experts are proposing application of
Six Sigma processes to manage key company data. Joe Danielewicz, an IT data
architect at Motorola, argues that even though humans tolerate poor quality data
and use context to fill in the gaps, businesses can use the Six Sigma methodology
of defining, measuring, analyzing, improving and controlling (DMAIC) to manage
their master data and mitigate project risk. Even though Six Sigma standards are
too high to achieve in this arena, such steps will reduce issues in management of
master data.
Furthermore, the global economic recession has triggered more government
regulations in the financial sector, which now is taking proactive steps to mitigate
some of the data risks. According to a recent data management survey of 52 senior
financial industry executives, 28 percent responded that the No. 1 driving force for
strategic investment in reference data management is improving data quality and
reducing data errors. No. 2, at 21 percent, was the need to integrate information
systems, reported Melanie Rodier for Wall Street and Technology in June 2010. In
the same research, 31 percent of the respondents said they anticipated that better
data would reduce risk, followed by 18 percent who expected better customer
satisfaction and retention.
Even more interesting, businesses are starting to see master data management as
a communication problem. They view MDM as another way to get different parts of
the business collaborating to optimize operational efficiency for the whole
organization. Writing in eWeek in 2007, Chris Preimesberger quoted an Intel data
architect and MDM product manager who maintained that installing an MDM
project got Intel departments talking to each other about common goals. If different
parts of the business cannot speak the same language because of data issues,
chances are greater that the entire customer fulfillment process will be disrupted,

which could lead to losing customers, problems with regulatory compliance and
dire consequences for the bottom line.
Another important key challenge in data management is trust. How do you get
business users to trust and rely on the data presented to them, especially if
different systems present a differing view of the same information? The ability to
track data as it passes through different nodes of transformation to the final
destination is called data lineage.
This ability increases data reliability and transparency and minimizes guesswork
because users are able to see where the data comes from, what was done to it,
who did what to it and when they did it. It allows data consumers to acquire a
holistic view of the data lifecycle. Thus, a good MDM system must be competent at
addressing data lineage issues. Having a holistic view of data helps businesses
answer regulatory compliance audit questions and reliably deal with important
business questions and decisions.
MDM options are growing
Until recently, the prohibitive expense of implementing a robust master data
management system has been difficult for most small to midsize firms to justify.
The cost easily could reach $1 million, which is out of the reach of many nonFortune 500 companies. But the drive for business efficiency and competitiveness,
along with increased requirements for regulatory compliance, has made a more
sound case for having an IT department build an MDM system.
Furthermore, a lot of new MDM vendors are entering the market, including
Teradata, NCR, Talend, Kalido and ObjectRiver Inc. Even the most recent release
of Microsofts SQL server product includes a free version of an MDM system called
Master Data Services (MDS).
Microsofts entry into MDM space is a very significant milestone when you consider
that the majority of small and midsize businesses use Microsoft solutions to
manage their data and daily operations. It means that small firms that previously
could not afford an expensive MDM solution can find an out-of-the-box solution by
just installing MS SQL Server R2. And Talend, a California-based MDM software
vendor, offers an open source MDM solution that interested parties can download,
modify and implement for free.
The benefits of a good master data management system are unlimited. Though
cost has been a major bottleneck to many businesses, new vendors entering the
market, including Microsoft and open source communities, will reduce the cost of
implementing an MDM system considerably. This means that more companies will
be able to implement an MDM solution and start reaping the benefits of a system
they could not afford or justify just a few short years ago. By implementing an MDS
solution, companies will reduce or minimize regulatory compliance issues, improve
operational efficiency and add to their bottom line. By taking guesswork out of data

through a transparent critical data management process, data consumers will be


able to make quick and timely decisions about operational and regulatory issues.
Mashiku Kuyi Stephen is an assistant vice president at the Trust Company of the
West (TCW), an asset management firm in Los Angeles. He is an expert in
database architecture and SQL server query optimization. He has extensive
experience dealing with financial systems integrations. His previous employment
includes work as a database developer and software engineer. He has a B.S. in
computer science from the University of California, Riverside, and an MBA in
finance from California State University, Fullerton.
Brian H. Kleiner is a professor of management at California State University,
Fullerton. He received both an MBA and a Ph.D. in management from the
University of California, Los Angeles (UCLA).

COMENTARIO:

Uno de los mecanismos para llevar un proceso estable en las empresas es que los
gerentes puedan tener acceso a datos precisos con los cuales ayuden a dar
soluciones importantes para tener una mejor eficiencia y productividad.
La aplicacin del control estadstico de las seis sigma en los procesos de
produccin son de gran importancia para la calidad total del producto, ya que en la
medida que se tengan un control efectivo de dichos procesos, se logran minimizar
los costos operativos, los desperdicios de materia prima y los productos
defectuosos, de esta manera se estaran alcanzando niveles ptimos de
beneficios, productividad y la alta calidad de los productos, respectivamente, que
en forma grupal llevan a la organizacin a una posicin competitiva bastante
aceptable.
Hoy en da, los clientes buscan no slo por los bajos precios, sino tambin para
productos de alta calidad y servicios. Con el poder de la computacin abundante
en mano, la gente puede encontrar productos y servicios en todo el mundo, no
slo en sus centros vecinales de compras. Este es un reto para las empresas que
deben competir en el mundo de las ventas, no slo en su ciudad, estado o pas.
Los negocios de cualquier tamao necesitan constantemente ser innovador en
encontrar formas de competir eficazmente mediante la reduccin de los costos,
aumentar la eficiencia, manteniendo o superando los estndares de calidad
existentes y la optimizacin de las operaciones. A travs de los aos, las
innovaciones rpidas y no coordinadas en el sector de tecnologas de la
informacin han dejado las empresas e instituciones con un conjunto de
aplicaciones informticas y sistemas heterogneos. Como resultado, puede ser
costoso y consume tiempo para gestionar y utilizar la informacin de mltiples
proveedores y plataformas. Las grandes empresas de software han dado cuenta
de la necesidad de crear sistemas interoperables. Sin embargo, el costo de
desarrollo y la necesidad de las empresas para diferenciarse seguir
obstaculizando estos esfuerzos.
Los datos de referencia juegan un papel importante en las operaciones bsicas de
una empresa y cuyo valor va ms all que un alcance operativo. La capacidad de
acceder y modificar la informacin de datos y aplicaciones de software por
diferentes usuarios, grupos, socios comerciales y filiales presenta un problema
importante para un negocio en trminos de mantenimiento de datos. Despus de
todo, la gestin se basa en los sistemas de informacin de alta calidad,
consistente y confiable para los ejecutivos pueden tomar decisiones crticas y el
sonido.

Los problemas de calidad de datos pueden tener diversas causas. En un pasado


no tan lejano, la inmensa mayora de la informacin acceda al entorno corporativo
a travs de la introduccin manual de datos, lo que aumentaba la tendencia a
cometer errores. Los nuevos canales de informacin entrante, como los portales
Web y las interacciones B2B con proveedores y socios, estn aumentando la
complejidad del entorno de datos de la empresa. Estos orgenes electrnicos
dispares entregan en tiempo real informacin ms sofisticada y variada, lo que
aporta un valor aadido a la empresa. Sin embargo, al mismo tiempo dificultan la
adquisicin de datos de calidad en toda la empresa y es preciso disponer de un
filtro de calidad de datos en tiempo real para preservar la integridad de la
informacin.
Adems, otra informacin similar (como los detalles de los clientes) se puede
almacenar en diversos orgenes dispares, incluidas aplicaciones CRM o sistemas
de contabilidad. La informacin puede actualizarse en un origen, pero permanecer
invariable en otro, creando as los tipos de incoherencias que suelen desembocar
en varias versiones de una verdad. Slo el hecho de recuperar datos ya es una
tarea potencialmente compleja. La imposibilidad por parte de los usuarios para
localizar y acceder a la informacin que necesitan para desempear sus tareas
diarias de la forma ms eficaz o para tomar decisiones sobre la marcha puede
reducir considerablemente el valor de los datos de la empresa. Basta un solo
pequeo error para corromper los datos de toda una empresa. El efecto de los
datos corruptos puede ser devastador. Informes de la industria estiman que las
cuestiones de calidad slo en los datos del cliente cuestan a las empresas por lo
menos 611 mil millones dlares cada ao.

REFERENCIA BIBLIOGRAFICA

Mashiku Kuyi Stephen and Brian H. Kleiner. MEJORES DATOS SIGNIFICA


MEJORES DESICIONES [EN LINEA] Julio / Agosto 2011-Gestin Industrial-Mejor
informacin
significa
mejores
decisiones
Disponible
en<
http://www.iienet2.org/Details.aspx?id=27644 [Citado el 26 de Septiembre de
2012].

También podría gustarte