BPM Weaves Data And Processes Together For Real-time Revenues
Written By: David Cameron
Published On: May 30 2003
Weaves Data And Processes Together For Real-time Revenues
Author - David
- May 30, 2003
One customer abandons a product application on a corporate Web site. Another customer starts to configure an online product but gets frustrated and quits. A third stops paying bills on time or contributing to an account. What do these events have in common? The relationship between customer and vendor is at risk, and the customer appears to be close to jumping ship. If that relationship is profitable, then reacting to these events in a timely manner becomes imperative.
Today's environment demands that organizations focus more on the customer. There are two components of this focus: who the customer is and what the customer does. At a technical level, these components translate into data and process, respectively.
For the last several years, better data has become the mantra of organizations seeking to improve their customer focus. As a result, organizations are swimming in data stored in disparate applications, data marts, operational data stores, and data warehouses. They have attached sophisticated analytic tools to this data to provide insight into customer attributes and behaviors.
Many companies have failed to link data to business process. Few companies have changed their business processes to reflect the insights gained on the data side. Even fewer companies harvest this knowledge to make the most of every interaction with the customer. And yet, this is the level companies must reach in order to respond to the types of customer behaviors listed above. Processes must glean enough information from data to generate actions, specifically to initiate dialogs with prospects and customers, and to address the growing sophistication of the customer relationship.
Linking data to process is the realm of business process management (BPM). BPM technology allows companies to transform potentially damaging customer interactions into revenue opportunities by coordinating the data and actions of disparate IT systems. Coordinating systems helps companies respond in real time to key business events and to minimize the risk of lost revenue. For example, a sales representative may contact a customer with discounts for continuing to pay bills online soon after payments have stopped. Responding at the right time with the right action allows companies to become real-time enterprises (RTEs).
Traditional Integration Is Costly, Inflexible
To achieve the RTE, companies must pass data from application to application quickly and efficiently: the right data to the right application process at the right time. This data logistics framework is the hallmark of the RTE.
Many organizations have found, however, that provisioning data contained in a vast, decentralized data infrastructure in support of the RTE is extremely impractical using conventional integration approaches. A quick look at these approaches illustrates why:
Data Transfer: The requirements of the RTE strain most batch data transfer
processes, the most common approach to moving data. Batch data transfer is tremendously
costly because it typically requires movement of all the data in order to provision
the right data. Also, it usually takes too long for processes to react to fleeting
revenue opportunities with customers.
Platforms: Despite a decade of enterprise application integration (EAI)
technology, fewer than 50 percent of integration projects utilize it. This technology
is typically only cost effective with scale, preventing the incremental "pay-as-you-go"
investment approach favored by IT departments in today's business climate.
Code: The RTE is adaptive and agile. This means that processes must
be flexible enough to support a "test-and-learn" approach. As processes change,
the data required to support those processes also changes. Custom code, while
initially cost effective, immediately becomes too brittle and expensive to maintain
in the face of this requirement.
Suites: Gone are the days of monolithic, 12-plus-month projects to
replace existing technology with pre-integrated suites for the sake of common
data models. Highly disruptive and expensive, these projects don't typically
reduce the integration problem, they just transfer it to a different place.
The Benefits Of BPM
BPM promises to overcome the impracticalities of these approaches while the leveraging investments in application and data infrastructure of the past decade. BPM starts with process, and provisions data in support of process. This approach provides the following benefits:
It provides a variable cost model that allows for a small
initial implementation footprint tied to a three- to six- month payback period,
tightly coupling investment and return. Today, technology must prove utility
before vendors demand license and support fees. Instead of the typical 12-plus-month
implementation and two-year payback model, BPM technology scales deployment
effort to initial requirements. As a company extends its BPM implementation
over time, it builds upon work already performed.
- BPM allows
for rapid modification so end users can engage in "test-and-learn"
process development and management. Most of the processes that support key
business drivers are complex and must continuously evolve in response to internal
and external changes. BPM technology enables users to implement, test, modify,
and re-implement processes in rapid sequence based on what they've learned.
- BPM gives end
users more involvement, so they can change processes on the fly,
and improve agility and productivity without having to call IT. The current
generation of technologies is brittle largely because IT shoulders the bulk
of maintenance costs and responsibilities. BPM technology balances the workload
between the end user and IT in such a way that users can maintain processes
without resorting to IT support.
- Companies benefit
from BPM's ability to reuse common data objects such as "Customer"
and "Trade" and use open standards to make them available to any application.
The complex infrastructure many organizations operate has evolved from many
proprietary database structures and application logic syntaxes of years of
systems development. Most valuable components of business processes, including
data definitions, business rules, and transformation logic, are replicated
in a variety of formats across the enterprise. BPM technology relies on an
object model that exposes and abstracts these elements so that they may be
reused across different systems via emerging XML standards.
- BPM provides
context for business process to allow users to access information
at the application level, before it has been saved to a database, dramatically
reducing data integration. Most traditional application integration is based
upon the movement of "state data,"or data which has been saved about a particular
event. State data is stored in databases, and then "synchronized" with other
databases linked to other applications. But companies don't save much information
needed to support business rules because it is too expensive or too complex.
- Companies can
use BPM to cultivate existing business logic and integration capabilities
to connect existing applications and databases without modifying them. Most
of these systems already have connections built into them, either with middleware
or published application programming interfaces (APIs). BPM technology uses
these connections to link databases and applications while providing a layer
of abstraction that hides variability between systems.
BPM's focus on process dramatically reduces the amount of data that needs to be moved, and thereby reduces both the initial cost and ongoing maintenance cost of application integration in three ways. First, BPM delivers only the right data to the right process at the right time, eliminating the need to move a "haystack" of data to supply a "needle." Second, by utilizing existing standards and technologies such as XML and Web services, companies can map data into virtual "data objects" and then retrieve it from its source. This reduces the need to physically synchronize data in multiple databases purely to create a common definition. Finally, companies can fill gaps in data as needed in real time. For example, if a process needs a customer lifetime value to evaluate which course of action to take, but does not contain that element, users can add the value at runtime, eliminating the need to constantly merge and update attributes in multiple databases.
The advent of BPM technology promises to dramatically reduce the cost and improve the effectiveness of the data infrastructure investments of the past decade, making the leap from data to process and giving organizations the ability to transform themselves into real-time enterprises.
Cameron, Vice President, Product Integration, AptSoft Corporation,
manages the marketing and product integration efforts, which include working
directly with prospects and clients to better understand the applications and
value of AptSoft's enterprise software. In this role, he draws on more than
13 years of experience in building customer-centric data and process integration
applications for large corporations and applying them to strategic sales, marketing
and service solutions.
He joined AptSoft from Wheelhouse Corporation, where as part of the start-up team he built the database integration and quantitative analytics practices. He was also one of several key executives guiding Wheelhouse's software and infrastructure product development.
Prior to Wheelhouse, Cameron worked for Harte-Hanks Data Technologies, where as vice president of product integration, he co-founded two business units that provided database marketing solutions to companies including Federal Express and Toyota.
can be reached at firstname.lastname@example.org