Make smart and accurate
software selection decisions
Podcasts, Webinars, and Videos
Interactive Case Studies
ERGO Decision Support System
Private Label Partnerships
TEC Case Studies
Software Evaluation Reports
Meet TEC's Experts
News and Press Releases
Working at TEC
Partner with TEC
Processing Complex Events (During these, oh well, Complex Times) - Pa...
Processing Complex Events (During these, oh well, Complex Times) - Part I
March 25 2009
The worn-out saying about how we learn new things every day applies to this blog topic too. Namely, my interest in
Progress Software Corporation
has long been due to its
. Indeed, many
enterprise resource planning (ERP
) and other applications providers leverage (embed) OpenEdge as
partners. Sure, I also follow and have recently written
about the company’s forays in the service-oriented architecture (SOA) space
with its two respective offerings:
enterprise service bus (ESB
But in late 2007, out of mere courtesy, I accepted a briefing about
, the company’s platform for
complex event processing (CEP
, and whatnot. Given the overwhelming nature (“rocket science” of a sort) of the offering’s concept, I now admit that I could not wait for the briefing to end.
Actually, I felt bamboozled like those ordinary mortal FBI agents in
CBS’ primetime hit show “Numb3rs.”
In that show, time and again the whiz kid math genius (the brother of the FBI team leader) tries to explain to these action-rather-than-theory agents how some complex and arcane math theory can be applied to make sense out of seemingly chaotic and unrelated events. Eventually, complex math solves some important crimes, often by detecting patterns that are not obvious to the naked eye.
Well, fast forward to early 2009, where at Progress’ Analyst Summit (a traditional Boston winter fixture event) we could all find out that
is possibly the best performing and growing part of the company. OpenEdge, while still contributing to over 60 percent to Progress’ total revenues, is a mature business that is now sold mostly to independent software vendors (ISVs). In addition, the recent financial markets (and consequently the overall economic) crisis and related cases of high-profile frauds ("white-collar crimes") have made me conduct my own study of Apama and become familiar with its underlying concept.
Frankly, I no longer grapple as much with the concept of CEP per se (Progress Software refers to CEP as “The Brains of the High Velocity Business”). Where I still get lost though is when it comes to CEP’s relationships with other like technologies and concepts “du jour.”
article from two years ago
confirms that there are various issues that confront the event processing community. There now seems to be common agreement that event-processing as a term should best be used to encompass both CEP and
event stream processing (ESP
), the latter term arguably applying to events that are not necessarily complex in themselves.
In addition to just what exactly event processing is, and whether and how it differs from operational
business intelligence (BI
), another vexing issue is how big the event processing market is. For those of you that might want to delve into philosophical discussions about which concept is broader (and which came first) within the alphabet soup of CEP, ESP, SOA,
event driven architecture (EDA
business activity monitoring (BAM
), here is
’s blog posts
. There is also
an excellent blog post on a practical combination of SOA, EDA, and CEP
, plus, you can always peruse
from the horse’s mouth (it is maintained by Progress Apama staffers) .
Principles of CEP-based Systems
In plain English, CEP lands itself well to any environment that treats any business update as an “event.” Such organizations want to enable users to rapidly define event-based
to identify patterns indicating opportunities and threats to the business. These encapsulated rules (either as
structural query language [SQL]
statements) are loaded into a
real-time computing (RTC
) CEP engine.
The correlating engine is permanently connected to multiple event sources and destinations (with volumes of events and related data points) and offers analysis and response within an extremely
period. Events can be captured and preserved in time-order for a historical
root-cause analysis (RCA
Given that algorithmic trading in
was one of the first real-life applications of CEP, let’s translate the above general CEP principles into trading terms. The continuing digitization of financial market data and the advancement of electronic market access has created a market environment in which competitive differentiation amongst financial service firms rests with split-second algorithmic execution that can exploit minuscule and momentary advantages in price, time, and available liquidity.
To that end, a trading company will treat any market update as an “event” and will enable users to rapidly build quantitative
(based on their vast experience and know-how) to identify trading opportunities and risk breaches. Germane trading rules are then loaded into a
that offers real-time analysis and response with a latency measured in milliseconds.
The trading system is permanently connected to a number of relevant
sources, news-feeds, and trading venues (exchanges). Finally, events can be captured and preserved in time-order for
In summary, the drivers for CEP adoption are the following:
Applications with high throughput and latency requirements. Such requirements from market trends such as higher velocity business event flows, more voluminous (and yet shorter-lived) transactions, and rapidly changing market conditions. These trends in turn pose the challenges onto customers in terms of how to detect opportunities and threats in real-time, and how to show the health of their business; and
The need for rapid software development and customization, and increasing application complexity (temporal and/or
, real-time analytics, etc.). The customers’ challenge in this regard is how to accelerate the deployment of new capabilities.
Differing from BI
CEP differs from traditional computing in the requirement for continuous execution of logic scenarios against a huge and continuous stream of information. This sharply contrasts with the traditional
model where data must be retrieved from a
values. An example of static data processing would be the ability to answer the question: “What were the best performing stocks last week?”
Conversely, CEP is aimed at providing event-driven query and analytic processing (e.g., providing algorithmic financial trading solutions in the financial arena) in real-time.
Another IT-Director article explains
that the conventional query-and-report approaches to these environments are only suitable for environments of smaller scale or those in which limited numbers of data feeds are being monitored.
In particular, these reporting solutions cannot handle environments where large numbers of data feeds need to be combined and correlated in a complex and dynamic (on-the-fly) manner. Typical
engines are only really suited for monitoring individual threshold (
) events (and they cannot predict the probability that the process will cross the threshold), while more comprehensive solutions such as conventional database approaches simply lack the capability for
real-time processing (RTP
require the data to be committed to the database prior to query processing and indexes to be updated, both of which inherently mean some time delay. Rather than committing data to a database and then processing it, CEP platforms data is directly processed as it is fed into the system, without resorting to the use of a database. In principle, all of the resulting database overhead activities are therefore omitted, resulting in better performance and significantly improved scalability.
To illustrate with some examples of high frequency trading rules, one rule could be defined like this: “When the stock X’s price moves two percent outside its five-minute
margin, buy it now!” For a slightly more complex example: “IF the stock X’s price moves outside two percent of its moving average margin, followed by my
moving up by half percent AND the stock Y’s price moves up by five percent OR the stock X’s price moves down by two percent, ALL WITHIN any two-minute time period THEN buy the stock X and sell stock Y!”
Again, the singular trait of CEP is its ability to handle complex event sequences coming from multiple data streams within real-time constraints. CEP accomplishes this via automated actions and built-in
Back to Apama (not Panama or Obama, Bozo!)
Progress Apama became part of Progress Software via the acquisition of the former
in April of 2005. Apama is the core technology foundation for Progress’ initiatives in CEP and the company’s go-to-market initiatives that leverage that CEP platform in capital markets for the following "daily bread" actions: algorithmic trading,
smart order routing
Prior to its acquisition by Progress Software, Apama had a few dozen customers in London, New York, and Boston. Today, however, after leveraging the global parent’s infrastructure, Apama is marketed and sold in all the major financial centers in the world.
Apama was founded in 1999 in Cambridge (UK), by John Bates and Giles Nelson. Fellow
and CEP visionaries Bates and Nelson are co-holders of the patents on Apama’s core technology, which is a commercially-productized expression of their efforts to create a platform for the unique characteristics of “event-based” applications.
Apama had set out to try and resolve a number of telecommunications-based real-time mobility issues,
but had then realized that there were additional commercial opportunities in a wide range of environments. As a result, the company has historically focused on financial markets and specifically financial trading systems where real-time event-based trading systems are in high demand.
segment has indeed proven to be an early proof point for the Apama CEP platform. Apama’s design philosophy and architecture were intended to provide a platform that allows traders to quickly develop and deploy distinctive proprietary strategies that exploit these opportunities and mitigate risks.
In addition to the above-mentioned CEP applications in capital markets, other current (or future) uses in the segment are the following:
trading and pricing,
foreign exchange (Forex)
aggregation and algorithms,
algorithms, news-driven algorithms, and so on.
Part II will continue with an examinaton of Progress Apama’s current state of affairs. In the meantime, what are your views, comments, opinions, etc. about the concept of CEP in general and about Apama per se? Can you envision leveraging the concept within your business, and in what manner?
comments powered by Disqus.
comments powered by
Interested in a better way to make software decisions?
Give us a call now: 1-800-496-1303 ext:404
Software Requirements Sets and Comparison Reports
Click here to leverage the experience of our 360 industry perspective