Make smart and accurate
software selection decisions
Podcasts, Webinars, and Videos
Interactive Case Studies
ERGO Decision Support System
Private Label Partnerships
TEC Case Studies
Software Evaluation Reports
Meet TEC's Experts
News and Press Releases
Working at TEC
Partner with TEC
Thinking Radically: Dr. Morten Middelfart, CTO of TARGIT
Thinking Radically: Dr. Morten Middelfart, CTO of TARGIT
September 19 2011
In 1997, TARGIT acquired Morton Systems, a company founded by Dr. Morten Middelfart and devoted to providing business intelligence (BI) and analytics solutions. Dr. Middelfart was then appointed chief technology offer (CTO) of TARGIT, and since then, he has become the technology and educational leader for
TARGIT BI using TEC Advisor.]
With more than 20 years of experience in the software industry, an MBA from Henley University (UK), and two PhD degrees from Rushmore University (USA) and from Aalborg University (Denmark), Dr. Middelfart has been able to apply his research to TARGIT’s BI solution—changing the way BI applications help organizations and making TARGIT an important player in the BI space.
In this interview, Dr. Middelfart discusses his vision of the BI space,
, and how TARGIT is planning to help companies overcome future challenges with their BI needs.
JG. What do you think are the major changes in the way we do BI nowadays as opposed to how we did BI 8 to 10 years ago?
MM. Having worked in
for the past two decades, I am afraid that the next decade does not have a huge revolution in store. If I think back to the early 1990s—the challenge then was to give users easy and fast access to valid data. That challenge pretty much remains today. Sure, technology has improved in terms of processing power, storage, connectivity, and mobility, but overall the challenge stands. Moreover, the challenge does not stop at valid data, because, in my mind, an even bigger challenge is
—meaning that the information users get out of BI systems can be completely
understood by them and that it is coherent across an organization. Information quality goes much further than data quality, as valid data can be misunderstood if a user does not understand its context and meaning and is not able to relate it to other data.
Lack of information quality is, to my mind, still the biggest challenge for organizations seeking to harness the power of information and to turn that into competitive execution. The worst thing about lacking information quality is that it can stall the decision process when users need to find consensus about the “truth” before a decision and subsequent action can be taken.
Having said this, we will of course see an increase in mobility as well as an increase in cloud-based delivery models, but these technological trends are not going to solve the fundamental information quality problem. In other words, if we are not careful, then we will just exploit technology to be even more efficient in confusing users, anytime and anywhere.
My hope for the next decade is that we will apply technology more wisely and address the fundamental problem that users face, allowing them to make well-informed decisions fast and in sync with their peers in their organization.
JG. You authored a very interesting book called "
CALM: Computer Aided Leadership and Management
." Could you brief us on what CALM is about, its objective, and its application within the TARGIT BI solution?
MM. The main idea of CALM is to turn everything we know about BI upside down. Today, it seems that we are much occupied with “what the new technological trends are” and “how they influence BI.” With CALM, I searched for excellent management and leadership, and from this perspective I identified the relevant technologies. In other words: management and leadership first, and technology second—as a facilitator. The findings were interesting: in particular I found that the most important factor for conducting excellent CALM is integration between the BI disciplines as we know them—e.g., direct linkage between reporting and analysis.
Conceptually, CALM led to a revival of
loop from the “TOPGUN” fighter school, in which any competitive scenario is broken down into four phases, namely, observation, orientation, decision, and action. In CALM, we seek to apply BI to allow any user at any given organizational level to cycle these four phases as fast as possible using technology. This ideal has a drastic impact on the way we design BI systems, as no technology is relevant unless it contributes to moving the user faster through the OODA loop or improves the quality of it.
Another aspect of CALM is that it seeks to create synergy between humans and computers, as opposed to simple efficiency. By “synergy,” I mean allowing an organization to do things that would never have been possible without computers—new ways of navigating the organization in its competitive environment. Again, the state of the art is somewhat disappointing, as most organizations only use computing to create the reports they have always known, and thus all they gain is an efficiency in terms of fewer person hours spent. CALM seeks to do the things with an organization that would be impossible for any human to do, regardless of the amount of time and people available.
JG. Many companies seem to have their own ideas about the ease of use of BI applications. What is your company’s or your personal view in this regard?
MM. I agree that ease of use is a huge challenge for BI applications, as it is the facilitator for allowing multiple users to make better and faster decisions. But I think there could be more focus in these efforts. Personally, I think that usability has a direct effect on reducing the number of interactions that the user needs to go through to make a decision. If we reduce the number of interactions, we reduce the risk of error, the time spent, and the time needed for training. But reducing the number of interactions also means we need to focus BI systems towards their intended use; therefore, I believe strongly in reducing the number of interactions needed for a user to cycle the OODA loop mentioned earlier. In my mind, an organization is most efficient when maximizing execution while minimizing interaction. This may be a no-brainer, but allowing this to happen means a radical redesign of applications to integrate all BI disciplines; e.g., there can be no separate report authoring without a complete integration with analysis, because this is needed to swiftly move from observation to orientation. But, yet again, most often we see these two disciplines being kept separate, both in software and in the organization.
JG. One of the components of CALM is “sentinel mining.” What is a sentinel? What is the difference between using sentinel and other classic approaches of mining?
MM. A sentinel is an agent that autonomously monitors a part of an organization’s data in order to warn about potentially critical business changes before they happen. A sentinel is discovered through a data mining process that identifies the change patterns in less critical data that seem to precede critical changes to important areas of an organization’s data; usually these important data are referred to as key performance indicators (KPIs). Once the proceeding patterns are identified, agents are assigned to monitor if a change pattern occurs, and if so, the user can get a warning that the given change might influence a KPI. The user will know what pattern occurred and, based on historical data, what the likeliness is that the change will cause an impact on a KPI.
Compared with other data mining algorithms, sentinel mining has some distinct properties in that it identifies so-called bidirectional patterns, meaning that the same pattern works for both negative and positive changes to a KPI. Other algorithms, such as association rule mining and sequential pattern mining, can identify only one direction per rule mined. For more detailed knowledge about the wonders of data mining, and how sentinel mining works in particular, please refer to
JG. How are these implemented in TARGIT BI? What are the benefits of using them?
MM. The most important message with regard to sentinels is that they are easily discovered and applied by end users who need not have any particular data mining knowledge. The fact that a complex data mining algorithm can be embedded and allow an organization to achieve a competitive edge by “seeing” changes before they happen is a very good example of how human–computer synergy can be achieved. There is simply no way that casual users would be able to discover these relationships, regardless the time they had available.
The benefits from an organizational standpoint are obvious: since resources can be allocated to problems or opportunities before they occur (or while they emerge), resources are spent much more efficiently—problems can be dealt with before they become severe, and opportunities can be followed through before they are missed.
With the implementation of sentinel mining in the TARGIT BI Suite, we have demonstrated directly how human–computer synergy can be developed into the design of software. Also, we have allowed end users to data mine, thereby effectively taking data mining out of the lab, where only a few users would be able to benefit from it.
JG. What are the challenges that organizations still have to face in order to improve their decision-making process?
I believe that the challenge is still to allow the great multitude of users in an organization to make informed decisions fast. As mentioned earlier, there are a number of challenges that still need to be overcome to deliver this promise. In my mind, the main emphasis when addressing these challenges should be on the managerial and leadership tasks, as opposed to emphasizing the technologies that happen to be available. For example, the CALM philosophy was developed while seeking to identify excellent leadership and management, and from this standpoint, technology was applied to deliver human–computer synergy in the process of conducting these managerial tasks.
In general, I think that the true organizational challenge in any application of computing is to achieve synergy rather than simple efficiency. Reaching this point takes a lot of mental processing—it means that the people in the organization should trust the computers to do certain tasks autonomously. Even though a computer is indeed trustworthy and reliable in carrying out a certain part of the decision process, it does not mean that the person in front of it is ready to let it.
JG. What are your thoughts on
—is it really necessary for an organization? How can an organization successfully achieve BI pervasiveness?
MM. I believe that business intelligence should be integrated with or into any process where it is relevant.
I am, however, not completely in line with those ERP vendors that believe that BI should be integrated into their system. In fact, I think it is the other way around: BI should incorporate all data sources available in an organization, whether they are ERP, social media, Web site stats, unstructured data, etc. This makes much more sense since the number of data sources will continuously increase, and the best decisions are made by involving all relevant data sources, particularly if we data mine. I guess no one would argue that all their e-mails or Web site stats should be fed into the ERP system and stored there, and thus the logical consequence is that BI should reside on top of all sources. In my opinion, it should be the BI application saying whether the users should reply to a complaint via e-mail or change the product prices in the ERP.
Having said this, I would agree that sometimes it is relevant for BI to be exposed through another application, and if this is done in a way such that the user runs though an OODA loop with as few interactions as possible, I am all for it. But I do think that the pervasiveness of BI will come from the top of the hierarchy and thus be the initiator of processes in many more cases than it will be coming from below as part of a process already initiated. This at least holds if we apply BI to its full extent and allow the BI application to be the primary levers of control.
JG. Do sentinels promote BI pervasiveness? How?
MM. From the perspective of having the BI system being able to see all data in the organization, sentinels will indeed be pervasive across all processes in the organization.
Sentinels will allow users to react to threats or opportunities; this is particularly effective when the patterns are mined on all data, identifying the leading data that influences key performance indicators unknown to the user.
JG. What is your favorite cartoon character?
JG. What is your favorite nontechnical book?
” by Arthur C. Clarke.
comments powered by Disqus.
comments powered by
Interested in a better way to make software decisions?
Give us a call now: 1-800-496-1303 ext:404
Software Requirements Sets and Comparison Reports
Click here to leverage the experience of our 360 industry perspective