Home
 > Research and Reports > TEC Blog > Managing Risks, Managing Measures: Decision Support Metho...

Managing Risks, Managing Measures: Decision Support Methodologies in Business Part 2 : Decision Aids

Written By: PhD
Published On: March 1 2002

Executive Summary

When it comes to making decisions using some tool that purports to measure value, it is wise to know something about the tool and what it is measuring, as well as why. Poor measurement methods contribute at least in part to project failures and corporate inefficiencies that can take off millions or more from bottom lines and poor technology selections alone have contributed to tens of billions of dollars of unnecessary costs. Methodologies implicit within decision aid tools can bring out value for the users, but the question is if these values are really representing what the stakeholders think they are being shown or need. This article provides some guidance into these issues, and how a manager may avoid making an expensive mistake.

Part One of this article provides some guidance into these issues, and how a manager may avoid making an expensive mistake.
 
This part discusses some of the tools used for business decision making.

A Look at Decision Aids in Business

The issue of which method or methods and processes to use for decision-making in business is not easy, and requires some expertise, and it is difficult for a practitioner to explain a decision aid tool to a senior manager, and where it fits in. Often the manager is not interested, but the results can be far reaching if a method and tool is chosen that does provide a good evaluation of the decision choices in hand, and also deals with the business question, not just the technical issues. In this article I examine the methods that have been used in business, and some of the applications where they have been applied.

Methods and More Methods

The list of methodologies for assisting in decision processing is a vast list in academic circles, but the few that are widespread and used consistently in business are determined by several criteria. These include:

  1. Simplicity to use
     
  2. Ease of understanding: being able to build trust
     
  3. Cost/Benefit/Effort tradeoffs
     
  4. Ease of implementation: turning the method into a usable tool

Strangely, I do not believe that the outcomes of these methods what they actually achieve in business is likely a determining factor of their continued use. It is difficult to measure their performance simply because it is not always possible (in fact, mostly impossible) to repeat the experiment with another aid to double-check the first. Some research by the author suggests that decision aids can differ frequently in outcomes perhaps 20% to 30% of the time, and may even come to diametrically opposite conclusions. The reason for such ambiguities can arise from several sources which can include:

  1. The business question is ill-posed in the context of the tool
     
  2. The tool methodology actually does not, or is not able to answer the real question posed to it, and can give false results
     
  3. Incorrect use of the tool processes and analytics
     
  4. Critical elements are missing in the tool's capabilities, or in the methodology used by the tool, or in its process (missing process elements), or analytics or indeed a combination of any of these.
     
  5. The tool does not give an indication of the subtle differences among the choices, leading to choice where the subtle turns out to be not-so-subtle. In IT, missing the fact one tool easily supports multiple platforms, while another is a dog, can be overlooked because of other attributes that are considered more 'dominant'. Here is where process and tool capabilities either in inherent capabilities ("you may want to look at this"), supported process point (do this before you look at the results) or analytics (the user has tools to mine for the issues) are needed, as point 4 above discusses.

What one can do is view what methods are actually in use, even if they are not known by their academic name. For example, procurement largely uses unbeknownst to most procurement officers MAUT (Multi-Attribute Utility Theory). In fact, of all methods used in business world, this is almost certainly the most prevalent, despite claims from, for example, the AHP (Analytic Hierarchy Process) camp. Whether it is used correctly or well is another matter.

Table of Methods and Business Applications

The table below summarizes what are the methodologies most used in Business. There are of course innumerable methods that specific consultants have developed, but I focus on those that are more likely used - in various forms and not necessarily with the academic terminology. Additionally, I identify some of the tools that exist on the market.

Proving Success of Decision-aiding Methodologies

Though many academic methods exist, few actually reach the audience they need to reach because of the barrier between academic and business languages, as well as the ability to produce workable and usable tools.

Additionally, there is the real question of measured value that any tool provides. There are no current standards that help businesses determine if a particular process and methodology is of real use: partly this is due to the inability of 'repeating' a particular situation. However, there are other methods, such as the ability to collect data to show business derives a real return from using methods that presumably help make better decisions. In some circumstances as in the use of decision trees in deciding where to drill for oil companies can have some tangible measure of improvement. In technology implementations, the numbers will only come out if there are repeatable decisions that impact the bottom line, or enable a business capability to happen that would not otherwise occur, which actually delivers value. Industry wide, of course, better decisions should lead to reductions in poor choices and overall losses from failed, delayed, or restarted projects, and the quality of deliverable solutions. The measure for this success should be reflected in the bottom line, and through fewer failed or restarted projects.

It would be a poor manager, of course, to rely solely on the outcomes from decision aids, as they are usually only part of the process. Tools provide better information and can assist in the decision process, but the final decision is often one that rests on relationships among the stakeholders vendors and customers alike. Trust and other pieces of the decision puzzle can always be incorporated into a model, but as McNamara's fallacy states, there are some things one needn't measure.

Although a decision process can help decision makers reach a decision, there is the whole issue of Total Cost of Ownership of the decision. Selecting a new piece of technology is complex, and you of course can save by reaching a solution faster: however, if the solution actually is not as good as would be determined by another method, any cost savings at the decision point become negligible. What is needed is a way of mapping the complexities of a major decision into some process, and into the tools that you need.

What to do

One suggestion that can be useful in avoiding pitfalls, if you are going to rely on a tool to help you (and even if you don't), is to have a clear plan of how you are going to go through the decision process, and what it is you need from the tool or tools to help you in that process. Decision aid makers always have a specific methodology and process in mind, which can help or hinder you. Have your plan ready to compare against what the tool can deliver. Unfortunately, the academic language does not necessarily meet the language of business, and this may not be simply a matter of semantics. Many times it is the differentiation between understanding the value from an academic perspective and a business perspective that can lead to success or failure. The semantic and process mapping for many tools has yet to be done.

Summary

Until a decision methodology can provide true measurable value or can be demonstrated as providing it - then it is difficult for the methodology to make inroads into a business environment. Even so, those that have made inroads are either simple to use or provided understandable and good value in the cost/benefit/effort tradeoffs.
 

About The Author

Dr. Eddie Robins, was formerly Chief Scientist of the Technology Evaluation Center (TEC), was responsible for the scientific development of TESS, the software engine that drives the selection models developed by TEC research analysts.

Dr. Robins has over twenty years of industry experience, and since 1995 has been instrumental at Arlington Software, the forerunner of TEC, in developing new mathematical algorithms which has led to patented methods of decision analysis and novel decision-making processes. These provide increased accuracy of risk assessment in selection processes. The work was based on experience garnered from over 200 client organizations that include Fortune 1000 companies and government departments.

Dr. Robins regularly presents at academic meetings and conferences in Operations Research and Management Science. He has managed and led multidisciplinary development teams in a variety of scientific and technical projects, including the development of pre-press equipment, life-test and reliability of integrated circuits for a Trans-Atlantic fiber-optic cable system, data processing and data acquisition systems at the Canadian Center for Magnetic Fusion, and power device development at GEC Hirst Research Centre, UK. He has also been a consultant to the International Civil Aviation Organization (ICAO) under its Technology Assistance Program.

Dr. Robins is a physics honors graduate from Imperial College, London, and the University of Manchester Institute of Science and Technology.

Dr. Robins currently consults for start-up and leading edge technology organizations, related to business analytics, soft computing, CRM, Decision Support and Knowledge Management, and can be reached at esro@attbi.com.

 
comments powered by Disqus

Recent Searches
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z Others

©2014 Technology Evaluation Centers Inc. All rights reserved.