Home
 > Research and Reports > TEC Blog > Software Selection Slagging

Software Selection Slagging

Written By: josh chalifour
Published On: December 17 2007

Consulting firm, 180 Systems, posted a little note regarding an article we recently published on some parts of the evaluation and selection process that we promote. I'd recommend checking out the 180 site for additional information. In fact they have a brief list of steps that tend to be very similar to commonly recognized selection best-practices, and are in many ways, close to what we at TEC recommend (more on that later as we're preparing to publish our internal guide). 180 Systems has a few perspectives to contribute on the subject and that's why I'm surprised they didn't say something more critically astute in their post, which I cited above.

The 180 Systems blog author (sorry to treat this anonymously, I couldn't find a name associated with their post) said
"We have not spoken to any of their customers about the usefulness of their services or used it ourselves. Our first impressions are that we find it hard to believe that they have vetted thousands of criteria on hundreds of vendors."

It would be difficult to make an informed comment without actually experimenting with a free trial of our tools or talking to anyone. Maybe I can shed some light. While we didn't exactly state that we'd vetted thousands of criteria on hundreds of vendors, I can see how the phrase may have been read that way. Still, when it comes down to it, the misquote turns out to be rather right. Consider this, we've researched and modeled about 50,000 or so distinct criteria on which to evaluate different software systems. When we enter selection consulting engagements and help clients through a complete selection process we'll go through the scripted scenarios with the vendors and the clients. Aside from these specific hands-on selection consulting projects, we regularly certify vendors, in which a part of the process involves asking them to demonstrate criteria. So yes, we have vetted thousands.

With all due respect, I think the 180 Systems blog author ought to do a bit more in-depth research before casting doubt. The 180 Systems methodology correctly recommends defining requirements, prioritizing, and identifying potential vendors as steps in the selection process, but could offer more expertise on good techniques for accomplishing this. Sadly, the author seems to have missed the point of how there are very sophisticated tools and data resources to help accomplish this.

180, saying things like "...do some pre-screening of the VAR. Better yet, get the VAR's name from someone you know." cuts short important issues to consider. I agree that it's good to get references and information from trustworthy sources but solid decisions need to be made on a well-rounded combination of pertinent factors. I'd suggest that systematically evaluating decision criteria relevent to business requirements should be more present at the crux of a well-made decision.

The other steps mentioned (defining reqs and prioritizing) are two examples of where a selection process can become exceedingly complex and error-prone without some good tools and assistance to help manage everything. These of course, are at the forefront of what we designed our tools and research to support. Engaged on a client's project we ensure that everything, including references are factored into the auditable, unbiased, results.

If the author at 180 Systems had tried the free trial (please give it a shot) s/he could have obtained a preliminary list of vendors to continue evaluating in serious depth (thousands of criteria) and prioritized each of those at any level of granularity required as part of a good decision. This is in direct contrast to the inaccurate claim the author made that we only enable prioritizing at a high-level (anyway, strategies for prioritizing merit another post).

As a final note, I would be curious to see what sort of a tool 180 develops as their blog author suggested they might. Considering TEC has been consistently developing, refining, and using DSS tools in real world situations for well over a decade, different perspectives in the approach intrigue me. I've worked and seen other companies that tried this in different ways, though none have ever come close to the useability, precision, applicability, and quantity/quality of data that TEC currently delivers. Maybe 180, you'd like to collaborate with us instead. :-)
 
comments powered by Disqus
Popular Searches

Recent Searches
Others A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

©2014 Technology Evaluation Centers Inc. All rights reserved.