IT-Analysis.com
IT-Analysis.com Logo
Business Issues Channels Enterprise Services SME Technology
Module Header
Craig WentworthMWD Advisors
Craig Wentworth
16th April - Egnyte the blue touchpaper...
Louella FernandesLouella Fernandes
Louella Fernandes
11th April - Managed Print Services: Are SMBs Ready?
Louella FernandesLouella Fernandes
Louella Fernandes
11th April - The Managed Print Services (MPS) Opportunity for SMBs
Simon HollowayThe Holloway Angle
Simon Holloway
11th April - Intellinote - capture anything!
David NorfolkThe Norfolk Punt
David Norfolk
11th April - On the road to Morocco

Blogs > Bloor IM Blog

Service virtualisation
Philip Howard By: Philip Howard, Research Director - Data Management, Bloor Research
Published: 11th May 2012
Copyright Bloor Research © 2012
Logo for Bloor Research

When you are developing or testing software that will exist within a service oriented architecture (SOA) you may have issues with the stability or availability of the services that you need to consume. This may be because these are under development themselves, they may be controlled by third parties, they may not be completed, they may be live systems that are under performance constraints or they may be only available at inconvenient times of the day.

Service virtualisation is a technique used to virtualise these services so that you can develop and test without regard to the stability or availability of services that you depend on. In effect, use of service virtualisation software allows you to "pretend" that these services are available and stable. There are various techniques that you can use for this purpose: record and playback, creating dummy services, creating echo stubbed responses and so on.

The actual number of vendors that provide this sort of functionality is relatively small: HP, CA, Parasoft, IBM (Greenhat) and Grid-Tools are the only ones I know about.

Now, regular readers will know that I don't dip my toe into development and testing as a subject very often: data is more my thing. However, there have been a couple of recent developments within the realms of service virtualisation that have caught my attention. The first was Grid-Tools' announcement of Intelligent Virtual Services, which is a service virtualisation product, and the second was IBM's acquisition of Greenhat.

So, what did I find interesting about these announcements? Well, in the case of Grid-Tools, it is the fact that that company is primarily known for its test data management and Intelligent Virtual Services caters to data, including dynamic data masking, in a way that service virtualisation products usually do not. And, in so far as IBM is concerned, the recognition that Greenhat has synergies with the Optim portfolio of products. For those of you not familiar with the IBM Optim suite of products let me say that it includes test data management and data masking.

Is there a pattern emerging here? It certainly looks like it. Of course, IBM is some way behind Grid-Tools in that it will take time to integrate Greenhat with Optim, but the trend is clear.

So, why is data important in a service virtualisation context? One obvious answer to that question is with respect to data privacy and protection: you either need to be able to mask data that is to be consumed during testing processes or, if the data is not physically available, then you need to be able to generate synthetic data that looks and feels like real data but isn't actually. And, of course, you need to be able to generate faulty data to test against. Grid-Tools' Intelligent Virtual Services does all of these things.

If I am right about the synergies between test data management and service virtualisation then it is worth looking at the vendors in the test data management space. These include HP, IBM, Grid-Tools and Informatica. And which of these doesn't have service virtualisation? So, here's a prediction: Informatica to acquire Parasoft.

Advertisement



Published by: IT Analysis Communications Ltd.
T: +44 (0)190 888 0760 | F: +44 (0)190 888 0761
Email: