According to a white paper from the industry organization, Information Systems Audit and Control Association (ISACA), calculating the total cost of an IT service against its potential return is always a challenge, and cloud computing brings some new wrinkles to this process. At issue is the reality that interoperability is a costly challenge around the use of cloud computing, including how information is shared among and between private, public, and hybrid clouds.
In my travels, this seems to be a systemic issue. Enterprises move to the cloud, including relocation of data assets to cloud computing platforms, but don’t take the time to perform some rudimentary planning around how data will flow in and between these systems, cloud and non-cloud.
This does not mean months of deep and expensive consulting, but a pretty simple process of understanding the structure and data semantics of the source and target systems, and the methods and approaches of data integration. You must also select the right data integration technology that is prepared to deal with emerging cloud computing platforms out-of-the-box or out-of-the-cloud.
Even if you have an existing data integration strategy in place, the use of cloud-based resources will require some updating, and perhaps a technology refresh. Indeed, many enterprises attempt to force fit existing solutions, or even manual processes that are in place, when other approaches are much more effective. Once again, you start with your requirements and back those into your solution.
Interoperability seems to be a hidden cost of cloud computing, at least according to the ISACA report. It is also clear that a sound data integration strategy, and the right enabling technology, can resolve many of the data-level interoperability issues when moving to the cloud. By creating this strategy, we’ll address specific issues such as:
• The ability to add, change, and remove cloud or traditional data assets, as needed, to support the business.
• The ability to externalize business intelligence data in support of critical business decisions, even core business processes.
• The ability to abstract changes to data structure without driving re-development and testing cycles.
• The ability to move to new and emerging database platforms, such as the use of “Big Data” technology.
The formation of this strategy follows a few fundamental paths:
Have a detailed understanding of the cloud-based data assets, either locally or remotely hosted. This means understanding the core database model, any structures that may be enforced, and the data semantics. In the case of Big Data systems, the structure may be layered in after the structured or unstructured data exists. While more complex, it also provides additional flexibility when dealing with data integration.
Create an approach for data integration, including latency, transformation, routing, integrity, performance, security, governance, etc. This typically means writing down the core requirements of the logical data integration solution, using this as a jumping off point to select the right technology.
Understand the interfaces to the data for the newer cloud-based and traditional systems. In the world of cloud computing, we typically access data using well-defined APIs or services. With traditional systems, there are typically many more ways to get at the data. I would suggest that some time is spent learning how data is produced and consumed from and to these interfaces, including any latency issues that may have to be dealt with during the deployment of the data integration technology.
Select the right technology, and create an implementation plan. Data integration technology that is cloud aware, such as technology sold by the host of this blog, typically provides the best path. In many instances you have the choice of hosting the data integration engine on premise, or consumed as a cloud service. Either approach has its tradeoffs; you can use either or both.
Deployment planning typically aligns with the roll-out of the new cloud computing solutions. The data integration planning activities should be a part of the larger cloud migration and deployment project. I do not recommend that you attempt this as an afterthought.
Considering that interoperability is indeed a hidden cost of cloud computing, the way you deal with this problem is to get ahead of the core issues around data mediation and integration. I would suggest that no enterprise should embark on the road to cloud computing without a parallel task to deal with the many data integration issues that need addressing. A bit of planning, combined with the right technology will do wonders here.