In recent years, an unprecedented numbers of companies have moved to the cloud, which is a trend that shows no sign of slowing. The total cloud market, which includes software as a service (SaaS), infrastructure as a service (IaaS), platform as a service (PaaS), business process as a service (BPaaS), cloud management, and security services, will be worth $278.3 billion in 2021, almost twice its value since last year, according to Gartner predictions.
Most companies have moved at least part of their data holdings to the cloud, and many more will follow suit in the next few years, effectively shifting data’s center-of-gravity from the data center to the cloud. Many are also leveraging multiple vendors, such as Amazon and Microsoft, as part of a multi-cloud strategy that leverages the best features of multiple offerings. Unfortunately, many of the gains realized by cloud technologies will be lost to downtime during the migration.
A Reality Check on Current Migration Approaches
Most on-premise systems housed in data centers have been in place for many years and are tied to business processes that link to a myriad of other systems. As a result, users often find that that existing solutions cannot be easily moved to new systems, cloud-based or otherwise, without impacting the business. In addition, moving data from countless data sources, including data stored in functional silos, can result in downtime, as the data will need to be moved in separate batches before being transformed and loaded to the new location. A lift-and-shift approach, in which applications are moved to the cloud without modification, may minimize downtime, but when companies leverage this approach, they cannot take advantage of several of the cloud’s most powerful native features, such as elasticity or pay-as-you-go cost structures. So, while lift-and-shift approaches may be the fastest way to migrate data to the cloud, downtime is still inevitable.
To truly eliminate downtime and still leverage all of the cloud’s best features, many organizations are looking to modern data integration approaches such as data virtualization (DV). Leveraging DV enables organizations to establish a multi-location cloud infrastructure, in which data resides in both the cloud and in on-premises systems, and is then gradually moved to the cloud, so as not to impact the business.
Data Integration, Modernized
In contrast with traditional data integration approaches, data virtualization does not actually move any data; instead, it enables users or applications to connect to the data remotely in real time, regardless of where it physically resides, so it appears as if all of the data were all stored in the same local repository. In this way, it creates an abstraction layer that removes users from the complexities of where the data is actually stored and what credentials may be necessary for a given user or application to access it. DV solutions operate in a layer above the organization’s numerous data sources, and contains all of the necessary metadata for accessing the sources that are connected to it. As a result, users don’t need to know or care where a given data set is stored; however, a user with the appropriate privileges can drill down into any data set’s complete lineage at any time.
Because data virtualization abstracts users from the complexities of data access, in a multi-location architecture that includes on-premises and cloud systems, organizations can migrate to cloud systems without downtime; in fact, users don’t even notice that a change has occurred behind the scenes. This enables IT Teams to migrate data on their own schedule, taking the time to ensure that security and governance is not compromised and ensure a risk-free, phased approach to cloud migration.
Today, forward-thinking companies are doing this with much success. For example, TransAlta’s IT department initiated a "zero data center" project to move its entire data layer to the cloud for flexibility, agility, and lower TCO. Playing a central role in TransAlta’s real-time data integration DV enables the company to move to the cloud with zero downtime. Similarly, and as a critical part of Logitech’s “cloud first” strategy, the computer accessory provider deployed the combined power of DV and Snowflake on Amazon AWS, for all of Logitech’s analytical needs.
As companies continue to rapidly adopt cloud technologies to gain greater agility and scalability, they must remember that these solutions also present a few challenges of their own. Data virtualization is a technology that overcomes each of these challenges, enabling companies to gain the maximum benefits from cloud initiatives., So, if you and your organization are contemplating a move to the cloud, follow the example of these groundbreaking companies and consider deploying a tool capable of simplifying data integration not only from but for the cloud.