The DaaS model provides “an enterprise-wide platform for data access, which is a good thing and certainly democratizes data,” said Soumendra Mohanty, digital solutions lead for Mindtree. “However, as a consequence, you also invite a variety of users accessing this data anytime, every time from anywhere, using any kind of device. This puts stress on the data governance processes—access privileges and security mechanisms, intrusion detection, traceability and who is using the data for what purpose.”
Intuit’s Mahadevan cautions that the degree to which workloads flow to the cloud may be limited by an old nemesis: bandwidth. “The rate and the scale of the move varies by company,” she said. “Two important factors to determine how much enterprise data to move to the cloud are existing infrastructure technology and bandwidth costs. If a company’s data is already stored in a private data center, bandwidth costs may limit data transfer to the cloud.” Recent moves by cloud vendors suggest they are looking to help address the bandwidth problem. “As an example,” she related, “Amazon, with its Snowball service, is providing data storage and transport as a service to move bulk data into the cloud.”
Because of the sheer variety of workloads and use cases that encompass today’s enterprises, DaaS or DBaaS may not be the right solution for every application—especially if they are delivered by an outside cloud provider. “There is no such thing as one-size-fits-all,” said Nishant Vyas, CTO of MariaDB. “Given the benefits for many use cases and the challenges with others, it’s clear that enterprises will be using a combination of traditional infrastructure, and both private and public cloud deployments to store data and host applications in the foreseeable future.” He added that “storing data in public clouds may be inappropriate when data security, governance, and compliance are top priorities. Public clouds are also unsuitable for applications that are latency-sensitive and where infrastructure needs to be tightly controlled. In a public cloud, a [rogue] tenant can slow down everyone else’s applications. Though public cloud vendors are working to alleviate the risks, the fact remains that a public cloud is ‘someone else’s computer.’”
Industry experts also caution against surrendering all control to a cloud partner. Organizations must maintain responsibility for the usage and security of their data. “Don’t hand over the keys to corporate data security and privacy to someone else,” said Jeff Denworth, senior vide president of marketing for CTERA. “Cloud providers are not responsible for your data—you are. Invest in a system that allows you to apply your corporate policies and meet critical business needs for data governance, security, integrity, sovereignty, and compliance.”
In terms of data, the best candidates for cloud include “discrete data,” which may include backups, archives, and DR sites, said Dada. Additional candidates for cloud include standalone applications or workloads “that have limited—high-throughput or low latency—dependencies to other local services or applications, especially those that have variable workloads that benefit from the dynamic provisioning.” Two classic examples include test/development environments and websites, he added.
It’s important to keep in mind that within the next few years, most remaining bandwidth limitations will be resolved, as security issues have been addressed. At that point, there’s little to stop enterprise data from flowing to the cloud, whether through abstraction off existing systems into DaaS, or via databases and applications residing entirely in the cloud. “Enterprise data in the cloud is absolutely becoming the new norm,” said Sayar. “If you take a look at the ‘software is eating the world’ analogy, you’ll find the same thing is happening with the cloud. The cloud captures the speed, reliability, scalability, cost effectiveness, and security that enterprises need to remain continuously innovative—so it only makes sense that data is also stored, captured, and analyzed there. As a result, enterprises are also sitting on a huge opportunity to make sense of their big data exhaust by using analytics to monitor, manage, and gather insights from cloud-based user, application and infrastructure log data.”
All it’s going to take is a change in thinking. A key obstacle that needs to be overcome is emotional, Dada explains. “This stems from losing control over your data, not knowing what security challenges may come up, and not being able to do anything—other than wait for the cloud service provider to come back up—in the case of downtime. Second, there’s a need for learning. Backups, DR, security, identity management, change management, forecasting, SLAs, and other aspects of IT are different in the cloud. It takes some effort and time to learn how to do these right.”