Let’s see a show of hands for how many of you have heard enough about cloud. My goal here is not to diminish the importance of cloud technologies and the impact they have had on our industry, as well as our personal lives and society, in general. My goal is to encourage information technologists and leaders to think beyond the present and into the future.
There has been plenty of dialogue within the IT community about when to migrate to the cloud, how to migrate to the cloud, which provider offers customers the best cloud environments, and the due diligence or governance that is necessary before taking that big step. Do you migrate everything all at once, or is a hybrid approach more sensible? No doubt, these are important questions that need to be answered for a cloud migration to be successful.
There are larger waves of change nipping at our heels, yet we seem content to continue discussing a technology that, in my opinion, is a means to an end. Cloud on its own is not particularly innovative. We have made great advances in infrastructure that allow cloud providers to help organizations run applications better and faster. Cloud does, however, allow for other technology innovations to take place. A few examples include artificial intelligence, augmented reality, predictive analytics, machine learning, and robotics. These technologies are here, and they are becoming part of our lives without much fanfare or concern. I opened the weather app on my smartphone recently and, for the first time, noticed that it said, “With Watson” on the splash screen. I was shocked and fascinated at the same time! The weather app is using artificial intelligence!
For the first time since assembly line and manufacturing automation, there is a real threat of job loss in many industries due to the efficiencies of artificial intelligence and robotics. We are using algorithms to make decisions for us that involve other human beings, and this could impact their lives. We use big data and predictive analytics without considering the ethical and moral impact of doing so. The data is available, so we hire talented technologists to massage it in various ways, hoping it will tell us something we can use as a competitive edge, but would it not be better to first have a relevant question in mind? In a recent issue of The Wall Street Journal (March 1, 2017), a Barclays analyst stated that many organizations are collecting data but not leveraging it. “Companies that created big data lakes in recent years have generally found results disappointing, since they were unable to tap it for actionable business intelligence.” Organizations need to ask good questions before mining data, but how often does this happen?
The other popular phrase of late is “digital transformation.” Organizations are discussing what it means to become digital, how to become a digital organization, and there are even books on becoming a digital leader. What does that mean? What exactly is a “digital leader”? In the latest issue of the Harvard Business Review (March/April 2017), Walmart’s CEO, Doug McMillon, talked about how Walmart is moving forward with technology. When asked how employees remain motivated as resources move to newer, digital operations, his response was: “The people who run the older parts of the business must also become digital. We can’t have some people live in yesterday while others live in tomorrow. And given the effects of inertia, we need people to lean into the future even more than other companies might. We’re trying to move large numbers of people to change their established habits.” Is this what it means to be a digital leader?
My message here is that technologists should critically evaluate current technology trends but, more importantly, look ahead to the future at the next big wave of change. In wading through the weeds, we will miss the opportunities that silently hover in the background but will, no doubt, be disruptive. Look around at how technology is changing society—cloud is an enabler for so many more exhilarating opportunities!