“Early adopters are reaping value now by leveraging AutoML to adapt faster to changes, collect better metadata, foster rapid intelligence, and deliver better outcomes for business users.” Natarajan sees AutoML gaining traction in areas such as “vision computing, natural language understanding, voice/chat, and rich mixed-reality experiences.”
INSIGHT ENGINES
Building on top of AI and machine learning, companies are turning to insight engines to help understand data and connect the dots. “Several years back, enterprise search technology was limited to searching and finding information in a basic way,” said Daniel Fallmann, founder and CEO of Mindbreeze.
“Many people still think that’s all a search solution can do,” said Fallmann. “However, because AI is now applied across the entire enterprise, data solution providers now address the needs of all business functional areas. These new search elements involve 360-degree views, holistic views of information, and greater use of digital twins. Search has been upgraded, and it’s now exciting again.”
LOW-CODE AND AUTOMATED DATA DISCOVERY
Automation in data discovery and low-code techniques is emerging as a key enabler of enterprises’ ability to compete on data. “It enables companies to do much more with their data in less time,” says Ayush Parashar, vice president of engineering at Boomi. “Enterprises have 850 applications on average, but only approximately 30% of their apps are connected. If companies could automatically discover the data and easily integrate their applications—the number of which likely surged during the pandemic—enterprises could see new business insights, deliver integrated customer experiences, streamline business processes, and more.”
Much of this technology is becoming available through platform as service offerings. IT has used low-code approaches for some time, but now it’s expanding to departments outside of IT, such as sales, HR, marketing, and finance, which can use low-code and intelligent data discovery methods to more quickly access the needed data by creating data integrations themselves, said Parashar. “Meanwhile, IT teams don’t have to spend as much time connecting new data sources or setting up data dashboards, freeing time to focus on higher-level responsibilities. I see low code as being the predominant tech for data integration, with low-code automation and automated data discovery driving quick ROI. Integrations are vital for a lot of business transactions, as well as for scaling company processes. Maybe in 5 years we won’t even have to manually set up data integrations, since automation will do it for us.”
FLEX CODE
One shift that has been developing in recent years has been the move toward toward the operationalizing of data collected and aggregated by the enterprise. “Part of operationalizing data is enabling a wide array of different roles to contribute to the various stages of the data refinement lifecycle—which is more difficult than it sounds,” said Sean Knapp, CEO and founder of data engineering company Ascend.io. “Contributors range from engineers to scientists to analysts and more, each with different skills, languages, tools, and methodologies. Engineers, for example, often require a deeper level of flexibility and customization for their tools, whereas business-level users require tools with higher levels of abstraction that integrate with their broader tools and ecosystem.” This is why flexible coding solutions are rising to prominence, Knapp said. These solutions enable teams with diverse technical experts—developers, data engineers— to work in their preferred programming language and delve deeper into different intricacies of DataOps technology, while other teams consisting of data analysts and data scientists can work more quickly in SQL and Python with a visual user interface.
DATAOPS
While DataOps, an agile methodology for developing and delivering analytics through collaboration and automation, is not a technology itself, it is becoming a major force in data management this decade. “DataOps applies proven methods from manufacturing—lean manufacturing—to data analytics,” explained Chris Bergh, CEO of DataKitchen. “Lean data analytics focuses on eliminating waste and reducing errors. DataOps utilizes automation to minimize analytics’ cycle time. It instantiates self-service sandboxes with the necessary test data and toolchains so data scientists can start new projects on demand. Using DataOps testing, data organizations can reduce errors to virtually zero so they are much less plagued by technical debt. DataOps uses automation to alleviate the bottlenecks in data and analytics workflows.” DataOps-driven workflow automation “will separate the leaders from the laggards in data analytics,” Bergh added. “Business agility will increasingly flow from analytics agility.”