AtScale, which provides a universal semantic platform for BI on big data, has completed a $25 million Series C financing round.
The round, led by Atlantic Bridge, also includes the participation from Wells Fargo Securities, Industry Ventures as well as returning investors Storm Ventures, UMC Capital, Comcast Ventures and XSeed Capital.
Seeking to provide big data access to any data, anywhere, for any employee, AtScale enables enterprises to simplify their business intelligence infrastructure by allowing business users to continue working with the tools they know while providing the enterprise with a universal semantic layer to centrally manage data definitions, performance and security.
Discussing the new funding at the Strata Data Conference, Dave Mariani, CEO and co-founder of AtScale, said AtScale’s series A funding was about building the technology, series B was about selling it and series C is all about scaling it.
The $25 million in series C financing is targeted at scaling the business, and the main focus besides sales and marketing and investment in R&D is to invest in customer success and help them on their journey from old enterprise data warehouse to new data lakes and this new big data landscape, said Mariani.
Citing statistics from the Economist, AtScale contends that only 5% of the data enterprises capture makes it to the decision makers who need it. In the last few years, with the transition from a centrally managed, IT-led business intelligence model to a self-service model driven by business units, AtScale says, business users have often been forced to become data modelers and data wranglers and re-invent the key metrics driving their business. Meanwhile, IT has sought to centralize business logic and data platforms, manage security and avoid the data silos that are often the unintended side effect of a self-service model. Most enterprises end up with dozens, sometimes hundreds of data stacks that, according to Harvard Business Review, can drive up the cost of implementations by 80%.
AtScale sees two primary use cases for customers deploying AtScale, said Mariani. One is to scale out their business infrastructure. “For years we have been making big data small to make it accessible to BI tools that were born in the 1980s and 1990s and designed for relational and highly structured data and data that is relatively small. Now, that organizations have got all this data, they want to be able to use those same BI tools but on a much richer more detailed set and they are having trouble doing. We call that BI scale-out.”
The second major use case is the “data architecture re-platform” so in that case they are saying it is too expensive to keep doing business the old way and they want to move to the new data lake architecture using Hadoop, Google BigQuery, or Amazon S3 to store data but they need to make it compatible with their analysts so they can use their BI tools to do their work. “We help bring those business users along for the ride during that transition.”
Enterprises have been tackling big data for more than a decade but they’re still forced to make it small for BI users by creating siloed data stacks. This, the company says, defeats the purpose of moving to a big data platform in the first place because it strips away big data’s nuances and, by losing data fidelity, deprives business analytics of its real potential.
To find out more about AtScale, go to www.atscale.com/try