Data Warehousing Articles
AtScale, which provides a self-service BI platform for big data, has announced an expansion of its services. With this announcement, the company says it is introducing a BI platform that enables businesses to work seamlessly across all of big data, on premise and in the cloud. In addition to Hadoop, AtScale has announced preview availability of support for data stored in Teradata, Google Dataproc and BigQuery, expanding on the company's existing support for Microsoft Azure and HDInsight.
Posted November 21, 2016
Snowflake Computing, a cloud-based data warehousing company, has launched a partner program. According to the vendor, the program helps Snowflake customers find partners with expertise in strategies, architectures, design principles, and best practices in big data and data analytics that can work with them to accelerate and maximize the benefits of Snowflake.
Posted November 18, 2016
Databricks has announced that, in collaboration with industry partners, it has broken the world record in the CloudSort Benchmark, a third-party industry benchmarking competition for processing large datasets. Databricks was founded by the team that created the Apache Spark project.
Posted November 16, 2016
SAP SE is releasing an update to its SAP S/4HANA platform aimed at improving user productivity. By utilizing a simplified data model and SAP Fiori 2.0 user experience, the SAP S/4HANA 1610 release can reduce complexity and allow applications to yield new business capabilities, according to SAP. "All companies and all line-of-business and industries need to start with 1610 if they want to capitalize on the digital transformation for their company before their competitor does," said Sven Denecken, senior vice president of product management, co-innovation and packaging S/4HANA at SAP SE.
Posted November 16, 2016
IDERA, a provider of database lifecycle management solutions, has released DB PowerStudio 2016+. The portfolio of database management tools was designed and built from the ground up by IDERA following the company's acquisition of Embarcadero in October 2015. With this update, in particular, IDERA adds new features for Oracle databases.
Posted November 16, 2016
New data sources such as sensors, social media, and telematics along with new forms of analytics such as text and graph analysis have necessitated a new data lake design pattern to augment traditional design patterns such as the data warehouse. Unlike the data warehouse - an approach based on structuring and packaging data for the sake of quality, consistency, reuse, ease of use, and performance - the data lake goes in the other direction by storing raw data that lowers data acquisition costs and provides a new form of analytical agility.
Posted November 03, 2016
Data has become a disruptive force for global businesses and a catalyst for digital transformation. But data can only be leveraged for BI initiatives to the extent it can be accessed and trusted. And, while today's self-service BI and analytics tools satisfy a user's craving for more "consumerized" technology, they often leave an analyst stuck in neutral because the users, first and foremost, cannot find the data they need to perform any analysis.
Posted November 02, 2016
Redis Labs, the home of Redis, is introducing an open source project called Redis-ML, the Redis Module for Machine Learning. The new project will accelerate the delivery of real-time recommendations and predictions for interactive apps in combination with Spark Machine Learning (Spark ML).
Posted November 01, 2016
Trifacta, a provider of data wrangling solutions, is launching Wrangler Edge, a platform designed for analyst teams wrangling diverse data outside of big data environments. "We are packing the Trifacta product and adding enterprise features such as the ability to schedule jobs to handle larger data volumes to connect to diverse sources," said Will Davis, director of product marketing. "We also added collaboration and sharing features as well all without requiring organizations to manage a large Hadoop infrastructure."
Posted November 01, 2016
MongoDB has introduced a new release of its NoSQL document database platform with key features that support additional data models for a "multimodel" approach, the combination of operational and analytical processing, elastic cross-region scaling, and tooling to simplify data management for customers.
Posted November 01, 2016
Rosslyn Data Technologies, formerly known as Rosslyn Analytics, has announced the immediate availability of RAPid One-Click Data Analytics, a new suite of self-service automated analytic (SaaS) solutions, a new suite of self-service automated analytic (SaaS) solutions targeted at helping to reduce the time to visibility and insight.
Posted October 31, 2016
Oracle has introduced new versions of the Oracle Database Appliance, which the company says is designed to save time and money by simplifying deployment, maintenance, and support for database solutions. "Oracle Database Appliance offers a path to cloud, future-proofing your investment," said Karen Sigman, vice president, Platform Business Group, Oracle.
Posted October 26, 2016
Business intelligence (BI) and analytics are at the top of corporate agendas this year, and with good reason. The competitive environment is intense, and business leaders are demanding they have access to greater insights about their customers, markets, and internal operations to make better and faster decisions—often in real time. There have also been dramatic changes with BI and analytics tools and platforms. The three Cs—cloud, consolidation, and collaboration—are elevating BI and analytics to new heights within enterprises and gaining newfound respect at the highest levels.
Posted October 24, 2016
Snowflake Computing is making its platform more accessible to users with Snowflake On Demand—a sign-up process for data users to get immediate insight from Snowflake's data warehouse. According to the vendor, with a virtual swipe of a credit card on Snowflake's website, data users can access the only data warehouse built for the cloud. They can store and analyze their data without relying on their own IT group to get up and running quickly.
Posted October 20, 2016
Managed application service provider TriCore Solutions has acquired Database Specialists, a database managed service company focused on support for Oracle database systems.
Posted October 19, 2016
For many years now, Cassandra has been renowned for its ability to handle massive scaling and global availability. Based on Amazon's Dynamo, Cassandra implements a masterless architecture which allows database transactions to continue even when the database is subjected to massive network or data center disruption. Even in the circumstance in which two geographically separate data centers are completely isolated through a network outage, a Cassandra database may continue to operate in both geographies, reconciling conflicting transactions—albeit possibly imperfectly—when the outage is resolved.
Posted October 07, 2016
Typically, most applications consist of both batch and online workloads. This is true even today, when most of our attention has turned to online and web-based interaction. Sure, online activities are the most obvious, in-your-face component of countless applications, but batch processing still drives many actions behind the scenes. This can include applying updates, processing reports, integrating input from multiple sources and locations, data extraction, database utility processing, and more.
Posted October 07, 2016
One symptom of an organization in the middle of a knowledge vacuum is evidenced by SQL that often includes what appears to be extravagant usage of the GROUP BY clause. Writing GROUP BYs here, there, and everywhere becomes a little SQL development dance step, a jitterbug to bypass the issue—moving but not really getting anywhere. Why do these kinds of circumstances exist? Well, maybe the only expert on the involved system has retired and no one else has picked up the torch, so no one is willing to touch the code.
Posted October 07, 2016
Splice Machine, provider of an SQL RDBMS powered by Hadoop and Spark, now supports native PL/SQL on Splice Machine. Announced at Strata + Hadoop World in NYC, the new capabilities are available through the Splice Machine Enterprise Edition.
Posted October 05, 2016
SQL Sentry, a provider of tools for monitoring, diagnosing, and optimizing SQL Server environments, has announced it is combining its tools into one platform, and changing its name to SentryOne.
Posted October 05, 2016
NoSQL and Hadoop—two foundations of the emerging agile data architecture—have been on the scene for several years now, and, industry observers say, adoption continues to accelerate—especially within mainstream enterprises that weren't necessarily at the cutting edge of technology in the past.
Posted October 04, 2016
Zaloni, the data lake company, unveiled new platform updates at Strata + Hadoop World 2016 including new enhancements to Bedrock Data Lake Management Platform and its Mica self-service data preparation solution. Bedrock helps businesses govern and manage data across the enterprise, and Bedrock 4.2 adds new capabilities around data privacy, security, and data lifecycle management.
Posted October 03, 2016
At Strata + Hadoop World, Hortonworks showcased its technology solutions for streaming analytics, security, governance, and Apache Spark at scale.
Posted September 30, 2016
Cloudera has added new technology enhancements to its data management and analytics platform to make it easier for companies to take advantage of elastic, on-demand cloud infrastructure for business value from all their data. The move to the cloud has become a top priority for CIOs, said Charles Zedlewski, vice president, of products at Cloudera, at Strata + Hadoop World 2016 in NYC.
Posted September 29, 2016
Capgemini, a provider of consulting, technology, and outsourcing services, and SAP SE are deepening their strategic partnership with the launch of a new joint initiative to help clients in the discrete manufacturing industries.
Posted September 28, 2016
Nimble Storage's Predictive AF-Series All Flash arrays are now certified by SAP as an enterprise storage solution for the SAP HANA platform. As a result, Nimble customers can leverage their existing hardware and infrastructure components for their SAP HANA-based environments, providing an additional choice for organizations working in heterogeneous environments. This certification adds to the SAP HANA certification Nimble previously obtained for its Adaptive Flash CS-Series arrays for use as enterprise storage solutions for the SAP HANA platform.
Posted September 28, 2016
SAP is releasing a next generation data warehouse solution for running a real-time digital enterprise on-premise and in the cloud. The new solution, SAP BW/4HANA, will be available on Amazon Web Services (AWS) and SAP HANA Enterprise Cloud (HEC).
Posted September 28, 2016
Data lakes are quickly transitioning from interesting idea to priority project. A recent study, "Data Lake Adoption and Maturity," from Unisphere Research showed that nearly half of respondents have an approved budget or have requested budget to launch a data lake project. What's driving this rapid rush to the lake?
Posted September 27, 2016
Database Brothers, Inc. (DBI) has released V6.3 of its flagship product, pureFeat Performance Management Suite for IBM DB2 LUW, which adds the Predictive Index Impact Analysis capability.
Posted September 27, 2016
At Strata + Hadoop World, Pentaho announced five new improvements, including SQL on Spark, to help enterprises overcome big data complexity, skills shortages and integration challenges in complex, enterprise environments. According to Donna Prlich, senior vice president, product management, Product Marketing & Solutions, at Pentaho, the enhancements are part of Pentaho's mission to help make big data projects operational and deliver value by strengthening and supporting analytic data pipelines.
Posted September 26, 2016
SnapLogic is extending its pre-built intelligent connectors - called Snaps - to the Microsoft Azure Data Lake Store, providing fast, self-service data ingestion, and transformation from virtually any to Microsoft's cloud-based repository for big data analytics workloads. This latest integration between SnapLogic and Microsoft Azure helps enterprise customers gain new insights and unlock business value from their cloud-based big data initiatives, according to SnapLogic.
Posted September 21, 2016
In his keynote on Monday at Oracle OpenWorld 2016, Oracle CEO Mark Hurd showcased customer success stories and offered three new predictions for the future of cloud, which, he said, represents a generational shift in the IT market.
Posted September 21, 2016
"In this coming year, you'll see us aggressively moving into infrastructure-as-a-service," Larry Ellison, Oracle's chief technology officer and executive chairman of the board, said to kick off the company's OpenWorld conference Sunday night at the Moscone Center. In the first of his two scheduled keynote addresses, Ellison went on to outline a number of strategic announcements that aim to strengthen the company's offerings, as well as to help it compete with Amazon.com, one of its top challengers.
Posted September 19, 2016
erwin Inc. has acquired UK-based Corso Ltd, a provider of enterprise architecture solutions. erwin has also announced general availability of erwin CloudCore, an integrated cloud bundle consisting of erwin Data Modeler and Corso Agile EA.
Posted September 12, 2016
At its partner conference this week, Teradata is announcing three key new offerings to support customers choosing hybrid environments spanning cloud and on-premise, and relational and big data technologies.
Posted September 12, 2016
Conventional wisdom insists that IT will migrate to the cloud entirely at some point. But practical experience shows that enterprises that have invested in legacy architecture that still has many years of life left in it are not likely to rip and replace, at potentially astronomical costs. Instead, implementing a Bimodal IT approach supported by SDDC on integrated systems will allow companies to address scalability needs with agility, while also ensuring the mission-critical functions of their legacy systems are not compromised.
Posted September 12, 2016
Can Oracle and its partners keep up with the increasing demands of customers for real-time digital capabilities? Is the Oracle constellation of solutions—from data analytics to enterprise applications—ready for the burgeoning requirements of the Internet of Things (IoT) and data-driven businesses? For Oracle—along with its far-flung network of software vendors, integrators, and partners—times have never been so challenging.
Posted September 07, 2016
The Independent Oracle Users Group (IOUG) is excited to be joining the Oracle technology community in San Francisco once again at Oracle OpenWorld 2016, September 18-22. IOUG's 30,0000+ member community is comprised of the top Oracle technology experts from around the globe, several of whom will be presenting sessions on hot topics like Data Intelligence, iOT, Data Security, and Cloud migrations.
Posted September 07, 2016
Dell has completed the acquisition of EMC, creating a $74 billion company with a technology portfolio spanning hybrid cloud, software-defined data center, converged infrastructure, platform-as-a-service, data analytics, mobility and cybersecurity. Describing itself as the world's largest privately-controlled technology company, the combined entity will be known as Dell Technologies.
Posted September 07, 2016
Programming is a literal sport. Code does exactly what it is configured to do, no compromises. When the definition of a task is fuzzy, it is up to the developer to do what they believe is correct. Does the code reflect what is desired? That answer is left open to interpretation. Sadly, developers may not have a clear understanding, and even the users requesting the solution may not be sure. The results can be very painful for an organization. Expectations may not align with the delivered solutions. Users will blame IT; IT will blame users.
Posted September 02, 2016
Perhaps the biggest and most overlooked is how to create accurate test data. You're implementing a new system in order to deal with a massive amount of data, and perhaps your relational database can't handle the volume, so it's vitally important to properly test this new system and ensure that it doesn't fall over as soon as the data floods in.
Posted August 23, 2016