Newsletters




Data Warehousing

Hardware and software that support the efficient consolidation of data from multiple sources in a Data Warehouse for Reporting and Analytics include ETL (Extract, Transform, Load), EAI (Enterprise Application Integration), CDC (Change Data Capture), Data Replication, Data Deduplication, Compression, Big Data technologies such as Hadoop and MapReduce, and Data Warehouse Appliances.



Data Warehousing Articles

Pivotal has introduced Pivotal HD 2.0 and Pivotal GemFire XD, which along with the HAWQ query engine, form the foundation for the Business Data Lake architecture, a big data application framework for enterprise

Posted March 17, 2014

Today, businesses are ending up with more and more critical dependency on their data infrastructure. If underlying database systems are not available, manufacturing floors cannot operate, stock exchanges cannot trade, retail stores cannot sell, banks cannot serve customers, mobile phone users cannot place calls, stadiums cannot host sports games, gyms cannot verify their subscribers' identity. Here is a look at some of the trends and how they are going to impact data management professionals.

Posted March 17, 2014

Along with big data there come some fundamental challenges. The biggest challenge is that big data is not able to be analyzed using standard analytical software. There is new technology called "textual disambiguation" which allows raw unstructured text to have its context specifically determined.

Posted March 14, 2014

To simplify and strengthen enterprise application integration (EAI) capabilities, the latest release of Kourier Integrator, Kore Technologies' flagship product for data management providing both extract, transform and load (ETL) and enterprise application integration (EAI) capabilities, offers simpler inbound integration of third-party systems and databases to U2 applications.

Posted March 12, 2014

SAP underscored the company's strategy which focuses squarely on HANA and the cloud in a webcast presented by Jonathan Becher, chief marketing officer of SAP, and Vishal Sikka, member of the Executive Board of SAP AG, Products & Innovation. The company rolled out new and enhanced offerings for the SAP HANA Cloud Platform, the SAP HANA Marketplace, HANA's new pricing, innovations on top of HANA, and also announced that HANA had broken the Guinness World Record for the largest data warehouse ever built - 12.1PB.

Posted March 05, 2014

The latest release of Embarcadero's portfolio of database tools adds first-class support for Teradata in addition to updating support for the latest releases of the major RDBMSs. Overall, a key theme for the XE5 releases is an emphasis on scale, as big data, with big models and big applications, requires close collaboration across big teams, said Henry Olson, Embarcadero director of product management.

Posted February 26, 2014

In order to be effective, big data analytics must present a clear and consistent picture of what's happening in and around the enterprise. Does a new generation of databases and platforms offer the scalability and velocity required for cloud-based, big data-based applications—or will more traditional relational databases come roaring back for all levels of big data challenges?

Posted February 26, 2014

Changes and enhancement to solutions are hard, even under the best of circumstances. It is not usual that, as operational changes roll out into production, the business intelligence area is left uninformed, suggesting that data warehouses and business intelligence be categorized according to the view of the old comedian Rodney Dangerfield because they both "get no respect."

Posted January 07, 2014

A new rapid-deployment solution from SAP aims to address the issue of big data storage access and analysis which companies are grappling with as they attempt to balance what information needs to be accessible in real time and what can be stored for historical analysis. The SAP NetWeaver Business Warehouse (BW) Near-Line Storage rapid-deployment solution facilitates seamless data transfer between the business warehouse and the near-line storage that holds historical data. This, the company says, helps limit a business' burden around housing volumes of big data while also creating an online and accelerated retrieval system with the near-line storage.

Posted December 18, 2013

Cloudera and Informatica have partnered to create a new Data Warehouse Optimization (DWO) reference architecture specifically for Enterprise Data Hub deployments with the goal of helping reduce data warehouse costs and increasing productivity. Cloudera also announced the public beta offering of Cloudera Enterprise 5, which delivers the new Enterprise Data Hub and enhancements to related products.

Posted October 31, 2013

If you look at what is really going on in the big data space it's all about inexpensive open source solutions that are facilitating the modernization of data centers and data warehouses, and at the center of this universe is Hadoop. In the evolution of the big data market, open source is playing a seminal role as the "disruptive technology" challenging the status quo. Additionally, organizations large and small are leveraging these solutions often based on inexpensive hardware and memory platforms in the cloud or on premise.

Posted October 24, 2013

At its Partners User Group Conference in Dallas, Teradata made a range of product and partner announcements, including the introduction of the Teradata Data Warehouse Appliance 2750, the availability of the Teradata Cloud, and new support for Java Script Object Notation (JSON) data.

Posted October 22, 2013

Oracle CEO Larry Ellison made three key announcements in his opening keynote at Oracle OpenWorld, the company's annual conference for customers and partners in San Francisco. Ellison unveiled the Oracle Database In-Memory Option to Oracle Database 12c which he said speeds up query processing by "orders of magnitude," the M6 Big Memory Machine, and the new Oracle Database Backup Logging Recovery Appliance. Explaining Oracle's goals with the new in-memory option, Ellison noted that in the past there have been row-format databases, and column-format databases that are intended to speed up query processing. "We had a better idea. What if we store data in both formats simultaneously?"

Posted October 02, 2013

RainStor, a provider of an enterprise database for managing and analyzing all historical data, has introduced RainStor FastForward, a new product that enables customers to re-instate data from Teradata tape archives (also known as BAR for Backup, Archive and Restore) and move it to RainStor for query. The new RainStor FastForward product resolves a pressing challenge for Teradata customers that need to archive their Teradata warehouse data to offline tape, which can make it difficult to access and query that data when business and regulatory users require it, Deirdre Mahon, vice president of marketing, RainStor, explained in an interview.

Posted September 26, 2013

Attunity Ltd., a provider of information availability software solutions, has released a new version of its data replication software intended to address requirements for big data analytics, business intelligence, business continuity and disaster recovery initiatives. Addressing expanding use cases for the solution, Attunity Replicate 3.0, is engineered to provide secure data transfer over long distances such as wide area networks (WANs), the cloud and satellite connections, said Lawrence Schwartz, vice president of marketing at Attunity, in an interview.

Posted September 25, 2013

Attunity Ltd., a provider of information availability software solutions, has formed a new partnership with HP and announced the availability of its enhanced Attunity Click-2-Load solution for HP Vertica. The solution provides automation and optimized technologies that accelerate data loading to HP Vertica from disparate sources, and then maintains the changed data continuously and efficiently.

Posted August 13, 2013

A new Oracle In-Memory Application - Oracle In-Memory Logistics Command Center - has been launched that enables customers to improve scenario management in order to increase supply chain resiliency and agility, decrease costs and enhance service levels. With supply chains and their associated logistics networks becoming increasingly complex, strategic and operational, Oracle says scenario management is now central to creating an effective logistics network. Oracle In-Memory Logistics Command Center leverages the performance capabilities of Oracle Engineered Systems, including Oracle Exadata Database Machine and Oracle Exalogic Elastic Cloud, which can manage the large and complex data sets central to in-memory applications.

Posted August 07, 2013

Noetix Corp., a provider of business intelligence (BI) software and services for enterprise applications, has introduced Noetix Analytics 5.3, with improvements including new data marts, performance enhancements, and a streamlined upgrade process. Noetix Analytics is a packaged data warehouse solution designed to provide business users with strategic reporting for trending and analysis based on information from multiple data sources.

Posted August 07, 2013

The Oracle database provides intriguing possibilities for the storing, manipulating and streaming of multimedia data in enterprise class environments. However, knowledge of why and how the Oracle database can be used for multimedia applications is essential if one is to justify and maximize the ROI.

Posted July 17, 2013

These are heady times for data products vendors and their enterprise customers. When business leaders talk about success these days, they often are alluding to a new-found appreciation for their data environments. It can even be said that the tech vendors that are making the biggest difference in today's business world are no longer software companies at all; rather, they are "data" companies, with all that implies. Enterprises are reaching out to vendors for help in navigating through the fast-moving, and often unforgiving, digital realm. The data vendors that are leading their respective markets are those that know how to provide the tools, techniques, and hand-holding needed to manage and sift through gigabytes', terabytes', and petabytes' worth of data to extract tiny but valuable nuggets of information to guide business leaders as to what they should do next.

Posted June 19, 2013

There is an emerging field of companies looking to take on the challenges presented by the roiling tide of big data. While their visions vary, each has identified a market need that it believes its technology uniquely addresses. Here, DBTA highlights the approaches of 10 companies we think are worth watching.

Posted June 13, 2013

Embarcadero Technologies gives 97% of the world's top 2000 companies the tools needed to address the biggest challenges in data management. Facing significant growth in complexity, diversity and volume of enterprise data, companies worldwide are increasingly turning to data governance as a strategic solution. Helping our customers manage this complexity, and close the "governance gap" has been a major driver of innovation in our products.

Posted June 03, 2013

SAP AG has announced the SAP HANA Enterprise Cloud service. With the new offering, running mission-critical SAP ERP, SAP CRM, SAP NetWeaver Business Warehouse and new applications powered by the SAP HANA in-memory platform, will be possible as a managed cloud service with elastic petabyte-scale.

Posted May 29, 2013

It seems that juggling is the most useful of all skills when embarking on a data warehousing project. During the discovery and analysis phase, the workload grows insanely large, like some mutant science fiction monster. Pressures to deliver can encourage rampant corner-cutting to move quickly, while the need to provide value urges caution in order not to throw out the proverbial baby with the bath water as the project speeds along. Change data capture is one area that is a glaring example of the necessary juggling and balancing.

Posted May 22, 2013

Key findings from a new study, "Big Data Opportunities," will be presented at Big Data Boot Camp at the Hilton New York. Big Data Boot Camp will kick off at 9 am on Tuesday, May 21, with a keynote from John O'Brien, founder and principal of Radiant Advisors, on the dynamics and current issues being faced in today's big data analytic implementations. Directly after the opening address, David Jonker, senior director of Big Data Marketing, SAP, will showcase the results of the new big data survey, which revealed a variety of practical approaches that organizations are adopting to manage and capitalize on big data. The study was conducted by Unisphere Research, a division of Information Today, Inc., and sponsored by SAP.

Posted May 16, 2013

The conference agenda as well as the list of speakers is now available for DBTA's Big Data Boot Camp, a deep dive designed to bring together thought leaders and practitioners who will provide insight on how to collect, manage, and act on big data. The conference will be held May 21-22 at the Hilton New York. SAP is the diamond sponsor, and Objectivity and MarkLogic are platinum sponsors of the two-day event.

Posted April 25, 2013

Data keeps growing, systems and servers keep sprawling, and users keep clamoring for more real-time access. The result of all this frenzy of activity is pressure for faster, more effective data integration that can deliver more expansive views of information, while still maintaining quality and integrity. Enterprise data and IT managers are responding in a variety of ways, looking to initiatives such as enterprise mashups, automation, virtualization, and cloud to pursue new paths to data integration. In the process, they are moving beyond the traditional means of integration they have relied on for years to pull data together.

Posted April 03, 2013

Attunity Ltd., a provider of information availability software solutions, released Attunity Replicate 2.1, a high-performance data delivery solution that adds improvements for data warehousing. Attunity Replicate's new performance enhancements support many data warehouses, including Amazon Redshift, EMC Greenplum and Teradata.

Posted March 13, 2013

SAP AG has introduced a new version of its Sybase IQ disk-based column store analytics server. The overriding theme of this new release, which will be generally available later in the first quarter, "is positioning IQ 16 to go from terabytes to petabytes," Dan Lahl, senior director of product marketing at SAP, tells 5 Minute Briefing. To accomplish this, IQ 16 provides enhancements in three critical areas.

Posted February 27, 2013

Hortonworks, a leading contributor to Apache Hadoop, has released Hortonworks Sandbox, a learning environment and on-ramp for anyone interested in learning, evaluating or using Apache Hadoop in the enterprise. This tool seeks to bridge the gap between people who want to learn Hadoop, and the complexity of setting up a cluster with an integrated environment that provides demos, videos, tutorials.

Posted February 27, 2013

MarkLogic said it plans to deliver an enterprise-grade application and analytics software solution based on the new Intel Distribution for Apache Hadoop software. The Intel Distribution will be combined with the MarkLogic Enterprise NoSQL database to support real-time transactional and analytic applications.

Posted February 26, 2013

Attunity is developing a solution for fast data loading into Amazon Redshift, AWS's new data warehouse in the cloud. The Attunity solution is expected to be available for customer preview in March 2013.

Posted February 21, 2013

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52

Sponsors