Newsletters




Trends and Applications



Customers across a wide variety of industries know analyzing Big Data is critical to sustain competitiveness. More importantly, they are quickly realizing that traditional architectures won't help them, especially when it comes to integrating data - what we commonly know as ETL.

Posted June 03, 2013

Terracotta, Inc. is the leading provider of game changing high-volume, high-value Big Data management solutions for Global enterprises. Its flagship product, BigMemory, is a Big Data in-memory solution that delivers performance at any scale. Terracotta's other award-winning data management solutions include Ehcache—the defacto caching standard for enterprise Java—and Quartz—a leading job scheduler.

Posted June 03, 2013

It's become clear to me that the Business Intelligence (BI) market is undergoing a period of significant change. Organizations now realize the potential and value of empowering people—throughout the enterprise—with the ability to take better, faster fact-based action.

Posted June 03, 2013

There is an emerging field of companies looking to take on the challenges presented by the roiling tide of big data. While their visions vary, each has identified a market need that it believes its technology uniquely addresses. Here, DBTA highlights the approaches of 10 companies we think are worth watching.

Posted June 03, 2013

Managing Big Unstructured Data - How to Be Your Own Data Scientist

Posted May 23, 2013

How to Gain Value from Big Data Projects is Explored at Big Data Boot Camp

Posted May 23, 2013

DBTA's Big Data Boot Camp provided attendees with an immersion experience in the world of big data. Presentations and panel discussions with data experts covered the value provided by Hadoop and related solutions, how it all fits into the overall enterprise data management picture, real-world use cases in which big data technologies are now being deployed, how to get started, the legal implications of big data management that organizations need to be aware of before they initiate a big data project, and how to successfully engage with customers through social media. Ten salient points emerged from the two-day conference which wrapped up at the Hilton New York on Wednesday.

Posted May 23, 2013

OpenText, a provider of enterprise information management (EIM) software, announced the expansion of the OpenText ECM Suite for SAP Solutions in support of SAP's latest technologies, including the SAP HANA platform, and cloud and mobility solutions. This support builds on a partnership that spans more than two decades and is growing every year, says Patrick Barnert, senior vice president, Partners and Alliances, OpenText. OpenText, he adds, is the first SAP ISV partner to have its products fully tested and confirmed by SAP to be integrated with SAP Business Suite powered by SAP HANA.

Posted May 23, 2013

Talend, a global open source software provider, has released version 5.3 of its integration platform, scaling the integration of data, application and business processes of any complexity. Talend version 5.3 reduces the skill sets and development costs necessary to leverage big data and Hadoop by enabling integration developers without specific expertise to develop on big data platforms. "The big challenge that organizations are facing today is not about getting the data; it's not about the platform to explore it; it's really about people who are operating the platform. There is a big shortage of developers with big data and Hadoop skills, and a big shortage of data scientists," Yves de Montcheuil, vice president of marketing at Talend, tells DBTA

Posted May 23, 2013

NuoDB, Inc., a provider of a cloud data management system offering SQL compliance and guaranteed ACID transactions, has introduced the NuoDB Starlings Release 1.1. Following on from its 1.0 release in January, NuoDB's Starlings Release 1.1 focuses on overall usability in three key areas, Seth Proctor, NuoDB chief architect, tells DBTA. The enhancements focus on greater Microsoft Windows support, general performance and stability, and an improved development and management experience in the web console, says Proctor.

Posted May 23, 2013

A new release of Oracle Secure Global Desktop, part of Oracle's Desktop Virtualization portfolio, extends secure access to cloud-hosted and on-premise enterprise applications and desktops from Apple iPad and iPad mini tablets, without the need for a VPN client. Now, through support of the HTML5 standard, Oracle Secure Global Desktop 5.0 allows tablet users to access enterprise applications with just a web browser, so they can use their own devices for work. The new release provides tablet users, in addition to PC, MAC and desktop users, certified access to Oracle Exalogic Elastic Cloud and web-based applications such as Oracle E-Business Suite, Oracle Siebel CRM, Oracle Primavera, and many others.

Posted May 23, 2013

NoSQL databases are becoming increasingly popular for analyzing big data. There are very few NoSQL solutions, however, that provide the combination of scalability, reliability and data consistency required in a mission-critical application. As the open source implementation of Google's BigTable architecture, HBase is a NoSQL database that integrates directly with Hadoop and meets these requirements for a mission-critical database.

Posted May 09, 2013

What Does it Mean to be a "Real-Time Enterprise"?

Posted May 09, 2013

As machines increasingly are fitted with internet and other network access, enterprises will be able to capture and increasingly expected to respond to more customer data than ever before. Machine-to-machine (M2M) network connections—this so-called "Internet of Things"—is positioned to become the next source of major competitive advantage. Whatever you call it, M2M is turning out to be the poster child for big data's "Three Vs": Volume, Velocity and Variety. What M2M data requires is a fourth "V" (Visualization) to convert its big data into value by giving users the ability to identify data patterns through real-time analytics.

Posted April 25, 2013

The conference agenda as well as the list of speakers is now available for DBTA's Big Data Boot Camp, a deep dive designed to bring together thought leaders and practitioners who will provide insight on how to collect, manage, and act on big data. The conference will be held May 21-22 at the Hilton New York. SAP is the diamond sponsor, and Objectivity and MarkLogic are platinum sponsors of the two-day event.

Posted April 25, 2013

IBM announced it has enhanced the Tivoli System Automation family with a product focused on helping smaller zEnterprise customers. The IBM Automation Control for z/OS (IACz) product is targeted at single System z customers looking to move from manual scripting to policy-based automation.

Posted April 25, 2013

MarkLogic Corporation, the provider of an enterprise NoSQL database platform, announced that it has closed a $25 million round of growth capital led by Sequoia Capital and Tenaya Capital, with participation from Northgate Capital. MarkLogic CEO Gary Bloom also made a personal investment in this financing round. With capital to fuel sales and marketing, MarkLogic seeks to go after the broader market of enterprise class customers while also targeting several key areas for feature expansion, Bloom told DBTA.

Posted April 25, 2013

Confio Software released version 8.3 of its Ignite database performance monitoring software at COLLABORATE 13 this week. Ignite 8.3 enhancements were developed specifically to address the needs of DBAs with very large database deployments spread out geographically as well as enterprise-level requirements for security and compliance, Don Bergal, chief marketing officer of Confio, tells DBTA.

Posted April 25, 2013

Big data has unceremoniously ended the era of the "all-purpose database." The days of sticking uniform data into a single database and running all your business applications off it are gone. Business data today comes in a variety of formats, from countless sources, in huge volumes and at fantastic speeds. Some data is incredibly valuable the instant it arrives, other data is only valuable when combined with large amounts of additional data and analyzed over time.

Posted April 10, 2013

In business, the rear view mirror is clearer than the windshield, said the sage of Omaha. And that is particularly true of business intelligence, composed almost entirely of such retrospectives. Consider this: Business intelligence proffers neatly organized historical data as a potential source of hindsight. Of course, there are also the dashboards of happenings in the "now" but precious little in terms of prompts to timely action. The time required to traverse that path from data to insight to intelligence to ideas to implementation to results is often the culprit. It's nowhere near quick enough, especially for businesses like banking, telecommunications and healthcare that set great store by the time value of information and the money value of time.

Posted April 10, 2013

Dell Software is rolling out the latest version of its Kitenga Analytics solution, which extends the analysis of structured, semi-structured and unstructured data stored in Hadoop. Kitenga was acquired by Dell along with Quest Software in September 2012.

Posted March 27, 2013

10gen, the MongoDB company, has released MongoDB 2.4, featuring hashed-based sharding, capped arrays, text search, and geospatial enhancements. 10gen has also introduced MongoDB Enterprise as part of a new MongoDB Enterprise subscription level, featuring new monitoring and security features including Kerberos Authentication and role-based privileges.

Posted March 27, 2013

The Independent Oracle Users Group (IOUG) will celebrate its 20th anniversary at COLLABORATE 13, a conference on Oracle technology presented jointly by the IOUG, OAUG (Oracle Applications User Group) and the Quest International User Group. The event will be held April 7 to 11 at the Colorado Convention Center in Denver. As part of the conference, the IOUG will host the COLLABORATE 13-IOUG Forum with nearly 1,000 sessions providing user-driven content. The theme of this year's COLLABORATE 13-IOUG Forum is "Elevate - take control of your career and elevate your Oracle ecosystem knowledge and expertise," says IOUG president John Matelski.

Posted March 27, 2013

IBM announced that all cloud services and software will be based on an open cloud architecture. As the first step, IBM unveiled a new private cloud offering based on the open sourced OpenStack software that it says speeds and simplifies managing an enterprise-grade cloud. The offering provides businesses with a core set of open source-based technologies to build enterprise-class cloud services that can be ported across hybrid cloud environments. The IBM announcement "goes a long way" to position OpenStack against other more proprietary solutions, Jim Curry, senior vice president and general manager of Rackspace's Private Cloud business, tells DBTA.

Posted March 27, 2013

At the recent Strata conference, CitusDB showcased the latest release of its scalable analytics database. According to the vendor, CitusDB 2.0 brings together the performance of PostgreSQL and the scalability of Apache Hadoop, and enables real-time queries on data that's already in Hadoop. This new functionality is possible with CitusDB's distributed query planner, and PostgreSQL's foreign data wrappers.

Posted March 27, 2013

Two big questions are on the minds of data professionals these days. How are increasing complexity and the inevitable onslaught of big data shaping the future of database administrators and data architects? How will our roles change? In the interest of studying the evolving landscape of data, the Independent Oracle User's Group (IOUG) took the pulse of the community. The Big Data Skills for Success study polled numerous individuals in the IOUG Oracle technology community, to identify just how the responsibilities of handling data are changing and what the future of these roles looks like.

Posted March 14, 2013

Special Report: Gaining Maximum Advantage with MultiValue Technologies

Posted March 14, 2013

Data keeps growing, systems and servers keep sprawling, and users keep clamoring for more real-time access. The result of all this frenzy of activity is pressure for faster, more effective data integration that can deliver more expansive views of information, while still maintaining quality and integrity. Enterprise data and IT managers are responding in a variety of ways, looking to initiatives such as enterprise mashups, automation, virtualization, and cloud to pursue new paths to data integration. In the process, they are moving beyond the traditional means of integration they have relied on for years to pull data together.

Posted March 14, 2013

Databases are restricted by reliance on disk-based storage, a technology that has been in place for several decades. Even with the addition of memory caches and solid state drives, the model of relying on repeated access to information storage devices remains a hindrance in capitalizing on today's "big data," according to a new survey of 323 data managers and professionals who are part of the Independent Oracle Users Group (IOUG). The survey was underwritten by SAP Corp. and conducted by Unisphere Research, a division of Information Today, Inc.

Posted March 14, 2013

A new survey of nearly 200 data managers and professionals, who are part of the Independent Oracle Users Group (IOUG), looks at the role of the data scientist - data professionals who can aggregate data from internal enterprise data stores as well as outside sources to provide the forecasts and insight required to help lead their organizations into the future. The research was conducted by Unisphere Research, a division of Information Today, Inc.

Posted February 27, 2013

The continued expansion of structured and unstructured data storage seems to be never-ending. At the same time, database administrators' need to reduce their storage consumption is accelerating as its cost becomes more visible. Today, however, there are data optimization technologies available that can help with the continued data growth.

Posted February 27, 2013

SAP AG has introduced a new version of its Sybase IQ disk-based column store analytics server. The overriding theme of this new release, which will be generally available later in the first quarter, "is positioning IQ 16 to go from terabytes to petabytes," Dan Lahl, senior director of product marketing at SAP, tells 5 Minute Briefing. To accomplish this, IQ 16 provides enhancements in three critical areas.

Posted February 27, 2013

DataCore Software, a provider of storage virtualization software, has made enhancements to its SANsymphony-V Storage Hypervisor. The new capabilities are intended to support customers who are facing high data growth, as well as the need to enable faster response times and provide continuous availability for business-critical applications.

Posted February 27, 2013

HP announced two new software-as-a-service (SaaS) solutions intended to speed application delivery and improve visibility, collaboration and agility across often siloed or geographically dispersed application development and operations teams. HP Agile Manager accelerates application time to market with an intuitive, web-based experience that offers visibility for planning, executing and tracking Agile development projects; and HP Performance Anywhere helps resolve application performance issues before they impact business services by providing visibility and predictive analytics.

Posted February 27, 2013

Oracle president Mark Hurd and Oracle executive vice president of product development Thomas Kurian recently hosted a conference call to provide an update on Oracle's cloud strategy and recap of product-related developments. Oracle is trying to do two things for customers - simplify their IT and power their innovation, said Hurd.

Posted February 27, 2013

Hortonworks, a leading contributor to Apache Hadoop, has released Hortonworks Sandbox, a learning environment and on-ramp for anyone interested in learning, evaluating or using Apache Hadoop in the enterprise. This tool seeks to bridge the gap between people who want to learn Hadoop, and the complexity of setting up a cluster with an integrated environment that provides demos, videos, tutorials.

Posted February 27, 2013

Having vast amounts of data at hand doesn't necessarily help executives make better decisions. In fact, without a simple way to access and analyze the astronomical amounts of available information, it is easy to become frozen with indecision, knowing the answers are likely in the data, but unsure how to find them. With so many companies proclaiming to offer salvation from all data issues, one of the most important factors to consider when selecting a solution is ease of use. An intuitive interface based on how people already operate in the real world is the key to adoption and usage throughout an organization.

Posted February 13, 2013

Delivering Information Faster: In-Memory Technology Reboots the Big Data Analytics World

Posted February 13, 2013

A profound shift is occurring in where data lives. Thanks to skyrocketing demand for real-time access to huge volumes of data—big data—technology architects are increasingly moving data out of slow, disk-bound legacy databases and into large, distributed stores of ultra-fast machine memory. The plummeting price of RAM, along with advanced solutions for managing and monitoring distributed in-memory data, mean there are no longer good excuses to make customers, colleagues, and partners wait the seconds—or sometimes hours—it can take your applications to get data out of disk-bound databases. With in-memory, microseconds are the new seconds.

Posted February 13, 2013

The explosion of big data has presented many challenges for today's database administrators (DBAs), who are responsible for managing far more data than ever before. And with more programs being developed and tested, more tools are needed to help optimize data and efficiency efforts. Using techniques such as DB2's Multi-Row Fetch (MRF), DBAs are able to cut down on CPU time - and improve application efficiency. MRF was introduced in DB2 version 8 in 2004. Stated simply, it is the ability for DB2 to send multiple rows back to a requesting program at once, rather than one row at a time.

Posted January 24, 2013

Databases are hampered by a reliance on disk-based storage, a technology that has been in place for more than two decades. Even with the addition of memory caches and solid state drives, the model of relying on repeated access to the permanent information storage devices is still a bottleneck in capitalizing on today's "big data," according to a new survey of 323 data managers and professionals who are part of the IOUG. Nearly 75% of respondents believe that in-memory technology is important to enabling their organization to remain competitive in the future. Yet, almost as many also indicate they lack the in-memory skills to deliver even current business requirements. The research results are detailed in a new report, titled "Accelerating Enterprise Insights: 2013 IOUG In-Memory Strategies Survey."

Posted January 24, 2013

EMC Greenplum has qualified Attunity RepliWeb for Enterprise File Replication (EFR) and Attunity Managed File Transfer (MFT) with EMC Greenplum Hadoop (HD). Attunity RepliWeb for EFR and Attunity MFT are high-performance, easy-to-use solutions for automating, managing and accelerating the process of making data available for big data analytics with Hadoop. According to Attunity, the products, launched earlier this year, are the first and only solutions currently qualified by EMC for Greenplum HD. "Greenplum has come into the marketplace by storm and has had a strong vision of being data-independent or data-agnostic. They want to make sure that their analytic platform can handle both structured and unstructured data and this aligns very well with Attunity's mission statement of any data, any time, anywhere," Matt Benati, vice president of Global Marketing at Attunity, tells DBTA.

Posted January 24, 2013

Enterprise NoSQL database provider MarkLogic Corporation has partnered with business intelligence vendor Tableau Software to offer analytics and visualization over unstructured big data. The partnership allows business users to leverage Tableau's business intelligence and reporting solutions to access disparate data sets of structured and unstructured data house in a MarkLogic NoSQL database. "Not only can you build rich, sophisticated applications, but you can also make use of that data where it is, and have business users connect to that data, visualize it, and do analytics over it, without involving the development center," Stephen Buxton, MarkLogic's director of product management, tells DBTA.

Posted January 24, 2013

Oracle has merged the core capabilities of the Oracle Audit Vault and Oracle Database Firewall products, creating the new Oracle Audit Vault and Database Firewall product which expands protection beyond Oracle and third-party databases with support for auditing the operating system, directories and custom sources. "It is really one single, streamlined solution to do both security and compliance for Oracle and non-Oracle databases, and extending beyond databases, to operating systems, file systems, and directories - essentially the structure surrounding your database," notes Vipin Samar, vice president, Database Security, Oracle. "Data governance is increasingly important in many organizations and, as we know from the IOUG survey that we did earlier this year, we have very few organizations that are monitoring sensitive data access," adds Roxana Bradescu, director of product marketing, Data Security, Oracle.

Posted January 24, 2013

The recent explosion of digital data has affected businesses of all sizes and has opened opportunities for companies that adopt machine learning technology - including predictive analytics - to mine intelligence from data assets. Predictive analytics has the potential to transform traditional small to medium businesses (SMBs), which have the same desire to take better advantage of their data assets as larger organizations - but the process with which they can glean strategic value from that data is significantly different.

Posted January 03, 2013

The emergence of web-scale apps has put us in the midst of a database crisis. Mobile apps, cloud-based SaaS/PaaS architectures and the distributed nature of the web have forced the software industry to make difficult compromises on how they collect, process and store data. While traditional databases provide the power and simplicity of SQL and the reliability of ACID, they don't scale without herculean-inspired workarounds. Newer, NoSQL solutions come close but don't quite make the last mile. They're designed to scale elastically, even on commodity hardware, but force developers to program powerful querying features into their application and throw away years of learning SQL skills, tools and languages. To give developers a truly modern solution for this century, we need to rethink how we process the collection and storage of data. It's time for a revolution.

Posted January 03, 2013

As organizations strive to deliver always-on access to applications users, it can be challenging to provide authorized access while simultaneously protecting against cyber-attacks. To address these challenges, two novel solutions combine the power of application delivery controllers (ADCs) with web access management (WAM) and database security technologies.

Posted January 03, 2013

For many years, enterprise data center managers have struggled to implement disaster recovery strategies that meet their RTO/RPOs and business continuity objectives while staying within their budget. While the challenges of moving, managing, and storing massive data volumes for effective disaster protection have not changed - exponential data growth and the advent of big data technologies, have made the challenge of disaster recovery protection more difficult than ever before.

Posted December 19, 2012

Progress Software, a provider of software for connecting business applications to data and services, has released DataDirect Connect and DataDirect Connect XE for ODBC 7.1. The releases provide fast, reliable, secure and scalable connectivity for Apache Hive, as well as expanded support for cloud databases including Microsoft SQL Azure. DataDirect Connect and DataDirect Connect XE for ODBC 7.1 also offer new features for IBM DB2 10.1, IBM DB2 pureScale 9.8, Teradata 14, and Microsoft SQL Server 2012.

Posted December 19, 2012

Despite the rise of big data, data warehousing is far from dead. While traditional, static data warehouses may have indeed seen their day, an agile data warehouse — one that can map to the needs of the business and change as the business changes — is quickly on the rise. Many of the conversations today around big data revolve around volume and while that is certainly valid, the issue is also about understanding data in context to make valuable business decisions. Do you really understand why a consumer takes action to buy? How do their purchases relate? When will they do it again? Big data is limited when it comes to answering these questions. An agile approach — one that gives even big data a life beyond its initial purpose — is the value data warehousing can bring to bear and is critical to long-term business success.

Posted December 19, 2012

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31

Sponsors